Table of Contents

Project Information

this project took place straight after the competition of the lego project, with an equally tight deadline. The output was the following. Two application. One was a VR application in which users could open the app, walk around the scene and see people on the second application (Desktop app).

If a user has there webcam enabled on the desktop app, the VR user should be able to see them in the scene. VR users can also view desktop users screenshare. Finally, VR users can spawn and grab VR objects into the unity scene.

Essentially, this project was to showcase the agora realtime messaging system. Using their SDK we created a way for desktop and VR users to be in the same environment and communicate with each other

This project, I was not heavily involved in. I only participated some code for the desktop application. Most/all of the code was carried out by Solarflares lead developer, who managed an extraordinary amount of work in such a small deadline.

Image showing complete agora demo Image showing complete agora demo

Development:

As mentioned, Solarflares lead developer carried out the majority of development. As a result I only participated in a fraction of the overall work. Due to this, I will only write briefly about the development process with what I worked on.

MicButton.cs
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
using Solarflare.HTCDemo.Framework.App;
using Solarflare.HTCDemo.Framework.Rtc;
using UnityEngine;
using UnityEngine.UI;

namespace Solarflare.HTCDemo.DesktopApp.UI
{
    public class MicButton : MonoBehaviour
    {
        private IRtcSystem rtcSystem;

        private void Awake()
        {
            rtcSystem = HTCDemoFrameworkApp.Instance.RtcSystem;
            
            GetComponent<Button>().onClick.AddListener(OnClicked);
        }

        private void OnClicked()
        {
            rtcSystem.IsMuted = !rtcSystem.IsMuted;
        }
    }
}
The above code is the typical code I was writing. In unity I implemented the UI for the desktop application. I then added scripts onto each button to control both the button state and the functionality when the button was pressed.

Due to Basar (Lead developer) writing very clean and high quality code, I simply used the interface he created (RtcSystem) and updated the values from the interface. This in hand affected both applications.

CameraStateView.cs
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
using Solarflare.HTCDemo.Framework.App;
using Solarflare.HTCDemo.Framework.Rtc;
using UnityEngine;
using UnityEngine.UI;

namespace Solarflare.HTCDemo.DesktopApp.UI
{
    [RequireComponent(typeof(Image))]
    public class CameraStateView : MonoBehaviour
    {
        [SerializeField] private Sprite camOnSprite;
        [SerializeField] private Sprite camOffSprite;

        private Image image;
        private IRtcSystem rtcSystem;

        private void Awake()
        {
            image = GetComponent<Image>();
            rtcSystem = HTCDemoFrameworkApp.Instance.RtcSystem;
        }

        private void Update()
        {
            image.sprite = rtcSystem.LocalVideoEnabled ? camOnSprite : camOffSprite;
        }
    }
}

Above is a typical stateview script I wrote to change the UI view of the buttons when they were selected/unselected.

StreamVirtualCamera
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
using System;
using Solarflare.HTCDemo.Framework.App;
using Solarflare.HTCDemo.Framework.Rtc;
using Solarflare.HTCDemo.Framework.Sessions;
using UnityEngine;
using UnityEngine.UI;

namespace Solarflare.HTCDemo.DesktopApp.Rtc
{
    public class StreamVirtualCamera : MonoBehaviour
    {
        [SerializeField]
        private RawImage remoteVideo;
        [SerializeField] private bool streamVirtualCameraView;
        [SerializeField] private Camera renderTextureCamera;

        private IRtcSystem rtcSystem;
        private ISession session;
        private bool streamingVirtualCamera;

        private void Awake()
        {
            rtcSystem = HTCDemoFrameworkApp.Instance.RtcSystem;
            session = HTCDemoFrameworkApp.Instance.Session;
        }

        private async void Start()
        {
            await rtcSystem.JoinChannel($"{session.SessionId}-Rtc", VideoMode.VirtualCamera);

            rtcSystem.UserJoined += OnUserJoined;
        }

        private void OnUserJoined(string userId)
        {
            if (string.IsNullOrEmpty(userId)) return;
            rtcSystem.AttachVideoToRawImage(remoteVideo, userId);
        }

        private void Update()
        {
            if (streamVirtualCameraView)
            {
                if (streamingVirtualCamera == false)
                {
                    rtcSystem.StartVirtualCameraViewStream(renderTextureCamera);
                    streamingVirtualCamera = true;
                }
            }
            else
            {
                if (streamingVirtualCamera)
                {
                    rtcSystem.StopVirtualCameraViewStream();
                    streamingVirtualCamera = false;
                }
            }
        }
    }
}
I also contributed some code toward streaming the virtual camera as seen from the code above.

Showcase:

Extra Information