I am pretty new to ROS and Unity. I have managed to display a video live stream coming from another computer inside Unity on a RawImage. I used ROS TCP Connector and ROS TCP Endpoint for getting the live stream into the computer and into Unity. But, i need to do this in a Oculus Quest 2. Do i need to install something to the vr headset for running the ros tcp endpoint? Or as long as my vr headset is connected to my computer with unity, the computer will handle that part and send the live stream to vr headset?
So there are generally two options. Either you want to run your application as a standalone Quest application directly on your headset or a PC VR application (having the application run on a PC and only display the output to the headset). I would suggest the second one as it is less complicated, and if you have your ROS application running correctly in Unity, you are basically halfway there.
The only thing left for you to figure out is how to develop and build PC VR applications as the ROS configuration will stay the same. Though, here I don’t know how deep is your knowledge and background. If you are a total beginner with VR in Unity, I would perhaps suggest this course (at least skim through it): Create with VR - Unity Learn
I did the same with my ROS project, also using Quest 2. Basically, I would use OpenXR Plugin and XR interaction Toolkit in Unity, SteamVR to run the application in VR, and then you can easily use your Quest 2 either wired to your PC or wirelessly (Air Link). The most tricky part is probably setting up Unity so you can test the application in playmode in VR without having to build everything.
Hello, @Envilon thank you for your kind advice and telling me about the vr course. It helped as a total beginner. I just took the PC VR path and even, managed to test the application in playmode using Oculus Link. But, there is a problem. When i pressed play in unity window inside Oculus Link, the VR space becomes black for a couple of seconds. Then, it turns back to Oculus Link space with Unity window displayed just as i would see it in PC and when, i turn my headset or move controllers, i see inside the game view field, it doesn’t react to moves of vr controllers and instead of Oculus controllers switching to vr controls, Oculus controllers continues to control position of mouse cursor and mouse clicks with it’s triggers.
Hey, sorry for taking so long, I always forget to check Unity forums now and then. Unfortunately, the robotics forum is less active than other forums here.
As I said in the previous post, this is the most tricky part to set up, in my opinion, and it took me some time to figure it out. Unfortunately, this was something I solved at the beginning of my project and then forgotten about. But it sure took a lot of googling and fiddling.
First of all, do the VR controls function normally in the build and is this only an issue in the editor’s play mode?
If I remember correctly (but I’m not 100% sure), to solve this issue in play mode, I used this package that I found: GitHub - shiena/OpenXRRuntimeSelector: Runtime Json Selector for Unity OpenXR
You’ll probably need to do something similar as in the provided sample code. But I remember it was not easy to get it running correctly and there were some errors, so I had to rewrite it somehow. So here’s my AutomaticXRLoading, and maybe it will work for you too, or you’ll be pretty close to a working solution:
using System.Collections;
using OpenXRRuntimeJsons;
using UnityEngine;
using UnityEngine.XR.Management;
public class AutomaticXRLoading : MonoBehaviour {
// Start is called before the first frame update
private void Start() {
// Get the available OpenXR runtime json
var openXRRuntimes = OpenXRRuntimeJson.GetRuntimeJsonPaths();
if (openXRRuntimes.ContainsKey(OpenXRRuntimeType.Oculus)) {
// Use the Oculus OpenXR runtime
OpenXRRuntimeJson.SetRuntimeJsonPath(OpenXRRuntimeType.Oculus);
StartCoroutine(StartXrCoroutine());
}
}
private void OnDestroy() {
Debug.Log("Stopping XR...");
XRGeneralSettings.Instance.Manager.StopSubsystems();
XRGeneralSettings.Instance.Manager.DeinitializeLoader();
Debug.Log("XR stopped completely.");
}
public static IEnumerator StartXrCoroutine() {
if (XRGeneralSettings.Instance.Manager.isInitializationComplete) {
yield return true;
yield break;
}
Debug.Log("Initializing XR...");
yield return XRGeneralSettings.Instance.Manager.InitializeLoader();
if (!XRGeneralSettings.Instance.Manager.isInitializationComplete) {
Debug.LogError("Initializing XR Failed. Check Editor or Player log for details.");
yield return false;
} else {
Debug.Log("Starting XR...");
XRGeneralSettings.Instance.Manager.StartSubsystems();
yield return true;
}
XRGeneralSettings.Instance.Manager.InitializeLoaderSync();
}
}
Just add it to some game object in the scene (it does not matter which). For example, I had it attached to some parent object holding all my VR stuff like this:
No problem about taking long and thank you for your kind advice. I just switched my xr plugin provider from open xr to oculus both for windows and android. Now, i when i press to play button, in Oculus Link inside my vr headset, i see it switches to unity app automatically and i am able to use vr controls.
Incidentally, how can i go by that first option you have mentioned earlier? Do i need to install rosjava? Or is the only thing i need to do is doing roslaunch ros_tcp_endpoint endpoint.launch in the computer i am going to build my app for android, before the build?
Unfortunately, at the time of my project, I was unable to create a standalone Oculus application, and I did not need it so I left the idea. However in my limited understanding of the subject, you’ll need to:
Figure out the networking: You’ll have to have a working networking solution for how your Oculus Quest will connect and communicate with the machine / virtual machine / docker container where you’ll run your ROS application. Since you won’t run your Unity application on the same machine, it will be a bit more complicated. Something similar was discussed here , I believe.
Build your Unity application for Oculus Quest as a target device: Probably an Android build is enough. I don’t know if there are any other requirements.
Get the built application on Oculus: Nowdays, there might be more possibilities to this (I was not using the headset for a few months), but I know about SideQuest for example, which allows you to load and run 3rd party applications.
In my opinion, no need for rosjava. I don’t understand why you’d need rosjava.
I have 2 PCs and the vr headset in the network. One PC runs the ROS master and publishes live video feed as ROS message. The other PC (the pc which unity is installed) runs
roslaunch ros-tcp-endpoint endpoint.launch
and VR headset subscribes to ROS topic for the live feed. Here is my ROS Settings:
When i build and run for android, i see from the output of
roslaunch ros-tcp-endpoint endpoint.launch
running at the pc with unity, the app running inside VR headset doesn’t connect to ROS master or subscribe for to ROS topics published from the PC which is running the roscore.
For the build part, selecting build and run in unity makes the app install into the vr headset.