Hey there! I’m a teacher trying to set up a VR game design class at my school. We have a bunch of Oculus Quests and Unity seems to run fine on our computers, but they are far from VR-Ready computers. Is there any way you can think of to develop a VR Unity game on our computers, and then rely on the hardware of the Oculus Quest to run the game instead of our underpowered computers? That or any other recommendations you have for this situation would be much appreciated. Thanks!
If they can run a Quest with Link cable then the development will be nice. Otherwise it will be hell.
That is a terrible terrible idea. Link for Quest == Needing above Rift Requirements (which are the lowest for any tethered headset).
Without powerful computers it will just be hellish to run it in editor over link – Link has higher computer requirements than running the same app on a Rift, because of the required video compression over USB or WiFi. Link puts all of the games processing work on the PC, with the Quest being a fancy controller tracking monitor running Android.
While they above suggested technically is possible: Developing anything where you have to compile it before running it (dozens to hundreds) of time per development session is hellish in a fully different regard – of having your iterating speed be cut down to at minimum a 3-4 minute wait before testing anything – plus losing any ability to tryout or change things in the app while it’s running – as it’s fundamentally never being “run in editor”, only bundled and built there. All the link cable will be providing as is an error log, and app build transfer. Plus the required Oculus Desktop app for Link won’t even install on unsupported hardware – for good reasons .
Wouldn’t Quest WITHOUT Link work the OK? This requires using an Android build and deployment and debugging with ADB, but it is doable. I did my first VR app dev this way on a Go with Unity.
The chunkiness of the interface is painful to use long term. Possible – yes – recommended… as @hippocoder said "
[quote=“hippocoder, post:2, topic: 859385, username:hippocoder”]
If they can run a Quest with Link cable then the development will be nice. Otherwise it will be hell.
[/quote]".
You don’t seem to understand. Link is fairly mandatory for Quest development. Not for the performance (that’s far from the point) but because VR is the most iterative of all disciplines in game development. You will lose pretty much 90% of your lesson time on compiling, and you will need to compile pretty much with every tiny change you make - unlike other hardware.
Without Link, it’s hell. I know because I started development on Quest in 2019, when Link didn’t exist.
Would appreciate less arrogance in replies in future.
Rubbish! I’ve been doing Quest development professionally for years now — since before the Quest came out, in fact — on a Mac, with no Link in sight. Link might be nice (I wouldn’t know) but it is certainly not necessary, and development without it is not hellish. I find it quite enjoyable.
My projects are set up such that I can run them in the editor, and they work, but obviously I’m not wearing the headset or using the controllers, so I have keyboard/mouse stand-ins (or sometimes just grab things in the Scene view and drag them around) as a substitute. This works well enough for 90% of the stuff I need to iterate on. When I really am doing something that requires testing in VR, like some new wand gesture or sword-based combat or whatever, then obviously I hit cmd-B to build to the device, and put it on. This takes maybe 30-45 seconds on my MacBook Pro. Slower than hitting “Play”, but much faster than, say, doing a web build of literally any Unity program.
Also, I often redirect Debug.Log to a scrolling text on a big canvas hanging out in the VR environment somewhere, so I can see my log messages in the headset. Even better than relying on Link, which would require you to take the headset off to read these.
So @robageejammin , yes, you can do VR development with Quest headsets on your computers, provided they can run Unity decently at all. Give your students a starter project that already has the VR basics set up, and show them how to test in the editor as well as the headset.
I apologize, I wasn’t trying to be arrogant, but I see why you would think so. I posed that as a question because I haven’t done ADB dev with VR since Oculus Go, so I am uncertain and not trying to be flippant. I shouldn’t have capped the without, I was just trying to make sure I was clear. If you are budget constrained but want to learn, you might still want to go with an older model Android based solution. GPU silicon is ridiculous right now.
That is the Truth.
That is less time than it can take unity to enter play mode on a good day in editor for clean projects with Link.
Edit:. (That was with what I later learned was a bad Link cable. Getting a good quality (fabric) Link cable as was suggested below by
This single change for me this made everything so much better, all it required was a $20 quality cord. Entering/exiting is now about 30-45 sec max for me now in 2021.2.0f1. I was using a 10 Ft. rubber thing which had been rolled over 150 times by the desk chair. Each time it disconnected would require a 2 min editor reboot, or sometimes a full power cycle on the quest to get it operational again.)
It’s being able to see inspectors which makes the real difference for me. For me, debugging without the debug tab on in the Inspector is like fishing without boots while standing in “time” sucking leaches… (I’ve never lacked system support for Link, just avoided developing for Quest before Link was around – after going through hell waiting for WebGL after doing 10 min builds through il2cpp for a past WebGL project) [also much improved via a multi thread compiler since then]… There is a reason I advise for VR building on Windows, or anything else with Link support.
My main argument against teaching through that mac like building interface is that you really do have to be physically within your game’s interface constantly – with fast iterative changes – to really feel what need to change or be developed. There is no academic “structural” substitute for rapid experience testing, and value tweaking. Tweak a values see how it feels two seconds later rather than 2 minutes. It’s also why you see $10k mountain bikes, and $2k production tear GPUs.
I am developing a game for the Quest 2 and have no problem with Unity.
You do NOT need a powerful graphic board on the PC, exactly because when you deploy on the headset, it will take care of everything.
And the PC computing power is more than enough to see what’s going on (except for the controls of course, and a faithful vision).
I am developing WITHOUT using direct simulation from Unity. It’s not the best thing, but it’s doable.
I deploy through the cable, then disconnect and see what is going on.
I can’t agree more. Build & Deploy takes about as long as it takes me to put on the headset (first build excepted…shaders always take forever). I dev on Mac and Win and have only needed Link once to help me align controllers with virtual hands in a VR body.
@robageejammin You are on a very technical forum with passionate engineers, so I wouldn’t let this back and forth concern you.
I think there are good points from everyone that depend on your perspective and more importantly the demands of your projects.
I feel like you are doing more of an introductory course based on your communication. With this, I personally wouldn’t shy away from trying to create a VR course where you use Unity with the Quest devices on slower machines. Especially since the alternative is no VR course for your students.
I suggest you explore this yourself to see if you feel like you can build a course with your available equipment. I would start with using Unity XR and the XR Interactive Toolkit. There are a number of great basic example projects and tutorials you could use to build a course and the code requirements are reduced.
The main concern about needing to be in VR regularly to properly build a game can be alleviated by building simple projects with basic interactivity, which can be run in the Unity editor before building and testing in VR. The tutorials and samples should help.
I think this is a great idea in that (with my assumption this is an introductory course) you will be teaching them the general concepts of 3D, programming, game design, as well as, VR basics. In addition, I feel like VR may attract additional people that might not just go to a game dev class.
I am happy to help with more specifics if you like, just reply to this thread.
I invested in a 6" flexible (fabric-sheath rather than plastic) USB cable, so I don’t even disconnect. I leave my headset plugged in all the time, so it’s just build & run, scooch my chair back, stand up, and put the headset on. If I need to do an “adb logcat | fgrep Unity” in Terminal while testing, I can do that too, because it’s still plugged in.
I do agree that being able to use the Inspector to see and tweak values while running would be nice. But I’ve gotten comfortable and productive without it.
Good for you… unluckily I don’t have enough space in my room to stand up and test with the cable connected, so I’m forced to move into the living room, even though my cable is 8 meters long.
But anyway, the game I’m developing is Quest resident, and I never try to use the PC, my deploys are all Quest native, so it’s perfectly doable.
For anyone who is forced to go without an in game inspector you can use this, this has one you can turn on in debug or release builds modes with a game overlay and a great console – I’ve only test it on Link, and the Rift 1, when running HDRP. I use it all the time. Just make sure not to leave it on for release. Debugging Essentials | Utilities Tools | Unity Asset Store
I can’t thank you all enough for your diligent and helpful responses! It seems like there is a decent possibility to make this work based on what I’ve heard so I will absolutely look into all of the solutions you mentioned and will be in touch. Thank you again!