Victories and Trials in VR

Holy cow! I was finally able to get Unity to play a build without having to build it! First time it’s happened! Normally I had to build an APK and get an external loader to make it happen but now I can see edits live! Saving me hours every day! Maybe even more time is saved now I can see problems and can fix them at the same time!

How I fixed it:

  • Got a new motherboard for laptop
  • uninstalled all drivers (graphics) and did only the essential ones from Windows and Asus/Nvidia
  • Got a proper USB C cable (max data and power transfer (it wasn’t cheap nor was it flexible but it was still 1/3 the cost of the “official” link cable. Yeah it’s still jittery (as was the official) but not as bad. Seated only while using.
  • Tried sooo many things and re-installed etc.
  • Used 2020.3.21f1 build with OVR integration (hand tracking etc not working yet but I’ll try something else, also needs XR transportation rig [etc] installed for proper use) and removal of Vulkan graphics (as per console notes) then installation of XR plugin and changing of graphics rendering to linear (as per console)
  • Searching found this little gem below (https://forum.unity.com/threads/how-to-preview-using-play-button-on-oculus-quest.828174/) and it worked!
https://forum.unity.com/threads/how-to-preview-using-play-button-on-oculus-quest.828174/

Unfortunately Unity shuts down seconds after stopping play 🙁 Fixed by creating a new project (lol). Now I have link cable and can play direct from Unity without it crashing upon completion. But I have a “new” problem and that is no hand tracking when using the link cable.

From another video (https://www.youtube.com/watch?v=a8YJmPx6HcQ) I should switch the build target Android and then he says to choose graphics (but he doesn’t as its just a demo [normally I choose ASIC]). He then goes to Player Settings and chooses a depreciated setting VR supported. And it now I see it’s a 2019 release and not XR (he said the info would be in a future video but this is his last ever video and it’s over a year old) – so this step doesn’t work. I’ll try something else

I’m keeping the build support as Android atm (takes time to change) and trying Oculus page (https://developer.oculus.com/documentation/unity/unity-handtracking/), I followed its steps. Changing (under OVRCameraRig(Hierarchy) – > OVR Manager/Quest Features/General/Hand Tracking Support → Controllers and Hands (Seems like this should’ve been part of the tutorial), as well as OVR Manager/Tracking/Tracking Origin Type → Floor Level. Oh no- Unity crashed. So something of the last few steps causes it to crash.

Checking the console- the tutorial missed a number of steps. Firstly, the cube needed a rigid body (duh) and there is a message “You are trying to read Input using the UnityEngine.Input class, but you have switched active Input handling to Input System package in Player Settings” repeated over and over (probably in an loop to be checked with every frame). There are many potential solutions to that problem (https://forum.unity.com/threads/unity-render-pipeline-debug-clashes-with-new-input-system.735179/page-2) but I’ll be using the generally liked one (on that page): Project Settings/Player/Settings For Android/Other Settings/Configuration/Active Input Handling → from Input System Package (New) to Both.

Serendipitously now I can locomote using my controllers which I couldn’t do before!! My left thumbstick moves while right rotates. No teleport but I can use the grip of the right to pick up the cube and move it around. I still don’t have hand tracking though… I also noticed that the console commands didn’t go bad but also Unity quit as soon as I took the headset off 🙁 Yep, I’m able to replicate it. If I quit inside the headset, take it off, and then hit the play button (to stop game in Unity) it quits Unity completely in about 10 seconds.

I’ve fixed the crashing problem – I do NOT quit inside the helmet, I hit the play button in Unity when I want to quit. And yep I was able to duplicate the crash inside the helmet (by adding Unity as a new panel and doing it there). Now onto hand tracking… OK, one article saying it’s available on Virtual Desktop (which I have) but no links on the Oculus App so I’ll assume no and just hope it will auto-magically work in Unity even though it doesn’t seem to want to work in the preceding app.

Back to the Oculus steps (now that I know why it crashed and why I was unable to interact with objects)… A noteable difference is the tutorial I’m currently working through grabbed the OVRPlayerController which already had OVRCameraRig as a child. If this hybrid doesn’t work I’ll do the Oculus way. Nope, I’m stopping the hybrid now (ovr v2) and starting a new project (OVR – OculusPage – 0)

Oculus page doesn’t even start with the plugin/package that you need nor a link on what to install to get it to work (I may be learning this but I wish my teachers were credible), luckily the steps to install it seems to be similar in both tracks. I will do their step by step and will only interject with my discoveries if I get compilation errors. Luckily also this times around Oculus Integration has stayed in My Assets as an option. This is enough – once I get it working I build my garage (later I will make a page for each setup). Lol. Oculus steps are missing too much – it says “Under Quest Features, from the Hand Tracking Support list, select Controllers and Hands.” but that option doesn’t exist unless I load up XR and follow other steps that were in the original tutorial. Good to know I can’t trust this one. Back to the original file.

ovr-v2. I set up the prefabs correctly (by using the steps from the tute as well as the Oculus page) but still the hands don’t appear. I’m 100% sure if they appeared in the Oculus menu (on the desktop) they’d appear in the app… OK, it still doens’t work but some form of hand tracking does appear in the Scenes in the OVR Integration folder (HandTest and HandTest_Custom) but the hands are in the wrong orientation – they’re flipped horizontally and so the fingers point towards me, palm up, when my hands are outwardly orientated, palm up. I suppose I could manually orientat them but I don’t think thats a wise long term solution. I could go through both scenes with a fine tooth comb to find out where the differences are but I might miss it and it’d take longer than I think. I’ll keep looking for a solution.

Crap. There’s a long one (https://github.com/handzlikchris/Unity.QuestRemoteHandTracking) – HandTracking_FromGithubTute. A key step here mentioned why the hand tracking option wasn’t available on the Oculus page tute – the project wasn’t set to Android (instead of the default computer one I was set to). Why does the switch platform take so long? There’s no gameobjects. Dang this Github tute- it doesn’t say WHERE to add the hand prefab. I’m guessing as a child object. Also they renamed a directory from “feeders” to “Customisations” and didn’t bother updating the readme. Groan. I hate having to trust tutorials. I’ve sent them a long letter requesting an update of their readme…

I've spent the better part of the day trying to get Hand Tracking working so I was overjoyed to stumble across your github (referenced from: https://forums.oculusvr.com/t5/Oculus-Quest-2-and-Quest/Oculus-Link-does-not-support-hand-tracking/td-p/750569). Unfortunately it seems your Readme seems to be missing a step or two... normally I could try and work around them by guessing (and I've guessed a few things so far) but there's now no way forward without clarity and I was hoping you could help :)

On the "Assign Scene Dependencies" step it says "You'll need to add Hands game object reference to Feeder objects" from the directory "feeders". Unfortunately "feeders" directory doesn't exist in the Project window and the files mentioned in this step also don't really exist in the project window either. 

However, I think I got it- you were talking about the child of OVRHandsDataTransmission in the hierarchy not the Project (which was my assumption at the time (as the previous step seemed to infer the next steps would need to take place in the same area)- is that right?... But that opens up a new problem... when the readme says I'll "need to add Hands game object reference to Feeder objects"- it seems an impossible task to do. For one thing they're already assigned in the hierarchy with all 3 showing the same error message "The associated script can not be loaded" etc. And another thing if I were to remove the script links (refresh them so to speak) from (what I'm thinking is) where the scripts are stored in the Customisations directory- I'd be unable to add them as I get an error message ("script class cannot be found etc")... It's the same error message if I try to attach those scripts to any game object, whether I create them or they already existed...

If I were to "skip" this step to see if "If hands on screen are flickering" by booting up my Quest 2, attaching a link cable, and hitting play Unity wouldn't enter a play state as there is a red compilation error ("OVRPlugin.BoneCapsule doesn't contain a definition") so I'm stuck :(

I'd really love to get your project working :) I'm using Unity version 20.20.3.21f1 so I don't even know if it's compatible but I'm hoping it will be. 

Oh and one more thing I wasn't sure where to add the Hand prefab so I guessed I should make the left one a child object of the LeftHandAnchor object and the right one of the RightHandAnchor...? I hope I got that right. That being said it was very helpful that you mentioned I should ensure it was set to an Android build (if you can't find tracking) - so many tutes on the internet skip this vital step (including Oculus: https://developer.oculus.com/documentation/unity/unity-handtracking/)

Thank you again :)

I guess I’ll have to try something else 🙁 Luckily one of my ex-classmates from study, Ben, got back to me regarding hand tracking and suggested I make sure automatic switching between controller and hands as a recent update disabled it for him (not relevant to me but I was thankful for the guess) and also swapping back to 2019 LTS which seemed to work a lot better with Geoffs tutorial… I forgot which version he was using – I’ll check his video tutorial… He was using 2019.3.14f1 at one point and then switches up to 2020.3.18f1. I’ll try the the 2019 one he originally used 🙂

While waiting I’ll try the video tutorial that I’ve had in the background since this sleigh ride began (https://www.youtube.com/watch?v=u6Rlr2021vw), which unfortuantely only works in 4 and above (lol). so I’ll do it in 2019.4.26 (YouTube-HandTrack-Tute). It says it’s the old method but now in 2020.2 it has the new version which allows the builds to work on Vive and Steam Index (which isn’t going to help me- I just want a working platform atm) and it looks like the version I’m going to have to use to follow this tute – 2020.2.1f1. OK it’s downloading, both are downloading…

In other news I saw my hands on the desktop when using virtual desktop but was unable to “play” in the helmet straight from Unity on the VD. Probably could do it if I could be bothered but I’m OK with just getting anything working and right now that is link cable.

Going back to Geoffs video (project name: GeoffsTute-PostBenMsg) I’ll be using the version 2020.3.18f1 that he ended up using (later if it doesn’t work I’ll try his earlier one). This would be so much faster if I had the Oculus Integration files (and unity engines) stored locally. Oh well. I switched platform to Android before the import – it would’ve saved many many minutes to do it now but only time will tell if I SHOULD have waited till after the import.