First of all, i'm complete new to this all (since yesterday to be precise).
I'm trying to build a hologram for the hololens. I installed everything: Visual Studio 2017 - Unity and Universal Windows Plattform, Unity 2018 - with .NET.
However, when i import the MRTKv2 package into unity, i'm missing some important settings. I already deleted unity and reinstalled it.
For example, i'm missing the "Apply Mixed Reality Project Settings".
Scripting Gyroscope Drift Hello,
I am developing an AR application.
In this application I use a "headtracking" that I use on the camera, it allows me to be in a 360 ° environment.
As I saw on other thread some of you also have a problem of "drift". The pr...
Now, I want to put the two together, and I'm not sure where to start. I want the arm of the character model to move and be controlled by my GvrControllerPointer, just like the Daydream Elements swing arm.
I need to get the precise world positions of left and right eye in Unity 5.1s native VR support (read-only). Is there an API call for this, seeing how the VR Camera rig is now constructed and handled by Unity behind the scenes? It's easy if one uses the OVRCameraRig Prefab from the Oculus Tools for Unity, but I prefer to use Unity's native VR Camera for compatibility reasons.
I am aware of "InputTracking.GetLocalPosition(VRNode.RightEye)", but this gets the "position of node local to...
Mock HMD returns zero for left/right eye I have two cameras in my scene (one set up to render different stuff in each eye).
However I can't get the eye offsets in my "OnPreRender" method called per eye.
InputTracking.GetLocalPosition(XRNode.LeftEye) returns a Vector3.Zero value.
I'm using a Mock HMD. Is this a bug in the mock hmd? Am I missing something?
I'm building a project where I have 4 videos (360, 4K, H.265)
Each are lined up to play (using a skybox & video player) in their own scene, a script tells them to play and a coroutine counts the seconds before telling Unity to move onto the next scene. Pretty simple.
Within the Unity editor the project runs flawlessly. (I do have to swap the video to H.264 versions to preview, as H.265 isn't yet supported, then swap them back to H.265 to export for Android). I use H.265 to try to...
Reference: Oculus Go How to create an onclick event on a UI button to change the scene in VR. I would like to point the Oculus Go remote at an object (button) - click the primary trigger and have the button perform an on click event to perform some other action (like change a scene) I can already do this on other platforms just not Android. I have most of it down - just not the part that actually performs the press in a VR world space. Thoughts? Help I'm drowning!!!!! Thank you before hand ( I hope). Have a...
Low-latency Mixed Reality I'm trying to import live video into Unity through the use of a native plugin. Trouble is, we're seeing 7 frames of latency from start-to-finish, and trying to figure out where these are coming from. We know the capture front-end (everything before it gets to Unity) takes 2 frames, but after it gets into Unity, it's another 5 frames before the photons come out of the headset.
To do the live textures, we create 2 textures in Unity (stereo) and send the native texture pointers to the native...