Forums

  • Point cloud Ids Requested new feature is now in : point cloud ids - have been in ARKit since 1.5, but finally exposed to C#. Added an example scene for usage. Let us know how you get creative with this! Thanks.
  • Only ‘part’ of app using ARCore? Can you have just part of your app use ARCore and still be installable on devices that don't support ARCore? I don't want to have 2 releases to solve this issue. I want the ARCore 'level' to be part of the app and other parts not require ARCore.
  • Eye gaze tracking not working I'm trying to get the eye gaze tracking to work in the "UnityTongueAndEyes" sample scene and the eye tracking is not working.

    ARKit plugin, commit d381878 on 11/27/2019
    Mac 10.13.6
    Unity 2018.2.13f1
    iPhone X iOS 12.1
    ARKitSettings - AR Kit Uses Facetracking is checked. App requires AR Kit is checked.

    I built the UnityARKitRemote app, installed it on the device. Then from the editor, I run the "UnityTongueAndEyes" test scene, connect to the device thru the ARKitRemote, the video feed...

    Eye gaze tracking not working
  • Autodesk AR/VR toolkit Forge I am a beginner with Unity. I want to learn Unity because it has a collaborative relationship with Autodesk. I want to follow the tutorials on the following site: http://forgetoolkit.com/#/helloworld
    The tutorial under Getting Started goes well, I can download the models and open them in Unity with the toolkit.
    If I'm going to do the OculusGo example under Guides, it goes wrong with downloading the assets from the assets store. If I have downloaded and imported it, I have to...

    Autodesk AR/VR toolkit Forge
  • Unity 360 Video Player Library – Implementation / Solution Hello everyone!

    I wanted to share with everyone a simple wrapper library I wrote that makes managing and playing 360 video content super easy and painless, especially if those video assets are hosted externally.

    Description
    Please note that this library has not been unit tested and therefore not production compliant.
    With that said, I'd greatly appreciate any PRs or feedback on improving this library.

    You may find a very *light* overview of the library's...

    Unity 360 Video Player Library - Implementation / Solution
  • ARcore no longer finding surfaces (it did, but now doesn’t ?!?) Hi, Anyone noticed ARCore not being able to find surfaces? We’ve been dabbling with ARcore for a few months and it’s been working well. It’s found surfaces pretty quickly and gets a stable lock on them. However, in the last few weeks it’s turned to ...
  • Application crash when trying to export an anchor with the WorldAnchorTransferBatch Hello,

    I programmed a Hololens Application with persistant Anchors. Now I'm trying to share these Anchors to another Hololens. Therefore i created a connection comparable to the SharingWithUNET example in the HoloToolkit. Unfortunately I get an error when using the WorldAnchorTransferBatch.ExportAsync function.

    Looking for bf881d22-e02d-4c07-a1ff-f5ffdbeda445
    found a good mesh mostVerts = 751 processed 6 meshes in 0.3814697 ms
    preparing...


    Application crash when trying to export an anchor with the WorldAnchorTransferBatch
  • Best tutorial for finding plane? I have gotten the HelloAR sample scene working. In my app I'm trying to get an object to interact with a plane that ARCore finds.

    Basically I'm looking to just find a plane that I can setup a collider on. I've been looking through tutorials on google and haven't found a good simple one. Does anyone have a good tutorial/link for this?

    Thanks in advance for any help with this issue.
  • How to implement rendering logic for stereoscopic 360 video Hello,

    I am trying to implement my own 360 stereo renderer and I'm not going to use any existing solution for this task; the reasons are specific to my scenario. I understand that the 360 stereo image is usually produced by rendering two cubemaps (one per eye) and converting them to equirectangular projection - my question generally concerns the first part.
    Currently I know about three implementations of cubemap rendering logic:
    • ...

    How to implement rendering logic for stereoscopic 360 video
  • Steam VR Plugin : Reset Position and orientation Hello,

    I'm currently working with Steam VR Plugin for Unity and i'm converting my whole project for the fantastic HTC Vive Pre and forward.

    UNFORTUNATELY i didn't figured out the equivalent to :
    " UnityEngine.VR.InputTracking.Recenter(); "

    I tried SteamVR.instance.hmd.ResetSeatedZeroPose(); without any success...
    Seems on the web i'm not alone but didn't found any answer.

    Does anyone can help ?
  • HMD lose 6DOF in gaming Hello : I use Acer AH100, Windows mixed reality and steam VR to make my project. I set the position setting in WMR portal.(Standing Only) The HMD has 6DOF in beginning but it will lose 6 DOF after a period of time (just only 3DOF). I want to know ha...
  • Proper way to rotate VR camera. I'm having difficulty figuring out the proper way to rotate a VR camera.

    Currently the position and the rotation of the headset is unreadable and sets based off of your hardwares tracking. The current well known work around for creating movement or rotation for the player is is to create a parent object and move or rotate that parent object instead of the camera. The problem with this is when rotating the parent the rotation is based off of the pivot of the parent object and not of the...

    Proper way to rotate VR camera.
  • Getting Depth Textures to work in VR (Single-pass & Multi-pass)
    NOTE: I have also posted this question in the Shaders forum here but have had no responses and I'm not sure which forum is best suited to getting feedback, hence this reposting.

    I'm in the (increasingly drawn out) process of adding VR support to my asset Fantastic Glass.

    I...

    Getting Depth Textures to work in VR (Single-pass & Multi-pass)
  • unity_stereoEyeIndex with GLSL (Single Pass Implementation details) I'm trying to play a 3D video on a texture in Unity on Android. Because the video is a high resolution and framerate, I need to use OES textures for the video, which forces me to use a GLSL shader (as OES texture sampling is not currently supported in CG).

    My question is, how do I get the current eye being rendered when using GLSL? With CG, this can be easily accessed with unity_StereoEyeIndex, but I don't seem to be able to access that function when the shader is directly written in GLSL.
  • I want to make an app I want to know ,how 3d model can fetch colour and shape of 2d images from outside. Here my project is an online shopping with augmented reality. So i took vuforia with unity3d But my problem is ,when we click images i want to display a 3d model dre...