I'm doing some research with Unity and ARCore. Here's what I'm trying to achieve:
1. the first player is in a room, starts the app, navigates around to get arcore to recognize the space
2. the first player then creates a "game session" and places a cloud anchor in the room's space
3. the second player is in the same physical room as the first player, starts the app (on his own device) and moves around to also recognize the space in the room
4. the second player then connects...
I'm having trouble connecting two hololens devices to one network even after following the Microsoft tutorial/course step by step.
I have reached and completed the 3rd section 'Chapter 3 - Shared Coordinates' but with no successful results. Here's the link. I am using the 'HolographicAcademy-Holograms-240-SharedHolograms' package I downloaded and successfuly imported into Unity.
HoloLens unable to send or receive data via BT and TCP Greetings
I am working on HoloLens (Unity-UWP) and trying to make a connection with PC (UWP) or Android phone work (Xamarin). So far I tried client and host with both Bluetooth and TCP (even two versions with different libraries) on Android and UWP. I...
This example extends the functionality of HelloAR scene from the official Google ARCore SDK for Unity samples. The demo loads collections of 3D models on start up. When user selects a 3D model, the model is downloaded on demand from ARCapt cdn cloud and loads it in the user's phone. No prebuild 3D models! Everything is dynamically loaded. The models are...
HoloLens BLE communication Hi,
Is there a way for the Unity3d HoloLens app to communicate and get position data from a android BLE phone sending out position values ?
Kindly help me out on this. Very new to BLE and plugin development on windows.
Windows Store Rejection due to unused HoloLens API. Using the native Mixed Reality package from Unity my app was rejected because somewhere in my app, the use of HoloLens API was declared but not utilized and the declaration of an API without using it is a problem according to their rules. Your app gets rejected due to potential security problem.
There were some instructions on how to fix it, I had to encapsulate something something, sorry can't recall, but I am not a programmer and my app didn't need a programmer but only for that, so I...
The best example of this would be the vignette effect I'm using during moments of sudden/fast motion (similar to Google Earth VR's comfort mode). In the past, it looked like a "circle" of fully visible pixels in the center of my view (as vignettes typically do), but after upgrading to Unity 2018 it...
I am seeking some developers for a Unity VR project. If this is not the correct forum for this, please excuse my ignorance and point me to the appropriate forum if I have made a mistake. If this is correct, read on.
We are seeking a developer to integrate a non-player observer character into an existing Unity VR game. The player's camera needs to render a full 180 degree equirectangular view. We have a very tight timeframe to get to a demo. This does not need to be production code....
Lightweight XR Rig – Haptic feedback? I've got a VR project running in 2018.2.6, using the same XR rig (with the TrackedPoseDrivers) that gets created when you make a new LW VR project.
I'm trying to enable/fire haptic feedback events to the controllers. I'm unable to use the SteamVR plugin, because it seems to be fully unusable in LW. (Every time I try using it, the entire editor crashes after rendering a single frame. This happens even on an empty project.)
I also noticed there's an OpenVR package available in the package...
GestureRecognizer.OnHoldStarted Delay When using unity's built in gesture recognizer (UnityEngine.XR.WSA.Input.GestureRecognizer) on the HoloLens we're encountering a strange delay in the triggering of the gesture depending on what is begin grabbed.
When pinching while looking at a simple-ish UI panel, our gesture is captured and the drag method is triggered almost instantaneously. When pinching over a mesh, the grab gesture must be held for nearly 3 seconds before the system recognizes it.
The problem I'm facing now is that i've no idea how you add physics to it, so your player falls to the floor, and how you would add touchpad locomotion to it. I've searched all over the internet and there's a serious shortage of any information on how to do this. Any help is appreciated?
accessing ARCore and phone camera settings Hey guys, Is there any way to adjust camera settings like exposure within unity? i dont want the phone camera adjust light/brightness by itself. its really bad for my app and i'd like to have control over it by myself. I know we can do it out of unity. But I cant write plugins. if there is no way, then how can I request it to unity team?
PS: also focusing in the area that i tapped is another useful...
I am making an AR application with Hololens and I have created a rectangular gameobject (cube) of little thickness and I need to assign a material that is transparent and that also conceals what I have below it.
Project things on a wall and this wall has shelves, and I would like if I look from above the shelf, do not see what I have below, but that if I look at it in front, if you see it. So where I theoretically have the shelf, I will place this material.