It is available on IOS (Unity or Scenekit Swift/Objective C). If you’re comfortable building an ARKit app, then you’ll be OK working with our platform. If you’ve ever built a Hololens or Tango app, it’s exactly the same
Pimax issued a statement on their forums stating that their 8K headset will not be shipped to all backers this month as previously claimed. The statement was also sent out via email.
The company explained that the 4K LCD panels used in the Pimax 8K have been rejected at a much higher rate since they enacted stricter QA testing.
So when will the remaining backers of the 8K model get their headset? Pimax made clear that it would not be before the Chinese Spring Festival which runs until February 10th. This means it’ll be weeks or months before all are produced & shipped.
The 8K headset isn’t the company’s only offering however. There is also the ‘Pimax 5K+’. Despite having lower resolution (1440p) panels the visual quality of the 5K+ is actually preferred by many. This is another reminder of the important fact that resolution is only one spec of a panel. While marketing departments everywhere would have you believe otherwise, a lower resolution can look superior if the other specs of the panel are better by enough of a degree.
The 5K+ headset isn’t free of QA issues either however. Pimax now has a replacement program for a “black dots” issue. This is a flaw in some 5K+ units where, as in the name, black dots appear over the panel.
It’s now 16 months since the Pimax kickstarter launched. While the company has finally shipped almost all of its 5K+ model, backers of the original 8K are still waiting. When we spoke to Pimax at CES the company’s new Head of US Operations spoke of the many ways in which they plan to overcome their production, shipping and support issues- lets hope this is the year where they finally deliver.
Hello I currently have a WinMR game live in the MS store. On two other platforms, I added the option to allow to users to play it in 2D or using their VR headsets. I achieved this on one platform by packaging two executables, and on the other a single executable and letting the user select VR or 2D at runtime. I would like to do the same for UWP. I was wondering if it is possible, and if so, how to do it. For the record I'm using Unity 2017 to create my game.
If you drag a teleport point from the SteamVR Interaction Prefabs into a group of objects in your scene, which you then use setActive on to show during runtime, unlike the default behavior that only shows the point when you hold down the thumb pad, that teleport point will then always be visible.
The added multiplayer functionality in a new Early Access build gives players the ability to share music-making sessions with others around the world. One player hosts a room and EXA keeps layouts synced for the various instruments as well as “items, playback states, metronome, and live ringer events.”
“The room can be made available publicly, can be hidden until a player enters the room name, or can be limited to your local network (LAN). The room creator can even put players into a ‘spectator’ mode by disabling some of their room permissions,” developer Zach Kinstner wrote in an update explaining the change.
Calling All Bands
A video further explains the syncing functionality and how it might work better over lower latency conditions. Players can talk to one another and record loops in any network condition — arranging instruments, adding sounds and building up compositions together. Loops, however, transfer to other players upon completion. That process could take several seconds for detailed loops with lots of data to transfer, according to Kinstner. Musicians can add live sounds on top of the loops via their shared instruments — just like a real-life band — in extremely low latency sessions, like over a local area network.
“When latency is low, each player’s ringer events can transfer fast enough for other players to hear the full ringer sounds at the correct time,” Kinstner explains. “In these conditions, you could conceivably play a live performance in EXA, with everyone playing their virtual instruments at the same time, rather than sharing loops. With higher latency levels, you won’t hear the full sound from a ringer event. For example, if an event reaches you 80 milliseconds late, you’ll miss the first 80 milliseconds of that ringer’s sound. As latency increases, it becomes more difficult for live performers to stay in sync with each other, and players should collaborate using recorded loops instead.”
EXA lists support for Oculus Rift, HTC Vive and Windows Mixed Reality headsets.