Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Submission + - Hands-on with Fove's First Eye-tracking HMD Prototype (roadtovr.com)

muterobert writes: Eye-tracking is the future of VR Head mounted displays. Ben Lang gets to try out Fove's first publicly demonstrated prototype which detects where you're looking and uses that information as input for games and applications.

"Kojima walked me through a few different experiences demonstrating the eye tracking capabilities of the Fove HMD. The first had me in a dark city street with some futuristic-looking super-soldiers lined up before me. Looking at them caused me to shoot them and one after another they dropped to the ground after being blasted by my eyes."

Submission + - Samsung Gear VR Gets Wireless Positional Tracking Via Sixense STEM (roadtovr.com)

An anonymous reader writes: Samsung's Note 4 powered Gear VR is hotly anticipated as the first fully-fledged virtual reality headset to hit retail next month. It's an impressive system, but it lacks a crucial component to combat motion sickness, positional tracking.

Now, Sixense have announced that their cutting-edge VR motion control system 'STEM' has been adapted to work with the Gear VR — plugging a big gap in the devices feature set.

"Sixense’s Rubin went on to detail the company’s success with not only getting STEM tracking modules working with Gear VR, but claimed that they’ve managed to achieve around 7.5ms latency between STEM and the Note 4 via Bluetooth. This figure is impressively close to the company’s ‘full fat’ PC targeted system, which touts 4.2ms of latency."

Submission + - 'UniverseVR' Let's You Visualize 20 Billion Galaxies Using the Oculus Rift (roadtovr.com)

An anonymous reader writes: 'UniverseVR' uses data from the 'Millennium Simulation Project' to allow the real-time interactive VR visualization of some 20 billion galaxies. The Millennium Simulation Project (http://bit.ly/10i2nCg), run by the Max Planck Institute for Astrophysics, used more than a month of supercomputing time to simulate the evolution of billions of galaxies in a section of our universe that's 2 billion light-years wide. UniverseVR let's users explore the visualization of this simulation inside the Oculus Rift.

Submission + - 'NewRetroArcade' Uses VR to Send Players to a Brilliantly Detailed 80's Arcade (roadtovr.com)

An anonymous reader writes: Ever wish you could revisit the classic arcades of the 80's? Well grab an Oculus Rift and fire up your PC, because your wish just came true. Developer Digital Cybercherries (http://digitalcybercherries.com/) has just released 'NewRetroArcade', a brilliantly detailed 80's arcade with playable arcade machines—with classic games from the 80's and early 90's—bowling, basketball, darts, and more. Using the Oculus Rift, players feels as if they're strolling through the corridors of arcade machines, all blinking and buzzing for their attention. The best part? NewRetroArcade is completely free. The game is built on UE4 and represents an impressive benchmark for the kind of detailed scenes that indies can create within the game engine.

Submission + - Epic Games Talk Optimization: Getting 'Showdown' to 90 FPS in UE4 on Oculus Rift (roadtovr.com)

An anonymous reader writes: Oculus has repeatedly tapped Epic Games to whip up demos to show off new iterations of Oculus Rift VR headset hardware. The latest demo, built in UE4, is 'Showdown', an action-packed scene of slow motion explosions, bullets, and debris. The challenge? Oculus asked Epic to make it run at 90 FPS to match the 90 Hz refresh rate of the latest Oculus Rift 'Crescent Bay' prototype. At the Oculus Connect conference, two of the developers from the team that created the demo share the tricks and tools they used to hit that target on a single GPU.

Submission + - Nimble Sense Aims to Bring Time-of-flight Depth-sensing to VR on the Cheap (roadtovr.com)

An anonymous reader writes: Today Nimble VR launches a Kickstarter campaign for Nimble Sense, a natural input controller that the company says was designed for virtual reality input, but flexible enough for other user interaction. And while Nimble Sense doesn’t at first appear to be much different than Leap Motion, the company says they’re using ‘time-of-flight’ depth sensing technology, like what’s used in the Kinect 2, which they say has unique benefits. The company claims to have “achieved a breakthrough in the accuracy, cost, and power consumption” of time-of-flight sensors, and they're aiming to bring the tech to the world of VR at an affordable price point: starting at $99, which includes the camera and a mount for the Oculus Rift DK2 headset.

Submission + - 'World of Comenius' Brings Virtual Reality Into the Classroom in a Big Way (roadtovr.com)

An anonymous reader writes: 'World of Comenius' uses the Oculus Rift DK2 VR headset and Leap Motion natural input controller to enable students to intuitively interact with software to learn about human anatomy. With the Leap Motion mounted to the Oculus Rift, users reach out to manipulate a skeletal model with removable bones and organs. Students can even zoom into the bloodstream and watch as blood cells make their way through the body. 'World of Comenius' developer Solirax partnered with the Mendel Grammar School in Opava City, Czech Republic to bring a fleet of the VR systems into the classroom to give students their first lesson in virtual reality education.

Submission + - New Oculus SDK Adds Experimental Linux Support and Unity Free for Rift Headset (roadtovr.com)

An anonymous reader writes: Oculus, creator of the Rift VR headset, has released a new version of their SDK which brings with it long sought after support for Linux, which the company says is "experimental". Linux support was previously unavailable since the launch of the company's second development kit, the DK2 (http://www.oculus.com/dk2/). The latest SDK update also adds support for Unity Free (https://unity3d.com/unity/download), the non-commercial version of the popular game authoring engine. Previously, Unity developers needed the Pro version—costing $1,500 or $75/month—to create experiences for the Oculus Rift.

Submission + - Magic Leap Just Raised $542m, Now Hiring to Develop Their Lightfield AR Wearable (roadtovr.com)

An anonymous reader writes: After rumors broke last week, it's Magic Leap has officially closed the deal on a $542 million Series B investment led by Google (http://bit.ly/1x5xQSK). The company has been extremely tight lipped about what their working on, but some digging reveals it is most likely an augmented reality wearable that uses a lightfield display. "Using our Dynamic Digitized Lightfield Signal, imagine being able to generate images indistinguishable from real objects and then being able to place those images seamlessly into the real world," the company teases. Having closed a investment round, Magic Leap is now soliciting developers (http://bit.ly/1wmvj6I) to create for their platform and hiring a huge swath of positions (http://bit.ly/1s3UBSF).

Submission + - Oculus Hiring Programmers, Hardware Engineers, and More for VR Research Division (roadtovr.com)

An anonymous reader writes: Buried toward the end of the must-watch keynote (http://bit.ly/1vQHUzD) by Oculus VR's Chief Scientist, Michael Abrash, was the announcement of a new research division within Oculus which Abrash says is the “first complete, well funded VR research team in close to 20 years.” He says that their mission is to advance VR and that the research division will publish its findings and also work with university researchers. The company is now hiring "first-rate programmers, hardware engineers, and researchers of many sorts, including optics, displays, computer vision and tracking, user experience, audio, haptics, and perceptual psychology," to be part of Oculus Research.

Submission + - Simple Hack Enables VR Mode for Oculus Rift in 'Alien: Isolation' (roadtovr.com)

An anonymous reader writes: In a surprising appearance at E3 2014, Oculus showed a virtual reality demo version of Creative Assembly's forthcoming first-person horror game, Alien: Isolation. Despite intense reactions to the demo (http://bit.ly/1lrinsT), the publisher stated that the full game would not feature Oculus Rift support. However, intentional or not, the developer left the code hidden in the game which can be enabled with a simple hack, leading to full support for the Oculus Rift including positional tracking.

Submission + - Reverse Engineering the Oculus Rift DK2's Positional Tracking Tech (roadtovr.com)

An anonymous reader writes: The Oculus Rift DK2 VR headset hides under it's IR-transparent shell an array of IR LEDs which are picked up by the positional tracker. The data is used to understand where the user's head is in 3D space so that the game engine can update the view accordingly, a critical function for reducing sim sickness and increasing immersion. Unsurprisingly, some endeavoring folks wanted to uncover the magic behind Oculus' tech and began reverse engineering the system. Along the way, they discovered some curious info including a firmware bug which, when fixed, revealed the true view of the positional tracker.

Submission + - DC Entertainment Bringing Batman Experience to Gear VR and Oculus Rift (roadtovr.com)

muterobert writes: Today it’s been announced that Warner Bros., DC Entertainment, and OTOY are collaborating to recreate the iconic Batcave from Batman: The Animated Series in virtual reality for Samsung Gear VR and Oculus Rift. OTOY is providing what they call “holographic video” technology to render the scene in a way that’s true to the Batcave of the classic 90s show.

Submission + - Experiment Shows Stylized Rendering Enhances Presence in Immersive AR (roadtovr.com)

An anonymous reader writes: William Steptoe, a senior researcher in the Virtual Environments and Computer Graphics group at University College London, published a paper detailing experiments dealing with the seamless integration of virtual objects into a real scene. Participants were tested to see if they could correctly identify which objects in the scene were real or virtual. With standard rendering, participants were able to correctly guess 73% of the time. Once a stylized rendering outline was applied, accuracy dropped to 56% (around change) and even further to 38% as the stylized rendering was increased. Less accuracy means users were less able to tell the difference between real and virtual objects. Steptoe says that this blurring of real and virtual can increase 'presence', the feeling of being truly present in another space, in immersive augmented reality applications.

Slashdot Top Deals

The cost of feathers has risen, even down is up!

Working...