Taeyong Kim, VP of Samsung Electronics and Head of Graphics R&D, took the stage last month at the company’s annual developer conference to talk a little bit about the future of VR for the company. While no big reveals were expected for the talk, Kim actually mentioned to the crowd that the company is actively working on a standalone VR headset with inside-out tracking and 6 DoF controllers as their ‘next mobile VR product’.
“The question is, how do we combine the benefits of [Gear VR and Odyssey] together for our next VR system?” said Kim. “That’s the question we’re asking at this point. So we think the next mobile VR system would have inside-out positional tracking along with 6 DoF motion controllers. Imagine what you can do with a system where you can track your motion […] along with your hands […] and also use it wherever you are with its full mobility.”
Kim says the company is partnering with Intel to bring inside-out positional technology to a standalone VR headset. Intel has been working on its own standalone headset called Project Alloy until it was scrapped back in September, possibly making for a fortuitous turn of events for the tech to live on in the hands of the well-seasoned Korean manufacturer.
If Samsung pulls through with their own standalone VR headset with 6 DoF controllers, it would be in direct competition with Oculus’ upcoming Santa Cruz prototype, and to a lesser extent with the newly revealed Vive Focus. The future of Vive Focus is uncertain however, as HTC hasn’t said specifically if the headset is staying in China, or making its way to the West at some point. Either way, it won’t be running the Daydream app store and likely only have 3 DoF controllers despite its similar inside-out tracking capabilities.
As for the growing divide between Oculus and Samsung; with the advent of the $200 Oculus Go mobile VR headset announced back at Oculus Connect, it will certainly be interesting to see if app marketplaces diverge, as for now Gear VR (powered by Oculus) is still very much a Samsung product.
Kim says more details we arrive in the near future.
Toyota recently revealed T-HR3, the company’s third-generation humanoid robot. Designed primarily as an experiment to explore new technologies to make robots more physically capable, T-HR3 demonstrates a new “remote maneuvering system” that not only mirrors a user’s movements to the robot, but lets them see and interact with the world through the ‘eyes’ and arms of the robot using a robotic exoskeleton, an HTC Vive headset, and a pair of Vive Trackers.
Controlled via what Toyota calls a “Master Maneuvering System,” T-HR3 allows the entire body of the robot to be operated by a person thanks to wearable controls that the company says mirrors the user’s head, hand, arm and foot movements.
Both the robot itself and the Master Maneuvering System contain a series of motors, reduction gears and torque sensors connected to each joint. A total of 16 controls command 29 individual robot body parts, making for what the company calls “a smooth, synchronized user experience.”
Toyota is positioning the robot as the next logical step in the ultimate goal of creating a friendly assistant capable of helping people in a variety of settings, including the home and medical facilities, and more dangerous places like construction sites, disaster-stricken areas and outer space. As investment in telepresence-controlled humanoid robot grows though, there’s bound to be a number of happy side effects for VR users like better force-feedback haptics and full immersion rigs that could equally be used to control VR avatars. Because projects like these are still in prototyping, we’ll just have to wait and see what happens during the inevitable rise of our robotic companions. In any case, we’ll be here reporting (until the journo-bots take our jobs, that is).
Without explicitly announcing a new specific product, Samsung quietly implied that they may be developing a new standalone mobile VR HMD during a session during their developer conference last month. While there were no major VR announcements during the main keynote at SDC, in a session titled What’s on The Horizon: A Look at the Future of VR at Samsung, Tae Yong Kim, Samsung Electronics’ VP, Head of Graphics R&D, showed a graphic with a question mark in between a Gear VR mobile VR headset and a Samsung Odyssey Windows Mixed Reality HMD. Kim said that the Gear VR is “fully mobile, quickly attaches via a cell phone, and affordable” while the Odyssey offers a “premium experience coming from the positional tracking of the headset and the controllers, and the computing power of the PC.” He said, “The question is ‘How do we combine the benefits of those two technologies together for our next VR system?’”
Kim then showed a slide saying the next steps for Samsung’s mobile VR include inside-out tracking and 6 degree-of-freedom controllers, and he said, “We are partnering with global partners like Intel to bring inside-out technology to our next mobile product portfolio.” Neither Intel nor Samsung had any further comment about this quiet announcement of a “next VR system” and “next mobile product” in Samsung’s portfolio, which seems more significant than merely adding positional tracking and 6-DoF controllers to existing Gear VR devices.
It looks like we’ll have to wait until CES this year to learn if this is more than a positional tracking and 6-DoF tracking update to Gear VR, and whether Samsung is developing their own standalone headsets independent of Facebook’s Oculus Go. It’s unclear what software would be running on Samsung’s new headsets as it appears as though Samsung has a non-exclusive agreement with Oculus since the Samsung S8, S8+, and Note 8 are both Daydream and Gear VR-enabled, but it doesn’t appear that Facebook has a non-exclusive agreement with Samsung. Or if Facebook is able to expand to any OEMs beyond Samsung, then appears as though they have not done so yet. It could be that Facebook is planning a walled-garden hardware ecosystem similar to Apple, and will be focusing their energy on the control that comes with building their standalone headsets.
It’s unclear how healthy and sustainable the current partnership between Facebook and Samsung is. It appears as though Facebook mostly handles the software while Samsung handles the hardware, and while there’s obviously overlap between the two, it’s possible that these next HMDs will indicate whether Facebook takes more control over the hardware and Samsung takes more control over the software.
I had a chance to talk with Samsung’s Tom Harding, who is the Director of Immersive Products in charge of product strategy and bringing VR to the market. We talked about the Gear VR, marketing VR, Samsung Internet VR, Gear 360 and Round cameras, the 3-DoF Gear VR controller, as well as the the collaborations Google with Daydream and ARCore and with Facebook/Oculus on Gear VR.
LISTEN TO THE VOICES OF VR PODCAST
I challenged Samsung for not investing many resources within the VR content ecosystem or attending very many community VR events over the past couple of years. Harding says that Samsung’s focus has been on scale and making VR solutions available to all, and that they’ve been primarily focusing on driving adoption. But I wonder how much you can drive adoption of VR technologies without also investing in the content that will ultimately drive grassroots word of mouth and adoption.
A number of independent video creators expressed frustration that Samsung has not been doing more to support the needs of content creators, including how Samsung has not created any marketplace for immersive content creators to sell their work. One creator told me that Samsung did not not offer them any licensing fees to feature their work in the Samsung VR app, and a survey of content creators whose work was featured at Sumsung’s Evening of 360 show revealed that there was not any payment offered for featuring their work.
A lot of the content curation and marketplace development has been offloaded to Oculus since they serve as the primary point of contact with the VR development community, and so Samsung has been really disconnected from the needs of content creators. Samsung is in a financial position to invest a lot more within the future of the VR medium, but it appears as though that they have not been taking a holistic approach to supporting the VR content ecosystem or more directly engage the grassroots of the VR community. I hope to see Samsung a lot more in the year to come, and that they take the initiative to engage, listen, and help serve some of the larger needs of the VR community.
Facebook Spaces, the company’s social VR platform available on Oculus Rift, has updated to include a new mini-game that aims to pave the way for more yet to come. Called Bait! Arctic Open, the game lets you ice fish while you shoot the breeze with your VR buddies.
Announced during the Oculus Connect developer conference this year alongside the news that Home will soon get a full social overhaul, Facebook has finally released their first game for Spaces, a Rift-native iteration of Resolution Games’ Gear VR title Bait! (2016). Called “an early experiment” by the company, the game may be a simple fishing game, but Facebook considers it a milestone in a much larger effort to not only get more people using the social VR space, but to keep them there.
Facebook Spaces Head of Product Mike Booth says the main reason the company used a game like Bait! was “because it’s a 3-dimensional, real-time simulation that’s social—so it could be used as a prototype to start building our third-party developer tools.” Boot, who’s overseen the project since the beginning says putting the game into Spaces forced them “to do the kinds of things we’ll need to do in a 3D environment for developers, rather than a 2D environment.”
As a game that admitted allows you to be socially active but not take “100% of your mental bandwidth,” as Resolution Games CEO Tommy Palm puts it, it’s an interesting first entry that plays to the strengths of the platform as it is currently. Facebook Spaces still doesn’t feature any sort of artificial locomotion, making an ice fishing game, where you’re necessarily tied to a single location, a natural starting point. Being fairly simple doesn’t hurt either, making it an easy thing to pick up and put down.
Facebook’s social VR team has also been experimenting heavily with traditional table games like cards and dice too, and while those things will likely come at some point in the future, the company seems more focused on building out a ‘target platform’ and creating developer tools so more third-party developers can come in with their own ideas.
“This is only the beginning,” says Booth. “Experiments like this are helping us learn what’s possible to build in Spaces, what works, what doesn’t, and what tools will empower developers to bring great ideas to life. Eventually, we want even more developers to build with us. We’ll share more on our plans next year, so stay tuned!”
Pixvana, a Seattle-based startup creating a cloud-based processing and delivery platform for VR video, today announced a $14 million Series A funding round led by Vulcan Capital with participation from Raine Ventures, Microsoft Ventures, Cisco Investments and Hearst Ventures, and existing investor Madrona Venture Group.
The company exited stealth when they announced their initial seed round back in December 2015. The Series A brings their overall investment to $20 million. Now, Pixvana hopes to “enable anyone to create and distribute next-generation video experiences” using its main service SPIN Studio.
SPIN Studio is an end-to-end solution for authoring and delivering virtual, augmented, and mixed reality video content, otherwise encapsulated in the term ‘XR video’ by the company. Both playback and direct-to-store publishing functions are already available to beta testers looking to deliver up to 8K video without the supposed caveat of questionable streaming quality and stuttery playback for their users—a feat accomplished in part by the company’s Field of View Adaptive Streaming (FOVAS) technology that optimizes playback resolution by delivering video in discrete sections and rendering that video only where you’re actively looking. FOVAS is supported on Gear VR, HTC Vive, Oculus Rift, Windows VR headsets and Daydream currently.
According to a company blog post announcing the investment, SPIN Studio takes care stitching, editing, publishing, and playback for a single “seamless workflow.”
“We are thrilled to work with this amazing group of investors backing us, to realize our vision of XR storytelling,” said Forest Key, Pixvana CEO. “Faster iteration and blending of video and 3D content will allow creatives, brands and media companies to create amazing XR experiences. More announcements like the recent devices for Windows MR and Oculus Go will create a rapidly expanding market for XR video experiences.”
To seal the investment, Key and his team created a 360 pitch video entitled Sofia, named after the famous director Sofia Coppola. Below you can get a taste of the 15 minute video, albeit edited for public viewing. While the video is available on YouTube, you won’t get the benefits of FOVAS unless you view through the Spin Play app on Steam, the company’s own showcase of videos possible with its cloud-streaming and adaptive playback capabilities.
The company formerly known as AxonVR, which has raised more than $5 million in venture capital, is rebranding to HaptX, and revealing a feature prototype of a VR glove which uses micro-pneumatics for detailed haptics and force feedback to the fingers. After trying the prototype for myself, I came away impressed with the tech. The company’s next challenge is to turn the prototype into something sleeker, smaller, and far more practical.
Meeting with HaptX co-founder Jake Rubin in Silicon Valley earlier this month, I got to try the latest prototype of the company’s wild-looking haptic VR glove—a monstrous piece of equipment hooked up to some massive cabling. Putting it on—with the help of two people by my sides—I felt like I was preparing for a medical procedure, as the pair showed me how to carefully guide my fingers into the right places, pull out some fabric slack, and then tighten the glove to my hand with a ratcheting mechanism to ensure a snug fit. But the point of this prototype is not about size and fit, it’s all about function. And function it did—the haptics and force feedback were the most responsive and detailed I’ve tried to date, in part thanks to the glove’s micro-pneumatics.
The HaptX gloves is based on innovative micro-pneumatic technology. The company has developed a method for essentially producing thin, bendable fabrics which are manufactured with a series of tiny air pipes along their length which eventually terminate in small inflatable circles which act as “haptic pixels,” according to Jake Rubin, one of the company’s two co-founders and its CEO. The inflatable circles, just a few millimeters across, are aligned into grids; by precisely controlling when and which haptic pixels to inflate, a convincing sensation can be created, simulating the feeling of an insect crawling along your finger or a marble rolling around in the palm of your hand.
The glove also features force feedback: the ability to restrict the movement of your fingers to simulate holding objects. This too is based on the company’s micro-pneumatic technology, which Rubin explained works by inflating stoppers along the joints of your fingers to restrict their movement. The effect is that when you reach out to grab an object, say, a baseball, your fingers stop right where they should be coming in contact with the baseball.
Both effects were impressively responsive and quite convincing. I’ve tried a few other similar systems, but the haptics from the HaptX glove blew the others away. The glove puts the haptic material across the palm of your hand and on the tips of your fingers—totaling some 100 individual haptic pixels—allowing you to feel a finely detailed sensation of pressure in all those places. The range of tactile sensations was ultimately surprising; revealed to me when I was thrown in to the company’s farm-themed menagerie of tactile examples.
Feeling the Farm
Rubin walked me through the demo experience, built using an SDK of HaptX’s design, which he says is largely created by leveraging Unreal Engine’s physics system to tell the glove when and where to apply haptic effects and when and how to engage the force feedback.
With the glove on my right hand, and wearing an HTC Vive headset, I was looking down at a miniature barnyard with some little sunflowers off to the right and a tiny patch of wheat in front of a barn. Rubin encouraged me to start poking and prodding at the scene. Each of the glove’s fingers is tracked by a proprietary magnetic tracking system which Rubin claims is capable of sub-millimeter precision. Indeed it worked well.
As I reached out with my index finger to gently touch the leaves on the side of the sunflower, I could feel pressure against my finger that quickly and closely followed the visuals, making it easy to connect the feeling with the image.
Next to the sunflowers was a little grey storm cloud, and when I poked it, pea-sized raindrops began falling from within. Stretching out my palm to catch them, I felt a convincing pitter-patter of pressure right on my palm. A similarly convincing moment came when I brushed my palm across the tops of the tiny wheat plants.
Eventually a baseball-sized tractor came rolling out of the barn. When I went to pick it up like a little toy, my fingers stuck in place—seemingly right against the virtual tractors surface—and wouldn’t budge. It was a convincing effect, especially combined with the haptics putting pressure on the tips of my fingers as though I was holding something. Although there’s no way for the gloves to simulate the weight of the object, making it feel like it was at least real, as far as the volume that it takes up, is a big step up. ‘Mock grabbing’ items with controller-less VR hand input feels really unnatural, but the quality of this force feedback remedied that with ease.
There was more to do and see in the demo, including some tiny critters that came out of the barn to dance around on my palm so that I could feel their little steps (including the eight legs of a spider, which Rubin tells me is the most contested part of the demo). In the end though, the whole ordeal provided me with a new benchmark for small-scale haptics in VR.
Czech developer iNFINITE Production has released UVRF – a free, cross-platform template for hand presence in VR. The open-source demo offers a framework for use in any Unreal Engine project, as well as a ‘Playground’ scene containing an underground bunker and shooting range to showcase hand interactivity.
Detailed in a post on the Unreal Engine VR developer forum, UVRF’s framework aims to be a useful starting point for implementing hand presence in an Unreal-based VR experience, offering 17 grab animations to cover most objects, per-platform input mapping and logic, basic haptics, teleport locomotion using NavMesh (with rotation support on Rift), touch UI elements, and several other useful features. The framework is released under the CC0 license, meaning it can be used by anyone without restriction.
In a message to Road to VR, Jan Horský at iNFINITE Production explained how this template could be particularly useful to new developers. “While Unreal does very good job at making development accessible, building hands that properly animate, are properly positioned, with grabs and throws that feel natural and so on, is still not a trivial task,” he writes. “While it’s not a problem for experienced dev teams, it is a problem for newcomers. And they’re the ones that are likely to have ideas that will surprise us all. This little demo is an attempt to make VR development easier for them.”
The included ‘Playground’ demo shown in the video features a functional shooting range in an underground bunker, littered with magazines to show the multi-object interaction of reloading a gun, along with many other features to highlight the hand animations.
Originally developed as an internal tool for prototyping at iNFINITE Production, the team decided to kindly share it with the world. “I expected such a project would come from companies that are more interested in VR growth like Oculus, Valve, or HTC,” says Horský. “It’s nearly a year since Touch was released and there is still no such thing publicly available, so we decided to take it into our own hands.”
Black Friday is nearly here, and while the stores are soon to be jam-packed with people hoping to get a great deal on this year’s must-buys, online sales have already begun for the newest entries into the world of VR hardware. Microsoft’s Windows “Mixed Reality” headsets are now on sale for up to $100, bringing the cheapest among them to just $300.
The sale is going on until 11:59 PM PST on November 27, 2017 (your local time), and is only available in Microsoft retail and online store in the US and Canada. The deal includes both the headset and wireless motion controllers.
Besides the Samsung Odyssey, which features higher resolution displays and integrated audio from AKG, every headset listed above is basically the same in terms of specs, save a few ergonomic and aesthetic differences. We’ve listed the basic specs and also Samsung Odyssey specs at the bottom of the article.
What sets these apart from other VR headsets on the market though is the ability to do inside-out positional tracking, which means you don’t have to put up sensors or basestations to have a room-scale VR experience. Not only that, but Microsoft says the headsets work on integrated graphics cards, albeit for less intense functions like using your standard Windows-flavored productivity tools, or watching a video in your own private cinema.
If you’re looking to use the headset for gaming though, you’re in luck. Microsoft recently pushed support for SteamVR compatibility, so you can buy and play VR games from VR’s largest digital marketplace, Steam. Remember, you’ll need a sufficiently powerful computer to run more graphically intensive applications though, so you might want to at very least have a computer that meets these specs published by Microsoft.
Oh, and don’t be confused by the “Mixed Reality” naming scheme. These are definitely VR headsets with no appreciative use for augmented reality (AR) like the company’s HoloLens headset.
Basic Headset Specs
Two high-resolution liquid crystal displays at 1440 x 1440
2.89” diagonal display size (x2)
Front hinged display, so you can flip the headset up while working
Up to 105 degrees horizontal field of view
Display refresh rate up to 90 Hz (native)
Built-in audio out and microphone support through 3.5mm jack
Single cable with HDMI 2.0 (display) and USB 3.0 (data) for connectivity
Foundry10, a Seattle based philanthropic educational research organization, wanted to explore what happens when you bring VR into the classroom. Following a pilot project started in 2015, Foundry10 has now put VR in the hands of 40 schools and community centers around the world, and measured students attitudes toward the technology.
The data from Foundry10’s initiative is available in two free reports, which begin to lay a foundation of data about how VR might be most useful in the classroom:
The majority of the schools in the study were in the United States and Canada. Headsets were mostly passed out by Foundry10 contacting teachers who had expressed interest in bringing VR to their classroom. A total of 1,351 students from 6th to 12th grade, a majority of them were in the 7th and 8th grades. As for gender, ½ the students were male, a ⅓ were female, and ⅙ did not specify. Most of the students had not tried VR before trying it in school or in a community center.
To gather data student surveys were passed out both pre/post VR experiences. The students surveys were grouped into two categories: VR consumers, those who viewed/engaged with VR content; and VR creators, those who consumed as well as created and/or modified existing content as part of their learning. VR content creation was offered in classes such as Advanced Computer Science, Game and App Development, and Fundamentals of Digital and Visual Arts. Some students were using VR to create artistic work, but those were not included VR content creators since they were not coding. Content creators also were mostly in higher grade levels (majority being in 12th grade), and the majority were male.
Of the students surveyed (regardless of class content) majority were interested in both consumption and creation of VR content. There was a small decrease (moved to consumption only) when comparing post to pre, but still a majority wanted to do both. As for subject matter, students mostly wanted to experience content in concrete subjects such as history or science education.
Initially students were unsure of what to see or what could be done in VR, but there were distinct shifts in before and after. There were positive shifts in categories such as trying new things and historic experiences, but negative shifts in emotions. Students felt that they could learn about places through VR. A teacher offered an anecdote about their students in a rural classroom experiencing a virtual subway ride. This was very impactful on this group of students because majority of them had never seen anything like a subway except in images and video.
Additional questions were recorded in the survey such as what causes breaks in immersion when it comes to VR content (the answer will most likely not surprise you: ). There was also data presented for discomfort experienced by students in VR, which was not overly common, but still something teachers and creators of educational VR content would need to consider in the future.
Also, there was evidence that, in schools where the VR program that did not have support of the school’s administration and IT services, the technology was draining on teachers, regardless of the teacher’s previous interest. Students also had comments on the hardware, in particular the cables being troublesome, so a wireless version would be preferred or less cumbersome experience could be beneficial.
Overall, students had confidence that VR ed content developers were knowledgeable about the content they were creating. They also understood that the technology has a long way to go, but felt the simulations they experienced were realistic. Students also felt that VR was helpful to people, and should be more accessible. At the moment of the published released 30 schools were enrolled for the 2017-2018 program. For more in depth analyses of these finding please visit foundry10. Links for both the in depth study as well as the summary are available there. This was a very intriguing study, and we look forward to seeing their results in the future.
Benchmark specialists Futuremark are launching a new ‘room’ on November 22nd as a free update for VRMark Advanced & Professional Editions. Adding to the existing ‘Orange Room’ and ‘Blue Room’ benchmarks, ‘Cyan Room’ uses a pure DirectX 12 engine optimised for VR.
Finnish software development company Futuremark launched VRMark in 2016, a dedicated virtual reality benchmarking application to compliment their popular performance testing software, 3DMark. VRMark features two tests: the ‘Orange Room’, a specific ‘VR readiness’ test designed around the minimum PC hardware requirements suggested for the HTC Vive and Oculus Rift; and the ‘Blue Room’, a much more demanding test designed to run at 5K resolution, stretching the legs of the latest graphics cards.
‘Cyan Room’ is the latest VRMark test, using a pure DirectX 12 engine built in-house. According to the VRMark website, this test was designed to show “how using an API with less overhead can help developers deliver impressive VR experiences even on modest PC systems.”
As with the other test rooms, the new scene can run on a monitor or VR headset, and runs on a fixed path for consistency. Resolution and other visual settings can be adjusted, with frame-by-frame performance charts. There is also an Experience mode where the user can explore the sequence at their own pace, to judge the quality for themselves. As explained in a previous article, because VR rendering techniques compensate for dropped frames or low performance so effectively, and the experience varies significantly between individuals, Futuremark believe that VR benchmarking should offer a combination of objective and subjective tests.
The benchmark results will tell you objectively whether your PC was able to meet the target frame rate, along with a comparison with other systems, but sometimes the numbers don’t tell the whole story, and a subjective look around in your headset could give a different impression, depending on how sensitive you are to various VR rendering tricks, like reprojection, which are used to cover up moments of spotty performance.
The Cyan Room scene will be a free update for VRMark Advanced Edition and VRMark Professional Edition. VRMark Basic Edition is a free download, containing just the Orange Room benchmark.