No Comment Diary

The News Without Comment

This content shows Simple View

VRFocus

Google Plans To Make Major AR Push

Apple’s ARKit has been a major success, seeing a whole host of apps and videogames created since it launch. Not surprisingly, Google have been keen to keep pace with Apple in this area, as seen by the announcement of ARCore in August last year. Now reports indicate that ARCore is almost ready for launch.

An insider familiar with Google’s plans in the augmented reality (AR) area has been speaking to Variety, and has revealed that the ARCore framework will be released at or just after the Mobile World Congress in Barcelona next week.

Preview versions of ARCore have been made available for Google’s own Pixel smartphones, allowing the company to test some of its abilities. Google plans to bring ARCore to over 100 million phones in the near future, a goal that Google plans to achieve by working with key manufacturing partners to install the framework on other Android devices.

Analysts are predicting that Samsung’s Galaxy flagship phones will be among the first to receive the ARCore update once it goes live. Google and Samsung have already worked together to introduce Google Daydream compatibility to the Samsung Galaxy S8 and Galaxy Note 8 series of phones. In fact, a preview version of ARCore is already available on some Samsung Galaxy S8 phones.

Google recently demonstrated the ARCore technology on its Pixel phones to allow users to capture videos of virtual Star Wars Stormtroopers. Two developer preview versions have been made accessible to selected developers, but third-party developers have so far been barred from making the resulting apps available on the Google Play Store. That will likely change with the official release of ARCore 1.0.

The ARCore technology allows for more sophisticated uses of AR, allowing virtual objects to behave like real-life objects would. If you move closer to an AR object, it will get bigger and if it is moved, its shadow will also move.

Google have been experimenting with AR technology for quite some time, with ARCore replacing the previous Tango technology that is soon to be discontinued in favour of supporting ARCore. VRFocus will continue to keep you informed on Google’s work in VR and AR.

http://ift.tt/2HHoqtl Source: https://www.vrfocus.com



Oculus To Attend GDC, Jason Rubin Discusses The Future For The Company

The Game Development Conference (GDC), is set to take place in San Francisco next month and attending the show will be Oculus who will bring with them plenty of exciting projects.

Oculus Touch - Chris

Talking to the GDC website in a preshow interview, Jason Rubin, Oculus’s Vice President of Content, spoke about what attendees can expect from the company at this years show: “Oculus is coming off a monumental year: Rift continued dominating the PC VR market as prices dropped, Touch launched and became the defacto input device, our content library grew to over 2,000 titles with numerous accolades, we unveiled the first standalone VR headset with Oculus Go, and we continue work on the Santa Cruz prototype.”

The Santa Cruz prototype was revealed back at Oculus Connect 3 as a virtual reality (VR) headset without the need for a computer or mobile device. In a hands on session VRFocus Senior Staff Writer Peter Graham described it as “..lightweight, soft to the touch and of a much better build quality. It’s also surprisingly small.”

Oculus Go headset

Talking to GDC, Jason Rubin described the Oculus Go as: “a winning combination of accessibility and freedom, and it will be the easiest way for people for all ages to get into VR when it launches. It’s binary compatible with Gear VR and uses the Oculus Mobile SDK, so developers will have a streamlined path to publishing. We’re really excited about what that means for consumers and developers alike.”

Oculus will be focusing on the future of VR at this years GDC and as Jason Rubin says the company is: “investing aggressively on all fronts: hardware, software, content. We’re hard at work in Oculus Research pioneering what VR (and AR) will look like 5 or 10 years down the line. We’re in deep with developers and we’re not slowing down. We’ll keep working to provide the best tooling for developers. You’ll see us building on early work with things like OpenVR to create more cross-platform systems and services. 2017 saw the launch of numerous social apps and games, like Echo Arena, that redefined shared presence. Once you’ve played a co-op game in VR, there’s no going back—imagine a made-for-VR MMO—and the potential is astounding. We are helping more developers envision and enable these types of meaningful interactions.”

GDC 2018 will be taking place 19th March to 23rd March at the Moscone Center in San Francisco. Further news about about Oculus, their developments and GDC 2018 will be here on VRFocus.

http://ift.tt/2GDdA66 Source: https://www.vrfocus.com



GORN Gets Even Bigger With Big Things Update

Over-the-top violence and copious amounts of mangled body parts have been the trademark of GORN since it was released. Apparently developer Free Lives and publisher Developer Digital are believers in the concept of ‘Go Big or Go Home’ judging by the upcoming Big Things update.

GORN is ostensibly a gladiator simulator in virtual reality (VR) featuring a suitably ridiculous combat engine that allows players to gleefully, smash, stab, rip or slice opponents to bits in a shower of blood and body parts using a variety of weapons such as knives, axes, maces and bows.

The Big Things update will be the third major update released for the title, and the development team are promising that this update will be the biggest yet. The first addition is a new opponent in the form of tag-team Giant and Mitch. Giant lives up to his name, while Mitch is smaller, but also quicker and sneakier, making them a pairing sure to prove a challenge to even experiences gladiators. Players will also be able to access new weapons, all of which are suitably oversized, the Great Axe, Chainblade and Throwing Shield.

The arena itself will be undergoing something of an overhaul as well, with a new larger location that offers more room to manoeuvre as well as tricky environmental hazards that can be used to the player’s advantage if they are careful.

Other new additions include a new crossbow companion and a Nightmare difficulty mode for those really wanting to push themselves. For those who are concerned about the effect of all this ultraviolence might have on impressionable minds, a Low Violence mode will also be made available in the new update.

Until Monday 26th February 2018, GORN will be available at 25% off as part of the Steam Weekend Deal. The Big Things update is available for free to all current GORN players. Further information can be found on the Steam store page.

Further news on updates and offers for VR titles will be here on VRFocus.

http://ift.tt/2CD0JP3 Source: https://www.vrfocus.com



Audi Quattro Coaster Brings the Showroom to Your Home

Audi Quattro Coaster Brings the Showroom to Your Home

New augmented reality app allows users to explore and drive the Audi Quattro in a virtual showroom experience

Car manufacturer Audi have released a new augmented reality (AR) app that allows a user to bring one of four models of the Audi Quattro into their own home.

Audi Quattro Coaster screenshot 1

By using a device to scan a room a user will be able to view a detailed model of the Audi Quattro right in the comfort of their own home. The digital recreation of the car can be viewed in full-size to see how it fits in your driveway or placed on a table to allow for more compact exploration. All the details are faithfully captured even down to the interior of the car resulting in the same experience as attending a showroom.

On top of viewing the Audi Quattro the Audi Quattro Coaster app also allows users to create their own track and take it for a test drive. By holding down the red button in the app and then moving forward you can create a test track that is only limited by your available space and imagination. Take the course around chairs, under and over a table and up to the ceiling and back. Once finished the track remains in virtual space allowing a user to move around freely and study both the road and the cars in different angles. To showcase the four-wheel drive power that the Audi Quattro has the track will experience all four seasons and the cars will react accordingly, giving you a detailed test drive up close.

Audi Quattro Coaster screenshot 2

The Audi Quattro Coaster app will also let users see an extended version of the TV commercial by holding a device with the app open to your TV. When viewed this way, the camera will scan a point in the commercial which will then lead to the car continuing to move through the device screen, extending the content shown in the commercial.

The Audi Quattro Coaster app is currently only available on Apple iPhone and iPad devices running iOS 11 via the App Store but Audi is hoping to bring the app to Android in the future. As and when further details are released about the Android release and any updates VRFocus will keep you updated.

http://ift.tt/2osOeAy Source: https://www.vrfocus.com



XRI: Cross-Reality Interaction

Widespread consumer adoption of XR devices will redefine how humans interact with both technology and each other. In coming decades, the standard mouse and QWERTY keyboard may fade as the dominant computing UX, giving way to holographic UI, precise hand/eye/body-tracking and, eventually, powerful brain-to-computer interfaces. One key UX pattern that must be answered by designers and developers is: How to input?

That is, by what means does a user communicate and interact with your software and to what end? Aging 2D input paradigms are of limited use, while new ones are little understood or undiscovered altogether. Further, XRI best practices will vary widely per application, use case and individual mechanic.

The mind reels. Though these interaction patterns will become commonplace in time, right now we’re very much living through the “Cinema of Attractions” era of XR tech. As such, we’re privileged to witness the advent of a broad range of wildly creative immersive design solutions, some as fantastic as they are impractical. How have industry best practices evolved?

Controllers

These may seem pedestrian, but it’s easy to forget that the first controllers offering room-scale, six degrees-of-freedom (6-DoF) tracking only hit the market in 2016 (first Vive’s Wands then Oculus’ more ergonomic Touch, followed by Windows’ muddled bastardization of the two in 2017). With 6-DoF XR likely coming to mobile and standalone systems in 2018, where are controller interfaces headed?

Well, Vive’s been developing its “Grip controllers” (aka the “knuckles controllers”) — which are worn as much held, allowing users freer gestural tracking and expression — for over a year, but they were conspicuously excluded from the CES launch announcement of the Vive Pro.

One controller trend we did see at CES: haptics. Until now, handheld inputs have largely utilised general vibration to indicate haptic feedback. The strength of the rumble can be throttled up or down, but limited to just one vibratory output, developers’ power to express information with physical feedback has been limited. It’s a challenging problem: how to simulate physical resistance where there is none?

VR Controllers
Left: the HaptX Glove, Right: the Tactical Haptics Reactive Grip Motion Controller

HaptX Inc. is one firm leading advances in this field with their HaptX Gloves, two Nintendo Power Glove-style offerings featuring tiny air pockets that dynamically expand and contract to provide simulated touch and pressure in VR in real-time. All reports indicate some truly impressive tech demos, though perhaps at the cost of form-factor — the hardware involved looks heavy-duty and removing the glove would appear to be several degrees more difficult than setting down a Vive Wand, for contrast.

Theirs strikes me as a specialty solution, perhaps more suited to location-based VR or commercial/industrial applications. (Hypothetical: would a Wand/Touch-like controller w/ this type of “actuators” built into the grips provide any UX benefit at the consumer level?). Meanwhile, Tactical Haptics is exploring this tech through a different lens, using a series of sliding plates and ballasts in their Reactive Grip Motion Controller, which tries to simulate some of the physical forces and resistance one feels wielding objectives with mass in meatspace. This is perhaps a more practical haptics approach for consumer adoption — they’re still simple controllers, but the added illusion of physics force could be a truly compelling XRI mechanic (for more, check out their white paper on the tech).

Hand-Tracking

Who needs a controller? For some XR applications, the optimal UX will take advantage of the same built-in implements with which humans have explored the material world for thousands of years: their hands.

Tracking a user’s hands in real-time 27 degrees of freedom (four per finger, five in the thumb, six in the wrist) absent any handheld implement allows them to interact with physical objects in their environment as one normally would (useful in MR contexts)— or to interact with virtual assets and UI in a more natural, frictionless and immersive way than, say, the pulling of a trigger on a controller.

And of course, I defy you to test such software without immediately making rude gestures with it.

Pricier AR/MR rigs like Microsoft’s Hololens will have hand-tracking technology baked in — though reliability, field of view and latency vary. However, most popular VR headsets on the market don’t offer this integration natively thus far. Thankfully, the Leap Motion hand-tracking sensor, available as a desktop peripheral for years, is being retrofitted by XR developers with compelling results. For additional reading, and to see some UX possibilities in action I’d recommend checking out this great series by Leap Motion designer Martin Schubert.

These hand-eye interaction patterns have been entrenched in our brains over thousands of years of evolution and (for most of us) decades of first-hand experience. This makes them feel real and natural in XR. As drawbacks go, the device adds yet another USB peripheral and extension cable to my life (surely I will drown in a sea of them), and there are still field of view and reliability issues. But as the technology improves, this set of interactions works so well that it can’t help but become an integral piece of XRI. To allow for the broadest range of use cases, I’d argue that all advanced/future XR HMDs need to feature hand-tracking natively (though optionally, per application, of course).

Interestingly enough, the upcoming Vive Pro features dual forward-facing cameras in addition to its beefed-up pixel density. We now know, having been confirmed by Vive, hand-tracking can be done using these. Developers and designers would do well to start grokking XR hand-tracking principles now.

Eye-Tracking

Though the state of the art has advanced, too much of XRI has been relegated to holographic panels attached at the wrist. While this is no doubt an extremely useful practice, endless new possibilities for UI and gameplay mechanics emerge once you add high-quality, low-latency eye tracking to any HMD-relative heads-up display UI and/or any XR environment beyond it.

Imagine browsing menus more effortlessly than ever using only your eyes to exact selection, or to target distant enemies better in shooters. Consider also the effects of eye-tracking in multiplayer VR and the possibilities that unlocks. Once combined with 3D photogrammetry scans of users faces or hyper-expressive 3D avatars, we’ll be looking at real-time, photorealistic telepresence in XR spaces (if you’re into that sort of thing).

Wrist-Mounted UI
Wrist-mounted UI has proliferated in XR — but only goes so far. Eye-tracking will usher in many HMD-relative UI possibilities.

Imagine browsing menus more effortlessly than ever using only your eyes to exact selection, or to target distant enemies better in shooters. Consider also the effects of eye-tracking in multiplayer VR and the possibilities that unlocks. Once combined with 3D photogrammetry scans of users faces or hyper-expressive 3D avatars, we’ll be looking at real-time, photorealistic telepresence in XR spaces (if you’re into that sort of thing).

Eye-tracking isn’t just promising as an input mechanism. This tech will also allow hardware and software developers to utilise a technique called foveated rendering. Basically, the human eye only sees sharply near the very center of your gaze — things get more blurred further out into your visual periphery. Foveated rendering takes advantage of this wetware limitation by precisely tracking the position of your eyes from frame to frame and rendering whatever you’re looking at super precisely on (theoretically) higher-resolution screens. Simultaneously, the quality of everything you’re not looking directly at is downgraded – which you won’t notice because your pathetic human eyes literally can’t. This will allow for more XR on lower-powered systems and will allow high-end systems to stretch possibilities even further with higher-resolution screens.

Tobii & HTC Vive
Tobii’s eye-tracking technology embedded in a custom Vive

While Oculus and Google have acquired eye-tracking companies in recent years, the current industry leader appears to be Tobii. Their CES demos were reportedly extremely impressive ; but considering they retrofit a new Vive for each devkit, their solution is not mass-market at this point – and likely pricey, since you have to seek approval to even receive a quote. Still, the potential benefits of eye-tracking for XRI are so great, surely we’ll see native adoption of this tech by major HMD manufacturers in coming hardware generations (hopefully through a licensing deal with Tobii).

Voice & Natural Language Processing

As the trend of exploding Alexa use has taught us, many users love interacting with technology using their voices. Frankly, the tech to implement keyword and phrase recognition at relatively low cost is already there for developers to utilise — it’s officially low-hanging fruit in 2018.

On the local processing side, Windows 10 voice recognition tech runs on any PC with that OS — though it currently fairs better with shorter keywords and a low confidence threshold. (Check out this great tutorial for Unity implementation on Lightbuzz.com). Alternatively, you can offshore more complex phrases and vocal data to powerful, highly-optimized Google or Amazon processing centers. At their most basic, these services transform vocal data into stringvalues you can store and program logic against — but certainly many other kinds of analyses of and programmatic responses to the human voice are possible through the lens of machine learning: stress signals, emotional cues, sentiment evaluation, behavior anticipation, etc.

At the OS/always-on level, some Alexa-like voice-controlled task rabbit has to be in the pipeline (Rift OS Core 2.0 already gives me access to my Windows desktop, and therefore Cortana) —that’s assuming Amazon’s automated assistant doesn’t grace the XR app stores herself. At the individual app level, this powerful input may be the most widely available yet underutilised in XR (though for the record, I do see it as primarily an optional mechanic, not one that should be required for many experiences). When I’m dashing starboard to take on space pirates in From Other Suns, I want to be able to yell “Computer, fire!” so badly — this would be so pure. In Fallout 4 VR, I want to yell, “Go!” and point to exactly where Dogmeat should run (I pulled this off with my buddy BB-8 in a recent project). Developers and designers should look for more chances to use voice recognition more often as the implementation costs continue to fall.

Brain-Computer Input

Will we eventually arrive at a point where the most human of inputs —our physical and vocal communications—are no longer necessary to order each and every task? Can we interact with a computer using our minds alone? Proponents of a new generation of brain-computer-interfaces (BCI) say yes.

At a high-level, the current generation of such technology exists as helmet- or headband-like devices that use generally use safe and portable electroencephalography (EEG) sensors to monitor various brain waves. These sensors will generally output floating point values per type of wave tracked, and developers can program different responses to such data as they please.

Neurable HTC Vive
Neurable’s Vive integration

Though studied for decades, this technology has not yet reached maturity. The major caveat right now is that a given person’s ability project and/or manipulate the specific brainwaves tracked by accurately (as tracked by each device’s array of EEG sensors) will vary and can sometimes require lots of calibration and practice.

Still, recent advances appear promising. Neurable is perhaps the leader in integrating an array of EEG and other BCI sensors with a Vive VR headset. On the content side, the Midwest US-based StoryUp XR is using another BCI, the Muse, to drive a mobile VR app with the users’ “positivity,” which they say corresponds to a particular brainwave picked up by the headset that users can learn to manipulate. StoryUp, who are part of the inaugural Women In XR Fund cohort, hope to bring these kinds of therapeutic and meditative XR experiences to deployed military, combat veterans and the general public using BCI interfaces as both a critical input and a monitor of user progress.

It will likely be decades before you’re able to dictate an email via inner monologue or directly drive a cursor with your thoughts — and who knows whether such sensitive operations will even be possible without invasive surgery to hack directly into the wetware. (Yes, that was a fun and terrifying sentence to write). I would wager, however, that an eye-tracking-based cursor combined with “click” or “select” actions driven by an external BCI will become possible within a few hardware generations, and may well end up being the fastest, most natural input in the world.

Machine Learning

Imagine an AI-powered XR OS a decade from now: one that can utilise and analyse all the above inputs, divining user intent and taking action on their behalf. One that, if unsure of itself, can seek clarification in natural language or in a hundred other ways. It can acquire your likes and dislikes through experience and observation as easily as you might for a new friend, constructing a model your overall XR interaction preferences — with the AI itself, with other humans, and with the virtual realities your visit and the physical ones you augment. This system will, at the very least, be able to model and emulate human social graces and friendship.

Any such system will also have unparalleled access to your most sensitive personal and biometric data. The security, privacy and ethical concerns involved will enormous and should be given all due consideration. In his talk on XR UX at Unity HQ last fall, Unity Labs designer and developer Dylan Urquidi said he sees blockchain technology as a possible medium for context-aware, OS-level storage of these kinds of permissions or preferences. This allows ultimate ownership and decision-making power re: this data to remain with the user, who can allow or deny access to individual applications and subsystems as desired.

I’m currently working on a VR mechanic using a neural net trained from Google QuickDraw data to recognize basic shapes drawn with Leap Motion hand-tracking — check out my next piece for more.

Machine learning is likely the most important yet least understood technology coming to XR and computing at large. It’s on designers and developers to educate themselves and the public on how they’re leveraging these technologies and their users’ data safely and responsibly. For myself, machine learning is the first problem domain I’ve encountered in programming where I don’t grok all the mathematics involved.

As such, I’m currently digging through applied linear algebra coursework and Andrew Ng’s great machine learning class on Coursera.org in an effort to better understand this most arcane frontier (look out for my next piece, where I’ll apply some of these concepts and train neural net to identify shapes drawn in VR spaces). While I’m not ready to write the obituary for the QWERTY keyboard just yet, these advances make it clear that in terms of XRI, the times are a-changin’.

http://ift.tt/2CHjVva Source: https://www.vrfocus.com



Introducing the Workshop Leaders of the VR Diversity Initiative 2018 VR Kick-Off!

The VR Diversity Initiative 2018 VR Kick-Off! will be the first of six events this year. Sign-ups are now closed and 23 participants have been selected to take part. They will be learning how to create a rough virtual reality (VR) prototype for the HTC Vive and Oculus Rift on VR ready laptops. Experienced VR workshop leaders will be helping participants along the way, teaching them the basic skills on how to get started in VR. These range from 360 film workflows to creating an interactive Unity project. Listed below are the workshop leaders taking part in the VR Diversity Initiative 2018 VR Kick-Off! event.

Full-Day Workshop Leaders

Kyaw Tun Sein, Co-Founder and CTO

Ikigai Factory

Kyaw Tun Sein

After participating in many VR game jams, Kyaw started Ikigai Factory to develop useful tools powered by VR like REVR (a mobile virtual tour creation app). His mission with VR is to shrink our world much, much further where we say farewell to our limits of physical distance.

Kyaw is from the diverse background himself. His background was industrial design, not software/game development. He is also from Myanmar and currently based in London as an entrepreneur. He is very excited about VRDI because he will be able to welcome other diverse people to VR.

Sam Perri, Founder

Virtual Vault

Sam Perrin

A self-taught game designer and programmer, Sam Perrin studied game development during a lengthy recovery in a hospital following a car accident in 2012. Since then he has gone on to make games and apps that have been downloaded and enjoyed across the world and is director of Virtual Vault. Alongside his development work Sam enjoys mentoring new developers as they hone their skills in the industry.

I’ve loved VR from the first moment I put on a headset. The power and potential of the medium never ceases to amaze me, not just or gaming but for healthcare, education and for allowing those with disabilities and reduced mobility to experience things they may never have thought possible. I feel the inclusive nature of the medium must be matched by the diversity of its developers; we are building new worlds in virtual space and everyone should have a chance to be included in its creation.

Half-Day Workshop Leaders

Zayne Beatham, VFX Specialist and Post Production in VR/AR/MR

Jaunt

Zayne_Beatham
Having studied programming, I soon discovered the world of VFX. Not having enough money to pay for a VFX course, I worked 3 1/2 years at a hospital in engineering to save up enough money to study VFX at the Met Film School.
After working in the industry for a few years, my gaming passion won out, and I started playing around with game development and this lead to my involvement in VR and AR development.
From start to finish, I’ll take you through the process of creating a VR short film. Showing how to shoot on a 360 camera and going through stitching and stability. Adding in a title and a basic GFX. To finally rendering out your final piece in the best formats. Touching upon software such as: Adobe after effects, premiere pro, insta360 stitcher.
Jaunt does not just talk about believing in diversity, we act on it as a business. We are producers and distributors of new media and are conscious of the rich diversity with which those stories are being told. Through our content and products, our aim is to create an inclusive culture because our products are created for everyone and therefore should be built and experienced by everyone.

Sitara Shefta, Senior Producer

Dream Reality Interactive

Sitara Shefta

I’m a Senior Producer who’s been developing games for over six years. Previously, I worked at EA on the Need for Speed franchise and at Sumo Digital on titles including LittleBigPlanet 3, Sonic Dash 2 and Snake Pass. In 2016, I was awarded the Women in Games Hall of Fame award and last year, I was featured in Develop’s 30 under 30. From organising the team to running creative reviews, at DRI I use my Production skills to drive the team to create compelling VR and AR experiences.
I’m supporting the VR Diversity initiative as I believe it’s a great opportunity for people from all walks of life to come together in such a creative way. Growing up, I never saw games and VR as a career option and I think it’s important to vocalise that this is an opportunity for everyone, regardless of your background. So I’m happy to be supporting the initiative as I’m excited to be part of our industry growing to be as inclusive and diverse as possible.

Laura Dodds, Senior Artist

Dream Reality Interactive

 Laura Dodds
In my spare time, I am an indie developer and designer on small games that have gone on to be showcased at EGX, Develop, Rezzed, PC Gamer Weekender and FMX.  I studied Film and Television at Warwick and did an MA in Games Design and Development at the National Film and Television School.
I am a Senior Artist at DRi which includes a wide range of responsibilities from art direction and concepting to creating assets and implementation in game engine.
I am keen to be involved in the VR diversity initiative because I think it is important that audio visual media isn’t dominated by the voices of a select few.  It took me a while to get into the games industry as growing up I never saw it as a possibility for me.  I think the VR diversity initiative is a great way to encourage people with a variety of backgrounds and talents to get involved.

Joel Herber, Senior Game Programmer

Dream Reality Interactive

Joel Herber

 My greatest passion is to bring the worlds of stunning art and great tech together to make great experiences. I’m all about working on things with good stories to tell and inspired art styles that push the boundaries of what interactive entertainment can do. After studying Computer Science at Brighton University I quickly moved into creating video games for clients such as Nickelodeon, Sesame Street and the BBC. Now working as part of Dream Reality Interactive as a Senior Game Programmer, my job revolves around working with the rest of the team to plan the feasibility, design and to implement our game designs.
I wanted to get involved with the initiative because I believe that games, like any art-form, are only enriched by having a diverse range of creatives with different backgrounds and stories to tell.

 

The VR Diversity Initiative 2018 Kick-Off event will take place on 28th February 2018, at Hobs Studio, Here East, Unit 3, 3-4 East Bay Lane, London E20 3BS.  Additional supporters of the VR Diversity Initiative include HTC Vive, Innovate UK , Oculus, Barclays Eagle Labs ,BlueHire and the Realities Centre, with more announcements to follow.

http://ift.tt/2EONbSz Source: https://www.vrfocus.com



Guns’n’Stories: Bulletproof VR Nears Full Release With Act III Update

Indie studio MiroWin released its first virtual reality (VR) title Guns’n’Stories: Bulletproof VR in September 2017 on Steam Early Access for Oculus Rift and HTC Vive. That was followed in October with a major content update introducing ‘Act II’. In preparation to leave Early Access the studio has launched ‘Act III’, introducing the final part of the campaign. 

Guns n Stories Bulletproof VR Act3 screenshot 1

In the update MiroWin has added three new levels, two new weapons: paired Tommy guns “Bella and Talula” and one the studio calls “Mrs. Jumbo”, a couple of new enemies as well as a final boss called Harry. Additionally, there are three new trials: Warfare in the Saloon, Wall to Wall and Obstacle Race, and new gameplay mechanics where players have to protect objects, protect allies and Chase!, where a hero is teleported to a new town’s location in the final round after each shooting.

Guns’n’Stories: Bulletproof VR has a sense of humour throughout, revolves around an old bounty hunter telling his grand kid about his adventures. Just like any old story it’s embellished to make it sound even more amazing and fantastical, and its these stories that players must complete. So players will come across steampunk style enemies, flying drones and cowboys riding segways and more.

Guns n Stories Bulletproof VR Act3 screenshot 2

In a hands-on preview of Guns ‘n’ Stories: Bulletproof VR VRFocus said: “Mention wave shooters to a VR player and they’ll probably roll their eyes. Titles in this genre need to have something special about them to even be worthwhile. Secret Location’s Blasters of the Universe had it and Guns’n’Stories: Bulletproof VR looks like it might as well.”

MiroWin hasn’t yet said when Guns’n’Stories: Bulletproof VR will leave Early Access simply teasing that further information will be released next week. When that happens, VRFocus will let you know all the latest details as they are announced.

http://ift.tt/2CdP9yb Source: https://www.vrfocus.com



Life In 360°: Beavers and Bagpipes

Nature is pretty wonderful and it’s always a pleasure when there’s something a little different to show you here on Life In 360° that you may not have previously seen.  What you have previously seen in this instance is the location and the people behind the video. We’re back in Scotland with the team from Bristol-based 360° natural history production house Biome Productions. It was they who were responsible for the two-part series Wild Tour: The Cairngorms which we featured on VRFocus at the tail end of last month and the beginning of this one.

Life In 360° / 360 Degree VideoThis time we’re talking about our buck toothed, wood chomping friends: beavers. You may not think of beavers as a Scottish animal, and are more inclined to link them with Canada and you’d be right to think that way… well, taking modern history into account.  In actuality the UK was home to Beavers only until very recently in history, and conservation groups have been working to reintroduce around the UK – and in particular Scotland.

Now as you can imagine, reintroducing a species that was native to the ecosystem and was effectively hunted to extinction in the country isn’t a simple process. Moreover. some five hundred years after the last beaver in Scotland was killed what right do humans have to meddle with the ecology of the forest in such a direct way. Isn’t inserting an animal into the natural order in many ways just as destructive to the harmony of the forest as removing said animal five centuries ago? It’s certainly a controversial subject for many.  To that end we’re off to Argyll in Scotland with Anthony De Unger to find out just what benefits there are to introducing wild Eurasian beavers to the British Isles. A species that just cut down trees and clog up water supplies with dams… don’t they?

“Scotland is infamous for its rain, even in the height of the summer. Our team and our unwaterproofed rigs can now confidently confirm that this is in fact, true.” Jokes the studio in their summary of the main challenges regarding the film. “The main challenge was certainly working around these deluges in what was already a tight shoot. The final shot of the film, the close up of the wild beaver, was no easy feat. The crew met someone that’s been on the reserve weekly for the last five years and still hasn’t seen one! With the help of a local guide, they managed to tempt Milly the beaver up onto the bank with small pieces of apple next to our camera rig, whilst they were sitting in the car praying it doesn’t rain. Using some elegant continuous recording solutions, they managed to leave rigs recording for four hours to make absolutely sure they filmed this key moment.”

No animals were harmed during the making of this film. That said ten apples were apparently consumed by beavers:  You can check out the video below and VRFocus will be back with another example of 360 degree video on Monday at the usual time.

 

http://ift.tt/2CEkbL2 Source: https://www.vrfocus.com



Microsoft Gains New German Partners for Mixed Reality Partner Program

Four new firms have joined the Microsoft Mixed Reality Partner Program in Germany. All four companies are specialists in their field and will be able to take advantage of hands-on training and technical support from Microsoft.

The four firms are Data Experts, Medialesson, Riflekt and Viscopic, who will strengthen the German-speaking mixed reality partner network in Europe.

WorkLink - HoloLens

Data Experts specialise in the development of interactive content, including cross-device experiences. “We believe Mixed Reality is the way we consume digital content in the future,” said Chris Papenfuß, team leader Holographic at Data Experts GmbH, “Microsoft offers Microsoft HoloLens, the leading technology in the field of mixed reality. By participating in the Partner Program, we want to benefit from having direct access to Microsoft’s know-how and to promoting our own solutions.”

Medialesson has been working on human-machine interaction for over 15 years, with particular emphasis on usability and user experience. One of the company’s flagship projects is between Medialesson and Porsche to utilise mixed reality vehicle design.

Reflekt have been working with augmented reality (AR) for five years, concentrating on remote site support and interactive user manuals and training.

“Mixed Reality is still a new platform and the direct exchange with the manufacturer is very important,” says Wolfgang Stelzle, CEO & Founder of Reflekt GmbH. “We are now able to link our products even more closely to Microsoft HoloLens by using the knowledge of our Microsoft colleagues.”

Visopic have also been working with AR technology, offering advice and workshops on how the technology can be utilised through prototype development and application case evaluation, including industrial training.

“The demand for Mixed Reality applications is constantly increasing, and we are increasingly asked if certain use cases are feasible,” says Marco Maier, Co-Founder and Head of Business Development at Visopic. “With the Microsoft HoloLens we can now implement mixed reality applications for companies that were previously difficult to realize. We see great potential for growth in technology and want to work with Microsoft to promote mixed reality in the industrial market.”

Further news about the Microsoft Mixed Reality Partner program will be here on VRFocus.

http://ift.tt/2EVhzOr Source: https://www.vrfocus.com



Los Angeles Architects Look To Cut Down Revision Costs With New Virtual Reality Counterpart

We’ve often seen at VRFocus that to many virtual reality’s (VR’s) be-all and end-all will be videogames. Yet the majority of news we carry is often about what VR and other immersive technologies such as augmented reality (AR) and mixed reality (MR) can bring to a variety of industries. 

Architecture Virtual Reality (AVR)

For younger users it can be a great source of education, used to bring topics such as science or history to life, a good example of the former being the various lesson packs that are provided by the team at MEL Science. In professional fields it is being used for methods as varied as reconstituting archaeological treasures, training doctors, surgeons and other medical staff in procedures and treatments. It’s used to make interactive movies, allows you to watch existing movies in a new way and allows you to see the world as you have not seen it previously.

Another field we are talking about increasingly is that of architecture and design and that partiuclar part of the immersive industry has a new name to take notice of: AVR Studio.

The company in question is a new creation of existing Los Angeles-based firm, and specialist in luxury architecture IR Architects.  With AVR – standing for Architectural Virtual Reality –  today announced as its new sister company. AVR, which will be staffed by a number of seasoned professional VR developers and architects will look to develop virtual walkthroughs of properties at various stages of development.

It comes after much speculation and developments within the real estate industry, such as Mattersport teaming up with Truss to make a VR real estate viewer at the same time as the idea of virtual house tours increasingly felt to be in demand. A recent US based poll by Coldwell Banker Real Estate revealed 77% of 3,000 adults surveyed being interested in a VR tour of their potential future home.

Targeting the same luxury end of the market as its sister company, AVR Studio will, according to them, provide a service that will help ease many issues over a new house. Thanks to its role as a ‘virtual blueprint’ it is believed it will helping with everything from planning approvals to tweaking elements of everything from the design of the house to the furnishings inside it.

architecture design building plan

“AVR Studio stems from the needs of IR Architects’ clientele, who required a better way to visualize their property beyond traditional renderings, in order to cut down on costly revisions.  Clients, developers and financial investors can experience future properties at their leisure with a VR head-mounted display.” Explains the company in a statement. “Viewers are immersed in a fully interactive three-dimensional environment that provides a virtual representation of the building as a whole, including every room and hallway, down to pieces of furniture within. Some of the highest-level VR models allow viewers to manoeuvre throughout the property and open doors, turn on the television, experience future poolside and city views.”

Founder of both IR Architects and AVR Studio, Ignacio Rodriguez, believes it is a necessary next step for the industry in the 21st century.  “AVR Studio is going to radically change how client’s design, review, and approve projects in the future. The challenge that many people feel when trying to understand two-dimensional architectural plans will no longer be an obstacle, as with VR, we are all speaking the universal language of volume and space.”

You can see a trailer for the company below. VRFocus will bring you more news on the developments taking place over the coming months to come.

 

 

http://ift.tt/2opDL8V Source: https://www.vrfocus.com




top