9 Augmented Reality Trends to Watch in 2020: The Future is Here

On the eve of the upcoming 2020, we decided to summarize the achievements of the technology of #augmentedreality of the outgoing 2019.

9 Augmented Reality Trends to Watch in 2020

Trend #1: Mobile AR: Apple announced ARKit 3.0, Google’s ARCore is rapidly growing its installed base

The 2017 introduction of Apple’s ARKit and Google’s ARCore software development kits (SDKs) has standardized the development tools and democratized mobile AR app creation which has brought about more than double the amount of mobile AR-enabled devices and tripled the number of active users during 1.5 years. Having once brought AR to the mass audience of mobile users, Apple secured its AR market leadership as it unveiled ARKit 2.0 at WWDC 2018, and then ARKit 3.0 at WWDC 2019. In terms of technology, the introduced advances placed mobile AR in the same line with headset-based AR, if not above it. We still can see a significant ARKit’s dominance over ARCore, however, the latter has grown almost 10 times in absolute figures. The installed base of ARCore-compatible Android devices grew from 250 million devices in December 2018 to 400 million in May 2019.

Major mobile device manufacturers anticipate that they’ll see brisk improvements in adoption figures as new phones continue to hit the mobile market.

Trend #2: Augmented Reality as a novel way of shopping

Based on a report from Gartner, it was expected that at least 100 million users would use AR-enabled shopping technologies in 2019, which is one of the hottest retail trends of this year. The boom in mobile devices that employ AR means the sector is now occupied by robust and mature technologies. Developers, retailers, and customers are now comfortably using them as part of their daily experience.A BRP report indicated that 48% of consumers said that they’d be more likely to buy from a retailer that provided AR experiences. Unfortunately, only 15% of retailers currently put AR to use. Only a further 32% of retailers stated they plan to deploy virtual or augmented reality applications over the next three years.

Several companies have gotten out in front of consumer demand for AR shopping. American Apparel, Uniqlo and Lacoste have deployed showrooms and fitting rooms that provide try-before-you-buy options in augmented reality spaces. Smart mirror technologies that scan RFID tags also offer the ability to bring recommendations to the brick-and-mortar shopping experience. IKEA customers have access to an app that permits them to point their phones at spaces and see what different products would look like in their own homes.

Makeup, fashion and lifestyle brands all stand to gain significant appeal with customers by using technologies that handle facial recognition, adapt to local lighting conditions and provide personalized advice. Virtual assistants will also significantly change the shopping experience.

Trend #3: AR for navigation solutions

One of the most obvious use cases for AR technologies is indoor navigation, and 2019 had been expected to be the year that the average consumer gets their first real taste of its potential. People already lean heavily on maps services from both Google and Apple to get around outside, but indoor navigation stands to be the use case that blows the public away.ARKit and ARCore based applications for indoor navigation can provide directions in airports, malls, hospital and office campuses. Gatwick Airport has already deployed its own smartphone solution that provides routes to terminals and gates based on a user’s flight number.

In August 2019, Google launched a beta of its augmented reality walking directions feature for Google Maps that will be available to all AR-compatible iOS and Android mobile devices. Users can simply whip out their phones, point their cameras and see information about surrounding features in real time. Google’s software is likely to move beyond the smartphone space and includes integration with smart glasses.

Working from an installed base of maps users, AR-powered navigation is expected to move into new territory.

Trend #4: AR-powered solutions for the enterprise

Smart glasses are currently at a stage where consumer solutions are likely a few years off. Military, medical and enterprise solutions, however, are beginning to prove the value of combining AR with headsets and smart glasses.

One of the major current headwinds for AR is battery life. Announced in February 2019, Microsoft HoloLens 2 was likely the most anticipated product in this space in 2019. The company hoped to roll out its technology to great fanfare by demonstrating improvements in raw processing power, battery life, and wearability. The U.S. Army has awarded a $480 contract to Microsoft, and the company is also working with the industrial IoT firm PTC to streamline the development of both augmented and mixed reality products.

Based on a Forrester report, it is estimated that 14 million American workers are expected to use smart glasses regularly on their jobs by 2025. Industry 4.0 applications that integrate AR are expected to be a strong driver of adoption. Companies plan to streamline processes like training and to provide self-help to workers in the field with AR overlays that deliver information from manuals.

Walmart and Tyson are piloting programs that will move traditional training methods into mixed reality settings. Workers will have new ways to learn about compliance and safety issues by looking around mixed-reality environments and identifying problems in a way that’s practical and engaging. Integration with other recent workplace training trends, especially gamification, may compound the returns that AR and MR solutions generate. According to ABI Research, AR-based training in the enterprise will be a $6 billion industry by 2022.

Improvements in prototyping, testing, troubleshooting, and quality control are expected to emerge from this trend, too, as workers will be able to make on-the-fly comparisons of real-world items against available documentation and specifications. Jobs that call for workers‘ hands to be free will also benefit significantly from AR headsets and glasses.

Trend #5: Augmented Reality enhanced by Artificial Intelligence

Artificial intelligence and machine learning are fast-growing sectors in tech. Bringing them together with Augmented and Mixed Reality systems is a natural extension of many of the things that are best suited to AI and ML, particularly computer vision. Likewise, the ability to create human-machine processes that handle problems like disease diagnosis has immense potential to improve outcomes.35% of sales on Amazon are derived from its recommendation engine, which leans heavily on data science and machine learning to deliver search results and match advertisers with customers. Moving out of the web browser and into the real world has immense commercial potential. By pairing consumer profiles with AR and ML, retailers can identify customer needs based on their environments and provide them with recommendations.

Point-and-shoot retail AR solutions will also be major drivers of innovation. A shopper in a store can get AI-based customer support while walking around. If they have questions about pricing, features or current offers, answers can be supplied by a chatbot based on natural language processing (NLP) technologies. Responses can even be tailored to the customer’s unique profile, allowing greater personalization on the fly.

Robust AI and ML solutions can be extended to the AR and MR spaces to provide value to everyday users of mobile devices.

Deloitte Research concludes that augmented reality and artificial intelligence will transform the traditional healthcare business model offering AR/MR-enabled hands-free solutions and AI-based diagnostic tools.

Trend #6: WebAR

In the web space, Chrome AR was a highly anticipated product of 2019. Instead of needing to use specialized apps, users can simply log on to AR-enabled websites to access the same level of functionality. In order to foster adoption, an unofficial and unsupported version of the WebAR code is made available to developers on GitHub, too.

Mozilla is also engaged with WebAR and trying to bring AR solutions to Firefox. The goal is to make AR adoption significantly more friction-free by using the installed user bases of web browser audiences. Apple, Samsung and Microsoft web browser offerings are also rapidly adopting the WebAR standards.

Although these standards have yet to be established, the implementation of AR in browsers is under active development, by means of either porting existing libraries (e.g. AR.js) or developing new ones (e.g. A-Frame, React 360). 2020 is the year that WebAR may become available on virtually every up-to-date web browser in the world.

Trend #7: Collaboration and remote assistance via shared Augmented Reality

Collaborative efforts, such as conference calls, are often undermined by the lack of a direct personal presence. AR, however, can create mixed-reality settings where everyone on a conference call can see each other in a more socially conducive environment. Microsoft is moving forward with a beta of a video-calling system that employs augmented reality to create holographic-style representations of participants. Cisco Systems is also working on a project called Musion that brings together its networking products with AR technologies.

It is worth noting that adoption faces several headwinds, and neither consumer nor enterprise-grade products are currently on the market. In particular, the use of headsets and current costs have made commercial rollouts unappealing to manufacturers and potential customers.

AR-based remote assistance sessions are a use case that may promote innovation. A combination of WebRTC and AR makes it possible to conduct real-time maintenance work and troubleshooting. By leveraging concurrent data streaming, assistance providers can join more directly in the maintenance, configuration, and repair processes.

Ford’s Ford X in-house incubator has been elaborating a spatial system to create shared AR workspaces for its employees. It’s easy to see how companies that place a premium on agile development frameworks like Scrum could become early adopters in this sector. Shared AR spaces also seem like a natural fit for training sessions, conferences, and education.

As we’ve seen in the world of video gaming, collaborative AR experiences are perceived by users as highly engaging and worth sharing with others. The challenges are bringing down costs, making wearables more accessible and finding use cases that allow wider adoption. AR-based collaboration and remote assistance remain underdeveloped sectors that will call for continued hardware improvements going into the 2020s.

Trend #8: AR in the automotive industry

At CES 2019, a number of car manufacturers were showing off on-the-road AR solutions, too. For example, Genesis G80 utilizes a number of features to ensure accuracy, including tracking the driver’s line of sight to ensure that holographic overlays are always in the right spot. Instead of having to look down at a GPS panel in the dashboard, the driver will see arrows on a heads-up display providing live directions. Porsche is also making major investments in similar technologies.

Heads-up displays have been a fixture in military aerospace for decades, but AR is only now beginning to bring that potential to the automotive world. Dashboard-mounted displays can project AR overlays into the driver’s line of sight on the windshield. Motorists can be alerted to hazards, providing directions and given warnings about traffic. On a more whimsical note, systems can also provide drivers and passengers with information about nearby landmarks and destinations.WayRay, a company based in Switzerland, is the current biggest player in this sector outside of the big auto companies themselves. It showed off the i-Cockpit 3D HUG in a brand new Peugeot 208. Porsche is pushing Series C funding into the HUD setup.

Hyundai, in particular, has been a leader in AR research that goes beyond the cockpit-style view of the motorist’s experience. Starting in 2015, the company has been merging collaborative and assistance technologies with AR to breathe new life into maintenance manuals. Hyundai has apps that allow users to point their phones at their cars to get information. If you’re trying to figure out where the fuse panel is, for example, the app will highlight it on the screen. Mercedes has a similar app, but its version adds a chatbot to provide virtual assistance.

One major advantage of automotive AR is that many of the problems that are present in other use cases are easily overcome. Cars already have alternators to generate electrical power for use on the fly, largely eliminating battery concerns. Likewise, the windshield serves as a ready stand-in for cumbersome headsets. Multiple generations of drivers have also been acclimated to onscreen data and instructions by means of car-themed video games.

Trend #9: The market evolves and remains open to innovative business-driven solutions

2017 heralded the Cambrian explosion of the AR world, the introduction of easy-to-use software development kits from both Apple (called ARKit) and Google (called ARCore). Brands, development companies, agencies, and startups rapidly followed, taking advantage of their potential. ARKit 2 landed at WWDC 18, with Apple introducing the USDZ format that makes adding models, data and animations to AR landscapes simple. 3D object recognition, environment texturing and face tracking were also introduced.

2018 was also the year that the Magic Leap headset arrived. Shipped in August 2018, the headset represents a major step forward in consumer-grade products. Highly wearable, durable and adaptable, Magic Leap’s price point in the $2,000 range is powered by technology from Nvidia. Selling points include 8 GB of memory, 128 GB of storage and USB-C charging. Magic Leap also says users can sustain three hours of continuous use. With a growing developers program, Magic Leap will become a jumping-off point for many companies diving into the AR space to experiment with new ideas and use cases.

The so-called ARCloud also stands out as something to watch. The ARCloud is a concept built around cross-platform compatibility, persistence and sharing. It is intended to provide a seamless experience with the real world, too.

2019 isn’t promising to become a year of revolutionary changes in the Augmented Reality technology. However, evolutionary improvements of software and hardware, like ARKit 3.0 and Apple’s A13 chip in iPhone 11, will contribute to the future AR technology maturity in 2020.

Future of Augmented Reality

Experts predict the AR/VR industry to reach more than $25B by 2025—and the growth will continue steadily. That’s the bright future of augmented reality, and it will be defined by the investments from the following business domains and spheres, which find its practical potential pretty enticing.While gaming will remain dominant in terms of revenues, more practical industries, such as healthcare and engineering are expected to pick up steam. When it comes to real estate and home improvement, there are such use cases as interactive walkthroughs in mixed or virtual reality environments, or instant delivery of information via mobile AR. The latter can also be used in travel. In the near future, we also expect evolution of the AR concept itself, with new software, hardware, and use cases emerging on the market.

 

Quelle:

9 Augmented Reality Trends to Watch in 2020: The Future is Here from augmentedreality

Interaktive Erlebniswelten erobern Hamburg

In Hamburg entstehen immer neue interaktive und digitale Erlebniswelten, die neben den klassischen Attraktionen für neue Abwechslung in der Tourismusmetropole sorgen. Die neugeschaffenen Angebote setzen auf Interaktion, Virtual Reality oder Edutainment.

Ein neuer Trend ist dabei schon jetzt erkennbar: Im Vergleich zu den klassischen Attraktionen reicht das passive Zuschauen zukünftig nicht mehr aus. Der Besucher wird immer mehr zum Mittelpunkt des Geschehens und darf mit modernster Technik, aufwendigen Kulissen oder Darstellern interagieren. Das Erlebniskonzept OPOLUM Adventures existiert seit 2018 zentral in der Speicherstadt und setzt auf Interaktion in fiktiven Welten.

„Wir möchten unsere Besucher ganz analog in andere Welten entführen und ihnen eine Mission geben.“, sagt der 29-jährige Geschäftsführer Christoph Freese. Die Besucher sind mit Betreten der Attraktion direkt in einer fantasievollen Welt und werden auf eine abenteuerliche Mission geschickt, um die Welt zu retten. Als Betreiber des Hamburger Escape Rooms TwistedRooms war für Freese nach einem Besuch in einem „Immersive Theatre“ in London klar, dass die Zukunft der Escape Rooms nicht in den Rätseln, sondern in immersiven Geschichten mit interaktiven Aufgaben liegt. Freese sagt dazu weiter: „Wir haben komplett auf Denkrätsel oder komplexe Zusammenhänge verzichtet. Wir sprechen daher nicht mehr von Rätseln, sondern von Aufgaben, die innerhalb der von Schauspielern erzählten Geschichte erfüllt werden müssen.“

Seit mehreren Jahren ist in der Hansestadt ein neuer Trend erkennbar. Museum, Ausstellungen und Erlebniswelten erschaffen vielseitige neue Methoden, den Besucher interaktiv mit einzubeziehen. Neben den vielen klassischen Museen und Ausstellungen in Hamburg haben zwei neue Edutainment-Formate auf Innovation und Technik gesetzt. Das Discovery Dock Hamburg lässt die Besucher an verschiedenen Stationen selbst zum Entdecker werden. So kann jeder mit Hilfe einer VR-Brille zum Kranfahrer im Hafen werden. Die Panik City zeigt das Leben von Udo Lindenberg auf ganz neuen Wegen. Hier können Besucher mit einer VR-Brille ein Live-Konzert sehen oder auf Touch-Monitoren Liquor-Bilder malen.

Im OPOLUM Adventure wird bewusst auf die virtuelle Realität verzichtet. „Wir kombinieren analoge Unterhaltungsformen zu einem einzigartigen Erlebnis, das überrascht.“, erklärt Freese. Für den Besucher entsteht das Erlebnis durch das gekonnte Zusammenspiel aus modernster Technik, liebevollen Filmkulissen, interaktiven Aufgaben und leidenschaftlichen Schauspielern. „Ohne moderne Technik funktionieren Attraktionen heutzutage nicht mehr. Die Leute sind an vieles gewöhnt und möchten immer neue Technologien sehen.“, berichtet Freese. Das OPOLUM funktioniert auf 700qm vollautomatisiert. Es setzt unter anderem auf „Projection Mapping“. Dabei strahlen Beamer nur auf bestimmte Bereiche und erzeugen so für das menschliche Auge beeindruckende Illusionen.

Neue Erlebniswelten werden in Zukunft eine wichtige Rolle für den Tourismus spielen. Modernste Technik wird dabei unverzichtbar. Die Einheimischen und die Touristen möchten zukünftig im Mittelpunkt stehen, persönlich angesprochen werden und mit der Umgebung interagieren. Ob mit oder ohne Virtual Reality ist dabei Geschmackssache. Die hohe Diversität an unterschiedlichen Erlebniskonzepten zeigt schon jetzt, dass Hamburg ein Vorreiter im Tourismus der Zukunft ist.

 

Quelle:

https://www.presseportal.de/pm/140301/4488987

Oculus Go Price Cut, Sony Skips E3 & Win Eclipse! VRecap

It’s a relatively news-lite week given that we’ve just passed CES, but there’s still a few things to talk about.

Firstly, Facebook just gave the Oculus Go its first official price cut. The low-cost standalone headset now costs even… lower. You can get the 32GB model for $149 (or £139) and the 64GB for $199 (or £189)! That’s a great price but Go is really showing its 3DOF age in 2020. Is it still worth picking one up? Well only you can decide that (we’d probably say no, though).

Next up, Sony is skipping E3 for the second year in a row. Not massive VR news, no, but it does mean that we probably won’t see any PS5 news — and by extension PSVR2 news — during the event in June. Then again, we’re not expecting PSVR2 to launch alongside PS5 later this year, so it’s not the biggest loss. And besides, there will probably be a certain VR showcase to tide hungry fans over…

HINTS!

Anyway, our last story of the week is about SideQuest and how Crisis VRigade has been downloaded over 50,000 times on the platform. That’s a whole lot of downloads, but don’t forget the game is in beta and a free download. Still, congrats to Crisis! Now, where’s that sequel?

GIVEAWAY: Win A Free Copy Of Eclipse: Edge Of Light on SteamVR!

Release-wise there’s a smattering of VR treats this week. Eclipse: Edge of Light is finally out on PSVR and PC VR, and you can win a copy of it on SteamVR in our competition. Meanwhile, Fail Factory comes to Rift and Vive and you can watch the excellent Doctor Who: The Runaway on more headsets!

Finally, we asked you what the future holds for HTC, now that it has a ‘new vision’ for Vive. You gave us some really interesting talking points, which we rounded up in the video.

Okay, we’re off to cry about Cyberpunk 2077’s delay. We’ll see you next week!

The post Oculus Go Price Cut, Sony Skips E3 & Win Eclipse! VRecap appeared first on UploadVR.

Japan’s Coolest VR Arcades + New Resident Evil VR Games: The VR Culture Show #3 Coming Monday

Welcome back to The VR Culture Show!

For our third episode, due on Monday, we’re in Japan. Yup, the land of stunning scenery, bustling metropolises and robotic toilets. How could we venture over to the East and not bring a camera in search of every cool VR gadget and game we could find?

And, believe us, we found plenty. In this episode, we visit three different VR arcades to see some of the amazing, unique experiences they offer. First up we’re in Tokyo’s Ikebukuro district to visit Bandai Namco’s amazing Mazaria VR park. Located in the Sunshine City shopping mall, the dream-like center offers all kinds of amazing VR goodies, including games based on Dragon Quest and Pac-Man.

We also journey over the road to the Capcom Plaza, where two exclusive Resident Evil VR games are on offer. This is a rare look inside two location-based spin-offs that you can’t play in the West so, if you’re a fan of the series, you really won’t want to miss this.

Finally on the arcade front, we also head up to another Bandai Namco VR Zone, this time in the chilly northern city of Sapporo. There we find yet more legendary franchises have been virtualized, including Gundam and Evangelion.

Outside of arcades, we also got to hang out with Psychic VR Lab, an intriguing Tokyo-based company making a web-based VR/AR creation platform called Styly. The team took us on a whirlwind tour of some of its work across the Shibuya district, where it has plenty of cool AR experiences littering the streets and some strange VR apps too. We also attended the company’s New View Awards, which hosted a bunch more indie-made VR ideas.

So, when will this burst of exotic VR excitement land on your plate? Very soon; be back here at 10am PT on Monday, January 20 to be the first to watch our best episode yet.

Liked this episode of The VR Culture Show? Let us know! We’ll be back soon with another installment, so keep your eyes peeled.

The post Japan’s Coolest VR Arcades + New Resident Evil VR Games: The VR Culture Show #3 Coming Monday appeared first on UploadVR.

The Walking Dead: Saints And Sinners Is Another Physics-Driven, Super Gory Powerhouse For VR Gaming

I think it’s safe to say a lot of us have Walking Dead fatigue. What started as a groundbreaking comic book flourished into a promising TV show and a landmark episodic game. But, like a zombie that just won’t die, the series just sort of… keeps shuffling on.

Spin-off series, console tie-ins, AR games; nowhere is safe from the hunger of the undead. A series so refreshingly concerned with the humanity behind such a cataclysmic event soon succumbed to the phenomenon it generated, recycling the same tired tropes, time after time.

I mean, heck, there’s even two Walking Dead VR games on the horizon (this one I played from Skydance and Onslaught from Survios), which is perhaps as embarrassingly unnecessary a piece of brand overlap as you’ll ever see. But I’ll say this as someone that parted ways with the series a long time ago; The Walking Dead: Saints and Sinners is the most promising extension of the franchise I’ve seen in years.

Their releases may be too close together to claim any source of inspiration, but Saints and Sinners, developed by Archangel studio Skydance Interactive, definitely graduates from the Boneworks school of VR design. While not every item in the game can be picked up and wielded against your brain-dead enemies, axes need to be swung with heft to make an impact, gun muzzles can be used to bash heads out of the way, and your hands are your first line of defense against the incessant gnashing of a Walker’s teeth.

Combat in Saints and Sinners, then, can be an uncomfortably personal, thrillingly grotesque, and intentionally messy affair. Like Survios’ upcoming The Walking Dead: Onslaught, great pride is taken in the stabbing of heads, though I note a sickeningly authentic feel to this approach. Knives, tools, and even serving spoons must be thrust into your enemy’s brains with intent, and successful blows then dislodged with queasy fiddling. No gory detail is spared either; at one point I take an axe to a zombified-companion, only to accidentally split his chin in two, much to the disgust of the developers and PR representatives in the room.

High-powered assault rifles and handguns, meanwhile, tempt Rambo-style action but in practice need a much more considered touch. If you don’t hold a rifle at the end of the barrel, it’ll flail around with a rubbery consistency, but even if you do grab it with two hands you’ll need to prop it up higher than you’re used to to help account for the weight you can’t physically feel. Skydance has clearly gone to great lengths to balance every weapon in the game, best seen in the measured reload animations which are often specific to the gun you’re holding. They’re unique in their handling and yet streamlined just enough to be manageable, provided you keep your cool under pressure.

It’s an encouraging set of rules and restrictions, suggesting Saints and Sinners genuinely belongs to that most wince-inducing buzzwordy of labels: a ‘next-gen VR game’. And it’s not just the combat that makes that promise.

Structurally, Skydance says there’s a beefy campaign with 15+ hours of single-player action, complete with your standard assortment of stamina meters and crafting elements. Saints and Sinners is set to a moody backdrop of a zombie-infested New Orleans, a series of flooded roads connecting several explorable areas to a main hub environment. You venture out in search of supplies and essentials, meeting other survivors that designate side-missions with the lure of big rewards.

walking dead saints and sinners hanging zombie

One woman I meet straight off the boat asks me to put her zombie husband out of his misery in return for a safe code. I could comply or, living up to the title, I’m told I could just kill her and take the code right now. Why wouldn’t you just do that? Well, there may be other rewards to gain from accepting the mission and you may want to play the path of the Saint; there are multiple endings depending on the choices you make.

Given the welcome, crunching impact of the combat and the generally impressive production values — New Orleans is convincingly dilapidated and character models and performances are a step above your usual VR NPCs — I’m inclined to believe Skydance when it makes these lofty commitments. Saints & Sinners appears surprisingly comprehensive, almost as if the issues of short VR games with repetitive content were a distant memory. Granted I haven’t played enough to the game to claim it will maintain this quality throughout, but it’s looking promising.

The Walking Dead may remain a dogged franchise with no end in sight, but Saints and Sinners looks to at least put its name to good use. VR has a lot of zombie-slaying ahead of it in 2020 but, from what I’ve seen, Saints and Sinners should be setting an early high bar.

The Walking Dead: Saints and Sinners comes to PC VR headsets on January 23rd. PSVR and Quest versions are set to follow later in the year.

The post The Walking Dead: Saints And Sinners Is Another Physics-Driven, Super Gory Powerhouse For VR Gaming appeared first on UploadVR.

Tea For God To Receive Oculus Quest Hand Tracking Support In Future Update

At the end of a change log for the latest build of Tea for God, developer Void Room stated that they are working on implementing the Oculus Quest’s controller-free hand tracking into the game in a future update.

The Oculus Quest received the experimental controller-free hand tracking feature in the latest V12 update. It is currently not on by default, and is only available to use officially in Oculus Home, Browser and TV. Once the update is no longer deemed experimental, developers will be able to submit updates to their Oculus Store apps to support hand tracking. In the meantime, however, many developers have created demonstration or proof-of-concept apps that use the controller-free hand tracking in various ways, and made the apps available for sideloading on the Quest.

As sideloaded apps are not available in the Oculus Store and not officially approved, they are not required to comply with Facebook’s content curation policies. This allows sideloaded apps and games, such as Tea for God, to implement the hand tracking SDK immediately, and potentially long before any official Oculus Store apps support the feature.

If Tea for God does receive hand tracking support, this would make it the first full game to implement hand tracking beyond just a basic proof-of-concept or demonstration-level implementation. Importantly, hand tracking won’t always be a viable input method for games – The Thrill of the Fight developer Ian Fitz recently stated the Quest’s hand tracking wouldn’t be ‘capable enough’ for boxing.

There is currently no indication on how long the implementation of the SDK might take, so we could be waiting a while. The developers also noted that additional tutorials might be required for hand tracking, as the player would need to learn specific gestures to perform in-game actions.

Tea for God also received an update a few days ago that adds the “persistence layer” to the game, allowing you to manage gear and unlock new items with “data blocks” that you collect throughout the game. You can read more about the update that here.

Tea for God is not available on the Oculus Store, but it is one the best apps available to sideload on the Oculus Quest. If you’re unfamiliar with sideloading, you can check out our how-to guide here.

The post Tea For God To Receive Oculus Quest Hand Tracking Support In Future Update appeared first on UploadVR.

‘Moon Rider’ is a WebVR Game That’s Quietly Amassed Thousands of Daily Players

Moon Rider is a free VR rhythm game built on the WebVR standard which means it runs directly from a web browser rather than being downloaded and installed on a specific VR storefront. Its creators say the game has garnered thousands of daily players.

Launched in May 2019, Moon Rider is a relatively simple VR rhythm game, but its web-based foundation makes it as easy to play as visiting a website, and just as easy to share with others.

Want to see for yourself?

  • On Oculus Quest: just launch the browser and enter moonrider.xyz, click the ‘Enter VR button’ at the bottom right.
  • On PC VR headsets: Launch a WebVR compatible browser (Firefox currently has the most frictionless support) then ready your VR headset by launching its base software (Oculus desktop software or SteamVR for most), then visit moonrider.xyz and click the ‘Enter VR’ button at the bottom right.

And… viola! You’re playing.

Photo captured by Road to VR

It’s this web-like ease of access that’s the crux of WebVR (and its forthcoming successor, WebXR), and what’s allowed Moon Rider to organically reach a surprisingly large audience, says one of the game’s creators, Diego Marcos, who is also the founder of Supermedium which built a browser specifically for leveraging WebVR content on VR headsets.

Marcos tells Road to VR that Moon Rider is seeing somewhere between 2,000 and 3,000 daily active users with an average session duration of 45 to 60 minutes with 50% player retention. That makes Moon Rider likely the leading WebVR game to date by those metrics.

“It’s head and shoulders above anything else in the [WebVR/WebXR] space,” Macros says.

Moon Rider is built on A-Frame (of which Marcos is a maintainer) a framework which makes it easier for web developers to build WebVR content (and WebXR, which brings AR into the mix).

“The message we wanted to send with Moon Rider is that A-Frame and the Web are now ready to deliver compelling VR content with user reach,” he added.

Like Moon Rider, some other seriously impressive VR web content has also been built atop A-Frame, like Supercraft, a Google Blocks-like VR environment builder with seamless web sharing, and Mozilla Hubs, a web-based social VR chatroom that works across almost any headset, smartphone, or computer.

Moon Rider itself is open source, giving developers an opportunity to see how it was built and to use it as a foundation for their own web-based VR experiments.

The post ‘Moon Rider’ is a WebVR Game That’s Quietly Amassed Thousands of Daily Players appeared first on Road to VR.

Quest Update Adds Digital IPD Indicator

Oculus Quest includes an IPD slider on the bottom of the headset which adjusts the distance between the lenses for ideal alignment with the user’s eyes, which is important for visual comfort and clarity. Since launch however, there’s been no way to know the actual width of the IPD setting, leaving users to mostly guess if they’ve set it correctly. In a recent update, Oculus added a digital indicator which shows the current IPD position.

Update (January 17th, 2020): A recent update to Quest has added a digital IPD indicator which makes it easier to adjust the distance between the lenses as needed for each user.

Now when you slide the IPD adjustment on the bottom of the headset, a small pop-up with an eye icon and a number shows the current IPD setting in millimeters.

On one headset we tested, the indicator ranged from 59mm to 71mm, just slightly off from the officially claimed 58mm to 72mm IPD range. Considering the indicator only reads to the nearest millimeter, we’d chalk the difference up to a margin of error from mechanical tolerances and measuring precision.

The hardware necessary for measuring the IPD position was built into Quest at launch but not active until the recent update.  As Oculus is steadily adding features to the headset with updates on a rolling basis, it isn’t clear exactly when the feature was added, but we’ve spotted it as of ‘Version’ 12.0.0.226.469.188362039.

The original article—which speculated about the feature and discussed the importance of an IPD indicator as well as how you can measure your own IPD—continues below.

Original Article (October 24th, 2019): It’s only been a few months since the launch of Quest, but the headset has already seen a range of improvements added via software updates. Passthrough+, for instance, recently rolled out, greatly improving the headset’s passthrough view with lower latency and more accurate visuals. The headset is soon to get both hand-tracking and PC tethering capabilities too. A future update is also likely to add an IPD indicator to make it easier to set the correct distance between lenses to maximize visual fidelity and comfort.

Quest IPD Indicator

At Oculus Connect last month during a Quest demo, I noticed that upon adjusting the IPD slider, a small pop-up appeared over my view which specified the current IPD width in millimeters. The readout updated as I moved the slider, allowing me to easily dial into my known IPD of 64mm.

A digital readout like this is common among other headsets which have hardware IPD adjustments, but it doesn’t yet exist in the consumer Quest. This week an Oculus spokesperson confirmed to Road to VR that the company is considering adding the feature, though they didn’t have any concrete information on when we might see it.

How do we know that this is likely to come via a software update rather than a hardware tweak? Looking at teardown photos of Quest shows that the headset’s IPD slider already includes the necessary electronics to measure its setting. This would make sense, as ideally the headset should be aware of the IPD setting so that it can adjust the rendered image accordingly (as an offset IPD can impact the sense of scale that comes from 3D visuals).

Why an IPD Indicator is Important

Though it’s possible to move Quest’s IPD slider, without a readout of the current setting, users are mostly left to guess if they’ve got the setting right since there’s really nothing to compare it to. Manually setting your IPD with no means of calibration is really difficult; even if you get close (say, within 5mm of your actual IPD), this can still notably effect visual fidelity and comfort.

SEE ALSO
Everything We Know (Officially) About the FOV and IPD of Rift S & Quest

On Quest it is possible to get a calibration screen, but it’s a clunky process and still not a particularly reliable way to get the correct IPD setting. Not having a reliable means of determining and replicating the correct setting is especially a pain when passing the headset around to others; not only is it difficult to guess where to put the setting for each user, you’ll ultimately need to dial it back in for yourself when you’re done. With a readout you could easily return the IPD to a known value.

Measuring Your IPD

Actually determining your own IPD measurement is still a clunky process in itself, and one that isn’t likely to get easier until eye-tracking hardware finds its way into more headsets. Short of asking for a precise measurement next time you visit the eye-doctor, you can also use a ruler and mirror to measure your own IPD, or ask a friend to hold a ruler close to your eyes and measure the distance between the center of each pupil.

SEE ALSO
Eye-tracking is a Game Changer for VR That Goes Far Beyond Foveated Rendering

In headsets of the future we could expect automatic IPD measurement and adjustment which would make this all much more seamless.

The post Quest Update Adds Digital IPD Indicator appeared first on Road to VR.

Mojo is Making Contacts Smarter with Its Incredibly Tiny Microdisplay

Mojo Vision is a company that’s working to produce smart contact lenses, which it hopes in the near future will let users have a non-obtrusive display without needing to wear a pair of smart glasses.

CNET’s Scott Stein got a chance to go hands-on with a prototype at CES 2020 earlier this month, and although the company isn’t at the point just yet where it will insert the prototype tech into an unsuspecting journalist’s eyeballs, Mojo is adamant about heading in that direction; the team regularly wears the current smart contact lens prototype.

While Mojo maintains its contact lenses are still years away from getting squarely onto the eyeballs of consumers, Mojo is confident enough to say it’ll come sometime in this decade, something the company sees landing in the purview of optometrists so users can get their microdisplay-laden lenses tailored to fit their eyes.

Image courtesy Mojo Vision, via Fast Company

But just how ‘micro’ is that display supposed to be? Fast Company reports that Mojo’s latest and greatest squeezes 70,000 pixels into less than half a millimeter. Granted, that’s serving up a green monochrome microLED to the eye’s fovea, but it’s an impressive feat none the less.

On its way to consumers, Mojo says it’s first seeking FDA-approval for its contacts as a medical device that the company says will display text, sense objects, track eye motion, and see in the dark to some degree, which is intended to help users with degenerative eye diseases.

SEE ALSO
Microsoft is Aware of Significant Display Issues on Some HoloLens 2 Units

Fast Company reports that Mojo integrates a thin solid-state battery within the lens, which is meant to last all day and be charged via wireless conduction in something similar to an AirPods case when not in use. The farther-reaching goal however is continuous charging via a thin, necklace-like device. All of this tiny tech, which will also include a radio for smartphone tethering, will be covered with a painted iris.

Image courtesy Mojo Vision, via CNET

Mojo also maintains that its upcoming version will have eye-tracking and some amount of computer vision—two elements that separate smart glasses from augmented reality glasses.

Smart glasses overlay simple information into the user’s field of view although it doesn’t interact naturally with the environment. Augmented reality, which is designed to insert digital objects and information seamlessly into reality, requires accurate depth mapping and machine learning. That typically means more processing power, bigger batteries, more sensors, and larger optics for a wide enough field of view to be useful. Whether Mojo’s lenses will be able to do that remains to be seen, but it at least has a promising start as a basically invisible pair of smart glasses.

Whatever the case, it appear investors are pitching into Mojo Vision’s vision. It’s thus far garnered $108 million in venture capital investments, coming from the likes of Google’s Gradient Ventures, Stanford’s StartX fund, HP Tech Ventures, Motorola Solutions Venture Capital, and LG.

The post Mojo is Making Contacts Smarter with Its Incredibly Tiny Microdisplay appeared first on Road to VR.