Vuzix Blade AR Smart Glasses Will Soon Support Real-Time Language Translation

The Zoi Meet language subtitling service makes its AR debut next month.

The Vuzix Corporation, a manufacturer of AR smart glasses, has announced a partnership with Verizon and Zoi Meet that will bring the popular multilingual communication platform to their Vuzix Blade Smart Glasses, establishing the AR headset as one of the first to incorporate real-time language translation.

Using a combination of proprietary voice to text and language translation algorithms, Zoi Meet can provide real-time translations between 12 languages, including Arabic, Chinese (simplified), Dutch, English, French, German, Italian, Japanese, Korean, Portuguese, and Spanish.

Image Credit: PRNewsfoto / Vuzix Corporation

Beginning this July, the Zoi Meet service will be available as an app on the Vuzix Blade Smart Glasses, allowing for hands-free language translation in which speech is automatically transcribed into text and projected directly onto the display of the headset.

“Spoken language is an integral part of communication and bringing real-time live multilingual transcription services to smart glasses really levels the communication playing field for everyone. This application can be utilized by individuals and business travelers alike communicating with others on a daily basis,” said Paul Travers, President and Chief Executive Officer at Vuzix, in an official release.

“We are excited to work with Vuzix and Verizon to bring a powerful language translation and transcription application live on the Vuzix Blade Smart Glasses.  Whether you’re traveling or using the Vuzix Blade for business, having hands-free language translation on the display of the Blade is a game changer for face-to-face conversations,” added Nick Yap, Founder of Zoi Meet.

The Zoi Meet app is expected to arrive on the Vuzix App Store this July. A beta experience of real-time AR language translation using Zoi Meet on the Vuzix Blade was recently showcased at the “Verizon Tech Day” event held in Basking Ridge, NJ, a well as Japanese app accelerator “Plug and Play Japan” and Vuzix’s annual stockholder meeting in Rochester, NY.

The Vuzix Blade Smart Glasses are currently available for purchase at $699.

The post Vuzix Blade AR Smart Glasses Will Soon Support Real-Time Language Translation appeared first on VRScout.

Watch: ‘Beat Saber’ 360 Mode Feels like a New Way to Play

Developer Beat Games last week revealed a new ‘360 mode’ for their VR hit Beat Saber. Notes can now come from all around the player instead of just straight ahead. While it feels like a fun new way to play, it makes beat mapping considerably more complex. It’ll take innovative and creative mapping to really make 360 mode great, and for that, Beat Games should turn to its community.

Beat Saber’s primary mode throws a series of blocks at you from directly ahead. It’s simple and straightforward, but can be very challenging at high levels, which is part of the reason why Beat Saber has such broad appeal. A new ‘360 mode’ planned for Oculus Quest allows beat maps to send lines of blocks at you from arbitrary directions as the song plays out.

At E3 2019 I got to play an early version of the 360 mode. As a relatively high-level Beat Saber player, I found it very intriguing and I see a lot of potential, but it’ll take more time to discovery what kind of note patterns really make this mode shine feel unique and awesome.

You can think of the 360 mode much like the normal mode, except that the direction where the blocks are coming from can rotate around you on the fly. No matter which direction they’re coming from, they’re still traveling in straight lines, but notes can come along multiple tracks at once with different angles to you. Lines on the ground are used to show where you can expect the next string of blocks and to give you an idea of which direction you should be facing.

I got to play two songs, both of which were only mapped up to Hard difficulty. At first the songs started with notes coming head-on, but pretty quickly I saw the lines on the ground shift to indicate that notes would be coming from the side. The lines worked well to tell me ahead of time what to expect, and even slicing notes that moved between tracks felt pretty natural as they shifted gradually around me. However, sometimes the lines and note paths intersected in front of me and it was little more difficult to sort out the order of the notes because of the way they intertwined as they got closer to me.

In the end, playing in 360 mode was fun and different, and definitely felt like a new way to play Beat Saber compared to the standard mode, but it’s clear that it’ll take more time to learn how to make beat maps in the 360 mode which are really innovative and interesting. Ideally 360 mode should allow note maps which create totally unique movements for players compared to the standard mode. Because the design space of 360 mode is larger in scope though, it’s going to be more difficult to discover what really makes a great beat map.

In order to accelerate this need to learn what kind of 360 mode beat mapping could really be awesome, I think that Beat Games ought to put these tools out to their passionate communicate to see what kind of interesting 360 beat maps they come up with.

SEE ALSO
Hands-on: Oculus Quest is Ready and Able to Handle ‘Beat Saber’s’ Highest Difficulty

While the 360 mode could be a cool addition to Beat Saber, it could also represent ‘feature-creep’, which is dangerous for an indie team like Beat Games and for a game which thrives because of its simplicity.

Beat Saber already features the standard mode and ‘one saber’ mode, both of which have unique maps across four difficulties. That means that to make one song level for Beat Saber, the developers need to hand-craft eight unique maps if they want to serve every difficulty of both modes. Introducing 360 mode means not just more complex mapping, but four additional beat maps for each song (if they choose to cover all songs and difficulties with the 360 mode), meaning each song needs 12 hand-crafted beat maps.

Beat Games says that 360 mode will debut first on Quest because of its untethered 360 tracking, but it expects that a similar mode would come to other headsets later in a way that confines the rotating notes to some area in front of the player (so that they don’t get wrapped up in their cable), perhaps a ‘180 mode’. Without incredibly careful mapping and testing, it seems unlikely that 360 maps could easily be automatically converted into 180 maps though, so again, a 180 mode might mean yet more complexity and work when it comes to beat mapping.

Image courtesy Beat Games

Speaking with the developers though, it sounds like 360 mode is still very early and both the mechanics of how it works and the extent to which it will or won’t cover all of the game’s music is unclear. So we’ll still have to wait and see if it ends up being a boon for the game or extra baggage.

The post Watch: ‘Beat Saber’ 360 Mode Feels like a New Way to Play appeared first on Road to VR.

The Unreal Garden AR Environment Was A Much Needed Escape From The Chaos Of E3

Large-scale multiplayer AR turns 6,000 sq. ft. of show floor into a digital jungle.

If you’ve ever attended an event like E3, you’re no doubt familiar with the sheer chaos of a jam-packed show floor, whether it’s the massive crowds of attendees bumping shoulder-to-shoulder as they squeeze through narrow walkways, or the endless barrage of in-your-face marketing from companies desperately trying to attract your attention.

The Unreal Garden, a large-scale multiplayer AR experience from The Entertainment Software Association and Onedome, offered E3 2019 attendess an escape from the chaos, turning 6,000 sq. ft. of show floor into an interactive AR jungle.

Originating from a San-Franscico-based pop-up developed in 2018, The Unreal Garden is a combination of art and technology, fusing entertainment with augmented reality, projections, and soundscape technology. Built on the Enklu platform and powered by Microsoft HoloLens, the experience offers users a mixed reality journey through an environment composed of both physical and AR elements.

“E3 provides unparalleled interactive experiences to our attendees,” said Dan Hewitt, Vice President of Communications for the Entertainment Software Association, the U.S. video game trade association that owns and manages E3, in an officia release. “The Unreal Garden @ E3 provides a world-class opportunity for E3 attendees to explore the intersection of technology and human experience.”

“We are truly honored to have the opportunity to participate in E3 2019, which is always regarded as the place where the world of interactive entertainment gathers to encounter innovative new technologies,” said Leila Amirsadeghi, Co-Founder and CMO, Onedome. “The Unreal Garden @ E3 will compliment this year’s show floor and bring this augmented reality environment further to life. Attendees will be immersed in this never-before-seen multiplayer AR activation, which will reveal hidden messages and content as they journey through the experience.”

We had the opportunity to check out the unique experience ourselves while attending E3, and sufficite to say our time inside the Garden was a welcomed change of pace from some of the other more intense immersive experiences offered on the show floor. Donning a pair of Microsoft HoloLens headsets, we ventured into the sealed-off environment and were immediately greeted to a colorful scpectacle of tropical foliage. Layered over these prop set pieces was a healthy assortment of augmented wildlife. By directing our attention at these creatures, we were able to access thought-provoking message from the artists who contributed to the project.

Not only were we able to interact with the existing AR environment—playing with magic mushrooms, watching AR whales float above the treeline—we were also able to manifest our own “energy by holding up our index fingers in front of our visors, upon which a ball of energy would generate. From there, we could paint colorful lines throughout our environment and even cast a few spells, all of which could be seen by fellow users. It’s just a shame the limited FOV of the gen 1 Microsoft HoloLens made the experience so restricting. Still, it was an interesting project that demonstrates the potential of large-scale multiplayer AR experiences.

If you happen to be near San-Francisco, check out the original installation by picking up some tickets here.

The post The Unreal Garden AR Environment Was A Much Needed Escape From The Chaos Of E3 appeared first on VRScout.

Virtual reality can spot navigation problems in early Alzheimer’s disease

Virtual reality (VR) can identify early Alzheimer’s disease more accurately than ‚gold standard‘ cognitive tests currently in use, suggests new research from the University of Cambridge.

The study highlights the potential of new technologies to help diagnose and monitor conditions such as Alzheimer’s disease, which affects more than 525,000 people in the UK.

In 2014, Professor John O’Keefe of UCL was jointly awarded the Nobel Prize in Physiology or Medicine for ‚discoveries of cells that constitute a positioning system in the brain‘. Essentially, this means that the brain contains a mental ’satnav‘ of where we are, where we have been, and how to find our way around.

A key component of this internal satnav is a region of the brain known as the entorhinal cortex. This is one of the first regions to be damaged in Alzheimer’s disease, which may explain why ‚getting lost‘ is one of the first symptoms of the disease. However, the pen-and-paper used in clinic to diagnose the condition are unable to test for navigation difficulties.

In collaboration with Professor Neil Burgess at UCL, a team of scientists at the Department of Clinical Neurosciences at the University of Cambridge led by Dr. Dennis Chan, previously Professor O’Keefe’s Ph.D. student, developed and trialled a VR navigation test in patients at risk of developing dementia. The results of their study are published today in the journal Brain.

In the test, a patient dons a VR headset and undertakes a test of navigation while walking within a simulated environment. Successful completion of the task requires intact functioning of the entorhinal cortex, so Dr. Chan’s team hypothesised that patients with early Alzheimer’s disease would be disproportionately affected on the test.

The team recruited 45 patients with (MCI) from the Cambridge University Hospitals NHS Trust Mild Cognitive Impairment and Memory Clinics. Patients with MCI typically exhibit memory impairment, but while MCI can indicate early Alzheimer’s, it can also be caused by other conditions such as anxiety and even normal aging. As such, establishing the cause of MCI is crucial for determining whether affected individuals are at risk of developing dementia in the future.

The researchers took samples of cerebrospinal fluid (CSF) to look for biomarkers of underlying Alzheimer’s disease in their MCI patients, with 12 testing positive. The researchers also recruited 41 age-matched healthy controls for comparison.

All of the patients with MCI performed worse on the navigation task than the healthy controls. However, the study yielded two crucial additional observations. First, MCI patients with positive CSF markers—indicating the presence of Alzheimer’s disease, thus placing them at risk of developing dementia—performed worse than those with negative CSF markers at low risk of future dementia.

Secondly, the VR navigation task was better at differentiating between these low and high risk MCI patients than a battery of currently-used tests considered to be gold standard for the diagnosis of early Alzheimer’s.

„These results suggest a VR test of navigation may be better at identifying early Alzheimer’s disease than tests we use at present in clinic and in research studies,“ says Dr. Chan.

VR could also help clinical trials of future drugs aimed at slowing down, or even halting, progression of Alzheimer’s disease. Currently, the first stage of drug trials involves testing in animals, typically mouse models of the disease. To determine whether treatments are effective, scientists study their effect on navigation using tests such as a water maze, where mice have to learn the location of hidden platforms beneath the surface of opaque pools of water. If new drugs are found to improve memory on this task, they proceed to trials in , but using word and picture memory tests. This lack of comparability of memory tests between animal models and human participants represents a major problem for current clinical trials.

„The brain cells underpinning navigation are similar in rodents and humans, so testing navigation may allow us to overcome this roadblock in Alzheimer’s drug trials and help translate basic science discoveries into ,“ says Dr. Chan. „We’ve wanted to do this for years, but it’s only now that VR technology has evolved to the point that we can readily undertake this research in .“

In fact, Dr. Chan believes technology could play a crucial role in diagnosing and monitoring Alzheimer’s disease. He is working with Professor Cecilia Mascolo at Cambridge’s Centre for Mobile, Wearable Systems and Augmented Intelligence to develop apps for detecting the disease and monitoring its progression. These apps would run on smartphones and smartwatches. As well as looking for changes in how we navigate, the apps will track changes in other such as sleep and communication.

„We know that Alzheimer’s affects the brain long before symptoms become apparent,“ says Dr. Chan. „We’re getting to the point where everyday tech can be used to spot the warning signs of the disease well before we become aware of them.

„We live in a world where mobile devices are almost ubiquitous, and so app-based approaches have the potential to diagnose Alzheimer’s disease at minimal extra cost and at a scale way beyond that of brain scanning and other current diagnostic approaches.“

Quelle:

https://medicalxpress.com/news/2019-05-virtual-reality-problems-early-alzheimer.html

Astronauts Will Bring Their Kids To Mars In VR

Immersive simulation could battle loneliness and improve mental health on long missions.

Within the next two decades, NASA hopes to launch a human mission to Mars. The roundtrip journey could take an estimated 21 months and experts have predicted a critical need for mental health support for the astronauts who experience the extreme isolation.

That focus on the crew’s overall well-being was a major theme of a recent workshop hosted by Baylor College of Medicine’s Translational Research Institute for Space Health (TRISH) and Z3VR, a Houston-based digital therapeutics company. Speakers during the daylong event, streamed live online, addressed how extended augmented reality tools could support the health of astronauts during long-duration space travel—particularly in the areas of emotional and mental support.

One of the more thought-provoking suggestions? Avatars.

Program speaker Javier Fadul, director of innovation for HTX Labs—a local tech company that specializes in creating immersive virtual reality (VR) training experiences—recommended avatars as a tool to combat loneliness.

An image of a virtual reality hospital room with a VR patient and nurse created by HTX Labs. (Credit: HTX Labs)

 

In the computing world, avatars are graphical representations of a user or that person’s alter ego. Through virtual reality technologies, these representations can take on 3-D forms so remarkably realistic that they could be the next best thing to physically having one’s relatives join them in space.

“I’m often thinking about the big picture and the grand vision of what this technology could provide for humans,” Fadul said during a follow-up discussion at the HTX Labs offices in Houston’s Montrose neighborhood. “For the seminar, I wanted to discuss our experiences with immersive training, all of the different variables that we’ve had experience with so far and also what lessons we have learned that could be applied to this long-term vision of humans in space—which is such an exciting thing.”

Fadul, who is responsible for developing virtual reality experiences at HTX Labs from a creative design perspective through user interface and user experience (UI/UX) technologies, said that VR capabilities have become so advanced that a fully immersive experience is now possible. Operating a high-grade headset—HTX Labs uses one called the HTC Vive—users can be transported through sight, sound and, in many cases, touch, to entirely new realities.

“A lot of people think of VR and have experienced VR as if it was just a new screen—a new display—and it makes sense. It’s definitely a very immersive screen, but not only does it display information, it also enables you to perform behaviors within that virtual world—and I think that’s the part that’s particularly significant. It’s not just about consuming content, but also about being able to provide agency, which adds to the level of engagement,” Fadul said.

He added that “being a full human in a simulation” has therapeutic benefits.

“That’s a pretty impactful thing—and for people who are going to space, where isolation and the extreme conditions will be highly prevalent, there are a lot of applications there,” Fadul said.

Research has shown that VR technology can help combat loneliness among seniors. In aMassachusetts Institute of Technology (MIT) study published in 2018, older participants who had access to a provided virtual reality system reported less social isolation, were less likely to show signs of depression and reported better overall well-being than those who did not use VR. Most of the systems provided in these settings transported individuals to a new environment or experience. The VR company Rendever, for example, uses the technology to give residents in assisted living and senior care facilities access to a bucket list adventure, an exotic locale or even their old neighborhood.

What Fadul is proposing is that NASA take it a step further for a Mars mission by creating something like the ultimate personalized VR. One of the applications he envisions would allow an astronaut to bring avatars of loved ones—partners or children or friends—as envoys of comfort during the long journey. Through a virtual reality headset, the astronaut could potentially be transported into a VR copy of his or her home to eat a meal with their partner or walk into their children’s rooms and read a bedtime story.

Fadul has made avatars of his 15-month-old son during different stages of his development so that someday his son could have the experience of going back in time. He said that for someone who works to create life-like virtual humans for a living, it’s been fascinating to watch his own son grow and develop.

“There is something to be said about seeing his development, of watching him discover his own hands and then sort of modeling those things in the work I do,” Fadul said. “He’s just starting to understand language and, similarly, our virtual humans are starting to respond in those ways as well. So, it’s really remarkable—this sort of parallel process that’s happening.”

Always thinking bigger, Fadul added that someday this same technology could prove beneficial for people dealing with loss.

“You can walk around within our systems, use tools and actually communicate naturally with virtual humans; you could hold your virtual baby again, for example, things like that—that could be hugely therapeutic,” he said. “I think the opportunities that immersive technology offer might be the only ones that can address some of these issues. A book can only go so far, a photo can only go so far, but really getting that sense of connection might only be possible with these systems.”

 

 

Quelle:

Image: Pathfinder on Mars. (Credit: NASA/JPL)

Facebook Open-sources AI Habitat To Help Robots Navigate Realistic Environments

Facebook Open-sources AI Habitat To Help Robots Navigate Realistic Environments

Facebook AI Research is making available AI Habitat, a simulator that can train AI agents that embody things like a home robot to operate in environments meant to mimic typical real-world settings like an apartment or office.

For a home robot to understand what to do when you say “Can you check if laptop is in the other room and if it is, can you bring it to me?” will require drawing together multiple forms of intelligence.

Embodied AI research can be put to use to help robots navigate indoor environments by marrying together a number of AI systems related to computer vision, natural language understanding, and reinforcement learning.

“Habitat-Sim achieves several thousand frames per second (fps) running single-threaded, and can reach over 10,000 fps multi-process on a single GPU, which is orders of magnitude faster than the closest simulator,” a dozen AI researchers said in a paper about Habitat. “Once a promising approach has been developed and tested in simulation, it can be transferred to physical platforms that operate in the real world.”

Facebook Reality Labs, formerly named Oculus Research, is also open-sourcing Replica, a data set of photorealistic 3D environments like a retail store, apartment, and other indoor environments that resemble the real world. AI Habitat can work with Replica but also works with other embodied AI research data sets like Matterport3D for indoor environments.

Simulated data is commonly used in AI to train robotic systems, create reinforcement learning models, and power AI systems from Amazon Go to enterprise applications of few-shot learning with small amounts of data. Simulations can allow environmental control, reducing costs that arise from the need to collect real-world data.

AI Habitat was introduced in an effort to create a unified environment and address standardization for embodied research by the robotics and AI community. To that end, Facebook also released PyTorch Hub earlier this week.

“We aim to learn from the successes of previous frameworks and develop a unifying platform that combines their desirable characteristics while addressing their limitations. A common, unifying platform can significantly accelerate research by enabling code re-use and consistent experimental methodology. Moreover, a common platform enables us to easily carry out experiments testing agents based on different paradigms (learned vs. classical) and generalization of agents between datasets,” said Facebook.

In addition to the Habitat simulation engine, the Habitat API provides a library of high-level embodied AI algorithms for things like navigation, instruction following, and question answering.

Facebook released the PyTorch Hub platform for reproducibility of AI modelsearlier this week.

Researchers found that “learning outperforms SLAM if scaled to an order of magnitude more experience than previous investigations” and that only agents with depth sensors generalize well across datasets.

“AI Habitat consists of a stack of three modular layers, each of which can be configured or even replaced to work with different kinds of agents, training techniques, evaluation protocols, and environments. Separating these layers differentiates the platform from other simulators, whose design can make it difficult to decouple parameters in order to reuse assets or compare results,” the paper reads.

AI Habitat is the latest Facebook AI initiative to use embodied AI research, and follows research to train an AI agent to navigate the streets of New York with 360-degree images and to get around an office by watching videos.

Facebook VP and chief AI scientist Yann LeCun told VentureBeat the company is interested in robotics because the opportunity to tackle complex tasks attracts the top AI talent.

AI Habitat is the most recent example of tech giants attempting to deliver a robotics creation platform for AI developers and researchers. Microsoft introduced a robotics and AI platform in limited preview last month, while Amazon’s AWS RoboMaker, which draws on Amazon’s cloud and AI systems, made its debut in fall 2018.

How AI Habitat works was detailed in an arXiv paper written by a team that includes Facebook AI Research, Facebook Reality Labs, Intel AI Labs, Georgia Institute of Technology, Simon Fraser University, and University of California, Berkeley.

AI Habitat will be showcased in a workshop next week at the Computer Vision and Pattern Recognition (CVPR) conference in Long Beach, California.

In other recent contributions to the wider AI community, Facebook AI research scientist Mike Lewis and AI resident Sean Vasquez introduced MelNet, a generative model that can imitate music and the voices of people like Bill Gates.

Major object detection AI systems from Google, Microsoft, Amazon, and Facebook are less likely to work for people in South America and Africa than North America and Europe, and less likely to work for households that make less than $50 a month.

Facebook VP of AR/VR Andrew Bosworth earlier this week said new Portal devices — the first after the video chat devices were introduced in October 2018 — will make their public debut this fall.

Facebook also announced plans to open an office with 100 new AI roles in London.

This post by Khari Johnson originally appeared on VentureBeat.

Tagged with: , ,

The post Facebook Open-sources AI Habitat To Help Robots Navigate Realistic Environments appeared first on UploadVR.

Today’s Advanced Research Goes From Free-flying Robots to Anti-Gravity Pants

Astronaut Anne McClain checks out the new Astrobee hardware
NASA astronaut Anne McClain checks out the new Astrobee robotics hardware earlier this year inside the Japanese Kibo laboratory module.

Robotics, combustion and human research were the primary focus of today’s science schedule aboard the International Space Station. The Expedition 59 crewmembers also checked out U.S. spacesuits and specialized pants designed to counteract some of the effects of living in microgravity.

Astrobee, a tiny cube-shaped free-flying robotic assistant, is being tested aboard the orbital lab for its sighting and motion abilities. Flight Engineer David Saint-Jacques of the Canadian Space Agency (CSA) set up Astrobee for more mobility tests today inside the Japanese Kibo laboratory module. The device may support routine maintenance tasks and lab monitoring capabilities. Northrop Grumman’s Cygnus space freighter delivered Astrobee to the station April 19.

The safe observation of how fuels and materials burn in microgravity takes place in the space station’s Combustion Integrated Rack (CIR). The research takes place in the U.S. Destiny laboratory module and may help engineers design more fuel-efficient spacecraft engines and safer, less flammable environments. NASA astronaut Christina Koch replaced a burner and igniter tip in the CIR to maintain continuing combustion research operations.

Flight Engineer Anne McClain of NASA attached cuffs to her legs and sensors to her chest for a series of blood pressure checks and ultrasound scans today. The Vascular Echo biomedical study from CSA, ongoing since March 2015, analyzes an astronaut’s cardiovascular system for conditions such as arterial stiffness.

U.S. spacesuits continue to be serviced after a set of three spacewalks that took place earlier this year. Astronaut Nick Hague cleaned the suit’s cooling loops, cycled their pressure valves and tested water samples inside the Quest airlock where U.S. spacewalks are staged.

Cosmonauts Oleg Kononenko and Alexey Ovchinin have been training this week to use the Lower Body Negative Pressure suit. The Russian suit, also known as Chibis, counteracts the upward fluid shifts in the human body caused by microgravity. This may alleviate the head and eye pressure reported by astronauts. An easily recognizable symptom of these fluid shifts that all crews experience is “puffy face.”

E3 2019: Asgard’s Wrath Could Be The VR RPG We’ve All Been Waiting For

Explore complex dungeons with animal-human hybrids in this Norse-inspired epic.

I’m following a brazen Shield Maiden and her crew of Viking compatriots as they sail across dark waters on a perilous quest to exact revenge on the Norse god Týr. Before we reach the coastline, however, a towering god-like creature erupts from beneath the surface of the violent waters, eviscerating half of her compatriots before being thrown into the waters herself. She then wakes upon a beach, surrounded by the wreckage of her warship and the scattered remains of her crew.

This is Asgard’s Wrath, an ambitious new VR RPG adventure that places you in the role of both mythical Nordic deity and mortal hero in a battle against the Gods themselves. Brought to us by Sanzara Games (Marvel Powers United, VR Sports Challenge) and Oculus Studios, the bold new RPG experience promises in-depth immersive combat, unique puzzle-solving, a fully-developed narrative featuring 30+ hours of gameplay, and jaw-dropping set pieces; all set to the rich backdrop of Norse mythology.

Image Credit: VRScout

Asgard’s Wrath was large and in charge at E3 2019, allowing attendees—including myself—the chance to go hands-on with the highly-anticipated title.

Taking place directly after the events featured in the PAX East demo—where players rescued the famous Norse God Loki from the clutches of the legendary Kraken—I began my experience inside an Asgardian tavern, surrounded by various other mythological deities. As thanks for saving his mischevious skin, Loki treats me to a hearty round of mead. After clinking our glasses and downing our drinks—an action that felt surprisingly satisfying in VR—Loki informs me of an ambitious Shield Maiden hellbent on avenging the death of her brother who was killed by the Norse god Týr. In order to rise through the ranks and become a true Norse God, it’s my job to assist this woman, along with various other adventurous mortals, by possessing their bodies and helping them fulfill their ultimate destinies.

After awakening on the beach following the attack on her fleet, I step into the shoes of the foul-mouthed warrior and begin exploring the coastline. After walking up to a special stone pedestal located further up the beachhead, I’m instantly transported back to godhood, allowing me to stare down at my companion and the surrounding environment as if looking at a board game. In this form, I’m actually able to recruit warrior companions out of local wildlife to assist my character in her journey; each animal features different strengths that can come in handy during certain scenarios.

Image Credit: RoadtoVR

For instance, one portion of the demo asked to me to navigate through a zombie-ridden dungeon by activating a series of hidden levers disguised as hanging cages; all of which just beyond my characters reach. While in God-mode, I reached down and picked up a shark who became stranded in a shallow pool following the recent storm. Applying my mystical god-like powers to the helpless creature, I watched as the wild animal turned into a wild half-man, half-shark hybrid with a hunger for flesh. I could then task my new NPC companion with activating said levers by jumping up and dragging-down the devices using its massive jaws.

The following dungeon required me to navigate past a series of precarious fire traps. For this puzzle, I enlisted the help of a sea turtle that—when transformed into a humanoid creature—used its rugged shell to protect me from the painful flames. Each of these characters, while simplistic in terms of AI (you can tell them where to go, who to attack, and when to bypass a trap or puzzle), feature an incredible amount of personality. From the various fish strapped to the belt of my sea turtle warrior, to the fin-shaped armor featured on my humanoid shark, each aspect of the characters, as well as the environments, are bursting with small details that—while seemingly insignificant—all come together to create a captivating and engaging world.

Image Credit: Sanzaru Games

These creatures, while invaluable tools for puzzle-solving, will also assist you in combat. While navigating the cliff-side dungeons in search of powerful weaponry, I came across several bands of undead warriors hungry for an ass-whooping. With my man-shark companion in-toe, I engaged the enemy in melee combat using a legendary sword and shield I uncovered earlier in the tooms. The combat I experienced, while brief, was satisfying. Dismembering different parts of my enemies depending on where I struck was a nice touch, if not a little too easy; especially when assisted by my NPC companion. Honestly, it almost felt as though combat was an afterthought, at least in terms of the demo. These small battles were definitely the least memorable portions of my experience.

The UI and menu system I found slightly less fluid. In order to access your bag and retrieve certain items—a task that was asked of me multiple times throughout my 30-minute demo—I needed to open the menu by pushing a button, navigate to the object, and then click-and-drag it out of the menu and into my hand. This was especially cumbersome considering there are tabs within your inventory that filter certain objects, forcing you to scroll through multiple pages in order to locate your desired item; not a game-breaking mechanic by any means, but an unfortunate one none-the-less.

Image Credit: Sanzaru Games

Honestly, my favorite moments during the demo were the points where I was allowed to explore the detailed environments and lose myself in the lore behind this Norse-inspired universe. That’s what excited me most about Asgard’s Wrath: the potential for a truly captivating narrative previously unheard of in modern VR RPG’s. It’s no secret that story-driven VR games are currently few and far between. However, titles such as Vader Immortal and Lone Echo have proven that story-driven VR games are not only possible, but in-demand.

I also really enjoyed the way the game plays with scale, whether you’re staring up at a titanic God while in mortal form, or gazing down at the miniature landscapes while in god-mode.

With 30+ hours of gameplay promised, Asgard’s Wrath could be the first real AAA RPG experience on Oculus, an idea that excites me to no end. Asgard’s Wrath will be available exclusively on the Oculus Rift and Oculus Rift S sometime later this year.

The post E3 2019: Asgard’s Wrath Could Be The VR RPG We’ve All Been Waiting For appeared first on VRScout.

So sieht die Zukunft des Lernens aus: Schoolflix und Hologramm-Brille

Eine neunte Klasse der Werkstattschule in Rostock hat an dem Wettbewerb „Prototype for Education“ des MVpreneur Day teilgenommen. Mit „Schoolflix“ und einer AR-Brille wollen die Schüler die Jury am Mittwoch überzeugen.

Eine 3D-Brille aus dem Kino, mit Flaschendeckeln als Knöpfe und Kabeln an der Seite: So kann die Zukunft des Lernens aussehen, zumindest wenn es nach den Schülern einer neunten Klasse der Werkstattschule in Rostock ginge. Für den Wettbewerb „Prototyp for Education“ des MVpreneur Day 2019 der Universität Rostock überlegten sich die Schüler, wie sie das Lernen interessanter gestalten können. „Zwei spannende Ideen beziehungsweise Prototypen sind dabei entstanden“, so ihre Lehrerin Anne Buhrand. „Die sollen den Wissenserwerb ermöglichen.“

Seit zwei Jahren ist die 29-Jährige Lehrerin für Mathe und Informatik an der Rostocker Schule. Durch eine Kollegin wurde sie auf den Wettbewerb aufmerksam und erzählte ihren Schülern davon. „Wir fanden es interessant daran teilzunehmen, weil es uns betrifft“, erzählt Lennart Karsten.

Eine erweiterte Realität

Der 15-Jährige entwickelte zusammen mit seinen Klassenkameraden Jonas Wirth, Pepe Kordhase, Hannes Utech, Arwin Krone und Lenny Schmalisch den Prototypen einer Augmented Reality Brille, kurz AR. Eine Brille, die die Realität durch Hologramme erweitern soll. Anders als bei einer VR-Brille, die bei Videospielen verwendet wird, ist die reale Umgebung bei einer AR-Brille noch sichtbar. „Die Schüler können also noch den Lehrer sehen“, erklärt Lenny Schmalisch. Auf die Idee mit der Brille ist die Gruppe durch die Apps des WDR, die ein AR-Erlebnis auf dem Handy bereits ermöglichen, gekommen.

Die Jungs sind sich einig: Ob die einzelnen Erdschichten zum Greifen nah im Raum schwebend oder Marie-Antoinette, die den Schülern erzählt, wie sie im 18. Jahrhundert am Französischen Hof lebte, durch die dreidimensionale Visualisierung des Mini-Computer auf der Nase werde Lernen interessanter, näher, aufregender. „Roboter oder auch Zeitzeugen könnten so in den Unterricht gebracht werden.“ „Ein negativer Punkt unserer Idee ist das Finanzielle“, räumt Hannes ein. Auch könne der Unterricht unruhiger werden. Mit der Idee der Neuntklässler ist die „HoloLens“ von Microsoft zu vergleichen. Die kostet rund 1800 Euro.

Schoolflix: Ein Mix aus Doku und Serie

Merle Siems, Anne Stachs, Ronja Hoch und Mette Matthes haben ebenfalls einen Prototypen für den Wettbewerb entwickelt. „Schoolflix“ heißt ihr Projekt. „So wie bei Netflix ist Schoolflix eine Website mit Filmen und Serien, die für den Unterricht geeignet sind“, erzählt die 16-jährige Mette. Die Inspiration kam durch die US-amerikanische Serie „Reign“, die vom Leben der jungen Maria Stuart handelt. „Die Serie weckte so ein großes Interesse bei den Mädchen, dass sie sich anschließend weiter darüber informierten“, sagt ihre Lehrerin. Anders als bei „Reign“ sollen die Beiträge auf „Schoolflix“ jedoch näher an der Realität sowie ein guter Mix zwischen Dokumentation und Serie sein.

Buhrand hat eine Idee für den Matheunterricht: „Über Pythagoras könnte man auch eine Serie drehen. Wie er in seiner Sekte lebte, Musik machte und dann den Beweis für den Satz des Pythagoras fand.“ In ihrem Unterricht würde die Lehrerin die entwickelten Prototypen gerne anwenden. „Ich denke, dass sie eine Bereicherung darstellen würden.“

MVpreneur Day 2019

Die Ideen der Schüler sind bereits auf YouTube zu sehen. „Dafür muss man einfach nur nach dem Wettbewerb „Prototype for Education“ suchen“, sagt Anne Buhrand.

Am Mittwoch beim MVpreneur Day wird sich herausstellen, ob die Schüler mit ihren Ideen die Jury überzeugen können. „Wir lassen uns total überraschen“, so die Mathelehrerin. Bei einem Gewinn werde das Geld in den Computerraum der Schule investiert.

Quelle:

Foto: Die neunte Klasse der Werkstattschule in Rostock nimmt am „Prototype for Education“ Wettbewerb des MVpreneur Day 2019 teil. Ihre Idee ist eine AR-Brille. Quelle: Lena Hackauf

https://www.ostsee-zeitung.de/Nachrichten/MV-aktuell/So-sieht-die-Zukunft-des-Lernens-aus-Schoolflix-und-Hologramm-Brille