Space Gardening Aboard Station Ahead of Spacewalks

NASA astronaut Jessica Meir dines on fresh Mizuna mustard greens
NASA astronaut Jessica Meir dines on fresh Mizuna mustard greens she harvested aboard the International Space Station.

The Expedition 61 crewmembers started taking turns “weighing in” Wednesday before a slew of space gardening and life science activities. The orbital residents are also nearing the start of a series of spacewalks to repair the International Space Station’s cosmic particle detector.

Isaac Newton’s Second Law of Motion allows for the calculation of mass in space using a variant of the equation — force equals mass times acceleration. Each crewmate attached themselves to a device this morning that applies a known force to the subject. The resulting acceleration provides a value used to calculate an astronaut’s mass.

It is harvest time once again aboard the orbiting lab. NASA astronauts Jessica Meir and Christina Koch cut leaves from Mizuna mustard greens grown inside ESA’s (European Space Agency) Columbus lab module. Half of the space crop is destined for crew tasting while the other half was stowed in science freezers for analysis on Earth.

NASA Flight Engineer Andrew Morgan is getting ready for Friday’s spacewalk with ESA Commander Luca Parmitano. Morgan reviewed robotics activities planned for Friday’s excursion and checked spacewalking gear. The duo will set their U.S. spacesuits to battery power at 7:05 a.m. EST inside the Quest airlock signifying the start of their spacewalk. NASA TV begins its live coverage at 5:30 a.m.

Parmitano focused on science today attaching electrodes to himself after his “weigh-in” to measure any changes to his body fat. Afterward, he collected noise level measurements in Russia’s Zarya module. He then set up samples in the U.S. Destiny lab module to explore how fluids behave in microgravity to improve medical conditions on Earth.

Cosmonaut Oleg Skripochka handed over radiation detectors to Meir for deployment in the station’s U.S. segment. He later joined cosmonaut Alexander Skvortsov and recorded video to share historical space accomplishments with Russian audiences.

Last Labyrinth Is A Nightmare-Fueled Escape Room Tailor-Made For VR

Silent Hill meets Saw in this uniquely-crafted horror puzzler experience. 

Last Labyrinth released worldwide on major VR headsets today and sufficeth to say AMATA K.K.’s escape-room-style experience is nothing short of horrific. Not in terms of quality; far from it in fact. The adventure puzzle experience is loaded with complex challenges and features a natural character interaction system that perfectly compliments the VR format. No, when I say “horrific,” I’m referencing the absolutely nightmarish consequences that come as a result of your failures. And I don’t know about you, but I fail a lot.

Tasked with escaping the confines of a mysterious mansion, Last Labyrinth begins with the player tied to a wheelchair — hands and legs bound together — inside a dark room; their only other company being a young girl who speaks an unrecognizable foreign language. Right off the bat, the immersive escape room experience sets itself apart from its contemporaries with its unique interaction system. Unable to physically move their legs or arms, players can trigger levers, activate switches, and interact with various other environmental elements by directing the young girl, Katia, to different positions throughout each room. 

Of course, neither Katia nor the player can understand one another. As a result, players “communicate,” using a laser pointer attached to their head which allows them to guide the young companion towards specific objects. Before Katia performs each desired action, she’ll “ask” for confirmation by pointing at the object, at which point the player can either nod “yes” or “no” to complete the action.

Image Credit: AMATA K.K.

These interactions range from opening doors and progressing through rooms, to pushing buttons and throwing levers in order to activate different chain reactions. Sometimes this involves aligning tracks to correctly redirect a toy train to a specific destination; other times it means navigating a series of deadly traps. Though I’ve only managed to tackle a handful of puzzles, so far I’ve come across a healthy variety of unique challenges, each more violent than the next. While the experience can sometimes slow down to a crawl during a few of the lengthier challenges, the comfortable, seated experience manages to keep you constantly engaged throughout, thanks in large part to its consistently-unsettling environments. 

Last Labyrinth is creepy. Like, super creepy. Fans of Japanese horror will no doubt feel right at home skulking throughout the terrifying halls of the decrepit mansion alongside their younger companion. And herein lies my only serious complaint with the experience so far: child murder. Look, I’m all for developers trying to envoke a sense of dread and shock with their projects, but every so often an experience goes a bit too far in terms of its content. One of Last Labyrinth’s most defining elements is the gruesome deaths you and Katia are subject to as a result of failing certain puzzles. This can include everything from decapitation and impalement, to falling or being crushed to death.

Image Credit: AMATA K.K.

In terms of you — the player — it’s a smart decision that adds more weight to your decisions, forcing you to be more strategic with each action you take and adding an additional level of immersion to the experience. Seeing as you’re confined to a wheelchair with no way of moving, your fate is completely dependent on your puzzle-solving skills. That being said, these gruesome deaths have an opposite effect when it comes to Katia. Call me crazy, but there’s something about watching a 10 to 12-year-old child getting repeatedly murdered in a variety of horrifying ways that tends to put me off.

In the context of the story, it makes sense having your companion, an NPC completely dependent on the player, be of a younger age. However, the choice to include this character in all the deaths was a decision best described as off-putting. Had the developers chosen to only kill the player, or at the very least increased the age of Katia, these sequences would have been much more tolerable.

Image Credit: AMATA K.K.

All-in-all, Last Labyrinth is a quality escape-the-room-style experience that lends itself incredibly well to the VR format. It’s also refreshing to see a new quality horror experience on major headsets. While PC VR headsets have a fair share of adult-themed experiences already available, the PSVR and Oculus Quest are severely lacking in terms of scarier content. With the arrival of Last Labyrinth and Five Nights at Freddy’s VR: Help Wanted, it appears as though the VR horror genre is finally getting the attention it deserves.

Last Labyrinth is available now on SteamVR, PS VR, and Oculus.

Feature Image Credit: AMATA K.K.

The post Last Labyrinth Is A Nightmare-Fueled Escape Room Tailor-Made For VR appeared first on VRScout.

Turn Your Home Into A Mixed Reality Record Store With Spotify On Magic Leap

Playlists change depending on what room you’re currently in.

Spotify — the Starbucks Coffee equivalent to digital music streaming — this week announced its arrival on the Magic Leap One mixed reality headset, adding yet another entry into its list of compatible devices.

Available for download now via Magic Leap World, the Spotify mixed reality app allows Premium subscribers to access their personal catalog of music and playlists in a mixed reality setting. Using a spatial-aware user interface, you can actually pin artists, tracks, albums, and playlists anywhere within your physical space, turning your home into a futuristic jukebox. Not only that, but you can also create personalized soundscapes that trigger certain playlists depending on what room you’re in. 

“Share your sofa with up-and-coming bands, meet your idols in your kitchen, explore new albums in your garage,” says the company in the official blog post.

Alongside Spotify integration comes Background Music Service (BMS), a feature that allows you to continue listening to Spotify while you navigate other applications. For a totally uninterrupted experience, you can also navigate the Spotify app via your desktop, smartphone, or tablet device. Spotify is the first app available on Magic Leap World to include BMS; as of this week, the feature has been made available to Magic Leap developers as well as Overture in Landscape. 

Image Credit: Justin_tech

“We see a time in the not too distant future when spatial computing will extend into the wider world of podcasts, audiobooks and storytelling,” continues Magic Leap in their post. “And this is just the beginning. The launch of Spotify marks an evolution in the way you can see, hear and experience the bands and artists that you love. It’s an exciting time for music. For musicians. For developers. And for music-lovers the world over.”

Magic Leap users can download Spotify right now for free via Magic Leap World. Magic Leap currently features a generous catalog of immersive mixed reality tools, games, and applications, including Star Wars: Project Porg, Magic Leap: Undersea, Dr. Grordbort’s Invaders, Angry Birds FPS: First-Person Slingshot and numerous other impressive experiences.

Feature Image Credit: Magic Leap

The post Turn Your Home Into A Mixed Reality Record Store With Spotify On Magic Leap appeared first on VRScout.

Mit Easy Speech gegen Redeangst: Training in der virtuellen Realität

Wie können Führungskräfte und Speaker ihre Redeangst überwinden? Mit Virtual Reality (VR) und Künstlicher Intelligenz! Der Verlag Dashöfer versetzt dich mit der VR-App Easy Speech in einen virtuellen Meetingraum, in dem du einen Vortrag halten und deine Soft Skills trainieren kann.

Die einzige Möglichkeit, seine Ängste wirklich zu überwinden, ist es, sich ihnen zu stellen. Das weiß auch der Hamburger Fachverlag und Seminar-Anbieter Dashöfer.

Weil laut einer Studie von 2016 ungefähr 60 bis 70 Prozent aller Menschen und 58 Prozent aller befragten Führungskräfte davor Angst haben, öffentlich vor Publikum zu reden, hat der Verlag die VR-App Easy Speech entwickelt.

Easy Speech: Die Virtual-Reality-App gegen Redeangst

Easy Speech basiert auf dem VR-Headset Oculus Go. Die Anwendung soll Menschen dabei helfen, ihre Rede- beziehungsweise Bühnenangst zu überwinden.

Dafür versetzt Easy Speech uns in einen virtuellen Konferenzraum, in dem zehn Menschen an einem Konferenztisch sitzen und uns anschauen. Und vor diesen sollen wir nun einen freien Vortrag halten.

„Wir haben in den virtuellen Seminarraum keine Avatare, sondern reale Personen projiziert, um eine lebensnahe Situation für die Teilnehmer zu schaffen“, erklärt Geschäftsführer Fabian Friedrichs zur realitätsnahen Abbildung des Publikums.

„Dabei gibt es wie in einem echten Seminar unaufmerksame Zuhörer und Zwischenrufer.“

Verschiedene Module und Einstellungen

Wir können zwischen unterschiedlichen Modulen wählen und beispielsweise eine Stegreifrede oder Musterpräsentation halten. Es ist aber auch möglich, per Wlan eine eigene Präsentation hochzuladen.

Dabei gibt es auch verschiedene Schwierigkeitsgrade und Einstellungen. Das Publikum kann aufmerksam zuhören, aber auch abgelenkt sein.

Manchmal hören wir störende und beinahe unverschämte Zwischenrufe oder aber ein Smartphone klingelt und die Sirene eines vorbeifahrenden Krankenwagens ertönt.

Easy Speech simuliert also Stress-Situationen, die einen unsicheren Redner ablenken oder ganz aus dem Konzept bringen können. Auf diese sollen wir in der Simulation möglichst souverän reagieren.

Künstliche Intelligenz gibt Feedback

Aber damit Redner sich nicht einfach nur ihrer Angst im virtuellen Raum stellen, sondern auch wirklich ihre Soft Skills trainieren und dementsprechend selbstsicherer werden, verfügt Easy Speech über eine Künstliche Intelligenz (KI).

Die KI zeichnet jeden gehaltenen Vortrag auf und analysiert ihn. Gleich nach einer Übung erhält der Redner Feedback zu folgenden Parametern: Habe ich zu viele Füllwörter benutzt? Habe ich gleichmäßigen Blickkontakt mit allen Zuhörern gehalten? Und: Habe ich zu langsam oder zu schnell gesprochen?

Diese und weitere Faktoren wertet die KI aus und fasst sie in einem Punktesystem zusammen. Für jeden Parameter kann man maximal zehn Punkte erreichen. Der Redner kann sich seine Präsentation anschließend auch nochmal komplett anhören.

Der Preis für eine VR-Brille mit der installierten Easy-Speech-Anwendung liegt momentan bei 2.999 Euro. Enthalten sind auch eine Lizenz für 30 Nutzer sowie die Installation und Einrichtung durch Dashöfer.

Easy Speech löst nicht das eigentliche Problem

Es stellt sich nun die Frage, wie erfolgreich die Technik wirklich dabei helfen kann, die eigene Redeangst zu überwinden.

Wir haben schließlich meistens Angst, wenn wir uns unwohl oder unsicher fühlen. Das eigentliche Problem liegt also nicht in der Angst selbst, sondern in dem, was dahinter steht: mangelndes Selbstwertgefühl und Unsicherheit.

Speziell auf Präsentationen bezogen, können wir diese minderwertigen Gefühle natürlich durch Training und Routine auflösen.

Wer viel vor Publikum redet, empfindet das öffentliche Sprechen irgendwann für ganz normal – unabhängig davon, wie fremd oder unwohl man sich in der jeweiligen Umgebung fühlt.

In diesem Sinne ist Easy Speech ein hervorragendes Tool und vor allem eine sehr gute Ergänzung, um Soft Skills zu trainieren und auf diese Weise selbstsicherer zu werden. Das eigentliche Problem – das mangelnde Selbstwertgefühl also – kann man aber nur selbst lösen, indem man sich mit sich selbst beschäftigt.

Virtual Reality versus Vorstellungskraft

Zudem ließen sich die Kommunikations- und Präsentations-Fähigkeiten auch selbst trainieren – und zwar mit der eigenen Vorstellungskraft, wenn sie denn ausgeprägt genug ist.

Easy Speech macht es uns natürlich sehr einfach: Wir werden in einen realitätsnahen Konferenzraum geschmissen und reagieren mit entsprechenden Gefühlen wie Stress und Aufregung auf die Projektion.

Im Grunde können wir uns eine solche Situation aber auch selbst vorstellen und unsere Präsentation so halten. Dann erhalten wir allerdings kein Feedback, das auf einer KI basiert.

Wer viel Wert auf Zahlen, Daten und Fakten legt, findet in Easy Speech also ein schönes Tool, um seine Soft Skills zu verbessern. Doch eine gute Präsentation muss nicht perfekt sein, sondern menschlich. Es geht schließlich immer auch darum, wie wohl wir uns vor und mit unseren Mitarbeitern und Kunden fühlen.

Wenn wir beispielsweise einem Kunden ein Produkt präsentieren, aber schon während unseres Vortrages merken, dass wir uns unwohl fühlen und menschlich nicht auf einer Wellenlänge sind – ergibt eine Partnerschaft oder Zusammenarbeit dann überhaupt Sinn?

Die Arbeitswelt wandelt sich

Viele Unternehmen bewegen sich immer mehr von konventionellen zu freundschaftlichen Arbeitsmodellen. Mindestens genauso wichtig wie die fachlichen Anforderungen eines Bewerbers, ist zum Beispiel vor allem auch, ob er persönlich ins Team passt.

Die Arbeitswelt braucht mehr Menschen, die Entscheidungen wieder mit dem Herzen treffen statt nur mit dem Kopf. Wenn wir uns wohl fühlen mit uns selbst und unseren Kollegen, Kunden und Vorgesetzten, brauchen wir auch keine Angst vor Meetings und Präsentationen haben.

Das ist die grundlegende Basis, die wir schaffen sollten. Dann kann eine VR-App wie Easy Speech auch erfolgreich dafür eingesetzt werden, Kommunikations- und Präsentations-Fähigkeiten zu schulen.



Mit Easy Speech gegen Redeangst: Training in der virtuellen Realität

This app uses AI and augmented reality to get you speaking a new language

When you consider the prospect of learning a new language, chances are your mind flashes back to those old high school classes where you repeated useless phrases like „Where is the library?“ or „what color is the car?“

Thankfully, language learning has come a long way since then. Whether you’re at home or on the go, you can get conversant in a new language with Mondly, the app that combines cutting-edge tech with an innovative curriculum to get you speaking a new language faster.

Mondly makes good use of the latest things your phone can do. Its speech recognition is first-rate, allowing the app to tweak your pronunciation as you go. The aim is to not only teach you the phrase, but have you sound like you say it every day.

Another great add-on is MondlyAR, which is a really fun way to learn on the go. It can pick up on everyday objects around you and prompt you to identify them in your new language of choice.

But the real standout here is the actual curriculum. Mondly breaks each language down into bite-sized lessons, each one building on the last. You get a sense of accomplishment at each step, and it actually gets you excited about moving on to the next set of words and phrases.

Right now, you can get a variety of plans on sale. A lifetime subscription to Mondly with access to all languages is 95% off the retail price.


This app uses AI and augmented reality to get you speaking a new language


17th-Century’s Shipwreck Comes To Life In VR

In the depths of the cold North Atlantic near the coast of Iceland lies the wreck of a Dutch ship that sank 360 years ago while pretending to be Danish.

At the time, the Netherlands (and all European nations) were barred from trading with Iceland by the country’s ruler, the king of Denmark. But Dutch smugglers skirted the ban by sailing to Icelandic ports in ships that flew a false Danish flag.

One of the smugglers‘ ships, named „Melckmeyd“ („Milkmaid“) met a violent end, smashed by a storm on Oct. 16, 1659. The sunken vessel lay forgotten on the sea bottom for centuries. But recent efforts by archaeologists and digital modelers have made the long-lost shipwreck accessible through a virtual reality (VR) „dive.“ As a digital model, Milkmaid can be explored by VR users through a headset or as an interactive video on YouTube.

Local divers found the Milkmaid wreck in 1992 near a small island called Flatey, off Iceland’s western coast. There, the frigid waters preserved much of the ship’s 108-foot-long (33 meters) lower hull in exceptional detail, representatives with the recent digital reconstruction project said in a statement.

The ship sank with a full cargo of fish, and one crew member died during the escape, project leader Kevin Martin, a doctoral candidate at the University of Iceland, reported in July at the 23rd International Conference in Information Visualization in Paris.

The wreck was first investigated in 1993 by maritime archaeologists with the National Museum of Iceland. They identified Milkmaid as a flute ship, a type of merchant vessel that was common during the 17th century.

Then in 2016, Martin and other researchers from the University of Iceland and the Cultural Heritage Agency of the Netherlands conducted high-resolution scans of Milkmaid, generating a digital model of the battered ship. They then used that data to create a VR dive experience for an exhibit at the Reykjavik Maritime Museum, according to the statement.

During the three-minute animated video — described in the presentation as „2.5D“ rather than true 3D — users can explore the underwater environment around Milkmaid as „divers,“ looking around in 360 degrees as the camera „swims“ over and past the wreck. Through this VR experience, anyone who can put on a headset or watch a Youtube video can instantly gain access to an important archaeological site and artifact, Martin and his co-author John McCarthy, a researcher with the College of Humanities, Arts and Social Sciences at Flinders University in Australia, wrote in the conference presentation.

„This approach maximizes the sense of immersion in the underwater environment and replicates as closely as possible the experience of diving for the non-diver,“ the co-authors wrote.

Milkmaid was just one of a fleet of illegal ships sent by Dutch merchants to secretly carry grain, ceramics and timber to Icelandic ports in 1659, according to the statement. As Iceland’s oldest shipwreck, Milkmaid offers a glimpse of this troubled time in the country’s past, „when Denmark ruled the island and had a monopoly over trade here for a period of 200 years,“ Martin said. „It shines a light on a fascinating period of Icelandic history.“



Enterprise VR: How VRgineers is Creating the Ultimate HMD for Simulation

VRFocus interviewed CEO and co-founder Marek Pol?ák.

The consumer virtual reality (VR) market is a tough industry to be in, for hardware and software companies alike. It’s not easy convincing the public of spending £400+ on a device that straps to their face for immersive entertainment. The same can’t be said for the enterprise sector, as demand seems to be steadily increasing thanks to VR’s range of use cases. It’s why some companies have either moved over or solely focused on business applications as there’s more money to be made. Depending on application companies have a range of head-mounted displays (HMD) to choose from, with only an elite few competing in the upper echelons, one of which is VRgineers’ XTAL headset.

VRgineers’ first product was the VRHero 5K Plus before its successor arrived in 2018, offering improved visuals, greater field of view (FoV), automatic IPD adjustment (AutoEye) and integrated hand tracking thanks to built-in Leap Motion technology. An imposing looking headset thanks to its width – the design allows for a 180-degree FoV – the XTAL is most certainly aiming towards enterprise applications, most notably simulation.

With that in mind, it’s why XTAL boasts a 5K resolution – 5120 x 1440 (2560 x 1440 per eye) – from two Quad HD high-density OLED displays, offering companies plenty resolution to make even the finest detail visible. It’s why car manufacturers like Audi/BMW use XTAL for both design iteration as well as sales applications, allowing customers to look at new models and customise them.

Having first encountered the XTAL HMD during CES 2019, coming away from the very brief demo rather underwhelmed – it does cost over £5,000 – VRFocus recently headed over to the Czech Republic to visit VRgineers’ headquarters. There will be a deeper hands-on article later this week but before that VRFocus interviewed VRgineers’ CEO and co-founder Marek Pol?ák.

Pol?ák is certainly passionate about VR having started in the industry by offering immersive tours of Prague. The trouble was that at the time the headsets weren’t up to the challenge, unable to provide the sort of quality Pol?ák wanted. So he co-founded VRgineers alongside Martin Holecko and Vaclav Bittner in a bid to create a device which could offer the quality he was after.

Today, that has led to XTAL, a headset which mixes both high-end industrial components with consumer tech – it supports SteamVR and Lighthouse tracking. Check out Pol?ák’s interview below to learn more about VRgineers and what the company is working on for the future. For VRginners updates, keep reading VRFocus.



Enterprise VR: How VRgineers is Creating the Ultimate HMD for Simulation

Designing Safer Social VR

Using the ideology of sexual consent to make social VR a better place.

Imagine it’s your first time entering a social virtual reality experience. You quickly set up an avatar, choosing feminine characteristics because you identify as female. You choose an outfit that seems appropriate, and when you’re done, you spawn into a space. You have no idea where you are or who is around you. As you’re getting your sea legs in this new environment, all the other avatars look at you and notice that you’re different. Strange avatars quickly approach you, asking inappropriate questions about your real-life body; touching and kissing you without your consent. You try blocking them, but you don’t know how. You remove your headset fearing that you don’t belong in this community.

New worlds, old problems

The above role play is based on various accounts of avatar harassment in social VR applications, reported by women in recent years. In 2016, Taylor Lorenz, a staff tech writer at The Atlantic, garnered attention from various social VR start-ups by sharing her experience in a virtual reality chat room. In that essay, she describes being greeted with unsolicited “virtual kisses” and asked about her real-life body by multiple users, noting that she felt ripped from the virtual world and transported back to middle school.

Soon after, VRChat publicly vowed to make safety a top priority after VR game designer, Katie Chironis shared a graphic recording of sexual harassment in one of their chat rooms. After that, a 2018 study conducted by Jessica Outlaw for VR communication service Pluto, reported that nearly half of the female-identifying VR participants have had at least one instance of virtual sexual harassment. And while these cases are unique in the broader harassment landscape, they are a notable facet of an emerging market.

As female designers working in VR, my co-worker Andrea Zeller and I decided to join forces on our own time and write a comprehensive paper. We wrote about the potential threat of virtual harassment, instructing readers on how to use body sovereignty and consent ideology to design safer virtual spaces from the ground up. The text will soon become a chapter in the upcoming book: Ethics in Design and Communication: New Critical Perspectives (Bloomsbury Visual Arts: London).

After years of flagging potentially-triggering social VR interactions to male co-workers in critiques, it seemed prime time to solidify this design practice into documented research. This article is the product of our journey.

The illusory virtual self

So why did we feel like we needed to take action on social VR harassment? Because when you’re in VR, interactions can feel real. During an early social VR demo, we discovered a bug that caused avatar hands to stick together when two users were in a virtual room. Two participants who didn’t know each other in real-life found themselves holding hands in VR, and when they took off their headsets blushing, as if they really held hands.

This sensation of experiencing a virtual body as your own is called “virtual embodiment.” Take the “virtual hand illusion,” for example — a VR variant of the “rubber hand illusion,” conducted by VR researcher Mel Slater. When a visible rubber hand (or, in this case, a virtually visible rubber hand) is put in front of a test subject, they tend to process potential sensations and threats inflicted on the fake hand as real experiences. This is an example of how the brain can form a connection to a foreign body.

Images: “Inducing Illusory Ownership of a Virtual Body,”Frontiers in Neuroscience (2009) by Slater, Perez-Marcos, Ehrsson, Sanchez-Vives.

When this happens in virtual space, and someone threatens or violates your virtual body, it can feel very real. This is particularly worrisome as harassment on the internet is a long-running issue; from trolling in chat rooms in the ’90s to cyber-bullying on various social media platforms today. When there’s no accountability on new platforms, abuse has often followed — and the innate physicality of VR gives harassers troubling new ways to attack. The visceral quality of VR abuse can be especially triggering for survivors of violent physical assault.

According to Abraham Maslow’s Hierarchy of Needs, feeling safe is a basic human right — in any place. And since social VR places have many of the hallmarks of real-world social places, we should be to crafting safety into our virtual experiences. It’s important that we do it now, while social VR is still young and the standards are being set. Safety and inclusion need to be virtual status quo. This notion is likely so obvious to us because, as women, we think a lot more about safety in real life.

Don’t believe us? See this Jackson Katz experiment: he asks men and women what they do daily to avoid being sexually assaulted. For women, the list begins with, “Hold my keys as a potential weapon, check the back seat before I get in the car, don’t drink too much, don’t leave my drink unattended, carry mace, don’t have a listed number […]” and continues seemingly indefinitely. While for men, this isn’t something they think about; their go-to answer was, “Nothing.”

For men, sexual assault isn’t something they think about. Their go-to for avoiding harassment: “Nothing.”

We knew that it was important to look at the problem of virtual reality harassment from our unique perspective as women in VR, and we started by looking at consent language. Having written our paper in the year of #MeToo, we had a lot of consent-focused debate in the media to take inspiration from.

We started with primary definitions of consent, such as, “all people should have complete ownership of their bodies and any interactions that should occur to them,” a quote from Jaclyn Friedman and Jessica Valenti’s Yes Means Yes!: Visions of Female Sexual Power and A World Without Rape (Berkeley: Seal Press). We grew that practice into looking at body sovereignty and ownership as an interactive principle to ensure safe, inclusive social VR spaces and help maintain a healthy virtual embodiment.


Fostering safety in virtual spaces

Well, that’s all well and good, but how do we — as designers — bring consent, body sovereignty, and respect into the virtual world? By empowering people with easy-to-understand social norms, accessible tools, and appropriate behavior engagement. Our theory was that we could develop these features by looking for consent-acquisition paradigms in the real world and proposing virtual equivalents.

To begin this process of digitizing consent, we knew it would be critical to understand how people perceive appropriate behaviors in the real world. In our day-to-day lives, there is etiquette in how we interact with people. You don’t wear your pajamas in public. You don’t skip the line or cut somebody off in traffic. And, if this does happen you can take action to stop that behavior. VR has very similar social modalities to what we experience in our real lives but, because VR is such a nascent format, the social norms we experience in reality have yet to be applied. In order to bring equity to VR would would have to pull in real world conduct expectations.

So, to create codes of conduct for VR, we looked to the factors that make up our real-world environments. Proxemics — a term coined by anthropologist Edward T. Hall — refers to the relationship between your identity, your surroundings, and the social norms of the community around you. Hall divides experiences into zones of distance from the body.

Diagram: Illustration of Edward T. Hall’s Zones of Interpersonal Space

Proxemics can be viewed as four distinct categories: intimate, personal, social, and public. The boundaries of these zones help us understand appropriacy at various distances. In the real world, each zone has an established code of conduct that offers explicit rules for what behaviors are acceptable and unacceptable. We can use these zones to help people understand what behavior is appropriate at specific times and locations.

Using proxemics as a spatial scale, we can define explicit structures for behavior expectations and build natural boundaries in virtual social relationships. In separating these regions, we’re able look at consent acquisition models unique to each and provide VR equivalents, cumulatively building an infrastructure for virtual safety. This results in inspiration for consent introspection, tailored to each zone — the architecture of our code of conduct.

As we go through each zone, we will accompany our inclusive design suggestions with examples from various social VR experiences.


Intimate space

Let’s begin with the nearest zone: intimate space. In the real world, an example of this would be a bedroom. To build safety in intimate virtual spaces, we suggest designers build granular controls that are easy to access and surfaced before intimate interactions begin. It’s important that people can customize and control the types of experiences they’re willing to have with other people in these close quarters before they happen.

The inspiration for this comes from the real-world intimate consent paradigms found in “Yes, No, Maybe” charts. These are procedures — often used by the BDSM community — in which individuals may list all intimate acts imaginable and categorize them into (1) experiences they would enjoy, (2) experiences they don’t ever want, and (3) experiences they’re not sure about. These individuals would then share these lists with each other before engaging in any precarious intimate acts.

In VR, we can empower people by allowing them to define their ideal experience up front, to avoid violation in their digital intimate space. Our example here is from Rec Room, and shows granular controls for interactions within the Experiences tab of the Settings panel. This dialog allows people to define how close other users can get to them by setting the parameters of their personal safety bubble before any interaction happens.

Personal space

Next, let’s look at personal space. In the real world, an example would be a living room or other shared household space.

To build safety into virtual personal spaces, we can look at how medical practices negotiate consent through nonverbal cues. Specifically, we took our inspiration from the way the National Institute of Health secures ongoing consent from deaf participants in clinical trials using universal gestures. Designers should incorporate simple communication gestures and easy-access shortcuts to allow their users quick-action remediation in tough situations. These simple shortcuts can allow users to quickly report a problematic experience without interrupting or further degrading their experience.

We designed the upcoming Facebook Horizon with easy-to-access shortcuts for moments when people would need quick-action remediation in tough situations. A one-touch button can quickly remove you from a situation. You simply touch the button and you land in a space where you can take a break and access your controls to adjust your experience.

Social space

In the real world, an example of a social space could be a college campus. To make social virtual spaces safer, we can refer to the unspoken conduct agreements that keep interactions appropriate in specific environments.

We looked at the rules sets created by colleges to prevent on-campus assault, and how the campuses needed to be explicit for reinforcement of these rules. Designers can introduce local behavior expectations in VR social spaces by creating conduct codes customized to the activities of the space and weaving them into the fabric of the space.

Our example of local behavior codes is from the [now-defunct] social VR app, Facebook Spaces. As people entered a room that belonged to a specific Facebook group, we set expectations for conduct in this space with these rules. Designers can reinforce these sorts of local behavior expectations by administering rewards to users who uphold the rules or report violators.

Public space

And finally, public spaces. In the real world, a great example of a public space could be a public park or an entire city; any place in which you could potentially meet any kind of person. To ensure inclusivity in public virtual spaces, we can look to real-world law systems for inspiration. Specifically the real-world’s definitions of consent, evaluations of public behavior violations, and criminal consequences. We should consider comparably universal rules and persistent consequences for virtual violation and harassment.

For example, VRChat created a universal system (across all their worlds) that defines appropriacy and allows people to report offensive behavior. By pushing timely consequences to violators, these systems reinforce conduct expectations.

More than zones

As VR designers, we hold the unique opportunity to imagine worlds unbound by reality’s constraints. When approaching the responsibility of constructing new social environments — regardless of how surreal they may be — we should remind ourselves to treat virtual embodiment with the same respect given to physical bodies. Even if the real reality we inhabit often fails to do so.

It is our responsibility to design innately safe virtual spaces and interactions, laying the groundwork for a future of inclusive, secure and empowering VR communities.

A safe future is in our virtual hands.



Image: Michelle gets chased around Rec Room by a man dumping water on her


Red Dead Redemption 2 VR Mod Not Coming Anytime Soon, Says GTA V VR Mod Creator

For those of you hoping that the creator of the recent GTA V VR mod might be able to work some magic on the recently-released Red Dead Redemption 2 PC port, don’t hold your breath. For a multitude of technical reasons, the game won’t be modded for VR in the same fashion as the recent GTA V VR mod anytime soon.

Luke Ross, the creator of the recently-released GTA V VR mod, responded to a Reddit thread asking about whether RDR2 might get the same VR treatment now that it has been released on PC. He shut down the idea pretty fast.

Ross wrote that he finished some benchmarking sessions with RDR2 and “the news is not good”.

The way that the GTA V VR mod works is by using alternate eye rendering – meaning that frames are split alternatively between each eye for VR. If the mod were running at 60 frames per second, the game would alternate rendering a frame for each eye, resulting in, effectively, 30 frames per eye. He wrote that RDR2 on the default settings “struggles to reach 40 fps on a system with i9-9900K, 32 GB dual channel, GTX 1080 Ti, and of course SSD.” Because of alternate eye rendering, that means effectively 20 frames per eye. Ross suggested “there is no way that during 2020 a new GPU will come out that will be able to render RDR2 at the combination of quality, resolution, FOV and frame rate that is needed for proper VR.”

“Please don’t take this to mean that the game is not “optimized” as kids on the forums love to say,” wrote Ross. “At 1080p, it pushes more than 80 fps (in other terms more than needed) while looking gorgeous.”

There are also other technical hurdles that a RDR2 VR mod would hypothetically face during development, but it looks like the biggest problem is that modern hardware can’t push the game at a high enough resolution and frame rate to make it comfortable.

Were you holding out for a RDR2 VR mod? Let us know what you think in the comments below.

The post Red Dead Redemption 2 VR Mod Not Coming Anytime Soon, Says GTA V VR Mod Creator appeared first on UploadVR.

John Carmack Transitions To ‘Consulting’ Technical Role For Facebook’s Oculus

Facebook’s technical VR guide at Oculus, John Carmack, is transitioning to a new “consulting” role as his interests extend to areas outside virtual reality.

Carmack joined Oculus in 2013 as the startup’s Chief Technology Officer and played a critical role in popularizing the idea of modern consumer VR before that. He continued guiding technical thinking at Facebook after the ad giant acquired Oculus in 2014.

Here’s the full text of Carmack’s update which says he still has a voice in VR development work, but it will only be “a modest slice of my time”:

Starting this week, I’m moving to a “Consulting CTO” position with Oculus.

I will still have a voice in the development work, but it will only be consuming a modest slice of my time.

As for what I am going to be doing with the rest of my time: When I think back over everything I have done across games, aerospace, and VR, I have always felt that I had at least a vague “line of sight” to the solutions, even if they were unconventional or unproven. I have sometimes wondered how I would fare with a problem where the solution really isn’t in sight. I decided that I should give it a try before I get too old.

I’m going to work on artificial general intelligence (AGI).

I think it is possible, enormously valuable, and that I have a non-negligible chance of making a difference there, so by a Pascal’s Mugging sort of logic, I should be working on it.

For the time being at least, I am going to be going about it “Victorian Gentleman Scientist” style, pursuing my inquiries from home, and drafting my son into the work.

Runner up for next project was cost effective nuclear fission reactors, which wouldn’t have been as suitable for that style of work. ?

On the Joe Rogan podcast recently Carmack said“I think that we will have — we will potentially have — clear signs of [Artificial General Intelligence] maybe as soon as a decade from now. Now lots of people disagree, the majority of scientists working on it think ‘oh its gonna be at least a few decades’, and you still have a few hold outs who say ‘oh it can’t happen at all’, but I’m a strict materialist, I think our minds are just our body in action, and there’s no reason why we can’t wind up simulating that in some way.”

Earlier this year Carmack described work being done at Elon Musk-backed brain-computer interface company Neuralink as “very bold.”

The post John Carmack Transitions To ‘Consulting’ Technical Role For Facebook’s Oculus appeared first on UploadVR.