All posts by Daniel

Biology and Physics on Station Today Promote Moon Mission Success in 2024

A moonrise from 2016
A moonrise from 2016 is photographed from the space station with the Earth’s limb, the atmospheric glow and the aurora below.

The six residents aboard the International Space Station kicked off the workweek today exploring microgravity’s long-term impacts on biology and physics. The Expedition 59 crew is also ramping up for a fourth spacewalk at the orbital lab this year.

NASA is planning to send men and women to the Moon in 2024 and life science on the station will help flight surgeons keep lunar astronauts healthy. The space physics research will also provide critical insights to engineers designing future spacecraft and habitats for exploration missions.

Several dozen mice and their immune systems, which are similar to humans, are being continuously observed in specialized habitats. Flight Engineer Anne McClain tended to the mice today cleaning cages and restocking food in Japan’s Kibo laboratory module. Doctors are testing the hypothesis the immune response decreases in space and exploring advanced vaccines and therapies benefiting both astronauts and Earthlings.

NASA astronauts Christina Koch and Nick Hague also researched a variety of space biological phenomena. Koch wrapped up a pathogen study today seeking to understand why virulence increases in microgravity. Hague cleaned up Veggie Ponds botany hardware in Europe’s Columbus laboratory module where small crops of edible plants are grown. He then photographed protein crystal samples in the afternoon for a student-designed investigation as Koch assisted him.

David Saint-Jacques of the Canadian Space Agency recorded a video demonstrating Isaac Newton’s Second and Third Laws. The video will help young students understand how force and acceleration influence air and space missions. He also transferred data captured from tiny internal satellites exploring space debris cleanup technology.

Commander Oleg Kononenko and Flight Engineer Alexey Ovchinin are getting ready for a spacewalk planned for May 29. The cosmonauts are resizing their spacesuits, inspecting the components and checking for leaks today. The duo will remove experiments, sample station surfaces and jettison obsolete hardware during their six-hour excursion.

AR Gaming Robot ‘MekaMon’ Adds Support For STEAM Education

Reach EDU looks to expand science, tech, engineering, arts, and mathematics through AR robots.

Reach Robotics, the creators of the world’s first AR gaming robot, MekaMon launched the next generation of EdTech with a new coding platform – Reach EDU.

If you’re not familiar with Reach Robotics line-up of kid-friendly robots, MekaMon are programmable machines which players can control via a smart device to battle digital opponents—both real and AI—in an augmented space; adding an additional layer of visual excitement to the conventional playtime experience. Now the company is adding even more educational value in the form of STEAM-based learning with Reach EDU. 

Learning through play is a key part of the Reach EDU experience. Originally announced at CES 2019, the Reach EDU app has officially launched and now works alongside the existing MekaMon augmented reality gaming app, bringing together creative STEAM learning and advanced robotics in one approachable package.

The World Economic Forum recently listed problem-solving, critical thinking, creativity as the top three skills that children need for success. ReachEDU Missions have been created using gamification techniques to reward and challenge students in ways that encourage creative problem solving and deeper engagement with coding concepts.

The Reach EDU app incorporates engaging storytelling to guide learning. Led by Ivy, the Head Engineer at the Mekacademy, students learn to code their MekaMon in a series of game-like challenges in preparation for a mission to Mars.

The app is structured around four core features which can be used independently of the missions to encourage creative play and experimentation:

  1. Free Drive: Freely to experiment with MekaMon’s fluid movement and lifelike animations.
  2. MekaDraw: Trace a line across the screen and MekaMon will follow.
  3. MekaMotion: Code directly on the MekaMon robot by moving each of its limbs to build up a series of commands through stop motion animation.
  4. MekaCode: Code direct commands in Scratch-based block coding.

Additionally, Reach EDU is designed with a range of learners and experience levels in mind.

For those just getting started Draw, MekaMotion and MekaCode help get familiar with coding concepts. Experienced learners can unlock a world of new programming potential with further applications such as Swift Playgrounds.

For students further along in their studies, including those at degree and postgraduate level, third-party educational platforms are being developed to work MekaMon in addition to existing Swift Playground integration.

These will include a Reach Raspberry Pi processing module and a browser version of Reach EDU incorporating Python to support KS3 and 4 curriculums and open the platform up to advanced experimentation.

Reach EDU is the realization of the primary inspiration behind MekaMon. Reach Robotics CEO, Silas Adekunle, created the first prototype while teaching in schools. He quickly learned that robotics and gaming captured students’ attention, making learning infinitely more engaging.

“There’s a huge amount of creative potential with MekaMon, due to the scope of its expressive movement and personality. Reach EDU is about delivering the tools to take advantage of this, by creating a versatile, accessible, and fun platform for effective STEAM education and ongoing innovation”. – Silas Adekunle, CEO Reach Robotics.

Reach Robotics is  exploring how their AR advances with MekaMon will evolve the Reach EDU experience. Reach EDU will have ongoing updates with new missions, content and supporting materials continually being developed.

The ReachEDU app is available for free on the Appstore and Google Play. MekaMon are available via Apple (in-store and online), Amazon among other retailers.  Schools can order multiple units at a discount via edu@reachrobotics.com.

Image Credit: Reach Robotics

According to Change the Equation, the number of STEM jobs will grow 13 percent, compared to 9 percent for non-STEM jobs between 2017 and 2027— computing, engineering, and advanced manufacturing are leading the way.

With platforms like Reach EDU, our present-day learners will be prepared to be future leaders through the power creativity, problem-solving, advanced robotics and fun learning experiences, advanced robotics all around.

The post AR Gaming Robot ‘MekaMon’ Adds Support For STEAM Education appeared first on VRScout.

Google Reveals Glass Enterprise Edition 2 AR Headset

The latest iteration of Glass features an improved camera and a more powerful CPU for $999.

Earlier today, Google announced the latest addition to its Google Glass hardware line-up with Google Glass Enterprise Edition 2. The company promises the new AR headset will help businesses increase the efficiency of its employees by offering them hands-free access to a world of information in real-time.

Available exclusively for enterprise use, this newest iteration is built on Qualcomm’s Snapdragon XR1 platform and features a new AI engine, offering users a noticeably more powerful hardware experience. This has resulted in considerable improvements to power and performance—including an improved camera for higher quality video streaming and collaborative features—and opens up the possibility of computer vision and advanced machine-learning.

To help protect all the delicate technology stuffed into the sleek device, Google teamed up with Smith Optics to create a set of Glass-compatible safety frames capable of withstanding the harsh conditions of working environments such as manufacturing floors and maintenance facilities.

Here’s a detailed spec breakdown of the Google Glass Enterprise Edition 2 headset:

  • SoC — Qualcomm Quad Core, 1.7GHz, 10nm
  • OS — Android Oreo
  • Memory & Storage — 3GB LPDDR4 / 32GB eMMC Flash
  • Wi-Fi — 802.11ac, dual-band, single antenna
  • Bluetooth — 5.x AoA
  • Camera — 8Mp, 80 DFOV
  • Display — 640×360 Optical Display Module
  • Audio Out — Mono Speaker, USB audio, BT audio
  • Microphones — 3 beam-forming microphones
  • Touch — Multi-touch gesture touchpad
  • Charging & Data — USB Type-C, USB 2.0 480Mbps
  • LED — Privacy (camera), power (rear)
  • Battery — 820mAH with fast charge
  • IMU — Single 6-axis Accel/Gyro, single 3-axis Mag
  • Power Saving features — On head detection sensor, and Eye-on screen sensor
  • Ruggedization — Water and dust resistance
  • Weight — 46g (pod)
Image Credit: Google

Thanks to multiple improvements to hardware, Google Glass Enterprise Edition 2 allows employees to collaborate remotely in real-time via live video streaming, reference helpful documentation, and safely access specific applications using hands-free voice commands.

Those interested in bringing incorporating this technology into their own workplace will be delighted to hear that Glass Enterprise Edition 2 features a much simpler development process; due in large part to an Android foundation and support for Android Enterprise Mobile Device Management.

For more information visit google.com/glass.

The post Google Reveals Glass Enterprise Edition 2 AR Headset appeared first on VRScout.

Google Lens und Suche: Neue visuelle Features und AR-Unterstützung

Wahrscheinlich einer der wichtigsten Neuigkeiten im AR-Bereich in den letzten Monaten. Wenn wir in Zukunft in unserer Google-Suche AR-Inhalte angezeigt bekommen und diese dann in den Raum platziert bekommen wäre dies, für das Thema AR ein sehr sehr wichtige Entwicklung und macht die Tragweite für Aktivitäten in Unternehmen sichtbar in das Thema 3D-Modell-/3D-Welten zu gehen und Kompetenzen/Skills hier aufzubauen. Upskilling ist das Thema, Mitarbeiter FIT machen für AR und 3D.
Ist Ihre Organisation 3D-ready?, Torsten Fell

 

Ich habe einmal zwei Artikel zu diesem Thema zusammen hier eingestellt, viel Spaß.

 

Quelle-Artikel:

https://germany.googleblog.com/2019/05/google-suche-und-lens-mit-ar.html

Neue Informationen lassen sich am besten veranschaulichen, indem man sie – wie es das Wort schon sagt – direkt sichtbar macht. Wir haben heute auf der I/O neue Funktionen für die Google-Suche und Google Lens vorgestellt. Mithilfe der Kamera, maschinellen Sehens und Augmented Reality (AR) können auf diese Weise bestimmte Inhalte in der realen Umgebung abgebildet werden – und so euren Alltag weiter erleichtern.

AR in der Google-Suche

Ein fliegender Weißer Hai? Google Lens macht’s möglich

In den nächsten Wochen werden wir eine neue Funktion in der Google-Suche veröffentlichen, mit der Nutzer 3D-Objekte direkt aus der Suche in ihrem Umfeld platzieren können. So ist es nicht nur möglich, gewisse Dinge genauer zu betrachten – ihr bekommt außerdem ein besseres Gefühl für die eigentliche Form und Größe. Es ist zum Beispiel eine Sache, zu lesen, dass ein Weißer Hai bis zu fünfeinhalb Meter lang werden kann. Einen wirklichen Eindruck davon erhaltet ihr aber erst, wenn ihr ihn dann in seiner tatsächlichen Größe vor euch seht. Wenn ihr künftig in der Suche nach bestimmten Tieren sucht, könnt ihr einige direkt im Knowledge Panel in 3D und AR ansehen.

Animiertes 3D-Modell von als Suchergebnis für „Muskelanspannung“

Wir arbeiten außerdem mit Partnern zusammen, um auch ihre Inhalte künftig in 3D in der Google-Suche darstellen zu können. So erweckt ihr bald viele weitere dreidimensionale Objekte direkt aus der Suche vor euren eigenen Augen zum Leben – ganz egal, ob es sich dabei um anatomische Modelle des menschlichen Körpers oder um ein neues Paar Sneaker handelt.

Neue Funktionen für Google Lens
Über eine Milliarde mal haben Menschen Google Lens schon genutzt, um mehr über die Dinge in ihrer Umgebung zu erfahren. Für eine möglichst präzise Antwort greift Google Lens dabei auf maschinelles Lernen (ML), maschinelles Sehen und Milliarden von Fakten im Knowledge Graph zurück. Ab sofort arbeiten wir daran, noch visuellere Antworten auf diese visuellen Fragen zu finden.

Stellen wir uns einmal vor, ihr seid in einem Restaurant und könnt euch nicht entscheiden. Google Lens hebt die besonders beliebten Gerichte auf der Speisekarte automatisch hervor. Wenn ihr dann auf ein Gericht tippt, könnt ihr anhand von Fotos und Rezensionen auf Google Maps sofort sehen, wie es tatsächlich aussieht – und wie es den anderen Gästen bisher geschmeckt hat.

Was soll ich bloß essen? Google Lens hilft bei der Entscheidung.

Doch wie funktioniert das genau? Als Erstes erkennt Google Lens die ganzen Gerichte auf der Karte und unterscheidet anhand der Schriftgröße, Schriftart und Farbe diese von der Beschreibung. Dann werden die Namen der Gerichte mit den dazugehörigen Fotos aus der Restaurantbewertung in Google Maps abgeglichen.
Google Lens ist vor allem dann besonders hilfreich, wenn ihr im Urlaub seid und die Landessprache nicht versteht. In diesem Fall könnt ihr ganz einfach die Kamera auf den zu übersetzenden Text halten. Google Lens zeigt dann die Übersetzung direkt über dem Ausgangstext an – und das in mehr als 100 Sprachen!

Google Lens übersetzt den Text und zeigt ihn direkt über dem Original an

Doch wir arbeiten an noch mehr Möglichkeiten, um hilfreiche digitale Informationen mit der realen Welt zu verknüpfen. Im De Young Museum in San Francisco erhaltet ihr mithilfe von Google Lens beispielsweise versteckte Hintergrundgeschichten von den Kuratoren zu den Bildern. Oder wenn ihr ein Rezept aus einer Kochzeitschrift nachkochen möchtet, richtet einfach eure Kamera auf das Rezept und schon könnt ihr loslegen.

Mit Google Lens erwachen Rezepte zum Leben

Lens für Google Go
Weltweit haben über 800 Millionen Erwachsene Schwierigkeiten, alltägliche Dokumente wie Busfahrpläne oder Bankformulare zu lesen. Mit Google Lens werden jetzt diese und andere Texte ganz einfach vom Smartphone vorgelesen.
Dazu müsst ihr eure Kamera nur auf den jeweiligen Text richten. Die einzelnen Wörter werden dabei hervorgehoben, sodass ihr den Text mitverfolgen und in seinem gesamten Zusammenhang verstehen könnt. Außerdem könnt ihr auch einzelne Wörter antippen, um nach ihrer Bedeutung zu suchen. Diese Funktion ist zunächst für Google Go verfügbar – die Such-App für Smartphone-Neulinge. Die Lens-Funktion für Google Go ist nur 100 KB groß und funktioniert sogar auf Smartphones, die 50 Euro kosten.
—– Weitere Artikel zu diesem Thema —–

Quelle-Artikel:

Google Lens und Suche: Neue visuelle Features und AR-Unterstützung

Im Rahmen der Google I/O hat Google heute einige neue Features für die Suche aber auch die Google Lens vorgestellt. Die Neuerungen wurden dabei sogar auch für unsere Breitengrade bestätigt. Ausrollen will man die neuen visuellen Funktionen schon innerhalb der nächsten Wochen. So peppt man die Google Suche und die Google Lens um neue visuelle Gimmicks auf.

Etwa zeigte man, wie über die Google Suche nun beim Stöbern nach bestimmten Themen oder auch Produkten 3D-Modelle und AR-Inhalte eingebunden werden. Beispiel: Ihr sucht nach einem bestimmten Tier oder auch einem Turnschuh und könnt jenen in der Augmented Reality direkt in eurer Umgebung erblicken.

Im Falle eines Tieres hilft das dessen Größe einzuschätzen, bei einem Kleidungsstück seht ihr direkt, wie es zum Rest eurer Garderobe passt. Das neue AR-Feature für die Suche will man noch innerhalb dieses Monats ausrollen. Als Partner nennt Google dafür bereits Organisationen bzw. Unternehmen wie die NASA, New Balance, Samsung, Target, Visible Body, Volvo und Wayfair.

Die Google Lens wiederum erhält eine Funktion, die über die Kamera Texte auf etwa Schildern erkennen und direkt vorlesen kann. Während des Vorlesens markiert die Google Lens sogar die jeweils gerade vorgelesenen Wörter. Das erleichtert es nachzuvollziehen, was einem die Lens gerade berichtet. Es soll sogar möglich sein, bestimmte Wörter anzutippen und dann weitere Erklärungen zu ihnen zu erhalten. Auch Übersetzungen werden auf diese Weise möglich. Dabei übersetzt die Google Lens den jeweiligen Inhalt dann sowohl visuell, also im Bild als auch verbal, also beim Vorlesen.

Zusätzlich kann die Google Lens etwa in Restaurants auf Speisekarten beliebte Gerichte hervorheben und dann etwa zu den Speisen Rezensionen und Bilder aus Google Maps hervorzaubern. Wer also in einem Restaurant nicht weiß, was er bestellen soll, erhält so neue Unterstützung. Das kann natürlich besonders bei Reisen im Ausland durchaus sinnig sein – etwa wenn ihr die Speisekarte vielleicht aufgrund japanischer Schriftzeichen sonst nicht einmal lesen könnt.

Für Museen wiederum ist angedacht, dass die Google Lens zu Gemälden oder anderen Kunstwerken direkt zusätzliche Informationen anzeigen kann – die direkt von den Betreuern des Museums eingestellt worden sind. So erhaltet ihr quasi direkt vor Ort eine kostenlose, digitale Führung. Letzte neue Funktion: Ihr studiert ein Magazin mit Rezepten, richtet die Kamera darauf und erhaltet direkt zum Kochen des Rezepts eine Videoanleitung.

Speziell für die Übersetzungen nannte  Google als verfügbare Sprache ganz direkt auch Deutsch, dieses Feature für die Lens will man bei uns in den nächsten Wochen verteilen. Seien wir einfach mal gespannt. Die Funktionen klingen definitiv extrem spannend und könnten auch für Menschen eine Hilfe sein, die z. B. unter Beeinträchtigungen beim Sehen oder Lesen leiden.

 

 

Watch This Guy Actually Walk Through An Apex Construct Level On Quest

Apex Construct Quest Tracking

Oculus Quest units are beginning to slip out into the wild ahead of official launch tomorrow. One of the coolest uses of the standalone VR headset we’ve seen thus far? Someone playing an entire level of Apex Construct… using only their legs. That’s right, their actual legs, not their fake virtual ones.

YouTuber Jugon Virtual just posted this video of the Quest port of Fast Travel Games’ debut. In it, he tackles an entire level of the game by physically walking through it. Jugon runs around a football field covering a 6050m squared area, battling robots and dodging projectiles.

It’s pretty cool to see. Jugon is able to skip backward when he’s rushed by exploding enemies and jump around cover to avoid incoming fire. At one point he’s even brave enough to roll onto his back. Quest’s inside-out positional tracking is able to handle all of this with the help of four onboard cameras. The tracking isn’t quite as extensive as, say, the original Oculus Rift, but it’s close enough.

Of course, most of us won’t have an entire field to play Apex Construct in. We’ll have to make do with the teleport and artificial locomotion options the game provides. The quest port consists of the entire original game and includes recent updates too.

Oculus Quest launches tomorrow and Apex Construct will be one of the first games you can buy for it. If you already own it on Oculus Rift via Oculus Home then you’ll get it for free. We thought the port of the game was first-rate.

Tagged with: ,

The post Watch This Guy Actually Walk Through An Apex Construct Level On Quest appeared first on UploadVR.

Superhot VR Quest Review: The Best Version Of A Genuine Classic

Superhot VR Quest Review: The Best Version Of A Genuine Classic

When we first reviewed Superhot VR nearly three years ago we said this:

“SUPERHOT VR is a pure, distilled, injection of unadulterated adrenaline that will get your blood pumping just as quickly as time stops in the game itself. With every movement you make, time creeps forward ever so slightly, and everything from the level design to the way it feels to dodge a series of bullets in slow-motion is orchestrated to reinforce the core ideals of the experience.”

It’s a testament to those ideals that, in mid-2019, Superhot remains arguably the best first-person shooter in VR. In fact, on Oculus Quest, it’s even better than it ever was. It’s somewhat fitting that time has been so kind to a game that’s all about manipulating it.

As with most Quest ports, Superhot’s development team managed to squeeze the entire original game onto Oculus’ new standalone. Unlike most others, though, it’s survived the transition with barely any noticeable concessions. Save for a few inconsequential lighting drawbacks and slightly slower loading speeds, Superhot VR is just as crisp and striking as the PC VR versions. Granted this was never the most visually-pressing game, but it arguably looks even better than the 2017 PSVR port.

More importantly, though, Quest’s tether-free tracking provides a more open, liberating version of the game than what’s come before. Previously Superhot was a game of two battles; one inside the headset and one outside. All of your moves had to consider the physical limitations of the cord connecting you to a PC or console. On Quest, that simply isn’t an issue. The game’s dystopian narrative often asks you to ‘Prove Your Devotion’ and now you can by throwing yourself to the floor and spinning around behind you without the worry of tangling yourself up or yanking a PC off of a desk.

My Quest playthrough was my third time running through Superhot (it’s the only VR game I’ve completed twice, let alone a third time). It’s nothing short of remarkable how fresh, relevant and immediate the game still feels in 2019. Every element of Superhot feels organic in VR, from the way it commands your body to bend and twist with slow-motion precision to the stylish flair of catching a gun mid-air and shooting a blank-faced goon seconds before his knife reaches your eyes. It’s a game about being in control, a game in constant pursuit of empowering the player. There’s nothing else in VR that articulates these emotions as consistently.

It’s just a shame there still isn’t anything ‘new’ to speak of here. While Superhot’s post-game is more robust than it used to be, with speedrun and skill-based modes, we’re way past due for extra levels. If you have already played through the game, it probably isn’t worth reinvesting (the game doesn’t support cross-buy with Oculus Rift) unless you’re jonesing for another playthrough.

Final Score: 9/10 – Amazing

Superhot VR’s hypnotic blend of physical, cinematic action is just as entrancing as it’s ever been on Quest. In fact, the lack of wires truly allows you to devote yourself to its endlessly entertaining levels. It might not be worth a second purchase for existing owners but, for those that haven’t played it already, this is the best version of a genuine classic.

Superhot VR launches on Oculus Quest on Mau 21st for $24.99 and is already available on Oculus Rift, HTC Vive, Windows VR and PSVR headsets. Read our Game Review Guidelines for more information on how we arrived at this score. 

The post Superhot VR Quest Review: The Best Version Of A Genuine Classic appeared first on UploadVR.

Everybody’s Golf VR Review: Swinging For The Green

Everybody’s Golf VR Review: Swinging For The Green

Everybody’s Golf VR is the type of game that shouldn’t work as well as it does. The PS Move controllers ae archaic by modern technology standards and they have significant issues in terms of jitter, tracking consistency, and tracking coverage compared to the competition but somehow Sony and its stable of developers continue to make things work.

To be clear: Everybody’s Golf VR is not a super-accurate golf simulation. This is not a replacement for actually practicing and should not be treated as a 1:1 golfing experience — but it’s damn fun. Maybe it’s the bright, colorful visuals, cheery voice over and animations, or just the fact that it makes me feel like I’m better at golf than I really am, or maybe it’s a combination of all three that make me feel this way, but this is an excellent example of how to adapt a sports game for VR.

Everybody’s Golf VR can be played with either a single PS Move controller or the DualShock 4. When playing with Move it works a lot like you’d expect with you swinging the controller wide to hit the ball and actually having to pay attention to your wrist rotation and placement. The biggest issue was just drift with the PS Move controller, which is something that has plagued PSVR since launch.

Like I said earlier, tracking is good enough and better than it should be but is still far from ideal. A game like this would shine even more with better tracking, but it gets the job done. I found myself really getting into things after a few courses and played standing up with my body turned to the side just as I would on an actual golf course. This isn’t a sports simulation so I didn’t mind if the shot was a bit wonkier than I intended or thought it would be based on my swing. As it stands, there’s great “pass the headset” style multiplayer appeal here even if that isn’t actually supported. The lack of multiplayer in general feels like a major missed opportunity.

You don’t need a PS Move controller to play Everybody’s Golf VR though, at least not technically. Since the DualShock 4 has a light bar you can hold it and swing it like you would the PS Move but it just feels awkward and since the light bar is flat and not rounded on top the tracking is even worse. Playing with a DualShock 4 left a lot to be desired and felt like a very tacked-on feature. I wouldn’t recommend anyone to play the game this way.

Between the practice range for hitting, a practice green for putting, a handful of course (it looks like there are four from what I can see) there is a good amount of content. Each course can be mirrored to offer a different experience and there are 3 hole, 9 hole out, 9 hole in, and 18-hole variations. There is very good variety between the courses available. You can also unlock new tees, holes, clubs, and caddies as well as outfits for each caddy.

At the end of the day it just feels like there needed to be a bit more to round things out. The reception area is finely detailed and has a good county club vibe, but I was left hoping for a more robust campaign of some kind rather than single courses.

Visually it looks great. The art style lends itself very well to the PSVR and the bright colors look excellent, especially on a PS4 Pro. Replaying holes to do better is extremely addictive, especially with all of the unlocks built into the game.

Final Score: 7.5/10 – Good

Everybody’s Golf VR is a solid adaptation of the franchise for the PSVR. The gameplay is extremely fun and engaging, even if lacking in terms of accuracy a bit due to the limitations of the PSVR as a platform. I was left wanting multiplayer support and more courses to pick from, but the variety offered within each course and amount of unlockables available provides plenty of goodies for fans to dig into.

Everybody’s Golf VR releases tomorrow on PSVR for $29.99. Read our Game Review Guidelines for more information on how we arrived at this score. 

Tagged with:

The post Everybody’s Golf VR Review: Swinging For The Green appeared first on UploadVR.

Netflix Is Officially Coming To Oculus Quest At Launch

Netflix Is Officially Coming To Oculus Quest At Launch

Despite the fact that the Oculus Quest standalone VR headset is primarily a gaming device, plenty of people anticipated using it for media streaming as well. After all, that’s one of the most popular use cases for the Oculus Go. Now we’ve finally got confirmation that Netflix will be coming to Quest and it’s slated as a day one launch app alongside Oculus Video, Sling TV, FOX Now, Red Bull, and various other video streaming portals.

Obviously you still need an active Netflix subscription to use the app, but it’s great to know that people will be able to watch their favorite shows and movies from a virtual theater inside their headset. We don’t believe there will be multi-user support so you can’t invite friends over in VR to watch along with you, but multiplayer theater apps like Bigscreen can be used as a workaround.

On the Oculus Go you can watch videos from Facebook and a few other select sources in Oculus Rooms with your friends, but sadly that and other social apps aren’t announced for launch on Quest.

Oculus announced YouTube VR was also officially coming to Quest, offering a dedicated way to view VR and 360 content without needing to open the Browser. The dedicated Netflix app on Quest should offer a similar functionality, although we don’t expect support for downloading content for later viewing. You would have to sideload the Android version of Netflix to do that.

Are you excited about having Netflix in VR on a wireless, standalone, 6DOF VR headset? Let us know down in the comments below!

Tagged with: ,

The post Netflix Is Officially Coming To Oculus Quest At Launch appeared first on UploadVR.

Oculus Quest Game Library Preview Livestream: Launch Day Lineup

Oculus Quest Game Library Preview Livestream: Launch Day Lineup

Curious about how we livestream the way we do? Then look no further than this handy guide for general tips and this guide specific to our Oculus Quest setup.

Do you know what day it is? It’s the day before the Oculus Quest releases. Tomorrow the Oculus Quest standalone VR headset finally releases on May 21. So naturally we are back with some more wireless, standalone roomscale streaming for all of your beautiful eyeballs to see.

Last week we had a ton of fun showing off lots of new games we hadn’t played before and today we’re back with the same setup. The visuals will be a fisheye view of the right eye, but we can now stream the entire library not just what’s available via Chromecast. And now we have finally decided to just stream directly to YouTube since that’s where the vast majority of our audience is located. This way we can focus specifically on that platform.

The stream is planned to start around 2:00 PM PT and we’ll aim to last for about an hour or two. We’ll be livestreaming to the UploadVR YouTube directly. You can see the full stream embedded right here down below once it’s up:

You can see our most recent past archived streams over in our YouTube playlist. There’s lots of good stuff there so make sure and subscribe!

And please let us know which games or discussions you want us to livestream next! We have lots of VR games in the queue that we would love to show off more completely.

Featured image credit: VRGameCritic

Tagged with: ,

The post Oculus Quest Game Library Preview Livestream: Launch Day Lineup appeared first on UploadVR.

Index Specs Were Driven by Valve’s VR Game Dev Teams

When revealing the upcoming Index headset, Valve was clear that their goal was to move the bar forward for VR fidelity, even if that meant a premium pricetag. The company said that its internal game developers working on “AAA VR content” pushed to VR hardware team to reach the fidelity they wanted.

Valve’s Index headset was fully revealed at the end of April. And while the $1,000 full kit is more than twice as much as the Rift S ($400), the company believes the headset will offer the best experience for its upcoming “flagship VR game,” which hasn’t been revealed yet but is confirmed to launch later this year.

SEE ALSO
Oculus Explains Why It Doesn’t Think the Time is Right for ‘Rift 2’ or ‘Rift Pro’

And that certainly makes sense, as Valve says that its VR game teams were pushing for greater capabilities from the hardware.

“Valve game teams requested increased fidelity to support AAA VR content development, which in turn drove Index’s specific technical innovations,” the company wrote in press materials shared with Road to VR.

Those “technical innovations” likely refer to Index’s dual-element optics, super low-persistence display, class leading refresh rates, and the headset’s unique and surprisingly good ‘off-ear’ headphone design.

Image courtesy Valve

It isn’t just specs and performance that Valve’s game teams were looking for though; during a press reveal of the headset last month, the company’s VR hardware team said that the internal teams building VR content kept talking about creating “full length” experiences, and needed a headset which had long term comfort to match.

“There isn’t one single factor that makes this HMD great, it’s all of these things together that contribute,” a member of the VR hardware team said.

Meeting those performance and ergonomic needs—”fidelity first” as the VR hardware team said—was priority number one for Valve, even knowing that it would necessitate a premium price tag. The company was clear about who the headset is positioned for.

“Valve index is for experienced, existing VR customers who want more [fidelity] and don’t want to wait,” the VR hardware team said.

And that fits the bill for VR’s early adopters, many of which bought into the PC VR space at $800 for a Vive or Rift—not because it was cheap, but because of the promise of immersion. Index represents a real step forward in fidelity over first-gen VR headsets, while other soon to launch headsets like Rift S and Quest primarily focus on low cost and ease of use.

– – — – –

Valve Index began pre-orders in limited regions on April 30th, and the first headsets are due to ship on June 28th, though it’s already backordered by two months or more. The Index ‘full kit’, which includes the headset, controllers, and tracking base stations, is priced at $1,000, while the headset and controllers can be bought without base stations for $750, and the headset by itself for $500.

The post Index Specs Were Driven by Valve’s VR Game Dev Teams appeared first on Road to VR.