All posts by Daniel

Rocket Rolls Out Ready to Launch New Station Crew on Apollo 50th

The Soyuz rocket stands at its launch pad in Kazakhstan
The Soyuz rocket that will launch three new Expedition 60-61 crewmembers to the station on Saturday stands at its launch pad in Kazakhstan. Credit: NASA/Joel Kowsky

A Soyuz rocket stands at its launch pad today at the Baikonur Cosmodrome in Kazakhstan ready to launch three new flight engineers to the International Space Station on Saturday. NASA astronaut Drew Morgan will embark on his first space mission with veteran station residents Luca Parmitano and Alexander Skvortsov.

The Expedition 60-61 trio from the United States, Italy and Russia, is lifting off Saturday at 12:28 p.m. EDT aboard the Soyuz MS-13 spacecraft. They will dock less than six-and-a-half hours later to the Zvezda service module 50 years to the day NASA first landed humans on the Moon.

About two-and-a-half hours later the Soyuz and station hatches will open and they will enter their new home in space. NASA astronauts Nick Hague and Christina Koch and station Commander Alexey Ovchinin of Roscosmos will greet their new crewmates then hold a ceremony with family, friends and mission officials on the ground.

NASA TV is broadcasting all the activities live with launch coverage beginning Saturday at 11:30 a.m. Docking coverage begins at 6 p.m. as the Soyuz begins its approach with the orbiting lab. Finally, NASA TV’s live coverage of the hatch opening and crew welcoming ceremony begins at 8 p.m.

Microsoft Shows Off Language-Translating Hologram Powered By HoloLens 2

Microsoft’s Japanese-speaking hologram steals the show at Inspire 2019. 

Microsoft’s HoloLens 2 Mixed Reality headset was front-and-center at the 2019 Inspire conference as the company showcased their latest advancements in artificial intelligence as well as mixed reality hologram technology. During a keynote held yesterday at the Las Vegas-based event, Microsoft Executive Julia White unveiled an exciting new project that uses a combination of body and voice capture, Azure AI, and HoloLens 2 technology to convert a human presenter into a 3D hologram capable of delivering a presentation in any language.

White begins the keynote by donning a Microsoft HoloLens 2 headset and sticking her out her open hand so that it may be seen by the headset, at which point a miniaturized green model of White, or “mini-me” as she refers to it, warps into existence. After demonstrating how the tiny hologram tracks and follows her hand, White states a simple voice command: “Japanese keynote. Render keynote.” This causes the simplistic green model to disintegrate into a cloud of particles, which then realign to form a near photorealistic holographic rendition of White.

Image Credit: Microsoft

This new Hologram then proceeds to deliver the rest of the keynote in Japanese, an especially impressive feat considering White isn’t fluid in the language. The Japenese-speaking hologram even sounds like White thanks to Microsoft’s Neural Text-To-Speech (Neural TTS) technology, which harnesses the power of AI to automatically create personal voice signatures based on simple voice samples.

Microsoft’s combination of mixed reality and Neural TTS technology could revolutionize how businesses, clients, and various other professionals across multiple industries deliver live presentations, such as lectures, talks, and conferences, by allowing them to create personable and engaging presentations in multiple languages simultaneously across the world.

Image Credit: Microsoft

Much like the HoloLens 2, however, the technology is still in its early stages and therefore a relatively expensive procedure. White’s hologram, for instance, was shot in near-perfect conditions at Microsoft’s dedicated mixed reality studio, which features professional lighting and camera equipment specifically designed for MR capture.

Regardless, White’s demonstration paints a fascinating picture for the future of live talks, Q&A’s, lectures, and various other forms of live presentations.

The post Microsoft Shows Off Language-Translating Hologram Powered By HoloLens 2 appeared first on VRScout.

This Psychedelic AR Book Cover Is Turning Heads Online

Forget the old phrase “you can’t judge a book by its cover.”

UK-based artist Alexander Ward was shocked to see his AR-enabled cover for Allowah Lani’s book I am love was circulating online. When scanned with a smartphone camera, the books cover opens a 3D portal to a visually-captivating psychedelic cosmos.

“It certainly was very surprising,” Ward explained, “[I] can’t imagine anyone expects their video to travel around merrily on Twitter gathering almost a million views.”

Ward said he didn’t even have a Twitter account — but he certainly does now, @wardyworks

“For a while now I’ve been excited by the interesting new avenues this tech opens up for my art, and the book cover was a small snippet of what I’m working towards with that tech,” Ward said. “So it’s very encouraging to see the response that many others are excited by it also!”

This is the book cover image. Once you have opened the camera effect on your phone, you can point the camera towards this image and the Augmented Reality will unravel.
Image Credit: Alexander Ward, Wardy Works

The cover is a woman surrounded and made of celestial and mystical elements. Her hair is an intertwining of creatures like snakes and butterflies, light, and faces and she is reaching out to an unknown hand wrapped in gauze that feels like it is your hand when looking at the cover in AR.

“For this particular Augmented Reality book cover I created, I wanted to give the illusion of looking into the cover itself,” Ward said, “as if the book cover is a window or portal into the artwork.”

Image Credit: Alexander Ward, Wardy Works

“I achieved this by separating the artwork and text into many individual layers, that I placed in receding layers of 3D depth, in a 3D program on the computer,” Ward said. 

Ward’s design works similarly to how an Instagram lens or Snapchat filter recognizes your face and then adds a layer AR element that moves with your movements. Ward has also designed face filters with his similar ethereal style.

“If for example there is a book that already has a million copies in circulation,” Ward explained, “I could make an effect, and it will work on all those million copies. Provided those covers are the same.”

A similar effect was achieved by the New Yorker in 2017 with a magazine cover that came to life in AR using the UNCVR app.

Image Credit: Alexander Ward, Wardy Works

Want to try it yourself? Click this link to open up the AR effect (Facebook app required) and point the camera towards the book cover image designed by Alexander. You can also scan the QR code provided above.

I Am Love by Allowah Lani is a guide meant for those using psychedelics in order to enhance their spiritual growth and is a sequel to his 2016 book, Who Am I? Yoga, Psychedelics, and the Quest for Enlightenment.

To check out more of Ward’s AR projects, visit

The post This Psychedelic AR Book Cover Is Turning Heads Online appeared first on VRScout.

Answering your top questions about Unity Reflect

Visualizing BIM and CAD projects in real-time 3D offers many advantages, but the workflow for creating these interactive, immersive experiences needs streamlining. That’s why we recently announced Unity Reflect, our new product that creates interactive real-time 3D experiences for any device from Autodesk Revit in one click. It helps every project stakeholder, from ownership to design and construction teams, make the right design decisions in real-time.

At the AIA Conference on Architecture (A’19) last month, we announced Unity Reflect, our new product for the architecture, engineering, and construction (AEC) industry. In a nutshell, Unity Reflect enables key stakeholders to review federated BIM/CAD models on the device of their choice and make better design decisions together. Unity Reflect is accessible to both technical and non-technical users.

In case you missed it, check out our previous blog post for more details on how it works or watch this 1-minute video.

Since taking the wraps off this product, we’ve demoed Unity Reflect to thousands of AEC professionals at A’19 as well as at Autodesk University London 2019. The industry response has been overwhelmingly positive. A few highlights:

See what others had to say in AEC Magazine, BIMstore, and ENR. Unity Reflect is currently available in limited private release and will be available to the public in the fall. Make sure you stay in the loop on all things Unity Reflect by signing up for our mailing list below.

Unity Reflect: Key questions

As we approach the launch of Unity Reflect, we thought it would be helpful to share answers to the five most commonly asked questions we heard at A’19, AU London, and elsewhere.

How do I get early access to Unity Reflect?

While we would love to give access to everyone, Unity Reflect is not an open beta program or preview release, so only a limited number of AEC firms will be able to get early access. Please contact your Unity business development manager and he/she can put you up for consideration. If you are not a Unity customer, reach out here. Slots are filling up fast, so don’t delay!

Which design applications will Unity Reflect support?

We will support Autodesk Revit and Trimble SketchUp out-of-the-box and are developing an API to extend support to other third-party applications. While this will comprise the initial set of integrations available at launch, that’s just the start. We plan to release many more integrations over time based on the needs of our customers.

Can I use the Unity Editor in tandem with Unity Reflect?

You do not need access to the Editor to use Unity Reflect. The beauty of Unity Reflect is you don’t have to learn new software and can instead continue working from your go-to design tools. In fact, you don’t even need software development skills of any kind. With one click, Unity Reflect does all the complex work of preparing, federating, and transferring BIM/CAD data to any Unity-supported platform, making it easy to review models simultaneously on phones, tablets, computers, and AR devices.

However, if you are a Unity developer, you can bring this data into the Editor to customize and build upon these projects to create the best experience for your end users. Once your data is in the Editor, you can add anything Unity supports to your project, such as the Unity Particle System, Terrain to create landscapes, and Skyboxes to represent the sky of your choosing. You can also take advantage of Unity’s extensible platform to seamlessly deploy to over 25 platforms, including AR and VR wearables.

Does Unity Reflect have the ability to overlay BIM models localized to the real world?
Unity Reflect is built with AR Foundation and supports the latest and greatest features of Apple’s ARKit and Android’s ARCore. This allows a user to place a model in the real world in a localized way, but the alignment of models to the real world has to be done manually.

How does BIM filtering work?

Because Unity Reflect preserves BIM/CAD metadata, you can zero in on different parts of the central, federated model in real-time 3D (as shown in the image above). From the structural columns and framing to doors and furniture, Unity Reflect makes it easy to get an interactive, immersive 3D view of various building components to inform design decisions.



Indie VR Gem ‘Racket: NX’ Launches on Quest with Cross-platform Multiplayer

Indie VR gem Racket: NX (2018) launches today on Oculus Quest with cross-platform multiplayer with the Rift version of the game. Playing out as a techno-future infused mashup between Breakout and racquetball, Racket: Nx feels right at home on Quest thanks to its 360 degree tracking and lack of tether.

Lesser known but well received, Racket: Nx is a polished and high energy game which feels like a far flung imagining of racquetball fused with elements of Breakout. Players stand at the center of a 360 degree arena with neon targets that pulsate to the game’s excellent soundtrack. With a racket in hand, players smack the glowing orb to destroy some targets while avoiding others.

Having launched in Early Access back in 2017, Racket: Nx has been honed over the years, eventually hitting its full release for PC VR headsets in 2018 [our review]. Today Racket: Nx launches on Oculus Quest, and developer One Hamsa claims the game “is now tighter than ever, runs perfectly on the Quest, and without any significant compromises in visual fidelity or feel.”

On Quest, the game benefits from the headset’s 360 degree tracking and lack of tether. Priced at $20 (same as the PC version), the game also offers up cross-platform multiplayer between Quest and Rift. It’s at the moment unclear if the Quest version supports cross-buy with the Rift version, or if it can join multiplayer games with players using the Steam version of the game. We’ve reached out to the developers for confirmation.

The post Indie VR Gem ‘Racket: NX’ Launches on Quest with Cross-platform Multiplayer appeared first on Road to VR.