The Expedition 60-61 trio from the United States, Italy and Russia, is lifting off Saturday at 12:28 p.m. EDT aboard the Soyuz MS-13 spacecraft. They will dock less than six-and-a-half hours later to the Zvezda service module 50 years to the day NASA first landed humans on the Moon.
About two-and-a-half hours later the Soyuz and station hatches will open and they will enter their new home in space. NASA astronauts Nick Hague and Christina Koch and station Commander Alexey Ovchinin of Roscosmos will greet their new crewmates then hold a ceremony with family, friends and mission officials on the ground.
NASA TV is broadcasting all the activities live with launch coverage beginning Saturday at 11:30 a.m. Docking coverage begins at 6 p.m. as the Soyuz begins its approach with the orbiting lab. Finally, NASA TV’s live coverage of the hatch opening and crew welcoming ceremony begins at 8 p.m.
Visualizing BIM and CAD projects in real-time 3D offers many advantages, but the workflow for creating these interactive, immersive experiences needs streamlining. That’s why we recently announced Unity Reflect, our new product that creates interactive real-time 3D experiences for any device from Autodesk Revit in one click. It helps every project stakeholder, from ownership to design and construction teams, make the right design decisions in real-time.
At the AIA Conference on Architecture (A’19) last month, we announced Unity Reflect, our new product for the architecture, engineering, and construction (AEC) industry. In a nutshell, Unity Reflect enables key stakeholders to review federated BIM/CAD models on the device of their choice and make better design decisions together. Unity Reflect is accessible to both technical and non-technical users.
In case you missed it, check out our previous blog post for more details on how it works or watch this 1-minute video.
Since taking the wraps off this product, we’ve demoed Unity Reflect to thousands of AEC professionals at A’19 as well as at Autodesk University London 2019. The industry response has been overwhelmingly positive. A few highlights:
See what others had to say in AEC Magazine, BIMstore, and ENR. Unity Reflect is currently available in limited private release and will be available to the public in the fall. Make sure you stay in the loop on all things Unity Reflect by signing up for our mailing list below.
Unity Reflect: Key questions
As we approach the launch of Unity Reflect, we thought it would be helpful to share answers to the five most commonly asked questions we heard at A’19, AU London, and elsewhere.
How do I get early access to Unity Reflect?
While we would love to give access to everyone, Unity Reflect is not an open beta program or preview release, so only a limited number of AEC firms will be able to get early access. Please contact your Unity business development manager and he/she can put you up for consideration. If you are not a Unity customer, reach out here. Slots are filling up fast, so don’t delay!
Which design applications will Unity Reflect support?
We will support Autodesk Revit and Trimble SketchUp out-of-the-box and are developing an API to extend support to other third-party applications. While this will comprise the initial set of integrations available at launch, that’s just the start. We plan to release many more integrations over time based on the needs of our customers.
Can I use the Unity Editor in tandem with Unity Reflect?
You do not need access to the Editor to use Unity Reflect. The beauty of Unity Reflect is you don’t have to learn new software and can instead continue working from your go-to design tools. In fact, you don’t even need software development skills of any kind. With one click, Unity Reflect does all the complex work of preparing, federating, and transferring BIM/CAD data to any Unity-supported platform, making it easy to review models simultaneously on phones, tablets, computers, and AR devices.
However, if you are a Unity developer, you can bring this data into the Editor to customize and build upon these projects to create the best experience for your end users. Once your data is in the Editor, you can add anything Unity supports to your project, such as the Unity Particle System, Terrain to create landscapes, and Skyboxes to represent the sky of your choosing. You can also take advantage of Unity’s extensible platform to seamlessly deploy to over 25 platforms, including AR and VR wearables.
Does Unity Reflect have the ability to overlay BIM models localized to the real world? Unity Reflect is built with AR Foundation and supports the latest and greatest features of Apple’s ARKit and Android’s ARCore. This allows a user to place a model in the real world in a localized way, but the alignment of models to the real world has to be done manually.
How does BIM filtering work?
Because Unity Reflect preserves BIM/CAD metadata, you can zero in on different parts of the central, federated model in real-time 3D (as shown in the image above). From the structural columns and framing to doors and furniture, Unity Reflect makes it easy to get an interactive, immersive 3D view of various building components to inform design decisions.
Forget the old phrase “you can’t judge a book by its cover.”
UK-based artist Alexander Ward was shocked to see his AR-enabled cover for Allowah Lani’s book I am love was circulating online. When scanned with a smartphone camera, the books cover opens a 3D portal to a visually-captivating psychedelic cosmos.
“It certainly was very surprising,” Ward explained, “[I] can’t imagine anyone expects their video to travel around merrily on Twitter gathering almost a million views.”
Ward said he didn’t even have a Twitter account — but he certainly does now, @wardyworks.
“For a while now I’ve been excited by the interesting new avenues this tech opens up for my art, and the book cover was a small snippet of what I’m working towards with that tech,” Ward said. “So it’s very encouraging to see the response that many others are excited by it also!”
The cover is a woman surrounded and made of celestial and mystical elements. Her hair is an intertwining of creatures like snakes and butterflies, light, and faces and she is reaching out to an unknown hand wrapped in gauze that feels like it is your hand when looking at the cover in AR.
“For this particular Augmented Reality book cover I created, I wanted to give the illusion of looking into the cover itself,” Ward said, “as if the book cover is a window or portal into the artwork.”
“I achieved this by separating the artwork and text into many individual layers, that I placed in receding layers of 3D depth, in a 3D program on the computer,” Ward said.
Ward’s design works similarly to how an Instagram lens or Snapchat filter recognizes your face and then adds a layer AR element that moves with your movements. Ward has also designed face filters with his similar ethereal style.
“If for example there is a book that already has a million copies in circulation,” Ward explained, “I could make an effect, and it will work on all those million copies. Provided those covers are the same.”
A similar effect was achieved by the New Yorker in 2017 with a magazine cover that came to life in AR using the UNCVR app.
Want to try it yourself? Click this link to open up the AR effect (Facebook app required) and point the camera towards the book cover image designed by Alexander. You can also scan the QR code provided above.
I Am Love by Allowah Lani is a guide meant for those using psychedelics in order to enhance their spiritual growth and is a sequel to his 2016 book, Who Am I? Yoga, Psychedelics, and the Quest for Enlightenment.
Microsoft’s Japanese-speaking hologram steals the show at Inspire 2019.
Microsoft’s HoloLens 2 Mixed Reality headset was front-and-center at the 2019 Inspire conference as the company showcased their latest advancements in artificial intelligence as well as mixed reality hologram technology. During a keynote held yesterday at the Las Vegas-based event, Microsoft Executive Julia White unveiled an exciting new project that uses a combination of body and voice capture, Azure AI, and HoloLens 2 technology to convert a human presenter into a 3D hologram capable of delivering a presentation in any language.
White begins the keynote by donning a Microsoft HoloLens 2 headset and sticking her out her open hand so that it may be seen by the headset, at which point a miniaturized green model of White, or “mini-me” as she refers to it, warps into existence. After demonstrating how the tiny hologram tracks and follows her hand, White states a simple voice command: “Japanese keynote. Render keynote.” This causes the simplistic green model to disintegrate into a cloud of particles, which then realign to form a near photorealistic holographic rendition of White.
This new Hologram then proceeds to deliver the rest of the keynote in Japanese, an especially impressive feat considering White isn’t fluid in the language. The Japenese-speaking hologram even sounds like White thanks to Microsoft’s Neural Text-To-Speech (Neural TTS) technology, which harnesses the power of AI to automatically create personal voice signatures based on simple voice samples.
Microsoft’s combination of mixed reality and Neural TTS technology could revolutionize how businesses, clients, and various other professionals across multiple industries deliver live presentations, such as lectures, talks, and conferences, by allowing them to create personable and engaging presentations in multiple languages simultaneously across the world.
Much like the HoloLens 2, however, the technology is still in its early stages and therefore a relatively expensive procedure. White’s hologram, for instance, was shot in near-perfect conditions at Microsoft’s dedicated mixed reality studio, which features professional lighting and camera equipment specifically designed for MR capture.
Regardless, White’s demonstration paints a fascinating picture for the future of live talks, Q&A’s, lectures, and various other forms of live presentations.
Echo VR, the hub for both ‘Echo Arena’ and ‘Echo Combat’, got a sizeable update today with a revamped lobby which turns zero-G into under-the-sea. A new Skirmish zone lets players hop into pick-up battles for practice before diving into matches.
While Echo VR offers up both Echo Arena and Echo Combat, the game’s lobby itself actually has plenty to do inside, and with today’s Summer Splash update there’s even more. The lobby has been transformed into an undersea paradise with new objects to play around with and new customizations to unlock. Echo VR has seen similar seasonal lobby makeovers for Winter and Halloween.
The Summer Splash lobby will be available until August 2nd at 12AM PT, and players can unlock some new customizations by completing an Echo Arena or Echo Combat match any time before it ends.
Players can also unlock some unique decals by participating in two scheduled Echo Arena events:
On July 25th from 10:00AM – 10:00PM PST / 17:00 – 5:00 UTC complete 1 Event Match to earn a special Summer decal (check the in-game poster on the event day for more info!).
On August 1st from 10:00AM – 10:00PM PST / 17:00 – 5:00 UTC complete 1 Event Match to earn a special Summer decal (check the in-game poster on the event day for more info!).
A new Skirmish zone has been added to the lobby as well. Similar to the Echo Arena practice area, this zone lets you actually fight each other like a pick-up game of laser tag.