In August of this year, Intel chief executive Brian Krzanich appeared onstage during the company’s developers conference wearing a clunky VR headset. He then participated in a live demo that involved using a dollar bill to cut through (virtual) chunks of gold. The headset was called Project Alloy, Krzanich said, and Intel was officially working with Microsoft on the device to offer a “merged reality” experience.
With the race to get humans wearing face computers heating up, it’s no surprise that Intel decided to join the likes of Samsung, Google, Facebook, Microsoft, Sony, HTC, Qualcomm, and others in creating a solution that could hopefully, possibly, maybe, help VR make the leap from ultra-nerdy to sorta-mainstream. And after having missed the boat on mobile, Intel is trying to ensure that it’s well-positioned for the next big wave of computing.
Intel’s pitch with Project Alloy is that it has figured out a way to offer a PC-like VR experience without the need to tether the headset to a computer system. The reference design has two processors, Intel’s RealSense 3D cameras, and a detachable, rechargeable battery. The first headset that ships will be based on Microsoft’s Windows Holographic desktop software.
Okay. So what’s “merged reality”?
It’s a good question, and it’s one I’ve been asking ever since Intel first dropped the term this summer. Fortunately, Intel was willing to let The Verge into its labs and let us try on an early version of a Project Alloy headset to get a better sense of how it works.
In short, merged reality means you can see real-world stuff in front of you, even while you’re wearing a full headset. The RealSense cameras on the headset capture images of the things in front of you and project them back into your virtual environment in milliseconds. If Microsoft’s HoloLens headset creates a layer of augmented reality on top of the real world, and Oculus Rift blocks you out entirely from the real world, Project Alloy falls into a bizarre place somewhere in between.
I could see my hands and take a selfie while wearing a full VR headset
While I was wearing the Project Alloy headset, I could still see my Verge colleagues Vjeran and Tyler standing nearby. I could use a physical pen as a tool within a virtual game. If someone handed me a piece of paper and a pen, I could still see well enough write a note with it — while I had the VR headset on. I could check the time on a real, physical watch. I could even take a selfie with my phone while wearing the headset.
Arguably, this type of virtual-world-meets-real-world interaction is safer because you’re able to see people and objects in front of you, even while your head is enveloped in a giant face computer. Project Alloy also eliminates the need for additional sensors or hand controllers in order to have an interactive experience. I played an interactive Raiders of the Lost Ark-type game in a dark room while wearing the Samsung Gear VR headset, but the room was filled with sensors that made interacting with the room possible. My mind was blown the first time I tried Oculus Rift and Oculus Touch together, throwing balls and other objects into a virtual void, but I was also using hand controllers.
Heads floated in and out of frame, pixelated versions of people who were there in real life
The Project Alloy headset I tried was only version one, though, and it was easy to see why Intel isn’t rolling this one out to market. The prototype was ill-fitting on my head, with enough room on the underside to see the (real) floor below me. Headset fit, like headphones, is too subjective to judge harshly, but it felt heavy, too. Intel says an ideal weight for a headset built off the Project Alloy reference design is 750 grams, or around 1.5 pounds.
The real-life objects I saw before me while I was wearing the headset — my hands waving in the frame, an Intel employee standing in the lab — were pixelated. Heads floated in and out of frame, depending on how close I was standing to a person in real life. Interactions with these digitized humans in my virtual world had a slightly morbid element to them, as though I was watching an old, low-quality home movie of someone waving and smiling at the camera; not really knowing if that person existed in real life or not.
It was more of a conceptual experience than a demo of a finished product, a half-glimpse at what could be.
But Tim Parker, Intel’s vice president and GM of marketing for perceptual computing, insists that next year’s Project Alloy, the one that it plans to ship in the second half of 2017, will offer a better experience. The one I tried runs on an Intel Core processor built on last year’s Skylake architecture and uses an Atom chip for compute vision. These will be upgraded to a Kaby Lake processor and a chip from Movidius, which Intel acquired this fall. The two R200 RealSense cameras used in the current headset will be upgraded to one single 400 series RealSense camera, shown off at the same developers conference earlier this year. And the real Project Alloy headset will include the option for a discrete graphics card.
So far, Microsoft is the only company that has publicly committed to working with Intel on Project Alloy, although the idea is that this VR solution will be intriguing — and unique — enough to attract other potential partners. Intel is becoming well-known for making neat tech demos that don’t always find traction in the real world, and Project Alloy could absolutely become another one of those.
At the same time, if anyone is going to figure out how to strike the right balance between just-okay mobile VR and heavy-duty tethered VR headsets, it might as well be Intel, one of the world’s largest chipmakers. Intel knows there’s no way to win the virtual reality game if you don’t play, and so we have Project Alloy. As unfinished as it is, it’s still fun to slip in and out of reality with it.
Photos by Vjeran Pavic.
Video by Vjeran Pavic and Tyler Pina.