Don’t get me wrong, it’s not perfect. But then again, neither were early mobile phones. However, just like early mobiles, ML1 is more about possibilities and a vision for the future, which it has in spades.
What’s so great about it?
Fundamentally, it’s great because it is joyous to use.
Contrast this with HoloLens which, for me, has always been frustrating to use. I could put technicals around this about how the U.I. is intuitive, how the experiences are fun, how the control mechanisms just make sense, and so on. But that would be missing the point, because it’s not about individual features, but rather about the holistic experience. The best analogy from my personal experience is comparing Windows Mobile 6.x with early iPhones. On paper, they both had similar specs and features. In reality, one was (mostly) joyous, the other (mostly) frustrating.
In a nutshell, ML1 is fun, fun, fun!
What’s not so great?
As I suspect you already know if you’re reading this, the field-of-view is too small. This is important because it truly does impact some otherwise compelling experiences. For example, a big screen is just not fun when only part of it is visible at any one time. On the other hand, experiences like Tónandi do a great job despite the small field-of-view.
Beyond that, through observing closely and based on experience of other VR/AR devices, I notice that:
– The tracking is not perfect, but it’s very good for inside-out. I have not experienced any lag-induced motion sickness.
– Likewise, meshing the room isn’t perfect, but it’s again very good.
– Hardware is comfortable, but very delicate.
– Hand/eye control works pretty well, but needs to usable from more applications.
– The number of “experiences” available is quite small at the moment.
Things I am eagerly waiting for
I don’t expect most of these things anytime soon, but there is no harm in dreaming!
- Live meshing fast enough to work with dynamic objects. That is, when one of our dogs comes into the room, I want to be able to put a sticky virtual object on the dog and have it move around with the dog.
- Lighting of virtual objects that takes into account the lighting in the room. (For example, if I’m in a darkish room and switch on a light, then virtual objects should appear to be illuminated from the direction of that light.)
- Pet dragons
- Shared environments. That is, multiple people in the same physical space, both magic-enabled, should be able to interact with the same virtual objects. For example, playing catch with a virtual ball. (I suspect this is possible to some degree already, but I haven’t been able to experience it.)
- Pet dragons
- Enough field-of-view and graphical fidelity to replace my monitors and code on virtual monitors. Or better, in a spatial environment that makes me more productive than the 2D world of monitors.
- Did I mention that I want a pet dragon?
I first blogged about Magic Leap back in 2014. I may have been a bit optimistic in that post about when the age of magic will arrive, but ML1 is a great leap along that path. Keep believing.