Apple shipped the Vision Pro in February at $3,500. It delivers exactly what Apple demoed: floating iPad-style apps that look physically present, a MacBook Pro's worth of compute packed into a headset, and the best single-person home cinema available. It is also too heavy to wear all day, limited to two hours on battery, and has no VR games. Meta has been building toward this category since its $2 billion Oculus acquisition in 2014. Apple arrived from the opposite direction, betting that general computing, not games, is the right foundation.

The real question is not the price or the weight. Both will fall. The question every reviewer lands on is: what is it for? Evans frames the Vision Pro as a stress test for the entire spatial computing thesis. Previous headsets could hide behind future hardware promises. This one cannot. It has a display system good enough to qualify as a real computer, and it still struggles to justify itself as one. Evans is skeptical that more screen real estate is the future of productivity, and even more skeptical that 3D interfaces solve a problem users actually have. He draws a direct comparison to Apple Pencil: technically flawless for a decade, and largely irrelevant to how people actually compute.

The piece is worth reading in full because Evans does not stop at the hardware critique. He works through why the mobile analogy, start with existing apps and wait for native experiences, may not hold for spatial computing the way it did for smartphones. He also raises a question no one has answered cleanly: if AI systems are moving toward showing users less and summarizing more, a bigger and more immersive display may be moving in exactly the wrong direction at exactly the wrong time.

[READ ORIGINAL →]