The Vision Pro isn't designed to rely on any type of physical controller for input.The headset is said to use 12 cameras, five sensors, and six microphones to track a user's hand movements. It also apparently uses high-speed cameras and a ring of LEDs that project light patterns onto a user's eyes to understand where they're looking. To select objects in an app, such as a button, the user looks at the object and then pinches their finger to select.From Apple's demos, it appears that the hand-based controls work well. Then again, gamers would argue that a physical controller is necessary for first-person shooters and other titles; Apple didn't offer up a physical controller for the Vision Pro. Presumably a Bluetooth-based gamepad would work, however, as they are already supported on iOS.Meta has been experimenting with similarly allowing users to control its Quest headsets using just their hands and eyes.Apple says the Vision Pro can track where a user is looking to understand when they're looking at an app or at someone nearby.If the wearer begins to look toward someone, the Vision Pro's glass screen is supposed to become transparent so the wearer can see their surroundings. The transparency of the glass also means users could, for instance, head to the kitchen to grab a snack without taking off the Vision Pro. When the wearer looks at an app, the glass is supposed to darken, immersing them in the app.The technology, called EyeSight, is designed to also give visual cues to people outside the goggles so they know whether they have the attention of the wearer or are being ignored. If the wearer is ignoring you and looking at an app, for instance, you wouldn't be able to see their eyes.Apple also demonstrated a spatial photography feature, designed to let the wearer capture recordings of their surroundings when they're wearing the headset. The headset would signal to other people that it's recording, but the idea is that because the wearer can still see their surroundings while they're recording, they can both experience the moment unencumbered — as they might be holding a phone — and then relive it later.The EyeSight technology is crucial to Apple's vision for a personal-computing headset, as CEO Tim Cook has long said he envisions an experience that doesn't remove people from their surroundings the way a traditional virtual reality headset might.Immersive video calls have been a rumored feature of the Vision Pro, but it was unclear how the wearer would be displayed when they make calls through a headset.This is one of the most difficult challenges we faced in building Apple Vision Pro, Mike Rockwell, the leader of Apple's AR/VR project team, said at WWDC. There's no videoconferencing camera looking at you, and even if there were, you're wearing something over your eyes.Apple devised a novel solution: Users take a scan of their face, and then the headset uses machine learning to create a 3D, digital version of a person's face. When the user talks on FaceTime, sensors facing the wearer are designed to detect facial movements and represent them as a digital persona to others in the call. If someone else in the call is also using a Vision Pro, users would see one another in full immersive 3D.Google demonstrated similar technology in 2021 called Project Starline, which creates a 3D representation of people in video calls as if they're all in the same room. The Quest Pro features face tracking to sync expressions with avatars, and Meta CEO Mark Zuckerberg has demoed similar technology for creating life-like versions of a person's face that could be implemented in its metaverse. But Apple will no doubt benefit thanks to the popularity of iMessage and FaceTime for intimate conversations.Apple says users of the Vision Pro can perform other tasks during a FaceTime call, like view a presentation collaboratively or watch an Apple TV+ show together using SharePlay.Apple also says a new R1 chip inside the Vision Pro ensures the headset can process a user's eye movements and ensure content appears in front of their eyes in real time, without lag. When screen space isn't a limitation and your entire 360-degree surroundings are a canvas to project anything onto, it's possible to extend your desktop screen to be much bigger.Instead of working on your MacBook's small 13-inch display and constantly toggling through tabs, you could display windows on the wall in front of you. Apple says keyboards and mice are supported.Apple's tight integration through iOS makes this possible — Meta might have hoped to create an ecosystem between Quest and the Facebook social graph, but Cupertino already has a strong ecosystem between iOS, its Mac lineup of computers, and iMessage. This ability to use a MacBook through the Visual Pro is possible only thanks to Apple's control.WWDC heavily touted entertainment, with Disney CEO Bob Iger taking the stage to announce the company would bring its content to the headset, starting with Disney+ at launch. It's easy to imagine how a captive, fully immersive headset like the Vision Pro could be an ideal way to watch content. The headset features two 4K displays, one for each eye. And Apple's spatial audio feature could give movies and TV shows theater-like sound, with that sound seeming as if it's coming from all around you. Additionally, Apple's SharePlay is expected to enable anyone with FaceTime to watch film and TV together through the headset. And Apple also demonstrated the compelling idea of using Vision Pro to watch movies on an airplane and distract oneself from the normal claustrophobia of an airplane seat. The company says the Vision Pro should get about two hours on a charge.