Early this month, Apple announced Apple Vision Pro, a new mixed reality headset that offers a fully three-dimensional user interface controlled by user’s eyes, hands, and voice. This device is powered by visionOS, Apple’s spatial operating system. Yesterday, Apple announced the availability of visionOS SDK that will allow developers to build new app experiences for Apple Vision Pro.
In July, Apple will open developer labs in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo to offer developers with hands-on experience to test their apps on Apple Vision Pro hardware and get support from Apple engineers. App development teams can also apply for developer kits to help them quickly build, iterate, and test right on Apple Vision Pro.
- Developers can build new apps for Apple Vision Pro using the same foundational frameworks they already know from other Apple platforms, including Xcode, SwiftUI, RealityKit, ARKit, and TestFlight.
- To help developers optimize 3D content for their visionOS apps and games, an all-new tool available with Xcode called Reality Composer Pro lets them preview and prepare 3D models, animations, images, and sounds, so they look amazing on Vision Pro.
- Developers can also interact with their apps in the new visionOS simulator to explore and test various room layouts and lighting conditions.
Starting next month, developers building 3D apps and games with Unity can port their Unity apps to Apple Vision Pro.
The visionOS SDK, updated Xcode, Simulator, and Reality Composer Pro are available for Apple Developer Program members at developer.apple.com.