Apple’s ARKit gives iOS developers the ability to easily add augmented reality (AR) experiences to their existing apps. With Apple’s RealityKit framework and Reality Composer app, developers can take that to the next level and create AR experiences that were typically reserved for production companies at a huge expense just a few years ago.
Let’s dive into RealityKit and Reality Composer to get a better idea of what these AR tools can do and how developers will use them. This article is also available as a download, Cheat sheet: AR Tools from Apple: RealityKit and Reality Composer (free PDF).
SEE: Mixed reality in business (ZDNet/TechRepublic special feature) | Download the free PDF version (TechRepublic)
What is RealityKit?
RealityKit is a new framework that’s baked into Xcode 11 and iOS 13 to allow developers to get photo-realistic rendering in their augmented reality scenes. It also allows for animations, effects, physics effects, and more on those AR objects.
RealityKit is a native Swift API, available when linking against the iOS 13 SDK. This means that you get to use Swift’s language features to build out an AR experience that utilizes RealityKit even faster than before.
SEE: All of TechRepublic’s cheat sheets and smart person’s guides
What are the key features of RealityKit?
Scalable performance
RealityKit uses the GPU to get the most rendering performance available; it also takes advantage of the CPU caches and multi-core processing capabilities to make the simulations smooth for users. Apple advertises that now you only need to build a single AR experience that can scale to match the performance of each iOS device it runs on.
Shared AR experiences
Last year, Apple added the ability to build shared experiences in ARKit; this year, Apple is taking it a step further to make it simpler for developers to handle the networking side of the experiences.
With RealityKit, networking tasks like maintaining consistent state, optimizing network traffic, handling packet loss, and performing ownership transfers are all handled automatically by the kit without the developer needing to write this semi-boilerplate code themselves.
Additional resources
- Apple doesn’t have an AR headset yet, but its AR toolkit is paving the way (CNET)
- Infographic: The history of AR and VR, and what the future holds (TechRepublic)
- Virtual and augmented reality policy (TechRepublic Premium)
- Apple Developer: RealityKit documentation (Apple)
- Video: WWDC 2019: Everything Apple announced and what really matters to business (TechRepublic)
- WWDC Session 605: Building Apps with RealityKit (Apple)
- WWDC Session 610: Building Collaborative AR Experiences (Apple)
What is Reality Composer?
Reality Composer was introduced alongside of RealityKit at WWDC 2019. This app, which is available for iOS and macOS, lets developers and graphic artists build out realistic AR experiences and add animations and interactions in a way that has never been this easy before.
USDZ files are at the heart of creating augmented reality experiences on iOS; these files typically were built using third-party tools–until now. Reality Composer can import your existing USDZ file, or you can create your own using the app and importing some of the hundreds of pre-built 3D objects to create your AR scene.
SEE: Apple’s first employee: The remarkable odyssey of Bill Fernandez (cover story PDF) (TechRepublic)

Image: Apple, Inc.
What are the key features of Reality Composer?
Animations and audio
Reality Composer allows you to breathe new life into existing scenes and make them more lifelike and interactive–you can change the 3D object’s size, style, and more, as well as add animations and audio.
Animations like wiggle or spin are fun things to add to draw emphasis to certain objects. The app also allows for starting animations when a user gets closer to an object, moves their device, or encounters some other trigger that can be specified using Reality Composer.
Audio inside of augmented reality scenes are usually flat, but Apple has done something interesting with Reality Composer. Inside of the app, you can take advantage of spacial audio, which allows different audio clips to be played when a user moves into a certain part of the scene, giving the scene even greater depth of realness.
Record and play on device
Reality Composer on iOS lets you easily record sensor data (such as moving the device around, zooming in on a particular location of the AR scene, etc.), then play it back while developing the AR experience. This is helpful because you don’t need to constantly build and run the app on a device and manually test every time you make a change to the AR scene.
Portability
Apple made Reality Composer cross-platform, allowing developers and graphic designers to edit AR scenes–and the entire experience–along with animations and events right from Mac and iOS devices. This means that you can edit your scenes from anywhere, and it lowers the cost of getting started with AR scene design for graphic designers.
Additional resources
- WWDC Session 609: Building AR Experiences with Reality Composer (Apple)
- WWDC 602: Working with USD (Apple)
- Apple’s coolest new AR features only work on iPads and iPhones with the newest processors (CNET)
- Get our latest Apple-related news and tips (TechRepublic on Flipboard)
When was RealityKit and Reality Composer released?
Apple introduced RealityKit and Reality Composer at WWDC 2019. Here are more details about the releases:
- June 3, 2019: Apple introduces RealityKit and Reality Composer.
- June 3, 2019: Apple releases beta of RealityKit in Xcode 11.
- June 3, 2019: Apple releases beta of Reality Composer.
Additional resources
- WWDC Session 603: Introducing RealityKit and Reality Composer (Apple)
- WWDC 2019: Mac Pro, iPadOS, iOS 13, WatchOS 6, and everything Apple announced (ZDNet)
- Apple’s ARKit gives AR apps motion capture, lets you step inside a digital creation (CNET)
How can I get RealityKit and Reality Composer?
RealityKit is available in Xcode starting with version 11. You can download the beta of Xcode 11 from the Apple Developer website under the Applications tab.
Reality Composer is available within Xcode 11 on the Mac by selecting Xcode from the menu bar | Open Developer Tool | Reality Composer. This will launch the Reality Composer app on your Mac.
Reality Composer is currently in closed beta on iOS, but you can request access to the TestFlight version of the app from the Apple Developer website. Once accepted, you will be able to install Reality Composer beta on your iOS devices from the TestFlight app.
It is expected that when Reality Composer exits beta later this year, it will be available for everyone to download from the iOS App Store.
Additional resources
- The Apple Developer Program: An insider’s guide (free PDF) (TechRepublic)
- Project Catalyst: What developers need to know (TechRepublic)
- More programming and developer-related coverage (TechRepublic on Flipboard)

Image: Apple, Inc.