How To Control AAC With Eye Tracking (iOS 18 Tutorial)

Two iPhone screens. On the left, the Apple accessibility settings menu. On the right, Spoken Tap to Talk AAC using the new eye tracking or eye gaze feature.

Apple recently announced that eye tracking will be available for certain iPhones and iPads that receive the iOS 18 update this Fall. At Spoken, we’ve been testing this feature to ensure seamless compatibility with our app. We believe it will make Spoken and other AAC apps accessible to entirely new user groups and we’re very excited!

What is Eye Tracking?

Eye tracking is a hands-free way of controlling your device. Using the front-facing camera, the device can detect what you’re looking at on the screen. This allows you to navigate and interact with your device without the need for touch or voice commands.

Benefits of Eye Tracking in AAC Apps

The inclusion of eye tracking in iOS 18 could be significant for augmentative and alternative communication (AAC) users — people who have trouble speaking and rely on their devices to communicate. This new form of navigation will improve the accessibility of AAC for individuals with physical disabilities or motor impairments that make touch controls challenging or impossible to use. This includes individuals with conditions like ALS or cerebral palsy, who may not be able to use their hands. With eye tracking, it’s possible to navigate and control AAC apps like Spoken just by moving their eyes.

Spoken on an iPhone with eye tracking enabled.

While eye gaze controls were already available on certain dedicated AAC devices, they weren’t widely available for apps until now. This is great news because AAC apps on consumer hardware like iPads tend to be much more affordable. Plus, you can use them for other things like texting or browsing the web. Due to all this, eye tracking being widely available for AAC iPad apps could be a game changing.

Existing AAC users may also benefit from the addition of eye tracking, even if they don’t have any physical disabilities. Some may find eye gaze faster or more efficient than touch controls. It can be rather intuitive once you get used to it.

How To Enable Eye Tracking on Your iPhone or iPad

iOS 18 won’t be available to the public until Fall 2024, but in the meantime you can enroll for the Apple Beta program to test the feature if you have a compatible device. On certain devices, eye tracking will not be part of the iOS 18 update due to hardware incompatibilities — don’t expect to find it on anything older than an iPhone 12 or iPad 10.

Once you have iOS 18 (or iPadOS 18) running on a device that supports eye tracking, you can enable the feature by navigating to your device’s settings.

Step-by-Step Guide to Enabling the New iOS Eye Tracking Feature

  1. Ensure you’re running the correct version of iOS or iPadOS (18 or above).
  2. Open the Settings app.
  3. Tap ‘Accessibility’.
  4. Scroll to ‘Eye Tracking’ (under the ‘Physical And Motor’ heading).
  5. Tap the first toggle to turn on the new feature.
  6. Follow the on-screen directions for calibration. You’ll be instructed to follow a circle around the screen with your eyes. For best results, place your device on a stable surface around 1.5 feet from your face.
  7. You’ll see a checkmark appear on-screen when you’re done — eye tracking will now be enabled and ready to use.
A visual representation of the steps required to enable eye gaze controls.

How To Use Eye Tracking on iOS

Using eye tracking to control your phone or tablet is simple. Once the feature is enabled, an invisible cursor will begin to follow your eye movements. When you look at an interactable element on the page, it will be highlighted with a white outline. If you maintain your gaze on that object, it will be selected or “clicked” and the chosen action will be performed.

The ability to select or click objects by fixing your gaze on them is called “dwell control.” In the Settings app, you can adjust how long you need to maintain focus (dwell) on an object before it will be selected.

Eye tracking pairs excellently with Apple’s existing AssistiveTouch feature, which will be automatically enabled when you activate eye tracking. This will make a circle appear in the bottom right corner of your screen, which will bring up a menu full of shortcuts to options that would usually require swiping or other gestures to access. From here you can also access scrolling options.

Select the Assistive Touch icon near the bottom right corner of the screen for assistance with actions that require swiping or gestures.

Does Apple Eye Tracking Work?

Apple’s eye tracking feature isn’t perfect yet. Because it is in beta, some bugginess is expected. Our hope is that it will be improved by the final release of iOS 18. To get the most out of it as-is, here’s what we suggest:

Can I Use Eye Tracking on Android Devices?

Unfortunately, eye tracking is not a built-in accessibility feature on Android yet and it’s unlikely to be added anytime soon. The accuracy of eye tracking is dependent on the camera’s hardware, which can vary greatly between Android devices.

About Spoken

Spoken is an app that helps people with aphasia, nonverbal autism, and other speech and language disorders.