Hot news

Everything you need to know about eye tracking on Apple's iPhone and iPad


Apple announced a host of accessibility features coming to the iPhone in the spring and said they would arrive "later this year." They're now officially coming with iOS 18, and they're eye-tracking options, music taps, and voice shortcuts, according to

Apple claims that the new eye-tracking accessibility feature in iOS 18 uses artificial intelligence to allow people whose motor functions have been affected to operate iPhones and iPads solely through the movement of their eyes.

IT is obviously not referring to Apple Intelligence , since it is only supported by the iPhone 15 Pro and possibly the iPhone 16 models . The eye tracking feature uses the front camera to set up and calibrate in seconds, and with machine learning on the device, all the data used to set up and control this feature is retained. securely on the device

It remains to be tested how successful the Eye Tracking feature is in selecting, running, and controlling iPadOS and iOS applications. Users can move between screen elements with their eyes and press screen buttons using the so-called Dwell Control to activate them, or scroll up and down with their eyes only

Dwell Control is a predetermined amount of time that the user pauses their gaze on a screen element in order to activate it

As for other new accessibility options for iOS 18, Apple is introducing Music Haptics, which plays sound-enhancing clicks, textures, and vibrations available in its Apple Music catalog