Business News, News

Revolutionary Eye-Tracking Feature Coming To Apple Devices

By Eric George

On:

Reviewed by: Eric George

Share

Eye-Tracking Feature Coming To Apple Devices

Apple has never failed to amaze us, the tech nerds. Most of the features that it introduces in its new version or updates are revolutionary and often thoughtful.

The tech giant is committed to ensuring that all of its users have access to the latest trends in the field and has never fallen back when it comes to tech advancements.

Also, many of the features have turned out to be thoughtful in helping all different kinds of users that it has all over the globe. Apple has transformed the lives of people who are differently abled in many ways with its voice assistant and more features. When it comes to the privacy of features of uses, Apple has always covered your back. 

The tech behemoth has even fought for the privacy of the users in international lawsuits and has gone against the instructions of multiple democratic nations when the company was asked to open up the locked mobiles and devices of the users.

Also, we can not go without mentioning the safety that Apple ensures to all of its users. You have access to all kinds of safety measures, you can know who is following you, and you can know when Bluetooth device tracking and the latest launched cross-platform third-party detection feature is the recent example that we have got to present you. 

Here we discuss the top-notch eye-tracking feature that will soon be available in Apple devices. Let us know what it is all about. We tell you all we know. 

What Is The Apple Eye Tracking Feature?

  • Apple’s eye-tracking feature is specially designed for users who are physically different. The feature is powered by AI and it uses the front camera to set up and calibrate. 
  • When you use this app, it will let you navigate through the elements of the apps in the deuce. You can also use the Dwell Control to activate and deactivate each element and function. Also, you will have additional access to the screen swipes, physical buttons on the device, and other gestures and all this can be done only by using your eyes. 
  • It utilizes device machine learning to function. All the data that you use to set up this device is completely kept on the device and it is not designed in a way to be shared with Apple. You do not need to have any additional accessories to use this feature. It can be used in all iOS apps and iPadOS. 

Eye tracking was not the only fear that Apple announced recently on May 15, 2024. The announcement also mentioned vocal shortcuts and music haptics. 

Apple Eye Tracking Feature

What Do Vocal Shortcuts Do?

With the new Vocal Shortcuts feature the users can train Siri to carry out particular tasks when they use certain vocal shortcuts, rather than uttering the whole command.

You as an Apple user will be able to assign custom utterances to launch various shortcuts in your devices. When you give this utterance to  Siri, the digital assistant will move on to carry out the tasks with which the vocal shortcut is associated. Vocal  Shortcuts are available to all iPhone and iPad users. 

Another voice feature launched by Apple is ‘Listen for Atypical Speech’ with which you will have an advanced option of enhancing speech recognition. This can be used for a wider range of speech. This feature is especially destined for those users who have a condition that affects normal speech delivery such as 

  • stroke, 
  • cerebral palsy, 
  • ALS (amyotrophic lateral sclerosis), and so on. 

With this feature and on-device machine learning the different user speech patterns will be recognized. For those who are tackling a condition where they can not provide commands or utter certain sentences, the feature provides a whole new sense of control and customization options.

This has been built on top of the future that was launched in the iOS 17 update which had features for users who were at risk of losing their speech ability and also for the nonspeaking users. 

Music Haptics

This feature is exclusively built for deaf users and users who have conditions that make hearing hard for them. Music Haptics lets all such users experience music. You can run on this accessibility feature to experience the music. Apple uses the Taptic Engine in iPhones to play textures, taps, and refined vibrations of the audio of the music. It can be used to experience millions of music present in the Apple Music Catalog. 

From Apple

The CEO

The CEO of  Apple, Tim Cook stated that they believe in the transformative power that innovation possesses to enrich their lives. He said that was the reason why Apple has championed inclusive design for the past almost 40 years.

It was done by embedding accessibility and prioritizing it as the core of all their software and hardware. He added that they were continuously pushing the boundaries of tech and that the new features reflect their long-standing commitment to delivering the best possible experience to all their users. 

Senior Director

The senior director of Global Accessibility Policy and Initiatives at Apple said that each year they aim at breaking new ground when it comes to the accessibility of the devices. She added that the new features would act to make an impact on the lives of a wide range of users and also it would provide new ways to communicate, control their devices, and move through the world. 

Final Thoughts

Everyone may not be an Apple fan and it is evident that so many think that Apple products are overrated and overhyped. The company managed to set Apple devices not merely as a device but more as a lifestyle, a status symbol, and much more.

Apple has garnered widespread criticism over time however when it comes to the privacy and safety features of the tech leader, there have been no huge disputes or disagreements. 

The new features enable users who are differently abled at hearing, speaking, and seeing. With music haptics deaf and hard of hearing users can experience millions of music, the vocal shortcuts help those who show different speech patterns and difficulty in speaking to set custom commands to carry out tasks and the eye tracking feature helps those who have eyesight related issues and the physically differently abled ones to carry out the functions. What do you think about the latest features? Do you think other tech giants may also implement similar features? Take time to let us know in the comment box below.

Eric George

Eric George, a retired journalist, focused primarily on market research and current tech trends. With a career spanning news media, he made significant contributions to understanding the intersection of technology and finance. Today, he continues to engage with these topics in various capacities

View All Posts

Leave a Comment