May 19, 2024
Breaking News

Apple machine learning speech focuses on benefits for accessibility and health – 9to5Mac

NPR reports:

Apple has given a rare speech at a global AI gathering, with vice president Ge Yue choosing to concentrate on Machine Learning in accessibility features […]

The company has chosen to illustrate the technology through accessibility features in Apple Watch, and AirPods Pro […]

She said that “Machine Learning plays a crucial role” in Apple’s hope that its products “can help people innovate and create, and provide the support they need in their daily lives.”

“We believe that the best products in the world should meet everyone’s needs,” she continued. “Accessibility is one of our core values and an important part of all products. We are committed to manufacturing products that are truly suitable for everyone.”

“We know that machine learning can help disabled users provide independence and convenience,” she said, “including people with the visually impaired, the hearing impaired, people with physical and motor disabilities, and people with cognitive impairment.”

Ge Yue gave the example of the Assistive Touch feature on Apple Watch, which the company introduced last year, alongside eye-tracking on iPad.

To support users with limited mobility, Apple is introducing a revolutionary new accessibility feature for Apple Watch. AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls.

Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench. AssistiveTouch on Apple Watch enables customers who have limb differences to more easily answer incoming calls, control an onscreen motion pointer, and access Notification Center, Control Center, and more.

She said that this utilized on-device machine learning.

“This function combines machine learning on the device with data from the built-in sensors of Apple Watch to help detect subtle differences in muscle movement and tendon activity, thus replacing the display tapping.”

Apple views accessibility as one of the company’s core values, and its tech can make a huge difference to the lives of people with disabilities. One reader spoke earlier this year about small things making a big difference.

I always thought it bonkers when using Siri on iPhones, for years users can place a call by saying “Hey Siri, call…”, but until now there’s been no “Hey Siri, end call” command. It lead to a lot of daily frustration as I can’t press the red button on the iPhone screen to hang up a phone call, so this prompted me to campaign for it. I’m really glad Apple has listened and resolved the contradiction in iOS 16! Hopefully, it will also be of use to anyone who has their hands full.

That point is one others have echoed: Accessibility features may be aimed primarily at those with disabilities, but can often prove beneficial to a much wider audience.

Apple also sees machine learning having huge potential for future health features, says Ge Yue.

Saying, too, that “our exploration in the field of health has just begun,” she says that Apple believes that “machine learning and sensor technology have unlimited potential in providing health insights and encouraging healthy lifestyles.”

Photo: Xu Haiwei/Unsplash

FTC: We use income earning auto affiliate links. More.


Check out 9to5Mac on YouTube for more Apple news: