Tuesday, September 27
Shadow

Apple’s machine learning talk focuses on accessibility benefits

Apple’s machine learning projects cover nearly every aspect of the company’s business, but in a new keynote at an AI conference, a senior executive spoke specifically about the benefits for accessibility and health.

Ge Yue, Vice President of Apple and General Manager of Apple Greater China, delivered her keynote address at the 2022 World Artificial Intelligence Conference in Shanghai…

NPR Reports:

Apple gave a rare speech at a global AI gathering as Vice President Ge Yue chose to focus on machine learning in accessibility features […]

The company chose to illustrate the technology through accessibility features in Apple Watch and AirPods Pro […]

She said “machine learning plays a crucial role” in Apple’s hope that its products “can help people innovate and create, and provide the support they need in their daily lives.”

“We believe that the best products in the world should meet everyone’s needs,” she continued. “Accessibility is one of our core values ​​and an important part of all products. We are committed to making products that truly work for everyone.

“We know that machine learning can help users with disabilities provide independence and convenience,” she said, “including visually impaired, hearing impaired, people with physical and motor disabilities, and people with disabilities. cognitive”.

Ge Yue gave the example of the Assistive Touch feature on Apple Watch, which the company introduced last year, alongside eye tracking on iPad.

To support users with reduced mobility, Apple is introducing a revolutionary new accessibility feature for Apple Watch. AssistiveTouch for watchOS allows users with upper body differences to enjoy the benefits of Apple Watch without ever having to touch the screen or controls.

Using built-in motion sensors such as the gyroscope and accelerometer, as well as the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and movement. tendon activity, which allows users to move a cursor on the screen through a series of hand gestures, such as pinching or squeezing. AssistiveTouch on Apple Watch makes it easier for customers who have limb differences to answer incoming calls, control an on-screen motion pointer, and access Notification Center, Control Center, and more.

She said it used machine learning on the device.

“This feature combines on-device machine learning with data from the Apple Watch’s built-in sensors to help detect subtle differences in muscle movement and tendon activity, replacing tapping the screen. “

Apple considers accessibility one of the company’s core values, and its technology can make a huge difference in the lives of people with disabilities. A reader spoke earlier this year about the little things that make a big difference.

I always thought it was crazy to use Siri on iPhones, for years users can make a call by saying “Hey Siri, call…”, but so far there is no had no “Hey Siri, end call” command. This led to a lot of daily frustration as I cannot press the red button on the iPhone screen to hang up a phone call, which prompted me to campaign for it. I’m really glad Apple listened and resolved the contradiction in iOS 16! Hopefully it will also be useful to anyone with their hands full.

Others echoed this point: accessibility features may be primarily aimed at people with disabilities, but can often prove beneficial for a much wider audience.

Apple also sees machine learning as having huge potential for future health features, says Ge Yue.

Also saying that “our exploration into health is just beginning,” she says that Apple believes “machine learning and sensor technology have limitless potential to provide health insights and encourage healthy lifestyles”.

Photo: Xu Haiwei/Unsplash

FTC: We use revenue-generating automatic affiliate links. After.


Check out 9to5Mac on YouTube for more Apple news: