We are just weeks away from the keynote for this year’s Worldwide Developer Conference, where Apple is expected to give the world a first glimpse at the next software for the iPhone and iPad. But the stream of updates continues. A few days ago, there was confirmation of the much-awaited arrival of Final Cut Pro and Logic Pro, popular video and music creation apps on the iPad Pro. An announcement that would normally figure in a WWDC keynote.

For representational purposes only. (Sourced image.)

Now, Apple has given us a first glimpse at the upcoming additions to an already extensive list of accessibility features, which will help users with cognitive, vision, hearing, and mobility disabilities. They’re coming “later this year”, which means the chances of a roll out with iOS 17, likely scheduled for release sometime after the iPhone keynote in September.

This is a mission that Apple touches upon annually, around the Global Accessibility Awareness Day on May 18. “We don’t ever consider accessibility to be something that is a checkbox that gets done and moved on,” Sarah Herlinger, senior director of accessibility policy and initiatives at Apple, told HT.

Last year, the important additions included door detection for users with vision disabilities, extending the iPhone’s Voice Control and Switch Control functionality to the Apple Watch, on-device generation of Live Captions for anything that’s on an iPhone, iPad or Mac’s screen and new language support for Voice Over.

One of the standout upcoming additions will be a feature called Personal Voice, which will allow users who are at a risk of losing their voice (this happens if there is a recent diagnosis of ALS, or amyotrophic lateral sclerosis) to create a virtual voice that sounds like them. For this, they’ll need to read what Apple calls a randomised set of text for a duration of 15 minutes.

We asked Herlinger how challenging it is to refine accessibility features since there is absolutely no room for error, and she alludes the success to a multi-layered approach towards getting things right. She mentions the point and speak feature as an example, which is built to support the blind and low vision users, and that sits atop tools such as voiceover that have been part of Apple’s iPhone, iPad and Mac devices for more than a decade now.

“I think that, in a lot of ways, comes back to the same mantra of Nothing about us, without us. As we do the designing of our features, it’s not a single conversation or gathering feedback and then going and building. It really is that iterative process along the way,” she said.

Herlinger tells us about particular feedback from an Apple Watch user who is an amputee, describing how they need to use a lot of the functionality, such as answering phone calls on the Watch itself, with the help of their nose. “I just wish I had a more dignified way to use this device,” the user had written to Apple.

Since then, Apple had expanded the Assistive Touch gesture controls for the Watch, which include ‘tap pointer finger on thumb’ to move to the next item in the on-screen interface or “close your hand into a fist” to select an item or option on the Apple Watch interface.

The text is randomly chosen, and a user can choose to pause and then resume this setup. If during this process the user moves too far away from the iPhone or iPad, or there is background noise, there’ll be prompted. The processing for Personal Voice is done on-device and will integrate with Live Speech feature, which allows users to type what they want to say, and have it be spoken out during calls, or in-person conversations. There will be the option to save commonly used phrases for quick access during conversations.

Herlinger points to Apple’s approach which relies heavily on feedback. “Whether we’re thinking about something like assistive access designed to lighten the cognitive load for people with cognitive disabilities or things like live speech to enable non speaking users to type with a when it’s say and have spoken aloud, we’re really trying to make sure that when we do that it’s in collaboration with members of the communities, to be able to build the best technology that we can,” she said.

Also Read:Tech, education, community and climate: Understanding the Tim Cook philosophy

There are significant interface tweak options incoming for some of the most used apps on an iPhone or iPad. This includes Phone and FaceTime, which can be combined into a single app called Calls for ease of finding who to call and the last dialled numbers.

Messages, Camera, Music and Photos apps will also take advantage of high contrast buttons and large text labels, which will make usability simpler for users with cognitive disabilities. The Camera app will have a larger and better marked ‘Take Photo’ on-screen button, Messages app will also have a bigger emoji-only keyboard and a larger send button, for instance.

One of the persistent elements within the accessibility-focused app interface is the large “back” button, to help with navigation. The home screen layout on an iPhone or iPad can be altered too for easier viewing and visibility, and much like the apps, users can choose either a grid-based layout or rows with large text labels to go with that.

Apple is expanding the iPhone’s Magnifier app’s scope with a Point and Speak feature that will describe and guide users with vision disabilities, about what the phone’s camera sees. The Magnifier app opens the rear camera on an iPhone, and along with the LiDAR Scanner, on-device machine learning as well as Voice Over, will be able to audibly instances such as text labels on home appliances. Over the past couple of years, Apple has added People Detection, Door Detection, and Image Descriptions to the Magnifier app.

“We try to make sure that our devices work for the largest group of individuals possible. When we think about accessibility, it’s not about compliance or a checkbox. It really is about customization and making sure that what we design is really built for the continuum of the human experience,” Herlinger said.




Source link