Tech News

Apple expands accessibility features including live subtitles, magnifying glasses and sound recognition

Apple shared some accessibility updates on Tuesday that will cover its product range, which are designed to help everything from reading text to real-time subtitles to reducing sports disease. The announcement begins on Global Accessibility Awareness Day on May 15, with the feature scheduled to be released later this year.

The tech giant is preparing for the annual global developer conference on June 9, during which it will share software updates on its platform, including what it offers for iOS 19. It will also likely share Apple Intelligence updates, especially as other companies like Samsung and Google continue to load their phones. Many of these AI-powered features have also enhanced accessibility on devices such as iPhones and Pixel phones.

“At Apple, accessibility is part of our DNA,” Apple CEO Tim Cook said in a statement. “Developing technology for everyone is a top priority for us all, and we are proud of the innovations we share this year. It includes helping people access important information, explore the world around them and make tools they love.”

Apple’s accessibility update will reach iPhone, iPad, Mac, Apple Watch and Apple Vision Pro. Here are some things that will be available soon between these devices.

Accessibility Nutrition Tags

Lists an example of the accessibility features supported by the application, including voiceover, voice control, and larger text

The Accessibility Nutrition Tag will show which App Store games and apps have the support features you need.

apple

In the App Store, a new section in the app and game product page will highlight the accessibility feature, so you can immediately know if the required features are included before downloading. These features include dubbing, voice control, larger text, adequate contrast, reduced movement and subtitles, and other subtitles.

Accessibility Nutrition Tags will be purchased worldwide on the App Store. Before displaying accessibility information on their product pages, developers will have access to guidance on which standard applications need to meet.

Magnifying glass on Mac

A magnifying glass is a type of person who allows blind or low-vision to zoom in, read text and detect objects around them on an iPhone or iPad. Now, the feature is also coming to Mac.

The magnifying glass on the MAC connects to the camera on the iPhone, so you can zoom in on things around you, such as a screen or a whiteboard. You can use a continuity camera to link it to your Mac on your iPhone, or you can choose to connect it to your camera’s USB. This feature supports reading documents with desktop view. You can adjust the content on the screen, including brightness, contrast, and color filters to make viewing text and images easier.

Accessibility reader

This new reading mode on iPhone, iPad, Mac and Vision Pro is designed to make text easier to read, including those with disabilities, including those with dyslexia or low vision. Accessibility Reader lets you customize text and hone what you are reading by adjusting fonts, colors, and spacing. It also supports spoken content so your device can read content on the screen loudly.

The accessibility reader is available in any app and is built into a magnifying glass on iOS, iPados, and MacOS. You can activate the feature to interact with real-world text in menus and books.

Braille Visit

Braille Access lets you essentially turn your iPhone, iPad, Mac, or Vision Pro into a Braille Notetaker. They can type any application by using a Braille screen input or linked Braille device, then note the notes in Braille format and calculate using Nemeth Braille.

You can also open Braille ready-made files in Braille Access, allowing them to access books and files created on Braille admission devices.

Live subtitles on Apple Watch

iPhone and Apple Watch showcase real-time transcription of speeches

Live listening and live subtitles will display live text on your Apple Watch and allow you to remotely control live listening sessions on your iPhone.

apple

Live listening is a feature that takes the audio captured by the iPhone and unveils it to your airpods, beats, or compatible hearing aids, essentially turning your phone into a remote microphone. The feature will now appear on the Apple Watch soon with live subtitles to display live text of what you hear through your iPhone. This way, people can all listen to audio while seeing these live subtitles on their Apple Watch.

You can also use your Apple Watch as a remote to start or end the live stream, and if you missed anything, you can jump back. This means you don’t have to get up in class or meeting to grab or control your iPhone – you can do that from the entire room of your watch. Live listening can also be used with the hearing aid feature on the AirPods Pro 2.

Visual accessibility of Apple Vision Pro

Apple Vision Pro is adding some features to people with blindness or low vision. The update to the zoom will allow you to use Vision Pro’s main camera to tap into anything in your surroundings. Real-time recognition will describe what is around you, pinpoint objects and use voiceover to read documents.

The developer’s new API will also allow approved applications to access the headset’s main camera, so you can get real-time visual explanation help in my eyes (such as mine).

Other accessibility updates

Apple has shared some other updates to its accessibility features, including adding vehicle sports tips that can help reduce on-screen sports illnesses to reduce Macs. You can also customize the animation dots displayed on the screen on your iPhone, iPad, and Mac.

The personal voice makes it possible for people at risk of speech loss to create sounds that use AI and device machine learning to sound like them. Now it’s faster and easier to use. Instead of reading 150 phrases to set up the feature and wait for processing overnight, personal voices can now use only 10 recorded phrases to create a more natural voice copy in less than a minute. Apple has also added support for Spanish in Mexico.

The notifications on the iPhone are as follows: "Name recognition identifies what might be Sophie's voice."

Name recognition will remind you if your name is called.

apple

Similar to eye tracking, this allows you to control your iPhone and iPad with just your eyes, and head tracking also allows you to navigate and control your device through head movements.

Now you can customize music touch on your iPhone, which plays a series of faucets, textures, and vibrations in Apple Music. You can choose to experience these touches for the entire song or vocal, and you can also adjust the overall intensity.

Voice recognition reminds those deaf or dumb or sounds hard to hear, doorbells or car horns, so that they can add name recognition so that they can also know when to call their name.

Live Captions is also adding language support in more regions of the world, including English (India, Australia, UK, Singapore), Mandarin China (Mainland China), Cantonese (Mainland China, Hong Kong), Spanish (Spain, Spain), France (France, Canada, Canada), Japan, Germany (Germany), and South Korea.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button