Apple brings Magnifier to Macs and introduces new access reader mode

This Thursday Global Access Awareness Day (Gaad), and its rite over the past few years, AppleThis access team takes this time to share some new auxiliary features that come to the ecosystem of products. In addition Bring “Access Nutrition Labels” to App Store.

According to the company’s press release, this year, especially “40 years of access to Apple at Apple”. The company was first launched with its screen reader and was designed to help those with a significant amount of vision defect in this year’s updates.

The arrival of Magnifier on Macks is one of the most important. Camera -based auxiliary feature Available on iPhones and iPads Since 2016, people are allowing them to refer their phones around them and get the auditory readouts of the scene. To increase the brightness, zoom in, add color filters and to adjust the perspective makes it easy to see the hardest things to read the magnifier.

With the Magnifier for Mac, you can use any USB-connected camera or your iPhone (through a continuation camera) to get feedback on the contents around you. In a video, a student in a large lecture hall showed how their iPhone was attached to the top of their mockbook and how to use it to be written in a distant Whiteboard. The Magnifier for the Mac also works with the desk view, so you can use documents to read more easily in front of you. Multiple Live Session Windows is available, so when you are using a desk view, you can continue the display through your webcam to read a textbook at the same time.

The Magnifier for Mac works with another new tool for Apple unveils today – Access Reader. It is “designed to facilitate the text for users with a new systemwide reading mode, dyslexia or less vision.” Access reader is available in iPhones, iPads, Macs and Apple Vision Pro and is part of your magnifier, which allows you to customize your text, “Extensive options for font, color and spacing.” For example, it helps to reduce distraction by getting rid of confusion.

Access reader also supports Spoken contentAnd because it is built in the magnifier app, it can be used to make it easier to read the real world text, such as codes or menus. You can start it from any app, as this is the mode of OS level.

Two phones side by side, Apple's new access reader mode. The narrow column of the text in a document on the left, but its access reader version on the right, which has a more obvious white-on-block text and bottom playback controls.Two phones side by side, Apple's new access reader mode. The narrow column of the text in a document on the left, but its access reader version on the right, which has a more obvious white-on-block text and bottom playback controls.

Apple

For the most comfortable works in Braille, Apple has supported Braille input for a few years and started working recently Braille displays. This year, the company is bringing Braille Access to iPhone, iPads, Max and Vision Pros and is designed to make it easy to take notes in Braille. It comes with a special app launcher, which allows people to open any app by typing the Braille screen input or connected Braille device. ” Braille access allows users to take notes in Braille format and use the Nemeth code for their math and science calculations. Braille access can open files in Braille Ready Format (BRF), so you can return from other devices to your current documents. Finally, the comprehensive form of direct titles allows users to translate conversations directly on the Braille displays in real time. “

Wrapping up vision-related updates is the enlargement of such access features in Vision. For example, the zoom function wearers are improving to allow them to be large in both virtual reality and actual reality. It uses Vision Pro’s cameras to see what is located in your surroundings, and Apple makes the new API available, “Approved applications allows you to access the main camera to provide assistance from person to person from person to person from person to person, from person to person.” Finally, using the on-Divis machine learning to identify and explain the contents of your surroundings, coming to the Live Recognition voice on Vision Pro. It can also read flyers or invitations and tell you what’s on them.

For those with hearing loss, the Live Listening feature already on iPhones is perfect by controls in the Apple Watch and some bonus features. When you start a session to listen directly to your iPhone, it will transmit your connected airpods, headphones or positive hearing devices to select its microphone, you will soon be able to see live headings on your attached Apple Watch. You also get controls on your wrist, so you can start, stop or rewind the session. This means that you can start with your iPhone and start direct listening sessions without going to the kitchen to pick up your iPhone and listen to what your partner says while cooking. Listen to Live with hearing health and hearing aid launched in Pro 2.

When we are on the sound of the sound, Apple is updating its background sounds, which helps to deal with the symptoms by playing white noise (or other types of audio) for those with tinnitus. After this year, the background sounds will provide automatic timers after the selection of new EQ settings to provide automatic timers, to personalize automation activities and sounds in shortcuts.

The personal voice that helps maintain their vocal identity is also getting big improvement. When I test the feature to write tutorial How to create your personal voice on your iPhoneI was shocked that the user had to read 150 phrases. Not only that, but the system needs to be percholed overnight to create personal voice. With the upcoming update, personal voices can be produced within a minute, only 10 phrases need to be recorded. The resulting accent also feels smoothly and with less clipping and artifacts. Apple is also adding Spanish language support to the US and Mexico.

Last year, Apple introduced Built on eye-tracking iPhones and iPadsAs well as vehicle motion instructions to reduce car illness. This year, it is improving those features by bringing motion instructions to the mocks, as well as adding new ways to customize the screen dots. Meanwhile, in eye-tracking other keyboard typing updates, it is getting an option to live or use the switch to confirm options.

Apple’s ecosystem is very widespread, it is almost impossible to list all personal access-related changes that come to all products. I scream head tracking quickly, which allows people to controll their iPhones and iPads more easily by stiring their iPhones and iPads like “eye tracking”. Nothing has been shared about this Head tracking on iPhones and iPads right now Is supported by connected devices. It represents comprehensive support that it is “like the eye tracking”, but we do not know whether it is true. I asked Apple for more information and I would update this part with what I found.

Speaking of connected devices, Apple is also adding a new protocol to change the control of brain computer interfaces (BCIS). Theoretically, which means that the brain wave-based control of your devices means, and Apple lists iOS, iPados and VisionOs in the deck in support of this new protocol. Again, it is uncertain whether we can go until the brain wave-based control is coming and I also asked Apple for more information on this.

To the users Apple TV.

It adds to the name identity to inform users to hear about the sound identification trait (such as alarms or crying or crying). Sound identification for the carple, in particular, informs users when weeping children (along with current support to external sounds such as horns and sirens). Carplay also supports a large text, making it easy to get attractive information.

Other updates include more language support in direct headlines and voice control, as well as access settings, iPads and iPhone are capable of sharing quickly and temporarily, so you can use a friend device without customizing your needs.

Its retail places from Apple, music playlists, books, podcasts, TV, news, fitness+ and many more access to the app store, mostly surrounded by more representation and inclusion. The most new features and updates that I have covered here are not much through the perfect release window, although they usually appear in the subsequent release of iOS, iPados, Macos and Vision.

We have to wait until iOS 19, iPados 19 and more public roll out to try our own, but at the moment, most of which look very helpful. And as always, it is advisable to design comprehensively and consider wide needs.

If you buy anything through the link in this article, we can earn a commission.

Source link

Related Articles

Back to top button