People with a range of physical abilities can now use their phones more thanks to tools like screen readers.

More accessibility features have been introduced, including improved live transcription tools and apps that use artificial intelligence. When enabled, your phone can alert you when a baby is crying or when you approach a door, for example.

It's easier to use the phone with accessibility tools. You can take a tour.

If you want to find all of the tools and features that are accessible, open the settings app on your phone. It is a good idea to take time to experiment.

ImageTwo smartphone screenshots: One showing the first part of Apple’s iOS 16 Accessibility menu with categories for “Vision” and “Physical and Motor” and the second image showing some of Google’s Accessibility menu for Android 13, including the options for Live Captions and Audio Descriptions.
Apple’s iOS software, left, and Google’s Android system each have an Accessibility section in the main settings area that can be used to configure the phone for specific needs.Credit...Apple; Google
Two smartphone screenshots: One showing the first part of Apple’s iOS 16 Accessibility menu with categories for “Vision” and “Physical and Motor” and the second image showing some of Google’s Accessibility menu for Android 13, including the options for Live Captions and Audio Descriptions.

The websites of Apple and Google have dedicated sections for accessibility, but note that your exact features will vary based on your phone model.

Swiping and tapping by hand to navigate a phone's features doesn't work for everyone but there are other ways to move through the screens and menu.

The Back Tap function, which completes assigned actions when you tap the back of the phone, is in theiOS Touch settings.

Similar options can be found inAndroid's accessibility shortcut. One way to access these is to open the main settings icon and select System, then gesture and system navigation.

Both platforms support navigation through third-party adaptive devices or by using the camera to recognize facial expressions, like looking to the left or to the right. The devices and actions can be configured in the settings of the app.

ImageTwo smartphone screenshots side by side. The first image shows Apple’s iOS 16 Head Tracking settings menu for assigning navigation functions to physical head movements. The second image shows Android’s setup screen for assigning a facial gesture to navigational function on the phone.
Both iOS, left, and Android can map specific phone functions to facial expressions and gestures.Credit...Apple; Google
Two smartphone screenshots side by side. The first image shows Apple’s iOS 16 Head Tracking settings menu for assigning navigation functions to physical head movements. The second image shows Android’s setup screen for assigning a facial gesture to navigational function on the phone.

Those who can't see the screen can use tools from Apple and others. The VoiceOver feature in Apple's software allows you to give an audio description of what's on your screen as you move your finger.

Voice control allows you to control the phone with spoken commands. It is possible to use the Select to Speak setting on the mobile device to read aloud what is on the screen.

Don't forget to use hands-free methods with your phone. It is possible to open an app and perform actions with spoken commands. Apple has a feature in its keyboard that allows you to write with your voice.

It's possible to zoom in on parts of the phone screen with the help of the accessibility settings. If you prefer larger, bolder text and other display adjustments, open the settings icon and select display and text size. Go to the settings and choose the display size and text you want.

The Magnifier app has been upgraded to version 16 of the operating system. People who are blind or low vision can use the app to detect doors and people nearby as well as identify objects and surroundings.

ImageTwo screenshots that show Apple’s Door Detection tool in the iOS Magnifier app. The first image shows the app detecting a door and displaying the distance to it and the door number in large type on the phone screen. The second image shows the Door Detection settings for adjusting the pitch of the audio alerts as the person approaches the door, as well as the settings for other audio and haptic feedback.
The Magnifier app’s Door Detection feature in iOS 16 uses the iPhone’s LiDAR scanner to identify doors and door numbers in unfamiliar places.Credit...Apple
Two screenshots that show Apple’s Door Detection tool in the iOS Magnifier app. The first image shows the app detecting a door and displaying the distance to it and the door number in large type on the phone screen. The second image shows the Door Detection settings for adjusting the pitch of the audio alerts as the person approaches the door, as well as the settings for other audio and haptic feedback.
ImageTwo screenshots from Google’s Lookout vision-assistance app for Android. One image shows the app recognizing a $2 bill and the other image shows the app scanning a food label as it announces the brand of beverage in the bottle.
Google’s free Lookout app for Android offers assistance by using the camera and audibly identifying currency, food labels, text, images and more.Credit...Google
Two screenshots from Google’s Lookout vision-assistance app for Android. One image shows the app recognizing a $2 bill and the other image shows the app scanning a food label as it announces the brand of beverage in the bottle.
ImageTwo screenshots depicting the settings screens for Apple Live Captions tool and Android’s Live Caption feature, both which add onscreen captions to videos, video calls and other apps.
The Live Captions tool on iOS, left, and Android’s Live Caption feature automatically converts speech from video and other apps to captions on the screen, although accuracy can vary.Credit...Apple; Google
Two screenshots depicting the settings screens for Apple Live Captions tool and Android’s Live Caption feature, both which add onscreen captions to videos, video calls and other apps.

LiveCaptions is a real-time transcription feature that converts dialogue around you into text. You can use the Live Caption setting on your phone to caption audio and video.

ImageA screenshot of Apple’s Sound Recognition settings and a visual alert for a ringing doorbell, next to a screenshot of Google’s Live Transcribe tool creating a transcription of the narration from a video playing near the phone.
The Sound Recognition feature in iOS, left, allows you to have the phone listen for certain sounds and provide a visual alert. Right, Android’s Live Transcribe software converts nearby speech into text on the fly.Credit...Apple; Google
A screenshot of Apple’s Sound Recognition settings and a visual alert for a ringing doorbell, next to a screenshot of Google’s Live Transcribe tool creating a transcription of the narration from a video playing near the phone.