Let’s do a little experiment:
Wait, did you get to Step 3 with your eyes closed? Maybe read all the steps first, then execute… How’d it go? I’m guessing it went rather poorly, since your eyes were shut, and you likely had no tactile feedback about anything going on in your app or what your finger swipes were doing. Now imagine you have a visual impairment and this wasn’t just an experiment. Imagine that you live every day trying to navigate through different technologies such as apps on your phone.
To be blunt, accessibility does not affect me or anyone that I am close with. That doesn’t mean that I lack empathy for those it does affect. Just like I’m happy to help a shorter person get something off the top shelf at the grocery store, I’m happy to put the small amount of effort into development to make an app work for those who struggle with day to day tasks on their phone.
The good news is that Apple recognizes that this is a real problem that needs a solution, and for years has continued to improve accessibility settings and support in iOS. I recently spent some time adding VoiceOver accessibility support for an iOS app, and I thought it would be great to share my experience with this topic.
In this post I’m going to navigate my way through one of my favorite apps (designed by the amazing team at Atomic Robot) and see how well it functions with VoiceOver turned on, if at all. CoasterRadio mobile is a pet project of mine, and during development I was focused on cranking out features as quickly as possible in my spare time rather than thinking about making the app accessible to all users. In a typical development cycle, VoiceOver support would be included as requirements for a feature, and should be called out in the acceptance criteria. My personal projects don’t have either of those things, which meant that VoiceOver support was completely overlooked.
First, let’s turn on VoiceOver:
If this is your first time enabling VoiceOver, you’re probably going to notice a few things:
If you take a moment to think about how a blind user might interact with apps on their phone, these things probably start to make more sense. If a user can’t see visual components on the screen, how will they know where exactly to tap their fingers? How will they know the primary function of the current screen? Between Siri describing the visual elements and these special swipe and tap actions, a user can start to interact with the app without actually having to see it.
Apple has a nice list of VoiceOver gestures, but for simplicity we’re only going to focus on three of them:
We’ll swipe to navigate through the screen and double-tap to take action on the selected element. Now, let’s try our experiment again from the beginning but this time with VoiceOver turned on. How was that?
I’m guessing it wasn’t a perfect experience, but a big improvement over navigating your app with your eyes closed and VoiceOver turned off. Without further ado, here’s a VoiceOver demo with a fresh installation of the CoasterRadio.com mobile app:
If you watched the screen recording with audio, VoiceOver does a pretty nice job of reading the content of the elements as you interact with them. This is because default iOS controls in UIKit and SwiftUI already support accessibility out of the box (ie,
Text, etc.), and VoiceOver knows how to handle them. Where things start to fall apart for us is when swiping right doesn’t seem to navigate between the main cards correctly. In fact, it looks like it’s starting in the upper left corner and proceeding left to right, up and down to each element…because this is the default behavior. This might work in many cases but in ours it doesn’t make sense.
Another thing you might notice is that VoiceOver is reading the name of the images on the cards. In some situations reading information about an image might be useful. However, in this example the image doesn’t really provide any value for someone with a visual impairment because it’s really designed to provide some visual context when looking at the cards on the screen. It may be best to skip this when swiping through elements altogether.
The last thing that could use some improvement is the focus over a tall, skinny element that appears empty and doesn’t do anything. I’m not completely sure what this is, but we can find out!
Our “Play Now” buttons work fine but, how might we address these other three issues? The CoasterRadio.com app is built primarily with SwiftUI meaning all code examples will use that toolkit, but accessibility is also fully supported in UIKit. I’m going to break this down into fixing the three issues, easiest to hardest:
This one is easy. Have a look at
.accessibilityHidden(_:). Setting this value to
true tells iOS that we want to ignore this element for anything related to accessibility support. In our case, it means VoiceOver won’t focus on this element.
I think I know what this is… when implementing these cards, I wanted the title to always be tall enough to hold two lines of text whether or not there was enough content to force a line wrap. The workaround [at the time] was to place the
Text element in a
ZStack with another
Text element that only contained a newline, like this:
Text view should simply be ignored, which is another easy fix for
This one might be a bit more complicated, but we’ll have to dig in and find out. In code, the large cards (
LargeCardView) are all self-contained
Views with nested
HStacks to hold the text and images, and the
LargeCardViews all live inside a
ScrollView(.horizontal). The issue here is that VoiceOver doesn’t know how to handle these custom views, and/or that
LargeCardView should even be a focusable element. To fix this, I’m going to apply the
.accessibilityElement modifier, and pass in
children: .contain. You can read the docs, but basically this is saying “this whole card should be considered a single accessible element, but we also want to include the child elements so they can be focused as well.” It might be nice for the user to know what the content of a card contains without having to swipe between all the children, so I also added the
.accessibilityLabel(<card type>) modifier directly to the
LargeCardView, and set the redundant
Text view to
Let’s see how we did:
There you have it! With a somewhat minimal amount of effort, we’ve made one of our custom view components completely VoiceOver accessible to help ease utilization for someone with a visual impairment. We’re well on our way to making the entire app easier to use with accessibility features.
As you build and test your changes on a physical device, you might find that it’s useful to turn VoiceOver on/off very quickly. For devices you can adjust the Accessibility Shortcut and toggle VoiceOver with a triple-click of either the Home button or the side button. I found this extremely useful while making code changes and trying to test them. For testing your accessibility changes on the iOS Simulator, there’s also the Accessibility Inspector, but quite frankly, I found this to be unreliable and defaulted to testing on a physical device.
We’ve only started to scratch the surface of accessibility and VoiceOver in iOS. If you are interested in learning more or diving in deeper, you may enjoy reading Apple’s Human Interface guidelines for accessibility, and specific guidelines for VoiceOver. We haven’t even talked about neat stuff like the rotor or custom actions! I hope you’ve found this content useful, and start to think about accessibility support in your apps, especially with new features. Keeping accessibility in mind from the beginning demonstrates support for all your users, and will save time down the road in refactoring. Until next time, go build something cool!