“If you’re a visual learner and you’re vision impaired, you’re a bit screwed – and I think I’m one of those people,” laughs Alex Man, a digital marketer and the Assistive Technology Officer for the Royal Society for Blind Children (RSBC). Alex was born with glaucoma, which damaged his optic nerve, impairing his sight from a very young age.

As both a blind person and a digital marketer, Alex is well aware of how issues of accessibility – particularly alt text – can become distorted by other marketing priorities.

“It’s rare that companies will add a descriptive alt text versus an alt text they think Google would like,” he says. “I’ve even been on websites that use keyword stuffing. They’ll have a graphic and the alt text says, ‘Cheap, blah, blah, blah, blah’. ‘Cheap’ is not a visually descriptive word, so why would you put cheap in there?”

The risk of treating alt text as another SEO opportunity is that it reduces digital accessibility to an abstract idea defined by a few core practices that, while we’re encouraged to use them, are inconsistently applied and far from compulsory.

“People still get confused. They’ll post in a forum asking, ‘Do I really need alt text’ and everyone’s replying, ‘SEO, SEO, SEO’. And I’m the only blind dude pointing out it’s also a good thing for screen reader users as well, by the way.”

While marketers and content creators routinely debate and swap advice on if, when and how to use these practices, a disabled person can’t choose if, when and how to be disabled. For them, digital accessibility is a full time, lived experience.

Chloe Tear was born with mild cerebral palsy and began writing an award-winning disability blog about her experiences when she was 15. During her first year at university, Chloe began to lose her sight due to a visual cortex disorder. “Considering I was doing my degree, I was having to constantly consume information. There wasn’t time to think, ‘I can’t access this,’ because I needed to. I had to adapt and find ways of doing things again.”

Chloe did adapt and graduated university with a First. Today, she works in the charity sector as a freelance writer and content designer and also moderates on a disability forum.

Yes, blind people use Instagram

Like Chloe and Alex, 93% of registered blind people can still see something, according to the Royal National Institute of Blind People. Whatever partial sight remains is therefore extremely valuable to them.

“Prior to losing my sight, I was a very visual person. In a way I still am, which is quite ironic,” says Chloe, who was active on Instagram long before her vision began to deteriorate. Instagram has remained part of her content mix despite being an image-focused platform.

“When posting to Instagram, I will go through my photos where I can zoom in and see which ones they are, saving them to a separate album or to my favourites. Then, when I’m going to post them, I’ve already done the editing and choosing in a more accessible way. I know the photos I’m putting up, rather than guessing from little thumbnail pictures.”

Chloe is also active on Facebook and Twitter, relying on massively resized text rather than a screen reader.

Alex’s sight has deteriorated more severely than Chloe’s. As he describes on his website, large text is not an option unless it is blown up “insanely large, and I mean one-word-taking-up-a-whole-monitor-screen large”.

Being born with a visual impairment means Alex has only ever experienced social media through assistive technology. “I use Instagram occasionally, but I use Facebook a lot more because there’s accompanying text. A lot of people post a picture on Instagram without any captions apart from some hashtags, and I don’t get much out of that.”

Because he prefers to use his remaining vision whenever possible, Alex switches between visual aids such as magnifiers and text-to-speech aids such as screen readers. “When I’m going through my feed on Instagram, I disable VoiceOver and look at the pictures. If it’s something I’m interested in, then I turn on VoiceOver to see if they’ve added a caption or description. But normally I would just zoom in and use my remaining vision to access it.

For most users, social media is fast-moving and spontaneous, where the impetus is to post in real time. Twitter’s Tweet compose box still prompts the user with the question “What’s happening?”, while Facebook asks, “What’s on your mind?”, reinforcing that social media operates very much in the present tense – this moment, right here, right now.

But for Alex and Chloe, going through their feeds takes more time and effort, so that social media is a much slower experience for them. As a result, social media becomes a more considered and deliberate space.

“Being more considerate of when I post and the fact it takes me longer, I don’t think is a bad thing,” says Chloe. She thinks the urge to post on social media immediately after the photo is taken or in the heat of the moment too often gets in the way of the moment itself. “Because of the extra work I have to do, I’m more in the moment when I’m there. Rather than trying to write the perfect caption 30 seconds after you’ve taken a photo, because it needs to go up now, just enjoy what you’re doing and post it later.”

Accessibility frustrations

As adept as both Alex and Chloe are at using assistive technology, some visual content remains stubbornly invisible to them.

You might not think of a PDF as visual content, but to a blind person there is little difference – which is why they are Chloe’s biggest pet hate. Most PDFs combine the image and text within a document into a single layer. “That’s a flattened image,” says Chloe.

“Even with enlarged text, you’re constantly scrolling to one side to read the line, and then you’ve got to scroll all the way back to read the next line and so on. You can do a text PDF, but that requires a lot more skill than the average person creating a PDF has.”

If your reason for creating a PDF is to prevent others from editing the document, Chloe says to simply lock the Word document instead.

Alex’s biggest frustration is tutorial videos where the audio is only music with no audio narration. “As soon as I come across that, I go out of it and find another one. As a marketer with my SEO hat on, I’m thinking that I just clicked away from your thing. Google or YouTube might report one more bounce and that makes me feel a little bit better. That person won’t get as many views of the video because they didn’t put in the effort, versus someone who does a really detailed instructional video with a voice and guidance. But in the long run it doesn’t really change anything, because I’m just one person and blind people are still a minority.”

Music-only videos are also common on TikTok and Instagram Reels, relying on captions to convey a few nuggets of text-based info. This may be fine for deaf users – and others who prefer to watch videos with the audio off – but these videos can be completely inaccessible to the blind.

To be truly accessible, content creators should therefore also consider whether the audio can convey information independently of the visuals with a little narration or dialogue.

Is technology the solution to accessibility?

Alex accepts that most people will try to keep text to a minimum on social media. “A picture is still worth a thousand words,” he says, even though the opposite is true for him. “I like genuine conversational stuff. That’s why platforms like Clubhouse have been so popular in the blind community. Obviously, that has caused another issue, because what Instagram is to blind people, Clubhouse is to deaf people.”

Live transcription isn’t yet available on Clubhouse, but some users are already creating workarounds with third-party apps like Otter.ai. The technology to automatically transcribe speech into text has been around for years, and the accuracy is always improving. In 2020, IGTV launched auto-generated captions, making it easier for users to provide reasonably accurate captions to their spoken-word videos.

However, live transcription is far easier for AI to handle than interpreting and describing a static image. Alex believes deaf users are better served by automated live transcriptions and captions than blind users are from automatically generated image descriptions, such as those now supported on Facebook and Instagram.

“It’s minimal what the computer can tell you,” Alex says. “It can say the image is outdoors with trees and sky and stuff like that. But it doesn’t tell you anything worth knowing, especially with things like memes and jokes. AI can tell you the image is of a man standing outside, but you won’t ‘get’ the meme unless you know the source of it. It won’t tell you this is a screenshot from a movie. The context is really important, and AI can’t do that yet.”

Chloe cautions against content creators and businesses relying on the platforms and devices to solve their content accessibility problems as not every user will have access to (or can afford) the same technologies. Through her blog, Chloe has had the opportunity to test and review many assistive products that would otherwise be too expensive for her to consider using. Meanwhile, most users don’t get to try every new innovation on the market.

“Technology is becoming better and better at making the internet accessible, which is great. But it means businesses and organisations are still getting away with not following these practices. And people with older forms of assistive technology still won’t be able to access it.

“It shouldn’t be down to the technology to be smarter, because then it becomes more expensive,” says Chloe. “It should be down to making the content accessible, so users get the same experience regardless of the assistive technology they use.”