I don’t particularly use generative IAs that create Alt Text from images, but I’ve seen a lot of people who do use them and I think that’s the only use of AI from which I can’t draw anything negative.
The fact is that there are many people who do not know how to make a good Alt Text, and therefore either do not put them or put one that is only the word “photo/image”, which is much worse.
I think if AI had started there, as a tool to help accessibility, it wouldn’t have the stigma it has today.
But of course, there are not enough blind people in the world from whom to get absurdly, ridiculously, vulgarly obscene amounts of profit…
(edit: I kind of regret posting this image for reasons described below, but I’ll leave it up for context.)
I’m glad Mary speaks for all disabled people and we can finally put this topic to rest.
Mary doesn’t get an opinion because it doesn’t represent 100% of people like her!
Nah you just don’t get to use Mary’s opinion as the final word on the matter.
Mary speaks for herself and we are acknowledging her personal opinion.
In my case (ADHD) I agree with Mary, I certainly don’t need inaccurate AI summaries shoved down my throat while being told it is a great thing for making things accessible for people like me.
I think that in this context it does not apply because the person in the image is talking about people who use accessibility as an excuse to keep asking ChatGPT to do their homework, and I in particular refer to the legitimate use of AI solely and exclusively to facilitate accessibility. I could give as an example the Bots on Mastodon that, if you follow them, when you upload an image without Alt Text, they respond to the Toot with a detailed description of the image.
Where is the context about using AI to do their homework?