Yep. Following standards has great side-effects all around. The same things that break sites for the blind also break it for UX-enhancement extensions like Tridactyl, which lets you click elements from the keyboard, so long as sites don't go out of the way to make clickable buttons undiscoverable.
(Extreme apologies for any implied equivalence between myself and the blind.)
I actually think this a great example of the Curb Cut effect, where accessibility features that are vitally important to one group of people (wrt curb cuts, wheelchair users) also provide broad benefits to many others (wrt curb cuts, one example would beparents with small children in strollers).
Webscrapping isn't bad per se. It's being made to look bad by parties that want to eat their cake and have it too. If you show something to a person, that person should also be able to use custom automation to view it. If you show something publicly, the public should be able to use custom automation just as well. Don't want someone scraping off your site and using it for profit? Limit the audience and sue people making profit off your data for copyright infringement.
> It has solutions, like Google voicing out the contents, instead of doing a webpage that is so scrappable that screenreaders can parse it.
The real solution is to find a business model, or a way to organize society, that does not depend on us building nonsensical prisons or restrictions for each other. It's almost as if surveillance capitalism is not the final answer...
(Extreme apologies for any implied equivalence between myself and the blind.)