I really liked this article because it gave me my first real look into A11y GNOME/OS-level development from a developers perspective.
And it really hit me. I have done a ton of accessibility development and testing in the past. Mostly for websites and mobile apps.
For that, I had to set up Windows virtual machines to use screen readers like JAWS or NVDA (on Debian/Gnome).
Also I remember, I want to try Orca when I got in touch with a11y development, but I failed to get it to work.
And since Orca just is not widely used and does not represent the market well. I had an argument to not dig deeper.
On desktops, NVDA and JAWS are the big players, both Windows-only, with NVDA being open-source under GPL2 and JAWS being proprietary. On mobile, it is all about Apples VoiceOver and Androids TalkBack.
From my experience, the projects that really nail accessibility on mobile and the web start thinking about it even before they write a single line of code or do any UI design.
That is probably because accessibility is so tied up with UI/UX design and user flow.
Because of all those non-functional requirements that make it tricky, it requires a cross-functional-team effort including: designers, developers, content creators and product managers.
Trying to improve accessibility after the software is already designed and built is tough.
Sometimes you have to completely redo the UI and UX. When I think about doing that at the OS level or within GNOME itself, it feels overwhelming, like trying to make an impact drop by drop on a stone. But before you drop anything, you have to talk with 10 different people and teams.