It's now 2025 and I finally got Orca working. I don't even know why it works. I switched to Wayland (Hyprland). I think installing elogind also helped, or something. Also, there have been quite a few developments in the a11y scene:
I wish there was a screen reader I could use on Linux. Of course I knew of the existence of Orca, but getting it to work turns out to be a hassle. Maybe even impossible.
I remember that on Arch Linux, I somehow got Orca to work, but maybe I was using PulseAudio at the time, or something like that.
And now that I have Gentoo Linux, Orca stopped working. The most I've ever gotten it to work was saying "screen reader on", but it never spoke again. That was all it had to say. I had passed ALSAPCM=jack to Orca to force it to use JACK according to my configuration, and while it did actually speak now, all it said was "screen reader on" and nothing more.
So basically, Orca is non-functioning software. Even with PulseAudio it doesn't work anymore. My WM is i3 if that matters, but I remember it working with i3 on Arch.
Most GUI programs have bad a11y. "How?" you might ask? Well, my first guess would be ignorance. While that's probably true, there's another reason that's actually worse. I wanted to make my own GUI program accessible, so I decided to dig into GNOME's AT-SPI and ATK APIs, as well as Microsoft's MSAA API and the IAccessible2 API, and what do I find? A cludge of awful documentation. In GNOME's case, it was all Doxygen generated. In Microsoft's case, there were only confusing non-sensical MSDN pages. No examples at all, from either side! No wonder software is not accessible! M$ and GNOME don't offer any good documentation.
So even if you want to make your programs accessible, the learning curve is unacceptably high due to the poor documentation. But this is only when I look at the low-level APIs. Over the years, GUI toolkits like GTK and Qt have been adopted greatly, and these toolkits interact with the low-level accessibility APIs. You know what that means?
Yep. We're all dependent on these devs, and yet we fuck up making our programs accessible even while using either toolkit because we e.g. don't know or use custom widgets. It's a real fucking mess. I don't even like either toolkit because GTK4 is not properly themeable anymore, and while GTK2 might be okay, it's not really cross-platform enough (no a11y on Windows). Qt is written in C++, and since I'm mostly a C programmer, I avoid using C++ libraries.
I've been thinking and I feel like textual environments make the most sense for blind people. There's no point to a GUI. However it provides a tabbable UI with a DOM of elements which some blind people seem to prefer over a CLI program. But CLI programs will be the easiest to make accessible, as there's a lot less cludge to deal with. As long as you avoid having multiple things on the screen, abusing ncurses, you will be fine. For inspiration, enable Links's braille mode.
I tried running brltty in a VT, but I never got speech to work. I even tried with regular ALSA but still no output. I did borrow a friend's refreshable braille display to try it on brltty and it worked, so if you got a braille display you're good to go. However these things are god-awful expensive! They easily cost 4000 euros.