Touch is for input. It has the affordances of a D-pad: up, down, back, forward, select. Also: options, aka right click. Implemented on a phone, these would be swipe up, swipe down, swipe left, swipe right, tap. Up and down traverse a browseable list. Left and right traverse history. Tap performs the default action on the current item. Long press activates options.
Audio is for output. It reads off menu items. It is a screen reader. It may be able to speed up speech. Audio would be best implemented via an earbud.
A second input type would be voice. I don’t see this as a navigation tool as much as an endpoint. Yes: talking on the phone. No: voice search.
Driver-oriented apps. A driver could navigate available radio stations without taking their eyes off the road. There could be a podcast directory. Or a news site where stories are audio but the ability to pick stories adds value.
Workout UX. A person who is jogging or doing other intense exercise could navigate a UI without interrupting their flow. For example, there could be a music player with UX targeting exercisers.
Cheaper and smaller mobile devices, with longer battery life. A screen adds a lot of cost, consumes a lot of space, and burns a lot of battery.
The app could be implemented as a web page on a mobile device. There would be little displayed, only a prompt for using the touch input controls. Audio output would be via Aria/WCAG.
But there would need to be a screen reader built in to the app. Usually the end user supplies their own screen reader, but that only works for apps whose target audience is hearing impaired. It wouldn’t work for apps using audio accessibility technology for a general audience.
I’m specifically interested in the _web_, not native apps. What inspires me is how web accessibility technology can serve the general public.
But maybe that’s bad factoring. To the extent that what I’m working on here is a type of user experience, and a value proposition, the user doesn’t care whether I’m delivering with web or native technologies. It may be easier to build on web accessibility than native SDKs, but if not then this isn’t truly a web technology.
When I googled the term “screenless web” today I didn’t find any comparable idea. I found “screenless displays”, as a reference to projector-based UI, and I found “screenless web” as an idea about wearable tech. My idea is related to wearables, but can work fine on a phone as well.
I worked out sample flows for a radio experience. You’ll need to open up the below image, zoom in, and explore.