Google’s “Voice Access” is decent for controlling the device through verbal commands, but you have to be looking at the screen to get results - it won’t read anything back to you.
Google’s “TalkBack” will read things on screen to you, but you have to interact with the screen physically (never mind the significant change in how interactions work - which I understand the need for - but it’s still a serious mental PITA to switch between the two interaction methodologies frequently).
Is there no way to just interact with it entirely verbally? A (very) simple example of what I’m looking for:
- “What are the current Google News headlines?”
- Starts reading each one aloud, along with the names of the sources.
- “Read the article about Trump caught making out with Elon from AP News.”
- Proceeds to load the article & read it aloud.
(Yeah, I know there are podcasts for this - it’s meant to illustrate the basic idea of completely verbal interaction with the device, not be an actual problem I’m looking for someone to provide a solution to.)
It just seems to me that we should be able to do this by now - especially with all the AI blow-up over the past couple of years. Can anybody point me to a usable solution to accomplish this?
TIA.
EDIT: I thought of a better example (I think), because it occurred to me that the above one could (sort of) be done with a Google Home speaker. I’m looking to be able to interact with Android apps verbally wherever possible, so my better example is “What are the latest posts made to the ‘No Stupid Questions’ community on Lemmy?” So far as I know, Google Home is not able to do such a thing. I’d like to tell Android to open my Lemmy client and start reading post headlines until it hit one I wanted to have it open & read to me.
I’m basically looking to use apps verbally to fill in gaps that Google Home/Assistant don’t cover.
EDIT 2: Here’s an even better, more universally applicable description of what I’m after - copied from a response I gave to another comment:
Imagine someone doing some relatively mindless menial job such as working an assembly line, janitorial work, chauffer - something where your mind is relatively unoccupied, but you’re not free to look at and/or touch your device (whether it be due to practicality, or job rules). While doing that job, I want to be able to have the device read and interact with something of interest to me at that moment (ADHD is a fickle mistress), rather than just relying on podcasts with predefined content. Kind of like having someone next to me doing all the interfacing between me and the device.
iOS will open your Lemmy client, start reading posts to you aloud, and go into a post of interest upon command without you ever looking at or touching the screen (using my newer example that I added to the OP)? I’m seriously going to have to look into getting an iDevice of some sort if so.
I don’t know if this is still the case, but I know years back iPhones were preferred by a lot of blind people just in terms of accessibility. Digging through accessibility settings, it looks like you can use Voice Control to tell it to open Lemmy, and VoiceOver to read all the text on the screen without touching it. I don’t know about the example you added to your OP, adding phrases it would need to interpret seems more like a Siri thing (which I don’t use), so I don’t know how well that plays with Voice Control.
I wouldn’t rush out to buy anything unless some Android people confirm it’s not doable. Apple does have people that know the software working at their stores, so they could tell you specifics for sure. And check that I’m not totally wrong, lol.
Yeah, I wouldn’t just jump in without looking first. If I can’t find a way to do this, then I’m definitely gonna have to take a trip to the nearest Apple Store, though. Thanks very much for the input!