If only Apple’s Voice Memos did transcription
Posted 30th October 2019 in Apple
I often have an idea for a blog post on my way to or from work. It’s easy to get the idea out if I’m sitting on the bus or Metro, but if I’m walking, typing out the outline for a blog post isn’t the easiest. This is where dictation comes in.
Dictating an idea is easy in Apple’s Voice Memos app, but that seems to be where it stops. I never check back. Mainly because it’s a pain to have to listen back to and transcribe the memos, but also because there’s no way to search my memos to find that brilliant idea I had. Transcription would solve both of these problems.
It would also make a decent accessibility feature for people who find typing awkward.
Apple’s software is certainly capable of transcribing. I sometimes dictate a blog post by opening iA Writer and hitting the dictation button on the keyboard, but I focus too much on the text that’s being produced to the point that it’s not much quicker than typing. Voice Memos removes that friction.
When Apple revamped Voice Memos in iOS 12 I searched in vain for an ‘export to text’ option of some sort; when iOS 13 arrived this year I had another look but, again, no sign of any transcription feature.
I thought I might be able to hack my way around it. Maybe a Siri Shortcut that transcribed a memo? Nope. Maybe I could play a memo on my Mac (now that they sync to Voice Memos on Catalina) with my phone held up to the speaker, iA Writer open and dictation enabled? Really…?
So imagine my jealousy when Google announced that the new Pixel 4 phone transcribes your dictations into text.
It’s not that Apple couldn’t do it. And I don’t even care about live transcription, or even on-device transcription – I’d be happy to let iCloud do it’s thing on my memos and come back later to find them transcribed.
I’m hoping Google’s new feature shakes Apple into action to create the same kind of thing. Live or as post-processing, it’d be of huge use to me.