Follow via RSS, Twitter, Mastodon, Telegram or email.

The new “text recognition” feature in iOS15 is absolutely magical. It’s one of those things that instantly makes you think “Damn, we’re really living in the science fiction future of the old movies now”. It works ridiculously well and I imagine it will be extremely useful. If my phone wasn’t new enough to support it, experiencing this feature would probably be reason enough for me to upgrade.

There is different ways to trigger/use it, and sadly one of them is driving me up the wall. Before iOS15, tapping into an empty text input would present you with only one option in the context menu that pops up – “Paste”. And when you selected text, “Paste” would be on the right edge of the context menu. I don’t know exactly how long it has been this way, but at least a couple of years. More than long enough for my muscle memory to deeply and thoroughly internalize the “tap empty space, instantly tap rightmost spot of context menu.

As you can see on the screenshot, in iOS15 that rather valuable position1 has been taken by this fancy new icon – to trigger “text recognition”. Tapping it replaces they keyboard with a camera view in which you could then select recognized text. Sadly, it’s not even automatically dismissed by tapping somewhere into the conversation or the text input field (as is the case with e.g. the photo picker which occupies the same space as the text replacement camera view). The only way to cancel is tapping the “x” in the top right corner.

I’ve been using iOS15 for less than 24 hours and already I have accidentally triggered “text recognition” instead of pasting something so often that I’m just glad I have not yet thrown my phone out of the window in anger. It’s extremely jarring and stops you in your tracks while doing something that previously only took a split second. Maybe I’ll get used to the new placement quickly, but as of now it just feels needlessly disruptive.


  1. It’s not quite the same, but I can’t help but think of Fitt’s Law here. ↩︎

Gee thanks, Apple Maps. Exactly what I needed while trying to check my ETA at the dentist here in my hometown in lower Bavaria. I might out myself as the fun-hating, at-cloud-yelling grandpa I am deep down inside, but why the heck are there any “Editor’s Picks” in my maps app at all?


But as a more concrete nitpick: When you have properly detailed calendar entries with location/address filled in in, Apple Maps (or rather: ON-DEVICE INTELLIGENCE Siri) normally does a pretty good job of suggesting that as the target location when you open it within a reasonable time frame before that meeting. In that suggestion, which you could tap to start proper navigation towards that target, they already show the ETA based on your current location.

Before taking the screenshot at the top I used this several times to check my ETA. I was running a little late and wanted to see if I had to keep up my brisk pace to make it on time. Worked perfectly fine, but from ten minutes before the appointment it was no longer a suggestion – hence me scrolling down in confusion and stumbling upon the Editor’s Picks.

Not suggesting appointments that are very close makes no sense to me, isn’t those final ten minutes when most people would open the app to check where exactly they need to go after arriving in the general area?

Too bad there is absolutely no space left on this entire screen for them to put this new “source” indicator, so instead the Music.app icon has to be put on top of the album art.

Why would you place the “Next” button in the left corner? During the ten pages of this Google Form I almost accidentally cleared the entire form twice due to how ingrained the “correct” placement of a “next page” button is in my brain.

Today in “native apps excellence”: Messages.app with some amazing responsive web behavior. (The screenshot on the right is for reference, that’s what it’s supposed to look like.) Happens every other day for me, I suppose it’s not getting the memo about retina resolution quick enough or something related to that?

I really can’t believe how Messages.app is still so ridiculously bad. These are all from within one year, after the great rewrite of the macOS version. And we didn’t even publish everything that’s bad about it, because some things are so hard to capture. Most of all, it is so damn slow all the time. I often send ":D" instead of “😃” because I hit the enter key faster than the text replacement happens. Philipp regularly sends me all kinds of broken texts for the same reason.

Seriously, what is going on there? How can it be that the richest company on the planet is unable to maintain at least some semblance of quality in one of their flagship apps, in one of the strongest lock-in components in their entire portfolio? There is no end of articles about how important the blue bubbles are. I used to be elated when my contacts used iMessage, these days I dread it. Encryption issues aside, how is it possible that Telegram has better UI/UX in every conceivable way than iMessage?

This one is truly baffling to me. In the end it’s probably “just” a cache mixup, but I still find it impressive. What’s wrong? The track1 is showing incorrect cover art on my iPhone.

On the right you can see the correct cover art, a screenshot of my local iTunes Music library from which I sync music to the phone, because I hate services. I verified that the track in question does include that very cover art (and only that cover art) within the file itself, it’s not just this weird incomprehensible magic album-based cover mapping they introduced a decade ago.

When I saw the incorrect cover art on the phone, I immediately did a double-take. Because I knew that picture quite well – I had hand-picked it years ago for a very, very different purpose. It feels like 20 years ago, but during and after the 2016 US election, I found Seth Meyer’s “A Closer Look” segments to be worth watching. Sadly the YouTube website itself is pure cancer, so I built a custom video podcast for them using youtube-dl and dircaster. For that, I also wanted a pretty picture, so it would show up nicely in my Podcast.app. And that’s where that photo is from. I stole it from the DDG image search results myself, cropped and edited it a bit, and put it on my web server for my podcast client to pick up. You can even see it in action in a previous annoyance!

I most certainly never used it for anything in my Music library. I only added this custom video podcast to the Podcast.app on my phone, and of course that synced the subscription follow back into iTunes Music. And from there it seems to have found a way into my music cover art cache on the phone. As I said, it’s kinda impressive. Sometimes I wonder if they even know that local syncing is still “supported”.

Nelson writes about trying to sign up at Amazon:

When trying to create an account, I was told an account already existed with the email I was trying to use (and they were right, I have a AWS account with that email). So I tried to sign in using that email. Then I was told they couldn’t find an account with that same email address. Odd. I tried signing into AWS with that email and sure enough I got in. It looks like emails associated with AWS cannot be used to create an account for AWS certifications, which is a bit ironic.

Thanks Nelson for the submission!

I’d love to know what this problem is. I don’t just say that in the “if you tell me there is a problem, tell me at least something about the problem” way, but I’m very curious about the technical details of it.

It can’t be that bad, because force-closing and reopening Mail.app fixes it. Usually it seems to occur when you have a spotty connection and Mail.app isn’t able to download the actual body of the email. If they can detect the problem – as demonstrated by the error message – and clearing some cache (or whatever other voodoo force-closing achieves) can resolve it, why can’t it be fixed in a non-shitty way?

So recently I’ve been forced to reboot my iPhone more often than usual. One thing I noticed is that Spotlight (is it called Spotlight over there?) is basically useless for a solid two or three minutes after each reboot. The above screenshot (left: shortly after reboot, right: normal) isn’t some crazy trick where I tried to capture a split second that doesn’t make sense, that’s after several retries of the same search term over the course of several minutes after a reboot.

Opening 1Password after an iPhone reboot is always the first thing I do, because 1Password requires one unlocking via Master Password after a reboot before you can use Face ID to unlock it. For some annoying reason, this Master Password unlocking can’t be done from within the Safari extension. So when you open a website after rebooting, and want to log in using the Safari extension – you can do that just fine. Instead of unlocking 1Password via Face ID, you use the Master Password in Safari. But that does not unlock Face ID for subsequent usages, you’d have to keep inputting your Master Password like some kind of neanderthal. Allowing Face ID unlocks in Safari can only be achieved by actually opening the app itself and unlocking that via Master Password. Afterwards, you can use Face ID from within Safari.

That’s why instantly opening 1Password via Spotlight after a reboot is something I have a lot of experience with. I’m not quite sure when it started being so unreliable after a reboot, but it seems to rebuild the search index every time. No progress indication, no information that searching currently is completely useless, nothing.