Tech reporters can be caught out by how fast technology improves. It was only today that I was able to identify plants and flowers from a photo on my phone.
I was disappointed by the speed and accuracy of the third-party apps that I tried last time. It is less convenient to open up an app I wouldn't otherwise use.
Apple has its own version of the visual search feature, which was introduced in the fall of last year. It is called visual look up.
It works in a way that's simple. You can find the blue "i" icon in the Photos app. If it has a small ring around it, then it can be identified using machine learning. Click the icon and it will attempt to dredge up some useful information.
It works for landmarks, art, pets, and more. It is not perfect but it has surprised me more than it has let me down. Just from my camera roll, there are more examples.
This feature was announced at last year's WWDC, but it hasn't been trumpeted. I found it through a link in a tech newsletter. The official support page for visual look up tells you in one place that it is the U.S. only, but there are other compatible regions on another page.
Access to visual look up has expanded since it was launched. It is available in English in the US, Australia, Canada, UK, Singapore, and Indonesia, as well as in French, German, Italian, and Spanish.
It's a great feature, but I'm wondering what else visual search can do. If you take a picture of a landmark on a holiday, for example, you may be asked to set up reminders for a watering schedule.
It's foolish to pin your hopes on something that's too advanced. These are some of the things we might get with future headsets. If Apple introduces this type of function, it will make a bigger splash.