Snapchat’s Scan feature can identify dogs, plants, clothes, and more

Snapchat's camera is mostly associated with disappearing messages and silly AR effects like a virtual hot dog. What if Snapchat could do more than just send you messages and suggest ways to improve the quality of your videos? You could even be shown a similar shirt to the one you are looking at.
A new feature called Scan will be available in the app camera. It allows it to identify various items in the real world like clothing or dog breeds.

The prominent position Scans has in Snapchat is a sign that Snapchat is becoming more than a messaging app. It's also becoming a visual search engine. Scan helps Snapchat users solve a problem: How to locate the millions of AR effects (or Lenses) that have been created by Snaps creators. Scan can suggest Lenses based upon what you are looking at. This could increase visibility for the Lenses that people create, and incentivize them to continue creating AR content for Snapchat.

Visual search isn't new. Google introduced Lens in 2017, allowing users to scan items with their phones camera and then identify them using its extensive index of search results. Lens is available in the Google Pixel phones as well as a variety of Android handsets. It can also be integrated into the main Google mobile application. Pinterest has its own visual search tool called Lens, which displays similar images based upon what you scan within the app.

Snap may be playing catch-up but it has the potential to make visual search mainstream. Snapchat is now open to the camera. Any change has huge implications on how the app's nearly 300 million daily users interact. Snap claims that Scan is used by more than 170,000,000 people every month. This is in addition to the fact that it was not visible on the camera before.

We see the camera doing much more long-term than it can today.

In an exclusive interview, Eva Zhan from Snaps, head of camera product, stated that Scan is a priority for the [Snapchats] Camera going forward. We see the camera being able to do a lot more in the long term than it is currently capable of.

Snap began work on Scan in 2009 after seeing how Snapchat users loved scanning QR codes to find friends. Snap later added the ability for Amazon to identify products that are available to be purchased on Amazon after initially working with Shazam and Photomath to identify songs.

Snap showed the latest version of Scan at its developer conference earlier in the year. It includes detection for dogs, plants, wine and cars as well as nutrition information. Other companies power most of Scans features. Vivino, the app that powers the wine scanning feature is one example. Allrecipes soon will be able to power a Scan feature that recommends recipes based on specific food ingredients. Snap will continue to add new capabilities to Scan using a combination of outside partners as well as what it creates in-house.

Snap's latest addition to Scans is its shopping feature. This was made possible by the recent acquisition Screenshop. The app allows you to upload photos of clothes and then shop for similar items. Scan will suggest similar clothes based upon what you are looking at. It also allows you to buy the clothes that you find. The Snapchat Memories camera roll section will soon include a scan shopping feature that allows users to shop for clothes using what they have captured with their camera or screenshots.

Snap also calls these camera shortcuts. It recommends a combination of a camera mode and a soundtrack. Lenses that are specifically made for the sky will be displayed alongside a clip of music and a color filter so you can apply all changes in one shot. Zhan claims that Snap is adding camera shortcuts to TikTok's rival Spotlight. This could allow viewers to quickly jump into their camera using the same setup used to create the video.

Scans camera shortcuts were fun to use initially. However, they are limited to shots of the sky and human feet, dogs, or dancing. Snap intends to increase the number of situations camera shortcuts can be used in, and the integration of Spotlight shows how they might become an integral part the video creation process.

Snap hopes that Scan will be a key way for users to discover AR lenses in the future. Snap recently allowed AR creators to tag their Lenses using relevant keywords. This will allow Scan to suggest the best Lenses based upon what the camera is seeing.

After trying Scan, I was not impressed.

After using the Scan for a few weeks, I found it to not be very reliable. Scan failed in many cases. It didn't recognize clothes that I wanted results for. There were also instances when it was incorrectly identified. Some Lenses suggested were appropriate, while others were not relevant.

Snap claims that Scan will improve over time. It will be able to identify objects more accurately and can detect new types of objects. Although Scan data is not currently being used to target ads, it's easy to see how this feature could be monetized with more advertising or shopping tie-ins.

AR glasses like the Snap Spectacles make scanning more appealing in the future. It doesn't feel natural for me to point my smartphone at objects in the real world to identify them. However, smart glasses that can scan my surroundings make this behavior more logical.

Snap has already anticipated this: The new Spectacles feature a dedicated Scan button that triggers Lenses based upon what the wearer is seeing. The new Spectacles won't be available for purchase. Snap will instead give them to selected AR creators and partners who apply.

Although Scan is still very basic, it is a sign of how Snap is expanding the uses for the camera. Zhan reports that Snap views Scan as an integral part of Spectacles, and possibly other cameras in the future. We don't want to restrict Scan to Snapchat.