Shoppability is the new black. There’s a trend towards all things being shoppable. We’re talking buy buttons on everything from YouTube videos to Instagram Stories. This isn’t necessarily a new phenomenon but is one of many trends that’s been Covid-accelerated.
Elsewhere — and for similar reasons — we see a separate trend: visual commerce. This includes product visualization and visual search. The former lets you try on everything from shoes to lipstick to couches using AR lenses. The latter identifies things you point your phone at.
Panning back, these trends — shoppability and visual commerce — are on a collision course. Point your phone at a jacket a friend is wearing using Google Lens or Snap Scan, then buy it right on the spot. It compresses the purchase funnel through a visually-informed decision flow.
All of the above is underway, but there’s a ways to go in capability and cultural acclimation. Accelerating things is the self-motivated efforts of tech giants to future-proof their core businesses. This is the theme of our AR and Shopping Collide series, continuing here with Google.
Well-Traveled Touchpoints
After covering Snap, Pinterest and Shopify in earlier parts of this series, What’s Google doing in AR shopping? It has spent 20+ years as the internet’s front door, including high-intent commercial searches. That has driven it to own a growing piece of consumer shopping.
But how does AR play in? When it comes to emerging tech, some companies are in a unique position to accelerate adoption. That can often happen by tapping into large established networks or user bases to expose and distribute the technology (we’re looking at you Apple).
This is what Google has begun to do with its multi-dimensional AR play. As we’ve examined, it’s using its position as the world’s search engine to incubate AR and expose AR. It’s done this so far by planting AR throughout its well-traveled touchpoints and search results pages (SERPs).
This has played out in a few ways. For one, Google offers AR-enabled search results that come to life in 3D. It also grants prime real estate to Google Lens with activation buttons in the places where people launch mobile searches. Let’s tackle these strategies one at a time…
Ten Blue Links
Starting with AR-enabled SERPs, Google continues to offer search results that animate in 3D and AR. To define these two terms, 3D is when searchers can spin a 3D graphic (often on desktop SERPs), while AR offers the same effect but overlaid in one’s space (on smartphones).
This has played out so far with topics that are conducive to visualization, and in educational contexts. These 3D/AR results include subjects like a human skeleton or members of the animal kingdom. These use cases and categories will continue to broaden as Google tests the waters.
We also see this moving towards more monetizable searches. In Google fashion, it’s gaining organic traction before it flips the monetization switch. The latter could involve things like characters to promote various entertainment releases like it recently did with Baby Yoda.
Another way monetization will play out is Google Swirl. This 3D/AR format acts in the above ways, but specifically for advertisers to develop interactive search results. In early tests, these campaigns are already showing high engagement versus non-AR benchmarks.
All of the above represents an ongoing evolution of the SERP from its “10 blue links” origins. After years of expanding into the broader knowledge graph, 3D models are the next logical step. They’re also a way to future-proof search by bringing it into a more camera-forward era.
Alternate Input
Speaking of future-proofing, Google Lens is a product that could represent an alternate search input to allow Google to continue growing its business. More search modalities — including voice and visual search — provide greater surface area for users to tap into Google.
Tying that back to the “incubation” play noted above, Google has accelerated Lens by giving it prime real estate on the main search bar in its mobile apps on iOS and Android. Planted right next to the voice search button, this gives Lens more exposure than any product could ask for.
While it does that, Google is simultaneously beefing up Lens’ capabilities. As it has announced, Google Lens recognizes 15 billion products — up 15x in two years. This taps into Google’s 20+ years of indexing media for Google Images to form an AI training set for visual object recognition.
Meanwhile, there’s a similar incubation play for Live View AR navigation. Over time, Google has given it more front & center positioning as it’s increasingly ready for prime time. It can now be found in the main tray of options when getting walking directions in Google Maps.
All of the above represent orbiting parts in Google’s AR play. It knows it can utilize its massive scale to accelerate things. AR can in turn help Google future-proof search as noted. This makes Google’s AR incubation a sort of virtuous cycle. We’ll see if that translates to dollars.
https://medium.com/media/342c763233765c7bdcc1b5f60542e3c6/href
Originally published at https://arinsider.co on April 6, 2022.
AR and Shopping Collide, Part IV: Google was originally published in AR/VR Journey: Augmented & Virtual Reality Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.