Skip to content

5 Future Scenarios for Google Lens

    When David Pierce Covering the transitional birth of Google Lens to WIRED in 2017, he labeled the tool as “a long-term bet for the business and a platform for many use cases.” In the following years, as Google Lens survived other experimental forays, senior writer Lauren Goode wrote about the long journey to perfecting visual search.

    Five years later, Google Lens continues to be great for identifying strange plants and helping students through their algebra homework. The longer it lingers, the more it seems to encompass. Ever used reverse image search? This is now made possible by Google Lens. Was Lens software in the augmented reality glasses prototyped on Google I/0 2022? Unconfirmed, but very possible.

    During the company’s presentation in May, Google CEO Sundar Pichai envisioned a world full of augmented reality that can be accessed without a smartphone. He said: “These AR capabilities are already useful on phones, and the magic will really come to life if you can use them in the real world without the technology getting in the way.”

    Wanting to better understand what’s in store for Google Lens, I visited the company’s San Francisco office and sat down with Lou Wang. He is director of product management at Google and has worked on Lens for many years.

    When asked about the balance between creating quality features for now and building for the future, Wang sounds confident that phones and desktop computers will continue to dominate the present. “Personally, I’m very excited about glasses, but it will take a while for that to scale up,” he says. “So our focus is very much on smartphones, with the understanding that some of the things we’re talking about, like exploring scenes, get more powerful when you don’t have to grab your phone.”

    After this conversation, I went on vacation to Yosemite and considered what to expect, as AR applications are layered over real world spaces, even locations we visit to disconnect and experience the natural world. What follows are five future situations with Google Lens. The predictions are illustrative, not all-encompassing.

    Hiking trails led by AR

    It’s 2030 and you’ll be underwhelmed by how well the electric car automatically navigates the winding mountain roads when you arrive at what’s left of Yosemite. An engine from the car’s air-purifying system whistles incessantly on this smoky morning. As you look out the window, Google Lens draws outlines of cliffs that are visible on clear days.

    After arriving at camp and pitching a tent, travel south towards the Mariposa Grove Trail. Halfway through the walk, you’ll see a sign that reads, “Activate AR for accurate historical recreations.” Okay. A pulsating arrow now hovering in the air will point you in the direction of the fallen Wawona Tunnel tree. When you turn the corner, all that remains is a small charred log. A towering 3D model of the mammoth plant is covered opaquely. You stand in the distance and watch as horses carry families through the tunnel. On the walk back to the car you wonder why the men used to dress like this.

    SEO strategies focused on unique images

    You are sure to spend a night in Yosemite to relax. You’re also here to snap some eye-catching, scenic product shots for a burgeoning ecommerce store that sells custom climbing gear. As more people use product photos and short videos to shop with Lens and Lens-powered store search tools, this is one of your favorite search engine optimization methods. You unpack lighting equipment for the shoot as the sun sets. The photos you take are recreated by competing companies that use artificial intelligence programs to get the best shopping results for items such as carabiners.

    You think of your photographic touch as a human capturing that special essence. You wonder if you are foolish.

    Late night snacks chosen by algorithms

    It’s well into the night when you’re done with the photo shoot. None of the camping snacks you’ve put in the bear box—a relic of ecologically diverse times—look appetizing. You ask the car to pick up snacks from the only nearby gas station that is open all night. It’s a 45 minute drive one way. You doze off under the stars at the concrete-filled campfire pit.

    A soft ringing sound from your glasses lets you know the car has arrived. A video feed of the car flashes in front of your face via Google Lens. As you delve into the different ice creams on offer, the AR software notes your gaze and circles the top three flavors based on past late-night snack purchases. Tonight you want something new and say out loud at the campsite, “Most popular flavor around me.” A digital blue jay descends from the top of your altered field of vision and settles on a vanilla ice cream mixed with animal crackers. Almost too excited, you whisper, “Buy. Confirm.” A small arm extends out of the car and places the treat in a compact freezer.