Skip to content

This website shows how much Google's AI can get out of your photos

    Software engineer Vishnu Mohandas decided he would quit Google in more ways than one when he learned that the tech giant had briefly helped the US military develop AI to study drone footage. In 2020, he left his job at Google Assistant and also stopped backing up all his images to Google Photos. He feared that his content could be used to train AI systems, even if not specifically related to the Pentagon project. “I have no control over the future outcomes this will enable,” Mohandas thought. “Shouldn't I be responsible anymore?”

    Mohandas, who taught himself to code and is based in Bengaluru, India, decided he wanted to develop an alternative service for storing and sharing photos that is open source and end-to-end encrypted. Something “more private, healthy and reliable,” he says. The paid service he designed, Ente, is profitable and claims to have more than 100,000 users, many of whom are already part of the privacy-obsessed crowd. But Mohandas struggled to convey to a broader audience why they should reconsider relying on Google Photos, despite all the conveniences it offers.

    Then one weekend in May, an intern at Ente came up with an idea: give people an idea of ​​what some of Google's AI models can learn by studying images. Last month, Ente launched https://Theyseeyourphotos.com, a website and marketing stunt designed to turn Google's technology against itself. People can upload any photo to the website, which is then sent to a Google Cloud computer vision program that writes a surprisingly thorough three-paragraph description of it. (Ente asks the AI ​​model to document small details in the uploaded images.)

    One of the first photos Mohandas tried to upload was a selfie with his wife and daughter in front of a temple in Indonesia. Google's analysis was exhaustive and even documented the specific watch model his wife was wearing, a Casio F-91W. But then, Mohandas says, the AI ​​did something strange: It noticed that Casio F-91W watches are often associated with Islamic extremists. “We had to adjust the clues to make it a little more wholesome but still creepy,” says Mohandas. Ente started asking the model to produce short, objective results – nothing obscure.

    The same family photo uploaded to Theyseeyourphotos now produces a more general result, showing the name of the temple and the “partly cloudy sky and lush greenery” surrounding it. But the AI ​​still makes some assumptions about Mohandas and his family, such as that their faces express “joint contentment” and that “the parents are likely of South Asian, middle-class descent.” It rates their attire (“suitable for sightseeing”) and notes that “the woman's watch shows a time of approximately 2 p.m., which matches the image's metadata.”

    Google spokesperson Colin Smith declined to comment directly on Ente's project. He directed WIRED to support pages stating that uploads to Google Photos are only used to train generative AI models that help people manage their image libraries, such as models that analyze the age and location of photo subjects. The company says it does not sell the content. stored in Google Photos to third parties or use for advertising purposes. Users can disable some analytics features in Photos, but they can't prevent Google from fully accessing their images because the data isn't encrypted end-to-end.