Meredith Whittaker, a noted privacy expert, the president of the encrypted messaging app Signal made some pretty concerning comments about AI, and how tech companies are going about Gen AI today, calling it a security and privacy nightmare
read more
At the Axios’ AI+ Summit on Wednesday, Meredith Whittaker, a noted privacy expert, highlighted the significant privacy risks associated with the burgeoning artificial intelligence (AI) industry. Whittaker, who is the president of the encrypted messaging app Signal and a co-founder of the AI Now Institute, emphasized the dangers posed by tech giants who have spent the last decade collecting vast amounts of personal data from users.
Whittaker urged the audience to consider the implications of their personal data being easily accessible. “If you close your eyes and imagine every email you’ve ever sent put in a database searchable by everyone, you know why you should care about protecting your data,” she said. This stark imagery underscores the potential invasions of privacy that could arise in the AI era.
Critique of the Surveillance Business Model
As a long-time critic of big tech’s “surveillance business model,” Whittaker pointed out that the immense costs involved in creating and running AI models drive companies to monetize personal data. “It costs hundreds of millions of dollars to train these models,” Whittaker explained. “So there is deep pressure from companies — that are basically promising God and delivering email prompts — to make some return on investment in this technology.”
Whittaker contrasted Signal’s approach with that of for-profit tech companies. Signal, a non-profit, offers fully encrypted messaging services, ensuring user privacy is maintained. “We looked at the cold, hard business model of tech,” Whittaker said, “and realized that if we were a for-profit, it is very likely that we would be pushed to erode privacy guarantees in an industry where collecting, selling, and making use of personal data is the primary economic driver.”
Concerns Over Microsoft’s Recall Feature
A particular focus of Whittaker’s criticism was Microsoft’s new Recall feature, which maintains an AI-searchable database of user activities on their computers. She described this feature as “a serious hijacking of trust” and a “dangerous honeypot for hackers,” highlighting the potential security and privacy risks.
Whittaker expressed concern for Signal users who expect a high degree of privacy and encryption. “I think we should be mortified,” she said. “Especially those of us who have a bit of understanding of what this tech actually does and the track record of these companies and the political environment in which they’re operating in.”
In response to these concerns, Microsoft has stated that the Recall feature allows users to delete specific snapshots, exclude certain websites or apps, and snooze the feature as needed. The company also suggested that Recall data could assist security teams in identifying malware infections.
The Potential of AI
Despite her criticisms, Whittaker acknowledged that AI has its merits. “It’s not a useless technology,” she said. However, she emphasized the need for critical evaluation of AI applications. “There’s a sort of groupthink around AI where people aren’t pausing to actually differentiate between where our large models that recognize patterns are useful…and where we need to leave them alone.”
Whittaker’s comments at the AI+ Summit bring to light the critical issues surrounding data privacy in the age of AI. As AI technologies continue to evolve, the challenge lies in balancing innovation with robust privacy protections. Her insights call for a more discerning approach to AI development and deployment, ensuring that privacy considerations remain paramount.
The discussion at the summit underscores the necessity for ongoing dialogue and regulation to responsibly navigate the complex landscape of AI and data privacy. As tech companies push the boundaries of AI capabilities, it is crucial that they do so with a clear commitment to protecting user privacy and maintaining trust.