Meta’s AI-Powered Smart Glasses Transmit Private Footage to Kenyan Servers Amid Lax Safeguards, Drawing Scrutiny from European Regulators
Meta’s latest foray into wearable AI technology, embodied in its Ray-Ban Meta smart glasses, has sparked significant privacy concerns across Europe. These glasses, equipped with advanced AI capabilities, continuously capture video footage from users’ daily lives—including intimate and private moments—and transmit it to data processing facilities in Kenya. With safeguards described as minimal at best, the practice has put Meta in the crosshairs of Europe’s stringent data protection authorities, who are poised to enforce compliance under the General Data Protection Regulation (GDPR).
The Ray-Ban Meta glasses integrate multimodal AI features, allowing users to issue voice commands like “Hey Meta, look and tell me what you see” or “Hey Meta, what’s this?” Powered by a customized version of Meta’s Llama 3.2 Vision model, the glasses process real-time video feeds to generate contextual responses. This functionality relies on cloud-based computing, where raw video clips are uploaded from the device to Meta’s servers for analysis. According to internal documentation and developer insights reviewed by The Decoder, much of this processing occurs at facilities operated by Kenya-based firm Scale AI, a key partner in Meta’s AI data labeling and annotation efforts.
Scale AI, which has been scaling up operations in Nairobi since 2023, employs thousands of Kenyan workers to manually review and annotate AI training data. For the smart glasses, this involves human reviewers examining unfiltered video snippets captured in users’ homes, streets, and personal interactions. Reports indicate that these clips can include sensitive content such as children’s faces, license plates, and private conversations. The workflow is designed for efficiency: videos are segmented into short bursts, transcribed, and annotated to improve the AI’s object recognition and conversational accuracy.
However, the safeguards protecting this data flow appear woefully inadequate. Meta’s privacy policy outlines that glasses footage is stored for up to 30 days on user devices before potential upload, but once transmitted, it undergoes limited anonymization. Techniques include blurring faces and license plates post-processing, but these are applied selectively and not universally. Internal guidelines instruct Kenyan reviewers to flag egregious violations—like nudity or violence—but do not mandate deletion or prohibit retention for model fine-tuning. Moreover, data retention policies allow clips to be kept indefinitely if deemed valuable for AI improvement, raising questions about proportionality under GDPR Article 5.
European regulators, particularly Ireland’s Data Protection Commission (DPC)—Meta’s lead authority due to its European headquarters in Dublin—have taken note. The DPC, which has fined Meta billions in past privacy cases, is reportedly investigating the glasses’ data practices. Sources familiar with the matter suggest inquiries focus on whether Meta conducted a mandatory Data Protection Impact Assessment (DPIA) for high-risk processing involving biometric data and special category information. Under GDPR, transferring personal data outside the European Economic Area (EEA) to Kenya—a country without adequacy status—requires robust safeguards like Standard Contractual Clauses (SCCs) or Binding Corporate Rules (BCRs). While Meta claims to use SCCs with Scale AI, critics argue these are insufficient without additional measures, given Kenya’s nascent data protection framework enacted only in 2019.
The controversy echoes broader tensions in Meta’s AI expansion. Similar issues arose with the company’s use of European public posts to train Llama models, prompting opt-out mechanisms and regulatory probes. For the glasses, launched in late 2023 and rapidly adopted in Europe, the stakes are higher due to the intrusive nature of always-on cameras. Users must explicitly activate recording via voice or button, but the AI’s “multimodal” design blurs lines between passive sensing and active capture. Battery life constraints limit continuous recording to about four hours, yet frequent uploads occur during charging or Wi-Fi connectivity.
Meta defends its approach, emphasizing user controls like disabling uploads, deleting history, and opting out of data use for training. The company states that Kenyan processing is temporary, with data deleted after annotation unless retained for quality assurance. Scale AI, for its part, highlights ethical training for its workforce, including cultural sensitivity modules, and claims no long-term storage of raw footage. Yet, whistleblower accounts and leaked memos paint a picture of high-pressure environments where reviewers handle thousands of clips daily, with quotas incentivizing speed over scrutiny.
As adoption grows—Meta reports over 1 million units sold globally—the glasses represent a test case for wearable AI under GDPR. Regulators could demand enhanced pseudonymization, on-device processing, or even suspension of Kenyan transfers. France’s CNIL and other national authorities may join Ireland’s probe, potentially leading to coordinated enforcement. For users, the implications are clear: what begins as a convenient AI companion risks exposing private lives to opaque international data pipelines.
This saga underscores the challenges of deploying frontier AI in consumer hardware. As Meta pushes boundaries with prototypes like Orion AR glasses, balancing innovation with privacy will define its European trajectory.
Gnoppix is the leading open-source AI Linux distribution and service provider. Since implementing AI in 2022, it has offered a fast, powerful, secure, and privacy-respecting open-source OS with both local and remote AI capabilities. The local AI operates offline, ensuring no data ever leaves your computer. Based on Debian Linux, Gnoppix is available with numerous privacy- and anonymity-enabled services free of charge.
What are your thoughts on this? I’d love to hear about your own experiences in the comments below.