AI25 views6 min read

Eufy Paid Users for Theft Videos to Train AI Systems

Eufy, a popular security camera brand, offered customers $2 per video for recordings of thefts to train its AI, raising questions about data privacy and security.

Chloe Sullivan
By
Chloe Sullivan

Chloe Sullivan is a consumer technology and privacy correspondent for Neurozzio. She reports on how consumer electronics companies handle user data, the security of smart home devices, and the ethical implications of AI training programs.

Author Profile
Eufy Paid Users for Theft Videos to Train AI Systems

Anker, the parent company of Eufy security cameras, launched a program offering customers monetary compensation in exchange for video recordings of package and car thefts. The initiative was designed to gather data to train the company's artificial intelligence systems for improved thief detection.

The program explicitly requested both real and staged incidents to build a comprehensive dataset. This practice of incentivizing user data submission highlights a growing trend among technology companies to leverage their customer base for AI development, while also raising significant questions about user privacy and data security.

Key Takeaways

  • Eufy offered customers $2 per video of package or car thefts to improve its AI detection algorithms.
  • The company encouraged users to submit both authentic and staged videos of criminal activity.
  • An ongoing "Video Donation Program" gamifies data submission, with one user reportedly donating over 200,000 videos.
  • The data collection initiatives exist alongside a history of privacy concerns for the company, including a 2023 incident involving unencrypted video streams.
  • The practice reflects a broader industry trend where companies pay for user data, which carries inherent privacy and security risks.

A Direct Appeal for AI Training Data

Earlier this year, Eufy initiated a campaign to enhance its AI capabilities by directly sourcing video content from its users. The program, which ran from December 18, 2024, to February 25, 2025, offered a payment of $2 for each submitted video depicting either a package theft or an individual attempting to open a car door.

The company's goal was to collect a substantial volume of data to refine its security algorithms. On its website, Eufy stated its objective was to gather 20,000 videos for each category: package thefts and "pulling car doors."

To facilitate participation, users were directed to a Google Form where they could upload their video files and provide a PayPal account for payment. This streamlined process was designed to encourage a high volume of submissions from its customer base.

Staged Events Encouraged

A notable aspect of the campaign was Eufy's encouragement of staged events. The company explicitly told users they could create their own scenarios to contribute to the dataset. "To ensure we have enough data, we are looking for videos of both real and staged events, to help train the Al what to be on the lookout for," the company's website explained.

Eufy even suggested that users could act as thieves themselves to generate content. "You can even create events by pretending to be a thief and donate those events," the instructions read. The company also outlined how users could maximize their earnings, suggesting that a single staged act captured by multiple cameras could quickly increase payments.

Potential Earnings

Eufy's website suggested users could earn up to $80 by staging multiple events, such as a package theft and a car door theft, captured by two different cameras simultaneously.

The company assured participants that the collected data would be used exclusively for internal purposes. "The data collected from these staged events is used solely for training our Al algorithms and not for any other purposes," Eufy stated. However, the company did not publicly respond to inquiries about the total number of participants, the amount paid out, or its data retention policies after the AI models were trained.

Ongoing Data Collection and Gamification

While the paid campaign has concluded, Eufy continues to solicit video data from its customers through another initiative called the "Video Donation Program." This program, accessible through the Eufy app, uses gamification to incentivize submissions rather than direct monetary payment.

Participants in this program earn rewards ranging from an "Apprentice Medal," a digital badge displayed next to their username, to physical gifts like cameras or gift cards for high-volume contributors. For this program, Eufy specifies it is only seeking videos that involve humans.

The Power of Gamification

Gamification is a strategy that applies game-design elements to non-game contexts to increase user engagement. By using leaderboards, badges, and rewards, companies can encourage behaviors like data submission without direct financial cost for every item, creating a steady stream of user-generated content.

The effectiveness of this approach is demonstrated on the app's "Honor Wall," a leaderboard that ranks users by the number of videos they have donated. According to a screenshot of the app, the top-ranking user has contributed a staggering 201,531 video events. This indicates a massive volume of user data being transferred to the company for AI training.

Similar to its previous campaign, Eufy provides assurances about data usage within the app. The donation program's page clarifies that "donated videos are only used for Al training and improvement. Eufy will not provide the video to third parties." The company also requests video donations from its baby monitors, although the support page for this feature does not mention any rewards.

A History of Privacy Lapses

Eufy's data collection efforts are viewed with skepticism by some privacy advocates due to the company's past security issues. Commitments to protect user data are often weighed against a company's track record, and in Eufy's case, there are documented instances of privacy failures.

In 2023, an investigation by The Verge revealed a significant security flaw. The company had advertised that its camera streams featured end-to-end encryption, a security standard that should prevent anyone other than the user from viewing the footage. However, the report found that video streams accessed through Eufy's web portal were unencrypted and could be viewed using a simple media player like VLC.

Initially, the company attempted to downplay the severity of the issue. After continued reporting and public pressure, parent company Anker eventually admitted it had misled users about its security practices and promised to implement fixes, including fully encrypting its web portal streams.

This incident damaged the company's reputation and created a trust deficit for some customers. When a company with a history of misrepresenting its security practices asks for sensitive video data, even for seemingly benign purposes like AI training, it naturally invites scrutiny.

The Wider Context of Paid Data Collection

Eufy is not alone in its willingness to pay for user data. The practice is becoming more common as the demand for high-quality, real-world data to train sophisticated AI models intensifies. While this can empower users to monetize their own data, it also introduces significant risks.

A recent example highlights the potential dangers. A viral calling app named Neon offered users money for sharing recordings and transcripts of their phone calls. Shortly after its launch, security researchers discovered a flaw that allowed any user to access the call data of all other users on the platform. After being notified of the vulnerability, the app was taken offline.

These cases demonstrate a critical challenge in the age of AI: companies are racing to collect vast amounts of data, but the security infrastructure to protect that data may not always keep pace. For users, the decision to participate in such programs involves a trade-off between a small financial reward and the potential risk of their personal data being exposed or misused.