Anker-owned Eufy, a prominent maker of internet-connected security cameras, recently paid its customers for videos depicting package and car thefts. The company offered $2 per video, including both real and staged events, to enhance its artificial intelligence (AI) systems. This initiative aimed to improve the cameras' ability to detect thieves, but it also raised important questions about user data privacy and security.
Key Takeaways
- Eufy offered $2 per video for theft footage to train its AI.
- Both real and staged theft events were accepted.
- Over 120 users reportedly participated in one campaign.
- The company's goal was to collect 40,000 videos across two categories.
- Concerns about data privacy and security have emerged.
Eufy's AI Training Initiative
Eufy launched a campaign earlier this year to gather video data. The company explicitly stated that it sought videos of both real and staged package and car thefts. This was to provide sufficient data for its AI models. The goal was to teach the AI what to look for when identifying potential thieves.
The campaign guidelines allowed users to create their own theft scenarios. For example, a user could pretend to be a thief. They could then record this act with their cameras. This method aimed to make participation efficient and easy for customers. Eufy suggested that capturing one staged act with two outdoor cameras could maximize efficiency.
Quick Facts
- Payment per video: $2
- Campaign dates: December 18, 2024, to February 25, 2025
- Video targets: 20,000 package thefts, 20,000 car door pulls
- Reported participants: Over 120 users in one campaign
Incentives for User Participation
Eufy offered financial incentives for video submissions. The company stated on its website, "You can even create events by pretending to be a thief and donate those events." It also highlighted potential earnings. "If you also stage a car door theft, you might earn $80," the website noted. This indicates a structured payment system for multiple submissions.
Eufy assured users that the collected data had a specific purpose. The company wrote, "the data collected from these staged events is used solely for training our Al algorithms and not for any other purposes." This statement aimed to address potential privacy concerns from the outset.
Campaign Scope and User Engagement
The primary campaign offering $2 per video for theft footage ran from December 18, 2024, to February 25, 2025. During this period, more than 120 users commented on the campaign's announcement page. These comments indicated their participation in the program.
Eufy set ambitious targets for video collection. The company aimed to gather 20,000 videos specifically showing package thefts. Additionally, it sought another 20,000 videos of what it termed "pulling car doors." Users submitted their videos and PayPal account details through a Google Form to receive payment.
"To ensure we have enough data, we are looking for videos of both real and staged events, to help train the Al what to be on the lookout for." ā Eufy Website Statement
Ongoing Video Donation Programs
Beyond the initial paid campaign, Eufy has continued similar initiatives. These programs incentivize customers to submit videos for AI training. An in-app program, called the Video Donation Program, offers various rewards. These range from an "Apprentice Medal," which is a badge displayed next to a user's name in the app, to physical gifts like cameras or gift cards.
For this ongoing program, Eufy specifically requests videos involving human activity. The Eufy app also features an "Honor Wall." This wall ranks users based on the number of video events they have donated. One user, at the time of publication, had reportedly donated 201,531 videos, demonstrating significant user engagement.
In the app's donation program page, Eufy reiterates its commitment to data usage. It clarifies, "donated videos are only used for Al training and improvement. Eufy will not provide the video to third parties."
Background on AI Training
Training AI models often requires large amounts of diverse data. For security cameras, this data helps the AI learn to identify specific actions, like theft, and differentiate them from normal activities. Companies sometimes pay users for data to quickly build robust datasets, but this practice can create privacy challenges.
Privacy and Security Concerns
While Eufy's initiative offers users a way to monetize their data, it also presents security and privacy risks. The collection of user-generated videos, even for AI training, necessitates strong data protection measures. The potential for misuse or security vulnerabilities remains a key concern for many.
A recent incident involving another app highlights these risks. TechCrunch reported that Neon, a calling app that paid users for call recordings and transcripts, had a security flaw. This flaw allowed users to access other users' data. After being informed, Neon shut down its services.
Past Privacy Issues with Eufy
Eufy has faced privacy scrutiny in the past. In 2023, The Verge revealed a significant issue regarding Eufy's camera streams. The company had advertised these streams as end-to-end encrypted. However, when accessed through its web portal, the streams were found to be unencrypted. This contradicted Eufy's public statements and marketing.
After initial denials, Anker, Eufy's parent company, admitted to misleading its users. The company then committed to fixing the security flaw. Such incidents contribute to user skepticism regarding Eufy's data protection promises. They underscore the importance of transparency and robust security protocols when handling sensitive user data.
Eufy also requests videos recorded with its baby monitors for donation. The support page for sharing these videos does not mention any monetary compensation. The company did not respond to inquiries about this specific program.





