Everalbum: Consumer Photos to Surveillance Tech
The FTC's template for algorithmic disgorgement
Consumer Tech
FTC
$0
2021
Outcome
Algorithmic disgorgement ordered — defined 'Affected Work Product' framework
Everalbum offered free photo storage. What users didn't know: their family photos were training facial recognition sold to law enforcement and military agencies. The FTC ordered the algorithms deleted, establishing the first "Affected Work Product" definition that has since become the template for algorithmic disgorgement. The company survived by rebuilding on legitimate data, proving disgorgement is enforceable.
This case is foundational for understanding data liability in AI-driven businesses. It demonstrates how consumer data pipelines can create regulatory exposure, how biometric data triggers specific legal obligations, and how business model pivots that monetize user data without consent can unravel.
Part 1: The Company
Ever App Origins
Everalbum launched in 2015 as a consumer photo storage application competing with Google Photos and Apple's iCloud Photos. The value proposition was straightforward: unlimited free storage with automatic organization. Users uploaded personal photos and videos to cloud storage for backup and management. The company raised venture capital based on the consumer photo storage market opportunity.
The Facial Recognition Pivot
In 2017, Everalbum added a "Friends" feature using facial recognition technology. The feature automatically grouped photos by the people appearing in them. Users could tag individuals, and the system would find all photos containing that person. Similar functionality existed in Google Photos and Apple Photos.
The consent problem emerged immediately. Until April 2019, facial recognition was enabled by default for users outside Texas, Illinois, Washington, and the European Union. Users in those jurisdictions received opt-in consent requests because state biometric laws and GDPR required it. Everyone else had no ability to turn off facial recognition. There was no opt-out mechanism.
As consumer photo storage became commoditized, Everalbum pivoted to enterprise facial recognition:
| Date | Development |
|---|---|
| 2017 | Facial recognition features added to Ever app |
| 2017-2019 | Face embeddings extracted from user photos |
| 2018 | "Ever AI" brand launched for enterprise sales |
| 2019 | Company rebrands enterprise division as "Paravision" |
| August 2020 | Ever consumer app shut down |
| September 2020 | Paravision releases new models without Ever user data |
The business model shift was complete: Everalbum transformed from a B2C photo storage company into a B2B facial recognition vendor, using consumer photo data to train enterprise-grade AI.
Enterprise Customers
The pivot's most controversial aspect was the customer base Paravision pursued. According to NBC News reporting from 2019, customers included private security companies, law enforcement agencies, military and defense contractors, and government identity verification programs.
In 2024, Paravision announced a "major new contract with a Five Eyes government partner," the intelligence alliance comprising the US, UK, Canada, Australia, and New Zealand. As of 2024, Paravision technology is deployed or planned for deployment at over 50 airports and event venues globally.
This trajectory from family photo storage to government surveillance exemplifies the data pipeline concerns that motivated FTC enforcement.
Part 2: The Data Pipeline
What Users Uploaded
The Ever app collected millions of personal photos from users. These included family photos, children's images, personal events, and intimate moments. Users understood they were uploading to a photo storage service. They did not understand they were contributing to a facial recognition training dataset.
The FTC found that Everalbum retained photos and videos from users who deactivated their accounts, despite promising deletion. This retention violation compounded the consent problem.
Face Embeddings Explained
Face embeddings are the structured biometric data extracted from photos that enable facial recognition. Understanding this technology is essential for assessing similar liability risks.
A face embedding is a numerical vector, typically 128 or 512 dimensions, that represents the unique geometric and textural features of a face. Each number in the vector captures specific facial characteristics.
How face embeddings are generated:
- Detection: A neural network identifies faces within an image
- Alignment: The face is normalized (rotated, scaled) to a standard orientation
- Feature Extraction: A convolutional neural network processes the aligned face
- Embedding Output: The network's final layer produces a fixed-length vector
The resulting vector might look like: [0.023, -0.156, 0.089, 0.412, -0.067, ...] (128 or 512 values).
Key properties of face embeddings:
- Identity-preserving: Two photos of the same person produce similar vectors
- Distance-based matching: Euclidean distance or cosine similarity measures identity match
- Irreversible: Cannot reconstruct the original face from the embedding alone
- Compact: 512 bytes can represent a face for matching purposes
Why this matters for liability:
Face embeddings are biometric data under state laws including Illinois BIPA, Texas CUBI, and Washington's biometric law. They constitute personal data under GDPR because they uniquely identify individuals. They are structured data (numerical vectors in databases) and derived data (created from photos, but distinct from them).
The FTC's requirement to delete face embeddings established that derived biometric data, not just source images, must be destroyed when the underlying consent is inadequate.
Training Dataset Creation
The FTC complaint detailed how Everalbum built training datasets:
"Between September 2017 and August 2019, Everalbum combined millions of facial images that it extracted from Ever users' photos with facial images that Everalbum obtained from publicly available datasets to create four datasets for use in the development of its facial recognition technology."
Four datasets were created: Ever user photos without adequate consent, public academic face databases, combined and augmented datasets, and production training sets for enterprise models.
The resulting facial recognition system achieved competitive accuracy on industry benchmarks and was marketed to enterprise customers first as "Ever AI" and then as "Paravision."
Part 3: The Violations
Consent Misrepresentation
Everalbum's help center stated that facial recognition was used "only with your consent." The reality was different. For users outside Texas, Illinois, Washington, and the EU, facial recognition was enabled by default. Users could not turn it off. There was no opt-out mechanism available.
The FTC found the help center statement "misleading" because consent was neither obtained nor meaningfully available for the majority of users.
Geographic Consent Disparity
Everalbum applied different consent standards based on user location:
| Jurisdiction | Consent Model | Reason |
|---|---|---|
| Texas | Opt-in required | TX CUBI law |
| Illinois | Opt-in required | BIPA (strongest US biometric law) |
| Washington | Opt-in required | WA biometric law |
| European Union | Opt-in required | GDPR |
| All other US states | Default-on, no opt-out | No applicable law |
This disparity is significant for due diligence purposes. The company knew how to implement proper consent. It did so where legally required. It chose not to elsewhere. This pattern indicates awareness of the problematic nature of the data practices.
Account Deletion Failures
Users who deactivated their accounts were told their photos and videos would be deleted. Everalbum retained this content indefinitely, continuing to use it for facial recognition development. This retention violated both the company's own privacy representations and reasonable user expectations.
Undisclosed Enterprise Use
Users were never informed that their family photos would train commercial facial recognition systems, that the resulting technology would be sold to enterprise customers, or that customers might include law enforcement and military agencies.
This information asymmetry was central to the FTC's deception finding. Consumers provided intimate photos for "free storage" that actually funded a surveillance technology business.
Part 4: FTC Enforcement
Section 5 Complaint
The FTC announced the proposed settlement on January 11, 2021, alleging violations of Section 5 of the FTC Act (unfair or deceptive practices).
The key allegations were:
- Misrepresented that facial recognition required consent
- Failed to delete photos and videos upon account deactivation
- Used consumer data to develop commercial facial recognition without disclosure
- Retained biometric data (face embeddings) without consent
The "Affected Work Product" Definition
The settlement introduced terminology that has become standard in FTC algorithmic disgorgement orders:
"Affected Work Product" means any models or algorithms developed in whole or in part using Biometric Information Respondent collected from Users of the "Ever" mobile application.
This definition established several important precedents:
- Covers models trained even partially on non-compliant data
- Includes algorithms at any stage of development
- Extends to derived products (models trained on outputs of other models)
- Requires deletion regardless of commercial value
The "in whole or in part" language is particularly significant. A model trained on 95% legitimate data and 5% non-compliant data is still Affected Work Product subject to deletion. This creates strong incentives for data provenance tracking throughout the ML pipeline.
Three-Part Deletion Requirement
The settlement ordered deletion of three distinct data categories:
Photos and Videos (Unstructured): All content from users who deactivated accounts. This content should never have been retained.
Face Embeddings (Structured Biometric): All embeddings derived from users who did not provide express consent. The numerical vector representations, not just the source images.
Models and Algorithms (Work Product): Any facial recognition models trained using Ever user data. Any algorithms developed using biometric information from the app.
This three-tier structure addresses the full data lifecycle: raw data, derived structured data, and trained models.
No Monetary Penalty
Unlike later cases, the Everalbum settlement included no financial penalty.
Commissioner Chopra dissented: "It is unfortunate that the Commission is not seeking any monetary relief from Everalbum."
Several factors may explain the absence of a fine. The settlement was reached before the FTC established a pattern of high penalties for algorithmic misconduct. The company was transitioning away from its consumer business. This was the first facial recognition-focused enforcement action, testing legal theories that had not been litigated.
Subsequent cases have included substantial monetary penalties alongside disgorgement requirements.
Settlement Finalized
The FTC finalized the settlement in May 2021 after a public comment period.
Part 5: Aftermath
Paravision Compliance
In a rare public disclosure, Paravision submitted a compliance report to the FTC confirming:
- Ever app shutdown: August 2020 (before settlement finalized)
- Data deletion: All photos and videos from deactivated accounts deleted
- Face embeddings: All non-consensual embeddings destroyed
- Algorithm rebuild: New models released September 2020 using only compliant data
This public confirmation provides evidence that algorithmic disgorgement is practically enforceable. Companies can and do delete algorithms when required.
Business Continuity
Unlike Cambridge Analytica (bankruptcy) or Edmodo (shutdown following FTC action), Paravision continues operating successfully.
Current status (2024-2025):
- NIST Face Recognition Technology Evaluation top 5 ranking (January 2025)
- Five Eyes government contract announced (October 2024)
- 50+ airport and venue deployments globally
- New product releases including Liveness 2.0 and Deepfake Detection 2.0
The company rebuilt its technology on compliant data foundations and continued its enterprise business. Disgorgement was not a death sentence. It was a reset requiring legitimate data sourcing.
Model Rebuild Timeline
| Date | Event |
|---|---|
| August 2020 | Ever app shut down |
| September 2020 | New facial recognition model released |
| January 2021 | FTC settlement announced |
| May 2021 | Settlement finalized |
| 2022 | Paravision compliance report filed |
The company proactively rebuilt its models before the settlement was even finalized, demonstrating that compliance was technically feasible within a compressed timeline.
Part 6: Lessons for Data Liability
Consumer-to-Enterprise Data Flow
The Everalbum case illustrates a specific liability pattern: company collects data through consumer app, extracts valuable features, builds B2B product, sells to enterprise customers.
Consumer consent for a photo storage app does not extend to facial recognition development, enterprise product training, or government and law enforcement applications. Each of these use cases represents a material departure from the original data collection context.
For due diligence purposes, any AI/ML company should be evaluated for training data provenance. Consumer-facing data collection that feeds enterprise products requires scrutiny of consent scope.
Biometric Data Sensitivity
The case establishes face embeddings as regulated biometric data with specific legal triggers:
- Illinois BIPA: Private right of action with statutory damages of 5,000 per violation
- Texas CUBI: Attorney General enforcement with penalties up to $25,000 per violation
- Washington: Attorney General enforcement
The geographic consent disparity in Everalbum reveals a due diligence heuristic. If a company implements different consent models for Texas, Illinois, and Washington users, this indicates awareness of biometric law requirements. The question becomes: what data practices would violate those state laws, and are those practices applied to users in other jurisdictions?
Different consent standards for different jurisdictions is a red flag indicating the company knows its practices are problematic where stricter laws apply.
Business Model Pivot Risks
Companies that pivot from consumer to enterprise face particular risks. Data repurposing is the core issue: consumer data collected under one set of representations cannot automatically be used for different commercial purposes.
Consent scope matters. Terms of service for "free photo storage" do not grant rights for "facial recognition development" or "enterprise AI training." Material changes in data use may require affirmative re-consent from users.
The Everalbum case demonstrates that business model evolution does not automatically carry forward data rights. Each new use case must be evaluated against the original consent basis.
Technical Feasibility of Disgorgement
The Everalbum case proves algorithmic disgorgement is technically achievable:
- Models can be deleted
- New models can be trained on compliant data
- Business can continue post-disgorgement
- Compliance can be documented and verified
Companies cannot argue that disgorgement is technically impossible or commercially fatal. Paravision's continued success demonstrates that the remedy is survivable for well-managed organizations with legitimate technology.
The implication for deal analysis is that algorithmic disgorgement should be modeled as a potential outcome for companies with questionable training data provenance. The cost is the investment required to rebuild models on compliant data, not necessarily the value of the entire business.
The "Affected Work Product" Template
The Everalbum settlement established the framework now applied across FTC algorithmic enforcement. Subsequent cases involving Weight Watchers (Kurbo), Rite Aid, and others have used similar "Affected Work Product" definitions requiring deletion of models trained on improperly collected data.
Understanding this template is essential for assessing regulatory exposure. Any model trained "in whole or in part" on non-compliant data is potentially subject to disgorgement. Data provenance documentation is not merely good practice; it is the defense against remedies that could require destroying core IP.
Sources
Primary FTC Documents
FTC Press Release: Everalbum Settlement (January 11, 2021) https://www.ftc.gov/news-events/news/press-releases/2021/01/california-company-settles-ftc-allegations-it-deceived-consumers-about-use-facial-recognition-photo
FTC Finalized Settlement (May 2021) https://www.ftc.gov/news-events/news/press-releases/2021/05/ftc-finalizes-settlement-photo-app-developer-related-misuse-facial-recognition-technology
Paravision Compliance Report (redacted) https://www.ftc.gov/system/files/ftc_gov/pdf/paravision_compliance_report_redacted.pdf
FTC Case Page https://www.ftc.gov/legal-library/browse/cases-proceedings/192-3172-everalbum-inc-matter
Legal Analysis
Harvard JOLT: First Facial Recognition Misuse Settlement https://jolt.law.harvard.edu/digest/everalbum-inc-in-first-facial-recognition-misuse-settlement-ftc-requires-destruction-of-algorithms-trained-on-deceptively-obtained-photos
Perkins Coie: Everalbum Settlement Analysis https://www.perkinscoie.com/en/news-insights/everalbum-settles-ftc-claims-alleging-deceptive-use-of-facial-recognition-technology.html
News Coverage
TechCrunch: Ever App FTC Settlement https://techcrunch.com/2021/01/12/ftc-settlement-with-ever-orders-data-and-ais-deleted-after-facial-recognition-pivot/
Biometric Update: Paravision Company Profile https://www.biometricupdate.com/companies/paravision