Loading...
Loading...
Knowledge Center / Dictionary
Comprehensive definitions of privacy concepts for institutional due diligence.
Privacy model ensuring every combination of quasi-identifiers appears at least k times
Mathematical framework providing quantifiable privacy guarantees via noise injection
Data points that appear anonymous alone but can identify individuals when combined
Technique combining quasi-identifiers across datasets to re-identify individuals
Statistical likelihood that an individual can be identified from de-identified data
Privacy metric measuring the fraction of individuals uniquely identifiable given a small number of data points
Category of personal data whose exposure poses elevated risk of discrimination, identity theft, or physical harm
Processing personal data so it cannot be attributed to an individual without additional separately-held information
Privacy model requiring at least l distinct sensitive attribute values per equivalence class
Privacy attack deducing sensitive information by analyzing query results or model outputs
Entity that determines the purposes and means of processing personal data
Third-party processor engaged by a primary processor to perform specific data processing activities
Independent public body empowered to investigate and enforce data protection laws
Formal regulatory measure to address an organization's failure to comply with data protection laws
Individual's right to obtain confirmation of whether their personal data is being processed and access to that data
Privacy disclosure provided to individuals at or before the moment their personal data is collected
Defined duration for which an organization stores personal data before deletion or anonymization
Personal data derived from technical processing of physical or behavioral characteristics enabling unique identification
EU Commission determination that a non-EU country provides data protection essentially equivalent to EU standards
General Data Protection Regulation (EU) - comprehensive data protection framework
California Consumer Privacy Act / California Privacy Rights Act
Regulatory remedy forcing deletion of AI models trained on non-compliant data
GDPR Article 17 - individual's right to have personal data deleted
Decisions made by algorithmic systems without meaningful human involvement in the final determination
Statutory mechanism enabling individuals to sue organizations directly for privacy violations without relying on government enforcement
Final liability metric combining asset toxicity with contextual multipliers
Intrinsic risk score of a dataset, independent of owner context
Risk from combining individually innocuous data into identifying information
Specific privacy risks in large language model AI systems
Techniques to remove specific data influence from trained models
Privacy attack reconstructing training data by exploiting machine learning model outputs