Data Risk Assessments: AI and Personal Data Processing | Event Recaps | All MKC Content | ANA

Data Risk Assessments: AI and Personal Data Processing

Share        

As states implement new privacy laws requiring data protection impact assessments (DPIAs) for high-risk processing activities, organizations must develop systematic approaches to evaluate and document risks related to AI, automated decision-making, and processing of sensitive data. Expert panelists provided a practical framework for implementing assessment processes while addressing constitutional and attorney-client privilege considerations.

Key Takeaways

Privacy impact assessments (PIAs) are becoming increasingly required for high-risk data processing activities, including targeted advertising, processing sensitive data, and AI/automated decision-making systems. These assessments serve multiple purposes: identifying and mitigating risks, maintaining current data inventories, and supporting privacy-by-design principles. Organizations must carefully document assessment processes and maintain records for potential regulatory review.

The Colorado AI Act sets new standards for high-risk artificial intelligence system (HAIS) assessments starting January 2026. Organizations must implement risk management policies and programs that identify and mitigate potential algorithmic discrimination and other harms throughout an AI system's lifecycle. Regular impact assessments are required annually and within 90 days of substantial modifications, with specific requirements for documenting system inputs, outputs, performance metrics, and monitoring plans.

California's draft regulations build upon existing requirements by adding unique provisions around AI training data, automated decision-making, and behavioral advertising. These regulations require detailed documentation of all internal and external parties involved in data practices, certification by approvers, and filing of abridged versions. Organizations must balance compliance requirements with maintaining appropriate protections for proprietary information and attorney-client privilege.

Implementation requires a systematic approach including creating comprehensive data inventories, understanding regulatory scope, developing assessment workflows, and establishing remediation processes. Organizations should leverage technology platforms to streamline assessments while ensuring integration with existing data management systems. Regular updates and monitoring are essential for maintaining effectiveness and demonstrating compliance.

Action Steps

  1. Create comprehensive data privacy inventory and mapping.
  2. Determine applicable regulatory requirements and assessment triggers.
  3. Develop standardized assessment workflows and templates.
  4. Implement technology solutions to manage assessment processes.
  5. Establish clear roles and responsibilities for assessment completion.
  6. Create remediation tracking and monitoring procedures.
  7. Document assessment outcomes and maintain required records.
  8. Develop reporting templates for regulatory submissions.
  9. Establish processes to protect privileged information.
  10. Regularly review and update assessment procedures.

CLE Materials

Source

"Data Risk Assessments: AI and Personal Data Processing." Alan Friel, partner at Squire Patton Boggs; David Manek, senior managing director at Ankura Consulting; Taylor Ball, account manager at Exterro; Kimberly Wong, VP and chief counsel at Kellanova. ANA Masters of Advertising Law Conference, 11/11/24.

Share        
You must be logged in to submit a comment.