ISACA Advanced in AI Audit (AAIA) Certification

Course 2020

  • Duration: 2 days
  • Language: English
  • Level: Intermediate

This two-day, instructor-led course provides IS auditors with the foundational knowledge and background of AI solutions to evaluate their proper governance, design, development, and security to apply their expertise in audit and assurance activities in the enterprise. The course is structured to align with the job practice and features a variety of knowledge check questions, case studies, activities, and discussions designed to apply the concepts to real-life business scenarios.

ISACA AI Audit Certification Delivery Methods

  • In-Person

  • Online

  • Upskill your whole team by bringing Private Team Training to your facility.

ISACA AI Audit Certification Course Information

In this course, you will:

  • Explain the principles of AI Governance and Risk Management
  • Implement effective AI Operations practices
  • Utilize AI Auditing Tools and Techniques

Prerequisites

IT Audit professionals with a CISA, CIA, or CPA certification looking to enhance their expertise in navigating AI-driven challenges while upholding the highest industry standards.

ISACA AI Audit Certification Course Outline

Domain 1. AI Governance and Risk

Learning Objectives:

Within this domain, the AI auditor should be able to:

  • Evaluate impacts, opportunities, and risk when integrating AI solutions within the audit process.
  • Evaluate AI solutions to advise on impact, opportunities, and risk to organization.
  • Evaluate the impact of AI solutions on system interactions, environment, and humans.
  • Evaluate the role and impact of AI decision-making systems on the organization and stakeholders.
  • Evaluate the organization’s AI policies and procedures, including compliance with legal and regulatory requirements.
  • Evaluate the monitoring and reporting of metrics (e.g., KPIs, KRIs) specific to AI.
  • Evaluate whether the organization has defined ownership of AI-related risk, controls, procedures, decisions, and standards.
  • Evaluate the organization’s data governance program specific to AI.
  • Evaluate the organization’s privacy program specific to AI.
  • Evaluate the organization’s problem and incident management programs specific to AI.
  • Evaluate the organization’s change management program specific to AI.
  • Evaluate the organization’s configuration management program specific to AI.
  • Evaluate the organization’s threat and vulnerability management programs specific to AI.
  • Evaluate the organization’s identity and access management program specific to AI.
  • Evaluate vendors and supply chain management program specific to AI solutions.
  • Evaluate the design and effectiveness of controls specific to AI.
  • Evaluate data inputs requirements for AI models (e.g., data appropriateness, bias, and privacy).
  • Evaluate system/business requirements for AI solutions to ensure alignment with enterprise architecture.
  • Evaluate AI solution life cycle (e.g., design, development, deployment, monitoring, and decommissioning) and inputs/outputs for compliance and risk.
  • Evaluate algorithms and models to ensure AI solutions are aligned to business objectives, policies, and procedures.
  • Analyze the impact of AI on the workforce to advise stakeholders on how to address AI-related workforce impacts, training, and education.
  • Evaluate that awareness programs align to the organization’s AI-related policies and procedures.

 

Section A. AI Models, Considerations, and Requirements

1. Types of AI

  • Generative
  • Predictive
  • Narrow
  • General

2. Machine learning/AI Models

  • Basic models
  • Neural networks

3. Algorithms

  • Classes of Algorithms
  • Additional AI Considerations (technical terms and concepts relevant to the IS auditor)

4. AI Lifecycle Overview

  • Plan and Design
  • Collect and Process Data
  • Build and/or Adapt Model(s)
  • Test, Evaluate, Verify, and Validate
  • Make Available for Use/Deploy
  • Operate and Monitor
  • Retire/Decommission

5. Business Considerations

  • Business Use Cases, Needs, Scope, and Objectives
  • Cost-Benefit Analysis
  • Return on Investment
  • Internal vs. Cloud Hosting
  • Vendors
  • Shared Responsibility

 

Section B. AI Governance and Program Management

1. AI Strategy

  • Strategies
  • Opportunities
  • Vision and Mission
  • Value Alignment

2. AI-related Roles and Responsibilities

  • Categories, Focuses, and Common Examples

3. AI-related Policies and Procedures

  • Usage Policies

4. AI Training and Awareness

  • Skills, Knowledge, and Competencies

5. Program metrics

  • Examples of Metrics with Objectives and Definitions

 

Section C. AI Risk Management

1. AI-related Risk Identification

  • AI Threat Landscape
  • AI Risks
  • Challenges for AI Risk Management

2. Risk Assessment

  • Risk Assessment
  • Risk Appetite and Tolerance
  • Risk Mitigation and Prioritization
  • Remediation Plans/Best Practices

3. Risk Monitoring

  • Continuous Improvement
  • Risk and Performance Metrics

 

Section D. Privacy and Data Governance Programs

1. Data Governance

  • Data Classification
  • Data Clustering
  • Data Licensing
  • Data Cleansing and Retention

2. Privacy Considerations

  • Data Privacy
  • Data Ownership (Governance and Privacy)

3. Privacy Regulatory Considerations

  • Data Consent
  • Collection, Use, and Disclosure

 

Section E. Leading Practices, Ethics, Regulations, and Standards for AI

1. Standards, Frameworks, and Regulations Related to AI

  • Best Practices
  • Industry Standards and Frameworks
  • Laws and Regulations

2. Ethical Considerations

  • Ethical Use
  • Bias and Fairness
  • Transparency and Explainability
  • Trust and Safety
  • IP Considerations
  • Human Rights

 

Domain 2. AI Operations

Learning Objectives:

Within this domain, the AI auditor should be able to:

  • Evaluate impacts, opportunities, and risk when integrating AI solutions within the audit process.
  • Evaluate AI solutions to advise on impact, opportunities, and risk to organization.
  • Evaluate the impact of AI solutions on system interactions, environment, and humans.
  • Evaluate the role and impact of AI decision-making systems on the organization and stakeholders.
  • Evaluate the organization’s AI policies and procedures, including compliance with legal and regulatory requirements.
  • Evaluate the monitoring and reporting of metrics (e.g., KPIs, KRIs) specific to AI.
  • Evaluate whether the organization has defined ownership of AI-related risk, controls, procedures, decisions, and standards.
  • Evaluate the organization’s data governance program specific to AI.
  • Evaluate the organization’s privacy program specific to AI.
  • Evaluate the organization’s problem and incident management programs specific to AI.
  • Evaluate the organization’s change management program specific to AI.
  • Evaluate the organization’s configuration management program specific to AI.
  • Evaluate the organization’s threat and vulnerability management programs specific to AI.
  • Evaluate the organization’s identity and access management program specific to AI.
  • Evaluate vendors and supply chain management program specific to AI solutions.
  • Evaluate the design and effectiveness of controls specific to AI.
  • Evaluate data inputs requirements for AI models (e.g., data appropriateness, bias, and privacy).
  • Evaluate system/business requirements for AI solutions to ensure alignment with enterprise architecture.
  • Evaluate AI solution life cycle (e.g., design, development, deployment, monitoring, and decommissioning) and inputs/outputs for compliance and risk.
  • Evaluate algorithms and models to ensure AI solutions are aligned to business objectives, policies, and procedures.
  • Analyze the impact of AI on workforce to advise stakeholders to address AI-related workforce impacts, training, and education.
  • Evaluate that awareness programs align to the organization’s AI-related policies and procedures.

 

Section A. Data Management Specific to AI

1. Data Collection

  • Consent
  • Fit for Purpose
  • Data Lag

2. Data Classification

3. Data Confidentiality

4. Data Quality

5. Data Balancing

6. Data Scarcity

7. Data Security

  • Data Encoding
  • Data Access
  • Data Secrecy
  • Data Replication
  • Data Backup

 

Section B. AI Solution Development Methodologies and Lifecycle

1. AI Solution Development Life Cycle

  • Use Case Development
  • Design
  • Development
  • Deployment
  • Monitoring and Maintenance
  • Decommission

2. Privacy and Security by Design

  • Explainability
  • Robustness

 

Section C. Change Management Specific to AI

1. Change Management Considerations

  • Data Dependency
  • AI Model
  • Regulatory and Societal Impact
  • Emergency Changes
  • Configuration Management

Section D. Supervision of AI Solutions

1. AI Agency

  • Logging and Monitoring
  • AI Observability
  • Human in the Loop (HITL)
  • Hallucination

Section E. Testing Techniques for AI Solutions

1. Conventional Software Testing Techniques

  • A/B Testing
  • Unit and Integration Testing
  • Objective Verification
  • Code Reviews
  • Black Box Testing

2. AI-Specific Testing Techniques

  • Model Cards
  • Bias Testing
  • Adversarial Testing

 

Section F. Threats and Vulnerabilities Specific to AI

1. Types of AI-related Threats

  • Training Data Leakage
  • Data Poisoning
  • Model Poisoning
  • Model Theft
  • Prompt Injections
  • Model Evasion
  • Model Inversion
  • Threats for Using Vendor Supplied AI
  • AI Solution Disruption

2. Controls for AI-related Threats

  • Threat and Vulnerability Identification
  • Prompt Templates
  • Defensive Distillation
  • Regularization

 

Section G. Incident Response Management Specific to AI

1. Prepare

  • Policies, Procedures, and Model Documentation
  • Incident Response Team
  • Tabletop Exercises

2. Identify and Report

3. Assess

4. Respond

  • Containment
  • Eradication
  • Recovery

 

5. Post-Incident Review

Domain 3. AI Auditing Tools and Techniques

Learning Objectives:

Within this domain, the AI auditor should be able to:

  • Evaluate impacts, opportunities, and risk when integrating AI solutions within the audit process.
  • Utilize AI solutions to enhance audit processes, including planning, execution, and reporting.
  • Evaluate the monitoring and reporting of metrics (e.g., KPIs, KRIs) specific to AI.
  • Evaluate data input requirements for AI models (e.g., data appropriateness, bias, and privacy).

 

Section A. Audit Planning and Design

1. Identification of AI Assets and Controls

  • Inventory Objective and Procedure
  • Inventory and Data Gathering Methods
  • Documentation
  • Surveys
  • Interviews

2. Types of AI Controls

  • Examples including Control Categories, Controls, and Explanations

3. Audit Use Cases

  • Large Language Models
  • Audit Process Improvement
  • Generative AI
  • Audit-Specific AI Applications

4. Internal Training for AI Use

  • Key Components for Auditor Knowledge
  • Practical Skills Development

 

Section B. Audit Testing and Sampling Methodologies

1. Designing an AI Audit

  • AI Audit Objectives
  • Audit Scoping and Resources

2. AI Audit Testing Methodologies

  • AI Systems Overall Testing
  • Financial Models

3. AI Sampling

  • Judgmental sampling
  • AI sampling

4. Outcomes of AI testing

  • Reduce false positives
  • Reduce workforce needs
  • Outliers

 

Section C. Audit Evidence Collection Techniques

1. Data Collection

  • Training and Testing Data
  • Unstructured and Structured Data Collection
  • Extract, Transform, and Load
  • Data Manipulation
  • Scraping

2. Walkthroughs and interviews

  • Design Interview Questions

3. AI Collection Tools

  • Using AI to Collect Logs
  • AI agents to create outputs
  • Voice to Speech
  • Optimal Character Recognition

 

Section D. Audit Data Quality and Data Analytics

1. Data Quality

  • Optimization

2. Data Analytics

  • Sentiment Analysis
  • Run Data Analytics

3. Data Reporting

  • Reports
  • Dashboards

 

Section E. AI Audit Outputs and Reports

1. Reports

  • Report Types (examples and details)
  • Advisory Reports
  • Charts and Visualizations

2. Audit Follow-up

  • Automated follow-up

3. Quality Assurance and mitigate risk.

Need Help Finding The Right Training Solution?

Our training advisors are here for you.

ISACA AI Audit Certification FAQs

  • Exam Format: The exam is computer-based and administered as a closed-book, remotely proctored assessment.​
  • Number of Questions: The exam comprises 55 multiple-choice questions.​
  • Duration: Candidates are allotted 2 hours to complete the exam.​
  • Passing Score: A score of 65% or higher is required to pass the exam.​
Chat With Us