
As Artificial Intelligence moves deeper into core processes—credit decisions, fraud detection, underwriting, claims, clinical workflows, customer journeys, and public services—internal audit and risk functions are under growing pressure.
Chief Audit Executives (CAEs), Heads of Risk, and Compliance leaders are being asked:
-
“Have our AI models been independently reviewed?”
-
“How do we know AI is working as intended—and still aligned with policy and regulation?”
-
“Where does AI sit in our risk-based audit plan?”
-
“Do we have enough skills and tools to challenge AI systems effectively?”
At the same time, most internal audit teams are:
-
Comfortable with process, controls, and IT general controls audits
-
Experienced in model risk from credit risk or pricing models
-
But still building confidence in how to assess machine learning and generative AI—especially at scale
This article offers a practical roadmap for internal audit and risk functions, showing how Dawgen Global’s proprietary AI assurance methodologies—Dawgen AI Lifecycle Assurance (DALA)™, Dawgen Generative AI Controls Framework (DGACF)™, Dawgen AI Governance & Ethics Index (DAGEI)™, and Dawgen Continuous AI Monitoring & Assurance (DCAMA)™—can be integrated into audit plans and three-lines-of-defence models.
1. Why Internal Audit Cannot Ignore AI
Internal audit’s mandate is to provide independent assurance on risk management, control, and governance. As soon as AI is embedded in:
-
Product approval and pricing
-
Credit, AML, and fraud decisions
-
Claims and underwriting
-
Customer complaints and servicing
-
Eligibility and public-service decisions
-
Internal productivity and coding assistants
…it becomes a material part of the control environment.
Three reasons internal audit must engage:
1.1 AI Changes the Control Landscape
Traditional controls focus on:
-
Approvals and authorisations
-
Segregation of duties
-
Reconciliations and exception reports
-
IT general controls and access management
AI introduces new layers:
-
Data pipelines and feature engineering
-
Model training, validation and deployment pipelines
-
Monitoring metrics (performance, drift, fairness, safety)
-
Generative AI prompt and output controls
If internal audit ignores these layers, gaps open between traditional controls and actual decision-making reality.
1.2 Regulators Expect Independent Assurance
Across sectors, regulators increasingly expect:
-
Robust model risk management for AI and ML models
-
Evidence of independent review and challenge
-
Demonstrable governance and oversight from boards and risk committees
Internal audit is a natural vehicle for this assurance—but only if it has a structured approach.
1.3 AI Risk is Cross-Cutting
AI risk is not just “IT risk” or “data risk.” It cuts across:
-
Conduct and consumer protection
-
Prudential risk and capital
-
Legal and compliance
-
Operational resilience
-
Data protection and confidentiality
-
Ethics, fairness and human rights
Internal audit is one of the few functions with a mandate to look across the entire enterprise, making it central to credible AI assurance.
2. Typical Challenges for Internal Audit
When internal audit teams start looking at AI, they often face similar obstacles:
-
Limited technical depth
-
Auditors are not data scientists and may feel uncomfortable challenging model architecture or training methods.
-
-
Lack of inventory and visibility
-
No clear list of AI systems, where they sit, or how critical they are.
-
-
Unclear scope boundaries
-
Questions like: “Are we auditing the model, the process, or the vendor?” often go unanswered.
-
-
Ad hoc approaches
-
One team audits a chatbot, another a fraud model—each using their own checklists.
-
-
Rapid change
-
AI models are updated, retrained or replaced more frequently than traditional systems, making annual audits feel insufficient.
-
Dawgen’s methodologies are designed to complement internal audit—bringing structured AI expertise while leaving ownership of the internal audit plan and reporting firmly with the CAE.
3. How Dawgen’s Frameworks Support Internal Audit
From an internal audit perspective, Dawgen’s four proprietary methodologies can be used as tools in the audit toolkit:
-
DALA™ (Dawgen AI Lifecycle Assurance)
-
Used for deep, end-to-end reviews of critical AI systems, aligned with internal audit or model risk engagements.
-
-
DGACF™ (Dawgen Generative AI Controls Framework)
-
Applied when internal audit needs to assess LLMs, copilots, chatbots and other generative AI.
-
-
DAGEI™ (Dawgen AI Governance & Ethics Index)
-
Used to provide a baseline of governance maturity and to inform risk-based audit planning.
-
-
DCAMA™ (Dawgen Continuous AI Monitoring & Assurance)
-
Offers ongoing monitoring and periodic mini-assurance, which internal audit can rely upon, test or incorporate into its own work.
-
Each framework can be integrated into the three lines of defence:
-
1st line (Business & IT) – Responsible for safe development and operation
-
2nd line (Risk & Compliance) – Responsible for policies, standards and oversight
-
3rd line (Internal Audit) – Provides independent assurance, leveraging Dawgen as specialist support where needed
4. Building an AI-Aware Risk-Based Audit Plan
A practical first step for internal audit is to update its risk universe to explicitly include AI.
4.1 Create or Validate the AI Use Case Register
If the organisation does not yet have an AI Use Case Register, internal audit can:
-
Recommend its creation, or
-
Partner with risk and data teams to validate and refine it.
The register should list:
-
AI systems and use cases
-
Business owners and technical owners
-
Model types (predictive, optimisation, generative, etc.)
-
Data sources
-
Impact and regulatory exposure
-
Deployment stage (pilot, production)
This inventory becomes the starting point for risk assessment.
4.2 Risk-Rank AI Use Cases
Internal audit should work with risk and management to classify AI systems by:
-
Impact on customers/patients/citizens
-
Financial/material impact
-
Regulatory visibility and sensitivity
-
Degree of automation (advisory vs. decision-making)
-
Exposure to generative AI risk (hallucinations, unsafe content, leakage)
This ranking informs the multi-year audit plan:
-
Critical/high-risk AI → DALA™ reviews + DCAMA™ monitoring + periodic internal audit follow-up
-
Medium-risk AI → Thematic or process-level audits, sampling, reliance on DCAMA™
-
Low-risk AI → Oversight via policy and governance audits
5. Using DAGEI™ to Inform Audit Planning
The Dawgen AI Governance & Ethics Index (DAGEI)™ offers internal audit a powerful input into planning:
-
It scores the organisation across six dimensions, including governance, policy alignment, data & privacy, fairness, resilience, and transparency.
-
The results highlight where controls are weakest and where audit focus will have greatest value.
Internal audit can:
-
Use DAGEI™ output as part of the annual risk assessment
-
Align audits to low-scoring dimensions (e.g., fairness monitoring, generative AI guardrails)
-
Track how repeated audits and remediation improve DAGEI™ scores over time
This ensures that internal audit activity is strategic, not random.
6. Embedding DALA™ into Internal Audit Engagements
When internal audit selects a critical AI system—for example, a credit-scoring model or claims triage engine—it can use DALA™ as the backbone of the engagement.
6.1 Scoping Around the Seven Phases
DALA™ covers:
-
Strategy & use case qualification
-
Governance & risk context
-
Data & model due diligence
-
Pre-deployment testing & scenario validation
-
Deployment & change management
-
Monitoring & incident management
-
Governance, compliance & continuous improvement
Internal audit can:
-
Scope the audit around these phases
-
Ask for evidence at each step (policies, documentation, logs, validation reports, monitoring dashboards)
-
Use Dawgen’s specialists to perform technical deep dives where needed, while internal audit leads on controls, governance and reporting
6.2 Outputs Internal Audit Can Rely On
A DALA™-based engagement typically delivers:
-
A structured findings matrix – by lifecycle phase and risk theme
-
Control ratings and remediation recommendations
-
Clear traceability to policies, risk appetite and regulatory expectations
Internal audit can incorporate DALA™ results directly into its own audit reports to management and the audit committee, referencing Dawgen as an independent expert contributor.
7. Auditing Generative AI with DGACF™
Generative AI presents a new set of audit questions:
-
What models and providers are being used (public, private, open-source)?
-
What data is being sent to these models?
-
What decisions or outputs are being influenced?
-
How do we manage hallucinations, unsafe outputs and prompt injection?
Using DGACF™, internal audit can structure reviews around six dimensions:
-
Model provenance & documentation
-
Use-case scoping & guardrails
-
Prompt, context & output controls
-
Data protection, privacy & IP
-
Human oversight & explainability
-
Monitoring & feedback loops
Internal audit might, for example:
-
Audit a customer-facing chatbot to ensure it cannot provide unapproved advice, reveal confidential data or generate harmful content.
-
Review a developer copilot to check that code suggestions are appropriately governed and logged.
-
Assess a knowledge-assistant used by staff for policy or regulatory queries—ensuring that outputs are treated as drafts, not gospel.
DGACF™ gives auditors concrete control points to test, rather than leaving them to invent their own lists from scratch.
8. Leveraging DCAMA™ for Ongoing Assurance
One of internal audit’s challenges is that AI changes faster than the annual audit cycle.
DCAMA™ (Dawgen Continuous AI Monitoring & Assurance) provides a layer of ongoing oversight that internal audit can:
-
Rely on – by reviewing DCAMA™ outputs and sampling test work;
-
Validate – by auditing the DCAMA™ process itself;
-
Extend – by asking Dawgen to include specific focus areas in future DCAMA™ cycles.
Examples of how internal audit can use DCAMA™:
-
Use DCAMA™ dashboards in audit planning to spot deteriorating performance or emerging incident patterns.
-
Draw on DCAMA™ incident reviews when auditing operational risk, conduct, or specific product lines.
-
Check whether management responses to DCAMA™ findings are timely and effective.
This creates a continuous assurance fabric where internal audit, Dawgen and management each play their part.
9. Practical Tips for CAEs and Internal Audit Leaders
To make AI assurance real—without overwhelming the audit function—CAEs can take a staged approach:
-
Acknowledge AI explicitly in the audit universe and risk assessment.
-
Partner with Dawgen for initial DAGEI™, DALA™ and DGACF™ engagements on 1–3 critical AI systems.
-
Build internal capability gradually—pairing internal auditors with Dawgen’s specialists to transfer knowledge.
-
Integrate AI into existing audit themes (e.g., credit risk, AML, customer conduct, IT risk) instead of treating it as a separate silo.
-
Use DCAMA™ as a force multiplier—especially where internal resources are constrained.
-
Report clearly to the audit committee:
-
What AI systems have been reviewed
-
Key findings and remediation status
-
How AI will feature in next year’s audit plan
-
Over time, internal audit evolves from “AI is scary and opaque” to “AI is another risk area we can handle—with the right tools and partners.”
10. Benefits of an AI-Savvy Internal Audit Function
When internal audit embraces AI assurance with Dawgen’s support, organisations gain:
-
Better risk insight – a clearer view of where AI is used, how it behaves, and where exposure lies.
-
Stronger regulatory posture – credible evidence that independent assurance has been applied to critical AI systems.
-
Improved collaboration – between data teams, risk, compliance, internal audit and the board.
-
Higher-quality AI deployments – because assurance feedback is fed into the design and lifecycle from the start.
-
Greater stakeholder confidence – that AI is being used thoughtfully and responsibly, not recklessly.
For internal audit itself, AI assurance becomes a career-enhancing capability, positioning the function as a strategic advisor, not just a compliance checker.
Next Step: Partner with Dawgen Global to Audit AI Effectively
Internal audit and risk functions are pivotal to ensuring AI is controlled, compliant and aligned with organisational values. But they do not have to do it alone.
Dawgen Global’s proprietary methodologies—Dawgen AI Lifecycle Assurance (DALA)™, Dawgen Generative AI Controls Framework (DGACF)™, Dawgen AI Governance & Ethics Index (DAGEI)™, and Dawgen Continuous AI Monitoring & Assurance (DCAMA)™—are designed to plug directly into your risk-based audit plan and three-lines-of-defence model.
At Dawgen Global, we help you make Smarter and More Effective Decisions—including how you oversee and assure AI.
📧 To explore how Dawgen can support your internal audit and risk teams in auditing AI, email [email protected] to request a tailored AI assurance and internal audit support proposal.
Our multidisciplinary team will work with your CAE, CRO and management to design engagements that fit your risk profile, regulatory context, and resource constraints—so your internal audit function can confidently step into the age of AI.
About Dawgen Global
“Embrace BIG FIRM capabilities without the big firm price at Dawgen Global, your committed partner in carving a pathway to continual progress in the vibrant Caribbean region. Our integrated, multidisciplinary approach is finely tuned to address the unique intricacies and lucrative prospects that the region has to offer. Offering a rich array of services, including audit, accounting, tax, IT, HR, risk management, and more, we facilitate smarter and more effective decisions that set the stage for unprecedented triumphs. Let’s collaborate and craft a future where every decision is a steppingstone to greater success. Reach out to explore a partnership that promises not just growth but a future beaming with opportunities and achievements.
Email: [email protected]
Visit: Dawgen Global Website
WhatsApp Global Number : +1 555-795-9071
Caribbean Office: +1876-6655926 / 876-9293670/876-9265210
WhatsApp Global: +1 5557959071
USA Office: 855-354-2447
Join hands with Dawgen Global. Together, let’s venture into a future brimming with opportunities and achievements

