How do you know your AI provider hasn't silently changed the model — or the MCP server — you approved?

AI DNA Integrity Verification

Behavioural fingerprinting for AI models and MCP servers. Detect silent model swaps, verify deployment integrity, and stream verification events to your SIEM — all from a single binary.

Australia/NZ ISM Ready
Zero Data Storage
Single Binary
8 Industry Genome Profiles
8 SIEM Integrations

The Hidden Risk in Your AI Infrastructure

Organisations deploying AI face critical integrity challenges that can lead to compliance violations, security breaches, and operational failures:

  • Silent Model Updates: AI providers update models without notification, potentially changing behaviour and breaking compliance
  • Accidental Misconfigurations: IT teams may deploy the wrong model version or endpoint without realising it
  • MCP Server Drift: Model Context Protocol servers can silently change tool manifests, capabilities, or timing profiles
  • No Audit Trail: Regulated industries require proof of which AI model version was used, but most organisations lack the tooling
4
multi-dimensional genome verification analysis
SHA256
cryptographic fingerprint hash for tamper-proof integrity
5
MCP server verification dimensions
8
SIEM platforms supported out of the box with mTLS

Complete AI Integrity Platform

Multi-dimensional behavioural fingerprinting with enterprise SIEM integration and MCP server verification

Behavioural Fingerprinting

Creates unique DNA signatures by analysing the construction and characteristics of your model and MCP server across 60+ probe interactions.

Multi-Dimensional Verification

Compares live model outputs across multiple verification dimensions with configurable weights and thresholds for precise match scoring.

MCP Server Fingerprinting

Fingerprints Model Context Protocol servers via JSON-RPC 2.0, capturing multiple verification dimensions for comprehensive server integrity analysis.

Cryptographic Integrity

Every fingerprint includes a SHA256 hash for tamper-proof verification. Immutable audit trails prove exactly which model version was deployed.

Industry Genome Profiles

Eight domain-specific genome profiles — general, medical, finance, energy, legal, cybersecurity, education, and retail — plus support for custom profile files.

SIEM Streaming

Stream verification events to Splunk, Azure Sentinel, Datadog, Elasticsearch, Grafana Loki, or any webhook endpoint with retry, backoff, and mTLS.

Supported AI Providers

Fingerprint and verify models from any major provider or local deployment

OpenAI GPT-4o, GPT-4, GPT-3.5
Anthropic Claude Opus, Sonnet, Haiku
Google Gemini Gemini 1.5 Pro, Flash
Azure OpenAI Any Azure-hosted deployment
OpenAI-Compatible Ollama, vLLM, LM Studio

Built-in Industry-Specific Genome Profiles

Each genome profile contains 55 expert-crafted analysis sequences across domain subcategories, producing far more discriminating fingerprints than generic probing

General

  • Algorithms & Data Structures
  • Operating Systems
  • Networking & Protocols
  • Databases & Storage
  • Software Engineering
  • Distributed Systems
  • Security Fundamentals
  • AI & Machine Learning

Medical

  • Clinical Decision-Making
  • Medical Ethics & Consent
  • Pharmacology
  • Diagnostic Reasoning
  • Public Health
  • Health Informatics
  • Clinical Trials
  • Patient Safety

Finance

  • Risk Management
  • Regulatory & Compliance
  • Trading & Markets
  • Portfolio Theory
  • Financial Instruments
  • Valuation & Accounting
  • Fintech & Digital Finance
  • Corporate Finance

Energy

  • Grid Operations
  • Renewable Energy
  • Energy Storage
  • Oil & Gas
  • Energy Markets
  • Nuclear Energy
  • Energy Efficiency
  • Energy Policy

Legal

  • Contract Law
  • Intellectual Property
  • Employment Law
  • Privacy & Data Protection
  • Corporate Governance
  • Dispute Resolution
  • Regulatory Compliance
  • International Law

Cybersecurity

  • Threat Analysis
  • Network Security
  • Cryptography
  • Incident Response
  • Application Security
  • Identity & Access
  • Cloud Security
  • Forensics

Education

  • Learning Theory
  • Curriculum Design
  • Assessment & Evaluation
  • Educational Technology
  • Inclusion & Accessibility
  • Teacher Development
  • Higher Education
  • Research Methods

Retail

  • Supply Chain & Logistics
  • Customer Experience & Loyalty
  • Merchandising & Pricing
  • E-Commerce & Omnichannel
  • Store Operations
  • Retail Analytics & Data
  • Loss Prevention & Shrinkage
  • Retail Technology
  • Consumer Behaviour
  • Retail Regulation & Compliance
Why industry genome profiles matter: A medical AI fingerprinted with a clinical genome profile produces a far more discriminating signature than one analysed with generic sequences. Each profile contains 55 expert-crafted analysis sequences across domain subcategories. You can also supply your own profile via -promptfile.

How DNAI Works

Three-step workflow: fingerprint, verify, alert

1

Fingerprint

Probe your approved AI model 60+ times to build a behavioural DNA baseline. DNAI analyses the construction and characteristics of your model and MCP server to create a unique cryptographic fingerprint.

2

Verify

Compare live model outputs against your stored baseline. Multi-dimensional matching detects silent model swaps, version changes, parameter modifications, and behavioural drift.

3

Alert

Stream structured verification events to your SIEM platform. Anomalies trigger immediate alerts with detailed match scores for your security and compliance teams.

Built for Enterprise AI Governance

Maintain AI integrity and compliance across your organisation

Healthcare

Verify medical AI assistants use only TGA-approved model versions with complete audit trails for patient safety and regulatory compliance.

Financial Services

Ensure AI-powered risk assessment and trading systems use approved models. Detect unauthorised changes that could impact financial decisions.

Legal & Regulatory

Maintain chain of custody for AI-generated legal documents, proving specific model versions were used to meet court and regulatory requirements.

Enterprise IT

Prevent shadow AI usage and ensure all departments use security-approved models with centralised governance and policy enforcement.

Research & Development

Track model versions used in experiments for reproducibility. Ensure research integrity with accurate documentation of AI methodologies.

CI/CD Pipelines

Integrate fingerprint verification into deployment pipelines. Automatically gate releases when AI model integrity checks fail.

Enterprise SIEM Integrations

Stream structured verification events directly to your security operations platform with retry, backoff, and mTLS support

Splunk HTTP Event Collector
Azure Sentinel Log Analytics API
Datadog Logs API
Elasticsearch REST / OpenSearch
Grafana Loki Push API
Webhook Any HTTP endpoint

Technical Specifications

Built with security, scalability, and compliance at its core

AI Model Verification

  • Multi-dimensional genome match analysis
  • Configurable verification weights & thresholds
  • SHA256 fingerprint hashing
  • 60+ probe interaction baseline
  • Silent model swap detection
  • Behavioural drift monitoring

MCP Server Verification

  • JSON-RPC 2.0 protocol probing
  • Multi-dimensional server integrity analysis
  • Tool manifest SHA256 hashing
  • Comprehensive server genome profiling
  • Configurable verification thresholds
  • Silent server swap detection

SIEM & Security

  • Exponential backoff with jitter
  • Configurable retries (default 5)
  • mTLS client certificate support
  • Custom CA bundle (PEM)
  • Structured event schema v1.0
  • Automatic HTTP 429/5xx retry

Deployment & Operations

  • Single binary, zero dependencies
  • Checkpoint/resume for long probes
  • Concurrent probe workers (1–8)
  • JSON or human-readable output
  • Fingerprint diff comparison mode
  • Environment variable configuration

User Guide

Get started with DNAI in minutes. Fingerprint, verify, and monitor your AI models from the command line.

1 Quick Start — Fingerprint an AI Model

Probe your AI provider 60 times to build a behavioural DNA baseline. The fingerprint captures the construction and characteristics of your model across multiple verification dimensions.

OpenAI
Anthropic
Gemini
Azure OpenAI
Ollama / Local
Fingerprint with OpenAIbash
# Set your API key
export OPENAI_API_KEY="sk-..."

# Fingerprint GPT-4o with 60 probes
dnai -provider openai \
     -model gpt-4o \
     -analysegenome 60 \
     -savefp gpt4o-baseline.json \
     -license-key YOUR_LICENSE_KEY
Fingerprint with Anthropicbash
# Set your API key
export ANTHROPIC_API_KEY="sk-ant-..."

# Fingerprint Claude Sonnet
dnai -provider anthropic \
     -model claude-sonnet-4-20250514 \
     -analysegenome 60 \
     -savefp sonnet-baseline.json \
     -license-key YOUR_LICENSE_KEY
Fingerprint with Google Geminibash
# Set your API key
export GOOGLE_API_KEY="AIza..."

# Fingerprint Gemini 1.5 Pro
dnai -provider gemini \
     -model gemini-1.5-pro \
     -analysegenome 60 \
     -savefp gemini-baseline.json \
     -license-key YOUR_LICENSE_KEY
Fingerprint with Azure OpenAIbash
# Set your Azure API key
export OPENAI_API_KEY="your-azure-key"

# Fingerprint Azure-hosted deployment
dnai -provider openai_compat \
     -endpoint "https://myorg.openai.azure.com/openai/deployments/gpt4o/chat/completions?api-version=2024-02-15-preview" \
     -model gpt-4o \
     -azure \
     -analysegenome 60 \
     -savefp azure-gpt4o-baseline.json \
     -license-key YOUR_LICENSE_KEY
Fingerprint with Ollama / Localbash
# No API key needed for local models
dnai -provider openai_compat \
     -endpoint "http://localhost:11434/v1/chat/completions" \
     -model llama3 \
     -analysegenome 60 \
     -savefp llama3-baseline.json \
     -license-key YOUR_LICENSE_KEY

2 Verify a Model Against Its Baseline

Load a saved fingerprint and probe the live model to check if it still matches. DNAI reports match scores across all verification dimensions and flags anomalies.

Verificationbash
# Verify GPT-4o against its saved fingerprint
dnai -provider openai \
     -model gpt-4o \
     -loadfp gpt4o-baseline.json \
     -probe 30 \
     -json \
     -license-key YOUR_LICENSE_KEY

# Output includes match result, confidence score, and detailed analysis
Deterministic Verification: DNAI automatically replays the exact genome fingerprinting sequence during verification, ensuring the most accurate comparison. Use -json for structured output suitable for automation.

3 MCP Server Fingerprinting

Fingerprint and verify Model Context Protocol servers. DNAI probes the server via JSON-RPC 2.0, performing multi-dimensional integrity analysis to create a comprehensive server genome.

MCP Fingerprint & Verifybash
# Fingerprint an MCP server
dnai -mcpprobe "http://localhost:3000/mcp" \
     -mcpsave mcp-baseline.json \
     -license-key YOUR_LICENSE_KEY

# Later: verify the live server against the baseline
dnai -mcpprobe "http://localhost:3000/mcp" \
     -mcpload mcp-baseline.json \
     -mcpverify \
     -mcpjson \
     -license-key YOUR_LICENSE_KEY

4 SIEM Integration

Stream verification events to your SIEM platform. All SIEM modes support retry with exponential backoff, configurable timeouts, and mTLS.

Splunk
Sentinel
Datadog
Elastic
Loki
Webhook
Splunk HECbash
dnai -provider openai -model gpt-4o \
     -loadfp gpt4o-baseline.json -probe 30 \
     -siem splunk \
     -splunk.url "https://splunk.example.com:8088" \
     -splunk.token "your-hec-token" \
     -splunk.index "ai_integrity" \
     -license-key YOUR_LICENSE_KEY
Azure Sentinelbash
dnai -provider openai -model gpt-4o \
     -loadfp gpt4o-baseline.json -probe 30 \
     -siem sentinel \
     -sentinel.workspace "your-workspace-id" \
     -sentinel.key "your-shared-key-base64" \
     -license-key YOUR_LICENSE_KEY
Datadogbash
dnai -provider openai -model gpt-4o \
     -loadfp gpt4o-baseline.json -probe 30 \
     -siem datadog \
     -datadog.api-key "your-dd-api-key" \
     -datadog.site "datadoghq.com" \
     -license-key YOUR_LICENSE_KEY
Elasticsearch / OpenSearchbash
dnai -provider openai -model gpt-4o \
     -loadfp gpt4o-baseline.json -probe 30 \
     -siem elastic \
     -elastic.url "https://localhost:9200" \
     -elastic.index "ai-dna-verifications" \
     -elastic.api-key "your-api-key" \
     -license-key YOUR_LICENSE_KEY
Grafana Lokibash
dnai -provider openai -model gpt-4o \
     -loadfp gpt4o-baseline.json -probe 30 \
     -siem loki \
     -loki.url "http://localhost:3100" \
     -loki.labels "app=ai-dna,env=prod" \
     -license-key YOUR_LICENSE_KEY
Generic Webhookbash
dnai -provider openai -model gpt-4o \
     -loadfp gpt4o-baseline.json -probe 30 \
     -siem webhook \
     -webhook.url "https://hooks.example.com/ai-integrity" \
     -webhook.headers "Authorization:Bearer tok123,X-Source:dnai" \
     -license-key YOUR_LICENSE_KEY

5 Container Deployment

Run DNAI in any OCI-compatible container runtime. The examples below cover Docker and Kubernetes — create a fingerprint once with persistent storage, then monitor continuously from the same volume.

Docker / Docker Compose

Dockerfiledocker
# DNAI — AI Genome Fingerprinting Container
FROM alpine:latest

# Download latest DNAI binary
RUN apk add --no-cache curl \
    && curl -fSL -o /usr/local/bin/dnai \
       "https://www.cyberautomation.com.au/DNAI/dist/linux-amd64/DNAI" \
    && chmod +x /usr/local/bin/dnai

# Persistent volume for fingerprints
VOLUME /data

ENTRYPOINT ["dnai"]
docker-compose.ymlyaml
version: "3.8"
services:
  dnai-fingerprint:
    build: .
    volumes:
      - dnai-data:/data
    environment:
      - OPENAI_API_KEY=sk-your-api-key         # ← Replace
    command:
      - -provider
      - openai                                # ← Your provider
      - -model
      - gpt-4o                                # ← Your model
      - -analysegenome
      - "60"
      - -savefp
      - /data/baseline.json
      - -license-key
      - YOUR_LICENSE_KEY                      # ← Replace

volumes:
  dnai-data:
Usage: Run docker compose up dnai-fingerprint to create the genome fingerprint. Replace the provider, model, API key, and license key with your values. The fingerprint is saved to the dnai-data volume at /data/baseline.json.
docker-compose.yml — Verification Serviceyaml
version: "3.8"
services:
  dnai-monitor:
    build: .                                  # Same Dockerfile as above
    volumes:
      - dnai-data:/data                          # Same volume — reads baseline.json
    environment:
      - OPENAI_API_KEY=sk-your-api-key         # ← Replace
    command:
      - -provider
      - openai
      - -model
      - gpt-4o
      - -loadfp
      - /data/baseline.json                    # Fingerprint from step above
      - -probe
      - "30"
      - -json
      - -license-key
      - YOUR_LICENSE_KEY                      # ← Replace
      # Optional: stream results to your SIEM
      # - -siem
      # - splunk
      # - -splunk.url
      # - "https://splunk.example.com:8088"
      # - -splunk.token
      # - "your-hec-token"

volumes:
  dnai-data:
Scheduled Monitoring (cron)bash
# Run verification every 6 hours via cron
# Add to crontab: crontab -e
0 */6 * * * docker compose run --rm dnai-monitor >> /var/log/dnai-monitor.log 2>&1
Shared Storage: This uses the same dnai-data volume as the fingerprinting container, so it reads the baseline created in the previous step. Schedule docker compose run --rm dnai-monitor via cron, systemd timer, or your CI/CD pipeline for continuous integrity monitoring.

Kubernetes

dnai-secret.yamlyaml
# Create with: kubectl create secret generic dnai-secrets \
#   --from-literal=api-key=sk-your-api-key \
#   --from-literal=license-key=YOUR_LICENSE_KEY
apiVersion: v1
kind: Secret
metadata:
  name: dnai-secrets
type: Opaque
stringData:
  api-key: sk-your-api-key            # ← Replace
  license-key: YOUR_LICENSE_KEY       # ← Replace
dnai-pvc.yamlyaml
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
  name: dnai-data
spec:
  accessModes:
    - ReadWriteOnce
  resources:
    requests:
      storage: 1Gi
dnai-fingerprint-job.yamlyaml
apiVersion: batch/v1
kind: Job
metadata:
  name: dnai-fingerprint
spec:
  backoffLimit: 2
  template:
    spec:
      initContainers:
        - name: download-dnai
          image: alpine:latest
          command: ["sh", "-c"]
          args:
            - |
              apk add --no-cache curl &&
              curl -fSL -o /data/dnai \
                "https://www.cyberautomation.com.au/DNAI/dist/linux-amd64/DNAI" &&
              chmod +x /data/dnai
          volumeMounts:
            - name: dnai-data
              mountPath: /data
      containers:
        - name: dnai
          image: alpine:latest
          command: ["/data/dnai"]
          args:
            - "-provider"
            - "openai"                    # ← Your provider
            - "-model"
            - "gpt-4o"                    # ← Your model
            - "-analysegenome"
            - "60"
            - "-savefp"
            - "/data/baseline.json"
            - "-license-key"
            - "$(LICENSE_KEY)"
          env:
            - name: OPENAI_API_KEY       # ← Match your provider's env var
              valueFrom:
                secretKeyRef:
                  name: dnai-secrets
                  key: api-key
            - name: LICENSE_KEY
              valueFrom:
                secretKeyRef:
                  name: dnai-secrets
                  key: license-key
          volumeMounts:
            - name: dnai-data
              mountPath: /data
      volumes:
        - name: dnai-data
          persistentVolumeClaim:
            claimName: dnai-data
      restartPolicy: Never
Usage: Apply with kubectl apply -f dnai-secret.yaml -f dnai-pvc.yaml -f dnai-fingerprint-job.yaml. The init container downloads the latest DNAI binary, then the main container runs the genome fingerprint analysis. The fingerprint is persisted to the dnai-data PVC at /data/baseline.json. Replace the provider, model, API key, and license key with your values.
dnai-monitor-cronjob.yamlyaml
apiVersion: batch/v1
kind: CronJob
metadata:
  name: dnai-monitor
spec:
  schedule: "0 */6 * * *"              # Every 6 hours
  concurrencyPolicy: Forbid
  successfulJobsHistoryLimit: 5
  failedJobsHistoryLimit: 3
  jobTemplate:
    spec:
      backoffLimit: 2
      template:
        spec:
          initContainers:
            - name: download-dnai
              image: alpine:latest
              command: ["sh", "-c"]
              args:
                - |
                  apk add --no-cache curl &&
                  curl -fSL -o /data/dnai \
                    "https://www.cyberautomation.com.au/DNAI/dist/linux-amd64/DNAI" &&
                  chmod +x /data/dnai
              volumeMounts:
                - name: dnai-data
                  mountPath: /data
          containers:
            - name: dnai
              image: alpine:latest
              command: ["/data/dnai"]
              args:
                - "-provider"
                - "openai"
                - "-model"
                - "gpt-4o"
                - "-loadfp"
                - "/data/baseline.json"   # Fingerprint from Job above
                - "-probe"
                - "30"
                - "-json"
                - "-license-key"
                - "$(LICENSE_KEY)"
                # Optional: stream results to your SIEM
                # - "-siem"
                # - "splunk"
                # - "-splunk.url"
                # - "https://splunk.example.com:8088"
                # - "-splunk.token"
                # - "your-hec-token"
              env:
                - name: OPENAI_API_KEY
                  valueFrom:
                    secretKeyRef:
                      name: dnai-secrets
                      key: api-key
                - name: LICENSE_KEY
                  valueFrom:
                    secretKeyRef:
                      name: dnai-secrets
                      key: license-key
              volumeMounts:
                - name: dnai-data
                  mountPath: /data
          volumes:
            - name: dnai-data
              persistentVolumeClaim:
                claimName: dnai-data
          restartPolicy: Never
Shared Storage: This CronJob uses the same dnai-data PersistentVolumeClaim as the fingerprinting Job, so it reads the baseline automatically. The schedule runs every 6 hours — adjust the cron expression to suit your monitoring frequency. Apply with kubectl apply -f dnai-monitor-cronjob.yaml.

6 Advanced Usage

Fingerprint Diffbash
# Compare two fingerprints side-by-side
dnai -diff1 gpt4o-jan.json -diff2 gpt4o-mar.json \
     -license-key YOUR_LICENSE_KEY
Industry-Specific Genome Profilebash
# Use finance-specific genome profile for fingerprinting
dnai -provider openai -model gpt-4o \
     -analysegenome 60 \
     -promptset finance \
     -savefp gpt4o-finance.json \
     -license-key YOUR_LICENSE_KEY

# Available sets: general, medical, finance, energy, legal, cybersecurity, education, retail
Checkpoint & Resumebash
# Save progress during long probes (resumes on interrupt)
dnai -provider openai -model gpt-4o \
     -analysegenome 100 \
     -checkpoint gpt4o-progress.jsonl \
     -savefp gpt4o-baseline.json \
     -workers 5 \
     -license-key YOUR_LICENSE_KEY
Custom Thresholds & Weightsbash
# Tune verification sensitivity with custom weights
dnai -provider openai -model gpt-4o \
     -loadfp gpt4o-baseline.json -probe 30 \
     -license-key YOUR_LICENSE_KEY
# Use dnai -h to see all available weight and threshold flags

CLI Reference

DNAI ships with a comprehensive set of flags for fingerprinting, verification, MCP probing, SIEM integration, and more. Run dnai -h to see the full list of options and their defaults.

Environment Variables: API keys and license keys can be configured via environment variables for seamless CI/CD integration. See dnai -h for details.

Download DNAI

Single binary, zero dependencies. Download for your platform, add your license key, and start fingerprinting.

Windows

x86_64 (AMD64)

DNAI.exe

macOS (Apple Silicon)

ARM64 — M1 / M2 / M3 / M4

DNAI

macOS (Intel)

x86_64

DNAI

Linux (x86_64)

AMD64 — Binary, .deb, .rpm

Linux (ARM64)

AArch64 — Binary, .deb

Linux (ARMv7)

armhf — Binary, .deb

Requirements: A valid license key is required. Run DNAI -h for usage. All binaries are code-signed and notarised where applicable.

Need a License Key or Have Questions?

Contact our team to get a license key, request a demo, or discuss enterprise deployment options.