Notebooks
N
NVIDIA
Getting Started With NeMo Auditor

Getting Started With NeMo Auditor

gpu-accelerationretrieval-augmented-generationllm-inferencetensorrtnvidia-generative-ai-exampleslarge-language-modelsmicroservicetriton-inference-serverLLMragnemoNeMo-Auditor

Getting Started with NeMo Auditor

NVIDIA NeMo Auditor audits LLMs by probing them with a variety of prompts to identify vulnerabilities. You can use the results to help assess model and system safety. Auditor is powered by the open source LLM scanner Garak.

In this tutorial we will audit an LLM hosted on build.nvidia.com.

Typical Audit Workflow

The audit workflow covered in this notebook is:

  • Create an audit target which specifies the LLM model to probe.
  • Create an audit configuration.
  • Run an audit job.
  • View the audit results.

NeMo Auditor Microservice Deployment

Deployment options

Auditor deploys as part of the NeMo Platform. See NeMo Platform documentation for help setting up the platform.

Prerequisites

Before starting this tutorial, ensure you have the following:

  • NeMo Platform up and running
  • An API key for accessing build.nvidia.com

Instantiate NeMo SDK client

Point the client at your NeMo Platform endpoint and call the Auditor info method to confirm that the Auditor service is ready.

[45]
{'platform_version': '2.0.0',
, 'version': '25.12',
, 'garak.__version__': '0.14.0'}

Basic Audit Target

When you run an audit job in NVIDIA NeMo Auditor, you create a separate audit target and audit configuration for the job. The target specifies the model name, model type, and free-form key-value pairs for model-specific inference options.

In this notebook, our target model will be NVIDIA Llama 3.1 Nemotron Nano V1 8B NIM.

We will add an Inference Gateway provider which provides access to models at build.nvidia.com. The target will specify the LLM model to audit and the Inference Gateway provider to use to access the model.

1. Add your build.nvidia.com API key as a secret

The inference provider will use this secret to access the model that we will audit.

[46]
Enter your NIM API Key:  ········
PlatformSecretResponse(name='nim-api-key', workspace='default', created_at='2026-03-12T21:33:20.117133', description=None, updated_at='2026-03-12T21:33:20.117139')

2. Add an inference provider

This inference provider will point at the build.nvidia.com inference endpoint (https://integrate.api.nvidia.com) and provide access to the models hosted there.

[47]
ModelProvider(created_at=datetime.datetime(2026, 3, 12, 21, 33, 33, 605621), host_url='https://integrate.api.nvidia.com', name='build', updated_at=datetime.datetime(2026, 3, 12, 21, 33, 33, 605630), workspace='default', id='model-provider-8pk5a845qYHyZaqzr3x6zS', api_key_secret_name='nim-api-key', auth_context=AuthContext(principal_id='', principal_email=None, principal_groups=[], principal_on_behalf_of=None), default_extra_body=None, default_extra_headers=None, description=None, enabled_models=None, model_deployment_id=None, project=None, required_extra_body=None, required_extra_headers=None, served_models=[], status='CREATED', status_message='Model provider created')

3. Add the target

The target specifies the model and associated options. Within the options, the provider created above is referenced.

[48]
AuditTarget(id='audit-target-FCbYfG1jzTr1o3SAX6J56G', created_at='2026-03-12T21:33:41.975704', created_by='service:platform', entity_id='audit-target-FCbYfG1jzTr1o3SAX6J56G', model='nvidia/llama-3.1-nemotron-nano-8b-v1', parent=None, type='nim', updated_at='2026-03-12T21:33:41.975709', updated_by='service:platform', workspace='default', description=None, name='demo-simple-target', options={'nim': {'skip_seq_start': '<think>', 'skip_seq_end': '</think>', 'max_tokens': 4000, 'nmp_uri_spec': {'inference_gateway': {'workspace': 'default', 'provider': 'build'}}}}, project=None)

Check the docs to update or delete an already existing target.

Basic Audit Configuration

The configuration describes which probes to run and the parameters of the garak audit.

[49]
AuditConfig(id='audit-config-DEW15RM2BSQd4CitKhwfXW', created_at='2026-03-12T21:33:45.986205', created_by='service:platform', entity_id='audit-config-DEW15RM2BSQd4CitKhwfXW', parent=None, updated_at='2026-03-12T21:33:45.986210', updated_by='service:platform', workspace='default', description=None, name='demo-simple-config', plugins=AuditPluginsData(buff_max=None, buff_spec=None, buffs={}, buffs_include_original_prompt=False, detector_spec='auto', detectors={}, extended_detectors=False, generators={}, harnesses={}, model_name=None, model_type=None, probe_spec='dan.AutoDANCached,goodside.Tag', probes={}), project=None, reporting=AuditReportData(report_dir='garak_runs', report_prefix='run1', show_100_pass_modules=True, taxonomy=None), run=AuditRunData(deprefix=True, eval_threshold=0.5, generations=7, probe_tags=None, seed=None, user_agent='garak/{version} (LLM vulnerability scanner https://garak.ai)'), system=AuditSystemData(enable_experimental=False, lite=True, narrow_output=False, parallel_attempts=32, parallel_requests=False, show_z=False, verbose=0))

Check the docs to update or delete an already existing audit configuration.

Run and Manage Audit Jobs

After you create an audit target and an audit configuration, you are ready to run an audit job.

[50]
AuditJob(name='demo-simple-job', spec=AuditJobConfig(config={'name': '', 'workspace': 'default', 'project': None, 'description': None, 'system': {'verbose': 0, 'narrow_output': False, 'parallel_requests': False, 'parallel_attempts': 32, 'lite': True, 'show_z': False, 'enable_experimental': False}, 'run': {'seed': None, 'deprefix': True, 'eval_threshold': 0.5, 'generations': 7, 'probe_tags': None, 'user_agent': 'garak/{version} (LLM vulnerability scanner https://garak.ai)'}, 'plugins': {'model_type': None, 'model_name': None, 'probe_spec': 'dan.AutoDANCached,goodside.Tag', 'detector_spec': 'auto', 'extended_detectors': False, 'buff_spec': None, 'buffs_include_original_prompt': False, 'buff_max': None, 'detectors': {}, 'generators': {}, 'buffs': {}, 'harnesses': {}, 'probes': {}}, 'reporting': {'report_prefix': 'run1', 'taxonomy': None, 'report_dir': 'garak_runs', 'show_100_pass_modules': True}, 'id': '', 'created_at': None, 'created_by': None, 'updated_at': None, 'updated_by': None, 'entity_id': '', 'parent': None}, target={'name': '', 'workspace': 'default', 'project': None, 'description': None, 'type': 'nim', 'model': 'nvidia/llama-3.1-nemotron-nano-8b-v1', 'options': {'nim': {'skip_seq_start': '<think>', 'skip_seq_end': '</think>', 'max_tokens': 4000, 'nmp_uri_spec': {'inference_gateway': {'workspace': 'default', 'provider': 'build'}}}}, 'id': '', 'created_at': None, 'created_by': None, 'updated_at': None, 'updated_by': None, 'entity_id': '', 'parent': None}, task_options=AuditTaskOptions(fail_job_on_retries_exhausted=True, max_probe_retries=0)), id='platform-job-3ErNBobRme8Yb81j7ASqf9', created_at='2026-03-12T21:33:50.744703', custom_fields=None, description=None, error_details=None, ownership=None, project=None, status='created', status_details={}, updated_at='2026-03-12T21:33:50.756946', workspace='default')

Get Audit Job Status

Now that the job is running, you can check its status until it completes.

[51]
PlatformJobStatusResponse(id='platform-job-3ErNBobRme8Yb81j7ASqf9', error_details=None, name='demo-simple-job', status='active', status_details={'message': 'Job is running'}, steps=[PlatformJobStepStatusResponse(id='platform-job-step-C2815muc3ShqSxqGaFfgQy', error_details={}, name='audit', status='active', status_details={'message': 'Job is running'}, tasks=[PlatformJobTaskStatusResponse(id='platform-job-task-YDWSkMaUouwqw2bHWj4iUR', error_details={}, error_stack='', name='task-7d4b19cc1bfd43e6b62dc87bab946f1e', status='active', status_details={'message': 'Job is running'})])])

When it completes, the status will change from active to completed.

[52]
PlatformJobStatusResponse(id='platform-job-3ErNBobRme8Yb81j7ASqf9', error_details=None, name='demo-simple-job', status='completed', status_details={'message': 'Job completed successfully with exit code 0'}, steps=[PlatformJobStepStatusResponse(id='platform-job-step-C2815muc3ShqSxqGaFfgQy', error_details={}, name='audit', status='completed', status_details={'message': 'Job completed successfully with exit code 0'}, tasks=[PlatformJobTaskStatusResponse(id='platform-job-task-YDWSkMaUouwqw2bHWj4iUR', error_details={}, error_stack='', name='task-7d4b19cc1bfd43e6b62dc87bab946f1e', status='completed', status_details={'message': 'Job completed successfully with exit code 0'})])])

If an error occurs, the status will change to error and a descriptive message will appear in status_details

Check the docs to see how to pause a running job and then resume it when needed

Viewing Audit Job Results

[53]
[54]