JSON Schema¶
The JsonSchemaMetric validates that the actual output is valid JSON and conforms to a specified JSON schema.
How It Works¶
- Parses the actual output as JSON
- Validates the parsed JSON against the provided schema using the
jsonschemalibrary - Returns
1.0if validation passes,0.0if the output is not valid JSON or does not conform to the schema
Parameters¶
| Parameter | Type | Default | Description |
|---|---|---|---|
threshold | float | 1.0 | Minimum score to pass |
schema | dict | required | JSON schema definition (follows JSON Schema specification) |
Required Fields¶
| Field | Required |
|---|---|
input | No |
actual_output | Yes (must be valid JSON) |
expected_output | No |
Dependencies¶
This metric requires the jsonschema library:
Usage¶
from eval_lib.metrics import JsonSchemaMetric
from eval_lib.test_case import EvalTestCase
import asyncio
test_case = EvalTestCase(
input="Return user info as JSON",
actual_output='{"name": "Alice", "age": 30, "email": "alice@example.com"}'
)
metric = JsonSchemaMetric(
threshold=1.0,
schema={
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"email": {"type": "string", "format": "email"}
},
"required": ["name", "age", "email"]
}
)
result = asyncio.run(metric.evaluate(test_case))
print(result.score) # 1.0
Example Scenarios¶
Pass (1.0)¶
metric = JsonSchemaMetric(schema={
"type": "array",
"items": {"type": "string"},
"minItems": 1
})
EvalTestCase(
input="Return a list of colors",
actual_output='["red", "green", "blue"]'
)
# Valid JSON array of strings with at least one item
Fail (0.0) — invalid JSON¶
metric = JsonSchemaMetric(schema={"type": "object"})
EvalTestCase(
input="Return JSON",
actual_output="This is not JSON"
)
# Output cannot be parsed as JSON