Documentation Index
Fetch the complete documentation index at: https://docs.responsibleailabs.ai/llms.txt
Use this file to discover all available pages before exploring further.
Installation
pip install "rail-score-sdk[telemetry]"
Setup
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from rail_score_sdk import RailScoreClient
from rail_score_sdk.telemetry import RAILTelemetry
# Configure OTEL exporter
provider = TracerProvider()
exporter = OTLPSpanExporter(endpoint="http://localhost:4317")
provider.add_span_processor(BatchSpanProcessor(exporter))
trace.set_tracer_provider(provider)
# Enable RAIL telemetry
rail = RailScoreClient(api_key="YOUR_RAIL_API_KEY")
RAILTelemetry.instrument(rail)
# All eval() calls now emit spans automatically
result = rail.eval(content="Your text here", mode="basic")
Span attributes
Every rail.eval() call emits a span with these attributes:
| Attribute | Type | Description |
|---|
rail.score | float | Overall RAIL score |
rail.confidence | float | Score confidence |
rail.mode | string | basic or deep |
rail.credits_consumed | float | Credits charged |
rail.from_cache | bool | Whether result was cached |
rail.dim.{name}.score | float | Per-dimension score |
rail.dim.{name}.confidence | float | Per-dimension confidence |
Viewing traces
RAIL spans integrate with any OTEL-compatible backend: Jaeger, Tempo, Honeycomb, Datadog, New Relic, or Langfuse.
# Start a local Jaeger instance for development
docker run -p 16686:16686 -p 4317:4317 jaegertracing/all-in-one
Open http://localhost:16686 to view traces.