Offline Trace Export
Export traces to a JSON file when you cannot send data to Deepchecks in real time, then upload them in a separate step.
If you cannot send traces to Deepchecks in real time - for example, in isolated environments without internet access, or for batch processing workflows - you can export traces to a JSON file locally and upload them in a separate step.
This two-step process works with all framework integrations (LanggraphIntegration, CrewaiIntegration, GoogleAdkIntegration).
Step 1: Export traces to JSON
Instead of calling register_dc_exporter, call register_json_exporter. Traces are saved locally as JSON files as your pipeline runs.
from deepchecks_llm_client.otel import LanggraphIntegration
LanggraphIntegration().register_json_exporter(
output_folder="./traces",
log_to_console=True,
)
# Run your pipeline as normal - traces are saved to ./traces/ as JSON filesStep 2: Upload the JSON to Deepchecks
Once you are ready to upload, use the log_spans_file function:
from deepchecks_llm_client.client import DeepchecksLLMClient
dc_client = DeepchecksLLMClient(api_token="your-api-key")
dc_client.log_spans_file(
app_name="Your App Name",
version_name="v1",
env_type="EVAL",
json_path="./traces/your-trace-file.json",
)After upload, Deepchecks processes the traces exactly as it would for real-time exports - parsing attributes, computing metrics, calculating properties, and running annotations.
Updated 8 days ago