Environments-19
For instructions on how to authenticate to use this endpoint, see API overview.
Endpoints
POST | |
POST | |
POST | |
GET | |
POST | |
GET | |
PATCH | |
DELETE | |
GET | |
GET |
Create environments llm analytics summarization batch check
Path parameters
- project_idstring
Request parameters
- trace_idsarray
- modeDefault:
minimal - provider
- modelstring
Response
Example request
POST /api /environments /:project_id /llm_analytics /summarization /batch_checkExample response
Status 200
RESPONSE
Status 400
Status 403
Create environments llm analytics text repr
Generate a human-readable text representation of an LLM trace event.
This endpoint converts LLM analytics events ($ai_generation, $ai_span, $ai_embedding, or $ai_trace) into formatted text representations suitable for display, logging, or analysis.
Supported Event Types:
$ai_generation: Individual LLM API calls with input/output messages$ai_span: Logical spans with state transitions$ai_embedding: Embedding generation events (text input → vector)$ai_trace: Full traces with hierarchical structure
Options:
max_length: Maximum character count (default: 2000000)truncated: Enable middle-content truncation within events (default: true)truncate_buffer: Characters at start/end when truncating (default: 1000)include_markers: Use interactive markers vs plain text indicators (default: true)- Frontend: set true for
<<<TRUNCATED|base64|...>>>markers - Backend/LLM: set false for
... (X chars truncated) ...text
- Frontend: set true for
collapsed: Show summary vs full trace tree (default: false)include_hierarchy: Include tree structure for traces (default: true)max_depth: Maximum depth for hierarchical rendering (default: unlimited)tools_collapse_threshold: Number of tools before auto-collapsing list (default: 5)- Tool lists >5 items show
<<<TOOLS_EXPANDABLE|...>>>marker for frontend - Or
[+] AVAILABLE TOOLS: Nfor backend wheninclude_markers: false
- Tool lists >5 items show
include_line_numbers: Prefix each line with line number like L001:, L010: (default: false)
Use Cases:
- Frontend display:
truncated: true, include_markers: true, include_line_numbers: true - Backend LLM context (summary):
truncated: true, include_markers: false, collapsed: true - Backend LLM context (full):
truncated: false
The response includes the formatted text and metadata about the rendering.
Required API key scopes
llm_analytics:writePath parameters
- project_idstring
Request parameters
- event_type
- data
- options
Response
Example request
POST /api /environments /:project_id /llm_analytics /text_reprExample response
Status 200
RESPONSE
Status 400
Status 500
Status 503
Create environments llm analytics translate
Translate text to target language.
Required API key scopes
llm_analytics:writePath parameters
- project_idstring
Example request
POST /api /environments /:project_id /llm_analytics /translateExample response
Status 201 No response body
Create environments llm analytics translate
Translate text to target language.
Required API key scopes
llm_analytics:writePath parameters
- project_idstring
Example request
POST /api /environments /:project_id /llm_analytics /translateExample response
Status 201 No response body
List all environments llm prompts
Required API key scopes
llm_prompt:readPath parameters
- project_idstring
Query parameters
- limitinteger
- offsetinteger
Response
Example request
GET /api /environments /:project_id /llm_promptsExample response
Status 200
RESPONSE
List all environments llm prompts
Required API key scopes
llm_prompt:readPath parameters
- project_idstring
Query parameters
- limitinteger
- offsetinteger
Response
Example request
GET /api /environments /:project_id /llm_promptsExample response
Status 200
RESPONSE
Create environments llm prompts
Required API key scopes
llm_prompt:writePath parameters
- project_idstring
Request parameters
- namestring
- prompt
- deletedboolean
Response
Example request
POST /api /environments /:project_id /llm_promptsExample response
Status 201
RESPONSE
Create environments llm prompts
Required API key scopes
llm_prompt:writePath parameters
- project_idstring
Request parameters
- namestring
- prompt
- deletedboolean
Response
Example request
POST /api /environments /:project_id /llm_promptsExample response
Status 201
RESPONSE
Retrieve environments llm prompts
Required API key scopes
llm_prompt:readPath parameters
- idstring
- project_idstring
Response
Example request
GET /api /environments /:project_id /llm_prompts /:idExample response
Status 200
RESPONSE
Retrieve environments llm prompts
Required API key scopes
llm_prompt:readPath parameters
- idstring
- project_idstring
Response
Example request
GET /api /environments /:project_id /llm_prompts /:idExample response
Status 200
RESPONSE
Update environments llm prompts
Required API key scopes
llm_prompt:writePath parameters
- idstring
- project_idstring
Request parameters
- namestring
- prompt
- deletedboolean
Response
Example request
PATCH /api /environments /:project_id /llm_prompts /:idExample response
Status 200
RESPONSE
Update environments llm prompts
Required API key scopes
llm_prompt:writePath parameters
- idstring
- project_idstring
Request parameters
- namestring
- prompt
- deletedboolean
Response
Example request
PATCH /api /environments /:project_id /llm_prompts /:idExample response
Status 200
RESPONSE
Delete environments llm prompts
Hard delete of this model is not allowed. Use a patch API call to set "deleted" to true
Required API key scopes
llm_prompt:writePath parameters
- idstring
- project_idstring
Example request
DELETE /api /environments /:project_id /llm_prompts /:idExample response
Status 405 No response body
Delete environments llm prompts
Hard delete of this model is not allowed. Use a patch API call to set "deleted" to true
Required API key scopes
llm_prompt:writePath parameters
- idstring
- project_idstring
Example request
DELETE /api /environments /:project_id /llm_prompts /:idExample response
Status 405 No response body
Retrieve environments llm prompts name
Required API key scopes
llm_prompt:readPath parameters
- project_idstring
- prompt_namestring
Response
Example request
GET /api /environments /:project_id /llm_prompts /name /:prompt_nameExample response
Status 200
RESPONSE
Retrieve environments llm prompts name
Required API key scopes
llm_prompt:readPath parameters
- project_idstring
- prompt_namestring
Response
Example request
GET /api /environments /:project_id /llm_prompts /name /:prompt_nameExample response
Status 200
RESPONSE
Retrieve environments logs attributes
Required API key scopes
logs:readPath parameters
- project_idstring
Example request
GET /api /environments /:project_id /logs /attributesExample response
Status 200 No response body
Retrieve environments logs attributes
Required API key scopes
logs:readPath parameters
- project_idstring
Example request
GET /api /environments /:project_id /logs /attributes