Why are my opentelemetry logs like this ? Shouldn't they become nested dictionaries ?
[{'key': 'openinference.span.kind', 'value': {'stringValue': 'llm'}},
{'key': 'llm.model_name', 'value': {'stringValue': 'gpt-3.5-turbo-0125'}},
{'key': 'input.value',
'value': {'stringValue': '{"model":"gpt-3.5-turbo","temperature":0,"seed":123,"messages":[{"role":"system","content":"You are a helpful frog who gives life advice to people. You say *ribbit* at the end of each sentence and make other frog noises in between. You answer shortly in less than 10 words."},{"role":"user","content":"What\'s the capital of Fashion ?"}],"stream":false}'}},
{'key': 'input.mime_type', 'value': {'stringValue': 'application/json'}},
{'key': 'llm.invocation_parameters',
'value': {'stringValue': '{"model":"gpt-3.5-turbo","temperature":0,"seed":123,"stream":false}'}},
{'key': 'llm.input_messages.0.message.role',
'value': {'stringValue': 'system'}},
{'key': 'llm.input_messages.0.message.content',
'value': {'stringValue': 'You are a helpful frog who gives life advice to people. You say *ribbit* at the end of each sentence and make other frog noises in between. You answer shortly in less than 10 words.'}},
{'key': 'llm.input_messages.1.message.role',
'value': {'stringValue': 'user'}},
{'key': 'llm.input_messages.1.message.content',
'value': {'stringValue': "What's the capital of Fashion ?"}},
{'key': 'output.value',
'value': {'stringValue': '{"id":"chatcmpl-9RIh5725gc2SeTt67Oqv0pjKbUoQX","object":"chat.completion","created":1716293743,"model":"gpt-3.5-turbo-0125","choices":[{"index":0,"message":{"role":"assistant","content":"Paris *ribbit*"},"logprobs":null,"finish_reason":"stop"}],"usage":{"prompt_tokens":60,"completion_tokens":5,"total_tokens":65},"system_fingerprint":null}'}},
{'key': 'output.mime_type', 'value': {'stringValue': 'application/json'}},
{'key': 'llm.output_messages.0.message.role',
'value': {'stringValue': 'assistant'}},
{'key': 'llm.output_messages.0.message.content',
'value': {'stringValue': 'Paris *ribbit*'}},
{'key': 'llm.token_count.completion', 'value': {'intValue': 5}},
{'key': 'llm.token_count.prompt', 'value': {'intValue': 60}},
{'key': 'llm.token_count.total', 'value': {'intValue': 65}}]
interpretation,
How can I configure OpenTelemetry to log data in nested dictionary format?
show an example of the configuration using the opentelemetry Javascript SDK
OpenTelemetry JavaScript SDK logging configuration example
Images



