Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
60 changes: 60 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,66 @@ That's it. You're now tracing your code with OpenLLMetry!

Now, you need to decide where to export the traces to.

## ⚙️ Configuration

### Service Name

You can customize your service name by providing a `name` parameter:

```ruby
require "traceloop/sdk"

# Without name parameter (uses OTEL_SERVICE_NAME as-is)
traceloop = Traceloop::SDK::Traceloop.new
# Service name: value of OTEL_SERVICE_NAME, or "unknown_service:ruby"

# With name parameter (combines name with OTEL_ENVIRONMENT)
traceloop = Traceloop::SDK::Traceloop.new(name: "worker")
# Service name: "worker-production" (if OTEL_ENVIRONMENT="production")
# Service name: "worker-unknown" (if OTEL_ENVIRONMENT not set)
```

### Multiple Service Instances

You can create multiple Traceloop instances with different service names in the same application:

```ruby
traceloop_api = Traceloop::SDK::Traceloop.new(name: "api")
traceloop_worker = Traceloop::SDK::Traceloop.new(name: "worker")
traceloop_scheduler = Traceloop::SDK::Traceloop.new(name: "scheduler")

# Each instance traces with its own service name (assuming OTEL_ENVIRONMENT="production"):
# - "api-production"
# - "worker-production"
# - "scheduler-production"
```

### Environment Variables

Control your service naming using standard OpenTelemetry environment variables:

```bash
# Used when no name parameter is provided
export OTEL_SERVICE_NAME="my-app"

# Combined with name parameter: "worker-production"
export OTEL_ENVIRONMENT="production"
```

Defaults:
- `OTEL_SERVICE_NAME` defaults to `"unknown_service:ruby"`
- `OTEL_ENVIRONMENT` defaults to `"unknown"`

### Cleanup

When shutting down your application, ensure spans are properly flushed:

```ruby
traceloop = Traceloop::SDK::Traceloop.new
# ... use traceloop ...
traceloop.shutdown # Flush remaining spans before exit
```

## ⏫ Supported (and tested) destinations

- [x] [Traceloop](https://www.traceloop.com/docs/openllmetry/integrations/traceloop)
Expand Down
5 changes: 3 additions & 2 deletions sample-app/Gemfile.lock
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ GEM
faraday-typhoeus (1.1.0)
faraday (~> 2.0)
typhoeus (~> 1.4)
ffi (1.17.0-arm64-darwin)
ffi (1.17.0)
gemini-ai (4.2.0)
event_stream_parser (~> 1.0)
faraday (~> 2.10)
Expand All @@ -39,7 +39,7 @@ GEM
typhoeus (~> 1.4, >= 1.4.1)
google-cloud-env (2.2.1)
faraday (>= 1.0, < 3.a)
google-protobuf (3.25.5-arm64-darwin)
google-protobuf (3.25.5)
googleapis-common-protos-types (1.16.0)
google-protobuf (>= 3.18, < 5.a)
googleauth (1.11.2)
Expand Down Expand Up @@ -100,6 +100,7 @@ GEM

PLATFORMS
arm64-darwin-23
x86_64-linux

DEPENDENCIES
aws-sdk-bedrockruntime (~> 1.14)
Expand Down
7 changes: 7 additions & 0 deletions sample-app/bedrock.rb
Original file line number Diff line number Diff line change
@@ -1,8 +1,15 @@
require 'aws-sdk-bedrockruntime'
require "traceloop/sdk"

# Example 1: No name parameter (backward compatible)
# Uses OTEL_SERVICE_NAME as-is, or defaults to "unknown_service:ruby"
traceloop = Traceloop::SDK::Traceloop.new

# Example 2: With name parameter
# Creates service name as "#{name}-#{OTEL_ENVIRONMENT}"
# If OTEL_ENVIRONMENT="production", this creates "bedrock-worker-production"
# traceloop = Traceloop::SDK::Traceloop.new(name: "bedrock-worker")

model = "anthropic.claude-3-sonnet-20240229-v1:0"

traceloop.llm_call(provider="bedrock", model=model) do |tracer|
Expand Down
7 changes: 7 additions & 0 deletions sample-app/gemini.rb
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,15 @@
options: { model: 'gemini-pro', server_sent_events: true }
)

# Example 1: No name parameter (backward compatible)
# Uses OTEL_SERVICE_NAME as-is, or defaults to "unknown_service:ruby"
traceloop = Traceloop::SDK::Traceloop.new

# Example 2: With name parameter
# Creates service name as "#{name}-#{OTEL_ENVIRONMENT}"
# If OTEL_ENVIRONMENT="production", this creates "gemini-worker-production"
# traceloop = Traceloop::SDK::Traceloop.new(name: "gemini-worker")

traceloop.llm_call(provider="vertexai", model="gemini-pro") do |tracer|
tracer.log_prompt(user_prompt="Tell me a joke about OpenTelemetry")
response = client.generate_content(
Expand Down
14 changes: 14 additions & 0 deletions sample-app/openai.rb
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,22 @@

client = OpenAI::Client.new

# Example 1: No name parameter (backward compatible)
# Uses OTEL_SERVICE_NAME as-is, or defaults to "unknown_service:ruby"
traceloop = Traceloop::SDK::Traceloop.new

# Example 2: With name parameter
# Creates service name as "#{name}-#{OTEL_ENVIRONMENT}"
# If OTEL_ENVIRONMENT="production", this creates "worker-production"
# traceloop_worker = Traceloop::SDK::Traceloop.new(name: "worker")

# Example 3: Multiple instances with different names
# If OTEL_ENVIRONMENT="production":
# - traceloop_api: "api-production"
# - traceloop_background: "background-production"
# traceloop_api = Traceloop::SDK::Traceloop.new(name: "api")
# traceloop_background = Traceloop::SDK::Traceloop.new(name: "background")

traceloop.workflow("joke_generator") do
traceloop.llm_call(provider="openai", model="gpt-3.5-turbo") do |tracer|
tracer.log_prompt(user_prompt="Tell me a joke about OpenTelemetry")
Expand Down
12 changes: 12 additions & 0 deletions semantic_conventions_ai/lib/opentelemetry/semantic_conventions.rb
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,18 @@ module SpanAttributes

# Deprecated
TRACELOOP_CORRELATION_ID = "traceloop.correlation.id"

# Gen AI
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just replace the existing ones

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and remove references to llm semantics?

GEN_AI_REQUEST_MODEL = "gen_ai.request.model"
GEN_AI_RESPONSE_MODEL = "gen_ai.response.model"
GEN_AI_USAGE_OUTPUT_TOKENS = "gen_ai.usage.output_tokens"
GEN_AI_USAGE_INPUT_TOKENS = "gen_ai.usage.input_tokens"
GEN_AI_COMPLETIONS = "gen_ai.completion"
GEN_AI_PROMPTS = "gen_ai.prompt"
GEN_AI_SYSTEM = "gen_ai.system"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

OpenTelemetry semantic conventions gen_ai.system deprecated gen_ai.provider.name

💡 Result:

gen_ai.system was deprecated and renamed to gen_ai.provider.name in OpenTelemetry semantic-conventions v1.37.0 (GenAI “system-specific naming policy” breaking change). [1]

What to use instead:

  • Use gen_ai.provider.name to identify the GenAI telemetry “flavor” / provider discriminator (e.g., openai, azure.ai.openai, aws.bedrock, gcp.vertex_ai, etc.). [2]
  • The spec notes gen_ai.provider.name should be set based on the instrumentation’s best knowledge and may differ from the actual underlying model provider (proxies/gateways); gen_ai.request.model, gen_ai.response.model, and server.address can help disambiguate. [2]

Sources:
[1] OpenTelemetry semantic-conventions GitHub releases (v1.37.0): rename gen_ai.systemgen_ai.provider.name
[2] OpenTelemetry GenAI spans semantic conventions: definition + well-known values for gen_ai.provider.name


🏁 Script executed:

# Check the file context around lines 31-32 and 41-42
cat -n semantic_conventions_ai/lib/opentelemetry/semantic_conventions.rb | sed -n '25,50p'

Repository: traceloop/openllmetry-ruby

Length of output: 1250


Mark GEN_AI_SYSTEM = "gen_ai.system" as deprecated or remove it.

gen_ai.system was deprecated and renamed to gen_ai.provider.name in OpenTelemetry semantic-conventions v1.37.0. Line 42 already defines GEN_AI_PROVIDER = "gen_ai.provider.name" as the replacement. If keeping GEN_AI_SYSTEM for backward compatibility, add a # Deprecated comment consistent with the pattern at line 31–32 (TRACELOOP_CORRELATION_ID); otherwise, remove it.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@semantic_conventions_ai/lib/opentelemetry/semantic_conventions.rb` at line
41, The constant GEN_AI_SYSTEM = "gen_ai.system" is deprecated in favor of
GEN_AI_PROVIDER = "gen_ai.provider.name"; either remove GEN_AI_SYSTEM or mark it
as deprecated for backward compatibility by adding a comment like "# Deprecated:
use GEN_AI_PROVIDER (gen_ai.provider.name) as of OpenTelemetry v1.37.0"
following the same style used for TRACELOOP_CORRELATION_ID so reviewers can see
the replacement and rationale.

GEN_AI_PROVIDER = "gen_ai.provider.name"
GEN_AI_CONVERSATION_ID = "gen_ai.conversation.id"
GEN_AI_BEDROCK_GUARDRAILS = "gen_ai.bedrock.guardrail"
end

module LLMRequestTypeValues
Expand Down
Loading