You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The AWS SDK for JavaScript v3 Bedrock Runtime client (@aws-sdk/client-bedrock-runtime) provides execution APIs for model invocation via ConverseCommand, ConverseStreamCommand, and InvokeModelCommand. This repository has zero direct instrumentation for any Bedrock Runtime SDK surface — no wrapper, no channels, no plugin, no auto-instrumentation config. Users who call the AWS Bedrock Runtime SDK directly get no Braintrust spans.
What instrumentation is missing
The @aws-sdk/client-bedrock-runtime package exposes these execution surfaces, none of which are instrumented:
SDK Method
Description
client.send(new ConverseCommand(...))
Unified chat completions across all Bedrock models
client.send(new ConverseStreamCommand(...))
Streaming chat completions
client.send(new InvokeModelCommand(...))
Direct model invocation with provider-specific payloads
The Converse API is AWS's unified interface for chat completions across all Bedrock-hosted models (Claude, Titan, Llama, Mistral, Cohere, etc.). It has its own request/response format distinct from any provider's native API. Users cannot use wrapOpenAI() or any other existing wrapper with this SDK.
No coverage in any instrumentation layer:
No wrapper function (e.g. wrapBedrockRuntime())
No diagnostics channels for Bedrock methods
No plugin handler in js/src/instrumentation/plugins/
No auto-instrumentation config in js/src/auto-instrumentations/configs/
No vendor SDK types in js/src/vendor-sdk-types/
No e2e test scenarios
A grep for bedrock (case-insensitive) across js/src/ returns zero matches. The only repo references are in an example project's package.json (@ai-sdk/amazon-bedrock) and a user bug report (issue #928) showing Bedrock usage via LangChain.
Indirect coverage exists but is limited:
Users can access Bedrock models through the Vercel AI SDK (@ai-sdk/amazon-bedrock) or LangChain (@langchain/aws / ChatBedrockConverse), both of which are instrumented. However, many enterprise users use the AWS SDK directly for tighter control over IAM, regions, and model parameters. Issue #928 demonstrates real user demand — a user was already using Bedrock via LangChain and encountered tracing issues.
Context
The Braintrust tracing docs at https://www.braintrust.dev/docs/guides/tracing list "AWS Bedrock" as a supported AI provider, but this JS SDK has no direct wrapper for the AWS Bedrock Runtime SDK. The listed support likely refers to indirect coverage through frameworks (LangChain, AI SDK) or Python SDK support.
Braintrust docs status
unclear — Braintrust's tracing overview page lists "AWS Bedrock" as a supported provider, but the wrap-providers documentation does not include any Bedrock-specific setup instructions, and this JS SDK has no direct instrumentation.
Summary
The AWS SDK for JavaScript v3 Bedrock Runtime client (
@aws-sdk/client-bedrock-runtime) provides execution APIs for model invocation viaConverseCommand,ConverseStreamCommand, andInvokeModelCommand. This repository has zero direct instrumentation for any Bedrock Runtime SDK surface — no wrapper, no channels, no plugin, no auto-instrumentation config. Users who call the AWS Bedrock Runtime SDK directly get no Braintrust spans.What instrumentation is missing
The
@aws-sdk/client-bedrock-runtimepackage exposes these execution surfaces, none of which are instrumented:client.send(new ConverseCommand(...))client.send(new ConverseStreamCommand(...))client.send(new InvokeModelCommand(...))client.send(new InvokeModelWithResponseStreamCommand(...))The Converse API is AWS's unified interface for chat completions across all Bedrock-hosted models (Claude, Titan, Llama, Mistral, Cohere, etc.). It has its own request/response format distinct from any provider's native API. Users cannot use
wrapOpenAI()or any other existing wrapper with this SDK.No coverage in any instrumentation layer:
wrapBedrockRuntime())js/src/instrumentation/plugins/js/src/auto-instrumentations/configs/js/src/vendor-sdk-types/A grep for
bedrock(case-insensitive) acrossjs/src/returns zero matches. The only repo references are in an example project'spackage.json(@ai-sdk/amazon-bedrock) and a user bug report (issue #928) showing Bedrock usage via LangChain.Indirect coverage exists but is limited:
Users can access Bedrock models through the Vercel AI SDK (
@ai-sdk/amazon-bedrock) or LangChain (@langchain/aws/ChatBedrockConverse), both of which are instrumented. However, many enterprise users use the AWS SDK directly for tighter control over IAM, regions, and model parameters. Issue #928 demonstrates real user demand — a user was already using Bedrock via LangChain and encountered tracing issues.Context
The Braintrust tracing docs at https://www.braintrust.dev/docs/guides/tracing list "AWS Bedrock" as a supported AI provider, but this JS SDK has no direct wrapper for the AWS Bedrock Runtime SDK. The listed support likely refers to indirect coverage through frameworks (LangChain, AI SDK) or Python SDK support.
Braintrust docs status
unclear— Braintrust's tracing overview page lists "AWS Bedrock" as a supported provider, but the wrap-providers documentation does not include any Bedrock-specific setup instructions, and this JS SDK has no direct instrumentation.Upstream references
Local files inspected
js/src/wrappers/— no Bedrock wrapperjs/src/instrumentation/plugins/— no Bedrock channels or pluginjs/src/auto-instrumentations/configs/— no Bedrock configjs/src/vendor-sdk-types/— no Bedrock typese2e/scenarios/— no Bedrock test scenariosjs/examples/ai-sdk/next-openai-app/package.json— only reference is@ai-sdk/amazon-bedrock(AI SDK adapter, not direct SDK)