Working with Large Language Models (LLMs) can often feel like a black box. You send a prompt and get a response, but what happens during that interaction? Just as software engineers rely on trace logging to debug complex applications, developers building with LLMs need a clear way to see what’s happening at runtime. This is where a powerful tool like LangSmith becomes essential.
LLM applications present unique complexities beyond traditional software. We’re not just tracking latency; we’re also concerned with token counts, non-determinism in responses, and the “elephant in the room”: issues like hallucinations, toxic content, and readability. This makes robust tracing not just helpful but non-negotiable for building reliable and trustworthy applications.
LangSmith elegantly solves this by providing the @traceable
decorator, which you can use to annotate functions. With this simple addition, all your run-time traces, including detailed run trees, appear automatically in the LangSmith interface. Developers can also add custom metadata at either build or runtime. This is a game-changer, as it allows you to filter and drill down into specific records, saving countless hours you’d otherwise spend manually searching through logs. Best of all, this tracing logic runs in a background thread, so it won’t impact your application’s latency. LangSmith makes debugging complex LLM applications both simple and efficient.
To enable tracing in your application, the first step is to sign up for a free LangSmith account on their website. You can do so using your Google or GitHub account, or an email address. Once you’re signed in, navigate to the Settings page (look for the gear icon ⚙️) to find and create your API key. Make sure you copy this key and store it in a secure location, as it’s shown only once.
With your API key in hand, you can enable tracing by setting two environment variables in your application’s environment:
LANGCHAIN_TRACING_V2=true
LANGSMITH_API_KEY=<your-api-key>
Setting LANGCHAIN_TRACING_V2
to true
is what activates the tracing feature. Now, whenever a LangChain-based application runs, the tracing data will be automatically logged to your LangSmith project, giving you immediate visibility into your application’s behavior. In the next part of this series, we’ll dive deeper into more advanced features.
This video provides a practical walkthrough of setting up a LangSmith account and API key.