Published on

What is LangChain and why should you care?

Authors

With the recent developments in Language Model Integrations (LLMs) what captured my interest recently was Langchain 🦜

Let me break it down for you in simple terms. Langchain helps developers like us connect data to language models, like GPT models from OpenAI and various others all through their API. It also supports agent workflows. What are those, you ask? Well, they're like smart, automated processes that make our lives easier. Imagine having tasks automated and complex stuff streamlined. It's a total game-changer!

Why do we need LangChain?

Guess what's challenging for developers these days while working with language models? Since the whole ecosystem is still evolving we often lack the right tools to smoothly deploy language models in real-world scenarios. But here's where Langchain comes in to save the day! It's like a all in one framework equipped with all the cool features we need. With this devs wont essentially have to struggle with prompt chaining, logging, callbacks, logics persistenting memory, and eastablish efficient connections to multiple data sources manually.

It also provides a model-agnostic toolset that lets both companies and developers explore multiple LLM offerings. So, you can easily test which one works best for your specific use cases. The best part is that you can do all of this within a single interface. No more crazy scaling of code bases just to support different providers!

The Community

When it comes to checking out a tool, one of the big factors is the awesome community supporting it. And for open-source projects like Langchain this is even more crucial. You want to know you're not flying solo, right?

Langchain seems to have already getting a massive fan base! As of today, it's got over 51k stars on GitHub, which is a pretty solid way to gauge its popularity in the open-source ecosystem. And guess what? It's racking up a million downloads every single month! That's some serious love from the developer community.

The community also maintains an active Discord channel. You know when there's a bustling chat going on, people are seriously into the project.

Bottom line: with that kind of following and engagement, you can bet Langchain is doing something right.

What exactly runs at the backend?

To speak in layman langauge the objective of this framework is to support building of applications on top of LLMs. It lets developers connect their language model to other data sources, which means it can tap into vast amount of additional informations and allows the model to interact with its surroundings, making it super responsive and dynamic!

Picture this: you can integrate AI chatbots like ChatGPT into your apps, and then hook them up to external sources like Wikipedia and Google Drive. This means you can create apps that are seriously powerful and language-driven! They can churn out personalized content based on what users input and the data they fetch from different sources.

How to get started using Langchain for your app? Langchain has got these robus APIs that let us seamlessly work with the powerful LLMs for all kinds of use cases. The best part is they use Python libraries, which makes working with these models very easy.

The fundamental concept here is its ability to “chain” together different components, also known as “chaining”, which allows for the development of advanced use cases that utilize LLMs.

If you're curious to try out LangChain, you're in luck! I loved the documentation and it seems just enough to get you started.

Once you're up and running, you can try out stuffs like text summarization, generative question answering (GQA), and chatbots. What I found while hacking around this was it is really efficient with use cases like creating accurate summaries, providing personalized answers to user queries, and creating engaging conversational experiences. This capabilities seems to have a huge scale of potential implementations in Customer Support and Engineering Sales.

The tech behind?

Under the hood, LLMs are statistical models that can predict the next set of text chunks based on the initial ones you feed them. These initial chunks are called "Prompts," and crafting the right prompts is what is "Prompt Engineering."

At its core, it's got two powerful capabilities:

  • Abstraction Layer: The capability to interact with various LLM providers using a standardized set of commands.
  • Chaining for Complexity: The capability to chain its components together, for complex interactions.

Check the below snippet:

const model = new OpenAI();
import { PromptTemplate } from "langchain/prompts";

const prompt = PromptTemplate.fromTemplate(`Tell me a joke about {topic}`);
const chain = new LLMChain({ llm: model, prompt: prompt });
const response = await chain.call({ topic: "developers" });

With this snippet, we essentially create a chain to ask LangChain for a joke about a specific topic (in this case, "developers"). We can actually level this up with a more complex application and add a translation twist:

const translatePrompt = PromptTemplate.fromTemplate(`translate the following text to Spanish: {text}`);
const translateChain = new LLMChain({ llm: model, prompt: translatePrompt });
const overallChain = new SimpleSequentialChain({
    chains: [chain, translateChain],
    verbose: true,
 });
const results = await overallChain.run("developers");

By using verbose: true in SimpleSequentialChain, we can log into the generation process, which comes in handy for debugging.

What makes things better is the module support for chaining that is available with langchain.

  • Memory Module: This allows devs to store state across chains, whether in external DBs like Redis or DynamoDB, or simply in memory.
  • Agents Module: With this module, chains can interact with external providers (agents) and take actions based on their responses.

I'm super pumped about LangChain, which is why I took the time to write this up. I genuinely believe it's the answer to many of the wriring blockers developers face while working with LLMs.

Checkout the LangChain community and documentation to learn more about LangChain with all the helpful guides, tutorials, and examples that walk you through every step of building LLM powered applications using LangChain.