Convert Figma logo to code with AI

langchain-ai logolangchainjs

🦜🔗 Build context-aware reasoning applications

16,529
2,917
16,529
148

Top Related Projects

119,163

🦜🔗 The platform for reliable agents.

Integrate cutting-edge LLM technology quickly and easily into your apps

LlamaIndex is the leading framework for building LLM-powered agents over your data.

Examples and guides for using the OpenAI API

23,837

AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.

76,895

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.

Quick Overview

LangChain.js is a JavaScript library designed to assist developers in building applications with large language models (LLMs). It provides a set of tools and abstractions to simplify the process of creating AI-powered applications, focusing on composability and ease of use.

Pros

  • Offers a wide range of integrations with popular LLMs and vector stores
  • Provides high-level abstractions for common AI-driven tasks
  • Supports both Node.js and browser environments
  • Active development and community support

Cons

  • Learning curve for developers new to LLM-based applications
  • Documentation can be overwhelming due to the breadth of features
  • Some advanced features may require additional setup or dependencies
  • Performance can vary depending on the chosen LLM and integrations

Code Examples

  1. Creating a simple chain with OpenAI:
import { OpenAI } from "langchain/llms/openai";
import { PromptTemplate } from "langchain/prompts";
import { LLMChain } from "langchain/chains";

const model = new OpenAI({ temperature: 0.9 });
const prompt = PromptTemplate.fromTemplate(
  "What is a good name for a company that makes {product}?"
);
const chain = new LLMChain({ llm: model, prompt });

const result = await chain.call({ product: "colorful socks" });
console.log(result.text);
  1. Using a vector store for similarity search:
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { FaissStore } from "langchain/vectorstores/faiss";

const embeddings = new OpenAIEmbeddings();
const vectorStore = await FaissStore.fromTexts(
  ["Hello world", "Bye bye", "Hello nice world"],
  [{ id: 2 }, { id: 1 }, { id: 3 }],
  embeddings
);

const resultOne = await vectorStore.similaritySearch("hello world", 1);
console.log(resultOne);
  1. Creating a conversational agent:
import { OpenAI } from "langchain/llms/openai";
import { BufferMemory } from "langchain/memory";
import { ConversationChain } from "langchain/chains";

const model = new OpenAI({});
const memory = new BufferMemory();
const chain = new ConversationChain({ llm: model, memory: memory });

const result1 = await chain.call({ input: "Hi! I'm Jim." });
console.log(result1.response);

const result2 = await chain.call({ input: "What's my name?" });
console.log(result2.response);

Getting Started

To get started with LangChain.js, follow these steps:

  1. Install the package:

    npm install langchain
    
  2. Set up your OpenAI API key:

    import { OpenAI } from "langchain/llms/openai";
    
    const model = new OpenAI({
      openAIApiKey: "your-api-key-here",
      temperature: 0.9,
    });
    
  3. Create a simple chain:

    import { PromptTemplate } from "langchain/prompts";
    import { LLMChain } from "langchain/chains";
    
    const prompt = PromptTemplate.fromTemplate("Tell me a {adjective} joke about {topic}.");
    const chain = new LLMChain({ llm: model, prompt });
    
    const result = await chain.call({ adjective: "funny", topic: "programming" });
    console.log(result.text);
    

Competitor Comparisons

119,163

🦜🔗 The platform for reliable agents.

Pros of langchain

  • More mature and feature-rich, with a larger ecosystem and community support
  • Extensive documentation and examples available
  • Supports a wider range of integrations and use cases

Cons of langchain

  • Steeper learning curve due to more complex architecture
  • Potentially slower development cycle for new features
  • May be overkill for simpler projects or prototypes

Code Comparison

langchain (Python):

from langchain import OpenAI, LLMChain
from langchain.prompts import PromptTemplate

llm = OpenAI(temperature=0.9)
prompt = PromptTemplate(input_variables=["product"], template="What is a good name for a company that makes {product}?")
chain = LLMChain(llm=llm, prompt=prompt)

langchainjs (JavaScript):

import { OpenAI } from "langchain/llms/openai";
import { PromptTemplate } from "langchain/prompts";
import { LLMChain } from "langchain/chains";

const model = new OpenAI({ temperature: 0.9 });
const prompt = PromptTemplate.fromTemplate("What is a good name for a company that makes {product}?");
const chain = new LLMChain({ llm: model, prompt: prompt });

Both repositories offer similar functionality, with langchain providing a more comprehensive set of features and integrations. langchainjs, being newer, may be more suitable for JavaScript developers or those looking for a lighter-weight solution. The code structure is similar, making it easier for developers to switch between the two if needed.

Integrate cutting-edge LLM technology quickly and easily into your apps

Pros of Semantic Kernel

  • More extensive documentation and examples
  • Stronger integration with Azure services
  • Built-in memory and planning capabilities

Cons of Semantic Kernel

  • Less flexible plugin system compared to LangChain.js
  • Smaller community and ecosystem
  • More focused on Microsoft technologies, potentially limiting cross-platform usage

Code Comparison

Semantic Kernel (C#):

var kernel = Kernel.Builder.Build();
var promptConfig = new PromptTemplateConfig();
var function = kernel.CreateSemanticFunction(promptTemplate, promptConfig);
var result = await kernel.RunAsync(function);

LangChain.js (JavaScript):

const model = new OpenAI({ temperature: 0.9 });
const prompt = PromptTemplate.fromTemplate("What is a good name for a company that makes {product}?");
const chain = new LLMChain({ llm: model, prompt: prompt });
const result = await chain.call({ product: "colorful socks" });

Both frameworks provide abstraction layers for working with language models, but LangChain.js offers a more modular approach with its chain concept, while Semantic Kernel focuses on semantic functions and skills.

LlamaIndex is the leading framework for building LLM-powered agents over your data.

Pros of llama_index

  • Specialized in efficient indexing and retrieval of large language model outputs
  • Offers advanced features for document chunking and semantic caching
  • Provides built-in support for various data sources and formats

Cons of llama_index

  • Less extensive ecosystem and community support compared to LangChain.js
  • More focused on indexing and retrieval, potentially limiting flexibility for other LLM tasks
  • Steeper learning curve for developers new to LLM-based applications

Code Comparison

LangChain.js example:

import { OpenAI } from "langchain/llms/openai";
import { RetrievalQAChain } from "langchain/chains";
import { HNSWLib } from "langchain/vectorstores/hnswlib";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";

const model = new OpenAI({ temperature: 0 });
const vectorStore = await HNSWLib.fromTexts(texts, metadata, new OpenAIEmbeddings());
const chain = RetrievalQAChain.fromLLM(model, vectorStore.asRetriever());

llama_index example:

from llama_index import GPTSimpleVectorIndex, SimpleDirectoryReader
from llama_index.indices.query.query_transform import DecomposeQueryTransform

documents = SimpleDirectoryReader('data').load_data()
index = GPTSimpleVectorIndex.from_documents(documents)
query_engine = index.as_query_engine(query_transform=DecomposeQueryTransform())

Examples and guides for using the OpenAI API

Pros of openai-cookbook

  • Focuses specifically on OpenAI's API, providing in-depth examples and best practices
  • Offers a wide range of use cases and tutorials for various OpenAI models
  • Maintained directly by OpenAI, ensuring up-to-date and accurate information

Cons of openai-cookbook

  • Limited to OpenAI's ecosystem, lacking integration with other AI providers
  • Less abstraction and higher-level functionality compared to LangChain.js
  • Requires more boilerplate code for complex tasks

Code Comparison

openai-cookbook:

import openai

response = openai.Completion.create(
  engine="text-davinci-002",
  prompt="Translate the following English text to French: '{}'",
  max_tokens=60
)

LangChain.js:

import { OpenAI } from "langchain/llms/openai";
import { PromptTemplate } from "langchain/prompts";

const model = new OpenAI({ temperature: 0 });
const prompt = PromptTemplate.fromTemplate(
  "Translate the following English text to French: {text}"
);
const chain = prompt.pipe(model);
const result = await chain.invoke({ text: "Hello, world!" });

LangChain.js provides a more abstracted and chainable approach, while openai-cookbook offers direct API access with more granular control.

23,837

AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.

Pros of Haystack

  • Specialized for question answering and document retrieval tasks
  • Supports multiple languages out-of-the-box
  • Offers pre-built pipelines for common NLP tasks

Cons of Haystack

  • Less flexible for general-purpose LLM applications
  • Smaller community and ecosystem compared to LangChain
  • Steeper learning curve for beginners

Code Comparison

Haystack:

from haystack import Pipeline
from haystack.nodes import TfidfRetriever, FARMReader

pipeline = Pipeline()
pipeline.add_node(component=TfidfRetriever(document_store=document_store), name="Retriever", inputs=["Query"])
pipeline.add_node(component=FARMReader(model_name_or_path="deepset/roberta-base-squad2"), name="Reader", inputs=["Retriever"])

LangChain:

import { RetrievalQAChain } from "langchain/chains";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { OpenAI } from "langchain/llms/openai";

const model = new OpenAI();
const chain = RetrievalQAChain.fromLLM(model, vectorStore);

Both libraries offer powerful tools for building NLP applications, but LangChain provides more flexibility for general LLM tasks, while Haystack excels in specific document retrieval and question answering scenarios.

76,895

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.

Pros of gpt4all

  • Focuses on local, offline language models, providing privacy and reduced latency
  • Offers a user-friendly interface for non-technical users
  • Supports multiple platforms including mobile devices

Cons of gpt4all

  • Limited integration capabilities compared to LangChain's extensive ecosystem
  • Less flexibility for custom workflows and advanced AI applications
  • Narrower scope, primarily centered around GPT-based models

Code Comparison

gpt4all:

from gpt4all import GPT4All

model = GPT4All("ggml-gpt4all-j-v1.3-groovy")
output = model.generate("Once upon a time", max_tokens=50)
print(output)

LangChain:

import { OpenAI } from "langchain/llms/openai";
import { PromptTemplate } from "langchain/prompts";

const model = new OpenAI({ temperature: 0.9 });
const prompt = PromptTemplate.fromTemplate("Tell me a story about {topic}");
const chain = prompt.pipe(model);
const result = await chain.invoke({ topic: "a magical forest" });
console.log(result);

gpt4all is more straightforward for simple text generation, while LangChain offers more complex chains and integrations for advanced use cases.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

🦜️🔗 LangChain.js

npm License: MIT Twitter

LangChain is a framework for building LLM-powered applications. It helps you chain together interoperable components and third-party integrations to simplify AI application development — all while future-proofing decisions as the underlying technology evolves.

Documentation: To learn more about LangChain, check out the docs.

If you're looking for more advanced customization or agent orchestration, check out LangGraph.js. our framework for building agents and controllable workflows.

[!NOTE] Looking for the Python version? Check out LangChain.

To help you ship LangChain apps to production faster, check out LangSmith. LangSmith is a unified developer platform for building, testing, and monitoring LLM applications.

⚡️ Quick Install

You can use npm, pnpm, or yarn to install LangChain.js

npm install -S langchain or pnpm install langchain or yarn add langchain

🚀 Why use LangChain?

LangChain helps developers build applications powered by LLMs through a standard interface for agents, models, embeddings, vector stores, and more.

Use LangChain for:

  • Real-time data augmentation. Easily connect LLMs to diverse data sources and external/internal systems, drawing from LangChain’s vast library of integrations with model providers, tools, vector stores, retrievers, and more.
  • Model interoperability. Swap models in and out as your engineering team experiments to find the best choice for your application’s needs. As the industry frontier evolves, adapt quickly — LangChain’s abstractions keep you moving without losing momentum.
  • Rapid prototyping. Quickly build and iterate on LLM applications with LangChain's modular, component-based architecture. Test different approaches and workflows without rebuilding from scratch, accelerating your development cycle.
  • Production-ready features. Deploy reliable applications with built-in support for monitoring, evaluation, and debugging through integrations like LangSmith. Scale with confidence using battle-tested patterns and best practices.
  • Vibrant community and ecosystem. Leverage a rich ecosystem of integrations, templates, and community-contributed components. Benefit from continuous improvements and stay up-to-date with the latest AI developments through an active open-source community.
  • Flexible abstraction layers. Work at the level of abstraction that suits your needs - from high-level chains for quick starts to low-level components for fine-grained control. LangChain grows with your application's complexity.

📦 LangChain's ecosystem

  • LangSmith - Unified developer platform for building, testing, and monitoring LLM applications. With LangSmith, you can debug poor-performing LLM app runs, evaluate agent trajectories, gain visibility in production, and deploy agents with confidence.
  • LangGraph - Build agents that can reliably handle complex tasks with LangGraph, our low-level agent orchestration framework. LangGraph offers customizable architecture, long-term memory, and human-in-the-loop workflows — and is trusted in production by companies like LinkedIn, Uber, Klarna, and GitLab.

🌐 Supported Environments

LangChain.js is written in TypeScript and can be used in:

  • Node.js (ESM and CommonJS) - 20.x, 22.x, 24.x
  • Cloudflare Workers
  • Vercel / Next.js (Browser, Serverless and Edge functions)
  • Supabase Edge Functions
  • Browser
  • Deno
  • Bun

📖 Additional Resources

  • Getting started: Installation, setting up the environment, simple examples
  • Learn: Learn about the core concepts of LangChain.
  • LangChain Forum: Connect with the community and share all of your technical questions, ideas, and feedback.
  • Chat LangChain: Ask questions & chat with our documentation.

💁 Contributing

As an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation.

For detailed information on how to contribute, see CONTRIBUTING.md.

Please report any security issues or concerns following our security guidelines.

NPM DownloadsLast 30 Days