superagent
Ajax for Node.js and browsers (JS HTTP client). Maintained for @forwardemail, @ladjs, @spamscanner, @breejs, @cabinjs, and @lassjs.
Top Related Projects
Official JavaScript / TypeScript library for the OpenAI API
Your API ⇒ Paid MCP. Instantly.
🦜🔗 Build context-aware reasoning applications
Integrate cutting-edge LLM technology quickly and easily into your apps
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Quick Overview
Superagent is an AI-powered Node.js library that simplifies the creation of AI agents and workflows. It provides a high-level API for building, running, and managing AI agents, supporting various language models and tools. Superagent aims to streamline the development of AI-powered applications and services.
Pros
- Easy-to-use API for creating and managing AI agents
- Supports multiple language models (e.g., OpenAI, Anthropic)
- Extensible with custom tools and integrations
- Built-in memory and caching capabilities
Cons
- Relatively new project, may have some stability issues
- Limited documentation and examples compared to more established libraries
- Potential learning curve for developers new to AI agent concepts
- Dependency on third-party language models and services
Code Examples
- Creating a simple AI agent:
import { Agent } from 'superagent';
const agent = new Agent({
llm: { provider: 'openai', model: 'gpt-3.5-turbo' },
tools: ['search', 'calculator']
});
const response = await agent.run('What is the population of France?');
console.log(response);
- Using memory in an agent:
import { Agent } from 'superagent';
const agent = new Agent({
llm: { provider: 'anthropic', model: 'claude-2' },
memory: { type: 'buffer', capacity: 5 }
});
await agent.run('My name is Alice.');
const response = await agent.run('What is my name?');
console.log(response); // Should remember and output "Alice"
- Creating a custom tool:
import { Agent, Tool } from 'superagent';
const weatherTool = new Tool({
name: 'weather',
description: 'Get current weather for a location',
func: async (location) => {
// Implement weather API call here
return `The weather in ${location} is sunny.`;
}
});
const agent = new Agent({
llm: { provider: 'openai', model: 'gpt-4' },
tools: [weatherTool]
});
const response = await agent.run('What is the weather in New York?');
console.log(response);
Getting Started
To get started with Superagent, follow these steps:
-
Install the package:
npm install superagent
-
Set up your environment variables for the language model provider (e.g., OpenAI API key).
-
Create a basic agent:
import { Agent } from 'superagent'; const agent = new Agent({ llm: { provider: 'openai', model: 'gpt-3.5-turbo' }, tools: ['search', 'calculator'] }); const response = await agent.run('Hello, how can you help me today?'); console.log(response);
-
Explore more advanced features like memory, custom tools, and different language models in the documentation.
Competitor Comparisons
Official JavaScript / TypeScript library for the OpenAI API
Pros of openai-node
- Official OpenAI SDK, ensuring compatibility and up-to-date features
- Comprehensive documentation and examples for OpenAI-specific use cases
- Streamlined integration with OpenAI services
Cons of openai-node
- Limited to OpenAI services, less versatile for general HTTP requests
- May have a steeper learning curve for developers new to OpenAI's ecosystem
Code Comparison
openai-node:
import OpenAI from 'openai';
const openai = new OpenAI({apiKey: 'your-api-key'});
const completion = await openai.chat.completions.create({
model: "gpt-3.5-turbo",
messages: [{"role": "user", "content": "Hello world"}]
});
superagent:
const superagent = require('superagent');
const response = await superagent
.post('https://api.openai.com/v1/chat/completions')
.set('Authorization', 'Bearer your-api-key')
.send({
model: "gpt-3.5-turbo",
messages: [{"role": "user", "content": "Hello world"}]
});
Summary
openai-node is tailored for OpenAI services, offering a more specialized and potentially easier-to-use solution for OpenAI-specific tasks. However, superagent provides greater flexibility for general HTTP requests across various APIs, not limited to OpenAI. The choice between the two depends on the specific needs of the project and the developer's familiarity with each library.
Your API ⇒ Paid MCP. Instantly.
Pros of Agentic
- Focuses on AI agent orchestration and workflows
- Provides a flexible framework for building complex AI systems
- Supports various LLM providers and tools out of the box
Cons of Agentic
- Less mature project with fewer contributors
- Documentation is still a work in progress
- May have a steeper learning curve for beginners
Code Comparison
Agentic:
from agentic import Agent, Task
agent = Agent()
task = Task("Summarize this text: ...")
result = agent.run(task)
print(result)
Superagent:
const superagent = require('superagent');
superagent
.get('https://api.example.com/data')
.end((err, res) => {
console.log(res.body);
});
Key Differences
- Agentic is designed for AI agent workflows, while Superagent is a general-purpose HTTP client library
- Agentic is written in Python, whereas Superagent is a JavaScript library
- Superagent has a larger community and more extensive documentation
- Agentic provides higher-level abstractions for AI tasks, while Superagent focuses on HTTP requests and responses
Use Cases
- Choose Agentic for building complex AI systems and agent-based workflows
- Opt for Superagent when working with RESTful APIs or performing HTTP operations in JavaScript applications
Both libraries serve different purposes and excel in their respective domains. The choice between them depends on the specific requirements of your project and the programming language you're using.
🦜🔗 Build context-aware reasoning applications
Pros of LangChain
- More comprehensive framework for building AI applications
- Extensive documentation and community support
- Wider range of integrations with various AI models and tools
Cons of LangChain
- Steeper learning curve due to its extensive features
- Potentially more complex setup for simple projects
- Heavier dependency footprint
Code Comparison
LangChain:
from langchain import OpenAI, LLMChain, PromptTemplate
llm = OpenAI(temperature=0.9)
prompt = PromptTemplate(
input_variables=["product"],
template="What is a good name for a company that makes {product}?",
)
chain = LLMChain(llm=llm, prompt=prompt)
Superagent:
const superagent = require('superagent');
superagent
.post('https://api.openai.com/v1/engines/davinci-codex/completions')
.send({ prompt: 'What is a good name for a company that makes shoes?' })
.set('Authorization', 'Bearer YOUR_API_KEY')
.end((err, res) => {
console.log(res.body.choices[0].text);
});
LangChain offers a more structured approach to working with language models, while Superagent provides a simpler, more direct method for making API requests. LangChain's abstraction allows for easier chaining of operations and integration with various AI tools, whereas Superagent is more flexible for general HTTP requests but requires more manual setup for AI-specific tasks.
Integrate cutting-edge LLM technology quickly and easily into your apps
Pros of Semantic Kernel
- More comprehensive framework for building AI applications
- Better integration with Azure OpenAI and other Microsoft services
- Larger community and more frequent updates
Cons of Semantic Kernel
- Steeper learning curve due to its complexity
- Primarily focused on .NET ecosystem, limiting language options
- Heavier resource requirements for implementation
Code Comparison
Semantic Kernel (C#):
var kernel = Kernel.Builder.Build();
var openAIFunction = kernel.CreateSemanticFunction("Generate a story about {{$input}}");
var result = await kernel.RunAsync("a brave knight", openAIFunction);
Superagent (JavaScript):
const agent = new Superagent('YOUR_API_KEY');
const response = await agent.chat.completions.create({
messages: [{ role: 'user', content: 'Generate a story about a brave knight' }]
});
Summary
Semantic Kernel offers a more robust framework for AI applications, particularly within the Microsoft ecosystem, but comes with increased complexity. Superagent provides a simpler, more lightweight approach for basic AI interactions, making it easier to integrate into existing JavaScript projects. The choice between the two depends on the specific project requirements, development ecosystem, and desired level of AI integration.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Pros of Transformers
- Extensive library for state-of-the-art NLP models and tasks
- Large community support and frequent updates
- Comprehensive documentation and examples
Cons of Transformers
- Steeper learning curve for beginners
- Larger package size and resource requirements
- More complex setup for simple tasks
Code Comparison
Transformers:
from transformers import pipeline
classifier = pipeline("sentiment-analysis")
result = classifier("I love this product!")[0]
print(f"Label: {result['label']}, Score: {result['score']:.4f}")
Superagent:
const superagent = require('superagent');
superagent
.get('https://api.example.com/product/review')
.query({ text: 'I love this product!' })
.end((err, res) => {
console.log(res.body);
});
Summary
Transformers is a powerful library for advanced NLP tasks, offering a wide range of pre-trained models and tools. It excels in complex language processing scenarios but may be overkill for simpler applications. Superagent, on the other hand, is a lightweight HTTP client that's easier to set up and use for basic API interactions. While not directly comparable in functionality, Superagent is more suitable for straightforward web requests and data fetching, whereas Transformers is the go-to choice for sophisticated NLP projects.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
superagent
Small progressive client-side HTTP request library, and Node.js module with the same API, supporting many high-level HTTP client features. Maintained for Forward Email and Lad.
Table of Contents
Install
npm:
npm install superagent
yarn:
yarn add superagent
Usage
Node
const superagent = require('superagent');
// callback
superagent
.post('/api/pet')
.send({ name: 'Manny', species: 'cat' }) // sends a JSON post body
.set('X-API-Key', 'foobar')
.set('accept', 'json')
.end((err, res) => {
// Calling the end function will send the request
});
// promise with then/catch
superagent.post('/api/pet').then(console.log).catch(console.error);
// promise with async/await
(async () => {
try {
const res = await superagent.post('/api/pet');
console.log(res);
} catch (err) {
console.error(err);
}
})();
Browser
The browser-ready, minified version of superagent
is only 50 KB (minified and gzipped).
Browser-ready versions of this module are available via jsdelivr, unpkg, and also in the node_modules/superagent/dist
folder in downloads of the superagent
package.
Note that we also provide unminified versions with
.js
instead of.min.js
file extensions.
VanillaJS
This is the solution for you if you're just using <script>
tags everywhere!
<script src="https://cdnjs.cloudflare.com/polyfill/v3/polyfill.min.js?features=WeakRef,BigInt"></script>
<script src="https://cdn.jsdelivr.net/npm/superagent"></script>
<!-- if you wish to use unpkg.com instead: -->
<!-- <script src="https://unpkg.com/superagent"></script> -->
<script type="text/javascript">
(function() {
// superagent is exposed as `window.superagent`
// if you wish to use "request" instead please
// uncomment the following line of code:
// `window.request = superagent;`
superagent
.post('/api/pet')
.send({ name: 'Manny', species: 'cat' }) // sends a JSON post body
.set('X-API-Key', 'foobar')
.set('accept', 'json')
.end(function (err, res) {
// Calling the end function will send the request
});
})();
</script>
Bundler
If you are using browserify, webpack, rollup, or another bundler, then you can follow the same usage as Node above.
Supported Platforms
-
Node: v14.18.0+
-
Browsers (see .browserslistrc):
npx browserslist
and_chr 102 and_ff 101 and_qq 10.4 and_uc 12.12 android 101 chrome 103 chrome 102 chrome 101 chrome 100 edge 103 edge 102 edge 101 firefox 101 firefox 100 firefox 91 ios_saf 15.5 ios_saf 15.4 ios_saf 15.2-15.3 ios_saf 15.0-15.1 ios_saf 14.5-14.8 ios_saf 14.0-14.4 ios_saf 12.2-12.5 kaios 2.5 op_mini all op_mob 64 opera 86 opera 85 safari 15.5 safari 15.4 samsung 17.0 samsung 16.0
Required Browser Features
We recommend using https://cdnjs.cloudflare.com/polyfill/ (specifically with the bundle mentioned in VanillaJS above):
<script src="https://cdnjs.cloudflare.com/polyfill/v3/polyfill.min.js?features=WeakRef,BigInt"></script>
- WeakRef is not supported in Opera 85, iOS Safari 12.2-12.5
- BigInt is not supported in iOS Safari 12.2-12.5
Plugins
SuperAgent is easily extended via plugins.
const nocache = require('superagent-no-cache');
const superagent = require('superagent');
const prefix = require('superagent-prefix')('/static');
superagent
.get('/some-url')
.query({ action: 'edit', city: 'London' }) // query string
.use(prefix) // Prefixes *only* this request
.use(nocache) // Prevents caching of *only* this request
.end((err, res) => {
// Do something
});
Existing plugins:
- superagent-no-cache - prevents caching by including Cache-Control header
- superagent-prefix - prefixes absolute URLs (useful in test environment)
- superagent-suffix - suffix URLs with a given path
- superagent-mock - simulate HTTP calls by returning data fixtures based on the requested URL
- superagent-mocker â simulate REST API
- superagent-cache - A global SuperAgent patch with built-in, flexible caching
- superagent-cache-plugin - A SuperAgent plugin with built-in, flexible caching
- superagent-jsonapify - A lightweight json-api client addon for superagent
- superagent-serializer - Converts server payload into different cases
- superagent-httpbackend - stub out requests using AngularJS' $httpBackend syntax
- superagent-throttle - queues and intelligently throttles requests
- superagent-charset - add charset support for node's SuperAgent
- superagent-verbose-errors - include response body in error messages for failed requests
- superagent-declare - A simple declarative API for SuperAgent
- superagent-node-http-timings - measure http timings in node.js
- superagent-cheerio - add cheerio to your response content automatically. Adds
res.$
for HTML and XML response bodies. - @certible/superagent-aws-sign - Sign AWS endpoint requests, it uses the aws4 to authenticate the SuperAgent requests
Please prefix your plugin with superagent-*
so that it can easily be found by others.
For SuperAgent extensions such as couchdb and oauth visit the wiki.
Upgrading from previous versions
Please see GitHub releases page for the current changelog.
Our breaking changes are mostly in rarely used functionality and from stricter error handling.
- 6.0 to 6.1
- Browser behaviour changed to match Node when serializing
application/x-www-form-urlencoded
, usingarrayFormat: 'indices'
semantics ofqs
library. (See: https://www.npmjs.com/package/qs#stringifying)
- Browser behaviour changed to match Node when serializing
- 5.x to 6.x:
- Retry behavior is still opt-in, however we now have a more fine-grained list of status codes and error codes that we retry against (see updated docs)
- A specific issue with Content-Type matching not being case-insensitive is fixed
- Set is now required for IE 9, see Required Browser Features for more insight
- 4.x to 5.x:
- We've implemented the build setup of Lass to simplify our stack and linting
- Unminified browserified build size has been reduced from 48KB to 20KB (via
tinyify
and the latest version of Babel using@babel/preset-env
and.browserslistrc
) - Linting support has been added using
caniuse-lite
andeslint-plugin-compat
- We can now target what versions of Node we wish to support more easily using
.babelrc
- 3.x to 4.x:
- Ensure you're running Node 6 or later. We've dropped support for Node 4.
- We've started using ES6 and for compatibility with Internet Explorer you may need to use Babel.
- We suggest migrating from
.end()
callbacks to.then()
orawait
.
- 2.x to 3.x:
- Ensure you're running Node 4 or later. We've dropped support for Node 0.x.
- Test code that calls
.send()
multiple times. Invalid calls to.send()
will now throw instead of sending garbage.
- 1.x to 2.x:
- If you use
.parse()
in the browser version, rename it to.serialize()
. - If you rely on
undefined
in query-string values being sent literally as the text "undefined", switch to checking for missing value instead.?key=undefined
is now?key
(without a value). - If you use
.then()
in Internet Explorer, ensure that you have a polyfill that adds a globalPromise
object.
- If you use
- 0.x to 1.x:
- Instead of 1-argument callback
.end(function(res){})
use.then(res => {})
.
- Instead of 1-argument callback
Contributors
Name |
---|
Kornel LesiÅski |
Peter Lyons |
Hunter Loftis |
Nick Baugh |
License
MIT © TJ Holowaychuk
Top Related Projects
Official JavaScript / TypeScript library for the OpenAI API
Your API ⇒ Paid MCP. Instantly.
🦜🔗 Build context-aware reasoning applications
Integrate cutting-edge LLM technology quickly and easily into your apps
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot