Installation
Install Agentary JS via npm:
npm install agentary-jsRequirements
For Device Provider (On-Device Inference)
- Node.js 18 or higher
- Modern browser with WebGPU or WebAssembly support
Browser Support
WebGPU Support
| Browser / Platform | Version (Shipped/Enabled by Default) | Notes |
|---|---|---|
| Chromium (Chrome / Edge / other) on Windows/macOS/ChromeOS | Chrome 113 / Edge 113 and above for desktop | Initial release: Chrome 113 on Windows, macOS, ChromeOS |
| Chromium on Android | Chrome 121 and up | Android support came later |
| Firefox (desktop) | Firefox 141 (Windows) and later | On Windows; Mac/Linux coming. Android not yet enabled by default |
| Safari / WebKit (macOS/iOS/iPadOS) | Safari 26 (macOS Tahoe) / iOS/iPadOS 26 etc. | Enabled by default in Safari 26 |
WebAssembly Support
| Browser / Platform | Version (Full Support for Wasm 1.0 MVP) | Notes |
|---|---|---|
| Chrome (desktop) | Chrome 57 and later | Widely supported early |
| Edge (desktop) | Edge 16 and later | Edge legacy / Edge HTML era |
| Firefox (desktop) | Firefox 52 and later | Full support has been long present |
| Safari (desktop) | Safari 11 and later | On macOS; note that on iOS Safari engine is the same WebKit so behaviour aligns |
| Chrome for Android | All Chrome for Android versions (post-support) | Mobile support good |
| Firefox for Android | Fully supported in the mobile version | - |
WebAssembly is used as a fallback when WebGPU is not available, providing broad compatibility across devices.
Minimum Hardware
- RAM: 4GB recommended for small models (270M-500M parameters)
- GPU: Optional but recommended for WebGPU acceleration
- Storage: Varies by model (typically 100MB-500MB per quantized model)
For Cloud Provider (Cloud Inference)
- A backend server to proxy API requests
- API keys from cloud providers:
- Anthropic API Key for Claude models
- OpenAI API Key for GPT models
Cloud Provider Setup:
- Set up a proxy server (see Cloud Provider Guide)
- Store API keys securely on your backend
- Configure Agentary JS to use your proxy endpoint
# Example: Install proxy dependencies
npm install express @anthropic-ai/sdk
# or
npm install express openaiVerifying Installation
Test Device Provider
Create a simple test file to verify device provider installation:
import { createSession, isSupportedModel, getSupportedModelIds } from 'agentary-js';
console.log('Agentary JS installed successfully!');
console.log('Supported models:', getSupportedModelIds());
console.log('Qwen3 supported:', isSupportedModel('onnx-community/Qwen3-0.6B-ONNX'));Test Cloud Provider
Verify cloud provider connectivity:
import { createSession } from 'agentary-js';
const session = await createSession({
models: [{
runtime: 'anthropic',
model: 'claude-3-5-sonnet-20241022',
proxyUrl: 'http://localhost:3001/api/anthropic',
modelProvider: 'anthropic'
}]
});
console.log('Cloud provider configured successfully!');
await session.dispose();Next Steps
Now that you have Agentary JS installed:
- Choose between on-device or cloud inference
- Set up a Cloud Provider (if using cloud models)
- Review Core Concepts to understand the architecture
- Build your first application with the Quick Start guide