Getting StartedInstallation

Installation

Basic Installation

Install Agentary JS via npm:

npm install agentary-js

Device Inference Setup

If you plan to use on-device inference (local models running in the browser), you need to install the peer dependency and configure your bundler:

1. Install Transformers.js

npm install @huggingface/transformers

2. Configure Your Bundler

Agentary JS requires your bundler to be configured to handle Web Workers and copy ONNX Runtime assets.

For Vite users (recommended):

See the detailed Vite Configuration Guide for complete setup instructions.

Quick configuration:

// vite.config.js
import { defineConfig } from 'vite';
 
export default defineConfig({
  worker: {
    format: 'es',
  },
  optimizeDeps: {
    include: ['@huggingface/transformers'],
  },
  build: {
    target: 'es2022',
  },
});

Requirements

For Device Provider (On-Device Inference)

  • Node.js 18 or higher
  • Modern browser with WebGPU or WebAssembly support
  • @huggingface/transformers peer dependency installed
  • Bundler configured (see above)

Browser Support

WebGPU Support
Browser / PlatformVersion (Shipped/Enabled by Default)Notes
Chromium (Chrome / Edge / other) on Windows/macOS/ChromeOSChrome 113 / Edge 113 and above for desktopInitial release: Chrome 113 on Windows, macOS, ChromeOS
Chromium on AndroidChrome 121 and upAndroid support came later
Firefox (desktop)Firefox 141 (Windows) and laterOn Windows; Mac/Linux coming. Android not yet enabled by default
Safari / WebKit (macOS/iOS/iPadOS)Safari 26 (macOS Tahoe) / iOS/iPadOS 26 etc.Enabled by default in Safari 26
WebAssembly Support
Browser / PlatformVersion (Full Support for Wasm 1.0 MVP)Notes
Chrome (desktop)Chrome 57 and laterWidely supported early
Edge (desktop)Edge 16 and laterEdge legacy / Edge HTML era
Firefox (desktop)Firefox 52 and laterFull support has been long present
Safari (desktop)Safari 11 and laterOn macOS; note that on iOS Safari engine is the same WebKit so behaviour aligns
Chrome for AndroidAll Chrome for Android versions (post-support)Mobile support good
Firefox for AndroidFully supported in the mobile version-

WebAssembly is used as a fallback when WebGPU is not available, providing broad compatibility across devices.

Minimum Hardware

  • RAM: 4GB recommended for small models (270M-500M parameters)
  • GPU: Optional but recommended for WebGPU acceleration
  • Storage: Varies by model (typically 100MB-500MB per quantized model)

For Cloud Provider (Cloud Inference)

Cloud Provider Setup:

  1. Set up a proxy server (see Cloud Provider Guide)
  2. Store API keys securely on your backend
  3. Configure Agentary JS to use your proxy endpoint
# Example: Install proxy dependencies
npm install express @anthropic-ai/sdk
# or
npm install express openai

Verifying Installation

Test Device Provider

Create a simple test file to verify device provider installation:

import { createSession, isSupportedModel, getSupportedModelIds } from 'agentary-js';
 
console.log('Agentary JS installed successfully!');
console.log('Supported models:', getSupportedModelIds());
console.log('Qwen3 supported:', isSupportedModel('onnx-community/Qwen3-0.6B-ONNX'));

Test Cloud Provider

Verify cloud provider connectivity:

import { createSession } from 'agentary-js';
 
const session = await createSession({
  models: [{
    runtime: 'anthropic',
    model: 'claude-3-5-sonnet-20241022',
    proxyUrl: 'http://localhost:3001/api/anthropic',
    modelProvider: 'anthropic'
  }]
});
 
console.log('Cloud provider configured successfully!');
await session.dispose();

Next Steps

Now that you have Agentary JS installed:

  1. Choose between on-device or cloud inference
  2. Set up a Cloud Provider (if using cloud models)
  3. Review Core Concepts to understand the architecture
  4. Build your first application with the Quick Start guide