🚀 Get Started
Integrate AI models with WebAssembly in just a few steps. Build, compile, and deploy secure, high-performance AI experiences.
Install WasmAI CLI
Install the WasmAI command-line interface to compile and package AI models as WebAssembly modules.
Convert AI Model to Wasm
Use the CLI to convert your trained model (PyTorch, TensorFlow, ONNX) into a WebAssembly module.
Implement in Your Project
Load the WebAssembly module in your JavaScript/TypeScript code and run AI inference directly in the browser.
const result = await Module.runInference(input);
Deploy Securely
WasmAI modules are sandboxed by default. Host them on any platform and run inference safely in user browsers.
Need More Help?
Check out the full documentation, join the community, or explore ready-to-use templates in the CLI.