Skip to main content
Home

Built and signed on GitHub Actions

Powerful Powerful Machine Learning library with GPU, CPU and WASM backends

This package works with Node.js, Deno, Bun, BrowsersIt is unknown whether this package works with Cloudflare Workers
It is unknown whether this package works with Cloudflare Workers
This package works with Node.js
This package works with Deno
This package works with Bun
This package works with Browsers
JSR Score
94%
Published
8 months ago (0.4.2)


Netsaur


netsaur stars netsaur releases netsaur License


Powerful Machine Learning library for Deno

Installation

There is no installation step required. You can simply import the library and you're good to go :)

Features

  • Lightweight and easy-to-use neural network library for Deno.
  • Blazingly fast and efficient.
  • Provides a simple API for creating and training neural networks.
  • Can run on both the CPU and the GPU (WIP).
  • Allows you to simply run the code without downloading any prior dependencies.
  • Perfect for serverless environments.
  • Allows you to quickly build and deploy machine learning models for a variety of applications with just a few lines of code.
  • Suitable for both beginners and experienced machine learning practitioners.

Backends

  • CPU - Native backend written in Rust.
  • WASM - WebAssembly backend written in Rust.
  • GPU (TODO)

Examples

Maintainers

QuickStart

This example shows how to train a neural network to predict the output of the XOR function our speedy CPU backend written in Rust.

import {
  Cost,
  CPU,
  DenseLayer,
  Sequential,
  setupBackend,
  SigmoidLayer,
  tensor2D,
} from "jsr:@denosaurs/netsaur";

/**
 * Setup the CPU backend. This backend is fast but doesn't work on the Edge.
 */
await setupBackend(CPU);

/**
 * Creates a sequential neural network.
 */
const net = new Sequential({
  /**
   * The number of minibatches is set to 4 and the output size is set to 2.
   */
  size: [4, 2],

  /**
   * The silent option is set to true, which means that the network will not output any logs during trainin
   */
  silent: true,

  /**
   * Defines the layers of a neural network in the XOR function example.
   * The neural network has two input neurons and one output neuron.
   * The layers are defined as follows:
   * - A dense layer with 3 neurons.
   * - sigmoid activation layer.
   * - A dense layer with 1 neuron.
   * -A sigmoid activation layer.
   */
  layers: [
    DenseLayer({ size: [3] }),
    SigmoidLayer(),
    DenseLayer({ size: [1] }),
    SigmoidLayer(),
  ],

  /**
   * The cost function used for training the network is the mean squared error (MSE).
   */
  cost: Cost.MSE,
});

/**
 * Train the network on the given data.
 */
net.train(
  [
    {
      inputs: tensor2D([
        [0, 0],
        [1, 0],
        [0, 1],
        [1, 1],
      ]),
      outputs: tensor2D([[0], [1], [1], [0]]),
    },
  ],
  /**
   * The number of iterations is set to 10000.
   */
  10000,
);

/**
 * Predict the output of the XOR function for the given inputs.
 */
const out1 = (await net.predict(tensor1D([0, 0]))).data;
console.log(`0 xor 0 = ${out1[0]} (should be close to 0)`);

const out2 = (await net.predict(tensor1D([1, 0]))).data;
console.log(`1 xor 0 = ${out2[0]} (should be close to 1)`);

const out3 = (await net.predict(tensor1D([0, 1]))).data;
console.log(`0 xor 1 = ${out3[0]} (should be close to 1)`);

const out4 = (await net.predict(tensor1D([1, 1]))).data;
console.log(`1 xor 1 = ${out4[0]} (should be close to 0)`);

Use the WASM Backend

By changing the CPU backend to the WASM backend we sacrifice some speed but this allows us to run on the edge.

import {
  Cost,
  DenseLayer,
  Sequential,
  setupBackend,
  SigmoidLayer,
  tensor1D,
  tensor2D,
  WASM,
} from "jsr:@denosaurs/netsaur";

/**
 * Setup the WASM backend. This backend is slower than the CPU backend but works on the Edge.
 */
await setupBackend(WASM);

/**
 * Creates a sequential neural network.
 */
const net = new Sequential({
  /**
   * The number of minibatches is set to 4 and the output size is set to 2.
   */
  size: [4, 2],

  /**
   * The silent option is set to true, which means that the network will not output any logs during trainin
   */
  silent: true,

  /**
   * Defines the layers of a neural network in the XOR function example.
   * The neural network has two input neurons and one output neuron.
   * The layers are defined as follows:
   * - A dense layer with 3 neurons.
   * - sigmoid activation layer.
   * - A dense layer with 1 neuron.
   * -A sigmoid activation layer.
   */
  layers: [
    DenseLayer({ size: [3] }),
    SigmoidLayer(),
    DenseLayer({ size: [1] }),
    SigmoidLayer(),
  ],

  /**
   * The cost function used for training the network is the mean squared error (MSE).
   */
  cost: Cost.MSE,
});

/**
 * Train the network on the given data.
 */
net.train(
  [
    {
      inputs: tensor2D([
        [0, 0],
        [1, 0],
        [0, 1],
        [1, 1],
      ]),
      outputs: tensor2D([[0], [1], [1], [0]]),
    },
  ],
  /**
   * The number of iterations is set to 10000.
   */
  10000,
);

/**
 * Predict the output of the XOR function for the given inputs.
 */
const out1 = (await net.predict(tensor1D([0, 0]))).data;
console.log(`0 xor 0 = ${out1[0]} (should be close to 0)`);

const out2 = (await net.predict(tensor1D([1, 0]))).data;
console.log(`1 xor 0 = ${out2[0]} (should be close to 1)`);

const out3 = (await net.predict(tensor1D([0, 1]))).data;
console.log(`0 xor 1 = ${out3[0]} (should be close to 1)`);

const out4 = (await net.predict(tensor1D([1, 1]))).data;
console.log(`1 xor 1 = ${out4[0]} (should be close to 0)`);

Documentation

The full documentation for Netsaur can be found here.

License

Netsaur is licensed under the MIT License.

Built and signed on
GitHub Actions

New Ticket: Report package

Please provide a reason for reporting this package. We will review your report and take appropriate action.

Please review the JSR usage policy before submitting a report.

Add Package

deno add jsr:@denosaurs/netsaur

Import symbol

import * as netsaur from "@denosaurs/netsaur";
or

Import directly with a jsr specifier

import * as netsaur from "jsr:@denosaurs/netsaur";

Add Package

pnpm i jsr:@denosaurs/netsaur
or (using pnpm 10.8 or older)
pnpm dlx jsr add @denosaurs/netsaur

Import symbol

import * as netsaur from "@denosaurs/netsaur";

Add Package

yarn add jsr:@denosaurs/netsaur
or (using Yarn 4.8 or older)
yarn dlx jsr add @denosaurs/netsaur

Import symbol

import * as netsaur from "@denosaurs/netsaur";

Add Package

npx jsr add @denosaurs/netsaur

Import symbol

import * as netsaur from "@denosaurs/netsaur";

Add Package

bunx jsr add @denosaurs/netsaur

Import symbol

import * as netsaur from "@denosaurs/netsaur";