Leaky Relu Activation Layer
forward(x: Value[]): Value[]
Pure-deno implementation of neural network inspired by Micrograd.
Add Package
deno add jsr:@sauber/neurons
Import symbol
import { LRelu } from "@sauber/neurons";
Import directly with a jsr specifier
import { LRelu } from "jsr:@sauber/neurons";
Add Package
pnpm i jsr:@sauber/neurons
pnpm dlx jsr add @sauber/neurons
Import symbol
import { LRelu } from "@sauber/neurons";
Add Package
yarn add jsr:@sauber/neurons
yarn dlx jsr add @sauber/neurons
Import symbol
import { LRelu } from "@sauber/neurons";
Add Package
vlt install jsr:@sauber/neurons
Import symbol
import { LRelu } from "@sauber/neurons";
Add Package
npx jsr add @sauber/neurons
Import symbol
import { LRelu } from "@sauber/neurons";
Add Package
bunx jsr add @sauber/neurons
Import symbol
import { LRelu } from "@sauber/neurons";