Skip to main content
Home

Built and signed on GitHub Actions

A simple neural network library for JavaScript and TypeScript.

This package works with Cloudflare Workers, Node.js, Deno, Bun, Browsers
This package works with Cloudflare Workers
This package works with Node.js
This package works with Deno
This package works with Bun
This package works with Browsers
JSR Score
100%
Published
4 weeks ago (1.0.0)
class ReLU
extends Activation

Rectified Linear Unit (ReLU) activation function. Outputs the input directly if it is positive, otherwise, it outputs zero. Formula: ReLU(x) = max(0, x)

Properties

private
lastInput: number[][]

Stores the input from the forward pass, used during the backward pass.

Methods

backward(outputGradient: number[][]): number[][]

Computes the gradient of the loss with respect to the input of the ReLU layer. The derivative of ReLU ReLU(x) is 1 if x > 0, and 0 otherwise.

forward(input: number[][]): number[][]

Applies the ReLU function element-wise to the input. ReLU(x) = max(0, x)

New Ticket: Report package

Please provide a reason for reporting this package. We will review your report and take appropriate action.

Please review the JSR usage policy before submitting a report.

Add Package

deno add jsr:@am/neuralnetwork

Import symbol

import { ReLU } from "@am/neuralnetwork/layers";
or

Import directly with a jsr specifier

import { ReLU } from "jsr:@am/neuralnetwork/layers";

Add Package

pnpm i jsr:@am/neuralnetwork
or (using pnpm 10.8 or older)
pnpm dlx jsr add @am/neuralnetwork

Import symbol

import { ReLU } from "@am/neuralnetwork/layers";

Add Package

yarn add jsr:@am/neuralnetwork
or (using Yarn 4.8 or older)
yarn dlx jsr add @am/neuralnetwork

Import symbol

import { ReLU } from "@am/neuralnetwork/layers";

Add Package

vlt install jsr:@am/neuralnetwork

Import symbol

import { ReLU } from "@am/neuralnetwork/layers";

Add Package

npx jsr add @am/neuralnetwork

Import symbol

import { ReLU } from "@am/neuralnetwork/layers";

Add Package

bunx jsr add @am/neuralnetwork

Import symbol

import { ReLU } from "@am/neuralnetwork/layers";