Represents a fully connected (dense) layer in a neural network. Implements both Layer and TrainableLayer interfaces.
inputUnits: number
outputUnits: number
Performs the backward pass (backpropagation) through the dense layer. Calculates the gradient of the loss with respect to the layer's input.
Performs the forward pass through the dense layer. Output = Input * Weights + Biases
Returns the number of input units.
Returns the number of output units.
getWeightGradients(): Map<string, number[] | number[][]>
Calculates the gradients of the loss with respect to the layer's weights and biases. dL/dW_ij = dL/dY_j * X_i dL/dB_j = dL/dY_j
getWeights(): Map<string, number[] | number[][]>
Retrieves the current weights and biases of the layer.
initializeBiases(): number[]
Initializes the biases for the layer. Biases are initialized with random values between 0 and 1.
initializeWeights(): number[][]
Initializes the weights for the layer. Weights are initialized with random values between 0 and 1.
updateWeights(updatedWeights: Map<string, number[] | number[][]>): void
Updates the layer's weights and biases.