Extension

rmath.nn

Neural network building blocks: activation functions, loss functions, normalization layers, and pooling. All operate on rmath.Array.

Activation Functions

FunctionDescription
relu(a)Rectified linear unit: max(0, x)
leaky_relu(a, alpha)Leaky ReLU with slope alpha for negative inputs
sigmoid(a)Logistic sigmoid: 1 / (1 + e⁻ˣ)
gelu(a)Gaussian error linear unit
softmax(a)Row-wise softmax (numerically stable)
hardswish(a)Hard-swish activation

Loss Functions

FunctionReturnsDescription
mse_loss(a, target)floatMean squared error
cross_entropy_loss(a, labels)floatCross-entropy loss. Labels are integer class indices.

Normalization & Regularization

FunctionDescription
batch_norm(a, mu, sigma, gamma, beta)Batch normalization. mu, sigma, gamma, beta are Vectors.
layer_norm(a, eps)Layer normalization with epsilon for numerical stability.
dropout(a, p)Random dropout with probability p.

Pooling

max_pool2d(a, kernel_size)Array2D max pooling
avg_pool2d(a, kernel_size)Array2D average pooling