active
library
gradator
Pedagogical C++20 automatic differentiation library
Resources & Distribution
Source Code
Package Registries
gradator
A pedagogical C++20 automatic differentiation library implementing reverse-mode AD with a functional API.
Features
- Functional API:
grad(f)returns a callable, enablinggrad(grad(f))for higher-order derivatives - Explicit context: No global state; computational graph passed explicitly
- Type distinction:
var<T>(differentiable) vsval<T>(constant) - Concept-based: Works with any type satisfying the
Matrixconcept - Comprehensive operations with analytical Jacobians:
- Arithmetic: +, -, *, /, negation
- Transcendental: exp, log, sqrt, pow
- Matrix: matmul, transpose, trace, sum
- Statistical: det, logdet, inverse, solve
Requirements
- C++20 compiler (GCC 12+, Clang 15+)
- CMake 3.20+
- elementa (sibling directory)
Usage
#include <gradator.hpp>
using namespace gradator;
using namespace elementa;
int main() {
// Define a function
auto f = [](const auto& x) {
return sum(pow(x, 2.0)); // f(x) = sum(x^2)
};
// Compute gradient
auto df = grad(f);
matrix<double> x{{1, 2}, {3, 4}};
auto gradient = df(x); // gradient = 2*x
// Higher-order: Hessian
auto H = hessian(f)(x); // H = 2*I
return 0;
}
Building
mkdir build && cd build
cmake ..
cmake --build .
ctest # Run tests
Examples
Gradient Descent
auto loss = [](const auto& theta) {
return sum(pow(theta, 2.0));
};
auto grad_loss = grad(loss);
matrix<double> theta{{1}, {2}};
double lr = 0.1;
for (int i = 0; i < 100; ++i) {
auto g = grad_loss(theta);
theta = theta - g * lr;
}
With Constants (val)
auto loss = [](const auto& x) {
val<matrix<double>> A(some_matrix); // Constant, not differentiated
return sum(matmul(A, x));
};
auto gradient = grad(loss)(input); // Only w.r.t. x
Core Types
| Type | Description |
|---|---|
graph | Computational graph holding all nodes |
var<T> | Differentiable variable (participates in gradient computation) |
val<T> | Constant (not differentiated) |
Operations
All operations record their Jacobians for automatic backward pass:
| Category | Operations |
|---|---|
| Arithmetic | +, -, *, /, unary - |
| Transcendental | exp, log, sqrt, pow |
| Matrix | matmul, transpose, trace, sum, hadamard |
| Linear Algebra | det, logdet, inverse, solve |
Testing
Gradients are verified against finite differences:
auto analytical = grad(f)(x);
auto numerical = finite_diff_gradient(f, x);
// These should match within tolerance
Design Philosophy
- Pedagogical clarity: Demonstrates autograd fundamentals
- Functional style: Immutable nodes, pure functions
- Mathematical transparency: Jacobians visible in code comments
- Stepanov-inspired: Generic over matrix types via concepts
License
MIT