# 🖼️ Example gallery#

Gradient Accumulation.

Meta-Learning.

Optimistic Gradient Descent in a Bilinear Min-Max Problem

Optimistic Gradient Descent in a Bilinear Min-Max Problem.

ResNet on CIFAR10 with Flax and Optax.

ResNet on CIFAR10 with Flax and Optax.

Train the parameters of a Flax module.

Adam optimizer and lookahead wrapper on the MNIST dataset.

Train an MLP classifier on MNIST using Optax.

Adversarial training of CNN on MNIST.

Character-level Transformer on Tiny Shakespeare

Character-level Transformer on Tiny Shakespeare.

Using LBFGS and linesearch.

Differentiable functions with perturbations.

Solving the linear assignment problem.

## Contrib Examples#

Examples that make use of the 🔧 Contrib module.

Differentially private convolutional neural network on MNIST.

Differentially private convolutional neural network on MNIST.

Reduce on Plateau Learning Rate Scheduler

Example usage of reduce_on_plateau learning rate scheduler.

Sharpness-Aware Minimization (SAM)

Sharpness-Aware Minimization (SAM).