Almost all pattern classifier training algorithms are optimization problems. Most optimization algorithms require the gradient of the optimization objective w.r.t. the parameters being optimized. Some optimization algorithms also require 2nd-order derivatives. To derive and code such derivatives by hand can be time-consuming and error-prone. Automatic differentiation can solve this problem. Automatic differentiation should not be confused with numerical differentiation, or symbolic differentiation, both of which are almost useless for optimization problems. I'll discuss and demonstrate my MATLAB implementations of two flavours of automatic differentiation and show an example of how they can be used to solve a simple pattern recognition problem.