The Frechet derivative is the "infinite-dimensional" analogue of the usual derivative. In particular, one should think of the derivative as a "first order" linear approximation of your function.
For example, in the usual case of a function f:R^n -->R, we have a representation of the derivative (gradient) as a row of partial derivatives (df/dx_1,...,df/dx_n). This is really a matrix representation for a a linear function T:R^n -->R that "best approximates" f in a sense that can be made precise.
The Frechet derivative does something similar, but for infinite dimensional vector spaces. That means that we consider a vector space, but with what is called a "norm" which is just a way to measure distances between vectors. If this vector space has no "holes" (is a Banach space), then the "best" approximation of a function by a linear one is called its Frechet derivative.