"How Good an Estimator is Conditional Expectation?"
Prof. Gary Wise
UT Austin
Friday, March 5th, 2:00 PM, ENS 302
wise@mail.utexas.edu
Abstract
In many studies of estimation theory it is bluntly stated that conditional
expectation leads to a best minimum mean square error estimate. We will
show that this claim is false. Specifically, for any real number B, we will
exhibit a probability space, two bounded random variables X and Y, defined
on that probability space, and a function f mapping the reals into the reals
such that the mean square error using E[Y|X] to estimate Y admits a mean
square error of at least B, and yet, at the same time, f(X) = Y pointwise
on the underlying probability space. With this result to whet our interest,
we will then go on and develop necessary and sufficient conditions for
conditional expectation to do the job. These conditions should be of
interest to any engineer interested in estimation theory.
Biography
Gary L. Wise received his B. A. in electrical engineering and in
mathematics from Rice University in 1971. He received his M. A. and his M.
S. E. in electrical engineering, with a minor in mathematics from Princeton
University in 1973. In 1974, he received his Ph.D. from Princeton
University. He is currently a Professor in Electrical and Computer
Engineering and in Mathematics at The University of Texas at Austin. He
has held visiting positions in the Department of Statistics at the
University of California at Berkeley in 1989 and in 1991, where he came by
the nickname of Dr. Counterexample. His first book was written with Eric
Hall and is entitled "Counterexamples in Probability and Real Analysis".
A list of Signal and Image Processing Seminars is available at
from the ECE department Web pages under "Seminars".
The Web address for the Signal and Image Processing Seminars is
http://anchovy.ece.utexas.edu/seminars