Quote Originally Posted by musicheck View Post
Here's a question I've been thinking a bit about recently.

In a linear regression, the least squares estimator is also the maximum likelihood estimator when we assume the error is normally distributed. In a lot of financial work, a cauchy distribution has a better fit than a normal. What are the properties of the maximum likelihood estimator of a regression coefficient under the assumption of cauchy distributed errors, and in what cases will this lead to an answer that's noticeably different than least squares?
Short story (I hate myself for spending one semester in a Quantitative Methods V): if you use linear regression, it's all the same shit.