• You are currently viewing our forum as a guest, which gives you limited access to view most discussions and access our other features. By joining our free community, you will have access to additional post topics, communicate privately with other members (PM), view blogs, respond to polls, upload content, and access many other special features. Registration is fast, simple and absolutely free, so please join our community today! Just click here to register. You should turn your Ad Blocker off for this site or certain features may not work properly. If you have any problems with the registration process or your account login, please contact us by clicking here.

MBTIc Math thread

FDG

pathwise dependent
Joined
Aug 13, 2007
Messages
5,903
MBTI Type
ENTJ
Enneagram
7w8
Here's a question I've been thinking a bit about recently.

In a linear regression, the least squares estimator is also the maximum likelihood estimator when we assume the error is normally distributed. In a lot of financial work, a cauchy distribution has a better fit than a normal. What are the properties of the maximum likelihood estimator of a regression coefficient under the assumption of cauchy distributed errors, and in what cases will this lead to an answer that's noticeably different than least squares?

Short story (I hate myself for spending one semester in a Quantitative Methods V): if you use linear regression, it's all the same shit.
 
Top