[img]
https://na.cx/i/56G0244.png[img]
[img]
https://na.cx/i/p53fz50.png[img]
6.2(前半)
呢部份講既係Cramer-Rao Lower bound
講之前, 先定義(R3),(R4)兩個新條件, 兩個都係有關pdf smoothness既要求
有左(R3),(R4), 我地就可以將"f積分等於1"呢條式微分兩次, 就得到(6.2.2)同(6.2.3)
唔好問我幾時先可以將個微分搬入去積分入面
定義score function之後,
(6.2.2)同(6.2.3)就給出左score既mean同variance啦
而我地睇到score既mean係0, 而score既variance我地定義為Fisher Information
上面提既係單個sample既score同variance,
而當有多個i.i.d. sample時, 我地都可以將堆sample睇成1個random vector,
然後定義呢支vector既score同Fisher Information
而多個sample既score同Fisher Information, 都不過係將單個sample既加埋,
所以我地只要關注單個sample點計就OK
搞完一輪, 到底呢個漁夫資訊係咪真的很有用?
夠唔夠IT狗有用?
原來我地可以develop 所謂Cramer-Rao Lower bound
(Theorem6.2.1)
Under (R0)~(R4), 對任何統計 Y, Y既Variance都會有一個Lower bound, 唔可以再少啦!
換言之, 只要Y係一個Unbiased Estimator, Var又到達呢個bound, 就係所謂MVUE!
(而MVUE係指有最少Variance既Unbiased Estimator)