PHD/Research討論區 (21)

1001 回覆
14 Like 5 Dislike
2017-06-05 21:38:47





2017-06-05 21:49:11






2017-06-05 21:52:16
有無巴絲係出黎做左兩三年嘢再去讀PhD? 難唔難搵番Prof寫reference letter?

做落嘢就愈覺自己想做番research..

P.S. Master grad with Distinction
2017-06-05 22:29:09
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁
2017-06-05 22:36:59
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?

2017-06-05 22:41:21
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

2017-06-05 22:59:21
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?



2017-06-05 22:59:58
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?



2017-06-05 23:05:48
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?




2017-06-05 23:10:48
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?





2017-06-05 23:12:52
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?






2017-06-05 23:18:36
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?







2017-06-05 23:19:24

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?







2017-06-05 23:19:44
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression

而呢個又好似同KLD有d關係咁

用discrete probability嘅setting大致上就係咁

KLD係指唔知乜divergence?







2017-06-05 23:34:01









2017-06-05 23:34:58








2017-06-05 23:47:31








吹水台自選台熱 門最 新手機台時事台政事台World體育台娛樂台動漫台Apps台遊戲台影視台講故台健康台感情台家庭台潮流台美容台上班台財經台房屋台飲食台旅遊台學術台校園台汽車台音樂台創意台硬件台電器台攝影台玩具台寵物台軟件台活動台電訊台直播台站務台黑 洞