As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression
而呢個又好似同KLD有d關係咁
用discrete probability嘅setting大致上就係咁
KLD係指唔知乜divergence?
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression
而呢個又好似同KLD有d關係咁
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression
而呢個又好似同KLD有d關係咁
用discrete probability嘅setting大致上就係咁
KLD係指唔知乜divergence?
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression
而呢個又好似同KLD有d關係咁
用discrete probability嘅setting大致上就係咁
KLD係指唔知乜divergence?
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression
而呢個又好似同KLD有d關係咁
用discrete probability嘅setting大致上就係咁
KLD係指唔知乜divergence?
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression
而呢個又好似同KLD有d關係咁
用discrete probability嘅setting大致上就係咁
KLD係指唔知乜divergence?
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression
而呢個又好似同KLD有d關係咁
用discrete probability嘅setting大致上就係咁
KLD係指唔知乜divergence?
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression
而呢個又好似同KLD有d關係咁
用discrete probability嘅setting大致上就係咁
KLD係指唔知乜divergence?
用discrete probability嘅setting大致上就係咁
KLD係指唔知乜divergence?
As i known既話 information theory既entropy係講緊minimum number of bits to represent symbol with some probi
然後用huffman code就可以做到lossless compression
而呢個又好似同KLD有d關係咁
用discrete probability嘅setting大致上就係咁
KLD係指唔知乜divergence?