Information-Theory

Information-Theory

ID:40719131

大小:697.97 KB

页数:24页

时间:2019-08-06

Information-Theory_第1页
Information-Theory_第2页
Information-Theory_第3页
Information-Theory_第4页
Information-Theory_第5页
资源描述:

《Information-Theory》由会员上传分享,免费在线阅读,更多相关内容在学术论文-天天文库

1、MachineLearningSrihariInformationTheorySargurN.Srihari1MachineLearningSrihariTopics1.EntropyasanInformationMeasure1.DiscretevariabledefinitionRelationshiptoCodeLength2.ContinuousVariableDifferentialEntropy2.MaximumEntropy3.ConditionalEntropy4.Kullback-LeiblerDivergence(RelativeEntro

2、py)5.MutualInformation2MachineLearningSrihariInformationMeasure•Howmuchinformationisreceivedwhenweobserveaspecificvalueforadiscreterandomvariablex?•Amountofinformationisdegreeofsurprise–Certainmeansnoinformation–Moreinformationwheneventisunlikely•Dependsonprobabilitydistributionp(x)

3、,aquantityh(x)•Iftherearetwounrelatedeventsxandywewanth(x,y)=h(x)+h(y)•Thuswechooseh(x)=-logp(x)2–Negativeassuresthatinformationmeasureispositive•Averageamountofinformationtransmittedistheexpectationwrtp(x)referedtoasentropyH(x)=-Σp(x)logp(x)x23MachineLearningSrihariUsefulnessofEntr

4、opy•UniformDistribution–Randomvariablexhas8possiblestates,eachequallylikely•Wewouldneed3bitstotransmit•Also,H(x)=-8x(1/8)log(1/8)=3bits2•Non-uniformDistribution–Ifxhas8stateswithprobabilities(1/2,1/4,1/8,1/16,1/64,1/64,1/64,1/64)H(x)=2bits•Non-uniformdistributionhassmallerentropytha

5、nuniform•Hasaninterpretationofintermsofdisorder4MachineLearningSrihariRelationshipofEntropytoCodeLength•Takeadvantageofnon-uniformdistributiontouseshortercodesformoreprobableevents01•Ifxhas8states(a,b,c,d,e,f,g,h)withprobabilities1/2(1/2,1/4,1/8,1/16,1/64,1/64,1/64,1/64)1/4Canusecod

6、es0,10,110,1110,111100,111110,1111111/8averagecodelength=(1/2)1+(1/4)2+(1/8)3+(1/16)4+4(1/64)6=2bits•Sameasentropyoftherandomvariable•Shortercodestringisnotpossibleduetoneedtodisambiguatestringintocomponentparts•11001110isuniquelydecodedassequencecad5MachineLearningSrihariRelationsh

7、ipbetweenEntropyandShortestCodingLength•NoiselesscodingtheoremofShannon–Entropyisalowerboundonnumberofbitsneededtotransmitarandomvariable•Naturallogarithmsareusedinrelationshiptoothertopics–Natsinsteadofbits6HistoryofEntropy:thermodynamicsMachineLearningSriharitoinformationtheory•En

8、tropyisaverageamoun

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
相关文章
更多
相关标签