Basic Mathematics You Should Mastered

  • 时间:
  • 浏览:0
  • 来源:uu快3分析_uu快3APP_计划

2017-08-17  21:22:40 

In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of theKullback–Leibler divergence. The inequality is tight up to constant factors.[1] 

Basic Mathematics You Should Mastered

In statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two random variables, or two probability distributions or samples, or the distance can be between an individual sample point and a population or a wider sample of points. 

5. σ-代数 

1. Statistical distance 

6. The definition of TV: 

2. Pinsker's inequality 

4. Total variation distance of probability measures

3. Total variation distance of probability measures