Tag: entropy

50 What is the role of the logarithm in Shannon's entropy? 2014-02-19T17:33:54.330

43 Measuring entropy/ information/ patterns of a 2d binary matrix 2011-10-17T12:39:47.917

33 What does the Akaike Information Criterion (AIC) score of a model mean? 2016-06-01T00:04:04.030

31 What does entropy tell us? 2016-02-05T18:52:08.630

30 Entropy-based refutation of Shalizi's Bayesian backward arrow of time paradox? 2012-05-09T00:41:06.787

21 Why is Entropy maximised when the probability distribution is uniform? 2013-07-31T10:20:03.673

20 Comparison between MaxEnt, ML, Bayes and other kind of statistical inference methods 2010-11-28T13:12:18.827

20 Kullback-Leibler divergence WITHOUT information theory 2016-11-16T12:22:18.640

17 Good introduction into different kinds of entropy 2010-07-20T16:03:40.877

15 Bounding mutual information given bounds on pointwise mutual information 2011-06-24T17:48:21.227

14 Clustering: Should I use the Jensen-Shannon Divergence or its square? 2011-02-25T18:01:07.703

14 Does Breiman's random forest use information gain or Gini index? 2015-04-04T16:17:17.093

13 Is there any use for the quantity $\int f(x)^2 dx$ in statistics or information theory? 2011-04-24T15:05:10.177

13 What is empirical entropy? 2012-05-10T12:44:06.637

13 Statistical interpretation of Maximum Entropy Distribution 2013-07-31T20:44:01.823

13 Qualitively what is Cross Entropy 2014-01-01T05:24:12.653

13 What is the meaning of the eigenvectors of a mutual information matrix? 2015-11-05T21:06:05.267

12 Definition and origin of “cross entropy” 2012-07-10T05:45:27.677

12 Deriving Negentropy. Getting stuck 2012-08-27T21:10:50.310

12 Is differential entropy always less than infinity? 2015-06-08T03:40:47.287

12 Prove that the maximum entropy distribution with a fixed covariance matrix is a Gaussian 2016-02-04T17:13:48.347

12 Typical set concept 2016-09-09T18:55:13.257

11 Differential Entropy 2013-02-18T16:13:45.037

9 How do I vertically stack two graphs with the same x scale, but a different y scale in R? 2010-08-18T14:55:55.133

9 "Since $x$ is near-gaussian, its PDF can be written as..." 2012-08-28T21:29:48.457

8 What is Shannon's source entropy? 2012-07-30T20:54:34.407

8 Comparison of entropy and distribution of bytes in compressed/encrypted data 2013-02-02T16:24:03.600

8 How to determine Forecastability of time series? 2014-12-05T22:17:20.690

7 Hypothesis testing that one time-series of a measure of entropy doesn't belong to a population 2010-08-18T15:19:44.807

7 Shannon entropy for non-stationary and non-linear signal 2011-07-15T10:56:53.870

7 Multinomial Logistic Loss vs (Cross Entropy vs Square Error) 2015-08-13T08:25:41.273

7 Entropy of distribution with uniform sub-distribution 2017-08-28T00:13:01.983

6 Test randomness of a generated password? 2012-04-04T16:47:48.370

6 Calculating entropy of a binary matrix 2012-07-14T23:50:27.007

6 Information gain and information gain ratio: Do I have to pick just one? 2012-09-01T14:15:08.993

6 Why can we use entropy to measure the quality of a language model? 2014-04-09T03:11:14.933

6 Does Random Forest ever compare the splitting of one node to the slitting of a **different** node? 2016-01-10T05:44:13.010

6 Interpretation of the entropy with a coding length? 2016-05-31T13:02:29.703

6 Maximum entropy distribution of a proportion with known mean and variance? Is it a beta? 2016-07-25T17:41:23.790

6 Entropy of an image 2016-09-16T06:35:33.823

6 How to maximize min.entropy from a bounded log normal distribution? 2016-10-16T02:53:31.067

5 Species Richness, Dominance and Diversity Differences 2011-07-25T16:28:34.833

5 Number of needed samples for entropy estimation 2012-10-19T07:46:02.490

5 How to avoid 0 determinant when sample covariance matrix has very small values 2013-02-19T10:53:19.263

5 Entropy of Inverse-Wishart distribution 2013-10-24T22:01:38.600

5 Difference between different kinds of entropy 2014-01-20T22:31:02.673

5 Entropy estimation for a symbol sequence 2014-04-08T15:13:27.927

5 Why am I getting information entropy greater than 1? 2014-04-26T02:28:25.233

5 kozachenko-leonenko entropy estimation 2014-05-14T13:16:52.367

5 How does the log(p(x,y)) normalize the point-wise mutual information? 2015-03-09T02:54:28.640

5 Why would perfectly similar data have 0 mutual information? 2015-07-14T17:01:27.600

5 Sum of squared Poisson probability masses 2015-07-21T10:15:47.217

5 Hypothesis test based on entropy 2015-08-04T15:19:55.870

5 Tensorflow Cross Entropy for Regression? 2016-07-12T00:08:31.183

5 Is "Shannon entropy" used incorrectly in machine learning related literature? 2017-05-11T18:33:54.413

5 Where does the Kullback-Leibler come from? 2017-08-22T12:07:53.533

5 Why does entropy as error measure leads to overfitting? 2018-02-08T10:43:31.373

4 All-Purpose Sample Entropy 2010-09-14T21:43:41.883

4 Asymmetry between high order and low order interaction terms 2010-11-29T02:08:24.257

4 Calculating the transfer entropy in R 2011-06-30T00:19:13.453

4 Struggling to understand the relationship between two definitions of entropy 2012-05-14T16:57:54.500

4 Information gain as a feature selection for 3-class classification problem 2012-05-15T05:54:23.387

4 How do I find fields related to a class in a naive bayesian classifier? 2012-12-17T19:49:31.673

4 Alternative to Shannon's entropy when probability equal to zero 2013-04-24T12:57:24.720

4 Logistic regression and maximum entropy 2013-11-22T17:09:01.420

4 Interpreting Shannon entropy 2013-12-01T23:02:09.110

4 Compute Shannon entropy between every row of a large, sparse matrix 2014-03-24T20:07:31.053

4 $\phi$-divergence? 2014-04-28T20:36:42.787

4 Estimating entropy of multidimensional variable through dimension reduction 2014-06-01T21:44:23.430

4 Entropy and information content 2014-06-05T21:45:22.947

4 Difference between Weighted Average Entropy and Adjusted Mutual Information (for evaluating Clustering) 2014-09-24T14:33:20.100

4 Entropy of Sum vs Difference of Random Variable 2014-11-11T04:38:19.467

4 Is it possible to use SD instead of entropy? 2015-01-13T22:53:47.993

4 Decision tree with adaboost 2015-01-16T15:06:16.447

4 Sum of squared Negative Binomial probability masses 2015-07-21T10:17:22.253

4 Does correlation implies mutual information? 2015-07-31T08:39:50.063

4 Relationship between least-squares regression and information theory 2015-09-10T16:45:15.117

4 If a decision tree already has very low entropy, do we still need a random forest 2015-09-24T18:24:35.620

4 Additivity of Shannon's entropy 2015-10-20T21:26:35.563

4 How to estimate a probability distribution 2015-10-21T16:57:15.070

4 Weighted entropy as a measure of diversity 2015-12-19T20:35:54.490

4 How to interpret differential entropy? 2017-01-14T11:07:18.457

4 Perplexity and cross-entropy for n-gram models 2017-06-16T19:31:40.570

4 Only minimizers of quadratic penalized least squares can be linear 2017-08-08T02:15:45.443

3 Property of entropy 2011-06-09T05:58:45.603

3 Significance of transfer entropy calculations 2012-07-02T11:49:50.050

3 Maximum entropy of dice 2012-10-27T21:01:21.837

3 ID3 and C4.5: how does “gain ratio” normalize “gain”? 2012-11-05T13:50:23.897

3 Multiplying two entropy values 2013-02-09T11:20:29.333

3 Mutual Information really invariant to invertible transformations? 2013-02-17T15:42:09.487

3 Lognormal Distribution as Maximum Entropy Probability Distribution 2013-02-25T23:05:31.260

3 Word entropy / frequency in human speech 2013-04-01T18:15:50.470

3 hypothetical measure of variability similar to entropy 2013-06-03T07:18:54.260

3 Regression forest: Leaf node and information gain 2013-08-05T16:18:08.907

3 Maximum entropy and non-informative distribution 2013-08-16T03:04:39.260

3 Additional parameter adds entropy to the centre of the density function 2013-08-18T09:22:25.983

3 Why does entropy increase with dispersion for continuous but not for discrete distributions? 2013-10-27T21:15:30.070

3 When and why does the "brittleness" of mutual information cause overfitting? 2014-02-14T19:54:14.897

3 Using similarity matrix to measure diversity of a group 2014-05-09T13:27:52.697