Science Log


23-mar-2015

Quote: "no testimony is sufficient to establish a miracle, unless the testimony be of such kind, that its falsehood would be more miraculous, than the fact which it endeavours to establish." David Hume.

Reading: Why Most Published Research Findings Are False, John P. A. Ioannidis PLoS Med 2(8): e124. DOI: 10.1371/journal.pmed.0020124

Working on: Crowding: large molecules are in fact WORSE at crowding than small ones like water!

6-apr-2015

Translation of Boltzmann paper now published on line in Entropy (2015) 17:1971

30-apr-2015

Paper: Calculation of entropy from data of motion, S-K Ma J. Stat. Phys. 26:221 (1981). A classic.

Book: Harold Jeffreys "Scientific Inference" and "Theory of Probability"
Should be read by every scientist. Two chapters of Jeffreys, an actual scientist, is worth the entire oeuvre of professional statisticians or professional philosophers of science such as Popper or Kuhn, who as HJ notes, seem to get their notions of physics from popular writings.

For example, HJ on the uncertainty principle in QM: "The existence of errors of observation seems to have excaped the attention of many philosophers that have discussed the uncertainty principle; this is perhaps because they tend to get their notions of physics from popular writings"-Theory Of Probability.

HJ on statistical tests:

What the use of P implies, therefore, is that a hypothesis that may be true may be rejected because it has not predicted observable results that have not occurred. This seems a remarkable procedure.” (Jeffreys, 1961, p. 385)



Sept-2016
After publication of translation of Boltzmann's paer, I came to appreciate that his explanation of Entropy is much clearer than many subsequent explanations, especially semi-popular or popular science accounts. And of course it is absolutely accurate, as many accounts aren't. In response I decided to write a short explication of entropy that follows his approach, has only the simplest math in it, and is hopefully both accurate and understandable by almost anyone with minimal science background:
Entropy According to Boltzmann

Gregory Bateson's "Metalogue: Why things get into a Muddle" is unsurpassed as completely non-technical, but surprisingly accurate and witty explanation of Entropy and the Second Law

1.
Bateson G (1972) Steps to an ecology of mind (Ballantine Books, New York). p1  p2  p3 p4

Oct-2016
Now teaching a tutorial Scientific Inference and Reasoning as a trial run for a Bayesian based course aimed to teach students how to really analyze data and actually draw inferences and reason. Inspired by Sivia's book "Data Analysis: A Bayesian Tutorial." Also to break from the mindless application and misuse of 'classical statistics hypothesis testing' that is still taught to students in spite of devastating criticism in dozens of papers by prominant statisticians. And the NIH wonder why there are rigor and reproducibility problems in biomedical science!

Case in point: 1994 Cohen in "The earth is round P<0.05"  explains exactly what is wrong, and has some recommendations (personally I think he would have been on firmer ground if he gone fully Bayesian). Twenty-two years later Greenland S, et al. (Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations. European Journal of Epidemiology (2016) 31:337.) "review 25 common misconceptions.... of significance tests, confidence intervals, and statistical power" Twenty five of them!!  Surely this must give question to the traditional teaching of statistical analysis of data, and suggest 'new' approaches such as Bayesian analysis.





Return to home