By WILLIAM MAYER
May 5, 2020 - San Francisco, CA - PipeLineNews.org – We ask the reader to allow for a bit of a preamble, that is absolutely necessary to understand the nature of the problem that we are examining.
Western thought regarding the nature of inquiry has almost entirely been shaped by the notion of empiricism which grew out of what is loosely called the Great Enlightenment, the product of which was the supposed decoupling of certifiable reality from superstition, meaning scripture. The primary [and intended] effect of the Enlightenment was to elevate science and the scientific method above all else.
The movement radically changed how Western man conceived of reality, his epistemology, i.e., how he knows what he knows. The cultural effect brought about by scientific rationality is so great that it’s impossible to quantify; suffice it to say that the entirety of the modern Western World would be unimaginably different had this not occurred.
All of Western science is predicated upon empirical investigation, which shapes our world-view. The mechanics of rational scientific inquiry are fairly simple. Phenomena are observed, hypotheses are developed and predictions will be made about how the phenomena behaves under this or that condition. If the predicted results are borne out by experiment, the hypothesis become a scientific theory - fact - and rational man might well adjust his beliefs to reflect the new understanding.
However here is where things become complicated, because science is never “settled” and facts are not forever propositions. As the great 20 th century philosopher, Karl Popper noted, a proposition, no matter the source or its presumed authority can never rise to the status of scientific fact if the theory does not allow for it to be disproven, possibly shown to be in error. As an example, Newtonian physics was until fairly recently considered the to be the fundamental law of how the universe operates but with the coming of quantum theory, Newtonian mechanics became subordinate to the new laws of quantum physics which now becomes the fundamental truth.
This method works well within the natural sciences because the matters being investigated easily yield to quantification, i.e., we “know” this or that because it can be measured, again quantified. However, the social “sciences” are different since there is no scientifically valid theory of human behavior.
In the social sciences this degree of precision and certitude are largely absent because individuals are complex systems and many aspects of their behavior resist any attempt at quantification. Hence there is no such thing as political science; it might be such an intensive study of politics that it leads to a doctorate, but it sure as hell aint science. Yes of course, we can for example count the number of people who voted in the last election with a fair degree of accuracy and from that we can formulate trends. So, let’s say that in the last election “x” percent of the population voted and that number represents a decreasing trend compared with previous years. Fine, that is a supportable conclusion, however it does not allow the researcher to predict with any sense of certitude how many voters might participate in the next election, any number cited would be a guess.
Exploring this example, let’s say since in the last election 40% of the voting age population actually went to the polls, and since that number represents a decreasing trend that the researcher might guess that somewhat less than that number will participate in the next election. But it will simply be an inductive guess, something intuited from the trend. But what if some national calamity [an epidemic?] occurs that reflects poorly on the current leadership, well then perhaps the educated guess will go out the window since the cultural dynamic has changed and all that can be said in that case is that more, less or about the same amount of voter participation will occur.
No one, certainly no social scientist, can possible quantify a metric based upon a system that is dynamic but not predictably so. There are simply no data points to hang on to, how angry voters are, how they perceive their anger in terms of voting, how various campaigns respond to voter displeasure, weather conditions…an unlimited number of variables that are not and not ever likely to be quantifiable.
So let’s get specific, we are dealing with a huge bureaucracy, the CDC, which does do some very good science such as working with pharmaceutical companies to develop flu vaccines. But even here the strain for which the vaccine is being created is a guess since there are dozens and dozens of differing flu viruses which may or may not assume prominence. Sometimes CDC in perfectly defensible manner produces the wrong vaccine, or not enough of it, so we know that by nature this stuff is imprecise. Hence, the type of medicine the CDC practices is a mix of both social and hard science of the form “x” kills “y” [but here that might be conditional too, maybe “y” is only vulnerable during period “z”]. The reality is that much of what comes out of the Centers for Disease Control are educated guesses, which is fine, it’s the best we can do.
So let’s take up the topic of our buried lede, Blinded By “Science” - The False Empiricism Of CDC Statistics. As everyone knows the Wuhan flu epidemic has shut down much of American commerce and that has taken place because we have been told in ever more ghastly detail just how dangerous this flu virus is. So many of us are under virtual house arrest, locked away by edict of state governors and local health departments. No jobs, obviously no income and the rent is already a month past due.
As previously stated in many cases the CDC tries its best, which is fine, but here is the kicker, because we are talking about making projections based upon imperfect data what we do as a society falls upon the political leadership because only it is answerable to the people.
But the level of alarm coming from every public health institution from the World Health Organization [do not get me going…] on down to the level of city boards of supervisors all of which combine to become the primary driver of political decisions.
We take action ”x” based upon the metric “y” which though it represents a number, is one with little statistical validity, and this is easy to prove.
Turning to the CDC itself we note the page entitled: Provisional Death Counts for Coronavirus Disease (COVID-19) . This page is reporting approximately 40,000 Covid/Wuhan flu deaths as determined by cause of death coding on death certificates. This is the best data we have because the cause of death has been certified by a doctor using the new CDC assigned IDC code. These codes have been developed by the National Center for Health Statistics (NCHS),which is the Federal agency responsible for use of the International Statistical Classification of Diseases and Related Health Problems.
But CDC also has a page entitled Cases in the US , that claims that 68, 279 have died of Wuhan/Covid-19 and that is the figure that the entirety of Western media are using to scare populations into compliance. This number is supposedly a summing of data provided by the health departments of the 50 states, the District of Columbia and US territories. But each state gathers its data from county health departments of which there are 58 in California.
So the chain of possession so to speak, goes from local, to state to the federal level. In California, county level “data" is compiled from CalREDIE and hospitalization data is collected via electronic survey. All data points could be a snapshot of different times. “We are continually working on improving the data collection.” Cal REDIE is another huge bureaucracy that gathers its information from a variety, by its own estimate at least 350 different sources.
One can only wonder when dealing with incremental reporting from a multiplicity of sources what the stacking error levels [a helpful engineering concept] might be. We are dealing with a very large number of data inputs and each component comes with a certain degree of precision, how much it might differ [deviate, statistically] from the actual figure. So even with high degree of confidence in each data point when you stack or sum imperfect information the degree of imprecision increases as the data is summed.
Here is the money-line, our bright and shiny method of gathering health statistics is inherently flawed; there are entirely too many uncontrollable and unknowable variables involved, all of which affect the level of confidence that the numbers square with the reality on the ground. Regardless, CDC presents us with two prominent but contradictory data points one of which we now understand is just an informed guess. One number 40,000 +/- was obtained in the most accurate manner, directly from cause of death on signed death certificates, the second number, the one that is alarming - 70,000 +/- is the one based upon crude guesstimates as percolated through state bureaucracies, and it is that number that has been the justification of essentially collectivizing a formerly free people.
Readers, though CDC deals with scientific matters, the information coming out of our nation’s largest health bureaucracy is often not grounded in science. Instead CDC employs a false empiricism operating under color of authority. Manipulation of flawed data yields false conclusions, irrespective of how supposedly rigorous the process of data acquisition is.
It’s instructive to look at CDC estimated flu deaths over the last 9 years:
[source, CDC, Past Seasons Estimated Influenza Disease Burden ]
Note 2017-2018, a 48% variance and 2018-19, a 45% variance. As an experiment I ran a standard deviation of the 17-18 figures and the result diverged so much from the reported data that using it would have contributed to junk or pretend science. Additionally, when presented with only two data points [one degree of freedom, technically] the concept of standard deviation becomes moot, one cannot even begin to estimate a numeric degree of confidence.
So what have we learned?
Plenty. It’s so easy to destroy the façade of integrity surrounding the almighty CDC’s public pronouncements that it should be considered best policy to simply ignore them. Fewer people are dying from Wuhan flu than even the most conservative estimates due to statistical chatter induced by too many uncontrollable, accountable variables or fraud, forced determination of cause of death, which we absolutely know is happening, at least in California.
The economy of a major industrial power must never again be placed in the hands of practitioners of voodoo science such as Comrade Fauci and Field Marshall Neckscarf Birx. They have no idea what they are doing and following their idiotic advice to close the economy for the foreseeable future is a prescription for disaster.
Gentle reader, consider yourself warned, and in the spirit of full disclosue we reached out this morning to CDC’s PIOs [public information officer]. Some very simple questions were posed in voicemail as well as email, as we go to press CDC remains silent though we kept our deadline open 4 hours past our intended time of publication about which CDC was fully informed.
Correction/Retraction: CDC did in fact contact us at 2:30 PDT, within the deadline, but the email was delivered into the junk file. So we thank the CDC PIO for responding, however the information presented rehashed what we have just reviewed. We have requested that the channel of communication remain open, so that we can further explore these matters as new developments occur.
©2020 PipeLineNews.org LLC. All rights reserved. No part of this publication may be reproduced, distributed, or transmitted in any form or by any means, including photocopying, recording, or other electronic or mechanical methods, without the prior written permission of the author except in the case of brief quotations embodied in critical reviews and certain other noncommercial uses permitted by copyright law.