Becoming Us & How

For good or ill, as John Adams put it, “Facts are stubborn things, but our minds are even more stubborn.” Truth always plays second fiddle to group think or as George Orwell put it “The further a society drifts from truth the more it will hate those who speak it”.


Why do so many people hold such strong opinions and beliefs that seem so obviously wrong to us? How can individuals stand on both sides of what would appear to be a simple issue? How can both sides think they are right? Time and time again we encounter people that swear something is a fact and even when shown evidence that demonstrates their position cannot be true, they still cling to their opinion. In the real world, shouldn’t facts be facts? Polarization in politics seems to be infecting all of society. There is something changing.

What are the processes that go into fashioning our world views and how we perceive the very reality that we experience? Understanding the past, our origins, our evolution and how we see reality itself, is the key to understanding what we, as a species, have become and what is happening.

Becoming Human

As mammals go, they weren’t very big or strong but they were unusual. They had the ability to walk mostly on two feet with hands that were capable of gripping and manipulating objects. Broadly they were the ancestors of chimpanzees, monkeys and baboons but their was also one that would evolve into what we now characterize as hominids. They walked completely erect , were losing their fur and were using rudimentary tools. While we evolved from this hominid line that was a few million years old, our modern beginnings go back to only about two hundred thousand years ago. At that time, giant mammals roamed the earth; mammoths, giant three-toed sloths, saber-tooth tigers and huge herds like bison. In addition to the hominid that was our great grandparent, there were also four or five first cousin hominids scattered across the Earth. Physically, all five had unusually large brains, stood completely erect, were capable of running, had opposing thumbs and were “naked”. Evidence would suggests that all of them possessed the ability to speak. They were hunter gatherers and also capable tool makers. While primate origins were in Africa, many of our first cousins had moved out of Africa to Europe, Asia, India, and the islands south of Asia. Of the five, we have solid evidence that two were similar to us in most ways, one was significantly smaller and one was much more robust and had a larger brain. We call ourselves Homo Sapiens and at that point in history we had not yet left Africa. One cousin that was bigger and stronger and had a larger brain than us was the Neanderthal and they were already living in Europe for two or three hundred thousand years.

Between two hundred and one hundred thousand years ago, something earth shattering seems to have occurred. The hominid that was our great grandparent started thinking very differently. It didn’t seem to have anything to do with brain size because they didn’t have the largest brain. Up until about one hundred thousand years ago, these new-thinking Homo Sapiens were still living exclusively in Africa, but that would soon change.

Our very way of thinking, and therefore our speech, changed significantly. Linguists call what happened in the line of hominids that were our ancestors, the development of “context free speech”. What that meant was they no longer simply thought and talked about things that were obviously practical, happening near them in space and in the “here and now”. This new type of thinking allowed homo sapiens to not only discuss things that weren’t even present locally or in the here and now but things that didn’t actually exist at all. They began to talk about hunts past and future, tools and tactics and what they could do to improve their fortunes in the future. Maybe they discussed what caused rain or lightning, why the seasons changed, why they should follow a specific leader or what made them ill. Included with this new thinking and speaking was the emergence of the ability to actually link minds, their very thoughts, together. The real leap forward that occurred in these hominids was based on a change in thinking but it was also multi-faceted. It enhanced their ability to work together, acquire knowledge, share thoughts with others and build on past experiences and push thinking into the future and even into non-existent imaginary places.

It’s difficult to explain how important this new thinking was. It completely upset the balance of nature that existed at that time. This change was so significant, that over the next fifty thousand years or so, homo sapiens spread across the planet and all the other hominid cousins disappeared completely along with unimaginably large numbers of the mega-mammals. While there remains controversy regarding what part homo sapiens had in the disappearance of their cousins (see end notes), there is no question that our singular line of hominids did start thinking differently and the world began to change. As these younger, smarter hominids moved out of Africa, through the Middle East and on, the world changed wherever they went. In Europe, they encountered their nearest cousin, the Neanderthals. Those hominid first cousins had lived in Europe for about three hundred and fifty thousand years and within ten thousand years of Homo Sapiens’ arrival in Europe, Neanderthals seemed to disappear. The same seemed to happen in other regions as well.

What was so special about this new way of thinking and speaking? First, context free thinking and speech could be about anything including abstract thoughts. It allowed them to describe in detail not just what they saw, but also what they were thinking and what they were planning on doing. Most important of all it allowed them to make up stories. These stories could be as much entertainment as fact. They could describe dreams, tell epic tales of adventure, assign unnatural causes to natural occurrences and organize social lives around rituals involving hierarchies and beliefs in an unseen reality thereby assigning additional properties to things that happened in life. In actuality, it was probably that invention of story telling that began to truly redefine the world and enhance the ability to not only see into but actually create a future world.

Over thousands of years we created the world’s first complex social structures, gave names to social position and individual skills and abilities. We invented mythological creatures and gods, assigned them with super-natural powers and attributed them with being the causes of natural phenomenon. We began to assign relative value to essential goods and assigned equivalent value to other objects like stones and shells to facilitate trading.

Within fifty thousand years, those homo sapiens made another huge leap forward and developed a way to express their thoughts in a permanent form using visual symbols and writing. No longer did information have to be passed along orally through a chain from one person to the next, but it could be recorded for transmission over great distance and through time to distant generations and stored just like the staples of life.

Along with the emergence of context free speech, language and the linking of minds, there was one more trait evolving that greatly facilitated our advancement as a species. Today we often make fun and refer with derision to this ability, refer to it negatively, calling it things like being gullible or naive. It remains ingrained in each of us and it too was a very significant building block that contributed to the advancement of the human race. Generally speaking we, as a species, are inclined to accept what others tell us as fact or the truth. As much as context free thinking and speech, this allowed us to more quickly share and widely spread our acquired knowledge with others. And, while we often make fun of our nature to trust other people, it is part of what makes us human and was another foundation of our developing greatness.

While we evolved slowly, by taking advantage of these traits, our species was able to advance in intellectual areas, inventing and exploring things like science, philosophy, mathematics, and myriad other fields by simply acquiring and spreading information (knowledge), discoveries and new ideas throughout large groups of people. For centuries it has actually been the foundation of formal teaching and what makes our current education system even possible. By this linking of our minds and accepting shared information as fact,, we quickly built on the experiences and knowledge of hundreds and thousands of other humans. Sir Isaac Newton described it best when he declared that he succeeded because he was “standing on the shoulders of giants”.

With incredible inventiveness and this ability to link minds, Homo Sapiens as a species began to bond into larger and larger groups. Again, it didn’t happen over night, but we went from hunter gatherers to farmers and herders to living in villages and eventually began to create real civilizations. It started in the natural enlargement of family groups into clans and tribes and resulted in a growing ability to actually change our local environments. By building structures, raising animals and growing crops, man greatly improved the stability of food availability while at the same time providing for an increasing localized population. Civilization wasn’t initially created as much as it simply grew along with human populations and knowledge. It grew as groups got larger, allowing for agricultural settlements that consistently produced surplus food which allowed some people to specialize in non-agricultural work. In turn it allowed for production of more goods, trade within the group and with outside groups, while allowing for ever growing populations and more sophisticated social structure.

This accounted for the first civilizations appearing in locations where the geography was favorable to large scale agriculture. In order to manage the resources and protect the community, government was another natural evolutionary step. Context free thinking allowed humans to create categories to describe tasks and assign social standing to individuals. No other primate could have conceived of or characterized other members of their species as king, commander, teacher, priest or even potter or miller. Cities and kingdoms emerged from this thinking with rulers, a ruling class, warriors and priests. They gained increasing control over larger areas of land, expanding resources and populations. As complexity increased, the ruling class assembled armies, developed writing and religion to maintain social order and consolidated power over ever larger geographic areas and people.

The use of writing now facilitated the recording of knowledge, codification of laws, methods of record-keeping, and the birth of literature, which led to and fostered the spread of shared cultural practices among growing populations.

Since before recorded history, humans have experienced a number of epic ages. They included the iron and bronze ages, the first agricultural revolution, the age of empires, the dark ages, an intellectual revolution, the age of exploration, the industrial revolution, the second agricultural revolution, world wide wars and the modern information age. Each age created its own turmoil and change. Coming out of the dark ages we witnessed a rise of religious philosophy which than led to the dawn of an age of ideologies. These new ideologies were based on the beliefs and writings of philosophers, economists, social scientists and psychologists and promised mankind a number of possible futures based on these new ideas and theories. They suggested new forms of belief, social structure, government and economic institutions, all promising to improve the lot of the human race itself. Many ideas were experimented with, some failed terribly while others found some level of success. The one thing they all had in common was the large numbers of ardent followers believing in a new future for the species and themselves and a willingness to adopt a new set of rules.

Understanding Reality

Most people consider the experience of reality to be the same for everyone and think this is so obvious that it doesn’t warrant much serious consideration. That is simply not the case at all and, to better understand how we and society evolved, we need to understand more about the true nature of reality and how we, as individuals, groups and as a species, experience it.

The way we experience reality and interact with the physical world around us is completely through our senses. Without our senses of sight, hearing, touch and smell, the world simply would not exist for us and the only experience we would actually have is an unnamed and undefinable sense of self. Reducing all experience to what we can absolutely be sure about, brings us to Descartes’s famous “I think, therefore I am” suggesting that all that we can truly believe exists for each of us is our sense of self. It is the senses and the interpretation of signals from them by our brain that creates our sense of a real world. To better picture what this is suggesting, you need to realize that our senses are limited in what they detect. For example, a dog’s sense of smell detects a whole world we have no understanding of. In the dark, bats and dolphins see with sound and numerous creatures navigate the world using the Earth’s gravitational fields which generally we are blind to. What we now believe about reality is that it actually is completely a mental construct built on sensory input, organized by our context free thinking, relying heavily on memories of past experience and the adoption of notions from having our minds linked with others who describe and share their sensory experiences.

Based on a number of examples it is also becoming clear that our personal reality can be, and often is, very different from the realities of other people. On hundreds of occasions we are confronted with the knowledge that what our senses are telling us cannot actually be true and often we struggle to deal with it. The tricks of illusionists are one of the more obvious cases but there are also hundreds of common sight and even sound illusions. Experiencing many illusions can leave us completely disoriented because our sense of reality is challenged. We now know that what we believe we are observing can change significantly based on our circumstances and our state of mind. One established study into our ability to judge height demonstrates that it is dependent on if it is simply an observation or if we believe we are going to climb or descend the specific height.

One researcher into consciousness, perception and reality came to the conclusion that reality is actually a shared illusion and that our interpretation of reality is actually biased toward survival in preference to true accuracy.

Another way of thinking about shifting reality involves feelings of anxiety. We all experience sessions of anxiety where we begin to worry about what could happen to us. It can involve a disagreement with someone or a fear that we aren’t properly prepared for a coming event. It actually changes the way our mind processes information, so that we experience the symptoms of fear when there is no actual danger around. This leads to negative thinking, overthinking, and the tendency for our minds to notice cues that match our worst expectations while ignoring most others (confirmation bias).

Because we live in a reality fashioned by context free thinking, our world is much more complex than the world of any other species. Unlike other species we are capable of rethinking and changing how we view the world at any time and in numerous ways. By learning new information or listening to another person’s views we can completely shift our perspective and understanding of the world and reality.

Over the course of human existence, we as a species have created an ever more complex world that is actually built upon a foundation made possible by our context free thinking and speech. It has facilitated the invention of abstract creations that were elaborated on by inventive story telling and it required a reliance on believing what others told us as being true. The world that we live in today is more a mental creation than a physical place. Sure there are trees, rivers, rocks, clouds and on and on but we also can believe in angels, traffic laws, the existence of corporations, currency value, electrons, courts, stars and planets, ghosts, diplomas, aliens, etc.

Understanding the foundational mental processes that make up our reality can be simplified into three basic categories:

The first is the process that provides us raw input through our senses. It is as near as we can come to observing true reality and it’s usually referred to as objective phenomenon. Broadly, it is observing a world that exists independently of human consciousness and beliefs and would remain even if humans weren’t even present.

The next is an internalized reality created by a combination of observation and comparison with our previous experiences then weighed against our accumulated knowledge. We call this subjective phenomenon and it is something that exists completely dependent on the consciousness of a specific individual and represents a singular, personal and isolated reality.

The last is actually the most dominant in our thinking and represents a broader reality constructed by blending our subjective understanding of the world with shared knowledge accepted as reality by various groups we consider ourselves members of. This is called inter-subjective phenomenon and is a group reality that exists because of communications between a community or group that has linked their minds together to form a shared, mutually recognized or agreed upon and usually localized reality.

Our human experiences of the world and by extension reality itself, are based on these three generally accepted processes. It has also been described as that shared illusion broadly accepted within a social structure. It often is not based on objective observation or even accepted, fact-based information. Once any notion or belief is broadly accepted by a group it becomes an inter-subjective phenomenon and generates a localized, group reality that is often the dominant reality over even objective observation and has the ability to be spread to more individuals and other groups by a number of means.

A simple example of inter-subjective phenomenon and our objective observation is the understanding of the Earth’s place in the universe. Our objective observation initially informs us that we are standing still on a fixed Earth while we watch the Sun move across the sky above us. The original, easy, subjective interpretation that aligns with the objective observation is to believe that the Sun travels around us on our fixed Earth. That thinking is so obvious that it was the predominant inter-subjective reality for thousands of years. In the fifteenth century scientists began to understand that the Earth was round, revolved on an axis and that it was actually traveling around our fixed Sun. The illusion that the Sun traveled around us was created by the Earth’s rotation. Over time this knowledge spread and was passed on and eventually became a new predominant inter-subjective reality. Today we still have no more personal, objective information than we had a thousand years ago but we have been taught things that alter our subjective beliefs. This is mostly because they became widely adopted into an accepted inter-subjective reality regarding the Earth and the Sun’s place in the universe. Today we almost universally incorporate that changed inter-subjective reality of the Sun’s relationship to the Earth into how we see and process even our objective observations.

In simpler terms, because of this linking of minds and us receiving widely believed knowledge regarding the Sun and Earth, we have incorporated this understanding of the movements of the Earth and Sun into our personal subjective reality. There is no way that we, as modern individuals, can observe the Sun crossing the sky and visualize it as traveling around the Earth even considering our unchanging objective observations.

Because our world is created more in our thoughts than in a physical sense, we need to take into account these processes and how they brought us to where we are today.

Our Reality and the Influence of the Group??

The Earth Sun example explains how our objective perception of the physical world becomes changed by group inter-subjective reality and is adopted into each persons individual subjective reality. This is the process that completely changes even our objective perceptions of the world. In accepting inter-subjective phenomenon and modifying individual subjective reality we are often integrating things that cannot be directly observed or measured by our senses into our broader sense of reality and general understanding. A great number of people accept a Heaven and God as a reality. We see this as an example of an inter-subjective, though non-material reality, but there are much more everyday mundane things that we accept in the same way. Civil authority and laws, economics, the existence of corporations, money and thousands of other non-material and representative items exist only as individuals subjective reality and are part of that wider inter-subjective phenomenon.

Take the case of modern currency. We think of currency to represent a number of complex things like a measure of individual wealth, the value of human labor, invested ownership as well as a transactional promise of future goods or services. The truth is, it has no actual objective reality, other than bits of paper and metal with little real value. In a larger sense currencies true value is based principally on the subjective faith that individuals put in it. There are dozens of historical cases of national economic collapse that occurred because of a loss of faith in currencies. One day currency had a high inter-subjective transactional value and suddenly a shift occurred in subjective and inter-subjective phenomenon that reduced the transactional currency value to a small fraction of its’ previous worth.

In the Earth Sun example it seems obvious that we are accepting a more accurate version of reality but that isn’t always the case with inter-subjective phenomenon. There are hundreds of examples large and small where groups fashion and accept inter-subjective realities that are way less fact related. They range from manifest destiny, eugenics, big foot, flying saucers and numerous health risks or cures later proven to be completely false.

A majority of human behavior today is based mostly on inter-subjective beliefs with some individual subjective integration and little if any objective observation being significant. Looking back at the roots of our development as a species, we understand that this willingness to believe what others tell us as fact has become an integral part of who we are and an overwhelming contributor to modern reality. Considering the abstract character of our modern world, those objective observations seem to play a surprisingly minor role in life today.

We see the influence that inter-subjective phenomenon has on an individual’s subjective reality but what forces contribute specifically to the groups inter-subjective reality? Those influences today are probably not much different from influences thousands of years ago. People look up to some individuals like natural leaders, those that seem knowledgable or educated, charismatic personalities and popular cultural figures and they are what contribute to, and often dominate, the foundation for most group beliefs. Not only are we humans prone to accepting a group’s reality but the group is inclined to accept the reality of natural leaders and seemingly well informed individuals.

To understand the process, think about the influence associated with broadcast news where a charismatic anchor makes a report claiming “experts studying… have concluded”. Our very nature inclines us to accept information as true but in an increasingly complex world, with multiple opposing streams of information, those sorts of presentations incline us to accept that specific information on the basis of who is presenting it and where the information is reported to have originated.

As influential as individuals and ideas may be, ideas still originate somewhere and exist in a broader environment where groups and ideas compete against other ideas and groups. Our individual subjective reality, as well as inter-subjective reality, can at times change quickly. Generally, to be successful, beliefs must provide some argument that resonates with numbers of individuals and their associated group held beliefs and they also have to be compatible with personal values generally long held by significant segments of a group or society.

The proceeding looks at the evolution of human understanding and interaction but there is also a darker side to the nature of inter-subjective phenomenon. It can be manipulated and artificially forced upon a group and, in some cases, even on a whole society. Artificially created and forced changes to group realities usually result out of fanatical movements, militaristic societies, radical philosophies and repressive despotic regimes. To force a change in a group’s inter-subjective reality and thus individual subjective beliefs requires very controlled environments and usually major resources. It employs methods characterized as forced mass reeducation, group and regional isolation, the elimination of outside information, outright brain-washing, coercion and even enslavement. It has been understood for a long time that radical new inter-subjective realities can actually be forced on individuals, groups and whole populations and the most successful tool is controlling access to information.

There are literally hundreds of major cases (see end notes) of these techniques being employed. One recent example is communist China’s system of managing information available to its’ people. It has become so effective that on the thirtieth anniversary of the Tiananmen Square massacre, interviewers could not find a single person on the streets of Beijing that seemed to have any memory historically of what happened. They simply did not know that in early June of 1989 the Chinese government cracked down on the student demonstrators in Tiananmen Square seeking democratic reform. Hundreds were killed, thousands were arrested and thousands more of the students completely vanished, never to be seen again. Following that event, the government officially forbid any mention of it. This is an example of a reality that has been effectively erased from the minds of the whole of the Chinese people and demonstrates the effective manufacturing of new inter-subjective reality.

A New World

As the current information age matured a whole new landscape emerged. At its’ center is the internet, the worldwide web. It quickly grew into an incredibly efficient system for distributing information throughout societies and relatively the whole world. The web spread and quickly reached unimagined numbers of people. It was significantly different from anything that preceded it. Unlike long established systems for “publishing” information that required resources and substantial infrastructure, all it required was a personna computer and a connection to the internet and now simply a cell phone. It has quickly replaced the need for printed media in all forms, officially sanctioned sources, authoritative fact checking and traditional forums for weighing opposing views. The information that flowed around the internet had few censors and little control at all. People could access vast amounts of information on millions of subjects that also included radical causes and ideas, all in the comfort of the home.

With the introduction of search engines, Wikipedia, social media and low cost web page creation, access to ideas and information grew exponentially. The low cost and remarkable growth of the internet caused mainstream producers and suppliers of information to lose their market. Encyclopedias, newspapers, magazines, broadcasters and even educational outlets were finding it difficult to compete with the free flowing environment of the internet.

Search engines quickly became the new gateway to information and by early in the twenty first century they were the dominant resource for acquiring and expanding that one thing that made us humans what we are – knowledge. At the same time, without really understanding what was happening, search engines began to be the gatekeepers of inter-subjective and subjective reality.

Tyrants and dictators had to conquer territory, build walls and create armies of secret police to alter and control the inter-subjective reality of their subjects. While they had to isolate whole populations, burn libraries, and take over all media, the search engine corporations had the potential to accomplish the same thing and began to influence inter-subjective and subjective realities by simply providing their internet service. Combine internet searching with open source databases like Wikipedia and some social media providers and you can account for more than two-thirds of humanities regular access to general knowledge and current news.

Normally, the flow of information is part of what is characterized as the free market of ideas. The way search engines should work in that marketplace is to compete with other similar services to deliver the broadest, most useful results while being understood to be unbiased. Currently, there are almost a dozen major search engine companies to select from and over a dozen smaller competitors, but that doesn’t tell the real story.

The way a search works is someone looking for information on something like the economics of commercial logging enters a set of key words. The results returned can normally include numerous related articles about tree farms, strip logging, regions of logging operation, jobs created, environmental impact, major logging corporations including costs, profitability and sustainability.

Unfortunately, the results aren’t always unbiased. The first search engine was launched by Google but within a short period of time they were joined by Yahoo and others but, through a number of processes, Google remains so dominant that “googling” something has become a generic term. The second search engine by quantity of searches is Yahoo with less than 3% of total searches and it is usually estimated that Google’s share of searches worldwide is above 85%. There have been many articles written contrasting Google’s bias with results between search engines but they rarely gain much popular attention.

Because logging isn’t now so controversial, comparing results between Google and say Yahoo or Bing doesn’t produce much difference in results, but if you search something more controversial like global warming, the differences can be significant. Search political figures or topics and many times the differences are shocking. Google and its’ many defenders claim some amount of censorship is necessary to protect society and they claim a responsibility to try and stop the accessing of false, dangerous, or radical information. Google isn’t alone in deciding what information people should have access to. Social media sites like Facebook, Twitter and others are now making claims that they also need to protect people from dangerous, false, misleading and inappropriate ideas and even Wikipedia is subject to the introduction of bias as individuals act as self-appointed arbiters being capable of altering information.

While many consider this a reasonable argument to make, consider that America was founded on the principle of freedom of speech (in all its’ forms). Throughout our history, hundreds of cases have been considered that attempted to suppress that freedom. Examples include cases involving distributing pornography, revealing classified documents, displaying hateful symbols like swastikas and vulgar and offensive speech. In virtually ever case. the uncensored freedom of expression has prevailed. Throughout our history. hundreds of lawyers, judges, elected representatives and common citizens have debated and testified in hundreds of those cases.

Now our information is being filtered by a handful of corporate people who judge what things need to be censored? The American system of free speech is actually based on a belief that no individual or group can be trusted with that responsibility. We have institutionally preferred that we be subjected to offensive, radical, hateful, and misleading information than to trust anyone to decide what information fits those categories and represents a public danger.

Understanding What Has Happened

How does all this fit together? What is the connection between how we developed as a species, the processes that have made us what we are and what is currently happening to us? The answer is actually straight forward. Based on the stepping stones of our evolution and how we created our individual reality, we can draw conclusions about what has happened. Those stepping stones include:

  • The advent of context free speech and thinking.
  • Our linking of minds together to share thoughts.
  • The tendency to trust what others tell us.
  • Our creation of a world full of abstract creations.
  • The existence of personal subjective reality.
  • The acceptance and influence of a groups inter-subjective reality.
  • The controlling of information to modify inter-subjective reality.

We are creatures that consume and utilize information to build our reality and change our world. We do this because of our relationships with others within our society, often based more on non-material creations with our willingness to believe others, especially other members of groups we belong to.

First we need to realize reality itself is a mental construct and by extension there can be numerous realities. By accepting membership in any group we become influenced by and usually adopt the groups inter-subjective reality, incorporating it as part of our own. Our objective observations and evaluations of supposed fact based concepts exercise only minor influence on our subjective reality in the context of inter-subjective phenomenon. Generally, individuals and groups often regularly employ confirmation bias to strengthen there beliefs and positions further strengthening their personal reality.

So the conclusion has to be that these multitudes of individual and group realities are not new nor unusual. The size of the groups and the strength of those realities could actually be new in our modern free society and there are several reasons for that change.

The first is a profound change in the core historical institution for the passing on of knowledge to new generations of humans. We identify the process as education and as an institution it has been allowed to reduce the importance of passing on accumulated factual information like math, science and even history in favor of teaching soft subjects and new philosophies about social issues and causes. This has lead to an enlargement of the importance of the non-fact based ideas.

Perhaps growing out of those major changes in education priorities is a change in what was a fact-based profession, media reporting. What was once a principal driven institution, the news has been mostly replaced by reporting by biased individuals sharing opinions. Even how or if fact-based information is reported has changed. America created protections for the news business to prevent the institutions of government from inordinately controlling public access to information. What wasn’t expected was the watchdog becoming a lapdog.

Add to those a new phenomenon that is partly the result of those internet companies deliberately taking on the role of information censors. On a number of fronts we are seriously restricting the public’s access to information that would allow them to reach informed conclusions. In the book 1984, George Orwell put forward that “Those who control the present, control the past and those who control the past control the future.” A number of governments and institutions have taken this notion to heart and seek to control reality by altering history itself.


• One theory is that Homo sapiens hunted Neanderthals down and deliberately killed them to eliminate food competition. Some scientists believe that their purpose in hunting Neanderthals was to eat them. The evidence for this was Neanderthal bones with butchery marks on them.

• There is another theory that Neanderthals died out due to disease. This theory claims that Homo sapiens in their advance into Europe from Africa carried with them viral diseases. Neanderthals’ immune systems, although well adapted to the germs in their own European environment, could not cope with these alien germs.

• One recent research project concludes that Homo sapiens had no part in Neanderthal extinction. The experts believe Homo sapiens did not eradicate Neanderthals because Neanderthals were already nearly extinct when Homo sapiens came out of Africa into Europe. Some experts believe that Neanderthals, perfectly adapted for the Ice Age, could not adapt to a warming climate, and the changes it brought to the landscape and animals.

• Recent modern, cheap genetic sequencing techniques and new strategies for reducing and detecting modern human DNA contamination have produced a breakthrough in understanding. It seems modern European Homo sapiens have Neanderthal genes proving that Neanderthals and Homo sapiens interbred, after the two species diverged from our common ancestor and came back together again in Europe.

• China’s first emperor completely altered the nation’s history by demanding the destruction of all Confucian philosophy. It changed Chinese morals for hundreds of years.

• Reportedly at the command of Julius Caesar the Library of Alexandria, one of the world’s first, was burned. Many believe it was the largest destruction of accumulated knowledge and culture related artifacts in history. It’s probable that many original discoveries and inventions were along with original works by Plato, Aristotle, and others.

• Sacked by Muslim invaders in 1197. India’s Nalanda University dates back to A.D. 427 and was the origin of many common advanced educational processes and standards. It contained the largest collection of important Buddhist and Hindu texts in the world and set the civilization back five centuries.

• The Chinese Quianlong Emperor 1735 to 1796 assembled a massive collection of bias Chinese history, art, and literature known as the Siku Quanshu. In order to insure its’ dominance in his society’s consciousness, hundreds of thousands of other written works were destroyed. Over 50 authors were executed after being labeled evil for criticizing ideas of the ruling class. Among the volumes purged were major Chinese encyclopedias and histories with information lost forever.

• Literally the entire knowledge base of Mayan civilization was destroyed by the Bishop of Yucatan Diego de Landa’s orders on July 12, 1561. An exact number of books isn’t known but was at least several thousand. The total religion, art, and traditions of a complete ancient society died in that one event.

• In support of Hitler and the Nazi party, on the night of May 10, 1933, German students from universities once regarded as among the finest in the world, gathered in Berlin to burn books containing “unGerman” ideas. Several hundred thousand books were destroyed that night with some being only copies. The purge extended all over Germany and destroyed an anti-Nazi sentiment.

• In the book 1984 George Orwell put forward that “Those who control the present, control the past and those who control the past control the future.” A number of despotic governments have taken this notion to heart and sought to control reality by altering history itself.

• History is littered with examples of these attempts to control or force a new inter-subjective reality on large groups. China has employed the process on a number of occasions in modern times. In August 1966, Mao Zedong called for the start of a Cultural Revolution at the meeting of the Communist Central Committee. He urged the creation of corps of “Red Guards” to punish party officials and any other persons who showed bourgeois tendencies. Between 1966 and 1976, the young people of China, following guidelines created by Mao, rose up in an effort to completely purge the nation of the “Four Olds”: old customs, old culture, old habits, and old ideas which were the very pillars of foundation of China’s historical inter-subjective reality. Additionally, Chinese Re-education through forced labor was a system of detention in Communist China active from 1957 to 2013, and it was used to detain persons who were accused of minor crimes, political dissidents, and religious believers. China’s system of managing information available to the people has become effective to the point that on the thirtieth anniversary of the Tiananmen Square event, journalists could not find anyone on the streets of Beijing that had any memory historically of what happened. They simply did not know that on June 3rd and 4th 1989 the Chinese government cracked down on the student demonstrators seeking democratic reform in Tiananmen Square. A hundreds were killed, thousands were arrested and a significant number of students vanished, never to be seen again. This reality event has been effectively erased from the minds of the Chinese people.

• At least in the modern western world, we have come to believe in a number of notions about what people should have in the way of freedoms and rights and what tools should be employed in the process of making sure we have an informed and educated population. Most discussions concerning culture, society and political institutions universally reject those techniques designed to control access to information that could manipulate broad-based inter-subjective reality. The very foundation of western democratic government was built on the notion of an open and free marketplace of ideas regardless of how appealing or abhorrent. Censorship is the path that leads to authoritarianism.

• Confirmation bias. It explains that the stronger our subjective beliefs are, the harder they are to overcome. We have ingrained habits that continually work to reinforce those beliefs by looking for confirming information while we completely ignore contradicting information.

COVID-19 – Coronavirus Data?

The policies and rules dictated by our government in reaction to the COVID-19 pandemic are impacting all of us. Any approach we take should focus on three principle areas:

First it should be collecting and analyzing data about the disease, from likely symptoms, its spread, lethality and susceptible populations.

Second identifying where the disease is and how it’s spreading in order to direct an effective medical response involving supplies, quarantine requirements and to identify effective medications and to support the development of medications and a vaccines.

Third it should be studying the pandemics economic costs to the country and adopting policies to best mitigate the financial impact of the pandemic on all communities.

Collecting data on the threat should be the highest priority, essentially if we don’t understand the threat we cannot effectively formulate a proper response. In the COVID-19 pandemic, accurate information is the most important thing for every citizen to have if they’re to make reasonable decisions about keeping themselves and their families.

Individuals make assessments about risks and associated costs all the time. Cost benefit analysis is a part of life at every level. How much insurance to have, if to get a seasonal vaccine, fire extinguishers, smoke detectors, seat belts, medical treatments and on and on. Governments do the same but for decisions to be appropriate for them and all of us we need good data.

What Data Matters

With COVID-19 what information is critically important for meaningful decisions? We need to know how many Americans were/are infected and how quickly it is spreading and where. Who the most “at risk” people are and how many are actually dying from the disease. We also need accurate data on how many Americans have recovered. After almost three months of cases in the United States we actually have meaningful information on only one of the above questions and that is who is at most risk.

The majority of data that we have access to and that is being widely reported includes:

Total People Infected. The issues with this number are two fold. First, how important is the number actually? Since a majority of infected people have minor or no symptoms the number shouldn’t be assumed to characterize the actual reported total and thethreat. A number of studies of local populations looking for antibodies strongly suggests we are undercounting cases by at least 500%. That also seriously diminishes the reliability of that data and the actual threat.

Deaths from COVID-19. While there is no doubt that this is a serious and deadly disease much of the fatality reporting is also inaccurate. The total death count would seem to be inaccurate and is more than likely exaggerated. The primary websites for tracking the COVID-19 is Johns Hopkins and the CDC. Those site’s data generally do not match and mostly they do not track the same date. There have been numerous reports that health providers are being asked to submit data that over-reports deaths from COVID-19 and because of the financial actions taken during the pandemic, hospitals and local governments can actually benefit from over-reporting financially.

Total Patients Recovered. Johns Hopkins stopped keeping a total count of recovered cases on their site weeks ago. They now report on each state and well over half of the states do not report recoveries at all. Any number that is reported is also seriously under-reported. Confirmed cases are now based on lab tests and if a person is hospitalized they are again tested before discharge. With a majority of cases being asked to just stay home and, supposedly remaining in self-quarantine for 14 days after recovery almost all of those cases go unreported as recovered.

The COVID-19 Lethality. We do know that the elderly and specifically those with serious health issues are at high risk of dying. In order to demonstrate the death rate from being infected we need to know the total number of people infected as the denominator, and how many people have died from the disease is the numerator. The fatality rate has been reported to be between 3.5% and 0.3% but the simple fact is we don’t have an accurate count for either of those numbers so we just don’t know.

The Data is a Mess

While there is active research gathering data trying to understand how this pandemic is unfolding the government primarily reports on “model” forecasts without supplying the data. What we do know is we have no idea how many Americans have been infected. We don’t really know how many people have died “of” the COVID-19. Because of that we don’t know the actual mortality of the disease. But most importantly we don’t know the financial consequences of having shut down a majority of businesses in the United States of America.

Consider the Following

While Johns Hopkins graphs total cases they don’t graph recovered or deaths and that graph is much more important. The CDC public data is spotty and suspect and while the “Task Force” has asked for specif reporting from every hospital in the country less than half have complied.

On April 25th alone the Johns Hopkins COVID-19 total case count in the United States increased by 32,793 and the CDC reported 7,875 deaths attributed to COVID-19 and 3,310 who died of pneumonia and COVID-19.

On May 2nd Johns Hopkins COVID-19 total case count in the United States increased by 20,340 and the CDC reported 1,636 deaths attributed to COVID-19 and 637 died of pneumonia and COVID-19.

Johns Hopkins stopped keeping a total count of people recovered and while it is now suggested that symptoms appear about a week after exposure and that most cases resolve after another two weeks, statistically about half of the people recorded as infected should now be recovered. On May 11th we are officially reporting 1,371,000 cases in America and 224,000 recovered which seems like total nonsense it should be double that. It also doesn’t really matter how many total people have been infected unless the intent of the reporting is to spread and increase panic.

Your Thoughts Are Now A Commodity And They Are Being Sold

It’s much worse than you think. Think Google…

There was a time that psychologists and advertising companies worked at embedding the name of a detergent or other product in your mind so you would reflexively reach for their product on the store shelf. One of their favorite techniques was ads that mentioned the product name ten to twenty times in a broadcast commercial. It did have an impact and there was a time when people were concerned about the practice. How naive we were.

We now have to admit that the internet has taken over our lives and represents our preference to access information. Once it was newspapers, the evening news, magazines, encyclopedias and on and on, but now it’s the world wide web. Early on we thought that this was a good thing and I still marvel at what I can access on my laptop and phone. I admit it, I am addicted too but I’m becoming concerned.

The gateway to literally everything is the search engine. I can remember my son’s teacher showing me how to “Yahoo” and my reaction – it seemed a miracle. That was almost thirty years ago and that miracle has now metastasized. We’re so used to it that I don’t think we question it as much as we should, but occasionally I stumble upon something that gets my attention.

Google With Love And Hate

When it comes to the current gate keeper to information it’s Google. There are other search engine choices but Google use is about double all the others combined. As much as I am addicted to the internet I have also become addicted to Google. I prefer Android phones, my calendar is Google, I use Gmail, my TV has Chromecast and my homes wireless uses Google hardware. I transfer my photographs between devices with Google Photo and on and on but I have serious issues with their search engine.

A decade or so ago I became concerned about how Google was presenting me with search results. There were a couple of topics that I was interested in and I was well aware that there were conflicting arguments, but Google seemed to lean toward only one of the views. Often I had to go three, five or more pages into the search results to get information on the opposing argument. At the time it disturbed me to think that Google was demonstrating a bias in presenting me with results on topics involving economics, politics, science and more. An unbiased process in returning search results seemed to me a requirement for such an important “obligation”.

My hobbies are travel and photography and I combine them in producing a travel blog. I’ve been at the blog for a couple of years and will admit that I am a very small fish in a very large ocean. I’ve always believed that there were rules that provided everyone an opportunity on the web. I no longer believe that. I now believe that the process is rigged to an extreme level and much of it has nothing to do with a social causes – it’s money and it’s hidden from view.

The second largest general topic on the internet is travel (the first is porn) and there are huge amounts of money involved. If you think about it for a moment you will recognize that people book flights, hotels, cruises, tours, trains, rental cars and more world-wide, primarily on the internet. Billions of dollars are at play. In that internet travel space were a couple of sub-categories that focused primarily on supplying general travel information like Frommer’s guides, Rick Steves, National Geographic and web sites like Lonely Planet and TripAdvisor.

Founded in 2000, TripAdvisor was intended to be a compilation of information from other travel sources but as an afterthought they included a feature for visitors to add a review. The site exploded and nine years later was sold to the originator of Expedia. Four years later the company split in two, partly to attempt to keep the impartiality of TripAdvisor. Today TripAdvisor is the eight-hundred pound gorilla in the space. If you are a member you get email offers for cruises and resorts along with recent articles that about travel. Responding to those articles you will discover offers from cruise companies, airlines hotels…

Here’s the issue with Google searches in this space. First you will always find the top of the results pages showing results with that little indicator that they are ADS. Okay that’s where they make money, no big deal. Next you will often discover that a lot of travel results take you to TripAdvisor and at times you can’t be sure why? Once I had the first eight results be TripAdvisor with most being a single members review without much information. Seems like something fishy here. Is Google being biased toward TripAdvisor and is there money involved?

There is something different about Google searches – Recently my travel blog had three referrals with the search term being “destinations index”. I have a number of index posts hat link to content and one is Destinations Index. It wasn’t something I planned but if you use any search engine other than Google (Bing, Yahoo, Duck Duck Go, Dog Pile and more) using that search term you will find my page at the tp of the list or at least in the top four. Us that search term on Google and it isn’t there. I gave up after fourteen pages.

If you are thinking about travel, Google is selling your thoughts to specific travel websites. Their playing field isn’t fair. There is a lot of discussion about their bias feeding you with results slanted to one side of an issue – global warming, Democratic candidates, rights to live or die, social causes… They are influencing our thoughts by what information they allow us to see but it seems they are also selling our thoughts about what we are interested in to specific companies.

The Google information gatekeeper charges a toll and we never knew the commodity was actually our thoughts. What gave them the right? There is a possible solution – start using a different search engine (Bing even pays you money to use theirs) and you might also question using Chrome as your browser?

A Veneer of Legitimate Science Crafted To Advance Questionable Claims

A Veneer of Legitimate Science Crafted To Advance Questionable Claims

I was doing research on the effect of a chemical being broadly used in industry for an article, when I stumbled on an online piece at The piece was titled “A Study estimates 15,000 cancer cases could stem from chemicals in California tap water” reported by Nadia Kounang, CNN in an Update @ 9:23 AM ET, Tue April 30, 2019. (Why was it necessary to update this article?)

I do a fair amount of work that involves scientific journals and I have a background in statistics and this piece struck me as odd. Upon reading the article it noted that Californians could see an increase in cancers state wide of 15,000 through the period of a lifetime(??). It bothered me because in a state with a population of 45,000,000 over a period of roughly 75 years the incidence is basically insignificant and falls below any threshold of even a research project with careful controls.

The piece was purportedly from researchers at the advocacy group Environmental Working Group out of Washington D.C. and was published in the journal Environmental Health. The first difficulty I had was locating the publication and the article. Eventually I located the journal which was one of supposedly two to three hundred scientific journals published by BMC a part of Springer Nature. Their mission from the website states they are

A pioneer of open access publishing, BMC has an evolving portfolio of high quality peer-reviewed journals including broad interest titles such as BMC Biology and BMC Medicine, specialist journals such as Malaria Journal and Microbiome, and the BMC Series.

The article in question was published in April 2019 and was an attempt to apply methods used in air quality and risks to drinking water. The paper is reasonably short and includes the two following statements.

“Cumulative risk assessment for drinking water has lagged behind similar methodologies already standard in air quality evaluations. In our estimate, the slow adoption of cumulative methods in drinking water assessments is at least partly due to the variety of health outcomes caused by drinking water contaminants. While acknowledging the scientific challenges of assessing the impacts of co-occurring chemicals on multiple body systems, we believe that the drinking water field can start with the application of existing cumulative risk methodologies established for air quality.”

“The EPA’s technical support materials for the National Air Toxics Assessment note that the true value of the cumulative risk is not known and that the actual risks could be lower than predicted [2]. “

So it seems they are applying air quality evaluation standards to water while admitting that cumulative substances in air cannot be evaluated for health risks?

What they did come up with is one hell of a headline and a remarkable amount of media coverage. It seems they got mentioned in The Washington Post and The New York Times and probably a number of other news outlets.

In browsing other of BMC’s “scientific” journals I found a number of other shocking headlines with one particular organization repeatedly getting published. It was the Ramazzini Institute of Italy. It seems they have announced findings like “Splenda Causes Cancer”, “Cellphone radiation Causes Brain Tumors”, “New Results On Glyphosate Danger” It seems that The Ramazzini Institute research is widely distrusted by the Food and Drug Administration, the European Food Safety Authority, and the Environmental Protection Agency as well as a virtual who’s who of modern science.

I am saddened by this afternoon’s walk through borderline research, questionable institutions and scientific publications. They seem to form a large network of radical organizations, lobbying groups, litigation law firms and pseudo scientific institutes that have been financed by millions in donations, litigation proceeds and well-intentioned private and government grants. They now represent a vicious conspiracy that plants misleading information in the media to stir up public fears and gain support. They use the fruits of this misrepresentation to legally attack businesses and win or extort hundreds of millions of dollars from business while at the same time making a number of participants rich.

I have always been puzzled by why bad scientific arguments seem to prevail in areas from breast implants, to talcum powder, to Roundup but I am beginning to get the picture.

Selective Blank Slatism

Bo Winegard & Cory Clark

Published on May 3, 2019 

Selective Blank Slatism and Ideologically Motivated Misunderstandings

Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I’ll guarantee to take any one at random and train him to become any type of specialist I might select—doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors. ~John B. Watson

Blank slatism is the view, exemplified here with John B. Watson’s characteristic arrogance, that human nature is highly flexible and largely determined by environmental forces. Because almost all the available evidence suggests that blank slatism is incorrect, many scholars are puzzled that versions of this philosophy appear to remain popular in certain university departments and among the intelligentsia more broadly. Some critics of progressivism, such as the economist Thomas Sowell, have contended that political progressives are particularly likely to hold blank slate beliefs as a result of their tendency to attribute many social disparities to environmental and social causes and to de-emphasize genetic ones.

Others—usually those favorably inclined to progressivism, like the Guardian‘s Ed Rooksby—have argued that this is a misrepresentation, a lazy straw man argument, and that, properly understood, most progressives are not blank slatists at all. Rather, they are simply sensitive to the effects of social forces and injustices, and this sensitivity is often mischaracterized by ideological opponents as naive environmentalism. Many of these people argue that, in fact, progressives are more likely than conservatives to accept that genetics contribute to human behavior. After all, conservatives still appear reluctant to believe that sexuality is caused by genes or constrained by one’s nature. Rather, they believe it is a choice, an exercise of a person’s free will. Similarly, they are equally unlikely to attribute social failure to a person’s genes, but instead blame a person’s attitudes to work and commitment, contending that the destitute are often lazy and undisciplined.

Those who defend progressivism are partially correct; progressives aren’t blank slatists in general and indeed they appear more likely than conservatives to accept genetic causes for many human behaviors and life outcomes. However, progressives are selective or ideological blank slatists. That is, they generally accept that there is some kind of nature that constrains individuals. Richard Dawkins could never be Lebron James nor vice versa, no matter their respective diets, upbringings, or effort exerted.

However, they are selectively skeptical that an appeal to this nature (genetics) can explain certain kinds of differences between humans, between sexes, and among ethnic populations. Specifically, they are skeptical of genetic explanations if they appear to suggest that social inequalities are “natural” or caused by genetic differences between groups, and especially when those differences appear to favor the higher status group (for instance, that men are better than women at something on average because of genetic differences between men and women).

The notion that progressives are selective blank slatists is congruent with theory, observational evidence, and systematic survey evidence.

Selective Blank Slatism: Theory and Evidence

One of the chief psychological differences between conservatives and progressives is that progressives are more averse to inequality. Both, of course, see disparities in the world—they see that a professor has more status than a construction worker or that a lawyer makes more money than a social worker and so on. But progressives find these disparities more disconcerting.

Where such disparities exist, there are at least two possible explanations. The first is that genes have endowed certain individuals and groups with natures that lead to better life outcomes than others (for instance, some have higher intelligence, superior athletic ability, greater musical talent, exceptional beauty, and captivating charisma). The second is that all individuals and groups are born genetically equal in their capacities to develop desirable traits and abilities, but then these natural equalities are distorted by environmental and social forces, which thwart certain individuals and groups trying to achieve their full potential.

Those who particularly abhor inequality appear to prefer the latter explanation for two reasons. First, it suggests that groups and individuals are naturally equal. Second, it suggests that equality in life outcomes can be achieved in a genuinely free and meritocratic society. The pressing political project at hand, then, is to create such a society. Accepting the first explanation (that individuals and groups naturally differ) is morally unpleasant for progressives simply because it violates their preference for equality; but it is also unpleasant because it means that society can only make individuals and groups equal by violatingmeritocratic principles with interventionist policies that favor certain groups.

The view that most humans and all groups are basically equal is a kind of cosmic egalitarianism that suggests that the universe is just and fair, but that people are not. This view ineluctably leads to selective blank slatism because if humans are, in fact, naturally equal, then the only thing that could explain social disparities are environmental forces.

So, selective blank slatism is theoretically consistent with progressives’ psychological inclinations and preferences. It also conforms to informal inferences we can draw from what we see in the world. For example, when James Damore’s “Google memo” was released, progressives immediately assailed him, accusing him of perpetuating sexism in the tech industry. Despite how scurrilous many of the attacks on Damore were, his actual memo was a generally judicious and cautious document. He simply asserted that some of Google’s diversity policies were unfair and likely doomed to failure because they failed to consider biological (read, natural or genetically caused) differences between men and women.

The consternation and outrage the memo provoked among progressives is readily explicable if we accept that progressives are cosmic egalitarians. Women are under-represented in the tech industry and, because a cosmic egalitarian cannot countenance genetic differences between men and women, this disparity must necessarily be attributed to sexism. Furthermore, anyone who claims otherwise is wilfully defending an intolerable status quo.

We now have strong, systematic evidence that supports the theory and the informal observations that progressives are cosmic egalitarians and selective blank slatists. We collected survey data from 3,274 people. We first asked traditional demographic questions, including political ideology on a 7-point scale (from 1 = very conservative to 7 = very liberal), and then asked many questions about sex and ethnic differences and the causes of social disparities. For analytical purposes, we divided participants into extreme conservatives (those who answered 1 on political ideology), conservatives (answered 2-3), moderates (answered 4), liberals (answered 5-6), and progressives (answered 7). It’s important to note that our scale did not use the label “progressive.” The term is ours to describe extreme liberals. Overall, 488 participants, or roughly 15 percent, were progressives as we defined it. Although we asked a variety of questions, we will only report seven of the most directly germane here (curious readers can examine this, which reports all of the data).

First, consistent with selective blank slatism, progressives more than others reported that men and women have equal abilities on all tasks. (Questions were on a 7-point scale, from 1 = do not agree at all to 7 = agree completely.)

And they also reported that all ethnic groups have equal abilities on all tasks more than others.

Consistent with these answers, they also reported that differences between the sexes (and between ethnic groups) were more likely to be caused by discrimination than others did. (Notice that the question/statement claims that the only reason there are differences is because of sexism. A full 130 progressives, or 26 percent, endorsed this at 7, indicating that they agree completely.)










Predictably, they also reported more than other groups that people use science to justify existing inequalities. These findings are consistent with progressives’ response to the Damore Google memo. Progressives are likely to impute nefarious motives to anyone who asserts that men and women differ biologically—even when such assertions are supported by science.


Progressives do accept a genetically caused human nature; but, consistent with the claims of their critics, they accept this much less when it is ideologically inconvenient. In other words, progressives are selective or ideological blank slatists. They accept genetic explanations for things such as homosexuality, transsexuality, obesity, addiction, and a variety of mental illnesses, but not for sex or group differences, and especially not when those sex or group differences could explain (and thus potentially justify) existing inequalities between sexes and groups.

Our data, although limited, provide compelling support for the contention that progressives are selective blank slatists. Progressives agreed more strongly than any other ideological group with statements that convey blank slate attitudes about sex and ethnic differences (precisely the kind of blank slatism a priori theory would predict progressives would hold). Most supportive and perhaps most surprising, a full 26 percent of progressives fully endorsed the statement that “the only reason there are sex differences is because society is sexist,” which is, to put it mildly, a wildly implausible claim.

It should not surprise us that progressives have an ideologically saturated view of human nature. On all sides, concerns about human nature are intense and passionate because today’s competing ideologies are premised on different conceptions of human variation and its relation to the ideal social order. With so much at stake, few are capable of approaching the evidence with an open mind. Conservatives too, as noted in the introduction, are almost certainly selective blank slatists. They appear, for example, to be more skeptical that mental illnesses, drug addiction, and sexual orientation are caused by genetics. And although conservatives do appear to accept a more constrained view of humans than do progressives, they often argue that all (or almost all) people, if they just work hard, can succeed. Furthermore, they often blame social pathologies exclusively on cultural deficits and decadence.

So, it is unlikely that either the Left or the Right has a monopoly on bias; and it is unlikely that either is absolutely correct about human nature (although, it is possible that one is more correct than the other). If we begin to understand these biases and the errors into which they lead us, then we can begin to adjust for them. We can, as it were, correct our distorted vision with the spectacles of self-conscious and disciplined reflection. The first step might be to ask ourselves a simple question: How likely is it that what we want to be true of human nature is true of human nature? In other words, if all of the “facts” about humans conform to our desires then that is strong evidence not that we are lucky, but that we are wrong.


Bo Winegard is an essayist and an assistant professor at Marietta College. You can follow him on Twitter @EPoe187

Cory Clark is an assistant professor of psychology at Durham university who studies moral psychology and free will. You can follow her on Twitter @ImHardcory

Second-Hand Smoke Again?

From the “sometimes the fraud is so obvious only really smart people can’t see it” department.

The EPA Particulate Matter studies greatly exaggerated the health risks of exposure to PM 2.5 (Particulate Matter as small as 2.5 microns) to justify proposed regulations to control second-hand smoke.

Sugar-punchlineA new study indicates you can smoke (each cigarette is a massive PM2.5 exposure) until you are age 35, quit, and then by age 50 statistically have normal life expectancy — despite inhaling over 4 pounds of PM2.5 as a smoker. For comparison, a non-smoker inhales about 2 ounces of PM2.5 over the course of an 80-year lifespan.

I Ask Only Because I Can

I raise this only because I am here. If I weren’t here I certainly would not raise this issue with you.

Just as Goldilocks found the porridge that was just right, the Earth seems to be just right for living creatures. The Earth seems to be the perfect distance from the sun with lots of water.

The fine-tuned Universe is the proposition that the conditions that allow life in the Universe can occur only when certain universal dimensionless physical constants lie within a very narrow range of values, so that if any of several fundamental constants were only slightly different, the Universe would be unlikely to be conducive to the establishment and development of matter, astronomical structures, elemental diversity, or life as it is understood.

As Stephen Hawking has noted, “The laws of science, as we know them at present, contain many fundamental numbers, like the size of the electric charge of the electron and the ratio of the masses of the proton and the electron. … The remarkable fact is that the values of these numbers seem to have been very finely adjusted to make possible the development of life.”

Martin Rees formulates the fine-tuning of the Universe in terms of the following six dimensionless physical constants.

N, the ratio of the strength of electromagnetism to the strength of gravity for a pair of protons, is approximately 1036. According to Rees, if it were significantly smaller, only a small and short-lived universe could exist.

Epsilon (ε), a measure of the nuclear efficiency of fusion from hydrogen to helium, is 0.007: when four nucleons fuse into helium, 0.007 (0.7%) of their mass is converted to energy. The value of ε is in part determined by the strength of the strong nuclear force.[13] If ε were 0.006, only hydrogen could exist, and complex chemistry would be impossible. According to Rees, if it were above 0.008, no hydrogen would exist, as all the hydrogen would have been fused shortly after the big bang. Other physicists disagree, calculating that substantial hydrogen remains as long as the strong force coupling constant increases by less than about 50%.

Omega (Ω), commonly known as the density parameter, is the relative importance of gravity and expansion energy in the Universe. It is the ratio of the mass density of the Universe to the “critical density” and is approximately 1. If gravity were too strong compared with dark energy and the initial metric expansion, the universe would have collapsed before life could have evolved. On the other side, if gravity were too weak, no stars would have formed.

Lambda (λ), commonly known as the cosmological constant, describes the ratio of the density of dark energy to the critical energy density of the universe, given certain reasonable assumptions such as positing that dark energy density is a constant. In terms of Planck units, and as a natural dimensionless value, the cosmological constant, λ, is on the order of 10−122. This is so small that it has no significant effect on cosmic structures that are smaller than a billion light-years across. If the cosmological constant were not extremely small, stars and other astronomical structures would not be able to form.[12]

Q, the ratio of the gravitational energy required to pull a large galaxy apart to the energy equivalent of its mass, is around 10−5. If it is too small, no stars can form. If it is too large, no stars can survive because the universe is too violent, according to Rees.

D, the number of spatial dimensions in spacetime, is 3. Rees claims that life could not exist if there were 2 or 4 dimensions of spacetime nor if any other than 1 time dimension existed in spacetime.

Various possible explanations of ostensible fine-tuning are discussed among philosophers, scientists, theologians, and proponents and detractors of creationism. The fine-tuned Universe observation is closely related to, but is not exactly synonymous with, the anthropic principle, which is often used as an explanation of apparent fine-tuning.

The anthropic principle is a philosophical consideration that observations of the universe must be compatible with the conscious and sapient life that observes it. Some proponents of the anthropic principle reason that it explains why this universe has the age and the fundamental physical constants necessary to accommodate conscious life. As a result, they believe it is unremarkable that this universe has fundamental constants that happen to fall within the narrow range thought to be compatible with life. The strong anthropic principle (SAP) as explained by John D. Barrow and Frank Tipler states that this is all the case because the universe is in some sense compelled to eventually have conscious and sapient life emerge within it. Some critics of the SAP argue in favor of a weak anthropic principle (WAP) similar to the one defined by Brandon Carter, which states that the universe’s ostensible fine tuning is the result of selection bias (specifically survivor bias): i.e., only in a universe capable of eventually supporting life will there be living beings capable of observing and reflecting on the matter. Often such arguments draw upon some notion of the multiverse for there to be a statistical population of universes to select from and from which selection bias (our observance of only this universe, compatible with our life) could occur.

The First Americans

Many thousands of years ago, not a single human being lived in the Americas.

I have long been interested in the settling of our continent by people from somewhere else. My interest intensified when I found a Clovis spear point on our property in Georgia. When I looked into Clovis sites I became even more puzzled by what I found. It seems that most Clovis sites are dated between 8,000 and 11,500 years old and surprisingly much more are found in the Southeastern United States than anywhere else. That seemed odd to me for a migration that was supposed to have come thru Alaska? Research into how humans came to the American continents has been going on for a long time. In comparing this there are as many inconsistencies and gaps as there are theories but it is looking like genetics may be the path to an answers.

I found the following article this morning and thought it worth a read…

A BBC Report By Melissa Hogenboom

30 March 2017

Many thousands of years ago, not a single human being lived in the Americas.

This only changed during the last Ice Age. It was a time when most of North America was covered with a thick sheet of ice, which made the Americas difficult to inhabit.

But at some point during this time, adventurous humans started their journey into a new world.

They probably came on foot from Siberia across the Bering Land Bridge, which existed between Alaska and Eurasia from the end of the last Ice Age until about 10,000 years ago. The area is now submerged by water.

There is still debate about when these first Americans actually arrived and where they came from. But we are now getting closer to uncovering the original narrative, and finding out who these first Americans really were.

During the last Ice Age lower sea levels exposed a land bridge across the Bering Sea

During the last Ice Age lower sea levels exposed a land bridge across the Bering Sea (Credit: Tom Thulen/Alamy)

During the peak of the last Ice Age about 20,000 years ago, a journey from Asia into the Americas would not have been particularly desirable. North America was covered in icy permafrost and tall glaciers. But, paradoxically, the presence of so much ice meant that the journey was, in a way, easier than it would be today.

The abundance of ice meant that sea levels were much lower than they are now, and a stretch of land emerged between Siberia and Alaska. Humans and animals could simply walk from Asia to North America. The land bridge was called Beringia.

People were using the woody shrubs from the land bridge to ignite bones on the landscape

At some point around this time – known as the Last Glacial Maximum – groups of hunter-gatherers moved east from what is now Siberia to set up camp there.

“The first people who arrived in Beringia were probably small, highly mobile groups evolving in a large landscape, probably depending on the availability of seasonal resources,” says Lauriane Bourgeonof the University of Montreal, Canada.

These people did well to seek refuge there. Central Beringia was a much more desirable environment than the icy lands they had left behind. The climate was a bit damper. Vegetation, in the form of woody shrubs, would have given them access to wood that they could burn to keep warm.

Beringia was also an ideal environment for large grazing mammals, giving early hunter-gathers something to hunt, says Scott Elias of Royal Holloway, University London in the UK, who reconstructs past climates.

During the last Ice Age humans could walk from Siberia into the Americas

During the last Ice Age humans could walk from Siberia into the Americas (Credit: Gary Hinks/SPL)

“Our hypothesis is that people were using the woody shrubs from the land bridge to ignite bones on the landscape. The bones of big animals contain lots of fatty deposits of marrow, and they will burn.”

When humans got to Beringia, they would have had little choice but to set up camp there. The vast Laurentide and Cordilleran ice sheets further east cut them off from North America.

This standstill helped these isolated groups of people to become genetically distinct from those they had left behind

It is now becoming clear that they made Beringia their home, staying put for several thousand years. This idea is called the Beringian Standstill Hypothesis. This standstill helped these isolated groups of people to become genetically distinct from those they had left behind, according to a 2007 study.

This long standstill therefore meant that the people who arrived in the Americas – when the ice finally retreated and allowed entry – were genetically different to the individuals who had left Siberia thousands of years earlier. “Arguably one of the most important parts of the process is what happened in Beringia. That’s when they differentiated from Asians and started becoming Native Americans,” says Connie Mulligan of the University of Florida in Gainesville, US, who took part in this early analysis.

Since then, other genetic insights have further supported the standstill hypothesis. Elias and colleagues even propose that people stayed in Beringia for as long as 10,000 years.

When the ice finally started to retreat, groups of people then travelled to different pockets of the Americas.

There has long been debate over whether these early settlers arrived from several migrations from different areas, or just one.

There’s been no turnover or change in the population group as some people had previously hypothesised

Over 20 years ago, Mulligan proposed that there was just one migration from Beringia into the “New World”. She came to this conclusion by analysing the genetic variation in the DNA of modern-day Native Americans and comparing it with the variation in Asia. The same rare pattern appeared in all the Native Americans she studied, but very rarely appeared in modern-day Asians. This meant Native Americans likely arose from a single population of people who had lived in Beringia, isolated for many years.

In 2015, a study using more advanced genetic techniques came to a similar conclusion. Rasmus Nielsen of the University of California, Berkeley, US, and colleagues found that the “vast majority” of Native Americans must have originated from just one colonisation event.

“There’s been no turnover or change in the population group as some people had previously hypothesised,” says Nielsen. In fact, about 80% of Native Americans today are direct descendants of the Clovis people, who lived across North America about 13,000 years ago. This discovery came from a 2014 genetic study of a one-year-old Clovis boy who died about 12,700 years ago.

But we now know there must have been staggered migrations from Beringia.

Many Native Americans today are direct descendants of the Clovis people

Many Native Americans today are direct descendants of the Clovis people (Credit: William Scott/Alamy)

That is because there are small groups of people in the Amazonian region of South America – such as the Suruí and Karitiana – with additional mysterious “arctic gene flow”, unrelated to the Clovis boy. Another 2015 study therefore proposed there was more than one “founding population of the Americas”.

The indigenous populations of the Americas, the team found, have distant genetic links in common with people of Australia, Papua New Guinea and the Andaman Islands.

People came into Beringia over different times during the standstill

This means, says Pontus Skoglund of Harvard University in Boston, Massachusetts, that people came into Beringia over different times during the “standstill” and went on to populate different parts of the Americas. Those early dispersals are still reflected by differences in the genomes of people living today.

“It wasn’t simply a single homogenous founding population. There must have been some type of patchwork of people, and maybe there were multiple pulses,” says Skoglund.

In other words, the Beringian inhabitants did not all arrive or leave at the same time.

This makes sense when you consider that Beringia was not a narrow land bridge with ocean on either side. “It was a huge region about twice the size of Texas,” says Elias. The people living there would have had no idea that it was a land bridge at all. “There were no sign posts saying they were leaving Siberia.”

This makes it highly likely that there were different groups of Beringian inhabitants that never met.

A study published in February 2017 strengthens this idea further. After examining the shapes of 800- to 500-year-old skulls from Mexico, researchers found they were so distinct, the people the skulls belonged to must have remained genetically isolated for at least 20,000 years.

There is evidence humans were present in Oregon 14,500 years ago

There is evidence humans were present in Oregon 14,500 years ago (Credit: John R. Foster/SPL)

To understand who the first Americans really were, we have to consider when they arrived. While the exact timing is hard to pin down. Nielsen’s work gives some insight. By sequencing the genomes of people from the Americas, Siberia, and Oceania, he and colleagues could understand when these populations diverged. The team concludes that the ancestors of the first Americans came to Beringia at some point between 23,000 years and 13,000 years ago.

We found cut marks on bones from horse, caribou and wapiti so we know that humans were relying on those species

We now have archaeological evidence to suggest that the people who left Siberia – and then Beringia – did so even earlier than the 23,000-year-limit proposed by Nielsen and colleagues. In January 2017, Lauriane Bourgeon and her team found evidence of people living in a cave system in the northern Yukon Territory of western Canada, called the Bluefish Caves, that dates to as early as 24,000 years ago. It was previously believed that people had only arrived in this area 10,000 years later.

“They reached Beringia as early as 24,000 years ago, and they remained genetically and geographically isolated until about 16-15,000 years ago, before dispersing south of the ice sheets that covered most of North America during this period,” says Bourgeon.

The caves “were only used on brief occasions for hunting activities”, she says. “We found cut marks on bones from horse, caribou and wapiti, so we know that humans were relying on those species.”

Cut marks were discovered on a horse mandible (Credit: Lauriane Bourgeon)

Human cut marks were discovered on this 24,000-year-old horse mandible (Credit: Lauriane Bourgeon)

This work provides further evidence that people were in the Beringia area at this early date. But it does not reveal the exact dates these people first ventured further south.

For that, we can turn to archaeological evidence. For decades, stone tools left by the Clovis people have been found throughout North America. Some date to as early as 13,000 years ago. This might suggest that humans moved south very late. But in recent years evidence has begun to emerge that questions this idea.

Most preserved remains are stone tools and sometimes bones of animals

For instance, at a site called Monte Verde in southern Chile, there is evidence of human occupation that dates between 14,500 and 18,500 years ago. We know these people built firesate seafood and used stone tools – but because they did not leave any human remains behind, much about this early group remains mysterious.

“We really know little about them, because most preserved remains are stone tools and sometimes bones of animals, thus technology and diet,” explains Tom Dillehay of Vanderbilt University in Tennessee, US, who is studying these people. “Monte Verde in south-central Chile, where I am at present, has several organic remains – animal hide, meat, plant remains that reveal a wider diet, wood technology – but these types of sites are rare to find.”

Another conundrum remains. Ice sheets still covered North America 18,500 years ago, making journeying south difficult. How did people arrive in southern Chile so early?

Animal remains were discovered in the Bluefish Caves site in northern Yukon

Animal remains were discovered in the Bluefish Caves site in northern Yukon (Credit: Lauriane Bourgeon/Canadian Museum of History)

A leading idea had been that an ice-free corridor opened up, which allowed humans to travel south. However, the latest evidence suggests this corridor only opened about 12,600 years ago, long after these early Chileans arrived.

Elias also points out how difficult this journey would have been. “Even if there was a small gap in between these enormous ice sheets, the environment left in that gap would have been so horrible, with mud, ice, meltwater and slush. It would not have been a habitable place for people or the animals they would have wanted to follow,” he says.

These early people could have travelled by boat

There is an alternative. These early people could have travelled by boat, taking a route along the Pacific coast. There is no archaeological evidence to support this idea, but that is not entirely unexpected: wooden boats are rarely preserved in the archaeological record.

There are still many unanswered questions, but Mulligan says that studying how and when early hunter-gatherers spread across the Americas can help us to understand the process of migration itself. That is, how population sizes change and which genetic traits persist.

In many ways, the peopling of America presents scientists with a golden opportunity to study these processes. There have been multiple migrations both into and out of other regions of the world – Africa, Europe and Asia, for instance. But the people who moved into the Americas were on a one-way journey. “We know the original inhabitants came from Asia into the New World with no other people there, and no major back migrations, so it’s the simplest model you can conceive of.”

That it was a one-way journey, coupled with the increased interest in studying the genetics of these ancient people, means we should soon understand even more about who these first Americans really were, and exactly when they arrived.

Melissa Hogenboom is BBC Earth’s associate editor. She is @melissasuzanneh on Twitter.

The Case Is Closed. Period.

Another post from my favorite soldier of fortune.

Darwin’s Vigilantes, Richard Sternberg, and Conventional Pseudoscience

Posted on September 22, 2018 by Fred Reed

I am sorry. I admit it: I am a bad person. I promise I will never write about this again. Well, sort of never. It’s just too much fun. Anyway, it’s not my fault. My childhood makes me do it. Maybe I ate lead paint.

Science is supposed to be the objective study of nature, impelled by a willingness to follow the evidence impartially wherever it leads. For the most part it works this way. In the case of emotionally charged topics, it does not. For example, racial intelligence, cognitive differences between the sexes, and weaknesses in Darwinian evolution. Scientists who do perfectly good research in these fields, but arrive at forbidden conclusions, will be hounded out of their fields, fired from academic and research positions, blackballed from employment, and have their careers destroyed.

A prime example is Richard Sternberg, a Ph.D. in biology (Molecular Evolution) from Florida International University and a Ph.D. in Systems Science (Theoretical Biology) from Binghamton University. He is not a lightweight. From 2001-2007 he was staff scientist at the National Center for Biotechnology Information; 2001-2007 a Research Associate at the Smithsonian’s National Museum of Natural History.

Hell broke loose when he authorized in 2004 the publication, in the Proceedings of the Biological Society of Washington, an organ of the Smithsonian Institution, of a peer-reviewed article, The Origin of Biological Information and the Higher taxonomic Categories by Stephen Meyer. It dealt with the possibility of intelligent design as an explanation of aspects of Darwinism not explainable by the conventional theory. This is a serious no-no among the guardians of conventional Darwinism, the political correctness of science.

At the Smithsonian, he was demoted, denied access to specimens he needed in his work, transferred to work under a hostile supervisor, and lost his office space. In the ensuring storm of hatred, two separate federal investigations concluded that he had been made the target of malicious treatment.

Predictably, the establishment dismisses Meyer’s idea as “pseudoscience”:

Wikipedia: The Sternberg peer review controversy concerns the conflict arising from the publication of an article supporting the pseudo-scientific concept of intelligent design in a scientific journal, and the subsequent questions of whether proper editorial procedures had been followed and whether it was properly peer reviewed.

Pseudoscience? Does not Darwinism itself qualify as pseudoscience? It is firmly based on no evidence. For most readers this assertion will seem as delusional as saying that the sun revolves around the earth. This is because we have been indoctrinated since birth in the Darwinian myth. But look at the facts.


We are told that life arose by chance in the primeval oceans. Do we know of what those oceans consisted? (Know, not speculate, hope, it stands to reason, must have been, everybody says so). No, we do not. Do we know of what those oceans would have had to consist to bring about life? No. Do we even know what we think evolved? No. Has the chance appearance of life been replicated in the laboratory? No. Has a metabolizing, reproducing chemical complex been constructed in the laboratory, showing that it might be possible? No. Can the chance appearance be shown to be mathematically probable? No. Can Darwinism explain the existence of irreducibly complex structures? No. Does the fossil record, particularly of the Ediacaran and Cambrian, support Darwin? No.

Darwinism was a clever metaphysical idea formed when almost nothing was known about the matter, and imposed by impassioned supporters on a near-total lack of evidence. Should not intensely believing in something that you cannot support by observation or experiment be called pseudoscience?

The ardent of evolution, like Christians, base their creation myth from a sacred book, The Origin of Species, both resting on about as much evidence. Thereafter they assume what is to be proved. Since Darwinists posit the unchallengeable truth of their version of creation, no reason exists for questioning it. If you know it happened, then obviously it was mathematically possible. The math can be discovered later. If you know that life originated in ancient seas, then how it originated becomes a mere detail. If you know the theory is correct, then anyone who doubts must necessarily be at least wrong, and thus ignorable, and perhaps a crank or fool or lunatic.

A classic example of starting from certainty is Darwinism’s reaction to the apparent irreducible complexity of the bacterial flagellum, though hundreds of others could be adduced. This is an immensely complex cellular organelle which would cease to function of any of its parts were removed. It could not have evolved by Darwin’s gradual changes. The Darwinians say, “Well, we aren’t sure just at the moment, but is possible that we will figure out later how it could have happened.” Yes, and it is possible that I will win three Irish Sweepstakes in a row. They are, again, saying that they know that Darwinism is correct, and therefore the evidence will be forthcoming. This is called “faith,” the belief in the unestablishable.

As a friend has written in another context, “When utterly astonishing claims of an extremely controversial nature are made over a period of many years by numerous seemingly reputable academics and other experts, and they are entirely ignored or suppressed but never effectively refuted, reasonable conclusions seem to point in an obvious direction.”

Just so. A lot of highly credentialed researchers express doubts about doctrinaire Darwinism, asserting that it cannot explain many aspects of nature. What does explain them is a separate question. Why is wondering about this a firing offense?

A difficulty in conveying doubts about Neo-Darwinism (the correct name of the current theory) is that very few people, including the highly intelligent, know anything about the issue. The world is full of esoteric specialities from the decipherment of ancient Sumerian inscriptions to the neural anatomy of squids. Few will have chosen Darwin’s defects for careful study.

This is convenient for Darwinists as the dim will believe whatever they hear on television and the bright usually have other things to do with their brains. As the case of Mr. Sternberg shows, scientists who doubt Darwin–again, there are many–know better than to say anything.

The fury is telling. If the Darwinists could prove the many highly credentialed proponents of ID wrong, they would do so, and that would be that. If they could prove their own propositions correct, they would, and that would also be that. But they can’t (or they would have).

If you follow the controversy, you quickly see patterns. One is that the Darwinists are fiercely defensive, suggesying doubt of their own position.   People seldom become infuriated at doubts of something that they believe with genuine certainty. If a physicist at CalTech expressed doubts about general relativity, he would certainly be challenged to prove his theory. He would not be hounded, belittled, forced to resign, charged with pseudoscience, and banned from publication.

Unfortunately for NeoDarwinism, the likelihood of confirmation diminishes with time. Year by year, the fossil record becomes less incomplete, and still the intermediates are not found. As molecular biology repidly advances, the failure to find a chemically possible chain of events that might produce life leads ever more convincingly to a simple concluseion: There isn’t one.

Publications by Richard Sternberg.


Science and Government for Our Own Good?

We Have Been Manipulated and We are Paying for It

From condo boards to Capital Hill we are generally managed by a group of people that consider themselves our benefactors. More often than not these people consider themselves highly intelligent and posses the conviction that everyone would be happier, healthier and generally much better off if they would follow their rules and recommendations. Another characteristic of these leaders is that they seem to gravitate towards academics or politics. Over decades the relationship between like-minded academics and politicians has reinforced the notion that they are capable to find fundamental solutions to most of our problems.

One example of this marriage is seen in government official concern about the American diet and its effect on the “nations health”. While we ask what business does the mayor, or the statehouse or Washington have in trying to tell any of us what we should eat, we need to understand what actions they have taken in controlling our choices over the years and the consequences of those actions.

The truth is that our political class have often decided that they and their educated advisors are much smarter than average citizens and have acted hundreds of times over the past number of decades to control our lives in regard to our eating requirements.

In addition to instituting policies designed to improve our eating habits they have also managed (manipulated) the system that produces our food for any number of other “worthy” purposes.

At the end of World War I, the destructive effects of the war bankrupted much of Europe, closing major export markets for the United States and beginning a series of events that would lead to the development of agricultural price and income support policies. United States price and income support, known otherwise as agricultural subsidy, grew out of a serious farm income and financial crises, which was understood to jeopardize the future ability of American agriculture to meet the food needs of the American people. This led to widespread political belief that the free market system was not adequately rewarding farm people for their agricultural commodities.

Beginning with the 1921 Packers and Stockyards Act and 1922 Capper-Volstead Act, which regulated livestock and protected farmer cooperatives against anti-trust suits, United States agricultural policy began to become more and more sweeping in its scope. In reaction to falling grain prices and the widespread economic turmoil of the Dust Bowl and Great Depression, three bills established to price subsidies for farmers in the United States: the 1922 Grain Futures Act, the 1929 Agricultural Marketing Act, and finally the 1933 Agricultural Adjustment Act.

Out of these bills grew a system of government-controlled agricultural commodity prices and government supply control (farmers being paid to leave land unused). Supply control would continue to be used to decrease overproduction, leading to over 50,000,000 acres to be set aside during times of low commodity prices (1955–1973, 1984–1995). The practice wasn’t curtailed until the Federal Agriculture Improvement and Reform Act of 1996. While the reform act of 1996 was supposed to be a first step towards returning the agricultural economy to free enterprise it left in place hundreds of specific supports and direct programs for farmers so that the result was at best only a series of half measures. Even this reform act was a victim of the power of special interest lobbies over the general welfare of the public. Sugar is a classic example of how this system corrupts the political process.


In 1981 the U.S. government passed laws and implemented policies that were designed to support the price of domestically produced sugar (beet and cane) by using tariffs and purchase programs.

In 2001 sugar imported into the U.S. outside of the Tariff Rate Quotas (TRQ) paid a duty of over 15¢ per pound. At that time world sugar prices were averaging 7¢ per pound while U.S. sugar was selling at 22¢ per pound.

The cost in tax dollars flowing to support the sugar industry has been as high as $1.4 million per month in 2000 just for storage of surplus sugar alone with an overall estimated $200,000,000 per year currently in direct purchase costs.

One private U.S. company, US Sugar of Clewiston, Florida produces over 18% of all U.S. sugar and this government policy has made Clewiston one of the wealthiest per capita towns in the country.

In 1996, an agriculture reform bill year, sugar companies paid directly and indirectly $13 million to the 49 members of the House Agriculture Committee and remarkably, seemed to have managed to leave sugar price supports exactly as they were.

In a claimed effort to protect an economic segment that probably employs under 17,000 total (with estimated job losses more like 3 – 4,000 should price supports be removed) the U.S. government has spent hundreds of millions and caused the American people to suffer multiple negative unintended consequences.

As a result of this sugar policy food companies have switched from refined sugar to fructose in the manufacture of most processed foods. In addition to processed foods virtually all U.S. soft drinks and a majority of juices are now formulated with high fructose corn syrup. This has been done for one very simple reason… cost.

While there is an ongoing debate about many health problems associated with fructose versus sugar there are troubling indications in some areas. Results published… by the journal Pharmacology, Biochemistry and Behavior in 2010, by researchers from the Department of Psychology and the Princeton Neuroscience Institute reported on two experiments investigating the link between the consumption of high-fructose corn syrup (HFCS) and obesity. Using lab animal experiments the results found a high correlation between HFCS and obesity that was not found in a diet using equal amounts of table sugar.

In 2002 The Department of Commerce, International Trade Administration issued an official report on the sugar laws impact on the confection industry in the U.S. which characterized the results as follows:

  1. Employment in sugar containing products (SCPs) industries decreased by more than 10,000 jobs between 1997 and 2002 according to the Bureau of Labor


  1. For each one sugar growing and harvesting job saved through high U.S. sugar prices, nearly three confectionery manufacturing jobs are lost.
  2. For the confectionery industry in particular, evidence suggests that sugar costs are a major factor in relocation decisions because high U.S. sugar prices represent a larger share of total production costs than labor.   In 2004, the price of U.S refined sugar was 23.5 cents per pound compared to the world price at 10.9 cents.
  3. Many U.S. SCP manufacturers have closed or relocated to Canada where sugar prices are less than half of U.S. prices and to Mexico where sugar prices are about two-thirds of U.S. prices.
  4. Imports of SCPs have grown rapidly from $6.7 billion in 1990, to $10.2 billion in 1997, up to an estimated $18.7 billion in 2004 based on 2002 trends.


The fundamental problem with price supports is that no matter how justified at some point, as with most laws and government programs, they live on way past their usefulness.


Want to understand why Americans are getting fat? Look at the unintended consequence of trying to protect sugar growing jobs.


Another area where the anointed have taken action to protect us from ourselves is government direct intervention in controlling the American diet. While the changes worked on the American diet are significant there is also the quantity of our money (taxes) expended in the task to consider.




It seems to be almost impossible to keep up with dietary research into what things in our diet are good for us and those that are bad. Salt is bad (recent research has now reversed that), eggs are bad (it seems eggs switch back and forth regularly), trans-fatty acids are bad (they were originally introduced as a healthy substitute for animal fat in our diet). For almost sixty years the recommendations regarding a healthy diet have emphasized that animal fat is a major contributor to coronary disease.


As a recent article in the New York Times notes “The new findings are part of a growing body of research that has challenged the accepted wisdom that saturated fat is inherently bad for you and will continue the debate about what foods are best to eat.


For decades, health officials have urged the public to avoid saturated fat as much as possible, saying it should be replaced with the unsaturated fats in foods like nuts, fish, seeds and vegetable oils.


But the new research, published on Monday in the journal Annals of Internal Medicine, did not find that people who ate higher levels of saturated fat had more heart disease than those who ate less”.


Since the 1960’s we have been pressured by our government and academics to avoid consuming fat in our diet. And while some will argue that the American people have not taken this government advice seriously the facts actually tell a very different story. So why did they urge this and what have been the results?


Historically, in the late 1930’s medical professionals began to notice an alarming increase in coronary disease. It was first made public in 1938 with an article in the Journal of the American Medical Association. By the 1950’s it was declared an epidemic.


Along comes Ancel Benjamin Keys, a scientist at the University of Minnesota with the explanation. His research pointed to high dietary fat as the culprit. He published a seven-country study “proving” that the level of coronary disease was a result of the fat content of various national diets. Other researchers and government officials embraced the findings and it didn’t hurt that he was a charismatic person with solid academic credentials. From that point on funding and emphasis in additional research, dietary recommendations and government policy changed.


The government so embraced his findings that within a decade whole new departments were created to inform and help the public make healthy eating choices. There were departments to educate, others to provide packaging requirements so the public could better identify what they were eating, and others to do research on better nutrition.


In 2013 there was an $8.7 million budget just for the Center for Nutrition Policy and Promotion within the Department of Agriculture. While the politicians are always quick to point out the need for the USDA to protect our food supply, inspection is actually one of the smallest parts of their budget. Consider 2010’s expenditures:


USDA 2010 budget (partial)

Farm and Foreign Agricultural Services $25 billion (not subsidies)

Food Safety $1 billion

Research, Education, and Economics $2.89 billion

Marketing and Regulatory Programs $2.75 billion

Food, Nutrition, and Consumer Services $97.95 billion*

  • (includes Supplemental Nutrition Program for Women, Infants and Children [$7 billion] and food stamps [$41.2 billion]). What is the remaining $47.5 billion used for?

That is $78.64 billion in non-food expenses at the USDA which represents the total average annual tax collected from over 17 million Americans


At the governments urging, butter was replaced by margarine, lard by vegetable shortening (Crisco which was a major funder of the American Heart Association) and other vegetable oils, and most importantly of all, eggs, cheese, and meat were replaced to a great extent by pasta and grain. The result of this is the actual American diet switched from animal fat to vegetable oil, from beef and pork to poultry along with a twenty-five percent increase in carbohydrates in the average American’s diet since the early 1970s.


Dr. Keys sealed the saturated fat argument in 1961 by taking a position on the nutrition committee of the American Heart Association, whose dietary guidelines are considered the definitive statement on the issue.


What were some of the results in this government led diet change in America? Increased consumption of carbohydrates could be one of the most significant negative changes. One of the problems is that carbohydrates break down into glucose, which causes the body to release insulin, which is a hormone that is extremely efficient at helping the body at storing fat. If carbohydrates weren’t enough, early on, clinical studies showed diets high in vegetable oil resulted in higher rates of cancer and gallstones.


In probability and statistics there is an adage that correlation is not causation. What that means is that without controls and consideration of multiple variable factors there cannot be high trust, let alone certainty, in the statistical results. Also it has become common for nontechnical people (media, government) to misunderstand study results as significant when the statistics actually conclude that the data is not outside of normal variation and probability. Oddly, in recent years an alarming number of researchers do not seem to attempt to correct the reporting when the media seriously misstates the nature of the research results, while in the past it was common. Perhaps they do not think that it is their job to educate anyone in probability and statistical hypothesis testing but the result is usually the uneducated media and politicians overstating the significance of the results.


During all this nobody thought to question the type of diet Americans had at the turn of the nineteenth century and before and what negative health results it should have produced. While there doesn’t seem to be much research to document the character of that historical diet it is highly unlikely that the American diet at that time was lower in fat or salt than the diet in the first third of the twentieth century. Butter, bacon, beef, biscuits, fatback, lard were all staples of most nineteenth century American’s diet. The Southern diet as recently as the mid twentieth century was characterized as a heart attack waiting to happen. With that history why did fat suddenly become the primary suspect in the cause of coronary disease?


What is now emerging is the very real possibility that Americans have born the high cost of a misguided information campaign sponsored by our elected officials and their bureaucrats. This cost was not just in tax dollars wasted but in the negative impact on Americans health. And now our government is very concerned with our diet and the obesity problem that they are partly responsible for creating.


In attempting to defend the government position an argument has been put forward that the crises was real and existed long before it was recognized and became apparent only after medical procedures improved for identifying heart attacks in the twentieth century. While death certificates may have missed actual causes of death in those early years, examinations of specific and large institutions detailed autopsy records have put that argument to rest. It seems that there was a very real and dramatic increase in coronary disease from the 1930s to the 50s and beyond. The question seems to remain what the cause could be? If the American diet had not changed significantly in the early twentieth century what was causing the major increase in coronary disease?


Actually there have been a few papers that have offered a credible explanation with one being published as early as 1962. It seems that the smoking of tobacco while common was actually lightly used until the late nineteenth century. Around 1885 American companies began to manufacture and market machine made cigarettes. Public consumption went up dramatically from a pre-machine annual consumption of 40 per year to 40 per month by 1900 and 80 per week by 1925. Allowing a 20 to 30 year lag time for symptoms to occur and you have a credible model to explain the 1930s spike in heart problems and the continuing increases thru the 1950s.


What should be taken away from all this? One thing would be to question the wisdom of our political leaders in areas beyond the basic needs our government was originally intended to provide. There are a lot of research results and opinions of scientists that are proven to be wrong over time.


Another is to question the ever-growing influence of government funds in scientific research. It seems that our political leaders have developed a preference for funding scientists whose research leans in the direction of confirming popular and already held beliefs. And the possibility that our lives might be better and healthier and more productive if government stopped being so concerned about protecting us from ourselves.