Episode 8: The Paris Agreement on Climate

Peasants & Emperors is a podcast presenting topics related to democracy, science, culture, women’s issues, current events and critical thinking. A new podcast is produced and available for listening/download approximately every two weeks.

In episode eight, the Hooligans discuss some noteworthy things related to the Paris Agreement generally and talk specifically how the agreement might impact Canada.

Episode 8: The Paris Agreement on Climate

Click on the hyperlink above to download and listen to the podcast. Feel free to leave a comment or question in the comments section below. One of the cast members will respond.

Thanks in advance for listening and check back regularly for updates to the site and podcast.

Notes & Clarifications
1). In the podcast Rick mentioned a website called http://www.topten.info that provides information to consumers about how eco-friendly certain products are, etc. Regrettably this site’s members appear to only be European countries. Nonetheless, if you click here you can learn something about this initiative. Canadians interested in learning about eco-label initiatives in Canada can click here for more information.

2). The six island nations threatened by rising oceans if warming is not kept at the 1.5 Celsius mark are: Kiribati, Maldives, Fiji, Palau, Federated States of Micronesia, Cape Verde.

3). For members of the audience interested in learning more about non-market approaches to tackling climate change please click here. This document was produced by the United Nations Framework Convention on Climate Change (UNFCCC).

4). For people interested in learning more about Forest Bonds (as it relates to being part of the non-market solution) click here.

5). Rick observed that we would need more than one earth in order to support a Western lifestyle for every person on the planet. Below is an infographic taken from the Popular Science website illustrating how many earths one would need to live a certain type of lifestyle. Additional information on this concept is available on the Global Footprint Network.yaypants1

6). Rick could not recall the name of the legal body overseeing and enforcing the provisions of the North American Free Trade Agreement (NAFTA). The technical term for such a body is a Free Trade Tribunal.

7). In the podcast, it was observed India/China would not qualify for financial support under the 100 billion dollar international fund as per the Paris Agreement. This actually is not the case. Both India and China qualify to draw from this fund in order to diminish the economic imperative to meet their growing energy needs through renewables as opposed to through coal.


The Role of Perception in Science: Part 5 “Logical Conjecture”

Testing Assumptions around a Long-Established Model
In the 1930s, the Land Bridge Theory was the best available explanation for the settlement of North America. Although the model had merit it still lacked the ability to be effectively tested. Models incapable of either being confirmed or falsified have more in common with speculation (or magical thinking) than with genuine science. Effective models enable scientists to make predictions; and if a prediction is confirmed it suggests the model is, at the very least, approximately true; and over time the more evidence acquired in support of a model increases the trustworthiness of that model.

Concrete evidence in support of the Land Bridge Theory came in 1949 when chemist Willard Libby introduced the world to radiocarbon dating. Radiocarbon dating was the first reliable technique for determining the absolute age of carbon-based (organic) objects. After radiocarbon tests were conducted on archaeological sites throughout North America, it was established that the oldest sites of human settlement were on the West Coast (just as the model predicted).

Scientists predicted the earliest settlements were located on the West Coast because the climate there was comparatively more hospitable than that of the interior. Predictions like these are developed in part through evidence gathering and logical conjecture, e.g. no people would settle, let alone survive, in too harsh a climate (especially when better options existed on the coast).

Another piece of evidence favoring a coastal settlement hypothesis was the fact that during the most recent ice age North America’s interior was not only too cold to support people but also unreachable because of glaciers. By contrast people living in coastal settlements could take advantage of both the abundance of fish/shellfish but also a climate made more livable by the influence of warm ocean air. Scientists conjectured that the interior was gradually populated with the recession of the glaciers about 11,700 years ago.

If the Land Bridge Theory had a weakness it was the assumption the Western Hemisphere was only ever settled by humans once in the planet’s entire history. If there was only one migration event, logically-speaking, Native Americans currently living in North America would be genetically similar to people living in Siberia (the assumed source of the first initial migration to North America 13,000 years ago). In the 1970s, improvements in our understanding of genetics allowed us to finally test this prediction.

Test results revealed Siberians and North America’s indigenous peoples did in fact belong to the same haplogroup (and therefore shared a common ancestor in the not too distant past). The one migration hypothesis seemed to be getting stronger and stronger with every subsequent piece of evidence—logical, archaeological or genetic.

Logical Conjecture & Theory Construction
In his book The Grand Design, theoretical physicist Stephen Hawking observed scientists prefer building theories that are simple and elegant over ones which are un-necessarily complex. At first glance the Land Bridge Theory seems to fit Hawking’s requirements—it is simple, straightforward, and supported by multiple lines of evidence. So what could possibly be wrong with such a theory?

Well, a theory is ultimately only as strong as the evidence in support of it; and since scientists do not know everything they frequently have to make (logical) assumptions; and if experts do not have all the information the assumptions they make might lead them to make mistakes or miss important details. Despite the problem of not having all the information the scientific method is still nonetheless a powerful tool of discovery. This is because when new evidence emerges we improve our theories by changing them to take in to account the new information.

Some critics of science point out that since we keep making mistakes and scientific knowledge is continually changing science is inherently untrustworthy. People who hold such a view do not understand the scientific method very well; that is, in many respects science works on the basis of trial and error, e.g. a scientist tries one approach and it doesn’t work so they try another, etc. In the end, when scientists admit they’ve made mistakes they use what they’ve learned to refine their theories making them more accurate (which most sane people would admit is a good thing). But to refine a theory you need more and better information; you need to confirm assumptions.

Logical conjecture is used whenever experts need to reach a tentative conclusion. In other words conjecture acts as a temporary stepping stone towards a better understanding of things. Logical conjecture is not quite the same thing as an educated guess. This is because when such conjecture is used no guessing is attempted whatsoever. Rather, experts make use of the available evidence (you need a concrete starting point) to make predictions of what probably happened, what is probably happening, or what might happen.

The Power of Logical Conjecture: Hubble & Lemaître
Conjecture is a powerful tool: for example, while walking in an alley you notice a broken window in the apartment building to your right. You see shards of glass and a baseball on the ground. If you were just guessing at what broke the window you’d resort to a claim not supported by the available evidence, e.g. powerful sound waves broke the window. Though it isn’t impossible sound was responsible for the damage, it is not supported by the available evidence (the source of such a sound is absent). Also, invoking Ockham’s Razor (or the notion the “simplest explanation is the most likely”) we can assume the cause of the broken glass is probably not so dramatic.

When using logical conjecture you do not guess: you begin with what evidence you have (a ball, shards of glass nearby) and move towards making a more reasonable conclusion like the ball was thrown probably breaking the window. This explanation, compared to the sound hypothesis, is simpler and more likely. But is it the only possible explanation? No. Yet, it appears to be the only one supported by the available evidence; and we use this information to construct a predictive model.

Logical conjecture is useful for helping us organize small pieces of evidence in to a ‘bigger picture.” Specifically, it allows us to complete a puzzle even though many pieces of the puzzle might be missing. For example, in 1924 astrophysicist Edwin Hubble (1889-1953) observed that some galaxies appeared to be “blue-ish” while others appeared “red-ish.” Hubble was initially confused by what he observed; however, he knew that both light and sound traveled as waves. In the case of sound, waves sound different depending upon whether or not sound is moving away from as opposed to towards an observer. This well-known phenomenon is called the Doppler Effect. Hubble used his understanding of sound and extrapolated that light also behaved as though it too expressed a Doppler Effect; that is, objects moving away from an observer cast red light while objects moving towards an observer cast blue. Physicists call this confirmed phenomenon “red shift.”


In 1927, astronomer Henri Joseph Édouard Lemaître (1894-1966) built on the work of Hubble. He conjectured that since galaxies were moving away from one another that at some time in the distant past star systems must have been closer together, even occupying a single point in space (13.82 billion years ago according to the most current work on the topic). To put it another way: the fact galaxies were moving away from us implied space was expanding (which likewise implied the prior existence of a central starting point or singularity).

The idea of an expanding universe is an idea most of us find reasonable or take for granted today. Yet, prior to Lemaître’s work the vast majority of astrophysicists (including Albert Einstein) believed the universe was in fact static, immovable, set in place “as is.” (Most physicists also figured the universe did not have a distinct beginning but rather simply “always was.”)

Lemaître’s work led to the eventual development of the Big Bang Theory. The Big Bang Theory is all fine and good, but what does it have to do with our discussion regarding logical conjecture? Well, we cannot actually go back in to the past and see the Big Bang occurring. Therefore, we use the available evidence—red shifted galaxies—deducing all matter at some point in the distant past occupied a single point, eventually exploding in to the expanding universe we observe today. (Stephen Hawking’s work with black holes in the 1960s actually established a theoretical framework for the universe not only having a definite beginning but that the universe emerged out of literally “nothing.”) These conclusions were not guessed at; they were the products of a disciplined use of logic and evidence.

The Necessity of Gathering More Evidence
No method of reasoning or evidence gathering is perfect: a theory is only as good as the evidence acquired in support of it. If experts are missing critical pieces of information their models will not be accurate. Such theories might even lead you to make false assumptions about reality.

For example, let’s return our attention to the alley (see image below). When we formed our first hypothesis, e.g. the ball broke the window, etc. we weren’t aware of several other vital pieces of information. When we examine the scene more closely we discover that a cat was hiding behind a garbage can where we also find a broken pot and plant. This new evidence requires we return to our original hypothesis: specifically, our assumption that the ball broke the window is now not the only reasonable explanation for what happened. Based on this new information we would be justified in forming a second hypothesis, e.g. the cat knocked the plant through the window above. Which hypothesis is correct? Until both hypotheses are tested we don’t actually know which model to use (both models make useful predictions). Ultimately, hypotheses are only useful starting points to begin an investigation.


The biggest problem with logical conjecture is that in order to use it we have to make additional or secondary assumptions. To this end our understanding of particle physics has improved over the past century of study; however, we do not know everything there is to know about this field. Yet, we are still developing some pretty interesting technologies using the Standard Model of physics, e.g. GPS, space travel, satellites, etc. Nonetheless, some humility is in order; that is, when we are too confident in the explanatory power of a theory we risk blindness (failing to either improve our existing understanding or see how that understanding might be flawed).

The Problem with Inheriting Assumptions
Scientists, just like everyone else, inherit assumptions from their parent culture. The parent culture shapes how they look at the world. In the previous section, I mentioned how most people simply take it for granted that the Big Bang is a fact. Why do the majority of people accept it as true? Based on the evidence? Perhaps. Though to be honest I’m not so sure. The theory itself is simple enough to grasp but how many people are actually acquainted with the Big Bang’s supporting evidence like “red shift” or the “cosmic background radiation”? In my honest opinion, the theory has been around for so long, and is so well-established in the parent culture most of us just grow up believing (rather than understanding) the Big Bang is an accurate description of the universe’s origins. We are, after all, creatures with believing brains. With this in mind, let’s return our attention back to the Land Bridge Theory.

According to the earliest renditions of the Land Bridge Theory there was only ever one migration of people from Asia to North America. Only one. Even the evidence in our possession as recently as the 1990s seemed to support the one-migration view. But what are scholars saying about it in 2015? Can we still learn anything new about how the North, South and Central America were populated? I guarantee we have not searched every inch of land. There must still be important discoveries to make. Therefore, it seems to me to be incredibly likely we’ll discover archaeological sites older than 13,000 years of age (and, as coincidence would have it, we have).

Sometimes when we think we’ve answered a question once and for all, or when we stop asking meaningful questions, we grow too confident in our theories; and that confidence frequently gets us in to trouble; this is because when theories cease being accepted on the basis of evidence alone, and instead become something simply believed in, the affected scientist becomes less willing (or, worse still, less capable of conceiving of alternate explanations). For example, if I believe there was only one migration event then the evidence of a settlement (dated to 20,000 BCE) will either be missed or possibly ignored, i.e. expectation blinds us to the possibility that reality is different than what we were taught; and when we believe in theories too strongly bias makes us incapable of seeing the forest for the trees.

Successfully Challenging the Land Bridge Theory
If you took a history or social studies class in North America over the past several decades, it is likely you’ve been taught one or both of the following ideas:

1). Before the arrival of Columbus in 1492 AD the “New World” was an untamed wilderness.
2). The sparse populations who lived on this virtually empty land were unremarkable civilizations.

Now if I were taught (and believed) these two ideas accurately described the settlement process of North America, the one migration hypothesis becomes more plausible: nonetheless, practicing good science requires we do not become indefinitely beholden or believe in any one idea.

(For this very reason many physicists in England were reluctant to accept the validity of Einstein’s General Theory of Relativity when it was first proposed in 1915. This is because for three centuries Newtonian mechanics sufficiently explained and gave order to the universe. Thus, physicists ignored Einstein’s challenge until Arthur Eddington proved by observation one of relativity’s chief predictions in 1919, e.g. light bent due to gravity’s influence. Newton was effectively unseated. Einstein never receiving a Nobel Prize for relativity perhaps reflects in part the bitterness felt by some English members of the scientific community to see Isaac Newton’s position diminished by a German-Jewish scientist. Click here for more on Eddington’s experiment.)

We must be free, like Arthur Eddington, to question any authority—scientific or otherwise. We must go where the evidence takes us and it is wise to remain skeptical of even our best established scientific theories. The importance of skepticism cannot be over-stated: questioning established wisdom is the only way to grow in our knowledge or understanding of the world.

The Land Bridge Theory started coming under fire in 1997 when an archaeological dig in Southern Chile (see Chinchihuapi Creek at Monte Verde) revealed compelling evidence of human habitation much earlier than 13,000 thousand years ago. Scientists initially struggled to reconcile this new information with the existing Land Bridge Theory as it was first proposed. According to the original theory this ancient Chilean civilization should not have even existed… Yet, exist it did. Scientists have subsequently conjectured that the early settlers of Chile must have made the journey southward before the ice-free corridor ever appeared (if it ever appeared at all—new evidence has emerged even placing the existence of an ice-free corridor idea in to doubt); moreover, there appears to be a haplogroup of a people living in the interior of South America that does not match up with Siberian peoples; therefore, there must have been at least two separate settlements.

In his book 1491, science writer Charles C. Mann observed that “[given] the near impossibility of surpassing the glaciers without the corridor, some archaeologists suggested the first Americans must have arrived twenty thousand years ago, when the ice pack was smaller. Or even earlier than that—the Chilean site had suggestive evidence of artifacts more than thirty thousand years old. Or perhaps the first Indians traveled by boat, and didn’t need the land bridge. Or maybe they arrived via Australia, passing the South Pole” (Mann, Page 19).

Although confidence in the Land Bridge Theory has been shaken by new evidence the model itself has not been discarded. Despite the theory’s apparent shortcomings it nonetheless still has explanatory power, e.g. we can still use it to describe the most recent human migration event to North America (which will form part of a bigger, more encompassing theory in the future). For this reason the theory is changing to accommodate the new findings like the archaeological sites in Chile and elsewhere. Again, to quote Charles Mann, “Scientists do not necessarily agree on all of the details; some scientists have theorized that the Americas may have been hit with as many as five waves of settlement before Columbus, with the earliest occurring as much as fifty thousand years ago. In many versions, though, today’s Indians are seen [comparatively speaking] as relative latecomers” (19).

SOURCE: Mann, Charles. 1491: New Revelations of the Americas Before Columbus. New York: Vintage Books, 2005.


The Role of Perception in Science: Part 4 “Concepts & Ideas”

Introduction: Deduction vs. Induction
In the history of science, spontaneous generation and abiogenesis have been used as synonyms describing the same phenomenon, i.e. an event by which animate life formed from inanimate matter. If I were an ancient Greek philosopher, like Aristotle or Anaximander, I would arguably feel right at home using either term to explain how life originated, e.g. Anaximander (610-546 BCE) hypothesized humans spontaneously formed out of mud exposed to sunlight. The idea humans emerged out of something like mud agreed with the worldview of the ancient Greeks because they believed everything was composed of a combination of earth, air, fire and water. The concept of spontaneous generation, or abiogenesis, was developed well before the advent of the modern sciences of chemistry, biology, physics or geology.


For all their cultural achievements the ancient Greeks conducted not a single experiment to confirm or falsify even a single assumption they made about physical reality (not a one).  Instead, they trusted their ability to intuit or think up answers.  (In philosophy, this type of reasoning is called a priori reasoning or deduction: through deduction we can come up with all sorts of conclusions about physical reality like mice are the product of dirty shirts, wheat, etc. without any need to confirm or disprove a hypothesis.)

Aristotle (384-322 BCE) deduced larger objects by their nature were heavier and therefore fell to the ground faster than smaller objects.  Aristotle’s view was considered authoritative until an experiment falsifying this assumption was finally conducted in Pisa (some 1900 years by students of Galileo). Once experimental science came in to widespread use—with the work of Galileo, Isaac Newton and Sir Francis Bacon—we reduced our dependency upon deduction, replacing it with an increasing use of inductive reasoning: through induction we “induce” or “compel” nature to relinquish her secrets by conducting experiments; that is, we aren’t satisfied by simply deducing an answer to a complex question; rather, we look for real or material causes to explain why the world works as it does.

Abiogenesis as a Scientific Concept
Not unlike Aristotle’s teaching on gravity, the ancient and pre-scientific concept of spontaneous generation persisted for centuries unchallenged; that is, until an alternative explanation for life’s origins was proposed by Charles Darwin (1809-1892 AD) in his book Origin of Species (published in 1859). Darwin’s work presented a different explanation for how life possibly emerged on Earth: life began as something simple, and over the enormous span of geologic time, evolved in to something more and more complex and various. The problem with evolutionary theory, at least as it existed in the 19th century, was it still did not explain why life began in the first place; it explained how natural selection produced a variety of life forms but did not explain the origin of life (attempts would come later). For this reason spontaneous generation continued to be regarded as a useful idea well into the 19th century.

As our understanding of the cell and life cycles improved in the latter 19th century, scientists placed increasing emphasis on looking for material causes for observed effects in biology and in the study of disease in particular. In this context, Louis Pasteur (1822-1895) developed a germ theory of disease whereby disease was understood as the product, not of the work of evil spirits, but of microscopic disease causing pathogens. In other words, people (well, most of us) abandoned magical explanations preferring to explain reality by appealing to observable mechanisms. Science posits the existence of any organism—whether we are talking mice, bacteria or alligators—is explainable through an appeal to a series of natural processes interacting with one another in a chain of causality. No spontaneity required.

Historically speaking, the words abiogenesis and spontaneous generation have been used interchangeably in the same sense the words hot and warm are used in order to refer essentially to the same thing. Honestly, some ideas are so simple we can be careless with language without risking or losing too much in the translation. This is case with the terms spontaneous generation and abiogenesis if we are just speaking off the cuff and talking in general terms as opposed to specific concepts.

Where we get in to trouble, however, is mistaking the two words for the same concept. Spontaneous generation was coined by Aristotle who attempted to explain the mysterious appearance of flies from rotting meat; whereas abiogenesis is a modern term used in reference to a chain of causality leading to the appearance of life from inanimate matter. The French philosopher and logician Peter Abelard (1079-1142 AD) argued that for the sake of clarity we must make deliberate and good use of the appropriate words when communicating with one another (or else we cannot expect to both get at the truth or transmit it accurately).

Abelard had four recommendations for those of us hoping to practice clear thinking (the third of which is most relevant to the topic at hand):

1). Use systematic doubt and question everything.
2). Learn the difference between statements of rational proof and those merely of persuasion.
3). Be precise in your use of words while expecting precision from others.
4). Watch for error (even in Holy Scriptures).

Mark Twain, an early 20th century American author, echoing Abelard observed the difference between using the right word and the almost right word was the distance between saying “lightning” instead of “lightning bug.” The way we use words matters and certainly shouldn’t become a barrier to understanding.

Confusion in Secondary Classrooms
Despite the conceptual differences between the two terms, I’ve come across science texts used in secondary classrooms that continue to use the abiogenesis/spontaneous generation interchangeably. In my own academic experience, I’ve never come across any serious evolutionary biologist who confuses the two.

In the case of the overall origins of life, the term abiogenesis is used in modern science to describe the eventual product of a lengthy, gradual process whereby the interplay of a combination of elements like carbon, hydrogen, oxygen, nitrogen mixing about in pools for hundreds of millions of years, etc. were exposed to a combination of radiation, heat, pressure, electricity, etc. inevitably forming in to simple organic compounds (due to chemical reactions).

During the Miller-Urey experiment in 1953, the conditions hypothesized as necessary (articulated above) for the development of life were simulated. During this experiment, scientists were able to generate or grow amino acids from a primordial solution (it should be noted that this experiment is ongoing and continues to generate amino acids even in the present day). Yet, the only thing “spontaneous” about generating these acids is that one could not exactly predict the exact moment the solution would produce an organic compound; however, we could and did predict that the primordial solution was capable of eventually producing organic compounds with the passage of time.

(There’s an interesting caveat here with respect to the Miller-Urey experiment. The scientists involved have acknowledged it is problematic to posit life emerged from their hypothesized primordial pools. Click here to visit a Scientific American blog discussing some of the associated problems with the experiment.)

Chemistry, biology and physics being what they are it is inevitable molecules and compounds are created through the sheer number of interactions between matter. The product of these interactions, in this case amino acids, are properly understood as organic by definition, i.e. “Of, relating to, or denoting compounds containing carbon.” Amino acids, however, are not properly understood as “living.” They are simply the building blocks of life. We are, however, still left with the task of bridging the gap between inanimate and animate (or to be a little more precise bridging the gap from simple amino acids to RNA, from RNA to DNA, and then from DNA to the development of complex organisms).

Lastly, although when these compounds formed they did so “suddenly” but there was no massive or categorical leap from one thing (a log) turning in to something completely different (an alligator).

Conclusion: Using Words Correctly Matters
The terms abiogenesis and spontaneous generation are not properly regarded in science as equivalents. I’ve read no serious modern biologist who has ever confused these two terms with one another when popularizing scientific ideas, e.g. Richard Dawkins, Stephen Gould, or Ken Miller. Again, yes, you could describe the moment something organic appeared by saying it was something spontaneously generated, but the idea that dust, rather than eggs, generated dust mites is now considered an absurd and unsupportable concept. The differences in the two terms, despite some incidental denotative similarities, point to two fundamentally different conceptions of reality, e.g. spontaneous generation reflects a pre-scientific worldview without any need for proof while abiogenesis reflects a modern scientific worldview whereby phenomenon are understood to exist as part of a chain of causality (no magic involved) which is both discoverable and falsifiable.

Why would a high school text use these terms interchangeably? I cannot answer this question without first taking a look at the background of the text’s authors and the context in which the book itself was written; it’s not as though mistakes or errors cannot creep in to a high school text book. In my honest opinion, I view any unqualified and continuing association of the two terms in any text as both problematic and confusing.  (Though to be fair if a student or teacher did not have the knowledge or awareness of the history of the two concepts, there’d be no reason to suppose a logical problem would exist in treating them as equivalents.) In the end, text book creators can and do make errors; moreover, not every scientist is necessarily an expert on the history of science.

The fundamental issue I have with the unquestioned interchangeability of these terms, therefore, is in one sense literary-philosophical and one part historical: philosophically speaking, if we treat these two ideas as identical concepts we commit a sort of categorical error because they do not describe the same process; and historically speaking, one concept continues to be used by serious scientists (abiogenesis) while the other is best reserved for conversations around a discredited notion relating to dirty clothes and mice.

Students and teachers need to appreciate that words, concepts and beliefs, etc. come from somewhere; that these words have a beginning; that they do not just appear as if out of nowhere (spontaneously generated); and when we understand the history of a term’s usage we begin to appreciate how our knowledge and conceptual understanding is a the process of unfolding.

The Role of Perception in Science: Part 3 “Model Dependent Realism”

The scientific method is the best tool of discernment we have. We don’t have to believe in things when using science. We just have to observe and make sense of the things we see. Nonetheless, we have to interpret and make sense of the evidence we gather through the scientific process to construct models.

For example, we have proven experimentally the theory of relativity’s prediction of being able to travel in to the future, e.g. the faster a particle (or person) travels the slower speed flows relative to a particle (person) at rest on earth. The implications of relativity and time travel are interesting to say the least, i.e. the person traveling at the speed of light for two weeks would return to earth where months have passed for the rest of us.

Many theoretical physicists believe there’s symmetry to the universe. (When I am using the word belief I am using it as a synonym for prediction.) For this reason some theorists argue if we can travel in to the future we should, at least in principle, be capable of traveling in to the past.

Enter Kurt Gödel.

Gödel (1906-1978) was an Austrian mathematician best known for developing the “incompleteness theorems” (theorems describing the limits of what mathematics can and cannot prove). Gödel used a combination of relativity’s predictions and mathematics to develop a model of the universe which predicted the Cosmos rotated (preventing the universe from crushing in on itself) while allowing for time travel in to the past.

According to Gödel’s model a traveler could return to a point in their own past by completing a circuit around a “cylinder” in spacetime (see diagram above). (Also, for more information see the September 2015 edition of Scientific American (Page 73).)

A closed timelike curve is any path through spacetime which loops back on itself. As it turns out, observations after Gödel developed his model have firmly established the Cosmos is actually expanding, not rotating. Nonetheless, prior to the observations made of an expanding universe, Gödel poured over all sorts of data to see if his model was reflective of reality. Despite Gödel’s error, the math underpinning his incorrect model of the universe was consistent with relativity (suggesting, at least at the level of mathematics, we could still learn something from the mistake: traveling back in time was theoretically possible).

The implication of Gödel’s work is obvious: scientists can construct models making valid predictions about reality even though the model itself might only be symbolic, or representational, of that reality. So despite the weaknesses in Gödel’s theory, there’s nothing about physics ruling out time travel to the past.

Reality is complex and it is rare indeed to be able to construct a model capable of fully explaining what’s going on “out there.” In fact scientists can create two models to describe the exact same phenomenon, e.g. light can be described as traveling as both a wave and as a discreet particle; gravity can be explained by appealing to either the bending of spacetime by massive objects, the action of conjectured gravitrons, and by appealing to the physical property known as acceleration (see Einstein’s acceleration-gravity equivalence). There is nothing contradictory about describing the same phenomenon using more than one model; rather, this capacity speaks to the explanatory power of science; that is, we can use science to develop different ways to describe the same reality in terms we can understand (even though reality, at its most fundamental or quantum level, makes absolutely no sense to anyone).

The KT Controversy
To get a basic appreciation for the nuances related to model creation, let’s consider the relatively recent controversy over the Cretaceous-Tertiary (KT) extinction event, i.e. a model positing an asteroid impact was responsible for the disappearance of the dinosaurs 65 million years ago.

What is incontrovertible about the KT Event is the fossil record firmly establishes the existence of dinosaurs and their relatively rapid disappearance at the time of the KT impact; however, the view perpetuated by well-meaning elementary and secondary teachers, that the asteroid was solely responsible for the extinction of the dinosaurs, might be overly-simplistic.

Tclay.jpgSubsequent analysis of the rock and sediment associated with the time period of the extinction event suggests the possibility these creatures were already in steady decline before the impact. The iridium found in the sediment of the earth at the KT (see Cretaceous-Tertiary boundary in diagram) may or may not be the product of an asteroid impact. Complicating things is iridium is also a product of volcanoes; as such, increased volcanic activity during the Cretaceous could actually account for the juxtaposition of iridium (leading to climate change) and the mass extinction event observed at and around the Cretaceous (for more see this article from the University of California (Berkeley): http://www.ucmp.berkeley.edu/diapsids/extinctheory.html).

This KT controversy is not the product of conflicting imagination or contradicting stories; it is born out of the genuine difficulty of constructing models from an interpretation of the available facts; and since science is not a fixed form of knowledge it is constantly being improved and amended; moreover, we are also not eye witnesses to the primary event forcing us to use a combination of evidence gathering and logical conjecture to put together a plausible picture. Scientists are detectives in the truest sense.

Suffice to say that although the scientific method helps us make some sense of things, the fact remains multiple and relatively trustworthy models can be constructed to describe the same reality or phenomenon. I do not mean to imply science as a result is arbitrary, only that, the rationalist would be wise to appreciate the scientific process produces plausibility not certainty; and I would further caution rationalists from doing expressly what they accuse non-rationalists of doing—believing science explains things, or rules out possibilities, that it does not.

Charles Darwin: The How Versus the Why

In his autobiography, the naturalist Charles Darwin (1809-1882) intimates science answers the question how physical processes unfold and not why they exist in the first place. Ultimately, it is an assumption nature is rationally organized as per the philosopher Immanuel Kant’s assertion that the language of nature is mathematics. Nature can be described using mathematics but it is not mathematics itself. In the great scheme of things, a model is our way of trying to make an irrational world—a reality shaped by both statistical determinism and randomness—rational.

In the 17th Century, Isaac Newton (1643-1727) explained the influence of magnets as the work of an invisible “soul.” Thinking of magnets in this way seems alien to us but it was natural to the people of Newton’s time, i.e. people believed some form of intentionality existed behind all things in nature. Unsurprisingly, most of us still perceive this intentionality in nature (contributing to our apparent inability at times to distinguish between correlation and causation). (For more on causation versus correlation see my article The Role of Perception in Science: Part 1 “Correlation versus Causation”.)

Although we no longer think of magnets in terms of souls, it is naïve to assume a scientist is immune to either the human penchant for story telling or from being influenced by their inherited worldview. Again, I do not mean to imply science is inherently untrustworthy. On the contrary, the Germ Theory of Disease would still be explained in terms roughly identical—pathogens cause disease—whether the author of the theory was a democratic-minded Frenchman or an authoritarian Russian.

Instead, the issue is whether the questions we ask lead us to answers reflective of what is actually happening in the world. In the end, and this is important to understand, we have no reason to assume the universe complies with our intellectual preferences for causal or rational order. Reality might just be, as the Enlightenment philosopher David Hume (1711-1776) asserted, a brute fact (no story or underlying purpose required).

Model Dependent Realism

In as much as the scientific method helps us construct plausible models of what is or what is not happening, rationalists still to varying degrees assume the scientific method is capable of describing the world with perfect objectivity.

How many high school (or university) science teachers teach the following fact? Even our best model is only ever provisional. In his book The Grand Design (2010), theoretical physicist Stephen Hawking (1942-present) describes the implications of our provisional knowledge by appealing to a concept called “model dependent realism.” Model dependent realism, roughly stated, is the notion our models (or perceptions) are merely representational (or symbolic) of reality. The models are not reality itself. Any assertion to the contrary, that “reality is one” as Plato insisted, that there’s no such thing as an “uncaused cause” as Thomas Aquinas argued, or “the earth is at the center of all creation because Man is God’s greatest creation” as the early Church intimated, etc. are just stories and illusions.

Hawking argues different models can describe the same phenomena and be equally useful; that is, the criterion for a successful model is not whether or not it disqualifies or even falsifies competing models; rather, a successful model is one which makes valid predictions while being simpler than a competing model. To illustrate his point Hawking contrasts Ptolemy’s “geo-centric view” with the Copernican or “helio-centric view” of the solar system.

The geo-centric view presented a description of the solar system with the earth at the center and the planets (and Sun) orbiting us. The helio-centric view by contrast posited the Sun at the center being orbited by all the planets.

Hawking asserted Ptolemy’s model is just as reliable a descriptive tool as the heliocentric when it comes to describing how the solar system functions, e.g. We could use Ptolemy’s model and still manage to send an astronaut to Mars or any other point in the solar system.

Hawking argued what ultimately set the heliocentric model apart from the geo-centric was the former provided a “simpler, more elegant” explanation for planetary motion while the latter required Ptolemy to posit something in to existence Epicyclethat did not in fact exist: epicycles (see diagram on the left). When the earth spins on its axis planets, unlike stars, appeared to the ancients to zig-zag back and forth. Ptolemy (90-168 AD) referred to this zig-zagging motion as an epicycle (a small local movement of a “planet” (a word in ancient Greek literally meaning “wanderer”)) while it orbited the earth.


The implications of gravity for epicycles and the geo-centric view notwithstanding, the strange implication of Hawking’s thinking is the following: it seems almost incidental planets in fact orbit around the Sun as opposed to zig-zag in fictional epicycles around the earth. Both models are functional and are usable to make successful predictions about where planets will be and when they will be there; it just so happens that the Copernican view is preferable because it does not postulate things into existence (like flat or immovable earths) just for the sake of making theory match observation.

Subjectivity is part of any scientific investigation and endows science with both unparalleled predictive and explanatory power for describing reality. Some critics claim that since scientists continually modify models, change their thinking, or cast aside obsolete theories, etc. science itself is inherently untrustworthy. Nothing is further from the truth. We should save such criticisms for those systems or ideologies which do not countenance or tolerate change or stand up to either testing or reasonable challenges. To change one’s mind in light of new evidence is wisdom, not folly, according the thinker with a more plausible view of how the Cosmos actually operates.

Are You Smarter than Jonathan Bernier?

Reporters ask questions. That’s what they do. Sometimes they ask meaningful and tough questions; sometimes not so much. The answers people give are frequently insightful or, as the case may be, sometimes strange.

In December of 2014 Jonathan Bernier, a goalie playing for the Maple Leafs, was asked his opinion on the importance of Nelson Mandela:

Reporter: Just wanted to ask you, I mean, obviously Nelson Mandela [was] one of the most significant historical figures of the twentieth century. What knowledge or awareness did you have of him growing up or when did you learn of him?

Bernier: Well, obviously, growing up, you uh, he’s one of the most known athletes, uh, in the world and, uh, lot of impact in any kind of sport that he did, and uh…that I, you know, even playing hockey, everyone knows him from being the type of person he was off the ice and on the ice and so, it’s unfortunate that he passed away a year ago, but, um, he changed a lot while he was with us and he’s a tremendous guy.

Here’s a link to the interview: https://www.youtube.com/watch?v=JrAWrsb13bU


After watching the interview I felt sorry for Bernier…and then I started feeling sorry for society. We are surrounded by a lot of misinformed and ignorant people. In this case, the reporter exposed one of Bernier’s blind spots (and to be fair we all have them). Some blind spots are harmless but others are problematic, i.e. in order to preserve democracy citizens need to be informed and educated.

We’ve all seen it on television: reporters asking people random questions on the street testing the public’s knowledge about particular topics. In 2008 I recall watching television interviews of reporters asking people their opinion about President Obama. Obama was accused by critics of being a fascist and by others a communist. People were asked to explain what fascism was and nobody could. Likewise people were asked to explain what communism was. Again, nobody could provide a cogent answer.

The answer from one person was telling: “I don’t know what fascism is I just know it’s bad and it’s what Obama is.” Such is the power of the meme: for this reason I wrote this article. I wanted to provide the public with information related to the context, principles and nuances of communism and fascism. If for no other reason, people can read this article, learn a few technical terms, and then cast more philosophically accurate aspersions at politicians.

What is Fascism?
Fascism emerged formally as an ideology in the early 19th century. Fascism is more or less a response of conservative thinkers to the perceived excesses of the Enlightenment.  The Enlightenment, also known as the Age of Reason (1650-1800), placed all traditional understanding of life, morality and the world in to question; to that end thinkers like France’s Voltaire, England’s John Locke, and Scotland’s David Hume, all encouraged people to possess a questioning attitude (especially towards power). The result of this promotion was both an intellectual revolution and two political revolutions—the American (1776) and the French (1789)—sweeping away traditional authority.

Nineteenth century conservative thinkers like France’s Joseph de Maistre responded to the Enlightenment by doing two things: firstly, he used his significant skill as a writer to oppose democracy because representative systems of government gave power to people who neither deserved it nor understood how to wield it. The only people worthy of ruling, argued de Maistre, was the elite—an aristocratic and educated one. Secondly, de Maistre was a proto-fascist: as such he, like all fascists from Gabriele d’Annunzio to Benito Mussolini or from Houston Stewart Chamberlain to Adolf Hitler, held an idealized view of the past: the present was chaotic and disorderly while the past was a simple and pure time.

Taken at face value simplicity and purity appear to be virtues; however, in their 20th century expression simplicity took the form of anti-intellectualism (think Nuremberg rallies and book burnings) and purity took the form of master race theory being taught in German classrooms giving birth to a generation of mass murderers (think Holocaust). Lastly, fascists did not embrace skepticism or the questioning attitude promoted by the vast majority of the Enlightenment’s philosophers; on the contrary, proto-fascists and fascists alike hated the disorder resulting from a society unwilling to give unqualified support of its elites while likewise despising the intellectual and cultural disorder unleashed by the Age of Reason.

Fascism & Mass Movements
Fascism is always tied to a mass movement. In 1933 Germany was ruled by the National Socialists (Nazis). The Nazi movement stressed the glorification of the state and the importance of racial purity. For this reason the Nazis sought to control every aspect of German society—education, politics, the arts, the media and, of course, the economy. Reinforcing the Nazi’s hold over the minds of Germans was the pseudo-science of ethnology. Ethnologists borrowed H. S. Chamberlain’s term “Aryan” to refer to the German people—a mythologized group of apparently genetically superior people originally from Northern India who eventually migrated and settled, as coincidence would have it, in Central Europe (aka Germany).

Not all fascists are racists like the Nazis. For example, Italian fascists in the 1920s were more or less ultra-nationalists wanting to establish law and order in Italy. Italy became a country in 1861 and from its inception suffered from constant political division. There were so many political parties—some representing a Catholic bloc, some socialist, others fascist, and still others a liberal worldview—that no party could establish a majority in the Parliament of Italy.  Italy actually continues to suffer from a similar problem with plurality in the present day. Again, Italian fascists, far from being racists or promoters of silly master race theories, simply wanted to rein in the divisive forces unleashed by democracy in order to make Italy strong again.

Fascism & Freedom
Although fascists are not democratic they typically allow people a small measure of freedom; that is, a citizen is free so long as freedom does not interfere with the greater aims of the state. For example, in Germany Aryans could marry anyone who was an Aryan but they were forbidden to marry a non-Aryan (to preserve racial purity).

(By the way “racial purity” is not a scientific concept whatsoever. There is no such thing as “pure genes” either; moreover, race doesn’t exist objectively; rather, it is a historically conditioned and created idea. In the case of genes: there simply exists genetic variation suggesting no particular group of people is better or superior or weaker or inferior than another; every person belongs to a single human genome. In modern biology, some scientists argue there’s no human genome but a universal genome to which all life on earth belongs. This is a minority view but it is gaining more and more adherents as our understanding of genetics improves over time. I digress.)

In the context of the 21st century, some political theorists like Sheldon Wolin argue the control corporations exercise over the political decision making process in the United States is proof positive of America being a fascist state (an expression of so-called “inverted totalitarianism”). Although counter-intuitive at a certain level, Wolin argues freedom can co-exist with fascism provided citizens remain incapable of effectively challenging the power of either the government or the corporations. For this reason fascism can be an insidious thing; you can evolve in to it. When it comes to communism, historically speaking, it is born out of something more obvious: violent revolution.

Fascism & the Individual
In the case of fascist Italy or Germany, individuals do not own themselves but exist as indistinguishable from the state. In this sense, fascism owes a lot of its inspiration from the Catholic Church: the Church likewise treated the faithful as a “corporate whole” who belonged to one body and who were required to unquestioningly accept Rome’s authority; that is not to say the Catholic Church was fascist; however, fascists like Church authorities desired to recreate an orderly past, a time before democracy (or a time before The Fall)—a period when the people were one and did not challenge either the Church or the king (or in the case of the Church, God).

In reality no such period ever existed. Kings like Henry II (1133-1189 AD) of England successfully challenged Rome by pressuring the Vatican to appoint the bishops he wanted; moreover, Peter Abelard (1079-1142 AD) a French thinker wrote Sic et Non (literally “Yes or No”) effectively placing in to question the wisdom of unquestioningly accepting a number of the Church’s best-established doctrines. Yet, fascists do not fret over whether or not their view of the past is accurate; they are worried about creating a view of the past that supports their narrative in the present which justifies future action. For this reason businessman turned politician Donald Trump appeals to many people (forming the basis of a mass movement in the United States). Trump argues America “doesn’t win any more” implying America always won (a quick study of the War of 1812 should disabuse the Americans of this view); and he promises that if he’s made president he’ll solve all the country’s problems which, interestingly, have a lot to do with America’s longstanding racial issues.

Fascism & Authoritarianism
Fascist governments are authoritarian by nature: decisions are not made through consensus building or parliamentary debates. Rather, fascist states dictate to the people what can or cannot be done. The assumption is the average person doesn’t know what is good for them. Therefore, people need a benevolent (infallible) leader to make important decisions for them. Fascists by and large hate the disorder perceived to be inherent in democracies. In particular, the Nazis rejected any notions of entertaining pluralism (that there’s more than one valid way of looking at the world)—a situation where minorities maintain their unique identity independent of the majority. To the Nazis all of the people must agree both in their thinking and in their genes for order to be preserved.

Fascism & the Internal Threat
Another feature common to most fascist states—from Germany, Italy, “Communist” China or even to the United States—the greatest threat to society comes from within the country. Thus, fascist states seek to create and promote national unity at all costs. In Germany, this meant purging “undesirable” or “racially questionable” elements from the body politic. In the case of fascists in Italy, they sought to bring order to a country paralyzed by political disorder by sweeping away democracy altogether and establishing a dictatorship. In China it means rejecting any notion of electing anyone outside of the Communist Party to a position of meaningful influence; and in the United States, a softer, subtler form of fascism has led to the creation of identity cards for Hispanics in the state of Arizona, a concerted effort to deny gay people legal equality, and to the passage of the most intrusive piece of legislation passed by any government—democratic or otherwise— in history: the Patriot Act. An act designed to give the government unfettered access to the digital communications of Americans. This act also suspends habeaus corpus among other rights.


What is Communism?
During the Industrial Revolution (1750-1900 AD), a new industrial society emerged generating enormous wealth. Wealth, unsurprisingly, was not distributed equitably or enjoyed equally. Those who owned businesses profited significantly while laborers received low wages, lived in squalor and eked out a miserable existence in growing, overcrowded cities.

Karl Marx, the father of communist theory, developed the political philosophy of communism in response to the social, economic and political problems he believed were created by the Industrial Revolution. With exception most communists are motivated by ending the exploitation of workers by the business class. In order to end the exploitation, every citizen must be made economically (and therefore politically) equal by eliminating social class. According to Marx history was best understood as a conflict between classes (or those that have versus those that have not). In the Communist Manifesto Marx describes a struggle between the Proletariat (workers) and the Bourgeoisie (business class). Marx sought to end this perpetual conflict by establishing a “classless society” based upon the common ownership of the “means of production.”

In a communist state, the government abolishes the ownership of private property. Marx believed if no single person possessed more property than another—but shared everything in common—then no one would have the power to exploit others. Marx considered capitalism a great social evil: the business class exploited the fact workers needed money in order to live; and since thousands of workers competed for the same jobs the employer could pay workers whatever they wanted (contributing to exploitation).

Compounding the problem was the fact workers, not employers, were the ones who did all the actual work while employers reaped all of the benefits. With that said, Marx did not adequately take in to account the relative risk run by entrepreneurs when setting up a business, i.e. if a business failed the investor lost everything. Yet, in the aggregate, there were more workers than there were capitalists; and, as such, the risk assumed by an individual entrepreneur or that entrepreneur’s right to make a profit from his business did not outweigh the injustice of that same entrepreneur exploiting vulnerable workers.

Marx believed communism was a genuine political alternative to representative democracy. According to Marx there were at least two problems with representative institutions: the first problem was they pitted the different social classes against one another (as exemplified by the situation of both Italy’s Parliament in the 1910-20s or the Congress of the United States in the 2000s). According to Marx competing demands meant the needs of the people as a whole were never met. Instead, elites would continue to exploit the resultant division to maintain the status-quo.

The second problem was the fact that in democracies, monarchies, etc. the wealthy elite, though small in number, exercised a disproportionate amount of influence over the decision making process. In a communist society (at least in theory), decision-making is supposed to be genuinely democratic; that is, there is no elite (there isn’t even a central government or legislature). Instead, decisions are made at the local or regional level through workers councils (called soviets in the Russian context); and membership on these councils wasn’t based on education or class but was rotational—everyone was given a turn to shape the decision making process. A good idea…until it’s Jonathan Bernier’s turn to chair meetings. I digress, again.

Also, Marx had little confidence in the capacity of democracies like Britain or the United States to truly represent the interests of the people as a whole. The elites of these two countries used their economic (and therefore political) influence to push for the invasion of other countries, etc. in pursuit of profit for their social class. For example, Great Britain required soldiers (“wage earners”) from Canada to fight the Boers in 1899 for control of gold mines discovered in the Transvaal State; and in 2003, the United States invaded Iraq to guarantee its strategic access to Iraqi oil.

In both cases, the 99% fought, bled and died for the 1%. Communist states, in principle, do not go to war with one another over profit. On the contrary, they make common cause with one another because a worker in Germany is a brother to a worker in France, Britain, Canada or the United States. For this reason Karl Marx famously exclaimed, “Workers of the world unite!  You have nothing to lose but your chains!” With that said, the Soviet Union invaded all sorts of countries, e.g. Afghanistan, Hungary, Poland, etc. etc.

So much for principles…


In the second part of the article, I provide a technical and detailed contrast of fascism with communism.

1. Ultra-Nationalism
Communists are “international” in their outlook while fascists are “nationalist,” e.g. Nazi Germany was aggressive to neighboring states like Poland because the Germans believed themselves a “master nation” entitled to rule over so-called “slave states.”

Communist states, like the Soviet Union (Russia), were in principle governed by the working class (a class transcending borders, language or ethnicity). Communists were “international” in their outlook because regardless of the country where the working class was located they all shared the same experience of being exploited and the need to overthrow the capitalists. In short, workers from every country shared a common cause: defeat the Bourgeoisie.

Fascists aim at exclusion not inclusion, e.g. German fascists excluded those who did not belong to the “master race.” Italian fascists excluded non-Italians because Mussolini was attempting to reinvent the Roman Empire. Simply stated fascists tend to look inwards to protect their local community from corrupting influences. However, communists from country to country to country are joined by a common sentiment: the enemy are not workers from other countries but capitalist exploiters outside of the communist’s borders. In other words, communists look outside to find their enemies.

2. Authoritarianism
Both fascism and communism are authoritarian, e.g. decision-making power is not shared but centralized around a dictator; that is, Hitler ruling Germany and Stalin ruling the Soviet Union. People not belonging to either the Nazi or Communist governing parties had no decision-making power. However, members of these two ruling parties received benefits/privileges non-members did not. In the communist context, class was supposed to be abolished; yet, members of the Communist Party definitely enjoyed material benefits the “average Vlad” did not.

Fascists want to build a strong state. In Germany, this was accomplished by ruling the people through the Führerprinzip or “leadership principle.” The Führer’s word was regarded “above all written law.” Basically, whatever Hitler said had to be obeyed. Therefore, all government policies, decisions, ministries, etc. operated in complete and absolute obedience to Hitler.

By contrast authority in a communist society is supposed to be decentralized (democratic). In theory, authority was exercised by a collection of workers councils from all around the country. In practice, however, authority in both the Soviet Union and Communist China was exercised in basically the same way it was in Germany: decisions were centralized, made by a dictator, and enforced by a ruling party through a combination of the military and secret police. The average citizen had no say in decisions directly affecting them.

3. Economics
Economically speaking, fascist states allow citizens freedoms unavailable to those living under communism. For instance, in Nazi Germany you could own your own home, factory or land. In the Soviet Union, the government abolished the practice of private ownership of property.  The government owned everything. Taken at face value fascism would appear to offer opportunities unavailable to communists; however, upon closer inspection the appearance of freedom in Germany is deceiving, e.g. while you could own a factory and even keep profits for yourself the Nazi Party told you what to produce, how much to produce, when to produce, how much to pay workers, etc. In the Soviet Union, the government did the exact same thing—telling people what to produce, etc.—but all profits went directly to the Communist Party before being distributed, in principle, to the people.

4. The State
Fascist states typically glorify war; therefore, a strong government is considered necessary to produce a strong, disciplined people. There is no room for catering to a plurality of public needs in a fascist society. In principle, authoritarianism in a communist system is considered a necessary but short-term evil, e.g. while fascists make a point of permanently establishing a strong state by comparison communists actually want the central government to eventually “wither away” and society run ultimately be managed by a series of workers councils.

The problem with communism, or any ideology for that matter, is it reflects an idealized version of reality. Marx’s claim the state would eventually disappear spontaneously out of the goodness of the people is not only without historical precedent, it also ignores the realities of human nature: human beings are selfish, egocentric, ambitious, etc. and anyone who is in a position of power will, if not limited by either law or police, will not only allow the state to disappear but they will also do whatever they can to continue benefiting as an individual from their position of privilege, e.g. In the Soviet Union, communists overthrew the tsar (1917) to establish the ideal workers society Marx envisioned; yet, the Communist Party far from establishing a classless society simply became the new ruling class; that is, the exploitation of the Proletariat by the capitalists ended and a new exploitative relationship emerged, an ironic one, with the Communist Party’s members exploitation of the Russian worker.

5. Ideological Issues
Both fascism and communism are ideologies. An ideology is a philosophical and systematic way of looking at the world. As worldviews go, neither fascism nor communism actually reflect how the world actually works, how people think, or how they behave.

This is a problem.

Specifically, communists and fascists attempt to build new societies based on how the world ought to be or how people ought to behave instead of on how the world actually works. For instance, communists assume people are either capable of cooperation or competition but never both. In reality, people are capable of doing both when and if reason and not fear/intimidation forms the basis of consensus building. For his part Marx did not anticipate elites in Canada, Britain or the United States would voluntarily share their wealth or weaken their control by allowing the passage of legislation protecting workers, giving people universal healthcare, or providing workers with unemployment insurance. Marx did not anticipate this because his ideology blinded him to the possibility of a change in attitudes over time. The reality is people are complex and capable of both cooperation, competition, and everything in between (see the work of Peter Kropotkin for more on the cooperation-competition dichotomy). Ideologies which fail to take complexity in to account—and this includes the world’s most dominant ideology, free market fundamentalism—are doomed to fail because they encourage the development of an incomplete understanding of the world and humankind.

Those ideologies or systems which take an accurate view of human nature’s complexity in to account are longer lasting, e.g. the Westminster system of parliamentary democracy has existed uninterrupted since 1689. Democracies, unlike fascist or communist states, encourage citizens to practice toleration, compromise, consensus building and pluralism; citizens are also involved in decision making which means the public views decisions coming from government as legitimate; moreover, democratic governments do not try to stamp out differences of opinion or force people to agree with one another. Instead, the role of the government is simply to preserve the peace; that is, democratic governments try to constructively manage disagreement making sure parties involved respect the peace and the rule of law. By contrast communists and fascists make extensive use of a secret police, concentration camps, intimidation, murder, and terror, etc. to force their citizens to “behave” and “agree” with official party policies and principles.

6. Anti-Democratic Sentiments
Fascists attack other organizations or groups attempting to influence the country’s affairs. Fascists despise unions because the labor movement influences a country’s economic and political life without being subordinate (obedient) to the aims of the state; this means labor unions can, in principle, provide a healthy counter-balance to governments which tend over time to become dominated by the political right (especially so if the right is left unchecked). In Nazi Germany, the individual was simply expected to obey—disagreement, disobedience, freedom, etc. were considered contrary to the principle of building the perfect society. Thus, fascists sacrifice liberalism and freedom in order to achieve goals like “racial purity” or the “conquest” of weak neighboring countries.

In principle, communist countries are supposed to be democratic. Yet, in practice they tend to be anti-democratic, e.g. you do not build the perfect society through parliamentary debates. You build it through terror, fear and intimidation. The problem confronting any authoritarian-minded political party is most people genuinely prefer living in freedom to living under a dictatorship (if given the choice). Therefore, in a fascist society it is believed necessary to rule the people by decree and remove (or kill) those elements of society who oppose destiny. In the context of communism, Karl Marx assures us once the enemies of the movement are destroyed the government mystically “withers away.”

 7. Racism & Xenophobia
Fascists believe their country is in crisis or “rotting from within” because of the presence of “impure” people/groups. To address this crisis fascists attack enemies of the country’s “true identity” (usually members of an ethnic, racial or religious minority). The current rash of xenophobia in Canada towards Muslims (2015), Syrians in particular, demonstrates the sobering prospect that fascism (however latent) exists in many of us and is never far from the surface. Also, fascists consider any citizen who speaks out against attacks on minorities as enemies. For this reason people who support the humanitarian aims of bringing Syrian refugees to Canada are wrongly labelled as either sympathetic or active “supporters of terrorism.” (People are just so intellectually lazy sometimes.) Fascists feel a sense of urgency when it comes to purifying society; it cannot be achieved fast enough.

Communists likewise desire purity; yet, the purity they are after isn’t racial. Instead, they want citizens to be ideologically pure. In other words, the Communist Party in Russia wanted the “thinking” of their citizens to perfectly line up with the teachings of communist theorists like Marx or Vladimir Lenin. Perhaps the greatest expression of this pathological desire for ideological purity came when the Soviet Union liberated its soldiers from prisoner of war camps in Germany during World War II. Instead of being freed and joining their comrades in the Red Army, these former prisoners were sent to new prisons in Russia, i.e. they could not be trusted because they had seen what it was like out of the Soviet Union and they might spread heretical ideas in the USSR. Communists, like fascists, do not tolerate criticism of the government. Individuals do not have the right to freedom of speech or even freedom of conscience. You are expected to give yourself entirely over to the communist cause (a cause which requires great sacrifice now with promises to pay off at some indeterminate time in the future—a future that never seems to arrive by the way).

Fascists view minorities as the enemy. Communists, on the other hand, insist “big business” and capitalism are the real enemy. Fascists deal with minorities in one of the following ways, e.g. they’ll expel them; if this doesn’t work they’ll imprison them (the current “for profit” prison system in America conveniently deals with the disenfranchised African American population); and if imprisonment doesn’t work you can always kill them. Communists, Marx in particular, argued the extermination of the Bourgeoisie might be necessary; however, communists do not have to kill the Bourgeoisie in order to establish communism. Instead, the wealthy as a distinct class can be destroyed by simply taking their businesses from them and abolishing private property. In short, communists seek to turn the wealthy elite (and middle class) into “wage earners” like everyone else.

In principle fascism focuses on what a person intrinsically is or is not while communism focuses its attention simply on a person’s social role. Both communism and fascism have some serious drawbacks (yeah think?); but at least it can be said of communism that one doesn’t have to resort to mass murder to “purify” society (although Josef Stalin did frequently resort to this, e.g. Holodomor (1932-1933), the Great Purge (1937-1939) or the Katyn Massacre (1941)). Again, in principle (so many principles), you can change a person’s social role without having to first destroy the person. The difference between fascism and communism is fundamental: communism seeks to develop everyone while fascism excludes or eliminates all who don’t belong to the “ideal.”

Well that was quite a mouth full. I wonder at what point a reporter asking me a question like “what is fascism” would leave? So next time a reporter asks you a question about either fascism or democracy or communism, maybe you’ll sound a little smarter than a goalie or probably anyone playing for (or supporting) the Maple Leafs.


Episode 7.5: Chris Hedges and the “Empire of Illusion”

Peasants & Emperors is a podcast presenting topics related to democracy, science, culture, women’s issues, current events and critical thinking. A new podcast is produced and available for listening/download approximately every two weeks.

In this episode, Rick introduces the audience to a book written by Chris Hedges called Empire of Illusion. Originally published in 2009, Hedge’s book provides readers with insights in to the political, cultural and economic decay of the United States. Canadians take note because, to borrow a French aphorism, when America gets a sniffle Canada catches a cold.

Episode 7.5: Chris Hedges and the Illusion of Empire

Click on the hyperlink above to download and listen to the podcast. Feel free to leave a comment or question in the comments section below. One of the cast members will respond.

Thanks in advance for listening and check back regularly for updates to the site and podcast.

Notes & Clarifications
1). In the show, Rick makes reference to an interview Chris Hedges gives with CBC’s The Current. Here is the link to the page hosting the interview. The actual link to open the interview is located at left just under Hedge’s picture by the heading “Listen.” Feel free to leave a note on what you thought of either this podcast or Hedge’s interview.

2). Click here to open the http://www.amazon.ca page to purchase Hedge’s book Empire of Illusion. The other book of his I would recommend reading is The Death of the Liberal Class.