(One simple rule)
Science is not guys in lab coats. Science is not beakers and test tubes, or fancy expensive equipment that requires a degree to operate.
Science isn't something funded by corporations or the government or universities.
Science isn't even chemistry or physics or biology. It’s (unfortunately) not something taught in school.
In its absolute simplest form, science can be boiled down to one straight-forward rule:
Check that you're right.
Part "I can't believe I even need to cover this, but lets get it out of
the way":
(Insanity does not equal enlightenment)
I spoke to a number of people about this essay as I was still working on it.
A recurring theme came up.
So I realized I have to get this out of the way before I can even begin...
There was a time when not being able to distinguish between ones own imagination and the outside world around you was considered mental illness. It was generally referred to as schizophrenia
(note: embedded links, such as this one, are just for reference. Links which are spelled out on separate lines are actually intended to be clicked and read before going on with this essay. I don't summarize them, and assume you read them in what follows)
Apparently an awful lot of us have come to believe that schizophrenia represents the highest level of consciousness:
"What ever you believe is true, is true."
"All that matters is what is true for you."
"Everything is relative"
"Everyone's reality is different"
At the very least, this represents an extreme form of narcissism; to believe that the entire universe literally revolves around oneself, to the point of thinking what is "real" depends on what you believe.
If it were really true, it could only be true for one person.
Since you can feel and think, you (you, reading this right now) know you exist.
The world can't be in someone else's mind.
If it were true that perception defines reality, this would mean that everything else would cease to exist if you died.
Hopefully you don't really believe that; do you?
If all of us are really here together, then there must be some things that are real outside of our individual minds, some world that is not dependent on us thinking about it in order to be there.
Of course it is technically true that there is no way to ever be 100% sure that all of this isn't your dream or hallucination.
But if we go with that, that's the end of the conversation right there.
There is nothing else to talk about. Since literally anything is technically possible, any thought what-so-ever gets cut off by the "how do you know this isn't all a fantasy?" argument, and it isn't terribly useful.
Here in the real world, every time you get out of bed, go to work or school or the grocery store, duck out of the way of something thrown at you or dodge a car coming at you, every time you do literally anything at all, you are making the assumption that there is a real world out there outside of your own mind.
If someone really believed that all points of view are equally valid, they would simply curl up into a ball and choose to feel unending bliss for the rest of eternity.
If that doesn't seem meaningful enough, this alone says you acknowledge there is something outside of your mind which can have meaning.
Within the assumption that there is a reality outside our minds, science says we can build an understanding of that reality. In fact, even if this were a dream or a fantasy, within that dream there is consistency, and we are no less able to figure out how the world (or dream) we live in works.
For the sake of practicality and simplicity, let’s make the assumption that the world does actually exist.
There is actually a world outside of our knowledge of it, and it remains exactly the same whether we know about it or not.
It’s all very fun to think about subjectivity, questions like "if a tree falls in a forest and no one hears it, does it make a sound?"
But if a person believes that gravity pulls objects away from the earth, and consequently lets go of a bowling ball expecting it to float away like a balloon, his belief will not prevent the falling ball from breaking his foot. What he believes does not change reality. Reality is the same whether we know what is true or not. In the end, it's individual belief that doesn't matter at all.
The answer to the tree falling in the forest question is: the disturbance of the falling tree causes vibrations in the air which propagate as waves identical to those that we call "sound". Whether you choose to classify it as sound without it being heard by something conscious is merely
semantic, and in the end doesn't change the reality of the situation one bit.
For the sake of truth and understanding, let’s drop our overblown egos.
Nobody cares what is "true for you"... and nobody should.
Part 2
(This is not merely an academic or philosophical issue)
Here in the real world, actions have consequences.
People can feel joy and pain. All other things being equal, most of us can agree that causing more joy or less suffering in the world is generally a good thing. Most people go through life attempting to make things better; if not for the world at large, at least those around them (and if not that, at the very least for themselves). In order to turn plan into effective action, it is necessary to have accurate information.
Our schizophrenic friend does not want a broken foot. The fact that he believes something false about the world leads to an action that ultimately brings slightly more unpleasantness into the world.
When a parent forgoes legitimate medical treatment for their child in favor of faith healing, homeopathy, or acupuncture; when legislators pass laws based on a single memorable (but very rare) case which has broad unintended impacts (3-strikes); when school districts mandate that religion be taught side-by-side (or even instead of) science; when the public opposes nationalized health care due to believing it will make health care more expensive and opposes the estate tax believing it hurts middle class; when a jury sentences someone to prison when there was no physical evidence...
All of these things are issues which (complex though they may be) could have been decided very differently, if truth were valued over individual belief. These are all things which end up affecting real people in negative ways, (and not just the people making the decisions).
The idea that the truth is what you make it and that perception defines reality is false, and this is important because it leads to people making bad decisions which end up being destructive.
In the mid-90's, under republican governor Pete Wilson, California began changing the electricity industry.
The new rules called for public utilities to sell off a significant part of their electricity generation to wholly private, unregulated companies (such as Enron) which then became the wholesalers from which public utilities needed to buy the electricity which they used to own themselves.
"Wilson admitted publicly that defects in the deregulation system would need fixing by "the next governor"".[wikipedia]
In 2000, just as the last of the rules put into place under his predecessor went into effect, democratic governor Pete Wilson took office. For years energy trading companies - who did not actually produce any energy themselves - had been working behind the scenes to fill the void of government regulation by manipulating markets to their own advantage. As the final stages of
deregulation came into effect, they went too far, deliberately shutting down plants in order to reduce supply and raise prices, and causing major blackouts throughout the state.
Davis inherited this situation and was largely powerless to do anything about it in real time.
Instead of blaming the republican governor which had pushed for deregulation in the first place, or the state senate that had approved it, or the companies which were (illegally) manipulating the energy supply for personal gain, the voters of California went after the most visible target, the person who happened to be in charge right at the moment, Governor Davis.
Primarily as a result of the energy crises, a recall election was held and in his place, the people elected another republican governor, another fan of free-market economics.
This, to me, is a failure of the American educational system.
It is a failure of people to apply basic science to real life, and to check that beliefs are right, before acting on them.
(Basically, just making stuff up)
The only alternatives to science are intuition, "common sense", and faith.
Intuition is really some deep parts of the brain that we don't have conscious access to, collecting data, weighing evidence, and making deductions, somewhat similar to how we consciously make decisions. There are in fact occasionally times when this process gets it right even when the conscious mind gets tripped up. Intuition might be called subconscious reasoning. It is subject to the natural cognitive errors the human mind naturally makes.
Conscious thought is also subject to those same errors, but the major difference
is that it is possible to be aware of and correct for them when it’s conscious. The subconscious mind will continue to make the same mistakes, even if a person knows better. Prejudices are an excellent example of this.
Common sense, in addition to not always being that common, is not always sensible. There are relatively few things, at least in modern America, which are universal enough to be called "common". Much of what is referred to as "sense" in this context starts as an assumption on
one person's part, and is taken on faith by enough people that eventually no one questions it.
Another definition of common sense is just "conventional wisdom", or "generally accepted claim" but there is no guarantee that it is based on anything valid. An urban legend which many people believe falls under a similar category. A few examples: shaving makes hair grow back thicker, eating sugar leads to being hyper, reading in the dark is bad for the eyes, people need to drink 8 glasses of water per day (http://www.uamshealth.com/?id=866&sid=1). There are a bunch more examples of things which "everyone knows". There's a couple in this list I believed: http://en.wikipedia.org/wiki/List_of_common_misconceptions. If something is false, no matter how many people believe it, it is still just as false. Many things which "everyone knows" started out because one loud and persuasive person was mistaken - or even deliberately making stuff up.
Faith is believing something without any (tangible) evidence at all. It could be an internal feeling, or it could be accepting what someone else says. Either way, one can't (or doesn't) verify that the belief is actually correct, assuming the one source of information to be unfailing. It’s been described to me as "a different way of knowing".
In looking at science vs. internal faith, those who promote the latter generally avoid considering the error rate.
Every time you "know" something, and turn out to be wrong, having been wrong should call into question everything else which is "known" in the same way. Instead we tend to ignore or forget all the times when intuition or faith or "just knowing" turned out to be wrong, focusing exclusively on that one or two times it was spot on accurate. That one time could have been random luck, but to the person looking for it, it represents confirmation that they can have internal knowledge without any evidence.
It’s generally more appealing to look for what you want to find, then it is to honestly look at both the times that confirm what you want to believe AND the times that don't.
The last big one is another meaning of the word faith: faith in someone else. This could be a book written in the distant past or a preacher, but it is also accepting the word of anyone without questioning it. This goes for teachers and doctors, and even scientists, as much as anyone else. These people are all human, and, though they are more likely to understand and
apply the principals of science and logic themselves, they are still subject to human errors and bias. This isn't to say that no weight at all should be given to an individual's knowledge of a subject we may know less about, but taking anyone's word as absolute gospel is faith, just as it is in church.
If the issue at hand is still disputed among scientists, and if someone seems completely confident about the answer, this is actually a reason to trust them less, not more. It is only when a particular answer has been found to be more or less universal, that a good scientist should be confident about it.
Experts in a particular field are subject to the same cognitive errors, assumptions, mistakes, and acceptance of anecdotes as the rest of us are. It is always a good idea to double check their sources.
There was a person who told me that he was told by the official vehicle inspector in his state that it was illegal to change from power steering to manual. This didn't seem very likely to me, and it took me all of about 5 minutes to search for the vehicle code of his state online, and find the inspection requirements for the steering system. Apparently the official inspector had simply made that up.
This mistake could easily have been avoided if, instead of just making things up, he applied the simple principal: check that you're right.
One would hope that when a person's life is at stake, people would tend to be more careful. But the people we trust as experts in health - doctors, who have spent 8 years in medical school and another 3 in residency - are no exception:
http://www.utne.com/2007-09-01/Science-Technology/A-Study-a-Day-Keeps-the-Doctor-Away.aspx
(links like this, on its own line, are intended not just as a side note, but as integral to this essay. Its not too long, and its probably something you're going to want to be aware of)
One has also to consider the vested interests of any one source. Those who calls themselves scientists, but who gets all of their funding from the industry they study may not be impartial; bias makes for bad science.
But what is a much more common problem in our society than putting absolute trust in experts is putting absolute faith into anecdotes.
(Its a story... it isn't anything)
It is unfortunate that we refer to a sample size of one single uncontrolled incident as "anecdotal evidence" because it is in fact not evidence at all.
The fact that your uncle’s friend put a hydrogen generator on his car and got twice the mileage out of it isn't just weak evidence. It is nothing. That your neighbor gave her child a homeopathic remedy and the kid got better is not evidence. Did the friend have a dynometer handy, or a closed track to test on? Might anything have changed about his driving style? How often do children get sick and recover with no medicine or treatment of any kind? Could it have been a coincidence that the machine and the drug were applied shortly before the change was observed? Might it be that expecting it to work changed the outcome more than the actual change?
If a dozen people tried something and it failed, but for one person it succeeded, who is most likely to repeat their story?
The dozen failures don't bother to mention it, so all we hear about is the one success.
One of the most common errors the human mind makes is the assumption that two things happening at a similar time must be related, but in fact temporal proximity alone does not imply correlation. The fact that two things happen in the same general time period once or twice could easily just be a coincidence. Further, even if two events are correlated, correlation does not imply causation. Just because two things are connected does not show that one caused the other. It could be that one thing causes another in some circumstances, but not all. Perhaps both were caused by the same third thing, and that original source that caused both things is still unknown at the moment. Since nearly everyone makes that assumption, and outside of formal science pretty much no one controls for it with multiple controlled trials, almost every anecdote has no more evidence than temporal proximity. But the stories get repeated, and when repeated with confidence, or often enough, they become accepted as true no matter how wrong they may be.
Part 3:
(Even good science makes mistakes. Just not that often)
Some will point out that science sometimes makes mistakes.
This is true.
First off, though, we need to honestly access how often it makes mistakes, relative to other basis for belief.
a) Many things one might call mistakes of "science" really were never science at all. Many seeming examples were actually deliberate corruption of research for social and/or political reasons. Remember, science isn't guys in lab coats. The fact that someone claims something is
scientific does not make it scientific. Anything where the conclusion was decided in advance, to fit an agenda, inherently does not fit with the meaning of the word science.
b) Many things which may be seen as mistakes in science by the general public were never fully established or accepted within the scientific community. Hypothesis does not equal theory. No single study is ever considered proof of a new idea. This is a very common mistake people make outside of science. The result of a study is evidence of a hypothesis, not conclusive proof of a theory.
In order to be accepted a given study or experiment has to be replicated, multiple times, independently. Published research gets peer reviewed, because no matter how objective and careful any one researcher is, we are all human and subject to making mistakes. What becomes popularized in media was not necessarily ever accepted by scientific community as a whole.
c) On rare occasion, mistakes actually are widely accepted. These are very rare (far more rare than errors in intuition, faith, common sense, and internal "knowing"). For example, it turns out there (probably) is no luminiferous aether after all, despite being largely accepted among physicists for many years.
But even here, science wins out over faith:
When faith or intuition discover new information which potentially uncovers a mistake, they invariably find an excuse - any excuse, however large of a stretch - to keep the original conclusion.
When mistakes are found out in science, science revises its theory.
It is exactly for these reasons - the need for universality and error correction - that science's error rate is so much lower. In admitting its own fallibility, it is more honest than self-assurance or dogma.
(Profound misunderstanding)
Some will claim that a scientific conclusion is dependent on culture, a certain way of thinking, even gender.
This illustrates a profound misunderstanding by the public of what science actually is.
Science is not the exclusive realm of those people who have the official title of "scientist".
Science can not have a bias. Science has no agenda, no point of view. It is universal. No matter who you are, where you live, what you believe, if you take two objects of the same shape and size, but different weights, to the top of a building and drop them, you will observe them falling at the same rate.
Anyone can do any experiment themselves, and they will get the same result, anywhere in the world. No matter what your religion, what background, what beliefs you started out with, you can run the same experiment as someone else, and you will get the same result. You can calculate the gravitational constant, or the speed of sound, or the curvature of the earth, yourself, with simple tools and math, if you want to verify that the accepted values are accurate. Anyone can test the validity of genetics by breeding their own varieties of peas, buy
a microscope and watch bacteria, or a telescope and examine the stars and planets.
There are, of course, some questions that will probably always be up for debate or that have no right or wrong answers. Where did the universe come from? what constitutes consciousness? what is fundamental human nature (outside of social influence)? But because we can't answer everything doesn't mean we can't be reasonably sure about plenty of others.
Part 4:
(Predictably Irrational; The human brain's consistent errors)
Common sense and intuition consistently make certain mistakes time and time again, in everyone. These mistakes are not really the fault of the person who makes them. They are universal parts of the human mind. We make certain assumptions automatically. Our brains developed to make survival on the open grasslands as likely as possible, and sometimes it’s better to err on the side of caution when the stakes of error include potentially getting eaten. But we don’t live on the savanna any longer, and our challenges have changed. Our evolution hasn’t kept up with our technology. This is why our instincts so often get things wrong.
Here there is a video that explores our natural cognitive errors a little closer. It shows some excellent examples of why it is a bad idea to trust your own intuition (and in a much more entertaining way than my dry straightforward prose!) please take a moment to watch it:
http://www.ted.com/talks/dan_gilbert_researches_happiness.html
Really, actually watch it! No, seriously, it's good. It is a little long, but it’s worth it. It is relevant and important to the point I am making here, so if you don't have time to watch it all right now, it may be best to stop reading here until you do. Then you can read more of this...
...
(...this pause is me waiting for you to finish watching...)
...
See, I told you you'd like it!
It gets deeper than mistaken predictions. This next video perhaps shows better why the natural ways in which our minds are irrational is important to consider. Our predictable irrationality affects our sense of morality, and from there it affects behavior
(this one is only half as long as the last one):
http://www.ted.com/talks/dan_ariely_on_our_buggy_moral_code.html
Here is a completely different example of our natural errors in cognition, a fun one with hidden cameras and unexpected twists. If you are still a fan of astrology, this one is for you especially.
[un-embeddable video, please follow the link]
http://www.youtube.com/watch?v=btP_vy5cQq4
That last video is titled "cold reading" though really the term is more properly applied to a much more interactive art than what it shows. While this video neatly explains 100% of the perceived "accuracy" of written astrology, a good "psychic", a live astrologer, someone who speaks to the dead, or even alternative healers, can build off of the mark's - (um, sorry, I mean, client's), responses and give something much more personalized and therefore even more convincing, as the following video describes:
http://www.youtube.com/watch?v=Xswt8B8-UTM
(if you have time, the following parts are all interesting too, but I'm just meaning for you to watch the first one)
Here is an example of it in practice:
http://www.youtube.com/watch?v=0D5bWfZDIgA
To understand why all this works requires understanding subjective validation:
http://www.skepdic.com/subjectivevalidation.html
(and, if you have time: http://www.skepdic.com/coldread.html)
Subjective validation is an essential part of understanding not just how we can be taken advantage of, but also for understanding faith and intuition, and why they so often feel more accurate than they really are.
Subjective validation goes a long way toward explaining the human mind and the errors we naturally make. And being aware of it can help us avoid making them.
Subjective validation is a reason why our intuition, our "internal knowing" can be so consistently, predictably, wrong - and yet we continue to have faith in them.
It is, essentially, us looking for meaning. Whether there really is any or not:
http://www.ted.com/talks/michael_shermer_on_believing_strange_things.html
In this last video, he mentions one of those fundamental errors of human reasoning - which we could easily compensate for if only we were more aware of it.
"You have to keep track of the misses, and not just the hits".
We naturally remember those times that fit what we are looking for and ignore or forget the ones that don't, making it easy to find meaning where there is none.
Subjective validation is why we so often feel that correlation is causation, that temporal proximity is correlation, and believe in superstitions. We tend to find reasons to believe what we had already decided were true. We tend to look for evidence that supports it and ignore whatever conflicts with it. We find ways to "rationalize" things that we really want to believe due to feeling or faith. Fortunate coincidences become religious experiences, and deeply held feelings become spiritual. Alternative treatments make a person feel better, and that's taken as evidence that it must be working. The government lowers taxes, the economy does well, and it’s seen as confirmation of supply side economics. For those who want to believe, its all too easy to ignore that people feel equally better from placebo...
http://www.skeptic.com/eskeptic/09-05-20/#feature
or that the economy has done equally well with high tax rates many times in the past.
http://www.slate.com/id/2245781/
Fortunately for modern humans, we also have the ability to think things through, to recognize our own mental limitations, and in doing so to compensate for them.
The Greeks started cataloging some of our common fallacies thousands of years ago. For example: "post hoc ergo propter hoc" - this happened immediately after that, therefore that must have caused this.
We also are particularly prone to thinking that because someone of importance believes something, it’s more likely to be true, or that if popular consensus holds a view its most likely correct.
http://en.wikipedia.org/wiki/Fallacies
Notice that each one has a fancy Greek name, and a complicated sounding technical explanation.
But notice also how each one has a very simple example afterward.
It is easy to see how many people can and do make these sort of incorrect assumptions, constantly - and also why they are wrong, by looking at the examples following each one. We have all made these mistakes now and again.
I was talking to someone on this topic who said she had taken "logic" class in college. She said she hadn't gotten anything from it, that it was overly complicated and that her "mind doesn't work that way".
But that's just it - no one's mind works that way.
That's the whole reason it is so important to learn.
Unfortunately most college level classes deal with things like the history of philosophers, delving into unsolvable problems, and considering the different schools of thought on various complex issues.
What I am talking about is much more basic than that. What I am saying we need to learn about is the ways common sense leads us to the wrong answers - and it should be at an age early enough that we can assimilate it into our minds as thoroughly as we do reading and counting and basic arithmetic. Nobody comes into the world knowing how to read or add. We learn it young, while our minds are primed for learning, and by adulthood we don't have to think about it. We don't have to sound out each word, or write down a two digit addition problem in order to solve it. If we learned these things for the first time in college though, it would be a lot more challenging.
(Don't believe everything you see)
One simple example of realizing and understanding all of the ways that our natural perception and intuition can get things wrong is optical illusions:
http://www.michaelbach.de/ot/
Scroll through a couple by clicking the "tour" button at the top, or just select some from the list of pictures on the right.
I find "rotating snake" and "motion after effect..." particularly creepy, just because I know it’s not moving, but I just can't stop my mind from seeing it as though it is.
And if you have any doubt it is your own mind creating these effects, and not some web page trick, "spiral aftereffect" makes the real world around you warble.
Here are a few more, in a fun little video:
http://www.ted.com/talks/al_seckel_says_our_brains_are_mis_wired.html
Everyone is aware that these exists, but we generally don't take the lesson they offer and apply it to everyday life:
Seeing is not always believing.
Just for emphasis, let me repeat that. You just saw a bunch of things with your own two eyes which did not accurately reflect reality.
Seeing is not always believing.
Seeing something with your own two eyes is not proof that something exists/happened, because the eyes - and more importantly, the mind - can be tricked or mistaken.
Here's a totally different approach, but equally simple and with the same implications (this one is only a couple minutes):
http://www.psych.ubc.ca/~rensink/flicker/
and another:
http://www.youtube.com/watch?v=2pK0BQ9CUHk
(Don't skip this one. Its less than one minute long, and you'll be tested on it in a second)
We all tend to see what we are looking for, and not notice what we aren't.
This last simple video should make people stop and question the value of eye witness testimony.
But despite the well documented limitations of human awareness (under even normal, stress-less situations) juries continue to give far more weight to eye-witnesses than to actual concrete evidence.
This ends up having a profound and tragic effect: of hundreds of people wrongly convicted of capital crimes and proven to be innocent by DNA evidence (which was unavailable at trial time) 75% had been positively identified by an eye-witness!
All of these eye-witnesses were not deliberately trying to get innocent people convicted. They believed with absolute certainty that the person at the defendant table was the same person who they saw commit the crime.
And they were wrong.
Every jury should be required to watch the counting basketball video before hearing eye-witness testimony. You just failed to notice a guy moonwalking in a gorilla suit, under perfectly calm conditions when your focus was on the screen - how likely is it an eye-witness saw everything clearly under the stress of a crime-in-progress?
It’s easy to place blame on the particular individuals who made the mistake, but its something we are all capable of.
It’s something to keep in mind the next time you are 100% sure about something which you don't have any tangible evidence of.
(p.s. in case you were waiting, there really isn't a test on the moonwalking gorilla video)
It doesn't stop at seeing what isn't there and not seeing what is either.
http://www.ted.com/talks/keith_barry_does_brain_magic.html
What this means is not that we should discredit everything we experience, or give up entirely on understanding the world. It just means we have to take the time and effort to look a little deeper. It means we have to be careful. Every single phenomenon shown in the optical illusions page, in the various videos, with a little investigation and thought, with understanding
of optics and neurology and psychology, they can all be explained and understood.
Part 4
(Knowledge fish leads to a nation of Americans)
They say "give a man a fish, he eats for a day, teach a man to fish, he eats for a lifetime."
We think of the purpose of education as being teaching people to fish.
But really, what we teach (at least in America) is facts and techniques. We are giving children knowledge fish. What is more important than learning 'stuff' would be learning how to learn.
There is constantly overwhelming amounts of information thrown at us. There are people who believe all sorts of conflicting things, books and TV and internet and friends and teachers and...
We need to know what to filter. We need to know how to think critically, how to make deductions from the facts we know, how to double check our conclusions. Not just on homework assignments, but in real life. If we learn how to make the most of our capacity for intelligence, we can use that to go out into the world and learn on our own.
If we just learn facts and methods to solve homework problems, we can pass tests, but not much else. Once you get out of school, work a few years, you aren't going to remember the steps you used to do your middle school homework. When it comes time to sign a contract and you need to compare possible interest rates and down payment calculations, if you actually
understood the theory behind what you were learning back then, it will be no problem to figure out the steps to take all over again.
It was true even before "no child left behind" that our educational system "taught to the test", but since it has only gotten worse. The fastest and easiest way to get test scores up is having kids memorize the right answers, or shortcuts to get to them. We give them intellectual fish.
No surprise then, that on the "Program for International Student Assessment, U.S. 15 year-olds scored below most selected nations in 2006, and... dropped ... in both mathematics and science [compared to 2000]" and "the average mathematics score of U.S. students was lower than scores in 18 comparison nations (out of 24), and higher than those in 4 other countries—3 of them developing economies."
http://www.nsf.gov/statistics/seind10/pdf/c01.pdf
As adults our retention of even basic facts about the world and how it works elude most of us. Chances are even you miss a few fundamentals.
Give it a try:
http://www.ted.com/talks/jonathan_drori_on_what_we_think_we_know.html.
These are pretty basic things, which anyone who went to school should know (I missed some myself).
When tested for facts and procedures, US students actually score in the center of other advanced and developing nations. On the other hand, when tested for problem solving ability and applying what they have learned to new situations, we come in at the bottom (page 16).
No wonder our adult population fares so poorly on facts. When we first were taught these things as children, we may have memorized the facts, but most of us never truly understood it, so once test time is past, we tend to forget it.
While many see the solution for our failings in education as being a sign that we need either to put more money in education (which is surely true, but certainly not enough by itself) or reward and punish teachers more severely (as though they aren't actually trying) the real problem seems to be the people who get to set standards and curricula (district administrators, state and federal education secretaries) and those who train teachers themselves, all suffering from ignorance of evidence and lack of rationality - in short, the principals of science applied to real life - in favor of theory and anecdote and what they want to be true.
http://www.nytimes.com/2010/03/07/magazine/07Teachers-t.html?em
I strongly suspect all of this explains why Americans level of superstition and religion is so much higher than it is in any other developed nation.
http://www.secularhumanism.org/library/fi/bishop_19_3.html
Before we had telescopes, it wasn't so unreasonable to think stars were the flaming chariots of the gods.
Only a few hundred years ago, people believed that the Earth was the center of the entire universe; that the stars rotated around us. Today we know with certainty that we are not the center of the galaxy, or even of our own solar system. The stars are burning balls of hydrogen and helium, just like our sun. They are thousands of light years away (or one quadrillion miles away, give or take). In fact, they are so far away that what we see is not the stars as they are, but how they were thousands of years ago, because it takes even light (the fastest thing that there is) thousands of years to travel quadrillions of miles.
The total effect stars have on anything that happens here is less than the effect of a butterfly flapping its wings on the other side of the planet - except for how they affect human behavior as a result of belief in astrology.
Given that the position of the sun and moon in the sky correlated with the seasons and tides and other earthly events it wasn't so unreasonable to imagine that the stars or planets might have some earthly effects or predictions as well, and from that developed the idea of astrology.
Today, given what we know about how the world works, believing in astrology is indefensible...
Today 30-40% of Americans believe the stars and planets can effect, or at least predict, what will happen in our lives.
In Europe even the majority of Catholic priests and other Christian leaders do not believe in literal creationism. Not only do we have a strong theory for how evolution occurs, and overwhelming evidence in support of it, but we can actually watch it occur, in real time, by producing bacteria that are resistant to antibiotics (we used to do this in the microbiology lab at my junior college, as part of a lesson). There is no question at all that evolution occurs. We have even sequenced DNA and transplanted genes.
50% of Americans believe in literal creationism.
Another third believe in intelligent design.
http://www.religioustolerance.org/ev_publi.htm
50% of Americans believe in ghosts, 60% of Americans believe in physic powers, and 70% believe magnets can heal people.
http://www.harrisi.org/harris_poll/index.asp?PID=359
http://www.gallup.com/poll/21811/american-beliefs-evolution-vs-bibles-explanation-human-origins.aspx
One fundamental claim made by the supporters of free-market libertarian economics is that individuals act out of rational self-interest. Aside from the issues around the tragedy of the commons and the public good, this claim is itself demonstrably false:
http://www.ted.com/talks/dan_ariely_asks_are_we_in_control_of_our_own_decisions.html
(this is one of my favorites)
Of course, in the right hands, this could be used in a good way:
http://www.ted.com/talks/rory_sutherland_life_lessons_from_an_ad_man.html
But in reality, human nature and psychology is usually exploited by corporate interests for financial gain:
(ok, the following documentary series is 4 parts long of nearly an hour each. Maybe I can't convince you to watch the whole thing right now. At least watch the first one. Or the first half. And do come back and watch at least the first one. This little known history explains an extremely profound shift in the collective American consciousness from citizen to consumer, and from community oriented to individualism. It has lead directly to the consumerism and individualism of today, and this history is virtually unknown by most of us.)
Theory is the highest level of science - hence evolution and gravity still both being theories. There is no dogma in science, no axioms or faith, nothing which is sacred or unquestionable, no fundamentals assumed to be true. Science is not a form of faith. Anyone can test it and if it is found to be wrong, it changes. It is literally the opposite of faith.
This is what they taught us about science in elementary school:
Observation -> hypothesis -> prediction -> experiment -> theory.
This actually a decent summary, but alone it isn't clear how this applies not only to the school subjects called "science" but to every area of human knowledge and understanding. In more general terms we call it logic, rationality, or critical thinking, but those terms suffer from the same limitations that "science" does, of being relegated to the content of a particular class, with no bearing on real life. I was told recently exactly that in conversation, from the person who had taken a college course on logic, and didn't feel she had gotten anything from it.
But just as science doesn't mean guys in lab coats and electronauticalspectrograph machines making grand pronouncements about some esoteric topic, logic doesn't mean the history of philosophy, or math based logic problems, ethical conundrums, brain teasers, or memorizing lists of rules.
Like science, logic and reason are real world ways to remove subjectivity and bias from our understanding of the world, so we can actually know something - at least as best as we possibly can - which is true, and not just a guess.
When we use observation and logic to arrive at our understanding of the world we have the opportunity to make better choices, choices which are more likely to achieve our goals. When we rely instead on faith, we more often start with faulty premises, leading the faulty conclusions, leading to mistakes that could have been avoided.
(One final point: Corruption of the concept)
You have probably got the point by this point. Ok, ok, rationality is useful; science is not just for the classroom, being aware of the ways in which the human mind is subject to consistent error can help avoid making those errors.
Unfortunately, people who wish to convince people of their own agenda are well aware of science, and it is all too easy to corrupt.
So, you have creationism leaving Sunday school, and entering regular school in "science" textbooks under the name 'intelligent design':
"Those who do not believe that the Bible is the inspired, inerrant Word of God will find many points in this book puzzling, this book was not written for them" ["Biology: Third Edition" Bob Jones University Press]
http://richarddawkins.net/articles/2478
You have homeopathy pointing to the few small studies the support it while ignoring the overwhelming number of ones which don't, and using arguments such as "the remedy is tailored to the specific individual, therefore it can not be tested in the traditional way"
http://www.skeptic.com/eskeptic/09-01-14/#feature
You have proponents of literal karma (not just that being nice to people causes people to be nice to you in return, but that the cosmos itself returns "good" acts with good fortune, even into different lifetimes, explaining it as the "law of cause and effect"
http://www.theosophy-nw.org/theosnw/karma/ka-jal.htm
I even spoke to someone recently who seemed to feel that astrology was verified by the fact that an astrology book was able to predict an eclipse.
All four of these have the same thing missing, and its a new issue from what I've mentioned thus far.
It is neatly summed up by this cartoon:
http://www.tc3.edu/instruct/sbrown/pic/miracle.jpg
(which you may recognize if you've been watching all the videos)
They all suffer from missing any clear mechanism of action. This step is skipped all too often. By what means did you "just know" what was going to happen? How could have Governor Davis have caused the power shortage? How does karma wrap around time and space and lifetimes and affect a specific individual in a way that restores cosmic balance? When making the jump from "one thing happened and then something else happened after it", question how the one might have exerted direct influence over the other.
Don't have an answer? Certainly that isn't disproof. Maybe it is an issue of cause and effect. But unless you can explain how, it is premature to say with confidence that it is. If there was a connection once, it may be no more than a coincidence. If you run the same experiment over
and over again, and find the same connection, what you have is a correlation. But if you can't explain how A caused B, you still don't have enough to say it did. Where that leaves you is, now you need to do more research.
There are times when perhaps you don't care enough about the answer to do that research. All I am saying is when that is the case, be honest and leave it at: I don't know. There should be no shame in saying "I don't know". There is a lot to know in this world. It is dishonest, and downright dangerous, to pretend to be confident about things you have no way to be sure of.
There are other things which you really do want to get to the bottom of, to know for sure, in order to make good choices that have a positive effect on life.
Part 6
(The good news)
Lucky for us, we happen to live in a time in history where each of us has more access to information than was available to the entire sum of all humanity only a few hundred years ago. Here at home (or school or the library or where ever you are right now) we have access to information which would have taken days, or weeks, to look up just tens of years ago.
1000 years ago is Asia and 500 in Europe, moveable type was invented, and the written word gradually became available to the masses. With time education became free and universal and mandatory, libraries became commonplace, daily newspapers let us know what is happening
almost in real time. For the first 200,000 years or so of human existence it was entirely understandable that people were superstitious and irrational. It was not ignorance; it was simply lack of information.
It the age of Google (Google makes peer-reviewed journals, books, and now even newspapers
available online, largely for free) and Wikipedia (Wikipedia has 15 million pages, almost all with references and footnotes) there is no excuse.
Nearly any topic you can think of, someone somewhere has thought of it, investigated it thoroughly, written about what they learned, and then put it online.
There is, obviously, an enormous amount of ridiculous crap online. Here, as I mentioned earlier about education, what matters is not the facts we already know, but our ability to filter. Our tendency is to filter based on what we start out believing, or what feels most right. This obviously will only lead to supporting what one wants to be true, not to learning what is.
What Wikipedia has done is, along with its vast amounts of information, is provide sources for every claim. Where a statement goes un-foot-noted, that fact itself is noted. Click on the tiny superscript number and you can trace the origin of a statement. Go back further, you can find who made it, who's paying them, what other claims they make, what agenda they may have and what bias. Then, because no one study provides more than evidence, check out another. Read the opposing arguments - always read the opposing arguments - and not just the ones that support your own belief.
Take your own personal experience out of the equation - one single personal experience, even your own, is merely an anecdote, and it isn't evidence one way or the other. It’s just a story.
I was recently questioning a food based medicine regiment that sounded to be like it bordered on "alternative" (while its thought of as an alternative to mainstream western medicine, the more accurate definition is any medical treatment which has not been scientifically confirmed to be more effective than placebo). I was shown a study that supported it. On further investigation, however, I found that the study was conducted by a group which is funded primarily by major industrial food and drug companies. Given the enormous conflict of interest that presents, I went on to find more independent sources.
Results? Inconclusive. There seems to be legitimate research which points both ways.
This is not the most gratifying conclusion.
But it is the most honest one. It is the truth.
And this is what it all comes down to, in the end. Often times the truth isn't what we want to hear.
We can either get instant gratification and comfort from fantasy world, OR we can base decisions on the best information available and create positive change in our own lives and the lives of those around us.
In a world where actions have consequences, we cannot do both.
I think it is of absolutely vital importance that we all ask ourselves: which
is more important?
Comments