Gleanings from Social Media

flat earth

Like most other people today, I spend a certain amount of time perusing social media. Not because I particularly enjoy it, but because it’s a good source of material for this blog. But the down side is that the impression one gets from such outlets is that the present human condition, and the outlook for the human race, are very bleak indeed.

I know someone who literally believes the earth is flat. Literally. She posted the above photo on Facebook not long ago, and declared that the concept of a round earth is a myth promoted by the Great Conspiracy to turn us into mindless drones. And the astronauts? They never really went anywhere. The government has expended billions of dollars and millions of man (and woman) hours and even several lives to advance a “myth” that nearly everyone has already believed for centuries. Oh and she also believes that chemtrails are used to control us, that global warming is a myth, that vaccines cause autism, and probably that Obama is still trying to take away her guns.

I wish I could tell you that she’s alone, but there are many others out there too. Most of them spreading an endless supply of misinformation that swarms the Internet like a plague of locusts. What is perhaps even more troubling is that even the positive and accurate information circulating out there in the hivemind paints a rather grim picture.

For instance, another link I saw posted from the website Daily Kos offers an illuminating explanation for the way “conservatives” in particular are so frequently ensnared in the web of what used to be called fake news before that term was stolen. Basically, right-wing manipulators are playing a game of telephone. And whatever the participants hear, they believe. Unshakably. And permanently.

As yet another link explains, any attempt to introduce verifiable facts to a devotee of alternative facts results in what is called the backfire effect:

As a rule, misinformed people do not change their minds once they have been presented with facts that challenge their beliefs. But, beyond simply not changing their minds when they should, research shows that they are likely to become more attached to their mistaken beliefs. The factual information “backfires.” When people don’t agree with you, research suggests that bringing in facts to support your case might actually make them believe you less. In other words, fighting the ill-informed with facts is like fighting a grease fire with water. It seems like it should work, but it’s actually going to make things worse.

Wow. If this is true, and given everything else that we’ve seen on Facebook et al, what conclusion can we draw except that we’re all doomed?

Well, someone else online posted at least a glimmer of hope. And it comes from comments made 300 years ago by the French philosopher Blaise Pascal, whom we’ve encountered before:

When we wish to correct with advantage, and to show another that he errs, we must notice from what side he views the matter, for on that side it is usually true, and admit that truth to him, but reveal to him the side on which it is false. He is satisfied with that, for he sees that he was not mistaken, and that he only failed to see all sides. Now, no one is offended at not seeing everything; but one does not like to be mistaken, and that perhaps arises from the fact that man naturally cannot see everything, and that naturally he cannot err in the side he looks at, since the perceptions of our senses are always true….People are generally better persuaded by the reasons which they have themselves discovered than by those which have come into the mind of others.

In short, if there is any hope at all of encouraging the delusional to actually realize that they are delusional, it lies not in challenging their delusions, but in humoring them as far as possible. That may sound like a slender thread of hope. But it may be the only thread we have. At least if the culture of social media presents an accurate reflection of society at large.

Protests Against Logical Fallacies

vet

By any measure, the Women’s March that occurred on Jan. 21 was a resounding success, with an estimated participation in Washington of some 470,000 and an estimated participation worldwide of about 4 million in more than 100 countries. Accordingly, it has come under attack from Trump supporters –even though the event was not specifically geared as an anti-Trump protest — who have summoned a smorgasbord of logical fallacies as justification for their criticism.

Many Internet memes making the rounds suggested that the marchers were just spoiled whiners and sore losers; one compared the circumstances of American women to the plight of women elsewhere in the world, including:

Saudi Arabia, women can’t drive, no rights and must always be covered.
China and India, infantcide (sic) of baby girls.

Afghanistan, unequal education rights.
Democratic Republic of Congo, where rapes are brutal and women are left to die, or HIV infected and left to care for children alone.

Mali, where women can not escape the torture of genital mutilation.

Pakistan, in tribal areas where women are gang raped to pay for men’s crime.

This is an example of the fallacy of relative privation , more colloquially known as the “not as bad as” fallacy. Its premise is that you have no right to complain if other people have it worse than you. By this reasoning, you have no right to seek medical attention for a broken arm because other people are dying of cancer. And ultimately, you have no right to seek redress for anything, because there’s always theoretically someone who has it worse.

There was also the corollary of this premise, which we might call the fallacy of relative merit or the “not as good as” fallacy. For example there was this photo:

16142668_10208472336822369_486230021594176272_n

With the caption: THESE ARE THE WOMEN REALLY MARCHING FOR YOUR RIGHTS.

The implication is that because the march on Washington didn’t require the kind of courage and risk that attends putting on a uniform and putting your life on the line, it’s not a valid campaign for human rights. Again, we could say that essentially nothing is as worthy an action as something else, and thus nothing ever should be done. Fortunately, when this meme made the rounds on Facebook, at least one person responded in this fashion:

EXCUSE Me!!! I am a Veteran…I wore the uniform for 24 years. The only reason I was able to serve is because at some point in history…someone MARCHED to allow me to serve as a BLACK and as WOMAN!!! I take nothing away from those serving today… So there is NOTHING wrong with marching.

The relative privation narrative above even mentions protesters showing up with a “5 dollar Starbucks” in hand, suggesting that if anyone can afford such indulgences, then they have no cause for complaint. That’s a huge non-sequitur, of course. But even if it were perfectly true, it’s based on the false assumption that each of the marchers is there representing only herself/ himself. Which is hardly the case at all; the march was on behalf of all disadvantaged people all over the world.

Another web virus was a video clip of a man purported to be a veteran (which evidently is supposed to make him an authority on these matters) that actually appears to be a commentary on other protests, but it’s been applied to the Women’s March as well. Upbraiding the “crybabies”, (who, in typical cart-before-the-horse fashion he proclaims are “the exact reason Donald Trump won the election”), he throws in several straw men that have no relevance to the actual grounds for complaint, including “You’re causing all this destruction just because your candidate lost” ; “You don’t always get your way”; “Ain’t nothing free”; and “But you want everything”.

His reference to protesters causing destruction could be considered an instance of the fallacy of composition –.a scant handful of protesters (actually masked interlopers who were not a part of the protest proper) had been destructive, so he’s applying that property to all of them. It’s unclear whether he is presenting himself as a phenomenally gifted psychic or a sociologist who’s actually studied the demographics of the crowd, but in either case he’s horribly inept:

None of you put on a uniform, but you’re quick to disrespect the flag, to not wanna say the Pledge Of Allegiance, not wanna recognize the Bible.

In fact, a great many of these protesters have put on uniforms of various kinds, including (as he evidently was referring to) military uniforms. Veterans are often involved in protests, because they often feel (not without reason) that they’ve been given the short end of the stick. But what difference does it make how many veterans were there? Does this social critic mean to suggest that nobody has a right to exercise constitutional rights who has not personally defended them in warfare? That has never, ever, been a condition for the rights and privileges of citizenship in the U.S.

Whom did he see “disrespect the flag”? What the hell does the Pledge Of Allegiance have to do with anything, and how would he know how many people in the crowd say it and how many don’t? And the Bible??? Who doesn’t “recognize the Bible” when they see it, and how is it in any way relevant to what is going on here? And how does he presume to know the religious convictions and practices of half a million people? It would be hard to cook up a bigger pot of red herrings than this.

One valid point he makes is that the demonstrations make people late for work. Or is it so valid? One sermon that the critics keep preaching to demonstrators is “you’re responsible for your own circumstances so quit your bitching.” Which is not only irrelevant, but not quite true — it’s hard to blame people who voted for Hillary Clinton as creators of the Trump presidency. But pretend that we all are one hundred percent responsible for our own circumstances. That means that people who are late for work can’t blame it on activity in the street — particularly when they have the capability of finding out about those activities in advance and planning accordingly.

In fact, sometimes we are tardy for our appointed rounds due to circumstances beyond our control. But that applies to a great many types of events, including not only protests but inaugurations. If we used inconvenience to the general populace as a criterion for prohibiting events, we’d never have any large-scale outdoor functions of any kind.

It’s also common for people to respond to the Resistance by saying, “hey, Trump was elected so let’s give him a chance. Every 4 years, people complain about the outcome of the presidential election. Why can’t we all just forget our differences and work together?”

This would have been an excellent speech 4 years ago or 8 years ago, when people were indeed raising a ruckus just because they had an irrational hatred of the guy who won. But it’s the most glaring of false equivalences to try to make a similar case now. The resistance to Trump is not just because the protesters’ candidate lost; it’s because Trump has made it very clear, with virtually everything he says and does, that he is grotesquely unfit for office, and a very real danger to the country and the whole world. It isn’t just a matter of ideological differences; and suggesting such is an effort to “normalize” someone who is anything but normal and healthy. “Working together” with him is “working together” with someone who is doing everything he can to prevent us from all working together.

If you pay careful attention to these attacks on anti-Trump protesters, you’re likely to find more species of faulty logic than you can shake an alternative fact at.

The “War On Christmas” in 4 Minutes

Jesse Dollemore

I regret that I didn’t discover it until Christmas Day, but Jesse Dollemore has a nifty little video that addresses the silly “War On Christmas” narrative that surfaces every year around November.

Dollemore shows a clip of Donald Trump proclaiming to an adoring throng that “we’re going to start saying Merry Christmas again.” (Does that mean he’s going to issue an imperial proclamation that everyone must say it?) And clips of the talking heads at Fox “News” reacting in a manner that (honest to Pete) brings to mind teenage girls screaming over The Beatles.

They declare that their War On Christmas has been won now that Trump is elected, and because of him people are starting to say “Merry Christmas” again (as if they’d been prohibited from doing so up until now), and even ask “When was the last time you heard a politician say” the taboo magic phrase.

Then, after presenting a few actual facts on the matter, Dollemore serves up a clip of President Obama wishing Americans a Merry Christmas 16 times — and Michelle Obama twice.

When was the last time you heard a politician say “Merry Christmas”? Depends on how much you’ve been paying attention.

 

The Big Misconception About the Electoral College

electoral-college-map-2016-final

The Electoral College has been very much in the news lately, with many people passionately calling for its eradication or staunchly defending it — usually depending on whether their candidate won or lost. There’s certainly room for debate on this matter, but what’s annoying is how frequently supporters of the institution defend it with the same misconception: the EC, they so often say, was designed to provide “balance” in the electoral process. Nothing could be farther from the truth.

The Electoral College, in fact, was designed to promote imbalance — i.e., to give some states disproportionate representation in relation to other states. Specifically, it was designed to give more clout to slave-holders and to advance the interests of wealthy landowners in general. And although slavery has subsequently been abolished, the EC still is doing an excellent job of keeping certain sectors of the populace “in their place” and skewing elections in favor of agrarian communities as opposed to urban communities, and in favor of wealthy white men over everyone else.

Another common phrase you hear from defenders of the EC is that it protects the country from being dominated by California and New York. So does it make more sense to have the country dominated by Oklahoma and Nebraska? Because that’s exactly what’s happening. Under the present system, many states that depend mostly on agriculture are valued far more highly than some states that thrive on agriculture plus the tech industry, publishing and media, banking, insurance, science and medicine. As an extreme example, a vote in Wyoming carries more than three and a half times as much weight as a vote in California.

And it’s getting even more lopsided. Twenty years ago, it probably would have been unthinkable for a presidential candidate to win the popular vote by nearly 3 million, yet still lose the electoral vote. But as more and more people move into cities, their votes for president will count less and less. If the trend continues, the president ultimately may be selected by a mere handful of voters. (Although technically there’s no limit to the number of votes a state can acquire, there’s a practical limit because the total must be 538.) Among other things, this means it’s going to be increasingly difficult for a Democrat to get elected — which is precisely why the system will never change: the last thing Republicans want is a level playing field. (See mandering, gerry.)

Supporters of the present system — or of the candidates most likely to benefit from it — also like to tout maps of electoral results like the one reproduced above.  This, they declare, is proof that the country overwhelmingly supports Donald Trump, even though most of the voters rejected him. Makes perfect sense, eh?

Sometimes they’ll get even cuter by breaking down the electoral map into counties, resulting in a sea of red with only a few little islands of blue. Where the hell are all the libruls lurking?

2016 election results map

Such maps are bad models because they depict geographic boundaries rather than demographic density.  What we are perfectly capable of producing, and yet you seldom see, is a 3-D map revealing that “blue” voters tend to live in more densely populated areas, and often even in high-rise buildings.

election-map-3d-by-county

The two-dimensional maps are meant to reassure us that the right guy won, because he’s representing more of the country. But what they actually do is betray the big flaw of the Electoral College: the president is elected by land mass more than by the citizens who live on it. Donald Trump was not the choice of the people, but he had the overwhelming support of cows, coyotes and cacti.

Defenders of the system are fond of comparing the Electoral College to the World Series. Specifically, they often cite the 1960 match, in which the highly favored New York Yankees outscored the lowly Pittsburgh Pirates 55 to 27, and yet the Pirates still won the series — thanks to the storybook finish of a home run by a mediocre hitter in the bottom of the 9th inning of game 7. We accept and even relish this kind of unexpected drama in sports; so why not in elections?

Well, because even though the presidential election has developed into a spectacle in its own right, with its catty debates and October surprises, it was never designed to be entertainment the way baseball was. It was designed to be a way to pick the leader of the nation; and that objective is better filled by honoring what the people want and need rather than honoring where they live. Furthermore, determining a baseball champion with a series of games rather than a single game helps minimize the element of chance; but breaking up a national election into state elections actually heightens the role of chance. In 2000, for instance, the fluke of a confusing “butterfly ballot” was enough to flip the entire state of Florida — which in turn was enough to flip the entire election.

Consider a hypothetical race between 3 candidates. In state A, candidate Jones receives 5 million votes, Smith receives 4 million and Brown 1 million. In State B, Brown receives 5 million, Smith 4 million and Jones 1 million. The totals for these two states then are: Smith 8 million, Jones 6 million and Brown 6 million.  So Smith, the candidate with the most votes, is awarded no electoral votes at all. Repeating this 25 and a half times, we see that it’s theoretically possible for a candidate to win the popular vote, and yet receive ZERO electoral votes for her trouble.

Does this mean the EC should be abolished? Not necessarily. Because there seems to have been another, more honorable purpose for its existence: to serve as a last line of defense against tyranny. The founders explicitly stated that the institution should be composed of individuals better qualified than the general public for selecting a national leader; and should help ensure that dangerous, unqualified demagogues should not sneak into office just because they happen to hoodwink the voters.

But obviously, the system has failed us big time. The Electoral College has become little more than a rubber stamp; some states even have made it illegal for electors to contradict the choice of the voters in that state.

So perhaps it should be abolished after all. Just don’t hold your breath.

The Myth of a Decaying Society

the-sky-is-falling-1

How many times have you read it or heard it? The world is going (or has gone) to hell in a hand basket. If only we displayed the Ten Commandments in more places. If only we had more capital punishment. If only kids didn’t wear such strange clothes and listen to such strange music. If only parents and teachers beat their kids more like they used to . If only movies didn’t have four-letter words. If only we had another B-movie president.

This year, 2016, will go down in history as the year when one of America’s two major political parties nominated a presidential candidate with no qualifications whatsoever. All he offered (besides a galactic ego and a very long history of very shady business deals) was a vacuous slogan: Make America Great Again. But that was enough to persuade millions of voters to march behind him. Because many of them, though they’d never had it so good, had been convinced that the nation was in such rotten shape that they were willing to entrust it to a crackpot demagogue snake oil salesman.

This phenomenon is by no means limited to Twenty-First Century America; it’s as timeless as civilization itself. Some 600 years ago, at the height of the Renaissance, a poet wrote what at first appeared to be a celebration of the richness, the vitality, the color and panache of his age, but he wrapped up his poem by concluding that he’d never seen things so bad. A couple of millenniums before that, Confucius was complaining that the music of his time was corrupting the morals of society, and the celebrated Hebrew prophet Jeremiah (or someone) was authoring the Lamentations bemoaning an utterly bleak world. Go back to just about any age, in just about any country, and chances are you’ll find a record of someone complaining about how things were just not like they used to be in the Good Old Days.

Time out. If all of these wailers were correct, then that means that civilization has been on a steady plunge since day one. And the world is now in the worst shape it’s ever been. Take a good look around you; do you really believe that’s the case?

It’s not unusual to look back upon one’s formative years with fondness. But as they age, many people grow more jaded, more disillusioned, more cynical, more bitter. And, unwilling to acknowledge that they themselves have changed, they often ascribe the difference to  a worsening environment.

Today, you’ll hear many people say that “this is not the country I grew up in.” And the turning point you’ll often hear them invoke is the Sixties — more specifically, the Summer Of Love, and the few years thereafter. In their minds, everything has gone downhill since then, and we could make America Great Again if only we could return to the Ozzie and Harriet Utopia of the Fifties. But, like the Jeremiahs of ages past, they are viewing their youth through rosy glasses.  To paraphrase a popular saying, the past ain’t what it used to be — and it never was.

As it happens, I’m old enough to remember what times were like before the hippies ruined everything. And what I remember is a bit different from what the nostalgia addicts remember.

I grew up in a world where “colored people” had separate schools,  courts and water fountains. Women, if they were lucky enough to get jobs as secretaries and receptionists, could maybe earn half of what men did. Gays could be arrested and beaten for their “crime”. Adults could physically and sexually abuse children with little fear of reprisal. Police brutality occurred with impunity, since nobody was recording their conduct and posting it on Youtube. Those of us who were lucky enough to have telephones had them mounted on the wall in our homes; and we didn’t own them; they were on loan from the phone company. “Religious freedom” meant forcing kids to pray in school — and there was plenty to pray about.

We were taught to live in constant fear that we could be annihilated at any moment by a nuclear attack from the Evil Empire of the Soviet Union; and just in case ducking and covering under our school desks didn’t protect us, we wore military-style dog tags to school so our bodies could be identified when recovered from the rubble. We knew the end of the world was surely at hand when a Catholic was elected president.

Pollution and crime were sharply on the rise. Plenty of abortions were performed, but in secret by unqualified personnel; with the result that many women and girls died horrible deaths, which were hushed up and disguised as death by other means. When I was 19, men of my age not only had to register for the draft, but were drafted — it was literally the luck of the draw that kept me from being shipped off to Vietnam. In college, I knew a young man who was sentenced to 3 years in prison for possession of marijuana. Yet everyone was encouraged to smoke cigarettes, because they were just so cool.

My, but times have changed, haven’t they?

Today, while racism and racial tension certainly still exist, race relations and equality have never been better. Women can be CEOs or just about any other thing they damn well want to. Gays can get married. Religious freedom includes (in some cases at least) freedom from religion, just as the founders intended. The economy is not in the best shape it’s ever been, but it’s been much worse, even in my lifetime; in fact, it’s been improving for the past.. oh, 7 years or so. We’ve had an all-volunteer military for years. Pollution has been declining for about 35 years; and contrary to the impression you might get from the media, crime has been declining steadily for at least 25 years or so.

Sure, we still have problems. And some of those problems would have been unthinkable when I was a kid. We never worried then about AIDS, overpopulation, climate change or ISIS. But every age has its own unique challenges. What’s different now is that we have the knowledge and the technology to meet those challenges head-on. In the past, all people could do when a plague or a meteor collision or an ice age occurred was just pray and hope. And today, even though we have an unprecedented level of misinformation stranglehold on the masses courtesy of the mass media, we also have the means of counteracting it — such as, ahem, websites like this one. Not only are things better than ever in the U.S., they are, on the whole, better than ever worldwide.

In short, the better they have it, the more stridently people complain. Polls consistently show that most Americans think their country is “headed in the wrong direction”. Maybe it has been, in some ways, over the period of a year or two. But many people believe it’s been heading in the wrong direction consistently for decades. What do they want — more crime, more racism, more pollution, more war, more disease, and less freedom?

Creationism, Design and the Watchmaker Fallacy

watchmaker

In 1802, British theologian William Paley imagined himself finding a watch on the ground while he was out for a stroll. That imaginary timepiece, though there was nothing intrinsically valuable or distinctive about it, ended up being probably the most celebrated and notorious ticker in the history of theology and philosophy. Because Paley conjured it up to make a point about what he perceived as the inescapable origin of the universe:

In crossing a heath, suppose I pitched my foot against a stone, and were asked how the stone came to be there; I might possibly answer, that, for anything I knew to the contrary, it had lain there forever: nor would it perhaps be very easy to show the absurdity of this answer. But suppose I had found a watch upon the ground, and it should be inquired how the watch happened to be in that place; I should hardly think of the answer I had before given, that for anything I knew, the watch might have always been there. … There must have existed, at some time, and at some place or other, an artificer or artificers, who formed [the watch] for the purpose which we find it actually to answer; who comprehended its construction, and designed its use … Every indication of contrivance, every manifestation of design, which existed in the watch, exists in the works of nature; with the difference, on the side of nature, of being greater or more, and that in a degree which exceeds all computation.

It’s hard to believe that an educated and intelligent individual like Paley would ever even fall for, much less contrive, such a silly and self-refuting argument. But it has shown the kind of staying power that all bad ideas have. Even today, it’s often summoned out of its crypt to defend creationism — which its promoters now believe they can sneak under the radar disguised as “intelligent design”.

The “argument from design” has been making the rounds for centuries. (Robert Frost put a sinister and ironic twist on it in his sonnet titled “Design”.) Basically, the argument from design is the belief that the universe is so complex and intricate that it could not have developed without being guided by a supreme power. This seems derived from the premise that a supreme being would be able to design a universe more complex and intricate than any universe that could develop by “chance” (if you want to think of it in those terms).

The latter is in a sense a reasonable conclusion; assuming that there really is a supreme being, then by definition it would be able to perform feats that nothing else can — including “chance”. But even if we grant this to be true, it does not follow that the particular universe we live in would require a designer. We simply have no way of confirming that assumption.

A related argument practiced by creationists is first causeThe reasoning is that since everything in the universe has a cause, then we can trace all the causes back to a First Cause, i.e., God. But the very notion is a self-contradiction: on the one hand, everything has a cause, and yet on the other hand, there is something that presumably doesn’t. The concept of a first cause also posits a naively and drastically oversimplified model of how the universe functions — a linear construct in which A causes B and B causes C and so on. But the real universe does not operate in a straight line; it operates in an unfathomably complex web of mutual influence. To single out a “first cause” is not only impossible but pointless.

The problem with the claim that “God created the universe” isn’t that it’s inaccurate, but that it’s downright meaningless. As we mentioned before, “God” is term subject to a broad range of definition. And how exactly would the universe be “created”, anyway? Many people insist that the universe could not have just developed out of nothing, so it must have been created, somehow, out of… well, nothing. By a Creator who came from… well, nothing.

The creationist/ design/ first cause argument falls prey to a fallacy known as infinite regress; which is to say that if the existence of the universe proves the existence of a creator, then the existence of a creator must prove the existence of a creator of the creator, and so on and on and on. The decision to cut off the chain after the second link, as creationists do, is purely arbitrary.

As for Paley’s notorious watch metaphor, it’s a classic false equivalence. He is juxtaposing a thing that we know to be created with a thing that we don’t know to be created and concluding that because it’s possible to find characteristics they have in common, then both must be created. What he’s overlooking is that their differences are much more significant than their similarities — a problem he even stumped his own toe on by comparing the watch to a rock.

The essential property that is intended to make the analogy work is the property of irreducibility. Remove one of the tiny gears (each of which was specially crafted for a specific, identifiable purpose) from a watch and you significantly impair or terminate its functioning. That certainly is a strong indication that the watch was designed and manufactured rather than “just happened”.

The universe, however, is quite another matter. On every level from the most microcosmic to the most macrocosmic, the universe is in an unending state of flux, with elements constantly growing old, dying out, being replaced, or just being lost altogether. People lose teeth, hair and organs and life goes on. Indeed, not only do we have wisdom teeth, tonsils and appendices removed, but these organs appear to be useless at best. Countless entire species have become extinct. Stars go nova, obliterating everything in the neighborhood.

While each of these events might make a difference at some level, the universe takes a licking and keeps on ticking. Can the same be said for Paley’s watch?

Ultimately, if you choose to believe in creationism, you must do so on the basis of faith alone. There is no line of reasoning that will make it more logical than any alternative(s). This need not be a problem as long as you keep religion in its proper sphere of cognition. It is when we try to substitute dogma for science that we run into problems of cosmic proportions.

It may be okay to say that “God created the universe”, particularly since nobody will even know what that means. But that does not mean it’s okay to teach kids that the earth is 6000 years old. It isn’t, and it isn’t.

 

 

 

 

 

 

 

Singular Proof

roger bannister

If you read the prior post on cherry picking, you may recall that I cautioned about seizing upon isolated incidents as “proof” of something. Now, however, I’d like to draw your attention to the fact that an isolated incident is always proof of something. A contradiction? Not at all. It just depends on what it is you’re trying to prove.

Consider a popular example: the use of guns for self-defense, commonly called defensive gun use (DGU). You’ve no doubt heard about plenty of these incidents; chances are your gun-loving friends will Facebook or Tweet every time such an incident drifts into their crosshairs, perhaps passing it along with the comment that “this proves that guns make us safer”.

But does it? “Us” is in this case a very inclusive pronoun. In order to prove that guns make “us” safer overall, you’d have to demonstrate that they effect a net reduction in crime — i.e., that the are used to prevent more crimes than they are used to commit. Nobody has ever been able to demonstrate anything even close to this.

For that matter, these incidents don’t even prove that guns make the individuals involved safer. They do prove that in some cases, guns can be used in self-defense. And that is essentially the value of an isolated incident: it demonstrates that such an occurrence is a possibility. When Roger Bannister ran the first recorded 4-minute mile in 1954, he proved once and for all that such a feat can be done. He didn’t prove that anyone could do it, but he did prove that it was at least humanly possible. (Since then, it’s been accomplished many times.)

Another thing about the singular proof, then, is that it also disproves something — Bannister disproved the commonly accepted notion that running the 4-minute mile was unattainable. He didn’t disprove a generality (running the 4-minute mile is very difficult) but he did disprove an absolutism (the 4-minute mile is impossible).

It’s been established that tobacco and excessive alcohol are harmful to health. Yet every now and then, you’ll hear about a man who’s lived a century even though he’s savored a cigar and a glass of whisky every day for years. Has he proved that medical science is wrong? Or that — as some of these superannuated persons maintain — these vices actually promote longevity?

Of course not. He disproves the absolutism that tobacco and alcohol invariably lead to premature death. And he proves that some regular smokers and drinkers can live long lives. But he’s done nothing to discredit the science that says such indulgences are harmful in general.

Scientists, however, are sometimes scornful of anecdotal evidence, declaring it to be totally worthless. Which is ironic, given how dependent they are on it. A scientific experiment is preceded by a hypothesis. And where does the hypothesis come from? Anecdotal evidence, quite often. Like the rest of us, scientists exercise inductive reasoning: they notice specific events and extrapolate from them that there might be a general pattern. Unlike the rest of us, they undertake methodical tests in an effort to prove this hypothesis — or hopefully, an effort to disprove it, since that’s really the only way to accomplish either proof or disproof. And how do they do this? By collecting more anecdotes, either in a laboratory or in the wild. But this isn’t considered anecdotal evidence, since the events are collected systematically rather than haphazardly.

Still. every anecdote does prove something. The trick is interpreting correctly what it proves, rather than being led over the lemming ledge of unwarranted conclusion.

(For more on the role of anecdotes in science and medicine, see here and here.)