I think you should learn probability and statistics.

Yes, I think *you* should learn stats.

Even if you’re not good at math. *Especially* if you’re not good at math.

**You can learn the core ideas without needing any significant math background**. This will be a list of posts for learning statistics, regardless of your background. This is for everyone.

**Why?**

*Axiom*, from the Greek, “that which commends itself as evident”. I’m going to list some statements here which seem, if not quite INDISPUTABLE FACTS OF THE UNIVERSE, then at least highly plausible. I think these statements, taken together, make a strong case for study of statistics, even without the maths.

**Axiom 1: Our beliefs should be based on evidence.**

Yes? Yes.

**Axiom 2: The world is big.**

The world is grey-matter-dribbling-out-the-ears complex.

There are over three hundred million people in the United States alone. There are over seven billion people on the globe.

Our world is not just the actions of individuals on islands. The world is the *interaction* of not just millions, but billions of people. Interactions are insane on a level that straight actions can’t match. There are only 52 cards in a standard deck, but those 52 cards can be shuffled and arranged into 80658175170943878571660636856403766975289505440883277824000000000000 different combinations. This is around the same number of atoms in our galaxy.

Complexity is not just about the number of pieces that you start with, but the way those pieces can interact with each other. It’s not just that we can fail to predict how tiny things interact with each other, it’s that we must *necessarily* fail to predict these interactions because they work on a scale that we literally cannot conceive. We have to wait to see what has happened, and only then try to untangle the pieces. If we even can.

What all this means, fundamentally, is that a complex system of interacting pieces — like our own society — might function in ways entirely contrary to our expectations or our ideological predispositions. And so we need to look at evidence and to evaluate it fairly.

**Axiom 2.1: We are small. Our brains are tiny. We can be wrong about shit.**

We need some way to be able to change our mind about how the world works, based on the evidence we see. And that evidence has become… intense.

**Axiom 2.2: A complex world creates more data than the tiny human mind can comprehend on its own.**

I have on my computer a dataset that is 60 gigs. Which is big. Comparatively. About 60 billion bytes. Billion with a “b”. Google works with bigger. Amazon. Apple. This would be tiny compared to a dataset used by a modern we-control-the-internet corporation on their own supercomputers. But an off-the-shelf desktop, the one that probably sits on your desk, wouldn’t necessarily have the memory to handle it. Unless you’re a 1337 gamer, yo.

Our own laptops couldn’t handle this dataset. We had to shop for a desktop that would. Then upgrade it.

If we want our beliefs to be based on evidence — and we should — then we should realize that the raw data that makes up that evidence is already at a size that dwarfs imagination. And getting bigger. All the time. We should realize, too, that there are strict logical rules that outline how we should work with data, no matter how large it is. How to approach it honestly and reasonably.

Those rules can be learned.

It is useful for trying to understand the world. It is useful for expert and non-expert alike, for the practitioner in the social sciences and also for the professional in the humanities who hasn’t taken a math class in thirty years.

**Argument: Statistics is just applied logic. It’s not necessary to know “the math” to know how the logic works.**

I have evidence for this argument. You already know how logic works.

When I see people in the humanities arguing for the importance of their discipline against the short-sighted naysayers, one of the (many, many, many) arguments tends to be that the humanities teaches critical thinking, how to construct an argument, how to argue persuasively and well — in short, the humanities can teach us how to build powerful, logical arguments. I’m an economist, but I have both undergraduate and graduate degrees in the humanities. I can say that this fits my own experience exactly.

And fundamentally, that’s the purpose behind stats, too. You don’t need to be able to derive every line of a proof to know the reasoning behind what’s going on. The math can be tedious. In the standard presentation, the language is linear algebra, the proofs based on calculus. I can’t remember how a Lebesgue integral works myself. (It’s… sideways? Instead of up-and-down?)

Doesn’t matter.

Here’s more evidence in favor of this argument: One of the best social and medical science bloggers on the intertubes dislikes math. He likes puns and hates integrals. And yet he can summarize statistical research in the social sciences better than most direct practitioners in those subjects, without being particularly good at math, without even liking math.

His secret is knowing the logic behind the statistics. He doesn’t do the proofs, doesn’t need the proofs. He was a philosophy major as an undergrad (humanities!). He knows critical thinking. He knows the underlying purpose of what all these funny stats things are for, and he uses that underlying purpose to talk, intelligently and convincingly, about modern research. (I might note here that he is a psychiatrist, meaning that he is a medical doctor. Practicing doctors tend to be shit at statistics, but somehow he is an exception.)

He can do this, because he can logic and he put in the investment to learn. He thought it was important enough to learn this stuff even with a weak math background. He was right! You don’t need to fuck around with algebraic manipulations in order to be able to do basic logic in basic English. Can you do an Aristotelian syllogism? All men are mortal. Socrates is a man. Therefore…

That’s it. That’s the mental toolset you need to be able to decipher the statistics in pretty much any empirical research article you pick up. You don’t need to know how to do an Edgeworth Expansion. You don’t need to know what an Edgeworth Expansion even is if you just want an introduction to the world.

Just learn the logic behind the ideas, and you can follow the discussion. You can read the vast majority of applied social science research papers and correctly interpret them.

**Axiom 3: We can be certain about the logic of uncertainty.**

There’s a tension here that can make people uncomfortable.

The world is big. We are small. Our insignificant brains can be mistaken about many things. Some people go full nihilist at this point, listening to the Cure on loop as they ponder the futility of absolute certainty while crying themselves to sleep under their jet-black vampyre blankets, with embroidered coffins on the hem.

I don’t recommend this route. A sixteen-year-old might pull it off, but eventually, most of us have to take off that eyeliner and face the sunlight.

I recommend instead relying on basic logic. Euclid wrote his *Elements* more than 2,000 years ago. And despite all the changes in the world since the Greek classical era — despite the shifting currents in literally every other field of human thought, from philosophy to political organization to economic structure to literally everything else — every theorem that was proven in that book remains relevant today. (He missed an axiom about triangles but the edifice remains intact.)

This is the strength of the axiomatic method. If you accept the premises, then you must necessarily accept the conclusions.

Modern mathematicians have cleaned up the axioms from Euclid’s time. And physicists have informed us that the space that we live in, the geometry of our universe, is not actually Euclidean. Space is curved, not straight, as Einstein so helpfully informed us. Nevertheless, the theorems of Euclidean geometry remain inevitable consequences of the original axioms. The theorems remain sound, uncontroversial and uncontested, after more than two millennia.

There is, in fact, an analogous axiomatic method that corresponds to the study of uncertainty. There is a way to be certain — logically, “mathematically” certain — about the way we deal with uncertainty. No need for black eyeliner. Start with basic ideas that strike us as “true”, and then basic logic can take over. And we already know how to logic. We can’t be certain about the conclusions we make about the world, but we can have logical confidence in how we choose to approach that uncertainty.

**Recommendation: You should be part of this discussion.**

For a lot of topics, we have no choice but to rely on the authority of experts. That will never go away. I don’t think I’ll ever know the math of general relativity (pretty sure you need “tensors” to work with it…), so I’m just going to trust the people who claim to know what Einstein was talking about. That’s good. That’s fine. My GPS works despite the relativistic effects of satellites in orbit. I’m cool with that. A lot of topics must necessarily be like this.

Statistics doesn’t have to be one of them.

This is because statistics can be the gateway to *whichever one* you find most important. It’s like a universal key. There are rules about how to change our mind based on new evidence. And we can pick up these rules. We can apply them. We can use them personally. We cannot learn about literally every subject, but we can acquire the key that helps unlock whichever subjects are most essential for us in our own lives. We can learn to get our hands dirty with subjects that were formerly strange and foreign and completely out of reach. We can become acquainted with the *logic* and *language* of working with data. I recommend it. It’s good, it’s healthy, it’s even fun when you approach it right. Stats doesn’t have to be “math” in the sense of the dismal teachers of your nightmarish childhood memories. Instead, stats can just be an extension of logic, accompanied by an extended vocabulary lesson. This is a language that any curious person can pick up. Again: regardless of background.

There is a huge stats conversation going on around the world, and yet many people feel excluded from that because they feel like they don’t have the background.

Not a problem! Join the conversation! I promise, it’s just basic logic applied to spreadsheets. All you have to do is learn the silly vocabulary and a few important symbols. It’s an investment to pick up the language, but the rewards are worth it. (Fair warning: the vocabulary can be BEYOND silly, but that’s the language they speak.)

**Axiom 4: There are problems in this universe where it is crucially important to get the right answer. There are problems in our own lives where it is crucially important to get the right answer.**

I would not, originally, have considered starting (or restarting) a blog with this as the slogan. This sounds like a platitude, a truism so banal and boring and uncontroversial that it’s not worth any explicit statement, just like saying “basic logic works” does not seem to merit being said out loud.

And yet.

I was having an online conversation once about probability theory. The discussion was with a person who claimed to have thought heavily about probability, not just the mathematics but the underlying philosophical issues. I was trying to explain the importance of the qualitative “rules of plausibility”, although I wouldn’t have phrased it that way at the time. And the person I was talking with made the comment — paraphrasing here — that he didn’t see the importance of these qualitative rules of plausibility. After all, he reasoned, what if you apply these rules of plausibility to a situation that doesn’t matter? Why would they matter then?

After some bewildered consideration of that comment, I tried to write a reasoned response. I acknowledged that there would be no point expending effort in ranking relative plausibilities, if the outcome of the ranking were not important. However, I then suggested the somewhat different task of applying the rules of probability to a situation that he actually found *important*, a situation where he did not know the absolute truth of the matter, but had to make a decision under some level of uncertainty. (The world is big. We are small. There are many things we do not know.) I suggested that rigorous thinking about probabilities was desirable in exactly those cases where it would be critical to him, even essential, that he got the answer right, when he was ultimately ignorant of what would happen.

This same person, who previously claimed deep consideration of these issues, declined to respond to this point, or any other point I made. Conversation over. I’m still not sure what to make of that. But that conversation, among others, is a source of this current emphasis. I understand now, finally, that this is a point that must actually be made explicitly. And emphasized. Repeatedly, if necessary.

There are problems in the world that are actually important. It’s worth thinking about how to improve our chances to solve those problems.

If anybody has an objection to that, well then, I don’t really know what to say except… *I disagree.* Likewise, if I’m discussing the importance of the various “rules of plausible reasoning”, I would like to discuss the importance of these rules within the context of an important problem, one where we genuinely care about getting the answer right, especially when there is potential human suffering on the line given a mistake.

The world is big. We are small. We cannot be absolutely certain that we are doing the right thing, and DESPITE ALL THAT, there are still problems that are so overwhelmingly important that we really, really, really, really want to get the answer right. This is true in the political world. It’s true in the sciences, as well. The inventors of the H-bomb were certainly smart enough to create the H-bomb, but (as the saying goes whose source I cannot find at the moment) they were definitely not smart enough to *not* make the H-bomb. So how smart were they? Wouldn’t it be a better world if everyone clever enough to make doomsday devices were also rational enough to NOT make them? There are problems in this universe where it is crucially important to get the right answer. I would say the inventors of thermonuclear explosives didn’t manage that. They solved the physics problem, and thereby spectacularly failed to solve the human problem.

And of course, it’s not just decisions on this scale that we need to get right. It’s also true that there are decisions that we need to make in our own personal lives, maybe not the world-shattering decisions of nuclear physicists, but nevertheless crucial for our own and our family’s happiness. It’s important to get those decisions right, too.

So now, I’m stating that as an axiom up front. *There is shit that we want to get right.* How to improve our chances of getting this stuff right is worth thinking about.