8 votes

A mathematician's lament

9 comments

  1. [9]
    skybrian
    Link
    To be fair, there are basic things you do want kids to know, like how money works. The analogy given is imperfect because unlike music or art, sometimes math is just the means to an end. But it’s...

    To be fair, there are basic things you do want kids to know, like how money works. The analogy given is imperfect because unlike music or art, sometimes math is just the means to an end. But it’s true that appreciation is important too.

    As a computer programmer, I sometimes get frustrated with higher math because in the end I would like to write programs to do calculations. Earlier this year I picked up a book on geometry algebra that completely skipped anything about doing calculations because supposedly you could use a library written by someone else. (Eventually I concluded that for my purposes, quaternions are good enough.)

    1 vote
    1. [2]
      blitz
      Link Parent
      I think he addresses this pretty well. We want kids to be able to read nonfiction books for their education but we introduce them to fiction and poetry too.

      The analogy given is imperfect because unlike music or art, sometimes math is just the means to an end.

      I think he addresses this pretty well. We want kids to be able to read nonfiction books for their education but we introduce them to fiction and poetry too.

      4 votes
    2. [6]
      bloup
      Link Parent
      Math should not be seen as only a "means to an end" and the fact that people see it that way really illustrates what a bad understanding society has of mathematics. Any time you make some kind of...

      Math should not be seen as only a "means to an end" and the fact that people see it that way really illustrates what a bad understanding society has of mathematics. Any time you make some kind of assumption, and then try to reason your way to some kind of useful conclusion, you are doing math. Math is not being a human calculator. Math is about proving things. But basically none of math education is devoted to this fundamental aspect of math, preferring instead to drill algorithms and pattern matching techniques into children's head without any understanding of why these things even work in the first place. Maybe in geometry class, you do some kind of weird 'proof' thing where you draw a table and you write out statements and justifications but that is basically it. Nothing more than a footnote. People come out of school thinking "doing math" means finding some kind of quantity, when this is such a shallow and limiting understanding of the subject. People think "why do I even need to learn math when I have a calculator in my pocket literally all the time" and not "now that I have a calculator in my pocket literally all the time and have no utility for memorizing tons of different algorithms, can we actually learn real math?"

      We are depriving people of something that has a lot of value, here. Being good at math just means you are good at reasoning your way through basically any situation you may encounter. But it takes practice and instruction.

      By the way, programs are proofs. This is my favorite result in all of computer science and mathematics.

      4 votes
      1. [5]
        skybrian
        (edited )
        Link Parent
        This seems like an over-broad claim to me? Many forms of thinking aren’t straightforwardly modeled using any kind of formal logic, because they aren’t about propositions that are unambiguously...

        Any time you make some kind of assumption, and then try to reason your way to some kind of useful conclusion, you are doing math.

        This seems like an over-broad claim to me? Many forms of thinking aren’t straightforwardly modeled using any kind of formal logic, because they aren’t about propositions that are unambiguously true or false, but rather something more nebulous. (See David Chapman on the rarity of absolute truths.) In Chapman’s terminology, I would say that most thinking is reasonable but not rational, because abstraction is a powerful but specialized tool.

        As an aside, I do agree that the “programs are proofs” result is pretty neat, but have doubts about its importance. As I understand it, for a function to be an interesting proof, it needs to return a complicated type in a language specialized for proofs. There needs to be some doubt over whether any instance of that type exists, and usually there isn’t - most everyday types used in programming languages are trivially inhabited by constructing an instance directly. A function that returns an integer just proves that an integer exists, which we already knew.

        That is, it proves it assuming the function returns at all. A useful language for proofs needs functions that are guaranteed to terminate, so you can show that the return value exists without running the program. But ordinarily we use languages that don’t guarantee termination and show that functions can terminate by running them and seeing that they do return an answer, since we care about what the answer is, not just that it exists.

        2 votes
        1. [2]
          psi
          Link Parent
          On the contrary, Bayesian probability can be thought of as an extension of propositional logic (rather than a proposition being true/false, the proposition can take intermediary values). Since...

          Many forms of thinking aren’t usefully modeled using any kind of formal logic, because they aren’t about propositions that are unambiguously true or false, but rather something more nebulous.

          On the contrary, Bayesian probability can be thought of as an extension of propositional logic (rather than a proposition being true/false, the proposition can take intermediary values). Since most people have intuitive a priori beliefs, understanding Bayesian interference (at least in principle) can guide people to make better decisions.

          For example, I've heard a few people express the belief that they had COVID-19 back in January, a proposition evidenced by some experienced general unwellness, usually accompanied by a fever and cough. However, despite these symptoms being compatible with COVID-19, relatively few people had the virus at that time, and therefore it's unlikely the malaise was actually COVID-19 (more likely, it was something else that presented similarly). In Bayesian parlance, we say the prior probability for having caught COVID-19 in January is low; consequently, the posterior probability of actually having caught COVID-19 in January should be appropriately penalized.

          For a more concrete example, see this.

          For an example of how people misidentify their priors, see here.

          3 votes
          1. skybrian
            Link Parent
            I think you’ve given a good example of a mathematical metaphor. There isn’t actual math here, just an intuitive judgement justified using mathematical concepts (like prior and posterior...

            I think you’ve given a good example of a mathematical metaphor. There isn’t actual math here, just an intuitive judgement justified using mathematical concepts (like prior and posterior probability) with informal reasoning.

            Even those of us who talk about Bayesian probability and priors a lot aren’t normally doing any probability calculations in our heads, or making a spreadsheet. If you don’t have any numbers to plug in, it doesn’t make sense to do this kind of calculation.

            Contrast with a gambler playing poker or blackjack, where it actually makes sense to learn to count cards and the like. This works because games of chance are specifically designed to make probability work. But the real world is often more nebulous than a casino.

            I think this is a good reason to learn something about the math, though, to understand such metaphors, even you rarely use it for calculations.

            2 votes
        2. [2]
          bloup
          (edited )
          Link Parent
          I actually don't think it's an overbroad statement at all, and I would also argue that mathematics does not deal in "absolute truth", but rather "conditional truth". It isn't even possible to...

          I actually don't think it's an overbroad statement at all, and I would also argue that mathematics does not deal in "absolute truth", but rather "conditional truth". It isn't even possible to prove that the "absolute truths" we take for granted in mathematics are even consistent with each other, and so, in a way, even in math, we never are dealing with "absolute truths", and are only making claims about things supposing that the assumptions we make are true (even though we can never say for sure). If, for example, you were making a decision, and you had to determine the consequences of two possible choices, you are going to have to make some assumptions. Now, you could just be wrong or really bad at making reasonable assumptions, and being good at math won't help you at all. But if you are good at making reasonable assumptions, then being good at math will only help you figure out reasonable conclusions.

          Also, there are concrete examples of trying to formalize the "rarity of absolute truth" when it comes to certain propositions in mathematics. You might be interested in checking out multi-valued logics or fuzzy logic.

          As for the "proof are programs" thing, in my opinion, its greatest utility is in the fact that it means we can develop programming languages that are specifically designed for proving statements. But, it also (in my opinion quite importantly) illustrates that even when you are writing literally any program, the entire time you are actually doing math! No matter what the program does!

          1. skybrian
            Link Parent
            By “absolute truth” I meant something else; perhaps “unambiguous truth” would be a better way to put it. Conditional logic doesn’t really touch on the things that cause statements in everyday...

            By “absolute truth” I meant something else; perhaps “unambiguous truth” would be a better way to put it. Conditional logic doesn’t really touch on the things that cause statements in everyday language to be somewhat nebulous. I’d encourage reading more of Chapman’s unfinished book. The first part seems to be close to done and does touch on multi-valued logic.

            I agree that computer programming is mostly math and furthermore, ordinary programming is close to logic, with its if statements and boolean expressions. It’s a realm that was specifically constructed to allow logic to work. The difficulty comes from imperfectly modeling the real world. See falsehoods programmers believe about names for the kinds of issues that come up.

            Machine learning is a somewhat successful way to handle nebulous data within a computer, and it’s all math too, but notice that logic isn’t very useful to describe how image recognition or machine translation happen.