• Activity
  • Votes
  • Comments
  • New
  • All activity
  • Showing only topics with the tag "statistics". Back to normal view
    1. Monocausality bias, essentialism, modernist grand narratives, and the awesomeness of statistical uncertainty

      #This is a "shower thought" more than a properly empirically researched idea, so it is presented without any citations. This lack of resources is also a reference to many modernist philosophers,...

      #This is a "shower thought" more than a properly empirically researched idea, so it is presented without any citations. This lack of resources is also a reference to many modernist philosophers, whom I dearly appreciate.

      Modernist theories famously tried to get at "the truth behind eveything". For example, majority of both pro- and anti-capitalists thought that history was progressing in a linear tract, and that there was such a thing as end of history. So, they tried to find the drive of history. Famously, Marx claimed to have found it in historical materialism. Similarly, many pro-capitalists have declared The End of History when USSR fell.

      Both of these claims were made on the idea that a single mechanism was behind the progress of history, therefore almost everything.

      It is my thesis that this was and is an extension of essentialist thinking. Such a way of thinking looked for "the essence" of the object of study, because it assumed an (singular) essence drove the object to behave the way it did. There were no multiple causes, only a single cause—if you could find it, you could explain the object in its entirety.

      Modernist philosophers updated this idea a bit. They didn't look for a Platonic idea, for example, but they looked for "the drive behind the object". While they were more materialist, it was also a quasi-metaphysical endeavor.

      I'm going to quote Marx's historical materialism again, because it's one modernist narrative I'm familiar with—simply put, it was such an attempt. While the historical materialist narrative touched on many great things about humanity (e.g. the plasticity of "human nature", the dependence of culture on material conditions), it overreached and overreduced history to a single mechanism. It seemingly recognized the role of other mechanisms, but decidedly explained away their importance in contrast to what Marx saw as "productive forces".

      This was an extension of Hegelian dialectics, but reversed. Hegel assumed thought drove materialist changes. Marx flipped this over. However, both of these were still highly metaphysical, highly essentialist.

      Essentialism's mistake, in this context, is not only that it is metaphysical, it's also that it reduces the object of study to a monocausal explanation. It looks for only one cause. However, as the advance of scientific and most specifically statistical knowledge shows, there are always multiple causes to complex phenomena.

      This revolution in thinking was a great attack on modernist and all the preceding grand narratives. Statistics especially was important in this. The more an explanation -any explanation- was tested in scientific contexts the more it was apparent that no single cause was able to explain everything. Nevermind that, as both natural and social scientists became aware, most of the time a single cause wasn't able to explain most (>50%) of the variation seen in a study.

      Another result of statistical thinking, if one is willing to consider all its implications, is that uncertainty is an inherent part of everything we do and explain. There is no epistemic certainty, nothing we can know for certain. So, everything is always, at some level, a working hypothesis. This doesn't mean that everything is equally plausible, but that we can never be 100% certain about our explanations, neither in science nor in anything.

      Why is this so? Because inferential statistics is structured to give an idea about the uncertainty of the inference we are doing, based on our observations. In short, it always assumes there are "error bars" or something of equivalent function.

      This is the second implication of this revolution—we should be aware of uncertainty and embrace it.

      In summary, there were two important results of this revolution in thinking.

      1. Monocausality bias hinders thinking. In complex phenomena, natural or social, there are most likely multiple important drives (causes).
      2. Rejecting the inherent epistemic uncertainty of our explanations and embracing the psychological certainty of monocausal explanations would be a folly.

      Again, and I cannot stress this enough, this doesn't mean everything is equally plausible (doing so is also counter to statistical thinking!). But realizing the value in this approach provides a great deal of flexibility of the mind, and it makes it much less likely that a person would seek comfort in psychologically certain, essentialist or quasi-essentialist narratives. It makes it less likely so that you fall victim to overly reductive but confident-sounding explanations.

      It also allows one to critically examine modernist and previous explanations, both in positive and negative ways. Grand narratives, I think, touch on many great topics and have insight, but they fall victim to overreductive monocausality bias. If you can separate them from that, then you find a source of rich thinking styles. It seems that sociology does this with thinkers such as Marx, Weber, and more.

      This, I think, is one of the greater revolutions in the "post-modern" era. Post-modern thinking is often associated with extreme skepticism, to the point of declaring everything unknowable, however, this would be reductive. In the way I described, being skeptical of such grand explanations and embracing multicausality and uncertainty is an extremely productive approach.

      This, however, does not mean essentialist, monocausal, modernist, etc. thinking is defeated and gone. "Lightning and thunder require time; the light of the stars requires time; deeds, though done, still require time to be seen and heard."

      Of course, despite the quote, there is nothing sure about the eventual victory of this better way of thinking. However, even in the case that it could become the dominant mode of thought, it will take a great deal of time and active struggle against the old ideas and powers-that-be.

      17 votes
    2. Suggestion: Show number of times a tag has been used

      Roughly knowing how many times each tag has been used would provide users actionable information if they would like to search or filter by tags. It might improve UX when applying tags, but might...

      Roughly knowing how many times each tag has been used would provide users actionable information if they would like to search or filter by tags.

      It might improve UX when applying tags, but might have undesirable side effects in user behavior.

      I can think of three places this might be implemented, and I don't know which, if any, we want:

      When filtering topics by tags:

      • informs users how large or small their scope is
      • this view should probably be kept somewhat up to date

      When looking at a topic's tags:

      • informs users where to start searching/filtering
      • passively builds a frame of reference for how tags are used?
      • this view could be allowed to become outdated and stale without issue

      When applying tags

      • a more common tag might be less accurate, but it might be more helpful?
      • in the auto fill issue weight by frequency was proposed, which is somewhat similar but more opaque
      • this should probably use pretty recent counts as well
      17 votes
    3. Can anyone recommend a specific type of statistics course?

      I would like to find a good Statistics course to do for myself, and also to recommend to others, down the road ... one that specifically focuses on risk, and the discrepancy between actual...

      I would like to find a good Statistics course to do for myself, and also to recommend to others, down the road ... one that specifically focuses on risk, and the discrepancy between actual statistical probability vs humans' intuitive sense of risk.

      I recall a quote, which The Interwebs informs me right now, came from Albert A. Bartlett ... "The Greatest Shortcoming of the Human Race Is Man’s Inability To Understand the Exponential Function".

      Alternately, Mark Twain popularized (but did not originate) the saying "There are lies, damned lies, and statistics".

      That's the kind of course I'm looking for, that focuses on questions like how much should we actually worry about supervolcanoes, asteroid strikes, Covid 2.0, WWIII, Trump getting re-elected, etc.

      There are two parts to this. One, people often (naturally, human nature, how our brains are wired to handle Risk) obsess about a short list of risks in life that are overblown, or appear to be more of a concern than they actually are.

      The other part is, some things have a very small risk of actually happening, but when considered in conjunction with the potential consequences (asteroid strikes, WWIII, global pandemic), are still worthy of aggressive efforts to prevent ... and people often focus on the first element (statistically unlikely) and dismiss or overlook the second piece (devastating consequences).

      Anyway, stuff like that ... ideally an actual, hands-on MOOC-type Statistics course, but even a good youtube video or blog article would suffice.

      As usual, thanks in advance.

      5 votes