13
votes
What are some exceptions to general statements that you find noteworthy?
For example, every material in a fluid shrinks in volume as is gets colder and solifldifies, with a few exceptions, one of them being water from 4 to freezing, and from freezing into ice. This is also why water freezes from the top instead of the bottom. This is relevant because it means lakes and oceans freeze from the top and the water below is kept from freezing by the ice, making things easier for the aquatic life below.
While this is the only example I thought of, something doesn't necessarily need to be important to be "noteworthy", it can also be amusing or unintuitive.
Many logical fallacies have noteworthy exceptions that are ignored by "fallacy bullies" all over the internet. For example, an appeal to authority is only fallacious when the authority is irrelevant. The sentence "according to Stephen Hawking, the concept of time has no meaning before the beginning of the universe" is not fallacious because Hawking's was superbly qualified in theoretical physics.
While insults are not generally a good thing to have in any discussion, I think people overuse 'ad hominem' as well. Technically the fallacy is "you are an idiot therefore you must be wrong," not "you are wrong, plus you are an idiot". The latter's rude, but it's not, uh, intrinsically illogical.
It's almost never necessary to know "computer science" topics - data structures and algorithms, things like the lookup characteristics of hash tables, what a trie is, how to reverse a linked list, etc - in daily life. We simply don't work with enough information on a day to day basis (without using software that's already handled this for us) for an O(n) algorithm to do much better than an O(n^2) or even an O(n^m) algorithm, especially if the slower-growing one has a high initial cost.
However, binary search vs. linear search can cut a search across a thousand ordered items - say, looking through a thousand colors to find the one that matches - to just ten comparisons at most. Rather than, on average, having to compare your chosen color to 500 paint swatches, if they're in order, you could compare it just 7 to 10 times.
Really useful!
An example of when choice of algorithm actually matters for small data is the firmware in one of Prusa’s 3D printers. They used bubble sort to sort the file listing from an SD card, to save on code size. This is normally fast enough even on the 8-bit microcontroller they’re using, but it would slow down for long file lists, so they switched to… shell sort, for a 10x speed up. (They are still optimizing for code space.)
But this is because they chose such a slow processor with so little memory to begin with. In their newer printer they went to a 32-bit microcontroller which shouldn’t have such problems. It’s unlikely that your code will ever run on a system that’s that small and slow.
This is a good example! But I actually meant processing data with your brain - no external hardware involved.
Can you explain how you might do this with your brain using your paint swatches examples? It's incredibly interesting but I'm missing the link.
Okay so let's say you have a paint color that's "light green". You don't know which of the 1000 (maybe more than there would be) greens there are in the store.
You could start at the lightest green, nearly white, and move forward until you reach the correct color. That would be at least several hundred comparisons.
Or, assuming they're sorted already in the store, you could start at the middle color. If that color was too dark, you'd go halfway towards the light end and compare there. If that's too dark, go halfway from there to the light end; if it's too dark, go halfway back to the middle. Repeat ~7 to 10 times and you've got your color.
Thank you for the explanation. That makes it very easy to understand!
I'm not sure I entirely agree. Are you talking about CS students or the general population?
For CS students, I think for the rote memorization version this is true, but the general principle of classes of complexity, O(_)-notation being taught is important in case you do end up coming up with your own algorithms, or even re-implementing an existing one. Just being mindful of the topic can often help speed things up a fair bit and prevent some rookie mistakes. Whether those rookie mistakes end up being relevant is another topic though, as it's very much up to the domain whether a speedup of 1000x is even worth it.
For the general population, I agree; but I don't think they're being taught this, right?
Right! I think they should be taught about binary search.
I agree, I’ll usually look to make slow operations concurrent prior to better algorithms.
https://www.youtube.com/watch?v=Ccoj5lhLmSQ
/noise-joke ;)