One of the surprising (at least to me) consequences of the fall of Twitter is the rise of LinkedIn as a social media site. I saw some interesting posts I wanted to call attention to: First, Simon W…
I can explain in basic terms what is happening there. Does that help anybody?
Really depends on where the bug lives.
I would argue that it doesn’t because almost everyone writes code in higher level languages.
Most people write mediocre code. A lot of people right shit code. One reason why a particular application or function runs faster than another is due to the compilation of the high level language into assembly. Understanding how higher level languages translate down into lower level logic helps to reveal points in the code that are inefficient.
Just from a Big-O notation level, knowing when you’ve moved yourself from an O(n log n) to a O(n2) complexity is critical to writing efficiently. Knowing when you’re running into caching issues and butting up against processing limits informs how you delegate system resources. This doesn’t even have to go all the way to programming, either. A classic problem in old Excel and Notepad was excess text impacting whether you could even open the files properly. Understanding the underlying limits of your system is fundamental to using it properly.
Similarly, I could explain to you how long division works but the next time you need to divide two numbers you’re still going to reach for a calculator instead of a pencil and paper.
Knowing how to do long division is useful in validating the results of a calculator. People mistype values all the time. And whether they take the result at face value or double-check their work hinges on their ability to intuit whether the result matches their expectations. When I thought I typed 4/5 into a calculator and get back 1.2, I know I made a mistake without having to know the true correct answer.
One of the cruelest tricks in the math exam playbook is to include mistyped solutions into the multiple choice options.
What then is the point of lamenting the loss of knowledge that no one uses directly?
It’s not lamenting the loss of knowledge, but the inability to independently validate truth.
Without an underlying understanding of a system, what you have isn’t a technology but a religion.
Really depends on where the bug lives.
Most people write mediocre code. A lot of people right shit code. One reason why a particular application or function runs faster than another is due to the compilation of the high level language into assembly. Understanding how higher level languages translate down into lower level logic helps to reveal points in the code that are inefficient.
Just from a Big-O notation level, knowing when you’ve moved yourself from an O(n log n) to a O(n2) complexity is critical to writing efficiently. Knowing when you’re running into caching issues and butting up against processing limits informs how you delegate system resources. This doesn’t even have to go all the way to programming, either. A classic problem in old Excel and Notepad was excess text impacting whether you could even open the files properly. Understanding the underlying limits of your system is fundamental to using it properly.
Knowing how to do long division is useful in validating the results of a calculator. People mistype values all the time. And whether they take the result at face value or double-check their work hinges on their ability to intuit whether the result matches their expectations. When I thought I typed 4/5 into a calculator and get back 1.2, I know I made a mistake without having to know the true correct answer.
One of the cruelest tricks in the math exam playbook is to include mistyped solutions into the multiple choice options.
It’s not lamenting the loss of knowledge, but the inability to independently validate truth.
Without an underlying understanding of a system, what you have isn’t a technology but a religion.