I never memorized multiplication tables, and yet I can do such calculations quite easily.
Ah, so you count addition/subtraction on your fingers and do multiplication by iteration? Or do you mean that you didn't bother to learn the operations the way that school taught you and learned them piecemeal as you needed them? If it's the first case, I'd say that you're an idiot. If it's the second, well, to each their own, but I'd say that you still rote-memorized the tables, just in a less-structured way.
In general, I agree that rote memorization doesn't lead to education in a useful sense, but you're deluding yourself if you think that it's absolutely avoidable in all cases.
Beyond that, I'd agree; generally, rote memorization is harmful, and when you get into real mathematics, those facts aren't as useful. I don't see math as the real reason that we teach arithmetic, though. It's useful to be at the grocery store and easily know how much you're going to be paying total if you're buying 4 items at $6.49 and 5 at $2.37. If you disagree about the purpose of memorizing those facts (for most people), or the usefulness (in daily life) of having memorized them, then I'm not going to try to convince you. Your replies sound like you're just trying to be contrary, anyhow.
if you aren't on Android which does everything it can to keep you tethered to Google.
What are you talking about, specifically? On every Android device I own, connection to Google services is optional (if you're willing to flash the OS), and an internet connection is no more necessary to use the functions of the device than it is on iOS devices.
In tech we reach plateaus of 'good enough' for the time and resources involved.
And then someone comes up with some kind of outlier use case that exceeds the requirements of "good enough", and sometimes that use case becomes more and more common over time. "Good enough" is constantly redefined.
I've had fairly good experiences with CFL bulbs, it's interesting to see how many people have had much more negative ones. I guess I've been lucky.
Anyhow, for me, the "picture" exists, but it's more tactile than visual. There are visual aspects, but it's not how I process most of the information. Loops are spinning wheels when they don't have a clear exit condition, and feel like unrolled spirals when they're "for" loops going over specific ranges. Algorithms seem like they have a size/weight, which corresponds to my idea of how quickly they'll run on a given set of data (although it's not always accurate, yet).
If I don't remember how a section of code works internally, it feels hollow, and when I read the code, it's like looking inside the black-box. If I change something outside the box, I feel the domino effect, and when it hits the box, I need to look inside to see what'll fall over. I can also feel like threading some string through the eye of a needle, when I'm running some value up through a class hierarchy, or something.
I think that the important insight is that a lot of us become very skilled at constructing mental models of what we're working with, and gain some sort of sensory perception (often vision-related) of how the model functions. I think it's telling that (in my case) the world falls away from my perception when I'm working through a complex problem, and closing my eyes sometimes helps, as well.
If the goal is to run a complete older system on new hardware, emulation in some form is the best bet. Running the old OS directly on modern hardware isn't likely to be feasible (without extensive modification to the old OS).