I'm actually okay with us not using our nukes.
The whole thrust of ESR's Cathedral and the Bazaar essay...
You're about 30 years late WRT your reference. When I said "back in the day"...
I first saw the term "software priesthood" in print in Byte magazine -- it was 1976, I think. It was already in play among those of us who had already been programming for a while, and even more so among certain sectors of management.
For most specific problems thrown at supercomputers, you can go 30 times faster with a custom hardware architecture baked into silicon
To go 30 times fast for general purpose supercomputing, you use the latest silicon (2X) and more chips (15X) and come up with a super new interconnect to make it not suck. This would involve making some chips that support low latency IPC in hardware.
They are free to send me a few billion dollars, I'll get right on it and deliver a 30X faster machine and I'l even use some blue LEDs on the front panel.
My Happy Hacking Professional 2 keyboard has a control key in the place where a caps lock key usually goes.
Agreed. I wouldn't change a thing. I spend my time worrying about my current project or my programming skills, not the layout of my keyboard. I internalized key positions decades ago, and I don't see any real benefit in trying to relearn what I've already got down pat.
There are a lot of slightly sub-optimal things in our lives that hang around due to simple inertia. The "pain in the ass" factor of learning a new keyboard layout probably outweighs potential efficiencies of completely or partially relearning how to touch-type. Most of us don't rely on maximum efficiency when typing, unless you happen to have a very specialized job in which you type a *lot*. A slightly more efficient layout probably won't make any substantial difference for most of us.
As a practical matter, choosing a non-standard keyboard layout is going to greatly inconvenience you in many ways. You'll have a dramatically limited selection of keyboards to choose from, and you're going to have problems every time you need to temporarily use someone else's standard qwerty keyboard.
You're "misunderstanding" fundamental concepts and basic facts about the figures in Karl et al. 2015. That's not misdirection, it's relevant. If you can't even get the basics right, what makes you think you'll be able to understand anything else?
I'm presuming there are no second graders here. Don't overthink it... it was just an example. Calculus, differential equations, etc. Even basic algebra is quite difficult for some. For others, it's logical, intuitive, and even beautiful. Math instructors would obviously tend to be of the latter group, and as such, might have trouble empathizing with students who "just don't get it".
Similarly, I've seen a lot of programmers who are convinced that *anyone* could easily learn to program, because *they* happen to find it easy.
I'm just not convinced that's the case. Not everyone finds the same things intuitive.
That's why I included parents in that topic (and teachers can also be helpful here). They're supposed to provide some wisdom and guidance about these sorts of things. Just have a conversation with your kids about being careful about what you post online, because those sorts of things can't *really* ever be deleted, and can have real-life consequences.
You need to start from that reality, and not with wishful thinking about being able to magically erase your past. It would be great if it were possible, but it's simply not.
I haven't touched the iphone stuff. Only Macintosh programming, which I've done on and off since 1984.
I found Swift pretty easy to pick up. Another language amongst many. Each to their own.
I wrote an MVC shim over curses in python for a point of sale application. Now that was a simple API.
Just to be clear, I do have some sympathy for people who are out in public, and are caught in an embarrassing situation not of their own making. Say, for example, a dog playing at the beach tugs at person's swimsuit and pulls it down. Pretty embarrassing, and not exactly anyone's fault. Someone films it with their smartphone and posts it online anonymously for kicks. Your situation is another good example.
This sort of thing can happen much more easily, because nowadays *everyone* has a handy videocamera available right in their smartphone. I'm perfectly fine with laws meant to protect people against that sort of abuse, or to compel services to remove photos or videos of that nature upon request. That being said, everyone has to understand that there's no way to permanently remove data from "the internet", only from a few specific sites. And anyone who downloaded that information could always upload it again. That's the hard reality of the world we live in. The information age provides some amazing benefits, but it certainly has downsides as well.
Part of this is a human problem as well. Who exactly posted these rumors on FB about you? Is passing a law about this going to fix the underlying problem here? That's sort of what I'm getting at. I'm saying that people need to understand that this is the new reality. Part of this needs to be some restraint in people NOT posting unfounded rumors about others online at the drop of a hat. I wouldn't shed a tear if the person who spread those rumors about you got reprimanded or fired because of pulling bullshit like that. Responsibility has to go both ways.
Agreed. Swift makes it easier to program, but the notion that "anyone" can write apps is definitely a laugh. There are a lot of programmers who don't understand that some people have a really hard time with the core concepts and skills involved in creating software. It reminds me of math teachers who don't seem to understand that some people have a fairly difficult time with advanced mathematical subjects. People have different areas of competence, and not all are suited to be programmers. It's not just logic... you need to do some creative problem solving in formulating that logic, and you need to keep a LOT of complex things in your head all at the same time to get them to all mesh together at the end.
And that's how I became I developer. In college I was going to major in Economics with a minor in Computer Science - but then I took an "Intro to programming" class after 8 years of home computer BASIC - and I was amazed that these engineering students had no ability to understand the logic and problem-solving required for programming.
I have a degree in computer science. I've been programming since I was 9. I learned Swift. It's quite good as languages go. But no amount of language knowledge or computer science knowledge will make the Apple APIs simple. They're not. They're complicated and hard to use. Swift will not make the APIs simple or logical. Making the APIs simple and logical will make the APIs simple and logical.
Kids & Teens: Don't post embarrassing photos or videos of yourself online, or put yourself in a position where others can post embarrassing photos or videos of you online. Don't think you can be anonymous online, because someone WILL recognize you or figure out who you are, given enough incentive. Consider it a valuable life lesson that you actually *can't* retract everything you do in life so easily.
Parent: Get involved and teach your children to be responsible online. Just like in the real world, there are rules for behaving safely and responsibly online. When things go public, there's no way to retrieve those images from everyone who may have gotten a copy, and no amount of legislation is going to change that reality, however much some people may wish it.
Legislators: Stop pretending that you can fix all the world's ills with the sweep of a pen. Start learning what IS and ISN'T possible in the online world. Or for God's sake, at least ask one of your younger tech-savvy interns before you make a fool of yourself with this sort of stuff.
> It's DRAM that's in the crosshairs.
Only to a small extent. This would reduce the need for DRAM cache of SSD data. Computers will still need huge amounts of DRAM for workspace. Workspace memory needs trillions of times more write cycles than this provides.
Or more SRAM cache local to the CPU with cache lines being merrily lobbed twixt the SRAM and the magic new memory. Maybe. A non volatile PC would be neat.
Which by the way, might be able to fail itself, and keep the pilot from unlocking the tail section when it needs to be unlocked. Killing the pilot and co-pilot.Hellova world, eh?
True, but engineering is oftentimes about weighing risks against each other. The risk of pilot error in this case has been demonstrated to be a real threat. The mechanical interlock can also be designed with overriding backup systems that can be activated by the pilot in case it fails for some reason. It's far better to have the pilot use a dedicated toggle in case of a rare emergency than have to remember to flip a switch at the correct time on each flight or risk the destruction of the spacecraft.
Oh, don't misunderstand... I absolutely agree. I was simply giving an example of the benefits of higher-level language abstractions.
I'm a videogame programmer, and we generally use C++ for engine code and game systems, since it's a pretty good balance of performance vs abstraction. But we often use C#/.NET for tools, and various scripting languages for game content.
Essentially, I think a reasonable maxim is that a programmer should use the highest level of abstraction possible for the job at hand. Picking the right language for the job is important. I'd hesitate to call anything a "best mix", because it depends on the project at hand, the environment, and the team you'll be working with.