Fortran is still very much in use for mathematical programming.
Fortran is still very much in use for mathematical programming.
Any language that could replace C and assembler would need to be statically compiled. So for Java, C#, Python and so on you'd have to define a subset that does not require a runtime parser or standard library. And you'd need extensions (or a static module system) that allows you to add assembler for direct hardware access. And a new compiler that can generate static code instead of the intermediate VM they target now. Not impossible by any means and probably a fairly interesting exercise too, but the languages would end up rather different and more restricted than the full versions people are used to.
Rather, I expect and hope that something like Rust will eventually supplant these languages in this space. Rust gives you the best of both worlds, with a statically compiled binaries and good memory safety at compile time, rather than runtime. You pay for it by having to be much more explicit about ownership than in these languages though. I've followed that project for a good while and it's clear that targeting small embedded systems is a struggle even for such a language; Java and friends would be much more difficult still.
FORTH is the rare language that tends to be even more memory efficient than C. The runtime interpreter is truly minimal (really just following a bunch of jump tables); you can have a small environment and application code in less than 8K.
On the other hand - and I say this as someone who likes FORTH a lot - you'd be hard pressed to find people claiming that FORTH is any higher-level (or easier to develop in) than C or assembler.
On the third hand - and off-topic here - it's quite a fun little language to use. Just like you can say that Scheme is programming directly in an AST, using FORTH is writing code directly for a stack machine. It's probably good for you to have a bit of experience even if you never do anything "real" with it.
The high-level VMs and the drivers to drive the specific hardware isn't developed by magical Low-Level Elves in Happy Rainbow Fairly Land. Every IoT device is going to have their oen special hardware stuff, and somebody needs to write the low-level code to interface with it. That is done in a combination of C and assembler.
Also, at volume the price difference between a MCU that can run a VM and one that can not will be on the order of tens of cents (either currency). If you plan to make on the order of a million devices, then 20 cents per unit will more than pay for a programmer that knows to use the small MCU over a Java hack that does not.
All devices capable of FaceTime supported iOS 7. Apple didn't leave any devices behind when they did this.
Not only that, but a company cannot be expected to support unnecessary legacy infrastructure forever.
My memory is hazy, but Flash was a subset of Shockwave that was optimized for the web.
I vaguely remember it being called "Shockwave Flash" for a brief time.
Of course, it's been a while so I could be wrong.
1Gb/s fiber, no caps, and internet phone for ~7000 yen ($60) a month. And I get close to that in practice when I test against a server on the Japanese mainland. Not bad, considering Okinawa is a smallish island in the south pacific.
I still even have a 10.6 workstation because the scanner software for my film scanner I have is PowerPC.
Vuescan works OK and is available for OSX, Windows and Linux. It supports a huge range of scanners old and new. I've been using it for many years and I can say it gives quality results, while the UI is at least no worse than other scanner software (not a high bar to clear, of course). And since it's cross-platform it's one less thing to lock you in to any specific system.
BTW, the part about knowing who's going to use the door and who isn't is probably doable with cameras and enough processing power.
It is possible, and it has been built. A couple of colleagues in Sweden did just that for one manufacturer, more than fifteen years ago. The idea was to reduce the amount of heat lost from unnecessary door openings in winter, and to a lesser extent from cooling losses in summer.
It would recognize who was aiming for the door versus those that just walked past. It wasn't fooled by dogs or kids (would open for kids, but not dogs) or things like suitcases or prams. During development they built a version that would only open if you did the Vulcan hand sign thing.
But it was too expensive. Automatic doors are not a high-margin business - there's many competitors - and the actual savings did not make up for the higher price. The actual energy losses are pretty minimal for most shops, and door openings are usually not in error. Those that have a real problem with it tend to use revolving or double doors already.
Also, it didn't help that the shops might have needed permission to mount what is effectively a camera pointing out on the street.
Today the hardware would be cheaper, and cameras are far more acceptable. But from what I heard customer interest would still be small.
It's funny because it was a fad for a bit in the 1950s, but similarly died when the novelty wore off.
History has an amusing way of repeating itself. Nobody liked having to wear glasses to watch a movie in the 1950s, and the same is true today.
What is it about having money that turns people into such assholes?
I mean really, 700 acres? How can someone not find sufficient privacy for their family on 700 acres, even if it contains a few parcels he doesn't own?
That's all fine and dandy until the day comes at MSFT stops maintaining the WSL subsystem and/or lets subtle incompatibilities creep in.
Bring it up with Microsoft? What do Windows app developers do when Wine doesn't run their application correctly?
How does it compare to offering a build linked against the Cygwin library?
Zero extra work and no need for a separate box or VM, and a Windows licence, to test the build.
Sounds horrible to me. Why bother?
Not sure what MS' motivation is, but it's good news for a lot of scientific software developers. Small teams or single researchers rarely have enough time to even keep the main development going, never mind keeping up with multiple OS targets. With this everybody can simply focus on Linux, and tell Windows users to just run it under the Linux layer and stop asking about a native port.
Oh gods this is so true.
I grew up in the 80s and 90s and cassettes were my main music format at the time.
The hiss. The tape becoming damaged now and then resulting in parts of your songs being screwed up. The poor speed regulation on many tape decks. The felt pad under the tape becoming damaged or falling out and having to replace it, hoping not to damage the tape in the process. The tape getting "eaten" by the deck. The fact that almost all prerecorded tapes were made with the lowest quality tape possible (low bias, non-metal), so you didn't even get the best quality tape could provided from your music purchases.
Heck, the technology itself was a hack. Cassettes were originally meant for low fidelity voice dictation.
Cassettes have literally NOTHING to offer except the nostalgia. If you want a physical copy of your music, CDs are the way to go. If you want to be retro-hipster, vinyl is far better in audio quality and durability. Tapes are a clusterfuck and I remember RELISHING the day I got a CD player and didn't have to deal with them for my new music purchases.
This is why when strangers photograph me, I flip them the bird, not a peace sign. Then they don't get my fingerprint, since it is not facing them.
Most parts of your skin has distinctive, unique patterns. You can get a unique print from your elbow, wrist, knuckles, knees... And you tend to leave such marks around too, if less commonly than fingers.
"There are some good people in it, but the orchestra as a whole is equivalent to a gang bent on destruction." -- John Cage, composer