Fortran is still very much in use for mathematical programming.
Fortran is still very much in use for mathematical programming.
Any language that could replace C and assembler would need to be statically compiled. So for Java, C#, Python and so on you'd have to define a subset that does not require a runtime parser or standard library. And you'd need extensions (or a static module system) that allows you to add assembler for direct hardware access. And a new compiler that can generate static code instead of the intermediate VM they target now. Not impossible by any means and probably a fairly interesting exercise too, but the languages would end up rather different and more restricted than the full versions people are used to.
Rather, I expect and hope that something like Rust will eventually supplant these languages in this space. Rust gives you the best of both worlds, with a statically compiled binaries and good memory safety at compile time, rather than runtime. You pay for it by having to be much more explicit about ownership than in these languages though. I've followed that project for a good while and it's clear that targeting small embedded systems is a struggle even for such a language; Java and friends would be much more difficult still.
FORTH is the rare language that tends to be even more memory efficient than C. The runtime interpreter is truly minimal (really just following a bunch of jump tables); you can have a small environment and application code in less than 8K.
On the other hand - and I say this as someone who likes FORTH a lot - you'd be hard pressed to find people claiming that FORTH is any higher-level (or easier to develop in) than C or assembler.
On the third hand - and off-topic here - it's quite a fun little language to use. Just like you can say that Scheme is programming directly in an AST, using FORTH is writing code directly for a stack machine. It's probably good for you to have a bit of experience even if you never do anything "real" with it.
The high-level VMs and the drivers to drive the specific hardware isn't developed by magical Low-Level Elves in Happy Rainbow Fairly Land. Every IoT device is going to have their oen special hardware stuff, and somebody needs to write the low-level code to interface with it. That is done in a combination of C and assembler.
Also, at volume the price difference between a MCU that can run a VM and one that can not will be on the order of tens of cents (either currency). If you plan to make on the order of a million devices, then 20 cents per unit will more than pay for a programmer that knows to use the small MCU over a Java hack that does not.
There are different national versions of Scrabble. Here's Icelandic
The horses have run away and you're now opening the barn hoping they come back, but they have long found something better.
If they've found greener pastures, that implies that they were out to pasture in the first place, i.e. not in the stable or barn. But... yeah, I knew what you meant.
I actually found it very witty, sorry to be so cocky.
Your original comment came over as sort of funny, albeit not quite as clever as it was obviously meant to be.
However, if you want to be self-congratulatory, it helps not to undermine things completely...
And yes, I know that the original metaphor requires you to shut the door after the horse ran away instead of opening it. But that part of the metaphor didn't make sense if anything, you would have to OPEN the door when you notice that the horse is gone and hope that the horse comes back home.
That's because you've entirely missed the point of the original metaphor!
Keeping the stable door shut (to stop the horse escaping) represents the thing that was *supposed* to have been done beforehand.
If you fail to do that and "the horse escapes" then... there is no point in trying to remedy things or make amends by shutting the stable door. Of course it doesn't make sense any more... that's the whole point!! The horse is already gone. Shutting the stable door beforehand would have prevented this. Shutting the stable door now is too late to solve the problem.
The point is that doing what you should have done in the first place only *after* the thing it was intended to prevent has already happened makes no sense.
You open the stable long after the horse has found greener pastures.
I see exactly the point you're making, but that's still a mixed metaphor that makes progressively less sense the more you think about it. (^_^)
The Wii was a fluke that happened at exactly the right time, hence why Wii U tanked. (removing the Wii, the Wii U had the expected number of buyers from the slowly decaying trend on their total console sales)
I don't know that the original Wii was entirely a fluke; I'd give them the benefit of the doubt and credit them with doing something different to MS and Sony's chasing of the traditional, mainstream "serious" gaming market by going for the casual market (which, to some extent, the DS had already had success in pioneering).
But basically, yes, I agree with you regarding the timing and the fact the casual market had moved on by the point the Wii U came out. I said much the same thing myself a few days back- the Wii U was a contrived attempt to replicate the original Wii's success by doing exactly the same thing (especially its controller, trying too hard to be as original as the Wiimote)... and without recognising that the casual gaming market the DS and original Wii had pioneered had started moving on to tablets and smartphones by then.
The fact that- from what I've heard- a relatively high proportion of the original Wii consoles tended to end up gathering dust in cupboards after the initial burst of enthusiasm and novelty wore off probably didn't help convince the same people to rush out and buy a Wii U. Particularly as the marketing- and name- didn't make clear that it was an entirely new console, and not just a slightly improved Wii.
And then, there's Benghazi. Clear case of treason, and no Democrat is interested.
Clinton clearly dropped the ball with Benghazi. Her negligence had fatal consequences, and her apology did not sound sincere. However, I would not call what she did "treason:"
Treason against the United States, shall consist only in levying War against them, or in adhering to their Enemies, giving them Aid and Comfort. No Person shall be convicted of Treason unless on the Testimony of two Witnesses to the same overt Act, or on Confession in open Court.
She was negligent and failed either to provide additional support for the consulate or authorize their withdrawal to a safer location. She did not levy war against the United States, nor did she adhere to or provide aid and comfort to an enemy of the United States. She was a spineless coward who clearly did not respect her subordinates or value their lives: but she did not commit the crime of treason. I think in general some people throw that word around a little too loosely without understanding what it means.
If you're right (and I think you are) - the Switch looks like it might be repeating this exact mistake.
Oh, indeed. That was my first thought when I saw the Switch- it was like deja-vu all over again. Another contrived attempt to create a "novel" console with a "novel" controller.
My daughter prefers to play Wii U single-player games on the GamePad rather than on the television
I have to admit that I've never played the Wii U. However, I remember when it first came out it- and in particular, the screen-based gamepad- struck me as a contrived attempt to replicate the success of the original Wii.
That- of course- enjoyed success because it *didn't* attempt to go down the well-trodden, stereotypical path of reliant-on-graphical-specs hardware and traditional "serious" gamer demographics, but instead targeted the casual gaming market (which had already been opened up by the Nintendo DS which did much the same thing) and used a novel, interesting and more "active" controller- i.e. the Wiimote.
I won't accuse them of wanting lightning to strike twice- since that would imply the original Wii's success was pure luck the first time round, which I don't believe- but it's obvious that they thought they could pull of the same trick again.
Hence, the Wii had a novel controller, so the Wii U had a (contrivedly) novel controller. The Wii got away with being underpowered, so the Wii U would get away with being underpowered. The Wii was a success by targeting the casual market, thus its lack of traditional mainstream arcade games wasn't such an issue- so the Wii U would do the same thing.
One problem as I see it is the "casual" market that the Wii opened up had already moved on by 2012- towards "Farmville"-type Facebook skinner boxes and smartphone and tablet games- and that the Wii U's trying-too-hard controller was pretty expensive and hardly ideal for family and multiplayer games.
But the marketing was also pretty crap- failing to make clear enough that "Wii U" was an entirely new, next generation console rather than a tarted-up Wii or giving people who had a Wii gathering dust in a cupboard any reason to buy a new one. (And that was possibly another issue- the Wii seemed like a good idea to many people at the time, but I gather a lot of them ended up not being used, so they weren't likely to rush out and buy the next Wii).
your brain also gets flooded with cerebrospinal fluid, which cleans a type of "plaque" from between pathways
So, you're saying that cerebrospinal fluid is basically mental floss then?
Zombie Nation describes the harm sleep deprivation is doing to the United States.
I was going to ask what the hell a bunch of German techno producers would know about it, but I guess they've spent a few late nights in clubs over the years.
(And yeah- I know. I used to think that too, but "Kernkraft 400" was actually the name of the song...)
But what's being unsaid throughout this is whether this works with a standard Git server, or whether it only works with a special Microsoft-kluged server. While the former is vaguely interesting, the latter merits only a derisive snort.
In other news, in 2062 they will have time travel, otherwise how could you possibly know that just-released 8TB drive would last 45 years?
You know damn well that's unlikely and you're purposefully misunderstanding this.
It's quite obvious to *anyone* with an ounce of common sense that it refers to an 8TB drive they've been running continuously since 1971. Occam's razor, see?
We warn the reader in advance that the proof presented here depends on a clever but highly unmotivated trick. -- Howard Anton, "Elementary Linear Algebra"