You failed science and they didn't offer Home Ec?
Burn! but not, ironically, the exhaust.
You failed science and they didn't offer Home Ec?
Burn! but not, ironically, the exhaust.
Any idea if one of these re-tooled coal plants be "easily" converted to run on natural gas, or back again? "Easily" meaning "cheaper than building a new plant from scratch".
Not a clue. Maybe? The difficult bit would be the turbine. You could probably run a gas turbine designed for syngas off methane, but unless it's designed for it, it might not work as well. You couldprobably replace the gas turbines, which would be cheaper than that plus the entire steam set I suppose.
Overloaded  does not.
Yeah it does. a[b] means operator(a,b). It's a function in C++. Some forms are predefined, much like some forms of sqrt, min, and less are predefined, but it's a function nonetheless. C++ is NOT C and if you keep treating it as such, then you'll get confused by such things.
The worse of all is overloaded ->, which is an operator which can normally be applied to a dereferenceable type, so you would really have no idea to even look for an overloaded operator to see if something unexpected is happening,
Except that if you know C++, you know that -> can be overloaded, so you would never expect that user defined types could never be dereferenced. It's not only possibly, it's a really really common idiom. Your reasoning only really works if you assume C++ is really C, and apply all the rules you know from there to C++. It is not. Again, all a->b means is operator->(a, b), which is yet again a function call with a funny syntax.
In my experience, if there are bad paradigms available in the language, people will use them, even celebrating their ability to do something "clever" and obtuse, and
So? It's my experience that people will do stupid things with power tools, up to and including losing digits and limbs. I wouldn't however advocate that we should never use power tools because an awful lot would be vastly slower and more difficult without them. It's the same with powerful language features. You can do stupid stuff with powerful language features. If people are doing stupid stuff with those in your code base, then the problem is not the language, it's your processes. You can, after all, write awful code in any language.
Very often it may not even be in your code base, that you have control of, but in some open source software that you have to read and understand.
When C was popular, there was an awful lot of bad code in C. When C++ became popular, there was an awful lot of bad code in C++. When Java became popular, there was an awful lot of bad code in Java. Nowadays, there's a lot of bad code in all of those plus PHP, Python, Go and a whole host of others.
And on the flip side, there are shining examples of excellence in all of those.
Of course it would be facetious to claim that no language is either better or worse. However, I do dispute that having powerful features inherently makes the language worse. Features that allow you to write the same code more concisely and so with fewer bugs make languages better.
but it's pragmatic to recognize that in a world of imperfect programmers, it's better to not have language features that generally lead to hard to understand code.
Except I dispute that they make it harder to read. Having special syntax makes things easy to read, essentially (I think that's why LISP never took off).  generally means "array access". And in C++ it works for builtin arrays, std::vectors and for various numeric array types, like Eigen. The thing is in all cases it means logically and semantically the same thing, so having it look the same is a huge advantage.
Poorly used, it makes things confusing, just like any other poorly named function. And that's why library writing is generally done by the experienced members of the community or team.
but there's no way you can ship something like this to the non-geek general public.
No, but you can set up the flash as read-only. I'm not sure precisely why it happens. Something to do with the controller on some SD cards? I think my preferred method would be to have r/o SD, because you have to have the root FS on the SD card and a writable filesystem on an internal USB drive.
Before I get lots of flames for the hardware comment, it really is. I kinda hate saying this because it's completely changed the industry and totally fulfilled its promise as an educational toy, but dear Ghod you don't want to build a product around it.
Depends on the product. Sticking to professional stuff, I've built a custom diagnostic/test device around an RPi. Currently one, but there'll be a herd of 4 eventually. I don't have to worry about people plugging in random USB crap of course. I certainly don't need gig-E (it won't even be on wifi permanently), though I do use the builtin bluetooth, since I need that. Power is provided by a decent SMPS (100-240V of course), so is good quality. I have one custom rider board that connects it up to various bits and bobs. Oh and I use the official screen which is really super handy. It's actually built around that, so everything screws to the screen on a stack, the screen screws into the case lid, and the lid screws to the case, so I can lift out all the hardware very easily. The official screen is very handy.
The SD card is annoyingly slow, but it does make development easy and stress free, partly, in that you can have a production and dev SD card and simply flip the card over to convert the device, and of course with the high speed reader on my laptop, I can duplicate and back up the card quickly and easily.
The downside is it has to be setup as a readonly root, since even with a journalling FS, there seem to be some obscure circumstances which can nuke the SD card. Though isn't eMMC basically the same: it's like the old MMC cards (i.e. SD without any of the dumb crap that no one uses) over a circuit board traces, rather than a fixed socket? I've not really looked into it deeply. On the other hand, r/o root is not a bad idea, generally for kit that some tech might simply yank the cord on at any time.
Given the use case, the RPi was great. It's not cost sensitive, but everything just works out of the box, software wise, and the RPi with its screen is available next day delivery and from loads of vendors. There's also docs on how to do most stuff, which is great.
As usual, horses for courses, etc, etc, but building the current thing I'm on has made me really appreciate the Pi.
The retards have really taken over, alright
Well, that's pretty unusual for a utility scale gas turbine plant, since it's a massive waste of energy and therefore money to dump out air hot enough to toast a bird. The plants tend to be built as combined cycle, with a gas turbine at the top (high peak temperature, medium rejection temperature) and a Rankine cycle (medium peak temperature, low rejection temperature) at the bottom of the theremodynamic cycle.
That gives you a much larger temperature differential than it's practical to achieve with either cycle independently and so gives you a much more efficient plant, about 60% versus 30% for a gas turbine and 40% for a Rankine cycle.
Interestingly, even the new generation of coal plants are now built this way, with a gasifier to turn the coal into gas, running that through a gas turbine, then using the outlet heat to drive a Rankine cycle. I imagine they use the solid residue to power the gasifier and probably an inter stage reheat on the Rankine cycle as well.
Nope that's no different.
a[b] means index(a, b). Information is no more hidden than if you have to look for index() versus looking for operator. Operators are not magic, they're just functions with an alternative syntax. If you have a problem where programmers are fooled because someone keeps committing an index() function to the repo which does something else entirely, the problem isn't overloading, it's that you're not doing code reviews properly.
I strongly disagree.
The first mistake is not knowing C++ well. a+b means operator+(a,b). It's a function call with a funny syntax. No more, no less. Now we're on to function calls, nothing in your language stops someone from writing a function called add() which does something silly. Or sensible.
Why is it worse having a+b than add(a, b) when a and b are complex as per your example?
My opinion has nothing to do with bad drivers. Everybody gets tired. Everybody gets distracted. Anyone who says otherwise is kidding him/herself.
Besides, more than 70% of all drivers eat while driving, and that's responsible (according to one study) for about 80% of all crashes. When I say humans suck as drivers, I mean that the overwhelming majority of human drivers (if not all) suck at driving at least some of the time. The only reason we don't have orders of magnitude more wrecks than we do is that split-second reaction time is only important on very rare occasions (perhaps five-seconds in a typical hour of driving), so being distracted usually doesn't result in an accident.
How can anyone seriously care?
Because languages are not static. Their future depends on the development community.
I'm anti-antibiotic and modern medical intervention because I think knowing that they're available just makes people careless and sloppy when they travel in areas where those interventions aren't available. I would much rather a few more people die because we don't use antibiotics at all than for people to become reliant on them and just become careless and unfit.
First, antibiotics are available nearly anywhere in the world you might go. By contrast, these sorts of autopilot features are available on a tiny fraction of a percent of vehicles, and probably will be for some time to come.
Second, the technology is highly limited, basically useful only on the highway, which means that if people get used to having that extra support during highway driving, it could easily result in an increase in accidents in cities, where accidents are much more likely to cause pedestrian fatalities. So there's a good possibility that this could actually make traffic deaths worse on the whole over the long term.
Early studies strongly suggested that partial self-driving solutions did more harm than good, which is why I think we should wait to make self-driving technology available until it can truly take the place of the human driver, rather than introducing a solution that only works part of the time and can lead to false confidence the rest of the time. I could be wrong, and I'd like to be wrong, but my gut says we'd be better off waiting a few more years for a more complete solution, rather than deploying a partial solution more broadly.
unless you go though a lot of hoops to create a read only file system and even then it is risky to use it for something in the field or embedded in a another system.
The hoops are not that bad. Just mount
There are guides for running debian as readonly root over NFS: most of those instructions (except the NFS bit) apply.
I'm not saying that the challenge of coming up with software that allows a car to autonomously drive itself better than a human isn't possible. I just challenge the assertion that a computer with multiple cameras is likely superior to a human.
I say that for several reasons:
Besides, Tesla's autopilot feature is designed exclusively for highway driving. (AFAIK, it still ignores stop signs and traffic lights entirely.) Highway driving is, on the whole, some of the safest driving possible, with nearly every accident caused by some combination of fatigue, distraction (particularly involving food/drinks), and/or drunkenness on the part of one of the drivers involved. To beat a human driver under those conditions, all Tesla's autopilot really has to do is keep the car in the current lane, reliably detect cars that have stopped in front of it without nodding off after half an hour or chugging one for the road, and avoid other people who have fallen asleep or are drunk. Of those, only the last one is particularly challenging, which is almost certainly why the crash numbers are only down by 40% instead of 80% or more.
Anyone can hold the helm when the sea is calm. -- Publius Syrus