Comment Re:intel is using fabs for low power (Score 1) 96
They did, at one point. They bought the rights to StrongARM and sold it for some time, then abandoned it completely.
They did, at one point. They bought the rights to StrongARM and sold it for some time, then abandoned it completely.
Power is governed by change of states per second. It varies by the voltage, but by the square of the current. There's only so much saving from reducing voltage, too, as you run into thermal issues and electron tunnelling errors.
You are much, much better off by saying "bugger that for a lark", exploiting tunnelling to the limit, switching to a lower resistance interconnect, cooling the silicon below 0'C and ramping up clock speeds. And switching to 128-bit logic and implementing BLAS and FFT in silicon.
True, your tablet will now look like a cross between Chernobyl, a fridge-freezer, and the entire of engineering on the NCC-1701D Enterprise, but it will now actually have the power to play those 4K movies without lag, freeze or loss of resolution.
Modern programming languages are a fusion of older programming languages, with chunks taken out. Often, it's the useful chunks.
There is no table, that I know of, that lists all the features ("significant" depends on the problem and who cares about solved problems?) versus all the paradigms versus all the languages. (Almost nothing is pure in terms of paradigm, so you need a 3D spreadsheet.)
Without that, you cannot know to what extent the programming language has affected things, although it will have done.
Nor is there anything similar for programming methodology, core skills, operating systems or computer hardware.
Without these tables, all conclusions are idle guesses. There's no data to work with, nothing substantial to base a conclusion on, nothing to derive a hypothesis or experiments from.
However, I can give you my worthless judgement on this matter:
1) Modern methodologies, with the exception of tandem/test first, are crap.
2) Weakly-typed languages are crap.
3) Programmers who can't do maths or basic research (or, indeed, program) are crap.
4) Managers who fire the rest of the staff then hire their girlfriends are... ethically subnormal.
5) Managers who fire hardware engineers for engineering hardware are crap.
6) Managers who sabotage projects that might expose incompetence are normal but still crap.
7) If you can't write it in assembly, you don't understand the problem.
8) An ounce of comprehension has greater value than a tonne of program listing.
9) Never trust an engineer who violates contracts they don't like.
http://arxiv.org/abs/1210.1847">No.
Please stop assuming you're intelligent, you're just embarrassing everyone else.
Given that physicists are seriously studying whether the universe is a computer simulation, that joke might not be too far from the truth. You have been warned.
These theories have their own problems. As noted on Slashdot previously, neither exist around dwarf globular clusters or in the local region of the Milky Way. It is not altogether impossible that our models of gravity are flawed at supermassive scales at relativistic velocities, that there's corrections needed that would produce the same effect as currently theorized for this new kind of matter and energy.
Remembering that one should never multiply entities unnecessarily, one correction factor seems preferable to two exotic phenomena that cannot be directly observed by definition.
But only if such a correction factor is theoretically justified AND explains all related observations AND is actually simpler.
There is just as much evidence these criteria are true as there is for dark stuff - currently none.
Maybe there are radioactive elements underground as well.
For me, the best the sleeps I had were in a hotel room which had air filtering, blackout curtains and was on the end of the top floor, well away from all the other guests banging and clattering their suitcases through the corridors.
But move to the same kind of room right next to the main hallway, and it was impossible to get a deep sleep, because there was always someone every hour who figured the best way to open a door that opened inwards was to hit it with a large suitcase. The same thing happens if the hotel room has emergency lights that come on whenever the main lights are switched off.
Air flow could be another problem. Even on Earth, sleeping under a lie-in (sloping part of a roof) always gives me a sore head due to the lack of air flow. The CO2 seems to build up. The only way I could stop that, was to sleep directly underneath the skylight window and keep it open. Maybe the shape of the sleeping pods leads to CO2 build up.
Given the premises of this thread (the costs and salaries of work immigration need to be controlled by the state), here a half-serious suggestion:
Have work immigrants be employed by your federal govermnent, not by the company they work for. The immigrant reports their working hours and conditions to the government, and they get their salary paid out from there. The government dispatches the worker to the company, and get the salary and other costs paid back from them.
The great benefit is that the worker is no longer there at the mercy of the company, and has no incentive to accept bad conditions or missing pay checks from them. And in any labour dispute they have the backing of a major legal and administrative organization. The government gets a clear view of exactly who the work immigrants are and what they do for their employers. The companies are relieved of some of the responsibility for these workers. Everybody has a common, single point of focus where they can turn in case of problems.
I definitely second that.
As an aside, you can generally expect a router to support things it does properly, at least you should be able to. Haven't seen too many routers certified as IPv6-ready (there's a comprehensive test suite out there by TAHI, it's not like it would be hard to verify) or even IPv6-capable, although a good number are both. So you can't trust the advertised capabilities as being either complete or correct.
There may also be hardware weirdness that means a feature won't work as expected whether with the regular firmware or a replacement.
Getting just the brand and revision is great, if you only want basic stuff. Which is most people. For freaks and geeks, we could use knowing if there's any really big, ugly omissions.
(I've done compatibility testing between network cards. It is unbelievable - or, at least, it should be unbelievable - how many network chipsets are defective. It's mostly obscure stuff, but bad silicon is expensive to fix, so you'd expect halfway decent testing. It just means all routers will do weird shit, so it's handy to know if it's weird shit that's likely to be a problem.)
Comcast failed to implement the duck switch. They do support rat, pig and ferret, though.
You might want to check out NIST's page on authenticating+encrypting modes.
You might want to look at Diffe-Hellman key exchange, where nothing is provided that cannot be entrusted to a wiretapper.
You might want to look at the Byzantine class of problems and their use in encryption.
You might want to look at the reasons for and against random oracles.
I see very, very little in cryptography that has to do with trust. Almost everything is dedicated to assuming that nothing can be trusted. People are encouraged to compress data before encrypting it because even the maths isn't trusted.
Android is not Java. Android is Linux with a JVM set up as an "other binary format" engine.
Most of what you're complaining about is in the standard library, not the core language. The standard library is semi-open, you can alter the code, rip out what you don't want. Only the core language is Java, the rest is just a programming aid.
As for what COBOL has, Admiral Hopper was running software on a non-networked sequential architecture. This is rather different from operating in a multicore SMP-architectured server farm. There is nothing complicated about parallelism, but naivety and self-blinding are two great ways to make every mistake in the book - and then some.
Stability, predictability and reliability could be done with Erlang, Occam, Eiffel, Smalltalk or Ada.
Business could have build "enterprise" applications with any of these. Most existed before Java or, indeed, the web. Servlets could have churned out WAIS or Gopher data for businesses. Graphics, via SGI's VRML, Apple's Postscript or the ancient GKS standard, could have given you everything that Swing delivered. Not that businesses use Swing, as a rule.
Portable applications in the form of Tcl/Tk packages could have provided everything Java applets did. Not that anyone uses applets either.
It should be self-evident that absolutely bugger all of the usual explanations hold water. If the explanations were valid, the role would already have been filled and Java would have never taken off.
Businesses flocked to Java and not to any other technology. Even technologies pushed by very large corporations. Businesses liked, and like, Java. That is obvious. "Why" is not obvious, Java does nothing that couldn't be done better in other ways. It isn't done in other ways, it's done in Java. There will be a sound reason for this, but it won't involve stability, reliability or predictability.
Oak was originally designed for household appliances.
D looks intriguing, certainly superior in theory to C++ or C#, but I'm seeing nothing substantial in it so far.
For other C derivatives, there's Aspect C and related attempts at adding high-level abstraction. On the other end of the spectrum, you've Silk and UPC - efforts to make parallelism simpler, safer and usable. Again, though, how many here have even got these compilers, never mind written anything in them?
For highly protected work, Occam-Pi is unbeatable. And almost unusable. Extraordinarily powerful, but extraordinarily formal. You could easily write an OS or virtual machine using it that could exploit multicore, SMP and clustering transparently. You just couldn't easily get it to do anything else, like hot-swap resources, add memory, access the busses, support RDMA, exploit hardware...
That's the rub. Most of what is needed in an OS is inherently unsafe. It's why there's so much interest in splitting operating systems into unsafe parts (which often need to be fast and low-level) and safe parts (the stuff that does all the managing and abstraction). So long as the unsafe parts are well-behaved with valid data AND the safe bits provably give only valid data (though it doesn't have to be provably correct), then the system is guaranteed to be stable.
You ideally want to split these up further. The safe bit should access an independent security kernel that handles all the access control, for example. The security kernel should be provably correct, which is a very different constraint than that imposed on other safe sections. Some sections of code should be able to self-replicate or migrate, to take advantage of resources rather than create bottlenecks. That would require greater emphasis on abstraction and adaptability, rather than validity or correctness.
No single language can handle this level of versatility. All languages obtain specific characteristics through constraints and freedoms. This means you need superior linkage between languages and optimization that takes into account that different paradigms are used to solve different problems and that there is insufficient data to optimize at compile time, that it has to be done at link time.
Hackers of the world, unite!