Sorry, but it has absolutely nothing to do with the real world. They're giving twenty people - ten in experiment group and ten in the control group 30 minutes to do a bit of analysis. And they measure minutes to apply a few changes, without any qualitative measure on how the code is growing. There's very little proof that the refactoring they did made any sense, the sample size is so low you'd never get reliable results and pretty much what you can conclude is that refactoring doesn't make hackjobs easier. Never thought so, that just involves finding the place something's happening and hack it. If it's a good idea, well... it works there and then.
Slashdot videos: Now with more Slashdot!
Well somebody has to be the first at discovering something before somebody else can confirm it. And yes, in human years it might take a while to build another billion dollar project to do that. Science works on incomplete information, otherwise there wouldn't be anything to do science on. Has anybody independently verified the Higgs boson yet? All the exoplanets discovered recently? Probably not. That's always how it will be at the leading edge of science.
Once the country tips though, there will be a short and intense period of violence that I hope stays contained within the country, but I fear will spill out to the south. Once that is over, North Korea will be split into two parts, one unified with the south and a portion annexed into China. I have no idea where the split will be.
Somehow I find that implausible, I expect China to take the whole country or not at all. South Korea would be to worried about a conventional or nuclear counter-attack on Seoul to do much of anything while China could probably swoop in and install a new authoritarian regime that by NK standards would seem like heaven, all they need to do is bring them into the 21th century. After that I'll think it'll be a bit like Life of Brian:
Reg: All right, but apart from the sanitation, medicine, education, wine, public order, irrigation, roads, the fresh water system and public health, what have the Romans ever done for us?
Attendee: Brought peace?
Reg: Oh, peace - shut up!
Reg: There is not one of us who would not gladly suffer death to rid this country of the Romans once and for all.
Dissenter: Uh, well, one.
Reg: Oh, yeah, yeah, there's one. But otherwise, we're solid.
So the new spec removes the compiler front end from the graphics driver, greatly improving performance. Only the compiler back end is present in the graphics driver.
However, the IR instructions is probably much simpler than the source language, for example Java has tons of classes but only ~200 opcodes. It would make graphics drivers not quite, but a lot more like CPUs running "assembler-ish" code instead of being huge graphics libraries. Basically you're moving most of what's OpenGL/DirectX today over into the application. Stallman might not approve but it might mean more AAA games being able to run on a thin OpenGL Vulkan shim than Mesa.
They've come full circle:
1. AMD announces Mantle, a low level graphics API which may give consoles an edge over the PC.
2. Microsoft panics and announces DirectX 12, aiming for pretty much the same thing.
3. Khronos Group panics and announces OpenGL Vulcan, aiming for pretty much the same thing.
4. AMD announces there'll be no public SDK of Mantle, use OpenGL/DirectX.
So in the end we'll probably have feature parity again. How important it is remains to be seen, outside of drawcall benchmarks it's unclear how much real world difference it makes, what is certain is that it exposes a lot more of the complexity to the developer. That of course gives you more room to optimize, but it remains to be seen how many will be able to take advantage of it.
On the bright side, it might actually mean there's less code that needs to be written and that open source might catch up a bit, it says it'll run on top of all platforms that support OpenGL ES 3.1 which might become a much bigger goal than OpenGL 4.x.
Windows users still need to activate extension visibility manually - even though email-transmitted viruses depend most on less savvy users who will never do this.
6. Records and Audits
You agree to keep accurate books and records related to your development, manufacture, Distribution, and sale of Products and related revenue. Epic may conduct reasonable audits of those books and records. Audits will be conducted during business hours on reasonable prior notice to you. Epic will bear the costs of audits unless the results show a shortfall in payments in excess of 5% during the period audited, in which case you will be responsible for the cost of the audit.
In an actual zombie apocalypse I think my list of threats would be:
1. Opportunistic bastards (thugs, gangs)
2. Desperate bastards (hungry, cold, afraid)
3. Devious bastards (poisoned, stabbed in sleep)
4. Survival skills (and fighting for the good spots)
I have no doubt that the Atom X3 is going to make it cheaper to put an x86 into a LTE capable tablet/phone. And Intel gets to get paid for the modem instead of a third party, so it's a big advantage for them.
Not really, the X3s are all made with third party GPU and modem functionality at TSMC. It's a bought design where they add a CPU and a brand to pretend they're competing in a market they're really not. The X5/X7s are Intel's homegrown solution with their own graphics and LTE modem and aimed only at the premium segment. You will not get Intel tech for cheap.
Back on x86 legacy advantages: The other issue I'd raise is that there are quite a few "tablet operating systems" that are languishing in "Not Android" land that might well do well if more hardware comes out supporting x86. The stuff Ubuntu and GNOME are trying to make work might, for example, end up turning into something very, very, powerful if they can get the UIs fixed and if a surfeit of x86 tablets comes out.
Why would that help them? Not being cruel but being open source and ARM a recompile away was supposed to be their big boon and with so much running on Android that's likely the primary source of ports. If they can't make it as a first party OS then they'll never go anywhere in the mobile world that's full of custom, poorly supported, one generation hardware, I don't see any compelling reason why you'd buy a Linux/x86 tablet if Linux/ARM tablets don't sell. And the aftermarket install market is probably just as tiny on the x86 side as on the rootable ARM side.
And I could see a longship having a piece break off after getting shot at and having that debris end up in just the right spot to clog the subs engines or torpedo bays or something like that. Sure it's statistically unlikely, and probably not even a 1/1000 chance of actually happening, but for the sake of game play I can accept it.
At that point you're better off imagining the sub had a critical weapons malfunction and blew itself up so the longship wins on walkover. Or that the warrior sneaked into the riflemen's camp and poisoned their water supply.
Weren't people saying the same sort of things when the "assembly line" was first invented? After all, the main purpose of the "assembly line" was to make the same amount of stuff with fa fewer workers than had been needed previously.
Well first off you're not looking back far enough, during the first industrial revolution there was massive unemployment as machines replace skilled artisans and craftsmen with cheap, expendable factory workers that could receive minimal training in their one task on the line. The assembly line actually comes very late in a mostly industrialized society already and an old fashioned manual assembly line still employs a considerable number of people. And Ford famously doubled wages to get retention up, because the assembly line work was actually getting complex and needed trained workers.
This time we're not just dividing and rearranging the way workers produce their product, we're cutting the humans entirely out of the equation except for meta-roles like designers, developers and repairmen. For example take the banking industry, it used to be huge with branch offices all over the place. ATMs were the first blow, now online banking has reduced it down to next to nothing. I just checked the figures on one bank I know, 250 FTEs (full-time equivalents) supporting 380,000 customers.
Think about it, in how many service industries is the human staff actually a service? When I go to the grocery store, what I want are the groceries. I don't care if robots automate the whole shop if they keep delivering the same service and quality. When it comes to water/sewage/electricity/internet etc. I'd rather not deal with them at all, I pay a bill and it works. If a lot of those jobs disappear at the same time and I don't mind seeing them go, but I'm paying nearly the same for the robot/self-service service there won't be much left of my paycheck to pay whatever new jobs these people have found.
If anything, what I want is for my DE not to be based on a major toolkit. This breaks down when it gets to the file manager
And the system settings, that one is much tighter integrated to the DE than the file manager. And it needs to manipulate the pointer. And context menus, arrange menu bars etc. so it need some kind of UI toolkit. I don't quite see what it has to gain by reinventing the wheel, it's not like pulling in Qt/Gtk drains that many resources by themselves.
At the assembly level it isn't so easy to automate with a lot of the designs. There are flex cables, adhesive, torque sensitive screws that all rely on a human to be able to manipulate and then quickly respond to misalignment. To automate this, the design constraints placed on the Industrial Designs need to change.
I think you underestimate how far sensor technology has come and will go, here for example is an example of automated salmon processing. Obviously there's a lot of natural variation, do we need to bioengineer a more robot-friendly salmon? No. They're measured out by a laser and intelligently cut. Head/tail/other cuts are dropped out to go on another processing line. Each cut is grabbed by a robot with robot vision and placed in pouches to be sealed. Skip to 3:12 if you just want to see that last part. Fillet-making machines are still in the research phase but there are examples of that too using X-rays to scan and find the pin bones. If they can deal with all that, I'm sure they can apply the right torque to a screw.
And half those sort of "new generation" searchers won't know half the time if they are redirected to a phony site.
Half the "old generation" didn't know half the time if they are redirected to a phony site by a phishing email. Anyway, that assumes you're going somewhere worth scamming. Email, online bank, ebay sure... but in the last 15+ years I haven't seen a single phishing attempt for my slashdot account info. And stuff that you just read, what's to phish? And that's why the important stuff is moving towards two-factor authentication so just stealing your password isn't enough.
It's the same generation