Comment TFA picture? (Score 1) 320
I'm more offended by the article's picture. Crappy chinese plastic chips and a rounded-off red die?
If you're going to fake a casino "moneyshot" picture, you might want to visit one prior to doing so.
I'm more offended by the article's picture. Crappy chinese plastic chips and a rounded-off red die?
If you're going to fake a casino "moneyshot" picture, you might want to visit one prior to doing so.
Make a prediction. Any prediction.
If you're wrong, no one will remember, or you can make another prediction about something right around the corner that kept your original prediction from coming true. If, on the other hand, it happens to be right, you get bragging rights and credibility in finding your next job as an outside consultant "expert".
You can't really argue managed code isn't several orders of magnitude slower than native code.
Actually you can, and quite correctly too unless you dont know the meaning of the term 'orders of magnitude', a quick google search turns up a myriad of results that prove you have absolutely no idea and are just making baseless claims. Here is just first one i happened upon.
I suppose you can argue that, if you really are that stupid. Where's the source? I can draw a chart, too, you'd be a moron to draw conclusions about language throughput without seeing what's actually being written. It's very easy to write inefficient code in any language and stuff the ballot box for such a "benchmark." If you weren't a mouthbreather you'd know that.
The defending of managed code because the difference between a user waiting 0.2 seconds versus 2 milliseconds for a window to open
Yet another unsubstantiated random bullshit number.
Correction, a mouthbreather that is incapable of understanding examples and hyperbole. If I was going to use actual numbers, it would be addressed in number of cycles for a given platform. But, hey, believe what you want. You dragging down the average intelligence of all humans has nothing to do with me.
You can't really argue managed code isn't several orders of magnitude slower than native code.
I'd be interested in seeing what makes you so dismissive of even attempting to argue your assertion that managed code is (even at the lowest end definition of 'several') at least 1000 times slower than native code.
Try taking a compiler design course at your local university and educate yourself.
The other day I caught a bit of a documentary on the zombie craze. It ended with the head of some zombie research institute saying something along the lines that deep down, they view the zombie appocolypse as a metaphor for any disaster, manmade or natural. The same tactics, supplies, and training you need for a zombie outbreak can also be used to survive another hurricane Katrina.
It's also a plot convenience to allow the main characters to massacre hundreds of humans without people crying over the race, nationality, or color of the cannon fodder.
While Sparkfun does do some circuit design, looking at the comments on their site, it seems the more their "engineers" worked on it, the more complaints they get. Those that are closest to the sample application circuits are the most reliable.
Also, Sparkfun started a spam campaign a few months ago. Fuck them.
The idea that these platforms and languages (python/php/ruby/.NET/Java) provide "Point-and-stick software development" is fucking retarded, anyone who knows even the slightest bit about them knows immediately that such a statement is objectively false. But that very thing is often said by the sort of people have no understanding of them through choosing to ignore them or genuine inability to comprehend them.
Frankly the runtimes are so lightweight and efficient that if you cant manage to write a GUI control panel in say
There's only one reason for using high level languages at all. More efficient development, easier to support, and better cross compatibility. But all that is on one side of the equation. Execution is always going to be slower, and it gets worse the higher up you go.
You can't really argue managed code isn't several orders of magnitude slower than native code. Boxing/unboxing, automatic garbage collection, all of it's other benefits: none of it comes free.
The defending of managed code because the difference between a user waiting 0.2 seconds versus 2 milliseconds for a window to open for a window that is rarely (if ever) opened is just good economy if it's going to be cheaper to develop. That actually isn't really a testament to a programmer's skill, it's really a testament that computers are so powerful now that we can afford to waste that much computing power in forcing the machine to unravel all these human abstractions, when really it just wants to run machine code.
That said, I assure anyone that the actual business-end (not any human interface) of a video card's drivers is most certainly using C or C++, tops.
It's easier to say.
Plus, it's fairly descriptive - it's almost 4000 pixels wide, and it's 4 x the resolution of HD.
I don't see the problem here, and I don't think it's "just marketing". People would come up with their own shorthand anyway if it was marketed at 3840x2160.
It's also completely backwards to what the established systems are. Resolution for television has always been specified in the number of horizontal lines. This is a consequence of early CRT raster displays only caring how many times HSYNC and VSYNC are flickered. However many distinct analog values you can toss out on the display wires per line is completely open. This is why we talk about 1080p and not "2K" even though a 16:9 display with square pixels will have 1920 pixels per line.
For TV resolution to be rightfully called 4K, it really should have 4000 horizontal lines.
"and accident rates haven't significantly changed."
false.
The have been dropping significantly.
Accident rates haven't.
Death rates have.
We have safety technology to thank for that, not cops writing tickets to drivers stopped at a light.
In Florida, we have a toll transponder system too. Recently waves of notices have been going out that the older style transponders are being deprecated for newer ones. I always thought that was kind of silly because the new style transponders are currently compatible with the existing system just like old ones are, so it's not really a "protocol" type change (I'm a software guy, not an EE, so there is likely some RFID stuff I don't know about).
The biggest change? The older transponders would beep when scanned, the newer ones no longer have that functionality. Sounds like perpetual tracking is coming to my state.
People will absolutely find out if their prints are indeed uploaded or stored on their device. Apple knows this, they've learned it the hard way when someone found out about the storing of geo-data and made an app to show the travel log of any iPhone user a few years ago.
Did anything change as a result? Did iPhone users suddenly wake up and not use their iPhones? Or switch to Android (ha ha, same privacy concerns, different companies)?
They got caught, took a few licks from the press, but ultimately the future refused to change.
With your bare hands?!?