... skynet lives and it is testing its metal...
... skynet lives and it is testing its metal...
Retention time in 2003-time-frame flash is tens of years. Retention time for the latest 25nm flash is measured at one year. Much less if you wear it out. Your 8MB SD card likely hasn't had the level of cycling needed to see reduced data life.
It is one device to the user. It is a metric boatload of NAND flash under the covers. But then again, all flash drives are some fraction of a metric boatload of NAND parts under the covers..
I hate having to write out-of-context, complicated, error-free programs on them when I can't rely on muscle memory to do some of the syntax that my fingers would automatically handle for me if I was typing, and where my normal brain-to-output pathways are unavailable.
When I ask the question, I don't expect perfection. Or error free. I expect someone to stumble through it. It gives me a chance to observe them in ways that they aren't normally observed. It gives me insight into how you think and how you approach the problem. Those things are more important than if you get your ';' right or not: the compiler will tell you when you botch that. It will also show me how you react when you make a mistake, how well, or poorly, you take criticism and how well you can communicate with me, what your style is, etc. There's a lot more going on in the interviewer's mind than playing 'cc'...
I have never, not once written on a whiteboard at work.
Then you are a loser and I don't want to hire you. Your attitude sucks.
I've been using whiteboards all my professional life. I have to use them to explain ideas to others, and have others explain them to me. If you can't express a simple idea of, say, implementing an in-order linked list insertion, then you're useless for my team. How can I expect you to explain the complicated algorithm you are working on? How can I expect you to give an informal talk about your latest work to the team? How can I expect you to socialize ideas that you have to other engineers if you can't whiteboard them?
When I ask candidates to code for me, it shows me how they think. I don't care about all the ; being in the right place, or if you misspell strtok strtoken. I care about how you can clearly explain what you are doing and walk me through your thought processes. If you don't know, say so. If you don't know and try to BS me in the interview, you'll try to BS me when I ask why your code is late or broken. The whiteboard programming for me is more about how they approach things, how they think through them, how they test the code to make sure it is right, how they weed out bugs, how they respond to my "what if someone passed in NULL here?" etc. They don't need all the answers right, but they do need to demonstrate they can think on their feet and take the right sorts of approaches to things.
And besides, you'd be surprised how many people can't write simple in-order insertion code. Or reverse this list. Or count the number of 'w' that are in a string passed in. Or, well, you get the idea. While I like to have "hard" questions, I rarely get to them because these simple ones catch up so many people so badly that I end things early. I make things hard because I want to judge you on a scale of 1 to infinity. When people complained about a calculus teacher giving really hard tests, he responded "well, I don't want to make them too easy. After all, everybody in this room is taller than this pencil, but it doesn't tell me anything useful about them if that's my the only metric."
"Honey, does this security system make my ass look fat?"
Actually, the work is transformative. The form of the facts was changed from the form in the book to the form in the database. Creativity was certainly involved there.
But that misses the point. It is 100% legal to copy phone books verbatim with no transformative work because they are just tables of facts. It isn't clear that these tables of facts even qualify for copyright protection at all, since they are very similar to telephone numbers and addresses listed in phone books.
Actually, it will. I've read data off of hundreds of old 3.5" floppies over the years. Using recovery programs like rescuedisk from FreeBSD or ddrescue I've found maybe two dozen of those I was actually able to read the data with enough retries, on the order of 1000.
A couple I've not been successful with, but I've been able to read the troublesome sectors if I try reading it on other drives enough times.
Maybe I've been lucky. I used to believe that if you couldn't read the media after 10 retries just give up, it is gone forever. But I accidentally left a disk running for a weekend once and found from the logs that it recovered all but 10 sectors on the first or second try, 5 more on the third, 3 more on the forth, one on the 453th try and one after 894 tries.
I've recovered hundreds of floppies over the years. Here's what I've done to good effect.
(1) Find a machine with a floppy drive. If this machine hasn't had its floppy used in a while, either read/write a bunch of disks, or get it cleaned/aligned. I've opted for the former with good effect, but drives are getting old enough now that the former may be increasingly necessary. For older 5.25" drives, I'd definitely try to clean the heads, but be sure to do research so you don't grind the heads away by using the wrong methods. The reason I use the read/write method of a few disks that are new is that it gives you a chance to see if the drive is working on disks that don't matter. It might also allow you to have a minor cleaning effect from this to remove oxides from accumulated sitting time, but I'm unsure if that's what's going on. I have used different drives when the first tests failed, but never paid to have the broken drives fixed. There's just too many surplus floppy drives around. It might also help to have multiple drives.
(2) I have used both ddrecover and rescuedisk. The former is a gnu thing, the latter is included with FreeBSD. Both will incrementally read the disk and optionally write out data about what's been read. Both programs try to read as much data as possible in large blocks, then switch to smaller size reads for the damaged areas to try to get as much data off as quickly as possible with as few read-head passes. Having said that, often times there's a few stubborn sectors that just need to be tried a lot. For ddrecover, you may need to crank up the retry count to 1000 or more. rescuedisk does this automatically. I've had several disks that people have sworn are totally unreadable that I've been able to recover and placed in my hand to do something with. I've been able to recover most of them by retrying between 100 and 1000 times. When that fails, and it has in maybe 2 or 3 of the hundreds of disks I've done, I've taken the log files about what had been recovered to a different machine with a different drive and tried to read the (usually 1-4) missing sectors there. This hasn't failed me yet for disks that are hard to read merely because they are "old." My experience has been more concentrated on the 3.5" floppies than the older 5.25" floppies too. Different rules may apply there.
I guess I should caveat the above advice with "for disks that are just old". Disks that have been damaged over the years, or have had magnets run over them, etc all bets are off short of "extreme" options that might not even work.
Many of these techniques also work for reading damaged audio CDs, DVDs, etc.
for (i = 0; i++; i 10)
is semantically the same as
for (i = 0; ++i; i 10)
This has what K&R has brought us. Of course, the reason for this preference is that PDP-11 had postincrement addressing mode as well as pre-decrement. So you'll see more --foo than foo-- in old time code. For simple ints like the above, of course it doesn't matter one wit. But for looks like:
while (*src++ = *dst++) ;
you get much better code on a pdp-11 than the nearly similar:
*src = *dst;
while (*++src = *++dst);
because the former's data movement is just two instructions, while the latter can be up to 6. Then again, this loop kinda disproves the usefulness of the ++foo that the parent to this reply expounded. There's really nothing more "logical" about it. it isn't until you find yourself in C++ land that you might think that (since operator ++ overloading is a lot easier with preincrement rather than post increment).
So there you have it. The main reason for foo++'s prevalence in K&R is due to the historical accidents of PDP-11 addressing modes and stack growing direction.
Since grades are supposed to be approximately normally distributed, no technology should raise grades. They should remain the same: approximately normally distributed. The real metric would be "can Johnny read better" or "can Jenny do math better" not "are their grades higher?"
Oh, wait, forgot about that stupid grade inflation thing where we're making the tests easier and not changing the grading curve to match...
Power and Power variation. To get enough power out of steam, you have to have high compressions, which steam is lousy at. Driving a turbine to generate electricity can be done at lower compressions, and also at more constant compressions.
This is pure arrogance. You aren't *THAT* good. Code is more than inputs and outputs. You should be judged on how you do it. It has been my near universal experience in the last 30 years of reading/reviewing code that the people most opposed to code reviews tend to be the producers of the worst, hardest to maintain code in the tree.
There's two FreeBSD Ports committers who have done more:
20110412 ok 1 16978 miwi
20110517 ok 2 13027 pav
I have the ATTANSIC L1 on one of my systems, and it works great on FreeBSD 8.1.
Unix soit qui mal y pense [Unix to him who evil thinks?]