Forgot your password?

Comment: Not such a bad idea... (Score 5, Funny) 194

by jmerlin (#41793041) Attached to: Industrial Control Software Easily Hackable

Worse, many of these systems are unneccessarily connected to the Internet, which is a terrible, terrible idea.

Now you're just being paranoid. Instead, you should develop an artificially intelligent system to defeat would-be attackers and malicious software. That sounds like the best idea.

- Skynet

Comment: Re:I think that's all college students (Score 1) 823

by jmerlin (#41790903) Attached to: Ask Slashdot: Rectifying Nerd Arrogance?

As moderaters have decided to upvote a troll AC, I suppose I shall respond.

Are you sure, because this is the most arrogant post in the thread.

In what way?

You seem to be doing a good job of feeling superior to all the people who chose to apply themselves in school, or who didn't need to apply themselves to do well. You don't seem to make the connection that your choice to blow off most of high school probably impacted your ability to get a scholarship to MIT or Stanford, and thus drove you get an education at a place where you admit "the CS curriculum there wasn't challenging".

Excuse me for a moment, I'm going to recount facts. This isn't arrogance, it's historical recollection. I didn't do a lot of homework, and that which I did do was done within 15 hours of it being due. I didn't try and was still in the top 10.2% of my class (ranked 33 out of 322), and still managed to qualify for a full scholarship to a state university. I spent almost all of my time applying myself to things I cared about: programming, hacking, etc, which were not recreational, and pushed me intellectually, which is far from "blowing off" my education. You seem to insinuate that I did "poorly" because I didn't apply myself, and then further the insult by pointing out that there are people who don't need to try to do well. Statistically speaking, I fit very firmly in the latter group. I did group work with people who were in the top 2% of my class. They didn't understand basic math and didn't take any specialized courses (like CS, which was entirely optional), and almost all of their time was spent doing coursework. I appreciate the effort to excel in school, but spending time trying to perfect work in a field you find completely uninteresting isn't a useful endeavor. Perhaps they found all of those fields interesting, I don't know, but I didn't.

As for your quip about ivy league schools, I looked into the matter. The scholarships available were not total, and would require relocation into areas with significantly higher costs of living. The people I knew in High School who went to these schools had very wealthy families, and no one was awarded any better than a partial scholarship. If I managed to score a 100 in every course, managed a perfect SAT/ACT score, did something astonishing like win a national programming competition, and had some pretty epic extra-curriculars (as hacking by yourself wouldn't count) it might have been possible to get a completely full scholarship to MIT or Stanford, but I'm no such intellectual monster. So in the real world, as a real person, not hiding behind AC, it was cost inefficient for me to pursue these schools. My SAT/ACT scores (1330/34) were in the top 1% and my GPA, as I actually looked it up, was 3.84. I definitely qualified for some lower level scholarship at MIT or Stanford, but as a guy who, on a full scholarship, living in an apartment which cost only $400/month, worked his way through college on an $8/hr salary at 20 hours a week without ever having more than $2000 to his name, even paying 20% of a $50,000/yr tuition (not counting cost of living differential) would have resulted in a debt that I would STILL be paying off today. I have no debts today, I own a nice car, and I have a sizable savings already. My decision insofar as what could have happened versus what has happened has been at least positive. You are correct in that I missed out on networking opportunities, but I have also done fairly well in that regard in spite of not gambling my future.

Also, even though my school wasn't challenging in CS, I still pursued challenging material. I acquired texts used by MIT in their recommended undergrad course schedule and consumed everything in them and everything that was relevant on OCW, and this took years. It isn't, strictly speaking, necessary to attend a school like MIT to learn the same.

Comment: Re:I think that's all college students (Score 1) 823

by jmerlin (#41772813) Attached to: Ask Slashdot: Rectifying Nerd Arrogance?

This is hilarious. Does this count as cognitive dissonance?

Why would it be? I assume you're confused as to why I would consider myself a valedictorian in some form while simultaneously making a statement implying that position is not much of an accomplishment. Maybe I wasn't clear, and I apologize for that. To be clearer, I didn't think myself smarter than my peers (unconditionally), but I felt that a general measure of academic success was meaningless and as a result, the "actual" valedictorian was a meaningless position. Instead, comparing individuals against one another on a per-field basis would provide a more useful result, and that is the basis of how I compared myself to others. In that case, specialization isn't punished, and there isn't just a single person who's "best" in some manner. How things work right now is much like asking: who's the best athlete? Who's the best programmer? Who's the smartest person? What's the best sorting algorithm? What's the best data structure? The answer to these is always the same: "there isn't, we have specializations." The Olympic Games don't have a single category: best athlete, do they?

These days, I don't much care, outside of the fact that children are still being subjected to such broken systems. While I believe a general well-rounded education is important, I don't think that someone's performance in a completely unrelated field should be relevant to their specialization when seeking a job or admittance in higher education, at least not for technical ones.

Comment: Re:First post! (Score 2) 403

by jmerlin (#41771217) Attached to: Microsoft Releases Windows 8
I haven't used Windows 8, but I do have to admit that I like the simplicity in the Metro UI look/feel. Not necessarily how the UI is functionally constructed in Windows 8, but for a web interface, it's pretty solid IMO. The Windows 8 standalone apps I've used on Windows 7 have been pretty good. I suppose if it was just Windows 7 with less focus on Aero and more focus on Metro-style minimalistic display, I'd be quite happy (none of this full-screen metro nonsense). When I install Windows 7 on my beastly gaming machine, I turn the visuals to best performance and disable Aero. I still maintain that it is/was a stupid idea, and I find the fact that Linux desktop distros are heading down the Aero path simply horrifying (wtf @ compiz, seriously?).

Comment: Re:I think that's all college students (Score 5, Interesting) 823

by jmerlin (#41768097) Attached to: Ask Slashdot: Rectifying Nerd Arrogance?

I will recount my experience.

Spend high school overachieving (probably at the expense of social development)

I didn't overachieve in High School because I realized how pointless an effort that was. There were only 2 things I cared about in High School, and that was Computer Science and Math. I realized this by about grade 5.. At that time, there was a little orange book called "Games in Basic" on my teacher's bookshelf. I picked it up and started reading it one day and was fascinated (we had a PC with Windows 3.1 and I could easily boot into DOS and code up basic games). She saw me reading it and said "I bought that thinking kids might like it but nobody but you has ever read it, so you can have it if you want." So I took it home and went through all of the exercises in it (just basic word games, input a number/word, output a response, etc). At that point I was hooked. When we finally upgraded to Windows 95/98, I started playing around in VB, eventually installing my father's copy of VC and learning C. This is where my time not in school was spent (split between that and playing games). I quickly realized I enjoyed this more than just about anything else, and so I did it. I taught myself VB then C then x86. By the time I could actually take a CS course in High School I was a junior, and it was an entry-level Java course. I still learned things -- data structures and some algorithms, but the majority of the syntax and other things I was quite familiar with already. Of those two categories I cared about, I maintained a 95%+ average. I didn't apply myself in History, English, other sciences, or any of the nonsensical electives we had to take. I saw no reason to, and I didn't care that I was just outside the top 10% mark in my school, nobody I knew was as good at Math or CS as me, so as far as I was concerned, I was the valedictorian. When I later spoke with people in the top 1% including the actual valedictorian, the arrogance they exuded was astonishing, as if they had accomplished something worthwhile.

work hard and get into a great college, get knocked down a peg when you realize that you're either somewhere in the meaty part of the curve among other prospective engineers, or that you'll actually need to *try* in order to get that A for the first time in your life

I didn't work hard to get in a great college, but I still managed to, even with my crappy GPA (something like 3.4 in HS), get a scholarship to a local university. I really wanted to go to Stanford or MIT, but the money just wasn't there, and a huge student loan wasn't something I could justify. So I majored in CS, an obvious choice, and figured that this 4-year degree would do nicely in the real world, where experience is more important anyway. I realized pretty quickly that the CS curriculum there wasn't challenging. I could read through the texts and learn what a course would teach me in a few days, and would end up bored sitting in a course going at a snail's pace for the rest of the semester. On the other hand, math courses were actually quite challenging. So 3 semesters in I switched from CS major to Math major and still took the interesting CS courses in my electives (compilers, AI, operating systems, etc). The math courses were a fair bit more difficult, especially more abstract courses, but the only time I actually had to really try to get a decent grade was when I finally started taking graduate courses. There's just too much information to keep in one's head to fully understand why a proof is valid (it doesn't just span that chapter in that book, nor even that entire book, but rather the past 3 years of courses of abstraction). Needless to say, in my spare time, I was still hacking around in CS and my brain was already prioritizing CS-useful math (including things like Abstract Algebra, Number Theory, Probability, etc), but the rest was reserved for actual CS work, so I wasn't to interested in pursuing an M.S. in math. No CS course I ever took was difficult, but I do suspect at MIT or Stanford that I would have been really challenged to make a good grade -- maybe one day I'll get around to getting an M.S. or a Ph.D. from either. It was at this point that I started reading the references in my texts and finding things I was particularly interested in, following references on Wikipedia, and reading articles in peer-reviewed journals my university had subscribed to. I very quickly realized what I knew is like a single point on the 2D plane, and similarly what we all know, collectively, is merely a single 2D plane within 3-space. It's astonishing when you actually open your eyes.

once you do succeed (or maybe just fail to fail) you graduate college thinking you're ready to take on the world... enter the business world and realize that the fancy education you paid so much for is only good enough to get your foot in the door...come to the realization that respect is earned by experience and demonstrated value... spend a few years building up credibility and expertise, then realize that being a manager (or director, or VP, etc.) requires some serious people skills (remember all those parties and extracurricular activities you skipped in high school in favor of hacking and video games?) and either choose to stay on the individual contributor path and hone your skills to guru level or take the plunge and start educating yourself (both formally and informally) in how to effectively manage a bunch of cocky engineers.

I'm still in the initial stages of the business world, but I've already realized I was lucky enough to be correct when it came to my educational choices: the school doesn't matter, and after a few years, your experience trumps it anyway. I also have made a decision already, and that's to never become a manager. I don't want to manage people, I want to be an engineer. Any company where the only way to "advance your career" is by changing your role entirely from an engineer to a manager is by definition not a company I want to work for. It's too bad that vision is so pervasive in this culture. Hackers want to hack. We don't want to go to meetings, read thousands of e-mails, or talk to "senior managers", "VPs", or "C*O"s outside of showing them our awesome hacks. That's all. I don't think I have a lot of arrogance these days. I definitely did when I was younger, but sometime around 22-23 a switch flipped and it was gone almost over night. Maybe it's the change from looking at everything subjectively to looking at everything objectively. Maybe it's the realization of how little you actually know. Maybe it's the realization that you no longer have to demonstrate that you're "smarter" than other people (a mentality put in place by schools via grades and grade-curves, we are competitive beings). Maybe it's because now when I look for information, I have to wade through a sea of information to find that little drop I care about versus consuming all information available in my text book and thinking: "is there more? Do I know everything about this topic now?" Maybe it's a combination of all of those things -- but I don't think there's a "fix" for it, other than letting someone eventually figure it out.

Comment: Re:The PC is dying claims are made every few years (Score 2) 291

by jmerlin (#41754517) Attached to: The Greatest Battle of the Personal Computing Revolution Lies Ahead

We had a bunch of horseless carriages designed before the Model-T too. It just needed the right situation to get them to kick off.

I have to point out that while I admire the appeal to cars, that this is a flawed analogy. The horseless carriage was capable of reproducing every single feature of a horse-drawn carriage (except the pooping), which made it an obvious successor. There are a myriad of things that touch-based devices can't do that a mouse/keyboard can. And there are a lot of things a little tiny form-factor micro-pc with an attachment port for a mouse/keyboard can't do that a full-blown desktop can (IE: power a 30" 2560x1600 monitor at 100+ FPS on the latest games). Even with Apple trying to bridge this with their new uber-thin desktop computer, the specs on it are pretty awkward for a computer built in 2012 and at a price starting at $1700 even with, essentially their Thunderbolt display built-in, it's a pretty second-rate system, also not capable of doing things many people need a full-blown tower for. In short: I'm saying that these devices, while they have obvious and great uses, do not fully reproduce the capabilities of PCs, and these hybrid-PC-microdevices don't fully reproduce the capabilities of their larger brethren. So the end result is that the PC is a completely different market, and no new micro-devices are ever a sign of "the end of the PC era."

I suspect until some major scientific breakthroughs occur that it's not even possible for a small-form-factor PC to replace a full blown tower-based PC. Even if Moore's Law continues for another decade, there's a physical limit to how much computing power you can shove into a hand-held or ultra-thin device. Compare that with a desktop that has this monstrous 2200+ cubic inches of space for hardware, and if you can shove 10x-50x the amount of horsepower into that space, the PC has the capacity to behave like a local-supercomputer in comparison. While you might argue that it's possible to use the "cloud" for computationally expensive work like that (and that's an argument, for sure, a little awkward, but I suppose it works), the problem then lies in latency. We're limited by the speed of light, literally, a limit which does not exist when the box is sitting under your desk, so there's a use-case that will never be supportable in a cloud system: real-time gaming on nearly photorealistic engines with extreme physics support. This was tried, and the implementation failed, for the very same reason (it's the latency, not the bandwidth, and that's a physical limit in our universe, so it will never work, until FTL communication is discovered).

Comment: Re:Simple answer (Score 1) 360

by jmerlin (#41749253) Attached to: Ask Slashdot: How To Avoid Working With Awful Legacy Code?
Every day at 5, I rm -rf *. I don't even have legacy code that's a day old. When people say "it sucks when you read your own 6 month-old code and have no idea how it works or what you originally did," I have no idea what they mean, because I just remember how I solved the problem and recreate it. I love being WET (reWrite Every Time).

Comment: I just had to respond to this inflammatory remark. (Score 2) 866

by jmerlin (#41683991) Attached to: Parent Questions Mandatory High School Chemistry

There’s a concept in economics called 'opportunity costs,' which you may not have learned about because you were taking chemistry instead of economics. Opportunity costs are the sacrifices we make when we choose one alternative over another. ... When you force my son to take chemistry (and several other subjects, this is not only about chemistry), you are not allowing him that same time to take a public speaking course, which he could be really good at, or music, or political science, or creative writing, or HTML coding for websites.

We took economics. We also pay attention to the world. For instance, we see public speakers all the time who make absurdly stupid statements regarding scientific fact (as in: getting it blatantly wrong) because they don't have a basic understanding of physics, chemistry, biology, etc. These courses in High School aren't meant to be doctorate programs. They aren't significant time investments. They're meant to provide a basic understanding of what we know about our world, to provide well rounded basic knowledge to people.

What happens, though, when a person finds something they're really good at or they enjoy immensely, is that they do it. A lot. Outside of class. If someone truly enjoys public speaking, they will be doing it more outside of class than inside of class. Learning more facts and becoming smarter WRT to the world around them is not going to inhibit this. If they're fascinated by political science, they're going to be on wikipedia, at the library, or even talking to local municipal leaders if the parent is capable of supporting their child enough to that end (here: you are the bottleneck, not your child's chemistry course). If they enjoy creative writing, they're probably going to be writing. Look at Harry Potter. What do you think mixing ingredients into a pot and getting some magical result correlates to in the real world? Chemistry. If you've learned even basic Chemistry, it feels like magic. There's a good motivator for creative writing, being inspired and in awe of something, even if you don't care to learn how it works to every detail. And HTML coding (it's not coding, it's a markup language, it's more like writing).. it's the same thing. Students specialize outside of school. If they are talented enough, they can even stop there, but being more well-rounded never hurts a person. It's a few courses in High School, before most even become lucid to the world around them. It's a background, not a lifetime investment.

Comment: Re:Wrong. (Score 1) 112

Incorrect. The court decided to misinterpret the function of copies in the cloud. An SMTP server makes a backup copy of a transmitted electronic messages so that a user can view it at a later time. If the server does not create this backup, the message would simply be lost forever. This is very similar to how phones work: if you answer your phone and walk out of the room, whatever the other person says is lost forever to you. Hence recording machines were invented to allow users who are not present at the time of a phone call to receive a message. The function of the copy made is to be a backup. Both email and phones allow a user to be present at the time of transmission and receive a message, however, it turned out to be useful to implicitly create backups, and so we've come to rely on that behavior. It does not, however, mean that those copies aren't backups. These backups are consistent with what the law says is protected. The court, out of ignorance of the technology, or in willful incompetence, decided that backups aren't backups in this case. They have decided that electronically stored data is not electronically and is not stored. What's next?

Comment: Re:Courts cannot fix faulty statutes (Score 1) 112

I think we must conclude that this court was horribly mistaken in both its understanding of what this text says (poor interpretation) and in what the "copy" in the cloud represents (poor understanding of technology). Consider, for a moment, the telephone. One would not argue that this definition of "backups" clearly applies to the use of a recording device to record a telephone conversation. Now what happens if one end simply enables a recording device, leaves the room, and comes back in 20 minutes when it is certain that the other party has terminated the connection? The message being sent electronically through the phone lines was not received by one party, but is stored. What is the purpose of that storage? Why would someone leave the room? What if they simply answered the phone, said "I'm not here right now," and then left the room? What if, instead of a person, it was actually a device that simply waited for N rings, then answered the phone, and told the person on the other line that nobody is present to answer the phone, and activates a recording device? We've moved, very basically and very incrementally from a normal phone conversation in which a recording is VERY CLEARLY a backup as this law states, to a position IDENTICAL TO CLOUD EMAIL STORAGE which is used EVERY DAY BY MILLIONS OF PEOPLE. They're called answering machines.

What is the purpose of the recording made by an answering machine? It is very obvious to anyone who's ever used a phone -- it allows a person to communicate a message, have it backed up for later use or review. The recording made by an answering machine IS A BACKUP. If it did not enable the recording device, and it did not need to, the copy transmitted over the wire would be, and this is the important part, LOST FOREVER AND UNRECOVERABLE. It is very obvious, then, that the recording is an implicit backup so that the absent party can later review the message.

Now consider SMTP. SMTP is the protocol used to send electronic mail. An SMTP client connects to an SMTP server over TCP (think of this like ringing a phone, the other end has to answer, and then you're connected). Then, using this protocol on top of TCP (think phone-line), a message (email) is transmitted from the sender to the receiver. The receiver can choose to discard this message, as in fact, it only reads the message into memory. If a user is present on the SMTP server, he/she can see the message transmitted over the connection as it occurs. But again, if that user is not present, a backup must be made so that he/she may at a later, more convenient time, access the message that was transmitted in his/her absence. This is identical to the answering machine. Electronic mail storage, even as a fundamental part of accessing electronic mail, is a glorified answering machine, and electronic mail is just a super high-speed, highly-efficient, yet glorified phone call.

IANAL, and IANAJ, but IASWE. My job, all day, every day, is to read specifications and implement specifications. I feel I can make a judgement here as an expert. Any mail stored as a "copy" in the cloud is a backup, implicitly, for protection against immediate loss, so that a user can later access the mail without being present at the time of its transmission. It therefore fits the definition present in the letter of the law. I find it incredibly unnerving that such an obvious and logical conclusion has eluded the court.

Comment: Re:Pipe Dream... (Score 2) 74

by jmerlin (#41593141) Attached to: Start-Up Wants To Open Up Science Journals and Eliminate Paywalls

I don't think it's the journals we have to worry about. All we need to do is to get the actual scholars to contribute their work in a more direct and public manner. From what I've gathered reading about how companies like Elsevier and others have treated authors following the dissolution of The Journal of Algorithms (good job, Knuth et al!), my understanding is that a handful of people get these journals free of charge, but most universities (including those who funded the research found in those journals) have to pay substantially over-priced yearly fees, and not purchase individual magazines but instead are required to purchase large collections of journals, most of which are widely regarded as complete jokes.

Further, these journals, from my understanding, pay authors/universities an utterly insulting royalty. The only reason authors submit their works to these journals is because everyone already buys them so it guarantees publicity and looks good on paper, and being a year-long endeavor, this makes actual research incredibly inefficient (leading to researchers just creating mailing lists and keeping each other informed about their research). I think we, the collective internet, have demonstrated that we can provide a better platform for this in every regard. With an open peer review, and instant publication to millions of people, with potential prices per-article (reasonable, like $.50, not $30) going primarily towards the authors and reviewers, we can do better.

I don't like the solution in TFA. I think it's a tiny step, maybe even just a shuffle, in the right direction, if at all. It reads very much like these two are just trying to take over the entire industry by placing themselves at the very top of this massive money machine as the middle men, much like Apple did with iTunes and the App Store, rather than actually solving the underlying problems. They argue "with better accessibility, the same model will work, and we can even drive down prices by being a little less greedy than Elsevier." Nonsense. We neither need nor want a middle man. Much research funded by tax dollars should be made publicly available at no cost, and the rest should be available at a reasonable cost to EVERYONE, not just other researchers. And if you completely cut away the operating costs of middle men and walled gardens, the prices could be astonishingly low but still maintain very good supplemental returns for Universities and all contributors. This proposed "solution" should be rejected for the very same reasons the existing publishers are being rejected.

Natural laws have no pity.