Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment: Re:Not this shit again (Score 2) 834

by jmerlin (#48366371) Attached to: How To End Online Harassment
Sure. I'll elaborate by correcting your timeline:

1) Dev writes game in 2013, has people testing the game, or trying it for feedback including Nathan Grayson (who is thanked in the game's credits, alongside other friends of dev).
2) Dev is a part of a game jam that fails in late 2013.
3) Dev has friendship with Nathan Grayson, writer at RPS at the time (later Kotaku). Grayson publishes article about 50 games added to Steam, putting a picture of Depression Quest, a choose-your-own-adventure text game as the primary article image (over actual games) and considering it the number one highlight of the bunch. Grayson does not disclose friendship, or early access to game, or any other involvement with dev or the game. Grayson does not even disclose how he chose the highlights, or if he played any other game. https://archive.today/iS4Ru.
4) Dev is filmed with friend, Nathan Grayson, in a bed, in a hotel room, at the end of GDC in 2014: https://www.youtube.com/watch?.... Date of recording is at the end of GDC which was held from March 2-6.
5) Nathan Grayson writes hit piece on GAME_JAM, painting dev in a very favorable light, with choice quotes and plugging her own game jam (which she profits from personally): https://archive.today/0KhZv. Her game is also mentioned. This is published on March 31, almost a month after the GDC video is filmed. No disclosure about their friendship or relationship is presented. Totilo had not been informed of anything at this time.
6) Nathan Grayson and dev "allegedly" begin a sexual relationship the next day, on April 1, 2014, according to Grayson and Totilo. This contradicts information presented from sources close to dev, however, and Kotaku claims they were only "professional acquaintances" at the time of publication, which contradicts all information available.

~ None of the previous information is publicly known except the publications (and the failed game jam) until after the next point ~

7) thezoepost is created in August, 2014. In it, ex-boyfriend of dev cites clear psychologically abusive behavior of dev during their relationship and points out that dev is a pathological liar who slept around, including with people who could help her career, such as Nathan Grayson. He ended it because she could not stop the "let's make this work" followed by abuse merry-go-round. He advises everyone to be extremely cautious and not trust dev.
8) Many jokes, including "five guys, burgers and fries" and #TheQuinnspiracy are born. This is in jest, there is little outrage about the scandal. People begin digging into allegations of impropriety on Nathan Grayson's part, amongst other parties.
9) Discussion around corruption in journalism explodes on major websites, such as reddit and 4chan. It is apparent that this is only the tip of the iceberg. People begin stockpiling evidence of other violations, and more still come forward with personal stories from inside of corruption.
10) Mass censorship of the discussion about the entire ordeal, including all discussion of corruption in journalism. Basically everything on every major website is completely and totally deleted.
11) Huge backlash occurs. IA publishes second video on the topic, mainly focusing on the censorship. Adam Baldwin tweets #gamergate, linking to IA's videos, after the second one is published, responding to the censorship of discussion of the first video.
12) "Gamers are dead" articles are published.
14) GJP is outed and proof of collusion is made public.
15) GJP members begin publishing many articles pushing the narrative that says GamerGate supporters misogynists, racists, white cis hetero males, who just want to keep women out of the industry and out of their games "hobby".
16) Many news outlets unaffiliated take these stories as fact (including Wikipedia) and continue to spread the false narrative. This is not unexpected, as studies have been done in the past few years that show how trivial it is to create a fake news story and get it published on every major news website as fact, as a larger example of how little fact checking is actually done by mainstream journalists and news outlets these days.

Grayson is considered to have failed to recuse himself, failed to disclose his conflict of interest with his boss, and failed to disclose his friendship (or more) twice, in both articles. He is widely considered to be a shining example of the failure of ethics in games journalism as he still has the job.

Things I didn't bother looking up the timeline for: dev speaking out against TFYC, dev creating a Patreon and faking a mugging to get contributions, and other people she's been in relationships with coming forward and corroborating thezoepost's outlook on her personal behavior. She intentionally tried to kill TFYC's efforts which was promoting getting women into gaming, which is where 4chan stepped in and donated to make it happen. This is where Vivian James came from. Dev is considered a terrible person by almost everyone who sees these actions. She abuses a person, cries wolf to make money, cheats to get ahead, and starts her own game jam and has it promoted by a lover, when contributions to make it happen are linked to her personal paypal account. There is 0 transparency in this game jam.

Gamergate is not so interested in the moral failures of dev. They are simply musings. And most people agree she's an awful person. However, criticizing someone is not harassment. And there is no action involving dev, only disgust.

Grayson, however, has and his employer (Kotaku, which is owned by Gawker Media) has received the majority of the attention. Almost all of it. Kotaku has been accosted for Grayson's dismissal, apologies for the defamatory and blatantly incorrect articles it has published, for disclosure and implementation of ethical policies, and for being related to known bigots and hate mongers through Twitter and parent company Gawker Media. As a result of failing to do anything, while continuing to violate basic ethical guidelines, and even legal guidelines, a huge campaign to divorce its advertisers from Gawker Media has had success and continues to be successful. This pressure may eventually force Gawker to change or evaporate without support. Illegal activities of Gawker Media are being pursued as well as activities that clearly violate the terms for both membership in Google's advertising program as well as Amazon's. There is a large pressure being placed on both of these companies to recognize these violations and terminate their dealings with Gawker Media.

You seem to want to talk about dev. We want to talk about why dev's game jam and shitty CYOA game that could've been made by a 5 year old are taken seriously by a press that shits on AAA games and studios. That is, we want to talk about corruption.

Comment: Not such a bad idea... (Score 5, Funny) 194

by jmerlin (#41793041) Attached to: Industrial Control Software Easily Hackable

Worse, many of these systems are unneccessarily connected to the Internet, which is a terrible, terrible idea.

Now you're just being paranoid. Instead, you should develop an artificially intelligent system to defeat would-be attackers and malicious software. That sounds like the best idea.

- Skynet

Comment: Re:I think that's all college students (Score 1) 823

by jmerlin (#41790903) Attached to: Ask Slashdot: Rectifying Nerd Arrogance?

As moderaters have decided to upvote a troll AC, I suppose I shall respond.

Are you sure, because this is the most arrogant post in the thread.

In what way?

You seem to be doing a good job of feeling superior to all the people who chose to apply themselves in school, or who didn't need to apply themselves to do well. You don't seem to make the connection that your choice to blow off most of high school probably impacted your ability to get a scholarship to MIT or Stanford, and thus drove you get an education at a place where you admit "the CS curriculum there wasn't challenging".

Excuse me for a moment, I'm going to recount facts. This isn't arrogance, it's historical recollection. I didn't do a lot of homework, and that which I did do was done within 15 hours of it being due. I didn't try and was still in the top 10.2% of my class (ranked 33 out of 322), and still managed to qualify for a full scholarship to a state university. I spent almost all of my time applying myself to things I cared about: programming, hacking, etc, which were not recreational, and pushed me intellectually, which is far from "blowing off" my education. You seem to insinuate that I did "poorly" because I didn't apply myself, and then further the insult by pointing out that there are people who don't need to try to do well. Statistically speaking, I fit very firmly in the latter group. I did group work with people who were in the top 2% of my class. They didn't understand basic math and didn't take any specialized courses (like CS, which was entirely optional), and almost all of their time was spent doing coursework. I appreciate the effort to excel in school, but spending time trying to perfect work in a field you find completely uninteresting isn't a useful endeavor. Perhaps they found all of those fields interesting, I don't know, but I didn't.

As for your quip about ivy league schools, I looked into the matter. The scholarships available were not total, and would require relocation into areas with significantly higher costs of living. The people I knew in High School who went to these schools had very wealthy families, and no one was awarded any better than a partial scholarship. If I managed to score a 100 in every course, managed a perfect SAT/ACT score, did something astonishing like win a national programming competition, and had some pretty epic extra-curriculars (as hacking by yourself wouldn't count) it might have been possible to get a completely full scholarship to MIT or Stanford, but I'm no such intellectual monster. So in the real world, as a real person, not hiding behind AC, it was cost inefficient for me to pursue these schools. My SAT/ACT scores (1330/34) were in the top 1% and my GPA, as I actually looked it up, was 3.84. I definitely qualified for some lower level scholarship at MIT or Stanford, but as a guy who, on a full scholarship, living in an apartment which cost only $400/month, worked his way through college on an $8/hr salary at 20 hours a week without ever having more than $2000 to his name, even paying 20% of a $50,000/yr tuition (not counting cost of living differential) would have resulted in a debt that I would STILL be paying off today. I have no debts today, I own a nice car, and I have a sizable savings already. My decision insofar as what could have happened versus what has happened has been at least positive. You are correct in that I missed out on networking opportunities, but I have also done fairly well in that regard in spite of not gambling my future.

Also, even though my school wasn't challenging in CS, I still pursued challenging material. I acquired texts used by MIT in their recommended undergrad course schedule and consumed everything in them and everything that was relevant on OCW, and this took years. It isn't, strictly speaking, necessary to attend a school like MIT to learn the same.

Comment: Re:I think that's all college students (Score 1) 823

by jmerlin (#41772813) Attached to: Ask Slashdot: Rectifying Nerd Arrogance?

This is hilarious. Does this count as cognitive dissonance?

Why would it be? I assume you're confused as to why I would consider myself a valedictorian in some form while simultaneously making a statement implying that position is not much of an accomplishment. Maybe I wasn't clear, and I apologize for that. To be clearer, I didn't think myself smarter than my peers (unconditionally), but I felt that a general measure of academic success was meaningless and as a result, the "actual" valedictorian was a meaningless position. Instead, comparing individuals against one another on a per-field basis would provide a more useful result, and that is the basis of how I compared myself to others. In that case, specialization isn't punished, and there isn't just a single person who's "best" in some manner. How things work right now is much like asking: who's the best athlete? Who's the best programmer? Who's the smartest person? What's the best sorting algorithm? What's the best data structure? The answer to these is always the same: "there isn't, we have specializations." The Olympic Games don't have a single category: best athlete, do they?

These days, I don't much care, outside of the fact that children are still being subjected to such broken systems. While I believe a general well-rounded education is important, I don't think that someone's performance in a completely unrelated field should be relevant to their specialization when seeking a job or admittance in higher education, at least not for technical ones.

Comment: Re:First post! (Score 2) 403

by jmerlin (#41771217) Attached to: Microsoft Releases Windows 8
I haven't used Windows 8, but I do have to admit that I like the simplicity in the Metro UI look/feel. Not necessarily how the UI is functionally constructed in Windows 8, but for a web interface, it's pretty solid IMO. The Windows 8 standalone apps I've used on Windows 7 have been pretty good. I suppose if it was just Windows 7 with less focus on Aero and more focus on Metro-style minimalistic display, I'd be quite happy (none of this full-screen metro nonsense). When I install Windows 7 on my beastly gaming machine, I turn the visuals to best performance and disable Aero. I still maintain that it is/was a stupid idea, and I find the fact that Linux desktop distros are heading down the Aero path simply horrifying (wtf @ compiz, seriously?).

Comment: Re:I think that's all college students (Score 5, Interesting) 823

by jmerlin (#41768097) Attached to: Ask Slashdot: Rectifying Nerd Arrogance?

I will recount my experience.

Spend high school overachieving (probably at the expense of social development)

I didn't overachieve in High School because I realized how pointless an effort that was. There were only 2 things I cared about in High School, and that was Computer Science and Math. I realized this by about grade 5.. At that time, there was a little orange book called "Games in Basic" on my teacher's bookshelf. I picked it up and started reading it one day and was fascinated (we had a PC with Windows 3.1 and I could easily boot into DOS and code up basic games). She saw me reading it and said "I bought that thinking kids might like it but nobody but you has ever read it, so you can have it if you want." So I took it home and went through all of the exercises in it (just basic word games, input a number/word, output a response, etc). At that point I was hooked. When we finally upgraded to Windows 95/98, I started playing around in VB, eventually installing my father's copy of VC and learning C. This is where my time not in school was spent (split between that and playing games). I quickly realized I enjoyed this more than just about anything else, and so I did it. I taught myself VB then C then x86. By the time I could actually take a CS course in High School I was a junior, and it was an entry-level Java course. I still learned things -- data structures and some algorithms, but the majority of the syntax and other things I was quite familiar with already. Of those two categories I cared about, I maintained a 95%+ average. I didn't apply myself in History, English, other sciences, or any of the nonsensical electives we had to take. I saw no reason to, and I didn't care that I was just outside the top 10% mark in my school, nobody I knew was as good at Math or CS as me, so as far as I was concerned, I was the valedictorian. When I later spoke with people in the top 1% including the actual valedictorian, the arrogance they exuded was astonishing, as if they had accomplished something worthwhile.

work hard and get into a great college, get knocked down a peg when you realize that you're either somewhere in the meaty part of the curve among other prospective engineers, or that you'll actually need to *try* in order to get that A for the first time in your life

I didn't work hard to get in a great college, but I still managed to, even with my crappy GPA (something like 3.4 in HS), get a scholarship to a local university. I really wanted to go to Stanford or MIT, but the money just wasn't there, and a huge student loan wasn't something I could justify. So I majored in CS, an obvious choice, and figured that this 4-year degree would do nicely in the real world, where experience is more important anyway. I realized pretty quickly that the CS curriculum there wasn't challenging. I could read through the texts and learn what a course would teach me in a few days, and would end up bored sitting in a course going at a snail's pace for the rest of the semester. On the other hand, math courses were actually quite challenging. So 3 semesters in I switched from CS major to Math major and still took the interesting CS courses in my electives (compilers, AI, operating systems, etc). The math courses were a fair bit more difficult, especially more abstract courses, but the only time I actually had to really try to get a decent grade was when I finally started taking graduate courses. There's just too much information to keep in one's head to fully understand why a proof is valid (it doesn't just span that chapter in that book, nor even that entire book, but rather the past 3 years of courses of abstraction). Needless to say, in my spare time, I was still hacking around in CS and my brain was already prioritizing CS-useful math (including things like Abstract Algebra, Number Theory, Probability, etc), but the rest was reserved for actual CS work, so I wasn't to interested in pursuing an M.S. in math. No CS course I ever took was difficult, but I do suspect at MIT or Stanford that I would have been really challenged to make a good grade -- maybe one day I'll get around to getting an M.S. or a Ph.D. from either. It was at this point that I started reading the references in my texts and finding things I was particularly interested in, following references on Wikipedia, and reading articles in peer-reviewed journals my university had subscribed to. I very quickly realized what I knew is like a single point on the 2D plane, and similarly what we all know, collectively, is merely a single 2D plane within 3-space. It's astonishing when you actually open your eyes.

once you do succeed (or maybe just fail to fail) you graduate college thinking you're ready to take on the world... enter the business world and realize that the fancy education you paid so much for is only good enough to get your foot in the door...come to the realization that respect is earned by experience and demonstrated value... spend a few years building up credibility and expertise, then realize that being a manager (or director, or VP, etc.) requires some serious people skills (remember all those parties and extracurricular activities you skipped in high school in favor of hacking and video games?) and either choose to stay on the individual contributor path and hone your skills to guru level or take the plunge and start educating yourself (both formally and informally) in how to effectively manage a bunch of cocky engineers.

I'm still in the initial stages of the business world, but I've already realized I was lucky enough to be correct when it came to my educational choices: the school doesn't matter, and after a few years, your experience trumps it anyway. I also have made a decision already, and that's to never become a manager. I don't want to manage people, I want to be an engineer. Any company where the only way to "advance your career" is by changing your role entirely from an engineer to a manager is by definition not a company I want to work for. It's too bad that vision is so pervasive in this culture. Hackers want to hack. We don't want to go to meetings, read thousands of e-mails, or talk to "senior managers", "VPs", or "C*O"s outside of showing them our awesome hacks. That's all. I don't think I have a lot of arrogance these days. I definitely did when I was younger, but sometime around 22-23 a switch flipped and it was gone almost over night. Maybe it's the change from looking at everything subjectively to looking at everything objectively. Maybe it's the realization of how little you actually know. Maybe it's the realization that you no longer have to demonstrate that you're "smarter" than other people (a mentality put in place by schools via grades and grade-curves, we are competitive beings). Maybe it's because now when I look for information, I have to wade through a sea of information to find that little drop I care about versus consuming all information available in my text book and thinking: "is there more? Do I know everything about this topic now?" Maybe it's a combination of all of those things -- but I don't think there's a "fix" for it, other than letting someone eventually figure it out.

Comment: Re:The PC is dying claims are made every few years (Score 2) 291

by jmerlin (#41754517) Attached to: The Greatest Battle of the Personal Computing Revolution Lies Ahead

We had a bunch of horseless carriages designed before the Model-T too. It just needed the right situation to get them to kick off.

I have to point out that while I admire the appeal to cars, that this is a flawed analogy. The horseless carriage was capable of reproducing every single feature of a horse-drawn carriage (except the pooping), which made it an obvious successor. There are a myriad of things that touch-based devices can't do that a mouse/keyboard can. And there are a lot of things a little tiny form-factor micro-pc with an attachment port for a mouse/keyboard can't do that a full-blown desktop can (IE: power a 30" 2560x1600 monitor at 100+ FPS on the latest games). Even with Apple trying to bridge this with their new uber-thin desktop computer, the specs on it are pretty awkward for a computer built in 2012 and at a price starting at $1700 even with, essentially their Thunderbolt display built-in, it's a pretty second-rate system, also not capable of doing things many people need a full-blown tower for. In short: I'm saying that these devices, while they have obvious and great uses, do not fully reproduce the capabilities of PCs, and these hybrid-PC-microdevices don't fully reproduce the capabilities of their larger brethren. So the end result is that the PC is a completely different market, and no new micro-devices are ever a sign of "the end of the PC era."

I suspect until some major scientific breakthroughs occur that it's not even possible for a small-form-factor PC to replace a full blown tower-based PC. Even if Moore's Law continues for another decade, there's a physical limit to how much computing power you can shove into a hand-held or ultra-thin device. Compare that with a desktop that has this monstrous 2200+ cubic inches of space for hardware, and if you can shove 10x-50x the amount of horsepower into that space, the PC has the capacity to behave like a local-supercomputer in comparison. While you might argue that it's possible to use the "cloud" for computationally expensive work like that (and that's an argument, for sure, a little awkward, but I suppose it works), the problem then lies in latency. We're limited by the speed of light, literally, a limit which does not exist when the box is sitting under your desk, so there's a use-case that will never be supportable in a cloud system: real-time gaming on nearly photorealistic engines with extreme physics support. This was tried, and the implementation failed, for the very same reason (it's the latency, not the bandwidth, and that's a physical limit in our universe, so it will never work, until FTL communication is discovered).

Comment: Re:Simple answer (Score 1) 360

by jmerlin (#41749253) Attached to: Ask Slashdot: How To Avoid Working With Awful Legacy Code?
Every day at 5, I rm -rf *. I don't even have legacy code that's a day old. When people say "it sucks when you read your own 6 month-old code and have no idea how it works or what you originally did," I have no idea what they mean, because I just remember how I solved the problem and recreate it. I love being WET (reWrite Every Time).

Comment: I just had to respond to this inflammatory remark. (Score 2) 866

by jmerlin (#41683991) Attached to: Parent Questions Mandatory High School Chemistry

There’s a concept in economics called 'opportunity costs,' which you may not have learned about because you were taking chemistry instead of economics. Opportunity costs are the sacrifices we make when we choose one alternative over another. ... When you force my son to take chemistry (and several other subjects, this is not only about chemistry), you are not allowing him that same time to take a public speaking course, which he could be really good at, or music, or political science, or creative writing, or HTML coding for websites.

We took economics. We also pay attention to the world. For instance, we see public speakers all the time who make absurdly stupid statements regarding scientific fact (as in: getting it blatantly wrong) because they don't have a basic understanding of physics, chemistry, biology, etc. These courses in High School aren't meant to be doctorate programs. They aren't significant time investments. They're meant to provide a basic understanding of what we know about our world, to provide well rounded basic knowledge to people.

What happens, though, when a person finds something they're really good at or they enjoy immensely, is that they do it. A lot. Outside of class. If someone truly enjoys public speaking, they will be doing it more outside of class than inside of class. Learning more facts and becoming smarter WRT to the world around them is not going to inhibit this. If they're fascinated by political science, they're going to be on wikipedia, at the library, or even talking to local municipal leaders if the parent is capable of supporting their child enough to that end (here: you are the bottleneck, not your child's chemistry course). If they enjoy creative writing, they're probably going to be writing. Look at Harry Potter. What do you think mixing ingredients into a pot and getting some magical result correlates to in the real world? Chemistry. If you've learned even basic Chemistry, it feels like magic. There's a good motivator for creative writing, being inspired and in awe of something, even if you don't care to learn how it works to every detail. And HTML coding (it's not coding, it's a markup language, it's more like writing).. it's the same thing. Students specialize outside of school. If they are talented enough, they can even stop there, but being more well-rounded never hurts a person. It's a few courses in High School, before most even become lucid to the world around them. It's a background, not a lifetime investment.

Comment: Re:Wrong. (Score 1) 112

Incorrect. The court decided to misinterpret the function of copies in the cloud. An SMTP server makes a backup copy of a transmitted electronic messages so that a user can view it at a later time. If the server does not create this backup, the message would simply be lost forever. This is very similar to how phones work: if you answer your phone and walk out of the room, whatever the other person says is lost forever to you. Hence recording machines were invented to allow users who are not present at the time of a phone call to receive a message. The function of the copy made is to be a backup. Both email and phones allow a user to be present at the time of transmission and receive a message, however, it turned out to be useful to implicitly create backups, and so we've come to rely on that behavior. It does not, however, mean that those copies aren't backups. These backups are consistent with what the law says is protected. The court, out of ignorance of the technology, or in willful incompetence, decided that backups aren't backups in this case. They have decided that electronically stored data is not electronically and is not stored. What's next?

Comment: Re:Courts cannot fix faulty statutes (Score 1) 112

I think we must conclude that this court was horribly mistaken in both its understanding of what this text says (poor interpretation) and in what the "copy" in the cloud represents (poor understanding of technology). Consider, for a moment, the telephone. One would not argue that this definition of "backups" clearly applies to the use of a recording device to record a telephone conversation. Now what happens if one end simply enables a recording device, leaves the room, and comes back in 20 minutes when it is certain that the other party has terminated the connection? The message being sent electronically through the phone lines was not received by one party, but is stored. What is the purpose of that storage? Why would someone leave the room? What if they simply answered the phone, said "I'm not here right now," and then left the room? What if, instead of a person, it was actually a device that simply waited for N rings, then answered the phone, and told the person on the other line that nobody is present to answer the phone, and activates a recording device? We've moved, very basically and very incrementally from a normal phone conversation in which a recording is VERY CLEARLY a backup as this law states, to a position IDENTICAL TO CLOUD EMAIL STORAGE which is used EVERY DAY BY MILLIONS OF PEOPLE. They're called answering machines.

What is the purpose of the recording made by an answering machine? It is very obvious to anyone who's ever used a phone -- it allows a person to communicate a message, have it backed up for later use or review. The recording made by an answering machine IS A BACKUP. If it did not enable the recording device, and it did not need to, the copy transmitted over the wire would be, and this is the important part, LOST FOREVER AND UNRECOVERABLE. It is very obvious, then, that the recording is an implicit backup so that the absent party can later review the message.

Now consider SMTP. SMTP is the protocol used to send electronic mail. An SMTP client connects to an SMTP server over TCP (think of this like ringing a phone, the other end has to answer, and then you're connected). Then, using this protocol on top of TCP (think phone-line), a message (email) is transmitted from the sender to the receiver. The receiver can choose to discard this message, as in fact, it only reads the message into memory. If a user is present on the SMTP server, he/she can see the message transmitted over the connection as it occurs. But again, if that user is not present, a backup must be made so that he/she may at a later, more convenient time, access the message that was transmitted in his/her absence. This is identical to the answering machine. Electronic mail storage, even as a fundamental part of accessing electronic mail, is a glorified answering machine, and electronic mail is just a super high-speed, highly-efficient, yet glorified phone call.

IANAL, and IANAJ, but IASWE. My job, all day, every day, is to read specifications and implement specifications. I feel I can make a judgement here as an expert. Any mail stored as a "copy" in the cloud is a backup, implicitly, for protection against immediate loss, so that a user can later access the mail without being present at the time of its transmission. It therefore fits the definition present in the letter of the law. I find it incredibly unnerving that such an obvious and logical conclusion has eluded the court.

Round Numbers are always false. -- Samuel Johnson

Working...