Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Not even close to a "first" (Score 1) 326

E3 did this more than ten years ago. Revenues at the conference obviously went through the floor, and they went straight back to the fanboy mosh pit everyone knew and loved from before.

That said, I didn't think the RSA conference would have the same demographic of attendees as E3. Perhaps it makes more sense for them; who knows. At the end of the day, business is business, and this is ultimately about money, not any social statement. They are simply betting that changing their appeal will raise their stature among security experts, attract attendees and media favor, and drive the all-important bottom line.

Comment 70% "Failure Rate"? (Score 4, Informative) 133

Gimme a break. One talking head wants to publish a "study" and suddenly it's canon? What does "failure" of a virtual team even mean? Probably, less money for the talking head.

Virtual-enabled teams, in my own experience, may not result in any massive, curve-jumping gains suitable for a click-baity headline, but I do think they are at least as good, and decisively better in a few key ways, as traditional, rigidly office-bound teams. They are cheaper facilities-wise, and result in vastly higher morale and loyalty among the employees who are able to work from home. They do not result in goofing off or falloff of productivity, except among employees who were worthless anyway.

Ultimately, virtual-enabled teams amplify whatever is good or bad about the leadership and quality of the hires. Good hires become great hires because they don't waste their work house stuck in ever-increasing traffic. Do-nothing managers become disaster managers. And the morons who represent the failure of the hiring process will spend their days at the beach and be discovered far earlier than they would have.

Virtualization is a Great Thing. But it highlights how the current state of management and hiring in software is a Terrible Thing. Don't believe the haters.

Comment Go eat your applesauce, Grandpa (Score 5, Funny) 205

The software industry just isn't a place for changing direction or starting new things. I mean, come on - learning a new skill is disloyal to the older skills. If everyone just learned things willy-nilly, who would sort the punch cards anymore?

Just keep your head down - you probably only have 2 or 3 more good typing years left before you're too old to sit up or retain bowel control.

Comment Study: This Study Is BS (Score 3, Insightful) 247

Making a judgment about "refactoring" as a single, simplistic concept is like making a judgment about "food" or "government" without going into any further detail. Umm, it's kind of not that simple.

Refactoring in real life is a whole array of different, nuanced activities. Any of them can be wise or foolish depending on the situation. Well-written code requires less of it, but some degree of it will always be needed as we can't tell the future. Each instance is a judgment call with no concrete right answer.

Comment A many-headed cultural problem (Score 1) 158

Thanks for bringing this up and giving it a name, first off; it needed one.

Over the years I have seen this mentality a lot, but it's almost always pushed onto a team from management rather than home-grown in the minds of the engineers. I see it as a systemic issue at software companies that the leaders see their own engineers as the least preferable choice to actually build new things. I think this arises from a few factors:

- open-source software has so successfully marketed itself as better than third-party enterprise software that managers think they are supposed to prefer it over in-house solutions that are informed by the knowledge of their own business. For obviously generic components like JSON parsers, this is true, but leaders often take this mentality to comical lengths.

- resume-building is about keywords, because that's all recruiters understand. Junior engineers realize that if they build their own components, they will not "get credit for it" later. But put Spring and Hibernate on your resume and the recruiters will come. So why not cram as many third-party components in as possible and really build that skills section?

- it can appear easier to manage a team that is struggling under the bloat of a third-party framework rather than creating something better. PMs feel more in control; managers don't have to think as much, and the engineers get their keywords for their resume. Everyone wins!

- a single phrase: "don't re-invent the wheel." I think this one sentence is responsible for the suppression of countless creative ideas. PMs and managers use it like a weapon to keep control; consultants use it to poison the reputations of in-house engineers in the minds of the managers whose money they want. Senior engineers even use it to squash down the ideas of competitors for their position. If we engineers did nothing but thoroughly discredit this phrase, turning it into a shared joke ala "what could go wrong?", the world would be a better place.

Comment 95% of research is fluff, research shows (Score 2) 411

This sounds like the same hand-wavy BS that spawned our current infestation of Agile consultants.

They aren't even trying to be scientific here; this is just baldfaced click-bait, likely commissioned by some unproductive company who wants to look like a "thought leader." What are they even defining as "wheat" and "chaff"? Who decides which lines of code are which? Who decides who gets to decide that? What does it even mean to describe what code "does"?

Smart people can disagree about best practices and what constitutes "good" code - ultimately, I think most of it boils down to personal taste rather than any notion of objective correctness or big-picture productivity. Personally, I feel most productive in Java - but that's because of an interlocking mesh of many subtle reasons and has nothing to do with how many bytes my code files take up.

Comment Re: Then be corrected of the error of your ways. (Score 1) 376

I hate to say it, but you're only reinforcing the pattern I've seen: I obviously wasn't in any way saying age discrimination "does not exist," as that would be moronic. But your attempt to twist my words into such a simplistic straw-man argument was the same sort of passive-aggressive victim-theater that I've heard a lot around this topic. It seems most alarm-raisers of age discrimination (that I've read, anyway) have this same sort of feel.

Please, are there any debate points on this subject that aren't obvious fallacy or appeal to emotion? My guess is, they exist, but are buried under the histrionics.

Comment Not convinced age discrimination is significant (Score 1) 376

I may as well pre-emptively tag this as flamebait, but I will throw out some honest dissent to the idea of (old) age discrimination being as overwhelming as it's portrayed.

I've been in software for 17 years and I have not personally witnessed a single incident of it - not even if I put on my easily-offended hat and really stretch for something that could be interpreted that way. Not a comment during a candidate review, not even an offhand water-cooler crack about "old folks" or whatever. That's obviously not because we engineers are angels - I have heard many, many inappropriate things. But *zero* were ever about being too old, nor have I ever seen any unstated pattern were older engineers were tarred as "not a culture fit." The reverse is not true - it is so common to reject inexperienced candidates that many feel comfortable saying someone is "too young" completely out in the open.

So, where is all of the anti-gray sentiment that I'm repeatedly told is lurking in my future? 40 is on the horizon and I am only in ever-greater demand as an individual contributor thanks to my full-stack independence and the dramatic vacuum of good engineers. In all hiring processes I've ever seen, we were so desperate for anyone with a hair of common sense and reasonable skills that we would have taken someone with three heads if they could crank out good product.

Any reasonable person would be suspicious, given this experience, that all this talk of age discrimination is less of a real problem than an exercise in trying to blame others for letting one's skills fall out of date and becoming un-hireable. True, a young engineer will never get rejected for knowing only COBOL - but there's no excuse for a graying one to have that problem either. If anything, good older engineers should be *more* up-to-date because they can learn new technologies faster (having learned so many before), and are more abreast of useful trends (because their experience lets them discern fads from real evolution).

I feel no pressure to move into management, and plan to code until I am no longer physically able, or financially required, to do so. My advice: double, triple and quadruple down on being the absolute best at what really gets you fired up, and you'll always have a cubicle with your name on it. That's definitely more fulfilling (and often more lucrative) than being a Dilbert-style manager who's only going through the motions.

Comment Hmm, is Mexico the US's Compaq? (Score 1) 433

"This week, President Fiorina welcomed all 174 new Congressmen from the 51st State, South Texas!

Although some have expressed concern that there are now more notorious drug lords in Congress than representatives from all New England states combined, the President emphasizes that 'together, we can realize cross-functional synergies to drive stakeholder value.'

Unfortunately, US Treasury bonds lost 80% of their value in after-hours trading."

Comment Re:Ridiculous, but so are college degrees (Score 1) 173

That's great that those sorts of courses at least exist. I'm wondering if the dearth of practical courses is a US thing, or if US "polytechnic" schools such as CalTech, MIT or Rensselaer would also offer those real-world programs.

The problem with technology-focused colleges is that you have to know you want to commit to studying technology several years earlier - at the point in high school where you're choosing which higher institutions you want to apply to. For some people (like me), that's 2-3 years before you realize you want to make that commitment, and that a technical college probably would have been a better choice.

I wish that "late bloomers" in programming didn't get left out of the opportunity for good real-world preparation by virtue of choosing a university that doesn't focus on tech. Why isn't a degree in Software Engineering widespread across all major US universities (instead of CS, even)? It seems bizarre that it's almost the only major type of engineering that isn't directly represented by a degree program of the same name.

We would never expect our mechanical engineering graduates to go out into the world knowing only science and theory - they spend their college years actually building real-world things that solve problems. But for some reason, software engineers are expected to start out with just such a handicap, which is why we aren't really worth much to an employer until we've gotten a few years' experience. We're basically starting a second degree program our first day on the job, except with no teachers and little feedback other than getting fired. The 10000-hour idea is totally right; it would just be great if we could start making that investment as students, rather than as professionals.

Comment Ridiculous, but so are college degrees (Score 4, Interesting) 173

Full disclosure: I hold a bachelor's in CS from Stanford and have been an engineer for 14 years since then. I think my degree was, to be polite, poor preparation for any real-world work beyond teaching college CS courses, although I have also never seen any program I think is better.

I've been saying for a while that any "good" engineering education of the future won't look much like today's system. A college degree is a needlessly long, expensive process for qualified candidates to go through to demonstrate their ability (although I definitely think college has many other benefits), and wastes our time with piles of worthless freshman requirements. On top of that, "Computer Science" isn't what engineers do - it goes into far deeper theory than is needed for almost anyone, and at the same time leaves out a lot of real-world skills that are critical for building functioning software.

Ultimately, the only reason CS degrees have the industry importance they do is because it's one of the only things recruiters can understand. For that very reason, boot-camp programs like this, despite their utterly moronic assertion that a decent engineer can be cranked out in three months, are nonetheless a step toward a better solution.

I think the industry needs some sort of advanced trade schools - basically, a prestigious version of DeVry that teaches not just programming using the language of the moment, but *software engineering* as the separate discipline that it truly is (maybe this already exists somewhere, but I think it should be widespread). We need degrees that are good enough to indicate reliably high value in a candidate and provide enduring background knowledge, affordable enough for the average middle-class person to break into engineering, and still provide a black-and-white resume line item that's simple enough to pass the buzzword filters in recruiters' minds. I see no reason why a two-year associate's degree that's packed full of courses on real-world subjects, as well as tons of actual code construction, couldn't theoretically be *far better* than any current CS degree from a top university.

I was never able to take a single class on scalability, security, development methodology trends and how to evaluate them, management of large codebases, refactoring, etc. These are not flash-in-the-pan concepts that only reflect some current fad, but timeless and critical skills that are fully suitable for a university setting. However, universities are too mired in trying avoid looking like trade schools (and thereby justify their astronomical prices) to care much about providing real value to their customers, which makes them ripe to be punished by the free market.

Comment Carmack FTW. (Score 5, Insightful) 60

I have been an Android developer for two years and a Java developer for almost 15, and even a former Google employee, and...in my estimation, Carmack is 100% right.

Despite how much more I like Java than lower-level languages, Google's software stack is a complete disaster. It's poorly designed, bug-riddled garbage that I have actually considered re-writing parts of, even in the middle of a high-pressure project. What makes matters so much worse is Android's distribution model: rather than the direct-to-consumer approach that Apple takes, Google distributes Android indirectly via its device vendors, who can provide arbitrarily modified or out-of-date versions of the infrastructure that you're expected to support when dealing with angry customers who don't understand why their network stack mysteriously doesn't work.

The NDK is not an answer. It's a wreck because JNI is a wreck. I've been using JNI since 2002, and almost nothing has evolved since then - it was never anything more than a token olive branch to luddite C++ developers in 1995, and probably never will be. Ultimately, Java is excellent for mature devices (like servers), but is not suitable for emerging devices (like all the mobile devices we're seeing now) because of its runtime overhead.

Despite Apple's many shortcomings, one of the key points they get right is that mobile development needs natively compiled, non-runtime (or thin-runtime) languages. And, of course, libraries that work. Apple isn't exactly the gold standard on that either, but at least they're miles ahead of "beta early, beta often" Google.

Slashdot Top Deals

Never test for an error condition you don't know how to handle. -- Steinbach

Working...