Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).


Comment: Study: This Study Is BS (Score 3, Insightful) 245

by engineerErrant (#49177331) Attached to: Study: Refactoring Doesn't Improve Code Quality

Making a judgment about "refactoring" as a single, simplistic concept is like making a judgment about "food" or "government" without going into any further detail. Umm, it's kind of not that simple.

Refactoring in real life is a whole array of different, nuanced activities. Any of them can be wise or foolish depending on the situation. Well-written code requires less of it, but some degree of it will always be needed as we can't tell the future. Each instance is a judgment call with no concrete right answer.

Comment: A many-headed cultural problem (Score 1) 158

by engineerErrant (#49148653) Attached to: Invented-Here Syndrome

Thanks for bringing this up and giving it a name, first off; it needed one.

Over the years I have seen this mentality a lot, but it's almost always pushed onto a team from management rather than home-grown in the minds of the engineers. I see it as a systemic issue at software companies that the leaders see their own engineers as the least preferable choice to actually build new things. I think this arises from a few factors:

- open-source software has so successfully marketed itself as better than third-party enterprise software that managers think they are supposed to prefer it over in-house solutions that are informed by the knowledge of their own business. For obviously generic components like JSON parsers, this is true, but leaders often take this mentality to comical lengths.

- resume-building is about keywords, because that's all recruiters understand. Junior engineers realize that if they build their own components, they will not "get credit for it" later. But put Spring and Hibernate on your resume and the recruiters will come. So why not cram as many third-party components in as possible and really build that skills section?

- it can appear easier to manage a team that is struggling under the bloat of a third-party framework rather than creating something better. PMs feel more in control; managers don't have to think as much, and the engineers get their keywords for their resume. Everyone wins!

- a single phrase: "don't re-invent the wheel." I think this one sentence is responsible for the suppression of countless creative ideas. PMs and managers use it like a weapon to keep control; consultants use it to poison the reputations of in-house engineers in the minds of the managers whose money they want. Senior engineers even use it to squash down the ideas of competitors for their position. If we engineers did nothing but thoroughly discredit this phrase, turning it into a shared joke ala "what could go wrong?", the world would be a better place.

Comment: 95% of research is fluff, research shows (Score 2) 411

by engineerErrant (#49031877) Attached to: Your Java Code Is Mostly Fluff, New Research Finds

This sounds like the same hand-wavy BS that spawned our current infestation of Agile consultants.

They aren't even trying to be scientific here; this is just baldfaced click-bait, likely commissioned by some unproductive company who wants to look like a "thought leader." What are they even defining as "wheat" and "chaff"? Who decides which lines of code are which? Who decides who gets to decide that? What does it even mean to describe what code "does"?

Smart people can disagree about best practices and what constitutes "good" code - ultimately, I think most of it boils down to personal taste rather than any notion of objective correctness or big-picture productivity. Personally, I feel most productive in Java - but that's because of an interlocking mesh of many subtle reasons and has nothing to do with how many bytes my code files take up.

Comment: Re: Then be corrected of the error of your ways. (Score 1) 376

by engineerErrant (#48504273) Attached to: Ask Slashdot: IT Career Path After 35?

I hate to say it, but you're only reinforcing the pattern I've seen: I obviously wasn't in any way saying age discrimination "does not exist," as that would be moronic. But your attempt to twist my words into such a simplistic straw-man argument was the same sort of passive-aggressive victim-theater that I've heard a lot around this topic. It seems most alarm-raisers of age discrimination (that I've read, anyway) have this same sort of feel.

Please, are there any debate points on this subject that aren't obvious fallacy or appeal to emotion? My guess is, they exist, but are buried under the histrionics.

Comment: Not convinced age discrimination is significant (Score 1) 376

by engineerErrant (#48493775) Attached to: Ask Slashdot: IT Career Path After 35?

I may as well pre-emptively tag this as flamebait, but I will throw out some honest dissent to the idea of (old) age discrimination being as overwhelming as it's portrayed.

I've been in software for 17 years and I have not personally witnessed a single incident of it - not even if I put on my easily-offended hat and really stretch for something that could be interpreted that way. Not a comment during a candidate review, not even an offhand water-cooler crack about "old folks" or whatever. That's obviously not because we engineers are angels - I have heard many, many inappropriate things. But *zero* were ever about being too old, nor have I ever seen any unstated pattern were older engineers were tarred as "not a culture fit." The reverse is not true - it is so common to reject inexperienced candidates that many feel comfortable saying someone is "too young" completely out in the open.

So, where is all of the anti-gray sentiment that I'm repeatedly told is lurking in my future? 40 is on the horizon and I am only in ever-greater demand as an individual contributor thanks to my full-stack independence and the dramatic vacuum of good engineers. In all hiring processes I've ever seen, we were so desperate for anyone with a hair of common sense and reasonable skills that we would have taken someone with three heads if they could crank out good product.

Any reasonable person would be suspicious, given this experience, that all this talk of age discrimination is less of a real problem than an exercise in trying to blame others for letting one's skills fall out of date and becoming un-hireable. True, a young engineer will never get rejected for knowing only COBOL - but there's no excuse for a graying one to have that problem either. If anything, good older engineers should be *more* up-to-date because they can learn new technologies faster (having learned so many before), and are more abreast of useful trends (because their experience lets them discern fads from real evolution).

I feel no pressure to move into management, and plan to code until I am no longer physically able, or financially required, to do so. My advice: double, triple and quadruple down on being the absolute best at what really gets you fired up, and you'll always have a cubicle with your name on it. That's definitely more fulfilling (and often more lucrative) than being a Dilbert-style manager who's only going through the motions.

Comment: Hmm, is Mexico the US's Compaq? (Score 1) 433

by engineerErrant (#48470459) Attached to: Former HP CEO Carly Fiorina Considering US Presidential Run

"This week, President Fiorina welcomed all 174 new Congressmen from the 51st State, South Texas!

Although some have expressed concern that there are now more notorious drug lords in Congress than representatives from all New England states combined, the President emphasizes that 'together, we can realize cross-functional synergies to drive stakeholder value.'

Unfortunately, US Treasury bonds lost 80% of their value in after-hours trading."

Comment: Re:Ridiculous, but so are college degrees (Score 1) 173

That's great that those sorts of courses at least exist. I'm wondering if the dearth of practical courses is a US thing, or if US "polytechnic" schools such as CalTech, MIT or Rensselaer would also offer those real-world programs.

The problem with technology-focused colleges is that you have to know you want to commit to studying technology several years earlier - at the point in high school where you're choosing which higher institutions you want to apply to. For some people (like me), that's 2-3 years before you realize you want to make that commitment, and that a technical college probably would have been a better choice.

I wish that "late bloomers" in programming didn't get left out of the opportunity for good real-world preparation by virtue of choosing a university that doesn't focus on tech. Why isn't a degree in Software Engineering widespread across all major US universities (instead of CS, even)? It seems bizarre that it's almost the only major type of engineering that isn't directly represented by a degree program of the same name.

We would never expect our mechanical engineering graduates to go out into the world knowing only science and theory - they spend their college years actually building real-world things that solve problems. But for some reason, software engineers are expected to start out with just such a handicap, which is why we aren't really worth much to an employer until we've gotten a few years' experience. We're basically starting a second degree program our first day on the job, except with no teachers and little feedback other than getting fired. The 10000-hour idea is totally right; it would just be great if we could start making that investment as students, rather than as professionals.

Comment: Ridiculous, but so are college degrees (Score 4, Interesting) 173

Full disclosure: I hold a bachelor's in CS from Stanford and have been an engineer for 14 years since then. I think my degree was, to be polite, poor preparation for any real-world work beyond teaching college CS courses, although I have also never seen any program I think is better.

I've been saying for a while that any "good" engineering education of the future won't look much like today's system. A college degree is a needlessly long, expensive process for qualified candidates to go through to demonstrate their ability (although I definitely think college has many other benefits), and wastes our time with piles of worthless freshman requirements. On top of that, "Computer Science" isn't what engineers do - it goes into far deeper theory than is needed for almost anyone, and at the same time leaves out a lot of real-world skills that are critical for building functioning software.

Ultimately, the only reason CS degrees have the industry importance they do is because it's one of the only things recruiters can understand. For that very reason, boot-camp programs like this, despite their utterly moronic assertion that a decent engineer can be cranked out in three months, are nonetheless a step toward a better solution.

I think the industry needs some sort of advanced trade schools - basically, a prestigious version of DeVry that teaches not just programming using the language of the moment, but *software engineering* as the separate discipline that it truly is (maybe this already exists somewhere, but I think it should be widespread). We need degrees that are good enough to indicate reliably high value in a candidate and provide enduring background knowledge, affordable enough for the average middle-class person to break into engineering, and still provide a black-and-white resume line item that's simple enough to pass the buzzword filters in recruiters' minds. I see no reason why a two-year associate's degree that's packed full of courses on real-world subjects, as well as tons of actual code construction, couldn't theoretically be *far better* than any current CS degree from a top university.

I was never able to take a single class on scalability, security, development methodology trends and how to evaluate them, management of large codebases, refactoring, etc. These are not flash-in-the-pan concepts that only reflect some current fad, but timeless and critical skills that are fully suitable for a university setting. However, universities are too mired in trying avoid looking like trade schools (and thereby justify their astronomical prices) to care much about providing real value to their customers, which makes them ripe to be punished by the free market.

Comment: Carmack FTW. (Score 5, Insightful) 60

by engineerErrant (#47839745) Attached to: Carmack On Mobile VR Development

I have been an Android developer for two years and a Java developer for almost 15, and even a former Google employee, and...in my estimation, Carmack is 100% right.

Despite how much more I like Java than lower-level languages, Google's software stack is a complete disaster. It's poorly designed, bug-riddled garbage that I have actually considered re-writing parts of, even in the middle of a high-pressure project. What makes matters so much worse is Android's distribution model: rather than the direct-to-consumer approach that Apple takes, Google distributes Android indirectly via its device vendors, who can provide arbitrarily modified or out-of-date versions of the infrastructure that you're expected to support when dealing with angry customers who don't understand why their network stack mysteriously doesn't work.

The NDK is not an answer. It's a wreck because JNI is a wreck. I've been using JNI since 2002, and almost nothing has evolved since then - it was never anything more than a token olive branch to luddite C++ developers in 1995, and probably never will be. Ultimately, Java is excellent for mature devices (like servers), but is not suitable for emerging devices (like all the mobile devices we're seeing now) because of its runtime overhead.

Despite Apple's many shortcomings, one of the key points they get right is that mobile development needs natively compiled, non-runtime (or thin-runtime) languages. And, of course, libraries that work. Apple isn't exactly the gold standard on that either, but at least they're miles ahead of "beta early, beta often" Google.

Comment: Troll (Score 5, Insightful) 794

by engineerErrant (#46371107) Attached to: Whole Foods: America's Temple of Pseudoscience

While Whole Foods does sell a lot of homeopathy items, that is *hardly* its entire character as a store. I, along with no doubt many others, go there because it's a specialty grocery store that has a lot of interesting foods that you can't find other places, including (and especially) a big variety of craft beers and vegetarian stuff. Their produce and bulk sections are also hard to beat for variety and freshness, and the prepared-foods section is great when you're on your way home and don't feel like cooking.

I'm no Whole Foods shill, and it does have its share of silliness. But comparing it to the Creation Museum is completely ridiculous and has no place in serious discourse.

Comment: Typical Kurzweil (Score 5, Interesting) 254

by engineerErrant (#46326175) Attached to: Ray Kurzweil Talks Google's Big Plans For Artificial Intelligence

Ray Kurzweil is no doubt a brilliant thinker and an engaging writer/futurist - I've read some of his books (admittedly, not "Singularity"), and they are fun and thought-provoking. However, disciplined and realistic they are not - his main skill is in firing our imaginations rather than providing realistic interpretations of the evolution of technology.

My favorite case in point is his elevation of Moore's Law into a sort of grand unified theory of computing for all time, and using some very dubious assumptions to arrive at the idea that we'll all have merged with machines into immortal super-beings within the near to mid future. I don't need to pick apart all the reasons why this is fallacious and somewhat silly to treat as a near-term likelihood - the point is, he's basically a sci-fi writer in a lot of ways, and I read most of his statements in the same spirit as I'd read a passage out of "Snow Crash."

That said, Google has some very capable people, and can, in all likelihood, mount our best attempt at human-like intelligence to date. They'll push the envelope, and may make some good progress in working through all the challenges involved, although the notion that they'll create anything truly "human-like" is laughable in the near term.

Comment: Re:The usual consulting snake oil (Score 1) 149

by engineerErrant (#46280705) Attached to: Can Reactive Programming Handle Complexity?

Clearly, one-time structural updates during system upgrades are a different ballgame. The pattern described is for ongoing use in deployed production code, and my assertion is limited to that context.

For upgrading Hibernate-based systems (or any other O/R-based system), I'd totally agree that short SQL scripts are in many cases the only reasonably performant solution.

As an aside, I don't really like "classical" O/R (meaning, every field is a column and object relations are explicitly embodied in the DB layer) either because it is so brittle. It lacks any ability to "soft-upgrade" the data because the code is so rigidly tied to the DDL that you're forced to write tons of SQL or other migration scripts for every system upgrade. This, in turn, drags the deployment process into an hours-long affair and sharply discourages frequent upgrades. Despite being no fan of Agile overall, I have found that frequent, granular upgrades are usually better than months-long waterfall cycles, which I feel that classical O/R tends to promote.

Comment: Re:The usual consulting snake oil (Score 3, Insightful) 149

by engineerErrant (#46279737) Attached to: Can Reactive Programming Handle Complexity?

That's certainly valid that proper organization is far more the key to good code than the use of any language - my comments should not be taken as an ad for Java or any other specific technology.

That said, certain language features lend themselves to good organization much better than others. Where SQL faces challenges is that (1) it's mostly a declarative language using set calculus, which (again, in my opinion) makes it ill-suited for non-trivial business logic, (2) because of the aforementioned, it can't be hooked up to a debugger in any normal sense, making maintenance and troubleshooting that much harder, (3) it's a separate "codebase" and technical competency than the "main" codebase (whether it's in Java, C#, Ruby or whatever), thus creating a competency barrier that must be crossed every time work needs to be done on that code, (4) it's not stored with the main codebase, but as a form of data, raising the issue of out-of-sync deployments with the app servers, and (5) far fewer developers know it well enough for complex uses than typical app-server languages, making staffing difficult.

Finally, I have personally always found large codebases much more manageable when written in a statically typed language (which SQL is obviously not). Not wanting to spark a flame war with Ruby or PHP fans, though, I will caveat my statement that those languages are also much better suited for business logic than SQL's declarative style is.

Comment: The usual consulting snake oil (Score 5, Insightful) 149

by engineerErrant (#46279443) Attached to: Can Reactive Programming Handle Complexity?

As background, I am the director of engineering in a small Java/Postgres-based shop. We run a cloud backend for our mobile apps.

This "methodology" reads from the first sentence like an extended infomercial for a consulting shop, or a company trying to create the aura of "thought leadership" to get more investment cash. The formula is simple and time-honored: (1) arbitrarily single out a well-worn software practice to receive a snappy marketing name and be held above all other practices, (2) claim it's new, and (3) offer to implement this bleeding-edge buzzword to clueless executives. For a small fee, of course. It's the same formula that gave us Agile.

In my opinion, what they've described here is a large step *backward.* Not only is this a relatively trivial use of the GoF Observer pattern, but bizarrely, it's done in SQL using triggers, causing immediate database vendor lock-in and creating a maintainability nightmare. It's how software was made back in the 90s when Enterprise SQL database vendors ruled the land. Sprinkling business logic around in the SQL instead of centralizing it in a much more suitable language for logic like Java is a completely terrible idea, unless you're an Oracle sales rep.

This one is safely ignored.

Are you having fun yet?