Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Re:Odd title (Score 1) 237

My problem is with the strongly opinionated frameworks. You know, the ones where you use framework 'X' and now you have a 'X'-website or 'X' application? Where the framework's author makes the majority of architectural and project organization decisions. Sure, they make 80% of the common usage patterns easy, but that almost always results in making 10% of what's left hard and the remaining 10% near-if-not impossible.

In a world where the programmer makes the rules, this is fine - they can find some awkward workaround, like forcing a second login or putting validation logic only on the client side. Unfortunately, in the business world, the customer is the one making the rules, and the customer always wants 100% of the product working the way they want it to work.

They seem to flourish - at least until the next fad framework - thanks to their low barrier to entry and focus on the intuitive space the article author references.

Hipster coder though? I don't know. Almost every dev I know has some subset of interest in new languages, frameworks, libraries, and so on. It's their toy language or pet project. In fact, the biggest violators I see are actual experienced programmers who do know what they ought to, but don't want to spend time reinventing that same tired ground when they just want to play around.

In fact, not having to reinvent that same solution is exactly what they're interested in.

That's why you'll find a Karaf server running a Camel app in a Microsoft shop, even though there's no one left at the company who knows OSGI. Or find an external build script dependency in the form of a Rust script from out of nowhere. The people who should know better just want to play, and the easiest way to do that is to make it part of their day to day job.

What this article and experience tell me is that more than ever, we're in need of technically-competent managers who can evaluate, triage, and reign in individual developer output and provide focus.

At least, in a business environment, where product delivery, support, and maintenance are more important than playing around.

Comment Re:Specialization (Score 2) 237

I've noted that businesses that have historically shown high growth and apparent long term stability seem to be looking for "T-shaped" employees. These are employees who know a little about a lot (a wide horizontal) and a lot about a little (a narrow vertical). Specialization is good, but you have to know enough to be able to respecialize in anything else quickly, which means dabbling in a bit of everything so that it's at least familiar.

Basically, a jack of all trades AND a master of one.

To use your analogy, you don't need to know how to rebuild the motor in order to drive the car, but you should be aware of all the parts and how they operate, so that way when something makes a noise, you not only know what it likely is, you either know how to fix it (or if it even needs fixing) and where to start looking for the information on how to fix it - even if you lack the tools or experience to do so.

Comment Re:Glueing things together is how I teach OO desig (Score 3, Interesting) 237

That's my experience as well, at least, as of the middle 90's. Why use a mainstream, commercial language, when you could use a theoretically complete one? One where they recognize that features like "output" and "input" are dirty exceptions that spoil the purity of your program when you're trying to get a good hard proof using Hoare logic.

Better not clutter up the syntax either by using confusing, pre-existing symbols to form constructs, not when you can add whole new symbols requiring a special (APL) keyboard!

I remember having a TA once challenge me - I had written an algorithm to operate iteratively, rather than recursively, because I had noticed the program would run out of memory if I did it the other way when fed large data sets - because to him, recursion was theoretically perfect and not using it was a personal affront. The fact that my code worked and his crashed after 4-5 minutes didn't matter.

To paraphrase a professor of mine, "There's absolutely no reason for a computer scientist to use a computer."

This sort of attitude was pretty rampant all throughout my college career, fairly startling having already been employed in the industry for some years prior. No concept of real world usage at all, or worse, some strange bias against it.

Comment There are a lot of strange allergies out there! (Score 3, Informative) 503

For example, every time I see or hear Donald Trump, or hear about his standing in polls, I experience waves of nausea, get headaches, become irritable, and have troubles thinking anything other than 'dark' thoughts.

I know a lot of people who have the same allergic reaction, and I think it's only fair that we make the US a Donald Trump free zone, to end this sort of suffering.

Comment Re:Definitions (Score 1) 163

Their definition of male and female is indicated to be 'sex' - that is, biological gender, as opposed to the social concept of gender. Pretty clearly stated.

What they're attempting to do is find out if there are any consistent trends within each group of biological genders that contrast with each other, that can be used to indicate a difference between biological genders. As they found none, speculation on what a trans-woman's brain would be, if they had found it is as useful as speculating on the type of hat a tyrannosaurus rex would prefer, if they had access to hats and the desire to wear them.

That is to say, completely pointless, on a number of levels.

Comment Wrong. (Score 1) 163

It doesn't prove that. It doesn't disprove it either. It makes no claims on that whatsoever.

Interestingly, this phrase shows up early in the article: “Nobody has had a way of quantifying this before,” which indicates that previous claims were invalid, based on supposition and assumption rather than data. They were at best, theories, in the scientific sense of the word, but lacked a mechanism to test.

Should you assume that previous claims were valid, even without proof, I can see how you might come to the conclusion that we lack the tools to derive differences. If you ignore the idea that these things could be impacted more by chemical, environmental or cultural factors, and that brain structure contains an inherent representation of male-like vs female-like brains, it is a pretty reasonable assumption.

Except that the article then goes on to point out that all the things previously claimed to indicate gender did not have reliability: there was no inherent consistency or commonality between results, except when you zoom out far enough to indicate a generic trend (one that cannot be used to reliably identify a brain as 'male' or 'female'). Again from the article:

"Some modest disparities have been reported: On average, for example, men tend to have a larger amygdala, a region associated with emotion. Such differences are small and highly influenced by the environment."

So, the previous claims are considered invalid, and the assumption - assuming I'm not painting a straw man - requires them to be true, and reliable. Instead, they are not, and while it doesn't discount the assumption, it does make it more unlikely to be true. ... personally speaking, I believe 'genderness' as it relates to social norms rather than biological ones, is predicated primarily on those same social norms and cultural values, rather than biological, which serve as a tertiary influence after cultural and environmental factors. If that were not the case, then it would be unimportant for a trans-individual to look and dress in a stereotypical gender-specific way, except to serve as a label to others. There's nothing that says you're not unless you dress like your culture's stereotype of , except insecurity and social pressures. Nothing except what an individual allows inside themselves.

Is there really a problem with it being primarily a psychological state? It's equally valid, isn't it?

Comment This is an easy answer (Score 1) 169

Is there business value in retaining, training, and developing individuals who will become domain experts with cross-functional expertise and proficiency with working within your business structure?

        Well, of course. That's a hypothetical question.

The real question: is that value greater than the cost to retain them vs. the cost of hiring multiple underpaid, low quality workers, perhaps from another country?

This is going to different industry to industry, and company to company. Those selling software as a service may find their short term apparent advantage results in a severe disadvantage over the long term. Those releasing products may see great gains in that same long term view, with a painful short term as their teams are brought up to date.

If any of these trends /are/ true on average, though, then the good companies will survive and the bad ones will go under in a socioeconomic version of natural selection.

In either scenario, there will still be other companies that value long term employees, and those that don't.

Comment Re:15M (Score 1) 291

They've been saying this since the origin of the Luddites in the early 1800's!

Technology is a force multiplier. It allows one person to perform the work of many. Historically, instead of increasing unemployment, it's done the opposite - creating larger numbers of jobs, especially in positions that can't exist without a large-scale economy technology provides. The ability to make a shirt via a machine, thousands of times faster than by hand necessitates buildings for the machines, a distributed sales force, logistical concerns like transport and storage, advertising, etc. Each technological advance may reduce the number of employees in that explicit role, but increases the total number of jobs, the total hours of work required by an order of magnitude or better.

However, there's a limit. We haven't reached it yet, but it's there, looming in the distant future. At some point, we will reach peak technological assistance. Everything that can reasonably be automated will be. We'll literally have no need for all the people in the world to be working, much less the ability to employ them all.

Automated tellers at fast food joints is not the limit, by the by. I'm guessing we're still hundreds of years away.

The hard part isn't getting there though. The trick is going to be how we handle the runup to this post-scarcity world. At some point in the interim, we'll still consider money to have a direct relationship to work done via services and goods, and thus be necessary for comfortable living, but a significant minority will be unemployed and poor.

That 15 hour day you're referencing, that's going to be like the end times. It means either we have a revolution and start all over, or we jump the hurdle and enter into some sort of Star Trek-like world where we just accept that not everyone should work as a condition of having a comfortable life.

Personally, I don't see it happening in a positive way. The average human just isn't wired to be altruistic, and it's about the furthest from what our governments are set up to give us.

Comment Arbitary choice of definition (Score 1) 568

Engineer doesn't necessarily have to mean "Someone with professional certification who performs a task in a pre-designated method begat wholly by one's predecessors."

The more general definition is simply "creator of systems".

More to the point, we do have professional certifications, we have professional societies, and while we may not be represented by a national-level federation with standardized capabilities, we still present a much-in-demand skillset at a variety of levels. If I had to guess, I'd say that the engineering certifications are themselves simply a relic of the times; one part hold-over from when unions were necessary plus one part CYA legal protection via standardized best practices. These are two things that are (not yet) needed in the software development world, and I can't see the former ever being relevant.

Additionally, as engineer itself is not a protected term like Dentist, or Doctor, anyone can use it. In the US, the best you could says is that you're a NCEES certified engineer, but that's not a requirement to work as an engineer, nor to claim that you are one. In fact, most states have their own requirements beyond the NCEES, and you couldn't claim to be a licensed engineer without meeting /those/ requirements. Or in other words, there's various licensing and certification bodies out there, with different goals, requirements, and levels of measure. Just like there are for programmers ...

All that being said, I actually prefer the term Software Developer or Software Programmer. Not for any of the reasons listed in the article why we're not engineers, rather, I just think Developer or Programmer are more accurate terms. Though I detest the term the author coined, "Engineerwashing*," I agree with his list of motivations why it is used. It's basically to make it seem fancier, neater, more professional. It's what I put on my resume because it's a better match for job postings than developer or god forbid, programmer now-a-days.

Then again, I liked the original definition of the term "hacker," and only recently have stopped correcting people by saying, "You mean "Cracker," right?" when they used it in the more modern form. I get it. We're not France though - language is a continually evolving thing, and you've got to keep up with the times.

The fact is, we're now all Software Engineers. That's what we do is called now.

Comment Fancy title, but misleading (Score 1) 321

This wasn't about the ethics of D&D. This was one person's opinion about how alignment should be handled in an RPG, presented as factual, objective instructions.

For the fun of it - and countering his /suggestions/ - how about these definitions:

"Alignment is how you treat everyone who is not in your party."
"Alignment is a rough pigeonholing of your moral and philosophical outlook, used to qualify for magic spells and effects, and can change on a day-to-day basis based on your actions or justifications for actions, but should restrict you in no way - it's a classification, not an attribute."
"Alignment is a silly thing, and it's implementation is overly restrictive, making it difficult to role play realistic, complex characters. So we're not using it in this campaign." ... and to counter individual suggestions about the impact of alignment:

    There's no need to make every crime result in moral reflection or in-game (negative) consequences - this is a game, for entertainment, not explicitly a forum for ethical reformation.
    It's okay to have a world with full populations of cartoonish, evil beings and villainous stereotypes who can be abused in a number of ways, with joy and abandon.
    Since the goal is fun, you can have fun 'beating the game' without even descending into roleplaying, much less worrying about how an arbitrary classification is supposed to straight-jacket your player into behaving in a single, stereotypical way.

Slashdot Top Deals

"Remember, extremism in the nondefense of moderation is not a virtue." -- Peter Neumann, about usenet

Working...