No, that doesn't follow. This "toxic employee" thing isn't a big enough problem for anyone to torch their hard-won career by mounting a discrimination lawsuit that's doomed to fail anyway. I clearly said this does not apply to most minority employees, and was merely making the point that such bad behavior *exists,* not that it's prevalent.
This is patently *absolutely* true. I, my wife, and my friends have all directly observed this happening, right out in the open. It doesn't happen with all "disadvantaged" employees, but with problem employees who use their political status as a weapon and veiled lawsuit threat against HR.
To be crystal-clear, I and others close to me have explicitly heard sentences of the form "we can't fire him/her; it's not worth the lawsuit," spoken aloud, by decision-makers, clearly as a matter of policy and not as an off-hand crack, more times than can be considered a fluke.
These "poison pill" employees are a minority among minorities, but they definitely exist, and they ruin things for everyone.
Energy production has impacts all over our culture and economy - it's short-sighted to look only at the (clearly negative) environmental effects. We also need to consider the job and GDP growth that oil can produce, at a time when our economy badly needs it. Then there are the (clearly positive) national and economic security implications of being energy-independent.
This doesn't mean we shouldn't also have a balls-to-the-wall, fully government-assisted race toward cleaner energy. But we're far from being able to rely on that for more than a small fraction of our needs. The R&D and infrastructure upgrades will take decades, and our only usable "bridge" to get there is to continue burning anything that will hold a flame.
I am part of a small independent app company as well, and we have been on the opposite side of this issue a couple times. It is just as common, particularly on Android, for low-budget teams (often in third-world countries) to purposely build a confusingly similar product in an attempt to make a few bucks before they get shut down. Honest developers face trolls on both sides.
Half our defense is what steps we've taken to work within our awful, awful system of IP law. Copyrights, trademarks, even a couple patents. The other half is maintaining good relationships with Google and Apple, so that when a problem arises, we can appeal to them quickly. I think both have been absolutely essential.
If you have not spent the time and money to build your legal bullshit-shield, you should do so ASAP but be prepared that you may take some heavy losses before the dust settles - your troll has home-turf advantage here.
Good grief, it's a resume point system. It's *supposed* to be over-simplified and callously reduce all the richness of a human being's life efforts to a single, faceless number. Its sole job is to efficiently extract a strong team from a given applicant pool, and do it fast enough to get the best applicants before other companies do, as well as not wrecking the team's productivity interviewing every candidate under the sun. A willingness to search for hidden gems may sound fair-minded, but it doesn't have a good outcome.
And, I hate to say it, since this will likely not help build agreement, but my startup-focused point system also explicitly dings freelancers, as well as former non-military government workers. So, despite your likely objection to this, hopefully you'll grant that the system is at least internally consistent.
No, I wouldn't interview you because you mis-interpreted what I said to mean "no interview," rather than a -2 score for a single resume line item, as well as assuming me to be a manager, which I didn't say I was. Engineers need to be precise thinkers. Otherwise, though, I'm sure the rest of your extensive resume would have added up to a pretty good number in my made-up system.
The reason why your certification is both good, and still irrelevant to the posting, is that a pro-serv contractor is a completely different beast than a normal software engineer. Someone being put in front of customers certainly should have all the "pieces of flair" that impress customers, regardless of what they actually represent. My only assertion is about interviewing pure software engineers, which the OP would seem to be about.
This is not about Google - I do not work there. I have not for a long time. I mentioned them solely for their study on certification. And FYI, the Google employees in my group were pissed about the trash-can thing too.
Your final sentence, about each company having its own unique needs, supports my point that one-size-fits-all certifications are BS.
Yeah...accounting turns out to be a different field than software. I am not saying that sheriffs shouldn't be certified in firearms, or surgeons shouldn't be certified by the medical board. But in the specific field of software engineering, certification is a (mostly) sure sign of reduced competence.
Furthermore, I have spent the vast majority of my career (and all of those hundreds of interviews and resume reviews) outside of Google. My personal experience, which I will back up with the firmest of conviction, is that filling an office full of XXX-certified software engineers involves basically the same level of intelligence as buying Powerball tickets.
I have conducted probably 100 interviews and reviewed hundreds more resumes. Over time, I have developed a point scoring system based on various items I see on people's resumes: +1 for each job in the same tech stack we need, -1 for leaving a job in less than 6 months, etc.
I actually give -2 for certification. That's right, certification will, in my book, nullify the positive impact of an engineering degree *and* one relevant job. Why? Because it is, more often than not, a means of hiding shortcomings behind the veneer of something that seems official.
I am mostly a startup guy, but I have also worked at Google. Google actually conducted a large survey of all their applicants' resumes and cross-referenced the words they contain with how "successful" those people were at the company (I do not know how they defined that). There were no sure-fire words indicating success. But there was one that predicted the opposite: that's right, "certification."
The poster's interpretation seems completely off-base to me; not only is Snowden not encouraging us to blindly trust Apple et al with our privacy, he explicitly warns of the very danger the OP brings up.
As an iOS developer, my perception is certainly not that Apple is trying to grab our data instead of the government - in recent years, they have started a major cultural shift toward real protections of user data - simply not collecting it, encrypting it in transit, etc., etc., even if it's a burden on third-party developers to make the transition. This is a Good Thing, full stop. Props to Apple (as well as Google, who is also making its own efforts).
The makers of this article clearly have no background in computing, or journalism either for that matter. I'm surprised I didn't see a reference to the Illuminati in there somewhere. Bizarrely, the article doesn't even mention Dart, which is no doubt due to the two-minute Bing search that I'd imagine formed the entirety of their background research.
When considered against the status quo for their purposes and eras, all of these languages show significant, useful advances in programming. And if we're going to declare all languages that are created by a for-profit corporation invalid, say goodbye to Java, C#, C++, and C. Hell, even the Jacquard Loom was meant to make money.
...since 2009, and we shouldn't be too shocked here.
I am a longtime Java engineer and was genuinely furious at the slow, humiliating decline I knew was coming for the language when the Lich King decided to buy Sun Microsystems. But I am past that, and for years have been preparing for the day when I would step off that ship instead of going down with it, as I think any engineer who's aware of the industry ought to be doing.
The main complication here is Google's insistence on keeping it alive, and even doubling down on it rather than taking the much smarter approach that Apple did with Objective-C. Go and Dart are both solid languages, but Google just isn't taking them seriously enough to make either the new lingua franca of the 2020s (yet).
Why would anyone take something seriously that was created and peddled by consulting outfits at $700-per-hour bill rates? I had the misfortune of being incarcerated at Pivotal Labs for a month by a misguided boss who thought their bizarre religion was the answer to all of life's ills. Boy, was that eye-opening.
As many people rightly point out, that doesn't mean we can't pick up some new ideas from it - my company now does short daily meetings and have a chart with everyone's daily tasking on it, and those have proven very effective. But the other 99% of what the Snowbird 17 vomited forth upon our industry is empty zealotry and jingoism. It was like Scientology right in our codebases, and worked about as well. And no, for god's sake, it wasn't because we just weren't doing it well enough.
We do have shockingly dramatic quality issues in the software industry, but they will never be addressed by the next dumb-ass management fad. We need to sit down and re-think the ways that we learn to code, get serious about "Software Engineering" degrees in our colleges, and let go of fetishized code patterns as the primary unit of engineering ability. In my own experience, we know plenty enough design patterns, but almost no one understands how to exercise coding judgment in the context of a team or long-term project.
FYI, I'm an iOS developer who uses a mix of languages, including Ob-C, every day. My coworkers and I met Swift with a mildly positive reaction - it's a decent, if imperfect, effort. It's not the second coming of Christ, but it definitely isn't a bad thing to try to modernize some of Ob-C's age-related shortcomings. The notion that we'd re-write code just to use it is utterly laughable, but we could certainly see ourselves using it to start a new app, or maybe at our next jobs.
The OP, though, sounds like a marketing intern wrote it. To add a little historical perspective: our apps are a riotous mix of C and C++ (for Android portability) and Ob-C. We chose to do it this way, within the last 3 years, so this is not a legacy issue. Both C++ and Ob-C were, at one time, meant as "replacements" for C, and we know how that turned out.
Swift may very well become the preferred language over Ob-C within, say, 5 years, for Apple-specific development. But the breathless "it'll replace C!" rhetoric is just silly. Over the coming decades, C will surely fade, but it will be replaced by other, newer, even more amazing tech, not just Swift.
The FBI should have no trouble pulling it off then.