To say that brainstorming flat out doesn't work? Now that's just a grabby headline that got me to post this rant.
I think there's two camps to this that really need to be addressed that showcased the skewed write-up:
Yes brainstorming in a forced group --- it's utterly pointless most of the time. You have people who don't want to be there who are warm bodies in a chair, one's who do and just shit on every possible to solution to protect their 'body of employment' with less (or more work), one's who just throw out buzz words to look important but can't implement or do shit, the one's who road block the shit out of everything because they want to wrap some corporate or bureaucratic tape around it to 'process-ify' the idea, etc. The list goes on and on. That's at least my experience with that, anyways.
Now, brainstorming in a group in terms of, you, the brain-stormer, going to seek out some group (peers, a few colleagues, ect.) for input on your idea to make sure there might be another/better/alternative way (if you're too deep in your own mulling and you actually notice it), you want some actual feedback with people you actually care to get feedback from --- I'm all for this. The point I'm driving home is the constructive criticism and peer input to solidify, reduce or confirm your idea to begin with.
Why in the hell does this topic become a reoccurring post every handful of months? I'm not opposed to fielding a ranty opinion that will be voted down, shit on or maybe even considered, but do we really have to feed the bear on this?
Maybe I'm just rubbed the wrong way on the justification for the question:
1) OP seriously references Windows 3.1/95/98? When was the last time you used a 'computer'? And we're really entertaining this?
2) OP asked and used the word 'easy'. Well, Linux isn't 'easy', it's a kernel. If you want your experience and interaction with Linux 'easy', then say that. If everything was easy, everyone would be doing it. That just tells me you're lazy; this isn't 1990's like the OS's you referenced FFS, there's PLENTY of OS's to find blog reviews on with about 30 seconds of actual search engine use, or just try anything -- most have a bootable CD or USB
I don't even know what mechanical whatever you want to monitor, control or whatever. But chances are, your environment will be Linux distro agnostic. Maybe you should have just said and explained that part of exactly what you wanted to do in a Linux userland environment, and it wouldn't been such a BSD vs. RPM-based vs. Gentoo vs. Debian-based vs. Inbreeds-of-Debian-based flame-war again.
I'll commend you and the few old-hats around you on being a self-starters, learning and adopting tech/hardware/development/engineering on your own and trying to share and communicate that in-house. I think ability to learn, fully understand and properly implement anything and do more than just nod your head and gasp a topic for 5 minutes goes a long way.
But I think it's starts where it stops right now. What you have is a bunch of self-taught experts trying to carry on a vision-less and foundation-less IT department with a 'Fight Club' ruleset of "The First Rule of our Company is you do not talk about IT assembly or the lack there of". You need IT, not for the knowledge and expertise (because it seems like you have some idea what you need to do and how to be productive with technology) but you need it for two reasons:
1) Get the damn day-to-day IT burden off your shoulders, so someone who's managed, worked and operated in an IT environment can come in and set up a foundation, standards, expectations, operations, training and management of this shit, not you guys who are hardcode dabblers.
2) So you can focus on the jobs you are PAID TO DO.
This isn't a new problem, it just means your company doesn't value that because you are all doing it yourself and don't see the pain points because you've been 'making it happen'. But that only can go on so far. If it's a company cheap-skate problem where the idea has been brought up before but got shot down because 'talent is expensive', then I guess find all the
This shit happens A LOT. And being, having and making a career in IT myself, there's nothing worse than seeing and empathizing with the other side of the coin where engineers, scientists, other staff, etc. doing IT in the capacity they can handle, failing at it, and not really focusing on their true job, which wasn't IT to begin with.
That's great there's an announcement of using an outdated Webkit framework on the Nintendo Switch. Is this anything new? How's that any different if I got some IoT device to a smart phone (Android or iPhone) to installing any Windows/Linux OS to an Xbox/Playstation? Does what I had deployed out of the box already have packages that are already part of security updates that need to be updated?
Fun to report from a journalism perspective, but definitely not news or anything to debate. Just update the Nintendo Switch and stop the huge reach of trying to criticize the console or Nintendo feebly.
I'd say this has very little to do with bubble talk or jobs not existing and everything to do with the following things:
* Where you decided to go to school in relation to the 'quality' of the program
* The quality of the faculty, staff, program and curriculum in terms of a mixture of academic and real world exposure
* If you, in terms of skills and potential, are even worth a damn to any future employer
I see and hear this shit all. the. time. in the computer science, information systems (which I reside in) and engineering realm and guess what? Not everyone who does, goes through or completes anything isn't good at it or even cut out for it long-term. STEM, EE and Info-sec are hot so people just jump on the degree bandwagon thinking they are going to land these amazing jobs when at most either their curriculum fails them (e.g. shitty professors and lackluster, poor ass program), lack of motivation on your part in being more than a hyper just-out-of-school know-it-all, and flat out thinking you're going to ever land a 6-figure 'side hustle'.
I think we hear a lot of this because college graduates expectations are sincerely and truthfully out of whack. Yeah, a lot of university's boast this unbelievable 99%+ straight-off-the-stage hire percentage, but that's mostly marketing bullshit to get, you, the student, enrolled. Just because you 'got a degree', doesn't make you hireable or even desirable to be hired. I hate to say it, but there needs to be more ownership and onus on the student-to-be-employee than it does always pointing the figure back at the university for not making them 'employable'.
I have a mix of friends I went to college with that don't even do or have anything to do with computer science or engineering, but have a BS/MS and don't do shit with it. I also have friends who are some really excellent IT professionals or software engineers that don't even have a true computer science BS (one of them has a degree in music education!).
It's a valid argument that holds weight, and I'd even take it a step further than the how involved with general users going around the rules to keep making new passwords is really... scary, predictable and in the exploding age of AI, machine learning and modeling, these rules, are indeed, a joke. For instance...
Just what I observe and know to be true: I can't tell you how many people who don't even know what 5cr1p7 k1dd13 language blantantly substitute all the letters of S, E, A, I, T and B for 5, 3, 4, 1, 7, and 8. Well that's an easy substitution and gives you a very 1:1 substitution pattern. Then simple typing patter heuristics will get you a bit farther to predict what/where most people 'prefer' to hit the shift key, which is mostly at the beginning or very end of a string. Coupled with all the password advice of using shitty, generic and way overused mnemonics, it gives a good solid guessable foundation for completely arguing it's mandatory bullshit, indeed. I didn't even sneak in the fact that a lot of people just use very linear and horizontal patterns on a keyboard, then on next password change, just shift over 'a key' and do it all over again. That ensures, to the end user, that they'll never reuse a password ever within a bullshit 'last reuse history' rule, but that's even MORE guessable than just making your own rainbow table on predictable typing behavior and mnemonics alone.
Now the question is, would I actually not use it in my own organization like Jeff Atwood wants? Absolutely not. Because then I'm absolutely positive the old 'top 10' commonly used passwords will for sure be in full damn effect. I'd prefer to feel ignorantly secure with the end users I administer around me.
This really isn't news, it's just countries trying to save face and do a quick shaming, finger wag at the US and CIA in regards to 'get off our digital lawns'. All countries have, do, practice, implement, will and always forever have cyber-warfare and hacking toolkits developed in-house for any op, espionage, defensive or offensive they do.
This is easy for China: I mean, who the hell wouldn't jump on the shit-talk bandwagon to get a few jabs in after a release like this just so you don't look 'as bad'?
All immediate perception here IMHO.
Wow, there is someone I can relate to on
I couldn't agree more with getting the first two points out of the way in an interview. Regardless of intellect, exposure, industry or experience, who wants to work with someone you're to all hate on a team? Team mental health far outweighs having that on your team any day IMHO.
Secondly, I had a similar experience in a job interview where I was asked to write out map reduce in pure python program structure (yes, that means including __name__ == '__main__' with full passable arguments, on a white board). I said almost as similar to you, "I can do it, but I'm sure to flub a few things here that my brains relies on with my IDE, not to mention, I'd just use the built-ins map() and reduce() vs. re-inventing the wheel and sacrificing efficiency in my algorithm."
I wasn't really offended or turned-off by the idea, sometimes I just think it's if you can talk-the-talk, can I figure out that you can even sort-of walk-the-walk and not just buzz-phrase repeating and 2 months into the job, you can't do it? But I think most of these people fall into that hard-on egotistical I-know-more-than-you shit and do me, that's like seeing who's dad could win in a fight in 3rd grade. I'm past it in a professional environment when everyone can bring shit to the table.
I'm not sure I'm a fan of the 'software' driven UI home button; I certainly don't care for it on any of my Android breed devices. I like the idea and design of a physical hardware button, but I won't if ditching this gives Apple more courage to mess with this rounded-screen design --- last time I checked, buttons are flat.
If anything it's going to take me a really long time to get used to not having that little indentation to blindly hover-touch my thumb on to do anything.
I can't speak for everyone else, but all this AI, machine learning, heavy algorithm, neural network, data mining that's been going on for well over a decade now and has become almost normal in terms of tech news conversation is really scary as hell.
For starters, the claim to the quote/unquote "internet" and plaguing social media is it's given absolutely everyone a platform to opinion-ate, alienate, berate, tolerate and flat out hate anyone, any topic, any agenda, any other opinion, idea, thought, preference, look, feel, ect. Let's face it: all that in itself alone as opened pandora's box to a metric shit-ton of people who flat out should not be sharing anything that bubbles in their skull. So now we all sit here with big thumb-tapping or keyboard-clacking loud mouths who can't act appropriately in a digital world.
But I have to say, when the hell did everyone become a bunch of sensitive sally's in terms of taking everything at face value, and buying into some internet handles drivel (or lack there of), hate speech. Look at slashdot and the anonymous coward approach? Hell at least we provide anonymity and low rank to toxic troll garbage here.
All that aside, we don't 'remove' it, cover it up and scrub it away because everyone likes to wave the I-am-offended-all-the-time flag. It becomes part of the culture, ambiance (if you laugh it it, I guess) and overall conversation. We don't un-ring bells, do we? I don't see how that's any different digitally.
An upfront caveat: I haven't spun up Upspin yet, but I did look at the code for about 15 minutes on Github. So I guess I haven't launched it.
I do have to merely shit on Brian Fagioli at BetaNews here: stick with objective reporter and keep your less-than-technical biased opinion out of the article, FFS. All that wanking about 'Unix-like directories' and written in 'Go' just proves your ignorance in the world of tech in general. My advice is, for starters, stop being a tech reporter and referring to yourself as 'submersed in technology' because you are clearly a posing douchey idiot. What world IS NOT built successfully on a 'Unix-like directory structure' and using a bleeding edge language like 'Go'?
Go is a fantastic language for any sort of platform-friendly deployment; I'm been using it almost exclusively for very system-heavy development that I need to port seamlessly between lots of UNIX platform variants. What's the problem with that?
Well Brian, to wrap your head around things you can relate to, better toss that MacBook you authored your article on (BSD-variant and Unix-like directory structure), stop watching Netflix (hosted on Linux and some distributed POSIX-friendly Unix-like filesystem), don't put anything on Dropbox anymore (hosted on Linux and some distributed POSIX-friendly Unix-like filesystem). Get my point? Stop whining. Just because it's over your head, doesn't mean it's not over anyone elses.
I think it's a really hard thing to quantify a 'good job' for a developer? The amount of context and work scenarios would make your head explode, honestly.
What if we were talking a one to two developer shop where hackish amateurism and 5-minute produced Wordpress sites seems like 'magic' and just 'works'? On the complete other side of the spectrum with your Google, Facebook, Amazon, Snapchat, Instagram, and Microsoft's of the world in terms of fixing an ultra complex situation in 5 minutes that's nearly bulletproof in terms of 'all the bases covered' with minimal room for turnover on not getting it right the first time?
I'd agree with most on here that, to me, at the end of the day, it takes one to know one --- ESPECIALLY when you've had to do any software development in any context in the real-world, support it and have a business function rely on it. I've had SO many developers try to tell me how 'awesome' they are and say "I have two bazillion lines tied into" and I think with enough experience to sniff that out, it's either _that_ complex or it's bullshit. And I think the other thing is the functionality piece. It has to work, work well and not just accomplish more than the bare minimum (from the start).
Look what most of us do when we have a car issue and don't know shit about being a auto/engine mechanic? We take their word with whatever shit they tell you is wrong as long as we get a working-like-we-had car back in return? That 'progress' could be that it took 5 minutes to get your car fixed and ran with your 10 hours of labor straight out of your checkbook. But if I was any sort of mechanic, I could rightfully call them out, right?
So totally true. If anything, companies like Fitbit tried to rally around lazy-ass people who needed a gadget and really poor apps to hold them accountable. Isn't that how gimmicky diets that pop up around the turn of every calendar new year work, as well? Sell you this unrealistic grand idea/plan when all you need is some humble pie, self worth, bit of dedication and don't cave on the day old donuts Carol from 'Accounting' brought in to share? Genetics aside, shit people, if staying in shape, having a six-pack, eating like a rabbit and looking like the gender of your type celebrity-of-the-day was so easy, we'd all be doing it already.
I'm glad I was in the camp of buying the time keeping and notification wearable as an attempt to re-wear a wrist watch in my life again instead of jerking that phone out of my pocket every 3 seconds...
A computer without COBOL and Fortran is like a piece of chocolate cake without ketchup and mustard.