Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Checking for fundamentals is the way to go (Score 1) 809

One of the problems that I have in IT is that many companies expect that new candidates have experience with all the equipment they would be expected to handle. For example, if you're a systems integration person who works in an HP shop with IBM storage and Juniper networking, it's hard to jump to a Dell/NetApp/Cisco environment. Not technically hard, mind you -- every vendor has their quirks but the same stuff gets done on everything. The hard part is convincing the interviewer that you have enough generalist skills to pick up what's needed. I regularly work with different vendors' equipment, support procedures, ways of handling patching and firmware management, storage configuration and so on. Do I go to training classes and spend months learning? No, I use my knowledge of what needs to be done at a high level and research how that particular vendor implements it, picking up what I need as I go.

That said, there are some basics I agree with the submitter on:
- A person with 20 years' dev experience should have at least encountered PKI at some point!!
- A person who just graduated from CS should be able to code a simple example without too much trouble...nothing fancy, just basic code literacy stuff.
- A person with experience managing HP equipment should at least be able to articulate what may or may not be different managing IBM/Dell/whatever equipment.

I think two things need to happen.
1. Companies need to stop hunting for the exact right person and hire someone with the expectation that they'll learn on the job. You may not be 100% ready to "hit the ground running" (I hate that phrase) but if you have the skills and desire to learn you'll figure stuff out. The company I do work for has some very proprietary stuff that no one outside of the industry knows before coming on board. It's expected that new hires take a while to become fully productive.
2. The crop of employees does need some improvement. This is way easier said than done - smart people are scared of taking a job in IT or development because of the threat of outsourcing or automation. I personally think we're still working through some of the people who got into IT in the first dotcom boom, and now we have a second phone/social/mobile/big data bubble pumping more people into the field. In this sense, I feel for employers because their hires need to have the potential to pick up the skills they need if #1 is to be achieved.

Comment Any CS push is probably not a good idea (Score 1) 288

Girls-first or generally, I don't know if pushing a single field or skill ("coding") is the right idea. "Coding" is increasingly becoming stratified due to outsourcing of routine stuff. You have people working on the core guts of operating systems, VM platforms, etc. who are very high end and always in demand, but you also have a huge glut of mid- and low-level coders. These are the corporate IT developers doing Java or .NET CRUD-style applications, and it's becoming pretty clear that outsourcing is killing a lot of that work or making it less profitable. (The other elite-level coder is the serial consultant who flies in to correct the messes the outsourcers deliver, but that's another story.) There are also a whole other bunch of mid level "coders" writing phone apps or website pieces in various application or web frameworks.

I'd be more interested if there was a focus on developing core skills (logic, troubleshooting, and a comfort level with technology beyond end-user status) early on in school. People with this fundamental layer of knowledge are useful in many different fields, even non-technical ones. Pushing coding, nursing, or any other "hot, in demand" career path is going to lead to a glut of graduates who have a low skill level and limited prospects once the hot field is cold again. I do systems integration work, and I can't stand seeing "developers" who have absolutely no idea of how what they write runs in the real world. There are a finite number of both men and women who are suited for this field. Pushing more people into it rather than finding somewhere they fit better is a bad idea.

The problem is, education-wise, we tend to come back to chasing fads. I'm just barely old enough to remember in the late 80s when the Japanese were supposed to take over the world, and education systems were looking at how to apply their methods here. Then there was the finance boom, then the dotcom boom, then the real estate boom, then the second dotcom boom...who knows what's next?

Comment Yes, there is a shortage, but maybe for a reason (Score 5, Insightful) 254

It's 2015, and most of the egregious geek stereotypes have changed significantly. But, the development and IT industries are still very similar. Development is a very solitary experience, as is IT once you get out of run of the mill support. I know I've spent stretches of a few hours digging through log files, troubleshooting an intermittent problem, etc. by myself. Even with agile development, pair/team programming, and every other coding fad that makes people work together, there is a lot of time spent alone solving problems. I like doing this -- it fits my personality type. Do most women? Probably not; I'm guessing most would rather be in social situations. Do some? Sure, I've worked with a bunch.

Being married to a female, and now having a daughter, I can safely say that men and women are very different creatures. I think women self-select out of IT and development mainly for the following reasons:
- Perceived lack of socialization, and yes, the nerd stereotypes are still there to a lesser extent.
- Especially in workplaces that suck, the work/life balance is screwed up. My wife and I both work, I'm in IT and she's got a corporate finance job. We are both incredibly lucky to have good employers who don't death-march us on a regular basis. I know many more people who don't have this luxury. If you're female, and are wired like most females, you will want to take care of your children more than spending extra hours at work. I feel that way too, and this is coming from someone who really loves my job and loves digging into strange problems.
- Women are smart, and they see the writing on the wall for the IT/dev industry. Now that it's "easy" to program an application for a phone, and more aspects of systems management are automated, there will be an inevitable reduction in employment and salaries across the board. These days, you really have to be on top of your game to stay employed at the higher salaries, and be constantly learning. There are a lot of jobs that have less of the constant retraining, are more stable, and have a better balance.
- Especially in the SV startup/web/social media sphere, the rise of the "asshole brogrammer" stereotype as evidenced by many stories all over the tech press might be scaring women away too. This is kind of the opposite end of the nerd spectrum -- now that development is open to more people, the more extroverted fratboy types who got through CS are founding startups and getting themselves into sexual harassment trouble.

Do I think any of this encouragement works? Not really. I think what would work is to keep developing girls' logic, problem solving and math skills at an early age. Those who excel at these and can handle all the other crap that comes with an IT/dev job will gravitate toward it. Others won't, and we just have to live with that.

Comment I wonder what the motive is (Score 3, Interesting) 127

OSS stuff like Linux and xBSD is already out there, and they can build their own back doors. Microsoft already gives companies and governments access to the source code for its products. I guess the mainframe providers (IBM, Fujitsu, etc.) are the only ones left that this would affect. That, and the network device manufacturers...I could definitely see Huawei getting a boost by being the only network device manufacturer allowed to sell to Chinese banks.

I guess the question is why -- every country on earth spies on every other country and its own citizens. So, it's probably being done to boost domestic companies. One of the things that's really going to make China come out on top this century is their ability to do stuff like this...it's one of their greatest strengths. If they decide they want to do something, it's done with zero debate. Their big overarching project right now is a massive urbanization project -- just picking up millions of rural peasants and physically moving them to cities. Can you imagine the US or a European country trying something like that? It would never work, look how much people complain when a local government uses eminent domain to build a road or public works project.

The summary is right though - companies can't ignore China. There are billions of people and a huge growing middle class, all with the full will of their government pushing through whatever is needed. There are always possible bumps in the road, but I'm assuming China will be the dominant superpower in a couple of decades just because they can make stuff happen that we can't/won't.

Comment Re:This is one of the reasons.... (Score 2) 158

I have heard rumors from folks that work at MS that he was basically blinded by his vision, and didn't want to listen to anybody. The result as we all know, is Windows 8.

I heard the same rumors. What's interesting is that some people (Steve Jobs, etc.) can get away with that, and others (Ballmer/Sinofsky) can't. Jobs had to literally die before Apple made a large-screen iPhone, and I don't think we'll ever see new physical buttons on an Apple product again thanks to his minimalist design manifesto.

If they actually do teach MBAs something useful, the Windows 8 case would be a perfect example. I see mini examples of this in the large companies I've worked in as well -- one person gets a hold of the decision makers, doesn't let go, and blows things up because they stop listening to criticism.

Comment Bye Windows RT (Score 1) 158

I guess that's the end of RT and ARM-powered Windows devices.

In my opinion this is a good thing. Despite all the bashing, Microsoft has done a decent job with server operating systems lately, and Windows 7 was pretty good. It's interesting that they have enough money, power and leverage to recover from a move that would probably have sunk a smaller company -- it was also able to absorb 3 iterations of Surface Pro before they got it right, and the killing of Surface RT. Windows 8 was basically a panic reaction to the iPad/mobile/social/Bubble 2.0. I'm sure Windows 10 isn't going to give that all up, but it'll be cool to see them not totally write off desktop/laptop computing yet. Let's hope they don't mess up Windows 10 and Office 2016 too badly before launch. One thing about killing RT is that they're basically saying they can't make money off the Windows Store apps the same way Apple does. This could be a good thing -- let them focus on being a good OS developer instead of trying to be another Apple.

Comment Tough problem (Score 4, Insightful) 271

This is a very current problem. The tech press is talking about IBM's announcements/rumors about yet another huge restructuring. Not so long ago, IBM was one of the most stable places in the world to be employed at outside of government and academia. There was an implicit contract that employees who contributed and worked within the framework of the company would be taken care of for an entire career. I think that needs to come back for those who desire it, not necessarily for socioeconomic reasons, but for workforce improvement reasons. This move to contractors and outsourcing for everything is just idiotic MBA management consultants looking at a spreadsheet and seeing a way to shift costs. The long term problem is that loyalty works both ways, and employees who are treated as disposable will treat their employers the same way.

I know that large organizations generate forests of dead wood as well, and that there comes a time when some of it needs to be cleaned. However, an enlightened company in my mind would be better served retraining that dead wood worker for something else. You get someone who knows the organization's culture and politics, and the institutional knowledge of how their previous job was done doesn't walk out the door.

I know I'm not in the majority on /., but I would love the ability to stay with the same employer for an extended time, without the worry of suddenly losing my job and immediately being branded with The Scarlet Letter U (unemployed) that prevents me from being hired ever again. I actively seek out employers who treat their employees well in exchange for long service -- and they're harder and harder to find. The reality is that the industry is rough - the 25 year old single coder/systems guy is preferred over the experienced person who's done the latest rehashed tech fad over and over again. Anyone with a family would be pretty foolish to go the contractor route - it's hard to explain to the family that you can't pay the bills this month because a customer didn't pay you or there's no work to be had. There's a difference between someone like me, who would put in extra effort in exchange for more security, and someone who just wants job security because they're lazy. I've worked with plenty of those types over my career as well -- they set themselves up as the single point of failure in a system or hold all the knowledge on a particular process just because they're scared someone will come and lay them off. You would get less of this if large companies didn't routinely say "we're cutting 30,000 workers" the way HP just did.

The problem for me with contracting isn't the constant learning - I like that. It's the bouncing around, never knowing where you'll be in 6 months, and never getting to finish anything you start.

In a perfect world, my solution would be twofold:
- Admit that there is going to be huge structural unemployment in the future, and enact European style unemployment insurance and worker protections.
- Take the design/engineering aspects of IT or SW development, draw a clear line between the engineering and the tech tasks, and merge it into the licensed professional engineer track. A professional organization would get a lot more support than the unions that techies irrationally fear. In addition, having a clear career ladder starting out as an entry level tech, spending the time necessary to make mistakes, then graduating to a status that requires you to be responsible for what you build/design is a good thing.

Comment Good idea...outside of the public eye (Score 5, Insightful) 141

One of the things that I always thought about Google Glass was this -- it has a billion good uses for work, but is stupid and creepy when you start walking around in public with it. It's creepy in more than one way - there's the "everyone thinks you're a stalker" thing, the weird head gestures you need to make to control it, the talking to yourself, and also the "Google now knows exactly what my eyes are tracking in any given image" kind of creepy. I'm not a millenial, so I probably sound like an old coot, but Google already knows enough about us - phones, search, Gmail, etc.

Now, that all goes out the window when you're talking about work use. With all these cloud data centers hosting thousands of racks of servers, maintenance techs would be able to get real time info. Warehouses would be able to show human forklift drivers where stuff is. Aircraft and car mechanics would be able to get manuals without having to print/read paper job cards. Stuff like that is very useful - walking around with them in public is a different story.

Maybe Google is realizing this and tailoring future devices for certain applications.

Comment Hacked? Uh huh, sure... (Score 5, Interesting) 128

The PFC appointed as Social Media Officer probably chose a weak password. Seriously, whenever I see a news article about a social media account being "hacked," I really wish journalists would understand these are just password-protected web services!

Celebrities' naked pictures and Twitter feeds get hacked because they have simple passwords, not because some genius hacker spends months looking for an exploit on their personal phone and the opportunity to introduce it. And even "security question" based password resets don't work when a celebrity will choose answers that anyone can find in 100 gossip rags.

Comment Why aren't these networks air gapped? (Score 3, Interesting) 34

SCADA and the like are the worst things to have available on an accessible network. Vendors never update their software, everything's insecure by default, etc.

I've worked in environments like this, and some of the equipment is just not possible to secure without leaving it on its own network. It makes maintenance a nightmare -- sneakernetting patches, software updates, AV signatures, etc. I know an air gap isn't a guarantee of security, but it at least prevents dumb things like drive by downloads on someone's computer affecting production equipment.

Working with vendors of some of this stuff is equally bad...most of them deny a problem exists. And even if they acknowledge a problem, they won't lift a finger to fix it because they just have to say it's secure if installed as per our instructions. I've seen lots of software for control systems, etc. with 15 or 20 year old software libraries gluing everything together. (Using the 15 year old version now, I mean.) The vendor knows they're one of a handful of firms providing stuff like this, and they know that companies don't care about information security anyway. (One example of this from outside of the manufacturing industry -- I was integrating a very specific peripheral for a customer, and the vendor absolutely refused to digitally sign the Windows drivers, rendering it nearly impossible to install on 64-bit Windows. A lot of people might say "that's what you get with closed source," but open source libraries and other code have their problems as well.

Comment Extreme example here, but... (Score 4, Interesting) 189

Not all legacy stuff is bad. Not all legacy stuff should be kept around to the point where you can't find people to run it, however,

I've had experience working in die-hard IBM mainframe shops as well as places that used the HP MPE operating system on the HP 3000 minicomputer system. In the 3000 case, the customer was relying on a service provider that was providing an application that was way way way out of date but still worked. All the IBM places I've ever worked have been slowly "modernizing" their application stack, but in most cases, the core transaction processing has remained on the mainframe because that was the best solution. It's extremely rare these days to see an end user facing green screen application, but they do exist as well. (Yes, I work in "boring" old school industry sectors, very few web-framework-du-jour hipsters here, but we're also not old farts.)

The problem I've seen is that vendors love the fact that customers are locked in and will do nothing to encourage them to get off. Most ancient mainframe code can run virtually unmodified on newer hardware, and that backwards compatibility is a big selling point. It allows IBM to go in, swap out your entire hardware platform at $x million, and keep billing you by the MIPS without changing any code.

But...the reverse problem is that "mainframe migration" projects often end up becoming case studies of how Big Consulting Company X was paid hundreds of millions to not deliver a working system. I believe I read about DWP's "Universal Credit" project that has Accenture, IBM or Oracle written all over it. These kinds of projects usually try to port all the business logic and transaction processing to some horrible-to-maintain J2EE monstrosity backed by an Oracle database. They usually fail because (a) no one correctly estimates the work required to pull all that business logic out of 30+ years of cruft, and (b) the consulting companies replace their star team (that travels with the sales force) with new grads in India (who do the actual work.) I've seen this cycle over and over again, and am still amazed that CIOs aren't wary of consultants.

Comment Interesting parallels in IT as well (Score 1) 242

Coming from the IT/systems side of the fence, this isn't just limited to programming languages. There are tons of little niches in industries and technologies. The key to not letting a niche define you is to stay flexible. For example, I do systems work for airline industry customers. Think of every niche, legacy, arcane, backward standard, and it's there. Any one of these niches can be followed down so far into the expert level that you can build a career out of them. But, these things can change, and if you get too mired in the details and never pick your head up above the water, you can get lost. You wouldn't believe how complex something like tracking lost luggage or managing passenger data is once you peel off the covers...and a lot of that complexity is because of the massive layers of legacy stuff that have been built up over the years.

My approach to my career has worked well so far: (1) be willing to learn a lot about a particular subject so you can do good work on it, (2) keep sharp by slightly changing the things you work on every once in a while, and (3) keep your eyes/ears open so you can figure out which trends to hop on just in case the current job dries up. It's difficult to do this, but not so bad if you really like learning all the time. If I ever get sick of working in my little corner of the world, I'm pretty confident that I would be able to pick up a job in another little corner pretty easily.

Niches are everywhere, and often that translates into knowing about how an industry works at the insider level. Treating exposure like this as a learning opportunity is a good way to keep yourself marketable, and not just in your specialty field. Not being able to do this is a good way to brand yourself as a permanent generalist who doesn't have the desire/ability to dig in and find answers. Not trying to sound like a jerk, but there is a big difference between someone who can really dig into a problem and someone who just does what's in the manual.

Comment Open or closed, same problems (Score 1) 255

From the perspective of most IT customers, bugs are bugs regardless of closed or open source. They still rely on other people to find them, patch them and release changes.

Companies who rely on open source libraries may or may not have the ability or spare resources to go digging through the code of a library, finding a security issue, writing a patch for it, recompiling the library, then using that patched copy in production. Companies in the 'service provider' realm may be able to do this, simply because they are staffed appropriately and have a greater IT focus. I do IT work for airline customers. Airlines want as little to do with IT as possible, even though it's a key part of their business...it's not directly related to the surprisingly low-margin business of moving people and planes. I would never advise a customer to roll their own Linux distribution, for example, even if it was based on a commercial one. There's just no appetite for keeping things maintained in a business who doesn't live and breathe technology.

The problem is that, increasingly, even closed source vendors are relying on open source libraries to provide large blocks of their application's functionality. A company who doesn't write operating systems generally shouldn't try rewriting these very important pieces, of course, but the closed source companies providing applications that use open source libraries need to be on top of these issues and ideally contribute back their patches.

Whether closed or open source, companies need to be able to respond quickly to security problems, and those problems may end up getting traced back to something like OpenSSL, the Apache stack, etc. Open source has the advantage of "more eyes" looking at the code for vulnerabilities, and less commercial pressure. Closed source companies have the opportunity to provide (usually at a cost) the expertise and support necessary to find and fix a customer problem. I've had both awful and good experiences with both trying to get bugs resolved. If you pay for it, and the closed source vendor has good support, they will move heaven and earth to fix your problem. For non-technology companies, closed source or support-funded open source companies like Red Hat give internal IT teams a good boundary between them and "the vendor" as well as someone to call when they have done their homework and find they can't fix something. For the Googles, Facebooks and maybe some academic institutions, the internal IT department can be staffed with kernel hackers and the like to maintain their own highly-optimized implementations. Techies tend to forget that users and companies have very little desire to mess around with technology, and use it to get their work done.

Comment Doesn't take into account real world parenting (Score 1) 323

I would say I'm pretty much a technocrat, in that I would take hard data over what feels correct or what has always been done any day. If the data show beyond any doubt that working with children in the manner that the article suggests produces better results than thousands of years of corporal punishment evidence, then I would follow the study regardless of what anyone else did.

The problem is that when you're working with people, especially _all_ the people, studies only get you so far. Average IQ is 100 -- so lots of parents are below that. Some parents are poor, or work 3 jobs, or don't give a crap about their children. Whenever I see bad behavior, I have to remember to reserve judgement because of these facts. Some parents lack the ability to reason with their children -- and no parent can reason with a preschooler sometimes! I have 2 little kids and really don't want to screw them up too badly. I'd like to think that treating them like human beings who need training works better than "My dad beat me up all the time, and look how well I turned out!" It must be a pretty lousy job being a social worker for a state child welfare agency and seeing children from the entire cross section of the public as opposed to what you are exposed to regularly.

It seems to me that the study boils down to a consequence of the old adage "Children learn what they live." If your household is a nice tranquil place with two academic parents who take the time to raise their kids, the kids will turn out better than those from a household ripped from an episode of Cops. Now, there's some scientific data behind this, showing that children can model the behavior they're exposed to.

Comment Re:How is this an AP? (Score 1) 208

One of the state universities by me is offering a "pre-intro" CS course that focuses more on the absolute basics before stuffing them in a programming course: CSE 110 It seems to me that this is a good way to scare away people who don't actually want to do CS, and to fill in gaps in knowledge that today's students would have. It's interesting that this is different from the high level survey course for non majors, and it's only a "suggested prerequisite" for the more programming and logic-heavy traditional Computer Science I, II and III.

To me, that seems like a good idea. Typical students who think CS is a good fit because they've messed around with computers are different from those of previous times. Most will not have the low-level programming, algorithms and other experience that people had to have at least a familiarity with back before the app revolution. See my other post in this article -- writing a Minecraft mod or cooking up a web application in ReallyCoolFrameworkOnRails doesn't give you the same low level understanding of how a computer actually does all the magic it does.

Slashdot Top Deals

"Experience has proved that some people indeed know everything." -- Russell Baker

Working...