Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Ask yourself (Score 1) 141

This times a lot. I'm not saying it's an ideal practice that this as-yet-unnamed vendor is doing, but I also don't view it as the end of the world either, particularly if no ultra-sensitive data is stored on the company's servers (i.e., credit card numbers, SSNs, etc.). In my eyes (admittedly not knowing all of the details), the biggest problem here may be that the vendor is storing passwords in plain text, which I can't quite fathom a reason for. At a bare minimum, they should be encrypted (which would not preclude the company from retrieving the clear text equivalent), but preferably hashed. You as a user may not be able to tell the difference between a company that stores passwords in plain text and one that actually e-mails them, but they're pretty close in levels of security, in my mind (and this is a very good reason for using a different password for every site, as has been suggested by many a Slashdotter).

There can be security benefit to a lost password procedure not involving e-mailing a password to a user though. The best ones I've seen e-mail a link back to the company's site containing some sort of token that proves you received the e-mail (at your registered address), and then prompt you to ask for the answers to one or more security questions that you configured when you first setup the account before you are prompted to enter/select a new password.

Security is a fundamentally hard problem, and while there have clearly been many SSL issues as of late, this is just not one of them.

Comment Re:In my experience (Score 1) 118

You are correct sir. Our experience was that HP did indeed release a BIOS update that was supposed to fix the issue, but did not. Setting intremap=off alone did not do the trick for us, as was often suggested. Instead, we turned off interrupt remapping, and disabled VT-d in the BIOS, and something else related to virtualization as well (I'd know it if I saw it). Obviously we weren't doing virtualization on these systems, but the combination of those three things largely alleviated the problem (or at least enough that we haven't had the need to revisit the issue with many other things we have going on).

Comment Re:In my experience (Score 1) 118

To be fair, if I remember correctly, the problem was with hardware provided by Intel, and could be worked around by a BIOS update (supposedly), but it would have affected a white box as much as it would a Dell or HP.

There are plenty of arguments for using white boxes or boxes from big brands, but this was wasn't one of them.

Comment Re:In my experience (Score 5, Interesting) 118

We were having a problem with a "no IRQ handler for vector" issue that was crippling networking on a lot of HP DL360G7 systems we had. We were running CentOS on some of these systems, and RHEL on others, and though we never reached out to Red Hat ourselves.

Red Hat had a bug open for it (bug 887006 if I recall correctly), and it was interesting to see what their response was to paying customers. They did provide special kernel packages to help fix/troubleshoot the issue, but it still went on for a long time. To make matters even worse, even when the bug was visible to me (as a Red Hat customer), lots of it was redacted, to the point where it was difficult to determine key pieces of information. And while I don't have access to my RHN login right now, I don't believe that bug is accessible to anyone outside of Red Hat at this point (which is another problem itself)

I suppose my point is even in circumstances where you can hold the vendor responsible and where they are taking action, it doesn't guarantee that the problem will be fixed when The Business(TM) wants it to be. And for problems like this, where it's affecting or going to affect a large number of people, it'll get the proper attention it needs, paid support or not.

I get paying for support from a CYA perspective, but that's really all it is, IMHO.

Comment Re:35? (Score 1) 376

I realize your experiences are as anecdotal as mine are, but (IMHO), there's nothing easier about management, if you're any good at it.

It takes a completely different set of skills to manage people and projects well. And it's not easy, even if you have the skills. Being able to manage IT *well* (software development or IT operations alike) requires a fair amount of technical knowledge (you don't need to the the expert on everything, but you do need to know your stuff), and being able to communicate well to those above you and beneath you.

A good coder or sys admin is hard to find. A good *manager* of those people is even harder to find, and are worth their weight in gold (both for the people who work for them, and for the company itself).

Full disclosure: I have a wonderful manager who helps make my job (as a Ops team lead--so I'm still in the trenches but "managing"/"mentoring" those on my team) much easier, and I've seen our best coders rise up to be very effective managers themselves.

Comment Re:top quarter still need to go to college (Score 1) 281

If you think that's what college is to most people who attend, then with all due respect, you're out of your damn mind. I'd say the vast majority of people who attended my school at the time I went were not particularly interested in expanding their minds, and truly benefiting from what college was structured to do. They were interested in passing their classes, drinking and partying, and landing a job after graduation (i.e., a vocational school).

Is there anything wrong with that? I'd say it depends on the cost. And I wholeheartedly agree with you that what you described is what college should be about, but it just isn't. That's the problem.

And, to be fair, this did vary a lot by major, so please don't take it as a blanket assumption about each and every group that attended.

Comment Re:Old Cisco Equipment (Score 1) 241

I use a 3550 at home too, specifically for its layer-3 capabilities. Of course, if you want a gigabit switch that does layer-3, you're talking about $$$$, even on eBay.

Other than that, Cisco gear all the way. It's overpriced, and for the most part, you're going to be limited to 100 megabit, even on eBay, for a reasonable price, but it's rock solid gear.

Comment Re:Alix 2D13 (Score 1) 241

I run a 2D3 myself, and it's rock solid (actually running CentOS/iptables). A tad on the expensive side, particularly considering how relatively low-powered it is by modern standards, but it's x86 compatible with full serial console access.

And it really is solid--I keep all my networking gear at home on a UPS, and it's still far more solid than any standalone Linksys router was (and uses far less power than it's predecessor--a Celeron 366 MHz box that had ~1400 days of uptime before I killed it).

Comment Re:Sounds like my kid (Score 1) 770

While I generally agree with your points, as a 28-year-old who still lives at home, despite a well-paying job, there are some reasons for all this.

First off, a fairly high percentage of kids going to college are just throwing their money away to begin with. How many kids are going to college now who have no business going? How many graduate without being able to think or analyze anything? They graduate with a diploma that means next to nothing, and yet, they're either in tons of debt, or mom and dad paid for it all and, in any case, there's little to be had from it. The value of a degree has gone down, and the price has skyrocketed. And, more so than ever, kids are told right from their freshman year of high school, that they need to go to college. This topic has been discussed endlessly here, and I don't want to rehash it more than is necessary to prove a point, but it's a big part of the problem that exists today with an entire generation.

Though I'm living at home, I'm more than able to cover my living expenses. I choose to do so, because as much as I want to move out, it would take quite a while to save up for a house between paying rent, utilities, and said student loans. I made some very foolish choices straight out of high school, and I'll be paying for my degree for a number of years, when it has proven entirely unnecessary in my line of work (IT systems engineering/administration). They're my own mistakes, and no one else's, but tons of people keep making these mistakes because of societal expectations and job "requirements" that are hardly requirements. And then they're surprised when they can't find a "real" job. The kids carry a good portion of the blame, but they can't carry all of it.

Want to be a doctor or lawyer? OK, go to college? Want to cure cancer? Go to college. Want to fix the horribly deficient infrastructure throughout the country? Go to college (but don't expect to find a job, since there's no funding for this). Want to party for 4 years and live at home for the rest of your life? Don't bother going to college, you can do that without a diploma. The distinction needs to be accepted by employers, but I doubt it ever will again.

For the record, since I'm sure the natural inference people will make is that I'm knocking the business majors, the humanities majors, etc., I'm not. Necessarily. I think they're well worth studying, and we're all well served by doing so, but going to college to do it just for the sake of having a degree is rather pointless. Also for the record, I graduated with a "BS" in business, and not once has it proven relevant on the job. Lastly, and again for the record, I am a bit bitter about it. :-)

Comment Re:power consumption (Score 1) 359

It depends on what you're doing. I'm sitting here, using my desktop with two 24" monitors, a Core i5 3570K, 32 GB of RAM, a SSD, and a 7200 RPM platter, with a browser, e-mail, and a few VMs for work open, and the Kill-A-Watt tells me the whole shebang is using a whopping 84 watts.

More than a laptop? Sure. Substantially more than a laptop? Not really. Especially if you were to add the screens and peripherals in. And while I'm sure you can find laptops with 32 GB of RAM, I doubt you'll find them as cheaply as I built this setup ($1500, roughly).

You're right, but the difference really isn't that large. And when I'm at my desk, I'll take this setup over the 13" MBA that also lives there any day.

Comment Re:Steve Jobs (Score 5, Interesting) 420

AC seems exactly right to me, based on what I remember of "Apple Confidential." In fact, if memory serves me right, Jobs was trying to get Sculley fired when Sculley was out of town, and Jean-Louis Gassee warned Sculley of the attempted coup.

So when Apple was looking to buy a company for the next generation Mac OS, Jobs had a very personal motive to get Apple to buy NeXT instead of Be (as Gassee was the president of Be, and in negotations to sell Be to Apple). That, and he got Apple to buy NeXT at a time when he was considering investing his own (and Larry Ellison's) money to take over Apple. Instead, he got paid to do it, and got the guy who executed the move fired.

Jobs was great at many, many things... but he wasn't exactly a nice guy, or--from everything I've read--the kind of guy you'd want running anything when he was forced out of Apple. I think even Jobs would admit it was probably good for him (and Apple) in the long run.

Comment Re:Ummmm, no (Score 1) 467

There are also the cases where upgrades intended to fix a problem actually make matters far worse. We had a lot of issues rolling out FCoE, and a firmware upgrade intended to fix some of our issues actually made matters far worse.

Of course, we were sane, and didn't blindly apply these updates to all our systems. We tested it out on one or two lab boxes first, and once we noticed the upgrade was problematic, we yelled-and-screamed at the vendor.

Point being, some firmware upgrades are bad, and some are good. Blindly applying all or blindly applying none at all are equally stupid system administrator philosophies. If you're not testing these sort of upgrades in a lab or testing environment prior to doing your production gear, you're doing it wrong.*

* - Yes, I know not everyone has the luxury of a test/lab environment. But you almost always have critical systems and non-critical, and it should be pretty obvious where you'd want to try upgrades first.

Comment Re:It's a culture issue (Score 2) 66

Excellent point, and a practice I've already seen at my current job (tracking service availability instead of server uptime--in fact, since I started, we've tracked nothing but service availability).

That said, this has led us down the path of constantly increasing availability requirements, for things as (relatively) insignificant as an internal company blog. We're currently doing work between two new data centers, and one of the goals is to provide near 100% availability of all systems. It becomes very easy to sell such an idea to the business at little incremental cost (compared to the cost of building out two DCs in the first place), but the actual work involved in making it happen can be tricky at times. Not to mention the real incremental benefit is questionable at best, at least for a lot of the applications in question (IMHO, and given that many systems aren't tied to money-making endeavors).

Sure, it's theoretically possible to have two DCs, and when you want to do patching, you flip to your secondary site, patch your primary, flip back, and patch the secondary. It's a practice I'd certainly expect to see in an environment like NASDAQ. The business likes it, and the technical minutiae are workable (most of the time), but it is a substantial amount of added complexity (and work... and time) for little added benefit, in a lot of cases.

In short, I agree completely with what you said, but it can have the side effect of increasing the "required" availability numbers to the point where it becomes little different than simply looking at uptime (depending upon the environment).

Slashdot Top Deals

Kleeneness is next to Godelness.

Working...