Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Public vs private (Score 4, Interesting) 387

Right, it's classic cognitive dissonance due to imperfect information. You can't see the security guard watching the surveillance camera video, so you assume it's fine. Whereas on the street, you are afforded more of a choice and so you take it. Unfortunately this, from an economic prospective, puts security guards with access to surveillance footage at a relative advantage to everyone else as far as having access to video. But what people don't take into account is that the kind of people who are attracted to the job are also the people who enjoy having that relative advantage. Thus, over time, it's likely the worst people you'd want to have access to video footage of you will have it and the people you'd most want to have it won't. Video is video, and that's the point this guy is trying to make. Just because you can face your accuser in this case doesn't make what he's doing any WORSE than other surveilance. But people feel it is because they associate it with a person. Any strong power that can make use of this advantage will have a very strong position of power due to the information imbalance.

Comment Re:Code versioning and deployment? (Score 2) 151

Here's what I did, pre-git:

Create svn repo, e.g. svn.company.lan/systems
Create structure ./trunk, ./branches, ./tags
Create a directory for each hostname e.g. ./trunk/sql1, ./trunk/web1, ./trunk/web2, etc.
Then you can svn import configuration directories on the host into the repo, e.g. svn import svn.company.lan/trunk/sql1 /etc
Then check out svn co svn.company.lan/trunk/sql1/etc /etc
From that point forward if you make changes locally you can svn ci OR you can make them externally (i.e. in a test environment) then svn up to update your local conf
I keep the same directory structure, so if I have some tomcat conf like /opt/jira/tomcat/conf it will be in svn as svn.company.lan/trunk/web1/opt/jira/tomcat/conf

With some scripts, I automated the process and since then it's been really easy to maintain. I understand that cfengine is quite a bit more complex and can do a lot more, like verifying your configuration and that sort of thing, but for a small shop this is good enough to prevent Oh Shit moments with minimal extra work and almost no maintenance.

Need to make a change? First, check in to make sure repo has latest version. Make your changes, restart your daemons..if it works, check in. If it doesn't work you can keep working or svn revert back to the previous version.

With git, you'd have a similar thing but the repo would be local and you'd have to find a way to back it up, or you could have something like stash running to be a central hub. DO NOT use github to store configs out of habit, because sometimes conf files have private keys and stuff and it is extremely likely that github will be targeted by crackers at some point. Svn is real easy to set up on a random utility server or even a workstation...

Comment Re:Your first server, in 2012 (Score 1) 152

Well, assuming you're just doing file stuff, one of the commonly available NAS solutions with a box full of disks and multiple file protocols would work great. If you're tiny, your external webserver will be at dreamhost or something (I might have said GoDaddy here in 2008), because you're not going to have a real network connection. More likely your network will be on par with your server equipment and it'll be a cable modem or DSL. Personally, and this has been my business niche a LONG time, so I hate to say this, but if you're under 25 employees, you can get by with just a great internet connection and Google or Windows Live or one of the other cloud apps services. If, and this is a big if, you don't need the data to do your work. For instance, if you're a plumbing company, and you can just do the work and then account for it later with paper slips or something, cloud apps are probably reliable enough.

The thing is, Dell and HP were never in this niche in a big way anyway. I mean, Windows SBS (Small Business Server) never sold many units, and it was designed to be a single server OS in a small office. I think what's really going on is that we've been in a recession, and so big companies have been buying fewer servers. Secondly, computers have gotten too powerful for the standard business workloads and if you combine this with the tendency over the past few years to do horizontal scaling in the CPU (i.e. more cores, not faster clock speeds), you have a lot of unused capacity if you stick with the old "one server per service" mantras. So, people have been virtualizing, building the "private clouds" where you have fewer more powerful hardware units and you split them up in software.

What's crazy is that this has been IBM's like bread and butter since the late 80's when AS 400 and then later zOS came out. For them it's always been about one big hardware unit and cutting it up. Hell, you can go back to the 60's timesharing computers and see "cloud" computing.

So, there you have it. Dell, HPaq have probably been selling fewer servers, and IBM is probably selling fewer due to the recession. On the consumer side, there's obviously Apple to blame for a lot of the desktop erosion, but again, we've been in a recession, everyone who wants a computer probably has one, and there hasn't been a compelling reason or need for new faster hardware.

Comment Re:If Google sold servers... (Score 1) 152

Cloud computing is a fad. The reason why is BGP. BGP means that there's nothing but statistical luck that your connection to your data will go through. The biggest companies in the world (and the largest purchasers of IT equipment) will not ever use it. It will always be relegated to the consumer and the small business, who don't have much to lose if they can't access the data.

At some point, some genius will invent a new internet protocol that will enable the data to be stored local to the owner but can also be securely and easily shared with everyone. And it won't depend on border routing arrangements but instead will be a true autonomous mesh. At that point, the 2010-2012 "cloud" (e.g. outsourced managed software/storage/hardware? as a service) will become the 2016 "cloud" of distributed services and storage. It's just right now there's a flood of computer illiterate who "grew up" on Facebook and the web and don't know any other way. The idea of having to deal with files and names and stuff is just too hard. And god forbid having to teach your devices to talk to each other rather than one parent in the sky. Pft. Get off my lawn.

Comment Re:CRC (Score 3, Informative) 440

For the lazy, here are 3 more tools:
fdupes, duff, and rdfind.

Duff claims it's O(n log n), because they:

Only compare files if they're of equal size.
Compare the beginning of files before calculating digests.
Only calculate digests if the beginning matches.
Compare digests instead of file contents.
Only compare contents if explicitly asked.
 

Comment Re:Field dependent requirement (Score 1) 1086

I agree with your post and I want to add some comments. I think the applied calculus such as that used in economics (Lagrange multipliers, etc.) are far more useful to the majority of programmers (or anyone, really) in a business setting than the applied calculus such as that used in physics. Even if they are almost the same (or are the same) mathematically, it's the linking of the math to the real world to do practical problem solving that is useful in business. Unfortunately, the need for calculation of physics is fairly limited these days, with most of that constrained to the gaming programmers. Rather than attempt to describe the physical world, as physics does, economics is more concerned with social problems such as resource allocation and the like. Say what you will about the dismal science but we ALL buy things, use money, and pretty much live our lives in the pursuit and consumption of resources. Very few of us (although at Slashdot this is less true than in most circles) need to calculate electrical fields or magnetism or orbits or oscillators, nor would being able to understand those phenomena have any real impact on our lives.

That being said, I hold the great physicists of our time and time past up for their often pioneering practical applications of mathematical theory, proving they have worth in the most tangible ways. Their efforts have blazed the trail for other disciplines to use advanced mathematics to attempt to further describe our environment. But I think physics (the what) is pretty much done and we have to start looking at relationships and resource needs to further advance society (the why), and the calculus is just as useful there. Of course at the end of the day our brains are chemical machines subject to the laws of physics but I'm assuming it'll be quite a while before we need to take human behavior all the way back to the physical realm and get a value equal to what we could get from a more economic and systemic analysis.

Comment Re:Into the wild? (Score 1) 76

To clarify what I specifically wrote in my post, Amazon.com (Amazon's application, where they make the money), has not been down in a long time. The Virgina EC2 outage only affected the excess capacity they resell to AWS customers. I'm not singling out Netflix and I'm not saying that this is a bad or horrible or un-useful tool. I appreciate all the stuff Netflix is open-sourcing.

Slashdot Top Deals

This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian

Working...