Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:Nicely done! (Score 3, Informative) 290

From my understanding the moonlight sonata wasn't even one of the pieces performed by the orchestra in the kickstarter campaign? It isn't listed at any of the links in the article. Musopen compiles a bunch of different music from many sources and so some if it is complete crap, but my impression was that the point of this project was to get some better recordings of a select group of pieces.

Comment Oh CCN (Score 1) 153

This stuff has been around for a while, and I have the following problems with it:

1. We already pretty much have CCN. They're called URLs, and companies like Akamai and others do a great job of dynamically pointing you to whatever server you should be talking to using DNS, HTTP redirects, etc. When I type www.slashdot.org, I already don't care what server it lives on. When I type https://www.slashdot.org/ I still dont care what server it is on, and I have at least some indication that the content is from someone authorized to speak on behalf of www.slashdot.org (PKI crap aside)

2. The article mentions that this tech would be used to relieve load at the core -- which I'm not sure I buy. The core is well known to be overprovisioned, and a recentish survey http://techcrunch.com/2011/05/17/netflix-largest-internet-traffic/ has shown that netflix and youtube consume 40% of downstream bytes -- both services already serviced by major CDNs pushing at least some traffic away from the core.

3. I'm unclear on the value proposition for us to redesign every router to be effectively, an HTTP proxy cache. These devices are well studied and even if we got a higher cache-hit-rate using CCN, I'm not convinced it would help anything. After all, we are doing just fine.

4. I think this approach is in the end, fundamentally wrong. Regardless of how much magic we use to find out what machine to get data from, we will always be transferring data from one computer to another (a caching router is effectively a computer). It seems to me that until we no longer need to move packets from some machine A to some other machine B, it makes sense to have host-centric primitives, and build our abstractions on top of them. That's what we've been doing, and it's been working pretty well.

Comment Re:Busy databases (Score 1) 464

However you are repeating what I also find to be a common misconception -- that latency is king. Will a SAN every beat the latency of a directly attached set of disks? No way, but it shouldn't be that far off. The rule of thumb I've heard (and observed in our environment) is that SAN latency is generally OK when under 10ms, which isn't much more than 2x the random access time on a spinning disk. Most of the time I have seen a SAN be significantly slower, it is because of a misconfiguration.

The above having been said, assuming latency is OK, it's hard to beat the spindle count on a SAN. If you need 5000 IOPS on a disk volume, are you going to directly attach dozens of disks to a single server? I doubt it. Directly-attached storage is nice, but apps have an IOPS requirement, and SANs make satisfying those requirements more straightforward, and no reduction in latency will make your random seeks faster.

Comment Do most of these companies already outsource? (Score 1) 403

I've done a bunch of consulting for the K-12 public sector and I have to say that educational software is some of the most poorly thought-out software ever (from an IT Admin's perspective).

Much of the software's installation guide goes something along the lines of "go to each computer and put in the disk", making deployment a massive headache. Those that actually come with a networking component usually require Everyone/FullControl permissions on the server share because the software was coded with the assumption it should have access to everything. Furthermore, I recently saw one where the application had a "server" component, but that component had to be manually run from a logged-on console session on a server, most kinds of automation would fail.

I always assumed that essentially off of this stuff was already outsourced due to the abysmal quality across the board. I guess I'd argue to keep as much in-house as possible so things don't get worse, but that would be pretty hard.

I know this comes off as a rant and it is, but if you write educational software, please actually think about the people that will have to deploy and run it while you're designing, whether you code it yourself or send it to India.

Comment Hmmmm... (Score 2) 96

So somebody else correct me if I'm wrong here, but on a dramatic number of their demonstrations it looked like they didn't have much more data than the amount my which their curve was shifted -- only in a few instances did it really change shape dramatically.

Moreover, it appeared as if the amount of the shift of the curve was directly proportional to how much the object was being touched. Part of me wonders if they were really essentially calibrating these "gestures" based upon the amount of contact with the device in question. E.g., two fingers will of course result in more contact than one, but less than an entire palm. The whole "we can detect how the object is being grasped" thing seemed contrived.

Not that it wasn't cool -- there are definitely uses for this -- but it doesn't seem to me like they're getting quite as much data as they seem to be implying.

Comment In fairness (Score 1) 765

While the Executive branch's job is to enforce the laws, it gets a little muddy when it comes to the other branches. Technically, it is Congress' job to police the membership of all the branches (only Congress has the power to impeach members, and it may impeach the members of any branch).

So here's to waiting for Congress to do something about it :)

Slashdot Top Deals

There are two ways to write error-free programs; only the third one works.

Working...