Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:rfc1925.11 proves true, yet again (Score 1) 83

You haven't worked with large scale virtualization much, have you?

In all fairness.. I am not at full scale virtualization yet either, and my experience is with pods of 15 production servers with 64 CPU Cores + ~500 Gb of RAM each and 4 10-gig ports per physical server, half for redundancy, and bandwidth utilization is controlled to remain less than 50%. I would consider the need for more 10-gig ports or a move to 40-gig ports, if density were increased by a factor of 3: which is probable in a few years, as servers will be shipping with 2 to 4 Terabytes of RAM and run 200 large VMs per host before too long.

It is thus unreasonable to pretend that large scale virtualization doesn't exist or that organizations are going to be able in the long run to justify not having large scale virtualization, OR moving to a cloud solution which is ultimately hosted on large scale virtualization.

The efficiencies that can be gained from a SDD strategy versus sparse deployment on physical servers are simply too large for management/shareholders to ignore.

However: the network must be capable of delivering 100%.

Perfectly content to overallocate CPU, Memory, Storage, and even Network port Bandwidth at the server edge. However the network at a fundamental layer has to be able to deliver 100% of what is there --- just like the SAN needs to be able to deliver within a degree of magnitude the Latency/IOPS and Volume space size that the vendor showed as the capacity of the SAN --- we will intentionally choose to assign more storage than we actually have, BUT that is an informed choice, the risks simply become unacceptable if the lower level core resources can't make some absolute promises about what exists and the controller architecture forces us to make an uniformed choice, OR guess about what our own network will be able to handle affected by the loads created by completely unrelated networks or VLANs outside our control, E.g. perhaps another tenant of the datacenter.

This is why a central control system for the network is suddenly problematic. The central controller has suddenly removed a fundamental capability of the network to be heavily subscribed, fault-isolated within a physical infrastructure (through Layer 2 separation), and tolerate and minimize the impact of failures, if designed appropriately.

Comment Re:rfc1925.11 proves true, yet again (Score 1) 83

I hate it when my problems get angry, it usually just exacerbates things.

I hear most problems can be kept reasonably happy by properly acknowledging their existence and discussing potential resolutions.

Problems tend to be more likely to get frustrated when you ignore them, and anger comes mostly when you attribute their accomplishments to other problems.

Comment Re:rfc1925.11 proves true, yet again (Score 2) 83

Your 300 x 10GB ports on 50 Servers is ... not efficient. Additionally, you're not likely saturating your 60GB off a single server,

It's not so hard to get 50 gigabits off a heavily consolidated server under normal conditions; throw some storage intensive workloads at it, perhaps some MongoDB instances and a whole variety of highly-demanded odds and ends, .....

If you ever saturate any of the links on the server then it's kind of an error: in critical application network design, a core link within your network being saturated for 15 seconds due to some internal demand burst that was not appropriately designed for is potentially a "you get fired or placed on the s***** list immediately after the post-mortem" kind of mistake. Leaf and spine fabrics which are unsaturatable, except at the edge ports: are definitely a great strategy to approach sizing of core infrastructure --- from there most internal bandwidth risk can be alleviated by shifting workloads around.

Latency performance seriously suffers instability at ~60% or higher utilization, so for latency-sensitive applications especially: it would be a major mistake to provision only enough capacity to avoid saturation, when micro "bursts" in bandwidth usage are the reality for real-world workloads.
An internal link with peak usage of 40% or higher should be considered in need of being relieved, and a link utilized 50% or higher should be considered seriously congested.

Comment rfc1925.11 proves true, yet again (Score 1, Interesting) 83

Every old idea will be proposed again with a different name and a different presentation, regardless of whether it works.

Case in point: ATM To the Desktop.

In a modern datacenter "2.2 terabits" is not impressive. 300 10-gigabit ports (Or about 50 servers) is 3 terbits. And there is no reason to believe you can just add more cores and continue to scale the bitrate linearly. Furthermore... how will Fastpass perform during attempted DoS attacks or other stormy conditions where there are small packets, which are particularly stressful for any centralized controller?

Furthermore.... "zero queuing" does not solve any real problems facing datacenter networks. If limited bandwidth is a problem, the solution is to add more bandwidth -- shorter queues does not eliminate bandwidth bottlenecks in the network; you can't schedule your way into using more capacity than a link supports.

Comment Re:This is because.... (Score 1) 140

that the companies are _former_ employers as that the companies are _future_ employers.

This is problematic. When you sign up for a regulatory agency to participate in the agency legislating the regulations, there should be a mandatory period of at least 10 years after you leave during which you cannot be employed by anyone in the industry you regulated, and especially, accepting any reward or promise of potential future employment should be illegal.

Comment Re: They aren't looking for public comments (Score 1) 140

The problem is that the FCC has limited regulatory power unless it reclassifies Internet access as a telecommunications service, which is considered the "nuclear option."

How about instead, they reclassify the Cable line or Wireless Data as the telecommunications service and say Provide competing IP providers equal access to the Cable or Wireless Data link to customer facilities, Or Else: All services over that link are telco services for you, including internet, by stating that A telecommunications service always exists for every end-user connection..

So an ISP is not a telecommunication service, BUT the Internet service itself carried over an exclusively owned link to the customer facility IS a telecommunications service UP to the protocol layer where the customer first has choice of who to direct packets to.

In other words: conditional classification. Not all internet services necessarily have to be classified the same. Let's start organizing and classifying IP service for regulation based on the characteristics of the service.

Comment Re:Just ran into this (Score 1) 753

However, small mom & pop shops stayed open, using a hand ledger and accepting cash. I was actually in one store buying supplies that was operating by candlelight.

Not surprising.... big box stores can afford to close, and it's likely cheaper for them to plan to do so.

Which is also one of the reasons local governments should make sure that big box stores can't get 100% of the business for essential goods.

There is much to be said about having $20,000 or so in emergency cash tucked away in your safety deposit vault at a bank with 24x7 access to your locker, just in case the SHTF.

Comment Re:KeePass? (Score 1) 114

An attacker would need my LastPass password (which is not, itself, stored in my LastPass vault); my physical YubiKey; and the knowledge to use both in tandem, in order to gain access to my LastPass account.

Yes, because the Lastpass website enforces this two factor scheme.

On the other hand, once it's open on your computer: the entire database is available for RAM-scraping malware to take a peek.

Or to decrypt using only the master password, since, as I understand: it's just the Lastpass website that requires the 2-factor, before allowing your software to download the DB.

Comment Re:because drinking water is so pristine (Score 1) 242

How do you get that foul chloride dioxide back out of your water?

You leave it in there all the way to the end user, so that the treated water can help disinfect the entire system.

If the user so desires, they can remove it through simple aeration. What the end user won't be able to easily remove (without filtering) is the actual chlorine you need to treat the water with or the fluoride that you add.

Comment Re:Price floors are subsidies (Score 1) 309

Actually I'd argue it is the government's job to protect cultural value; that's precisely why they fund libraries and museums.

No..... libraries and museums are common goods which the public wants and everyone benefits equally from, and preservation of cultural history is one of the benefits. It is the job of the government to support such common goods, as long as there is majority support for the good.

Without government support, then there would be the problem of freeriders --- people who paid nothing in the long run, would get just as much benefit from the existence of the good as those who did pay for the construction of the library or museum.

It's the central purpose of government to provide a structure to help fund such goods, by requiring a majority to agree --- then everyone has to pay their equal share (relative to the benefit they and their descendants will derive from that good over their lifetimes), no freeriders, no tragedy of the commons.

Comment Re:Price floors are subsidies (Score 1) 309

And when they don't have it they order it directly from the publisher. Hence they definitely have a useful cultural role.

So you're saying that since there's a portion of the population interested in buying 5-6 year old books, the folks offering those need to be protected against competition on sales for the newest bestsellers?

That's ridiculously anti-consumer.

If the population of aficionados for older books is so small that they cannot support these businesses, or if after visiting the local library, their needs were met and they don't need to buy old books, then these businesses by definition don't have sufficient value for society anymore.

Public policy should not be based on nostalgia. It's not the government's job to try to protect "cultural value" either.

Comment Re: Not France vs US (Score 1) 309

that supermarkets would only stock bestsellers and that smaller shops were necessary to ensure the availability of more specialized, less popular books.

Well... if they only stock bestsellers, then they've created a market opportunity for smaller shops to carry the non-bestsellers at higher prices. How do you know if a book will be a bestseller, before it sells, anyways? :)

Slashdot Top Deals

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...