Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:MoCa (Score 1) 608

D-Link, Netgear, etc all make a multimedia over coax converter, they typically allow you to hit 100mbps using existing lines via frequencies boosted way over the TV range (essentially the same way networking via powerlines work).

It really sounds like he doesn't want to pay any money, he just wants to fiddle some connectors together and connect from a UTP equipment to coax. I've seen people to really ad-hoc things over short distances and be able to get buy with high amounts of loss and very slow speeds. To do anything decent you really to deal with signal processing hardware...your never going to get the appropriate voltages, line conditioning, noise handling, etc unless you make your own layer 1 & 2 hardware.

Now you might have luck with playing with something that runs on a more forgiving medium...possibly modifying the antenna controller in a set of older single antenna (non-MIMO ) wireless routers and use the coax between them. But its probably not going to work without a lot of tweaking which puts you back in with dealing with signaling, plus the wireless would have worked over the air; and also this would cost you just as much as replacing the coax, or buying a proper media converter.

Idle

Hand Written Clock 86

a3buster writes "This clock does not actually have a man inside, but a flatscreen that plays a 24-hour loop of this video by the artist watching his own clock somewhere and painstakingly erasing and re-writing each minute. This video was taken at Design Miami during Art Basel Miami Beach 2009."

Comment Re:Of course you should be paid (Score 2, Informative) 735

My company has a pretty simple setup. On Call Primary - $300/wk, On Call Secondary - $150/wk. Flat fee.
1. If you are forced to work more then 24 hours straight the secondary takes over for 12 hours.
2. If you have work more then an additional 40 hours per week, you get equal 'comp. time' (Ext Paid Vacation time)
3. If you were not on call and have to fill in for a last min change/emergency you get whatever time in 'comp. time'
4. During Scheduled Maintenance (min 1 week advanced warning) the primary and secondary are expected to be logged in and monitoring
5. You get comp time working more then 2 hours of maintenance per week while not on-call.

Comment Re:How much voltage/current? (Score 1) 444

Assuming they weigh about as much as a penny that's only 340,219.541 lbs (154,320.988 kg) per watt and every 87.4 days the energy level halves. Luckily its all beta radiation so its pretty harmless unless you eat it, inhale it, etc. and you don't need any shielding unless you deal with a sizable amount.

Comment Re:How much voltage/current? (Score 1) 444

orders of magnitude of charge is a very wide term here. The Sulfur -35 isotope has a half-life of only 87 days. Also looking at the other work that the micro-nano lab works with, they seem to generally work in terms of microwatt-seconds. So yes it might be a great leap in smaller radioactive decay batteries...

So for those looking to power their future car or home, I hope you like dealing with things in N Scale

Comment Re:Yes, but watch for... (Score 1) 438

IPs are a finite resource and when there are no longer any left to allocate they will grow dramatically in value. So companies that can segment their networks and sell off chunks of their space will buy and sell them like commodities for awhile. Of course smaller companies will start hopping on the free bandwagon to keep costs down and will demand IPv6. Then like everything else larger and larger business will cut costs by moving to IPv6 spaces and selling off their IPv4 networks. Then as retail and consumers start moving, so will everything else. The costs of IPv4 addresses will plummet as almost everyone has replaced all their antiquated equipment and has the 6 & 4 routing tables. Sure lots of people will be running v6 on the edge and translating to v4 on the inside for some time...but after a decade or so it'll be cheaper to eliminate all v4 infrastructure.

Comment Re:Yes, but watch for... (Score 1) 438

Even if it didn't violate standards and cause a complete rewrite of IP protocols, MAC addresses would make internet routing tables impossible to manage. Move a single public router, server, etc and you would need to push routes out to hundreds if not millions of network providers. Then of course you could something like prefix the MAC with a network address to make it routeable based on network topology...but I dunno maybe a 64 bit network ID, the 48-bit MAC, and a bit of padding to round it up to a nice manageable 128-bits address. That would be a great idea...you know something like IPv6 does.

Comment Re:Disk replacement? (Score 1) 487

Depends on the software. If your data is distributed in redundant copies scattered across multiple chassis off-lining a handful of entire chassis or a few hours would just create a temporary performance decrease. Also this company is in the backup storage business, which usually means in-frequent requests for data. So at any given moment having 90% of your clients total backups online is usually considered acceptable in a disaster situation (as long as the chances that the same client's data being in the 10% from failure to failure are small). And since its all HTTP based redirecting and forwarding requests from offline sites to online ones is pretty trivial.

Comment Re:they are missing hardware mgmt (Score 4, Insightful) 487

Its better at what they need it for. Based on the services and software they describe on their site, it looks like they store data in the classic redundant chunks distributed over multiple 'disposable' storage systems. In this situation most of the added redundancy that vendors put in their products doesn't add much value to their storage application. Thus having racks and racks of basic RAIDs on cheap disks and paying a few on-site monkeys to replace parts is more cost effective then going to a more stable/tested enterprise storage vendor.

Comment Re:What we don't know (Score 2, Funny) 257

Dude what do you expect humans are still in an alpha release...If you want to know how it works your just going to have to read the code. They run pretty crappie because they are mainly a few hacks wrapped around bits and pieces cobbled together from other projects. The betas and the QA lab are still billions of years away. But trust me the new interface that's coming out is going to be sweet.

Unfortunately at the next major release they wipe the dev systems to clean out any faulty data. Sorry.

Comment Re:Conspiracy (Score 1) 371

Hell I remember watching an 80's episode of The Phil Donahue Show where they were talking about fermilab and they had a hard time explaining that matter was made up of protons electrons, and neutrons. News of problems with superconductors were waaay to over the populaces head to get any attention.

Of course Phil had a few things to say about those quarks being tops and bottoms. And when strange came into the picture, man it got nasty.

Comment Re:Did anyone else think... (Score 1) 371

Now I just have a picture in my head of the whole thing not working because somebody tripped over the cable connecting the whole thing to the standard wall outlet..

No no....someone tripped over a superconducting connection, ruptured the cryogenic lines instead.

Comment Re:anything worth doing (Score 2, Interesting) 371

Detecting the Higgs Boson is not a process where you turn on an accelerator, smash some protons and go...'look there it is'.

Basically they are never going to see a higgs boson, they are going to look at all the stuff that flies out of these collisions and trace back each bit and try to figure out where what its lifecycle was. When you find something that isn't explained by known particles and fits the model of the higgs boson you can statistically believe it exists.

If the Higgs does exist it you make runs with the accelerator over and over again (it runs at a peak of about 40 million collisions a second). From this you get a large amount of data, about 2GBytes a second. The data is more or less filtered for interesting 'events'. These events are then rated based on how likely they show evidence related to a higgs boson based on various models. Then after a long period of time you look at trends and you can statistically map the mass and energies of particles in an attempt to figure out where the higgs boson exists.

The LEP (at CERN before he LHC) and The Tevatron (at fermilab) have done a lot of work to narrow down what areas should be focused on...but essentially the more energy you have the more granularity your going to have in the resulting data and thus the more confident they can be about the results.

Slashdot Top Deals

The only possible interpretation of any research whatever in the `social sciences' is: some do, some don't. -- Ernest Rutherford

Working...