Forgot your password?
typodupeerror

Comment: Re:PS3 (Score 1) 205

by c0nner (#38271526) Attached to: Ask Slashdot: Parallel Cluster In a Box?

That bad boy is just silly... the chassis is expensive, they only have a limited number of cards that they will support in it when purchased from them with the special sleds and they have a limited number of machines that they can be connected to. penguin has a more interesting offering with a Relion 4708a. They can stuff 8 GTX 580s in the case with a dual cpu and decent amount of memory for 15k or less. You can't even buy the tesla cards for that much. So unless you need the performance advantages of the tesla it is a pretty nice solution with a crazy amount of air being pushed through the case but it isn't something you would want to put under you desk.

I have both and the c410x (paired with the c6100) and the penguin solution and the individual boxes are much easier to deal with and were 1/4 the price though admitedly the dell solution we had to get tesla cards because that is what they support in the chassis.

Comment: The use will fill the available space (Score 1) 359

by c0nner (#37376014) Attached to: Why We Don't Need Gigabit Networks (Yet)

Looking at the adoption rates of Internet solutions and any other technology you have a Field of Dreams situation of "If you build it they will come".

While laptops may not be able to take advantage of a 1Gb connection two could. 3 really could and looking at the roll out of computers that is they type of household we are looking at. Especially as more video content is delivered over the internet. At some point the cable companies will figure out a better way to deliver their content over an IP based network and that Gb connection will not seem like so much if you are trying to watch multiple HD channels in different rooms while also trying to download the latest video that your aunt Nelly shot of her dogs on her 4k camera. Letting the connection companies give the bare minimum with old and outdated technology will just result in an old infrastruture that will not be able to be upgraded.

In rural areas where the providers didn't want to spend the money on fiber backhauls they were stuck with a cap on their maximum supported speeds and just tried to ride it out by laying more copper to get them by. Unfortunately they spent that money on a technology that wouldn't scale out. Now you have many areas of the country that have broadband available but that broadband is only 512k or 786k because they can get away with it and they don't have the backhauls to provide more than that to all the customers. They also are running over some crappy old copper that gets flakey when they try to push 1mb dsl over it.

Comment: Re:Paranoid much? (Score 1) 290

by c0nner (#37221468) Attached to: IBM Building 120PB Cluster Out of 200,000 Hard Disks

We have many researchers that arrive here that give us similar storage requirements but then when we ask them how they are going to process that captured data or if they have any plan on the real output then end up realizing that spending $5k a month to store,replicate, and protect data that could be regenerated at a cost of $1k of compute cycles and a couple days of waiting it just doesn't make sense.

Now admittedly we do house a bit over 1.2 PB of live data and another ~2 PB of archival and DR data so it isn't like we aren't storing a lot but if we were storing as much data as every new lab said they needed to store we would have closer to 20 times that storage.

We see more waste in storage because the storage is so cheap compared to the past even if everyone complains about how expensive it is compared buying an external hard drive from Best Buy.

Comment: Re:Nobody knows where anything is (Score 1) 266

by c0nner (#37068488) Attached to: How Does GPS Change Us?

Even if they know where things are too many people can't give good directions.

My dad is a volunteer firefighter in rural Pennsylvania. They would get calls to go to the Smith residence and when someone would look for clarification it would be something like take Falls Road until you get to the old Kent farm then turn left. But the Kent's likely haven't lived there in 10 years and the barn is gone so unless you already know the area to some degree you will not be able to follow the directions.

Navigating by landmark works if you have a common understanding of the landmarks. Most people don't know the names of the streets that are more than 1 turn removed from the one they live on. I moved into a new neighborhood and started by looking at the map and trying to memorize the streets so that I would know where people lived when they introduced themselves and said where they lived. I learned after telling people the street I lived on that people had no idea. But when I said I live near the mailboxes at the next road past the house with the pink truck they knew where I was talking about.

Comment: Re:old problem, new medium (Score 3, Interesting) 122

by c0nner (#37055896) Attached to: The Biggest Dangers to Your Fiber

Sadly a too large portion of utility maps are not accurate.

I had a house where I had to do plumbing repairs and they were right at the first shutoff valve inside the house. I had turned it off but I needed to be able to turn off at the curb to replace that internal shut off. I called the water company and they came out to turn it off but they couldn't find the shutoff. They looked at their maps and dug many many holes trying to find it. Then after 2 days of looking they were going to give up but they ended up finding it right next to the shutoff for the next house over. That was 50 feet from where it was supposed to be and as a result the pipe run was no where close to where the map said it was.

To make it worse the map was marked as being accurate as of just 5 years before. And there hadn't been any waterlines pulled up and replaced in that time so someone claimed they came out and traced the line as being where the map said but either never did or had no idea what they were doing.

Comment: Re:Well, there goes my damn corn crop (Score 2) 411

by c0nner (#36513212) Attached to: More Users Are Shunning Facebook

There will be huge foreclosures on farms across the interwebs as the economy falls and owners walk away from the facebook. The short sales will be huge and the united bank of Zynga will implode from all the lost money. We will see the CEO go to congress to get a bailout because how were they to know that people would want privacy and wouldn't be willing to pay for fake goods indefinitely. Think of the children who will never know the joy of sitting around waiting for the chance to harvest their carrots. We have to keep things going and with this bailout money we will ensure that pixels are available for everyone and we will get people back into wasting their time on farming fake goods.

Really think of the children and all those unfarmed bits.

Comment: Re:Questions ... (Score 2) 302

by c0nner (#36513088) Attached to: Verizon To Drop Unlimited Data Plans In Two Weeks

I would be almost okay with a real pay as you use pricing except that I have no faith in the wireless industry to not "make mistakes" in calculating the charges.

At least with the electric company there is a box on the side of the building with a little spinning disk and a count up meter that lets me look and see how much I am using at any time I feel the need. Want to know how crazy that new bandsaw is going to be for your electric bill? No problem. Mark the meter for 5 minutes and see how much power you use normally and then run the saw for 5 minutes and see how much that little disk spins. And when you get a bill from the power company that says you own $1000 when you normally have a $150 bill you can go out and look at the meter to see if there was an error. On the other hand the wireless companies have no such transparency. AT&T customers with iphones have the problem when every they travel where the pay for the international plan and check the meter on the phone to be sure they don't go over their allotment and even if they shut the data off before the meter says they have to they end up coming home to huge bills. When you can't trust the count on the phone and the wireless company's solution is to make you go to a web page to check your usage and when you do it from your phone it is adding to that usage.

Comment: Re:Relevance?! Which client/server can handle that (Score 1) 128

by c0nner (#36512852) Attached to: Brute-Force Password Cracking With GPUs

Using GPUs to crack passwords isn't going about it the same way that you are thinking. There are no network connections to a server as the GPUs wouldn't be any faster at that than a normal CPU. What they are doing is getting a copy of the encrypted passwords in some way. Either from a workstation with cached passwords or gaining some amount of access to system to get a hold of the encrypted passwords. Then they run the cracking software against that local file using the GPUs to do the heavy lifting.

Back in the mid 90's I remember we ran a quick little utility on the windows nt box connected to the domain and it gave us a file that could then be passed to a cracking program. After the initial dump it didn't need network access and just sat there churning away spitting out passwords as they were found. It took progressively longer the more characters there were in the password.

Fast forward 10 years and the methods used to encrypt the passwords has gotten much better to the point that even a multi core CPU just couldn't make enough headway in a short amount of time without using something like a super computer or HPC cluster. Then move forward to GPU computing where you can throw thousands of little cores that are really good at checking passwords into a single computer and you move from needing racks of servers to a decent desktop with a few GPU cards to do the same work.

So why is this relevant if you have to get into the system first to get the file. Well the answer to that is you only have to find one weak link in an organization to get a hold of the encrypted passwords for the entire system. So if you can convince one normal user to run malware that harvests that file and sends it off site you can work on cracking it at your leisure.

We recently threw our password file at a single Nvidia Tesla M2050 card and we were able to get all the passwords with 8 characters in just a few hours even with complexity requirements. We were able to do pretty much anything under 13 in a few days. It prompted us to change the encryption on passwords stored in ldap as a result of it.

Comment: Please... Please... Please! (Score 3, Interesting) 95

by c0nner (#36300728) Attached to: Bringing Old Arcade Machines Into the Internet Age

I have to agree with this. Back in the day this was the kind of thing that I came to /. for along with pointing out the cool techy news from the edge of the mainstream. But now all I am left with are the same stories that broke on CNN earlier in the day. It is CNN for goodness sake. They shouldn't be scooping a specialty news site on their own topic.

Anyway screw news as it is going to be bad anyway and the summary will be wrong and just go with cool stuff. Dude Hacked his toaster to talk with the coffee pot so that the toast and coffee is ready at the right time every morning... great. Another story about how Apple may or may not be releaseing another widget some day but no one at Apple has announced it we only have rumor from someone who thought it would be cool and blogged about it... bad story idea...

Thanks and bring on the Karma. Last time I got wasted down because someone didn't understand the instances of rabies in opossum.

Comment: Re:Stupid Kernel Question (Score 0) 201

by c0nner (#35612932) Attached to: Red Hat Nears $1 Billion In Revenues, Closing Door On Clones

It really is a difference in goal and philosophy that you can see well in the difference between Fedora and RHEL. Fedora you get the latest and greatest of everything to the point that you are upgrading your system every year or so. This gives you the advantage of all the new whizbang features and nifty toys but you are reinstalling or upgrading a system every year. And every time you upgrade anything that you installed from source instead of from the yum repo has a chance of breaking and if that application is a business application that makes your organization real money then you have to spend time testing it and verifying that it really does work on the new version.

On the other hand you have RHEL. You get a new version every ~2 years and it won't have the very newest version of everything. They will choose to go with versions that are stable and accepted at that point. You install it on your box and install your application and for the next 7 years you don't have to worry about upgrading because the only things that will be coming down the update feed will be bug fixes so unless your application stack was built to use that bug for some reason you don't have to worry about the stack breaking. You just keep on running the application as it has been for a relatively full lifetime. Then at year 6 or so you pick the current version of RHEL at that time and install with the newer version of the application and the various underlying applications to make it run. Test it and now you have another long life cycle for the new server. If you don't want to have to go through the upgrade at that point you can buy the extended support and Red Hat will continue to provide you patches for bugs for another 3 years giving you 10 years of life on that single software stack.

So on a desktop computer where it takes all of an hour to check out your applications and see if they open after an upgrade and if they don't you can live without your video editor for a few days until it has been fixed in an enterprise it may take 6 or 8 months to verify a new system and ensure that the workflow gives the same results as expected. At that point having to go through even 6 months of testing every 1 year is not acceptable as a cost in man hours.

We are on a 3-4 year life cycle on most things here at work and event that gets to feel like you have only gotten things settled in when you start in on specing, buying and testing the next revision. Heck with our compute clusters we don't upgrade anything really once we have built it because there are so many software stacks that that have only been tested against a single version of various packages. We do a 6-12 month overlap with our clusters so at the end of year 3 we have the next cluster arrive and we install and configure it. Then over the next few months people can try their work flows out on the new one to see what changes they have to make to their code based on the new versions of pretty much everything that is now available. Then at the 3-6 month mark people are pushed to move their work flow as we no longer have service contracts on the old hardware so they start losing available nodes. Then by the 12 month mark everything has to be over on the new one and the old one is powered off.... then just 2 years later we have a new one in house and online and start the migration again. I can't imagine how hard it would be if we couldn't run them in parallel for a period of time and had to do the upgrades every year.

Comment: Re:He's right (Score 0) 487

by c0nner (#34760304) Attached to: Rushkoff Proposes We Fork the Internet

The problem with forking the internet is the question of what content is there going to be on it?

There currently is the Internet fork aka Internet 2 http://www.internet2.edu/network/ where people with content to share decided to create a new network and only allow people to play there that were of like mind. They won't be willing to let zinga or netflix come to I2 because those things are not inline with their goals and in the same way they are not going to make it available to Joe sixpack with a cable modem though admittedly there isn't much content on I2 that Joe would be interested in. But the cost to create a build out like I2 is huge and I don't see a private network or even a fork of the internet being able to build out to the point that the financial aspect would make sense. You can claim that once the people are there that the content will follow but really without the content the juicy center of the user base that advertisers like won't go there.

And if you think that getting things like video chat and video games to work when you are mixing clients is hard imagine if in the beginning you had to plug into your new ForkNet cable modem to be able to talk to your techy friends and into your Comcast modem to talk to your mom and dad who don't live close enough to a POP to get on the new internet fork.

There's got to be more to life than compile-and-go.

Working...