Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Yup. (Score 1) 287

Same conclusion. It's too easy to feel that precarity from the early computing age (not enough storage! not enough cycles! data versions of things are special!) if you were there. I think there's some of that going on here on Slashdot a lot of the time.

People in love with old Unix boxen or supercomputer hardware. People that maintain their own libraries of video, but all that's stored there is mass-market entertainment. And so on. It's like newspaper hoarding.

Storage and computation are now exceedingly cheap. 8-bay eSATA RAID cases run a couple hundred bucks, new. 4TB SATA drives run less than that. With 6 raid ports on a mainboard and a couple of dual- or quad-eSATA port PCI-x cards, you can approach petabytes quickly—and just for four digits. The same goes for processing power—a dual-processor Xeon setup (in which each processor can have core counts in the double digits) again just runs $couple thou.

And data is now cheap and easy. Whatever you want—you can have it as data *already*. Movies? Music? Books? Big social data sets? They're coming out our ears. The investment of time and equipment required, all in all, to put yourself in a position to rip and store a library of "every movie you've ever rented," and then actually do so, is much larger than the cost of simply licensing them via streaming. The same goes for music, ebooks, and so on.

There's just no need. Even my desktop is now starting to feel obsolete—for the work computing I do, there's a good chance I'll just go to Amazon cloud services in the next year or two. At that point, an iPad, a wireless keyboard, and a couple apps will probably be all the computing power I need under my own roof. If I have a desktop, it'll just be to connect multiple monitors for screen real estate.

Comment No datacenter. Just a desktop computer (Score 1) 287

with 20 cores, 128GB RAM, 48TB online storage, and gigabit fiber coming in.

Yes, I use all of it, for work. But it's definitely not a "data center." These days, I don't know why anyone would want one—even moderately sized enterprises are increasingly happy to pay someone else to own the data center. Seems nuts to me to try to bring it into your basement.

If you just need the computation and/or the storage, desktops these days run circles around the datacenter hardware from just a few years ago. If you need more than that, it's more cost effective and reliable to buy into someone-or-other's cloud.

Comment Why do this? (Score 4, Interesting) 287

I sort of don't get it. White box PCs with many cores, dozens of gigabytes of RAM, and multiple gigabit ethernet ports cost next to nothing these days with a few parts from Amazon.com. If the goal is just to play with powerful hardware, you could assemble one or a few white box PCs with *many* cores at 4+ GHz, *tons* of RAM, gigabit I/O, and dozens or hundreds of terabytes of online RAID storage for just a few thousand, and plug them straight into the wall and get better computation and frankly perhaps even I/O performance to boot, depending on the age of the rackware in question.

If you're really doing some crazy hobby experimenting or using massive data storage, you can build it out in nicer, newer ways that use far less (and more readily available) power, are far quieter, generate far less heat, don't take up nearly the space, and don't have the ugliness or premium cost spare parts of the kinds of gear being discussed here. If you need the features, you can easily get VMware and run multiple virtual machines. 100Mbps fiber and Gigabit fiber are becoming more common and are easy to saturate with today's commodity hardware. There are an embarrassment of enterprise-ready operating systems in the FOSS space.

If you really need high reliability/high availability and performance guarantees, I don't get why you wouldn't just provision some service for yourself at Amazon or somewhere else and do what you need to do. Most SaaS and PaaS companies are moving away from trying to maintain their own datacenters because it's not cost effective and it's a PITA—they'd rather leave it to specialists and *really big* data centers.

Why go the opposite direction, even if for some reason you really do have the need for those particular properties?

Comment This. (Score 2) 243

This is a pretty transparent proposal to immediately cap speeds, then approach platforms for extortion money based on user demand.

In short, it's exactly the same thing. The words have changed, but the idea about what to do with the cables is the same.

Comment Re:If there was only one viable choice ... (Score 1) 159

I switched to DuckDuckGo and haven't looked back. They used to be noticeably worse in results quality, but Google has gone a long way downhill. Occasionally I don't find things with DDG and try Google. When I do, I have to wade through pages of totally irrelevant stuff to find that there are no matches, whereas at least DDG tells me straight away that it can only find half a dozen possibly-relevant things. I especially like the way DDG integrates with a number of domain-specific search engines.

Comment Re:What for? (Score 5, Interesting) 183

I maintain the GNUstep / Clang Objective-C stack. Most people who use it now do so in Android applications. A lot of popular apps have a core in Objective-C with the Foundation framework (sometimes they use GNUstep's on Android, more often they'll use one of the proprietary versions that includes code from libFoundation, GNUstep and Cocotron, but they almost all use clang and the GNUstep Objective-C runtime). Amusingly, there are actually more devices deployed with my Objective-C stack than Apple's. The advantage for developers is that their core logic is portable everywhere, but the GUIs can be in Objective-C with UIKit on iOS or Java on Android (or, commonly for games, GLES with a little tiny bit of platform-specific setup code). I suspect that one of the big reasons why the app situation on Windows Phone sucks is that you can't do this with a Windows port.

It would be great for these people to have an open source Swift that integrated cleanly with open source Objective-C stacks. Let's not forget that that's exactly what Swift is: a higher-level language designed for dealing with Objective-C libraries (not specifically Apple libraries).

Objective-C is a good language for mid-1990s development. Swift looks like a nice language for early 2000s development. Hopefully someone will come up with a good language for late 2010s development soon...

Comment Re:If there was only one viable choice ... (Score 2) 159

It wasn't just about interface. People tend to forget how search engines did an absolutely horrible job of intelligently ranking the sites you wanted to see.

I find it pretty easy to remember - I go to Google today.

The UI was what made me switch both to Google originally and from it some years later. When I started using Google - and when Google started gaining significant market share - most users were on 56Kb/s or slower modem connections. AltaVista was the market leader and they'd put so much crap in their front page that it took 30 seconds to load (and then another 20 or so to show the results). Google loaded in 2-3 seconds. The AltaVista search results had to be a lot better to be faster. I switched away when they made the up and down arrow keys in their search box behave differently to every other text field in the system.

Comment Re: Government s a crappy investor (Score 2) 64

My 'precious electronic toys' use about a tenth of the power that the ones I was using a decade ago for the same purpose did. Even lighting power consumption has dropped. My fridge, freezer and washing machine are the big electricity consumers in my home - efficiency has improved there, but nowhere near as fast as for gadgets.

Comment Re:Tricky proposition (Score 1) 64

There's a lot more to government than military intelligence gathering and law enforcement (although it would be a good idea for someone to remind most current governments that those are two things, not one). And most government projects end up spending insane budgets. This isn't limited to the US. It amazes me how often government projects to build databases to store a few million records with a few tens to thousands of queries per second (i.e. the kind of workload that you could run with off-the-shelf software on a relatively low-spec server) end up costing millions. Even with someone designing a pretty web-based GUI, people paid to manually enter all of the data from existing paper records, and 10 years of off-site redundancy, I often can't see where the money could have gone. Large companies often manage to do the same sort of thing.

The one thing that the US does well in terms of tech spending is mandate that the big company that wins the project should subcontract a certain percentage to small businesses. A lot of tech startups have got their big breaks from this rule.

Comment Never been a fan of multiplayer. (Score 5, Insightful) 292

Maybe I'm dating myself here, but multiplayer games are still newfangled and weird to me, and I don't know if that will ever change.

When I used to play games, I played to get away from social interaction and enjoy myself in isolation. It was a kind of recuperation. A world of gaming in which you have to face social interaction once again as part of gameplay was unattractive enough to me that I stopped playing games altogether. These days I mainly do crossword puzzles and read e-books for the respite that I used to get from gaming.

Slashdot Top Deals

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...