Forgot your password?
typodupeerror

Comment: Re:I have a serious question.... (Score 2) 66

by Toasterboy (#48313575) Attached to: Court Order: Butterfly Labs Bitcoins To Be Sold

It's built into the design of bitcoin. It automatically adjusts the difficult of mining up (never down) based on the rate at which coins are mined in heats. Every time a faster ASIC for mining comes out, the difficulty shoots up correspondingly. At best you can mine a good percentage of the blocks in the current heat with a new machine before the difficulty shoots back up and the new ASIC performance is the new baseline The new asics are still better than the poor schmoes running regular cpus or gpus, but it's quite difficult to "get ahead" with mining unless your mining resources are free, such as harvesting cpu cycles form a botnet.. There are a finite number of bitcoins, and with every block of them found the difficulty to mine the next batch goes up very fast.

I recently thought about getting some ASIC mining hardware, but after running the numbers and factoring the cost of electricity, and the current price of bitcoin, it was dubious whether the devices would even make back their cost even if they did ship on time and perform as advertised. Even just running mining software on cpus and gpus I already own is a losing proposition due to the cost of electricity. Not really a good investment, unless you're on unmetered power, such as college dorms.

The way that that the mining difficulty cranks up with blocks returned, the developers of any new fast ASIC hardware will reap the greatest benefit of the faster hardware during development, and by the time you get it in your hands the bitcoin ecosystem will have already cranked up the difficulty. It will still be faster than older hardware, but since the difficulty cranked up too, it's likely going to produce much less than you initially expected.

Comment: CS is not programming (Score 2) 546

by Toasterboy (#47819547) Attached to: Does Learning To Code Outweigh a Degree In Computer Science?

Computer Science is largely very specific applied math and theory. It includes algorithms, algorithm efficiency, a bunch of math, data structures from a theoretical design standpoint, and computer architecture. It tends to be very academic.

University programs vary widely on what the programs focus on, but generally Comp Sci is about the math and theory, and programming is something you do on the side to get the assignments done to illustrate the theory you are learning. With Computer Engineering and Software Engineering programs, things tend to be more hands on and focused more on doing than theory.

Programming, as desired by business, is NOT computer science. Business wants the most simplistic designs (i.e. always use linked lists instead of more appropriate data structures), and above all, they want you to code whatever it is FAST FAST FAST so you can SHIP SHIP SHIP. Because generally, most business are not software businesses, and they don't value developers or software beyond getting the minimum quick and dirty solution out of them as fast and as cheap as possible. Also, most business are not doing anything remotely resembling state of the art, and value the ability to hire a newbie to replace you.

CS grads have it rough. They know too much theory to be satisfied with basic programmer jobs, but they don't know enough about efficiently slapping out code day in and day out to have an easy time in a basic programmer job. The degree can get you in the door though. A lot of places filter out folks with no degree.

Not that there aren't some grads who still can't code their way out of a wet paper bag.

There's all sorts of stuff about programming that you will never learn in a CS program, such as when to select designs based on implementation risk and ease of maintenance, rather than algorithm efficiency. It sucks, but the people who pay for you to write the software could not give two shits about how well the code is designed as long as it mostly works and ships on time. For the most part, that CS theory is mostly only directly relevant to later in your programming career, and when you actually have some autonomy to "do it right" versus "do it yesterday", or if you strike out on your own.

Comment: Re:In the US, business doesn't care. (Score 1) 201

If they require a master's or PhD, it's not an entry level position.

They either a) are trying to change the world with new or hard stuff and want a theory guy to guide things or b) don't know what they are doing or c) don't want to mess around with kids straight out of school who haven't figured out the corporate metagames and "git'er done" culture yet.

There's the optimal implementation on paper, given infinite time for implementation, and there's the "We have two weeks, do what you can pull off" implementation that business is usually looking for. Business values programmer time more than academia does. I know my CS degree didn't prep me for that very well.

Actual raw engineering is a bit less wild wild west than software... there are legal definitions of what a certified engineer is responsible for; i.e. if people die as a result of your engineering mistakes, it's your fault, not just some edge case bug. But the same corporate BS is still driving it, so the same stuff applies... HR is still about risk avoidance, it's just that a guy with a master's or PhD had to jump through more hoops to get to the table and thus the wheat is seperated from the chaff so to speak.

Business doesn't care about getting the best candidate, they care about getting the guy who looks like he's good enough for the money they are willing to spend on him and won't end up as a disaster. And also, some of those job postings may require a master's or PhD so they can legally justify hiring an H1-B after there is no one "qualified" to be found.

Comment: In the US, business doesn't care. (Score 3) 201

Business (HR specifically) doesn't give a shit about your degree. They care about a) that you have the checkbox, b) who you worked for previously and are not lying about it, and c) whether it looks like you aren't a total fuckup who will cost them. It's about risk avoidance.

The actual team you interview with (if it wasn't an HR drone) cares that you look like you know your shit and can carry your weight.

Engineering and especially computer degrees are such a total crapshoot on the skills you get in a candidate, that they don't know how to weigh your degree. Even degrees from badass schools sometimes come with folks who still can't code their way out of a wet paper bag. Besides, most of that senior level theory stuff in the degree won't help you much in a real world job until the late stages of your career, and will piss off your peers who don't have the same background, and definitely piss off management, who barely understands what a linked list is.

The quality of in person versus remote will depend on your learning style, and whether you actually would make use of those in-person office hours anyway.

Comment: Most people don't understand that it's a bad idea. (Score 5, Informative) 405

by Toasterboy (#38457828) Attached to: Is Overclocking Over?

Look, digital electronics are still subject to analog limitations. When you overclock, you squeeze the hysterisis curve, increasing the probability that your chip incorrectly interprets its the state of a particular bit as the opposite value. i.e. you get random data corruption. This is why you eventually start crashing randomly the more you overclock.

While overclocking a chip that has been conservatively binned simply to reduce manufacturing costs but is actually stable at higher clock rates is reasonable, trying to overclock past the design limits is pretty insane if you care at all about the data integrity. Also, you tend to burn out the electronics earlier than their expected life due to increased heat stress.

I never overclock.

Comment: Outside virtualization, it won't work well. (Score 1) 239

by Toasterboy (#37585302) Attached to: Hot Multi-OS Switching — Why Isn't It Everywhere?

Most devices barely work in one operating system, let alone having to deal with being initialized and controlled by multiple driver models and switching back and forth between them hot.

They are simply not designed for that scenario. Hence, the hypervisor, and virtualized devices under it.

Comment: $4 for 5 cent parts isn't going to motivate DIYers (Score 1) 413

by Toasterboy (#36271000) Attached to: RadioShack Trying To Return To Its DIY Roots

Radio Shack has been a ripoff for years. Why the hell would anyone who knows enough to DIY pay $4 for a 5 cent part? Sure, it might take a few days for it to come from Mouser, but honestly when you're designing a circuit, you need a lot of components, generally plan out what you need in detail and a retail place just isn't going to stock whatever exotic parts you are going to need for your project anyway.

Since there are far more folks who aren't with it enough to DIY, Radio Shack is far better off overcharging the masses for extension cords, sub par computers, and low grade RC cars in the mall. They just want the masses to THINK that smart people shop there.

Comment: I went to WWU, and have a CS degree... (Score 3, Informative) 298

by Toasterboy (#36130180) Attached to: Western Washington Univ. Considers Cutting Computer Science

Western's CS program is one of the ones that grew out of a math base. It's pretty hardcore on the theory, but you're sort of on your own for learning the stuff that business wants. Which is fine.... even if the program focused on exacty whatever buzzwords corps want these days, corps don't generally hire CS grads straight out of school. The stuff you learn in the 400 level classes is great for senior developers to know....but you're not going to start out as one. It wasn't till my 3rd job out of college (which I'm still at) that I actually got to touch source code at work. For long term personal growth, I'm really glad that I had my ass kicked with the theory; I find that the rigorous methods that were drilled into me really help me tackle the hard problems I work on every day. (debugging nasty kernel mode race conditions in code written by others for example). Besides, if you can handle the proofs and algorithm stuff, you can handle anything else, though you'll sure as hell not enjoy writing silly business apps over and over.

You know what the job finding foks at Western tell you about finding a job once you graduate? They tell you to forget about finding anything remotely in your field. The real difficulty in getting hired after college has less to do with your skills and what you're taught and more to do with risk aversion for employers...they don't like hiring green kids who don't understand corporate politics yet. You have to persevere in order to get to do what you love.

Computer science is supposed to be hardcore...unfortunately there is a huge variation in what different universities consider to be computer science, let alone what the business world thinks. For some, any old programming is CS, for others, they focus on software engineering methods, and some hardly touch on theory and math at all; others still consider web page design to be CS. CS is about understanding the extreme limits of what computers and software are capable of and pushing the limits of what's possible....it's not supposed to train you for "IT" (which most businesses consider to be the guys that fix their computers).

You really should not be doing a computer science degree unless you are going to be some kind of developer and you get off on things that require in depth knowledge of how to design and compare the performance of different algorithms, want to fix bugs no one else can, want to write really hardcore software (such as doing speech recognition, computer vision, or 3d rendering) at the bleeding edge, and need to be able to prove why your design is better than someone else's design. The industry is already full of very experienced, very compentent people who don't have CS degrees. In fact, many of them started before such degree programs even existed. They know how to code, but they generally don't have any exposure to the more advanced theory stuff and are therefore not inspired by it, nor do they generally value it. The degree is MUCH more a long term investment for your career than a credential to get your foot in the door, as you'll eventually get to apply the theory and start doing things that wow. After you've taken your lumps that is.

Comment: Duh. (Score 1) 1307

by Toasterboy (#35856952) Attached to: Ask Slashdot: Do I Give IT a Login On Our Dept. Server?

You're doing work for the hospital on the system; therefore they need access to it.
Not only that, but there are all sorts of legal requirements around any data on the damn thing. Technically, your calendar, which includes appointment data and scheduling for when you worked on which patient's stuff probably falls under the domain of medical records....

There's a reason that beaurocracy isn't real compatible with you throwing up a server for whatever.. there are legal requirements that make it so every little thing needs to have enterprise grade bs and management behind it. At least on paper anyway.

Not only that, but once you've used it for that, who'se going to sanitize the data off it when you're done with it? I'm surprised the IT guys didn't show up with crowbars demanding admin accounts, followed shortly by dismantling the thing.

That said, I'm sure it's a sweet iphone calendar thingy or whatever.

Comment: Syfy is retarded! (Score 1) 742

by Toasterboy (#35316404) Attached to: Does Syfy Really Love Sci-Fi?

I hate wrestling, and I hate Ghost Hunters. It's all they show now. Neither one is science fiction or epic fantasy. Those idiots who took over Syfy don't understand that the people who used to watch SciFi don't watch anymore, because of their stupidity. They have killed off every show that was even moderately interesting to watch.

The whole point is that Scifi was a place where stuff that wasn't mainstream could flourish. The audience doesn't want the bland stuff that's dumbed down for people with a 50 IQ. Now the morons who own it have turned it into another version of TBS.

With Scifi dead, I have no reason to bother keeping cable other than the History channel, which is also starting to go downhill with stupid reality shows. (Pawn Stars is great though..it's actually genuine.)

Comment: Political suicide. (Score 1) 339

by Toasterboy (#35152150) Attached to: Is an Internet Kill Switch Feasible In the US?

It's a stupid idea.

Besides, the economic impact alone from breaking the internet in the US for any period of time makes "pushing the kill switch" political suicide anyway.

Also, it's exactly the same power as "we want to shut down the phone system so you can't communicate or call 911 during a revolt, or whenever, you know, some politician feels like it".

Facebook

Facebook To Make Facebook Credits Mandatory For Games 116

Posted by Soulskill
from the hope-that's-ok-with-you dept.
An anonymous reader sends this excerpt from TechCrunch: "Facebook has confirmed that it is indeed making Facebook Credits mandatory for Games, with the rule going into effect on July 1 2011. Facebook says that Credits will be the exclusive way for users to get their 'real money' into a game, but developers are still allowed to keep their own in-game currencies (FarmBucks, FishPoints, whatever). For example, Zynga can charge you 90 Facebook Credits for 75 CityCash in CityVille. ... The company acknowledges that some developers may not be pleased with the news, explaining this is why it is announcing the news five months in advance, so it can 'have an open conversation with developers.' The rule only applies to Canvas games (games that use Facebook Connect aren't affected), and while it's games only at this part, Facebook says that it eventually would like to see all apps using Facebook Credits. It's a move that's been a long time coming — there has been speculation that Facebook would do this for a year now, spurring plenty of angst in the developer community."

Comment: Re:Already here (Score 1) 305

by Toasterboy (#34970566) Attached to: British ISPs Embracing Two-Tier Internet

Also, what people don't realize is that the internet is already a loose confederation of networks owned by only a few corporations who have peering deals with each other, and they already throttle each other under the table.

There have already been incidents where the Internet experiences massive failures when these companies get into pissing contests with each other and shut off each other's access to influence negotiations.

Comment: Re:Already here (Score 5, Insightful) 305

by Toasterboy (#34970532) Attached to: British ISPs Embracing Two-Tier Internet

Akamai is very different from a "two tier strategy".

Akamai is all about having local data centers nearer to high traffic population centers. This has the side effect of relieving congestion on the main internet backbones by essentially doing local caching. You want the data, and it happens to be located on a server closer to you, which by coincidence does not have to bottleneck through the backbone as much, so you get better scaling and performance. This strategy is net positive because the internet as a whole benefits by reduced waste and the hosts can deliver content more efficiently with a better user experience.

A two tier internet is something *very* different. That's taking the same pipe, and allocating priority to the rich and powerful at the expense of those who don't pay the premium; there is still the same amount overall of bandwidth available but they want to allocate less of it to you and more of it to companies that pay. How that will actually work is that those who pay more get internet hosting that works, and everyone else gets screwed with a broken, high latency, congested network. Oh, and the price for them will also go up while the service goes down.

Everyone else should get really pissed off about this crap, once they figure out how bad the deal is for them.

Let me put it this way: if this sort of thing is allowed, more advanced internet services developed over the next few years will only be possible when they are run by huge corporations with deep pockets, and all other innovators will be shut out in the cold. And that means you get to pay more for those services because there won't be any competion.

"Consistency requires you to be as ignorant today as you were a year ago." -- Bernard Berenson

Working...