Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Accepting money from a criminal (Score 2) 68

to do something that furthers his criminal enterprises has a name. It's called "conspiracy".

So if you ever try your hand at hunting down criminals like this, be aware of the potential danger of tying yourself to the criminal's legal fate. If you've done business withhim that's the least bit shady, and he's overseas beyond the reach of local authorities, things could get quite ugly for you.

Comment Re:Were known management tools used? (Score 1) 118

Well, we're talking far too abstractly here to be very meaningful. I'm not saying an RDMS couldn't be *part* of the picture. I'm saying that a system architecture that punted all the persistence and data consistency problems to a distributed RDBMS is a non-starter for something on this scale. People don't build systems like this one that way. If they don't, even though that technology is a mature one, that's a good reason to be skeptical of the idea that that approach would be a panacea.

Comment Re:Were known management tools used? (Score 2) 118

No, it doesn't have the "pop" of NoSQL,

More to the point, it doesn't have the scalability across distributed systems. Show me one application approaching this scale, just *one*, that relies RDBMS clusters and two-phase commit exclusively to support this kind of transaction volume. Don't get me wrong; I'm an old-school RDBMS guy myself; I know a lot about relational database systems, including their limitations. I'd look to the way outfits like Amazon, ,Google or LinkedIn design their infrastructure rather than Oracle on big iron. This is well outside the sweet spot for that approach.

RDBMS servers are made to just do the job quietly and reliably, with very strict ACID compliance...

This is a very simple-minded approach to architecture, one that's admittedly very serviceable in a wide array of applications. But useful as ACID is as a set of assumptions you can rely upon, it's not the only way to create a reliable, serviceable system. In fact there are situations where it's provable that ACID falls short. Google "CAP Theorem" and "eventual consistency". It's fascinating stuff.

Comment Re:Were known management tools used? (Score 4, Informative) 118

It does not sound to me as though known management tools were used. Did they sit down with the government personnel in charge, and present their approach, and what the site would look like (menus, flow, etc) when finished? Were there testable milestones, and a final presentation of working software? It sure doesn't sound like it.

They might well have done all these things and still failed to catch the problems before the site's launch.

Performance, like security (ack! scary!) is a non-functional requirement -- that is to say it's not the kind of requirement where you can sit down with a checklist and say, 'yep, this it works,' or 'no, it doesn't.' You have to develop a more sophisticated test.

Load testing is a step in the right direction, but you also have to look at system architecture. Remember the days before people figured out that you had to load web ads asynchronously, after the page content was loaded? Sometimes the page load would be slow, not because the page's server was loaded, or because of the user's browser or internet connection were slow. Often it would be the ad server that was overwhelmed, which if you think about it is bound to be more common than the content server being overwhelmed. You could functional test and even load test the heck out of a page with synchronous ad loads, but until it went into production chances are you wouldn't catch the fatal performance flaw. That kind of problem is architectural; some of the data being delivered is coming from servers outside your control.

Ordinary tests are about ensuring reproducible results, but when the architecture leaves you vulnerable to things happening on servers and networks outside your control your problems *aren't reliably reproducible*. You have to design around *possibilities*.

Some of the problems with Healthcare.gov were of this nature, although with not so simple a solution as "use window.onload()." The site is supposed to orchestrate a complex, *distributed* process *synchronously*. You have to go out the Homeland Security's database to confirm citizenship, then to the IRS databases to confirm claims about income, then get quotes from the private insurers that meet the customer's needs. There is, in my opinion, no way to be 100% sure, or even 80% sure that a system like that will work under real work load, unless you present it with real work load.

Were I architecting such a site, I'd plan to do a lot of that work batch; that is I'd build the healthcare application offline on the user's browser, with internal consistency checks of course. Then I'd send the user's application through a batch verification system, emailing him when a result was ready. This is a clunky and old-fashioned approach, but it wouldn't force the user to chain himself to his browser . And it would have more predictable performance. Predictability is a vastly under-rated quality in a system. A system which is fast most of the time may not be as desirable as one which provides the answer consistently.

Comment Re:Most don't understand the legal argument (Score 1) 243

Let me start with a disclaimer: I'm not an expert in Bitcoin, but this is my understanding.

One of the important differences between Bitcoins and regular currency is that every bitcoin has an unique serial number. This is not true for dollars; oh a particular banknote my have a serial number, but the dollars in your bank account have no individual identity. The fact that every bitcoin has a serial number makes it possible to distribute information about who owns a particular bitcoin. Thus the physical location of the fact of ownership is the entire network of computers participating in bitcoin transactions. Naturally, the protocol has provisions for resolving contradictory assertions, so that over time the network will converge towards consensus on any particular bitcoin.

Now as to how Uncle Sam can 'seize' a bitcoin, it's simple. Uncle Sam doesn't seize your bitcoin *wallet*, he seizes the cryptographic keys you use to conduct Bitcoin transactions. Then for every bitcoin X in the wallet, he sends out the message, "I, BringsApples, transfer bitcoin #X to Uncle Sam." Furthermore, since the protocol is de-centralized, there is no mechanism for the participants to agree that bitcoin X really belongs to you. The only way to "get the bitcoin back" is for Uncle Sam to broadcast a message, "I, Uncle Sam, transfer bitcoin #X to BringsApples."

Comment Re:Most don't understand the legal argument (Score 1) 243

The bitcoins don't reside anywhere, including in any particular "virtual wallet on a PC".

If you don't believe that, make 100 copies of that "virtual wallet" and ask yourself if you now have 100 times as many bitcoins.

Well, let's ask a far more pertinent question: if you make 100x copies of wallet, do you have 100x the bitcoin purchasing power? The answer is no. It is theoretically possible to spend the same bitcoin twice, but th protocol is designed so that on average that costs more than obtaining a second bitcoin legitimately. You don't have to make it impossible to cheat; you only have to make it pointless.

A bitcoin is an abstraction, just as the dollars in your bank account are. If you have $100K in a bank account, that doesn't mean there's a stack of banknotes in the vault that corresponds to that. Your funds "in the bank" exist only on that bank's ledgers. They have no physical existence, they exist according to a set of rules which assign those dollars to you. When nearly everyone in the system agrees that according to the rules you have $100K in the First National Bank of Podunk, then you've got $100K there. Bitcoins work exactly the same way, except the rules are voluntarily agreed-upon conventions rather than national law. Once everyone in the Bitcoin universe agrees you own a certain bitcoin, you can exchange that bitcoin for goods and services.

Comment Re:Still an idiot (Score 1) 243

Even if he somehow could get out of the drug dealer and murder for hire charges he would still have the problem of proving how he legitimately got the money and why he didn't pay taxes on it. Penalties for failing to report tens of millions of dollars in income could easily put him in prison for a decades and would still result in the loss of the bitcoins because he can't prove any legitimate means why which he got them.

I assume he will claim he mined them (which could very well be true - an early adopter could've easily mined that many - although it will be interesting to see how the argument will be countered with block chain analysis) and that they aren't taxable until the gain in value is realised (by selling them for dollar).

Comment Re:Yeah and there's no more North Pole (Score 3, Insightful) 94

These predictions are all 100% accurate, just like Ted Danson's prediction that all U.S. cities will be completely uninhabitable by 1980 because Reagan was president throughout the 1970's and are even MORE accurate than ALGORE's 100% accurate prediction that the entire polar ice cap has permanently melted and all polar bears are dead.

Which specific predictions are you talking about? Nobody brought up Ted Danson, so I fail to see his relevance unless you are making this argument: if Ted Danson's predictions aren't accurate, then nobody's can be.

In any case it seems to me that the summary and linked article is kind of sloppy, almost as if it were designed to provoke this kind of silly strawman response.

For example, the notion that the power grid might be vulnerable to a geomagnetic storm the size of the 1859 event isn't a prediction. It's merely an assessment of vulnerability to a rare but possible event (e.g., a 14+m tsunami hitting Japan). A large geomagnetic storm is a possibility that should be taken into account, not an event to put on the calendar.

Likewise, nobody is suggesting that there's anything special about the 400ppm CO2 figure, other than that it is a round figure. It's common sense to look for climate impact, not because we've hit some "magic" number, but that CO2 levels are higher than they've been at any time in the past three million years. This is especially so because the slope of the CO2 concentration graph shows no sign of topping out.

Keep in mind there's no such thing as a "natural disaster". There are only natural events that catch us unprepared. There's nothing inherently disastrous about flooding in Bengladesh, except that economics and politics forces people to live in the flood plain. Likewise there's nothing about a world that's two or three degrees warmer that makes it uninhabitable by humans, but the changes involved along the way will be *perceived* as catastrophic. For example models predict a drier western US. That doesn't mean that one day it will stop raining, it only means that rainy years will be less common and drought years more common. For all we know we'll have a rainy year in 2014, but it still makes sense to plan for the day when the Colorado River won't be able to supply all the water cities in the west require from it (Los Angeles, San Diego, Denver, Phoenix, Tucson, Salt Lake City, and others). The disaster won't be lack of water, it will be lack of preparedness.

Comment Re:110,000 year major glaciation Sun cycle (Score 3, Insightful) 552

You wrote (emphasis mine):

The Earth's orbital changes around the Sun varies from more circular to more elliptical and its axis wobble changes and the net effect is that the different solar inputs are what causes the major climate shift on about a 110,000 year cycle.

From the /. summary (emphasis mine):

They have concluded that the driving factor since 1900 has been greenhouse gases.

From the U of Edinburgh press release (emphasis mine):

Research examining the causes of climate change in the northern hemisphere over the past 1000 years has shown that until the year 1800, the key driver of periodic changes in climate was volcanic eruptions.

These tend to prevent sunlight reaching the Earth, causing cool, drier weather. Since 1900, greenhouse gases have been the primary cause of climate change.

Now let me tie it all together for you. Let's say we assume:

(1) Over the course of hundreds of thousands of years, variations in solar radiation are the strongest determinants of global temperature.

(2) Over the course of the last thousand years, volcano eruptions have been the strongest determinants of global temperature.

(3) Over the last hundred years, anthropogenic greenhouse gasses are the strongest determinants of global temperature.

Here's the important point: you can believe ALL THREE of these things without the least contradiction. Denialist arguments seem to assume that any dominant factor must be dominant in every past period and over every timescale. This is why people scratch their heads at the denialists' "gotchas!", e.g. "Gotcha! There were no SUVs in the medieval warm period." So what? It's a straw argument. Nobody ever claimed that *all* past climate variation was due to greenhouse gasses, much less *anthropogenic* greenhouse gasses.

Comment Re:Being fired was the correct response regardless (Score 3, Insightful) 399

Well, you are right of course, the behavior was unprofessional. That doesn't mean that the reaction isn't disturbing.

Just because the inciting behavior is unreasonable doesn't make the piling on reaction *rational*. It has more than a whiff of a mob turning on someone who is suddenly perceived as vulnerable.

The people reacting to this act like they know all about this person. But do they? All they have to go on is one foolish comment. Many years ago, in the early 70s, my older teen sister volunteered in a program for intellectually disabled children -- this was at a time before this kind of service was common, or required for high school graduation. One day she remarked to one of her friends that she had to leave because it was time to go see "her retards." Word got back to one of the parents and my sister was banned from the program. Now I can understand the position of the parent defending her child, but is it reasonable for her to deprive her child of the support and help of someone he loved just because that person said something stupid?

If there is one thing I've learned over the years it's that the fruits of self-righteousness are bitter. The instinct to become part of an avenging mob is no respecter of fact, context, circumstance or consequences. It is not to be trusted.

Firefox

Asm.js Gets Faster 289

mikejuk writes "Asm.js is a subset of standard JavaScript that is simple enough for JavaScript engines to optimize. Now Mozilla claims that with some new improvements it is at worst only 1.5 times slower than native code. How and why? The problem with JavaScript as an assembly language is that it doesn't support the range of datatypes that are needed for optimization. This is good for human programmers because they can simply use a numeric variable and not worry about the difference between int, int32, float, float32 or float64. JavaScript always uses float64 and this provides maximum precision, but not always maximum efficiency. The big single improvement that Mozilla has made to its SpiderMonkey engine is to add a float32 numeric type to asm.js. This allows the translation of float32 arithmetic in a C/C++ program directly into float32 arithmetic in asm.js. This is also backed up by an earlier float32 optimization introduced into Firefox that benefits JavaScript more generally. Benchmarks show that firefox f32 i.e. with the float32 type is still nearly always slower than native code, it is now approaching the typical speed range of native code. Mozilla thinks this isn't the last speed improvement they can squeeze from JavaScript. So who needs native code now?"
Government

FBI's Secret Interrogation Manual: Now At the Library of Congress 102

McGruber writes "The FBI Supervisory Special Agent who authored the FBI's interrogation manual submitted the document for copyright protection — in the process, making it available to anyone with a card for the Library of Congress to read. The story is particularly mind-boggling for two reasons. First, the American Civil Liberties Union fought a legal battle with the FBI over access to the document. When the FBI relented and released a copy to the ACLU, it was heavily redacted — unlike the 70-plus page version of the manual available from the Library of Congress. Second, the manual cannot even qualify for a copyright because it is a government work. Anything 'prepared by an officer or employee of the United States government as part of that person's official duties' is not subject to copyright in the United States."

Slashdot Top Deals

"Don't drop acid, take it pass-fail!" -- Bryan Michael Wendt

Working...