It should be relatively easy to bust the myth: what are the capacities of the USB drives? If they are all >4Gb I don't buy that they are from 3 years ago. Drive make and model would also give you a fairly firm point for the earliest point in time at which they could have been purchased.
It's worth more than all the computers and related hardware in the office combined.
Debatable. It cost more than all the computers and related hardware in the office combined for sure.
The only trouble with the America Invents Act of 2011 is it is not an Amendment to the Constitution, which it needs to be in order to change the original text of the Constitution which clearly specifies (re: "Inventors") the "first to invent" system over the "first to file" system.
No it doesn't. It specifies that the rights over their discoveries shall go to the Inventors but it nowhere specifies the process that inventors shall be required to undergo to secure those rights.
Under first-to-file as it is practiced elsewhere it is still illegal for anyone other than the inventor to secure a patent on a discovery. It is really a very minor change which only makes a difference in the case of near-contemporaneous discoveries. It mostly benefits accidental inventors who are less likely to be able to provide any evidence for the date of invention than the industrial-scale patent-generators who receive most patents.
A secondary effect which may turn out to be rather more important is that the current US system requires that you file within 1 year of the discovery or never file for patent protection; so an invention cannot be kept a trade-secret and used for 40 years before being patented by that same person/corporation when they feel there is a risk of someone else working it out, effectively securing them a 65-year monopoly. First-to-file elsewhere generally allows this; I am not familiar with the US statute in question.
However, SHA-2 could be broken tomorrow, and this time we won't have a decade's wait while a suitable replacement is designed.
As all road cyclists should know:
Mountain bike shoes and pedals have their place.
On a mountain bike.
Not quite sure what that has to do with Islam, but it's always useful to be reminded of the sacred texts.
Most of these whippersnappers have never even heard of the unsigned keyword.
You did your testing on a pre-production test domain, and you have a managed code base that allows for easy transition to other domain names?
Why add in an unnecessary risk? If third-party code (unbeknownst to you) names its tables with the domain name you can test it all you like on a pre-production name but as soon as you flip to the real name it will stop working. And the first time you will know about it is after you flip the switch and your customers complain.
Or are you seriously suggesting you create an entire new server environment to make modifications to your existing web sites?
Pretty-much everyone I know does this; you make a new VM in EC2 or Peer1 or Rackspace or whatever, install and configure everything under its final name, test it, then switch the DNS entries.
Once you are certain the DNS changes have propagated everywhere they're going to get (wait TTL * 2 after the change) you stop and delete the previous VMs.
If you are identifying people from a population of 30 million you need ceil(log2(30 000 000)) bits for your person identifier; which is 25 bits in this case. However you are likely to need to identify corporations as distinct from persons, which will probably take another bit or so. 26 bits per trading entity into a 24-byte (192-bit) TAC goes 7.4 times.
No matter how you put those IDs into the TAC you can never fit more than 7.4 at a time. So if you are a criminal (or privacy nut) who wants to use this system, make sure there are 8 trades between you and any other party you interact with if you want deniability if someone has access only to the TAC used for the final transaction to you. This is not a very plausible tracking scheme because for practical reasons you will need a timestamp and other gubbins to be encoded in the TAC.
Of course, if you have access to all the TACs you only need to fit two IDs in there at a time to build a chain. This is IMO very plausible.
You are probably right; however it does depend on how rapidly it is broken down and excreted by the body. If it is never got rid of, then this test is using a massive under-dose (from my point of view), since I have certainly swallowed 2 tubes-worth of Colgate Total over the course of my life.
If it is eliminated in 12 hours it is a huge over-dose as you say. Somewhere in between these points is an elimination-rate which makes this number entirely appropriate.
I have a bunch of iOS devices and a Motorola XOOM 2. On paper, the XOOM 2 should kick the arse of, say, an original iPad but it does not. I have tried to browse developer.android.org on both, and only the iPad renders the API Reference correctly: the XOOM leaves the lower-right pane completely empty much of the time.
To be fair I should compare it to my iPad 2, in which case it looks even worse. Pages generally take an age to load, and the only way to get acceptable performance is to turn off the flash plugin. This means you don't get any videos on the majority of sites unless you enable debug mode in the browser and tell it to use the UAString of an iPad.
Now you can point out that the XOOM 2 is running Honeycomb 3.2.2 and not the latest greatest Ice Cream Sandwich 4.0.x, but this is not my choice: Motorola haven't issued an update for ICS yet so I can't have it.
Bigger range? Yes.
Better OS? No. What kind of retard thinks perpetuating a destination-information-free back button is a smart idea? I lose 80px at the bottom of the screen for that, and there are always 600px of blackness which could be used to hint where tapping it will take you, but no.
More Open? Maybe.
Lower Cost? If you don't value your time.
Programmable? Nowhere near as easily as iOS. On iOS I only need to care about the current release because everyone likely to purchase an app has 5.0.1. On Android I have to code for the archaic 2.2 to meet the same market share.
Basically if you buy an iOS device Apple can screw you over, but probably won't. If you buy an Android device Google, the device manufacturer and (if applicable) the carrier can screw you over, and 2 out of the 3 probably will.
I can't; don't have mod points today.
I did a similar exam in, I guess, 1999; you could get an A+ in about 20 seconds without looking at the screen. The last sentence advised you to compare what you had done to sample.doc, so I typed:
Ctrl+O sample.doc Enter Alt+F A
I then went through the paper to verify that there weren't any hidden extras or obvious flaws in the sample (there weren't) and delete any metadata (there was none).
Don't know about Premiere but I do this all the time with FCP X/Compressor. I understand not all players will play them, but my PS3 does.
There is another consideration.
When generating a 1024-bit random number which you hope is prime you take 1022 bits of random data and then put those bits in a 1-sandwich. For a 512-bit random probable prime you use 510 bits of random data. Having generated the probable prime you then test if it is very likely prime and if not you repeatedly add (or subtract, as long as your algorithm is consistent) two and retest until the test says you've found a winner. Because prime numbers are not perfectly evenly distributed (if they were we could find them very easily) there are sometimes large gaps between primes and random generation using this method will (over many attempts) pick primes following unusually large inter-prime gaps much more commonly than primes following unusually small inter-prime gaps.
So far as I know no-one knows what the distribution of 512- and 1024-bit primes really looks like: but supposing there were only 2^30 very-large gaps in the 1024-bit prime space it would be very worthwhile looking for common factors in 2048-bit RSA public keys.
All Finagle Laws may be bypassed by learning the simple art of doing without thinking.