Ninite.com is the only place I go for software on a new Windows installation. Select what you want and it gives you one installer. And you get exactly what you asked for. No search bars or crapware. It has been working great for years now.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
But that was not my question. I fully understand how to use lookup tables/Chebyshev expansions of exp(x) and ln(x) to implement pow(x,a)--I have implemented these many times. My question was specifically on your assertion that any differentiable function could be evaluated with as a Newton-style iterative correction and thus provide arbitrarily precise results. I asked specifically to see how that is accomplished for pow(). There is no corrective mechanism in the algorithm you have stated above. The precision you get is a function of the precision that you've baked into your lookup table--and then it become the space/accuracy trade-off. On desktop/server CPUs, that trade-off is more often than not won by Chebyshev expansions, especially in a world of SSE/AXV vector instructions.
So, if there is a technique that does allow me to start with an initial guess x of pow(a,b) and then create corrections of the form: x[i+1] = x[i] - f(x[i]) where f() uses only intrinsic math operations (+,-,*,/,etc) but not transcendentals, then I am quite anxious to see it.
With a table of values in memory you can also narrow down the inputs to Newton's method and calculate any differentiable function very quickly to an arbitrary precision. With some functions the linear approximation is so close that you can reduce it in just a few cycles.
No, you can't. I know this was done in Quake3 fastInvSqrt(), but that is the exception, not the rule in my experience. x = pow(a,b) is a differentiable function. How can you assemble a root function/Newton iteration to successively correct an initial guess for x to arbitrary precision--without actually calling pow() or other transcendental function? I have built Newton (and Halley and Householder) iterations to successively correct estimates for pow(a,b) when b is a particular rational number. You can re-arrange the root function to only have integer powers of the input a and of the solution value x, and those can be computed through successive multiplication. These can be fast, but they are certainly not useful when b is something other than a constant rational number. And even if the exponent value has only a few significant digits, the multiplication cascade starts to get expensive (that was the reason to use Halley/Householder because once you have f' calculated, f'' and f''' are almost free.)
If you know otherwise, please let me know. My current fast pow() function leverages IEEE floating point formats and Chebyshev polynomial expansions to get reasonable results. If there is way to polish an approximate pow() result with Newton (or higher order) iteration, I would be happy to learn it.
I don't think that the author got it all exactly right, but your points are wrong. Business is driven mindlessly by ONE THING--maximizing profitability. The time will come...and right soon...when machines will work well enough to eliminate many existing blue collar jobs and a large fraction of low-skilled white collar jobs too. There is and will be pushback, in the form of higher minimum wages and requirements for health insurances. And all those pushbacks do is accelerate the process. We can likely thank China, India, and Mexico for providing cheap labor and forestalling the onset of this mechanization a decade or two. Had they not been there to take our manufacturing jobs, serious automation efforts would have started even in the early 90s. As greed is the only acceptable (and fiduciarily mandated) corporate ethos, we should expect corps to follow its guiding light to its logical end. As soon as Walmart can stock their shelves with a robot, they will. As soon as McDonalds can reliably serve food without a single worker on staff, they will. As soon as Fedex can roll a Google truck, they will. Human labor is viewed as a "commodity" and it reliably becomes more expensive with time.
I think the mistake that the author makes is that he assumes humans will be part of the machine. They won't. There will be strikes and protests and maybe even legislation, but those will only slow the pace of change, not stop it. The US and Western European economies will look quite different in 50 years. I doubt anything but boutique items will be made even in part by humans. Before then, we will all have to ask ourselves what we do with our time and how do we provide for our needs. The end of the story effectively contrasts two possible outcomes.
Replace the 100 million key dictionary with a PRNG seeded with a secret key, some time information, and source/destination ports and addresses. The NSA would have the PRNG, the key, and the seeding input from the packet. They could deduce the key without much effort and the keys would appear truly random to anyone without knowledge of the secret key, no matter the sample size.
It is easier to just drill wells. Google geothermal heat pump.
This idea is terrible. It is even worse than RFID credit cards. Since it has active electronics, I assume this is able to do challenge/response authentication, which is good. But how do you disable it? Somebody just "bumps" into you with a scanner and pays their dinner bill with your gut. At least RFID-chipped cards can be stored in a conductive pouch to prevent walk-by theft.
Whatever shape these new authentication methods take, they need to be at minimum:
(1) Challenge/Response based, and
(2) Momentary ON
Requirement 1 kills most biometrics systems. And Requirement 2 kills most implant/ingested systems.
Yeah...TImex Sinclair 1000, a hand-me-down BW TV from my grandfather, and my old tape recorder. That was the beginning for me...back in 6th grade. I bought from Hills for $50, I think. I spend many hours on it...until the C64 and floppy drive came a few Christmas' later. I look back on those days and the oceans of free time I had...peeking and poking and disassembling code. I remember spending many hours typing in code from Compute! Gazette...that was all C64. And Transactor too.
As machines and computers do more and more of the work, will such a thing as work, as we know it, still exist in 50 years ?
Fixed that for you. The answer is no. My child will live to a see a very different world and economic reality.
This. The whole series is very well done and deeply engaging. But it is dense. It might be best described as fictional mathematical physics, but it is not your typical SF...even hard SF.
I had the Timex Sinclair 1000 as well, but not 16KB module. Paid $60 for it at Hills--I was in 6th grade. It learned quickly to be careful with my precious 2k of RAM, but I coded a fairly accurate image of the Space Shuttle and figured out how to make it "fly" across the screen. Hard to believe I have been writing code for almost 30 years!
I think the key take-away is that there is another physical signal dimension to exploit--frequency, directionality, polarization, and now orbital angular momentum. They have demonstrated that they can distinguish between two channels on the same frequency using orbital angular momentum as the differentiator. So, OAM mode can be added to the tool kit. If they can distinguish among a few dozen modes and still allow beam forming, this could provide a huge benefit for cellular and other wireless networks. If they can distinguish among hundreds or thousands of modes, it could be truly transformative. It has been a long time since my EM class, but I wonder if similar mode discrimination could be applied to waveguides.
Sandy Bridge has fused multiply-add too.
Probably referring to efuses that can be burned out on the die. These are common and allow CPU/GPUs to have unit-specific information (like serial numbers, crypto keys, etc) coded into otherwise identical parts from the fab. Video game systems like the 360 use them as an anti-hacking measure...disallowing older version of firmware to run on systems that have certain efuses "blown." Likely, there is an efuse for each core or group of cores. Those can be burned out if they are found to be defective or to simply cripple a portion of the part for down-binning. That is a practice at least as old as the Pentium 2.
Truecrypting external USB/eSATA drives are by far the better option. We also use normal 3.5" drives with external USB/eSATA docks. There are NO cheap tape solutions anymore. I'd further argue that what tape solutions exist are trumped by hard drive backup solutions for on-site backup--far slower and no more reliable than hard drives. Tape is dead. Anyone still using them is either leveraging a 5-to-10-year-old investment in a tape robot or is being sold a bill of goods by a vendor.