Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment: Re:Pinto (Score 1) 133

by mcrbids (#49566571) Attached to: The Engineer's Lament -- Prioritizing Car Safety Issues

Nope. Poor breaking behaviour doesn't cause crashes, people not keeping a safe distance causes crashes.

Nope. What causes crashes is hunks of metal ramming into other hunks of metal. It would be complicated except that it's not. We choose to ascribe "cause" to other events that precede the ramming behavior, but it's really arbitrary.

For example, it's widely understood that driving cars is *dangerous* and yet we don't ascribe standard risk factors for *driving at all*.

Skiing is inherently dangerous. In order to use a ski slope, I have to acknowledge this risk. Why aren't car manufacturers covered with a similar legal conract?

Comment: Ungrounded Lightning (Rod) to Stop Using DietPepsi (Score 1) 533

by Ungrounded Lightning (#49564133) Attached to: Pepsi To Stop Using Aspartame

Aspartame has problems for some people (like my wife and brother-in-law) and not for others (like me).

Sucralose has problems for some people (like me) and not for others (like my wife).

Seems to me the thing for Pepsi to do is to bring out another formula - with a different name - using Sucralose, put them in the stores side-by-side (they get a LOT of shelf space to play with), and let the customers decide.

Changing the formula of an existing brand strikes me as a stupid move. I suspect Pepsi is about to have it's "New Coke!" moment...

Comment: problems with making stuff invisible to drivers (Score 1) 98

The bit you're apparently not grasping is something called a spatial light modulator. ... Couple it with a microwave radar or ultrasound sonar, and you can track individual raindrops and then cast shadows on them.

Then construct an object that appears to the system to be raindrops and you can put an invisible obstacle in the road. B-b

Comment: Don't forget legacy BROWSERS. (Score 4, Insightful) 154

by Ungrounded Lightning (#49563663) Attached to: JavaScript Devs: Is It Still Worth Learning jQuery?

A site may wish to continue using JQuery because some of its clients are using older browsers that don't support the new features that allegedly obsolete JQuery code.

Drop the JQuery code and you drop those customers. Develop future code without it and the pages with the new features won't perform with people using legacy browsers. And so on.

I've seen similar things happen over several generations of web technology. Use care, grasshopper!

Comment: Re:So, Microsoft is a social leech! (Score 1) 103

by mcrbids (#49553419) Attached to: Microsoft Increases Android Patent Licensing Reach

Except in this case, the patent is for the use of VFAT, which is a very specific file system format that even Microsoft doesn't use much anymore, but is commonly understood by their systems.

There is no reason, for example, why Microsoft couldn't implement an open file system like EXT4 or UFS and update all their operating systems to recognize it, except that it would mitigate the value of their VFAT patents. So they don't bother.

I remember reading that they make more money on their patents from Android vendors than they make *gross* from their Windows Mobile sales.

Comment: Not just IOS (Score 2) 472

by mcrbids (#49553317) Attached to: Ask Slashdot: What Are the Most Stable Smartphones These Days?

I have a Moto Razr Maxx HD, now working on its 3rd year. It's been basically perfect. I reboot it perhaps once every few months, and half of those reboots are due to an OTA OS upgrade.

With it's amazing battery life, and durable, sturdy case, it's a phone that feels like a "partner" that doesn't leave me hanging and even when I'm really putting the screws to it, (EG: on trips) it's "just there" for me.

It is no longer a "flagship" phone, it's not the fastest phone, and it doesn't have the biggest/brightest screen any more, but it's still a very, very good balance for a phone that I probably won't be replacing until it actually dies.

My only honest complaint is that its bluetooth reception seems weak. I use $20 wired headphones as a result.

Comment: Such hyperbole in TFS (Score 2) 33

by fyngyrz (#49544657) Attached to: MIT Developing AI To Better Diagnose Cancer

MIT Developing AI To Better Diagnose Cancer

FFS, it's not AI. It's a mindless program. Unthinking software. Data analysis software. Innovative to some degree perhaps, but AI? Hardly. No better than me stumbling in here and calling some DSP code I'd written "AI." Well, except I wouldn't do that. :/

When AI gets here, we'll have to call it something else what with all this crying wolf going on.

Comment: Re:Too bad (Score 2) 66

To paraphrase, you can't be too rich, too thin, or have too many bits of precision in a calculation. With single precision you have to be enormously careful not to drop digits even in comparatively modest loops; with double precision you can many digits before you run out. You can see it in almost any computations involving trig and pi -- single precision pi degrades in series much faster than double precision pi. It isn't just a matter of not using forward recursion to evaluate bessel functions, which is unstable in any precision (or for that matter, using book definitions of e.g. spherical bessel functions in terms of trig functions) or reordering series to avoid subtracting big numbers and running small to big instead of big to small -- there is simply a big difference between cumulating a random walk with a random digit at the 16th place and one at the 8th place.

A second problem is the exponent. 10^38 just isn't "big" in a modern large scale computation. It is easy to overflow or underflow a single precision computation. 10^308 is a whole lot closer to big, even expressed in decibels. One can concentrate a lot more on writing simple code, and a lot less on handling exponent problems as they emerge.

A final problem is random numbers. This is actually a rather big problem, as lots of code (all Monte Carlo, for example) relies on a stream of algorithmically random numbers that (for example) do not have a period less than the duration of the computation and that do not have significant bunching on low dimensional hyperplanes or other occult correlations. It is much more difficult to build a good random number generator on fewer bits, because the periods of the discretized iterated maps scale (badly) with reduced numbers of bits and it is more difficult to find acceptable moduli for various classes of generators from the significantly smaller discretized space. You can watch this problem emerge quite trivially by building a Mandelbrot set generator in float and rubberbanding in -- oops, you hit bottom rather quickly! Rebuild it in double and you at least have to work to rubberband in to where it all goes flat. You have to build it in a dynamically rescaleable precision to rubberband in "indefinitely" as the details you wish to resolve eventually become smaller than any given finite precision. This actually illustrates the overall problem with single precision quite nicely -- the emergent flat patches in an graphical representation of an iterated map are isomorphic to the establishment of unintended correlations in long runs of iterated maps in a random number generator and the clipping of the graphical representation of small numbers illustrates the problems with mere underflow in real computations of interest.

Personally, I dream of default quad precision and 128 bit processors. 34 decimal digits of precision means that a random walk with n unit steps (which accumulates like \sqrt{n}) require (10^30)^2 = 10^60 steps to get to where I don't still have 4 significant digits. Even a rather large cluster running a rather long time would have a hard time generating 10^60 add operations. In contrast, with only (say) 8 decimal digits a mere 10^16 operations leaves you with no digits at all, assuming you haven't overflowed already. I've run computations with a lot more than this number of operations. I also like the idea of having overflow around 10^5000. It takes quite a while adding numbers at the overflow of double precision to hit overflow, and one basically could add overflow scale single precision floats forever and never reach it. That gives me comfort. It would also make writing a Mandelbrot set explorer tool where one would be likely to give up before rubber banding all the way to the "bottom" -- there are a whole lot of halvings of scale in there to play with that still leave you with much more resolution than needed on the screen.

rgb

Comment: Re:Dell, HP, Panasonic (Score 5, Informative) 417

by mcrbids (#49540887) Attached to: We'll Be the Last PC Company Standing, Acer CEO Says

How is Dell a laugh?

I write this on a gorgeous Dell Precision M3800 that has it all: powerful i7 processor, space for lots of RAM (16 GB), dual SSD bays, gorgeous 4K screen, and all in a lightweight, svelte case that rivals a Macbook Air in appearance and feel.

Oh, did I mention Linux compatibility? Ubuntu is officially supported. (My fave distro, Fedora runs without issue - literally load and forget)

Not sure what you're looking for in a PC manufacturer, but for Slashbots, isn't this pretty much it?

Comment: Re:Back end (Score 1) 77

by pe1rxq (#49535317) Attached to: GCC 5.1 Released

Unless your build environment is really broken (or you have a seriously a-typicall code base) compile time should not matter nearly as much as the resulting code. Don't forget that the resulting code has a big impact on the test phase of your cycle.

Normally during the edit phase you only touch part of a codebase, and proper dependency tracking should result in only a small part of it being rebuild and linked.
Proper dependency handling is not a job of the compiler.
LInking is also not the job of the compiler. (And untill lld is mature enough llvm and gcc use the same one anyway, and even then it still has to prove itself)

When it is not necessary to make a decision, it is necessary not to make a decision.

Working...