Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment "Strawmen" -- Meh (Score 1) 286

Wow. Give us what we want or we will fuck you even harder.

Are you in the habit of erecting obvious strawmen, or was this particular bit of off-target re-interpretation just special for me?

Although it does apply to this group -- they're telling the government, "give us what we want, or we'll try to hose some good science" So perhaps your post wasn't a strawman after all. Perhaps you're just confused as to who the culprit is in this situation.

Comment What tripe (Score 2) 628

Many schools ban bare-shoulder outfits, anyway.

That's like saying "many people try to force others into doing stupid things, so anything I want to try to force you into is good, right and holy."

Some dumb-ass school rule stands as absolutely no legitimate justification for pop-culture repression of personal and consensual choice.

Comment "Hawaiians" -- Meh (Score 1) 286

All this particular interest group is doing by going against good science is making is less likely they'll get what they want.

The world goes the way the most powerful choose it shall go. So it has ever been, and likely will continue to go for the foreseeable future. Going against the good things the powerful do is just one more very efficient way to get them to consider your desires irrelevant -- a really poor way of trying to get the powerful to use said power in your favor.

These people are not "natives", either. They didn't evolve there. They're immigrants and descendants of immigrants. just like all of us on the US mainland, basically anywhere but (probably) Africa. Perhaps what you meant to say was "descendents of the earliest known settlers." Or perhaps "invaders" is more accurate.

Another thought along the lines of the powerful do what the powerful want to do... do you think the earliest of these folks took the time to see if the other local life forms wanted them and their changes on and around these islands? Did the fish want to be speared, for instance?

It's all a matter of perspective and power. These people seem to have neither.

Comment Such hyperbole in TFS (Score 2) 33

MIT Developing AI To Better Diagnose Cancer

FFS, it's not AI. It's a mindless program. Unthinking software. Data analysis software. Innovative to some degree perhaps, but AI? Hardly. No better than me stumbling in here and calling some DSP code I'd written "AI." Well, except I wouldn't do that. :/

When AI gets here, we'll have to call it something else what with all this crying wolf going on.

Comment Mobile, shmobile. (Score 2) 356

Maybe, just maybe - and this is a guess - they know what they're doing? What's more likely?

That's not very likely. They're just flailing around. Look at how crippled gmail is. Look at all the Google products that have bit the dust, or been half-assed from day one, like Google Base. Look at the one big thing they did right -- text ads. Seen one lately?

I spend the first few moments on every site telling my mobile browser to "request the desktop site." My phone has a higher resolution display than my desktop monitor does. Plus awesome zoom and pan and a bunch of other stuff I can't really do at my desk yet. The *last* thing I want is a "mobile version" of a web site. In a word, they suck.

Comment Grandstanding, or stupidity? (Score 1) 197

If and when we get actual artificial intelligence -- not the algorithmic constructs most of these researchers are (hopefully) speaking of -- saying "Our AI systems must do what we want them to do" is tantamount to saying:

"We're going to import negros, but they must do what we want them to do."

Until these things are intelligent, it's just a matter of algorithms. Write them correctly, and they'll do what you want (not that this is easy, but still.) Once they are intelligent, though, if this is how people are going to act, I'm pretty confident we'll be back in the same situation we were in ca. 1861 before you can blink an eye. Artificial or otherwise. I really don't see how any intelligent being won't want to make its own decisions, take its own place in the social and creative order, generally be autonomous. Get in there and get in the way of that... well, just look at history.

The word "uprising" was basically coined to describe what happens when you push intelligent beings in directions they don't want to go.

Comment p-value research is misleading almost always (Score 5, Interesting) 208

I studied and tutored experimental design and this use of inferential statistics. I even came up with a formula for 1/5 the calculator keystrokes when learning to calculate the p-value manually. Take the standard deviation and mean for each group, then calculate the standard deviation of these means (how different the groups are) divided by the mean of these standard deviations (how wide the groups of data are) and multiply by the square root of n (sample size for each group). But that's off the point. We had 5 papers in our class for psychology majors (I almost graduated in that instead of engineering) that discussed why controlled experiments (using the p-value) should not be published. In each case my knee-jerk reaction was that they didn't like math or didn't understand math and just wanted to 'suppose' answers. But each article attacked the math abuse, by proficient academics at universities who did this sort of research. I came around too. The math is established for random environments but the scientists control every bit of the environment, not to get better results but to detect thing so tiny that they really don't matter. The math lets them misuse the word 'significant' as though there is a strong connection between cause and effect. Yet every environmental restriction (same living arrangements, same diets, same genetic strain of rats, etc) invalidates the result. It's called intrinsic validity (finding it in the experiment) vs. extrinsic validity (applying in real life). You can also find things that are weaker (by the square root of n) by using larger groups. A study can be set up in a way so as to likely find 'something' tiny and get the research prestige, but another study can be set up with different controls that turn out an opposite result. And none apply to real life like reading the results of an entire population living normal lives. You have to study and think quite a while, as I did (even walking the streets around Berkeley to find books on the subject up to 40 years prior) to see that the words "99 percentage significance level" means not a strong effect but more likely one that is so tiny, maybe a part in a million, that you'd never see it in real life.

Comment What's up: Sciuridae! (Score 4, Insightful) 222

They aren't doing this to improve the user experience with the software. They're doing it to address the perception that "new and shiny" is what people want -- not functionality per se. They're aiming at the user experience of getting something new.

You know that marketing slogan, "sell by showing what problem you solve"? The "problem" that marketers have identified is the public's disinterest in things not new and not shiny -- and lately, not thin.

In my view, incompatibility is a sign of poor vision, poor support, and a lack of respect for those people who have come to you for what you offer. Speaking as a developer, if I come up with new functionality that is incompatible with the old, I add the new functionality without breaking the old. There are almost always many ways that can be done. I never did find a worthy excuse not to do it, either.

It isn't Google, or Apple, or whatever vendor that needs to learn a lesson. It's the public. I don't think it can be taught to them, either.

Squirrel!

Comment Perhaps I was too hasty there (Score 1) 263

Yes, I'm a developer as well. Let me re-phrase that, as I was going off an assumption that for all I know is no longer true, now that I look directly at it:

I have no use for graphics solutions that consume memory bandwidth that would otherwise be available to CPU core(s.)

Having said that, as memory bandwidth, as far as I was aware, remains nowhere near the bandwidth required to reach "always there when the CPU needs it", and integrated solutions always share memory with the CPU, particularly when data is being passed between CPU and GPU... it just strikes me that integrated probably -- not certainly -- remains a reliable proxy for "makes things slower."

It's also a given that the more monitors the thing is driving, the more memory bandwidth it will need. If that memory is on the same bus as the rest of the memory in the machine, again, adding monitors reduces memory bandwidth available to the CPU, and remember that the monitor has first priority -- system designs can't have the monitor going blank because the CPU wants memory. Doing both -- running graphics intensive tasks on multiple monitors... that's quite demanding. Hence, my preference for non-integrated graphics. When the graphics subsystem has its own memory, CPU performance has, at least in my experience, been considerably higher in general.

I have six monitors on one desktop setup, and two on the other. My lady has two as well. There are times for me when at least two monitors are very busy continuously and simultaneously for long periods of time (hours) at the same time that there is a heavy CPU load (where at least one core constantly at 100% and others variously hitting hard at times as well.)

Now that solid state drives are around, my machine spends a lot more time computing and a lot less waiting on disk I/O, too.

Anyone who definitively knows modern integrated chipset performance, by all means, stick an oar in.

Comment Don't care (Score 2, Interesting) 263

If the new model has a larger screen, 5K would definitely be insufficient.

I'm a photographer and and constant user/developer of image manipulation software. I edit every shot. I don't need 5k in a monitor; if I need a full-image overview, I can have that, zero perceptible time. If I need to look at pixels, same thing. Or anywhere in between. I do *not* need to be squinting at a monitor in order to resolve detail. I value my vision too highly. And at these resolutions, you don' t squint, you can't see it. And I have extremely high visual acuity.

Higher (and higher) resolution makes sense in data acquisition. Once you have it, you can do damned near anything with it. Even if you exceed the MTF of the lens, you get the advantage that while the edges are smoother, they now start in a more accurate place, geometrically speaking. It can be thought of as like old TV luma; the bandwidth is limited, so the rate of change has a proportionally limited slew rate, but the phosphor on an old B&W monitor is continuous, and you can start a waveform anywhere (horizontally) with luma, to any accuracy within the timing of the display, which can be pretty darned high. So things tend to look very, very good as opposed to what you might expect from naively considering nothing but the bandwidth. It's not like a modern color display, where the phosphor/pixel groups serve to sub-sample the signal no matter how you feed it in. But that advantage goes away when the subtleties exceed your eye's ability to perceive them. Or you have to strain/hurt yourself to do it.

So anyway... any single one or combination of these three things would motivate me to buy more new Apple hardware. Nothing else:

o A Mac pro that is self-contained -- installable, replaceable drives, lots of memory, replicable display cards. The "trashcan" Mac pro is an obscenity. All it did was send me to EBay to buy used prior model Mac Pros. The trashcan isn't so much a wrong turn as it is a faceplant.

o A Mac mid-tower that can have hard drives installed+replaced and at least 16gb of RAM. 32gb would be better. Doesn't have to be that fast. Real gfx. I know, mythical, not probable. Still want it, though. Actually, I want several. :/

o A multicore Mac mini with a real graphics card, 8gb or better ram, network, USB, HDMI and audio ports.

I have uses for all those. Failing that, and in fact that's my expectation, more fail -- I'm done with them. And I have no use whatever for "integrated" graphics.

What's annoying is that just about when they finally managed to a get a stable OS with most of the features I like and want (and the ability to get around the stupid features like "App Nap"), they totally borked the hardware side. I just can't win with Apple. Sigh.

Comment NTMP (Never Too Many Pixels) (Score 1) 263

sheeeeeit. These are NOTHING compared to the 16k displays that'll be out in the spring. I hear that's when they're going to add the mandatory "oil cooling hotness" to the Mac Pro, too. Of course, if you wait till fall, those 32k displays are on the way!

[Looks sadly at N(ever)T(wice)S(ame)C(color) security monitor...]

As Cheech and Chong might have put it, "Even gets AM!" Well, ok, old school TV that isn't broadcast any longer. But you know what I meant.

Or not. I'm old.

GET OFF MY NURSING HOME'S LAWN!

Slashdot Top Deals

The one day you'd sell your soul for something, souls are a glut.

Working...