Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment "Strawmen" -- Meh (Score 1) 286

Wow. Give us what we want or we will fuck you even harder.

Are you in the habit of erecting obvious strawmen, or was this particular bit of off-target re-interpretation just special for me?

Although it does apply to this group -- they're telling the government, "give us what we want, or we'll try to hose some good science" So perhaps your post wasn't a strawman after all. Perhaps you're just confused as to who the culprit is in this situation.

Comment What tripe (Score 2) 628

Many schools ban bare-shoulder outfits, anyway.

That's like saying "many people try to force others into doing stupid things, so anything I want to try to force you into is good, right and holy."

Some dumb-ass school rule stands as absolutely no legitimate justification for pop-culture repression of personal and consensual choice.

Comment "Hawaiians" -- Meh (Score 1) 286

All this particular interest group is doing by going against good science is making is less likely they'll get what they want.

The world goes the way the most powerful choose it shall go. So it has ever been, and likely will continue to go for the foreseeable future. Going against the good things the powerful do is just one more very efficient way to get them to consider your desires irrelevant -- a really poor way of trying to get the powerful to use said power in your favor.

These people are not "natives", either. They didn't evolve there. They're immigrants and descendants of immigrants. just like all of us on the US mainland, basically anywhere but (probably) Africa. Perhaps what you meant to say was "descendents of the earliest known settlers." Or perhaps "invaders" is more accurate.

Another thought along the lines of the powerful do what the powerful want to do... do you think the earliest of these folks took the time to see if the other local life forms wanted them and their changes on and around these islands? Did the fish want to be speared, for instance?

It's all a matter of perspective and power. These people seem to have neither.

Comment Such hyperbole in TFS (Score 2) 33

MIT Developing AI To Better Diagnose Cancer

FFS, it's not AI. It's a mindless program. Unthinking software. Data analysis software. Innovative to some degree perhaps, but AI? Hardly. No better than me stumbling in here and calling some DSP code I'd written "AI." Well, except I wouldn't do that. :/

When AI gets here, we'll have to call it something else what with all this crying wolf going on.

Comment Mobile, shmobile. (Score 2) 356

Maybe, just maybe - and this is a guess - they know what they're doing? What's more likely?

That's not very likely. They're just flailing around. Look at how crippled gmail is. Look at all the Google products that have bit the dust, or been half-assed from day one, like Google Base. Look at the one big thing they did right -- text ads. Seen one lately?

I spend the first few moments on every site telling my mobile browser to "request the desktop site." My phone has a higher resolution display than my desktop monitor does. Plus awesome zoom and pan and a bunch of other stuff I can't really do at my desk yet. The *last* thing I want is a "mobile version" of a web site. In a word, they suck.

Comment Grandstanding, or stupidity? (Score 1) 197

If and when we get actual artificial intelligence -- not the algorithmic constructs most of these researchers are (hopefully) speaking of -- saying "Our AI systems must do what we want them to do" is tantamount to saying:

"We're going to import negros, but they must do what we want them to do."

Until these things are intelligent, it's just a matter of algorithms. Write them correctly, and they'll do what you want (not that this is easy, but still.) Once they are intelligent, though, if this is how people are going to act, I'm pretty confident we'll be back in the same situation we were in ca. 1861 before you can blink an eye. Artificial or otherwise. I really don't see how any intelligent being won't want to make its own decisions, take its own place in the social and creative order, generally be autonomous. Get in there and get in the way of that... well, just look at history.

The word "uprising" was basically coined to describe what happens when you push intelligent beings in directions they don't want to go.

Comment What's up: Sciuridae! (Score 4, Insightful) 222

They aren't doing this to improve the user experience with the software. They're doing it to address the perception that "new and shiny" is what people want -- not functionality per se. They're aiming at the user experience of getting something new.

You know that marketing slogan, "sell by showing what problem you solve"? The "problem" that marketers have identified is the public's disinterest in things not new and not shiny -- and lately, not thin.

In my view, incompatibility is a sign of poor vision, poor support, and a lack of respect for those people who have come to you for what you offer. Speaking as a developer, if I come up with new functionality that is incompatible with the old, I add the new functionality without breaking the old. There are almost always many ways that can be done. I never did find a worthy excuse not to do it, either.

It isn't Google, or Apple, or whatever vendor that needs to learn a lesson. It's the public. I don't think it can be taught to them, either.

Squirrel!

Comment Perhaps I was too hasty there (Score 1) 263

Yes, I'm a developer as well. Let me re-phrase that, as I was going off an assumption that for all I know is no longer true, now that I look directly at it:

I have no use for graphics solutions that consume memory bandwidth that would otherwise be available to CPU core(s.)

Having said that, as memory bandwidth, as far as I was aware, remains nowhere near the bandwidth required to reach "always there when the CPU needs it", and integrated solutions always share memory with the CPU, particularly when data is being passed between CPU and GPU... it just strikes me that integrated probably -- not certainly -- remains a reliable proxy for "makes things slower."

It's also a given that the more monitors the thing is driving, the more memory bandwidth it will need. If that memory is on the same bus as the rest of the memory in the machine, again, adding monitors reduces memory bandwidth available to the CPU, and remember that the monitor has first priority -- system designs can't have the monitor going blank because the CPU wants memory. Doing both -- running graphics intensive tasks on multiple monitors... that's quite demanding. Hence, my preference for non-integrated graphics. When the graphics subsystem has its own memory, CPU performance has, at least in my experience, been considerably higher in general.

I have six monitors on one desktop setup, and two on the other. My lady has two as well. There are times for me when at least two monitors are very busy continuously and simultaneously for long periods of time (hours) at the same time that there is a heavy CPU load (where at least one core constantly at 100% and others variously hitting hard at times as well.)

Now that solid state drives are around, my machine spends a lot more time computing and a lot less waiting on disk I/O, too.

Anyone who definitively knows modern integrated chipset performance, by all means, stick an oar in.

Comment Don't care (Score 2, Interesting) 263

If the new model has a larger screen, 5K would definitely be insufficient.

I'm a photographer and and constant user/developer of image manipulation software. I edit every shot. I don't need 5k in a monitor; if I need a full-image overview, I can have that, zero perceptible time. If I need to look at pixels, same thing. Or anywhere in between. I do *not* need to be squinting at a monitor in order to resolve detail. I value my vision too highly. And at these resolutions, you don' t squint, you can't see it. And I have extremely high visual acuity.

Higher (and higher) resolution makes sense in data acquisition. Once you have it, you can do damned near anything with it. Even if you exceed the MTF of the lens, you get the advantage that while the edges are smoother, they now start in a more accurate place, geometrically speaking. It can be thought of as like old TV luma; the bandwidth is limited, so the rate of change has a proportionally limited slew rate, but the phosphor on an old B&W monitor is continuous, and you can start a waveform anywhere (horizontally) with luma, to any accuracy within the timing of the display, which can be pretty darned high. So things tend to look very, very good as opposed to what you might expect from naively considering nothing but the bandwidth. It's not like a modern color display, where the phosphor/pixel groups serve to sub-sample the signal no matter how you feed it in. But that advantage goes away when the subtleties exceed your eye's ability to perceive them. Or you have to strain/hurt yourself to do it.

So anyway... any single one or combination of these three things would motivate me to buy more new Apple hardware. Nothing else:

o A Mac pro that is self-contained -- installable, replaceable drives, lots of memory, replicable display cards. The "trashcan" Mac pro is an obscenity. All it did was send me to EBay to buy used prior model Mac Pros. The trashcan isn't so much a wrong turn as it is a faceplant.

o A Mac mid-tower that can have hard drives installed+replaced and at least 16gb of RAM. 32gb would be better. Doesn't have to be that fast. Real gfx. I know, mythical, not probable. Still want it, though. Actually, I want several. :/

o A multicore Mac mini with a real graphics card, 8gb or better ram, network, USB, HDMI and audio ports.

I have uses for all those. Failing that, and in fact that's my expectation, more fail -- I'm done with them. And I have no use whatever for "integrated" graphics.

What's annoying is that just about when they finally managed to a get a stable OS with most of the features I like and want (and the ability to get around the stupid features like "App Nap"), they totally borked the hardware side. I just can't win with Apple. Sigh.

Comment NTMP (Never Too Many Pixels) (Score 1) 263

sheeeeeit. These are NOTHING compared to the 16k displays that'll be out in the spring. I hear that's when they're going to add the mandatory "oil cooling hotness" to the Mac Pro, too. Of course, if you wait till fall, those 32k displays are on the way!

[Looks sadly at N(ever)T(wice)S(ame)C(color) security monitor...]

As Cheech and Chong might have put it, "Even gets AM!" Well, ok, old school TV that isn't broadcast any longer. But you know what I meant.

Or not. I'm old.

GET OFF MY NURSING HOME'S LAWN!

Comment Re:Easter eggs (Score 2) 290

My software is SdrDx. Details here: Very much a "radio person's" design.

As for the eggs, that's not all of em -- those are the easiest to find, too. And they're a thing that's been in there for a couple of years or so, I figure it's not much a secret. Also, there's not much overlap between slashdot and my users. If any. Lastly, I don't think of them as exclusive so much as I do something fun to find.

You might be the first, if it turns out to be something you can use. :)

Comment Re:Easter eggs (Score 1) 290

The top meter is one of the documented ones for reference -- so you can see the kind of "normal" thing the eggs replace. With this meter model, the left meter is the s-meter, and the bar graph sub-display in it is the noise detection level, which correlates with the noise reduction intensity. The squares are unfilled dark red because the noise reduction function was off when I took the screen capture. The segments are filled with a light blue color with noise reduction on; they display the detected noise level actively regardless of the setting. The smaller, yellow meter on the right is an audio VU meter that tracks the modulation percentage for FM, SAM, FSK and AM (bottom scale), and the dB output level for other modes -- USB, LSB, CWU, CWL.

For the Klingon meter, The triangle at the left is the s-meter value; as the signal increases, it fills with nested triangles, segment by segment. The slanted bar is the AGC level, which is independent from the S level in my radio design. The double row of symbols that comes next is the freq to kHz resolution on top, and the remaining freq component in Hz on the bottom. The last set of larger symbols is the signal level in dBm.

For the Predator meter, the first two symbols are the S meter value.The next five are the signal level in microvolts. The last four are the current time HHmm. On the bottom row, all the symbols are frequency.

Slashdot Top Deals

"Here's something to think about: How come you never see a headline like `Psychic Wins Lottery.'" -- Comedian Jay Leno

Working...