Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

×

Comment: What's up: Sciuridae! (Score 4, Insightful) 206

by fyngyrz (#49486129) Attached to: Google Sunsetting Old Version of Google Maps

They aren't doing this to improve the user experience with the software. They're doing it to address the perception that "new and shiny" is what people want -- not functionality per se. They're aiming at the user experience of getting something new.

You know that marketing slogan, "sell by showing what problem you solve"? The "problem" that marketers have identified is the public's disinterest in things not new and not shiny -- and lately, not thin.

In my view, incompatibility is a sign of poor vision, poor support, and a lack of respect for those people who have come to you for what you offer. Speaking as a developer, if I come up with new functionality that is incompatible with the old, I add the new functionality without breaking the old. There are almost always many ways that can be done. I never did find a worthy excuse not to do it, either.

It isn't Google, or Apple, or whatever vendor that needs to learn a lesson. It's the public. I don't think it can be taught to them, either.

Squirrel!

Comment: Simulated annealing (Score 1) 140

by dhasenan (#49457579) Attached to: Finding an Optimal Keyboard Layout For Swype

In the T9 section we employed a random walk optimization. For the swipe optimization we use a similar approach but gradually reduce the number of random swaps over time so that the keyboard settles into a local minimum.

A random walk with hops being shortened over time is called "simulated annealing". It's an alternative to genetic algorithms and tends to be easier to use for problems with solutions that can't be chopped up and put together in a coherent format. For instance, keyboard layouts, which require each key to be present exactly once.

Comment: Misleading summary (Score 1) 183

by dhasenan (#49435035) Attached to: The Key To Interviewing At Google

Some interviewers might use "how would you move Mt Fuji" type questions, but, the Wired excerpt explains, these questions and their answers are removed from consideration when determining whether to extend an offer, and the official (and unofficial) policy is not to ask that sort of question.

Nice try, though. The error probably comes from summarizing a summary of an excerpt rather than going to the original source, or at least the full excerpt.

Comment: Re:I wonder (Score 1) 113

by hawk (#49427119) Attached to: Turning the Arduino Uno Into an Apple ][

Earlier than that.

The Mac IIfx had a pair of chips each of which effectively had such a creature. One ran the serial/network ports, and I forget the other.

Had apple sold that chip, combined with the network that ran on the second (unused) pair of standard home wiring, they could have *owned* home automation years ahead . . .

hawk

Comment: Re:Interlacing? WTF? (Score 1) 113

by hawk (#49427085) Attached to: Turning the Arduino Uno Into an Apple ][

for hires, rather than reading the same 40 bytes eight times in a row, and feeding to a character generator,eight different sets of 40 bytes were read (of which six set bits, and two danced around the colorburst signal. the pixel rate was just at the colorburst signal, so shifting half a bit tickled it and gave a different set of colors. Not just clever,but fiendeshly clever)

hawk

Comment: Perhaps I was too hasty there (Score 1) 263

by fyngyrz (#49422911) Attached to: LG Accidentally Leaks Apple iMac 8K Is Coming Later This Year

Yes, I'm a developer as well. Let me re-phrase that, as I was going off an assumption that for all I know is no longer true, now that I look directly at it:

I have no use for graphics solutions that consume memory bandwidth that would otherwise be available to CPU core(s.)

Having said that, as memory bandwidth, as far as I was aware, remains nowhere near the bandwidth required to reach "always there when the CPU needs it", and integrated solutions always share memory with the CPU, particularly when data is being passed between CPU and GPU... it just strikes me that integrated probably -- not certainly -- remains a reliable proxy for "makes things slower."

It's also a given that the more monitors the thing is driving, the more memory bandwidth it will need. If that memory is on the same bus as the rest of the memory in the machine, again, adding monitors reduces memory bandwidth available to the CPU, and remember that the monitor has first priority -- system designs can't have the monitor going blank because the CPU wants memory. Doing both -- running graphics intensive tasks on multiple monitors... that's quite demanding. Hence, my preference for non-integrated graphics. When the graphics subsystem has its own memory, CPU performance has, at least in my experience, been considerably higher in general.

I have six monitors on one desktop setup, and two on the other. My lady has two as well. There are times for me when at least two monitors are very busy continuously and simultaneously for long periods of time (hours) at the same time that there is a heavy CPU load (where at least one core constantly at 100% and others variously hitting hard at times as well.)

Now that solid state drives are around, my machine spends a lot more time computing and a lot less waiting on disk I/O, too.

Anyone who definitively knows modern integrated chipset performance, by all means, stick an oar in.

Comment: Don't care (Score 2, Interesting) 263

by fyngyrz (#49419647) Attached to: LG Accidentally Leaks Apple iMac 8K Is Coming Later This Year

If the new model has a larger screen, 5K would definitely be insufficient.

I'm a photographer and and constant user/developer of image manipulation software. I edit every shot. I don't need 5k in a monitor; if I need a full-image overview, I can have that, zero perceptible time. If I need to look at pixels, same thing. Or anywhere in between. I do *not* need to be squinting at a monitor in order to resolve detail. I value my vision too highly. And at these resolutions, you don' t squint, you can't see it. And I have extremely high visual acuity.

Higher (and higher) resolution makes sense in data acquisition. Once you have it, you can do damned near anything with it. Even if you exceed the MTF of the lens, you get the advantage that while the edges are smoother, they now start in a more accurate place, geometrically speaking. It can be thought of as like old TV luma; the bandwidth is limited, so the rate of change has a proportionally limited slew rate, but the phosphor on an old B&W monitor is continuous, and you can start a waveform anywhere (horizontally) with luma, to any accuracy within the timing of the display, which can be pretty darned high. So things tend to look very, very good as opposed to what you might expect from naively considering nothing but the bandwidth. It's not like a modern color display, where the phosphor/pixel groups serve to sub-sample the signal no matter how you feed it in. But that advantage goes away when the subtleties exceed your eye's ability to perceive them. Or you have to strain/hurt yourself to do it.

So anyway... any single one or combination of these three things would motivate me to buy more new Apple hardware. Nothing else:

o A Mac pro that is self-contained -- installable, replaceable drives, lots of memory, replicable display cards. The "trashcan" Mac pro is an obscenity. All it did was send me to EBay to buy used prior model Mac Pros. The trashcan isn't so much a wrong turn as it is a faceplant.

o A Mac mid-tower that can have hard drives installed+replaced and at least 16gb of RAM. 32gb would be better. Doesn't have to be that fast. Real gfx. I know, mythical, not probable. Still want it, though. Actually, I want several. :/

o A multicore Mac mini with a real graphics card, 8gb or better ram, network, USB, HDMI and audio ports.

I have uses for all those. Failing that, and in fact that's my expectation, more fail -- I'm done with them. And I have no use whatever for "integrated" graphics.

What's annoying is that just about when they finally managed to a get a stable OS with most of the features I like and want (and the ability to get around the stupid features like "App Nap"), they totally borked the hardware side. I just can't win with Apple. Sigh.

Comment: NTMP (Never Too Many Pixels) (Score 1) 263

by fyngyrz (#49419379) Attached to: LG Accidentally Leaks Apple iMac 8K Is Coming Later This Year

sheeeeeit. These are NOTHING compared to the 16k displays that'll be out in the spring. I hear that's when they're going to add the mandatory "oil cooling hotness" to the Mac Pro, too. Of course, if you wait till fall, those 32k displays are on the way!

[Looks sadly at N(ever)T(wice)S(ame)C(color) security monitor...]

As Cheech and Chong might have put it, "Even gets AM!" Well, ok, old school TV that isn't broadcast any longer. But you know what I meant.

Or not. I'm old.

GET OFF MY NURSING HOME'S LAWN!

Comment: Re:Easter eggs (Score 2) 290

by fyngyrz (#49412225) Attached to: Is This the Death of the Easter Egg?

My software is SdrDx. Details here: Very much a "radio person's" design.

As for the eggs, that's not all of em -- those are the easiest to find, too. And they're a thing that's been in there for a couple of years or so, I figure it's not much a secret. Also, there's not much overlap between slashdot and my users. If any. Lastly, I don't think of them as exclusive so much as I do something fun to find.

You might be the first, if it turns out to be something you can use. :)

Comment: Re:Easter eggs (Score 1) 290

by fyngyrz (#49411115) Attached to: Is This the Death of the Easter Egg?

The top meter is one of the documented ones for reference -- so you can see the kind of "normal" thing the eggs replace. With this meter model, the left meter is the s-meter, and the bar graph sub-display in it is the noise detection level, which correlates with the noise reduction intensity. The squares are unfilled dark red because the noise reduction function was off when I took the screen capture. The segments are filled with a light blue color with noise reduction on; they display the detected noise level actively regardless of the setting. The smaller, yellow meter on the right is an audio VU meter that tracks the modulation percentage for FM, SAM, FSK and AM (bottom scale), and the dB output level for other modes -- USB, LSB, CWU, CWL.

For the Klingon meter, The triangle at the left is the s-meter value; as the signal increases, it fills with nested triangles, segment by segment. The slanted bar is the AGC level, which is independent from the S level in my radio design. The double row of symbols that comes next is the freq to kHz resolution on top, and the remaining freq component in Hz on the bottom. The last set of larger symbols is the signal level in dBm.

For the Predator meter, the first two symbols are the S meter value.The next five are the signal level in microvolts. The last four are the current time HHmm. On the bottom row, all the symbols are frequency.

365 Days of drinking Lo-Cal beer. = 1 Lite-year

Working...