Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
No seriously, the radio is not connected to the computer system, the comptuer system is extremely conservative by many standard and is not connected to the in flight wifi. You cannot have an air-gap attack without a microphone or similar device.
The gao report is a complete nonsense and was laughed out by all technical people involved in the computer system of airplane or in flight entertainment.
"So, Mr cyber Expert and Pilot, other than saying "nuh uh", do you have anything to suggest there is no chance of this?"
Aside complete network separation and absence of microphone ? Really guys sometimes there is absolutely NOTHING about threat reports.
First they came for the power users...
They aren't doing this to improve the user experience with the software. They're doing it to address the perception that "new and shiny" is what people want -- not functionality per se. They're aiming at the user experience of getting something new.
You know that marketing slogan, "sell by showing what problem you solve"? The "problem" that marketers have identified is the public's disinterest in things not new and not shiny -- and lately, not thin.
In my view, incompatibility is a sign of poor vision, poor support, and a lack of respect for those people who have come to you for what you offer. Speaking as a developer, if I come up with new functionality that is incompatible with the old, I add the new functionality without breaking the old. There are almost always many ways that can be done. I never did find a worthy excuse not to do it, either.
It isn't Google, or Apple, or whatever vendor that needs to learn a lesson. It's the public. I don't think it can be taught to them, either.
I write my own.
"get a shop to bake you a gay mariage cake" => "you may not impose those poor sales peopel to sell stuff ! Religious rights ! Right to discriminate ! Eleventy !"
This is in a nutshell the big hyprocrisy. they are for the rights and freedoms that conservative likes, but those right and freedom they do not likes, suddenly are to be made illegal.
The funny things is that there is more religious interdiction that conservative breaks, which are cited far more than homosexuality, but conservative ignore them, because they would look stupid upholding them (think not eating shellfish or mixing cloth of different fiber, working on saturday/sunday or paying 30 shekels in silver and forcing a woman to marry her rapist). On the other hand homosexuality which is barely mentionned suddenly is a religious freedom issue. The sad reality is that it is not a religious aversion, it is a disgust they feel at homosexuality and they try to make up any excuse to impose their own disgust over the population". Well anyway I can't wait for this generation to "pass away". In 20 years all republican of today and tea party guys of today will be seen the same way as those who were for racial separation back in the 50ies / 60ies : they will be seen as incredible biggoted fucktard.
Yes, I'm a developer as well. Let me re-phrase that, as I was going off an assumption that for all I know is no longer true, now that I look directly at it:
I have no use for graphics solutions that consume memory bandwidth that would otherwise be available to CPU core(s.)
Having said that, as memory bandwidth, as far as I was aware, remains nowhere near the bandwidth required to reach "always there when the CPU needs it", and integrated solutions always share memory with the CPU, particularly when data is being passed between CPU and GPU... it just strikes me that integrated probably -- not certainly -- remains a reliable proxy for "makes things slower."
It's also a given that the more monitors the thing is driving, the more memory bandwidth it will need. If that memory is on the same bus as the rest of the memory in the machine, again, adding monitors reduces memory bandwidth available to the CPU, and remember that the monitor has first priority -- system designs can't have the monitor going blank because the CPU wants memory. Doing both -- running graphics intensive tasks on multiple monitors... that's quite demanding. Hence, my preference for non-integrated graphics. When the graphics subsystem has its own memory, CPU performance has, at least in my experience, been considerably higher in general.
I have six monitors on one desktop setup, and two on the other. My lady has two as well. There are times for me when at least two monitors are very busy continuously and simultaneously for long periods of time (hours) at the same time that there is a heavy CPU load (where at least one core constantly at 100% and others variously hitting hard at times as well.)
Now that solid state drives are around, my machine spends a lot more time computing and a lot less waiting on disk I/O, too.
Anyone who definitively knows modern integrated chipset performance, by all means, stick an oar in.
It is simple, if the enemy can cut off your signal, then make the absence of signal the trigger. Deadman switch.
If the new model has a larger screen, 5K would definitely be insufficient.
I'm a photographer and and constant user/developer of image manipulation software. I edit every shot. I don't need 5k in a monitor; if I need a full-image overview, I can have that, zero perceptible time. If I need to look at pixels, same thing. Or anywhere in between. I do *not* need to be squinting at a monitor in order to resolve detail. I value my vision too highly. And at these resolutions, you don' t squint, you can't see it. And I have extremely high visual acuity.
Higher (and higher) resolution makes sense in data acquisition. Once you have it, you can do damned near anything with it. Even if you exceed the MTF of the lens, you get the advantage that while the edges are smoother, they now start in a more accurate place, geometrically speaking. It can be thought of as like old TV luma; the bandwidth is limited, so the rate of change has a proportionally limited slew rate, but the phosphor on an old B&W monitor is continuous, and you can start a waveform anywhere (horizontally) with luma, to any accuracy within the timing of the display, which can be pretty darned high. So things tend to look very, very good as opposed to what you might expect from naively considering nothing but the bandwidth. It's not like a modern color display, where the phosphor/pixel groups serve to sub-sample the signal no matter how you feed it in. But that advantage goes away when the subtleties exceed your eye's ability to perceive them. Or you have to strain/hurt yourself to do it.
So anyway... any single one or combination of these three things would motivate me to buy more new Apple hardware. Nothing else:
o A Mac pro that is self-contained -- installable, replaceable drives, lots of memory, replicable display cards. The "trashcan" Mac pro is an obscenity. All it did was send me to EBay to buy used prior model Mac Pros. The trashcan isn't so much a wrong turn as it is a faceplant.
o A Mac mid-tower that can have hard drives installed+replaced and at least 16gb of RAM. 32gb would be better. Doesn't have to be that fast. Real gfx. I know, mythical, not probable. Still want it, though. Actually, I want several.
o A multicore Mac mini with a real graphics card, 8gb or better ram, network, USB, HDMI and audio ports.
I have uses for all those. Failing that, and in fact that's my expectation, more fail -- I'm done with them. And I have no use whatever for "integrated" graphics.
What's annoying is that just about when they finally managed to a get a stable OS with most of the features I like and want (and the ability to get around the stupid features like "App Nap"), they totally borked the hardware side. I just can't win with Apple. Sigh.
sheeeeeit. These are NOTHING compared to the 16k displays that'll be out in the spring. I hear that's when they're going to add the mandatory "oil cooling hotness" to the Mac Pro, too. Of course, if you wait till fall, those 32k displays are on the way!
[Looks sadly at N(ever)T(wice)S(ame)C(color) security monitor...]
As Cheech and Chong might have put it, "Even gets AM!" Well, ok, old school TV that isn't broadcast any longer. But you know what I meant.
Or not. I'm old.
GET OFF MY NURSING HOME'S LAWN!
My software is SdrDx. Details here: Very much a "radio person's" design.
As for the eggs, that's not all of em -- those are the easiest to find, too. And they're a thing that's been in there for a couple of years or so, I figure it's not much a secret. Also, there's not much overlap between slashdot and my users. If any. Lastly, I don't think of them as exclusive so much as I do something fun to find.
You might be the first, if it turns out to be something you can use.
The top meter is one of the documented ones for reference -- so you can see the kind of "normal" thing the eggs replace. With this meter model, the left meter is the s-meter, and the bar graph sub-display in it is the noise detection level, which correlates with the noise reduction intensity. The squares are unfilled dark red because the noise reduction function was off when I took the screen capture. The segments are filled with a light blue color with noise reduction on; they display the detected noise level actively regardless of the setting. The smaller, yellow meter on the right is an audio VU meter that tracks the modulation percentage for FM, SAM, FSK and AM (bottom scale), and the dB output level for other modes -- USB, LSB, CWU, CWL.
For the Klingon meter, The triangle at the left is the s-meter value; as the signal increases, it fills with nested triangles, segment by segment. The slanted bar is the AGC level, which is independent from the S level in my radio design. The double row of symbols that comes next is the freq to kHz resolution on top, and the remaining freq component in Hz on the bottom. The last set of larger symbols is the signal level in dBm.
For the Predator meter, the first two symbols are the S meter value.The next five are the signal level in microvolts. The last four are the current time HHmm. On the bottom row, all the symbols are frequency.
I *always* squeeze out one or more easter eggs.
My latest: The application, which is free, is software defined radio. It's loaded with features, and everything is documented in detail. Radios have something called an "S Meter", which in a "real" radio is often an actual meter. I offer, and fully document, quite a few different s meter types you can switch by simply clicking on the currently displayed meter. Left click gets you the next model, right click the previous model. Some are classic looking meters, some are digits, some are graphs, some have audio dB meters incorporated as well, some read S, some read S+AGC, some read S+noise reduction, some read S+microvolts at the antenna input, some graphs are vertical, some are horizontal... and there are various combinations of the foregoing. Quite a variety.
So, if you follow the directions, you get exactly what the docs tell you you'll get.
But if, when you reach the last s-meter model, you left click again, you get an s-meter with some of the above information packed into it... in Klingon.
If you click one more time, you get the same set of information again, but this time... predator.
Both meter styles are quite dynamic. As they should be, since they're driven by actual data and displaying it. Albeit not in the usual fashion.
My only regret is that Alien's aliens were not written language users. I suppose it was alien to them. And perhaps that's why they were so mean... because they were... alienated.
Ok, I'll stop now.