Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Businesses

Steve Ballmer: We Won't Be Out-Innovated By Apple Anymore 610

An anonymous reader tips an article about comments from Microsoft CEO Steve Ballmer regarding Microsoft's attitude toward Apple. It seems Microsoft is tired of being behind the curve in most areas of the tech market, and will be trying very hard to prevent Apple and other companies from beating them to the punch in the future. From the article: "In a recent interview, Ballmer explained that the company had ceded innovations in hardware and software to Apple, but that the-times-they-are-a-'changin. 'We are trying to make absolutely clear we are not going to leave any space uncovered to Apple,' Ballmer explained. 'Not the consumer cloud. Not hardware software innovation. We are not leaving any of that to Apple by itself. Not going to happen. Not on our watch.' ... An admirable goal, but it's fair to argue that attempting to innovate everywhere results in innovation nowhere. A big part of the reason Apple has been so successful is that they devote the bulk of their attention to only a few select market areas. By trying to innovate everywhere, so to speak, Microsoft runs the continued risk of spreading itself too thin and not really having a fundamental impact in any one market."

Comment Get Real! (Score 4, Interesting) 241

Alright, so let's say the example in the video took place today:

Company 1 in Europe has an idea for a part and contacts Company 2 in America to produce it:

1) Company 1 googles and finds the name of a company in America to produce the part. They call the American company and it takes two hours to wade through the phone system menus and leave several voice mails and wait for a reply.

2) Company 1 can't give any details without a signed NDA, and because of requirements from the company's lawyers, the NDA has to be faxed over, signed, and faxed back.

3) Once they agree to work together, company 1 wants to send company 2 a copy of the design.
3a) The email bounces because it was typed wrong due to international spelling differences
3b) Once the email stops bouncing, it is picked up by a spam filter and nobody ever sees it
3c) Since the email had a large attachment, microsoft exchange choked and the server admin had to come in on the weekend and rebuild the databases
3d) After that, Company 1 decides to just put the file on an internal FTP server.
3e) Company 2 isn't able to use FTP in windows without downloading a program from the internet, which involves getting permission from the IT department, registering the program with the developer, convincing the anti-virus software to allow the ftp program to run, etc etc
3f) The server at Company 1, an older machine not frequently used, isn't firewalled correctly by an unintelligent cisco firewall product, and fails to correctly open the reverse datastream. The files never arrive, as the connections hang.
3g) Company 1 gives up and uses Dropbox.
3h) The files arrive at Company 2, but they are also intercepted by some Russian and Chinese hackers that easily evesdropped into their dropbox using a script inserted several months ago to look for interesting keywords.

4) Many months pass, and finally the prototypes are shipped over to Europe, where it is discovered, the Americans did not convert metric units to English units correctly for each portion of the project, and nothing screws together.

5) The hacked data is leaked to the highest paying competitor.

The other futuristic situation, about the doctor, is equally obnoxious these days if you factor in HIPPA, incompatible data formats, and even lower IT standards.

Let's face it, this started off as a great idea and became something quite different.

Comment Re:The bit depth does matter (Score 1) 841

I am so glad you wrote this, Rimbo.

Check it out, yes, the digital time-axis is discrete. As is the "y-axis" which I assume to mean amplitude.

The fact that you would even mention stair steps shows how fundamentally off your concept of digital audio is. Here's the deal. Take a Fourier transform of one of those "steps". You'll note these are sharp rectangular shapes. The fundamental frequency of one stair step is that of the sine wave it sort of approximates, with a period of two samples. Gee, what frequency is that? Oh, right, Nyquist. You won't create a step with a longer (lower-frequency) step than this -- the steps exist only on the small differences between each *adjacent* sample. Now, there are other harmonics. Recall your Fourier transform pairs, something like a rectangle should be sinc-like. But the harmonics are *greater* than the Nyquist.

So basically, the effects of these "stair steps" are all at or above the Nyquist, and are filtered out by anti-aliasing filters. The author is correct that the original wave is reconstructed.

Now, your point about the FFT. Do you know what an FFT is? The ear does not perform a fast Fourier transform with order nlog(n). As several have pointed out, the eardrum does not vibrate over about 20khz or so. So, even if the ear implements a butterfly FT, I highly doubt it notices the anti-aliasing filtering taking place around 22 KHz. Especially with the up-sampling and digital-domain filtering that actually takes place.

The author's description of bit depth and dynamic range is correct. More digital 'bits' per sample does lead to greater differences and granularity between the loudest and quietest sounds that may be recorded on a PCM stream.

I wouldn't say the author doesn't understand PCM. I'd say you don't.

Comment Re:The bit depth does matter (Score 1) 841

I am completely aware that 192 KHz/24 bit is alive and well. I saw many a studio "upgrade" to the latest Protools HD. And I happen to use 88.2k/24 for a lot of the work I do.

I too cannot hear to 20k. My hearing tops out at 19 KHz, and that's just fine with me. It shouldn't be surprising that our test gear can "see" up past 20 KHz though. My Tektronics scope from the 80s will go way up to 100 MHz. But that isn't the point, is it? Point is, we can't hear up there. But the gear that can does sell, because people like bigger numbers.

I would totally advocate switching to 24 bits over 16. But higher sample rates really present diminishing returns. Not only do we not hear up there, we certainly don't hear much detail in the +10K range.

Your points about processing are totally valid, and I do the same with my photos.

Comment Re:The bit depth does matter (Score 1) 841

What sort of "smearing" are you talking about? From analyzing your comment, I think you are saying that the time of arrival to the ears is quantitized to, at minimum, 1/Fmax, where Fmax = 1/Fs.

Here's the thing. BOTH left and right signals are quantitized into equal chronological 'bins'. You aren't erroring any more to one side than the other. Secondly, the frequencies where this kind of error would be noticeable -- if it occurred -- would be in the +15 KHz range. At this frequency, every millimeter of material in your surroundings has an effect on the perceived sound. Unless you have a perfect setup in every way, this is not a big deal.

I'll tell you where the sample rate will matter. In both ADCs and DACs, there are anti-aliasing filters designed to make sure frequencies greater than the Nyquist are not recorded, and are filtered out of the resulting reproduction signal. Filters can't just cut everything after some given frequency though. They have to do this gradually. Too fast a cut and you create ripple in the pass-band and ripple after the cutoff. Two gentle and you allow for some aliasing, or you filter out some of the desirable high frequency content that is just below the Nyquist. Solution? Sample at a ridiculous rate, like 100KHz, and filter gently. Your ripples will be outside the audible range, and you'll be able to cut it much more gently.

Having said that, with good filtering the ripples are still pretty minimal, way above the 'content' of the program, and with today's distortion aka mastering, and your home equipment or even so-called audiophile equipment, a few db here and there around 19 KHz isn't going to be noticeable. This ripple almost certainly existed in the analog equipment (microphones, rooms, etc) involved in the original recording, and nobody cried about that.

Comment The bit depth does matter (Score 4, Insightful) 841

As a former audio engineer with some ranking success, I can tell you that it's true -- delivering high-sample rate audio as an end format is really pointless. It hardly makes sense in a studio, and definitely is illogical for the distribution of a final mix.

However, there is an increase in quality using 24 bit. Most people just assume increasing the bit depth is the same as increasing the sample rate, but this is incorrect and short-sided. With higher bit depths, you can get your analog components operating a little further away from the noise floor. This also makes dithering much less noticeable (the noise you hear when you crank the volume up as a song fades out). Why? There are more "levels" for each sample to be recorded into. It's like going from 16 to 24 bit color. You would notice this.

For the 192 KHz fans out there, there is direct and proven mathematical reasoning for why 44 KHz audio is plenty. That, and your equipment probably can't produce it. Your converters probably suck at this frequency, and your ears definitely can't vibrate that quickly. More samples doesn't "smooth out" the waveform.
Media

Your Next TV Interface Will Be a Tablet 210

waderoush writes "You can forget all the talk about 'smart' and 'connected' TVs: nobody, not even Apple, has come up with an interface that's easy to use from 10 feet away. And you can drastically curtail your hopes that Roku, Boxee, Netflix, and other providers of free or cheap 'over the top' Internet TV service will take over the world: the cable and satellite companies and the content owners have mounted savvy and effective counterstrikes. But there's another technology that really will disrupt the TV industry: tablet computing. The iPad, in particular, is the first 'second screen' device that's good enough to be the first screen. This Xconomy column argues that in the near future, the big-screen TV will turn into a dumb terminal, and your tablet — with its easy-to-use touch interface and its 'appified' approach to organizing content — will literally be running the show in your living room." Using a tablet as a giant remote seems like a good idea, and a natural extension of iPhone and Android apps that already provide media-center control. Maybe I'm too easily satisfied, but the 10-foot interface doesn't seem as hopeless as presented here; TiVo, Apple, and others been doing a pretty good job of that for the past decade.

Submission + - Connectors to blame: Faster-Than-Light Neutrino Results Invalidated (sciencemag.org)

gnu-sucks writes: It appears the universal speed limit has actually not been broken. A bad connection on an optical network was found to be inserting an additional 60 ns, causing additional delay at the time processors at the sending site, and making the neutrinos appear to arrive earlier than expected.

Comment Sounds great (Score 5, Funny) 647

I can't wait for a system where each application automatically takes up the entire screen!

Just imagine, reading facebook.com on my 30 inch screen, FULLY MAXIMIZED, so that no other applications can distract me. Or, if I decide to code, EACH terminal could span the entire desktop. No longer will I have to struggle with seeing two things at once -- from now on, it's peace of mind with GNOME 3.

Thankfully I can now give gvim the space it has always deserved -- a fully uncluttered 2560 x 1600 space. And when I decide to listen to music, my music app can take up the entire space too! Imagine, seeing nothing but whitespace. Thank goodness someone thought of this. I can finally relax and do what I've always wanted to do: use my computer, one app at a time, in FULL SCREEN!

If you think about it, this is almost as good as DOS. No more annoying window title bars and multi-app desktop usage. No more extra buttons and widgets. Just one thing and one thing only -- what you're going to work on. I can't wait to develop kernel drivers and work on my apps this way. The fact that when I currently work I can actually see (and be distracted by) about three to four windows at a time is just devastating. I have to (currently) *navigate* to each and every window, and precariously drag the window across my entire desktop to achieve this effect, only to remain haunted by menu bars, title bars, and application switchers.

If only they could put a stop to all those pesky background processes and really get it down to just one single process. Then all the processes on my computer wouldn't have to compete for computer resources. Just like DOS, I'm telling you, I can't wait, we're getting back to the single-purpose one-thing-at-a-time operating system!

Obligatory slashdot sayings:
I for one welcome our maximized-app overlords!
In Soviet-Russia, window manager maximizes YOU!
One app to rule them all!
It was as if millions of apps suddenly cried out in terror and were suddenly silenced, replaced with calming whitespace.

Comment Re:why iPads? (Score 2) 396

Why not a kindle or nook? Because they suck at reading PDFs. Yes, they can technically show a PDF on the screen. But unless that PDF is formatted for a small screen size, the experience is going to be awful. The more expensive devices (iPad, color-screen-glossy nook-kindle-whatever, $400+) have solved this with better multi-touch zoom and pan options, but then you're going to pick the iPad if they're in the same price arena. Just go on youtube and look for "kindle touch pdf reading" and you'll see how awful it is.

For someone trying to study, you need the ability to quickly browse material and annotate, and the cheaper devices don't offer this in any reasonable way.

So why not a netbook? Sure, a netbook can display PDFs quickly. But if your input is limited to mouse and keyboard, you're ten times less likely to annotate. Which is the function you would normally perform on a real paper textbook. So the iPad and other stylus-bearing devices come out on top, due to their size, advanced software, and input methods.

Comment Re:Not a surprise (Score 4, Interesting) 197

Here's the problem: You can block "Lightspeed" from deploying devices known to cause harmful interference to GPS signals. Big deal. What you can't do is make it "illegal" to jam GPS. Well, you can make it illegal, but it's a matter of enforcement. Expecting it to work 100%, especially in a battle field, is stupid. Your enemy will build GPS jammers by the dozen and hide them all over the place once they realize this is how you guide your missiles.

All I'm saying, is that this is a symptom of a larger problem: depending on easily jammed GPS.

I realize the military will just triangulate and find the jammers. But a jammer just has to hide their equipment in nearby hospitals and grocery stores, and use intelligent timing and antenna arrangements.... they can make triangulation a very difficult and time-consuming operation. And once the devices are found and destroyed, it's another $15 to deploy another one somewhere else.

I think it's a good idea to try and prevent what you can, such as by not certifying equipment that causes harmful interference. But let's not think this is the real problem with GPS...

Comment Not a surprise (Score -1, Flamebait) 197

It's not surprising that an RF signal can be interfered with remotely. Whether the signal was for a baby monitor, an emergency room health computer system, remote aircraft control, etc, people will always be astonished that they were susceptible to interference.

But honestly, it's an RF signal. Blocking the signal is about the same for any given service. Some are a little more robust than others, but it's the same mathematical game.

Let's get over the sensationalism and realize the real problem: We had false expectations of GPS and therefore should not have depended on this technology in defense systems.

Slashdot Top Deals

Gee, Toto, I don't think we're in Kansas anymore.

Working...