Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment: Re:Lexmark is still around? (Score 1) 228

by atrizzah (#41155581) Attached to: Lexmark To Exit Inkjet Printer Market
I used to work for Lexmark as an intern throughout college and grad school, and while I am ASTONISHED that they have made it so long as a company given how horribly managed the place is, I also feel bad. There are a lot of fantastic and brilliant people who work for the company, in many cases because there just aren't too many other tech jobs in Lexington, KY. I hope my old friends land on their feet.

Comment: Re:kinda cheating (Score 3, Interesting) 389

by atrizzah (#40367831) Attached to: Chinese Firms Claims It Can Build World's Tallest Tower in 90 Days
In this case, I'm pretty sure that most of the pre-fabbed parts are specific to this particular building. So the manufacturing time of the pre-fabbed blocks should be considered in the total building time. A lot of software projects have components that are developed as libraries specifically for the project. These certainly would be counted in the development time. That being said, 90 days to assemble the tallest building in the world is still hugely impressive.

Comment: Re:It's a good thing the military is still funded. (Score 4, Insightful) 422

by atrizzah (#39032225) Attached to: White House Wants Devastating Cuts To NASA's Mars Exploration
There's only so much a person can expect Obama to do. The reality is that there is a massive movement in this country that is opposing social investment (taxes) for any purpose. If we're not willing to pay extra to balance the budget and increase our investment in our own future, then the real funds we can invest in ourselves decrease as more of our tax revenue is devoted to servicing the debt. I don't think Obama ramming any sort of increased spending down the GOP's throat is a winning strategy, and without tax increases or spending cuts on untouchable programs, there's really no other way. He could stare the GOP down, as you say, but the GOP's politicians have no incentive to back down. Their sole goal is to set the man up to lose the re-election bid, and failing that, they're at least going to stonewall everything he does to make him appear ineffective.

Comment: Re:Wish I could understand the details of FFTs (Score 2) 271

by atrizzah (#38761222) Attached to: Faster-Than-Fast Fourier Transform

Most of the time, when people talk about taking the FFT of a signal, they really mean the STFT, which breaks the signal into small blocks, and then takes the FFT of each block indepently, the result being known as the analysis frames of the signal. Either way, it fundamentally comes down to this: you have a discrete (sampled) signal of finite-length N (N could be really large for the full FFT, or something like 1024 for an STFT), and the amount of information (bandwidth) of that signal is fundamentally limited by the time interval between the samples (see Shannon-Nyquist sampling theorem).

Forget Fourier for a second, linear algebra alone says that you are free to transform that signal into another representation by multiplying it with a set of basis functions, and it will always take just N coefficients, so long as your basis functions are linearly independent. The best sets of linearly independent basis functions are also orthogonal, because that means each of your N coefficients is representing an independent part of your original signal.

As it turns out, the set of sinusoids that are harmonics of the sinusoid with period N (up until the sinusoid with a period of just two samples) are linearly independent and orthogonal, and therefore, are completely sufficient to describe a signal of length N. Fourier was one of the first to use sinusoidal bases, hence his eponymous transform. Mind you, once again, that N could be the length of the original signal, or just the length of the STFT blocks of your signal fragments. We typically use STFT because for most real-world signals, meaningful ideas of frequency content are a time-local thing. When you listen to music, for example, frequency only really has meaning over windows of a sound signal that are up to 1/20 of a second. So taking the FFT of an entire audio file all at once will give you the frequency content of the entire file, but that has little perceptual or musical meaning.

But, there is a problem with the STFT. Using exactly N sinusoids to represent a block of data doesn't work correctly if your data has frequencies that don't fit into exactly N samples. In other words, the FFT only produces meaningful measurements of frequency if the signal is periodic in the window length. In reality, this is always the case, because the STFT process of chopping the signal into blocks is not at all careful about making sure those blocks don't snip the signal at awkward points. The result of taking the FFT of a block that has awkwardly clipped edges is known as "spectral leakage": the appearance of frequency content in sidebands close to the real frequency, which makes the output not the ideal spike at the measured frequency. To combat this, we use windowing functions, which don't totally fix the problem, but they do make it much, much better.

Hopefully that sheds a little light on the topic

Comment: Re:Doesn't sound THAT useful (Score 1) 271

by atrizzah (#38760740) Attached to: Faster-Than-Fast Fourier Transform
Most compression algorithms don't use the FFT at all, they use the DCT. And while the signals aren't usually sparse, the heart of the compression process is throwing out non-zero coefficients that are very small. Of course, you have to do the DCT first to find out which coefficients's are small enough to be thrown out. But, at the very least, this could be useful in the decoding process.

Comment: Re:Security (Score 2) 271

by atrizzah (#38760688) Attached to: Faster-Than-Fast Fourier Transform
I might also add that unless your numbers have a whole bunch of zeros, they aren't sparse, so the algorithm in the paper above doesn't help you. The algorithm in the paper basically says the more zeros you have in your inputs to your FFT, the faster you can do the FFT. This is mostly useful in data compression, where you intentionally throw out coefficients to save space. Although, most compression algorithms use variations of the discrete cosine transform (DCT) instead of the FFT. The DCT is very similar to the FFT, but not sure if the algorithm in the paper can also be used to speed up the DCT

Comment: Re:Security (Score 3, Informative) 271

by atrizzah (#38760610) Attached to: Faster-Than-Fast Fourier Transform

For numbers that are small enough to fit in your processor's registers, multiplication is a constant-time process. But when dealing with numbers much larger than your processor's word-length (more than 4 bytes), you have to resort to the long-mulitplication algorithm. As it turns out, multiplying polynomials is exactly the same as convolution, and is O(n^2), and long multiplication is a special case of polynomial multiplication, where the variable is fixed:

525 * 343 = (5x^2 + 2x + 5) (3x^3 + 4x + 3), for x = 10

So for two large numbers, rather than going through the long process of convolving the digits (long-multiplying), you do the following process:

  1. take the FFT of the digit sequence of each number, O(n*log(n))
  2. multiply point by point, O(n)
  3. take the IFFT of that result, O(n*log(n))

Comment: Re:two suggestions (Score 1) 402

by atrizzah (#38611304) Attached to: Ask Slashdot: Mirrorless, Interchangeable Lens Camera Advice?

I agree 100%. To throw in my 2 cents, I was very much interested in upgrading from my Canon G10 to a MILC. What I wanted was a similarly compact camera, but with interchangeable lenses and a larger sensor. I had been holding my breath for Canon to jump into the MILC sector. I had it in my mind that they were completely missing the boat on the revolution. But when I started doing my research, I realized that I shouldn't be holding my breath.

MILC's right now are terribly expensive compared with more capable DSLR's. What you gain in portability, you lose in lens selection, sensor size, and upgradability. The biggest problem though is that manufacturers are apparently terribly confused as to who the target audience of the MILC's is, because the particular design decisions to make the MILC a step up from the bridge camera or an alternative to the DSLR are often contradictory. I think Nikon's 1-mount is clear evidence of this. It uses a comically small sensor size, comically odd mix of features and limitations, and comically high price.

Some people are already writing the obituary for the DSLR's, but they aren't going anywhere any time soon. Maybe in 10 years, MILC's will dominate the market, but if you want a camera today with great image quality, smooth upgrade paths, and low cost, you can't go wrong with the high-end consumer DSLR's. I ended up with the Nikon D5100, which is basically equivalent to the Canon T3i. I bought mine with a 18-55mm kit lens and a 55-200mm telephoto zoom for $760. I am absolutely loving it.

Never try to teach a pig to sing. It wastes your time and annoys the pig. -- Lazarus Long, "Time Enough for Love"

Working...