## Comment Re: Maybe Kotlin (Score 1) 245

*can*do it, of course, but most Scala libs aren't very Java-like.

It sounds like you're using it as a Better Java, and if so, your conclusion sounds pretty reasonable. Programming Scala like a better Java would be frustrating. You *can* do it, of course, but most Scala libs aren't very Java-like.

My impression is that Scala is good for quickly prototyping simple tasks. Leaving aside the meaningless "mission-critical" bit, it looks like you set up a simple web server that could run some jobs in the background on a cluster.

Well, not really, since we want to know ASAP when these processes terminate. Sure, there's nothing about Scala that is uniquely capable of meeting our requirements. In fact, since Akka supports Java on an equal standing, we could have kept much of the same underlying technology. However, the actual domain logic would have been far more verbose and awkward to model in Java, in my experience.

On the other hand, if you have a large complex software project spanning many years where you need to use a lot of third party Java libraries and where you want to avoid obscure bugs (everything clean and simple and easy to understand) then Scala is really not the best choice.

You provide no argument for why the Scala option would have these obscure bugs the Java approach supposedly would lack. Nor why Scala would be any less capable of supporting a "large complex software project". I would assert that Scala's direct support for functional programming makes it better for writing the every - day domain code. This used to be a much more controversial proposition, but at this point, most of the people I know even in the Ruby and JS ecosystems are sold on this sort of "tactical FP" as a way of writing maintainable and resilient code.

And yet, read the other answer on that same Quora page.

I would also add that the opinion of an Engineering VP at Twitter might take into account a whole bunch of things that aren't applicable to the dev who is trying to decide whether learning a language is worthwhile to them. Those are two very different points of view, with very different considerations. I love working in Scala, but if I were building a team of probably over 100 engineers, I might think twice.

Also keep in mind that Twitter was one of the very early adopters of Scala. They lived through growing pains in the ecosystem that newcomers to the language will never have to encounter.

Scala has a learning curve, no doubt about it. Its design tends to emphasize features that are very general. You put in some extra effort upfront, but you get more mileage in the long run. Scala also has a very rich ecosystem of libraries that may approach the same problems with different philosophies. Some find the amount of choice paralyzing. It's a bit reminiscent of the React ecosystem. But I think I speak for many Scala fans when I say that it feels very rewarding and empowering as a programmer to work with.

Intellectual benefits aside, I've also found it to be a language for getting things done. My small team launched a mission-critical distributed system in 6 months, with me as the only veteran Scala programmer. I give a lot of credit to the extremely robust ecosystem of libraries around Scala -- especially Lightbend's Akka and Slick. A lot of ready-made tech mapped very nicely to our problem domain, and we were off to the races. The documentation is great and there's lots of support available.

My advice for the OP: take the Coursera course and/or give the Red Book a spin. You'll likely be challenged but also you may find it to be very intellectually stimulating.

They make laser printers as well, but they mostly sell those directly to businesses so you don't really see them in stores.

I used to work for Lexmark as an intern throughout college and grad school, and while I am ASTONISHED that they have made it so long as a company given how horribly managed the place is, I also feel bad. There are a lot of fantastic and brilliant people who work for the company, in many cases because there just aren't too many other tech jobs in Lexington, KY. I hope my old friends land on their feet.

I love my HP L7580. The thing's a beast! It prints in duplex, scans, and has massive, off-carrier tanks

In this case, I'm pretty sure that most of the pre-fabbed parts are specific to this particular building. So the manufacturing time of the pre-fabbed blocks should be considered in the total building time.
A lot of software projects have components that are developed as libraries specifically for the project. These certainly would be counted in the development time.
That being said, 90 days to assemble the tallest building in the world is still hugely impressive.

Racist much?

The same could be said for white folks. Or at least that blacks and hispanics are more likely to be railroaded or treated more harshly by the "justice" system.

Yes...cut the social safety net that millions of people rely on to fund pet projects for engineers. What could be bad about that?

There's only so much a person can expect Obama to do. The reality is that there is a massive movement in this country that is opposing social investment (taxes) for *any purpose*. If we're not willing to pay extra to balance the budget and increase our investment in our own future, then the real funds we can invest in ourselves decrease as more of our tax revenue is devoted to servicing the debt. I don't think Obama ramming any sort of increased spending down the GOP's throat is a winning strategy, and without tax increases or spending cuts on untouchable programs, there's really no other way. He could stare the GOP down, as you say, but the GOP's politicians have no incentive to back down. Their sole goal is to set the man up to lose the re-election bid, and failing that, they're at least going to stonewall everything he does to make him appear ineffective.

Most of the time, when people talk about taking the FFT of a signal, they really mean the STFT, which breaks the signal into small blocks, and then takes the FFT of each block indepently, the result being known as the analysis frames of the signal. Either way, it fundamentally comes down to this: you have a discrete (sampled) signal of finite-length N (N could be really large for the full FFT, or something like 1024 for an STFT), and the amount of information (bandwidth) of that signal is fundamentally limited by the time interval between the samples (see Shannon-Nyquist sampling theorem).

Forget Fourier for a second, linear algebra alone says that you are free to transform that signal into another representation by multiplying it with a set of basis functions, and it will always take just N coefficients, so long as your basis functions are linearly independent. The best sets of linearly independent basis functions are also orthogonal, because that means each of your N coefficients is representing an independent part of your original signal.

As it turns out, the set of sinusoids that are harmonics of the sinusoid with period N (up until the sinusoid with a period of just two samples) are linearly independent and orthogonal, and therefore, are completely sufficient to describe a signal of length N. Fourier was one of the first to use sinusoidal bases, hence his eponymous transform. Mind you, once again, that N could be the length of the original signal, or just the length of the STFT blocks of your signal fragments. We typically use STFT because for most real-world signals, meaningful ideas of frequency content are a time-local thing. When you listen to music, for example, frequency only really has meaning over windows of a sound signal that are up to 1/20 of a second. So taking the FFT of an entire audio file all at once *will* give you the frequency content of the entire file, but that has little perceptual or musical meaning.

But, there is a problem with the STFT. Using exactly N sinusoids to represent a block of data doesn't work correctly if your data has frequencies that don't fit into exactly N samples. In other words, the FFT only produces meaningful measurements of frequency if the signal is periodic in the window length. In reality, this is always the case, because the STFT process of chopping the signal into blocks is not at all careful about making sure those blocks don't snip the signal at awkward points. The result of taking the FFT of a block that has awkwardly clipped edges is known as "spectral leakage": the appearance of frequency content in sidebands close to the real frequency, which makes the output not the ideal spike at the measured frequency. To combat this, we use windowing functions, which don't totally fix the problem, but they do make it much, much better.

Hopefully that sheds a little light on the topic

Most compression algorithms don't use the FFT at all, they use the DCT. And while the signals aren't usually sparse, the heart of the compression process is throwing out non-zero coefficients that are very small. Of course, you have to do the DCT first to find out which coefficients's are small enough to be thrown out. But, at the very least, this could be useful in the decoding process.

I might also add that unless your numbers have a whole bunch of zeros, they aren't sparse, so the algorithm in the paper above doesn't help you. The algorithm in the paper basically says the more zeros you have in your inputs to your FFT, the faster you can do the FFT. This is mostly useful in data compression, where you intentionally throw out coefficients to save space. Although, most compression algorithms use variations of the discrete cosine transform (DCT) instead of the FFT. The DCT is very similar to the FFT, but not sure if the algorithm in the paper can also be used to speed up the DCT

For numbers that are small enough to fit in your processor's registers, multiplication is a constant-time process. But when dealing with numbers much larger than your processor's word-length (more than 4 bytes), you have to resort to the long-mulitplication algorithm. As it turns out, multiplying polynomials is exactly the same as convolution, and is O(n^2), and long multiplication is a special case of polynomial multiplication, where the variable is fixed:

525 * 343 = (5x^2 + 2x + 5) (3x^3 + 4x + 3), for x = 10

So for two large numbers, rather than going through the long process of convolving the digits (long-multiplying), you do the following process:

- take the FFT of the digit sequence of each number, O(n*log(n))
- multiply point by point, O(n)
- take the IFFT of that result, O(n*log(n))

In the realm of scientific observation, luck is granted only to those who are prepared. - Louis Pasteur