"The Circle" is typical Dave Eggers. I couldn't finish it.
"The Circle" is typical Dave Eggers. I couldn't finish it.
But the bottleneck is not CPU itself for a good many applications.
That's true, but it's also not relevant. For most "apps", the main issues are battery life and responsiveness. Multi-core is increasingly being seen as a tool to increase responsiveness rather than throughput, because the app looks like it hasn't fallen asleep even if it hasn't done the thing that the user asked yet.
If I ask a database to do a sort, it may use parallelism under the hood, [...]
Interesting example. I wrote the sort subsystem for a (non-SQL) DBMS in one of my previous jobs, so... I guess this illustrates that we come from different perspectives on this point. In case you are curious, it was single-threaded, although it was designed to work on a clustered database, so it was parallel in the sense that it did parallel sorting across multiple machines in a cluster (which is what we called it before we called it a "cloud").
That "root engine" may indeed use FP, but the model maker doesn't have to know or care.
Right, and that's the advantage: Pure functional programming ensures that the client doesn't have to care, because workers are guaranteed not to modify anything that they are not supposed to because they are pure functions.
Map/reduce was all the rage a couple of years ago. I think the main advantage was not the map/reduce model, but the realisation that when you have "big data", you take the code to the data rather than taking the data to the code. But on top of that, forcing yourself into a pure functional style means that your code can run anywhere because it doesn't care about the context in which it runs.
I would add "Microserfs" and possibly "The Circle" to that list.
Freakonomics by Dubner and Levitt. Assuming you already know the mechanics of being in business, the most important lesson you need to know is that people respond to incentives, but they rarely respond in the way you anticipated.
I can't imagine why anyone would want to emulate Steve Jobs. He died because he believed in woo-woo quack cures. I realise that denying reality is valued in the entrepreneur business, but surely that's why you should stand out.
To say that "there can be no free market in the absence of regulation" is equivalent to saying that there can be no free market, period.
For a fundamentalist definition of "free", that's accurate. There can be no free market. There is only "more free" or "less free". And even then, you're often talking about various freedoms traded off against each other.
The real world is a balancing act which requires constant, nimble adjustment. Neither Bloated Government nor The Mythical Hand of the Market can efficiently supply this by itself.
Strangers in the night...
Thus, it may not match or be part of the evolution pattern you outlined.
Of course. For every highbrow concept that became mainstream, there's three that only became niche and another 20 which died off.
I think that functional programming (and I include the actor model) is more important now because of the trend towards more and more cores on the one machine. Purely functional design scales to lots of cores in a way that sequential code does not. Whether or not the industry realises this is a separate question.
I like to crumble up day old fecal matter from the kitty litter box to add a little bit of flavor to things.
Hey, that's how I named my startup, too!
That is some first-class word salad right there.
"High brow" programming has never proven itself in practice for multiple projects.
Sure it has. Assembler proved itself against writing binary, high-level languages proved themselves against assembler, managed languages proved themselves against languages compiled to machine code, regular expressions proved themselves against writing custom parsers... most new technologies were "high brow" once.
My impression is that functional programming comes from Lambda calculus which was introduced in the 1930's.
Mostly correct. The Lisp family of languages borrow the lambda notation, but they're not based on lambda calculus in the Church/Tarski sense.
The differences (and the differences between functional programming and the theory of sets-and-functions that we teach to high school students) don't really matter at a basic level.
How are functional programs translated into machine code?
Here you go. This book is 30 years old, but the basic principles are the same.
That same complex algorithm that is simplified thanks to lambda calculus techniques might be so difficult to translate to machine code that the resulting program is less efficient.
For what it's worth, the same is true of any high-level language. CPUs don't understand virtual dispatch natively, either.
Oh, so that carrier group really was near North Korea after all!
So if regular programmers who form the bulk of the workforce can't grok them, the languages need to be fixed, not people.
I know what you're saying, but there's a real danger here that the industry will find itself caught in a local extremum. An engineer of 1880 could easily have said that if regular engineers who form the bulk of the workforce can't understand this "electricity", then it needs to be fixed to conform to the world of steam.
The worst thing we can do as an industry is think we know what we're doing. And in a sense, we're already there.
Asking whether you like functional programming is like asking whether you like phillips head screwdrivers.
I do not. Give me JIS screwdrivers any day.
If one is well versed in category theory or has spent a significant amount of time working with functor spaces, monoids, and monads, then it's much easier to understand a non-trivial application written in Haskell than the equivalent object hierarchy in an object-oriented language. The up-front cost is greater in terms of study and learning the semantics, but the end result is significantly more powerful.
I strongly suspect (but can't yet prove) that the supposed up-front cost in understanding Milner-esque functional languages is just the same as the up-front costs for Simula-style object oriented languages. The difference is that in the case of Simula-style object oriented languages, most of the up-front cost has already been largely paid by the time you come to them.
If it's any help, consider that there seems to be a significant learning cost in wrapping your brain around "real" object-oriented languages such as Smalltalk when coming from "broken Simula" object-oriented languages such as Python or C++.
We teach set/function theory and basic logic to high school students. It shouldn't be that much harder to make the very small amount of generalisation to explain the fundamentals of a modern logic-based type system.
Every young man should have a hobby: learning how to handle money is the best one. -- Jack Hurley