When going to a new version of a language, the correct strategy is to come up with the highest level language one can conceivably jit-compile and rewrite the current language as syntactic sugars of the higher level language. "Pragmas" may be part of the sugaring (especially since there may be important pragmatic information provided by the lower level language) but it is better if the so-called "pragmas" are, instead, assertions written in the higher level language itself. The answer here is not a functional language but a relational one since functions are degenerate relations. Moreover, since one seeks to have assertions in the place of pragmas, the formal basis of the relational language should be sentence oriented. The sentence-oriented relational formalism most widely accepted across disciplines (including program specification) and with the most history is the predicate calculus. The brain-dead zombies will now start chanting things about Prolog even though it was never an implementation of the predicate calculus and tried to do things that probably should never have been attempted on a DEC-10 anyway. There are neo-zombies who will start chanting things about Erlang. Erlang is a bastardization of Prolog which is a bastardization of the predicate calculus. The best thing I can say about Erlang is that Mozart/Oz is much worse, being a bastardization of Erlang that is attempting to add relational constructs in without undoing the damage that Erlang did to Prolog -- when, in fact, they should have undone the whole mess, including Prolog, and gotten on with arranging a legitimate marriage of the predicate calculus with computers. If you are such a zombie, spare yourself the pain of reading further.
So, here is the high level idea (despite the danger of inviting Prolog zombies I'll be using its syntax for the Horn Clause):
The Idea
Parallelism spawns independent computations.
The Horn Clause:
m(A,B,C):-x(A),y(B),z(C).
expresses AND parallelism spawning 3 independent computations.
The Horn Clause document:
m(A):-x(A). m(A):-y(A). m(A):-z(A).
expresses OR parallelism spawning 3 independent computations.
In an operating system, parallel computations are scheduled for execution, allocating resources according to priorities.
There are also computations which cannot be scheduled until the computations upon which they depend complete. The Horn Clause document:
m(A,B,C):-m(A),m(B),m(C). m(A):-x(A). m(A):-y(A). m(A):-z(A).
expresses 3 AND parallel computations, each depending on 3 independent OR parallel computations.
This kind of data-dependency suspension of scheduling is also handled by operating systems.
By focusing on these constructs:
- AND parallelism
- OR parallelism
- Scheduling
- Dependency suspension
a radical reduction in semantic complexity can be realized.
Tools
Seymour Cray once said that much of engineering creativity comes from using old tools in never-before intended ways. The same is true of anything. New understanding of a thing's use is a way to create a new tool. Indeed, even when creating a new thing-in-itself as a tool (the ordinary means of creating a new tool), what comes first is its desired use. It is harmful to think about the fact that your hammer can be used as a paper-weight when you are pounding a nail into a piece of wood with a rock.
With that in mind, let us properly-use the Horn Clause:
- Branching is properly scheduled parallelism. This is even done in CPUs with instruction look-ahead threads and their abort.
- Looping is either AND parallel recursion or it is properly scheduled OR parallelism.
- Class hierarchy is properly scheduled polymorphism.
- Polymorphism is OR parallelism.
- Name space determination is word-sense disambiguation embodied in a particular choice among various clauses for the same predicate enjoying logical success.
- Exception handling properly scheduled OR-parallelism.
- A database row is cached AND parallelism.
- Numbers are duplicate row counts, dimensioned by the conjunction of the dimensions of their columns (some of which may be, themselves be duplicates when, for example, the dimension is squared) where addition is OR (adding rows) and multiplication is AND (adding columns). Negative row and column counts are a result the kind of negation in logic required for quantum algorithm specification.
- A database table is cached OR parallelism.
- Triggers void caches and originate with user input upon which all computations ultimately depend.
- User observation demands computations.
- Eager evaluation is driven by imputed (future) user inputs and observations.
- Lazy evaluation is a failure of eager evaluation., And finally:
- While caches exist and are under user observation, change in user input state does not stupidly void caches -- thereby allowing dependent computation to proceed -- it first compares the cached value with the recomputed value and proceeds to void dependent caches if and only if there is a change in value.