* memory management is explicit [merriam-webster.com] -- what does this mean?
Quantifying the Performance of Garbage Collection vs. Explicit Memory Management
Automatic vs. Explicit Memory Management: Settling the debate
* deterministic [merriam-webster.com] -- what does this mean?
I thought it was self evident. Here is a discussion of the matter.
* endemic [merriam-webster.com] use of a garbage collector... -- what does this mean?
Pervasive would be a better word. Languages that make garbage collected allocations for most or all things. For example in Java, aside from primitives, all allocations conceptually occur on the a garbage collected heap.
reference-counted heap objects
Reference counting: counting the number of references to an object.
Heap: an arena of memory maintained by a memory allocator. Also CPUs typically have no knowledge of how software manages heaps. You may be thinking of virtual memory
Objects: object in the generic sense of some amount of memory managed on a heap. These lecture notes show the same usage. The editors of this page also use the word 'object' in exactly the same manner when discussing pointers. It's not that hard to follow.
Putting it together we have objects on a heap for which reference counts are maintained; reference-counted heap objects.
"exchange" heap -- what does this mean?
* "local" heap -- what does this mean?
The link I provide to Patrick Walton's blog would get you there. Also, there is documentation, Sorry if discussing a new programming language involves terms you haven't heard. Computing can be like that sometimes.
(note: there is only one "heap" on most CPU architectures, so now we have added abstraction)
Now you are definitely confusing heaps and virtual memory. There are usually many, possibly thousands of heaps on a system at any given time with many distinct implementations of which the CPU is entirely ignorant. Memory allocators and virtual memory are different things.
* via an "owned" pointer -- what does this mean?
Similar to a C++ auto_ptr or unique_ptr. Again, the link I provided would get you there.
* wild pointers -- what does this mean?
Dangling pointer and wild pointer are synonomous.
Use of the exchange heap is exceptional and explicit yet immediately available when necessary -- what does this mean?
I provided a link directly to a discussion of this.
Memory "management" is reduced to efficient stack pointer manipulation -- uhh, what? the language sits around modifying content at %esp and %ebp along with some offsets? sounds far from efficient)
Incrementing a decrementing stack pointer registers is very efficient. Offsets are computed at compile time and the instructions typically require one CPU cycle and no memory access, given a naive model of a CPU. These techniques are a ancient and ubiquitous. Sorry you weren't familiar with them.
or simple, deterministic destruction -- what does this mean?
Others seem to have no difficulty with these terms. In particular, they are not compelled to link merriam-webster at each use, for some reason.
(note: 2nd use of word "deterministic")
Reusing words is an important feature of language.
Compile time checks preclude bad pointers and simple leaks so common with traditional systems languages -- what does this mean and how does this work, considering that the value stored at a pointer (or what it points to) can be manipulated at run-time, so how would the language "deterministically know" (see what I did there?) what's "bad" vs. "good"?
"bad", "wild" or "dangling" pointers are memory safely faults or violations. It is an feature inherent to Rust that they can't exist. Feel free to learn about it.
* ... that is productive, concise
Holy shit! An opinion on Slashdot? Say it isn't so.