It's unlikely our language processing is over-engineered as we certainly lost some other of neural processing due to the change (check out this episode of MindFeild which covers this in some depth https://www.youtube.com/watch?...). Most animals already have some language and tone interpreters built in, thus language interpretation need not evolve at exactly the same time as the ability to accidentally make sounds that can more uniquely communicate your intention. However, one can imagine the increased benefit in survival one might have by being able to quickly learn sound mannerism of the ruling family & neighboring tribes, and thus those already having superior interpretation skills we likely selected via evolution in much the way we selected for similar genes in border collies (see https://www.youtube.com/watch?...). Moreover, it seems that the ability for complex thought (and thus the ability to further extend language) just an artifact of having being taught to name more things during development (https://www.youtube.com/watch?v=gMqZR3pqMjg). Similarly, our ability to perceive things can be limited by whether we were taught a name for them (https://www.youtube.com/watch?v=mgxyfqHRPoE).
Most research I've read compels me to believe that everything in the brain seems to be stored by association, this is both efficient and allows for some redundancy at the same time. There is no single node responsible for the perception of a "zebra" or "tiger" (as the "Jennifer Aniston Neuron" paper would lead you to believe) but rather provocation of a single neuron can lead to recall of the major cords associated with neuron, much like playing a 'C' on a piano might make one think of the C major chord, unless one is feeling sad, in which you might first think of C minor. (Emotional processing is why there will likely be various resolutions of a brain back-up, the first one just being the map of default connectivity, and the others how the map changes with emotion.) This same redundancy exists at the lower level processing like retinal computation (see https://www.youtube.com/watch?... for a primer), this is likely because the interconnectivity allows for better denoising of the system. Thus, while a single neuronal node (like the Aniston Neron) might be able to stimulate the recall/perception of a 'stripe' or 'loop', in undamaged HUMAN circuits perception never really works this way.
Reading is not always the best way to convey higher level concepts. While reading does provide a more rapid way to convey information because your brain isn't distracted by the overhead of needing to interpret body langue, tone, ones ability to recall information is more directly correlated with having an actual or imaged experience (especially an emotional one), which is why we have lab classes in science and need to actually write some code in order to learn how to program.
Your final question seems to be based on the false premise that we don't use the audio cortex when reading, and thus I cannot address directly address it. Many of the same auditory processes are still involved in reading (https://www.sciencedirect.com/science/article/pii/S0960982213000055). Similarly, while the brain is very plastic, the brain tends to keep the processing of like things together in a similar way across humans, and our visual cortex and navigational processing seem to happen in the same areas, but if vision is removed, some of this area can be repurposed for storage (most likely about where something was, or what it felt like). Moreover, there seems to be much more evidence that perception and memory storage happens at the network group level rather than the individual node. Memories of perceptions are stored/experienced like cords played on a piano, only when we try to recall do we run into a scenario where a single neuron/note can be linked to what one might describe as a memory because of the brain automatically tries to play the appropriate cord to harmonize with it. This is the basis of why it's so easy to implant false memory.
I'd begin with an introductory example of how a human makes a decision and how artificial neural nets were designed to mimic that sort of weighted decision tree.
I'd then go on to explain what inhibitory and excitatory nodes are and how changing the weights in a weighted tree can produce what we consider a stored memory.
I'd follow with some examples about how our conscious mind is really only designed to process the difference between the expected and the actual outcomes in our environment, and why this would make sense evolutionarily.
I'd convey the dogma in neuroscience around memories being encoded in our brains synapses, then introduce the concept of a connectome as a graph of the entire set of those connections.
I'd talk about the interfaces being developed in DARPA's NESD project. I would then talk about the current destructive mapping technologies being used in the Lichtman lab and in the Blue Brain Project. And then I'd discuss that while DARPA is on a good path for brain restoration, none of these read technologies will help you in the event of a head injury or Alzheimer's.
I'll talk about the macro multi-network approaches to reading using MRI. I'd demonstrate their power by showing a short clip of auditory decodings of people listing to spoken works in an fMRI and discuss why there's an echo. I'd follow with a similar published example for visual data.
I'd finish the lecture discussing how I envision solving the high-resolution read problem. How the connectome backup will likely be sold with various levels of resolution (like VHS, DVD, and BlueRay). Then discuss my current projects to create synaptic level non-destructive imaging technologies and how once fully proven we could get an even more nuanced map.
Cough, unblocked.krd, cough.
@ThePirateProxy lists the latest domain.
Perhaps the reason for the huge drop in thankful users was their decision to stop supporting their second most desperate set of users, those running KDE?
While I understand Gnome 2 users are probably their bread and butter due to their forking into Mate, there are plenty of other stable Debian based distros that provided stable Gnone 3 and Xfce based environments that 'just work'. But just like having an up to date Gnome 2 environment makes people EXTREMELY thankful, there really isn't another Debian based KDE distro that 'just works'. So if you prefer a KDE desktop, sadly, you options are to either deal with broken Debian based KDE distros (such as KDE's own Neon distro), abandon KDE, or move to the RPM based OpenSUSE.
While it's obvious from the long delays in Mint roll outs of KDE compared to other environments that KDE has never been a real focus of Mint, I can personally attest to how thankful KDE users are to Mint for all their years of support. Thanks guys!
It's a Linux Distro. Seriously. It's not that you had the solution for cancer and then lost it or something..
I do cancer research, and many of us use Mint as a primary desktop as it is easier to install and maintain current versions of the tools we need than the windows (or worse mac) software we need to do our data analysis. Sure my institution might have a expensive licence for a very specific piece of software that is slightly easier to use once you learn it, the problem is I'll have to learn it, and when I leave in 2-3 years to work somewhere else with better resources, they aren't likely to have a license to this same package. So in effect, Mint is helping cure cancer, in the same way your water utility is helping your local brewery brew beer, sure they could truck in the water, but it would make things A LOT more difficult.
I was working directly in the medical diagnostics space and Theranos (and Calico) made it impossible to get grant funding for electrochemical bioassays for newly discovered targets despite the path to development being extremely straight forward and unlikely to fail (even from our own institution who had money allocated for this but never dolled it out), let alone get a nibble of interest from an outside investor. It's sad, because by now there'd be an at-home test for cancer recurrence if it wasn't for the inflated press that came with these guys getting their $.
AKA legalized bribery.
I work in vaccine research. It's not just the regular public that's been tricked into thinking the flu vaccine is efficacious, it's our doctors. While in about 1 in 5 seasons they pick the correct strain, and during that season there is can be heard immunity, even during those seasons the vaccine isn't effective for everyone that gets it, and thus a true heard immunity would never be possible. Moreover, the side effects of flu shots are notoriously under-reported, many people often feel slightly sick after wards as part of the indented immune response they provoke. Thus, the idea that millions of people should be MANDATED get a shot every year that makes often makes them feel sick so that they might not get very sick 1 in 5 years / that they can protect old people and children that are too young to be out in public anyway seems like a loosing proposition to me.
The reason the government is pushing you to get it despite it's poor efficacy is because they are worried about the chance year where the vaccine does work, and there's and outbreak and a sudden unmet surge in demand that causes supply to run out (as has happened in the past). Thus, in order to avoid a public health failure, they've agreed to prepurchase the vaccine to cover the majority of the public. As a government, once you have the vaccine already on your self, of course your policy changes to get doctors to push it on everyone so that the patients with the worst anxiety disorders will already be vaccinated before there is epidemic and a few unnecessary deaths might possibly be prevented, but it's mostly the former vs the latter.
This is logical enough that I also always encourage someone to get a flu shot if they are considering it (and I get it my self) . However, mandating it clearly hurts public trust; you should have a choice as to when and if you want to get it a shot with a 1 in 5 chance of working. Conversely, It seems there should close to no choice with respect to other vaccines with a superb track record of efficacy, such as the rest of the vaccines (including the one for HPV) that are often required before entering a school.
Systems programmers are the high priests of a low cult. -- R.S. Barton