D'oh - NSA, not NASA. Nevermind - Feel like an old SNL sketch there.
D'oh - NSA, not NASA. Nevermind - Feel like an old SNL sketch there.
Suddenly those spent costs no longer seem like they should have cost as much.
And those lessons learned? We should have just known those!
It's why industry refuses to spend anything on basic research anymore. SOO inefficient, and with priorities that make no sense to some random consultant or investor.
Pff - NASA, I could do better than that! Here - I'll just make up an ideal, say, random number generation that I just happen to have a library of code on, and WOW - I do SO MUCH BETTER than them. Not impressed, NASA, not impressed.
I don't even have to bother understanding the ideals that their code was actually built towards!
Neurons work primarily in terms of communicating - I'd say they're basically communicating machines as much as muscles are movement machines. They store states, query other neurons, take external inputs, and work together to do virtually everything an animal can do, as a macroscopic being. As they grow, they have to figure out their particular role based on their inputs and outputs.
So, why can't we just query them for their contents? With stories like this, we're making artificial nerves - shouldn't there be some way we can signal the nerves, push some simple neurotransmitters, and experiment until we get enough singnal+noise to figure out the 'language'? Even in simple creatures, it seems like we should be able to do this enough to ask a neuron its contents, then query neighbors, until we at least get a loose map of queryable resources.
Every once in a while I search google scholar and the like to see what folks are doing along these lines, and I never seem to see anyone take this approach, or even attempt to reach for mechanisms of this form. But if we can see, learn, imagine in real-time, and so on, there has to at least some analogue of an informational query system we can use, static purpose neuron maps just wouldn't make sense even with the scale, even with specialization.
No - no, it could NOT be! Those zero-sum *whackos* got to Slashdot too! It's not true I tell you - everything is a positive sum game, where you more you reward the rich and *deserving*, the more resources just *exist* to better serve the sheer excellence of the intentions of those in the market!
Entropy is a lie! Hope must win! If we only *trust* in the market enough, it WILL provide! Rational skepticism will only doom us all!
And with enough sarcasm, I might *just* be able to express how little a surprise this but of news is!
I know, it DOES sound absurd, and in practice, it is. Now, it's "only" around 800 individual products actually delivered fresh each year, but because I'm having to touch and test older games as a part of that process, I'm in effect coding for thousands of shipped products per year, just to make it as sane as possible to continue each product line.
And yes, that means each day, I'm jumping between 15 minute mini-projects, reviewing and raising issues on design documents, throwing together project directories and rapidly configuring them, throwing those project into automated testing suites of tools (which I'm also cross-developing), testing the various inputs/outputs of other teams to make sure nothing will prevent delivery to spec.
I'm making active progress on around a dozen separate projects each day, contacting clients as needed to hammer out shared documents, then reacting to rare but important issues as they are raised.
I code for thousands of mostly-unique commercial software products a year, using 8 languages (mostly C#), for many dozens of major customers, and lots of smaller ones.
Because of this, I have a huge chain of demands I keep track of, and methods of automation in order to collectively manage a constant flow of data requirements, and of course tracking issues both shared and common between these scenarios.
When I'm coding, I've got to code in a way that communicates these details to myself, consistent between all the languages I might have to touch for coding, scripting, database, reporting, and specialized languages a client may suddenly require.
Because of that, my code has to be a loose framework, a late-binding train station of logic, where demands may switch at any moment, and limitations imposed from other teams may similarly pop up.
My code is littered with multi-paragraph discussions of a technology I once had to interact with (customers often switch back), large sections of functions commented out rather than deleted, and other 'bad' practices just to give me landmarks and a 'flavor' of what a customer is occasionally interested in, amidst a never-ending avalanche of context switching between products and customers.
I've redesigned these several systems from the ground floor once (they used to only handle a small fraction of the work, using an antiquated language), and am working with a team to do a better design... but it's been very difficult for a team of perfectionists to understand how to react to an unlimited flow of changing requirements. Fortunately, the code itself has been quite usable, and they're using the same languages, but no system can really handle these demands truly consistently - I'd call it NP ridiculous. It's basically the "mythical man month" writ live, where I've got to do my work, and train a team whose work process may never really be able to do what I can do - definitely healthier long term, but can't help but result in some amazing process failures.
I actually would have made most of these design changes myself, but at the time, I was forbidden by management from making those choices, since I was doing my work directly at the production level - so it's actually a bit of a relief to see someone at least allowed to make some of the better choices.
In short (and yes, for this scenario, this is short), because I'm doing alone, for years, what a team of almost any size would struggle to approximate, as many of us seem to be doing, I've got no choice but to code how I need to in order to have a system that I can sanely maintain in an insane set of requirements. There's not really a choice in the matter, if your put in a position where "oh, we suddenly need this" exists as a live production task in a growing industry.
Milktoast centrist put in vice president status. Courage required: 0
At least the guy is well educated and experienced.
Not a dramatic choice - but a solid guy all the same. Would be justly called pretty conservative most places outside the US.
I'd have much preferred an Al Franken or Elizabeth Warren emotionally - but see the virtue in a low-key centrist technocrat.
Perhaps he's exciting by virtue of being boring in this environment. Get the guy training with some comedians before the debate, and a few good lines with low expectations could have OK results.
In other news: No news is news, in this news cycle. Which isn't news, with 24 hour news.
A large portion of our (and virtually all other life) is partially composed of virus-inserted code.
To a virus, life isn't really a thing to begin with, only DNA interactions, with rare opportunities to copy.
From that perspective, death of the host body just means it's bacteria party time, and even if 99% of organelles used to copy are kaput, almost all viruses are bacteria-predators anyway. So, hiding away in human DNA for a few hundred generations or whatever is just a distraction from getting to the (ambiguous) goal of a bacteria to infect.
So, now that they're not suppressed, some random virus code passively sends a request to the organelles to write a copy of themselves for the 83rd billionth time, and this time don't get their message scrambled. All this happens trillions of times, infects perhaps millions of bacteria that manage to escape, which spread off into the world to keep the messy process going.
Niches for DNA code are massively multidimensional, and even though the possibility space for success is outrageously sparse, the life that lives in the outer reaches of possiblity doesn't have be intelligent to know it's a bad idea, and so spreads where we can't imagine. Things like life that only has the chance to reproduce every few hundred years (using another life form's mechanisms to keep their DNA active in the meantime), or has to jump between 3 species in order to continue a full reproduction cycle.
Heck, the only reason we can move around and talk and stuff is because some odd other microlife got mixed in with an ancestors cells to become mitocondria. With that, we can live away from immediate energy sources, and use sugars. To this day, bacteria are constantly mixing DNA with eachother, getting into the oddest combinations, with some help from viruses, who get everyone else involved in the party.
And from a microscopic perspective, we're mostly mobile apartments for bacteria, that protect the bacteria/helpful viruses we like from the bacteria/viruses that tend to wreck the apartment. Fortunately, most bacteria are boring tenants, and most viruses only target bacteria.
Microsoft makes very good developer tools - but the decision on how to make them, unfortunately, has constantly been shaped by a LONG series of internal political decisions on what products they want to be promoting at that very second.
DirectX, Crystal Reports, 'Modern' (metro to everyone else), the dot Net framework, phones, MS-specific java,, etc., etc... Some of them ended up good ideas, some of them were just what they wanted to push to capture some market. It's a big part of why it's actually something of a bad idea to just read Microsoft documentation on their own products - they'll tend to color every piece of introductory material in light of what they want to promote at that moment, what they want YOU to do for THEM.
They also tend to stall on some technologies that they feel will shift resources in a way that won't help them at that very moment. Shifting to 64-bit developer tools might be a bit expensive to test binary interoperability with everything - and Microsoft also hasn't found a very good consistent method of hosting 32-bit DLLs into 64-bit executables that isn't just some piped communication between different process spaces. I can understand their reluctance to commit resources into something they're not sure they can make work seamlessly.
But really, I've seen a lot of my Visual Studio exe memory usage stats floating up to the large-address-aware 32-bit limit. Even if it doesn't meet 100% compatibility/interoperability, I think it would still be a good idea to start an experimental option of a set of 64-bit build environments. Perhaps with embedded 32-bit memory spaces that you can host stuff in without loading errors, if at all possible - but 64-bit-only restrictions if that's not possible would also be acceptable.
Seriously, this could all be parody, and at this point, no one would be able to tell.
It's kind of like recent decades of of the political process: Take normal political lying, intersperse it with assurances that "Oh, now we're going to make EVERYTHING better - government is not the solution WE ARE... when we're government, that is."... then they get in, and it's like 10x more cynical rules being passed.
That said, pessimism is misleading too. PLENTY of scummy businessmen have dreamt of pushing these same models, but were rejected soundly by smaller customer bases - it just takes longer for Microsoft to fall the same way IBM and other scummy folks did.
Also, for politics, if you look at the ages of yellow journalism in ages past, the populace was truly more deeply ignorant in the past, and the politics even more cynical, with death as a much more common side effect of that cynicism - things are genuinely better, which actually makes it relatively shocking to see some small degree of backsliding towards a less classically liberal path. Despite the 'overton window' of recent decades and news, we're actually amazingly liberal in terms of actual policies, with no real show of that stopping.
But yeah - this crap with windows quadrupling down on their spyware-like 'upgrade' practices is in the same vein - an amazing throwback to scummy ideas I'd thought the 'marketplace of ideas' rejected to soundly everyone should still remember not to use them.
I guess we have to keep relearning those things.
First off, those engines will only run at full power at the very start of the journey, if even then to get to, well, _cruising_ speed, which is around 22 knots, which is around 25 miles per hour. It IS a lot of fuel to use in any case - but per-person, it's not so bad as these blind numbers in headlines.
Bulk shipping by large ship is actually pretty efficient a method of transporting our stuff. Yeah - they often use the nasty fuel when they can get away with it - but in terms of per-unit cost, it really isn't that bad by scale. The entire transportation industry DOES need to get off carbon fuels - but compared to the fuel used to give everyone groceries and trade, the impact of vacation resources isn't that large a cost. People always eat, the extra fuel to eat on this boat isn't a very large extra percent.
I don't think it's terribly productive to label folks taking vacations as wasteful, when really, it's our entire current system that needs to get its resource usage into a sustainable state.
I think if you'd compare it to environmentally 'friendly' activities like touring Alaska's wildlife, it uses far less fuel per person.
Doesn't matter if it's got a wonderfully elegant underlying structure, it's still a FUNGIBLE, TRADEABLE RESOURCE, with a set of mechanisms controlled by PEOPLE.
If a company town somewhere offered the perfect elegant scrip system, where demand is balanced against resources to ensure complete fairness in the system... if people are in control of it, they're going to find some way to exploit their position over others with that system.
Even though it was getting popular, it's still a niche currency, and it was going through a mania stage, like tulips did 300 years ago. It's very instability is an important key to reaching adoption, but also why it would be very difficult to reach large scale in its current form.
The problem is that it's a cryptocurrency - and like town scrips, anyone can invent one. And people are motivated to create them, in order to live at the vortex of managing a currency. But you can't stabilize a monetary system like that, since they're both cryptic and competing.
Even if a government were to create one, its very cryptic nature would make it feel very much like a 'McBuck', a secretly controlled game that feels fed by advertising as much as any real features. All currency is a form of shared manipulation, but cryptocurrencies are a different kind of artifice, in the same way computerized voting is flawed compared to paper voting.
Ah, financial news - a place where you can open make statements like: "Unless the US changes its laws to give me lots of money, I can't help but foresee DIRE, DIRE things happening. Financial catastrophe would be putting it lightly."
Well, WELL past the point of poe's law.
Perfect song for the car.
"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years."
- Gordon E. Moore, April 19, 1965
It's both cost and density, and continued to be so as it crystallized into the transistor doubling every 18 months figure. To double the density, only at exorbitant cost would not really be an increase in accessible technology. It's not just the technology being invented and monopolized, but being rolled out and usable by the entire field. Increasing computational complexity is still the most important part - but cost has always been a part of the idea too.