How could you forget Neverwinter Nights that gave more than a modding, it was a complete toolkit for the creation of whole new games? They published details of all data files and model formats, the community was awesome. Some modules were so good Bioware provided a forum for developers to sell them.
That was completely destroyed by Neverwinter Nights II where Atari decided to go with a gaming engine that used proprietary models and masked data file formats so that they could try to sell modules and addons themselves which failed miserably.
Would love to have a toolkit like that again with an enhance 3D engine and an improved scripting language. I spent many hours playing on custom servers and designing my own out.
This so frustrates me. We do not have to get CO2 levels down to 350ppm to be safe as you put it. CO2 levels rising are not a direct threat to your life, it just changes the distribution of flora and fauna across the planet and potentially changes coastlines and weather patterns.
CO2 becomes toxic at levels of 1%, at 10% it can cause respiratory paralysis which can lead to death. Current CO2 levels are around 387ppmv which equates to roughly 0.0387%. So suffocating in an open space due to atmospheric CO2 levels would require those levels to increase 30x what they are now or roughly 10000 ppmv and there is no science that says CO2 levels will increase that much.
Now, yes a storm or rise in sea level could change your coastline and wash away your house, but coastlines changes and storms have been happening every since weather patterns first formed on this planet.
So yes maybe anthropogenic sources of increased CO2 might be causing concentrations to rise faster than they would have without it. But it is not a direct threat to you, indirect and inconvenient perhaps, but it doesn't make you any less safe than you are now.
Colleges and Universities (at least in the US) exist to support colleges, universities and professors. And I have heard former professors say the same thing, not just people like me.
The university system does not prepare students for work in the real world, it simply teaches them some basic theory. It isn't until a person gets out of school and goes into an apprenticeship model (depending on the career path) that students learn anything useful. The college system did a great job convincing HR managers that they should require college degrees when many times it isn't needed. All the degree shows is the candidate is willing to waste 4-5 years in a classroom.
I hit a glass ceiling 10 years ago, the company I worked at (where I was considered one of, if not the top, technical leader) said I could not get promoted without a degree, so I went and got a BS in Compute Science. I took classes with graduate students who (literally) did not know how to open a file stream in C++ and read individual words out of the file. I had to show them during labs. And these were the same people that would apply for jobs I had posted claiming they had Master Degrees and were deserving of higher salaries. The head of the Computer Science department asked if I would consider coming back and teaching after I graduated.
What we need in this country is to go back to the guild/apprenticeship model for people that plan to work. If you want to teach, want to do research, then let the universities focus on that. But if a person wants to implement, let OJT be the way to go. Stop requiring 4 year college degrees and stop penalizing highly skilled practitioners who learned their trade instead of sitting in classroom.
Consensus governance will likely never work in diverse societies, there are just too many opposing views and opinions. The places where it does work tend to be very homogeneous (aboriginal tribes in Canada for example).
Some governments have limited forms of consensus but simply stated there are just too many people who are too lazy to get involved and become learned in the issues. They'd rather listen to platitudes and 30 second attack ads and pull a lever.
Actually I think the United States started out correct, where each State was supposed to be allowed to govern their citizens however those citizen decided. The union of states was only meant as a means of defining a single entity to the outside world, but each State was supposed to be sovereign. Prior to the war between the States (which was completely un-Constitutional, there was not and still is no prohibition in the Constitution barring a State from seceding, so by the 10th Amendment a State has that right, but I digress), we were referred to as These United States, afterwards we became The United States.
Looking at what is happening around the world where there are riots and protests by those who have lived off government redistribution that can no longer be sustained, I wonder if the problem is simply that large, diverse populations simply can not exist as a single nation, that maybe small city states where like minded people can congregate is not a better long term survival model?
Wait I'm confused, you make the point that A basic income has no preconditions for working, then you give a number for currently available handouts at around $600 and say that basic income in your view is $1000 (only about $400 more) but you also say if that isn't enough people could work for more.
So I'm confused, isn't this exactly how things are today? There is a group of people living off basic handouts from the government and a group of people that have decided they want more than the basic so they work for more. But now the first group is screaming how it is unfair the working group has more and the government should take it away and spread it out.
So maybe I'm missing something, but what you describe is exactly what is happening right now, except you left out the part where those on the basic level who don't want to work still want more.
As others have pointed out, the Preamble does not bestow any powers, it is merely an introduction.
Now, inside the Constitution in the actual details, that phrase also exists in the Taxing and Spending clause (Article 1 Section 8 Clause 1), but is always taken out of context.
The text of the clause is:
The Congress shall have Power To lay and collect Taxes, Duties, Imposts and Excises, to pay the Debts and provide for the common Defence and general Welfare of the United States; but all Duties, Imposts and Excises shall be uniform throughout the United States;
Notice what follows the general Welfare -- of the United States (and this means the collection of States, not a federal republic).
Our nation was founded as a union of independent sovereign States. At the time of the writing each State had its own laws, own rules and many had their own currency. The Constitution was written to recognize those States were sovereign, but would come together collectively for defense and dealing with external nations. Prior to the civil war we referred to ourselves as these United States, after that war (which may also have been un-Constitutional since there is no prohibition against States leaving the union, so the 10th Amendment gives them the right to do so) we became known as the United States.
So the clause refers to the welfare of the States. There is absolutely no authority granted to the federal government to deal with welfare of the people. None of the so called entitlement programs is backed by Constitutional authority, neither are areas like education or EPA regulations that apply only to instate resources. Congress claims all of these powers under the Commerce Clause.
So no, the Constitution does not grant the federal government any say in the welfare of the citizens of the States.
Ok, so say Oracle does end up destroying the Java community and you get your wish that Java dies, what replaces it?
Java is huge in corporate development because Java provides a complete ecosystem. It is a supported platform, there are large numbers of trained developers, it has a huge pool of good quality external components available from the Apache projects for example. It works.
You can go from zero to a working webservice, complete with connections to a database in a couple of hours. With some decoration you can change that from being XML based to JSON based.
You can build 3D games, using OpenGL that perform remarkably well, so long as your target platform supports OpenGL (not a Java issue).
And you can do all of this in one language, with one development kit, some well known, well defined add-on libraries that you can deploy to multiple operating systems. Or you can use a number of other languages, if you prefer to code in a different style and don't like the wordiness of Java the language. Java the platform gives you this ability.
Call it the new Cobol if you like, be all smug. Doesn't matter to all the companies using it and developers making a living coding in it.
I would really like to know what could replace this? I have been concerned since the Oracle take over and have been trying, for example, to find an alternative to a simple webservice world.
Today I can download, unzip and fire up Tomcat and I'm ready to write code, or I can use Jetty and embed an HTTP server and servlet engine in my jar file and make it a single jar deployment. Yes I know, I have to install a JVM, which is a simple download and install. You do the same with Ruby, Python, Perl or PHP. With C/C++ you don't need a runtime, but you have to code for cross platform usability.
But none of these has the complete environment Java and Java frameworks offer.
So, for all of you wishing Java would go away, please, what is a complete replacement?
The part that continues to annoy me (more than the stupid name change) is how they continue to take and repackage BBC shows (Dr Who, Prime Evil, Merlin, Being Human, etc) and advertising them as "original series".
Now they don't directly claim it to be a SciFi channel (hate the new name) original but they make no mention of the fact they were originally on someone else's network. Almost seems like they are violating copyrights for profit.
I've actually started watching more BBC America, History and Discovery than SciFi, more interesting.
I've asked this question before, many times, and have yet to receive a reasonable answer from the climate change crowd, so I'll try again:
So what if the climate is changing? It has never been static, it never will be static. Weather patterns have changed continuously throughout Earth's history, when humans were not even present. Hell, previous changes could have been caused by all the methane released in dinosaur farts for all we know. Some areas of the world that are barren may become lush, others may become inhabitable, coast lines may change. Outside of the political implications of which nation is on top or not, so what? That's life, and humans will adapt.
There's a saying I've heard that goes something like:
software that is 50% complete and ships provides 50% more functionality to users than software that strives for 100% completeness but never gets shipped
Getting software in the hands of users, even if it doesn't provide all the functionality they want up front, can give you a first mover advantage. Then, as you learn what your users like or request (which is almost guaranteed to change from your original idea as they start using it), you do iterative releases with new capabilities. By the time a competitor gets a really polished, complete system out the door, you'll already have a growing user base.
If you had better tools, you could more effectively demonstrate your total incompetence.