Doesn't really. current IP laws do not protect new, aspiring authors or small companies. They protect established players. Same as most of the rules actually. This is one of the reasons why when you create a company you have to become big fast. The fastest to grow big wins.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
It seems hard to believe that Africa will sustain 4x more people than now given the state the continent is already in.
There are globally close to 2 billion people and growing with their own computer, power plug and working flushing toilet. 400 millions in North America, 500 millions in the EU and surrounding countries, plus Russia, plus Japan, plus Korea, probably at least 100 million in China, 100 million in India, and smatterings everywhere else. This works out to about 25-30%. Nowhere near the 1% the GP is referring to.
To get to understand more advanced maths, one needs to build familiarity with basic algebra, arithmetics, geometry, logic and abstract reasoning to name a few. For most people this is very alien and requires learning by rote, reinforcement, teaching attention to detail and so on. Most mathematics does not suffer approximate reasoning. This is a very exacting science and even sometime bright mathematicians sometime make sometime spectacular mistakes (e.g. Poincaré).
I agree the way math is taught now is pretty lousy, but I do not have a really great way handy to teach it better. Mostly it is a question left to the teacher. Great math teacher can do wonders with the material even in early classes.
In episode V we have consistent great dialog, especially with Han Solo, excellent action, a sputtering Millenium Falcon, a really badass lightsaber & force fight, and of course, "Obi-Wan never told you what happened to your father, etc". We also have much better improved special effects compared with the wireframe rendering of the death star of Ep IV; Darth Vador gets really angry with the military brass. Best of all, there are now Ewoks.
CF is no hot In her slavegirl costume? come on!
At the time it was innovative in many ways.
It already does unless you've forcibly removed OSX.
Indeed a great upgrade at less than $100 these days.
Thanks, interesting document, found here. The audio is really bad at the beginning and fluctuates throughout the talk. The interesting bit that you refer to is at 21 minutes from the start.
I'm trying to type in what he said directly from the audio:
The 16-bit design gave us a megabyte of memory. The 8086 has a 20-bit address. It is really a segmented 16-bit data path with segment registers that are really indexes. It is a 1-MB address space. And in this original design I took the upper 384K and tied it to a certain amount to provide for memory video, the ROM and I/O. And that left 640K for general purpose memory. And that leads to today's situation where people talk about the 640K barrier. The limit to how much memory you can put to these machines. I have to say that in 1981 while making those decisions I felt like I was providing enough freedom for 10 years. That is, a move from 64K to 640K felt like something that would last a great deal of time. Well, it didn't. It took only 6 years before people started to see that as a real problem.
Fortunately, there is a reasonable solution. Intel has moved forward with its chips families, the 286 chip introduced in 1984 moves us to a 24-bit address space (mumbles about segmented indirection, being not that good). That is sort of an intermediate milestone. in 1986 we moved up to the 386 where we get a full 32-bit offset to these segments that have been designed in this architecture. So what we have is a machine that can address 4GB of RAM. And I have to say with all honesty, I believe that it will take us more than 10 years to use up that address space.
So he never makes that exact quote, however one can understand why people picked it up. Essentially, BG thought in 1981 640K would be enough for everybody for a long while. Note that he was reasonably prudent regarding using up the 32-bit address space (that ship has sailed now).
Later, regarding memory, he says that computers should have about 1MB of RAM per MIPS. Specifically, he goes on to saying machines with 30-60MB of RAM should be desirable soon (in 1989).
In this talk he talks about many things, most are pretty insightful in fact: OS design, multitasking, parallelization, multi-processor designs, dynamic linking, object-oriented design. Funnily he talks at length about OS2 in a very positive way. This was before Windows 3 of course. He compares OS2 and Unix, saying that OS2 will take over the desktop and Unix the servers, and all other OSes will die out. He talks about the FSF, saying its task of creating a free Unix-like OS is doomed.
Some interesting comments on that talk here.
Sorry hit send too soon, replying to myself, bad form.
In my case when I had long commute times, I sometimes solved bug in my head that had eluded me all day pounding on the keyboard trying to find out what was happening. When you are forced away from the screen/keyboard and you must think for yourself without any help from documentation or debugger or anything else, your may realise that your assumptions had been wrong from the start and your design was not optimal, for instance.
Now I have a more academic job where sometimes I have to come up with mathematical proofs, or at least things like novel algorithms that are not linked to producing lots of code in an editor. More often than not, they form up by themselves in my head when I'm doing something mindless and apparently unproductive: taking a shower, driving, grocery shopping, etc. Mind you this only happens after a lot of work (on paper or computer or just thinking), but the final step often somehow clicks when not thinking about it.
So unstructured, *boring* time is essential for many tasks. Also having fun. Many people like driving.
Lots of ifs in your post, also assumptions. While I'm not disputing that driving is largely non-productive, setting aside some non-productive time in your schedule is also good. Be it driving (if you enjoy it) or staring at clouds, or walking, basically empty your mind in some way. Humans are not machine whose productivity is linear as a function of the time they can work during the day. It is likely that when your were driving to
You are perhaps thinking of London, Singapore, Hong-kong, Beijing, Tokyo, Sydney or Paris. These cities are criss-crossed with an effective public transport system (including taxis). However this mostly works for the downtown area. A bit further afield and you'll find that there are busses and trains but they don't run very often, and taxis become expensive and don't want to go there. So it is possible to get around without owning or renting a car, but not always easy or convenient. My experience is similar to that in the Bay Area, Los Angeles, New-York, Boston, Chicago, and other North American cities with a non-ridiculous public transport system.
However it is mindset. My European friends tend to think of using the public transport first when they come for a visit wherever I've been. And most of the time it is there and it works (mostly). I many Asian cities it would be a very bad idea indeed to think of driving a car yourself. In North America, visitors tend to rent a car when they arrive and drive themselves around when they can.
Public transport develops as demand increases. European and Asian cities are by and large not well designed for car trafic, so developing a public transport system was a natural solution to the increase in population. It requires a lot of coordination between local governments, and a lot of investment over the years. In America cities are relatively easy to navigate and widespread so self-transport is a natural solution not requiring as much investing and coordination from local authorities.
If you read the Forbes article the parent linked to, you will see that in this case IBM behaved exactly like a patent troll.
Really ? if I had enough money (say 10 millions) to live on without requiring a salary, I would set up myself as an associate researcher in some famous university like Stanford or MIT, or probably a much smaller one, perhaps not in the US but still good with less turf war and less admin, propose a research prize in CS/Applied mathematics and fund a few bright kids' PhD every year. One a year every year works out to about 200k$ per year. Pretty cheap. No outcome pressure, no need for expensive equipment (a small number of fast computers, some offices). Bright and fun colleagues would be a plus. I could do that forever. The kids would have their own agenda and limited time, so research outcomes would probably just flow, in the right environment.
In research it is possible to get funding up to a few millions but you can't invest it, you can only spend it, and you have a limited time to do so. This is too bad.