OK, you proved the GP false. Given the huge disbalance, with ~30% of the world's Great Idiots born in the US, anything based on a representatively significant distribution is proven to be false.
The amount of great programmers who don't want to live in the USA.
After posting my post (of course, I got to brag before reading your opinions), I started reading how valued "multitouch" seems to be among
It's, granted, a game-changer that enabled buttonless phones, for better and for worse. But in a car, you want to avoid as hard as you can all kinds of interfaces that require your visual attention. My body knows where most useful buttons in my car are (and in the strange event I need to, say, switch the airflow setting, I know I can do it while at a red light or something like that). I do not want a car that enables me to do what I should only do with my full attention on it.
I neither want my neigbour driver's car to provide such abilities, of course.
This sounds exactly like the tech used by Hewlett Packard in the mid-1980s (here in Mexico, maybe it was known earlier elsewhere) for their HP110 and HP150 lines. The HP110 had (25x80? Probably...) holes on the screen edge, with a LED and a receiver at the opposite ends. IIRC, for the HP150 the "magic" was that the screen borders were now smooth, because the LEDs were higher power, and infrared instead of visible-spectrum.
I never used those machines; I remember seeing them and drooling at the finger-detecting magic
I have been three times to Cuba; first time (in 1999) I went to visit a friend at the Health Ministry, and they had quite a good dialup access point; back then, dialup was still the main Internet access mode where I live (Mexico). The lacking part was, of course, computer access in the population.
There was no censorship I could find (using a regular student account). Of course, I didn't go testing everything, as I didn't want to leave my host disconnected — But the main issue was the limits derived from having a single satellite uplink for the whole nation. I was told the situation improved vastly after the fiber to Venezuela was laid, but I cannot comment first-hand on it.
Of course, I'd expect now a fat fiber will be laid to Florida.
If I were to get a H1B visa, I might want to do the work you currently do for a much lower wage than yours (since I come from an allegedly poor country or something like that). So, is getting a PHP newbie developer who was born in the USA and charges US$100K a year, or getting a good, talented programmer who will do the same work for US$60K a year... Is on the same level only because they will fill the same job position?
(I live in Mexico, and am *not* interested in living in the USA. I have a ~US$25K yearly salary, and live quite well off it. But many colleagues have migrated to the USA, just because of that salary difference)
Right. You want to live in a free-market economy? Then people like you and me become part of the market. And, it's not like getting a H1-B visa is that simple: For a non-USian, only being quite qualified and skilled can get you a work-enabling visa. Of course, were I to get a visa to work on the US, I would probably be a cheaper hire than you — So, for (supposed) equal skills, I'd be more valuable.
So, if you push for a free market and reduced state, you'd be pushing for me to be hired over you.
Yes, we tend all to think that things that happen to us are related to the IT industry. However, nothing in "the H1-B debate" restricts this issue you mention to the IT sector.
This issue is not even related to immigration — If a company prefers to hire me to do $foobar because I'm better and cheaper for the job than the guy who did it before me, the company will do its best not to get bad press. It might include paying him a bit extra so you leave happy, or adding judicial clauses to shut his mouth up.
Of course, specific cases can be mentioned to say "hey, this is a specific issue for us techies and it involves them non-USians!". But it's the way things have always worked.
Yes, but that is far from enough.
What about authors who have passed away? Whom should I write to?
What about wnriching the globally available corpus of available knowledge? For the things I have written, I often grab tens to hundreds of articles, read a couple of paragraphs, and just casually filter them out. If it requires me begging to a third person, including the knowledge vested in that paper will not cross my mind – Unless, of course, somebody strongly points me at it.
What about long-term archival? What if said author lost his files in a hard drive crash last year? Freely accessible knowledge is lost for good?
Of course, it's better than having the journal as the only source for the knowledge (and them denying it), but it's not enough.
In Mexico City, at the end of the primary school, ~1988, we did learn how to extract square roots (and covered the basis for "higher" roots). Of course, it was not something we used since; in secondary school we went on with algebra, and didn't do much more pure arithmetics since. But square roots are useful to at least estimate without computers.
I cannot understand why such a setup isn't more common. My workstation has two monitors: One of them in portrait (900x1440) and the second in landscape (1900x1080). I mostly use the portrait one to write texts and browse the Web. The landscape one is where I usually code or sysadmin from. And, of course, other stuff finds its place in different ways.
Of course. But when doing a course on data structures, kids are expected to develop the skills needed to write pieces of code that might seem trivial to you — But in practice are the result of tens of years of work. I quite enjoy reading 1960s computer science papers precisely because of that.
I teach Operating Systems. My course depends on Algorithms and Data Structures. Believe me, even though the students just finished the course mentioned in this note (of course, in a different university, different country even), it is obvious in their assignments they have not yet interiorized many of the things they are supposed to have learnt. I could probably fill a book explaining the different implementations of lists or trees I have seen, or the myriads of antipatterns I read on a regular basis. And that's what university is for.
In "real" works, of course, they can answer open-book to all exam^Wsituations. They can copy code from teh intarwebz. They can compare code. But first, they have to understand and interiorize the concepts.
Legal issues make clear the splitting point of that hair.
Using that exact library means you include it from your project source and acknowledge it as a complete piece of work. If your work is developed openly, you usually list it as a dependency (and acknowledge the authors — And get the ability to link to updated versions. Free updates, yay!) or hard-include it in your tree (but still acknowledging authorship); if it is developed in a closed model, you can either do it or not, but if $boss comes to ask why every time a frobnicator is quuxed you get shizzles, you can point to an outside-acquired code.
If you just copy-pasted a funciton as yours, there are many negative side effects. Besides, of course, opening yourself to lawsuits and whatnot.
You are completely right, I was over-optimistic in my numbers. So, thanks for pushing the point even more so!
Even if your 3GHz 4 cores have a decent amount of cache and can perform their computations without going down the memory bus bottleneck? Remember, the bottleneck would be even worse, because you didn't mention the memory would be twice as fast as well. And, of course, the rest of the buses and peripherials would also be affected, so all waits for memory and for external I/O would, for become effectivly doubly as expensive, as seen by the processors.
Of course, you could say that it'd be nice to have all of the computer's components continue increasing in speed. Well, that'd bring another problem: Motherboard sizes. Because at 6GHz, light speed becomes a limit as well: If, speaking in round numbers, light travels ~300,000,000 meters per second, then it takes 3.33x10^-9 seconds for it to travel one meter. At 6GHz, light travels 50cm per clock cycle. I know I'm comparing apples and oranges here, as electrons don't "move" along the wire, but still — Signals will only travel fractions of that distance on an electronic circuit.
Yes, it could be easier to keep both cores happily going along without programmers having to learn to master concurrency. But we are hitting physical barriers, They do not give way easily.