Neither the X station nor the dumb terminal "receive pre-rendered content".
You're not entirely right. They both receive partly pre-rendered content. Dumb terminals receive data about which character they need to display at certain (relative) screen location, however the shape of characters is done by terminals. X terminals receive parts of display content pre-rendered as bitmaps, they don't invent any of contents. Indeed they don't receive exact display contents in full and every time (eg. acording to refresh rate), they need to re-assemble full display content from incremental updates received from the mainframe/X client
And for years, Linux was still the only mainstream OS that had good 64-bit support. The only thing holding users back were a couple of proprietary desktop applications that are now finally becoming fully obsolete. System administrators have been able to run full 64-bit Linux on their servers for what, 8 years or so.
Back in year 1999 (yup, that's full 13 years ago) I was using my desktop Linux (Red Hat Linux 5.1) on DEC alpha, which was fully 64-bit kernel and user space. We've been running Linux on some seroius DEC alpha machines instead of running DEC's own UNIX implementation (OSF/1) because it actually behaved better.
After I changed my employer in 2002, I had to downgrade to i386 version (being much more stable than amd64 at the time).
A bit rough example: if you eat say 2 Big Mac per day, each costing you a $, why doesn't McDonalds offer you eat-all-you-can for 60$ per month? This would make their income the same as it is today, right?
Because many users would over consume and/or start throwing food away (even more than they already do). Which means that they'd have much higher costs
The fact is that many of heavy-users of unlimited (or nearly unlimited) plans abuse the bandwidth they've given.
Another common misconception is that telcos still pocket absurdly high profits. They do fine, but their profits dropped a lot. A decade ago, most profits in mobile telecommunications went to telcos and telecom equipment manufacturers. Only small share went to VAS providers and handset manufacturers. Recently things turned upside down: most money goes to VAS (Google et co. through advertising money) and smart handset manufacturers (Apple, Google, Samsung,
In short: as telco industry slave (I'm working for them) I can tell you that things are not nearly similar as in general IT, so one can not make direct comparisons.
It's not actual transferred bit that costs telco. It's overall capacity that costs telco and if overall consumption is kept under a certain threshold, capacity of infrastructure can be kept lower and thus cheaper.
Did you ever notice that 100Mb ethernet switches generally cost slightly less than 1Gb ones and the 10Gb ones are quite significantly more expensive? Ethernet switch is not consumable, it's infrastructure (among other things).
Those bastards telcos actually want to get their investment paid back in a decent time frame and also make some profit. Old (copper) infrastructure had paid back the investment way back, while new infrastructure (fiber, wireless) did not.
If the global economy wasn't in such a precarious state, gas would be over $5/gallon *now*! In 2032, $10/gallon gas will be a fond memory.
Talk about US petrol prices. In Europe, petrol price is in vicinity of 1.50€ per litre (likely even more), which is in neighbourhood of $7/gallon.
Link to Original Source
I'm not sure how many people face this situation, leading to duplication of effort for the same updates. While I have set up a wiki (http://statusnow.wikispot.org), its utility is dependent upon contributions based on similar experiences with other organizations. Do you think this is a problem worth solving?"
Somewhat tangential, but are there dialects/programming languages in your locale that use semicolons to separate parameters in method invocations?
I wouldn't know. I'm not huge fan of CS translations and localizations, at least of those that go beyond proper localized output of data (numbers, dates). I've seen examples of actually translated programming language (Pascal to be exact) which made me sick. Which means that whenever I do programming I end up typing comma as parameter separator and dot as decimal separator. I hate using localized version of OS (Linux or Windows) that backfires me while using its calculator
Also, the use of semicolons to separate function arguments is an annoying difference from Excel. Why not just use the same format? Was it patented? Most of the rest of the UI tries to be Excel-like... so why this difference?
Actually my Excel uses semicolons to separate function arguments. I always wonder why all examples in FAQs, Tutorials etc. insist on commas? Really: everybody that uses comma as decimal sign (think Germany and remember StarDivision) use semicolon as separator sign. I guess OOo devels just adopted it as default/only option not to mess with different locales.