It was a much simpler era, where there was little need to worry about platforms other than Windows+IE
You were part of the problem.
Microsoft SQL:
- select top 100 * from table instead of select * from table limit 100
- White space after values is ignored ('Bob' = 'Bob ')
- Command-line client sucks
Oracle
- A column of type date is actually timestamp. There is no column type that stores just a date.
- Command-line client sucks
- expensive
MySQL
- You can quote strings with single ticks, double quotes, or backticks
- The MyISAM engine
- Query cache based on the text of the select statement, rather than its meaning. So slightly rewording your query will skip the cache. Also updating a single row will clear the cache. This is inferior to how I understand PostgreSQL's shared buffer cache, which keeps frequently read rows in cache, only flushing out the ones that are updated, and deciding whether to use the cache after the query is parsed, and so not dependent on the query being literally written the same way.
It's no wonder so few web developers fully exploit the powers of the database, reimplementing many of its features in PHP, poorly. I once went to a local PHP meeting. The leader gave a talk, mainly about object-oriented programming, which I never got into. Anyway, he also recommended some kind of job queue application, like to email new users a welcome message. Don't use your database for that, he said, because keeping track of who you've emailed in the users table would upset MySQL's delicate query cache. At the end of the talk, I asked the group of 20 or 30 who had used PostgreSQL. Nobody.
Like others have said, most web developers probably should use SQLite. It's great not only as an embedded database but also the backend for most of the little web apps out there. Or if you're writing business applications for a large company, use PostgreSQL. The rest can go to the dumpster.
In other news, Microsoft is also renaming Windows to something else, although they're not sure what. The version number will start at at least 20, though, to further distance itself from Windows 10.
Microsoft is also seeking to ditch the names Bing and Microsoft.
I agree and disagree with the writer of the article.
On the one hand, there are a lot of silly rules floating around. The reason you shouldn't end a sentence with a preposition is because Latin doesn't. In fact Latin can't. The same goes for why you shouldn't split an infinitive. The infamous double negative used to be accepted English centuries ago, just as it still is acceptable in Spanish, French, and many other languages. I've come to think of it as a parity bit. Since one simple word flips around the meaning of the whole sentence, it's better to put it in twice.
But on the other hand, one of my favorite books is The Elements of Style. To its credit, it doesn't mess with chiding writers over ending a sentence with a preposition. It doesn't even advertise itself as a standard-bearer of "proper" English. It is mainly a collection of common-sense tips for improving your craft. It's most famous advice is "too omit needless words." It goes on to show you how to write clearly, rather than wishy washy. In short, how to serve the reader, and help him understand the information while wasting his time as much as possible.
It's 14/3/2015 in the sane world.
14 has no meaning outside of the month. 14 literally means the 14th day of the month. The month is the context. The fourteenth day of what? March. So when people say dates, it is perfectly normal to begin with the context and then the day. "What's the date?" "It's the month of March, and we are in the 14th day of it."
Why not then begin with the year, since that is the even broader context? "The year is 2015, in the month of March, on the 14th day." Because when people make appointments with each other, they do not normally make them more than a year ahead. "When shall we meet again?" "April 20th." It would be so unusual to make an appointment for April 20 of some other year, that we leave off the year. When we have to give a year --- "When was the Battle of Antietam?" --- we are already in the habit of saying month and then day, so we stick with it, and append the year, "September 17, 1862."
It's 14/3/2015 in the sane world.
Nope, it's 2015-03-14 in the sane world.
How do you say it? 3/14/2015 goes along with how Americans say dates: "March third, Twenty Fifteen." I suppose the European way, 14/3/2015, with its nice descending specificity, corresponds to saying "The fourteenth of March, Twenty Fifteen," or even as I've heard it said, "Fourteen March, Twenty Fifteen."
But in normal conversation do you say anything like 2015-03-14, like, "I will see you again on Twenty Fifteen, March 14"?
TV is dead anyways
Not only are people cutting the cord, but the new generation is not even buying the large screen to put in their living room. Instead, they're using their laptops and tablets.
If my work didn't give me a laptop for free, I would be tempted to snap up a new Chromebook Pixel.
The self-anointed tech pundits are all scratching their heads. "Why such a luxurious laptop to just browse the web?"
"Just browse the web." That's the first lie. Web browsers, especially Chrome, no longer just browse the Web. It is no less than a modern GUI toolkit and practically a whole operating system. HTML 5 specifies that web browsers can run background processes, run offline, open and save local files, stream video, support instant chat, draw raster and vector artwork (<canvas> and SVG), and put up a large variety of widgets from just a little bit of code.
Chromebooks don't just browse the Web, they aren't useless offline --- or actually, Windows and Macs offline are just as useless, the way we use them today. About the only thing I'm still waiting on in a Chromebook is an offline video editor. Everything else --- word processing, spreadsheets, drawing, photoshopping --- are now available and pretty good. In fact, I think they're better, maybe just because they're newer, made by programmers who are wiser.
And who wouldn't want all the nice things in a Google Pixel: a solid build, a nice screen, a good keyboard, long battery life. The only point I agree on is that the processor is a waste, for most people. I would rather Google had gone for an ARM processor while keeping everything else the same, resulting in 24-hour battery life. I would rather get away with forgetting to charge my laptop one night than have that much speed.
Why not just run apps natively then, instead of in the crappy browser environment? Oh, right, Google lock-in.
Do apps written in JavaScript lock you into Google?
it makes sense that a device that requires you use the Google office apps rather than native apps, would require you use considerably more memory and power.
Yes, it's ridiculous, but think of it like this: how optimal do you think a Google spreadsheet, implemented over JavaScript, the DOM, and XML, in turn implemented over various abstraction layers that eventually get down to C++ and some kinda linkage to the native widgets of the underlying OS, is, compared to a Microsoft/GNOME Spreadsheet implemented directly in C++, with a little abstraction but not a lot between that C++ and the underlying OS?
TL;DR: A device that forces you to run desktop apps inside a web browser will always need more power than a device that allows optimized apps to run.
Are you forgetting the other Chromebooks, all implanted with low-end processors? The Pixel is noteworthy because it's overkill. James Kendrick writes, "My old Acer C720 Chromebook had budget hardware when released, and still runs Chrome OS well." (Okay, his "old" Chromebook came out just a year ago. But still, it has a Celeron. Others have ARM processors.) The consensus is that Chromebooks are snappy no matter the hardware.
Just like the year of Networking it will never happen. If it happens it will just keep creeping up until you notice it is everywhere and then look back and wonder when was the year of X.
Will this be the decade of Linux gaming?
Life is a healthy respect for mother nature laced with greed.