My dream environment = perfect representation of data in flexible/dynamic objects in a programming language, disconnected or connected to databases with nearly identical, flexible and dynamic data model representation, with a powerful query language (SQL-like), the scalability of the new generation of shared-nothing architectures, simple connectivity options (simple sockets all the way up to REST) and the reliability of a relational database's ACID properties.
Amen. Your storage layer shouldn't dictate your usage patterns; quite the opposite, actually. But domain entities seldom conform to a single usage pattern -- there's one set of them in OLTP, and OLAP, and another for use in realtime incides, etc. Having to have myriad representations of an object just to accommodate different persistence patterns is wasteful.
Often when something is banned from the marketplace and its replacement is significantly more expensive, you will find the people who profit from the added cost were among those lobbying for the ban, if not drafting it.
I haven't dug into the details behind this particular case, but I wouldn't be surprised if utility or manufacturing patents are involved in the price increase.
Wait, who gave the EPA the authority to ban drugs?
I don't know the nuances of the limits to their authority, honestly. But if a bureaucratic agency 'bans' something they don't have the authority to take out of the marketplace, what can they do to manufacturers, distributors and retailers who continue to make and move the product?
It seems that producers think they have to launch lawsuits when a bureaucracy oversteps its authority. Why not but the onus back on the bureaucracy to stop them?
They imagined it, they were fully aware of the possibility and propensity for rulers to abuse their powers and collude against the best interests of the governed, and they tried to put two crucial things in place to prevent it: Checks and balances, and limitations of powers.
Once we demanded that politicians have the authority to fix things, we also gave them the power to rig things. There's no way around that. If your ability to remain employed depends on the generosity of donors, and the generosity of donors depends on how beneficial you are to them, the system you erect will naturally pull towards oligarchy.
Being in the business of owning patent portfolios and not doing anything with them should be 100% non-viable.
If you added an exception for the original inventor, you might be onto something. There's a well established business model around inventing something worthwhile and monetizing it through licencing deals. However, if you're not business savvy, it can take an inordinately long time to navigate through the myriad decisions needed to get an invention made.
If you could limit damages anyone else could collect for infringement -- by tying them to actual manufacturing under the patent, whether by the patentholder or by a licensee -- it could achieve the objective you're going for, without threatening the business model that fosters a lot of the innovation we see.
Unless key prediction gets *much* better than what I've seen on my phone, it seems that I'd quickly learn to ignore any hints given by the keyboard since more times than not, it would be wrong.
I shared your opinion until recently, so I was surprised to see how much better prediction has gotten with alternative keyboards on my Android device. SwiftKey is all about prediction, and it learns quite quickly. It has a decent training set right out of the box, but a week later it's night & day.
Swype isn't as sophisticated as SwiftKey with next-word prediction, but the idea of tracing in lieu of keystrokes is great. The first beta was almost unusable, but after trying beta 2, I switched and I'll probably never go back to key-tapping.
I think smarter keyboards will be a short-lived phase though; voice recognition is really coming of age the past few years, and when it works it's far more efficient than even the most accurate predictive keyboard. (Well, unless it predicts your whole next paragraph, I guess...)