I have a break program on my system at work. it tells me to take a break every hour. I get up and walk a few laps around the building. I find my energy levels are better, I'm given time to think about what I'm working on, and I feel a lot better overall.
I think Intel wanted to try to scale the x86 down. That's where the Atom came from. Unfortunately, they could never get it to work on a power scale that competes with ARM before ARM hit 1 GHz. That seems to be the speed at which processors become good enough to do most anything useful. With the Cortex-A8, the Atom was in serious trouble. The Atom is now positioned as too much for a tablet or phone (and doesn't support most Android apps) and not enough for desktop or laptop (it can't handle more resource intensive x86 apps, like the Windows GUI or video). That's a pretty awkward spot.
The PC is the mainframe.
No, the PC is the refrigerator. Tablets are the beds. A home needs exactly one refrigerator (more are a luxury), but it needs about one bed per person. Now consider that people have been sleeping in refrigerators for the past 20 years. Thus, the market for refrigerators is highly over-saturated, and the market for beds is seeing explosive growth as millions of people have never had one before. In the end, though, everybody still needs a refrigerator. There may come a day when they don't, but everybody knows that a refrigerator isn't a bed.
Yes, the metaphor is a bit strained.
Point being that consumers are realizing that tablets do about 90% of what they want in a PC, so they just buy tablets. That doesn't mean they don't occasionally need something for that remaining 10%. We may see tablet docks that turn a tablet PC into a full desktop setup, but we're not there yet. I can browse the web, watch a movie, play a song, look up information, and type an email or text on a tablet or phone. I can probably do my online banking -- although it's a bit cumbersome. I wouldn't want to write a paper, or seriously manage my finances, or do photo editing, or do my taxes on a tablet (unless I was single, had no kids, had one job which withheld taxes, and did not own a home).
Besides, all Intel has to do is make a better ARM than ARM. They did that before when AMD introduced AMD64, and now that Intel fabs ARM, they can learn the ins and outs of that, since obviously there's something there that they missed. Intel still has the most advanced fabrication plants in the world. It would be foolish to write them off so quickly.
This is because, just like the student in this story, schools have been punished for doing the right thing in the past. Teachers and administrators that go out on a limb to protect students at the cost of the district get removed by the board of education because some parent will complain regardless of what is done.
The issue is that they need to fix the goddamn wiretapping laws, and police and prosecutors need to learn some goddamn discretion.
I agree. Using the term "Computer Science" for what most degree programs teach is purely the result of the growth of the industry. 70 years ago you couldn't get a Computer Science degree. 50 years ago, you could get a Computer Science degree without ever having used an actual computer. 30 years ago, the only degree in computing you could get was Computer Science, and it encompassed the whole of the field. 20 years ago, Computer Science began to mean "software" instead of Electrical Engineering's "hardware". 10 years ago, the field was so broad, so diverse, and encompassed so many disparate technologies that required significant specialization that you could get a specialization certificate on your CS degree. Today, you can get a 4 year Bachelor's in any number of fields including Information Technology (sysadmin, netadmin), Information Systems (DBA, Systems Analysis), Information Management (management for IT), Software Engineering (web design, application programming). Computer Science is again a theoretical area of research and development on the theory of computers. All these other fields born from this CS research once again free it to be what it once was: mathematicians and logicians playing with number machines.
The school district I work at uses a messaging system which is capable of sending phone calls (pre-recorded or computer generated voice), email, SMS, and twitter. We also contact the local news agencies if the emergency requires it (school closures, etc.). We also use it for attendance calls for students with unexcused absences or tardies. Parents are signed up for phone calls (required at time of registration) and email (if given) by default, but they have to opt in for SMS.
We still have parents who don't know about school emergencies.
On the flip side, spending six weeks fixing an issue on a single server running a non-critical, non-time-sensitive service which occurs once or twice a year and is 100% worked around by a reboot probably isn't an efficient use of your time.
Programmers are human. They'll make a ton of mistakes.
Doctors are human. We hold them accountable for their mistakes. Engineers are human. We hold them accountable for their mistakes. Indeed, we hold just about everybody accountable for their on-the-job mistakes and the consequences of their mistakes result in everything from terminations to criminal proceedings.
So, when should programmers be held accountable for their mistakes, and how should be respond as a society?
Not exactly. Oxygen is a prerequisite for the process known as combustion, since combustion is an oxidization reaction. "A rapid, exothermic oxidation of a substance, called the fuel," is a reasonable definition of combustion. Usually we say the fuel is combustible.
The language you copied, fucked with, and then claimed to have the definitive version of.
Isn't that how English came about in the first place?
God forbid someone make them think about their data structures and how the end user might need to query them with their own reports.
It's a little hard to call it 'valuable intellectual property' with a strait face when they refuse to derive any value from it.
No it isn't. It's a revised and ongoing work. They own the old version 5.1 (Win XP) and the new version 6.3 (Win 8.1). They both have much of the same code in them. So by selling the new version, they're still deriving value from the old version. If you want to buy Windows, MS will sell you a recent version containing much of the same code that was available in Windows XP.
I'm not going back to GW-BASIC, thanks,
Yes, but those are proven to decrease ticket revenue. Why do you think red light cameras are so much more popular?
Personally, I'd love to see this system running in my town. Traffic control here involves trying to impede drivers and preventing them from reaching the speed limit by forcing all lights to be red by the time you get to them as much as possible. Your choice on a clear road is either to speed, or drive 10 under. It's so pervasive in this city that nobody outside of town or in the surrounding cities likes to drive in this town.
C/C++ don't claim to follow relational data rules like MySQL does. Not only is SQL supposed to error if it can't do *exactly* as the user describes, it's supposed to change nothing if any of the affected rows error. It's not supposed to be allowed to guess if the user tells it to do something ambiguous or nonsensical. It's supposed to be required to throw an error in that case. Indeed, many RDBMSs error on some tasks simply because the result would be non-deterministic.
An RDBMS is not just a fancy key-value store. It's not a series of JSON or XML strings. It's a data entity rule set. Used correctly, it will not allow you to store obviously invalid data, even if the underlying datatypes allow the data as valid types.
Determinism is the real issue here. Imagine a compiler that produced different programs from the same source code. That's not particularly useful, is it? Well, a lot of the behavior that MySQL has let slide indicates that they don't particularly care about being deterministic. It's sloppy, and the one thing DBAs hate is sloppy, unpredictable results.
MySQL is the IE 6 of the database world. It encourages poor developer practice. It allows and even encourages lazy or downright risky developer behavior, when the RDBMS should be the element requiring the developer to think about how he stores his data and consider the ramifications of getting useful data from his system that go beyond his own needs. It has more oddball syntax than any other RDBMS, and is less likely to complain about data integrity and more likely to perform silent truncation or silent modification than even SQLite.