ASUS will outlive Acer, but does it really matter?
Unless you're a gamer, you're wasting your money buying a desktop (whatever form factor). Before long reasonably priced laptops will run games well, too, and the desktop PC will be effectively dead. I've been building/maintaining towers since 1991, and I said goodbye to all but one machine this month. I don't know why I kept it. I turn it on once a month.
Maybe you want a desktop for storage. Laptops are shipping with more than 1 TB of storage, and you can replace a desktop with one or two USB 3.0 enclosures with 4 TB (or larger) 7200 RPM drives for a few hundred bucks.
Eventually laptops will be dead, too. A more interesting question might be who will be the last laptop vendor and when will nearly all people finish the switch to tablets, phones, watches, or perhaps nearly invisible computing.
Once your net worth is sufficient such that you don't have to earn a salary anymore, you want to switch the majority of your earnings to income from investments.
Investment income is taxed at the capital gains rate, whereas regular income is taxed at the much higher income tax rate.
This is the simplest explanation you can read, but it's one reason why the rich keep getting richer. We've set up a welfare state for them.
Fused Location is absurd. I have an unrooted phone that was upgraded to, but did not come with, Android 4.3.
Using the program CPU Memory Monitor, Fused Location right now is at 48 CPU minutes. Google+ is legitimately at 59 CPU minutes because it's backing up a bunch of photos and videos I took last night. The next highest process is at 5 CPU minutes. Except for this unusual workload, Fused Location almost always consumes many times the amount of CPU time as the next process.
I think Google has great ideas but their programmers are not very good at testing: Too much theory, not enough practice.
To my knowledge there will never be an update for my phone, a non-LTE Samsung Galaxy S III. Even with the massive battery drain of Fused Location, the phone works great. It would be an amazing device if Fused Location was fixed, or if there was some way I could disable it. Rooting my primary communications device is not something that I would do, but your comment seems to indicate that wouldn't do any good.
Yale professors' ideas of being knowledgeable in a subject come from their experience lecturing students.
I've been getting paid to do programming for almost 30 years. Google has changed programming such that you no longer have to memorize the useless trivia that college professors lecture about.
As a result I can focus on improving my ability to program as a generalist, and I'm very good at what I do. If you asked me to write a bit of non-trivial code in anything but pseudo-code, I would very likely not get the syntax exactly right (unless you asked me to write it in C, which I learned before the days of Google).
Google allows us to not be smart at things that are a waste of our time to learn in the first place. We can have a much more broad knowledge of many subjects and use Google to drill down on specifics, rather than having the type of knowledge that professors crave, being completely pigeon-holed into one speciality where you have all of the trivial detail memorized.
Can I rattle off every type of tree structure, and tell you what tree is good for what problem? No. In the days of Google, that type of knowledge is useless. You ought to know when you need to use a tree structure of some sort and you can spend an hour or two making that determination, or if the decision is critical you can spend a day on it. Effectively, those weeks or months we spent in computer science/computer engineering classes learning all of these very specific attributes of data structures were a waste.
To generalize, consider everything you can easily find with Google to be part of your knowledge. Memorizing it would be a complete waste of time. But that very waste of time seems to be what these professors were measuring (and valuing!)
I'm a fan of both libressl and this project. Now we will have two dedicated groups working on improving the security of SSL. There will no doubt be sharing of findings, and both products will improve as a result.
SSLeay and OpenSSL have been neglected for too long. It's boring to work on this software, but that doesn't mean the work is not important.
There is no substitute for meeting in person. We've evolved over millions of years to meet with each other in person. Every distributed meeting I've ever attended has had yelling, mumbling, and misheard things caused by technological failures.
If you're sketching out your next year's worth of work, spend the money and get together for it.
If you're just talking about a couple of minor issues, then by all means use a distributed whiteboard.
I've been programming for about 30 years and it took me at least 20 years to learn how to make good estimates. If you can't estimate how long something is going to take, then you probably haven't been doing it long enough. I don't care what your field is, or even if it's engineering or not.
Programming is just like any engineering endeavor. Be a professional and admit that not every job is a weekend hack. If you work somewhere where you have to lie about work taking less time than it takes, then quit that job and go work somewhere else.
One difference between a so-so programmer and a great programmer is someone who finishes his or her work when he or she says he or she will. There are other differences, such as communication skills or lack thereof, but having people be able to trust you is pretty key.
Another way of looking at is this. Programming cannot be rushed. That doesn't mean tell people it will be done when it's done! What it does mean is learn to estimate the worst case scenario. You will not be punished for writing a great piece of software and finishing it a few weeks early. You will be punished for saying it will be finished by a date assuming that nothing will go wrong at all and that you are some kind of programming genius, better than every other programmer out there.
Consider this example. I think I might be able to finish something in a week. The estimate I give is two weeks. Another programmer typically says a job of roughly the same magnitude will be done "in a few days." I have given myself plenty of time to do the job right, and extra time to fix bugs and possibly deliver the project early. The other programmer screwed up and it does take him a week. After a week, his work has tons of bugs. It goes out to production and every one of those bugs is embarassing and takes man days to resolve, rather than a few man hours. I've seen this scenario happen many times. Programmers fall into this trap of their own making. Maybe it has something to do with machismo. Stop pretending to be more manly or smarter than you are and be a professional.
I looked at my state and H1Bs have below-average salaries, somewhere around 10-25% below average, depending upon the exact position. Clearly, the purpose of H1Bs is to drive down the wages of people already here; otherwise, H1Bs would be getting paid about the same as everyone else, within let's say 5-10%.
I also looked at the numbers, and by far the H1Bs are going to California. Only 2,000 made their way to my state. Companies in California want you to live there, paying $3,000 or more per month in rent plus high taxes and everything else but aren't willing to pay you enough to be able to afford it. Since they've run out of people to con into moving to California, they've turned to H1Bs.
I have nothing against the best and brightest coming to the United States. We have tons and tons of international students studying engineering in our universities, and these people are more than welcome to stay here and become citizens, joining our labor pool.
Owing to the vast distances between planets likely to have intelligent life, machines are likely the only alien things you'll ever see. Owing to the energy expense of traveling vast distances, they will likely be very tiny machines, like nanobots from science fiction. Due to the speed of light and hence radio communications, the machines won't be able to be in contact with their makers, at least without significant transmission delays; therefore, the machines will likely possess advanced intelligence on their own.
Having said all that, we could very easily have been invaded many times over by intelligent nanobots from space. They would have been completely undetectable until maybe a few decades ago and perhaps are clever enough to continue to escape detection. Our planet is highly unexplored. We've only cataloged a tiny percentage of life. We've barely glimpsed into the deep oceans. We're also quite ignorant about the underground, as we're drilling for oil right now and have no clue what might even cause an earthquake (so we're ignorant about not just tiny things, but features gigantic enough (possibly miles in diameter) to produce earthquakes).
In fact, the very cells in our bodies could be invaded by aliens disguised inside bacteria or viruses. How many of these bacteria and viruses have we even looked at under a microscope? We have trillions of these organisms. Imagine the power of one million compute nodes that could hide in less than 1% of our cells. And right under our noses is just one possibility among many.
Maybe the world is like the Matrix, except the robots are tiny instead of massive with a little bit of Scientology thrown in, where our bodies have an alien presence inside of them (but not the part about stealing people's money).
T-Mobile offers Wi-Fi calling, but it sucks.
Let's say that I want to download something for a few minutes. My call turns to garbage.
I could theoretically configure my AT&T hardware to prioritize Wi-Fi calling traffic, but there is no insight given by T-Mobile or AT&T into doing that. I'm not wasting my time to save gigantic multinational corporations a few pennies.
So the idea here is to take our most sensitive and least understood organ, a device with processing power greater than any hardware/software system we've been able to even conceive of, and do the equivalent of smashing it with a hammer?
Would you go to a data center and start zapping random computers with electric pulses, hoping that your buffoon-like behavior would randomly flip the rights bits somewhere to make the machines to work better? No, you would work to understand the software being used and improve it. Or, you would replace the hardware with something that works better.
Likewise, there are no shortcuts with the brain. Until we can program neurons and neural networks directly, anything we do to the brain expecting to make it work much better is bound to do more harm than good.
Game reviews are good at generally identifying the best games, but I wouldn't rely on them for more than a very rough metric. You simply have to play a game to know if you like it.
The best game review I've ever experienced was a live stream of a well-known GTA V speed runner playing the game on next generation hardware. Speed runners know games almost as intimately as the programmers do. The conclusions of the speed runner were dead on. The game has beautiful graphics, but playability had declined in many areas. He immediately noticed very long load times. And what do you know, after a few months people are complaining on Youtube about load times for GTA V on next generation hardware. One thing that I notice on Youtube is that pop in seems to be worse on next generation than previous generation. That's sad.
This 8 hour or so run (I wasn't tuning in continuously) convinced me to not buy a PS 4, a new television, or the next generation version of GTA V. I may take the plunge when the Windows version is released, but I'm going to see how the speed runners like the game. They notice everything right off the bat.
I'm perfectly happy with the PS 3 version of GTA V. It's the only game I play when I'm in the mood for video games.
Most programming is mainaining somebody else's, usually poor, code.
Do not use recursion if it makes you feel clever, or if you're showing off your mastery of some programming language's syntax. Don't use recursion for efficiency. Writing efficient code rarely matters. By rarely I'm not saying never!
Use recursion if it results in the most understandable and easy to grasp code for someone, probably currently in junior high, 10-15 years from now.
In 15 years processors, IO, and storage will be many times faster and the bottleneck costing your former employer millions in lost profits will be your tightly-coded routines buried deep within the infrastructure.
I'd like to see Cyanogen succeed because the more competition there is in the smartphone market, the more companies will be pressured to develop new, useful features.
I bought my first smartphone two years ago last month. It's a Samsung Galaxy S III. It still works great, despite some quirks. I felt like with the Galaxy S III, the smartphone was beginning to take a quantum leap forward in features. Especially for the last year, though, it seems like there isn't much to crow about except for some fingerprint functionality nobody uses. Phones are getting a bit more memory, somewhat faster CPUs, a bit better screens, and improved cameras but you would expect all of these things. In terms of new and interesting features, it seems like we're in a mature market where we've all decided upon what it means for a device to be a smartphone.
Perhaps Cyanogen will bring some excitement back. At worst, they'll come up with some new ideas that Samsung can license or copy. I'm using Samsung as an example, but I could be talking about HTC or one of the Chinese startups. I don't see a whole lot to distinguish current smartphones (except that Samsung does not permanently glue batteries inside of its products).