Your suggestion, in a thread about relative costs of systems, is to buy a custom piece of hardware, from a vendor who's website doesn't actually list a price.
But it's not like Windows can backup to thin air. You have to have something on the other end of that CAT-5, so it's probably a wash hardware-wise.
Do you know what I think when I see a website selling a product but not listing a unit price.
"Huh, I wonder if Amazon has them?" would have been my first thought, but apparently it wasn't yours.
if things ever get too hairy for a dell, your restore process is entirely automated in windows or linux. restoring a mac is nothing short of corporate witchcraft.
To backup: buy a Synology NAS. Enable the Time Machine service. Configure your Macs to back up to it. Voila, done.
To restore from scratch: hold down Command-R when booting a Mac. Tell it to restore from Time Machine. Wait an hour. Voila, done.
because Mac is like 10 percent of the worlds PC sales, and the viruses usually dont survive that far when the percentage of ownership is that low
That has zero to do with the relative dearth of malware on Macs. (Pausing for a moment for a pedant to point out the one or two Mac bugs they've read about. Yes, we know. It's still proportionally much less than Mac's market share so move along.) Macs are initially more expensive, but that also means there owners tend to have more money and therefore the machines are more valuable targets. There are also still tens of millions of Macs out there in the wild. Even if there are more PCs, there are still a hell of a lot of Macs to be owned for anyone interested and capable. The fact that they're not is an indicator that building a nice interface on top of a solid Unix platform is a good way to end up with a stable, secure desktop.
I've been doing end user computing for quite a while, and we've gone through so many cycles of "where the client intelligence lives" or "where the virtual desktop is hosted" and everyone oscillates between two extremes. PCs to zero clients usually ends up being a mix of laptops and thin clients in the end. All VDI ends up being some VDI after some very expensive POCs in most cases. I guess the same debate of "host it yourself vs. rely on a cloud provider" is alive and well here. I see it every day where I work -- the management is all about cloud, and the staff are fine with some cloud, but going all the way over to total dependence on a third party is not great in my mind.
Something as fundamental as DNS should probably at least have some footprint in your "locus of control." I didn't say "in your office" but fundamental stuff that could completely kill everything else if you lost it shouldn't be given over to a third party that you don't directly control. In this case, Dyn had a DDoS attack, and on-premises DNS could too. But having a way to run both off and on premises makes good sense...if one entity is having a bad day, the other could at least keep things alive. However, all this old school DR stuff is lost in the world of the cloud and startups. It all comes down to how much in dollars or reputation the loss of a service costs the company...if you can quantify that and the number exceeds the cost of mitigation, businesses would be stupid not to put something in place to mitigate it.
OMG yes. I bought my wife an MLB.tv season pass because she loves watching baseball. What do you get for $109.99? Every game on TV except the ones in your home market. You can watch the Twins suck any time you want, so long as you don't live in Minnesota. Oh, and no postseason: that's a separate subscription.
Who the fuck came up with those ideas? I'll be damned if MLB ever gets another penny from us.
This is not an Apple problem, it's an industry and maybe even a societal problem. I don't even think it's possible to get a good job, get an A+ rating for every performance review ever, and expect to stay at that job for 5+ years. After 10 years, you are too expensive to keep around.
Lol, just left one job after 10 years, not because I was too expensive but because the new company had more resources to spend and could offer me significantly more. The average seniority at the new company for IT workers is 17 years and not a month goes by that our Office of ~700 people doesn't have an announcement for someone celebrating their 25 or 30 year anniversary. You just need to develop valuable skills, expertise, and a proven track record and there WILL be someone willing to hire you. Any time I've gone looking for top tier talent for a specific area of expertise the number of qualified respondents has been very low because the majority of people with the applicable skills are generally already gainfully employed, the unemployment rate for the last few IT focused surveys I've seen results from were under 3% which is an incredibly tight market. If you're IT, not entry level, and having trouble finding employment it's either something with your local market (and you're not willing to relocate) or you've done something very wrong with your career.
One thing I could see contributing to this difference is the amount of time _wasted_ at work. Now that I'm a dad whose wife also works, as soon as I'm in the office the proverbial clock starts ticking. If I don't want to be stuck doing stuff after the kids go to bed at night, I have to get my work done in that narrow window of time. Lots of tech employers, especially Silicon Valley type companies operate on the college campus model, where long hours in the office are encouraged and part of the culture. Google serves 3 meals a day to their employees, and expects you to be there long before/long after those meals to make up for it. Your workplace becomes your extended family, and you are expected to put in time accordingly. If you want to see an extreme of this, look at Japanese work culture, where salarymen work massive amounts of hours _and_ have to go drinking with the boss when they're done.
If more employers would adopt the "get your stuff done when you want, as long as it gets done" mentality, I think most people would choose to be at work fewer hours. This may not be true for recent college grads who have no commitments at home, but I think it's very true for anyone wanting to maintain some sort of home life. You could say that in the traditional family, the father was the one working all the time and that was all that mattered, but I think priorities and society are shifting away from that.
What I don't think people see when they complain about UBI is how vulnerable their jobs are. Techies in particular feel that they're always going to get high pay and have great jobs, but we keep seeing offhshoring of even the good tech jobs in pursuit of lower labor costs. Removal of the rest of the white collar low-to-mid level workforce is going to be even more disruptive than getting rid of labor. I work very hard to stay current with the technology I work with and I'm still seeing companies doing everything they can to pay less...managing out older employees, removing the idea of a career track, etc. All I'm saying is that the people who are currently smug and looking down at the "unskilled" class of labor from on high are about to be shocked when it turns out their job can be automated or done cheaper somewhere else as well.
Why do I think this is a big deal? In corporate IT, I constantly see tons of jobs that could easily be automated if some MBA decided to do a cost-benefit analysis. In lots of companies these jobs are the majority of the workforce. Why do you think HP, Dell, etc. are announcing layoffs in the 5 digits as a response to the slump in PC sales? Lots of people in cube-dweller land are doing modern versions of paper-movement jobs that existed 40 years ago, having pointless project meetings, etc. Inefficient, right? Sure, but those same people pay taxes, have children, buy houses, buy the products their companies make, and generally keep things moving. People being worried that they'll be able to pay their bills all the time reduce their spending and put off major purchases. Look at the Millennials as an example...most aren't jumping right into home ownership because they don't feel they have a stable footing.
I do see a lot of logistical problems, but I see way more problems if something like this isn't done. If companies suddenly don't have to pay their workforce as much, and wages are the highest cost in many companies, you suddenly have tons of money floating around. Businesses are going to scream socialism if we try something like forcing them to contribute to the UBI fund with all this extra money they have. I imagine that without UBI, the business owners will just keep squeezing employees and keep all the savings for themselves. The problem with this scenario is that money-for-labor economies can cope with 5% unemployment, are uncomfortable at 10+%, and get pretty screwed up much beyond that. Imagine an 80 or 90% unemployment rate across all walks of life...all the guards in the world won't protect the business owners from an angry population with nothing else to do and no way to earn a living.
Whether those paper publications are high quality or not, it definitely shows the difference between a government with full control over its economy and one that's given all the control over to businesses. In my opinion, there are 3 primary reasons for the decline of science education in the US:
1. Anti-intellectualism on a massive scale -- other countries shower their scientists with praise and research dollars, and we dismiss research as something those "leftist egghead liberals in the ivory tower" do. Other countries place education as the top priority, and we don't care anymore.
2. Lack of employment prospects -- everyone keeps saying "STEM is the future" and I believe this, but you have to have jobs for all these scientists and engineers you're graduating. Why would someone smart enough to be a research scientist, who had the photographic memory to get a 4.0 GPA and a perfect score on the MCATs, choose anything other than medicine, management consulting or investment banking? Medicine, dentistry and pharmacy are the last fully protected professions in the US, and banking/consulting are guaranteed success tickets. Any rational actor is gong to choose the safe path.
3. Lack of funding -- most state university systems are cutting back on research funding, not filling as many tenured professor positions, etc. R&D is typically funded through private grants and demands only research that will produce a completely new product within a few months.
Contrast this with China, which has full control over where their money is spent. Nearly every private company still answers to the government in one way or another, and no one is going to say a word about building up universities at a faster pace. This is basically why I think they're going to be the long-term economic winners -- larger population and more central control.
The funny thing is that in this country, during the 50s through the 70s, there was a massive investment in universities and science. No one batted an eyelash about how much things cost and what they were being spent on, as long as it kept us on par with or ahead of the Soviet Union. However it seems that priorities have shifted away from that. I really don't want to be worried about being wiped out constantly, but maybe a new cold war with another powerful adversary is the way to stimulate growth in science again. All of the US conflicts since Vietnam have been very one-sided -- I imagine a war with China wouldn't be just based on numbers and sheer will. When you have a situation like that, just as we had with the Soviet Union, a cold war seems like a good idea.
I'd never really given a thought as to _why_ Synaptics and friends required their touchpads to have a full-blown application and a driver installed to have full gesture functionality...but I do know it's another pain in the butt step that has to be automated when a laptop is deployed in order to have a common Windows image. I didn't even know there was a standard. Now I know why - you learn something new every day.
Microsoft does have info on how to implement their standard here but I wonder if Linux hardware drivers can implement the same spec, or if there will always be a "legacy fallback mode" so the touchpad can be used in situations like navigating the UEFI, which has gotten very GUI-like lately.
HP and all the other manufacturers are caught in the perfect storm of:
- PCs having enough power for almost anything an end user can throw at them (including games) for a much longer period of time
- Tablets cannibalizing the "information consumer" side of the market
- Nothing really exciting and new in the PC space for years
I do end user computing in the business world. Yes, Millennials are bringing their phones and tablets in, but outside of a few niches, most of the real work gets done on PCs or Surface-esque full PCs in a tablet form. I sure hope HP is using these 4,000 layoffs to refocus on providing a rock-solid line of business PCs and dumping all their consumer junk down the toilet. That's the side of the industry that is falling off a cliff...you can't make margin on $300 PCs, and consumers don't need them as much as businesses need $700 PCs with real warranties and support.
These crappy temp jobs are going to bubble up into the unemployment numbers and, though the rate is seasonally adjusted, they'll show job growth. What I want to see is real full-time employee job growth, the kind of work that comes with real salaries, retirement and health benefits. It's really sad to see people in their mid-50s driving for Uber because they can't find work after having their jobs offshored or eliminated. Uber will say they're doing people a favor, but I think they and companies like them are contributing to the perception that employees should be treated as disposable commodities.
There has to be a better safety net for these people than what unemployment insurance provides in the US these days. If people could be assured of at least their full salary being replaced for a reasonable amount of time, they might be willing to take more risks, look for a job that's a good fit rather than the first thing that comes along, etc. I know we're supposed to be living in a wondrous time of automation, innovation, etc. but the fact is that most people need something to do. They need full time employment, a sense of purpose, the ability to put down roots, etc. Almost no one can be a fabulously wealthy entrepreneur no matter how much the small business owners/cheerleaders want people to believe that. Very few people want to be nomadic and move from place to place chasing work every year or so.
I know one theory I have on how to solve this is not popular at all, but what about forcing businesses to pre-fund longer-term employee severance packages at a rate proportional to the employee's salary? Employees would be free to leave at will and their pre-funding would go back into a general fund. But, just dumping a worker because you feel like it, offshore their job, etc. would require a payment out of the fund that would actually carry the employee until they could find new work. It's good for the businesses too, because it forces them to really think hard about who they hire rather than just take the first guy who comes in the door. I know every business owner would scream socialism, evil regulations, etc. over this one. But the reality is that every single business, small or large, has huge advantages over regular workers. Business owners can just funnel all their personal expenses through their companies, the really large ones can take advantage of loopholes to pay zero taxes, etc. Having a common sense plan like this makes sense -- it's just a bigger payment into the unemployment insurance fund to ensure people aren't reduced to what amounts to minimum wage when you get thrown out of a job and still have bills to pay.
I think that in most cases, if employees felt safe in their jobs, they'd do better work. That Millennial who seems to be "slacking" because they won't put in 90 hour weeks for years on end just sees what's going on. SV startups live on fresh college kids who haven't experienced what it's like to work in an unstable environment or for a hostile employer. Older Millennials are more cynical, just like older people of other generations.
Restoring the balance of employer/employee loyalty would be a good start if employers want a more productive workforce. Smart people see employers who will replace them at the drop of a hat and don't put in the extra effort as a result. Previous generations had some employers who would employ you for life...IBM had a no-layoff policy for ages and there are legions of people who worked for large employers like this their entire careers. In return, their employees were loyal, worked hard, put in extra hours where needed, etc.
Unfortunately, I can't see this happening any time soon. Back in the 60s/70s, the US was quite different. Absolutely everything was manufactured domestically, there was very little foreign competition, only 3 car companies of note, etc. And, companies needed thousands and thousands of people just to move paperwork around the organization, all of whom had stable jobs. Now, we manufacture very little, offshore well-paid technical jobs, and companies just keep squeezing harder to get those pennies out of their operational processes.
Alexander Graham Bell is alive and well in New York, and still waiting for a dial tone.