Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Submission + - Apple buys iFixit, declares repairable devices "antiquated". (ifixit.com) 2

ErichTheRed writes: Apparently, Apple is buying iFixit. iFixit is (was?) a website that posted teardown photos of gadgets and offered repair advice. According to the website: "Apple is working hard to make devices last long enough to be upgraded or irrelevant, making repairability an antiquated notion." It's all clear now — I can't replace the batteries, hard drives or RAM in new Macs because I'm expected to throw them in the landfill every 2 years!

It made it to CNN, so it has to be true, right?

Comment Re:Still don't know what everyone's complaining ab (Score 1) 168

As for being woried more about the corps than government why? Can corperations arrest and imprison you? If not then you really have screwed up threat assesment abilities.

This is kind of what I was getting at -- among those more concerned about privacy, everything is part of a vast government conspiracy, and they're lurking behind the next corner just waiting to imprison and torture you. I think the reality is a little different -- the US has become way too diverse even in the last 50 years to allow any one group to gain enough power to do anything major. There's 300+ million people, spread over a huge geographic area, all with different opinions on pretty much everything. Even if you did live in a mountaintop compound stockpiling ammunition for the revolution, no one would bother you unless you start using it on your neighbors. Look at how hard it is to get anything accomplished with a divided Congress...the entire country is polarized like that, and I doubt that will change anytime soon.

Companies having access to your personal data is a little different. There's an incentive to squeeze every last cent out of every single customer interaction now, and I think most people don't realize how much their data is being mined, for whatever reason. I find the increasingly focused ad targeting I've been noticing lately to be a little more invasive than an imagined threat. I'd love it if Google charged a subscription fee instead of using my data as payment for their services, but I guess they make way more from advertisers or they would have offered it as an option by now.

Submission + - "Clean up Github" -- A backlash against stereotypical nerd culture? (ethicalco.de)

ErichTheRed writes: The story on Monday about Julie Ann Horvath quitting GitHub because of harassment ties in nicely with this. A group called Ethical Code is starting a "Clean Up GitHub" campaign to request people to pull offensive comments out of their code. This brings up a very interesting question...is it still considered too PC to expect people to be somewhat professional in their public code submissions, or is this a sign that the industry might be "growing up" a little? I'd like to hope it's the latter....

Comment Why all the fuss about Common Core? (Score 1, Interesting) 273

Common Core is a big thing in NY where I live right now, because the state just voted to suspend its implementation for 2 years. NY already has pretty high standards for high school graduation and, if I'm any indication as a product of it, the curriculum is pretty good too. That doesn't mean that all other states have the same standards, and it seems to me that Common Core was designed to bring all states up to a higher level. As an example, my previous job wanted me to move to Florida, so I played along and did the whole relocation trip thing before telling them, "Sorry." Even the real estate agents who were pushing the place hard told me that my children, if they were smart, would have to be in private school to get a good education...just like Texas, FL values football more than education in high school apparently.

It seems to me that all the people screaming about how bad this is brought it on themselves. Look at all the press about the evil teachers' unions who have pensions, yearly raises, protect their members and only work 180 days of the year. Also here in NY, there was a big fight to force teachers to be evaluated and ranked like corporate employees get their performance reviews. I'm not a teacher, and I'm totally against that. First off, getting stuck with a class of crappy students can cost you your job, especially early on in your career when you might have to work in a bad school district. Second, teachers are professionals. Once they receive tenure, they should no longer be subject to evaluation and should have a job for life, end of story. Doctors and lawyers aren't stack-ranked -- those of us in private sector jobs who don't like it should fight to get representation.

Regarding the SAT, I wound up doing much better on the ACT when I took both. The ACT was much closer to what the SAT is slated to become. I remember it focused a lot more on what you were learning in school rather than obscure vocabulary words. I have a horrible time with head-based arithmetic, and the math section of the SAT (when I took it) had no calculators allowed and was basically two 30-minute tests of arithmetic and algebra tricks. I went on to make pretty decent grades at a state university in chemistry, so so much for the predictive factor or SAT scores... :-)

Comment Definitely not for power users (Score 2) 103

Even though I do find myself using things like OneDrive and Dropbox to keep non-sensitive stuff I'm working on available at home or at work, I'm not totally convinced that most power users will be replacing their PC or laptop with what's essentially a thin client that Google has control over.

On the other hand, for people who truly don't know any better, live in a location with five-nine, super-fast broadband access and just don't have the savvy to understand that their data is being mined by a third party, this might take off. It's the same reason tablets are taking off among the "content consumer" set. Amazon is doing something very similar with the Kindle Fire -- basically give away the hardware with the knowledge that Amazon uses your browsing habits to improve their prediction engine.

Honestly, I wish Google and similar services would offer a "paid" version with no data mining or tracking. People forget that the awesome search engine, maps, etc. aren't a free resource, and their data is paying Google's bills.

Comment I've seen this before (Score 5, Insightful) 137

I remember CS enrollment shot way up in the late 90s as the dotcom bubble was inflating. Now that we're in the late stages of the social media/apps bubble, and people are getting interested in computer science again, I'm guessing that's the reason for the spike.

Bubble or no bubble, there's always going to be demand for good, talented people in software development and IT. The H-1B and offshoring trends have cut salaries significantly, and have made employment less stable, but there are still jobs out there. If students are going into CS that have a genuine interest in computers, that's good. Chasing the money like they were doing in the 90s without the desire will lead to the same problem we had when 2001 rolled around -- tons of "IT professionals" who had no aptitude for the work and were just employed because of the frothy market.

I've managed to stay employed for almost 20 years now and I still really enjoy what I do. It's not as wildly lucrative as it was in the 90s when you could get 20+% salary increases by changing jobs every six months. The only things I've done consistently over this time are:
- Keeping my skills current (and yes, it is a tough commitment especially when you employer doesn't care.)
- Not begging for higher and higher raises every single time salary review time comes around (which requires saving and living within one's means...)
- Choosing employers who don't treat their employees like they're disposable.

I've heard lots of older IT people that they're actively discouraging their kids from following in their footsteps. I don't think that's necessarily good advice. Sure, there are crappy employers out there, and it's not a guaranteed ticket to wealth anymore. But if you're flexible and want interesting work that lets you use your brain and get paid for it, it's still a good move IMO. Look at the legal profession right now - the ABA sold out their members by allowing basic legal work to be offshored. Law degrees were previously an absolute guarantee of a respected, high-salary job, and now that profession is starting to see what we're seeing. My opinion is that as computers get more and more involved in our daily lives, a professional framework will eventually develop when things really start getting safety-sensitive and people stop treating computers like magic boxes and IT/developers like magicians.

Comment Interesting idea (Score 1) 231

I think that one of the problems with the way math is taught in schools is the fact that very little is done to explain how calculations students are doing can be applied to actual problems. Now that I'm older, went through a science education in college and work in a technical field, I understand this. However, one of my problems early on was that I never really felt comfortable doing math problems. It sounds really stupid, but I must have some sort of disability -- I can't do basic arithmetic in my head. The numbers just don't stick in my head the way they need to when you're doing multi-column addition or multiplication. My wife, a finance wizard, laughs at and pities me at the same time when I'm manually figuring out a tip. When I was learning math back in the Jurassic period, the students who were "good at math" were the ones who could easily do calculations in their head and just had a feel for numbers. Calculators in the early grades were unheard of back then. And this skill is still what a trader needs -- they need to be able to make a decision in 5 seconds based on a calculation they do in their head. It's also a skill you need to do well on the SATs, since they basically contain two 30-minute timed algebra and arithmetic tests.

What I'm saying is that math is more than basic arithmetic and algebraic manipulation. If you can get a student to understand what you mean when you say exponential growth, and how it relates to something they care about, then students will understand it more. I remember hating grade school math with the endless arithmetic drills, and later, the rote memorization of procedures for fractions, long division, etc. I also remember going through high school algebra just memorizing the exact steps to complete the crazy factoring/simplification problems and not understanding _anything_. It literally took me until about halfway through high school, when science classes actually got somewhat challenging and delivered meatier material, to make any sort of connection.

Calculus and other applied math should be at least touched on earlier on in the school career. I think it would help students who don't necessarily have the skills that would make them "good at math" to at least understand some of it. People I know who understand math well say it's like a foreign language, so maybe we should be teaching useful phrases for travellers more than we teach verb conjugation and sentence structure...

Comment It all depends on where you work (Score 1) 427

Not all companies are crazy software development shops run by 20somethings who don't mind working 100 hour weeks. If you go to a company like that and ask about wage parity, you'll get all the excuses that were posted in this thread -- time off for kiddies, inability to travel, inability to work 100 hour weeks when needed, etc.

The reality is a little different. My wife and I both work, and we have 2 little kids. They take an insane amount of BOTH our time. Both of us have to share the responsibility of sick days, chores around the house and running errands, and especially this winter, snow days. We both have technical jobs, mine in IT, hers in finance. The difference is that we work for companies that don't expect 100 hour weeks. So far, it's worked out as long as one of us isn't taking all the time off work. It's not the 50s anymore -- most companies are mainly concerned with whether you get your work done and less obsessed with the butts-in-seats factor. The trade off for this is that sometimes we end up having to do a little extra work to catch up after the kids go to bed, which sucks when we're dead tired for working AND taking care of the kids. But, we're (at least not publicly) referred to as the one team member who can't get their stuff done.

So it's less of a female wage parity problem, and more of an "old guy/girl with kids" problem. That really bites as the two of us get older. We have to be smarter about the type of companies we choose to work for, and yes, both of us are leaving money on the table compared to the wages in our area. Single people who just graduated and have zero obligations will always have more choices. They can choose to work at an investment bank, or for a consulting firm that will fly them to clients' offices 300 days out of the year. They could go work for EA and fulfill their "lifelong dream to break into the exciting video game industry." These are choices only, not necessarily good or bad ones. It's just that as we get older, if we don't want to ignore our kids, we have to give up some of our options. ECO 101 - opportunity costs.

For fathers that aren't totally disconnected with the responsibility of raising kids, it can be very close to the same amount of extra time off the woman needs. You need to be there for them. It's harder to understand as you get to your late 30s and are still single or married and childless, but in my experience good managers have been able to at least relate. If one of your team members is still doing great work and needs to work a weird schedule, you would be silly to dump them and replace them with a fresh grad who doesn't have the experience but is willing to work themselves to death for you.

Comment Pretty silly (Score 1) 146

I think this might work for _some_ millenials who are so used to this kind of reward system that this becomes the only way they can function in a work environment. If someone is raised on video games and collecting badges/trophies/points/whatever for doing a task, then it becomes a good workplace motivator. This would be especially true for younger software developers -- grind out this module/finish this sprint/debug this feature and receive the "Chief Debugger" badge. It could also work for mundane tasks that younger workers might turn their noses up at if there wasn't some sort of bragging rights attached to it. I'm not that old, and I was raised on video games, but not the whole "status collection" thing.

For someone who is already motivated to do a good job and doesn't need this, I can see it becoming a huge wedge issue. Not everyone works for companies that are arranged around being an extension of the college dorm lifestyle. Different people are motivated by different things. Money is nice for me, for example. Same goes for finishing something, seeing it go out to a customer or one of our internal guys, and having it work without coming back. I don't care if I have 16 badges and 20,000 points for doing that -- I care about the end result.

Comment Is the abuse problem really that big a deal? (Score 4, Insightful) 294

I know "drugs are evil" and all, but I genuinely don't understand why people are so panicked about people abusing prescription pain killers. The reality is that there's a huge demand for pain medication, both for legitimate and abuse purposes. Just like the other wars on drugs, it's impossible to stop. Therefore, I'm of the mind that we shouldn't do anything...and that's coming from a very left-wing, big-government type. We should focus on providing abusers safe drugs, and spend the money we save on enforcement on treatment for the people who really want to get off drugs. I've never touched drugs, but I can't blame someone who has a crappy life and no prospects of it getting better from doing so.

Providing pain medication addicts with a preparation that won't destroy their liver (due to the included acetaminophen in other meds) would be a start. There's no fix for the demand problem, and reducing supply just drives up the price.

The reality is that the future is looking pretty bleak -- unemployment is going to be incredibly high as even safe middle class jobs are automated. Unless we want a revolution, it might be time to start loosening the restrictions on controlled substances. When unemployment goes up past 30, 40% and higher, governments are going to have angry mobs on their hands unless they have something to keep them occupied...

Comment It's not the buses themselves... (Score 1) 606

The core problem that I think is being addressed is this -- if your urban area doesn't have a good mix of uses (work, leisure, living space, etc.) then it eventually starts decaying. San Francisco is the exception to this rule...the Google and Apple employees want to live the hipster city lifestyle and make enough money to do it. These companies save on insane SF rents by locating out in the suburbs where land is a little cheaper. The same is happening with the big investment banks in NYC -- there's no longer a physical reason to be right next to the stock exchange (though your data center still needs to be.) A lot of banks relocated further uptown, or to NJ or CT especially after 9/11. The difference is that there aren't "Goldman Sachs buses" or "UBS buses", but most people employed at these places have enough money to live wherever they want and commute on their own.

Other "less desirable" cities have the problem of people not wanting to live in the urban core, the reverse of what's going on in San Francisco. I've never actually been to San Jose/Cupertino/Mountain View/wherever in SV, but I imagine it's something like where I live (Long Island, suburban NYC.) We have some very nice places on LI and other communities surrounding NYC, but it's mostly very expensive sprawly development you find around most big cities. Tons of people use public transportation to get into the city every day, mainly because much of the area was at least somewhat designed around it. There are big employers on Long Island too, but not as many reverse commuters. The problem is, if businesses are downtown but _everyone_ goes home to their suburban towns after work, nothing is left to prop up the city center after the offices are done for the night. Google and Apple want to attract the hipsters, so they choose to ferry them from their hipster neighborhoods to the relatively boring suburbs. Most other employers in most other locations cater to the suburbanites, As a result, those cities' urban cores decay and become shells after 6 PM on weekdays. Fewer residents --> fewer businesses to cater to their needs --> crime and urban decay. Look at Buffalo and Detroit as extreme examples of this -- the suburbs surrounding the city have basically become the only sustainable parts of the city. Atlanta is basically a city of suburbs with no comprehensive public transportation and nightmare traffic as a result. Urban planning is really tricky to get right.

It's not an easy problem to solve. Everyone wants it both ways -- the 2 acre mansion PLUS the urban hipster bar/club scene. But the MTA is right in saying that Google buses are bad for (most) cities. The most sustainable development is a mix of uses in both city and suburban settings.

Comment Yay Social Media Advertising Bubble!! (Score 4, Interesting) 257

I think it's time to call the near top of the social media bubble. Maybe this one will be called the Web 2.0 Bubble.

It's funny, because I remember the last tech bubble in the 90s ending a few months after similar insane acquisitions. Remember when AOL was bought by Time Warner because they were panicked that they would be left behind in the Web 1.0 future? How about all the IPOs of completely unprofitable companies based only on the fact that they sold stuff online or were funded by advertising?

I think whether this turns out to be a bubble or the "new normal" depends on how well these social media companies and device manufacturers can present themselves to the average joe as "the internet." Remember that AOL used to be "the internet" for anyone non-technical. People keep predicting the death of PCs simply because anyone under 25 uses tablets and phones as their primary computers, considers email old fashioned, and lives on Facebook. The question is whether this is universally true or just some hipster marketing buzz. I know people who live on Facebook, people like me who use it to post family pictures, and people who actively hate it. I think it could go either way, but the market for this stuff is way too frothy now. Even my boring corner of IT is being bombarded by cloud this and cloud that, and it's touted as the solution for everything.

The strange thing is this -- during the 90s, I was a new grad riding out the dotcom boom in one of those "boring" corners of traditional IT (sysadmin for an insurance company). This time around, I'm in a different "boring" corner of IT (systems architect in air transport). The plus side of this is that I never got laid off during the bust cycle. Marketing flash may sell IPOs, but people who actually know their stuff get to keep working when most of the fluff gets thrown out. Oh well... At least the 90s tech boom sparked a huge Internet build-out, oh, and left a lot of Aeron chairs on eBay. :-)

Comment Definition of "computer geek" has changed. (Score 3, Interesting) 158

I think the study might have some merit, but only because the definition of geek has changed a lot.

I got into computers in the early 80s as a very young kid. By the time I really got involved with a "geek" social scene, there was a mix of people. Before that, computers were most definitely nerd toys -- there were very few "typical" folks who gravitated toward them. Even so, I've worked with people who want nothing to do with computers once they are off the clock, people who have a healthy level of hobby involvement with computers, hardcore gamers, and extremely hardcore "computer nerds" -- mom's basement types. The first group are the most likely to be in a stable relationship from my experience. I'm happily married with 2 kiddos, and I put myself in the "healthy level of hobby involvement" camp. It's surprisingly hard to find time to do anything these days with 2 young kids. You certainly won't see me playing video games for 10 hours at a clip anymore...I used to do that back in the day though.

I do have anecdotal evidence from my dealings with "tech workers" that divorces are very common. Lots of people I work with are on Wife #2 or more. I think a lot of that might be the crazy amount of time that work and computer hobbies can suck out of your life -- you really have to be matched up with someone who will either tolerate it or is a "geek" themselves and understands. And like I said, once kids come along, I can see huge problems if you decide to disappear for hours on end and expect your partner to just handle the kids. If you work an IT job for one of the crappier employers out there that demands on-call duty and tons of hours a week, only the shallowest of spouses will stick around and only if you make good money to make up for you not being there.

My other piece of strictly anecdotal evidence is the prevalence of...non-traditional...relationships among the geekier set. One US-born guy I worked with was divorced and constantly trying to bring his girlfriend from China to the US -- no clue how they met. Lots have girlfriends they met online. Others have had obvious mail-order brides. That could sound a little stereotypical, but I've seen LOTS of guy's wives who barely speak English and look like they're pretty much there to cook and clean for them. Maybe I'm just working with the wrong sorts, but that's a very common theme in my experience.

Non-traditionals aside, I think a lot of the evidence the study cites is just because computers are now a normal part of our lives. Anyone can be a Facebook user. Smartphones are designed to be used by non-techies. There are plenty of "IT" jobs that don't involve hardcore coding or systems/analysis work. My job borders on the nerdy side, but only because I make it that way.

I think that if you actually do find the right person, and that person is less of a geek than you are, it balances you out. My wife is incredibly smart, but not obsessed with computers and tinkering the way I am. (She's a finance geek.) If you find someone who's just there for the money or has absolutely no interest in what you do, that's where the divorces and bitterness creep in. I'm almost at 15 years married -- and she hasn't tossed me out yet!

Comment I don't get it. (Score 3, Interesting) 195

How is semiconductors not a core business for a company that still makes huge profits off mainframes and midranges?? Sure, keep design in house, but you'll lose the flexibility you have. Imagine your research division came up with an amazing new chip design they wanted to work on right away, but were told "Nope, it'll take 6 months to ramp up GlobalFoundries, TSMC, or whatever. Sorry."

The thing I really don't get (in general) is the way businesses feel like they can have no assets on their books and just run everything with a massive tower of multi-layer outsourcing. It doesn't make sense -- outsourcing something is never cheaper than doing it yourself. As soon as you do that ,you add in a layer of middlemen who need to get paid for doing a task which was previously cheap or "free with purchase of inhouse labor." It never works out. I guess I'll never be an MBA, because I don't get the accounting tricks that make a company appear profitable when they're wasting money on things they could do cheaper and better themselves.

For IBM's case, I do see what they're trying to do. Software is more profitable than hardware. But the problem is that IBM is/was a huge innovator in hardware and chips. They're one of the last US companies massive enough to support basic research that can improve those hardware innovations. IBM's software may be profitable, but I haven't seen anyone singing the praises of WebSphere or their Rational products lately. IBM also has a massive "services" division. I've had extremely good luck with the services people who service IBM hardware, but that's going away. So, we're left with the legendary crap outsourcing and offshoring stuff they do for large companies, and of course, "consulting." My experience with outsourced IT run by IBM is an ITIL nightmare of endless support tickets, revolving door engineers, meetings to plan meetings to plan the strategy for changes, etc.

It's kind of a shame if you ask me. I am just old enough to remember when IBM was as powerful as Microsoft was and as Apple is right now. They were able to command huge margins on everything they sold because it was backed up by a really good services team. People I know who worked for IBM "back in the day" tell me the corporate culture was weird, but employees never wanted for anything because they made so much money. (I also know people who worked for Sun and Digital who say the same thing.) In some ways, it would have been much nicer to work in the computer field during this "golden age of computing." I guess my main question is where the new hardware innovations will come from when you don't have a massive company and research group driving them.

Comment Congratulations (Score 1) 293

Now, can I please have Windows 9 with the Windows 7 and Windows Classic UI as options?? It's literally the only reason why I'm not switching -- some of the Windows 8 UI is nice, but I can't stand the 2D desktop interface from Windows 2.0.

Seriously, the best thing that could be done for Windows right now is not to dump Metro, but to put it on tablets where it belongs and not force desktop users to buy into the whole touch-first thing.

Slashdot Top Deals

PURGE COMPLETE.

Working...