Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Also (Score 1) 94

Turns out most people aren't good self-motivated learners. You find that if you have the "courseware" kind of model where people can just go and watch lectures and do assignments at their own pace the attrition and failure rate is very high. People just won't do what they need to do. They need a more structured environment to succeed. Now you can get all self superior and say "Well they should just work harder and not suck!" but we have to deal with the real world and that means educating all types of people.

Comment That was a big part for sure (Score 2) 458

My boss got us smartphones back in the Windows CE days, because he's a huge geek like the rest of us. The problem was that while work was willing to pay for the phone part the data was WAAAAY too expensive so we didn't have that. Combine that with lackluster wifi availability and the fact that you had to manually turn it on and off because it drained battery out of range, and we didn't end up using the "smart" portion much. Not because it was too hard to use or any of that BS, but because there just wan't the ability.

Now, data is cheap, and my phone auto roams on and off of wifi, and work has complete wifi coverage. So I use my smartphone often for its "smart" features. It is always on data of some kind and like you, I never get near my cap, particularly because it is usually using wifi.

That is the biggest thing that changed and made smart phones useful to me, and others I know. It because affordable and practical to use the smart features. Data is something that is an included feature in most phone plans these days. $40/month can get you a line with some data.

Another thing that changed is just the progress of technology mainly the processors. Before switching to Android I had a Blackberry, which I loved, except for its slow CPU. Due to the excessive amount of JavaScript and such shit on most websites, browsing with it was slow. Not so much waiting for data, but rendering. However I not can browse whatever I want, my phone has a very high power CPU in it that can deal with all that shit, so it isn't too much slower to load a page than on my desktop.

Touchscreens and such weren't the thing that changed it for me. I still liked Blackberry's real keyboard + scrolly ball interface. It was having an affordable data plan plus a processor capable of handling the BS of the modern web.

Comment Which he needn't do (Score 1) 180

If you choose not to use the tools available, well don't expect anyone to have sympathy for you or marvel at how hard you had it. You've only yourself to blame. When I wish to mount something in my house I get out a laser level, cordless electric drill with titanium bits, and so on. As such things get put up easily, quickly, and dead level. You could do the same with a rock and sharpened metal pieces, but don't expect me to be impressed with how long it took you or the problems with the results. You could use modern tools, if you chose.

Comment Is anyone surprised? (Score 5, Insightful) 180

I think some forget, or never knew, that his first book was published 1996. This guy is not a fast writer.

Personally doesn't bother me, since I stopped reading after the third book because the quality tanked so hard. The original Game of Thrones is my all time favourite fantasy novel and I will recommend it all the time. A Clash of Kings was good, but a major step down. I enjoyed it though. A Storm of Swords wasn't very good at all.When A Feast for Crows I asked some people and the answer I universally got was "don't bother" so I didn't. It was also a bit harder to maintain the "givashit" with 5 years intervening instead of 2.

It seems like he more or less ran out of ideas and has bogged things down in to a whole bunch of characters nobody cares about. Ok, he can do as he pleases, but I'll keep my money thanks.

Comment Re:physical access (Score 1) 375

"Of course, this comparison is also patently unfair -- Windows 7 was written in the 2000s, X11 was written in the 1980s. Expecting them to be comparable in terms of security is pretty ridiculous."

Which could be a good argument for replacing X. It is rather old technology, perhaps it is time to update it to something newer, rather than clinging to it and claiming it is all one needs.

Comment Consumers? No just whiny fanboys (Score 3, Insightful) 113

Consumers are fine. The only benchmark that matters to a normal consumer is "How fast does it run my games?" and the answer for the 970 is "Extremely damn fast." It offers performance quite near the 980, for most games so fast that your monitor's refresh rate is the limit, and does so at half the cost. It is an extremely good buy, and I say this as someone who bought a 980 (because I always want the highest end toy).

Some people on forums are trying to make hay about this because they like to whine, but if you STFU and load up a game the thing is just great. While I agree companies need to keep their specs correct, the idea that this is some massive consumer issue is silly. The spec heads on forums are being outraged because they like to do that, regular consumers are playing their games happily, amazed at how much power $340 gets you these days.

Comment Apple is almost that bad (Score 1) 579

They support two prior versions of OS-X and that's it. So OS-X 10.7, released 3 years ago, is unsupported as of October 2014. I guess that works if you have the attitude of just always updating to the latest OS, but it can be an issue for various enterprise setups that prefer to version freeze for longer times, or for 3rd party software/hardware that doesn't get updated. Also can screw you over if Apple decides to change hardware like with the PPC to Intel change.

Comment And form talking to our researchers (Score 0) 110

Between a bit better language design and superior support and tools, CUDA is way easier to do your work in. We've 4 labs that use CUDA in one fashion or another, none that use OpenCL. A number have tried it (also tried lines like the Cell cards that IBM sold for awhile) but settled on CUDA as being the easiest in terms of development. Open standards are nice and all but they've got shit to do and never enough time to do it, so whatever works the easiest is a win for them.

On a different side of things, I've seen less issues out of nVidia on CUDA than AMD on OpenCL for video editing. Sony Vegas supports both for accelerating video effects and encoding. When I had an AMD card, it was crashes all the time with acceleration on. Sony had to disable acceleration on a number of effects with it. I had to turn it off to have a usable setup. With nVidia, I find problems are very infrequent.

Obviously this is one one data point and I don't know the details of development. However it is one of the few examples I know of a product that supports both APIs.

Comment It's also a load of shit (Score 1) 332

NTSC stuff is so bad when viewed on a large TV. It is amazing how blurry things look when you flip back and forth between the HD and SD channels. That is part of what lead to the rise of big screen TVs was actually having content for them. With NTSC, a large TV just meant a big blurry image. With ATSC it can mean a nice large image.

Comment Also (Score 1) 332

Why shouldn't they continually improve their products? Even with NTSC sets this was done. New ones would be larger, have better focus, more clearly resolve the signal, have better phosphors, and so on. Why shouldn't this continue? They should keep trying to improve their products as technology allows.

None of that means you need to buy a new toy all the time though. You can stick with what you have until it breaks, or until the new stuff is a big enough leap that you wish to own it.

I think a lot of the whining from people comes down to simple jealousy. They'd like to own the new stuff, but cannot afford it, or do not wish to. So they try and hate on it and act like a luddite. You see it practically any time Slashdot has a story on new technology. People complain about it like it is somehow a bad thing that there might be something new.

Comment And they could probably handle 120fps (Score 1) 332

Most panels in higher end screens are actually real 120fps panels. However that is just used for 3D and for reduced motion blur. The only set I know that advertises support for 120fps input is Vizio. Others could do it, if they wanted to, however.

As you say, the issue with higher refresh rates isn't in the display technology.

Part of it is just getting people used to the idea I think. We've seen shitty, jerky, frame rates in moves for so long people start to associate that with being "cinematic". People need to get used to the idea that's bullshit and maybe they'll start to like it more.

Hopefully sports and such will get shot at 60fps some day and that may help.

Comment It also doesn't really matter (Score 4, Insightful) 145

Thing thing is if you go and look at benchmarks of the cards in actual games, you find out the 970 wrecks shit, particularly given its price point. The 980 is an overpriced luxury (I say this as a 980 owner) because the 970 gets nearly the same performance for like half the price. The difference with its memory controller just doesn't seem to matter in actual games out there on the market.

And that's the real thing here the the spec head forget: You buy these to run actual software. If it does well on all actual software, then who gives a shit about the details?

Comment Not necessiarly (Score 1) 180

He may well have been as smart as he thought (I'm not saying that is the case for sure, mind) but turns out others were smart enough, and more knowledgeable in the ways that mattered.

Hans Reiser is a good example. Man is unquestionably very smart. However, he had the geek hubris that I call SMFU, Smartest Motherfucker in the Universe syndrome. He figured he was so much smarter than everyone else, he could easily get away with his crime. Turns out that the police have some smart people too, and those people know a lot more about criminal investigation than he did.

Slashdot Top Deals

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...