Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Get HideMyAss! VPN, PC Mag's Top 10 VPNs of 2016 for 55% off for a Limited Time ×

Comment Re:Word up (Score 1) 95

Far be it from me to "defend" word (a plain text editor and TeX is more my style), but do really blame the programmers for the bulk of Word's shortcomings? I suspect it wasn't a programmer who said, "hey, let's have a talking paperclip!"

If it's buggy and crashing all the time, then it's poor programming, poor QA, or unrealistic timeframes set by the higher-ups. If it's the features that are completely useless and laughable, then I wouldn't be blaming the programmers. But that's just me...

Comment Re:Spying? (Score 1) 104

My sense -- perhaps completely off base -- is that even if "ISP Google" doesn't use a modicum of data, it's still a Good Idea for Google to offer high-speed internet on the cheap. The more you use the internet, the more potential there is to make money on ad revenue. And if you have lousy internet, you're more liable to resort to other, less-monetizable forms of entertainment.

I like to think -- and this is again perhaps a bit naive -- that there's some overlap between what Google wants and what I want with regards to internet access (although for different reasons).

Comment Re:What is the appeal of these things? (Score 1) 129

as always - things that HAVE NOT EVER been a problem.

Sure, but much of what I use the internet for doesn't "solve a problem," it's just convenience. When I wanted to look up a word, I used to grab a dictionary, instead of googling. When I wanted to learn about some event in history or similar, I'd grab the encyclopedia, instead of wikipedia. Once laptops became commonplace, it was about the same speed to look it up on the net (assuming I had to wake up the laptop first); now that smartphones are ubiquitous, it's decidedly faster to just whip out your phone.

The only "problem" that was solved was some gains in efficiency. Now it seems we're at the point of diminishing returns to be sure, but that doesn't mean that setting an alarm/adding items to a shopping list/etc. can't be streamlined a tiny but more.

Personally, the killer app as I see it is having a more robust silent alert (I sometimes don't feel the vibration from my phone) along with the ability to quickly see if I should just ignore the alert or address it (it takes about 1s to glance at a watch and determine if something requires my attention, vs. several seconds to take my watch out of my pocket -- and the former can be done without significant hand movement).

My limited experience with this is based off of a $15 "smart" watch which was extremely flaky, but when it worked I was certainly happy with the workflow.

Comment Re:When will VideoCards peak? (Score 1) 89

Nice. I was thinking more in terms of computational power required for arbitrary photo-realistic graphics at this resolution. I'm not even sure if that's a well-posed question, though. But perhaps one could decide the minimum size of a polygon/size of textures required/etc., and come up with some heuristic argument for theoretical GPU requirements that could provide an imperceptibly high frame rate at the "44.1kHz/16bit video" resolution/bitdepth, displaying an arbitrarily complex (up to the limit of human perception) scene.

I like and agree with your calculation, up to (perhaps?) a factor of two due to ol' Nyquist–Shannon and whatnot (e.g., human hearing is often quoted as good to 20kHz, but we require twice that or ~40kHz due to sampling). Also, the human eye has a phenomenal dynamic range, so the bitdepth might need to pretty high, although the full dynamic range is only realized over a very slow timescale I think (we can see just fine in the sunlight at ~1kW/m^2 illumination, but we can also see object outlines at night when an entire room is lit only by an LED pushing 100 mW).

Comment Re:When will VideoCards peak? (Score 2) 89

...44.1kHz 16 bit audio is relatively trivial...

So, is is there an analogous specification for video cards? The 44.1kHz @ 16bit is pretty easily justified (Nyquist–Shannon + reasonable dynamic range). Can a visual equivalent be easily justified? That is to say, at sort of "eye limited" (retina, in Apple lingo) resolution and field of view, how many polygons can be said to make up the human perception of reality, and what sort of graphics processing muscle would be required to drive this?

I of course have no idea, just wondering out loud. Just trying to approach OP's question in a pseudo-scientific fashion.

Comment Re:What's bad for the telcos (Score 1) 194

In the vast majority of cases, I certainly agree. However, there are some instances where -- at least in a very narrow sense -- "big business" has more-or-less similar interests as the consumer. As an example, Netflix and Google both have seemingly reasonable stances on internet openness (at least in the USA). Whether you want to look at this as big business doing The Right Thing or as big business looking to lower their costs to increase their bottom line is up to you.

I almost* feel sorry for the telcos; they're sort of a necessary evil in that they don't offer anything other than a means to an end -- a required but largely thankless service. Netflix has movies, I want to watch them, and the fact that they have to go over a series of tubes to get to me isn't something I really care about or even notice, unless it doesn't work flawlessly.

*Well...actually not at all.

Slashdot Top Deals

A modem is a baudy house.

Working...