Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment dunno (Score 1) 113

I had ATT for years and generally no issues with them in my area, my wife has used various services on the sprint network and is now with sprint, and it mostly seems ok

my current phone is on Verizon, and wow for as much crap they talk, at least in my area, it sucks donkey ass, its constantly at 3g or lower, often with no mobile data, and it drops at a gnats fart, not impressed for as much as it would be costing me.... now to be fair its mostly fine when I travel to other large area's, but I don't do that but a handful of times a year

So while being smack dead center in their blood red map, it is by far the worst service I have ever had... and I had cellphone's since the brick with a rubber ducky antenna Motorola's on gaping wide gaps in analog coverage, and its less reliable than that

Comment Re:Designed in the US, produced elsewhere (Score 1) 71

Ok, we design things in California.

Often that design process is that a US company contacts a design company in Taiwan, which produces a bespoke design for which the real designer will not claim rights to. It's then "Designed in the USA", because someone in the US approved and paid for the design.

Comment Re:What's changed? (Score 1) 279

Until recently we all consumed information through channels whose content had to appeal to the Ketchup Company.

It was until recently expensive to spread ideas. The ideas that were spread were all highly produced to make the most of the limited bandwidth these channels had ( and by channels I mean any media such as print too ).

The expense was paid for by dishsoap and ketchup and such, which meant talking about some celeb's wardrobe malfunction was OK but controversial ideas or even bare facts that rustled too many feathers were squelched.

Now it is free to seed the space of ideas with your own. Anyone who wants can find your idea can take it for their own and spread it further mutating it with their own unique spin. We;re in the age of evolutionary computation of worldviews using memes writ large.

So for the first time, instead of being channeled like cattle down a high walled path to the slaughter, we can all see what is happening elsewhere. If it is happening to us ourselves, we can post about it and be seen.

The manufactured consensus is obliterated when the blinders come off.

And advertisers will eventually come around. The freshest ideas will be generated where the least regard for advertisers (or production values they once paid for) exists and travel down the idea digestive tract of the global human centipad repackaged over and over with better production values and reconsumed until it's completely stale.

Perhaps we'll find more advertisers, in search of eyeballs seeking fresh ideas will give up on appealing to everyone eg: starbucks and cater only to their patrons.

Endless channels of information in perfect competition now exist. The consensus can no longer be manufactured by the advertisers.

The ideas are now king. Produciton is ever cheaper, and communication is just about free.

We're getting the carfax on lots of stuff. The new car smell spray was always an illusion.

Comment Re:The current system is stupid. (Score 1) 171

No, that won't work. Changes may have taken place in-between the two copies of robots.txt.

An example: A newspaper.
At the first fetch of robots.txt, an article might not exist. The first version of it has not yet been verified, and is published with a new robots.txt that tells robots not to crawl it. Then, the article is modified and verified, and a new robots.txt published that now allows crawling it.
Yet, a spider may have caught the first robots.txt from before the article, the article while it was in bad shape, and the second robots.txt from after it was corrected. Both robots.txt files agree that it can be cached, yet the copy that was crawled was never meant for caching, and the robots.txt at the time it was published even said so.

Comment Re:The current system is stupid. (Score 1) 171

The problem with robots.txt is that it doesn't contain a validity period.

Say I add mustnotbecrawled.html, a link to it in existingpage.html, and a modification to /robots.txt that bans crawling of mustnotbecrawled.html. The problem is that a robot might have downloaded robots.txt right before my publishing, and does not see that it shouldn't crawl it. So it does.

It could be argued that a crawler should always re-load robots.txt if encountering a document newer than the last server transmit time for robots.txt, but that adds a lot of extra requests.

Some propose using the meta tag for excluding browsers, but that has its own problems. Like only working for XML type documents. And being applied after the fact. If I have a several megabytes HTML, and want to exclude it to save bandwidth, the meta tag won't work. It adds a little bit extra bandwidth.

I think this should be handled at user-agent level, where crawlers identify themselves as a crawler, and the web server can make the decision on whether to serve them based on that.

Comment Re:who knew (Score 1) 228

lol just move closer you say

listen I am not moving closer to an industrial wasteland city packed with assholes just so I can ride my bike to work, I can do that out in the open air where I live, which I do

but apparently you think I must uproot my entire life JUST to ride a dumbshit bike to work, why? so I can be one of the trendy people living in a 100 sq foot rabbit cage and showing up to work smelling of sweaty asshole and armpit, 20 min late?

there's a reason people hate cyclist, you have shown just a brief glimpse of why

Comment Re:Pretty obvious (Score 0) 380

Let's take a far simpler feature example, like when the tar utility added the xz compression flag -J. It didn't ruin everyone's work-flows.

By the time of -J, the damage had already been done. When tar gained the ability to compress/decompress with -z, it broke compatibility for a long time until bugs were fixed, as well as opened for a new generation of scripts that no longer used multi-threaded compression (like pigz) but instead sent everything through a single bottleneck, greatly increasing runtime on multicore systems, and blocking on slow IO systems (including, ironically, tapes, which tape archive was made for). tar using compression internally might be the most common reason for processes being in D state long enough to be observed.
It's a prime example of something that should have been left well enough alone. It did not solve any real problems.

Slashdot Top Deals

The reason computer chips are so small is computers don't eat much.

Working...