Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:Never look up (Score 1) 84

I personally tend to look at cameras with positive interest and sometimes wave at the (potential) guy at the other end. Silly, I know, and no doubt you will call me an idiot or worse, but I know what I am and I am self-assured enough to feel comfortable about it, so what what do I care?

That would be "worse".
The three letter agencies are not your friends, and it doesn't matter what you know you are, what matters for them is to do their job, which includes investigating matches. Looking at numbers, the risk of becoming a false positive seems far higher than the chance of them picking the right person.

No matter how innocent you are, cops coming to fetch you at work or swatting your home with neighbors watching is not going to make your future bright, no matter how comfortable and carefree you are now.

And "worse" because what you do when deliberately offering your face to the cameras, is increasing the size of the haystack. And worse, smiling decreases the reliability of facial recognition, and increases the risk of false positives.
More hay to sift through, and more hay that can be mistaken for a needle. That's counterproductive.
Any cops who get your image popping up as a potential match are not going to be grateful to you for their extra work investigating and clearing you.

Given that you don't seem to be an idiot, as evidenced by clear writing, I call your actions "not even stupid".

Comment Re:Going Howard Hughes... (Score 1) 117

This could be used to carry large ungainly freight, like lifting a factory-built house onto a mountainside.

And that would buy what over using a helicopter to lift materials, except risk and costs?

Human psyche being what it is, the world's biggest blimp will always primarily be a target. For ridicule and bullets.

Comment Re:I hope he wins his suit (Score 1) 692

An engineer is someone who a received an engineering degree from an academic institution.

Or has passed as a journeyman in an engineering guild, like e.g. clock makers. However, there are precious few engineering guilds left in the world.

(And, of course, those responsible for the engine on a train or boat, but that's a different kind of engineer.)

Comment Re:Sigh (Score 1) 177

God knows how much electric 100,000 fast-charging stations pull. I doubt it's any more environmentally friendly than even 100,000 petrol cars.

Yes, supercharging is much worse for the environment than regular charging. The grids don't deliver enough juice for them at peak, and they have to store energy locally in battery buffers. That's another quite lossy conversion. And supercharging isn't as energy efficient in itself either - the heat loss is larger than with slower charging.
In countries that produce a good part of the electricity from coal and oil, that's not a good thing.

Comment Re:Designed in the US, produced elsewhere (Score 1) 78

Ok, we design things in California.

Often that design process is that a US company contacts a design company in Taiwan, which produces a bespoke design for which the real designer will not claim rights to. It's then "Designed in the USA", because someone in the US approved and paid for the design.

Comment Re:The current system is stupid. (Score 1) 173

No, that won't work. Changes may have taken place in-between the two copies of robots.txt.

An example: A newspaper.
At the first fetch of robots.txt, an article might not exist. The first version of it has not yet been verified, and is published with a new robots.txt that tells robots not to crawl it. Then, the article is modified and verified, and a new robots.txt published that now allows crawling it.
Yet, a spider may have caught the first robots.txt from before the article, the article while it was in bad shape, and the second robots.txt from after it was corrected. Both robots.txt files agree that it can be cached, yet the copy that was crawled was never meant for caching, and the robots.txt at the time it was published even said so.

Comment Re:The current system is stupid. (Score 1) 173

The problem with robots.txt is that it doesn't contain a validity period.

Say I add mustnotbecrawled.html, a link to it in existingpage.html, and a modification to /robots.txt that bans crawling of mustnotbecrawled.html. The problem is that a robot might have downloaded robots.txt right before my publishing, and does not see that it shouldn't crawl it. So it does.

It could be argued that a crawler should always re-load robots.txt if encountering a document newer than the last server transmit time for robots.txt, but that adds a lot of extra requests.

Some propose using the meta tag for excluding browsers, but that has its own problems. Like only working for XML type documents. And being applied after the fact. If I have a several megabytes HTML, and want to exclude it to save bandwidth, the meta tag won't work. It adds a little bit extra bandwidth.

I think this should be handled at user-agent level, where crawlers identify themselves as a crawler, and the web server can make the decision on whether to serve them based on that.

Comment Re:Pretty obvious (Score 0) 388

Let's take a far simpler feature example, like when the tar utility added the xz compression flag -J. It didn't ruin everyone's work-flows.

By the time of -J, the damage had already been done. When tar gained the ability to compress/decompress with -z, it broke compatibility for a long time until bugs were fixed, as well as opened for a new generation of scripts that no longer used multi-threaded compression (like pigz) but instead sent everything through a single bottleneck, greatly increasing runtime on multicore systems, and blocking on slow IO systems (including, ironically, tapes, which tape archive was made for). tar using compression internally might be the most common reason for processes being in D state long enough to be observed.
It's a prime example of something that should have been left well enough alone. It did not solve any real problems.

Slashdot Top Deals

The amount of time between slipping on the peel and landing on the pavement is precisely 1 bananosecond.

Working...