Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:Designed in the US, produced elsewhere (Score 1) 74

Ok, we design things in California.

Often that design process is that a US company contacts a design company in Taiwan, which produces a bespoke design for which the real designer will not claim rights to. It's then "Designed in the USA", because someone in the US approved and paid for the design.

Comment FORTRAN IV on a CDC 6600 (Score 1) 617

First language was FORTRAN IV on the CDC 6600 at UT Austin.

I learned BASIC a couple of years later, then various assembly languages (specifically including 6600 CP COMPASS), then PASCAL on the 6600. LISP on 6600 a little, more assembly languages (PDP-11 and DEC-10 in particular, Intel 8080, Motorola 6800 and 6809, and I don't know what all else).

Starting in about 1988, I was doing Ada and C. I started doing C++ while doing refresher work at UT Austin in 2003-2004, and still am.

Comment Re:Damage from BASIC (Score 1) 617

This.

I used BASIC as it was what was available on the machine I was paid to write.

My BASIC, though, looked more like good FORTRAN than most basic, with thought out calls, etc.

If the language you need to use doesn't have the control structure you need, just write it.

Although I don't miss worrying about what line number to put routines at for efficiency (MBASIC until 5 or so would search through memory on a GOTO or GOSUB, making low-numbered calls faster than high-numbered).

And it's amazing that noone has pointed out the adage that a sufficiently skilled programmer can write bad FORTRAN in any language . . .

hawk

Comment Re:The current system is stupid. (Score 1) 171

No, that won't work. Changes may have taken place in-between the two copies of robots.txt.

An example: A newspaper.
At the first fetch of robots.txt, an article might not exist. The first version of it has not yet been verified, and is published with a new robots.txt that tells robots not to crawl it. Then, the article is modified and verified, and a new robots.txt published that now allows crawling it.
Yet, a spider may have caught the first robots.txt from before the article, the article while it was in bad shape, and the second robots.txt from after it was corrected. Both robots.txt files agree that it can be cached, yet the copy that was crawled was never meant for caching, and the robots.txt at the time it was published even said so.

Comment Re:The current system is stupid. (Score 1) 171

The problem with robots.txt is that it doesn't contain a validity period.

Say I add mustnotbecrawled.html, a link to it in existingpage.html, and a modification to /robots.txt that bans crawling of mustnotbecrawled.html. The problem is that a robot might have downloaded robots.txt right before my publishing, and does not see that it shouldn't crawl it. So it does.

It could be argued that a crawler should always re-load robots.txt if encountering a document newer than the last server transmit time for robots.txt, but that adds a lot of extra requests.

Some propose using the meta tag for excluding browsers, but that has its own problems. Like only working for XML type documents. And being applied after the fact. If I have a several megabytes HTML, and want to exclude it to save bandwidth, the meta tag won't work. It adds a little bit extra bandwidth.

I think this should be handled at user-agent level, where crawlers identify themselves as a crawler, and the web server can make the decision on whether to serve them based on that.

Comment Re:Follow the funding and experts (Score 1) 35

You have all the universities (UCB, Stanford, Caltech, UCLA) as well as a critical mass of tech companies that allows interchange of staff and the creation of a new company overnight. Getting a new job is as easy as going out for lunch. This applies to the East coast as well. Neither area is lumbered with a large unemployed population claiming benefits.

Having so many corporations means that a startup can remain in stealth mode and keep under the radar of politicians and quangocrats. I've known companies to implode because the local government office instructed that they were instructed "not to promote anyone any further but instead to have a fresh talent initiative".

Weather isn't that much of a factor. Even places close to the Arctic circle can have a strong tech base providing the quality of life is high.

Slashdot Top Deals

If you think the system is working, ask someone who's waiting for a prompt.

Working...