Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
The Internet

Danish Court Rules Deep Linking Illegal 382

Jstein writes "In a court ruling today Friday, the court in Copenhagen, Denmark ruled in favor of the Danish Newspaper Publisher's Association against the online news aggregator Newsbooster. Thereby deep linking has been ruled illegal for the first time." Currently the story is only in Danish (from Computerworld Denmark, Online). Update: 07/05 23:15 GMT by T : ttyp writes "Here is a link to an English language story about the Danish deep linking case."
This discussion has been archived. No new comments can be posted.

Danish Court Rules Deep Linking Illegal

Comments Filter:
  • Deep linking? (Score:5, Insightful)

    by User 956 ( 568564 ) on Friday July 05, 2002 @12:36PM (#3827993) Homepage
    Too bad. Next week Time Magazine will require you to read pages 1-36 before reading the article you want on page 37.
  • Also Illegal: (Score:5, Insightful)

    by tswinzig ( 210999 ) on Friday July 05, 2002 @12:38PM (#3828021) Journal
    - Sending specific URL's to your friends via email.

    - Citing specific pages in your footnotes.

    - Pointing at specific locations with your finger.
  • Moronic. (Score:5, Insightful)

    by wirefarm ( 18470 ) <[ten.cdmm] [ta] [mij]> on Friday July 05, 2002 @12:44PM (#3828056) Homepage
    If you put a document on the web and make it accessible through the use of a(n) URL, anyone can use that URL to access it.

    Of course you can use referrer technology to block how people get to your document, but these people seem to lack the ability to do things like that.

    What if I bookmark a 'deep link'? What about Google?

    Personally, I think that the term "deep link" is a misleading term - each document is equally accessible from outside, well except for a few bytes in the length of the URL.

    Jim in Tokyo
  • New Meta Tag? (Score:4, Insightful)

    by randomErr ( 172078 ) <ervin.kosch@gmail . c om> on Friday July 05, 2002 @12:46PM (#3828072) Journal
    Just a thought but how about a couple of new Meta Tags:
    <meta http-equiv="LinkStatus" content="NoLink">
    <meta http-equiv="LinkTo" content="False">
    If the browser and search engine was setup properly they could read the tag and ignore the link(s) on the page or give a page is unavalible security zone warning.

    I Corinthians 6:1
    Dare any of you, having a matter against another, go to law before the unjust, and not before the saints?
  • by Stuart Gibson ( 544632 ) on Friday July 05, 2002 @12:46PM (#3828074) Homepage
    Let's hope that this doesn't mean that deep linking in itself becomes illegal. There may be a case where advertising revenue pages are bypassed or some other legitimate reason exists that the content publisher would rather users came via their front page.

    However, it is well known that deep linking is good linking [] as far as users go.

    I don't suppose there's any chance that publishers will come to a gentleman's agreement that it is improper to deep link if they explicitly ask not too (in the same way as it is considered "impolite" to provide direct links to files on others servers.

    Finally, if DeCSS code can be considered "free speech", how can writing an URL not be subject to the same rational?

  • Re:Hmmm. (Score:5, Insightful)

    by Fastolfe ( 1470 ) on Friday July 05, 2002 @12:48PM (#3828092)
    I disagree. If you're sticking something up on a web site, that something has a URL. Every entity on a web site has its own unique URL that should be retrievable anywhere.

    If you don't like this behavior, and you want "pages" on your site to only be accessible by people browsing through your site, you're going to need to stick a "document retrieval" application layer onto your site. Users start a session when they enter this application, and are only able to retrieve stories through this application front-end. This can be done through HTTP as simply as with a session ID, but the web was not meant to work like this.

    Again, we have a rather useful technology being twisted and warped by corporate interests instead of those corporate interests funding a proper technological solution, just like the intellectual property crap associated with DNS nowadays.
  • Referer (Score:2, Insightful)

    by sirisak ( 590510 ) on Friday July 05, 2002 @12:49PM (#3828103)
    A number of large sites, both corporate and strictly informative, use a HTTP-referer mechanism to transport you to the top-level page if you just "ended up" in the middle of the site. Used properly, this is a good example of user-friendly interface engineering without being obnoxious. Just my $.02.
  • by silicon_synapse ( 145470 ) on Friday July 05, 2002 @12:56PM (#3828159)
    If I go rent a movie instead they'll lose business too (even more so). That doesn't mean it should be illegal. Life isn't fair. You can't legislate a profit although many seem to enjoy the challenge.
  • by alsta ( 9424 ) on Friday July 05, 2002 @01:06PM (#3828219)
    I can recall a lot of people putting their bookmark.htm file online and use that as a start page. Should bookmarks as we know them be illegalized too? Because that's `deep linking' too if you think about it.
  • Re:New Meta Tag? (Score:4, Insightful)

    by BlowCat ( 216402 ) on Friday July 05, 2002 @01:08PM (#3828234)
    Then I'll use an "improperly setup" browser. Software should be on the side of the users, or the users will choose other software.
  • Re:Hmmm. (Score:5, Insightful)

    by JamesOfTheDesert ( 188356 ) on Friday July 05, 2002 @01:10PM (#3828245) Journal
    It's when you link to a second, third, fourth, etc level of a website.

    What's a "level"? If there is a specific, direct URL to a item, then it is already at the "top" level. That there are other ways to arrive at that URL is a conceptual design decsion, not a feature of hyperlinking or the Web itself. There is no "top" of a web site, other than mental contructs people impose on it, unless the web server enforces a particular sequence of URLs.

    What's (almost) funny is that this is trivially easy to do, and just has to be cheaper than suing people, unless you are collecting damges each time.

  • by jellomizer ( 103300 ) on Friday July 05, 2002 @01:13PM (#3828265)
    If you dont want your content deep linked then higher a webdeveloper who can make CGI Scripts ASP, PHP, JSP. Or whatever to prevent unorathrised access. After you setup a minumal security so the data is only available threw the webpage. Then you can accuse any attempts to deep link as a form of cracking (or Hacking for people who want to be that way)
  • Re:Deep linking? (Score:3, Insightful)

    by Oculus Habent ( 562837 ) <oculus DOT habent AT gmail DOT com> on Friday July 05, 2002 @01:14PM (#3828272) Journal

    Companies could prevent deep linking in a heartbeat just by redirecting anything that wasn't referred by their domain. That way people couldn't even send "deep links" to friends in e-mail...

    It's a great way to have huge amounts of unaccessible information on a web page... Like phone trees, only more pathetic.

  • by jdavidb ( 449077 ) on Friday July 05, 2002 @01:15PM (#3828280) Homepage Journal

    The reason "deep" linking should not be illegal is because there is no fundamental difference between a deep link and a regular link. We should quit playing the game by using this term to distinguish "deep" links from others.

    You can't come up with a clear, unambiguous definition of deep links without having a special database or extension to the DNS database (!) to indicate what a site considers to be deep links on a case-by-case basis. In otherwords, the only clear and concise definition of a "deep" link is "a page on the website of Somebody Powerful that that Somebody doesn't want me to link to."

    You can't just say, "A deep link is a link that goes somewhere besides the top of a site." For example, this [] is a deep link (to a website that has tried to force people not to link to them, I might add), while this [] is not. Both are links to something other than just, but the second is the top page of a site.

    The real problem is web newbies (big media companies) think every website should have one entry point, but the web wasn't designed that way. We should quit helping these people persist in their misunderstanding of reality by using the term "deep link."

  • by marhar ( 66825 ) on Friday July 05, 2002 @01:19PM (#3828299) Homepage
    The web has links, period. The term "deep link" was created by individuals who fundamentally don't understand the nature of the web. Using their terminology makes it much easier for them to stay on the offensive.

    Sorry to sound so RM-esque, but sometimes the words really *do* matter... :-/
  • by jhoffoss ( 73895 ) on Friday July 05, 2002 @01:40PM (#3828452) Journal
    Given that technology can prevent deep linking, it is most efficient if the law is in accordance with that technological reality.
    And if you want to allow people to deep link to your site? Is it then illegal to do so? I know it's an unreal example, but technology (knives, guns, blunt objects, whatever) can be utilized to injur, maim, murder, etc. someone...does that mean the law should also be in accordance with this technological reality?

    If the technology to prevent deep-linking is present, and a web-site that wishes to prevent deep-linking does not utilize (or attempt to utilize) this existing technology, they shouldn't be able to complain if someone deep-links to their site, IMNSHO.

    Note that I am making two assumptions: implementing anti-deep-linking technology does not require the time or resources it would take to build a wall around 10 acres; the second is that you are just as able to give permission to deep-link as you are able to give permission for someone to murder you.
  • by Lars Clausen ( 1208 ) on Friday July 05, 2002 @01:42PM (#3828475)
    Having read several of the actual documents involved in this case, let me say this: This case is not about deep linking at all. In no way. Whatsoever.

    What they're being sued over is having essentially copied the table-of-contents. They've taken the links and titles of all the newspaper articles directly from the webpage and presented them to users. Unlike /., they did not put their own title on the references or anything.

    Under Danish copyright law, an index can be copyrighted. This copyright was violated.

    This case sets no precedent for a site that collects links to articles about e.g. Linux, as such a site would have to put their own effort into making the index.


    Thank you.
  • by pbrammer ( 526214 ) on Friday July 05, 2002 @01:45PM (#3828495)
    Yes, but anti-deep-linking legislation is completely against the way the Internet works... The whole premise is that we can link to another page, thereby spinning a web of documents. If we couldn't "deep-link," then I presume that Google, et al., will have to shut down, considering the links they produce are "deep-linked." Wouldn't you say?

    And I disagree with you as to why we abide "No Trespassing" signs. It's because I don't want to get my a$$ shot off by some looney character with a shotgun in his lap.

    A better analogy might be the other way around... Take a look at research papers written MANY years ago. Take a look at their bibliography page. Is that not "deep-linking?" Thought so...

    That's all we need is more frigg'n legislation to protect some ignorant people who are only comfortable if they're bitching about something. Face it, deep-linking has been around for years. It makes me so sick that these people (that are new to the Internet) think they own the damn thing. Christ, perhaps the elders of the 'net need to speak up. I know I've been on since it was commercialized mainstream around 1993/94 and frankly, I'm appauled as to where this is all heading.

  • Re:Deep linking? (Score:3, Insightful)

    by CaseyB ( 1105 ) on Friday July 05, 2002 @02:08PM (#3828643)
    Companies could prevent deep linking in a heartbeat just by redirecting anything that wasn't referred by their domain.

    I haven't been a webmaster in a while, but I think the spread of smart browsers and privacy firewalls that supress "extraneous" information like Referrer: headers would make this unadvisable.

  • A Conversation (Score:2, Insightful)

    by Phybersyk0 ( 513618 ) <phybersyko.stormdesign@org> on Friday July 05, 2002 @02:16PM (#3828690)
    wife: "you should really try out that new Thai restaurant, it's really good"
    husband: "oh, really? tell me more about it?"
    wife: "well, they've got great food, authentic atmosphere, native Thai cooks & waitstaff"
    husband: "wow, that sounds really great, can you tell me where it is?"
    wife: "no, i can't. you see, Kansas City has this rule that you can't tell anybody how to get to where you really want to go, they want you to first go to Kansas City, drive around for a few hours, until you happen to see a road sign for your particular destination, and then you'll find out the location"
    husband: "Wow, that's stupid!"
    wife: "I know." ----EOF
  • by dirk ( 87083 ) <> on Friday July 05, 2002 @02:58PM (#3828992) Homepage
    There are technological ways around deep linking, of course. Checking the Referer header in an HTTP request is one option, and dynamically creating unique URIs on the pages you allow people to visit from is another.

    It would be nice if technology was used to prevent this rather than court rulings, but hey, what can you do?

    I am all for deep linking in most cases, and feel it should be legal. But I hate the idea that "there's a way to prevent it, so it shouldn't be ruled illegal." To me, that is the same as saying "There are ways to make your house burglar proof, so we shouldn't have to make breaking and entering illegal." Just because someone can prevent something from happening doesn't means they should have to prevent it. If we refuse to rule on things because there are ways to prevent it, what happens when those ways to prevent it are circumvented? Can we rule then? Or do we just wait for more ways to prevent the circumvention? I think deep linking is legal in most cases, but I want to see it remain legal because it is the right thing to do, not because there's a way to prevent it.
  • by davew2040 ( 300953 ) on Friday July 05, 2002 @03:50PM (#3829295) Journal
    ... was created by the marketoids. As soon as people realized that this whole online advertising hack was a way to make some money, then they started creating these silly legal defenses that just end up circumventing the natural order of the technological system that permits the business to transpire in the first place.

    It's funny how people have a tendency to take the law into their own hands the moment they think they have a handle on technology. I guess it's even funnier when judges go through with it.
  • by roystgnr ( 4015 ) <<roystgnr> <at> <>> on Friday July 05, 2002 @05:28PM (#3829771) Homepage
    I am all for deep linking in most cases, and feel it should be legal. But I hate the idea that "there's a way to prevent it, so it shouldn't be ruled illegal." To me, that is the same as saying "There are ways to make your house burglar proof, so we shouldn't have to make breaking and entering illegal."

    No, it's not quite the same. If you want to make the analogy a little more reasonable, imagine that you installed a little electronic box on your door that, when someone walks up and says, "I want to go inside", unlocks and opens the door for them. This same box could be configured to only let in family members, but you decided that it would just be easier to sue your curious visitors for breaking and entering, then sue anyone who told them your address for aiding the crime.

    If you want to make the analogy even closer, imagine that you live in a world where people enter others' houses this way, welcomed, billions of times a day, that they are unable to do anything but look around once inside, and that your only real complaint is that you wanted all your visitors to go to your neighbor's house and watch commercials first!

    Finally, no, an HTTP request is not "circumvention" any more than saying "I want to go inside" is. If someone discovered that making the HTTP request 5 kilobytes long broke into the web server, or that shouting "MACKEREL!" at the top of your lungs broke the door opener, that would be clearly circumvention even though in each case you're just sending data or making noises. One set of data constitutes an understandable request (in the HTTP case, conforming to internationally recognized protocols); the other set is an intentional attempt to get in without making that request or having it answered.

Machines that have broken down will work perfectly when the repairman arrives.