Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

ICANN Under Pressure Over Non-Latin Characters 471

RidcullyTheBrown writes "A story from the Sydney Morning Herald is reporting that ICANN is under pressure to introduce non-Latin characters into DNS names sooner rather than later. The effort is being spearheaded by nations in the Middle East and Asia. Currently there are only 37 characters usable in DNS entries, out of an estimated 50,000 that would be usable if ICANN changed naming restrictions. Given that some bind implementations still barf on an underscore, is this really premature?" From the article: "Plans to fast-track the introduction of non-English characters in website domain names could 'break the whole internet', warns ICANN chief executive Paul Twomey ... Twomey refuses to rush the process, and is currently conducting 'laboratory testing' to ensure that nothing can go wrong. 'The internet is like a fifteen story building, and with international domain names what we're trying to do is change the bricks in the basement,' he said. 'If we change the bricks there's all these layers of code above the DNS ... we have to make sure that if we change the system, the rest is all going to work.'" Given that some societies have used non-Latin characters for thousands of years, is this a bit late in coming?
This discussion has been archived. No new comments can be posted.

ICANN Under Pressure Over Non-Latin Characters

Comments Filter:
  • Changing a system (Score:5, Insightful)

    by Kamineko ( 851857 ) on Tuesday November 21, 2006 @12:06PM (#16931804)
    Changing a system which works is a very, very bad idea.

    Wont this open up the system to many more phishing attacks involving addresses which include non-latin characters which look similar to latin ones?
    • by Daniel_Staal ( 609844 ) <DStaal@usa.net> on Tuesday November 21, 2006 @12:14PM (#16931984)
      That's one possible problem. Then there are characters that are technically equivilent but have different representations. (Accented vowels for instance: you can code them directly, or you can code the accent and the vowel seperate.) You need some way to make sure they both go the same place, no matter UTF-8, -16, -32 or whatever else people throw at it.

      And, of course, you need to make sure when someone types this into a browser some major DNS server someplace won't crash.

      I'm all for adding non-latin characters. But I do recognize that it should be a slow process.
      • And, of course, you need to make sure when someone types this into any application ever made that access the internet Fixed.
      • by ericlondaits ( 32714 ) on Tuesday November 21, 2006 @12:33PM (#16932512) Homepage
        Accented vowels would be a problem, at least in spanish. Though their use is "mandatory", people with mediocre spelling don't use them in the internet. Even people who use them don't always do it: even though the use of accents is mostly regular, there are many (and very common) irregular placements.

        Let's say for instance we have an online shop for tea called "Sólo Té" (Tea Only). Both accents are due to irregular rules ("Sólo" = "Only" and "Solo" = "Alone", "Te" is a personal pronoun and "Té" = Tea). Some people would try the current www.solote.com, others would try the correct www.sóloté.com, some would try www.sólote.com and yet others www.soloté.com depending on their spelling capabilities.

        What this basically means is that in order to make sure everybody finds your domain and to avoid phishing you have to register four different domains.

        A solution to this problem could be what Google does right now with accents: map them to the unnacented vowel. Thus "Solo Te" and "Sólo Té" would both find the "Sólo Té" store.
        • And what happens when the owners of "sol ote" (sol as in the sun), who already have a website, find people attempting to access it "helpfully" redirected to your example?
        • Re:Changing a system (Score:4, Interesting)

          by MrNougat ( 927651 ) <ckratsch@noSPAm.gmail.com> on Tuesday November 21, 2006 @01:56PM (#16934734)
          Though their use is "mandatory", people with mediocre spelling don't use them in the internet.


          I don't have mediocre English spelling, and I would use the correct accented characters in English words like "naive" - except I don't know how to type those characters. Like many people, I know how to type the characters that are on the keyboard. Additionally, because there's no need for me to type characters outside the ones printed on the keys on my keyboard to make the internets come down my tubes, I have no incentive to learn how to type any differently than I already do.

          It's not necessarily a matter of spelling ability.
          • Re: (Score:3, Informative)

            by ericlondaits ( 32714 )
            If you are spanish-speaking (which was my example) not knowing how to place accents is not an excuse. They're a fundamental part of the language, unlike in english where they're only required for foreign words written in their original form.

            In Argentina some people have keyboards with spanish language distribution (that is, with extra letters) and some learn the ASCII codes and use the ALT key (along with the code typed in the Numpad) to place accents and the letters Ñ and ñ (which are mandatory
      • by msobkow ( 48369 ) on Tuesday November 21, 2006 @05:17PM (#16939378) Homepage Journal

        Instead of changing the fundamental DNS which is a programmer's and administrator's tool, not an advertising medium. It is founded, like programming languages, on a fundamental 7-bit ASCII character set, and is not intended to be used for NLS text.

        A far better solution is some form of VDNS that translates NLS text names into the proper domain name at the system level. That also allows the same domain to have multiple language translations to reflect localized product and service names.

        We seriously need to kick the general political community in the arse. They keep trying to impose technical decisions, and it fails as miserably as any corporate PHB's uninformed decisions. ASK the techies to propose solutions instead of shoving ill-conceived ideas down our throats.

        For example -- once you mandate multibyte domains, you implicitly mandate multibyte URL components. Goodbye direct mapping of names to the directories, file systems, and servers.

        Bad idea. Very bad idea.

        • The problem is that it was designed for natural language text in the US back when some computers could deal with the new fancy feature of lower-case letters and others couldn't, and when humans tended to get confused about that sort of thing even though they all spoke English, and some computers could deal with 8-bit bytes and punctuation while others were very limited. I don't know if the IBM 48-character character sets were still around, but 64-character was still widespread, and EBCDIC was certainly st
    • by KingJoshi ( 615691 ) <slashdot@joshi.tk> on Tuesday November 21, 2006 @12:25PM (#16932314) Homepage
      But it's not working. Mainly for all those people that want non-latin characters. It's been broken from the beginning. Sure, there is historical reasons why we have the system we do, but change is definitely needed. Twomey is right that a change can't be rushed and it needs to be done right (for reasons of security, compatibility, stability, etc). However, the change does need to occur and there needs to be some level of pressure to ensure that it happens.
    • by jmorris42 ( 1458 ) * <{jmorris} {at} {beau.org}> on Tuesday November 21, 2006 @12:28PM (#16932378)
      > Wont this open up the system to many more phishing attacks involving addresses which include non-latin characters which look similar to latin ones?

      Even worse, although your problem is reason enough to postpone doing this change. It will break the very idea of the Internet as a common when URLs can't even be typed in on all keyboards. There are good reasons why DNS didn't even include the whole ASCII set. Least common denominator is a good design decision. Every character currently allowed is easy to generate on ALL keyboards, can be printed in an unambigious way by EVERY printing system, etc. Remember that a lot of wire services aren't even 7-bit ASCII clean, email addresses on a lot of news wires have to use (at) instead of @.

      More bluntly, of what use is the parts of the Internet I can't even type the domain name for? As things now stand I CAN, and have, snarfed firmware directly from .com.tw sites where I couldn't read any of the text. Learned things from sites where I couldn't read anything but the code text and command lines. Seen images and understood even when the captions were meaningless to me. I'm sure the reverse is equally true, that those who do not speak English still benefit from the English majority of the Internet the same way. All this because DNS is currently universal. Break that universal access feature and, frankly they can just as easy ingore ICANN and just get the hell off the Internet and make their own walled garden network based in IPv6 technology.

      At a minimum, unicode DNS should be restricted to IPv6 ONLY. No sense wasting scarce IPv4 resources on supporting walled off ghettos.
      • by krell ( 896769 )
        "More bluntly, of what use is the parts of the Internet I can't even type the domain name for?"

        There might be some places that would like to block going to sites that don't have certain character sets as their name.
      • by Sin Nombre ( 802229 ) on Tuesday November 21, 2006 @12:53PM (#16933116)
        'when URLs can't even be typed in on all keyboards'
        As far as Japanese go, there are very usable technologies that allow to type in kanji. Using a standard latin keyboard. It works pretty well, and i'm not sure what other languages have such options available, but since most of Asia uses the same kanji system I'm pretty sure that at least Asia has viable typing options.
        'of what use is the parts of the Internet I can't even type the domain name for?'
        Its of no use... to you. But then again, can you read Japanese, Korean, Arabic, Sanskrit or any other non-latin language? no? Then your usability isn't in question here.
        • by Zaatxe ( 939368 ) on Tuesday November 21, 2006 @01:37PM (#16934276)
          As far as Japanese go, there are very usable technologies that allow to type in kanji. Using a standard latin keyboard. It works pretty well, and i'm not sure what other languages have such options available, but since most of Asia uses the same kanji system I'm pretty sure that at least Asia has viable typing options.

          I wonder how you got +4 mod points... this makes no sense at all!!

          Let's suppose you are are a japanese person and you travel to Brazil. Nevermind if can speak portuguese or not, but then you need to send an e-mail using your company's webmail server from a computer at the hotel. And suppose this webmail server has kanji characters in its URL. How are you going to type them? Believe me, brazilian portuguese Windows has no support for asian languages (at least not by default, and actually I don't know if it's even possible with a regular brazilian Windows XP). What now?
        • by dasunt ( 249686 ) on Tuesday November 21, 2006 @02:01PM (#16934870)
          As far as Japanese go, there are very usable technologies that allow to type in kanji. Using a standard latin keyboard. It works pretty well, and i'm not sure what other languages have such options available, but since most of Asia uses the same kanji system I'm pretty sure that at least Asia has viable typing options.

          I must have missed where Japan conquered 51%+ of the area east of the Ural mountains.

          AFAIK (and I'm not an expert), China, Japan, Korea and Vietnam used very similar writing system decended from Chinese Hanji characters. Vietnam and Korea (South Korea at least) later adopted other alphabets. So really, only China and Japan commonly use Hanji/Kanji, and even then, the CJK unification of hanji/hanja/kanji characters really annoyed a few purists when similar hanji/hanja/kanji were merged in unicode.

          So, other than hanji/kanji, there is hangul (S. Korea), hana/kana (Japan -- yes, they have more than one writing system!), the Thai alphabet, the Cyrillic alphabet (former USSR), the Arabic alphabet (Middle East), Hebrew (Israel), the Brahmic scripts (India) and the Georgian alphabet. (And this is just off the top of my head, I wouldn't be surprised if there were a few more writing systems in use in Asia!).

          And then, just to confuse the problem, there are the various forms of encoding. Admittedly, unicode would probably be one of the better methods, but there are a lot of pre-unicode encodings in common use.

          When you expand the problem to be worldwide, there's also the Ethiopian and Greek alphabets that are used in their respective regions. There's also a ton of latin-based alphabets, which introduces many more characters than are currently used in the DNS system. (Including characters that look a lot like existing characters!)

          And then you have the problem of alphabets used only by very small groups, such as Cherokee (Oh, I'm going to get flamed!). There are very few people who can write in Cherokee, but does that mean that the Cherokee language shouldn't be part of the DNS system?

          Now, can you see why this is a mess?

      • by teh kurisu ( 701097 ) on Tuesday November 21, 2006 @12:57PM (#16933216) Homepage

        Just because the letters aren't printed on your keyboard doesn't mean it won't type them. Have a look at the list of keyboard layouts in your OS. Sure, it's an inconvenience for you, but less of an inconvenience than it is to the people for whom it is a barrier to entry. Or you could use Google - a lot of people don't even bother typing in domain names any more, they just search.

        The whole point about this is that it avoids walled gardens, because the DNS records are still held by ICANN. The alternative is that China decides it's had enough, and creates its own root servers, causing a very real split.

      • by Znork ( 31774 ) on Tuesday November 21, 2006 @02:13PM (#16935194)
        "It will break the very idea of the Internet as a common when URLs can't even be typed in on all keyboards"

        You know, when one sees comments like that, it's not strange that non-7bit ascii countries find themselves rather exasperated with the rate of progress. If you take a few seconds to actually research the issue you'll find both a suggestive lack of multi-thousand key keyboards, as well as a whole host of solutions to that problem.

        I mean, I can cut'n'paste chinese and japanese into vi, save the file with a unicode filename, and it'll just work. Earlier valid technical reasons are gone, everyone else has solved this; now the excuses start sounding really hollow.

        It's time to drag DNS kicking and screaming out of the dark ages.
    • Re: (Score:2, Insightful)

      by imbaczek ( 690596 )
      Except that it doesn't. Being allowed to use 37 characters as a domain name is not what many people consider "working".
    • Re: (Score:3, Insightful)

      by Tet ( 2721 )
      Wont this open up the system to many more phishing attacks involving addresses which include non-latin characters which look similar to latin ones?

      Potentially, yes. But I'm not too bothered about that. Protecting people from their own stupidity is rarely a good long term strategy. However, i18n for DNS is a particularly bad idea for purely pragmatic reasons. Currently, anyone anywhere in the world can go to any URL in the world in their web browser. If we allow the full range of unicode characters, that s

    • They are going to have to wait for the system to be capable of using more characters.

      Its just a fact of life that the encoding scheme implemented has a limited set of characters that is readable by the technically adept people who built the thing.

      Its a great idea to enable lookup by character strings using alphabets from other languages but if it takes time to implement a global standard thats just too bad.

      If they are desperate to implement lookup in their own character sets then let them get on with it - b
    • Re:Changing a system (Score:5, Informative)

      by Anonymous Coward on Tuesday November 21, 2006 @01:39PM (#16934332)
      What's this? I've been able to use the Norwegian characters in domain names for a long time. There are screetshots over at http://en.wikipedia.org/wiki/Internationalized_dom ain_name [wikipedia.org]
  • What? (Score:5, Funny)

    by Aladrin ( 926209 ) on Tuesday November 21, 2006 @12:06PM (#16931808)
    Wait, so it's not tubes... It's a 15 story building?

    Anyone else getting more lost every day?
    • Re: (Score:3, Funny)

      by rubycodez ( 864176 )
      those that live in 15 story buildings made of glass tubes should not throw brick laptop power supplies.
    • Re: (Score:2, Funny)

      by jmyers ( 208878 )
      No, there are only 15 stories about the internet that are just retold with slight modifications. One is about tubes, one about bricks, etc, etc, etc...
    • Re: (Score:3, Funny)

      It's a 15-story building made of tubes and supported by a brick basement, on a flatbed truck headed down the information superhighway.
    • Well, it's not a truck, that's for damn sure.
    • 1. Physical
      2. DataLink
      ...
      6. Presentation
      7. Application
      8. Tubes
      9. Bricks
      10. Porn
      11. Google
      12. YouTube
      13. ??
      ...
      16. Profit

      It was hard enough remembering them all back when there were only 7.
  • by account_deleted ( 4530225 ) on Tuesday November 21, 2006 @12:07PM (#16931838)
    Comment removed based on user account deletion
    • Re: (Score:2, Interesting)

      by Aladrin ( 926209 )
      And mail. And ... Hmm, yeah, the whole thing.

      Seriously... How many mail servers are going to freak out because they can't handle unicode?
    • by mosel-saar-ruwer ( 732341 ) on Tuesday November 21, 2006 @12:30PM (#16932434)

      Now if you'll excuse me, I need to finish reading all the new posts on 66.35.250.150.

      Base-Ten CHAUVINIST!!!

      What about societies that use Base 2 [binary], or Base 8 [octal], or Base 16 [hexadecimal]?

      Or entire societies, like the British empire, which use no base at all?

      12 inches in a foot. 3 feet in a yard. 1760 yards in a mile...

      60 seconds in a minute. 60 minutes in a hour. 24 hours in a day. 7 days in a week. 52 weeks in a year [give or take]...

      Or how about base 12?

      12 keys in a chromatic scale: A 440, then, logarithmically [give or take a little well-tempering [amazon.com]]: A#, B, B# == C [kinda sorta], C#, D, D#, E, E# == F [kinda sorta], F#, G, G#, and finally A 880.

      Except that on the continent, things are often just a little sharper - say A 443/444/445 & A 886/888/890...

      And let's not even get into water freeezing & boiling at 32 & 212 versus 0 & 100...

    • Philistine! I'm off to http://[2001:200:0:8002:203:47ff:fea5:3085]/ [2001200080...fffea53085].

      So nyeeer.

      (Unfortunately, SlashCode mangles IPv6 addresses, so don't bother clicking.)
  • Yes and No (Score:5, Insightful)

    by Aadain2001 ( 684036 ) on Tuesday November 21, 2006 @12:08PM (#16931842) Journal
    Yes, countries that use non-English characters should be able to interact with the rest of the world using their natural language. No, they shouldn't rush the change and risk a possible crash of a large portion of the Internet. Be patient young patawans, soon you will be able to have DNS names with any character you can think of, but it will be reliable and actually work.
    • Besides, think of how well prepared DNS will be to start supporting lookups in extra-terestrial languages when the time comes if we do this now! We'll be completely compatible with Martian, Klingon, Mimbari, and Vulcan networking systems the day we meet them! We should be able to view each other's pron almost immediately!
    • Um... why? (Score:3, Informative)

      by Colin Smith ( 2679 )
      "Yes, countries that use non-English characters should be able to interact with the rest of the world using their natural language."

      Why... No really. You speak as if this is a good thing. Why should they be able to use their natural language rather than English? Why shouldn't they be restricted to a limited area of local language speaking people?

      The reason the Internet is useful is because everyone speaks TCP/IP. Incompatible protocols are to be actively discouraged because they balkanise the network. Langu
  • Plans to fast-track the introduction of non-English characters in website domain names could 'break the whole internet', warns ICANN chief executive Paul Twomey

    Luckily for us, GWB knows that we have some redundancy with the Internets, so if one breaks we can just use another.
  • Given that some bind implementations still barf on an underscore, is this really premature?
    Maybe it's time to get rid of Bind? The Model-T of DNS implementations...
    • by igb ( 28052 )
      Actually, back in the day bind _did_ tolerate underscores. I remember the anguish we had flushing machines called things like fileserver_one out the day that Vixie et al decided to enforce the standards. The DNS standards say [-a-z0-9], with dot as a delimiter. It's not the place for implementations to play hooky with that. For years there were hacks in bind to allow you to choose between accepting underscores in master zones (bad idea), secondary zones (quite bad idea) and recursive queries (sometimes
  • The ICANN tries to give a technical reason to a political problem, although this reason may be valid, it is not a very good idea. With the UN, it will be handled by international comitees and we will all be long dead before they finally agree on which country will be in that comitee.
  • Late in coming? (Score:3, Insightful)

    by grasshoppa ( 657393 ) on Tuesday November 21, 2006 @12:11PM (#16931924) Homepage
    Perhaps, but I can't fault ICANN for this one, as much as I might like to. Like it or not, most internet technologies have their roots in latin speaking countries, which means systems developed there may not be tweaked to work with outside language schemes.

    If the fault lies with anyone, it's with the individual contributers of the tech. Or better, with the non-latin countries appearent lack of interest in some of the core projects needed to push this through ICANN ( specifically DNS, httpd ).
    • by radja ( 58949 )
      although you're right, many of the latin-alphabet countries have at least some 'unique' letters, mostly ligatures, like the german ringel S(long S/short S ligature) and the dutch 'long y' (i/j ligature and different from the y). many of these letters don't appear in the alfabet itself.
    • by 1u3hr ( 530656 )
      Like it or not, most internet technologies have their roots in latin speaking countries

      Yes, the Vatican State, back in the MCMLX's.

  • by Bonker ( 243350 ) on Tuesday November 21, 2006 @12:13PM (#16931966)
    - Don't be too surprised when people around you start building their own houses rather than choosing to pay rent.

    DNS upheaval has been a long time coming, and the current anti-American sentiment worldwide isn't exactly helping to stabilize it. We're already seeing all sorts of adhoc routing setups that deal with shortcomings of an ameri-centric DNS. My guess is that within the next few years, ICANN's 'control' of the internet will be in name only as everyone else in the world will have moved on to alternative routing and domain systems.
    • Cool... sounds like a good solution to me. If someone can develop a better system that works for people who WANT to use it great.
    • I think you are confusing anti-American GOVERNMENT sentiment with anti-American PEOPLE sentiment. Oh, and don't forget, we built the Internet. We were there first. We laid the groundwork and did the first R&D. It was only later that other countries started to get involved. And at any point in those phases, they could have suggested these changes. Instead, they wait until the house it built and everyone else it hanging their family pictures to complain about the choice of land and demand everyone s
      • Re: (Score:3, Insightful)

        by benoitg ( 302050 )
        Please, there have been complaints about DNS not supporting most language's (even latin) character sets since the birth of the web, so it's completely untrue that we waited till everything was built. After well over a decade of patient waiting, it seems that actual pressure was required to get this change through.
    • Re: (Score:3, Insightful)

      I think that might be jumping the gun. American or not, the internet plays a huge role in the functionality of the modern world. Just imagine the chaos if international office networks went from "I can't open this word document you sent me because it's in a different format" to "I can't get email from you because you're on a different internet". American DNS control or not, decentralizing the internet like you suggest might happen could be one of the worst things that could happen for global communications.
  • Stupid question (Score:4, Insightful)

    by VENONA ( 902751 ) on Tuesday November 21, 2006 @12:13PM (#16931974)
    "Given that some societies have used non-Latin characters for thousands of years, is this a bit late in coming?"

    No.

    Zonk either knows zero about the histories of the Internet or DNS, or is so enamored of finishing stories with questions that he'll tack on the truly ridiculous.
    • Re: (Score:2, Offtopic)

      by Ingolfke ( 515826 )
      Read the news. Is organized religion currently a net win, or a dead loss?

      I like your sig... it's just not accurate. You've focused to much on a particulary component of the larger problem and have failed to recognize the actual whole of the issue. Here's a correct understanding of the problem.

      Read the news and some history. Is organized humanity currently a net win, or a dead loss?
  • by Agelmar ( 205181 ) * on Tuesday November 21, 2006 @12:16PM (#16932042)
    For all you people saying "There's no problem, just do it" - I say watch out... there will be a rush of attacks and spoofs as soon as this is opened up. The letter "a" appears in the unicode character set multiple times, and some of the variants are almost indistinguishable. I'm not just talking about someone registering släshdot.org, I'm talking about someone reigstering slashdot.org (the a is FF41 instead of the normal a). Good luck telling the attacks appart from the real sites.
    • by gsasha ( 550394 ) on Tuesday November 21, 2006 @12:24PM (#16932280) Homepage
      It's called a "Homograph Attack". See http://en.wikipedia.org/wiki/IDN_homograph_attack [wikipedia.org]
    • I'm talking about someone reigstering slashdot.org (the a is FF41 instead of the normal a).
      Yikes! You almost tricked me into thinking it's 0430!
    • Re: (Score:2, Insightful)

      As a human you might be fooled, but a well designed browser could tell the difference and alert you. So this shouldn't be a problem.
  • Sure, go 'head (Score:4, Insightful)

    by kahei ( 466208 ) on Tuesday November 21, 2006 @12:16PM (#16932044) Homepage

    I'd be in favor of the change just because anything that undermines the Unix Tower of Babel -- the dependency on ASCII which complicates text handling sooooo much even when Windows solved the problem soooo long ago -- is good. Even Java gets it. Even Apple (finally) get it. Unix Is Teh Problem.

    And the ASCII problem isn't just bad because it forces people to use inefficient encodings like UTF-8 (THREE bytes per character?) It's bad because it allows people to write code like:

    if(string[index] == '.' || string[index] == '?' || string[index] == '!') sentenceEnd = true;

    (a line repeated, with subtle variations, several hundred times in the code of a certain ubiquitous editor).

    And, lo and behold, the above does not work, but once it appears in a few thousand places it's impossible to fix, and a vast towering structure of fixes made by people who don't really understand why it's an issue is built.

    So, even though the proposed change would be hugely inconvenient for a huge number of people, I'm in favor, because I want the world to grow the fork up and understand that text != byte array some time while I'm still alive.

     
    • by reed ( 19777 )
      ... the dependency on ASCII which complicates text handling sooooo much even when Windows solved the problem soooo long ago ... inefficient encodings like UTF-8 (THREE bytes per character?)


      What the hell are you talking about??

    • by Srin Tuar ( 147269 ) <zeroday26@yahoo.com> on Tuesday November 21, 2006 @01:02PM (#16933348)
      much even when Windows solved the problem soooo long ago

      i18n on windows is far from "solved".
      I do admit that MS had a huge benefit when they started pushing unicode.
      (It takes a company with microsoft's level of clout to push around national governments )


      And the ASCII problem isn't just bad because it forces people to use inefficient encodings like UTF-8 (THREE bytes per character?)


      Perhaps you don't realize that UTF-8 is moving on to become the most dominant character encoding,
      and the legacy cruft such as UTF-16 (designed to deal with design flaws in windows) is being phased out.

      Even languages that would end up as mostly 3 byte characters tend to benefit from the savings on single byte
      characters for control and formatting markup.

      I'm not going to harp on about it, but a few basic web searches could enlighten you here.

      if(string[index] == '.' || string[index] == '?' || string[index] == '!') sentenceEnd = true;

      Code like that *works* in UTF-8, which is one of the things that makes it beatiful. (among many others)

      It allows you to deal with world characters sets when it matters, and allows you to ignore them when it does not.
      (for example, a lexical analyzer that specifies its tokens does not want to support punctuation from every language ever conceived)

      And if you think code like that doesnt exist in the windows world, you are sadly quite naive.
      In my experience internationalizing applications, its typically far easier to upate unix applications, which
      on occaision need nearly no changes at all, compared to the laborious grind and near total re-write often needed
      for ms-windows applications.
  • by Anonymous Coward

    Unicode has many characters that look almost exactly like characters in Latin-1.

    For example, if "www.microsoft.com" is shown in your browser's address bar, how would you know for sure that the "c" is not from the Cyrillic alphabet, or the "o" is not from the Greek alphabet?

    You simply won't be able to trust your browser's address bar anymore. The possibilities for phishing attacks are endless.
    • Why not have the browser fail to render them outside of the user's preferred alphabet?

      Cyrillic users would see www.**c******.com, latin users would see www.mi*rosoft.com?

      Or better yet, put up a big warning that it's using mixed alphabets?
      • Re: (Score:3, Insightful)

        by reed ( 19777 )

        Or better yet, put up a big warning that it's using mixed alphabets?

        In general, browsers ought to make users more aware of the parts of their current URL, and maybe also of link destinations (also mail client).

        For example, seperate the URL into its parts (scheme, host, path). Display some of the WHOIS info below the hostname, and some info from the SSL certificate if it has one.

        This would help people spot phishing scams or other suspicious activity.

        Reed

      • Re: (Score:3, Insightful)

        by Srin Tuar ( 147269 )
        Thats a good start.

        Registrars shouldnt accept such names in the first place though: Is there a valid reason to ever have a domain name with stray characters mixed in from different languages?

        If a standard were to specify that a domain name must use a subset of unicode that is self-consistent, and that browsers should turn the address bar red to warn anytime a domain uses characters not in the users selected languages subsets, that would go a long way towards minimizing the phishing problem.

        There would still
        • Re: (Score:3, Informative)

          by Bogtha ( 906264 )

          Is there a valid reason to ever have a domain name with stray characters mixed in from different languages?

          You're assuming that characters belong exclusively to one language. Try telling a French guy that he can't register café.com because 'c' 'a' and 'f' are English, not French.

  • Whatever happened to Punicode (Unicode in a special dns-characters-only encoding format)? There was some hoopla about the scheme, which would require browsers to show punicode-encoded URLs in the appropriate characters on the screen, but some naysayers said that it was a phisher's dream since many glyphs throughout Unicode looked alike. I figure this issue has nothing to do with Unicode per se, but with phishing vs certified sites in general, but I haven't heard a peep from the Punicode camp for over a y
  • Prince no longer goes by that strange symbol as his name anymore.
  • URL goldmine. (Score:4, Insightful)

    by emmagsachs ( 1024119 ) on Tuesday November 21, 2006 @12:19PM (#16932132)
    Imagine the land rush that'll ensue if DNS will allow non-Latin characters. Trademark transliteration ? A heaven for domainsquatters and an upcoming surge of legal fees for trademark lawyers, if you ask me.

    Nice for localising, sure, but how usable will Japanese, Indian, or Arabic script URLs -- for example -- be for those who do not have access to the respective sets or keyboard layouts?

  • by tverbeek ( 457094 ) * on Tuesday November 21, 2006 @12:19PM (#16932144) Homepage
    Of course it's late in coming.

    But that doesn't mean it should be done hastily and badly.
  • "One source of the pressure was Adama..."

    And he will not rest until the script of each of the 12 Colonies is properly represented with ICANN. I hear he's not too keen on Cyrillic, however.
  • by pubjames ( 468013 ) on Tuesday November 21, 2006 @12:23PM (#16932244)
    Given that some societies have used non-Latin characters for thousands of years, is this a bit late in coming?

    Let's be clear. The domain name system only uses English characters. There are lots of languages in Europe (Italian, Spanish, French...) which are closer to latin than English (which isn't really a latin language at all) which are not currently represented, because you can't use accents in domain names, or other letters such as the spanish Enye (n with a squiggle, actually a distinct letter). English speakers often think accents aren't important but they can completely change a word's meaning.
    • Re: (Score:3, Insightful)

      by brusk ( 135896 )
      True, but the English subset of the alphabet has another feature that matters in this regard: it's a lowest common denominator that all computers on the planet are capable of producing. I can type any letter easily on a computer in China, Israel, Jordan, Russia, Spain, India, etc. I can't necessarily input a given Chinese character, Arabic letter, or Cyrillic letter.

      Why does this matter? Well, one argument is that it doesn't, much: if I want to view a Chinese website I'm probably in China and can input Chin
    • . English speakers often think accents aren't important but they can completely change a word's meaning.

      Yes, I am an English speaker, and throughout the day I often stumble across the recurring idea that accents are of no particular use to determing the meaning of a word. I would go so far as to say that I often think accents just aren't important. I'm glad Slashdot has you around to set things straight.

  • Not a trivial job (Score:4, Insightful)

    by turnipsatemybaby ( 648996 ) on Tuesday November 21, 2006 @12:24PM (#16932300)
    The internet was originally conceived, designed, and implemented in the USA at a time where hardware was at a premium, and corners were cut to conserve that limited resource. DNS was just one of the results of that era. However, it is the most visible because it is the front end means for people to find each other. That means there is now a very well established standard, used by people across the entire globe, that is very difficult to change.

    Changing all the DNS servers in the world to switch from ASCII to Unicode is NOT trivial. The fact that some societies have used non-latin characters for thousands of years is completely and utterly irrelevant. THEY didn't make the internet. They simply bolted themselves on to an existing infrastructure.

    I agree that progress needs to be made to accomodate non-latin characters, but to have people whining about "how they want it, and want it now"... That's just ridiculous. It's like waltzing into a house that was built 40 years ago and having a tantrum because the stairs are too steep and the house is too squished. Major structural renovations take time, effort, and careful planning. And there is nothing you can do to avoid that, short of implementing cheap stop-gap measures that are virtually guaranteed to cause even bigger unintended headaches later on.
  • by sexyrexy ( 793497 ) on Tuesday November 21, 2006 @12:27PM (#16932344)
    Given that some societies have used non-Latin characters for thousands of years, is this a bit late in coming?

    Those societies did not build an entire economic and social infrastructure using all 50,000 of those characters in a few decades, though.
  • How 'bout we all just speak English and forget about all those weird letters.

    (It was a joke... well sort of)
  • Huh? (Score:5, Funny)

    by writermike ( 57327 ) on Tuesday November 21, 2006 @12:29PM (#16932426)
    ICANN Under Pressure Over Non-Latin Characters

    You mean white people?
  • by tempest69 ( 572798 ) on Tuesday November 21, 2006 @12:34PM (#16932544) Journal
    Set up a private latin name prefix for the non-latin names i.e. NONLATINPREFIX and then a UUEncode of the non-latin name.. IE (arabic word for horse in arabic script)=AER5ER8EDG so you would have NONLATINPREFIX-AER5ER8EDG.com as a domain name, that would resolve correctly if someone typed in (arabic word for horse in arabic script).. 1. This allows for simple web-extention to serve non-latin countries

    2. Doesnt require any change to the DNS system. (other than some name policy changes)

    3. Allows links to be imbedded in normalweb-pages so that they can be cut and pasted by anyone with latin functionality. So a Japanese person could cut and paste the link to some arabic site that they dont have the font for.

    4. While this is a kludge it has some major advantages over rebuilding the DNS system.

    Storm

    • Oh there is a funky problem... letters that look identical in different languages. could allow for spoofing.. so if the link had a dual-language character-set it would need to multicolor the thing so that it would look pretty odd.. so that microsoft.com and mîcrosoft.com would be clearly distinguishable (reverse the î character so that it's white on black) but I still it's workable.

      Storm

  • DNS won't break (Score:3, Informative)

    by zdzichu ( 100333 ) on Tuesday November 21, 2006 @12:34PM (#16932552) Homepage Journal
    DNS won't break. In fact, it already works! The thing is called IDN [wikipedia.org] and is supported by all modern web browsers (including IE). Try for yourself - http://www.kozowski.pl [www.koz] (I hope Slashcode won't caniballize letter "").

    So DNS and Web is OK. Any breakage I can think of may appear in email systems or other domain-based forms of communication.
  • ... parallel multi-nets. I guess servers will have multiple domain names for same IP address, one for each culture they wish to address.

    No matter what, english-language net will continue to be *the* Internet, a global Forum, direct connection between common people from all parts of the world ( Hey there! :) ).

    All the other nets will have quite a marginal significance. Nations will try to boost them in order to keep their citizens indoctrinated with own traditional values, but things that do not fly by thems
  • I thought this was what unicode was for. The only 3 scared characters that I wouldn't want messed with are the ":", "/", and "." How come we don't have a unicode DNS solution so countries could use the entir unicode address pool for domain names? I've read postings basically bashing the non-English world for not being invovled with the original tech so being left out. So that's a valid reason to discrimnate now? What used to get me excited about slashdot was the unquie solutions that you could find in the c
  • .cn (Score:3, Interesting)

    by hey ( 83763 ) on Tuesday November 21, 2006 @12:45PM (#16932862) Journal
    Does ICAN control .cn (China)? Or other national TLDs? Why don't they just start registering
    domain in their local language. Leave .com, .org, .mil (ie the USA TLDs) English.
  • by bugnuts ( 94678 ) on Tuesday November 21, 2006 @12:49PM (#16932970) Journal
    Tht ìs thê £äst thïñg wë ñèêd

    Dibs on ©óm
  • by rs232 ( 849320 ) on Tuesday November 21, 2006 @12:49PM (#16932978)
    What's this going to do for security. Didn't we have phishing attacks receintly that consisted of unicode characters being inserted into e+bay.com for instance that didn't get displayed. the domain e+bay.com being different than ebay.com.

    "A domain name is a unique address that allows people to access a website, for example, smh.com.au"

    No,a domain name is a sequence of characters mapped to an IP address. It was designed so as you won't have to remember 66.35.250.150 instead of slashdot.org. This wasn't a problem while the original Internet consisted of just four computers. DNS was never designed to provide identity. There was also the case of a stock trader hacking a DNS server and redirecting traffic from a legitimate finantial site to his own where he had duplicated the real site only with bogus information.

    "He said that this could create problems where, for example, a character in Urdu looks identical to one in Arabic"

    It sure could. How about totally replacing DNS with a system of online identities.
  • Horrible indeed (Score:3, Interesting)

    by unity100 ( 970058 ) on Tuesday November 21, 2006 @12:55PM (#16933158) Homepage Journal
    Im in a country that is based between europe and middle east, we have a few non-latin characters in the alphabet, still it creates problems when conferring domain names.

    no wonder the middle east (arabic) countries are especially wanting this, because the majority of the inexperienced internet users there will be more likely to easily use these domain names, hence the sites using those domains will be greater incentive for controlling what they see, because these domains will be under their control nationally.

    not only this, but we as it people will be very unwilling to change all our software to adapt with the new situation because of the horrible development/testing/implementation involved, and hence wont be accepting these domains as valid in our network traffic, which will create a second internet which is as described above, less free.

    this should not be allowed.
  • Bad for phishing (Score:3, Interesting)

    by AaronW ( 33736 ) on Tuesday November 21, 2006 @01:14PM (#16933666) Homepage
    Adding unicode to DNS names would make phishing much more difficult to detect unless all the browsers, email clients and other tools are modified to indicate that a URL may not be what the user thinks it is. It is bad enough as it is, and remember, most Internet users are not as savvy as those of us on Slashdot. I forsee a lot of security implications by adding this.

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...