ICANN Plans Non-English Character Domain Testbed 138
Wanted writes: "This article reveals ICANN's plan to open registration of domain names
with national characters. Actually it's Network Solutions, who are responsible for technical issues of implementing that project. Initially they want to support CJK (Chinese, Japanese, Korean), then Spanish and other European languages. I don't know why they like Spaniards, but I'd rather say about supporting ISO-8859-1, not particular languages. Nevertheless the Internationalized Domain Names IETF Working Group should be pretty happy about it. Wonder, how would you type www.wong-kar-wai.org in Chinese with classic keyboard :)"
Re:Questions (Score:1)
Well, for example, assume that there's a new brand called El Perro Rabioso selling, say, spiced cucumbers. They have a web page www.el-perro-rabioso.com, which they are advertising all over the place. I don't understand Spanish, and have simply memorized the link, so I'll just go there and see if they have an English section. All works fine since it's Spanish, I understand the alphabet.
Substitute Spanish for Korean and maybe you'll see the problem.
Besides, not all information needs be written. Some Chinese site by Mr. Wu might just contain nice images of sunsets in the Chinese countryside or something. But if I'm unable to type the URL of the web page, I cannot view the information, even though I would understand the images (but not necessarily the image captions).
Your argument about software is true and interesting. Maybe we'll soon have a Babelfish-kind function in our web browser, paint-n-translate! Now that would be cool.
Ưou people just don't get it (Score:2)
Think. (Score:2)
[ maur_at_technologist.com ] "For a sufficiently powerful message,
[ http://maur.litestep.com ] the medium is irrelevant."
Re:EXCELLENT POINT... (Score:1)
Re:Bad idea, NOT (Score:2)
First, 8-bit computers are still in use now, and their bus width does not prevent them from dealing with data of any format.
Second, DNS already is in use, and NOONE BUT "UNICODERS" EVER COMPLAINED about ASCII use in it. There is no demand for this feature, only some people's desire to break all existing software to sell "updates".
Domain name is an address. Address should be reachable from everywhere and everything. This works even for postal addresses -- I can write them in English, and they will reach the intended destination in any country, be it US, Spain, Russia or Japan. The same functionality is available now with DNS, but if this proposal will be implemented, it won't be available from every computer unless everyone will switch to Unicode -- and that won't happen until Hell freezes over (being Russian I have all reasons to be sure about that).
If someone is too concerned about "good-looking" addresses, they should implement some name-translation service like AOL keywords for people who don't like DNS, but the basic architecture of the Internet should not lose interoperability just because someone wants to add one useless feature to his shitty software.
Re:Bad idea, NOT (Score:2)
Not completely true. The domain name is an alias. The dotted quad is the address.
Tell me why I can't put a name server out there that supports more characters? Yes, comaptibility will be a problem but not an impossible one. The DNS request must inform the server which charset it is using. Default to UTF-8 of course.
So we have the following alternatives. (Let's accept "unicode" for "standardized extended charset" in the following OK?)
1) I run a web server with a UTF-8 domain name. No problem, a Unicode DNS will be able to handle the ascii subset.
2) I run a web server with a unicode domain name. I must register my domain with a unicode dns and I'd be wise to also register a UTF-8 domain name as an alias if I think my domain name will cause trouble.
This works even for postal addresses -- I can write them in English, and they will reach the intended destination in any country, be it US, Spain, Russia or Japan
Well, but can you write an adress in russian and expect the letter to be delivered in the US? Or in russia, if mailed in the US?
If someone is too concerned about "good-looking" addresses, they should implement some name-translation service like AOL keywords for people who don't like DNS, but the basic architecture of the Internet should not lose interoperability just because someone wants to add one useless feature to his shitty software.
Someone sure is "too concerned" and I rather have ICANN setting a standard than wait until there is an AOL/MS proprietary name space.
I do see your point. Clueless fibbling with DNS is not a good thing. *My* point it that that is exactly what will happen unless it is done the proper way.
Chinese input is rather easy (Score:1)
Re:ugg (Score:1)
the purpose of a domain is global accessibility, not so users with keyboard type x and input method y can access it.
Re:A split on the internet? (Score:1)
the target audience for a given domain can always have the input method to support accessing hosts in that domain
What if the target audience is on vacation in some other foreign country? Will they have to go to a cyber-ethnic-café to browse their local paper?
The DNS system is the same for all domains, so if one certain TLD can use certain encoding, what's to stop the rest of TLD's to use it? Nameservers don't know the difference, they just hold the data. Many
The fact that you can't make sense of the names is irrelevant. As long as someone might get benefited (domain registrars, for example) it will eventually happen.
The problem is not that we have language-specific characters, is that you don't. We don't consider them special, except for the fact that we can't use them normally (without kludges like i18n features) on a computer.
i18n should be about content, not about presentation.
Re:Why Spanish before other European languages. (Score:1)
Ummm... I thought there were a lot of european-descent people among the US. Wouldn't for example the irish americans prefer to type fáilté.com?
Re:How about *dis*-allowing characters? (Score:1)
Why Spanish before other European languages. (Score:4)
CJK has a similar "numbers" vibe. Since the CJK character sets are generally handled by a single solution in software (esp. since written forms of Japanese and Korean include both native syllabic/alphabetic [respectively] scripts and Chinese idographic script), you get Japan, Korea, and Greater China in one fell swoop. (Greater China here not only including the PRC and Taiwan, but the Chinese-speaking groups in Maylasia, Singapore, and Indonesia.)
So why not Devanagari too? Because 1) there are a lot more CJK and Spanish language customers than Hindi/Bengali customers due to internet penetration and financial factors, and 2) the people who would buy the domains in India generally are of the educated classes that speak English. So there's less demand for Devanagari.
Steven E. Ehrbar
We wont be able to access half the domains then... (Score:1)
I see the future with this domain system pretty stupid.
The American Embassy, for Instance, have 24 sets of Keyboards per computer- to access the sites of the other countries.
Re:Unicode Limitations / BIND (Score:1)
example... (Score:2)
Interesting, BIND 8 works with it (my nameserver), but when I enter that in nslookup it pukes (i.e. I can use a webbrowser [IE 5.5], but can't type it into nslookup).
--
Re:Questions (Score:2)
If you sell a product internationally, and use a lots of strange letters in the url, you are just stupid. Of course you should use a "classic" url for this. But if you have a company that delivers shrimp sandwiches within a swedish town, you should be able to use the domain räksmörgåsar.se instead of raksmorgasar.se.
Re:Ưou people just don't get it (Score:1)
Re:Conflict with existing names (Score:1)
No, not everyone. I don't agree that a company has that right, and I bet I'm not alone. As far as I'm concerned, in the chaos of the net you get to sink or swim on your own merits and companies don't get any special perks. The real irony here is that some of these so-called pornographers actually supply more content than the domains they ape, but you suggest we should take measures to prevent that.
(UNFAIR Term applied to advantages enjoyed by other people which we tried to cheat them out of and didn't manage. See also DISHONESTY, SNEAKY, UNDERHAND, and JUST LUCKY I GUESS.
--The Hipcrime Vocab by Chad C. Mulligan)
Re:Conflict with existing names (Score:1)
Possible solutions (Score:1)
It seems a cool idea. I think that many problems surrounding URL i18n can be solved.
great! (Score:3)
--
Re:Um, why would they have to register those? (Score:1)
Re:Not good (Score:1)
(3 * 7) * (1 * 2) * (2 * 5) * (1 * 5) * (1 * 2) == 4200.
How do you decide which of those forty-two hundred possible accented domains Docrates gets first refusal rights on?
Re:Conflict with existing names (Score:2)
Conflicts with existing names are mostly dangerous when the user might type the name and make a typo. Evidently you would not type such a thing as "amazn.com" (here the "o" before the "n" is replaced by the cyrillic form of the same letter which is supposedly indistinguishable from it). When you are following links, well, you are following links, and you are therefore trusting the site with the links to some extent. After all, non-power users rarely read the URL written at the bottom of the page, in any case: if someone writes a site which looks very much like a well-known site and links to it, whatever the URL, many users will be fooled. I don't think "internationalized URLs" will be a major change in this respect.
Slashdot's handling of accented characters in nicknames was completely grotesque, in any case. It was done naïvely by taking the 8-bit data as submitted and using it in the URL. But this is not how it works: the data should have been encoded in UTF-8 beforehand.
--
Here you should see an upper-case e with an acute acent: é. Here you should see an upper-case Y with two dots on it: . Here you should see a capital greek Gamma: . Here you should see a Hebrew aleph and a Hebrew beth: ; of course, the aleph should be on the right because it is first (unless there was a line split between the two). Here you should see the Devanagari "OM" sign: . Here you should see a smiling face: . Here you should see the Chinese (or Japanese) character for "sun": . None of this should depend on your selected "document encoding". If you did not see all that, then your browser is broken and you should change it.
Questions (Score:1)
For example, I can't tell a Chinese symbol for dog from that of a cat, so how am I supposed to write the address from memory, supposing I want to visit the English section of Wang-Chan's Noodle Soup Test Drive or something?
And vice versa, how will some Chinese person write, for example, umlauted characters (å, ö, ä, ë and so on)
Re:Unicode Limitations / BIND (Score:2)
That's interesting -- that had entirely escaped me. (This from the kid who spent years studying Egyptian and Mayan ideograms. *smack*) And I thought 63 characters was incredibly long in English. You could have an several haikus in Japanese ideograms as a domain name!
I like ICANN but
NSI really pisses
me off a lot
-Waldo
-------------------
A split on the internet? (Score:2)
At the risk of sounding anglo-centric, isn't this a big blow against interoperability?
Re:Ýou people just don't get it (Score:1)
Again, your assumption is that people who do not agree with you do not understand those issues and therefore must be American. Furthermore, the issue is not whether you should be able to have a domain name with a Jönsson component, but whether you should be able to stick that in .com, .org, or .net. I do not see the problem with it for ISO 8859-1 characters, but I do understand the argument against it. Furthermore, I don't think .com, .org, or .net should be extended beyond the extended latin characters for reasons I have already enumerated.
And I really do not like your dismissal of any American point of view on the matter.
resume.com (Score:2)
{resume, résume, resumé, résumé}.{com, net, org, new TLDs}
Costly!
Re:Why Spanish before other European languages. (Score:2)
(I have no concrete figures) that the majority of people in South America (excluding Brazil, they don't speak Spanish) DO have access to the Net. Because of tourism, there are Internet stores all over the place. And that's not just in the big cities either. Even in the little smaller cities near the jungle, there are stores providing access. Whether everyone can afford it is another issue (cost about the equiv of 1 us dollar and hour) but ALOT of people are have the access...
Just so ya know...
Re:Questions (Score:1)
Re:Ưou people just don't get it (Score:1)
If they can do this, then why not? (Score:2)
Because:
1) I can't read a Japanese-language site whether or not I can get to it.
2) If I could read it, I'd use software that let me input it.
3) A rational web designer will register a non-accented Roman/ASCII character name if they intend to reach an audience that may include people who can't input other characters. The irrational deserve to have their sites fail anyway.
Steven E. Ehrbar
Re:Conflict with existing names (Score:1)
Romaji is the roman alphabet transliteration of hiragana and katakana. You've confused it with a mixture of katakana, the phonetic alphabet used for imported words, or kanji, the pictographic alphabet.
domain names should be like soundex (Score:3)
This way, you can write your domain name however you want, and there isn't so much of a potential for people registering something similar.
Hyphens have got to be the dumbest idea of all time. If you have a multi-word name, you almost have to register both with and without the hyphen or you will lose visitors.
Even better would be using something like soundex, which makes a "hash" of a name so that similar sounding words map to the same value. Memorizing exact spelling is not something people are used to doing.
They shouldn't do CKJ domain names, they should just define a standard translation, which can then be incorporated into client software and possibly into DNS systems. What's next, I'll be unable to get to a site unless I also choose the correct encoding? Let's see, was that "cool-shit.org in 8859-1, or coolshit.org in japanese encoding, or maybe cool-shit.net eastern european encoding. Or was it coolshít.org?"
Re:example... (Score:1)
Once internet protocols stop catering to whatever archaic systems out there that are limited to 7-bit transport, the scheme would work.
Re:Ýou people just don't get it (Score:1)
"If
Yeah, cause we all know everyone in the whole fuckin' world can read English! (Well, those who can't aren't worth shit anyway.)
ISO-8859-1 is not an English-specific character set. There is nothing english-centric about it. And again, my comments are directed towards .com, .org, and .net which are American TLDs in spite of the trend of foreign companies using them.
I suggest you get a fucking clue.
Re:Unicode Limitations / BIND (Score:2)
You are mistaken.
Re:Why Spanish before other European languages. (Score:2)
That's only part of the story. The only non-ASCII characters in Spanish are entirely contained in the ISO-8859-1 (aka Latin-1) character set. Since most programs are already configured to work correctly with Latin-1, supporting that (in browsers and such, that is) should be rather easy.
Chinese is moderately complicated. Yes, it does have a huge number of characters, but on the other hand they are fixed-width, and the difficulty of rendering Chinese is rather small once you have the appropriate font (I'm talking of rendering, e.g. in the URL bar); in fact, Chinese is simpler to render than the Latin script.
Devanagari (just like every other Indic script), on the other hand, is hugely complicated. The crazy ligature system means that we are going to have to wait a looong time before we see software that correctly handles the Nagari script in any non-trivial situation.
For example, consider the Unicode test page [eleves.ens.fr] I keep referring to: you have a sample of Russian, a sample of classical (polytonic) Greek, a sample of Sanskrit (written in Devanagari script) and a sample of Chinese. Many browsers will handle the Russian and Chinese correctly, and the non-polytonic Greek characters; very few will handle the full Greek text correctly; none is known that correctly displays the Sanskrit text.
Re:A split on the internet? (Score:2)
Why not for .com, .org and .edu? What makes you think non-US netizens are not entitled to use such TLD's?
Because those TLDs are American TLD's. A historical quirk of the fact that the USA started the Internet. Now, if you want to co-opt them for world use, then they should use the lowest-common denominator character set usable by all users and input methods: ISO-8859-1. If they are to retain their US origins, then it should be ASCII. There is absolutely no use for forcing domains aimed at an American audience to deal with the inefficienies of two, three, and multi-byte character sets or of conversions among character sets.
The problem is not that we have language-specific characters, is that you don't. We don't consider them special, except for the fact that we can't use them normally (without kludges like i18n features) on a computer.
You are making an ass out of yourself. My machine supports entering characters in several alphabets, including many accented European languages, Russian, Arabic, and Thai. The issue is nothing about considering certain characters "special". It is about understanding your audience. I spend a lot of time on I18N issues and am a huge advocate of I18N concerns, but opening every domain to every character set on earth is not I18N: it is wreckless multiculturalism.
Re:example... (Score:2)
--
Re:There is Need for a Non-DNS URL System (Score:1)
See my comment about just such a system [slashdot.org].
Another attempt to screw us over. (Score:1)
1. America built the foundations of the internet as we know it. The core, as DARPANET, was our government's baby.
2. Most languages can translitterate to Latin characters. (It may be messy, but it works.)
3. Other nations that have sites on the Internet have decided to play by the rules we (Americans) laid out. Only one person per IP at a time, Latin-character domain names.
4. I don't care what you'll tell me; the Net is way too new to be fragmented by this sort of gross nationalization.
"And they said onto the Lord.. How the hell did you do THAT?!"
Re:Questions (Score:1)
Give people some credit (Score:3)
History of the "Something must be done to control the outbreak" syndrome.
Early 1990s: OMG! People are making up their own web sites in large numbers. Thousands of people will see them and be unable to distinguish fact from fiction.
Mid 1990s: OMG! People are now making up their own news sites. Millions of people are reading them and can't tell the difference between real and fake news.
Late 1990s: OMG! People are posting stock market tips which are causing market fluctuations. People will be unable to tell the difference between real and fake stock market news!
Early 2000s: OMG! People are allowed to use accented chars. Millions of people will be diverted to fake sites which use similar accented chars in their domain name, and thus be unable to tell the difference between real and fake sites!
Here, take a chill pill. Welcome to the internet, my friend.
w/m
Re:example... (Score:2)
--
Bad idea (Score:4)
This is a bad idea -- domain names must be interoperable on all systems, with or without Unicode or any other charset support, with or without keyboard capable of entering certain characters. The ASCII subset allowed in DNS now is the only subset supported by absolutely all computers (even ones that natively use EBCDIC), and no matter how the use of other charsets (and/or Unicode) will expand, this is not going to change. I see it as an attempt to just promote "unicodefication" of existing standards for no good reason.
And if anyone cares, my native language has nothing to do with ASCII.
Re:Why Spanish before other European languages. (Score:2)
No one is questioning that there are a lot of Spanish speakers out there. I would doubt, however, that there are many native Spanish speakers with access to the Internet and those that are don't understand another the English alphabet well enough to use domain names. (Couldn't Spanish ISPs patch BIND to resolve domains typed with accents to their low ASCII counterparts?)
The Internet right now is dominated by rich people in rich countries which, for better or worse, seems to disproportionately exclude more of those 417 million people than other languages. I think ICANN is expecting those 417 million people to 'wake up' to the Internet and wonder why it's not in Spanish. Frankly I don't see this happening soon unless Spanish speaking people have an overwhelming need for porn and illegal mp3s.
--
what a poorly written piece of trash (Score:1)
And to answer the
Q: why are today's laptops so damn slow?
A: 5400rpm IDE drives.
http://koax.org
Re:Ưou people just don't get it (Score:1)
--
But. Isn't english really the language of the net? (Score:1)
The internet has silently standardized on english. I always thought standards were a good thing. The english alphabet is relatively easy to learn, and noone forces you to actually spell english words.
(As a disclaimer: I'm not a native english speaker, but I don't have a problem with english being the language of the 'net)
I think this would be a huge step backwards. people will start memorizing IP addresses again instead of names.
probably the first thing I'd do is write an application that for every funky-characterd domain name I see I'd automatically alias it to an english name in
BlahSo the problems are that DNS servers won't support it for a while (if ever) and the people with such domain names will get a rude awakening when they won't be able to use most of the existing tools with their funky new domain names.
Re:Conflict with existing names (Score:2)
Slashdot has had to ban accented characters to prevent this kind of abuse; ICANN should do the same lest they a similar outbreak of mimicry infect the entire Web.
People should realize that it's a world wide web. It's not only american, and it should not only be in english -- diversity is important. And if you want to support other languages, you have to accept accented characters; they are not only "decorative", they make a whole difference.
Sure people will abuse it. But we already have slahsdot.org and other similar sites. It's already being abused.
Você não acha?
--
Re:A split on the internet? (Score:2)
.com/net/org are g(lobal)TLDs.
From the US DOC White paper [icann.org] at ICANN:
Could everyone Please stop making this mistake now, it's really starting to bug me.
Re:example... (Score:1)
While trying to retrieve the URL: http://domän.nu/
The following error was encountered:
Invalid URL
Some aspect of the requested URL is incorrect. Possible problems:
Missing or incorrect access protocol (should be `http://'' or similar)
Missing hostname
Illegal double-escape in the URL-Path
Illegal character in hostname; underscores are not allowed
Samba Information HQ
Re:Unicode Limitations / BIND (Score:2)
In the UTF-8 encoding (defined by RFC2279), it takes between one and six octets (bytes) to encode one character, although no currently assigned character needs more than three. UTF-8 can address all the 2147483648 characters of ISO-10646-1.
In the UTF-16 encoding (RFC2781), it takes either two or four octets (bytes) to encoed one character, although no currently assigned character needs four. UTF-16 can access only the first 1114112 characters of ISO-10646-1 (the first 17 planes), which form the Unicode range proper.
Both these encodings use characters outside the ASCII range (i.e. 8-bit characters), which are not supported by current BIND versions, but which are still permitted by the DNS standards (RFC1034&1035).
However, the proposed IDNS [idns.org] standard does not use either of these encodings (IMHO not using UTF-8 is a terrible mistake) but yet another one, called UTF-5 (see "draft-jseng-utf5-00" in Internet Drafts).
In the UTF-5 encoding (defined by the aforementioned dreft), it takes between one and eight octets (bytes) to encode one character, although no currently assigned character needs more than four. UTF-5 can address all the 2147483648 characters of ISO-10646-1.
If UTF-5 is used on DNS labels, you can have up to 15 Chinese characters in such a label.
Re:Conflict with existing names (Score:1)
Re:ugg (Score:1)
8-bit DNS software (Score:1)
ICAN'T (Score:1)
Network Solutions? (Score:1)
Evidently, no one bothers to check the links. (Score:2)
--
Sounds great, but... (Score:1)
---
this is basically just NSI trying to make more $$$ (Score:1)
if NSI starts accepting registration of internationalized domain names before these details are worked out, they are inviting trouble.
this is just another example of why .COM, .NET
and .ORG need to be taken away from NSI and
put in the hands of a group that will act
responsibly.
Spam woes... (Score:1)
Now I can get spam from domains with oddball characters that will just be about impossible to trace using standard tools.
Re:Questions (Score:1)
chinese/japan characters? (Score:1)
Chinese and Japanese characters can't be typed with Nordic/American keyboards, so that'
s just silly. For example, look at the
Re:A split on the internet? (Score:1)
Nevertheless, I think all domain names should be unicode or UCS with registry restricted some subset of the unicode spectrum appropriate to the TLD in question: .com as 8859-1, .ru as 8859-1 + cyrillic ISO, etc. That way all DNS servers can easily resolve names, but the target audience for a given domain can always have the input method to support accessing hosts in that domain.
Well duh (Score:1)
Re:just one problem... (Score:1)
The fact is EVERYONE builds their systems for their needs. The fact also is that Americans lead the world in driving issues of I18N and L10N.
Good Idea! (Score:1)
It's not ICANN, and it's not non-English (Score:2)
And it's "non-ASCII", not "non-English". There are already plenty of domain names that are non-English, as others have pointed out already. ASCII is a character set (of sorts); English is a human language. The differences are defined in detail in the requirements document for the IETF's working group.
Full details on the working group can be found at http://www.i-d-n.net [i-d-n.net]. Maybe folks should consider reading the copious archives before declaring that it can't be done. It can be done, and hopefully it can be done right. We're quite sure that the Powers That Be in the IETF won't allow it to become a standard if it isn't right.
Reason behind it (Score:1)
Re:Not good (Score:1)
Rather than trying to shoehorn international functionality into a system with limited, English-oriented semantics, the standards folks should be working an intuitive, international, squat-resistant layer above the current domain-name/URL conventions. It won't be easy to do, but there's got to be a better way to distinguish vaguely similar sites than panamahats.com versus panamá-se-expatria.com. (Apologies for my Spanish.)
By the way, does this mean that domain names will now be case sensitive? I've always thought that the resource part of a URL should never be considered case sensitive, even when they map to files on case-sensitive file systems. This was actually considered, but rejected on the grounds that case-mapping is ambiguous in some languages. If this is a legitimate issue, than internationalizing domain names is even more problematic than Docrates suggests.
Well, excuse me, I have to go register Amazon.com, aMazon.com, AMazon.com....
Re:Bad idea, NOT (Score:2)
Expensive, yes
A pain for hard core geeks to get used to, yes
Necessary, hell yes! Pehaps not today, but soon.
I do see your point, but the same argument could be used against 16-bit computers (8-bits is the current standard and programs and data must be interoperable...)
Do you know how much creative spelling there is, simply to force non a-z characters into the DNS? Simply removing dots, rings and accents is not good enough. (oops sudddenly my domain name became equivalent to "www.faggot.com" or someone elses brand name)
Ever tried enforcing a "8 character a-z only" file name policy on a network where *some* servers and programs could not handle other names? Forget it. It was cheaper to dump those, buy a new network and microsoft products (as you can see this was after the dos days:-) even if tecnically inferior, than to handle the constant hassle.
People *hate* modifying spelling to comply with stupid limits. There is no standard way to map non a-z chars onto a dns
ASCII is outdated, get rid of it!
Either it will be done in a standardized way, or it will be done by Microsoft. I prefer the former.
Very good (Score:1)
Re:Questions (Score:1)
YOU see umlauted characters here, but I see normal capital cyrillic letters. This problem comes out of the default code page used. Currently it is necessary to place
meta content="text/html; charset=" http-equiv="Content-Type"
and
meta content="" http-equiv="Content-Language"
only into pages in languages, other than English. The question is - If I have a web page in Russian, and I want to provide a link to a page in Sweden, what language should I declare for my page ? AFAIK there are no means in HTML 4.0 (except using UNICODE) to make multilanguage contents on one page (correct me if I'm wrong). At least all or most of original english-language pages with default character set should be updated. So the real problem is not how one will type the link, but in just seeing it more or less correctly.
Re:How about *dis*-allowing characters? (Score:1)
Re:example... (Score:1)
Re:Give people some credit (Score:2)
You're right -- people are way too smart to get taken in by that kind of thing. I mean, it's not like some 23 year-old could release fake stock market news that would cause the stock of some electronics company to plummet, knocking 2.5 billion from its market cap.
As for people being unable to tell fact from fiction on the Internet in general, Slashdot is more than enough to reassure us in that regard.
So yeah, give people some credit!
-
How you would type www.wong-kar-wai.org (Score:2)
It depends on which Chinese character set you use (either traditional for Hong Kong and Taiwan or simplified for China)
For each character set there's a choice between a couple of input methods to map keystrokes from a QWERTY keyboard to the actual Chinese characters. I normally use a method called traditional Cangjei and here's how you type the URL:
twlb vfog vfbtv .mg jmso hodqn .dvii dttb .wong kar wai .(--org--)
w w w
Of course there are rules to generate the above if you know what the word looks like :-). However as you can see it's much more inconvenient that way, and anyone who thinks that the average person who doesn't know Chinese
typing would be able to reach their Chinese domain is being silly at the
least.
[Off-topic] Non-profit organizations (Score:1)
Uh? Who are you kidding? Looks like you have been smoking something fairly strong, lately (such as political propanganda).
"Non-profit organization": this term is such an oxymoron, anyway! As if any organization could last without somehow benefitting its members!
The whole DNS monopoly is a huge racket on the internet, anyway.
-- Faré @ TUNES [tunes.org].org
Re:Ýou people just don't get it (Score:2)
There are many Americans who understand internationalization issues very thoroughly, and some of them disagree with this proposal. It is a bad proposal because, first off, it really does not seem to understand internationalization issues. You do not accomplish I18N by using national character sets. Using an NCS is not making your content supported for an international environment. It is doing exactly the opposite. In many cases, there are half a dozen NCS that support the same damn alphabet. If you really want I18N, you need to use Unicode (preferably UTF8) or UCS.
If you are going to support multilingual domain names, resolution must occur in either Unicode or UCS. Let DNS lookup libraries handle the conversion from KOI-8 to UTF8. The user enters the domain in their NCS and the DNS server only has to handle one character set.
Beyond the issue of I18N, however, is the issue of who a TLD is targetted at. If .com is aimed at a global audience, then domains registered under that TLD should support a global audience: i.e. ASCII or ISO-8859-1. NOTHING ELSE. Let .ru use more of the unicode spectrum. Or even allow for a .ðî (that was the cyrillic letters for the first to characters in the Russian word Rossiya, in case your browser cannot resolve those) for domains aimed specifically at Russian speakers.
Wow, this is amazing... (Score:1)
---------///----------
All generalizations are false.
Re:example... (Score:1)
Though it shows it as dom%e4n.nu in IE's address bar.
Re:Ưou people just don't get it (Score:1)
I'm not sure the point is to see everything in the first place. There is no way you can view all the content of the Internet in a single lifetime.
The problem comes when somebody is picking out the good hits from a search. They remember they want to go to fahrvergnügen.com, but they have no way to type the ü. Then what?
The new domain name system should be designed with the following key points in mind:
As you said, web services and software might help, but how many clueless users are going to be able to find them? How many of those are going to bother? In the "Internet economy" everyone keeps blathering about, they'll go somewhere else. Then all your "the Internet is global" propaganda is worthless...
<<< CmdrTHAC0 >>>
How it's supposed to work (Score:4)
Recently I posted this comment [slashdot.org] mentioning the fact that there's really no reason why a domain such as www..com [www..com] (you should see two Chinese ideograms meaning "China" between the "www." and the ".com" parts; further, if you click on this link, your browser should open a window telling you that the domain "www..com" does not exist, with the same two Chinese ideograms) doesn't exist.
Let us recall: first, as specified [w3.org] by the HTML specification [w3.org], every HTML document, no matter what character set it is "encoded" as, is written in the all-englobing Unicode [unicode.org] character set. So when you write something like "中国" in HTML, it refers to the Unicode characters (decimal) 20013 and 22269, no matter what the current character encoding and font are. So that's how you write the link text. Second, as for the URL itself, well, although it is not (as far as I know) formally recommended by an Internet standard, it is widely recognized that URLs are written in the UTF-8 [isi.edu] encoding format (which is afterward %-encoded into ASCII).
The whole process is described in this Internet Draft [isi.edu] ("Internationalized Uniform Resource Identifiers"; WORK IN PROGRESS!) by Larry Masinter and Martin Duerst where the relationship between URIs and IURIs (Internationalized URIs) is discussed in detail.
The DNS is the toughest part of all. The DNS specification (RFC1034 [isi.edu]) states (section 3.1) that DNS data is to be taken as binary for possible upward compatibility (this was wonderful foresight on Mockapetris' part!). Consequently, there is nothing as per standards wrong with using (UTF-8 encoded Unicode) 8-bit data in DNS labels. Except, of course, that many "buggy" implementations will have to be corrected for broken assumptions, *sigh*. The IDNS [idns.org] working group suggests using a UTF-5 encoding to avoid going beyond the current domain name limits: I think this is not a good thing and we should stick to UTF-8 and repair broken software.
Oh, and incidentally, see this page [eleves.ens.fr] too know how broken your browser's Unicode support is.
I can see where this is going... (Score:2)
ugg (Score:2)
a domain name i supposed to be universally accessable. this is going to make a great many pains in the asses.
old browsers wont work
english keyboards lack accented characters
its not fun changing your charset, then punching in random alt+XXX codes until you match the CJK symbol your looking at.
the internet is really becoming dumb.
just one problem... (Score:2)
wouldn't this choke most applications? im not entirerly sure how CJK are handled... doesnt seem to me like it would be a pop-in transition.
mapping back to ASCII? (Score:2)
However, if you had some kind of translation software that automatically mapped the local character set back to ASCII (and of course disallowed name clashes for the mapped names when registration occurred), it could be a win/win situation both for making the DNS more useful for non-english speakers, and keeping the net globally accessible.
Re:Why Spanish before other European languages. (Score:2)
Re:Give people some credit (Score:2)
As you eloquently pointed out, the masses are confused by some young guy posting a stock market rumor. And they are confused by fake sites pretending to be something else. What would your solution be? can you really think of any fix to this, other than letting the thing settle down and everyone chill out and understand how the net works?
Whenever something new happens on the net, there is always mass panic and hysteria, but eventually people settle down. The way to deal with issues of confusion is not to PREVENT abuses from happening, or put in "safeguards" to avoid the dumb masses from getting confused.
Things that need regulation need to have a high threshold to satisfy a need to slow down the anarchy and vibrancy of the net. Stuff like market rumors or fake sites or accented characters...geez, let millions of people panic and get confused and learn in the process. They will get over it and figure it out. There's no way to hold everyone's hand and regulate panic and confusion.
I also think there's a misunderstanding about accented chars. people who type something know what they want. For instance, someone who didn't know English at all might wonder if people get confused by some fake site registering yah00.com or m1cros0ft.com - they all look the same to him.
The o and the o with 2 dots above it may seem the same to us, but it's about as different to a Norwegian as a zero is to an 'o'. So don't worry, "they" won't get confused by similar looking domains. Even if they do, they'll get over it.
w/m
There is Need for a Non-DNS URL System (Score:2)
I think it is about time we tossed out DNS when it comes to URLs. It is ridiculuous that so many millions of non-technical users are expected to use DNS. The further absurdity of the DNS systems application to URLs is realized by the endless "property" claims made by rich litigious corporations.
Why not use some kind of distributed, non-exclusive labeling system that lets IBM have the name "IBM". Maybe something LDAP based?
We are not going to get anywhere by patching up the DNS system a problem at a time. We need to engineer a new solution. I'm all for evolution but I don't want to wait for it to come up with something that works
Thought it would be a joke... (Score:2)
I have read about it on: http://www.spiegel.de but as far as I remember, it was only a doubtful claim by some politicans, who have a german special sign like ä,ö or ü in their names and could not register it without using substituions like ä --> ae and so on.
As far as I remember, the chairman of http://www.denic.de only laughed about this claim.
I think it would lead us to more problems than it would solve! Or is it just again about making $$?
Michael
Re:Questions (Score:2)
Unicode Limitations / BIND (Score:3)
I didn't find these questions answered anywhere on ICANN or NSI's sites. Anybody have any ideas?
-Waldo
-------------------
Not good (Score:5)
my recomendation would be to leave it up to the countrlies TLD's. so if i want to register cualquiercosa.com.pa then ok, but the regular
Re:A split on the internet? (Score:2)
There's a problem with this... (Score:2)
One, people talk about accented characters as being harder to recognize when spoken. While this is true, there's another problem, and one that's a lot tougher: there is no standard way to type these characters. On a Mac it's done one way (fairly intuitive, based on the character over which a given mark most frequently appears), on Windows it's done another way (an unnecessarily difficult process involving a four-digit keycode), and on Linux/Unix it's still another (I don't even know how it's done there).
Part of the reason the keyboard works so well is that it's at least semi-standardized; for the basic Roman character set I can move across platforms effortlessly. But when you start throwing diactiricals into the mix, I'm lost when I move from platform to platform. We need to solve that problem before we can even think of putting such characters in URL's. Can it be done? I think so.
Now, there's the problem of CJK characters in URL's. First of all, most computers aren't even capable of recognizing these without special software. As a result, the characters come out as a sequence of ASCII chars which if you're really lucky might all be printable. If you're not so lucky, the characters won't even be printable, or they'll be indistinguishable from one another so you still don't know what to type.
The answer here? Unicode (specifically UTF-8) helps, but many computers still don't support Unicode. Even in the case of those that do, I doubt there are any fonts which support every single character in the CJK set yet (remember, the Chinese character set in particular is truly vast; a two-byte encoding system is still insufficient for encoding all the possible characters). While all current operating systems can banage Unicode, many people are unable or unwilling to upgrade to current technology, and that's going to be a huge barrier to overcome (it may even prove insurmountable).
Supporting all the world's languages in URL's is a Good Thing. However, we have more than a few problems that we have to get through before we can accomplish that goal. The resources currently being spent on this project would be better spent solving those problems first.
----------
.nu (Score:3)
And worldsnames.net [worldnames.net] also features Japanese characters, Chinese, Korean, Arabic, Cyrillic.
Tho one of the nice "features" of the internet is the fact that you have the opportunity to reach a gobal public. Which is rather hard when you have country/language specific characters. my 0.02