Today, the Universal Acceptance Steering Group (UASG) released two reports aimed at measuring the state of popular browsers and websites and their ability to handle all domain names – including new, longer, or non-Latin top-level domain names (TLDs) or non-Latin email addresses.
The “Evaluation of Websites for Acceptance of a Variety of Email Addresses” report (UASG017) found that of the 749 websites tested, just 7 percent passed all the test cases, which included attempts to register seven diverse types of email addresses – both in non-English language and those with top-level domains longer than the traditional two or three characters.
After testing 17 URLs in eight browsers on six different operating systems, “The Universal Acceptance of Popular Browsers” report (UASG016) found only one – Internet Explorer – was fully UA-compliant. The majority had challenges with non-English domain names. In these cases, the browser either displayed search results instead of loading the expected web page or did not render URLs properly in the tab title bar.
According to the reports, while there is some achievement of Universal Acceptance (or UA – making sure all email addresses and all domain names can be successfully used online), many problems still exist.
“Since 2010, the common infrastructure of the Internet has evolved, allowing for longer TLDs (and email address based on those TLDs) and for domains that represent geographically diverse local scripts (e.g., 普遍接受-测试.世界, ua-test.世界,etc.). These new domain names allow people to choose an online name that best reflects their own sense of identity. And yet, the software development community still has work to do to apply these Internet standards to their online systems.”
“The UASG’s goal with the reports is to educate the software development community on the state of Universal Acceptance, and highlight specific areas where companies can improve and become UA-compliant. Businesses and governments – or any organization with an online presence – have an interest in becoming UA-compliant, as it significantly opens access to worldwide users who have yet to meaningfully leverage Internet participation. UA opens a truly multilingual Internet, fosters competition, and provides new and meaningful options for online identities.”
It seems that dedicating the required man power to tackle these difficult tasks is not a top priority for most companies. There is so little reward. The TLDs should help these companies by creating a framework that can be used by this old software. You can’t expect all these expenses only to satisfy a very small percentage of their users or clients.
And ICANN is the first to blame for all this mess. First you plan and prepare and then you make the release. But all they care is crabbing the money fast and let the others care later.
The report does not deliver any clear conclusive results and its methodology is questionable.
They targeted the Alexa Top 1000 web sites? Not clear: “evaluated more than 1000 websites (based on Alexa ranking)”
Here is why this approach is inaccurate in its assumptions.
A large portion of web sites that actively utilize contact forms is built on WordPress, with ever-updated plug-ins. By default, the WP core and its plugins stay atop those developments, e.g. IDN characters in email addresses, and the availability of gTLDs.
The conclusion is related moreso to bad input practices for contact forms, than anything else, and it’s definitely not specific or conclusive about gTLD adoption.
Here is my biggest pet peeve when it comes to “recognizing” new gTLDs: Facebook.
In FB, when you just type Example.com, it is recognized and turned into a live link without having to use the “www.”; the only exception I have found is in messages where you still need to use the “www.”
When you use Example.nTLD, however, you’re stuck with having to always use the “www.”
I suppose there may be other social media contexts I’m not aware of where the problem is the same. In fact, come to think of it, that may be the case with Reddit too (not sure).
“.” Doesn’t mean web address but .com does. If Facebook started assuming dots between words were url’s that would result in lots of unintended links. Buy.today could be a URL or just a marketing slogan.
People forget to put a space after the.You can’t turn all these into links.There must be a software that has all the new gtlds running that is going to put a lot of weight on all applications.This will not happen any time soon because software can’t really know what the mistake is. 🙂
It’s not fair to people and businesses who are tying to make a go of some of the new gTLDs.
The error of failing to put a space after a period is not going to be that abundant to be a major issue.
And yes you could also have code running to see if there is even a known TLD.
Or there could at least be some method and mechanism provided to enable people to quickly and easily make something appear as a link without having to use “www.” or “http://” to level the playing field just a bit.
People used to say iPhones should have a .mobi button and default to .mobi when people type a keyword.
Only investors in new tlds would think it is unfair. For everyday surfers they’d be redirected to new tlds by mistake.
No Snoopy. You missed my final point:
“at least be some method and mechanism provided to enable people to quickly and easily make something appear as a link without having to use “www.” or “http://” to level the playing field just a bit.”
There is no possible “mistake” when the person making something a live link intended to do that. Did you not understand that’s what I advocated there? It doesn’t get any simpler – give people the option through some little method in the FB interface like any other little option they include. As it is, adding “www.” or “http” has to be deliberately done to begin with, so there is no such thing as a mistake even being possible.
And what kind of remark is this:
“Only investors in new tlds would think it is unfair.”
Only someone biased in the other direction could make a statement like that.
I’m as pro .com as anyone, but if anyone suggests that not even a small number of new TLDs are beautiful and desirable from and end user perspective, then they are either lying or clueless. I have no unrealistic ideas about new TLDs in general, but I certainly think a small number of them are remarkably nice with real potential.
So, yes, no matter you slice it, it’s unfair and also not good for society or the economy that only .com has such an advantage in social media.
Happy to agree with Snoopy regarding the Period “.” I see many shops using a dot in signage these days. Just more confusion
See my suggestion above.
The only people that care about this is registries. Normal people and developers don’t care. 🙂
I had to use a separate email address for my 1and1.com account because they could not support my use of a .expert domain. I was hampered in bidding and in registering domains because I was using that .expert domain as my main account email. They were blocked somehow from sending me required notices, and their support team could not contact me properly either. And this is a registry who sells new gtlds!! I cut off all business from them because they are the absolute worst registrar there is hands down, but that is a rant for another time.
As to the study, I agree with Acro that the inclusion of IDN characters hopelessly muddles the case as to how well the ngTLDs are actually performing. Let us see how well they are doing without concern for IDN characters. The UASG and ICANN have made a political decision to conflate the serious, immediate issue of nGTLD acceptance to force attention towards the separate issue of recognition of IDN characters that is not as urgent.
The top priority for ICANN should be spending all of the millions it has raked in through ngTLD sales on fixing this issue. Or it may be more effective to give them a percentage of their salaries, bonuses, and travel budgets commensurate with that of ngTLD domain acceptance.