h3. Best Practice
bq. Use human readable URLs
This is human readable:
This is not:
The problem with complex URLs is three-fold:
# A human cannot “reverse engineer” a URL to figure out where they are in the site or what might be “one level higher”. Human readable URLs allow you to “cut off” the end of the URL and get to a higher level in the site. URLs that reflect the site’s page layout also act as a secondary way-finding tool.
# It is hard to share URLs that are not human readable. If you cut and paste a complex URL into an e-mail to share it, often the URL will break in two because it is too long to fit on one line. This creates a broken link for the recipient.
# Some search engines have a hard time with overly complex URLs and you may find that many of your pages are not accessible to search engine “bots” looking for your content.
If your site is dynamically generated, discuss with your tech team or vendor whether it is possible to use human readable URLs. If it isn’t you’ll have to do some serious thinking about the cost benefit of findablity vs. technology upgrades.
If your site is static and created by humans, remind them to name pages and directories with the user in mind, not the builder.
Instead of naming pages descriptively, put them in well-named directories and save the pages as index.html.
In this “best case” a visitor can simply type *allaboutbutlers.org/servingdinner* to get to the exact content they want.
h3. Best Practice In Action