15 Best Practices for SEO

15 Best Practices for SEO pixelwork

15 Best Practices for SEO

#1: Whenever possible, use a single domain and subdomain

It's hard to argue with this given the preponderance of evidence and examples of people moving their content from a subdomain to a subfolder and seeing improved results (or, worse, moving content to a subdomain and losing traffic). Whatever search engines use to judge whether content should inherit the ranking power of its parent domain, it seems to have trouble constantly switching between subdomains.

That doesn't mean it can't work, and if a subdomain is the only way to create a blog or produce the content you need, then it's better than nothing. But your blog is more likely to perform well and help the rest of the content on your site if they are owned together on a subdomain and root domain.

#2: The more human-readable, the better

It should come as no surprise that the easier a URL is for humans to read, the better it is for search engines. Accessibility has always been a part of SEO, but never more so than today, when engines can leverage advanced user data and usage signals to determine which people are engaging and which aren't.

Readability can be a subjective issue, but hopefully this example can help:


The requirement is not that every aspect of the URL must be absolutely clean and perfect, but that it at least be easily understood and, hopefully, fulfill the search need.

#3: Keywords in URLs: still a good option

It remains the case that using the keywords in the URLs you are targeting to rank well is a solid idea. This is true for several reasons.

First, keywords in the URL help tell those who see your URL on social media, in an email, or when hovering over a link to click and get what they want and expect, as shown in the example below from Metafilter (notice how hovering over the link shows the URL in the bottom left corner):


Second, URLs are regularly copied and pasted, and when there is no anchor text used in a link, the URL itself serves as the anchor text (which is still a powerful input for ranking), for example:



Third, and finally, the keywords in the URL appear in search results, and research has shown that the URL is one of the most prominent elements that users consider when selecting a site to click on. .


#4: Multiple URLs providing the same content? Canonize them!

If you have two URLs that display very similar content, consider canonicalizing them, using a 301 redirect (if there is no real reason to keep the duplicate) or a rel=canonical (if you want to keep slightly different versions for some visitors, e.g. a page to print).

Duplicate content doesn't really incur a penalty (at least not until you start duplicating on very large scales), but it can cause a ranking split that can hurt your search traffic potential. If Page A has a certain positioning and is duplicated and Page A2 has a similar amount of positioning, through canonization, Page A may have a better chance of positioning and gaining visits.

#5: Exclude dynamic parameters when possible


If you can avoid using URL parameters, do so. If you have more than two URL parameters, it's probably worth a serious investment to rewrite them as static, readable, pure text.

Most CMS platforms have become experts at this over the years, but a few stragglers remain.

Some dynamic parameters are used for click tracking (such as those inserted by popular social sharing apps like Buffer). In general, these don't cause a big problem, but they can make URLs somewhat unsightly and awkwardly long. Use your own judgment as to whether the benefits of tracking parameters outweigh the negatives.


Research from a 2014 study by RadiumOne suggests that social sharing (which has positive but usually indirect effects on SEO) with short URLs clearly communicates content and performs better than unclear shortened or long URLs.

#6: Short > Long

Short URLs are generally preferable. You don't have to take this to the extreme, and if your URL is already less than 50-60 characters, don't worry about it at all. But if you have URLs with more than 100 characters, there is an opportunity to rewrite them and gain value.

This is not a direct problem with Google or Bing, search engines can process long URLs without much trouble. The issue is that it lies with usability and user experience. Short URLs are easier to parse, copy and paste, and share on social media, and while these might add up to only a fractional improvement in sharing or amplifying, every tweet, like, share, pin, email, and link counts. (either directly or often indirectly).

#7: Match URLs to titles most of the time (when they make sense)

This doesn't mean that if the title is “My 7 Favorite Bottles of Islay Whiskey (And How One of Them Has Cost Me My Entire Lego Collection)” the URL should make it a perfect match. Something like:


It would be very good. So, I would do it too


Or variations of this. The pairing achieves a primarily human-centered goal, namely to imbue an excellent sense of what the web user is encountering on the page through the URL and then fulfilling that expectation with the header/title.

It is for this same reason that it is highly recommended to keep the page title (which search engines display prominently on their pages) and the header visible together, so one creates an expectation, and the other fulfills it.


For example, above, you'll see two URLs that I shared on Facebook. In the first, it is not totally clear what you can find on the page. It's in the news section of the BBC website, but beyond that, there's no way of knowing what you might find there. In the second, however, Pacific Standard Magazine has made it easy for the URL to give an idea of ​​the content of the article and then the title of the deliverable piece:


We should aim for a similar level of clarity in our own URLs and titles.

#8: The inclusion of prepositions and linking words.

If your title includes words (and, or, but, of, the, an, etc.), it is not critical to put them in the URL. You don't have to leave them out, either, but it can sometimes help make a URL shorter and easier to read in some cases when sharing. Use your best judgment on whether or not to include based on readability vs. Length.

#9: Remove/Control Unwieldy Punctuation Characters

There are a number of text characters that become ugly parts that are difficult to read when inserted into the URL string. In general, it is good practice to eliminate or control them. There is a large list of safe vs. safe characters. not safe:


It is not simply the poor readability that these characters could cause, but also the potential to break certain browsers, crawlers, or parsers.

#10: Limit redirects to two or less

If a user or a search engine requests a URL A, it redirects to a URL B. That's great. It's even okay if URL B redirects to URL C (not great, it would be more ideal to point directly from URL A to URL C, but it's not terrible). However, if the redirect URL string follows the last two hops, you could get into trouble.

Generally speaking, search engines will follow these jumps longer, but they have shifted against the practice of this in the past, and because “less important” URLs (in their eyes), may not follow or count toward ranking. complete.

The biggest problem is browsers and users, who are slowed down and sometimes even frustrated (mobile browsers, in particular, can occasionally struggle with this) by long redirects. Keep redirects to a minimum and you will have fewer problems.

#11: Having fewer folders is generally better

Take a URL like this:


And take into account, change it and structure it this way:


It's not that diagonals (aka folders) necessarily hurt performance, but it can create a perception of site depth for engines and users, as well as making URL changes considerably more complex (at least, in the most CMS protocols).

There is no strong need to change it quickly but it is important that you use your best judgment.

#12: Avoid hashes in URLs that create standalone/unique content

The hash (or URL fragment identifier) ​​has historically been a way to send a visitor to a specific location on a given page (blog posts, for example from Moz use the hash to navigate to a comment on). Hashes can also be used as tracking parameters (for example randswhisky.com/lagavulin#src=twitter). Using URL hashes for anything more than this, such as displaying content only from what is available on the page without the hash or completely separate pages is generally a bad idea.

There are exceptions, such as Google allowing developers who want to use the Hashbang format for dynamic AJAX applications, but even these are not as clean, visitor-friendly, or simply from an SEO perspective, statically rewritten URLs. Sites from Amazon to Twitter have found great benefit in simplifying their previously complex URLs using hashing/Hashbang. If you can avoid it, do it.

#13: Be careful with capitalization and lowercase

A couple of years ago, Search Discovery's John Sherrod wrote an excellent article to point out the challenges and issues around capitalization in URLs. “Short story” if you're using Microsoft servers/IIS, you're usually free of this. If you are on Linux/UNIX, you can get into trouble, as they can interpret separate things, and therefore randswhisky.com/AbC could be a different thing from the contents of randswhisky.com/aBc. That's bad.


In an ideal world, you want URLs that use the wrong case to automatically redirect/canonize to the one on the right. There are .htaccess rewrite protocols to help, highly recommended if you are facing this problem.

#14: Hyphens and underscores are preferred as separators

Notably missing (for the first time in my many years of updating this) is my recommendation to avoid underscores as word separators in URLs. In recent years, search engines have successfully overcome previous challenges with this issue and now treat hyphens and underscores similarly.

Spaces can work, but they make URLs look awkwardly like %20, which detracts from reading your pages. Try to avoid them if possible (it's usually pretty easy in a modern CMS).

#15: Keyword stuffing and repeating are meaningless and make your site look spammy

Take a look at the search results below, and you'll see a bunch of “canoe” puppies in the URL. That's probably not ideal, and could lead to some search bias against clicking.


Repetition like this does not help your ranking, Google and Bing have gone far beyond algorithms that positively reward a keyword that appears multiple times in the URL string. Don't hurt your chances of earning clicks (which can affect your rankings) by exaggerating keyword matching/repetition in your URLs.

Source: Moz