As we said in previous article “Technical SEO is a technique of creating footpath for crawlers so they can understand the website.” Creating that footpath could be tricky and usually involves some programming skills.
In this article we are going to explain main pillars of technical set without going in details of each. because that will be topic of following articles.
Mine approach this topic will be on the way I usually analyse any website and determine steps which need to be done.
So let’s start!
First thing I always check on any website is language which is declared in the website. If you trying to rank your website globally then general lang=”en” language declaration is more than enough, of course if your website is in English. But if you are trying to rank website only in UK, it is better that content of the website is written in UK version of English language with proper declaration lang=”en-UK”. Properly declaring language of the website is not guarantee that your website ranking are going to be boosted but it will help it. Developers in Google are constantly trying to adjust their algorithm with global standards so you should align your website with them. Full list of language declarations according to ISO 639-1 Language Codes here.
First part of language declaration is language “en” and second part is country “UK” for which the website is created. Language declaration must be written in same pattern as above and aligned with ISO Country Codes.
This declaration was written for the website which is written in French language in Canada.
As CMS (WordPress, Joomla, Magneto, SquareSpace, etc) systems become more and more popular new problem emerged. Those systems were creating massive amount of pages which were not visible for normal user but they existed in background of any website creating additional hustle for crawlers. Those pages were built for CMS functionality purpose. Similar problem apeared when many advertising platforms emerged which were trying to pass parameters inside of url in purpose of understanding and following users patterns. As solution for this problems Canonical tags were invented to show crawlers which is an original version of page and to disregard parameterised versions of page.
Url of the page you are watching is https://www.prime07.ie/technical-seo?id=2358 which tells me that you arrived on this page trough my advertising campaign 2358 but for crawlers in html of page is injected canonical tag
<link rel=”canonical” href=”https://www.prime07.ie/technical-seo” />
which tells them that there is only one page about technical set and others should be disregarded.
Same principle can be used for variable product pages which contain colours or sizes or if you create duplicate products for certain promotions but not want to rank those product pages in SERP-s.
When you are creating the website, you need to decide on which subdomain it will be available www or non www, is it going to be http or https. Common mistake of many developers is to create website on one subdomain, but forget to prevent the website to be shown on other variants. So every website should be forced to display content only on one variant of url “https://www.prime07.ie/” other variants with http protocol or without www should be redirected to the version you decided. Solution in case of WordPress website is with small line of code in attaches file which forces the website to redirected users to one version.
Having the same website on multiple domains active or with multiple protocols sending confusing message to crawlers and your website is loosing credibility. In this article we are not going to analyse which variant is better (www or non www, http or https), we are only going to tell that the website should be accessible only on one version.
Using sitemaps is one of essentials for every SEO Guru. Why is that? Because sitemaps helping crawlers to find every page on your website in shortest possible time. Some pages of website have very few backlinks so for crawlers it is much harder to discover them and especially on big websites which have thousands of pages.
Most common types of sitemaps are XML Sitemaps, Html Sitemaps, Video Sitemaps and Google news Sitemaps.
XML sitemaps are mainly created for crawlers to speed up crawling and understanding structure and the website in whole, Crawlers really easy browse trough them because all pages are listed one below other without any styling which interfere with crawling and slowing them down.
If the website on which they are applied is small it is OK to have one sitemap file with all pages listed one below other. But if the website is hughe then the sitemap should be split in categories and types. First sitemap should contain links to other sitemaps in logical way. From our experience none of the sitemaps should contain more than 5000 links.
Html sitemaps are mainly created for users to understand content of big websites. Not much websites are using them now these days but we still recommend them for big websites because they will help crawlers to reach old parts of content which is quite often out of reach. Of course this sitemaps don’t need to be submited to Search console.
This sitemaps should be used only if video content is significant part of your website content and attracts visitors.
This maps should contain few latest pages-posts which have news character and if the website is accepted to Google News section.
There is rule that in sitemap should contain only pages with content on which you would want to have them to see (Pages, Posts, Product Pages). Search results, 404, Chart pages, paginated pages tagged pages, media pages should not have place in sitemap because you don’t want someone to land on them.