How to improve website technique

Pondering about website technique

Building and maintaining a website requires a lot of work and technical knowledge. Even if you decide to outsource this process, it’s advisable to acquire some technical knowledge. This will only simplify the communication process between you and your website builder. We have set out the most important technical aspects that you should have on point for your website.

Website technique and SEO: what technical aspects are at play when launching a new website?

The technical side of SEO strategy might be the most important one of the three pillars (next to content and conversion optimization) that can make your SEO strategy successful. The technique of a website ensures that search engines can understand and index a website. If you neglect the technical aspect, chances are you won’t rank high in the search results.

Web browser and HTTP

It all starts with your web browser: a web browser is a program that enables you to freely surf the internet. The most popular browsers are Google Chrome, Safari, Mozilla Firefox and Internet Explorer. Make sure your website is compatible with all – or at least the most – web browsers, so you won’t miss out on any visitors.

The communication language of the web browser and web server is called Hypertext Transfer Protocol (HTTP). In short, it’s the technique that is used to send and receive information via the internet. Nowadays, Google attaches value to the use of HTTPS: HyperText Transfer Protocol Secure. HTTPS is an extension of the HTTP protocol whereby the sent data is encrypted. You can recognize websites using HTTPS from the yellow background and the URL that starts with https://. Also, you can see a small padlock pictured on the left or right side of the URL, indicating that information on the website is encrypted.

HTTPS is of vital importance for websites where visitors have to fill in personal data, such as web shops, web mail and of course online banking. Would this information fall into the wrong hands, then the HTTPS protocol and a SSL-certificate ensure that the information cannot be deciphered. In this way, the safety of online banking and online shopping is warranted.

To be able to transfer a website from HTTP to HTTPS, you need to install a SSL-certificate (Secure Sockets Layer). The SSL-certificates cannot be copied or falsified, because per domain name only one corresponding SSL Certificate is issued. SSL is used by millions of websites for the protection of the internet connection, so online purchases, financial transactions and sending personal data can be done safely.

HTTP(S) response: error codes and redirects

Webservers and browsers actually communicate through HTTP status codes. You probably have seen these codes when typing in a URL or when clicking on a Google search result. These status codes are also called a HTTP response. A status code usually exists out of 3 numbers of which the first number tells you what the status code is for.

The most common status codes are:

  • 200 OK – The request has succeeded
  • 304 Not Modified – The resource for the requested URL hasn’t changed since it was last accessed or cached
  • 400 Bad Request – The request was somehow incorrect or corrupted and the server couldn’t understand it
  • 403 Forbidden – The requested document is forbidden and/or you don’t have permission to access it
  • 404 Not Found – The requested document doesn’t exist
  • 410 Gone – The requested document did exist, but isn’t available anymore. Similar as to status code 404
  • 500 Internal Server Error – The web server couldn’t perform the requested action
  • 503 Service Temporarily Unavailable – The web server is unavailable due to a temporary overloading or maintenance of it

 

Link juice, PageRank, follow and no follow

We know that link building is essential for thorough SEO strategy: a good ranking in the search results isn’t possible without having any backlinks to your website (preferably backlinks from a website with a higher domain than yours). When placing and receiving backlinks, often the terms link juice and Pagerank (PR) are used. PageRank is a sort of popularity rank that Google awards to websites. Websites with a high PageRank (or authority) can pass this on to other websites by means of backlinks. This causes the authority of the other website to gradually increase, because of the link juice that is passed on to each other via web pages.

 

It’s often inevitable that websites with a high PageRank get to deal with spam at a certain moment: websites without any authority that place spammy links to a high PageRank website. Sometimes placed by competitors to lower authority, sometimes placed by individuals who hope to profit from a website’s authority by linking to their own page in the comments. At some point, this resulted in the creation of the ‘nofollow tag’. By adding a nofollow tag to the link, you tell the search engine bots not to crawl it. That’s how the bots are able to not award these new links any PageRank. Placing nofollow tags contributes to a healthy, organic linking profile, just as placing dofollow tags does when you do want the links to be crawled. Google approves!

Structured Data and Rich Snippets

SEO isn’t only about keyword research, backlinks and authority: your website’s technique is also an important component. For example, think of the structured data. Structured data comprises making the content of your website understandable for search engines, by marking your content with a piece of code. You can see structured data as a more extensive way of adjusting your meta descriptions. For example, you can indicate that certain content represents an address, phone number or review, so it can be pictured clearly in the search results.

Rich Snippets are search results that offer more (visual) information, based on the manually entered structured data. Your page will stand out more, because you have marked certain aspects of your website. You can, for example, mark the price and reviews of products or indicate whether it’s in stock. Search engine users can then see all the information at a glance that he or she is probably looking for. It’s far more likely that someone will then click your link instead of a link that has integrated no structured data. In this way, you can enrich the search results, leading to a higher click through rate (CTR).

URL and search engine optimization

Lots of webmasters know how to adjust the meta descriptions of pages, but tend to forget to optimize the URL of a website. You should optimize your website’s URL as good as possible for search engines and visitors, to give them a proper idea of the content of your website.

A readable URL is better recognizable for visitors and it’s very important to search engines. A search engine friendly URL is one of the many factors Google takes into account to determine whether a page is relevant to a specific keyword.

Nowadays, most content management systems (CMS) allow you to configure the URL’s of pages and subpages to be rewritten to a better readable version for visitors and search engines. Usually, you can find this option in your CMS under the heading SEF (Search Engine Friendly). Here you can rewrite the URL of your page(s) to a better readable one.

Subsequently, you should think about the structure of your URL’s. Google prefers a semantic construction, because you build up a URL from scratch. You always start with your domain name, after which you go deeper into the page structure of your website. Maintain a logical order, so your visitors can see where they’re located. Avoid using URL’s with lots of parameters, like ‘index.php?Page_ID=3968’ but rather go for a clear URL like ‘wooden-floors-laminate’. Keep in mind that short URL’s usually are better than long URL’s. In general, try to strive for a URL with 50-60 characters.

Duplicate content and canonical tag

Do you have multiple pages (with different URL’s) that have more or less the same content? Then you should use a canonical tag. You can realize this by adding the tag ‘rel=canonical’ to the page in the source code of your website. If you don’t do this, duplicate content can result in a lower ranking. By adding a canonical tag, you prevent a penalty for having ‘duplicate content’, like a printable version page. Thanks to the canonical tag, Google can see which page is the main page.
Do you want to adjust the structure of your website or change it completely? Make sure you do this with caution. When you have a page that has been indexed by Google before and ranks high, it’s not wise to suddenly change its URL. If Google can’t find the original URL anymore, the ranking of the page will be made undone!

Are you sure you want to change a page’s URL? Then don’t forget to place a 301 redirect on the old page. A 301 redirect makes sure that visitors who use the former URL, won’t receive an error, like a 404, but are automatically forwarded to the new URL. The old page will gradually disappear from the rankings and be replaced by the new one.

Robots.txt: operation and application

Adding a robots.txt is another way to prevent Google from indexing pages with duplicate content. There will undoubtedly be pages that you don’t want to be displayed in the search results, like contact form pages or pages with duplicate content. After all, it’s kind of useless to display to exactly the same web pages in the search results. Plus, you shouldn’t even want this to occur anyway since Google doesn’t like duplicate content.

Although issues with duplicate content are often easier to solve with a canonical URL, excluding web pages by means of a robots.txt file is preferable when lots of URL’s have to be crawled. This happens a lot with web shops that use lots of filter and sorting options that result in a large number of URL’s that more or less contain the same content.

You always place the robots.txt file in the root folder of your website. Then, you can perform different actions. First, you determine the location of your sitemap by adding https://www.website.com/sitemap.xml to the robots.txt file.

You can also determine for which search engines the instructions for the crawling of web pages should apply. Do you want the instructions to apply for all search engines? Then you simply put ‘User-agent: *’ at the beginning of your robots.txt. This indicates that all bots should follow to crawling instructions. By using ‘Disallow’ you can indicate which URL’s you want to block, for example: ‘Disallow: /contact-form’. This allows you to keep certain pages out of the search results. Below, the functions are depicted shortly:

User-agent: * Determines that all robots should follow the crawling instructions

Disallow: Determines that specific website pages are not allowed to be indexed by bots

Allow: Gives you the opportunity to have specific map files crawled

Sitemap: Indicates the location of your website’s sitemap

The importance of a sitemap

Adding a sitemap to your website is a good help for Google for the indexation of your website. A sitemap is nothing more than a page on your website that contains the complete link structure of your website. Thanks to a sitemap, search engines can at one glance see how many pages your website has and which subpages belong to which main page.

You can easily make a XML sitemap yourself. An XML sitemap is an XML file that is specially designed for search engines and is only used by search engines. A sitemap is best compared with the table of contents of a book, where the URL’s of the website are the chapters and the subpages resemble the paragraphs.

Sitemaps enable search engines to index the content of your website more easily and quickly. This positively influences your ranking, but it also results in search engines checking your website more often to index possible changes and additions. The faster the new content is indexed, the better your SEO results. So, keep your sitemap up to date and clear!

User Experience and Responsive Web Design

A good website doesn’t only look awesome, it’s also as user friendly as can be. Especially the intuitive click behavior of visitors and their perspective is important. When a website has a proper user experience (UX) design, every visitor can optimally use the website without running into uncertainties or getting frustrated whilst navigating. To realize an optimal UX design, developers and designers check out the design from the perspective of users. A UX design has a clear menu structure, few annoying pop-ups and clearly clickable buttons.

More and more people use a smartphone or tablet to check out a website instead of using a desktop. That means that websites have to be developed for different screen sizes, where the content is adjusted to the available space per device. Responsive web design (RWD) is a way of designing with the goal of offering an optimum user experience to users of all types of devices and screen sizes.

Loading time and Accelerated Mobile Pages

The loading time of your website influences the ranking of your page too: a page that loads slow runs the risk of driving away visitors. A short loading time positively influences both the ranking and the conversion rate of your website. Generally, the loading time of a website shouldn’t be any longer than 2 seconds.

The loading time of mobile websites is also taken into account. Google has come up with a new initiative, i.e. applying Accelerated Mobile Pages (AMP). AMP comprises a better mobile user experience because of web pages that load superfast on your mobile phone. You can achieve this fast loading speed by slimming down your pages: the vital content stays, but the header, big logo’s, footers, sidebars and sometimes even de menu bar is removed. In this way, smartphones don’t have to load any heavy files, resulting in a superfast loading time. A page that is provided with an AMP version can be recognized by the word ‘AMP’ accompanied by a circle with a bolt of lightning.

Our thoughts on website technique

To conclude, lots of technical aspects are at play when launching a new website. Website browsers, https response errors, robots.txt, link juice, PageRank, follow and no follow, duplicate content, canonical tags, user experience, loading time and more. These are all aspects that you should take into consideration. Not always an easy task, but most definitely a fertile one and well worth the effort. Understanding these concepts will help you fine-tune with your website developer.

Not sure yet if you want to spend too much time with the technical stuff? Then consider to build your website with InCMS, the innovative website building application of SwissMadeMarketing.

You can also give the Leadpages Landing Page Builder a go or try the intuitive MyThemeShop. Then you don’t have to worry (too much) about all the technical stuff!

Go to the overview of all the website building tools we have used ourselves and check out the information we've given.