Willemien is Content Team Lead at Yoast. She enjoys to create and organize content in such a method that it assists individuals understand SEO.
An SEO Basics post about technical SEO may look like a contradiction in terms. However, some basic understanding about the more technical side of SEO can mean the distinction in between a high ranking website and a site that doesn’t rank at all. Technical SEO isn’t simple, however here we’ll describe– in layman’s language– which elements you should (ask your designer to) focus on when dealing with the technical structure of your website.
What is technical SEO?
Technical SEO describes improving the technical aspects of a site in order to increase the ranking of its pages in the search engines. Making a website much faster, simpler to crawl and understandable for online search engine are the pillars of technical optimization. Technical SEO is part of on-page SEO, which focuses on enhancing elements on your site to get higher rankings. It’s the reverse of off-page SEO, which is about creating direct exposure for a website through other channels.
Why should you enhance your website technically?
Google and other online search engine wish to present their users with the finest possible outcomes for their inquiry. For that reason, Google’s robotics crawl and assess web pages on a wide range of factors Some factors are based upon the user’s experience, like how fast a page loads. Other factors assist search engine robotics understand what your pages are about. This is what, amongst others, structured information does. So, by improving technical aspects you assist online search engine crawl and understand your site. If you do this well, you might be rewarded with greater rankings or perhaps abundant results.
It also works the other way around: if you make serious technical mistakes on your site, they can cost you. You wouldn’t be the first to block online search engine totally from crawling your website by inadvertently adding a tracking slash in the incorrect place in your robots.txt file.
However it’s a mistaken belief you ought to focus on technical details of a site simply to please online search engine. A website should work well– be quickly, clear and easy to utilize– for your users in the very first place. Thankfully, developing a strong technical structure often corresponds with a much better experience for both users and online search engine.
What are the qualities of a technically optimized website?
A technically sound website is quickly for users and simple to crawl for search engine robots. An appropriate technical setup helps online search engine to comprehend what a website is about and it avoids confusion brought on by, for example, replicate material. Additionally, it doesn’t send visitors, nor online search engine, into dead-end streets by non-working links. Here, we’ll shortly go into some essential qualities of a technically enhanced website.
1. It’s quick
Nowadays, websites need to load quickly. People are impatient and don’t desire to wait for a page to open. In 2016 already, research showed that 53%of mobile site visitors will leave if a web page doesn’t open within three seconds. So if your site is slow, people get disappointed and proceed to another website, and you’ll miss out on all that traffic.
Google knows slow web pages offer a less than optimal experience. For That Reason they choose web pages that pack faster So, a sluggish websites also ends up further down the search engine result than its faster equivalent, resulting in even less traffic.
Wondering if your website is quickly enough? Check out how to easily evaluate your website speed Many tests will likewise provide you pointers on what to enhance. We’ll guide you through common site speed optimization pointers here
2. It’s crawlable for online search engine
Search engines utilize robotics to crawl or spider your website. The robots follow links to discover material on your site. A excellent internal linking structure will make sure that they’ll understand what the most essential material on your website is.
However there are more ways to assist robotics. You can, for example, obstruct them from crawling specific content if you do not desire them to go there. You can also let them crawl a page, but tell them not to show this page in the search engine result or not to follow the links on that page.
You can give robots directions on your site by utilizing the robots.txt file. It’s an effective tool, which ought to be handled carefully. As we pointed out in the beginning, a small mistake may avoid robotics from crawling (fundamental parts of) your website. Often, people unintentionally block their website’s CSS and JS files in the robot.txt file. These files consist of code that tells browsers what your site must look like and how it works. If those files are obstructed, search engines can’t learn if your site works appropriately.
All in all, we recommend to truly dive into robots.txt if you desire to find out how it works. Or, perhaps even better, let a developer handle it for you!
The meta robots tag
The robotics meta tag is a piece of code that you won’t see on the page as a visitor. It’s in the source code in the so-called head area of a page. Robotics read this section when finding a page. In it, they’ll discover details about what they’ll discover on the page or what they need to do with it.
If you want online search engine robots to crawl a page, however to keep it out of the search results for some factor, you can inform them with the robotics meta tag. With the robotics meta tag, you can likewise instruct them to crawl a page, however not to follow the links on the page. With Yoast SEO it’s simple to noindex or nofollow a post or page Find out for which pages you ‘d want to do that.
Read more: https://yoast.com/what-is-crawlability/
3. It does not have (lots of) dead links
We’ve talked about that slow sites are frustrating. What may be even more frustrating for visitors than a sluggish page, is landing on a page that doesn’t exist at all. If a link causes a non-existing page on your website, people will experience a 404 error page. There goes your carefully crafted user experience!
What’s more, search engines do not like to find these error pages either. And, they tend to find even more dead links than visitors encounter because they follow every link they run into, even if it’s hidden.
Unfortunately, a lot of websites have (a minimum of) some dead links, because a website is a continuous work in progress: individuals make things and break things. Luckily, there are tools that can assist you retrieve dead links on your site. Check out about those tools and how to resolve 404 mistakes
To prevent unnecessary dead links, you ought to constantly reroute the URL of a page when you delete it or move it. Ideally, you ‘d reroute it to a page that changes the old page. With Yoast SEO Premium, you can quickly make redirects yourself. No requirement for a developer!
Read more: https://yoast.com/what-is-a-redirect/
4. It does not confuse online search engine with duplicate content
If you have the exact same material on multiple pages of your site– and even on other sites– online search engine might get confused. Since, if these pages reveal the very same material, which one should they rank highest? As an outcome, they might rank all pages with the same content lower.
Regrettably, you may have a replicate content problem without even understanding it. Since of technical factors, various URLs can show the very same content. For a visitor, this doesn’t make any distinction, however for a search engine it does; it’ll see the same material on a various URL.
Luckily, there’s a technical service to this problem. With the so-called, canonical link element you can suggest what the initial page– or the page you want to rank in the online search engine– is. In Yoast SEO you can quickly set a canonical URL for a page. And, to make it easy for you, Yoast SEO adds self-referencing canonical links to all your pages. This will help prevent replicate content problems that you ‘d may not even know.
5. It’s secure
A technically optimized site is a protected site. Making your site safe for users to ensure their privacy is a fundamental requirement nowadays. There are numerous things you can do to make your (WordPress) website safe and secure, and one of the most important things is carrying out HTTPS.
HTTPS makes sure that no-one can intercept the data that’s sent over between the browser and the website. So, for instance, if individuals log in to your site, their credentials are safe. You’ll need a so-called SSL certificate to carry out HTTPS on your website. Google acknowledges the importance of security and therefore made HTTPS a ranking signal: safe sites rank greater than unsafe equivalents.
You can easily check if your site is HTTPS in many browsers. On the left hand side of the search bar of your browser, you’ll see a lock if it’s safe. If you see the words “not protect” you (or your developer) have some work to do!
Learn More: SEO Essentials: What is HTTPS?
6. Plus: it has actually structured data
Structured data assists browse engines comprehend your website, content and even your company much better. With structured information you can tell search engines, what kind of item you offer or which recipes you have on your site. Plus, it will give you the opportunity to supply all kinds of information about those items or recipes.
Due To The Fact That there’s a fixed format (described on Schema.org) in which you must provide this info, search engines can easily discover and understand it. It assists them to put your content in a bigger image. Here, you can check out a story about how it works and how Yoast SEO helps you with that.
Executing structured data can bring you more than just a better understanding by online search engine. It likewise makes your content eligible for rich outcomes; those shiny outcomes with stars or information that stand out in the search results page.
7. Plus: It has an XML sitemap
Just put, an XML sitemap is a list of all pages of your site. It acts as a roadmap for online search engine on your website. With it, you’ll ensure search engines won’t miss any essential content on your site. The XML sitemap is frequently classified in posts, pages, tags or other customized post types and consists of the number of images and the last modified date for every page.
Ideally, a website does not require an XML sitemap. If it has an internal linking structure which connects all content perfectly, robotics will not require it. However, not all websites have a fantastic structure, and having an XML sitemap won’t do any harm. So we ‘d constantly advise having an XML site map on your site.
8. Plus: International websites use hreflang
If your website targets more than one country or countries where the very same language is spoken, online search engine need a little aid to understand which countries or language you’re attempting to reach. If you help them, they can show people the best site for their area in the search results page.
Hreflang tags assist you do just that. You can specify for a page which nation and language it is indicated for. This likewise solves a possible duplicate material problem: even if your US and UK website show the same material, Google will know it’s composed for a various area.
Optimizing international websites is rather a specialism. If you wish to find out how to make your global websites rank, we ‘d recommend taking an appearance at our Multilingual SEO training
Want to find out more about this?
So this is technical SEO in a nutshell. It’s quite a lot currently, while we’ve just scratched the surface area here. There’s a lot more to inform about the technical side of SEO! If you wish to take a deep-dive into technical SEO, we ‘d encourage our Technical SEO training or Structured information training With these courses, you’ll find out how to create a solid technical foundation for your own website.
PS You’re the enthusiastic type? Get both training courses together and save $59!
Read more: https://yoast.com/wordpress-seo/