googlelink 发表于 2023-11-17 15:04:34

What's technical SEO? 8 technical aspects everyone should know

In this post, we'll go into the basics of technical SEO. Now, discussing the basics of technical SEO might seem like a contradiction in terms. Nevertheless, some basic knowledge about the more technical side of SEO can mean the difference between a high-ranking site and a site that doesn’t rank at all. Technical SEO isn’t easy, but we’ll explain – in layman’s language – which aspects you should (ask your developer to) pay attention to when working on the technical foundation of your website.
What is technical SEO?

Technical SEO is all about improving a website's technical aspects to increase its pages' ranking in the search engines. Making a website faster, easier to crawl, and more understandable for search engines are the pillars of technical optimization. Technical SEO is part of on-page SEO, which focuses on improving elements on your website to get higher rankings. It’s the opposite of off-page SEO, which is about generating exposure for a website through other channels.
Why should you optimize your site technically?

Google and other search engines want to present their users with the best possible results for their queries. Therefore, Google’s robots crawl and evaluate web pages on many factors. Some factors are based on the user’s experience, like how fast a page loads. Other factors help search engine robots grasp what your pages are about. This is what, among others, structured data does. So, by improving technical aspects, you help search engines crawl and understand your site. If you do this well, you might be rewarded with higher rankings. Or even earn yourself some rich results!
It also works the other way around: if you make serious technical mistakes on your site, they can cost you. You wouldn’t be the first to block search engines entirely from crawling your site by accidentally adding a trailing slash in the wrong place in your robots.txt file.
But don't think you should focus on the technical details of a website to please search engines. A website should work well – be fast, clear, and easy to use – for your users in the first place. Fortunately, creating a strong technical foundation often coincides with a better user and search engine experience.
What are the characteristics of a technically optimized website?

A technically sound website is fast for users and easy to crawl for search engine robots. A proper technical setup helps search engines to understand what a site is about. It also prevents confusion caused by, for instance, duplicate content. Moreover, it doesn’t send visitors, nor search engines, to dead-ends caused by non-working links. Here, we’ll shortly go into some important characteristics of a technically optimized website.
1. It’s fast

Nowadays, web pages need to load fast. People are impatient and don’t want to wait for a page to open. In 2016, research showed that 53% of mobile website visitors will leave if a webpage doesn’t open within three seconds. And the trend hasn't gone away – research from 2022 suggests ecommerce conversion rates drop by roughly 0.3% for every extra second it takes for a page to load. So, if your website is slow, people get frustrated and move on to another website, and you’ll miss out on all that traffic.
Google knows slow web pages offer a less-than-optimal experience. Therefore, they prefer web pages that load faster. So, a slow web page also ends up further down the search results than its faster equivalent, resulting in even less traffic. Since 2021, Page experience (how fast people experience a web page to be) has officially become a Google ranking factor. So, having pages that load quickly enough is more important now than ever.
Wondering if your website is fast enough? Read how to easily test your site speed. Most tests will also give you pointers on what to improve. You can also look at the Core Web vitals – Google uses these to indicate Page experience. And, we’ll guide you through common site speed optimization tips here.
2. It’s crawlable for search engines

Search engines use robots to crawl, or spider, your website. The robots follow links to discover content on your site. A great internal linking structure will ensure they understand the most important content on your site.
But there are more ways to guide robots. You can, for instance, block them from crawling certain content if you don’t want them to go there. You can also let them crawl a page, but tell them not to show this page in the search results or not to follow the links on that page.
Robots.txt file
You can give robots directions on your site by using the robots.txt file. It’s a powerful tool, which should be handled carefully. As we mentioned initially, a small mistake might prevent robots from crawling (important parts of) your site. Sometimes, people unintentionally block their site's CSS and JS files in the robots.txt file. These files contain code that tells browsers what your site should look like and how it works. Search engines can’t determine if your site works properly if those files are blocked.
All in all, we recommend diving into robots.txt if you want to learn how it works. Or, perhaps even better, let a developer handle it for you!
The meta robots tag
The robots meta tag is a piece of code you won’t see on the page as a visitor. It’s in the source code in the so-called head section of a page. Robots read this section when finding a page. In it, they’ll discover what they’ll find on the page or what they need to do with it.

If you want search engine robots to crawl a page, but to keep it out of the search results for some reason, you can tell them with the robots meta tag. With the robots meta tag, you can also instruct them to crawl a page, but not to follow the links on the page. With Yoast SEO, it’s easy to noindex or nofollow a post or page. Learn for which pages you’d want to do that.
页: [1]
查看完整版本: What's technical SEO? 8 technical aspects everyone should know

  • 外链吧 | 雨住水巷 | 五金修配网 | 免费优化 | 全能百科 | 万能社区 | 链接购买