Appearing in the top organic listings of Google is increasingly like pouring a gallon of milk into a shot glass. And the shot glass is already full of adverts, a bunch of maps, a broadsheet newspaper, a lengthy opinion piece about Taylor Swift and an Argos catalogue. And oh look, now you’ve got milk all over the kitchen counter, but everyone’s just ignoring it as they’re too busy looking into the shot glass because all the information they need is right there. So what can you do, apart from fetching a mop and bucket, stop wasting so much milk and learning to unpack bizarre opening metaphors? Well it may surprise you to learn that many website owners are still struggling with basic SEO techniques. These are the technical, on-page habits that are easily addressable and take very little time to action, but can make a heck of a difference to your overall search ranking. Excelling at these basic SEO skills can, therefore, put you ahead of the competition. But which of the on-page techniques are still not being implemented correctly and what can you learn from this? SEMrush has collected anonymous data on 100,000 websites and 450 million pages to determine the top SEO issues and it has published its results in this huge stat-filled infographic, which we’ll also republish at the bottom of this article. But for now, here are the most common on-page, technical SEO issues that website owners are experiencing, along with links to our own guidance for addressing these issues. Top 11 most common SEO issues 1. Duplicate content According to SEMrush, 50% of analyzed web pages face duplicate content issues. Although there isn’t a specific penalty against duplicate content, the problem arises when your similar webpages begin cannibalising each other for the same search positions and Google ends up filtering one at the expense of another, and this may not necessarily be the page you want to aee ranking. This is where the rel=canonical attribute can help, by letting Google know exactly which duplicate page to rank. For more information on this, check out: how and when to use canonical. 2. Missing alt tags and broken images The research reveals that 45% of sites have images with missing alt tags and another 10% have broken images. Alt tags are a way to accurately describe your images to search engines to make sure they’re indexed properly in image search, and therefore bring some extra traffic to your site. Broken images can cause the same issues as broken links by providing a poor user experience. One way to avoid this is to make sure you’re hosting images within your own media library, not on a third-party image host. For more information, check out our guide on how to optimise your images for SEO. 3. Title tag issues Title tags are used to tell search engines and visitors what any given page on your site is about in the most concise and accurate way possible. SEMrush found that 35% of sites have duplicate title tags, 15% have too much text in the tag, 8% are missing them and 4% don’t provide enough text. Here’s how you can fix all these issues: How to write title tags for SEO. 5. Broken internal and external links The research showed that 35% of sites had broken internal links that returned bad HTTP status codes (70% of which return a 4xx page not found code). A further 25% of sites had broken external links, which can seriously impair your website’s authority. You can learn how to check for crawl errors in our guide to Google Search Console and you can also read our best practice guide to internal linking for help. 6. Low text-to-HTML ratio The research showed a warning of ‘low text-to-HTML ratio’ on 28% of sites analyzed. According to SEMrush, this means that these sites contain proportionally more back-end HTML code rather than text that people can actually read. They recommend an acceptable lower limit beginning from 20%. Here’s a thorough checklist of things to help lower your ratio according to Woorank:
- Remove huge white spaces
- Avoid lots of tabs
- Remove comments in the code
- Avoid tables.
- Use CSS for styling and formatting
- Resize your images
- Remove any unnecessary images
- Only use Javascript if required
- Keep the size of your page under 300kb
- Remove any hidden text that is not visible to people
- Your page must always have some amount of plain text. Include easily readable – text with quality user information
7. H1 tag issues It’s important to know the difference between H1 tags and title tags. The title tag appears in search results, whereas the H1 tag (normally your headline) is what visitors see on the page. Of the sites analyzed, 20% had multiple H1 tags, 20% were missing H1 tags, and 15% had duplicate information in their title tag and H1. You should ordinarily only use one H1 tag per web page and break up articles with plenty of h2 tags. 8. Low word count Increasingly Google is ranking more in-depth articles over what it considers thin content. Of the websites crawled, 18% had a low word count on some pages. Here’s a guide to evergreen content that can help you create nice in-depth webpages that sit at the top of Google and stay there. 9. Too many on-page links The research reveals that 15% of sites have too many on-page links on some pages. Having a maximum number of links on a page isn’t a problem as such, but cramming a page with unnatural links definitely is. After all, a cluttered page full of links can be a bad user experience, especially on mobile. As SEMrush states, good SEO means having a natural link profile that includes relevant high quality links. Carry out a link audit for every page and get rid of the links that don’t provide any value to your readers or your SEO strategy. 10. Incorrect language declaration SEMrush has found that 12% of websites have failed to include a language declaration stating the default language of the text in the page. Language declaration is useful for translation and page display, and ensures that people using text-to-speech converters hear your content read in the correct dialect of their native language. You can easily amend this in the International Targeting section of Search Console. 11. Temporary redirects The research shows that 10% of websites analyzed contain temporary redirects. According to SEMrush, a 302 redirect can cause search engines to continue to index an outdated page while ignoring the page you are redirecting it to. Although Google’s John Mueller has stated its algorithm does not penalize for 302 redirects and the index will eventually treat a 302 as a 301 if it remains long enough. But it is worth keeping in mind that a temporary 302 redirect won’t pass any link authority on to your preferred page, but a permanent 301 redirect will, so it’s best to avoid them. And finally, here is the promised infographic from SEMrush… This article was originally written for searchenginewatch. Read the original article here.
ABOUT THE AUTHOR Christopher Ratcliff is the editor of Search Engine Watch. He’s also the editor of Methods Unsound. He was the Deputy Editor of Econsultancy. You may hound him on Twitter @Christophe_Rock especially if it’s beer/donut/record-voucher related.