Good SEO Starts With Technology

There is often a lot of “mystery” associated with what kind of SEO work is done behind the scenes. Most folks know that your UX (user experience) and your content need to be on point to support strong SEO. However, there is more to it than that. Such as balancing HTML to content, ensuring every internal link is on HTTPS as opposed to HTTP, and much more.

HTML To Content Ratio Matters

As mentioned above, UX certainly matters in relation to your SEO. Part of the design and content creation experience that can sometimes be overlooked is the balance of HTML to content. If there is more code than there is content on a page, this is when your SEO can get dinged.

Why does that matter? Point blank, search engines, in particular Google, prioritize web pages with more relevant content to your business. Equally, less HTML and code, in general, will increase the page’s load speed and this helps decrease bounce rates. You can see how one aspect of your SEO can domino into another. It’s best to learn how to get that domino to work in your favor. In this case, you should be sure to keep your text to HTML higher than 10%.

Fix Broken Links and Redirects

If there’s one thing search engines really don’t like, it’s broken links. This includes broken hyperlinks within your website, broken backlinks on other websites, as well as, broken redirects.

Let’s start with backlinks. If you have a backlinking campaign, be sure to check up on those fairly often. If one of your backlinks was set up to an old page that has since been taken down, that should be removed. You never want to send a potential customer to a broken or dead page. In some cases, backlinks are bad for your SEO when the website you are backlinked to is considered spammy and/or has many broken links to other websites on it. This happens often with “listing” sites that are not regularly maintained. In other words, the health of the website you’re backlinking from also matters.

It is common knowledge that having a secure URL, one that starts with HTTPS is a best practice for many reasons. It’s best for the security of your website and your customer’s data, and as a result, Google prioritizes secure websites. Subsequently, if your website has any URLs attached to an HTTP (non-secure), there should be a redirect made to the HTTPS. Having HTTP pages of your website accessible to the public can be dangerous for security reasons. Plus, let’s face it, no one likes getting the pop-up that says “This site is not secure”.

Having HTTP pages without redirects to HTTPS is also bad for SEO because your website then has duplicate public pages. This means your own content is competing against itself and can cause difficulties and disparities with your individual page rankings.

Using Schema Mark Up is Good for SEO

As defined by Schema.org, this is a “…vocabulary…used with many different encodings, including RDFa, Microdata, and JSON-LD. These vocabularies cover entities, relationships between entities, and actions, and can easily be extended through a well-documented extension model. Over 10 million sites use Schema.org to markup their web pages and email messages. Many applications from Google, Microsoft, Pinterest, Yandex, and others already use these vocabularies to power rich, extensible experiences.”

So, what does this actually mean? Essentially, it’s a language added to the back end of your website that feeds bot crawlers efficient data about your website. This goes totally unseen by the front-end user. Schema is specifically meant for the purposes of ensuring the right potential customers are seeing your website in search results.

You can add as much relevant schema to the back end of your website as makes sense. As long as it is done accurately and implemented correctly, adding schema is highly recommended.

Ensure Your Files Are Up to Date and Condensed

Are your sitemap and Robots.txt files up to date and accurate? Is your page speed being dragged down by unminified script files? If any of this sounds unfamiliar, it might be good to run an audit of your website to verify this.

Your sitemap.xml file should be current with all of the most important indexed pages of your website. If you’ve added a new blog page that should be indexed, add this to your sitemap, and be sure to reindex your sitemap through Google Search Console. (Don’t worry, we’ll talk more about Google tools further in this blog.) In short, having an accurate sitemap.xml file will provide Google with a quick reference to the most important pages of your site for indexing.

Don’t forget, your sitemap should include your robots.txt file so the bot crawlers know they’re welcome and where to go. The sitemap lists your important pages, the robots.txt page will guide the bots through it.

Your page speed, which we learned earlier affects bounce rate and subsequently, SEO is directly affected by your script file sizes. Every time a page has to load unminified files, as in uncondensed files, it will slow down the load time. It’s like sending an email with a large file vs a condensed file, one will be fast, one won’t. Some of the most common large files are JavaScript and CSS files, so that’s a good place to start. All told, the point is to ensure you have reduced your page load speed while maintaining otherwise best SEO practices.

Get Your Technology Right, Improve Website Rankings

What’s wild, is this is somewhat just scratching the surface. To ensure your website’s SEO efforts are set up properly, working with an SEO specialist is a great idea. You can also check on the health of your SEO through software services such as Lighthouse, SEMRush, and Moz. However, to ensure these technical updates are done right, working with an SEO content specialist and a developer will be your most successful route.

Check out our blog series about SEO to learn more about how to take control of your website’s SEO health!

0 Thoughts

Discussion

Leave a Reply

Your email address will not be published. Required fields are marked *