As the only B Corp certified digital marketing agency in San Francisco, we strive to make a positive impact wherever and however we can. To that end, we’ve put together this three-part series on the different types of search engine optimization (SEO) for digital marketers. We’ll be covering on- and off-page SEO, local SEO, and, in this part, the fine points of technical SEO.
Technical SEO is the first step for SEO for any website. Every other kind of optimization or ranking factor depends on it. You can think of this type of SEO like the backbone of your site’s optimization. If your site is a nicely decorated house, then technical SEO is the foundation, the studs in the walls, the power outlets, the insulation, and so on. Technical SEO is the optimization of your site’s structure and coding.
Technical SEO also impacts how your site is crawled and indexed, and, since it deals with how your site actually functions, it has a tremendous impact on user experience (usually abbreviated UX). Since technical SEO has an impact on just about every aspect of your site, we’ve broken it down by the specific issues it can cause when it’s done improperly, and we’ve added tips to diagnose and solve those issues in your own website. If you’d like some assistance diagnosing your site or implementing these fixes, don’t forget to schedule your own complimentary digital marketing strategy session with our CEO, Anna Colibri!
Crawling and Indexing
Before it will ever appear in the SERPs (Search Engine Result Pages) your site will need to have been indexed by a “crawler,” a piece of software designed to systematically follow hyperlinks and record what it finds there. You can help the process along with Google Search Console, but there will still need to be a few things in place for it to be successful.
First and foremost, your site will need to load. If it has 404 errors or if your web host can’t maintain a stable connection, your site won’t get indexed. Your site will need an HTTPS address. While there are plenty of sites that use other languages, and plenty of browsers to visit them, Google doesn’t index sites that aren’t using a standard HTTP domain address. As for the -S at the end, that confirms that a connection to your site will be capital-S secure. For close to three years now, since August 2014, Google has counted that security as a ranking factor.
A common mistake business owners make is to use subdomains, especially for their blogs. If, within your site, you’re stitching together multiple domains, stop. You’ll seriously undercut your own DA (Domain Authority), harm UX, and generally miss out on traffic in the long run.
Google treats multiple domains as if they were separate websites. If your blog, which has most of your fresh content, is on a separate domain, then your main website and home page will get very little traction as you work to increase rankings and traffic.
If your links aren’t visible in the HTML of your site (if they’re hidden inside a Java object, for example), you’ll need to publish a comprehensive and up-to-date sitemap, usually in XML, in order for any pages beyond your homepage to get indexed. For a more detailed explanation, take a look at our piece on XML sitemaps for SEO.
When you’re planning your site architecture, it might be tempting, for your own organizational piece of mind, to meticulously sort all your pages into a precise file tree. It’s a great instinct, but you’ll actually be harming your site in the long run.
Crawlers only follow a finite number of links before they stop (it varies, but for best results assume a crawler will go no more than three layers deep). And from a user perspective, it’s generally expected that you can get to the content you need in no more than three clicks. Sometimes it might take more (an older blog post, for example, might be deeper in the archives) but don’t push it.
Users have an unwavering expectation for immediacy. With everything on demand, even a few extra seconds or a few extra clicks will deter a large portion of your potential user base. Take a look at some of the data in our piece on Micro Moments and Local SEO if you want to get a clearer picture of the kinds of user behavior we’re talking about.
These are sort of a mixed bag, but using them inappropriately will seriously undermine your site in a number of ways. Basically, redirects exist to mitigate between two competing SEO best practices. Best practice SEO will tell you to choose a single domain and stick with it, since even infrequent changes will reset your Domain Authority to zero and will confuse your users. DA, domain authority, is basically a measure of reputability or trustworthiness assigned to your site. You earn an authoritative reputation by hosting strong content and linking to sites that users will generally spend a long time exploring, and you’ll lose that reputation if you change to a new domain (basically starting out all over again at level 1, and abandoning all your experience points.) However, there are plenty of reasons why a URL that once made sense for your brand might no longer be appropriate, and you’d have a clear reason to change it. Thus comes the humble redirect.
There are two kinds of redirects, 301 and 302, which are treated very differently by crawlers. 302 redirects are temporary, usually used during site construction or renovation. Crawlers ignore them, since it’s expected that they will be removed pretty quickly. As such, it’s imperative that a page not leave a 302 redirect in place too long, or the overall site will not be indexed properly and will gradually lose its standing, and it can be a long and tiresome process to get everything back in order.
301 redirects are used for permanent URL changes. If you’re moving your business to a new domain, it’s generally good practice to use a 301 redirect on the old root URL. This means that anyone, human or crawler, to load your old URL will be automatically sent to your new destination relatively seamlessly. The only time when this would not be appropriate is if there were a reason to intentionally distance your site from your previous domain.
Permanent redirects generally let you transfer most, if not always all, of your Domain Authority from your old site to your new one, however even 301 redirects can increase page load times, which we’ll cover in the next part of our SEO series. Furhter, they can look a little slap-dash to an attentive user who sees a sluggish redirect in real time (in the address bar) and who might wonder why your site didn’t just load properly the first time. If your two sites are hosted in entirely different places, or if an old server isn’t pulling its weight, there’s often a perceptabe hang during the redirect so be sure to weigh the pros and cons before setting your site up this way. Don’t redirect to a redirect, ever, and make sure that the redirect happens before any other script on the page has run. Redirecting after some of the page has loaded is hugely off-putting to users.
That’s it for Part 1, but we’ll have Part 2 of our series on SEO for digital marketers coming soon!
Colibri Digital Marketing
We’re a digital marketing agency that focuses on the triple bottom line of people, planet, and profit. Based in San Francisco, we’ve got our fingers on the pulse of Silicon Valley, we’ve got an insider perspective on the tech industry, and we get a sneak peak at the future of digital marketing. If you’re ready to work with San Francisco’s first and only full-service B Corp-certified digital marketing agency, drop us a line or click below to schedule a free digital marketing strategy session!
It’s fun 🙂
Originally published at colibridigitalmarketing.com on March 20, 2017.