seo essentials

    5 Often-Overlooked SEO Essentials

    1024 544 Lisa Martino

    Search engine optimization (SEO) is a discipline that requires just that—discipline. Think Daniel of the Karate Kid (the prized 80’s version, please) painting the fence, again and again. Tedious though it may seem, maintaining an error-free and well-optimized site can lead to improved rankings on search engine results pages (SERPs) and long-term traffic success. But whether you’re a digital marketing novice or an SEO black belt, if you don’t have the basics right, the more advanced optimizations are useless. Here are some of the often-overlooked SEO essentials that are critical to success.

    1) tweak your titles

    When was the last time you reviewed your page titles? By this I mean the prominent links that appear on search engine results pages. Don’t set these pages and leave them to die—check and tweak them to ensure they still make sense and link to the right place. Are they putting your website’s best foot forward? Keep in mind they are the first thing users will see when they browse for your listing.

    2) consult the console

    Pay a visit to Google Search Console (formerly Google Webmaster Tools), a free service that helps you monitor and maintain your site’s presence in Google search results. Search Console gives you a bunch of great info about your site, including existing errors, who is linking to you, when Google bot last paid your site a visit, how many pages you have indexed in google, and how your site has trended over time. And, if you are linked to Google Analytics, will give you top queries and analysis. Most importantly, it will help you discover how Google—and the world—sees your site. (As a side note, keyword data will be limited as Google stopped allowing you to see top keywords from organic search, claiming it was a privacy issue. Thankfully you have more access to information other than the dreaded “keyword not provided”.)

    3) 86 your 404s

    Minimizing crawl errors and general accessibility issues can help get your content into search engine indexes more quickly and frequently. Improving and resolving 404 and timeout errors on your site can also help search engines minimize the bandwidth used to completely crawl your site.

    In Google Search Console, take a look at the “Crawl errors” in the diagnostics panel. Pay attention to the “Not found” and “Timed out” reports, and test each error, as well as the “linked from” tab. If you drill down the report, you may find a common pattern that can be solved with only a few fixes. Focus on 404 error pages that have external links first to get maximum SEO value. Linking out to expired external URLs isn’t great for user experience and a preponderance of them may imply that, as a resource, your site is getting out of date. Instead, implement a permanent 301 redirect to the updated URL to help get site visitors to the correct page.

    Also be sure to inspect your XML sitemap. Make sure it’s 100% up to date. Search engines only have a 1% tolerance before listing yours as junk. If it contains more than 1% “junk” URLs—redirects, dead pages, or pages with some sort of error message on them—they consider it junk and they may not crawl your site as often. 

    4) say canonicalization three times fast

    How well have you dealt with duplicate content (Google hates it)? When was the last time you checked your redirect rules? If redirect rules were accidently left out of updated site releases, your canonicalization is back to square one. Of course, you should always be working towards reducing internal duplicate content as a best practice. Check your www or non www redirects (choose either, but always use a 301), to see if you have a problem. Also check trailing slash and case redirects (tip: a 301 redirect to all lower case URLs can solve a lot of headaches).

    5) make nice with the robots

    If there are files and directories you do not want indexed by search engines, you can use the “robots.txt” file to define where the robots should not go. But be sure to take a monthly look at your robots.txt file. Does it still make sense to block certain paths? Even the slightest of errors can cause the bot to disregard the specifications and possibly include pages that should not appear in the search engine index.

    Keep in mind that the “robots.txt” file is a publicly available file, so anyone can see what sections of your server you don’t want robots to use. Know that search crawlers can also ignore your “robots.txt”, especially malware robots that scan the web for security vulnerabilities.

    Making these tips part of your essential spot-check routine will help your SEO kick butt on the SERPs. You can also learn more about our search engine marketing services here atSmart Panda Labs—our SEO experts can provide you with everything you need to achieve strong search results.

    Wax on, wax off, Daniel-san.

    outsourcing-promo

    AUTHOR

    Lisa Martino

    Lisa heads the firm’s search engine marketing practice and brings more than 15 years of experience to smart panda labs, where she implements and manages a variety of paid search campaigns for clients, each with unique budgets and goals. Her areas of specialty including paid search and search engine optimization, internet marketing, display ads, email marketing, CRM, affiliate marketing, direct mail and web analytics. Prior to joining smart panda labs, Lisa was a Search and Auction Media Manager for the Walt Disney Company for over eight years, where she managed extensive, multi-million dollar paid search campaigns for numerous Walt Disney Parks & Resorts websites and brands. She has also worked in marketing for Cox Target Media/Valpak and American Express. Lisa is a Microsoft Accredited Professional and is Google AdWords qualified in search, display, video, mobile and shopping.

    All stories by: Lisa Martino

    Leave a Reply

    Your email address will not be published.