5 Technical SEO Issues to Fix in 2014

technical seo issues

Many online marketers jump into a New Year with fresh ideas to guide their content strategy and focus on updating their target keywords to lay the foundation for a successful SEO campaign. However, before investing the time and effort into creating effective new content, we recommend you first review the following five common technical SEO issues below and fix them on your website. Improving your ability to rank in search results will help to ensure the content you create will successfully drive organic visitors to your site.

By implementing these updates, you provide your website with the opportunity to increase its ranking in search results and prepare your website for a year of SEO success. 

1. HTML Markup Errors

HTML markup provides context for search engines creating hierarchies of importance within content. Unfortunately, many online markers utilize WYSIWYG editors for visual changes without realizing its impact on HTML markup and therefore search engine optimization.

Several different types of HTML markup exist, creating even more confusion. We recommend leveraging Schema.org for markup direction and best practices. Review your website for incorrect header tags, alt tags and alt titles and evaluate all page title and meta descriptions. Every page of your website should have a unique title that doesn’t exceed 60 characters, and meta descriptions should include a call-to-action and 165 characters, max. Check all of your pages for these details to complete a thorough SEO technical analysis.

2. Poor robots.txt file

Robots.txt files allow webmasters to provide crawling directions for search engine robots by indicating which pages should be crawled and which should be ignored. Online marketers run into most problems with either a restrictive robots.txt file that blocks folders or pages within a website on accident or an excessively permissive file that fails to block private information from search engines, such as a log-ins, and/or customer profiles. These errors lead to webmasters accidentally directing major search engines to ignore and not index content, essentially removing potential for organically driven traffic for those pages.  

Don’t forget to place this file in the correct place – in the top-level directory of your web server. That’s where the search engine crawlers go to look for direction. Also, keep in mind that this file is publicly available – anyone can read it. For obvious reasons, don’t include proprietary information, or heaven forbid, password information. Yes, it’s been done before. If you plan on launching a new website or adding a new content area to your existing site this year, create a comprehensive Robots.txt file strategy first.

3. Dynamic URLs

From a web development standpoint, dynamic URLs provide an efficient way to reference a page through various routes, categories and filters on a website. However, from a SEO standpoint, dynamic URLs can cause problems, including the creation of duplicate content. URLs with unique parameters are essentially different URLs, all of which are separately indexed by search engines and thus treated as different web pages. Each of these pages have the exact same content, but in search engines’ eyes, are all different web pages. This can cause confusion for crawlers as they determine which version should rank in organic search results. Having multiple versions of the same content can also decrease the web pages’ ranking potential, because visitors link to different versions of the same content. This ultimately dilutes the page’s ability to generate back links.

To safeguard a website from the potential issues that result from dynamic URLs, online marketers can utilize URL rewriting strategies and identify canonical versions. This allows webmasters to include keywords and other filler words within URLs, giving them a cleaner look. Following these updates, website visitors can read URLs more easily and understand their route through online resources. These canonical links also allow tell search engines to funnel all value to a singular URL without requiring redirect setup.

If dynamic URLs are rampant across your website but don’t affect the content shown or impact your tracking strategy, webmasters can also leverage 301 redirects. These forward all URLs to one preferred version. They can take time to go into effect once set up, however, and if set up incorrectly, can majorly harm a website’s standing in search results. It is also important to remember that a 301 is a permanent redirect. If you are not willing to make that commitment, try a 302, which in recent months has been noted to also pass page rank. Take great care in building your redirect strategy.

Webmasters can also employ Google Webmaster Tools to provide search engines with definitions for URL parameters. For example, one can direct Google to crawl URLs with a specific value, all URLs containing a certain parameter or no URLs containing a certain parameter.

Before you start utilizing any of the tools above, work with your development team to create a dynamic URL strategy that works best for your website and overall business goals. By outlining a comprehensive plan before taking action, you save yourself from many potential search result issues.


4. Basic site structure

Website architecture refers to the layout and navigational structure of your digital resources. A successful website focuses on target audience needs and includes navigational strategy, menu organization and content labeling. When organizing your website content, you must take search engines’ abilities to crawl and index your website into consideration. They should be able to navigate your website through primary navigation. This increases ranking potential. Also focus on prominently displaying your service/product pages and avoid placing them too deep in your website. Minimize the amount of clicks that it takes for visitors to find your conversion pages, making it easier for visitors and robots to find your most valuable content.

Making changes to your website architecture can be a long and arduous task, but there are small changes that can be made to your navigation, internal linking strategy and content distribution strategy that can be hugely beneficial. Gain an understanding of the content areas on your website that can be minimized and/or removed to increase the prominence of your most valuable content.

5. Poor URL structure

Successful URL structure ensures all website pages are fully crawled and indexed by search engines. To improve this process and keep pages from getting lost, structure URLs with primarily lowercase text. Always utilize dashes instead of underscores in your URLs. Search engines view all words in URLs separately as well as together as keyword phrases when dashes are used. When underscores are used, each word in the URL is crawled as a unique word. Properly and effectively utilize session IDs and dynamic parameters. Of course, don’t forget to include your main keyword in the URL, and always follow logical website structure when developing your URL strategy. As discussed above, mirror your navigation so that your website visitors don’t get lost.

By checking your website for these common SEO technical issues, you can put your website on track for success in 2014. Are you worried you’re in over your head? 

Contact our online marketing team for help on building a cohesive strategy focused on achieving your overall business goals. 

photo credit: Svadilfari via photopin cc