3 Factors That May Be Affecting Your Natural Search

Estimated Reading Time 4 minutes

This ‘best practice’ approach should always be a start point before conducting a deeper dive or audit.

Over the years it has always amazed me that simple checklists aren’t adhered to or SEO agencies never discuss, or worse, don’t apply this process with a client, pre audit.

So, here are 3 quick pointers to look into.

Analytics set-up

Are you counting the right numbers? Checking that Google Analytics has been correctly set up is key in understanding. Google Analytics can help with:

  • Advertising and Campaign Performance
  • Analysis and Testing
  • Audience Characteristics and Behaviour
  • Cross-device and Cross-platform Measurement
  • Data Collection and Management
  • Sales and Conversions

Make sure your webmaster or agency has tested your Analytics and set up goals and funnels within the dashboard.

URL structure

As websites change over the years and ages are added or omitted, there is the likelihood that you may have orphan pages out in the ether still being indexed. These changes are normally accompanied by a variety of URL changes that do not tie-up with the main site map and as such cause confusion for Google to index correctly. See below example:

  • URL example – 2014

Companyname.com/whatwedo

  • URL example – 2015

Companyname.com/our-services

  • URL example – 2015

Companyname.com/services

The example above is common place among companies of all sizes and highlights the problem of Google trying to index more than one page of the same content with differing URL structures.

Please note that though this is a good simple check, the URL issue is deep and requires a professional site audit and back link check.

Robot (Bot Traffic)

What is Bot traffic? Bot traffic is non-human traffic generated by robots and normally constitutes spam. These are sometimes referred to as spambots.

Setting up and excluding all known bots in Google Analytics is imperative for you to understand the level and quality of traffic that is arriving to your site.

One should become suspicious of bot traffic when looking at analytics data and noticing spikes in traffic or pages with high bounce rates and little or low conversion.

The best way to block referrers from accessing your site at all is to block them in your .htaccess file in the root directory of your domain. On top of this, you can set up filters in Google analytics. If you are using WordPress, there are also spam blocker plugins that can be used.

The above are just 3 examples out of a larger number of simple checks that can be conducted to help ensure your traffic is indexed and ranked by Google.

To learn more about professional SEO and Analytics audits, please contact the team on the details below.

Reggie James
Having built and sold various technology businesses over the years, Reggie heads the consultancy and commercial side of the business. Approachable and pragmatic, Reggie previously ran the first dot-com to list on the Singapore Stock Exchange as well as developing business strategies for brands whilst at AltaVista and Yahoo! before launching Digital Clarity. As a passive investor, he is also involved in the US public OTC markets.

Posted 5 decades ago.

×

Posted 5 decades ago.

×