Estimated Reading Time 6 minutes
SEO, known as search engine optimisation, free search, Google listings, and so on, whatever you call it, SEO best practices should always be applied. Google have over 200 ranking signals to indicate whether your website should appear on a search results page. The signals range from page speeds, on page content and links to the site, to anything technical or content based.
Not for profit domains tend to fall into the same higher domain authority and trust as education and government domains. Links from these authoritative sites are still a large factor for ranking high in search, as you are seen as helping a good association with good causes.
So, your site is seen as highly trustworthy in Google’s spiders eyes (all 8 of them), but you’re still not ranking well in Google? We have done a little digging into why this could be, looking at 25 different charity sites from donation based to events, and fundraising to trusts. We almost found it alarming that every site pretty much had the same issues and errors, which will all be having an impact on each sites keyword ranks. Let’s go through some of the findings:
Duplicate content is classed when “substantive blocks of content within or across domains that either completely match other content or are appreciable similar” (support.google.com). There are two types of duplicate content; internal and external. Staying true to the word, internal is within the same domain and external is content found on other domains.
From the websites we reviewed 63% of sites contained high levels of internal duplicate content and a huge 90% of domains suffer from external duplicate content!
The reason duplicate content is seen as negative is due to it appearing misleading to a user and giving a poor user experience. Google’s algorithms work hard to ensure that when searching, a user is given the most relevant results, however if there is duplicate content it can be confusing and can lead to losing clicks to your website. There are multiple ways of resolving these issues, however the best method will be dependent on the situation and each site can vary in its solution.
Slow Site Speeds:
Google indicates site speed as one of the many signals used by its algorithm to rank pages. It’s unconfirmed exactly what element of speed they use; whether it’s pagespeed, whole site speed or just measuring time to first byte. However, the slower your site speed is the less pages essentially can be crawled within the search engines crawl budget, leading to possible negative impacts on indexation.
A huge 95% of sites are affected by slow mobile page speeds (have you not heard how important mobile is these days??) and a slightly lower 22% have slow desktop speeds.
Page speed is also vital for user experience, long load times are a recipe for high bounce rates and lower time spent on site.
Incomplete Codes & Tracking:
I bet you didn’t think this had an impact? Well it depends what Analytics you use. If you use Google Analytics and Google Search Console, this can in fact improve your page’s indexing, as they feed Google data with other ranking factors such as more accurate bounce rates and referral traffic from created backlinks.
Just over half of those analysed, had broken tracking codes or duplicate codes on site. I mean, with this one you are probably damaging your own reporting more than anything. Having duplicate codes or broken codes means you are not tracking data correctly which can lead to some wrong formed decisions and conclusions.
Don’t forget there are over 200 ranking signals for the Google algorithms which can impact where you rank and appear in search engines. Some are more important than others, and some are more easily fixable than others, but they should really all be on your agenda when running a website. If you want to learn more about SEO, Algorithms, Spiders and Bots then get in contact with me or the team and let’s start talking about why you’re really not ranking on Google.