First Thoughts On Google’s Fresh Algorithm Update

Nov. 04, 2011 | by Modestos Siotos

Just yesterday Google announced a major algorithmic update which will serve searchers fresher and more recent results. According to Google the recent change will impact on approximately 35% of searches – a much higher percentage compared to this year’s Panda update, which initially had roughly a 12%  impact in the SERPs. Since Panda was launched Google applied several updates in order to fine-tune its impact as many site owners raised concerns that their sites were negatively affected in a rather unfair way.

Cumulatively the incorporation of social signals (Google+, +1) in Google’s organic results, the multiple Panda Updates and this recent announcement, it is evident that at least 50% of searches have been affected in less than 12 months. There is no doubt Google’s intent is to provide users with the most relevant results. However, changing the organic search landscape so drastically in less than a year would not come without some great challenges.

Algorithmic Challenges

The main challenge of Google’s “freshness” update is relevancy and how the served results satisfy user intent and expectation. It is still unclear how the algorithm works out which keywords fall into the affected 35% of searches and which don’t. The key question is whether a “fresh” or “more recent” result is a quality one. Google claim that freshness won’t be the only component as the content itself, topicality and quality will also be taken into account.

Nevertheless, it is very interesting to see how Google have addressed all the above algorithmically. With the Panda update Google tried to tackle the challenge of “quality content” with the aid of human search quality raters. They then fed the raters’ results into their machine learning processes in order to measure quality in an algorithmic way. A similar process may have been followed with the new algorithm update, which given that it doesn’t have an official name yet we could call “Fresh Fruit” update inspired by Amit Singhal’s introduction.

But how will the recent change deal with gaming intent and how will it detect efforts to manipulate the SERPs? In particular with searches for topical news and stories that sounds like a great challenge. Will the new change favour the site that first published a news article or the one that published it last, which in this case would be more recent? And what if a content author rewrites an existing news article so it substantially changes? Would that be considered as fresh content and outrank the original one?

Freshness Update Impact

At the moment it seems that searches for mainly head/generic terms have been affected. For instance, searching for ‘football’ in Google.co.uk returns the live football page of the BBC on the top spot. Similarly, searching for ‘Olympic Games’ does no longer return Wikipedia’s entry as the first result but the London 2012 official homepage.

Make The Best Out Of It

Even though it is still early days it seems that keeping the content of a site fresh will be more important than ever before . However, maintaining the freshness of the content without Google knowing about would be pointless, hence frequent crawling and rapid indexation will become critical.

Do note that updating your content every couple of days shouldn’t allow a site to improve visibility off the back of this algorithm update. The following actions are likely to benefit web sites and help their pages rank even higher through improving their ‘freshness’.

RSS Feeds

Implementing RSS feeds for new pages as well as those which are frequently being updated will result in quicker indexation. Submitting the RSS feeds to all major RSS aggregator sites will probably become more popular.

Dynamic XML Sitemap

Certain searches seem to return recent, date-specific results. Thus, the page timestamp seems to play a big part in a page’s ability to rank. One way to give Google a hint about that is by providing a dynamic XML sitemap. This way Google would get informed about what the most recent content is  by processing  the timestamp provided via the XML sitemap.

Optimal Crawl Budget

Being visited by Google’s spiders regularly is very important. Therefore making sure spiders will spend their time crawling valuable content and not duplicate pages that will eventually get filtered out from the SERPs is very essential. Site architecture issues disallowing spiders from making a deep crawl of the site would need to be addressed and resolved.

Social Signals

Fresh or recent content that becomes viral via the social media channels will get indexed quicker which could even result in higher rankings. Therefore, making all social media share buttons available to users would be very beneficial. It is very likely that Google’s new algorithmic update takes social media trending topics into account in order to discover fresh topics and serve more “fresh” search results.

Fresh Links

It makes sense that when Google has to choose between two relevant pieces of content, fresh links would indicate which page should rank higher. Links shape a site’s domain authority, thus sites receiving fresh links from authoritative and trusted sources will be more likely to rank for recency-sensitive key terms.

Be Sociable, Share!

    Comments (4)

    • Google Engine Optimization, or GEO | Minnesota Siren

      [...] results are associated with my search.  Typical Google search results contain the newest, freshest search results, and therefore old articles are hard to find.  Interestingly, the personal results [...]Mar 4, 2012 10:30 pm

    • Modestos Siotos

      Interesting points Jeremy.

      There are a few things Google may be using in order to identify recency -sensitive queries:

      Microblogging and social media sites would be the most obvious ones as most people use them for trending topics. Signals could include Facebook updates, Twitter hashtags and obviously Google+ streams in order to identify where users are clicking to.

      I think that if Google serves better results for recency-sensitive queries less people may turn to Twitter or Facebook for bursty topics such as celebrity or natural disaster news.

      For less time-dependent queries links may play a big part as pages with fresh incoming links will probably rank higher than those without fresh links which could potentially be flagged as pages with stale content.Nov 6, 2011 12:45 pm

    • Google Fresh Algorithm Could Lead to Another Cycle of Content Spam

      [...] change will lure publishers to diversion a complement and manipulate SERPs, writes iCrossing. “In sold with searches for accepted news and stories that sounds like a good [...]Nov 5, 2011 01:34 am

    • Jeremy Head

      Interesting post Modestos.
      It feels to me a bit like there are two types of Search result here now.
      1) The immediate short –hit one (ie Fresh). Which will bubble up fast to the top of the page and then drop off quickly
      2) The longer term less time-dependent one that has always ranked well because it has lots of authority.
      Google needs to provide both and I can't see how the algorithm can ever be smart enough to always know which of the two I am actually looking for. Sometimes it will but certainly not always.
      The bit we won't know for a while is: How long is the ‘sell by date’ on something that Google sees as 'fresh'?Nov 4, 2011 05:26 pm

     
    Please note: the opinions expressed in this post represent the views of the individual, not necessarily those of iCrossing.

    Post a comment

    SUBSCRIBE