Last week it was reported that Facebook were angling to rival Google’s domination of the search market by creating a bespoke search engine. Rumours, fuelled by a blurry image of Mark Zuckerberg’s MacBook Pro screen showing an unfamiliar Facebook branded search box, flew around the internet. We even joked about ‘ZuckerSearch’ in our traditional April fools post, but was this the first glimpse of the much speculated and long-awaited Google killer?
The sensationalists amongst us may have jumped on this bandwagon; however the likelihood is that the team of Facebook engineers, headed by an ex-Google employee, is simply working on improving the fairly basic integrated search function that Facebook already offers.
With Facebook nearing 1 billion users, the service has long been awaiting an update that will allow users to easily navigate through shared media on the platform. Currently, although there are many ways of refining your search results, they can be counter-intuitive and do not fully utilise the wealth of personal and social information that is stored on the Facebook servers.
Indeed, we are now seeing a huge move towards social search – although the impact of Google’s “Search plus your world” is yet to be fully seen (and we have speculated that although initial uptake has been fast, Google+ engagement levels are low), the move away from semantic to social-signal led results seems to be on all search professional’s minds. If Facebook did have the same resource and experience that is available to Google then could a real competitor be on the cards?
In short, that is the problem. The years that have gone into the Google algorithm and its organic growth and change means that they of course have a head-start, and a very large one at that. While reports are that a team of 24 are to be working on Facebook search, the talent pool at Google is huge. Take a look at the video linked below for a sneak peak at one of the search quality team meetings:
Google Chrome is set to overtake Microsoft’s Internet Explorer as the world’s most popular browser in May, finally ending Microsoft’s dominance.
Sunday the 18th of March 2012 was an historic day in the browser wars, with Google Chrome finally overtaking Internet Explorer to take the top spot for the day in terms of global browser market share. Whilst this victory was short-lived, with Internet Explorer regaining its crown come Monday morning, it shows the impressive gains that Chrome has made since it was launched in December 2008.
The next question can only be: when will Chrome overtake IE for good?
Extrapolating the monthly market share for the two browsers shows that Google Chrome is set to continue its surge to permanently best Internet Explorer in May this year to become the world’s most popular browser.
But since Chrome will not overtake IE until the end of the month, we need to look deeper to see the point where Chrome has already passed IE.
Google has recently started taking action against blog networks in an attempt to remove low quality websites from its index. It is estimated that several thousand domains have already been removed from Google’s index and the number is likely to increase further in the forthcoming weeks or months. That means that hundreds of millions of links have been completely devalued affecting the rankings of several websites, directly or indirectly.
Carrying out a thorough backlinks audit for new clients is extremely important to us because it allows us to:
- Get a good understanding of the link profile to their site and the quality of the historical backlinks
- Work out the chances of losing some link equity in the foreseeable future
- Closely monitor link equity loss on a weekly/monthly basis, react quickly and modify our link strategy if necessary
- Forecast more accurately on ranking improvements and traffic growth
Preparing The Data
First and foremost we need to collect as much backlink data as possible. Exporting data from the following sources would make the data-set quite reliable – the more data, the better.
Majestic SEO Data
Majestic SEO historic index offers invaluable data about a site’s backlinks and should almost definitely be the primary source of backlinks data.
In a move sure to cause some consternation for U.S. financial regulators, Facebook have broken their SEC-mandated ‘quiet period’, a requirement for firms that are preparing to go public, by announcing the launch of their new search engine ‘ZuckerSearch’.
After search engine giant Google’s biggest foray into social networking ‘Google+’ launching last September, leading social network Facebook are seemingly firing back by challenging Google in the market with which they have become synonymous – search.
Integrating this new service with the traditional layout and experience with which Facebook users are familiar will be the biggest challenge for the Facebook developers, and probably the subject of the greatest speculation that will no-doubt result from the announcement.
“The idea came to us very organically,” Facebook CTO Bret Taylor told a stunned New York press conference on Saturday evening. “We’ve noticed that people sometimes use their Facebook status updates to ask questions of their friends in the same way that you might use a search engine – ‘what happened on American Idol last night?’ or ‘how tall is Jeremy Lin?’ or ‘who invented the toothbrush?’
“At first we thought, what the hell is wrong with these people? Haven’t they heard of a search engine? That’s really where the idea was born.
“We think there is a significant opportunity in marketing this service to users who see their laptops, netbooks and tablets as just $1000 Facebook-machines.”
Although too early to draw any conclusions, privacy-watchdogs will likely be monitoring developments on Facebook’s ZuckerSearch closely, particularly after they drew criticism earlier this year for their proposed ‘Recommended Relationships’ (a product which automatically sent out speculative invitations to other users for dates, based on your romantic history) and their perhaps ominously titled ‘We Watch You While You Sleep’ security service.
At time of press Google have yet to respond to the announcement, but this will no doubt escalate the notorious competition between the two internet mega-giants. Relations between the two companies have been tense since Facebook CEO Mark Zuckerberg suggested in February 2012 that Google+ is a service designed exclusive for ‘nerds’, and removed Google-founders Larry Page and Sergey Brin from his ‘Important Industry People’ circle on the service.
No firm date was given for the company to begin rolling out ZuckerSearch, and it’s unlikely to be released before their initial public offering, rumoured for May 2012. However, limited access has been given to several top web publishers, and as long as you promise not to tell anyone, you can have an early look by clicking through the iCrossing portal here.
Firefox has recently made a decision that is set to continue a worrying trend for website owners by further restricting the knowledge of what keywords people searched for to find the site.
Whilst not yet live, Mozilla have committed to making the changes, and they have started to be placed in the main code set for an imminent future release.
The changes build on Google’s decision to hide all information about the keyword used from a website owner if the user is signed in to their Google account. The search is made secure, using the https protocol instead of the standard http.
I’m going to address secure search from two viewpoints; the users and the webmasters:
Benefits of Secure Search to a User
Secure search is of benefit to the end user because it improves the privacy of their internet browsing. They may navigate through to a site, but the keyword they used in Google search is hidden from the owner of the site they arrive on. Whilst this may not be perceived as an issue to most people, the problem arises in the way Google Instant (where results are shown as you type) works – as the search is modified, the original keyword used is maintained and then also passed to the end website. This is of concern as the first keyword could be completely unrelated to the final one used before selecting a site, and could be a serious breach of user privacy.
One additional advantage of secure search, which perhaps makes it of more importance for the end user, is that this would prevent network sniffing being used to copy the cookie data of the signed in user. As Google have consolidated the privacy policies across all their digital properties, and since being signed in to Google search is the same as being signed in to Google Docs or Gmail, having someone copy your cookie means they can log in as you to any of these other websites. As most users tend to use one email address to be the point of reset for all their other passwords, this access means that a person’s entire digital life can be hijacked, potentially being extremely complicated to recover.
The hijacking issue obviously points to a bigger problem – cookies are a really unsecure way of working. Whilst there have been calls for changing this underlying system, decreasing costs of SSL (Secure Site License) means that, potentially, every site will go secure in the future.
Following on from my previous post showing the Australia/Melbourne race result, here is Round 2 of the Forumla 1 in Malaysia at the weekend. A somewhat extended race being red flagged while the thunder, lightening and heavy rain passed, it was the drivers toward the back of the top 10 who went on to take 1st and 2nd.
Last week I spoke at a conference on digital marketing around the topic of putting data, and more importantly the insights from that data at the heart of your digital marketing strategy and how doing so would help increase sales and ROI.
Research should form the foundation of any digital marketing strategy, without it you’re placing yourself on unstable ground, adopting a Mad Men guess work approach to marketing rather than leveraging all the powerful insights that can be gleamed from the large amounts of data available to us.
Understand the audience: who are they, what do they want, why are they here? Creating personas around your typical user can help you understand your audience. For example if you have an audience of middle aged women from India you’ll likely implement a different strategy than if you have an audience of young men in their 20’s from North America.
Understand their behaviour: demographics and socio economic data can only take you so far, what is of real interest is how people behave. What language do they use online, does it match the language you’re speaking on your site? Categorising and performing cluster analysis on the user data allows us to delve deep into the data and pull out nuggets of insight that help uncover niche interests that competitors may not be targeting but are of interest to your customers.
Ever since I can remember I’ve been passionate about Formula 1 so, with the kick-off of the 2012 season I thought about combining my passion for the sport with my love for design and come up some infographics around each of the races and the results to show how creative can be applied to a multitude of content sources, not just research results. Check out the first from this year’s Australian GP – I’ll be circulating throughout the series so we can see the final results altogether in a final summary of the 2012 races – stay tuned!
Developments are afoot once more for Google, with a new format for the integration of Google+ searches making its way onto our screens. This new format means that brands on Google+ can now occupy a much larger proportion of the search results screen than they did previously. This appears to be a simple progression of the new Search Plus Your World update that we discussed back in January which primarily sees the Google+ page recommendations/latest posts appear at the top right of the screen as opposed to its previous location within the search listings.
What does this mean for your SEO strategy?
For businesses on Google+ this is going to become key for your natural search strategy as appearing in the top right of search results (in a space usually reserved for paid search ads) will mean more visibility and an additional place for users to access your social content. It’s also important to note that as with previous Search Plus Your World, this format change is visible even if a user isn’t signed in to their Google+ account meaning it will reach a far greater audience than was previously possible.
In October 2011, Google made an announcement regarding a change to encrypted search queries within Google.com. This involves an SSL encryption protocol which is automatically applied to all users logged in to Google (Gmail, Google+ etc.) and also searches made directly via https://www.google.com (notice the ‘s’ in the URL).
While the secure protocol was not a new feature in Google, the latest update meant that all searches via the secure server would no longer pass keyword referral data. While Google’s announcement initially suggested this was to protect users’ privacy, the SEO community speculated whether Google’s intent was otherwise.
What was particularly suspicious about the update was that the secure keyword data would remain available for paid search referrals, suggesting that Google were intending to encourage paid search, rather than protect users’ privacy. While other sources suggested that privacy was a genuine concern, with the Google+ API allowing webmasters to track search queries down specifically to any individual.
How does it affect us?
Since the update, analytics packages have returned encrypted keyword data as ‘not provided’, while other keywords appear to dip in visits. With the update only implemented on Google.com, US sites have taken the biggest hit, while UK sites have been affected on smaller scale until now…
On 5th March 2012, Google announced that this feature will be pushed out across their localised domains as well, affecting referrals from Google UK as well as Google.com. The announcement states that this will be introduced “over the next few weeks”; therefore UK sites should see an impact by the end of March.
UK webmasters should expect to see a further increase in traffic filed under the keyword ‘not provided’, which Google’s Matt Cutts “estimated even at full roll-out, this would still be in the single-digit percentages.” Although, external research into the impact on US sites shows the average figures to be closer to 11%.
What to do?
With secure keyword referrals returning ‘not provided’, all websites will lose a fraction of keyword data, which is unavoidable. However, there are various ways to make use of the data such as sorting visits by landing page, to determine what keywords may have contributed towards the unknown data. This is particularly useful for websites where individual landing pages correlate closely with specific keywords. In cases where more than 20% of data is being lost and it is having a significantly negative impact, iCrossing suggests that further actions should be taken to make the most of the lost keyword data.
Sign up for email notifications of Connect blog posts.
- Connect – iCrossing U.K.
- Conecta2 – iCrossing LATAM & Spain
- Greatfinds – iCrossing U.S.
- Talblick – iCrossing Germany