If you follow SEO news, you’ll no doubt know that Google are preparing a number of changes to their algorithm which are going to “level the playing field”.
What does this mean?
Well, it means that Google are preparing to “clear the spam” out of their search results by penalizing sites who have over-optimized SEO.
This brings a number of thoughts to mind… just what are they up to, and why are they doing it?
What is over SEO over optimization?
Well, as we all know search engine optimization is just an algo which uses many signals to rank websites.
It’s actually a rocket science but not impossible to understand what all signals Google consider to rank a website better. Keyword density, backlink building, invisible text, doorway pages and many more.
Most of them lies in the line of what we call as Black hat SEO’s. Still not sure, read about how garbage ranks in search engine.
For a normal Bloggers, who just work on quality content, this SEO seems to be a tough task and Google is trying to level the playing field by penalizing Websites who are doing SEO over optimization.
This was added by Matt Cutts added in a comment at SXSW. In simple word, In coming days, Google will be rolling out major search engine algo change, where sites which are into aggressive SEO will be penalized. Bad time for Black hat SEO’s.
As added by Amit Singhal on Google+:
Let me just say that every day, we’re improving our ability to give you the best answers to your questions as quickly as possible. In doing so, we convert raw data into knowledge for millions of users around the world. But our ability to deliver this experience is a function of our understanding your question and also truly understanding all the data that’s out there. And right now, our understanding is pretty darn limited. Ask us for “the 10 deepest lakes in the U.S,” and we’ll give you decent results based on those keywords, but not necessarily because we understand what depth is or what a lake is.
In 2010, we acquired Freebase, an open-source knowledge graph, and in the time since we’ve grown it from 12 million interconnected entities and attributes to over 200 million. Our vision for this knowledge graph is as a tool to aid the creation of more knowledge — an endless cycle of creativity and insight.
Bad search results = more advertising revenue
Forgive my cynicism, but I’m only sharing what Larry Page and Sergey Brin said in a thesis paper in 1998. Advertising income often provides an incentive to provide poor quality search results. So, in many ways, the spammers have been doing Google a favour. Instead of relying on those top 10 search results, users will hit the adverts at the top, contributing to the $20bn plus the firm makes from Google Adwords.
But good search results = more users
Here in the UK, Google dominates completely, but in the US it’s a slightly different picture with Google only dominating massively. There’s still something called Yahoo (remember them?) and Bing (yes, Bing, as in Chandler!) This competitive market sees Yahoo and Bing constantly improving their own search results, and in theory, people switching search engines because the Google results are so polluted by spam. Or, as Google now calls them, over-optimised sites.
But what is over-optimised?
We don’t know. We all have opinions about what is over-optimised, and they range from keyword stuffing to spammy link building tactics (forum signatures, blog comments, social bookmarks, etc.) But Google has already been acting on this, hasn’t it? What it now appears to be targeting is sites who use keywords, sites who build too many links, sites whose content doesn’t respond to user queries.
But how does a robot know what’s good and bad?
Again, we don’t know. And here’s the problem: the axe hangs over many sites, and that’s a good thing. The axe should fall on websites who rank because they have dropped links on splogs and forums and should fall on those who have broken Google’s Webmaster Guidelines. But an algorithmic change is a swinging, indiscriminate axe: how can a robot know whether a website is good for users or not?
Well, think like a robot for a second:
- What if the website has a low bounce rate? Surely that means that users are engaging with it.
- What if the website has been shared many times on Google Plus or Facebook? Surely that means that users like the content.
- What if the website has an ‘about us’ section with profiles of real people, with social profiles too. Surely that means that it’s legit.
- What if the website has an FAQ section. Surely that means they’re providing quality content that relates to users’ queries.
- What if the website has backlinks from ‘people in the know’ – industry experts, etc. Surely that means they’re good at what they do.
Some good websites will fall in a few weeks’ time, and some bad ones will rise. The indiscriminate swinging axe of Google will take out many of the poor quality sites that pollute its SERPs, but there will be casualties.
So what can webmasters do?
First of all, if you’re providing your users with good quality, unique content, you should be OK. If not, get writing. Prove to Google that you’re real and that you have something interesting to say. Prove to Google that you’re part of a community and that the community is interested in what you say. Who’s to say that Google isn’t looking at Twitter information such as retweets. Who’s to say that Google isn’t giving you a few extra places up the rankings if you’re using Google+?
It’s not time to stop thinking about keywords, but it is time to start thinking differently. If you have a trophy keyword, put it to one side and start concentrating on associated keywords. Remember those long-tail phrases from your Google Analytics account? They’re worth their weight in gold now.
It’s time to stop paying that link-builder who comes back with forum links and blog comments, and most definitely time to stop paying that link-builder who won’t tell you where your links are because it would “harm the integrity of the blog network”. It’s probably time to hire an SEO who can write. Preferably one who can schmooze, too.
If Google is serious about improving the quality of its search results (and we have had reason to believe it isn’t in the past), then remember, it likes.
- Real people
- Popular people
- Something interesting to read
Quite simple, really.
Does this mean that the white hats SEO have won?
The white hats keep winning battles, but they never seem to win the war. After all, why is it that these content-packed websites sit at number 5 while the splogging spammer sits at number 2? Black hats will shift their focus quickly, and we need to watch what they do – because what they do will quickly become Google’s next target. Equally, they carry out some quite detailed analysis on what makes sites rank well – you can learn a lot from a black hatter.
They are already offering Google +1s and Facebook “likes” en masse, so what’s next? Some kind of advanced content scraping, perhaps, or better spam websites with fake profiles and fake communities. There are no limits to a black hat imagination, so join the forums and lurk!
One final thought: wouldn’t it be great if Google weren’t the dominant search engine? This “hanging axe” behavior is narcissistic at best, and at worst is power-crazy. Why announce that you’re going to hit people over the next few weeks when you can just go and punch them anyway? Google is acting like the school bully, seemingly gaining pleasure at the sight of the other kids cowering in fear whenever he walks in the room, promising to slap somebody – but not saying exactly who.
Also, check out this recent video posted by Google Webmaster team on common mistakes in SEO:
The phrase “we pray at the altar of Google” has taken on an altogether more sinister angle.
I would love to know, what do you think about SEO over optimization penalty? How Google is going to determine the difference between white hat, Grey hat, and Blackhat SEO?