Back in 2000, it made perfect sense to rank sites by inbound links. If a lot of people were linking to it, it must be good right? Billions of dollars and queries later, the answer to that question is apparent. Google’s algorithm is phenomenal, but 2010 deserves better.
Problem with Google’s Algorithm
It’s unfair. The average person does not understand it, and therefore the average person has almost no chance of rising in the ranks. Most would argue that this sophistication is what makes it so valuable. If no one knows what the algorithm is, then the playing field is evened allowing the best content to rise regardless of who posts it. That logic is flawed.
Due to its complexity only those that understand technology and the digital space can formulate effective strategies to rise in the ranks — or those that have the money to hire someone to do it for them. This traps the average person from rising because regardless of the quality of their content, their reach is limited only to their social networks, which most likely are not large or engaged enough to turn it mainstream. Due to this, Google’s algorithm is not finding the best information out there, it is finding the information that is marketed well.
For example, if I am looking for the best information on Brazilian Economics, I type it into Google and the following appears:
Is that the best information out there? Is that the content from the experts in the field? Not even close.
The experts are most likely in academia, continually researching, learning, and submitting to industry publications that take 2 years to publish such studies — and probably not putting forth a ton of effort into the online marketing of their work. Due to this, the results to each query are not the best information, but the best marketed information.
How to Improve Google’s Algorithm
It needs the human touch, basing relevance on the number of times something has been shared, and not on links, tags, and traffic info.
Imagine going to a search engine, typing in Brazilian economics, and what appears are the posts that have been shared the most in the social sphere. This eliminates spam and affiliates; and it takes the power away from SEO specialists and puts it into the hands of the masses consuming that information.
Not only that, but each result would show the face of the person that posted the content, their background and contact information, as well as any comments or reviews about that link from other people. You also see relevant online social communities where these experts interact.
That’s the search engine of 2010.
Can Google change its ways and survive in the age of social?
If you look at its track record of social failures — Buzz and Wave — the answer would be no. Their culture is technology focused. They have the world’s most talented engineers building the best products to organize the world’s data and information. Releasing over 100 betas every quarter, it seems that they’re capable of solving any problem with technology — for heaven’s sake they are even building a robot car.
That does not matter though because they fail to see the power of the human element.
There are millions of people out there that are experts in their own field, who can’t get recognition because Google’s algorithm expects them to be experts in digital as well. That’s not fair. That’s not reasonable. That doesn’t bring us the most relevant results.
The consumers of that content would bring us the best results though; because they are the ones that have the need to be satisfied, and they will not share anything that does not satisfy that need. This subjectivity is what makes the difference. Technology can only scan and retrieve, it can not determine value. If search results were ranked by how many times the content has been shared, you would get nothing but honest results from the passionate people who know best.
Maybe Google +1 is the answer to this problem, or maybe it isn’t. Regardless, people need to wake up and realize that their algorithm is outdated in the age of social.