Menu

Google's search engine might not be telling the 'truth'

Google's search engine might not be telling the 'truth'

A Google exec, Danny Sullivan, recently spoke out about Google's search algorithm not being the mystical oracle that many of us want to believe that it is. Danny is the new search engine liaison for Google, which is a strange title, but a much needed one these days with the latest issues dealing with user privacy and security.

"'We're not a truth engine. ... We can give you information, but we can't tell you the truth of a thing."

Dammit Danny, I wanted truths of things. Which means that all those times that I took out my phone to prove to people how right I was, might have just been in vain. Proof of this isn't hard to come by. You can search for the Benefits of more sleep and spend hours 'learning' about the magic of more sleep. And at the same time you can spend hours reading through medical journals on why Too much sleep is killing you. Both probably will contain many valid points and will have different meaning depending on who is reading it. If I am an athlete, then sleep is super important for recovery, and if I'm a 21 year old guy with no job, then it could probably be killing me.

So which is right? Or more relevantly, which is truth and which is less truth? Well, this is where things get subjective and go beyond logic. Because you can't both sleep more and less at the same time. That, or nobody really knows sleep well enough yet to truly have an answer, which is probably closer to the truth. And that's as close as we can get with most things really. The truthiness factor. And Google does not consider that factor when rendering search results. But should it?

truthiness - the quality of seeming or being felt to be true, even if not necessarily true.

Truth is very allusive

And the reason why truth is so allusive, is because it's so damn hard to prove usually. Everyone points out facts and makes correlations that sound convincing and even our research experiments normally tilt to one side more than another due to biases in the lab.

correlation does not imply causation

Google can only show you relevant results to you and your search history, and not much else. If you've been searching alot about the benefits of coconut oil then showing you results on the opposite wouldn't be very helpful to you. You can help Google out by searching for the 'pros' and 'cons' of coconut oil, to which you will probably have less results to go over. And hopefully some wonderful soul out there has done their valid research on these topics and written a concise article on it.

Google's search engine might not be telling the 'truth'

Compare those results to the following.

Google's search engine might not be telling the 'truth'

One of these is not like the other. You'd almost think that coconut oil was out stealing candy from children and making fun of the elderly the way some articles make it out to be. Obviously both results can't be 'true'. The truth here depends on the side that you are on. If you are already pro coconut oil, then you'll steer towards the first set of results. And if you are disgruntled with it, then more than likely you will use the second result set to justify your views.

Google itself has no content to deal with or to validate. So Danny is right, Google can give you results from what they deem to be a valid source. Google is just a collection of nodes on the internet that points to content that people like you or me write. And where do those people get their information? Well. Many times it comes from Google also. And this is important to realize. Because it is both confusing and pretty fascinating from a daily living standpoint. The stuff that we read, was read from somewhere else by someone, who wrote it down for us in their own interpretation of it. And the chain can go on and on.

Our societies are opinion based

Many things that we do daily are based on opinion. Whether the opinion of ourselves, or the opinion of others. We choose what to eat based on someone else thoughts usually, just as we choose where we work many times based on some review we see online. So we have to accept some level of uncertainly whenever we consume online content.

We also can't sit and wait for solid 100% proof on any topic. Or we'd be sitting for a long time. There is some personal responsibility that has to go into the process of determining truth. Is this true to YOU right NOW where you are in life. If so, then use it as you will. And if it rings of untruth, then give it time to marinate and then come back.

So really, whenever we jump on Google we're just getting a subtle mirror back of our own beliefs and current prejudices. They ring truth to us, but not necessarily to anyone else. Even the way that we label planets as 'Planets' is based on some form of opinion. We chose certain key points and then called that Planet. It's a guideline that we can use to communicate more simply, but by no means is it anchored in any type of reality.

The truth police

Things that we thought were false are now turning out to be true, and vice versa. This happens all the time. And our technology can't really handle or support that system of information change. Currently there are more than likely millions upon millions of web pages out in the wild with outdated or invalid information. And no one is going to be updating that content anytime soon. But should they? Does anyone have any responsibility to do so?

Even on this blog, there have been some mistakes that users have pointed out, that I fix and update immediately. But that's because I want relevant content for anyone that reads my site. Even sites like Wikipedia have invalid information on many topics. And while they do get corrected, it isn't an instant process and normally takes some and cost.

Which of course begs the question of who will take charge in validating the content that is so freely distributed online, and who will be responsible for keeping said content updated.

Should 'truthiness' increase page ranking

And now back to the heart of the matter, and that is our reliance on Google and its moral obligation to provide 'truthful' content. And while the idea sounds somewhat farfetched and futuristic, we aren't too far I feel from having something of the sort in place. Recently Elon Musk spoke towards the fact that journalist should be held accountable for the quality and accuracy of their content. He suggested some type of application where journalist would be ranked based on their truthfulness factor. And while many in the media spoke out against it, some of us on the other side don't think it's such a bad route to take.

Walter G. author of blog post
Walter Guevara is a Computer Scientist, software engineer, startup founder and previous mentor for a coding bootcamp. He has been creating software for the past 20 years.

Get the latest programming news directly in your inbox!

Have a question on this article?

You can leave me a question on this particular article (or any other really).

Ask a question

Community Comments

No comments posted yet

Add a comment