Alphabet's Eric Schmidt: It can be 'very difficult' for Google’s search algorithm to understand truth
Alphabet's Eric Schmidt: It can be 'very difficult' for
Google’s search algorithm to understand truth
By Catherine Clifford November 21, 2017
In the United States' current polarized political
environment, the constant publishing of articles with vehemently opposing
arguments has made it almost impossible for Google to rank information
properly.
So says billionaire Eric Schmidt, Chairman of Google's
parent company, Alphabet, speaking at the Halifax International Security Forum
on Saturday.
"Let's say that this group believes Fact A and this
group believes Fact B and you passionately disagree with each other and you are
all publishing and writing about it and so forth and so on. It is very
difficult for us to understand truth," says Schmidt, referring to the
search engine's algorithmic capabilities.
"So when it gets to a contest of Group A versus
Group B — you can imagine what I am talking about — it is difficult for us to
sort out which rank, A or B, is higher," Schmidt says.
Ranking is the holy grail for Google. And when topics
have more consensus, Schmidt is confident in the algorithm's ability to lower
the rank of information that is repetitive, exploitative or false. In cases of
greater consensus, when the search turns up a piece of incorrect or unreliable
information, it is a problem that Google should be able to address by tweaking
the algorithm, he says.
"I view those things as bugs as a computer
scientist, so if you are manipulating the information and then our system is
not doing a good enough job of properly ranking it ... as a computer scientist,
I can tell you, this stuff can be detected," says Schmidt.
The problem comes when diametrically opposed viewpoints
abound — the Google algorithm cannot identify which is misinformation and which
is truth.
That's the rub for the tech giant. "Now, there is a
line we can't really get across," says Schmidt.
Since the election of President Donald Trump, major tech
companies have been condemned for their role in spreading misinformation and
fake news. Representatives from Facebook, Twitter and Google were all hauled
before Congressional lawmakers for hearings over their roles in Russian
operatives being involved in the 2016 election.
However, platforms like Facebook and Twitter have a
different issue, sometimes referred to as the "Facebook bubble" or as
an echo chamber. Because those companies' algorithms rely, at least in part, on
things like "friends" and followers to determine what's displayed in
their news feeds, the users are part of the problem.
"That is a core problem of humans that they tend to
learn from each other and their friends are like them. And so until we decide
collectively that occasionally somebody not like you should be inserted into
your database, which is sort of a social values thing, I think we are going to
have this problem," the Alphabet boss says.
Comments
Post a Comment