Recently, Google has made some big changes in its search feature to tackle the fake news, disturbing answers and offensive results that keep polluting its search queries.
The Google update “Project Owl” has offered two new ways to its consumers to report the issues they face with search queries.
Table of Contents
ToggleWhat Google Algorithm is Offering?
The problematic searches involve heavily biased content including rumors, myths, and conspiracies. It has contributed to the spread of deceptive, low-quality, offensive or fabricated information.
These searches were always a part of the process. However, it never caused any big issues. But now, Google is facing a lot of issues due to fake, fabricated and low-quality search results.
So to fight back these problematic searches, Google is continuously trying to bring high-quality content to the top results. To make it possible, Google has made a few structural changes in an update code-named Project Owl. It offers the users an option to report content that they feel are not accurate or offensive directly to the Google.
The changes that Google has made in its update are:
- Improved Auto complete search suggestions
- Improving ‘Featured Snippets’ answers
- More emphasis on authoritative content
How Can Consumers Report Google About Fake or Offensive Content?
Google tries to auto complete and suggests topics as someone begins to type in the search box. It was designed to speed up searching. So, for example, if someone is typing “wea,” Google auto-completes or suggests by completing the word for “weather.” It helps saves the time of a user by suggesting the full word.
Google’s suggestion is relevant to the queries that people search on search engine. So while “wea” brings up “weather” as a top suggestion, it also shows “weather today,” or “weather tomorrow,” as those are other popular searches beginning with those letters that people usually search for.
Since suggestions come from real keywords that people search on, it can suggest problematic topics that others might be researching. This was illustrated last December, when the Guardian published a pair of widely-discussed articles looking at disturbing search suggestions, such as “did the holocaust happen,” as shown below:
Now Google has launched a limited test allowing people to report offensive and problematic search suggestions.
Moreover, a “Report inappropriate predictions” link now appears below the search box. Clicking that link opens a form that allows people to report their issues in one of the several categories:
Reporting an inappropriate suggestion doesn’t mean that the content will disappear immediately. It may take days or longer to be removed, as Google churns the data and figures out whether the reported content should be removed or not.
i) The other way users can report about inaccurate information is by telling the company whether the information was helpful to them or they have some problem with it.
ii) The third big change is that the search engine will pay more attention to “more authoritative” information to filter offensive and low-quality content.
How Will Google Offer “Authoritative” Content?
Google will be taking help of quality raters to review and improve the search results. Google’s has hired over 10,000 quality raters working on this project across the world. Their reports are supposed to help train the algorithms to weed out the hateful or misleading stuff Google wants to downgrade.
Google being the largest search engine on the Internet has a large impact on the website when it makes any algorithm changes. Although Google assures that the impact of this change wouldn’t affect sites that work by the book. But we have to wait and watch what happens when Project Owl begins to take shape.