The Google search result for Trudy Wade, a North Carolina state senator and an ardent supporter of President Trump, showed a picture Friday with the word "BIGOT" in red. It was the second time this week that Google search results appeared to be manipulated. The first incident involved the word "Nazism" listed in a Google information box for the California Republican Party's ideology.
Google blamed "vandalism" for the California GOP incident and quickly removed the section on ideology.
By Friday evening, the picture of Wade had been removed. Google replied on Twitter to news articles about the incident, apologizing and writing that "images that appear in the Knowledge Panel are either selected by verified users or are automatically sourced from sites across the web." Google said this image was hosted on a student news blog, and once alerted, they immediately deleted it from the Knowledge Panel.
Wade's office did not immediately respond for requests for comment.
Both incidents were first spotted by Vice News.
On May 24, Wikipedia's publicly available change log showed that an anonymous user added the term "Nazism" to the entry for the California GOP. The change was reversed by another editor a week later. A second, similar change that afternoon was reverted a minute later.
The California GOP has formally repudiated, who has praised Adolf Hitler and advocated for limiting the representation of Jews in the government. While Little calls himself a Republican, the Republican Party says it has not endorsed any Senate candidates in California.
California Rep. Kevin McCarthy, the House Republican leader, tweeted that the Google result was a "disgrace." California Republican Party executive director Cynthia Bryant called the result "libelous" in a statement and said Google and Wikipedia should "take more ownership of what is published on their sites."
Federal law exempts internet companies from liability for user-posted material as long as they promptly remove offensive items once notified. The California GOP situation, however, underscores the risks that companies like Google face when they rely on user-generated sites like Wikipedia.
"We have systems in place that catch vandalism before it impacts search results, but occasionally errors get through, and that's what happened here," Google said last week in a statement. "This would have been fixed systematically once we processed the removal from Wikipedia, but when we noticed the vandalism we worked quickly to accelerate this process to remove the erroneous information."
Two months ago, YouTube CEO Susan Wojcicki told an audience at the SXSW technology conference that YouTube would begin to rely on Wikipedia entries to create a "companion unit" that will be shown beside conspiracy theory videos that continue to spread on the service.
Wojcicki told Wired editor Nicholas Thompson at the time that the goal was to show alternative sources of information so viewers would "be able to research other areas as well."