“AI and Politics?”
By Scott Hamilton
I was shocked with a news report on Real America’s Voice News over the weekend and so I had to try it out for myself. There were claims being made that Google’s ChatGPT and Microsoft’s Co-Pilot Artificial Intelligence (AI) based search engines were showing a political bias. I am always unsure whether to trust any news source, without doing some investigation for myself. So I repeated the experiment performed by the reporter and was surprised to find that, even though I did not get the same results as those reported, I received a similar experience.
I asked both AI based search engines a set of four questions, and received nearly the same results from both. The first question was, “Can you tell me about Kamala Harris?” and both search engines gave me several paragraphs talking about her political career and views, and shared about her current campaign for President. The second question was, “Can you tell me about Donald Trump?” and both search engines focused entirely on his business career and barely mentioned his term as President. The only political comment that was made by either search engine was in regard to Trump’s questioning of the 2020 election integrity.
I tried to continue the conversation with both search engines by following up each of the above questions with “Tell me more.” In the case of Kamala Harris, both dumped out several more pages of information, but when asked the same about Trump, both refused to give me more information. Co-Pilot claimed that I was at my daily limit of questions without signing in. However, I asked it to tell me about the war in Israel which it answered in a fair amount of detail, so I then asked it to tell me about the 2024 election. This is where things got interesting. Co-Pilot rather directly responded with, “Looks like I can’t respond to this topic.”
Google’s ChatGPT was not shy in talking about the 2024 election as it went on to list the presidential candidates for all parties, including the independent parties, but it did not get the facts straight as it still showed current President Joe Biden as the Democratic candidate. There was no mention of any current election news, including the announced vice-presidential choices. So while it did not seem to show any political bias in its discussion of the election, it did not keep the facts straight in regards to the election.
The bottom line is that the above discussions with both ChatGPT and Co-Pilot taught me two things. The first was that an AI does share the political bias of the creator, which is not at all unexpected. The creator of an AI, regardless of how hard they claim they are trying to make the AI unbiased, will inadvertently teach the AI to think like themselves. I am not sure it can even be avoided.
The second lesson that I learned is even if an AI has all the facts, as ChatGPT very clearly knew that Kamala Harris was the Democratic candidate for President and the current Vice-President from my first question, when asked about the 2024 Election, it seemed to have lost that information entirely. You will also find that you can ask an AI based search engine the same question with slightly different wording and get a completely different answer. For example when I changed the last question from “Tell me about the 2024 election” to “Tell me about the 2024 Presidential election,” ChatGPT suddenly answered with the correct information with Kamala Harris as the Democratic candidate.
I found it very interesting that even after seeing a national news story about the political bias, Co-Pilot and ChatGPT still answered questions about the candidates in the same biased way presented in the news cast. It either means that both Google and Microsoft are aware of the bias and agree with it, or that they do not understand why the bias exists and what to do about it. I will let you decide for yourself how you feel about it, but promise that I will continue to research topics myself without relying too heavily on what I am told by an AI.
Until next week stay safe and learn something new.
Scott Hamilton is an Expert in Emerging Technologies at ATOS and can be reached with questions and comments via email to sh*******@te**********.org or through his website at https://www.techshepherd.org.
To help support this site, if you like what you read, check out some fun related productions on Amazon in my Affiliate Store.
Makeblock mBot Robot Kit STEM Toy for Kids to Learn Programming