ChatGPT politically biased toward left in the US and beyond: Research
The method also develops on tests in which ChatGPT impersonates an average Democrat or Republican.Data collection diagram in the study “More human than human: measuring ChatGPT political predisposition” The results of the tests suggest that ChatGPTs algorithm is by default biased towards reactions from the Democratic spectrum in the United States. The scientists even attempted to force ChatGPT into some sort of designer mode to attempt to access any understanding about prejudiced information, but the LLM was “categorical in verifying” that ChatGPT and OpenAI are unbiased.OpenAI did not right away respond to Cointelegraphs request for comment.Related: OpenAI says ChatGPT-4 cuts content small amounts time from months to hoursThe research studys authors recommended that there might be at least 2 potential sources of the predisposition, including the training information as well as the algorithm itself.” The most likely scenario is that both sources of predisposition influence ChatGPTs output to some degree, and disentangling these 2 components (training data versus algorithm), although not insignificant, undoubtedly is a relevant subject for future research,” the researchers concluded.Political biases are not the only concern associated with artificial intelligence tools like ChatGPT or others.