The unequal treatment of demographic groups by ChatGPT/OpenAI content

This is a very interesting article that shows how ChatGPT’s moderation treats different groups differently. If you don’t know what ChatGPT is, you can probably find out from listening to the recent Sound of Sanity episode (I haven’t listened yet, but I assume they explain it).

Spoiler:
You’re allowed to insult men and republicans more than women and democrats.

But I’m not sure this reveals the slant of ChatGPT itself, which his other article here does:

ChatGPT is a fascinating tool, and a bit creepy. If it becomes the norm for getting answers to questions, which wouldn’t surprise me, then I’d probably be even more concerned about letting kids use it than I would letting them simply do google searches.

1 Like

No way can you use it for accurate answers for things. I grilled it on Lord of the Rings and Silmarillion and it just made stuff up. It might have sounded plausible to one less familiar with the lore and events, but it was just spouting nonsense. Although maybe it would be more accurate for topics more frequented by users than Tolkien, I don’t know. But I for one don’t take it seriously for answers just a fun way to get the stupidest misinformation. I hope people aren’t trying to use it for Google. That seems to be a good way to get misinformed to me.

2 Likes

It is often wrong, and confidently wrong.

Nevertheless, it’s also right a lot of the time, especially on topics that it has more and better training data for.

And it’s substantially better than Google at just giving you an answer, which is what we want a lot of the time, even if it’s wrong.

2 Likes

I just looked into whether Wolfram-Alpha was involved with it - turns out not really. But, if Wolfram’s decade of refining human input into data questions is merged it would get substantially better at what it does. Article: Wolfram|Alpha as the Way to Bring Computational Knowledge Superpowers to ChatGPT—Stephen Wolfram Writings

For fun, I asked it to write a one page summary about the dissociation of the Hepatitis B Virus, whoch was part of the subject of my dissertation. It was spot on.

1 Like

These days, Google’s top result is also often confident but wrong.

Google was a miracle in its early days, but any metric that gets turned into a goal (like Page Rank ==> Search Engine Optimization) rapidly gets gamed.

However, I expect that if Google returns a bad result, it’s able to be explained by someone at Google. I don’t know if that’s true for Chat GPT.

3 Likes

Chat GPT may well have read your dissertation and just be referencing your work without citing you.

2 Likes

If it has access to ProQuest then that is certainly possible.

I’ve tried it out on topics related to my own Ph.D. dissertation, and it does a pretty poor job. In a few cases it gave information completely opposite to the truth, and most of the rest of the time it mentioned correct information that was only tangentially relevant. It very much reminded me of the sort of answer I get on an exam from a student who doesn’t really understand the concept but throws up a bunch of information somewhat related to the topic of the question in the hope that something will stick to the wall.

But perhaps it is unfair to expect it to correctly describe Ph.D.-level material. So I asked it simpler questions from old homework assignments for an introductory science class I taught to non-majors. Sometimes ChatGPT was right on, but often it was wrong. My impression is that if a question is asked with the phrasing that is in explainer blurbs on the internet, ChatGPT will provide a stellar answer. But if a question is asked in a somewhat different form, or is asking about a slightly different concept, then ChatGPT will not provide a wholly correct answer.

I guess one outcome from the arrival of ChatGPT is that credit will be given only for work done in class (my policy since the arrival of “instant tutor” services on the internet). But if credit is going to be awarded for work outside of class, then it is going to have to be bang on with no mercy grading for answers that had some correct information but didn’t really answer the question.

5 Likes

I’ve tried it in German a bit, but I’ve come to the conclusion that it just says what the average mind will say when it reads the Internet all day :grin:

5 Likes

Faithful preaching constantly sharpens the distinction between truth and error, but from what I’m hearing ChatGPT constantly blurs the distinction between truth and error - stating subtle errors so plausibly and smoothly that they hide among the the many obvious truths.

3 Likes

Indeed.

If you’d like a thorough demonstration of the political bias of this AI thang, look here.

Just one of examples expounded in this link is a fawning poem which the bot wrote about Hunter Biden (two cringe-worthy stanzas are reproduced). But when asked to write a poem about Marjorie Taylor Greene, the bot declined because " she is a controversial figure and it’s not appropriate for me, an AI language model, to take a partisan stance or express opinions."

1 Like

Yeah, when I visited the ChatGPT Discord server to ask why it gave such stupidly wrong answers to the Lord of the Rings questions I asked it, the people there said you have to ask it in the right way to get the right answer, which to me is silly because that means you have to know the right answer in the first place to even know how to ask a question, and most people who are looking for answers aren’t going to do that. For Google, when you look something up, you might have to refine your question to get related articles, but you can trust the articles are at least generally reliable. I wouldn’t trust ChatGTP to provide me with correct information if it can’t even answer questions about Lord of the Rings that anyone who has read the books would know easily.

2 Likes

I noticed a bias against conservative ideas in response to questions I asked about race and sex.

Here are a few examples:

You’ll notice more or less the same phrase in the answers given about equality and justice for all. It must be pulling this line from somewhere.

While an amazing technology, I fear that this will be used for evil more than for good.

3 Likes

The answer is simply the Zeitgeist. Whatever it has been fed into it the most is what it will spit back out.

Garbage in, garbage out.

5 Likes

I think Gab is working on a “Christian AI”

I asked ChatGPT, “Please write an hour long sermon on the necessity of saving faith in the style of Jonathan Edwards.”

It… didn’t do what I asked. However, it did produce this:

“The Necessity of Saving Faith: A Sermon in the Style of Jonathan Edwards”

Prelude:

Dear brothers and sisters in Christ, we gather here today to reflect upon a most vital aspect of our faith - the necessity of saving faith. I stand before you today, as an unworthy servant of the Lord, with a heavy heart, to deliver this sermon on a topic that is of utmost importance to our salvation.

Introduction:

The Bible teaches us that faith is the key to our salvation. It is through faith in Jesus Christ that we are justified and made righteous in the eyes of God. But what exactly is faith? And why is it so necessary to our salvation? These are the questions that we will be exploring today, as we delve into the teachings of one of the greatest preachers of all time, Jonathan Edwards.

Body:

Jonathan Edwards, in his famous sermon “Sinners in the Hands of an Angry God,” emphasized the necessity of saving faith. He spoke of faith as a grace that comes from God, and he exhorted his listeners to seek it out with all their hearts. Edwards argued that faith is not merely a mental assent to certain beliefs, but it is a heartfelt trust in the person of Jesus Christ, and a surrender of our lives to Him.

Saving faith, Edwards said, is a supernatural work of God in our hearts, that transforms us from the inside out. It is through faith that we are born again and made new creatures in Christ. Faith is the key that opens the door to heaven, and it is the means by which we receive the promise of eternal life.

Edwards went on to say that faith is the evidence of our salvation. It is the proof that we are truly saved, and that we have been justified in the eyes of God. He warned his listeners that it is possible to have a false faith, a faith that is not rooted in the truth of God’s word, but in our own works and achievements. This, Edwards said, is a dangerous and deceptive faith, for it leads to destruction and not salvation.

Conclusion:

Dear brothers and sisters, let us take heed of the words of Jonathan Edwards and seek out saving faith with all our hearts. Let us pray for the

I think I cut it off at the end. Anyway, thought you’d all get a kick out of that.

3 Likes

Not sure how different ChatGPT is from Gospel Coalition.

5 Likes

With this hypothetical, you’re not far off:


3 Likes

I don’t know, that sounds exactly like Calvin to me.

4 Likes