They're so afraid of reading books in Iowa that they used AI to Ban them

How can you tell if you need to ban a book if you are too afraid to read them in the first place? Make a computer read them instead.

How can you tell if you need to ban a book if you are too afraid to read them in the first place?

Make a computer read them instead.

That’s what is going on in the state of Iowa. The state government in Iowa is, apparently, so terrified of the books they want to ban that they are afraid to read them for themselves. So they are making AI do it instead.

 


Click to share this on Facebook and Twitter


 

A little background. In May, Iowa Governor Kim Reynolds signed Senate File 496, which enacted major changes to the state’s school curriculum.  It made major changes to Iowa’s existing code 702.17 and further restricted what books were allowed in schools and classrooms. Apparently, however, they wanted to potentially ban so many books that they realized they could not possibly read every single one of them.

So Iowa is getting ChatGPT to do it instead.

The  AI being used to scan books for content is a generative AI model that the state has fed with commands to find “age-inappropriate” content and filter those books out of libraries and classrooms. What constitutes “age-inappropriate”? Well, they are not very clear on this. They just trust the computer to figure it out for them.

 



Sign the petition to stand up to book bans across the country


 

Never mind that, in their current iterations, generative AI models are notoriously unreliable and often make mistakes.

Imagine, for a moment, feeling like you had to protect students from SO MANY books that you could not be bothered to read them first. That you had to make a computer do it instead.

It begs the question what officials in the state are actually afraid of. Learning something? Becoming more empathetic?

Maybe members of the state government are just too lazy. 

It’s important to remember that AI learns by input. The more it “scrapes” from across the internet, the more it “learns.” But the problem with this is that it isn’t actually creating new ideas: it’s just repeating the same things it has found in other places.

And that is on full display with what the AI decided to ban in its initial scan. The books are not surprising. They are routinely on banned book lists around the country every single year. Beloved by Toni Morrison. I Know Why the Caged Bird Sings by Maya Angelou. The Color Purple by Alice Walker.  Kite Runner by Khaled Hosseini.

However: remember that part about ChatGPT not always being reliable and making mistakes. Well, state officials are finding that when it suggests a book to be banned, it cannot identify the actual passages that it has deemed inappropriate. Nor can it identify the page numbers it comes from, or where in the text inappropriate passages occur. The computer is literally saying that it is pretty sure there is something inappropriate, but it cannot tell you where it is, what it is, or recite it back for review.

 



Help fight for libraries by starting a $5 monthly donation today!


 

So: is the system working, or is the AI Iowa hired to do its reading just finding books it recognizes as being banned elsewhere and kicking them out of libraries around the state? Is it making real decisions, or is it just repeating what it has already seen?

And that is the larger problem with this system. Decisions are not being made based on the content of the books but potentially on what a computer program recognizes as patterns of bans from elsewhere. All it knows is what it is told- and it is told that these are good books to ban.

Normal people can recognize this as frankly ridiculous. It is one thing to want to ban books- it is an entirely different level of absurdity to say that you want to ban books but don’t have the time to actually read and review them yourself. This is strange, considering Iowa has a literacy rate of 85%- the tenth highest in the country.

You would think the politicians there would be able to do their own reading.