Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

erronis

(20,133 posts)
Mon Jun 9, 2025, 01:16 PM Monday

Ohio university says all students will be required to train and 'be fluent' in AI -- The Guardian

https://www.theguardian.com/us-news/2025/jun/09/ohio-university-ai-training

Ohio State to embed curriculum teaching undergraduates how artificial intelligence ‘can be responsibly applied’

I actually think this is the right approach for dealing with these new tools - use them, teach them, understand their strengths and weaknesses.

Ohio State University has announced that all of its students will be using artificial intelligence later this year, requiring them to become fluent in combining conventional learning with AI.

“Ohio State has an opportunity and responsibility to prepare students to not just keep up, but lead in this workforce of the future,” said the university’s president, Walter “Ted” Carter Jr.

He added: “Artificial intelligence is transforming the way we live, work, teach and learn. In the not-so-distant future, every job, in every industry, is going to be [affected] in some way by AI.”

Ohio State’s provost, Ravi Bellamkonda, added that its AI fluency initiative will embed education about the technology throughout the undergraduate curriculum.

“Through AI Fluency, Ohio State students will be ‘bilingual’ – fluent in both their major field of study and the application of AI in that area,” he said.

. . .
12 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies

anciano

(1,840 posts)
1. Excellent OP....
Mon Jun 9, 2025, 01:26 PM
Monday

IMO, AI is a game changing innovation that will continue to improve in its practical applications as it evolves and will indeed play an increasingly important role in our lives as we continue the transition into a futuristic cybernetic era. I have no doubt that it will eventually become a ubiquitous part of our daily lives just like the internet has.

highplainsdem

(56,674 posts)
2. Those AI tools are all illegally trained, and it's ALWAYS unethical to use them. I feel sorry for anyone
Mon Jun 9, 2025, 02:53 PM
Monday

forced to use them for school or work. And anyone using them voluntarily is showing zero concern for all the creators of the intellectual property stolen to train the AI. IMO it's pretty similar to deciding you're just fine with benefiting from slave labor, or picking up cheap merchandise that you know was stolen.

What you're seeing happen with universities is due to tremendous pressure on them by the AI companies, with the weakest links - usually administrators - caving first.

Teachers are much more aware of how harmful AI is in education.

erronis

(20,133 posts)
5. The data that the "AI" tools use does not have to be illegally obtained.
Mon Jun 9, 2025, 04:15 PM
Monday

It can be public domain information only. Or in closed environments such as corporations and universities, it can consist of works for which the entity has a legal right.

Not any different than any other search bot since the beginning days of the internet.

highplainsdem

(56,674 posts)
6. It isn't primarily a search bot. Its primary use is fraud, so the users can pretend to have knowledge and
Mon Jun 9, 2025, 04:27 PM
Monday

skills they don't have, whether with writing, coding, art or music.

I'm not aware of any popular AI tool including those used at universities that was legally trained only on public domain and/or properly licensed work. That's a fantasy. The theft of the world's intellectual property by robber barons is the ugly reality that AI fans don't want to talk about. OpenAI even admitted in court filings that training on what's in the public domain wouldn't be adequate, and the AI companies have also made it clear it would have been too much trouble and would have taken too long to get permission to use copyrighted work. And they've also made it clear that they don't intend to pay for training data they stole.

It's blatant theft. One of the greatest thefts in history. It's unethical to try to ignore it.

erronis

(20,133 posts)
7. We differ in our opinions. Your experience may know of more fraud.
Mon Jun 9, 2025, 04:36 PM
Monday

Mine is a good way to dig through mountains of information that the old search algorithms are failing at.

Have you ever used a generative "AI" to build you a full skeleton of a Flask app based on descriptions of the problem space, the database and web layers?

highplainsdem

(56,674 posts)
8. Why would I? You're choosing to use an unethical tool controlled by robber barons because you find
Mon Jun 9, 2025, 04:45 PM
Monday

it convenient, and you're willing to overlook the intellectual property theft that was necessary for that genAI tool to work even as well as it does - all of them are so flawed it's ridiculous that people find them acceptable - and you're also willing to overlook all the harm done by AI.

To me, that's a deal with a devil. Very much like deciding you'll be okay with slave labor if there's even a small benefit for you.

You think your convenience outweighs the rights of everyone whose work was stolen to train the AI.

erronis

(20,133 posts)
9. Generative "AI" and the LLMs underlying it are not all controlled by "robber barons"
Mon Jun 9, 2025, 07:11 PM
Monday

You can run your own "AI" programs on your own hardware and built using public source models - or you can train your own models based on the data you supply. Sure the big players are trying to scoop up all the eyeballs but that doesn't mean that there aren't a lot of applications using open-source technology and non-pirated content.

highplainsdem

(56,674 posts)
10. Those open-source models were still pre-trained on stolen IP. They're not found under a cabbage leaf,
Mon Jun 9, 2025, 07:53 PM
Monday

innocent of the vast datasets needed to train genAI. They're still unethical tools.

And we're not talking about open source in this thread.

The AI tools Ohio State currently uses include Microsoft Copilot.

And Ohio State is now in a partnership with OpenAI and will receive funding from OpenAI:
https://news.osu.edu/ohio-state-joins-nextgenai-consortium-for-breakthrough-ai-research/

Raven123

(6,743 posts)
11. I'm sure there is more to it but AI reminds me of social media
Sat Jun 14, 2025, 04:05 PM
12 hrs ago

A bit of an oxymoron. I can’t think of anything that has made us less able to socialize than “social media.” Many just don’t know how to have a decent conversation.

erronis

(20,133 posts)
12. And telephones were the death of the art of writing nice letters....
Sat Jun 14, 2025, 05:36 PM
11 hrs ago

It is interesting how varied the responses to these types of articles are.

I love tools and tool making. But every new version obsoletes something from before and loses its history and meanings.

A short story --

I took a group of French professionals on a private tour of the Smithsonian in DC. We stopped at the Old Castle where they had an exhibit of artifacts from 200 years earlier (the Bicentennial) and we all exclaimed how wonderfully made the surgical instruments, etc. were -- fine woods, brass fittings, velvet cases, etc.

We then went to the Air and Space Museum which, at that time, had various models of space capsules, etc. And a replica of the first moon rover. From a perspective of a couple of centuries earlier, it was made of plastic, flimsy metal parts, even tin-foil (mylar). Not up to the quality of the 1700s!

Latest Discussions»Region Forums»Ohio»Ohio university says all ...