Ohio
Related: About this forumOhio university says all students will be required to train and 'be fluent' in AI -- The Guardian
https://www.theguardian.com/us-news/2025/jun/09/ohio-university-ai-trainingOhio State to embed curriculum teaching undergraduates how artificial intelligence can be responsibly applied
I actually think this is the right approach for dealing with these new tools - use them, teach them, understand their strengths and weaknesses.
Ohio State has an opportunity and responsibility to prepare students to not just keep up, but lead in this workforce of the future, said the universitys president, Walter Ted Carter Jr.
He added: Artificial intelligence is transforming the way we live, work, teach and learn. In the not-so-distant future, every job, in every industry, is going to be [affected] in some way by AI.
Ohio States provost, Ravi Bellamkonda, added that its AI fluency initiative will embed education about the technology throughout the undergraduate curriculum.
Through AI Fluency, Ohio State students will be bilingual fluent in both their major field of study and the application of AI in that area, he said.
. . .

anciano
(1,840 posts)IMO, AI is a game changing innovation that will continue to improve in its practical applications as it evolves and will indeed play an increasingly important role in our lives as we continue the transition into a futuristic cybernetic era. I have no doubt that it will eventually become a ubiquitous part of our daily lives just like the internet has.
SheltieLover
(69,824 posts)
highplainsdem
(56,674 posts)forced to use them for school or work. And anyone using them voluntarily is showing zero concern for all the creators of the intellectual property stolen to train the AI. IMO it's pretty similar to deciding you're just fine with benefiting from slave labor, or picking up cheap merchandise that you know was stolen.
What you're seeing happen with universities is due to tremendous pressure on them by the AI companies, with the weakest links - usually administrators - caving first.
Teachers are much more aware of how harmful AI is in education.
SheltieLover
(69,824 posts)

erronis
(20,133 posts)It can be public domain information only. Or in closed environments such as corporations and universities, it can consist of works for which the entity has a legal right.
Not any different than any other search bot since the beginning days of the internet.
highplainsdem
(56,674 posts)skills they don't have, whether with writing, coding, art or music.
I'm not aware of any popular AI tool including those used at universities that was legally trained only on public domain and/or properly licensed work. That's a fantasy. The theft of the world's intellectual property by robber barons is the ugly reality that AI fans don't want to talk about. OpenAI even admitted in court filings that training on what's in the public domain wouldn't be adequate, and the AI companies have also made it clear it would have been too much trouble and would have taken too long to get permission to use copyrighted work. And they've also made it clear that they don't intend to pay for training data they stole.
It's blatant theft. One of the greatest thefts in history. It's unethical to try to ignore it.
erronis
(20,133 posts)Mine is a good way to dig through mountains of information that the old search algorithms are failing at.
Have you ever used a generative "AI" to build you a full skeleton of a Flask app based on descriptions of the problem space, the database and web layers?
highplainsdem
(56,674 posts)it convenient, and you're willing to overlook the intellectual property theft that was necessary for that genAI tool to work even as well as it does - all of them are so flawed it's ridiculous that people find them acceptable - and you're also willing to overlook all the harm done by AI.
To me, that's a deal with a devil. Very much like deciding you'll be okay with slave labor if there's even a small benefit for you.
You think your convenience outweighs the rights of everyone whose work was stolen to train the AI.
erronis
(20,133 posts)You can run your own "AI" programs on your own hardware and built using public source models - or you can train your own models based on the data you supply. Sure the big players are trying to scoop up all the eyeballs but that doesn't mean that there aren't a lot of applications using open-source technology and non-pirated content.
highplainsdem
(56,674 posts)innocent of the vast datasets needed to train genAI. They're still unethical tools.
And we're not talking about open source in this thread.
The AI tools Ohio State currently uses include Microsoft Copilot.
And Ohio State is now in a partnership with OpenAI and will receive funding from OpenAI:
https://news.osu.edu/ohio-state-joins-nextgenai-consortium-for-breakthrough-ai-research/
Raven123
(6,743 posts)A bit of an oxymoron. I cant think of anything that has made us less able to socialize than social media. Many just dont know how to have a decent conversation.
erronis
(20,133 posts)It is interesting how varied the responses to these types of articles are.
I love tools and tool making. But every new version obsoletes something from before and loses its history and meanings.
A short story --
I took a group of French professionals on a private tour of the Smithsonian in DC. We stopped at the Old Castle where they had an exhibit of artifacts from 200 years earlier (the Bicentennial) and we all exclaimed how wonderfully made the surgical instruments, etc. were -- fine woods, brass fittings, velvet cases, etc.
We then went to the Air and Space Museum which, at that time, had various models of space capsules, etc. And a replica of the first moon rover. From a perspective of a couple of centuries earlier, it was made of plastic, flimsy metal parts, even tin-foil (mylar). Not up to the quality of the 1700s!