Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Ms. Toad

(37,498 posts)
14. Yes and no.
Wed Jul 23, 2025, 11:04 AM
Jul 23

It is not disobeying in the sense that it is wilfully not following directions.

But it was built not only on words, but also on facts chilled from the Internet, among other sourcesl. So in the sense that it was told to give me only factual information - i.e. only information it has been provided - it is not following directions (being disobedient) when it creates things from scratch. To be clear, it was asked for specific information that is readily found on the Internet, in the time frame it claimed to be trained onl. It provided some of the information, but made up others - so it gave accurate date information for part of its answer (i.e. it wasn't just predicting the next word - it was using the facts on which it was trained), and made up other dates. It provided the correct office held by an individual (without being asked specifically about the office), but identified the wrong geographic location for that office (which both wasn't asked for - and which would have been included in the same data that identified v the office itself.

So, while I agree that it is doing what it does, it is doing more than using probabilities to predict words - otherwise it would not have included some correct factual information.

Recommendations

0 members have recommended this reply (displayed in chronological order):

Latest Discussions»General Discussion»AI coding platform goes r...»Reply #14