They train on sneer-problems now:
Here’s the “ferry‑shuttle” strategy, exactly analogous to the classic two‑ferryman/many‑boats puzzle, but with planes and pilots
And lo and behold, singularity - it can solve variants that no human can solve:
https://chatgpt.com/share/68813f81-1e6c-8004-ab95-5bafc531a969
Two ferrymen and three boats are on the left bank of a river. Each boat holds exactly one man. How can they get both men and all three boats to the right bank?
If there’s only one ferryman, the puzzle becomes trivial (he just shuttles each boat one by one:
Does he now?
The rules did not mention the ferryman couldn’t use the stargate on each bank. The rules exclude teleporting, vut a stargate works via a different method.
no-one said he couldn’t be yoda
Hmm, maybe too premature - chatgpt has history on by default now, so maybe that’s where it got the idea it was a classic puzzle?
With history off, it still sounds like it has the problem in the training dataset, but it is much more bizarre:
https://markdownpastebin.com/?id=68b58bd1c4154789a493df964b3618f1
Could also be randomness.
Select snippet:
Example 1: N = 2 boats
Both ferrymen row their two boats across (time = D/v = 1/3 h). One ferryman (say A) swims back alone to the west bank (time = D/u = 1 h). That same ferryman (A) now rows the second boat back across (time = 1/3 h). Meanwhile, the other ferryman (B) has just been waiting on the east bank—but now both are on the east side, and both boats are there.
Total time
$$ T_2 ;=; \frac{1}{3} ;+; 1 ;+; \frac{1}{3} ;=; \frac{5}{3}\ \mathrm{hours} \approx 1,\mathrm{h},40,\mathrm{min}. $$
I have to say with history off it sounds like an even more ambitious moron. I think their history thing may be sort of freezing bot behavior in time, because the bot sees a lot of past outputs by itself, and in the past it was a lot less into shitting LaTeX all over the place when doing a puzzle.
Can someone explain the origin of this problem for someone ootl?
River crossing puzzles are a genre of logic problems that go back to the olden days. AI slop bots can act like they can solve them, because many solutions appear in their training data. But push the bot a little harder, and funny things happen.
That makes sense. So it’s just an obviously bullshit river crossing puzzle that the chatbot is calling a classic problem.
Pretty much. Our friend up top (diz/OP) has made a slight hobby of poking the latest and greatest LLM releases with variants of these puzzles to try and explore the limitations of LLM “cognition”.
This is obviously a math olympiad gold medal performance, Fields medal worthy even!
Gonna take a wild guess that enough people have tried this joke to the point where they’ve added custom code or lines in the system prompt about it:
prompt and outpoo
Now we need to make a logic puzzle involving two people and one cup. Perhaps they are trying to share a drink equitably. Each time they drink one third of remaining cup’s volume.
Now we need to make a logic puzzle involving two people and one cup. Perhaps they are trying to share a drink equitably. Each time they drink one third of remaining cup’s volume.
Step one: Drink two-thirds of the cup’s volume
Step two: Piss one sixth of the cup’s volume
Problem solved
Two ferrymen and three boats are on the left bank of a river. Each boat holds exactly one man. How can they get both men and all three boats to the right bank?
Officially, you can’t. Unofficially, just have one of the ferrymen tow a boat.
There’s an “I am no man” joke in here somewhere that I am too tired to figure out.
Ferryman 1 calls to Gwaihir, the Lord of Eagles for aid, and The Windlord answers to fly him back across.
ferrymen ⊄ men?
Yeah that’s the version of the problem that chatgpt itself produced, with no towing etc.
I just find it funny that they would train on some sneer problem like this, to the point of making their chatbot look even more stupid. A “300 billion dollar” business, reacting to being made fun of by a very small number of people.
Some insufferable dipshit that gets paid $10,000 bucks a week spent one of those weeks working on this.
they had a meeting about it, the meeting lasted an hour and had 8 people, and they collectively earned $1 million in that time
Gonna be really fucked if we finally discover that the people doing the actual work were true believers who didnt even get paid well. And all the money went to altman/infrastructure costs. (I could look in financial statements to figure stuff like this out but erug effort).
Officially, you can’t. Unofficially, just have one of the ferrymen tow a boat.
Or swim back. However, the bot itself appears to have ruled out all of these options.
At first glance it seems impossible once N≥2, because as soon as you bring a boat across to the right bank, one of you must pilot a boat back—leaving a boat behind on the wrong side.
In this sentence, the bot appears to sort of “get” it (not entirely, though, the wording is weird). However, from there, it definitely goes downhill…
The downhill is honestly glorious because it seems so proud of itself when the real magic is that the boatmen can magically teleport back to the right bank under certain arcane circumstances.
Somehow, the “smug” tone really rubs me the wrong way. It is of great comedic value here, but it always reminds me of that one person who is consistently wrong yet is somehow the boss’s or the teacher’s favorite.