Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: RoosterRedux

This one had no problem with it: https://www.quipqiup.com/

I test all my posts there first. Sometimes, like yesterday, I failed to catch problems when proofing the solutions. What happened yesterday was the quote was too long for: http://www.rinkworks.com/brainfood/p/cryptmaker1.shtml so I used my DIY Octave script that puts in line breaks without keeping words (delimited by spaces) together, and you get what happened yesterday. It’s a bug I should fix. (Actually, output to a text file with line feeds replaced by < p>< lf>< cr>)


8 posted on 04/07/2024 5:15:52 AM PDT by Lonesome in Massachussets (Perdicaris alive or Raisuli dead!)
[ Post Reply | Private Reply | To 6 | View Replies ]


To: Lonesome in Massachussets
As someone who follows AI very carefully, I was shocked that both Claude and ChatGPT got it wrong today. I carried on quite a long conversation with both of them about it.

I will try quipquip, but I am thinking of contacting the developers of Claude (Anthropic) and ChatGPT (Sam Altman) to warn them of the problems. Good AI systems should be able to make easy work of cryptograms.

I didn't mention it but I had the same problem with Co-Pilot, which is the Microsoft AI. It was completely confused with today's cryptogram and tried to fool me by providing the answer to the MLK quote from earlier in the week.

16 posted on 04/07/2024 6:27:13 AM PDT by RoosterRedux (A person who seeks the truth with a closed mind will never find it. He will only confirm his bias.)
[ Post Reply | Private Reply | To 8 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson