I see people being deceived by this again and again: ChatGPT can NOT read content from URLs that you give it, but it will pretend that it can (and can be incredibly convincing when it does that)

Constantly debunking this feels like a Sisyphean task, but it's really important to spread this message any time you see anyone falling into this (very understandable) trap

simonwillison.net/2023/Mar/10/

Here's some good news: the new GPT-4 model (only available to paying preview users at the moment) is better behaved in this regard - it appears not to pretend it can access URLs any more

Show thread
Follow

@simon also the "I'm writing examples / documentation" jailbreak must still work.

Sign in to participate in the conversation
Doma Social

Mastodon server of https://doma.dev.