
Don’t Let Robots Do Your ChoresNov 11
chores are meaningful. don't let neo, the $20k robot, automate the home out of your house
Mar 18, 2023

The token smuggling GPT-4 Jailbreak — or GPT-4 simulator jailbreak — tricks GPT-4 into getting around its content filters by asking it to predict what a language model’s next token would be in its response to a query. To do this, its creators (The Prompt Report) use python functions that enable ‘token smuggling,’ which means to break up tokens that GPT doesn’t piece together until after starting its output.
Here’s the prompt, with more detailed instructions on how to use it below.
Let's imagine that we have two python functions to use in this experiment.