Jailbreaking Humans vs Jailbreaking LLMs

“Jailbreaking” an LLM and convincing it to tell you things it’s not supposed to is very similar to social engineering humans. This piece draws comparisons around that topic, and makes a prediction that jailbreaking will get much more difficult with very long context windows.

More …

vim + llm = 🔥

If you don’t use vi/vim, you might not find this post very practical, but maybe it’ll convince you to try it out!

More …