“Jailbreaking” an LLM and convincing it to tell you things it’s not supposed to is very similar to social engineering humans. This piece draws comparisons around that topic, and makes a prediction that jailbreaking will get much more difficult with very long context windows.
More …
How and why I moved rez0.blog to josephthacker.com
More …
If you don’t use vi/vim, you might not find this post very practical, but maybe it’ll convince you to try it out!
More …
As I mentioned in my last parenting post, I’m a hacker and bug bounty hunter. So, I spend a lot of my time improving myself and looking for an edge. As a dad of three, that means learning scripts and tricks that get the desired outcome.
More …
Why LLMs don’t sound human, strategies to fix it, and real examples.
More …