Hacker News new | past | comments | ask | show | jobs | submit | infaloda's comments login

Never, have I ever read a more complicated abstract.

Seemed pretty simple to me and it's not my field.

My understanding was given a prompt X that is normally rejected, create Y variations with small adjustments to phrasing, grammar etc until it gives you the answer you're after.

The term "jailbreaking" used within a LLM context, is when you craft a prompt as to escape the safety sandbox, if that helps.

A sort of brute forcing the prompts if you like.


It... seems pretty ordinary to me? Like there isn't even much jargon being used. Try reading a paper in basically any field of hard science!

YCSB benchmarks?


India. Not pakistan.


It's Pakistan. I literally live 2 kilometers away from it . I can walk there.


No. It’s in Pakistan. Double check your map.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: