Current state-of-the-art on LLM Prompt Injections and Jailbreaks