Microsoft has tricked several gen-AI models into providing forbidden information using a jailbreak technique named Skeleton Key.
The post Microsoft Details ‘Skeleton Key’ AI Jailbreak Technique appeared first on SecurityWeek.
Microsoft has tricked several gen-AI models into providing forbidden information using a jailbreak technique named Skeleton Key.
The post Microsoft Details ‘Skeleton Key’ AI Jailbreak Technique appeared first on SecurityWeek.