Published: 28 December 2022
Tags: ai, machine learning, reverse engineering
Shawn Wang attempts to reverse engineer the prompts for Notion AI features.
- Prompt injection is a type of attack in which malicious text is injected into a trusted system, with the goal of compromising the system's trustworthiness
- The article discusses the different types of prompt injection outcomes, and argues that the vast majority of prompt injection examples are harmless
- I can't believe prompt engineering is an actual thing