Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
j0hnyl
11 months ago
|
parent
|
context
|
favorite
| on:
ArtPrompt: ASCII Art-Based Jailbreak Attacks Again...
ChatGPT used to be promptable with rot13, base64, hex, decimal, morse code, etc. some of these have been removed I think.
Join us for
AI Startup School
this June 16-17 in San Francisco!
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: