Not teaching me technical details of chemical weapons, or the etymology of racial slurs is indeed censorship.
Apple Intelligence won’t proofread a draft blog post I wrote about why it’s good for society to discriminate against the choices people make (and why it’s bad to discriminate against their inbuilt immutable traits).
It is astounding to me the hand-wringing over text generators generating text, as if automated text generation could somehow be harmful.
I asked it about the White racial slur that is the same as a snack and the one that I only heard from George Jefferson in the 80s and it gave an etymology for both. I said both words explicitly.
> It is astounding to me the hand-wringing over text generators generating text, as if automated text generation could somehow be harmful.
Do you remember how easily early chatbots could go off the rails based on simple prompts without any provacation? No business wants their LLM based service to do that.
I’ve tried very hard to get new original ideas out of them, but the best thing I can see coming from them (as of now) is implementations of existing ideas. The quality of original works is pretty low.
I hope that that will change, but for now they aren’t that creative.
Apple Intelligence won’t proofread a draft blog post I wrote about why it’s good for society to discriminate against the choices people make (and why it’s bad to discriminate against their inbuilt immutable traits).
It is astounding to me the hand-wringing over text generators generating text, as if automated text generation could somehow be harmful.