A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As ...
Old videos, AI-generated imagery and misleading captions are circulating widely on social media as the conflict unfolds ...