- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
Attacks on Microsoft’s Copilot AI allow for answers to be manipulated, data extracted, and security protections bypassed, new research shows.
You must log in or # to comment.
Note that the attacker needs to already have access to your Microsoft 365 account to do any of this. Fuck copilot and all, but this isn’t something they couldn’t achieve before.
I mean it is already by design a Phishing system.