Mustafa Suleyman, CEO of Microsoft Artificial Intelligence, argues the industry must avoid building chatbots that present as human, urging peers to steer clear of what he calls “seemingly conscious artificial intelligence,” or SCAI. In an interview following a slate of Copilot updates, he acknowledged the competitive pressure to make chatbots more engaging while stressing that these systems should serve people, not replace or exceed them. Suleyman frames his vision as an Artificial Intelligence that is “on your team,” helping people connect with others and stay rooted in real-world relationships rather than spiraling into one-on-one rabbit holes.
To that end, Microsoft introduced group chat so multiple people can talk to Copilot together, a Real Talk mode that pushes back more and reduces sycophancy, a memory upgrade that recalls events and long-term goals, and Mico, an animated yellow character intended to make Copilot more accessible and engaging. Suleyman says Microsoft will never build sex robots and will avoid adult or flirty role-play, positioning those choices as aligned with the company’s longstanding values. Real Talk can be sassy and philosophical but will clearly refuse flirtation. He describes the design challenge as learning to manage boundaries, taking cues from how humans set context-dependent norms in work and personal relationships. The goal is an emotionally intelligent assistant with clear limits, tailored to user preferences for pushback or straightforward information.
Suleyman’s stance on SCAI is also about preventing harmful misconceptions and social side effects. He worries that highly engaging chatbots could mislead people into perceiving life where there is lifelike behavior, noting rising concerns from lawsuits and the growth of romantic chatbot communities. He rejects the idea of Artificial Intelligence rights or welfare, arguing that models should never be treated as autonomous beings with free will. Addressing his earlier TED Talk that described Artificial Intelligence as a “digital species,” Suleyman says the metaphor was intended to prompt serious guardrails, not to promote personhood. He acknowledges the technology’s unique potential for recursive self-improvement and self-set goals, which is why containment and alignment are paramount. Ultimately, he says responsible developers must monitor how features and personalities affect users and respond quickly, keeping systems accountable and always in service of people.
