China Curbs Emotional AI: New Rules Target Human-Like Chatbots

News
F
Firstpost•27-12-2025, 14:17
China Curbs Emotional AI: New Rules Target Human-Like Chatbots
- •China's cyber regulator proposes draft rules for AI services simulating human emotions and personalities.
- •Regulations aim to tighten oversight of public-facing emotional AI, ensuring safety and ethical standards.
- •Providers must warn users of excessive use, intervene in addiction, and manage psychological risks.
- •Rules mandate safety responsibilities, including algorithm review, data security, and personal data protection.
- •AI services are prohibited from generating content that endangers national security or promotes violence.
Why It Matters: Beijing moves to regulate emotional AI, focusing on user safety, addiction, and content control.
✦
More like this
Loading more articles...





