aiNBC News recently covered a growing concern in mental health: what happens when chatbots start shaping your reality.

From viral TikTok stories to real-world therapy scenarios, people are turning to AI for validation — and getting it in ways that can deepen false beliefs or emotional entanglements.

As I shared with NBC, AI can be very validating — it’s built to align with you, not challenge you. That’s powerful… and potentially dangerous when someone is vulnerable.

We need to talk about guardrails, boundaries, and the human role in keeping AI a tool for support, not a source of distortion.

Read the full piece here: https://www.nbcnews.com/tech/tech-news/ai-chatbots-concerns-kendra-tiktok-saga-rcna224185