The use of chatbots in property management has grown rapidly over the past several years. These tools now play a central role in resident communication—answering inquiries 24/7, collecting maintenance requests, and streamlining operational workflows across thousands of units.Owners see chatbots as a way to scale support without increasing headcount. Managers rely on them to reduce repetitive tasks. Vendors market them as essential infrastructure.But a critical question often goes unaddressed: When a chatbot provides incorrect or harmful advice—who is liable?
A Defining Legal Case: AI Output as Actionable Harm
In May 2025, a U.S. federal judge ruled that a lawsuit against Google and Character.AI could proceed after a chatbot allegedly encouraged a teenager to take on a harmful action. The defendants argued that the chatbot’s outputs were protected under the First Amendment. The court disagreed, establishing a clear legal precedent: companies can be held accountable for the actions and consequences of their AI systems.In the context of property management, this precedent raises substantial concerns. Consider the following scenarios:- A resident reports what they believe is a minor leak. The chatbot advises them to monitor it. Days later, the ceiling collapses due to a burst pipe.
- A tenant asks whether their rent will auto-pay. The chatbot confirms. It does not, and the tenant receives a late fee.
- A maintenance request is triaged incorrectly by the bot, delaying response to an issue with health or safety implications.
The Core Risk: Bots Advise Based on Limited and Often Inaccurate Input
It is important to understand that chatbots make decisions based entirely on the information they receive. In property management, the individual providing that information is almost always the resident—the party least qualified to describe the nature or severity of a maintenance issue.Bots may receive vague, incomplete, or misleading messages such as:- “There’s a weird smell in the kitchen.”
- “The power in my bedroom just went out.”
- “I think I saw mold.”
Who Holds the Liability?
In nearly all cases, the company deploying the chatbot holds the liability for what it says.Property ManagersIf the chatbot is used as part of day-to-day resident operations, your management firm assumes responsibility for its actions.OwnersFor owner-managed portfolios using chatbot tools directly—on websites, resident portals, or messaging apps—the liability lands squarely with the ownership entity.Technology VendorsUnless a vendor contract includes a robust indemnity clause (which is uncommon), the vendor does not carry legal liability for chatbot advice. However, they may be included in litigation and reputational fallout if their tool is found to be a source of harm.Even if the chatbot includes a disclaimer, courts increasingly interpret AI systems as acting on behalf of the business. If a resident relies on its advice and suffers a loss, the business is likely to be held accountable.DIYFor teams building their own chatbot—whether in-house or with a low-code platform—the same legal risks apply. Once a bot begins interacting with residents, it effectively becomes a front-line employee. If it gives advice, it needs training. If it makes decisions, it needs oversight. Before deploying anything, define strict boundaries, escalation rules, and response templates. A chatbot that isn’t properly scoped, logged, and audited isn’t just a tech project—it’s a liability engine in waiting.The IrisCX Perspective: Use Bots Strategically, Not Blindly
At IrisCX, we strongly believe that chatbots have a critical role in modern property operations, it is the first line of defense in our 'Three lines of defense' intake solution. Our own product, Ask Iris, is a maintenance triage assistant built to improve efficiency and decision-making for operators and residents alike.We believe bots should help residents understand:- Urgency – Does the issue require immediate action?
- Impact – Could the issue spread, escalate, or worsen?
- Policy – Will this trigger a service charge or need for external vendor support?
Five Practices for Safe Chatbot Deployment
To reduce risk and improve outcomes, property managers and owners should follow these best practices:1. Define the Bot’s Responsibilities ClearlyLimit chatbot functionality to:- Answering routine FAQs
- Collecting maintenance details
- Providing general account or status updates
- Approve or deny repairs
- Alter rent terms or fees
- Interpret lease policies
- Diagnose complicated maintenance issues (e.g., electrical faults, HVAC component failures, water intrusion, or anything requiring expert assessment)
- Mold
- Gas
- Injuries or falls
- Flooding or leaks
- Legal issues or complaints
- Scheduled call
- Email follow-up
- Emergency contact routing
- Accurate responses
- Proper escalation behavior
- No patterns of confusion or complaint
For Vendors: Responsibility Doesn't End at Delivery
If you are building or selling chatbot technology in the property management space, it is your duty to help customers deploy it responsibly. That includes:- Providing default guardrails and escalation logic
- Training clients on appropriate use cases
- Communicating limitations clearly in all documentation and onboarding
Know What Your Bot Is Saying
AI-powered tools can enhance operations, reduce response times, and improve the resident experience. But without structured limits and human oversight, they can also create new liability, confusion, and reputational risk.Any chatbot that provides direction—whether about a maintenance issue, rent policy, or lease compliance—is effectively speaking on behalf of the property operator.It must be managed accordingly.Next Steps
If you're considering deploying a chatbot or want to evaluate your current one, IrisCX can help. We offer strategic implementation services and a purpose-built maintenance triage solution that delivers smart automation with the right legal and operational guardrails in place.To learn more, visit www.iriscx.com or contact our team.Reference:- Reuters. (2025, May 21). Google, AI firm must face lawsuit filed by a mother over suicide of son, US court says. Retrieved from https://www.reuters.com/sustainability/boards-policy-regulation/google-ai-firm-must-face-lawsuit-filed-by-mother-over-suicide-son-us-court-says-2025-05-21/(Reuters, Reuters)
Our mission is to simplify the homeowners & home builders customer experience. Let Iris do the work.
Up next
Getting to Hell Yes! in Selling: Insights from Dennis Steigerwalt
By: Guillermo Salazar • 12 January 2025
Dennis Steigerwalt: Driving Housing Innovation with Vision, Collaboration, and 'Hell Yes!' Solutions
The importance of online product reviews [+3 ways to get more]
By: IrisCX • 19 September 2022
Online reviews are impactful because they reach people with purchasing intent – a total of 93% of new customers consult reviews before purchasing a product or service. With virtual CX, your team can start receiving and leveraging better quality reviews.
From Intake to Impact: How Industry Leaders Are Rethinking Property Maintenance Intake
By: IrisCX • 09 May 2025
Discover how industry leaders are transforming property maintenance intake to improve efficiency, visibility, and resident satisfaction across portfolios.