Amid the Covid-19 pandemic, negotiators are increasingly making deals and resolving disputes online. But a trend toward online dispute resolution (ODR) was already in the making before we all began to quarantine. On July 15, experts discussed how technology can help us effectively and efficiently resolve disputes in a roundtable discussion, “AI Agents Negotiating Deals and Resolving Disputes,” hosted by the International Association for Conflict Management and moderated by Northwestern University professor emerita Jeanne Brett.
Online dispute resolution (ODR) has “grown by leaps and bounds” in recent years, according to panelist Colin Rule, co-founder of ODR provider Modria.com. Private companies, courts, and individuals all rely on ODR to resolve many millions of disputes per year.
Advanced negotiation techniques included in ODR include the use of artificial intelligence (AI) agents—computer programs that can negotiate and help resolve disputes. AI agents can aid in mediating disputes between two or more humans, negotiate on behalf of a human negotiator, or negotiate with each other.
Modria, for example, is an online collaborative workspace where two parties can come together to resolve their dispute. The AI agent asks each party questions about their interests and preferences, then works to facilitate an efficient agreement.
Suppose that Party A tells the AI agent privately that it would like to settle a claim with Party B for $500 but would settle for $300. The agent then might ask Party B if they would be willing to pay $500. Party B says it is will pay no more than $350. The agent can then propose a settlement of $350, a figure acceptable to both parties. The process eliminates haggling and should generate efficient outcomes, according to Rule.
Build powerful negotiation skills and become a better dealmaker and leader. Download our FREE special report, Negotiation Skills: Negotiation Strategies and Negotiation Techniques to Help You Become a Better Negotiator, from Harvard Law School.
Disputes are often marked by power imbalances between negotiators, as in the case of a customer negotiating a dispute with a corporation, noted University of Washington professor Zoe Barsness. If one party is much more powerful, how does the AI agent manage questions of injustice and structural bias, and conduct an ethical negotiation?
“Technology can be used to reduce injustice and structural bias or to exacerbate them,” responded Rule. “We’re in a transformative moment where people are paying attention to systemic white supremacy. We have to take a step back and take a look at the dynamics of these disputes to make sure we don’t automate inequality. In many ODR formats, when we think systemically about those imbalances, we can design to overcome them.”
Negotiation training may help level the playing field—and here AI can play a role as well. In his presentation, researcher Emmanuel Johnson of the Institute for Creative Technologies at the University of Southern California described an “intelligent tutoring system” he and his colleagues created to teach negotiation skills. Johnson has launched a project focused on AI negotiation skills training that will help medical students negotiate their first salary.
A Matter of Trust
Trust can be an obstacle to the adoption of ODR, noted MIT Sloan professor Jared Curhan. Drawing on research by Roger Mayer, James Davis, and F. David Schoorman, Curhan described three main trust concerns that people are likely to have about AI agents:
- Ability-based trust: “Does the AI agent have the skill to help me with this problem, or should I seek help from a human being?” “Algorithmic aversion” describes the common tendency to distrust information provided by an algorithm. At the same time, people may be more willing to trust an AI agent to handle sensitive issues, such as divorce, debt, or housing issues, noted Curhan.
- Benevolence-based trust: “Do I think this agent is trying to help me or is working against me?” People may fear that the AI agent will use the information they provide against them, especially if the ODR platform is run by their opponent.
- Integrity-based trust: “Will the agent keep its promises or defect on me?” When divulging our preferences to the other side, we might fear we can’t trust the algorithm to accurately assess what we prefer.
Using platforms created and managed by third parties—rather than by one of the parties themselves, such as a corporation—can help build trust in ODR. As the field advances, ODR designers will need to draw on advanced negotiation techniques to build useful, effective, and unbiased tools that win users’ trust.