Responsible AI in Legal Services (RAILS)

Jeff Ward
Top Legal Marketing Companies

Raise your hand if you have encountered a story in the last 24 hours about AI’s potential for the legal industry.

With all those opportunities have come some ethical questions about the rules of the road and the need for guardrails. “[AI] is not without its challenges. We must understand the need to address bias. We need to protect data privacy and security, and we must understand the need for transparency and explainability in many contexts,” said Jeff Ward, director of the Duke Center on Law & Tech. “The focus should be on usable guardrails that balance innovation with accountability across contexts like client engagement, courtroom proceedings, direct-to-consumer legal services, and more.”

Advertisement

Answering Legal Banner

The Duke Center on Law & Tech has launched a new initiative called Responsible AI in Legal Services (RAILS), a network of representatives from the judiciary, corporations, law firms, tech providers, and access to justice organizations. RAILS participants are developing resources through several cross-industry working groups. According to the RAILS website, the goal is “to support the responsible, ethical, and safe use of AI to advance the practice of law and delivery of legal services to all.”

Access to Justice

While RAILS attends to AI’s use in many contexts, a top priority for RAILS is helping provide access to justice for people who qualify for legal aid, people who need legal services but can’t afford to pay for them, and small businesses.

“[We’re] taking an analytic approach to risk versus reality of legal AI tools that facilitate direct-to-consumer legal services to help close the access to justice gap through lower cost and increased efficiency compared to seeking traditional legal help,” said Maya Markovich, executive director of the Justice Technology Association and a member of the RAILS Steering Committee.

Advertisement

Eza Mediation

Changes in Laws and AI’s Role

Ward emphasized the importance of adaptability in AI tools. “We have to think about building tools in a way that accounts for changes in laws, changes in regulatory structure, changes in cultural understanding and technology,” he said.

While some consumer-facing justice tech companies have relied solely on AI for client interaction, Ward cautions that for many legal matters, AI should only be a starting point.

“Legal expertise is still needed to understand not only where the law is going but to predict the risks for all sorts of different clients,” explained Ward. “So, AI probably would never be held responsible for understanding what the law is in that context. It might help us define the information and allow us as human beings to research it. It might allow us to craft documents. But it’s not the ground truth. It is still going to come down to well-trained legal experts checking the information and providing feedback.”

Collaboration and Feedback

As RAILS continues to develop, the initiative is committed to open communication and collaboration across industries involved in AI for the legal sector.

“As things develop, we’ll post them to our website, rails.legal. It will be an open website where people can come to find out what’s going on and see what tools are being developed,” said Ward.

“This is not a proprietary endeavor. We’re trying to forge these answers through discourse. It’s also a place where people can come and just add their two cents. With almost everything we’ve put out, there is a ‘please give us your comments’ and ‘please give us your feedback’ message.”

Bob Friedman

Robert "Bob" Friedman is the publisher of Attorney at Law Magazine North Carolina Triangle. He contributes articles and interviews to each issue.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts