Alaska¡¯s AI Probate Assistant Reveals Limits of Chatbots in Law
In an effort to streamline probate ? the legal process of transferring a person¡¯s belongings after death ? Alaska¡¯s court system set out to build a pioneering generative AI chatbot to help residents navigate a maze of legal forms. The result, a tool named the Alaska Virtual Assistant, or AVA, predictably became less a turnkey solution than a case study in the limits of applying artificial intelligence to high-stakes legal work where accuracy is critical.
Conceived as a digital counterpart to in-person legal facilitators, AVA was designed to guide users through the forms and steps required to settle an estate. For many Alaskans who cannot afford legal counsel, the chatbot promised clear and timely assistance. But what was initially planned as a three-month project stretched to more than a year as developers confronted a core weakness of large language models: hallucinations.
Even when restricted to a fixed knowledge base, AVA occasionally fabricated information or overstated its authority. In one test, it directed users to seek help from an Alaskan law school that does not exist. In a legal context, such falsehoods can mislead people who are already vulnerable and searching for clarity.
Stacey Marz, administrative director of the Alaska Court System, said the tool must be flawless to be useful. ¡°If people are going to take the information they get from their prompt and they¡¯re going to act on it, and it¡¯s not accurate or complete, they really could suffer harm,¡± she said. ¡°It could be incredibly damaging to that person, family, or estate.¡± Errors in probate can delay inheritances, spark disputes, or lock in legal mistakes that are difficult and expensive to correct.
Developers also had to rethink the chatbot¡¯s tone. Testing revealed that users found AVA¡¯s repeated expressions of sympathy impersonal and exhausting. ¡°Everyone said, ¡®I¡¯m tired of everybody in my life telling me that they¡¯re sorry for my loss,¡¯¡± said Aubrie Souza of the National Center for State Courts. As a result, many scripted condolences were removed.
After months of revisions, AVA received a limited public release in late January 2026 with a narrower mission. It now serves as a basic informational guide rather than a substitute for legal professionals, and it directs users to human assistance when questions fall outside its scope.
The cautious rollout reflects broader government hesitation around AI. A recent Deloitte survey found fewer than 6% of local government officials consider AI a priority for service delivery, underscoring ongoing technological, ethical, and operational challenges.
In law, where nuance, accountability, and judgment remain essential, Alaska¡¯s experiment delivers a clear lesson: AI can assist, but it cannot yet replace people.
Sean Jung R&D Division Director teen/1769392426/1613367592
1. Who set out to build a pioneering generative AI chatbot in Alaska?
2. What is the name of the Virtual Assistant created for Alaska residents?
3. Why did developers need to rethink the chatbot¡¯s tone during testing periods?
4. How might using an inaccurate legal chatbot cause serious harm to families?
1. Should tourists pay to keep old city landmarks clean and safe?
2. How would you feel about seeing barriers at famous outdoor sites?
3. Is it more important to welcome tourists or protect local life?
4. What can cities do to reduce crowding without charging people money?