US Chief Justice: AI won’t replace judges but will ‘transform our work’

03.01.2024

 

In the annual report of the Federal Judiciary, Chief Justice John Roberts addressed the potential impact of artificial intelligence (AI) on the judicial system. Chief Justice Roberts aimed to alleviate concerns about the obsolescence of judges in light of technological advancements as the year 2023 concluded.

Amidst fervent predictions regarding the future of AI, Chief Justice Roberts asserted that judges are not on the brink of becoming obsolete. However, he acknowledged the ongoing transformation of their work due to technological changes. Emphasizing the intrinsic value of human judgment, Roberts argued that machines cannot completely replace the nuanced decisions made by individuals.

The Chief Justice highlighted the significance of subtle human factors, such as a trembling hand, momentary hesitation, or a fleeting break in eye contact—elements that machines may struggle to accurately discern. He underscored the public’s inherent trust in human judgment over AI when evaluating such nuances.

While Roberts expressed legitimate concerns about potential drawbacks of AI in the legal domain, including the risk of AI-generated fabricated answers or “hallucinations,” he provided examples of lawyers using AI-powered applications to submit briefs referencing imaginary cases. Additionally, he pointed out the risks associated with AI influencing privacy and introducing bias in discretionary decisions like assessing flight risk and recidivism.

Despite these apprehensions, Chief Justice Roberts recognized the positive aspects of incorporating AI into the legal system. He acknowledged AI’s potential to democratize access to legal advice and tools, particularly benefiting those who cannot afford legal representation.

In conclusion, as the legal world adapts to the integration of AI, Chief Justice Roberts’ reflections underscore the importance of striking a balance between harnessing its substantial benefits and managing the potentially devastating risks.

en_USEnglish