Future of Law Speaker, Harry Surden

November 13, 2018

A former software engineer before going to law school, Harry Surden, now a professor of law at the University of Colorado Law School and affiliated faculty at the Stanford CodeX Center, spoke to law students and prospective law students at the Future of Law Lecture Series, March 6, 2017 at BYU Law School.

He addressed the values and biases embedded in legal artificial intelligence systems making a distinction between “legal” and other technological systems because of the importance of legal systems’ impact on our system of justice. Examples of these systems already integrated into legal use are criminal sentencing systems, border entry control, government benefits, and tax preparation software.

Although many biases are not “purposeful” in a system, engineers do have choices about how a technology will operate in a system, and their design choices may promote certain outcomes over others. Surden named these biases in a more positive light, calling them “values” that are embedded in systems based on engineering design choices. He first examined machine learning, the pattern-based approach used in artificial intelligence.

He gave the example of how criminal sentencing systems use data to find patterns for risk of reoffending, but pointed out if the data itself is based or skewed then the output of the system will be biased. So if police disproportionately stop some groups over others the data would be distorted in the patterns programmed into the system. System recommendations would be distorted, too.

Secondly, he addressed the artificial intelligence in a system based on logic and a rules-based approach. This system could distort the meaning and interpretation of laws by translating the law into a computer code not easily subject to review. This kind of bias might be inherent in income tax software that is represented as reflecting accurate income tax law when it is programming fixed in computer code.

Surden concluded with the warning that creators of legal technological systems must be vigilant on value issues implicit in design.