By Chris Adams
Artificial intelligence is everywhere but few people understand how it is actually constructed.
But there are ways reporters can seek to get inside the “black box” that represents most AI systems.
In a session with National Press Foundation fellows, Miranda Bogen of the civil rights and technology organization Upturn gave a tutorial on understanding AI systems. Bogen (bio, Twitter) does research on equity implications of machine learning and the role they play in society.
She started with a definition of AI terms, and whether a system could be considered “artificial intelligence” or “machine learning” or “deep learning.” Within the industry, those terms often mean different things to different people, but Bogen said she focuses more on the output of the system than what to call it.
The questions journalists need to consider:
- What is the automated decision intended to accomplish – and are there enough data that it can do so?
- Where do the humans fit into the system? How much of the system is really automated? Many systems are a combination of people and machines, and so you need to determine where in the process the people play a role.
- What are the observable consequences of the system? In other words, why should people care? She talked about work from the investigative news outlet ProPublica that detailed how a criminal justice algorithm was conducting risk assessments on criminal defendants in ways that favored white people over African-Americans.
- What are the policies and rules than govern the system?
- What data does the system take in and what are the results it produces?
- When the system was being “trained,” were the data correct, complete and representative? For example, facial recognition systems had been shown to do a worse job identifying African-Americans than white people. But that was a function of not having enough African-American faces to test; when more were added to the system, the identification rates went up.
- If it possible to learn anything from the source code? Many not: “As the technology gets more and more complex, it’s less and less likely this will be helpful,” she said.