Learning to think critically about machine learning MIT news

Students at MIT Course 6,036 (Introduction to Machine Learning) study the principles underlying powerful models that help physicians diagnose illnesses or help employers screen job candidates.

Now, thanks Social and ethical responsibilities of computing (SERC) system, these students will also stop reflecting on the consequences these artificial intelligence tools, which sometimes have unintended consequences.

Last winter, the team from HEART SCIENTISTS worked with instructor Leslie Kaelbling, a Panasonic professor of computer science and engineering, and 6,036 teaching assistants to provide weekly labs with material covering ethical computation, data and model bias, and machine learning integrity. This process in 2019. initiated in the fall by Jacob Andreas, Associate Professor, Department of Electrical Engineering and Informatics, Consortium X. SERC researchers collaborate in multidisciplinary teams to help postdoctoral and faculty develop new course material.

Because 6,036 is such a large course, more than 500 students will enroll in 2021. in the spring semester, struggled with these ethical aspects along with efforts to learn new computational techniques. For some, this may have been their first experience of thinking critically in an academic environment about the potential negative effects of machine learning.

SERC researchers evaluated each laboratory to create specific samples and ethical questions that would fit the material for that week. Everyone brought a different set of tools. Serena Booth is a graduate of the Interactive Robotics Group at the Computer Science and Artificial Intelligence Laboratory (CSAIL). Marion Boulicault was a graduate student in the Department of Linguistics and Philosophy and is now a postdoctoral fellow at the MIT Schwarzman College of Computing, which houses the SERC. And Rodrigo Ochigame was a graduate of the History, Anthropology and Science, Technology and Society (HASTS) program and is now an associate professor at Leiden University in the Netherlands. They worked closely with teaching assistant Dheekshita Kumar, MEng ’21, who contributed to the development of the course material.

They thought and repeated each laboratory, working closely with the teaching assistants to ensure that the content was relevant and conducive to achieving the core learning objectives of the course. At the same time, they helped teacher assistants identify the best way to present material and conduct conversations on topics with social implications, such as race, gender, and observation.

“In a class like 6,036, we are dealing with 500 people who are not there to learn ethics. They believe it is there to learn the nuts and bolts of machine learning, such as loss functions, activation functions, and the like. We have this challenge to make sure that those students are actually very active and active in this debate. We have done this by linking social issues very closely to technical content, ”says Booth.

For example, in the lab, on how to present the input features of a machine learning model, they presented different definitions of fairness, asked students to consider the pros and cons of each definition, and then invited them to think about the features that would need to be entered to make the model correct.

Four laboratories have now been published MIT OpenCourseWare. A new team of SERC researchers is reviewing the other eight, based on feedback from faculty and students, focusing on learning objectives, filling in gaps, and highlighting important concepts.

Conscious attitude

The students ’efforts in using 6,036 show how the SERC seeks to work with faculty in ways that suit them, says Julie Shah, SERC’s dean and professor of aeronautics and astronautics. They adapted the SERC process due to the unique nature of this large course and the strict time constraints.

The SERC was founded more than two years ago at MIT Schwarzman College of Computing as a deliberate approach to bringing faculty from different disciplines together in a collaborative environment to jointly develop and publish new course material focused on social and responsible computing.

Each semester, the SERC team invites about a dozen faculty members to join an action group to develop new teaching materials (there are several SERC action groups, each with a different mission). They are purposeful in what they invite and aim to involve faculty members who are likely to form a fruitful partnership in smaller subgroups, says David Kaiser, dean of SERC, professor of history of science at Germeshausen and professor of physics.

These subgroups, consisting of two or three faculty members, highlighted during their term of office their common interest in developing new ethical material. But instead of one discipline serving another, the process is two-way; Each faculty member returns new material to their course, Shah explains. Faculties are included in action groups from all five MIT schools.

“In part, it involves going beyond the usual disciplinary boundaries and building language, and then trusting and collaborating with someone new beyond the ordinary circles. That’s why I think our conscious approach has been so successful. It’s good to try materials and add new things to your course, but building relationships is key. That makes it valuable to everyone, ”she says.

When making an impact

For the past two years, Shah and Kaiser have been captivated by the energy and enthusiasm associated with these efforts.

Since the beginning of the program, they have worked with approximately 80 faculty members, and more than 2,100 students have attended courses featuring new SERC content in the past year alone. Those students are not necessarily engineers – about 500 have been exposed to SERC content through courses at the School of Humanities, Arts and Social Sciences, the Sloan School of Management and the School of Architecture and Planning.

The core principle of the SERC is the principle that ethics and social responsibility in computing should be integrated into all areas of MIT curriculum, making it as important as the technical parts of the curriculum, Shah says. Technology, and in particular artificial intelligence, now affects almost all industries, so students in all disciplines should have the training to help them understand these tools and think carefully about their power and pitfalls.

“It’s not someone else’s job to figure out why or what happens when things go wrong. It is all our responsibility and we can all be ready to do it. Let’s get used to it. “Let’s build muscle so we can stop and ask those difficult questions, even if we can’t find one answer at the end of the problem set,” says Kaiser.

It was an exceptional challenge for the three SERC researchers to carefully prepare ethical questions when there was no answer key. But deep thinking about such complex issues has also helped Booth, Boulicault, and Ochigame learn, grow, and see the world through the lens of other disciplines.

They expect 6,036 students and teaching assistants to take these important lessons into account and into their future careers.

“This process inspired me and gave me energy, I learned so much – not only the technical material, but also what you can achieve by collaborating in different disciplines. The scale of this effort alone is exciting. If we have this group of 500 students coming out into the world with a better understanding of how to think about such issues, I think we could really make a difference, ”says Boulicault.

Godfrey Kemp

"Bacon fanatic. Social media enthusiast. Music practitioner. Internet scholar. Incurable travel advocate. Wannabe web junkie. Coffeeaholic. Alcohol fanatic."

Leave a Reply

Your email address will not be published.