21 Jan Machines Learn To Cooperate With Human Partners, Who Often Cheat or Become Disloyal
MedicalResearch.com Interview with:
Jacob Crandall PhD
Associate Professsor, Computer Science
Brigham Young University
MedicalResearch.com: What is the background for this study?
Response: As autonomous machines become increasingly prevalent in society, they must have the ability to forge cooperative relationships with people who do not share all of their preferences. Unlike the zero-sum scenarios (e.g., Checkers, Chess, Go) often addressed by artificial intelligence, cooperation does not require sheer computational power. Instead, it is facilitated by intuition, emotions, signals, cultural norms, and pre-evolved dispositions. To understand how to create machines that cooperate with people, we developed an algorithm (called S#) that combines a state-of-the-art reinforcement learning algorithm with mechanisms for signals.
We compared the performance of S# with people in a variety of repeated games.
MedicalResearch.com: What are the main findings?
Response: Our studies showed that, when machines followed the algorithm S#, human-machine partnerships and human-human partnerships produced similar levels of mutual cooperation on average, while machine-machine partnerships had much higher levels of mutual cooperation. Additionally, S#’s ability to generate and respond to signals (cheap talk) at levels conducive to human understanding was critical to developing cooperative relationships with people.
MedicalResearch.com: What should readers take away from your report?
Response: Our studies revealed important differences between humans and S# (the machine). For example, more than half of human participants in our studies did not follow through with a verbal commitment they made to their partner at least once. On the other hand, the machine was programmed to be honest. Furthermore, while the machine typically learned to be loyal to its partner once a pattern of mutual cooperation emerged, people often defected against their partner after establishing a pattern of mutual cooperation. Our analysis indicates that had our our human participants followed the machine’s example with respect to honesty and loyalty, human-human partnerships would have been as successful as machine-machine partnerships.
MedicalResearch.com: What recommendations do you have for future research as a result of this work?
Response: The success of the algorithm in forging cooperative relationships with people suggests that artificial intelligence may be able to help improve our abilities to cooperate with each other. While humans are often good at cooperating, human relationships still frequently break down. People that were friends for years suddenly become enemies. Additionally, many potential human relationships never develop because of our inabilities to resolve perceived differences. We hope that future work can continue to address how artificial intelligence can help people get along with each other.
Jacob W. Crandall, Mayada Oudah, Tennom, Fatimah Ishowo-Oloko, Sherief Abdallah, Jean-François Bonnefon, Manuel Cebrian, Azim Shariff, Michael A. Goodrich, Iyad Rahwan, last revised 16 Jan 2018 (this version, v4))
MedicalResearch.com is not a forum for the exchange of personal medical information, advice or the promotion of self-destructive behavior (e.g., eating disorders, suicide). While you may freely discuss your troubles, you should not look to the Website for information or advice on such topics. Instead, we recommend that you talk in person with a trusted medical professional.
The information on MedicalResearch.com is provided for educational purposes only, and is in no way intended to diagnose, cure, or treat any medical or other condition. Always seek the advice of your physician or other qualified health and ask your doctor any questions you may have regarding a medical condition. In addition to all other limitations and disclaimers in this agreement, service provider and its third party providers disclaim any liability or loss in connection with the content provided on this website.