Researchers at Washington State University claim to have programmed artificial intelligences that can teach one another how to play certain video games. Matthew E. Taylor, a professor specializing in AI and robot learning—and who is not at all interested in winning the hearts of our future overlords and securing his place amid their ascendance—led the team. The researchers created “teacher” and “student” artificial intelligences that purportedly interacted like humans would. A student AI would start out being terrible at Pac-Man or StarCraft, the two games used in the study, and the teacher would provide it with advice at the optimal moments. Eventually, according to Taylor, the student learned enough to become even better at the games than the teacher and could then teach a new student.
In the Washington State University statement regarding the study, Taylor says the research was more than just a cool thing to do and is a small step toward one of the major goals for the future of robotics: If a robot can teach a robot to play Pac-Man, then surely it can teach a human. According to Taylor, the trick—for teaching robots and humans alike—is figuring out the right times to give advice and finding a sweet spot for the amount of advice offered. Give too little, and there’s not enough learning. Give advice too often, and learning is impeded.
As for the need to have AIs that can teach other AIs, Taylor gives the example of a future where we all have robot servants. We’ll need a way to teach their technologically superior replacements how we like to be served. You could just transfer your crappy old robot’s memories, but what if they aren’t compatible with the brain of your new robot butler? Teaching the new one about your pampering preferences is the next best option. But then your former robot servant has to stick around where it isn’t wanted and teach its replacement. Hopefully robots won’t have learned to feel by then; otherwise, that’ll just be awkward for everybody. [via ScienceDaily]