An artificial intelligence (AI) called NooK has beaten eight world champions at bridge. Bridge is a card game in which players work with incomplete information and must react to the behaviour of other players, which requires human-like decision making. According to the article, NooK presents a "new generation AI," as it explains its decisions as it goes along.
'Explainability' is a hot topic in the AI community and was also discussed at this year's Scottish AI Summit, which was held this week in Edinburgh and I attended virtually. An explainable AI can be understood as an AI in which the results can be understood by humans. This is in contrast to so-called "black box" machine learning concepts where even its creator cannot explain why the AI arrived at a particular decision.
NooK represents a "white box" machine learning approach where it first learns the game's rules and then improves its play through practice. It can be understood as a hybrid between rules-based and deep learning systems. This is in contrast to, for example, AlphaGo, which is a "black box" system that uses deep learning methods to acquire its knowledge through extensive training, both from human and computer play.
As was also discussed at the Scottish AI Summit, different AI systems may find applicability in different domains or fields. It is thought that hybrid systems may be particularly suitable in a wide range of fields, such as in health and engineering. For example, as the article mentions, self-driving cars will need to read each other's behaviours when negotiating a junction.