【Abstract】Deep Learning and Artificial Intelligence have attracted enormous attention recently. The race to design and manufacture “brain-like” computers is on and several companies have produced various such chips. Yet, the current state of affairs is very unsatisfactory and ad hoc. We describe a mathematical framework we have developed that provides a hierarchical architecture for learning and cognition. The architecture combines a wavelet preprocessor, a group invariant feature extractor and a hierarchical (layered) learning algorithm. There are two global feedback loops, one back from the learning output to the feature extractor and one all the way back to the wavelet preprocessor. We show that the scheme can incorporate all typical metric differences but also non-metric dissimilarity measures like Bregman divergences. The learning module incorporates two universal learning algorithms in their hierarchical tree-structured form, both due to Kohonen, Learning Vector Quantization (LVQ) for supervised learning and Self-Organizing Map (SOM) for unsupervised learning. We describe our most recent work on convergence properties of stochastic vector quantization (VQ) and its supervised counterpart, Learning Vector Quantization (LVQ), based on Bregman divergences. We demonstrate the superior performance of the resulting algorithms and architecture on a variety of practical problems including: speaker and sound identification, simultaneous determination of sound direction of arrival speaker and vowel ID, face recognition. We next describe three concrete foundational challenges for Artificial Intelligence. We describe current work and plans on micro-electronic implementations that mimic architectural abstractions of the cortex of higher-level animals and humans, for sound and vision perception and cognition. The resulting architecture is non-von Neumann (i.e. computing and memory are not separated in the hardware) and neuromorphic. We call the resulting chip class “Cortex-on-a-Chip”.
【CV】John S. Baras is a Distinguished University Professor and holds the endowed Lockheed Martin Chair in Systems Engineering at the Institute for Systems Research (ISR) and the Department of Electrical and Computer Engineering of the University of Maryland College Park. He received his Ph.D. degree in Applied Mathematics from Harvard University in 1973. Professor in the Applied Mathematics, Statistics and Scientific Computation Program. Affiliate Professor of: Fischell Department of Bioengineering; Department of Mechanical Engineering; Department of Decision, Operations and Information Technologies, Robert H. Smith School of Business; Department of Computer Science. From 1985 to 1991, he was the Founding Director of the ISR and since 1992 he has been the Director of the Maryland Center for Hybrid Networks, which he co-founded. He is an IEEE Life Fellow, SIAM Fellow, AAAS Fellow, NAI Fellow, IFAC Fellow, AIAA Associate Fellow, AMS Fellow, Member of the National Academy of Inventors and a Foreign Member of the Royal Swedish Academy of Engineering Sciences (IVA). Major honors and awards include the 1980 George Axelby Award from the IEEE Control Systems Society, the 2006 Leonard Abraham Prize from the IEEE Communications Society, the 2017 IEEE Simon Ramo Medal, the 2017 AACC Richard E. Bellman Control Heritage Award., and the 2018 AIAA Aerospace Communications Award. In 2016 he was inducted in the University of Maryland A. J. Clark School of Engineering Innovation Hall of Fame. In 2018 he was awarded a Doctorate Honoris Causa by his alma mater the National Technical University of Athens, Greece. His research interests include systems and control, optimization, communication networks, signal processing and understanding, robotics, computing systems, network security and trust, systems biology, healthcare management systems, model-based systems engineering. He has been awarded eighteen patents and has been honored worldwide with many awards as innovator and leader of economic development.