India’s healthcare scenario presents a set of unique challenges to ensure effective delivery of care to the large population suffering from various communicable and non communicable diseases. The medical devices market in India is largely catered by imports, most of which were not designed to handle the constraints and requirements of country’s care delivery system and market. While this presents a significant challenge to established players, it is an exciting opportunity for innovators and entrepreneurs to create and scale indigenous technology solutions tailored to local needs. However, development of affordable technology solutions which create large impact, and can achieve scale in India requires a deep understanding of the care delivery system and strong partnerships with various stakeholders of the ecosystem.
Healthcare Technology Innovation Centre of IIT Madras focuses on developing and commercializing affordable healthcare technologies through its team of over 200 engineers, doctors, researchers, students and entrepreneurs working with more than 30 medical institutions, industries, government agencies. The talk will highlight some of its technology successes and the use of AI and machine learning in tackling the healthcare challenges. The potential of these technologies beyond Indian market will be discussed.
Mohanasankar Sivaprakasam is a faculty of Electrical Engineering at IIT Madras and Director of the Healthcare Technology Innovation Centre (HTIC), a R&D centre of IIT Madras. After his PhD and postdoctoral research in US in implantable medical devices for 8 years, he returned to India with goal of developing affordable medical technologies in the country. Since 2009, he has successfully built an ecosystem of technologists, clinicians and industry, culminating in setting up of Healthcare Technology Innovation Centre (HTIC) in 2011. Over the years, HTIC has grown into a unique and leading med-tech innovation ecosystem in the country bringing together more than 20 medical institutions, industry, government agencies, collaborating with HTIC in developing affordable medical technologies for unmet healthcare needs. He has more than 70 peer reviewed publications in journals and conferences.
This talk explores a trio of related takes on how to investigate the nature of human-like intelligence. The first concerns cognitive architectures – implemented models of the fixed structure and processes that yield natural and artificial minds – with a drill down to Sigma, an attempt at a deep synthesis across what has been learned over the past four decades on (what started as) high-level symbolic cognitive architectures versus the low-level graphical/network technologies of probabilistic graphical models (such as Bayesian networks) and neural networks. The second concerns a more abstract attempt at specifying a Common Model of Cognition that yields an evolving community consensus over what must be part of any cognitive architecture for human-like intelligence. The final take concerns an even more abstract (and speculative) attempt at understanding more deeply the space of approaches to intelligence – framed as maps resulting from cross products among core cognitive dichotomies – along with how such maps may help to understand and structure the capabilities required for (human-like) intelligence.
Paul Rosenbloom is a professor of computer science in the Viterbi School of Engineering at the University of Southern California (USC) and director for cognitive architecture research at USC’s Institute for Creative Technologies (ICT). He was a member of USC’s Information Sciences Institute for two decades, ending as its deputy director, and earlier was on the faculty at Carnegie Mellon University and Stanford University (where he had a joint appointment in Computer Science and Psychology). His research concentrates on cognitive architectures (models of the fixed structures and processes that together yield a mind), the Common Model of Cognition (an attempt at developing a community consensus concerning what must be part of a human-like mind), and on computing as a scientific domain (understanding the computing sciences as akin to the physical, life and social sciences). He is a fellow of the Association for the Advancement of Artificial Intelligence (AAAI), the Association for the Advancement of Science (AAAS), and the Cognitive Science Society; and with J. Laird was recently awarded the Herbert A. Simon Prize for Advances in Cognitive Systems. He has served as councilor and conference chair for AAAI; chair of the Association for Computing Machinery Special Interest Group on Artificial Intelligence; and president of the faculty at USC.
The principal investigator of a UMBC-led “massively collaborative” project published in Science Magazine will describe how archaeologists, geographers, and information science came together to show that human societies began transforming earth thousands of years earlier than known by earth scientists; evidence for an earlier anthropocene.
Joint work with Alan Sherman, Richard Chang, Enis Golaszewski, Ryan Wnuk-Fink, Cyrus Bonyadi, Mario Costa, Moses Liskov, and Edward Zieglar
Secure Remote Password (SRP) is a widely deployed password authenticated key exchange (PAKE) protocol used in products such as 1Password and iCloud Keychain. As with other PAKE protocols, the two participants in SRP use knowledge of a pre-shared password to authenticate each other and establish a session key. I will explain the SRP protocol and security goals it seeks to achieve. I will demonstrate how to model the protocol using the Cryptographic Protocol Shapes Analyzer (CPSA) tool and present my analysis of the shapes produced by CPSA.
Erin Lanus earned her Ph.D. in computer science in May 2019 from Arizona State University. Dr. Lanus is currently conducting research with Professor Sherman’s Protocol Analysis Lab at UMBC. Her previous results include how to use state to enable CPSA to reason about time in forced-latency protocols. Her research also explored algorithmic approaches to constructing combinatorial arrays employed in interaction testing and the creation of a new type of array for attribute distribution to achieve anonymous authorization in attribute-based systems. In October she will begin as a research assistant professor at Virginia Tech’s Hume Center in Northern Virginia. email:
Support for this research was provided in part by grants to CISA from the Department of Defense, CySP grants H98230-17-1-0387 and H98230-18-0321.
Most common techniques for correlation analysis (e.g., canonical correlation analysis) require sufficiently large sample support, but in many applications only a limited number of samples are available. Correlation analysis with small sample sizes poses some unique challenges. In this talk, I will focus on the problem of determining the correlated components between two or more data sets when the number of samples from these data sets is extremely small. Applications are plentiful, and among them I will discuss the identification of weather patterns in climate science and analyzing the effects of extensive physical exercise on the autonomic nervous system.
Peter Schreier was born in Munich, Germany, in 1975. He received a Master of Science from the University of Notre Dame, IN, USA, in 1999, and a Ph.D. from the University of Colorado at Boulder, CO, USA, in 2003, both in electrical engineering. From 2004 until 2011, he was on the faculty of the School of Electrical Engineering and Computer Science at the University of Newcastle, NSW, Australia. Since 2011, he has been Chaired Professor of Signal and System Theory at Paderborn University, Germany. He has spent sabbatical semesters at the University of Hawaii at Manoa, Honolulu, HI, and Colorado State University, Ft. Collins, CO.
From 2008 until 2012, he was an Associate Editor of the IEEE Transactions on Signal Processing, from 2010 until 2014 a Senior Area Editor for the same Transactions, and from 2015 to 2018 an Associate Editor for the IEEE Signal Processing Letters. From 2009 until 2014, he was a member of the IEEE Technical Committee on Machine Learning for Signal Processing, and he currently serves on the IEEE Technical Committee on Signal Processing Theory and Methods. He is the Chair of the Steering Committee of the IEEE Signal Processing Society’s Data Science Initiative, and he serves on the IEEE SPS Regional Committee for Region 8. He was the General Chair of the 2018 IEEE Statistical Signal Processing Workshop in Freiburg, Germany.
The goal of this talk is to give an introduction to tensor decompositions for the analysis of multidimensional data. First, we recall some basic notions and operations on tensors. Then two tensor decompositions are presented: the Tucker decomposition (TD) and the Candecomp/Parafac decomposition (CPD). A particular focus is placed on the identifiability conditions of the CPD. Finally, various applications in biology are presented.
David Brie received the Ph.D. degree in 1992 and the Habilitation à Diriger des Recherches degree in 2000, both from Université de Lorraine, France. He is currently full professor at the Department of Telecommunications and Networking of the Institut Universitaire de Technologie, Université de Lorraine, France. He is editor-in-chief of the French journal “Traitement du Signal” since 2013 and will be co-general chair of the next IEEE CAMSAP 2019. His current research interests include vector-sensor-array processing, spectroscopy and hyperspectral image processing, non-negative matrix factorization, multidimensional signal processing, and tensor decompositions.
Recent technological advances have enabled deployments of pervasive sensing and actuation in our physical world, which has led to the emergence of cyber-physical systems where computing and sensing interact with the physical world and humans in unique and exciting ways. Such systems are increasingly being deployed in smart city domains such as energy, transportation, health, grids, and agriculture.
In this talk, I will argue that the rich and vast amounts of data generated by smart city applications necessitate a data-driven approach where AI and systems techniques are employed in a symbiotic manner to tackle smart city challenges. I will present two smart city applications from the energy domain as examples of such a symbiotic approach. I will first present WattHome, a city-scale machine-learning-based approach that can determine the least efficient buildings within a large city or region. I will present the results of a city-scale evaluation performed in collaboration with a local utility, where WattHome successfully identified causes of energy inefficiency for thousands of buildings. Second, I will present SolarClique, a sensor-less data-driven approach that is designed to detect anomalies in power generation of large number of existing solar sites without requiring any additional sensor instrumentation. I will conclude my talk by describing a number of open challenges in designing data-driven approaches for smart cities.
Prashant Shenoy is currently a Professor and Associate Dean in the College of Information and Computer Sciences at the University of Massachusetts Amherst. He received the B.Tech degree in Computer Science and Engineering from the Indian Institute of Technology, Bombay and the M.S and Ph.D degrees in Computer Science from the University of Texas, Austin. His research interests lie in distributed systems and networking, with a recent emphasis on cloud and green computing. He has been the recipient of several best paper awards at leading conferences, including a Sigmetrics Test of Time Award. He serves on editorial boards of the several journals and has served as the program chair of over a dozen ACM and IEEE conferences. He is a fellow of the IEEE and the AAAS and a distinguished member of the ACM.