talk: Energy Efficient and High Performance Architectures for DSP and Communication Applications

EE Graduate Seminar

Energy Efficient and High Performance Architectures
for DSP and Communication Applications

Tinoosh Mohsenin, PhD
Assistant Professor of Computer Engineering

CSEE Dept/UMBC

11:30am-12:45pm, 9 March 2012, ITE 231

Many emerging and future communication applications require a significant amount of high throughput data processing and operate with decreasing power budgets. This need for greater energy efficiency and improved performance of electronic devices demands co-optimization of algorithms, architectures, and implementations. This talk presents several design projects that illustrate the cross-domain optimization.

The design of System-on-Chip (SoC) blocks becomes increasingly sophisticated with emergent communication standards that have large real-time computational requirements. Two such algorithms, Low Density Parity Check (LDPC) decoding and Compressive Sensing (CS), have received significant attention. LDPC decoding is an error correction technique which has shown superior error correction performance and has been adopted by several recent communication standards. Compressive sensing is a revolutionary technique which reduces the amount of data collected during acquisition and allows sparse signals and images to be recovered from very few samples compared to the traditional Nyquist sampling. While both LDPC decoding and compressive sampling have several advantages, they require high computational intensive algorithms which typically suffer from high power consumption and low clock rates. This talk presents novel algorithms and architectures to address these challenges.

As future communication systems demand increasing flexibility and performance within a limited power budget, multi-core and many-core chip architectures have become a promising solution. The design and implementation of a many-core platform capable of performing DSP applications is presented. The low power and low area core processors are connected through a hierarchical network structure. The network protocol includes contention resolution for high data traffic between cores. The result is a platform with higher performance and lower power consumption than a traditional DSP with the ease of programmability lacking in an ASIC. Early post place and route results from a standard-cell design gives processor areas of 0.078 mm2 each using TSMCs 65 nm.

Dr. Mohsenin received the B.S. degree in electrical engineering from the Sharif University of Technology, Iran and the M.S. and PhD degrees in electrical and computer engineering from Rice University and University of California Davis in 2004 and 2010, respectively. In 2011, she joined the Department of Computer Science and Electrical Engineering at the University of Maryland Baltimore County where she is currently an Assistant Professor. Dr. Mohsenin's research interests lie in the areas of high performance and energy-efficiency in programmable and special purpose processors. She is the director of the Energy Efficient High Performance Computing (EEPC) Lab, where she leads projects in architecture, hardware, software tools, and applications for VLSI computation with an emphasis on DSP workloads. Dr. Mohsenin has been consultant to early stage technology companies and currently serves as a member of the Technical Program Committees of the IEEE Biomedical Circuits & Systems Conference (BioCAS), the Life Science Systems and Applications Workshop (LiSSA), and IEEE Women in Circuits and Systems (WiCAS).

Host: Prof. Joel M. Morris

Need Homework Help? Ask Dan…

A Sophomore Computer Science major, Dan Maselko has been a tutor in the Computer Science Help Center since last fall.

Dan Maselko got hooked on computer science in high school. “When I took my first computer programming course in tenth grade,” he says, “I realized how easy and fun it was for me to get computers to solve problems.” Since then, Dan, a Sophomore, has been working towards his Computer Science degree while helping those who struggle with the subject.

Last Fall, Dan applied to be a tutor in the Computer Science Help Center. “The best thing about tutoring is getting the chance to help other students learn,” he says. “Every time someone walks out of the door of the Help Center with a better understanding of the material they had questions about, I just feel good knowing I could help them learn something.” Though Dan mainly helps students in CMSC 104, 201, and 202, the center provides help for students in most lower-level Computer Science courses including CMSC 100, 203, 313, 331, and 341, he explains.

The Computer Science Help Center—located in ITE 201-E—offers tutoring on a walk in basis. “Anyone enrolled in a Computer Science course at UMBC can be tutored by the Help Center,” says Dan, “and it’s completely free.” Dan compares the challenges of tutoring to those faced by computer scientists.

“The good challenge is trying to figure out how to make the computer science topics make sense to different people with different ways of thinking,” explains Dan. “Trying to understand so many diverse strategies is a lot like solving a problem in computer science.”

Dan has plans to continue tutoring until he pursues a Master’s degree in Computer Science. Once in graduate school, his teaching aspirations will not cease: “I do hope to eventually become a TA.” Though Dan enjoys helping others, he’s not set on a career in teaching, though he’s still considering it. “I…want to work at a job that’s exciting and requires collaboration,” he says. “Right now the thing that excites me most is cyber security.”

Niels Kasch PhD Defense: Mining Commonsense Knowledge from the Web

Ph.D. Dissertation Defense

Mining Commonsense Knowledge from the Web:
Towards Inducing Script-like Structures From Large-scale Text Sources

Niels Kasch

10:00am Friday, March 9th, 2012, ITE 325B

Knowing the sequences of events in situations such as eating at a restaurant is an example of commonsense knowledge needed for a broad range of cognitive tasks (e.g., language understanding). This thesis outlines an approach to mine information about sequential, every day situations in a topic-driven fashion to produce declarative, script-like representations (c.f., Schank's scripts). Given a topic such as eating at a restaurant, we produce graphs of temporally ordered events involved with the activity referenced by the topic. Our work utilizes large-scale data sources (e.g., the Web) to avoid data sparseness issues of narrow corpora.

We describe steps that address the scale and noisiness of the Web to make it accessible for script extraction. Boilerplate elements (e.g., navigation bars and advertising) on web pages skew distributional statistics of words and obstruct information retrieval tasks. To make the web usable as a corpus, we introduce a machine learning technique to separate boilerplate elements from content in arbitrary web pages.

A key element for commonsense knowledge extraction is the generation of a topic-specific corpus that facilitates script extraction in a topic-driven manner. We introduce Concept Modeling for Scripts as an efficient method to induce concepts containing script elements (e.g., events, people, and objects) from topic-specific corpora. Our experiments and user studies conducted on the 2011 ICWSM Spinn3r dataset show that our method outperforms state of the art topic-modeling approaches such as Latent Dirichlet Allocation (LDA) on this task when applied to unbalanced (topic-specific) corpora.

Concept Modeling serves as a starting point for automated methods to discover events relevant to a script. We demonstrate event detection methods in topic-specific corpora based on (1) learned dependency paths indicative of individual event structures, (2) semantic cohesiveness of event pairs, and (3) surface structures indicative of golden sentences containing sequential information. Events extracted for a given topic can be arranged in a graph. The detection methods exploit graph analysis methods to identify strongly connected components to prune the event set such that related and central events are predominant in the structure. User studies demonstrate that (1) the Web is suitable for mining script-like knowledge and (2) the resulting graph structures portray events strongly related to a given topic.

Script-like structures, by definition, impose temporal ordering on the events contained within the structure. This work also presents a novel method to induce ordering information from topic-specific corpora based on a counting framework to judge the presence and strength of a temporal happens-before relation. The framework is extensible to several counting methods, where a counting method provides co-occurrence and ordering statistics. We present, among others, a novel naive counting methods that uses a simple sentence position assumption for temporal order. Comparisons to existing temporal resources show that our naive method, in conjunction with connected components analysis, induces temporal relationship with similar accuracy than more sophisticated methods, yet with a smaller computational footprint.

Committee

  • Dr. Tim Oates (chair)
  • Dr. Ronnie W. Smith
  • Dr. Matt Schmill
  • Dr. Tim Finin
  • Dr. Charles Nicholas

Next Century Corporation Comes to the Classroom

This semester, the students in Susan Mitchell’s Software Design and Development course were hand-picked. After applying and being interviewed, ten students were chosen based on their “go-getter” attitude.

Why the selectivity? Susan ’s CMSC 345 course this semester is a trial course that’s being taught in collaboration with Next Century Corporation, a Maryland-based technology company. Though Mitchell has been teaching CMSC 345 for ten years, this is a first.

Designed around the completion of one software-design project, the course provides students with a “customer” (normally a faculty member) who gives them specific guidelines for the “product” they need to complete. In years prior, students were given the task of developing a program that plans a student’s UMBC course career. Mitchell explains that the product for this semester will be especially real-world focused.

In fact, essentially everything about the course is meant to simulate working in the software industry. A writing intensive course, students are asked to write formal documents, and at the end of the semester, they must give a formal presentation.
Mitchell explains that the course isn’t so much about coding as it is about understanding the “software development lifecycle.” It’s the process that’s important, she explains, from conception to carry through. Understanding what the customer wants and then turning out a product that fits those guidelines is the goal.

Chris Stepnitz, a software engineer at Next Century, is the “customer” of this semester’s pilot course. Stepnitz, who graduated from UMBC in 2006 with a degree in Computer Science, took the very same course with Mitchell years ago. “We wrote an accounting system,” remembers Stepnitz, who admits she was considering changing majors before taking the course. She credits it with opening her eyes to the reality of a career in software development and the rewarding experience of programming with a team.

So, when Stepnitz heard that Next Century, who has been reaching out to the community through local colleges, was about to reach out to her alma matter, she jumped at the chance to participate. “I’m very excited,” says Stepnitz. “For the students, I really want to make sure that they both enjoy [the class] and get the taste of what it’s like to really be in the development world.”

The arrangement is meant to be mutually beneficial. Students in the course learn how to succeed in an industry setting, while Next Century builds bonds with universities that may provide them with future staff members (In fact, roughly 20% of their staff are UMBC alumni). If all goes well, Mitchell hopes to collaborate again and maybe even branch out to other local businesses.

talk: Interactive visual computing for knowledge discovery in science, engineering and training

Interactive visual computing for knowledge discovery
in science, engineering and training

Dr. Jian Chen
University of Southern Mississippi

1:00pm Wednesday 7 March 2012, ITE 325b UMBC

Advances in simulations and lab experiments are producing huge datasets at unprecedented rates, and deriving meanings from these data will have far-reaching impacts on our lives in many areas of science, engineering, and medicine. Visualization and interactive computing provide great tools for exploiting these data in scientific discovery and engineering innovations. A limiting factor in the scientific use of visualization tools is the lack of guiding principles to identify and assess visualization methods that are helpful in scientific tasks. In this talk, I present research designed to advance knowledge discovery through the design and evaluation of interactive visualizations. Experiments on image illumination and density are described that successfully address this limitation in brain imaging for medical diagnoses. I also present the theoretical foundations that have led to the various choices in visualization design. In the second part of the talk, I argue that most existing tools designed for scientific discovery fail to address the dynamic nature of the discovery workflow. I present a new visualization tool, VisBubbles, that integrates programming, visualization, and interaction in one environment to create fluid workflows in which new hypotheses can be tested efficiently. VisBubbles augments interactive computing and analysis of time-varying motion data of bat flights by enabling dynamic displays, thus facilitating scientists' quest for new knowledge. I present the design methods we have followed in our long-term collaboration with biologists and engineering scientists on motion analysis. Finally, I present future work I envision in interactive visualization that will be critical in developing future visualization tools for science, engineering, and training.

Jian Chen is an assistant professor in the School of Computing at the University of Southern Mississippi. She is the founder and director of Interactive Visual Computing Lab. Her research is in the broad area of interaction and visualization, with current focuses on the emerging field of scientific visualization theory and workflow analysis. She has published numerous articles in top journals and international conferences. Her panel on combining human-centered computing and scientific visualization received honorable mention at the 2007 IEEE Visualization Conference. She was a postdoc at Brown University with Drs. David H. Laidlaw (CS) and Sharon Swartz (BioMed) from 2006 to 2009. She has a Ph.D. degree in Computer Science from Virginia Tech and Master’s degrees in both Computer Science and Mechanical Engineering. Her research has been funded by DHS and NSF.

Host: Penny Rheingans

See here for more information

talk: Using Static Analysis to Diagnose Misconfigured Open Source Systems Software

Using Static Analysis to Diagnose
Misconfigured Open Source Systems Software

Ariel Rabkin, UC Berkeley

1:00pm Monday 5 March 2012, ITE 325b UMBC

Ten years ago, few software developers worked on distributed systems. Today, developers often run code on clusters, relying on large open-source software stacks to manage resources. These systems are challenging to configure and debug. Fortunately, developments in program analysis have given us new tools for managing the complexity of modern software. This talk will show how static analysis can help users configure their systems. I present a technique that builds an explicit table mapping a program's possible error messages to the options that might cause them. As a result, users can get immediate feedback on how to resolve configuration errors.

Ari Rabkin is a PhD student in Computer Science at UC Berkeley working in the AMP lab. His current research interest is the software engineering and administration challenges of big-data systems. He is particularly interested in applying program analysis techniques to tasks like log analysis and configuration debugging. His broader interests focus on systems and security, including improving system usability by making systems easier to understand, the connections between computer science research and technology policy, developing program analysis techniques that work acceptably well on large, complex, messy software systems.

Host: Anupam Joshi
See http://www.csee.umbc.edu/talks for more information

The Princeton Review recognizes UMBC's Video Game Design program

UMBC has earned an honorable mention on The Princeton Review’s recently released list: “Top Schools to Study Video Game Design for 2012.The recognition places UMBC among schools like Georgia Institute of Technology, UC Santa Cruz, and Northeastern University.

Released annually, the list features 50 schools from around the country, including their “top ten” undergraduate and graduate schools in this category. This year, the University of Southern California secured first place for their undergraduate game design program. The rankings were based on a survey administered by The Princeton Review during the 2011-2012 academic year that consulted administrators at 150 schools and universities. The winning schools were judged on the quality of their curriculum, faculty, facilities, and infrastructure, as well as their scholarship, financial aid, and career opportunities.

“We salute the schools on our list this year for their commitment to this burgeoning field and the innovative programs they offer. For students aspiring to work in this more than $10.5 billion industry and for the companies that will need their creative talents and skills, we hope this project will serve as a catalyst for many rewarding connections,” says Robert Franek, The Princeton Review’s Senior VP/Publisher in a press release.

Both UMBC’s Computer Science and Electrical Engineering Department and its Visual Arts Department offer programs for students interested in pursuing a career in video game development. Artists can concentrate on “Animation and Interactive Media,” while computer programmers can pursue the “Game Development” track within the Computer Science major. In addition, UMBC has a Game Development club, and has been a host site of the International Development Association’s Global Game Jam for the past four years.

To see a comprehensive list of winners, visit The Princeton Review’s website.

In the News: driverless cars and digital intersections

The buzz about driverless cars erupted after Google received a patent for the technology in December of last year. Since then, the project has been steadily moving forward. A few days ago, Google started testing their vehicles on Nevada roads, following approval by the state’s Legislative Commission. With the inevitability of autonomous transportation creeping closer, measures to ensure its realization are being pursued by more companies than just Google.

Peter Stone, a professor of computer science at the University of Texas at Austin, and his team are working on developing a “smart” intersection that would manage the flow of driverless cars, reports techworld.com.

According to Stone’s project, intersections would be equipped with a manager that would coordinate traffic in an efficient and mathematical way. Techworld reports:

‘"When a car gets close to the intersection, it calls ahead and says, 'I want to go through the intersection.' The intersection manager says either yes or no. It keeps track of the reservations it grants and makes sure it doesn't give permission to other cars that would conflict with them," Stone said.’

To learn more, check out the full article: “Scientists develop computer-controlled intersections for self-driving cars.”

He dances, he climbs, he teaches Computer Science: Meet Max

Meet Max, a Teaching Assistant who loves climbing mountains, swing dancing, and Artificial Intelligence.

“I’ve never been bored in my life,” says Maksym Morawski (call him Max), a Computer Science graduate student who spends most of his free time scaling mountains.

Originally from Silver Spring, Max moved to Baltimore in 2006 to study Computer Science as an undergraduate. In the 4th grade, while others kids were busy building volcanoes for their science projects, Max and his computer scientist dad were putting together a computer that compared different algorithms for computing prime numbers. So choosing his major in college, explains Max, was a no-brainer.

Now a second year graduate student pursuing a Master’s in Computer Science, Max is working on a thesis that looks at predicting connections in social networks, like Facebook. A computer scientist with a sociological streak, Max’s project uses computers to understand how people interact with one another based on e-mail data sets taken from corporations.

Max’s foray into teaching began in 2010 when he became a Teaching Assistant for CMSC 202. He says his favorite part about being a TA are the discussions—where he actually gets to get up and teach and get his students excited about Computer Science. His dose of teacherly advice is as follows: “Program for fun.” If you don’t practice and enjoy programming, he explains, you will never be as good as someone who lives and breathes it.

Throughout his years at UMBC, Max’s on-campus involvement has extended past teaching. An avid dancer (he frequents Mobtown Ballroom in Baltimore City), he founded UMBC’s Swing Dancing club. He also helped conceive Project X, the club that sponsored a campus-wide scavenger hunt in 2008 and 2009 that included tasks like jumping into the Inner Harbor and high-fiving Freeman Hrabowski (which prompted a not-so-enthusiastic e-mail from the UMBC president). The prize for the hunt was an amalgamation of candy that was procured from the “Spot” using late-night meals over a series of weeks, explains Max.

But, Max’s favorite thing to do is the hobby he took up in high school: exploring mountains. A frequenter of Earth Treks—a climbing center in Columbia–Max had plans to climb frozen waterfalls in New York State this winter. His dream job, he says half-jokingly, is to be a mountaineering guide. Though, he may also consider a job in academia: “I would love to be a teacher,” he says.

talk: Building and Testing Distributed Systems

Building and Testing Distributed Systems

Dr. Charles Killian
Purdue University, Computer Science

1:00pm Friday, 2 March 2012, ITE325 UMBC

Building distributed systems is particularly difficult because of the asynchronous, heterogeneous, and failure-prone environment where these systems must run. This asynchrony makes verifying the correctness of systems implementations even more challenging. Tools for building distributed systems must often strike a compromise between reducing programmer effort and increasing system efficiency. In my research, we strive to introduce a limited amount of structure and limitations to implementations to enable a wide range of analysis and development assistance. Most prominently, we have built the Mace language and runtime, which translates a concise, expressive distributed system specification into a C++ implementation. The Mace specification importantly exposes three key pieces of structure: atomic events, explicit state, and explicit messaging.

With a few additional contextual annotations, we show how we can support intra-node parallel event processing of these atomic events while still preserving sequenal event consistency—even using variably available computing resources distributed across a cluster. By leveraging these three structural elements, we have further built tools such as a model checker capable of detecting liveness violations in systems code, a performance tester, and an automated malicious protocol tester. Recent research has also explored applications of these key structures in legacy software, that has produced a log anaysis tool that can detect performance problems, and a malicious fault injector that can discover successful performance attacks. Mace has been in development since 2004 and has been used to build a wide variety of Internet-ready distributed systems both by myself and by researchers at places such as Cornell University, Microsoft Research (Redmond, Silicon Valley, and Beijing), HP Labs, UCLA, EPFL, and UCSD. This talk will give an overview of my research, presenting the execution model and its checker, support for event parallelization, and our more recent testing tools.

Charles Killian is an Assistant Professor in the Department of Computer Science at Purdue University. He received an NSF CAREER award in 2011, as well as an HP Open Innovation award. In 2008 he completed his Ph.D. in Computer Science from the University of California, San Diego under the supervision of Amin Vahdat. Before transferring to UCSD in August 2004, he completed his Masters in Computer Science from Duke University with Amin Vahdat. His systems and networking research focuses on building and testing distributed systems, and bridges this research with software engineering, security, data mining, and programming languages. Since 2004 he has implemented the Mace programming language and runtime, built numerous distributed systems, and designed MaceMC, the first model checker capable of finding liveness violations in unmodified systems code and 2007 best paper award at NSDI. Chip has built many additional tools and enhancements since then, including performance testing, work on parallel event processing, automated attack discovery, and data mining logs to discover performance problems.

1 114 115 116 117 118 142