Everyone is invited to see presentations and demonstrations of  six class projects done by the 17 students in CMSC 491/691, Virtual Reality Design for Science, taught by CSEE Professor Jian Chen this spring.  The demonstrations and presentations will take place 12:00-1:30pm Wednesday, 10 May 2017 in the π² Immersive Hybrid Reality Lab located in room 201b in the ITE building. Join us in this new adventure to explore ideas and foster interaction and interdisciplinary science. Pizza will be provided.

  • Utilizing VR simulations to study the effect of food labeling on college students meal choices, by Elsie, Kristina, and Michael
  • Integrating spatial-and-non-spatial approaches for interactive quantum physics data analyses, by Henan, John, and Nick
  • Analyzing the benefits of immersion for environmental research, by Caroline, James, and Peter
  • CPR training effectiveness, by Joey, Justin, and Zach
  • Quantitative measurement of cosmological pollution visualization, by Kyle, Pratik, and Vineet
  • Memorable mobile-VR-based campus tour, by Abhinav and Vincent

Support for this new course was provided by an award from the UMBC Hrabowski Fund for Innovation to CSEE Professors Jian Chen, Marc Olano and Adam Bargteil.  The project-oriented class introduces students to the use of hybrid reality displays, 3D modeling, visualization and fabrication to conduct and analyze scientific research. The new course embraces the university’s goal of advancing interdisciplinary and multidisciplinary research activity.

The UMBC π² Immersive Hybrid Reality Lab is funded by a $360,000 NSF award, with additional support from Next Century Corporation. In the lab, users wear 3D glasses with sensors attached to them and operate handheld controls that allow them to sensorially immerse themselves in data, which appears on dozens of high-resolution screens that are precisely aligned to work together. Users control the data by manipulating it in the space around them. The user’s body is fairly stationary, but the brain thinks the body is moving within the virtual world. The lab brings together tools “that will allow humans and the computer to augment each other,” notes Dr. Chen.