13.2

· CRR-13 · program · talks · posters · call for papers · photos · 

 

Session I (9:30am–10:40am) – 20 minute talks

  • Predicting Patient Outcomes from a Few Hours of High Resolution Vital Signs Data
    Tim Oates, Embedded Systems and Networks Lab

    Monitoring of non-invasive, continuous, high-resolution patient vital signs (VS) such as heart rate and oxygen saturation is becoming increasingly common in hospital settings. These data are a potential boon for health informatics as a source of predictive information about a variety of patient outcomes. Yet the volume, noisiness, and per-patient idiosyncrasies of these data make their use extremely challenging. In this talk I will discuss the utility of representing VS data as unordered collections (bags) of local discrete patterns for the purpose of training classifiers to predict outcomes for traumatic brain injury patients, including mortality and level of cognitive function months after hospital discharge. The Symbolic Aggregate approXimation (SAX) algorithm is used for discretization, producing a bag of SAX "words" (local patterns) for each time series. Experiments with a dataset of sixty traumatic brain injury patients demonstrate that this approach is promising both in terms of predictive accuracy and patterns that it can reveal in the underlying VS data.

  • Increasing Base-Station Anonymity in Wireless Ad-hoc and Sensor Networks
    Dr. Mohamed Younis, Embedded Systems and Networks Lab

    In many applications of ad-hoc networks, the bulk of the traffic is targeted to few nodes. For example, in wireless sensor networks the base-station (BS) collects data from a large number of sensor nodes. Another example is a surveillance network in which the gathered intelligence data about criminal activities flow towards field commanders and/or an in-situ BS. Such a network operation model makes the BS a critical asset for these applications. An adversary can nullify the value of a network by simply disrupting or physically damaging the BS, without targeting individual data sources. The failure of the BS can also cause a loss of important data that may not have been processed and can cause a major negative impact if the BS represents a commanding authority for the network. Therefore, concealing the location and role of the BS is of utmost importance for maintaining a robust network operation. Packet encryption does not achieve BS anonymity since an adversary can intercept the individual wireless transmissions and employ traffic analysis techniques to follow the data paths without knowing the content of intercepted traffic. Since all active routes end at the BS, the adversary may be able to determine the BS’s location and launch targeted attacks. Similarly, camouflaging or hiding the BS does not provide protection when its location is unveiled via traffic analysis. Employing spread spectrum signaling methods is not a sufficient BS anonymity countermeasure as adversaries are becoming more advanced and equipped with sophisticated intercept technologies. In addition, signal spreading reduces rather than eliminates the prospect of transmission detection. This talk will highlight the traffic analysis threat, present anonymity assessment metrics, provide an overview of effective cross-layer techniques developed in the ESNet Lab for increasing the BS anonymity, and outline open research problems.

  • Fostering Student Interest in Cybersecurity and STEM Through Interscholastic Competition: The Maryland Cyber Challenge
    Dr. Richard Forno, Graduate Cybersecurity Program

    Using the annual Maryland Cyber Challenge and Competition (MDC3) as example, this talk discusses student cybersecurity competitions as an innovative, viable and meaningful method of fostering educational aspirations in cybersecurity and STEM related disciplines. By preparing for and participating in these competitions, experienced and curious students alike are exposed to the technical aspects, conceptual principles, and functional processes required of cybersecurity and IT practitioners as they navigate an increasingly complex series of refereed intellectual and practical exercises designed to simulate real-world cybersecurity situations. Additionally, the most successful teams are those best able to cooperate, communicate, delegate, and otherwise embrace the various qualities necessary for operational success within a contemporary collaborative environment or workplace. Ultimately, the goal of cyber competitions like MDC3 is to help satisfy the ongoing need for knowledgeable, competent individuals motivated to pursue education and/or careers in cybersecurity or other STEM related fields in order to meet critical national and commercial requirements – specifically, individuals who demonstrate both technical proficiency and an ability to function effectively as part of a team.

 

Session II (1:00pm–2:10noon) – 20 minute talks

  • An Efficient & Reconfigurable FPGA and ASIC Implementation of a Spectral Doppler Ultrasound Imaging System
    Adam Page, Energy Efficient High Performance Computing Lab
    BS/MS Award Winner

    Pulsed Wave (PW) Doppler ultrasound is an important technique commonly used for making non-invasive velocity measurements of blood flow in the human body. The technique makes use of what is known as the Doppler effect, a phenomenon in which there is a change in frequency of a wave for an observer moving relative to its source. Using the Doppler effect relationship between velocity and frequency, it is possible to determine the velocity of an object by measuring the change of the object’s frequency relative to the medium in which the waves are transmitted. In order for PW Doppler ultrasound systems to measure blood velocity, they must be able to analyze the change in the observed frequency relative to the emitted frequency while filtering out noise. Therefore, these systems rely heavily on the use of digital signal processing (DSP) techniques. Most common PW Doppler ultrasound imaging systems use fixed DSP hardware to accomplish this. As a consequence, these systems have limited target frequency ranges. In this paper, we propose an FPGA-based PW spectral Doppler ultrasound imaging system that is both highly efficient and versatile. The design is implemented in a Virtex-5 FPGA using Xilinx ISE design suite. There are currently only a few studies available for the implementation of an efficient, reconfigurable FPGA-based PW Doppler ultrasound system. These studies mainly discuss a few variations of the system but fail to discuss reconfigurability, accuracy, and performance details of the system. The proposed design addresses all of these issues. Each of the main components constituting the proposed design is discussed in detail including the reconfigurability aspect. The accuracy of the system was determined by constructing a similar design in MATLAB using 64-bit floating-point precision. For performance comparisons, the design was also implemented in 65 nm CMOS ASIC design. The Virtex-5 design requires 1,159 of 17,280 slice resources and consumes 1.128 watts of power when running at its maximum clock speed of 333 megahertz. The ASIC design has an area of .573 mm2 and consumes 41 mW of power at a maximum clock speed of 1 GHz.

  • The DARPA Shredder Challenge
    Dr. Don Engel and Dr. Marianne Engel

    In November 2011, the Defense Advanced Research Projects Agency (DARPA) challenged civilians to come up with novel ways to reconstruct shredded documents. This talk will examine the strategy our team of two pursued in our spare time, enabling us to place second out of about 9,000 teams. Our approach involved signal/noise analysis, light computational linguistics, and an experimental user interface. DARPA has open competitions regularly and future competitions may be of interest to UMBC students, faculty, and staff.

  • Experience with Normalized Compression Distance
    Dr. Charles Nicholas and Kevin Stout

    The Normalized Compression Distance, or NCD, is a versatile and intuitively appealing measure of object similarity. The idea behind NCD is simple: concatenate the two objects to be compared, and compute the length of the resulting object when compressed. If the length of this compressed object is less than what would be expected when looking at the lengths of the individual objects when compressed, the objects share substrings in a way that the compression algorithm can exploit. However, the overhead imposed by compression often causes NCD to violate the reflexive and symmetry properties of metrics. Furthermore, a naive implementation of NCD does not scale well, since compression is a relatively expensive operation. We propose the dzd measure, which appeals to the same intuition as NCD but also satisfies the reflexive, symmetry and triangle inequality properties of metrics. The dzd measure is also easier to compute than NCD, in that partial results can be done once and saved for future use. We tested both measures against a private malware collection of many thousands of specimens. The dzd measure consistently agrees with NCD when tested using random pairs of objects drawn from this collection, but an unoptimized implementation of dzd seems to be faster by a factor of 2.5.

 

Session III (2:30pm–3:40pm) – 20 minute talks

  • Beyond Davinci: Interdisciplinary visualization and interactive computing for knowledge discoveries
    Dr. Jian Chen

    The core research of my DAVINCI lab is to investigate effective and efficient interaction and visualization techniques to aid researchers in cutting-edge sciences and engineering to explore vast amount of data. In this talk, I will present several recent results from NSF-funded projects that have advanced interactive and visual communication in interdisciplinary studies among neurologists, biologists, and computer scientists. The first project is to address human limited working memory and the lack of integrated data construction and visualization construction workflow in knowledge discovery from multifaceted bat flight datasets. I will demonstrate a so-called VisBubbles interface that has successfully integrated human-in-the-loop in the computing process to facilitate knowledge discovery. The second project is to study how to maximize the uses of our complex visual system to infer observable properties from visualizations that can, for example, assist neurologists understand massive amount of tube renderings. I present results from evaluating several visual encoding techniques to quantify the effectiveness of visualization techniques from various perspective. To summarize, our general contribution is to place the discovery in the lens of design, optimizing the use of interactive visualization to facilitate scientists' quest for new knowledge.

  • Design and Implementation of FROST: Digital Forensic Tools for the OpenStack Cloud Computing Platform
    Josiah Dykstra, Cyber Defense Lab
    PhD Honorable Mention

    We describe the design, implementation, and evaluation of FROST—three new forensic tools for the OpenStack cloud platform. Operated through the management plane, FROST provides the first dedicated forensics capabilities for OpenStack, an open-source cloud platform for private and public clouds. Our implementation supports an Infrastructure- as-a-Service (IaaS) cloud and provides trustworthy forensic acquisition of virtual disks, API logs, and guest firewall logs. Unlike traditional acquisition tools, FROST works at the cloud management plane rather than interacting with the operating system inside the guest virtual machines, thereby requiring no trust in the guest machine. We assume trust in the cloud provider but FROST overcomes non-trivial challenges of remote evidence integrity by storing log data in hash trees and returning evidence with cryptographic hashes. Our tools are user-driven, allowing customers, forensic examiners, and law enforcement to conduct investigations without necessitating interaction with the cloud provider. We demonstrate through examples how forensic investigators can independently use our new features to obtain forensically- sound data. Our evaluation demonstrates the effectiveness of our approach to scale in a dynamic cloud environment. The design supports an extensible set of forensic objectives, including the future addition of other data preservation, discovery, real-time monitoring, metrics, auditing, and acquisition capabilities.

  • A Semantic Message Passing Approach for Generating Linked Data from Tables
    Varish Mulwad, Ebiquity Lab
    PhD Award Winner

    Large amounts of information is stored in tables, spreadsheets, CSV les and databases for a number of domains,including the Web, healthcare, e-science and public policy. The tables' structure facilitates human understanding, yet this very structure makes it difficult for machine understanding. We describe work on making the intended meaning of tabular data explicit by representing it as RDF linked data, potentially making large amounts of scientificfic and medical data in important application domains understandable by machines, improving search, interoperability and integration. Our domain-independent framework uses background knowledge from the LOD cloud to jointly infer the semantics of column headers, table cell values (e.g., strings and numbers) and relations between columns and represent the inferred meaning in RDF. A table's meaning is thus captured by mapping column headers to classes in an appropriate ontology, linking table cell values to literal constants, implied measurements, or LOD entities (existing or new) and discovering or identifying relations between columns. At the core of our framework is a probabilistic graphical model that exploits existing LOD knowledge to improve a message passing scheme during the joint inference process. We have evaluated or framework on tables from the Web and Wikipedia with promising results.