NSF CAREER Awards

The prestigious Faculty Early Career Development (CAREER) Program supports junior faculty who exemplify the role of teacher-scholars through outstanding research, excellent education and the integration of education and research within the context of the mission of their organizations.

The following faculty members in the College of Engineering are CAREER Award recipients:

Unveiling the Governing Mechanisms of Fatigue Failure in Additively Manufactured Aluminum

Award date: September 1, 2018 to August 31, 2023 (Estimated)

Additive manufacturing (AM), often referred to as 3D printing, is an exciting technology that can offer more flexibility and efficiency in the production of complex metal parts compared to conventional manufacturing. However, the path to AM as a viable and safe alternative in applications where structural components must carry loads (in some cases, where components must sustain repetitive loading over long periods of time) is at a critical junction. The widespread incorporation of this transformative manufacturing technology is hampered by the fact that it is currently not possible to predict when and why an additively manufactured metal component might fail, and to design the component accordingly to mitigate risk of failure. This presents a major problem for many industries that are looking to use AM to produce metal load-bearing components. This Faculty Early Career Development Program (CAREER) award supports fundamental research to address this pressing need and to enable the expanded, yet safe, use of metal AM in many industries, including aerospace, automotive, biomedical, manufacturing, and national defense. The research is closely integrated with a unique outreach program that will engage students across different age levels and backgrounds, including middle-school students from rural locations in Utah.

The research supported by this CAREER award is a fundamental step toward expanding the use of AM to fatigue-critical applications through the discovery of 3D, microstructure-sensitive, fatigue-crack driving mechanisms in additively manufactured aluminum. Two parallel research thrusts will be carried out. One thrust will focus on experimentally characterizing the microstructural features in 3D neighborhoods of fatigue cracks observed in aluminum specimens produced by laser powder bed fusion. The second thrust will focus on numerically characterizing the local micromechanical fields that evolve in 3D as a function of underlying, manufacturing-induced microstructure and defect distribution, with particular focus on residual-stress incompatibility, porosity, and surface roughness. Data-driven approaches will be leveraged across the experimental and numerical data sets to provide new insights into the mechanisms responsible for fatigue failure among the specimens. While the research focuses on aluminum alloys, it is anticipated that the findings regarding the relative importance of geometrical defects, like pores and surface roughness, versus intrinsic material defects on fatigue failure of additively manufactured parts could be broadly applicable to other metals as well.

Safe and Efficient Extensions for Low-Latency Multitenant Storage

Award date: May 1, 2018 to April 30, 2023 (Estimated)
Businesses are moving data and applications into the cloud, meaning that many applications and data are consolidated efficiently in one place on fewer servers. Cloud storage services must keep the data of thousands of customers separated while also allowing customers to operate on it efficiently. Safely intermixing customer-provided operations over data is problematic. Historically, processor hardware isolates programs, but increasing data access rates make that costly. This project develops a new approach to storage that allows safe operation on data without hardware protection using recent advances in programming languages.

The approach combats data movement between disaggregated storage and compute nodes by having untrusted tenant extensions pushed to Sandstorm, a new cloud storage system. Sandstorm’s insight is that storage extensions can use language-level isolation to eliminate hardware isolation overheads that cannot be avoided today: not with virtual machines, containers, nor serverless Lambdas. Sandstorm also eliminates copying data for safety, so extensions benefit from low-level hardware functionality like zero-copy network transmission. The project will develop multitenant benchmarks, low-cost performance-isolated concurrency mechanisms for multicores, techniques to minimize data movement within servers, storage extensions that demonstrate the benefits, and distributed extensions over clusters.

As power limits data center scale, minimizing data movement out of storage becomes crucial. Sandstorm enables any cloud developer to accelerate data-intensive applications like real-time social network and natural graph analysis and fine-grained coordination of hundreds of thousands of autonomous vehicles. All artifacts will be developed openly under a permissive MIT license for academic and industrial use. The project includes development of a new education platform for teaching students about distributed systems and cloud computing at the graduate, undergraduate, and high school levels with a set of serverless computing labs targeted toward University of Utah students and summer camp attendees.

All data, code, experiments, and benchmarks will be open and made publicly available through http://github.com/utah-scs/ and at http://utah.systems/ and retained for a minimum of three years beyond the project award period. All data, code, benchmarks, and experiments associated with all published results will also be hosted at http://dataverse.harvard.edu/dataverse/utah-scs as part of the Harvard Dataverse for long-term retention.

Vibration-Assisted Laser Keyhole Welding to Improve Joint Properties

Award date: April 1, 2018 to March 31, 2023 (Estimated)

Welding-critical industries represent about one-third of the Gross Domestic Product (GDP) of the United States, and as such this project on vibration-assisted laser keyhole welding advances the national prosperity and welfare; it also has numerous defense applications so that this project secures the national defense. Any improvements in welding productivity and performance can potentially save billions of dollars for the country. Laser welding is a promising process that can improve the welding speed by tens and even hundreds of times compared with conventional welding processes such as arc welding and electric resistance welding. However, laser welding can suffer from several problems, including high porosity, coarse grains, and brittle intermetallic compounds in the joints, all of which can reduce performance. This Faculty Early Career Development Program (CAREER) project is focused on a novel process called Vibration-assisted Laser Keyhole Welding (V-LKW) and aims to study how the laser vibration will help to solve the aforementioned problems in laser welding. Experimentation and numerical modeling will be used simultaneously to investigate the fundamental physics in V-LKW. The research, if successful, will significantly advance scientific understanding of the complex phenomena in V-LKW, and promote the applications of this novel process in the manufacturing industry (especially the automotive and aerospace sectors) for the improvement of welding productivity and performance. The educational objective of this project is to engage and prepare students for the intellectual challenges in modern manufacturing technologies. Outreach activities to high school students will be conducted to cultivate their interests in manufacturing technology and encourage them to explore engineering as their future careers.

This CAREER project will focus on a novel Vibration-assisted Laser Keyhole Welding (V-LKW) process. The research objective is to test the hypothesis that the properties of V-LKW joints can be significantly improved through the mechanisms of porosity reduction, grain refinement, and intermetallic reduction, all of which are functions of the vibration direction, amplitude, and frequency of the laser. In-process time-resolved observation techniques, post-process microstructural/mechanical characterization methods, and a multi-physics numerical model will be used to identify the process-structure-property relationship of V-LKW and understand the fundamental physics in V-LKW. If successful, this project will advance our understanding of laser-matter interaction, keyhole dynamic behavior, vibration generation/transmission, multi-phase thermo-fluid flow, and solidification phenomena in V-LKW. The research will enable the improvement of welding productivity and joint properties in the manufacturing of many products, including but not limited to automobiles, thermal management devices, electronic devices, and medical devices. The outreach and education activities will improve the education of manufacturing science and engineering at the University of Utah, attract more high school, college, and minority students to choose careers in engineering and manufacturing, and strength the university’s connection to local communities. Research results will be disseminated in a timely manner through journal/conference publications and interactions with industries.

Enabling Reproducibility of Interactive Visual Data Analysis

Award date: April 1, 2018 to March 31, 2023 (Estimated)

Reproducibility and justifiability are widely recognized as critical aspects of data-driven decision making in fields as varied as scientific research, business, healthcare, or intelligence analysis. This project is concerned with enabling reproducibility and justifiability of decisions in the data analysis process, specifically as it relates to visual data analysis. Visualization is an important tool for discovery, yet decisions made by humans based on visualizations of data are difficult to capture and to justify. This project will develop methods to justify, communicate, and audit decisions made based on visual analysis. This, in turn will lead to better outcomes, achieved with less effort and cost. The increasing use of visual analysis tools for decision making will make data analysis accessible to a broad variety of people, as visual analysis tools are generally easier to use than scripting languages and do not require extensive computational and statistical training. This research and its related activities increase accessibility and enhance the data analysis infrastructure for research and education.

To achieve these goals, this research will develop a framework for making visual analysis sessions not only reproducible but also reusable. The approach is based on tracking semantically meaningful provenance data during an interactive visual analysis session. Once a discovery is made, analysts can use this history to curate a succinct analysis story, adding justifications and explanations to make their analysis reproducible by others. Using a semi-automatic process, analysts will be able to make their actions data-aware, so that their analysis processes become robust to changes, such as updates in the data. A second contribution of the proposed work is the integration of visual analysis into computational analysis processes. While visualization is commonly used to present computational analysis results, the results of a visual analysis session are rarely used to feed into further computational processes. The techniques developed in this project will allow analysts to feed analysis results (selections, aggregations, filters, etc.) back into a computational environment. This will make it possible to use interactive visualization at any point in the data analysis process while maintaining reproducibility and enabling reuse. The expected results include new methods to capture user intent, create data stories from analysis processes, and to integrate computational and visual data analysis, leveraging the strength of both, human abilities and computational power. The results will be disseminated in publications and in the form of open source software, and accessible via the project website (http://vdl.sci.utah.edu/projects/2018-nsf-reproducibility/).

Functionality-Enhanced Devices for Extending Moore’s Law

Award Date: January 15, 2018 – December 31, 2022 (Estimated)

This CAREER project intends to develop high-energy-efficiency computing systems by making a more “useful” elementary device rather than only focusing on its performances. This enables addressing the critical scientific question of “How can we keep pushing computing performance limits”?. For more than four decades, the semi-conductor ecosystem answered the demand for higher levels of performance by manufacturing devices with increasingly small dimensions. Nevertheless, there is still the largely unexplored route of increasing the basic switching primitive of the elementary transistors, i.e., enhancing their functionality rather than focusing only on reducing their size and/or improving their performances. This project is also relevant from an industrial perspective, as it proposes novel solutions to push device and systems performance without overextending investments to reach an ever-larger degree of integration. More importantly, this CAREER project (i) will involve graduate/undergraduate students tackling research on problems that are directly relevant for industry, ultimately boosting their employability, and (ii) will develop a scientific popularization YouTube channel as a mechanism to increase interest and broaden the participation of K-12 students by capitalizing on their online curiosity.

The proposed research aims to developing novel computing systems exploiting Three-Independent-Gate Field Effect Transistors (TIGFETs), a novel device technology capable of device-level functionalities typically not achievable by standard CMOS and leading to major benefits at gate and circuit levels. A TIGFET can, depending on its usage, achieve three totally unique modes of operations: (i) the dynamic reconfiguration of the device polarity; (ii) the dynamic control of the threshold voltage; and (iii) the dynamic control of the subthreshold slope beyond the thermal limit. In order to fully assess the potential of this technology, this CAREER project will (1) fabricate, characterize and model TIGFETs using advanced semi-conductor materials, (2) develop a complete design framework for TIGFETs, including a design kit, novel circuit primitives and dedicated design tools, and (3) evaluate TIGFET-enabled low-energy high-precision neural network computing kernels.

Data Mining to Reduce the Risk in Discovering New Sustainable Thermoelectric Materials

Award date: July 1, 2017 to June 30, 2022

Humanity faces a number of grand challenges in engineering in the 21st century ranging from making solar energy affordable, to inventing new tools for scientific discovery, to preventing nuclear terror and more. A common requirement to many of these challenges is the need to discover new materials but traditional materials discovery is slow, inefficient, and expensive. Clearly, a new tool is required to develop new materials faster and at a fraction of the cost. One possibility is to rely on big data to accelerate materials discovery. This project serves the national interest by using data mining tools to create a materials recommendation engine for new sustainable thermoelectric materials. This engine will provide recommendations for new materials based off of statistical probability of desired performance. Scientists will be able to use this tool to guide experimental efforts to explore totally new compounds that would be too risky to investigate otherwise. Since thermoelectrics are devices that can convert waste heat to electricity the potential for this project to benefit the United States is significant. Currently close to two thirds of energy is lost as waste heat and recovering even a small fraction of this with new thermoelectric materials would amount to enormous energy savings. The PI will also leverage this research opportunity to supplement his teaching and outreach efforts. Students will construct novel thermoelectric devices and use these devices to perform bilingual Spanish/English outreach to minority-majority high school and junior high students in Salt Lake City.

Discovering new materials is slow, inefficient, and expensive. These factors make searching for novel new materials from chemical white space very high risk. Instead, most new developments occur incrementally in already known or established structure types, chemistries, and systems. However, the risk associated with exploring chemical white space for new compounds can be mitigated by utilizing the emergent field of materials informatics. In this proposal novel, sustainable thermoelectric compositions will be suggested using a materials recommendation engine for thermoelectrics. The engine uses composition only to make probabilistic estimates of performance rather than computationally expensive calculations which generally require knowledge of the crystal structure a priori. Avoiding crystal structure as an initial input means entirely new compounds can be discovered with this tool. The engine output is a probability of compositions lying within a desired performance range. Therefore, this project will combine these predictions with existing predictions of where compounds should form to experimentally explore novel thermoelectric materials. The training data set for algorithm and descriptor development will be improved by inclusion of performance of poor, mediocre, as well as good materials to overcome the bias in literature for high-performing materials. Experimental synthesis and characterization will be carried out on suggested compounds and for algorithm validation.

Powering Micro Scale Biomedical Implants through Controlled Low Frequency Magnetic Fields and Multiferroic Transducers

Award date: February 1, 2017 to January 31, 2022

Biomedical implants hold the promise of dramatically improving health and well-being by, for example, enabling people to pro-actively monitor health through real-time tracking of internal body chemistry (e.g. pH, glucose, lactate, tissue oxygen), treat diseases through targeted and tailored drug delivery, treat neural disorders through neural prostheses, etc. However, this vision is only possible if implants become much smaller with longer lifetimes. The current state of the art in integrated circuit and micro-sensor design and manufacturing could enable cubic millimeter sized implants that would greatly reduce trauma to the patient and improve continuous health monitoring. However, power systems have lagged behind and become a barrier to implant miniaturization. Very small batteries would quickly become depleted and then the entire implant would have to be surgically replaced. The goal of this project is to overcome this power problem by wirelessly transmitting power to the biomedical implants using low frequency magnetic fields that easily penetrate the human body. These magnetic fields will excite a magnetoelectric power receiver that will be part of the implant. The magnetoelectric receiver will convert the magnetic fields to electricity which will then be properly conditioned to power the implant. The Principle Investigator (PI) and affiliated researchers will explore two competing types of magnetoelectric devices and characterize them especially in terms of uncertainties related to the position and alignment of the implant and associated power receiver. New fabrication processes will be developed that enable micro-scale magnetoelectric devices to generate more power, thus enabling further miniaturization for biomedical implants. In addition to enabling the miniaturization of implants, the work to be accomplished by this project could have broader benefits for the state of the art in both sensing and wireless power transfer.

The goal of this project is to explore the use of low frequency magnetic fields coupled with magnetoelectric power receivers to transmit power to biomedical implants. The target goal is to safely supply 100 microwatts per cubic millimeter, which would enable a wide range of implanted sensors and therapeutic devices. Low frequency magnetic fields are attractive because of their very low absorption in human tissue and encapsulating structures. Two classes of magnetoelectric devices will be investigated: laminates of magnetostrictive and piezoelectric material, and jointly fabricated permanent magnet / piezoelectric structures. The two approaches will be compared given alignment and orientation uncertainties and issues associated with human tissue interaction. Specifically, researchers will characterize the surrounding tissue’s role in degrading the quality factor of the resonant magnetoelectric power receivers. The key relationships for power generation by this method as well as performance limits will be elucidated and experimentally validated, which will serve as a basis for system design. A new microfabrication process will be developed to enable high power transducers through the use of much thicker active materials (i.e. piezoelectric and magnetostrictive). Finally, a system to control the DC voltage used by the implant from the external transmitter will be developed and validated to remove the need for large onboard passive components associated with traditional power electronics. The efficacy of the external control method will be fully characterized with respect to stability of the DC voltage and robustness to system uncertainties. The results of this research will establish the basis for much smaller, more ubiquitous biomedical implants by overcoming the issue of delivering power at sufficient densities.

Probing Stem Cell Differentiation With Synthetic Biology

Award date: May 1, 2016 – April 30, 2021

The demand for stem cells is anticipated to continue to rise due to their expected ability to treat disease, use in novel diagnostic technologies, and for pharmaceutical screening. However, before these goals can be realized, a better understanding of the mechanisms regulating their fate is required. Stem cells integrate the many signals that surround them and execute cellular behaviors based on these inputs. These attributes can be harnessed and manipulated using synthetic biology to tightly control gene expression in dynamic patterns, in addition to programming cells to sense and record changes in their microenvironment. Upon successful completion, this work will enable a better understanding of the mechanisms involved in stem cell fate decisions that may eventually be exploited to direct differentiation.

Both intrinsic (transcription factor expression), as well as extrinsic (environmental) mechanisms are thought to be involved in the regulation of stem cell self-renewal, and their commitment to differentiate into more specialized cell types. This interplay between intrinsic and extrinsic cues in differentiation poses challenges to studying the mechanisms involved in their proliferation and terminal differentiation. These challenges can be addressed using genetic circuits. The completion of this research will produce tools to query the different timings, identities, and interactions of the internal and external signals of cell fate. This will be accomplished by developing strategies for probing stem cell differentiation using genetic circuits with the following objectives: (1) to expand the genetic toolbox for dynamically controlling gene expression in mammalian cells, (2) to assess changes in the extracellular microenvironment as stem cells differentiate by connecting genetic circuits to important receptors known to be involved in cell fate choices, and (3) to use these tools to probe stem cell differentiation. All of the results of this research will be made accessible to the research community. An additional important broad impact area of this research program is education and outreach of STEM fields at a variety of levels including elementary school, high school, and undergraduate students.

Formal Methods for Approximate Computing

Award date: May 1, 2016 to April 30, 2021

Nowadays, power and energy constraints are the main driving force of improvements in computing abilities, and approximate computing has emerged as a promising technique for enabling these improvements. The main idea behind approximate computing is to trade the accuracy of computations in order to enable novel optimizations that improve performance or energy efficiency. This leverages the fact that many applications are resilient to errors to some extent, and hence do not require absolute correctness of computations. Clearly, developing software for future approximate computing platforms will not be an easy task: a developer must introduce as many approximations as possible, while at the same time ensuring that correctness and the results quality requirements of an application are met.

This project explores automated techniques and tools to assist developers by allowing them to explore approximate computing trade-offs. The techniques are based around a rigorous, automated, and precise analysis of program approximations. The main novelty is to leverage recent advances in automated software verification in the context of approximate computing. The project will develop an open platform for the rigorous analysis of approximate programs. By focusing on improving developer’s experience when writing code for future approximate computing platforms, this work has the potential to achieve significant impact across a wide spectrum of industries and applications.

“An Integrated Research and Education Program to Advance Pathogen Detection and Quantitation”

Award date: July 15, 2016 – February 28, 2021

This research will improve environmental stewardship efforts and public health protection by developing and optimizing a rapid method for tracking hundreds of waterborne pathogens and developing statistical tools for identifying the source of water impairment. Several basic research questions surrounding the interface of molecular biology, environmental microbiology, and environmental engineering are proposed. Additionally, this research and education program is expected to show that student understanding and retention of microbiology concepts for environmental engineers can be measurably improved using technology-enabled, interactive learning modules.

The objective of this integrated research and education program is to develop and validate quantitative microarrays for simultaneously detecting hundreds of pathogens, fecal indicator bacteria, and microbial source tracking markers in environmental samples. The combination of the comprehensive microarray data with statistical methods for fecal source apportionment will advance the science of water protection. Rapid, multi-target tools for water quality monitoring that can simultaneously indicate fecal contamination sources can enhance our ability to provide high quality water supplies and water for reuse. In developing and testing this tool the transformative research proposed will: 1) refocus water microbiological safety monitoring on pathogens, 2) develop rigorous methods to overcome whole genome amplification bias, 3) develop quantitative microarrays, and, 4) develop statistical methods for fecal contamination source apportionment in water. The proposed quantitative microarray will facilitate rapid and sensitive determination of the presence of infectious agents in a variety of media (e.g., water, treated wastewater, and food). The research agenda proposed herein will: 1) establish upstream environmental sample processing for reproducible and unbiased concentration, extraction and amplification of nucleic acids, 2) develop quantitative microarrays and rigorous quality control methods for detection of nucleic acids from environmental samples, and 3) develop statistical methods for identification of the source of fecal pollution in environmental samples. The educational agenda proposed herein will develop, test, and disseminate technology-enabled, active-learning modules of fundamental microbiology concepts for environmental engineers. The development of tech-enabled, student-active learning modules allows for broad dissemination of these new tools and the concepts behind them to practicing engineers, educators, and students at multiple levels. The principal investigator has a plan to assess the effectiveness of the teaching modules and pipelines in place, especially for underrepresented groups. With the NSF-REU in place and departmental funds available, the feasibility of the educational components is high.

“Geometric Shape Deformation with Applications in Medicine”

Award date: July 15, 2014 – June 30, 2019

In spite of significant recent advances 3D computer graphics are still humbled when confronted with medical-grade requirements, so medical illustrators often continue to rely on 2D hand drawing. A fundamental challenge is that detailed geometric models and advanced nonlinear materials increase computational complexity, making them difficult to apply in real-time interactive applications. In this research, the PI will investigate an alternative approach based on geometric shape deformations rather than the processes which created them. He argues that intuitive shape deformation can be facilitated by guarantees of basic geometric properties such as smoothness and injectivity (no self-intersection). The key is to design algorithms that can do this quickly while providing the user with a small yet expressive set of adjustable controls to ensure an efficient interactive experience; the task of shape deformation techniques is to extrapolate this parsimonious, human manageable set of input controls into a full-scale 3D deformation field in a natural and predictable way. The PI’s hypothesis is that this requirement can be formally expressed in terms of basic geometric properties. To this end, the PI will explore both direct (closed-form) and variational methods, because while direct methods excel in speed variational methods offer stronger guarantees and advanced geometric properties. In terms of direct methods, the PI will develop new ways to quickly blend certain groups of 3D transformations (e.g., with the help of new geometric algebraic structures). Transformation blending will be complemented by advanced influence weights that allow the user to explicitly control the resulting sparseness. In terms of variational methods, the PI will study deformation energies satisfying traditional properties such as rotation invariance but augmented with higher-order continuity and injectivity; here, the main challenge will be to find efficient numerical solutions for the underlying optimization problems. The PI believes it will prove possible to mitigate the inherent computational complexity of variational methods by suitably combining them with direct methods so as to cast some of the variational problems as convex optimizations, thereby opening the door to highly efficient convex solvers.

Shape deformation is relevant to architecture, computer aided design (CAD), and many areas of science and engineering, as well as to the entertainment industry. But this project has primarily been motivated by medical applications, inspired by requests from the PI’s collaborators at The Children’s Hospital of Philadelphia. Given the right tools, the classical field of hand drawn medical illustration will evolve into 3D animated medical atlases, setting new standards in medical education. Shape deformation techniques could ultimately contribute to clinical praxis, by facilitating diagnosis and pre-operative planning when treating conditions such as pathological skull deformities (craniosynostosis). And shape modeling tools in expert hands could help lower the radiation dose required in CT scanning, by applying new reconstruction methods that combine user input with template models and accurate surface scans (obtained with radiation-free methods such as laser scanning). The PI also will organize seminars and courses that bring together medical and engineering students, including members of underrepresented groups, thereby promoting interdisciplinary collaboration in both research and education.

“Design Decision Patterns for Visualizing Multivariate Graphs”

Award date: July 15, 2014 – June 30, 2019

Multivariate graphs, or datasets that link together entities that are associated with multiple different variables, occur in a broad range of problems. For example, the dataset could be geospatial locations that include socio-economic statistics, linked together through a public transportation system. These multivariate graphs are notoriously difficult to visualize because the number of data variables exceeds the number of available visual cues – these cues include color, size, position, etc. The goal of this project is to establish a set of validated and generalizable techniques for visualizing and interacting with multivariate graphs. Three target application areas will drive the investigations: one in cancer biology, a second in urban transportation, and a third in particle physics. These areas were chosen to represent a wide spectrum of possible applications in which multivariate graphs play a central role, thus fostering generalizable results. The multidisciplinary nature of the research and the close collaboration with domain experts in our target application areas will provide a unique educational environment for undergraduate and graduate students, while also broadening the participation in computer science beyond traditional boundaries.

This is the first systematic, problem-driven effort to consider the visualization of multivariate graphs using a diverse set of application areas, with the goal of developing a generalizable set of techniques and principles for supporting a broad range of visualization and data analysis tasks. The research will be conducted with domain experts using a design study methodology, which is a deeply collaborative and user-centered approach to visualization research. The primary impact of this work will be validated visualization design decision patterns for effective visual representation and user-driven exploration of complex multivariate graphs, resulting in a more comprehensive foundation of techniques for visualizing this increasingly important data type. The resulting design decision patterns will support ongoing research and discovery in our target application areas, as well generalize to a broad class of real-world problems. Furthermore, these patterns will form the foundation of software tools for visualizing multivariate graphs that effectively support exploration and sense-making of these complex data types by taking into account the varied relationships embedded within. Results and software will be disseminated to both the research communities of our target application areas, but also more broadly through the project website at mvgraphs.sci.utah.edu.

“Manipulating Neural Plasticity to Enable Brain-Computer Interface Learning”

Award date: May 1, 2014 – April 30, 2019

The overall goal of this project is to develop computational systems that can teach the brain how to interface with them in both directions. Machinery in the brain has evolved to learn complex motor skills that endure for a lifetime, like tying a shoelace or riding a bike. This project will examine the feasibility of systems designed to coopt this latent learning machinery, to teach neural tissue to manage a brain-computer interface to control external devices, like a motorized wheel chair or a robotic arm. If neurons in a small area could be taught to generate a coordinated signal, wearable devices could use those robust signals to reliably control external devices. To do this, electrical stimulation will be paired to the learning-control and motor-control centers of the rodent brain, teaching neurons to synchronize activation with their neighbors. Once neighboring neurons can generate synchronous activity reliably, the rodents will be let use those signals to control external devices. Coopting neural learning mechanisms will eventually enable long-term, reliable, and robust, brain-computer interfaces for neural rehabilitation of people with disabilities. To expand the pool of individuals contributing to this nascent field of neural engineering, this project will teach relevant experimentation to undergraduates. In a standard BioSystems Engineering course, the PI will add a laboratory component in which students learn to manipulate the nervous systems of insects and other invertebrates, to control their behaviors. These student experiences should be engaging and memorable and should translate to a broader understanding of and an interest in careers in neural engineering.

Motor skill acquisition is mediated via brain processes that engage both motor control (M1) and motor learning (basal ganglia) brain centers. Efferent basal ganglia is a preferred surgical target for deep brain stimulating electrodes, to alleviate symptoms of parkinsonism and dystonia, and tens of thousands of multi-contact leads have been safely implanted in humans. Given their role in motor learning, the efferent basal ganglia are ideal sites from which to modify cortical connectivity. The research objective of this proposal is to determine if plasticity in the basal ganglia-thalamo-cortical loop can be coopted to teach the brain to generate large, easily-detected, neural-field activity that could then be used to drive brain-computer-interface devices. This objective will be achieved through 3 tasks using a rodent model. First, the ability of coupled electrical stimulation to modify the connection strengths between the motor learning (basal ganglia) and motor control (motor cortex) centers of the brain will be tested. In parallel, a closed-loop system will be built to record selective activity from, and provide arbitrary stimulation to, 64 channels simultaneously and independently with sub-millisecond latency. Finally, with real-time feedback and unit-activity detection in the motor cortex, the brain will be taught to amplify the unit activity to correlates to natural behaviors, into large field events that can be recorded reliably for years, and thus drive body-external devices.

“Foundations for Geometric Analysis of Noisy Data”

Award date: May 15, 2014 – April 30, 2019

An important role of computational geometry is to understand and formalize the structure of data. And as data is becoming a central currency of modern science, this role is growing in importance. However, much of classical computational geometry inherently assumes that all aspects of data are known and precise. This is rarely the case in practice. This project focuses on building the foundations for two extensions to classic geometric settings pertinent to noisy data.

1. The PI will study locational uncertainty in point sets, where the location of each data point is described by a probability distribution. Given such an input, the goal is to formalize how to construct, approximate, and concisely represent the distribution of geometric queries on this uncertain data.

2. The PI will study the geometric consequences of applying a statistical kernel (e.g. a Gaussian kernel) to a data set. He will investigate how this process can smooth data, remove degeneracies, and implicitly simplify and regularize algorithms. Moreover, he will explore the geometric structure of the resulting kernel density estimate, and how it relates to algorithms for the data and approximate representations of the data.

The PI will lead the development of a data-focused educational program around the themes of data analysis, algorithmics, and visualization. The PI is developing a model course for this program on data mining; it focuses on the geometric, statistical, and algorithmic properties of data. An extensive set of course notes is being compiled, accompanied with videotaped lectures freely available online. This class and program attract many interdisciplinary and diverse students and observers. This program is part of a larger effort to make relevant data analysis techniques from computational geometry available to a broader data-rich audience.

“Static-Analysis-Driven Engineering of Modern Software Systems”

Award date: February 1, 2014 – January 31, 2019

Users of software are all too familiar with its shortcomings: software is slow, software is buggy and software is insecure. When a complex software system fails, it is unhelpfully simplistic to blame the implementors of the system as incompetent. The truth is that software engineers are uniquely disadvantaged among the traditional engineering disciplines because they lack a viable predictive model for the systems they design and build. That is, a software engineer cannot predict the behavior of software in practice in the same way that a civil engineer can predict the behavior of a bridge under load. The primary intellectual merit of this research is that it lays the critical, systematic foundations for the science of prediction for software. The broader impacts are to enable engineers to build better software with the aid of predictivity. Moreover, this research also seeks to develop courses and educational material to train the next generation of software engineers in the art of constructing fast, safe, reliable and secure software in this fashion. As this research transfers into practice and engineers adopt this methodology, it will significantly strengthen the foundation of national cyberinfrastructure. The core technical thrust of this research is the development of a systematic method for the synthesis of static analyzers for complex, modern programming languages. It also explores whether or not this methodology can be automated in whole or in part. To motivate the development of this method, this research applies the method to the synthesis of intensional static analyzers for popular scripting languages such as JavaScript, Perl, PHP, Ruby and Python?many of which happen to be the languages powering modern, web-based software. The foundational technical concept of this research is the systematic transformation of small-step interpreters into static analyzers. Small-step analyzers promise unique advantages over traditional techniques, including more opportunities for optimizing speed and precision, and clearer, easier reasoning about the soundness of the results of the analysis.

“THz active metamaterials employing thin-film semiconductors”

Award date: January 1, 2014 – December 31, 2018

Objective: The objective of this program is to provide a solid foundation for the PI’s long-term research goal of developing active solid-state terahertz devices that can be employed in compact, low cost communication and imaging systems. Based on the enhanced light-matter interaction in thin-film semiconductor-based metamaterials, this proposal aims to develop devices for: a) terahertz beam steering, based on the linear properties of these materials, and b) terahertz generation via mixing and/or higher harmonic generation, based on the materials nonlinearities, thereby targeting two of the central limitations in existing terahertz technology.

Intellectual Merit: The intellectual merit is to foster the development of novel terahertz optoelectronic devices, which can form the basis for a wide range of applications. Moreover, these studies are expected to improve the current understanding about the terahertz properties of two thin-film semiconductors: vanadium dioxide and graphene, especially in terms of their non-linear properties. Since both materials can be built on arbitrary substrates, their low cost is expected to lead to market applications.

Broader Impact: The broader impacts are to create some of the components necessary to make complex terahertz systems and to make these more readily available. It is expected that the developed technologies will be transformative to a broad range of scientific and application-oriented communities. Moreover, undergraduate and graduate students will be trained in the emergent fields of terahertz and nanomaterials, and they are an integral element of the designed outreach activities to disseminate the new discoveries to the general audience, school teachers and students.

Enhanced Power Generation in a Nanoscale-Gap Thermophotovoltaic Device due to Radiative Heat

Award date: July 1, 2013 – June 30, 2018

Approximately 58% of the energy consumed annually in the US is lost to heat. Direct thermal-to-electrical energy conversion via thermophotovoltaic power generators can contribute significantly to recycling this large amount of waste heat in various systems such as combustion chambers, photovoltaic cells and personal computers. Conventional thermophotovoltaic systems are limited by the blackbody spectrum. By separating the heat source and the cells converting heat into electricity by a nanosize gap, radiation heat transfer exceeds the blackbody predictions by a few orders of magnitude due to energy transport by evanescent waves. Enhanced energy transfer by evanescent wave tunneling can thus lead to a significant increase of thermophotovoltaic power generation.

This research will demonstrate experimentally that power generation in a nanoscale-gap thermophotovoltaic (nano-TPV) device can be enhanced by a factor of 20 to 30 compared to conventional thermophotovoltaic systems. This will be accomplished by measuring radiative heat flux and nano-TPV electrical power output and conversion efficiency in a device involving planar surfaces separated by a gap as small as 20 nm. The nanosize gap will be maintained via spring-like spacers. The application of an electrostatic force between the surfaces combined with the knowledge of the spring constant of the spacers will allow precise control and measurement of the gap thickness. This research will provide for the first time well-controlled radiative flux measurements between planar surfaces separated by a nanosize gap. This will allow the verification of the d-2 and d-3 near-field thermal radiation regimes, where d is the gap thickness, predicted respectively for optically thick and thin materials supporting resonant surface waves. The research will provide the first quantitative experimental nano-TPV performance analysis at nanosize gaps. The spectral effects will be considered by testing various materials for the radiator, such as indium tin oxide supporting surface plasmon-polaritons in the near infrared band. Nano-TPV performances will be systematically quantified as a function of the gap thickness and the temperatures of the radiator and the cells, and will be compared against predictions based on a coupled near-field thermal radiation, charge and heat transport model. The impacts of heat dissipation within the cells due to lattice and free carrier absorption, thermalization and non-radiative recombination of electron-hole pairs will also be analyzed in great detail.

This project will enhance fundamental knowledge in near-field thermal radiation by measuring fluxes between surfaces spaced by sub-wavelength gaps and in evanescent wave-based energy conversion by experimentally analyzing nano-TPV performances. The research is an important step toward the establishment of miniature waste heat recovery devices that could be used in personal computers and systems harvesting heat from the human body. The project will also promote training and learning through the involvement of high school, undergraduate and graduate students in the activities. Fundamentals of near-field thermal radiation and its application to power generation will be disseminated via the development of a new elective course at the University of Utah dedicated to both undergraduate and graduate students, where the class content will be made freely available to the general public. K-12 outreach will be performed via the Utah Science Olympiad. Departmental scholarships and research fellowships will be offered to high school students participating in this event. Direct thermal-to-electrical energy conversion will be promoted via the conception of an interactive, portable demo-kit that will be presented at the Utah Science Olympiad and in high schools. These activities will assist the efforts of the College of Engineering in recruiting high quality students in science and engineering programs.

Microorganisms swimming around microstructural heterogeneity

Award date: June 1, 2013 – May 31, 2018

The goal of the proposed work is to obtain a fundamental understanding of how motile microorganisms, such as bacteria, are able to move through complex biological environments, such as bodily tissues or mucus, which often have microstructural features of similar size as the microorganisms. To properly study the effect of these microstructural features on swimming microorganisms, numerical computations capable of calculating the flows around complex geometries will be employed. These computations will also be able to study the flexibility of the microstructures and the swimming microorganisms and the effects of this flexibility on swimming. In the past, most theoretical investigations into swimming in such complex environments have only treated the microstructure in an average sense; these studies have not been able to determine what environmental properties control swimming behavior. It is expected that treating the microstructure explicitly as moving objects will give insight into what features promote or hinder movement through these environments. The results obtained through the numerical computations will expose the physical mechanisms of microstructural influence on swimming and be compared to results from investigations that do not treat the microstructure explicitly. Microorganism locomotion and propulsion in environments with microstructure affects infection, healthcare, and ecology. As examples, the microstructure of mucus can act as a barrier in infection; the research may lead to ways to alter the microstructure to hinder infection. Understanding tissue penetration is needed to design microrobots capable of precise delivery of drugs to targeted tissues such as cancer tumors, avoiding side effects in other tissues. Ecologically, the secretions of marine organisms can form a microstructural network; understanding the effects of this network on swimming microorganisms may alter estimates of microbial foraging efficiency, which affects carbon recycling rates and hence the global climate. Environmentally, bacterial dispersion in porous soils can be hindered to prevent water contamination or enhanced to promote bioremediation of pollutants. Finally, the proposed work also contains an educational component which aims to create “Move Like a Microbe,” a force-feedback simulation of microscale microbial locomotion. It will bring the proposed research to life for the public and K-12 students by providing a hands-on demonstration of how microorganisms are able to swim, and explain the consequences of microbial locomotion to everyday life.

A Multiscale Study of Heavy Particle Transport in Sparse Canopies

Award date: March 1, 2013 – February 28, 2018

Particle Dispersion in Plant Canopies is an innovative and comprehensive project designed to integrate research and education at the university level and to extend the learning to outreach at local area high schools in Salt Lake City. It is designed to peak student interest in scientific research while they participate in lab activities focused on key hypotheses regarding atmospheric transport of particles through and above plant canopies with applications to atmospheric scalar transport and plant pathology in agricultural and natural ecosystems. The project is well founded in learning theory and integrates the disciplines of meteorology, biology, physics, engineering and mathematics. It is part of the long-range career plan of the PI, is aligned with the University’s mission, and meets College and Departmental goals.

Particle transport through the atmosphere plays an important role in many ecosystems. Significant portions of these ecosystems interface with the atmosphere through plant canopies. Understanding the transport through these canopies, between canopies and the overlying atmosphere, and between disconnected canopies is critical to understanding how these ecosystems function, and how to manage any positive or negative effects. The research objectives of the project are designed to investigate these issues and are critical towards developing improved models for net ecosystem fluxes and to prevent the spread of airborne pathogens that cause plant diseases.

The educational objectives address the NSF priority of encouraging more women and minorities to enroll in academic courses preparatory to STEM-related careers. An extensive evaluation plan is included that will track participating students’ attitude toward STEM careers in high school and follow course enrollment in STEM courses for three years. Likewise, student tracking will occur at the undergraduate level, as enrollment data is available from each of the Utah high schools that feed into the University of Utah.

The intellectual merit of the project comes from the use of different approaches to develop a new comprehensive understanding of the impact of canopy geometry on flow dynamics and particle transport across a wide range of spatial scales. This understanding will lead to new models that can account for momentum transport and particle dispersion in and above plant canopies facilitating the asking and answering of questions related to how canopy geometry affects ecosystem functioning and services, how ecosystems are connected across landscapes, food production, the economic security of agricultural producers, and general heavy particle dispersion. Existing models for the dispersion and deposition of heavy particles to and from plant canopies fail to include the effect of horizontal heterogeneity (e.g., non-vegetated space between plants and land-cover transitions) on flow dynamics. The PI’s goal is to develop improved models that can capture the effect of horizontal heterogeneity over a wide range of spatial scales to benefit natural and agricultural systems.

Through direct collaboration with biologists, this project will contain a strong interdisciplinary component giving graduate students unique experience working outside their core discipline and promoting interaction between engineering, atmospheric science and biology. These interdisciplinary benefits will extend to undergraduate students and to the high school level in Utah. The active (hands-on) laboratory experiences will be designed to promote teaching, training, and learning at all of these levels, and to broaden the participation of underrepresented groups. While the goal is to engage the students, a critical component of this program will be the involvement and training of high school teachers and their effect upon STEM enrollment. Results of the project will be disseminated broadly through the project’s web site and conference attendance at high school and college science education conferences. Society in general will benefit from research leading to better management of natural resources and crop production.

Deep sparse dictionary context models and their application to image parsing and neuron tracking for connectomics

Award date: September 1, 2012 – August 31, 2017

The research objective of this proposal is to create novel computational algorithms and image processing tools that will make it possible for biologists to reconstruct large-scale neural circuits from electron microscopy volumes. Electron microscopy is a key technology in reconstruction of neural circuits at the level of individual neurons and synapses, also known as connectomics. While an important motivation of connectomics is providing anatomical ground truth for neural circuit models, the ability to decipher neural wiring maps at the individual cell level is also important in studies of many neurodegenerative diseases. State-of-the-art image analysis solutions are still far from the accuracy and robustness of human vision and biologists are still limited to studying small neural circuits using mostly manual analysis. The proposed computational models will provide biologists a tool for segmenting individual neurons and detecting other structures such as synapses in very large electron microscopy volumes, and proof reading these automatically produced results in a time efficient manner.

Reconstruction of a neural circuit from an electron microscopy volume involves pixel-by-pixel annotation of these images into classes such as cell membrane, mitochondria and synaptic vesicles and the segmentation of individual neurons in three dimensions. This task demands extremely high accuracy. Even with 99% pixel accuracy, an acceptable accuracy for many other applications, it is virtually certain that almost every neuron in a volume will be incorrectly segmented due to their global, tree-like structure and correspondingly large surface area. Therefore, lack of reliable automated solutions is a critical bottleneck in the field of connectomics. In this project, a novel hierarchical model will be created by combining the representation power of sparse dictionaries and their ease of learning with an inference and proof reading capability. Human experts will contribute to the process by providing ground truth for supervised learning and proof reading of automatically produced results. The combination of deep sparse dictionaries with inference using connection weights from conditional probabilities can provide a very fast way to learn hierarchical models. Several variants of the model will be studied for understanding the relative importance of feature representation, inference, symmetric connections, deep and lateral connections. The model will be applied to general object classification and image parsing problems in computer vision as well as connectomics datasets. Success will be evaluated on real datasets annotated by experts.

Next-Generation Micro Gas Chromatography System Toward Ultra-High Capacity, Selectivity, and Portability For Distributed Environmental Awareness

Award date: August 1, 2012 – July 31, 2017

Some unknown physical and chemical phenomena can be precisely observed and engineered by manipulating fluids in micro and nano scales. PI?s long-term research agenda lies in the development of Integrated Microsystems to enable such interface and reverse-engineering on non-electric ambient phenomena by utilizing precision electro-mechanical transduction via fluidic movement. This 5-year CAREER proposal focuses on developing a ?wearable? micro gas chromatography (ìGC) system to enable real-time, on-spot, and personal monitoring of a class of various airborne pollutants (Volatile Organic Compounds: VOCs) for early warning for individuals. Specifically, PI proposes to investigate fundamental sciences of an entirely novel gas chromatography configuration that is expected to overcome the major barrier in miniaturization and enable the ultra-high capacity, selectivity, and portability beyond the current state-of-the-art technology.

To overcome the miniaturization barrier in scaled-down gas chromatography devices, the fundamental scientific conflict has to be resolved between the capacities of chromatographic separation and fluidic pumping under size restriction. Gas chromatography systems identify targets by racing them along a column resulting in spatial separation. Ideally a longer column provides farther isolation among more targets and thus higher detection capacity. However, it imposes rapidly-increasing fluidic resistance that requires over-sized pumps preventing true portability of the whole system. Therefore, in order to enable both high-capacity and wearable-portability in GCs, both the sufficient column length and fluidic head pressure should be attained in a miniaturized size. Currently there are no viable options to achieve both. This project proposes to address such barriers by: (1) investigating fundamental sciences and establishing a prediction model of the proposed novel gas chromatography configuration, (2) examining and maximizing performance capacity and limitation under scaling, and (3) experimentally demonstrating functioning GC operation utilizing the novel configuration for environmental monitoring: detection of volatile organic compounds (VOCs).

Intellectual Merit: Although holding great promises as an enabling tool, recent micro-scale gas sensors still require bulky pumping systems barring true portability of the whole integrated system. This project obviates such dilemma by providing a novel paradigm-shifting concept in gas-chromatography-based sensors. Scientific establishment of the proposed concept is expected to lead to a revolutionary advancement in generic chemical and biological detection technology and instrumentation in all scales. Additionally, the experimental demonstration will provide a new design guideline for the multiple-component gas chromatography system with the new configurations.

Broader Impact: Recent literature and government policy have increasingly reported the emerging demands of knowing environmental conditions in real-time at workplace, public, and home. The proposed project is expected to bring MicroSystems technology, analytical chemistry, and environmental education together creating synergetic impacts in increasing the environmental awareness in both academia and public. Specifically, the education objective of this project is to enhance the awareness of under-represented highschool and K-12 students of the importance in environmental monitoring and the roles of science and technology. This project will educate the next-generation students with the impacts of the micro/nano sensor technology in such a context. This project will train multiple graduate and undergraduate students through a new course and hands-on modules, and many K-12 students to the basic sensor concepts through on-going collaboration with a local science museum. This project is highly inter-disciplinary among engineering and science, and will also expose students to important social issues for balanced education.

3-D Global Full Maxwell’s Equations Modeling of the Effects of a Coronal Mass Ejection on the Earth

Award date: July 1, 2012 – August 31, 2015

This project will develop a global numerical model of the electromagnetic fields produced by ionospheric electromagnetic currents with the research goal of enabling better prediction of the impact of space weather on the electric power grid. The methodology involves finite-difference time domain (FDTD) computational solutions of the full-vector three-dimensional time-dependent Maxwell’s equations for electromagnetic wave propagation within the global Earth-ionosphere system. In particular, the atmosphere-lithosphere volume between 400 km above the Earth’s surface to 400 km below will be studied in unprecedented detail at high spatial resolutions. The research will build on previous and current work in this area by using Active Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE) data and by establishing collaborations with the Electric Power Research Institute SUNBURST monitoring network project. The model will simulate the effects of ionospheric currents that develop as the result of coronal mass ejections in order to improve our understanding of the resultant electrodynamics. The fine spatial resolution of the newly computed FDTD solutions will provide improved information for assessing and mitigating potential hazards specific to the operations of inland and coastal power grids and oil pipelines. One education goal is to develop a sequence of three computational electromagnetic courses at the University of New Mexico. A second goal is to initiate and establish a Residential College system at the University of New Mexico.

Novel Query Processing Techniques for Distributed Probabilistic Data

Award date: September 29, 2011 – January 31, 2016

Data are increasingly generated, stored, and processed distributively. Meanwhile, when large amounts of data are generated, ambiguity, uncertainty, and errors are inherently introduced, especially in a distributed setup. It is best to represent such data in a distributed probabilistic database. In distributed data management, summary queries are useful tools for obtaining the most important answers from massive quantities of data effectively and efficiently, e.g., top-k queries, heavy hitters (aka frequent items), histograms and wavelets, threshold monitoring queries, etc. This project investigates novel query processing techniques for various, important summary queries in distributed probabilistic data.

Broadly classified, this project examines both snapshot summary queries in static (i.e., no updates) distributed probabilistic databases, and continuous summary queries in dynamic (i.e., with updates) distributed probabilistic databases. A number of techniques are explored to design novel, communication and computation efficient algorithms for processing these queries.

A distributed probabilistic data management system (DPDMS) prototype is implemented based on the query processing techniques developed in this project. This DPDMS is released to and used in practice by scientists and engineers from other science disciplines as well as industry.

Graduate and undergraduate students, including those from minority groups, are actively involved in this project. Findings from the project have been integrated into different courses, demos, and educational projects. For further information, such as publications, data sets, source code, and education initiatives, please visit the project website at http://www.cs.fsu.edu/~lifeifei/dpdm.

Scalable Nanopatterning to Enable High Efficiency Photovoltaics

Award date: July 11, 2011 – June 30, 2016

This Faculty Early Career Development (CAREER) award provides funding to enable the cost-effective manufacture of nanostructures in a manner that is scalable to areas large enough for photovoltaic applications. The high cost of solar energy is the main impediment to its widespread adoption. Nanostructures hold great promise to lower this cost via enhanced light trapping and increased efficiencies. However, there exist no viable approach for the manufacture of complex nanostructures over areas large enough (~1m2) to be of interest in solar applications. This project aims to overcome these limitations via a massively parallel, optical approach that will enable fast nanopatterning with near-molecular resolution. When combined with inexpensive replication technologies, a new framework for scalable nanomanufacturing becomes feasible. Another intellectually compelling core of this approach is the demonstration that the 150-year old far-field diffraction limit can be overcome at low-light intensities. This project aims to: (1) study optimized molecule systems for optical nanopatterning, (2) assemble an optical system that is capable of fast nanopatterning, and (3) apply this system to build a solar cell, whose efficiency is enhanced using nanophotonic structures.

The ability to sculpt nanostructures over large areas with exquisite fidelity will advance fields beyond solar cells, especially in nanoelectronics and nanophotonics. This system will be made available to users in academia and industry, enabling early and widespread adoption. This project should lead to the creation of intellectual property and to commercialization with concomitant generation of high-value jobs. In order to integrate education with research, this project includes two novel demonstration modules in solar energy specifically designed to appeal to high school students and undergraduates. A primary component of the education effort is the integration of under-represented students in solar-energy research.

Statistical Models and Classification of Time-Varying Shape

Award date: June 1, 2011 – May 31, 2016

This project develops nonlinear statistical models and classification procedures for time-varying shape and investigates their application to biomedical image analysis problems. In biology and medicine it is often critical to understand processes that change the shape of anatomy. For example, a neuroscientist studying the development of the infant brain would be interested in how neurodevelopment is different in healthy children versus those with Autism. An evolutionary biologist studying how a species has evolved to adapt to its environment would be interested in studying changes in the shape of bones found in the fossil record. The challenge in this modeling problem is that shape and shape variations are highly nonlinear and high-dimensional, and standard linear statistics cannot be applied. Therefore, the ability to model and understand changes in shape depends on the development of new regression models for data in nonlinear spaces. The research activities of this project include: (1) developing statistical models for dealing with time-varying shape using least-squares principles in shape manifolds, (2) investigating new classification methods for shape sequences, and (3) validating the methodology using synthetic data and testing its efficacy for neuroimaging applications in Alzheimer’s disease and Autism. In addition to the significant impact to computer vision, biology, and medicine, this project is combining differential geometry, statistics, and computing within the undergraduate and graduate computer science curriculum.

Bacteriophages in activated sludge bioreactors- effect on process performance and process sustainability and a tool for educational outreach to underrepresented students.

Award date: January 1, 2011 – May 31, 2016

This CAREER proposal aims to integrate PI?s research in bacteriophages with education and outreach. The motivations behind this CAREER proposal are three-fold: engineering, scientific and educational. The principal engineering goal of this CAREER proposal is to demonstrate the application of bacteriophage (virus that infect bacteria) mediated biocontrol in activated sludge systems using filamentous bulking and membrane biofouling in submerged bioreactors as the two model applications. The broad scientific goals are to: (1) to better understand phage mediated lysogeny of key activated sludge under the influence of various environmental and operational parameters: and (2) to study the phage diversity of the isolated phages from full scale activated sludge plant using 454 pyrosequencing. The educational goal is to enhance the participation of minority undergraduate and k12 students in environmental engineering using novel computer animation and internet based techniques, provide minority U.S. undergraduate students the international research experience and integrate the proposed research into education at various levels and at multiple institutes (University of Utah and the University of Texas Pan American, a Hispanic population serving institute at the U.S.-Mexico border). The specific research tasks are to: (1) isolate bacteria from the biofilm formed on the membrane of a laboratory scale membrane bioreactor and characterize these using molecular tools, (2) isolate lytic phages from full scale wastewater treatment plants with selected model filamentous bacteria and the bacteria isolated from the lab scale membrane bioreactor, (3) demonstrate phage mediated biocontrol of filamentous bulking caused by selected model bacteria and biofilm forming bacteria, evaluate the cross infectivity in bioreactor and environmental impact on receiving waters of isolated phages, (4) evaluate whether lysogeny is important in key activated sludge bacteria under the variations of pH, temperature, organic loading and toxic loadings, and in the presence of heavy metals using cultured model bacteria and, (5) obtain the genomes of isolated phages from tasks 1 and 2 using 454-pyrosequencing and study the genomes using established bioinformatic tools.

The PI?s educational and outreach aims are to: (1) involve undergraduate and high school students in laboratory research to stimulate their interest in environmental engineering, (2) integrate bacterial ecology and virology concepts into teaching to develop an effective learning module for undergraduates, (3) develop computer animation and web based tools for improving the public understanding of science, wastewater treatment and phage-related issues and, (4) integrate research into graduate education and disseminate the knowledge and research findings.

The proposed research will improve our understanding about the role of bacteriophages in activated sludge systems, the world?s most used engineered bioreactors. The research will develop phage mediated control strategies for filamentous bulking and membrane biofouling in membrane bioreactors (MBRs), the two most common operational problems in activated sludge bioreactors. Hence, the research promises to provide long term energy sustainability (i.e reduced air sparging in MBRs) and cost effectiveness to the operation of activated sludge systems including membrane bioreactors. Operational and environmental factors that triggered phage mediated bacterial lysis research, proposed in this CAREER proposal, is fundamental in nature and potentially sheds light on unknown reasons for the process upsets in activated sludge bioreactors. The proposed research, for the first time, will use fundamental concepts of virology to provide operational sustainability to activated sludge systems. This is the first systematic study to explore phage-bacteria interactions in activated sludge processes.

The research outcomes will directly affect many other related areas such as biofilm in drinking water distribution systems, understanding phage mediated horizontal gene transfer, elimination of hydrogenotrophic bacteria in biohydrogen producing reactors and nitrite oxidizers in anaerobic ammonia oxidation. The PI will focus on training of Native American and Hispanic students to encourage them to pursue careers in environmental engineering. Computer based animation and web based tools will help reach a broader audience. The PI?s unique effort to develop and teach an undergraduate course at the University of Texas Pan American will stimulate interest in Hispanic students to go for higher studies in environmental engineering. Graduate students will be trained to be leaders in their field as well as engineering ambassadors. Lab practices in k12 curriculum will expose k12 students to the exciting field of wastewater engineering and microbiology. Presentations at leading conferences and publications in peer reviewed journals of highest repute will demonstrate the success of the research.

Nonuniform-Magnetic-Field Control of Medical Microrobots

Award date: April 1, 2010 – March 31, 2015

Magnetic microrobots that navigate the natural pathways of the body have the potential to revolutionize minimally invasive medicine and biomedical research. Current magnetic manipulation systems utilize massive magnets to produce a uniform magnetic field over a relatively small area. Uniform magnetic fields are used to simplify control, but this simplified control comes at a huge cost, and it is difficult to scale up most laboratory field-generation systems to the size required for clinical use. The use of nonuniform magnetic fields makes it possible to place magnets nearer to the patient, which permits the use of smaller, less-expensive magnets, while simultaneously improving actuatable degrees of freedom and force levels that systems can render. The hypothesis being tested is that using nonuniform magnetic fields to wirelessly control medical microrobots results in superior systems?in terms of size, cost, and performance?compared to using uniform fields. This research consists of two thrusts: control of magnetically tipped continuum microrobots, which provide distal dexterity in hard-to-reach locations, and control of fully untethered magnetic helical microrobots, which swim and crawl through fluids, lumens, and soft tissue using a method inspired by bacterial flagella. Understanding how to use nonuniform magnetic fields for wireless control may be the key to translating nearly every previously developed method for microrobot propulsion into clinical practice. Magnetic microrobots may be the ideal platform from which to deploy the numerous BioMEMS devices and magnetic sensors and actuators that have been designed in recent years.

Geometric Algorithms For Data Analysis In Spaces Of Distributions

Award date: February 1, 2010 – January 31, 2015

Collections of distributions arise naturally when analyzing large data sets. Since it is impractical to store all but a small fraction of such data, distributional representations are typically used to summarize the data in compact form. For example, a document in a corpus is typically represented by a normalized vector of frequencies of occurrence of keywords, an image is represented by a histogram over gradient features and speech signals are represented by spectral densities over a frequency domain.

Representing data sets as collections of distributions enables analysis via powerful concepts from statistics, learning theory and information theory. Concepts like strength of belief, information content, and pattern likelihood are used to extract meaning and structure from the data and are quantified using information measures like the Kullback-Leibler distance and its parent class, the Bregman divergences.

These measures capture meaning in data in a manner that traditional metrics cannot, by connecting abstract notions of information loss and transfer with concrete geometric notions like distances. However, they lack properties like symmetry and the triangle inequality that are essential requirements for the application of traditional geometric algorithms for data analysis.

In this project, the PI will develop a systematic, rigorous and global algorithmic framework for manipulating these distances. This framework will provide the foundation for efficient and accurate data analysis of spaces of distributions, and will lead to deeper insights into analysis problems across a wide range of applications.

High-Rate Wireless Data Links for Biomedical Implants

Award date: May 1, 2009 – September 30, 2009

The objective of this research is to develop high-rate data links (>20 Mb/s) for implanted biomedical devices that can operate in the presence of narrowband interference from an inductive power link. The approach is to employ ultra-wideband signaling with transmitted reference synchronization to realize low-power, high-rate data transfer over the short distances required by biomedical implants. A comprehensive approach to system design is employed, with substantial effort focused on the design and modeling of the antenna and channel so that their effects can be accounted for in the circuit design.

The novel ultra-wideband transceiver architectures being explored in this work will bring about an order of magnitude increase in data rates for biomedical implants, as compared to the narrowband transceivers that are currently prevalent. This research will advance the state of the art in low-power, short-range wireless communications, and is expected to prove beneficial for a range of applications beyond implantable devices.

The high-rate data links being explored in this research have the potential to be of tremendous benefit to society, by enabling biomedical devices that can improve the quality of life for individuals (e.g. visual prostheses, neural control systems for prosthetic limbs) and by enabling extensive, long term neural recordings that will further our understanding of the brain’s physiology. The educational initiatives integrated with this research target every stage in the development of young engineers to solve tomorrow’s technology challenges, from high school outreach initiatives, to undergraduate research involvement, to graduate course curriculum development.

Optoelectronic Sensing with Single Organic Nanowires

Award date: August 15, 2008 – April 30, 2012

The Analytical and Surface Chemistry (ASC) program of the Division of Chemistry will support the CAREER development plan of Prof. Ling Zang of the Department of Chemistry and Biochemistry at the University of Southern Illinois. Prof. Zang and his students will synthesize organic nanowires and characterize their optoelectronic properties using near field scanning optical microscopy (NSOM) and conductivity measurements. The study will result in increasing understanding of the behavior of adsorbates on organic nanowires and the electronic response of organic nanowires due to adsorption of molecules on their surface. The study will facilitate the development of improved nanowire-based sensors that can find use in many applications including environmental analysis, bioanalytical measurements and defense and national security applications. The project will provide excellent educational training opportunities to students in the area of nanomaterials synthesis and will address growing workforce needs. Particular emphasis will be placed on recruiting high school students to this new field.

HCC: Haptic Guidance Systems

Award date: July 1, 2008 – June 30, 2013

The PI’s goal in this project is to advance the state-of-the-art of haptics research, which has to date centered primarily on the use of point-based force-feedback devices, by exploring and comparing two novel approaches to providing haptic guidance for path following and fine motor tasks. These two approaches are: (1) using tactile shear guidance to provide directional information through the grip of a stylus; and (2) augmenting a traditional stylus-based haptic interface with an active handrest. In the first approach (providing directional information through tactile shear feedback), the PI will investigate using a specialized stylus interface with shear devices embedded in its grip. These devices will transmit shear feedback to the user’s thumb and index finger. The second approach (using an active handrest) for executing path following and fine fingertip motions was inspired by the way artists use a baton-like handrest to support fine hand motions during detailed painting. The active handrest will be explored as a supplement or substitute for traditional force feedback and other haptic guidance techniques, such as virtual fixtures. Through modeling and human subjects testing, the PI will investigate two modes of supporting the user’s wrist and/or forearm while gripping a traditional stylus haptic interface. One mode will have the handrest impart forces or motions to the user’s wrist/forearm, providing corrective task intervention. The second control mode will infer the user’s optimal handrest position and preemptively move itself to provide continued support based on measured reaction forces. The PI will evaluate and compare the impact tactile shear guidance and the active handrest have on task performance (e.g., accuracy and execution time), versus established approaches. The research will also produce theoretical characterizations of the passive dynamics between the forearm and hand that will form the foundation for controlling active handrest systems. Algorithms for controlling the handrest under multiple modes of operation will be established.

Broader Impact: This research will lead to dramatic improvements in the realism of simulations and virtual environments of all kinds. Project outcomes will be applicable across a broad cross-section of domains including neuro- and tele-surgery, hand rehabilitation, guidance systems for the blind, and consumer applications like automotive GPS navigation systems. Imagine if, rather than having to look at your GPS navigation map or listen to its instructions, you received a shearing tactile cue from the steering wheel that told you a turn was coming up; this could significantly reduce driver cognitive load and thereby lead to improved driver safety. A major objective of the PI is to attract women and underrepresented students, especially Native Americans, into the fields of science and engineering. To this end, he will develop haptic learning modules based on his research interests, which can be presented in conjunction with established college-wide outreach activities aimed at junior high and high school students, and that can also be used by the University of Utah Robotics Group (with which the PI is affiliated) as part of its established relationship with Montana State University (which has a large Native American representation in its undergraduate programs). The PI will also develop a course in haptics with innovations such as a Haptics Concept Inventory and hands-on demos.

<blockquote class=”pullquote”> Storing, Querying and Re-Using Provenance of Computational Tasks</blockquote>

<strong>Award date:</strong> April 15, 2008 – July 31, 2011

Workflow-based systems have emerged as an alternative to ad-hoc approaches to data exploration that are widely used in the scientific community. Workflows can capture computational tasks at various levels of detail and systematically record the provenance (history) information necessary for reproducibility, result publication and sharing. Although the benefits of using scientific workflow systems are well known, the fact that workflows are hard to create and maintain has been a major barrier to wider adoption of the technology in the scientific domain. The goal of this project is to produce new algorithms and techniques for exploring and re-using useful knowledge embedded in workflow specifications and in the provenance of the data they manipulate. This project addresses key limitations in existing workflow systems. First, it develops a set of usable tools that enable casual users (who do not necessarily have programming expertise) to perform exploratory tasks and solve problems through workflows. These include intuitive user interfaces to manipulate collections of workflow and to query workflows by example. Second, it builds a scalable provenance management infrastructure to support the efficient execution of these operations. The research results of this project advance the state of the art and build fundamental knowledge in storing, querying, and re-using provenance of computational tasks. This project has the potential to impact a variety of applications where the creation and maintenance of workflows is currently a major bottleneck. This includes large computational science projects and portals. Furthermore, it makes workflows and workflow technology more accessible to casual users. Through our interdisciplinary collaborations, this project will have immediate impact in helping improve the scientific discovery process. The involvement of graduate and undergraduate students in the project will provide mentoring opportunities. The PI is committed to recruiting minority students. The results of this project will be disseminated as research papers and as freely available tools at the project website: http://www.cs.utah.edu/~juliana/projects/NSF-IIS-0746500

Rare Earth Oxide-based Diluted Magnetic Dielectrics

Award date: March 1, 2008 – February 28, 2013

****NON-TECHNICAL ABSTRACT****

Recent advances in semiconductor technology have facilitated the realization of a host of new electronic devices with ever-decreasing dimensions. Handheld pocket computers (palm-tops), ultra-thin cell phones with internet, iPods, iPhones and micro-cameras are a few examples that exploit and adopt the advances in the technological development. However, as the typical component dimensions approach the nanometer scale, further miniaturization becomes increasingly difficult. It is believed that any further improvement in device functionality will require a transition from the conventional electronics to an altogether new regime known as “Spintronics.” While the electronic devices utilize the charge of electrons, the typical spintronic devices exploit both charge as well as the spin (a magnetic attribute) of an electron. Because of this additional attribute, spintronic devices are expected to be faster, smaller and consume less power than the conventional charge-based electronic devices. However, the spintronic devices can not be fabricated simply by making use of the simple semiconductors. The practical realization of spintronic devices heavily rely on the development of two new classes of materials namely, Dilute Magnetic Semiconductors (DMS) and Dilute Magnetic Dielectrics (DMD). These materials make it possible to utilize the electron’s spin in addition to its charge. Though a significant amount of work has been performed on DMS materials, very little has been done on DMD materials. This CAREER project will be focused on discovering new families of DMD materials that potentially can lead to innovation in spintronics. Educational program will develop numerous opportunities for graduate, undergraduate and k-12 students and teachers. Summer program will provide k-12 teachers more reasons to teach science with contagious enthusiasm in the classroom. Proposed work on introducing science and engineering to minority students will have meaningful societal impact.

**** TECHNICAL ABSTRACT****

The integrated research and education goal of this Faculty Early Career Development (CAREER) project at the University of Utah is to discover new families of Dilute Magnetic Dielectrics (DMD) that will lead to innovation in Spintronics and to communicate materials science and engineering to a wider audience through science exhibits, lab-integrated courses, and hands-on activities. The most critical step in the functioning of a spintronic device is the injection of spin-polarized carriers at the ferromagnet-semiconductor interface. Recent studies have shown that dilute doping of semiconductors or dielectrics with magnetic atoms can provide an enabling breakthrough in achieving high spin-injection efficiency. This has led to an extensive effort exploring the possibility of inducing room temperature ferromagnetism in several systems. Most of the work in this field is still focused on dilute magnetic semiconductors. Little work has been performed on DMDs. This project will start an extensive research program to explore the possibility of inducing room temperature ferromagnetism in Rare Earth Oxide based high-k dielectrics by dilute doping of transition metal elements. The educational component of this project will disseminate the fundamentals of materials science and engineering to a wider audience. The following specific tasks will be performed: (i) developing interactive materials science exhibits for the Utah Science Center Museum, (ii) initiating a summer research program for k-12 teachers and students, (iii) and creating a collaborative and interdisciplinary environment for undergraduate and graduate research.

RF-Sensing Networks for Radio Tomographic Environmental Imaging

Award date: February 1, 2008 – January 31, 2013

Intellectual Merit: This research focuses on the development of new technologies to ?see? through walls into buildings to show interior structures and the motion of people within the structure. Rather than relying on a single self-contained short-range radar, this method uses a large-scale network of low-cost sensors as multi-static radio frequency (RF) radars whose pair-wise and spectral measurements can be used to image the environment. This research lies at the intersection of statistical signal processing and radio propagation and addresses the necessary key advances related to dense networks of RF sensors and accurate statistical channel models. The proposed research (1) uses extensive measurements to develop valid statistical channel models that depend on the attenuation field, (2) develops and tests estimation algorithms for radio tomographic imaging, and (3) analyzes their estimation performance.

Broader Impact: If successful in leading to new tomographic environmental imaging systems, the proposal has the potential to significantly benefit fire fighters, other first responders, and building occupants in emergency situations. In addition, the research has the potential to benefit other types of communication networks by advancing cooperative spectrum sensing in dynamic spectrum access radio networks and improving channel simulation in multi-hop networks. The project is a crucial part of the principal investigator?s goal of integrating research and education in signal processing and wireless networks. This project will lead to a new wireless communication system laboratory, a key part of a departmental curricular initiative to provide students with integrative lab experiences. Further the project will develop and disperse new interactive modules to be used with students in grades 10 through 12, in particular in programs targeted towards students from under-represented groups, both part of department goals to increase the diversity and the total enrollment of students in electrical and computer engineering. Undergraduate student research will also be integrated into this project.

Exploring Heterogeneity Within Chip Multiprocessors

Award date: May 1, 2006 – April 30, 2011

Future microprocessor chips will contain numerous computational cores, large cache hierarchies, and complex on-chip networks between cores and cache banks. For an application to exploit a processor’s peak throughput, it will have to necessarily be composed into many threads. As part of this project, the curriculum at Utah will be revised so that graduating students have the skills to write efficient multi-threaded programs that can harness the compute power in future processors. When multi-threaded applications execute on a chip, different threads and data transfers make varied demands on the hardware in terms of speed, bandwidth, power, reliability, etc. By optimizing specific cores and networks on the chip for different metrics, the hardware can meet the diverse needs of software. A processor that packs in heterogeneous functionalities and device characteristics will likely allow processor throughput to continue its steady rise while not compromising reliability or power-efficiency. This project explores the effect of optimizing on-chip networks for either speed, bandwidth, or power. It also explores the effect of customizing cores to execute the operating system, redundant threads, or speculative threads.

Efficient Utilization of Multiple Antennas for High-rate Communications in Wireless Networks

Award date: May 1, 2006 – April 30, 2012

In recent years, the use of multiple antennas at both transmitter and receiver ends of a communication link has been identified and widely studied as the most practical method of increasing channel capacity. For the next generation cellular and wireless local area networks, multiple-input multiple-output (MIMO) technology that employs multiple antennas is envisioned to be the core technology to achieve higher data rates.

While MIMO technology promises significant information-theoretic capacity gain for wireless links, there are still many unknowns as to how to efficiently realize such gains in practical communication systems and networks. There are a number of key issues from the physical layer to the network layer that need to be addressed. This proposal aims to take a cross-layer approach to address these issues to facilitate efficient utilization of multiple antennas in wireless networks. MIMO detection is the most fundamental issue in MIMO networking, and it is the complexity bottleneck that limits the employment of MIMO technology. A main objective of this proposal is to develop low-complexity MIMO detectors that scale well with antenna number and modulation size so that it is applicable for practical network setting. We propose a novel MIMO detector based on the Monte-Carlo Markov chain (MCMC) approach which shows performance superior to other existing MIMO detectors at a complexity that is more than one order of magnitude less. Performance analysis of the MCMC detector and its impact on code design will be investigated. One primary focus is on joint optimization of channel codes with MIMO detection for large antenna systems. We plan to develop joint coding design and detection strategies to find capacity-approaching channel codes at high spectral efficiencies. Code design criterions will be derived for short channel codes that are suitable for delay-sensitive applications.

From the network layer this proposal addresses the issues of multiple access and resource allocation for MIMO wireless networks. A central issue in these designs lies in the amount of channel state information (CSI) available at the transmitter. We propose to study practical power control and scheduling algorithms that are robust to channel variations and have the capability of supporting limited CSI. We will address fairness and quality of service for users with heterogeneous channel conditions. Optimal signaling design for practical MIMO detectors will be investigated in order to maximize network throughput and minimize multi-user interference. These are closely tied with our study on the physical layer issues of MIMO detection and coding.

Broader Impact:

The educational plan of this project offers diverse opportunities to students at all levels. The proposed research will generate a cluster of undergraduate research projects, which in particular will focus on developing a multiple-antenna test bed for wireless local area networks. The PI plans to encourage students from under-represented groups to participate in such projects. The proposed research may generate industrial interest that can result in undergraduate industry sponsored projects. The proposed research will also attract graduate students to explore modern communication theory and encourage their future careers in this exciting field. The PI plans to develop a new course on software-defined radio (SDR) for wireless communications, and a more advanced course on cross-layer design for wireless networks at the graduate level.

Intellectual merit:

The intellectual merit of this proposal lies in the development of new techniques and theories in MIMO networking. The broader impact is in the interdisciplinary dimensions of this research, as well as in the educational program and the exposure of students in all levels to the proposed areas. This plan offers strong integration of the research with education and industry. The impact of this research is expected to be on a variety of fields, including coding and information theory, signal detection and estimation, algorithm design and complexity, network protocol design and more.

Exploring Symbolic Algebra for RTL Verification of Arithmetic Datapaths

Award date: February 1, 2006 – January 31, 2012

Digital designs that implement polynomial arithmetic computations are found in many practical applications, such as in Digital Signal Processing (DSP) for audio, video and multi-media applications. The growing market for such designs requires sophisticated CAD support for analysis and verification. Contemporary verification technology – mostly geared towards control-dominated applications – is unable to efficiently model and validate designs with large arithmetic datapath component. Such designs described at register-transfer-level (RTL) perform polynomial computations over bit-vector variables that have pre-determined word-lengths. Conventional Boolean models do not scale well wrt increasing word-lengths. To overcome this knowledge and technology gap, this research explores an altogether new paradigm for RTL datapath verification by incorporating symbolic computer algebra within a CAD-based verification methodology.

A bit-vector of size m represents integer values reduced modulo 2^m. Therefore, bit-vector arithmetic can be modeled as algebra over finite rings, where the bit-vector size dictates the cardinality of the ring. The verification problem then reduces to that of proving polynomial equivalence over finite rings of residue classes Z_{2^m}. In this project, the investigator: (1) models RTL datapaths as polynomial functions over finite integer rings of the type Z_{2^m}; (2) Studies the properties of such class of rings for polynomial equivalence using number theory and ideal theory; (3) Derives algorithmic solutions to RTL datapath verification using symbolic and algebraic manipulation; (4) Investigates the impact of polynomial manipulation over Z_{2^m} on RTL datapath synthesis; and (5) Investigates how to model arithmetic with imprecision (e.g., error rounding and saturation arithmetic) as polynomial functions. The intellectual merit of this research lies in its mathematical challenge and in its engineering application to digital design verification. Successful completion of this project would broadly impact datapath verification theory and practice and would also enhance the understanding of some classical mathematical problems. Both graduate and undergraduate students will be involved in this research. The results will be disseminated not only to the Digital Design and CAD community, but also to the Symbolic Algebra community.

Vertically Integrated Program Analysis for Embedded Software

Award date: May 1, 2005 – April 30, 2010

CAREER: Vertically Integrated Program Analysis for Embedded Software In recent years, a great deal of progress has been made towards tool support for developing embedded software. Tools solve a variety of difficult problems, for example by automating error-prone implementation tasks, by eliminating redundant and inefficient constructs, and by guaranteeing the absence of certain classes of errors, such as race conditions or out-of-memory exceptions. This NSF CAREER research is about Vertically Integrated Program Analysis and Transformation (VIPAT), a new way to look at embedded software tools: as a collection of building blocks that can be connected together in different ways to support novel analyses and transformations. The existing tools become mechanisms that are controlled by a high-level policy. VIPAT is based on two main ideas. First, the vertical integration of tools that operate at various levels of abstraction, which permits high-level transformations to be precisely targeted at parts of a system whose low-level resource usage is worst. Second, a clean separation between mechanism and policy, enabling effective reuse of existing tools in new situations. This research is a step towards a world where meaningful static guarantees about program behavior can be made, and where software can be automatically specialized to meet platform- and application-specific requirements such as time and energy constraints. The high-level vision is “fearless reuse”: developers should spend less time worrying about resource usage and potential failure modes of components that they reuse.

Quantifying and Controlling Error and Uncertainty in Computational Inverse Problems

Award date: January 1, 2004 – December 31, 2009

Heart disorders are a malady, which affect many Americans each year. Although techniques such as electrocardiography allow physicians to postulate as the probable cause of patient discomfort, cardiac source localization cannot currently provide the physician with the precise location within the heart of a bioelectric abnormality, nor can current techniques provide the physician with confidence measures based upon the numerical (discretization) errors, modeling errors and variability/uncertainty errors which exist in the inverse problem. This research involves the development of methods for quantifying and controlling error and uncertainty in computational inverse problems. The specific driving application is to make the computational source localization procedure a viable tool for diagnosing cardiac bioelectric field problems. This research is valuable for both its multi-disciplinary influence and its expansion of computational science and engineering (CS&E) techniques beyond the original applications for which they were designed. The academic merit of this research is its fundamental contribution to the solution of computational inverse problems and its practical contribution to the bioengineering problem of cardiac source localization. In the true spirit of CS&E, this research is the synergy of a domain specific task and computational science tools. The broader impact of this research results from its extendibility to a much larger class of computational inverse problems. The educational objectives of this proposal are focused on training young scientists to properly view simulation science as a tool in the validation of and extension of scientific inquiry.

Specifically, this research is partitioned into two aims: (1) to quantify and minimize the effects of numerical modeling errors in the ECG source localization computation through the judicious use of high-order methods and the discontinuous Galerkin method; and (2) to quantify and minimize the effects of uncertainty and variability in the source localization process through the exploration of the polynomial chaos methodology for uncertainty quantification.

Advances in Universal Data Compression with Applications to Joint Source and Channel Coding

Award date: December 15, 2003 – November 30, 2010

The purpose of this research is to develop several unexplored areas in data compression, as well as to utilize universal data compression techniques in other applications including biological modelling and a novel direction of joint source-channel coding. The research focuses on four topics: (a) design of joint source-channel universal source code based codes, (b) study of universal compression for large and unknown source alphabets, (c) design of advanced universal coding techniques for non-traditional, yet more realistic, data models with practical implementations, and (d) the study of random access lossless compression. Common techniques in joint source-channel coding suffer from severe synchronization problems in bad channel conditions and do not address universality issues when the source statistics are unknown. This research develops techniques to combat these problems, and even attain “free” gain in channel decoding performance for redundant channel information streams. Common compression schemes assume that the data is from a known alphabet, it has a “standard” stationary or constantly changing statistical model, and it consists of a long sequence. However, (a) there exist compression applications with large unknown alphabets, such as text compression where the words constitute the alphabet, (b) most real data sequences are usually neither stationary nor of constantly varying statistics, and (c) random access is necessary in large compressed data bases. The investigator studies these three non-traditional problems. The research work combines the development of rigorous theoretical results including redundancy and description length bounds, with empirical testing, algorithm design with focus on practical low-complexity techniques, and implementation of proposed techniques. Finally, the research also investigates the use of universal compression techniques to segmentation and modelling of biological sequences.

In Vivo Quantification of Tissue Deformation and Growth from Medical Image Data

Award date: August 1, 2002 – November 30, 2007

Under this CAREER Award, methods from theoretical and computational mechanics will be combined with pattern theory to directly incorporate medical image data into the analysis of deforming biological tissues. Results will provide new finite-element based tools that use image data to track three-dimensional kinematics and nonlinear strain in deforming tissues and to register anatomical structures appearing in image data sets. Two applications would be targeted: in vivo measurement of strains in the beating human heart and in situ measurement of strains in ligaments. The techniques and resulting software, which will be made available to the public, are expected to be applicable in numerous other areas including geophysics, manufacturing engineering design, biology and computational medicine.

The educational component focuses on three areas: bioengineering outreach to Utah high school students through a summer institute; developing an integrated biomechanics curriculum built around three core courses, and training undergraduate and graduate students in conjunction with research objectives.

Low-Power VLSI Circuits for Large-Scale Neuronal Recording

Award date: March 1, 2002 – February 29, 2008

There is a great demand for technologies that enable neuroscientists and clinicians to observe the simultaneous activity of large numbers of neurons in the brain. The monitoring of these groups or “neural ensembles” allows researchers to begin to understand the cooperative mechanisms used by neurons to encode and process information. Recent advances in MEMS technology have produced small arrays of microelectrodes containing as many as 100 recording sites. “Next generation” neural recording systems must be capable of observing 100-1000 neurons simultaneously, in a fully-implanted unit.

While integrated electronics have been developed for small-scale amplification of the weak extracellular neural signals (<100 electrodes), existing circuits have high levels of noise and consume too much power to be fully implanted in larger quantities. We propose to develop low-power, low-noise analog and mixed-signal VLSI systems allowing fully implantable recording of 100-1000 neurons.

A fully implanted multichannel neural recording system must use an RF or inductive-link transmitter for transcutaneous telemetry. We will investigate techniques for on-chip data reduction (e.g., spike thresholding, feature detection) to assist in spike sorting and reduce the required bandwidth (and hence power) of such a transmitter.

The educational component of the proposed work involves the improvement of the VLSI curriculum in the PI’s department. This improvement will consist of three main thrusts: (1) Development of a laboratory component of a course in analog integrated circuit design taught by the PI. The construction of “class chips” will allow students to measure VLSI circuits in modern submicron fabrication technologies. (2) Development of a new advanced analog VLSI course. (3) Enlisting industrial partners to evaluate our VLSI curriculum. In addition to this curriculum development, the PI will also mentor a graduate student who will perform research related to the proposal.

Integrated-Optic Nanoparticle Biosensor Arrays

Award date: February 15, 2002 – January 31, 2008

Research world-wide on biosensing is motivated by numerous applications in environmental and food testing and clinical diagnostics, for example. However, the important problem of detecting in parallel a large number of molecular species from the very small samples typical of most collection procedures remains an elusive goal. This CAREER research plan focuses on solving this problem by merging the science of nanophotonics with waveguide biosensors and microfluidics for the development of a new class of molecular detection array.

The immobilization of metallic nanoparticles onto discrete zones of an optical waveguide surface makes the parallel detection of a large number of molecular species feasible. In each zone, capture molecules tethered to the nanoparticles preferentially bind to a particular molecular species through an affinity interaction. Strong localization of light about each nanoparticle allows for dramatic improvement in optical signal transduction, thereby facilitating the detection of small numbers of molecules bound within each zone.

Microfluidics will be used to deliver small sample volumes to each sensing zone and passive mixing structures will be studied in order to increase the molecular binding probability within each zone.

The education plan focuses on the creation of a summer optics workshop for secondary school physics and science teachers. As more demands are placed on teachers, and as technology continues to advance at a rapid pace, teachers need a way in which to further their knowledge of science and hands-on teaching methods. Detailed lesson plans and laboratory exercises will be developed for deployment in the classroom, with the goal of improving student understanding of and instruction in optics and the sciences, and encouraging students to pursue careers in engineering and science. Participation of teachers from Hispanic and Native American schools will be strongly encouraged.

A Statistical Framework For Reconstructing 3D Manifolds From Range Data

Award date: October 1, 2000 – September 30, 2005

This project addresses the question of how to automatically generate 3D computer models of objects and scenes using data from a range finding device, such as a laser range scanner, sonar, ultrasound, or radar. Such 3D computer models are important in a wide range of applications including defense surveillance, forensics, teaching, and medicine. Range measuring devices typically sweep a beam of energy to gather many millions of 3D measurements from surfaces of objects but they have some limitations. First, because not all object are visible from a single point of view, a single sweep is incomplete. Second, each individual range measurement is not necessarily accurate because the measurement process is inherently noisy. The strategy is to systematically fuse together many measurements from different points of view in order to create accurate, complete 3D models. This project examines some of the fundamental mathematical questions pertaining to this process and then studies how to implement and demonstrate this theory on real data. Range-finding devices measure distances to objects by reflecting energy off of the interfaces between different types of materials, but they provide a noisy, mathematically complex, and highly nonlinear transformation from a collection of surfaces to a set 2D depth maps. This project will develop statistical methods for estimating manifolds from this kind of data, thereby generalizing the current state of the art in estimation theory, which is primarily concerned with estimating functions or fields. Thus, the goal is to provide a general, complete, and practical foundation for 3D surface reconstruction. The strategy is to find the surface that maximizes the posterior probability conditional on a collection of range measurements taken from different points of view. The reconstruction framework is Bayesian; it includes a sensor model as well as prior knowledge about the characteristics of the object or scenes being modeled. This work will address a number of important issues pertaining to this statistical methodology for building 3D models, including better sensor models, high-order priors, fast and robust algorithms, and broader applications. These developments will comprise a fundamental scientific result: the generalization of the basic principles of estimation theory to the challenging and timely problem of 3D surface reconstruction.

Online Estimation of MWD of Polymer Melts Using Broad-Band Dielectric Measurements: Sensor Development and Control

Award date: April 1, 1999 – March 31, 2004

The focus of the proposed research program is on the development of sensing technology and model-inversion based identification methods to infer online the molecular weight distribution (MWD) of polymer melts based on broad-band dielectric measurements. The ultimate goal of the research is to develop new polymer processing technology for online customization of the properties of the final product by varying the processing conditions during extrusion and molding. The inverted fringe-effect microdielectric sensing with controlled depth of field penetration is a novel approach for measuring spatially resolved properties of a material in a direction normal to the interface. In the proposed approach, methods for inferring MWD online from dielectric measurements will be based on theoretical modeling of polymer relaxation in electromagnetic fields and on statistical correlation methods. Theoretical and experimental studies will be followed by the development of advanced process control systems, which will enable online polymer customization by varying conditions during processing. The educational program is aimed at creating a balanced undergraduate process control curriculum, which includes project-based hands-on experience in implementing practical control systems in laboratory settings.

Building Conceptual Natural Language Processing Systems for Practical Applications

Award date: May 15, 1997 – April 30, 2003

This research aims to develop a conceptual natural language processing (NLP) system with adaptable components that can be easily tailored for different domains and applications. The architecture of the system consists of fine-grained layers to support various depths of text processing. The shallow layers support syntactic processing, which may be sufficient for some information retrieval tasks, while the deeper layers support semantic and conceptual processing for in-depth language understanding. The system includes components for part-of-speech tagging, prepositional phrase attachment, semantic feature identification, and concept extraction. Each component can be tailored for new domains with minimal manual effort. The layered architecture also allows students to develop individual components and plug them in to the larger system for experimentation. The education goals are to use the system as the basis for a hands-on science workshop for young girls, for summer lectures to high school students, for class projects in natural language processing and machine learning, and for graduate and undergraduate research projects. The purpose of the research is to develop techniques for building conceptual natural language understanding systems automatically or semi-automatically for new domains. Generating conceptual sentence analyzers quickly and efficiently is an important step toward many practical applications, including conceptual information retrieval, text categorization, and information extraction.

A Biomimetic Active Separation Device Based on the Microtubule Motor Protein, Kinesin

Award date: July 15, 1996 – June 30 2001

CTS – 9624907 Russell Stewart University of Utah ABSTRACT The project is on the fabrication of a microanalytical separation device incorporating the microtubular motor protein (kinesin) into nano-fabricated machines. A prototype active chromatography device will be built to recognize, separate and detect specific DNA fragments on a single micromachined chip. A fluorescent-labeled DNA fragment in solution will attach selectively to the motor protein and move along aligned microtubules in a microchannel to a detector. The potential impact of the project is strong in bioseparations and chemical process monitoring. If successful, it may be possible to mimic intracellular transport in fabricated machines. The educational plan includes development of a new lecture and two laboratory courses on protein engineering and microfabrication. Students will be trained to apply biological principles to conventional engineering problems.