2005 — 2010 |
Popovic, Zoya (co-PI) [⬀] Meyer, Francois |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Submillimeter-Wave Imaging @ University of Colorado At Boulder
ABSTRACT
This integrative systems proposal focuses on the optimized design of focal-plane arrays in the 100-300GHz range, with an integrated design methodology of hardware and image processing. Intellectual merit of the proposal: In many applications of sub-millimeter and millimeter-wave imaging it is critical to acquire high-resolution images in real-time, and remove noise and distortions. New nonuniform sampling of the focal surface coupled with efficient image reconstruction algorithms will be investigated. Dual-polarization antennas coupled with optimized image processing algorithms will be designed. Noise removal and deconvolution algorithms specific to the limitations of FPAs will be studied. Methods for designing low-loss quasi-optical components tailored to FPAs will be investigated. Antenna aperture efficiency will be improved using microlens-coupled antennas. Dual polarized antenna arrays with low polarization coupling; and multi-band nested antenna arrays with nonuniform periodicities will be designed. Design and fabrication methods will be investigated.
Broader impacts of the proposed work: This research will lead to the design of fast imaging techniques for the detection of non-metallic concealed weapons, and will have an impact on public safety. The technology developed will also have direct applications to astronomy, and medical imaging. This project will foster multidisciplinary collaborative research spanning areas of high-frequency analog quasi-optical techniques and image processing. The proposed research will be complemented by educational activities that will transfer the findings into a course. The investigators will disseminate their results in journals, through a regularly updated interactive web site and through a workshop that brings together all areas needed for millimeter-wave imaging.
|
0.915 |
2009 — 2014 |
Meyer, Francois Martinsson, Per-Gunnar (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cdi-Type I: Geometrical Image Processing With Fast Randomized Algorithms @ University of Colorado At Boulder
The objective of this research is to develop general tools for reducing the effective size of massive datasets and to demonstrate the capabilities of these tools by applying them to the efficiently represent images and streaming video. The approach is based on a new class of randomized algorithms that have proven capable of accelerating key scientific computations, while simultaneously making the algorithms easier to implement on multi-core computers.
With respect to intellectual merit, this project addresses the fundamental scientific question of how to extract information from large and noisy datasets. These datasets are created by many branches of society in quantities and at a rate that make the analysis of the data extremely difficult with state-of-the-art methods. This research is based on the belief that the scale of the problem can be reduced with new algorithms that discover the underlying structure of a dataset by randomly examining chunks of data. Formally, mathematical results on random projections in high-dimensional spaces will be turned into fast algorithms that build low-dimensional representations amenable to knowledge discovery.
With respect to broader impacts, this project has the potential to create a new generation of algorithms for organizing and searching huge databases of images and videos. It could pave the way for novel algorithms to manage the massive datasets created by web searches, biomedical research, cyber security, and other domains. The investigators are mentoring students from the existing "SMART" program, which attracts outstanding students from underrepresented groups into graduate school. Undergraduate students are involved via the existing "Mentoring Through Critical Transition Points" program.
|
0.915 |
2014 — 2017 |
Sain, Stephan Dougherty, Anne [⬀] Anderson, Kenneth (co-PI) [⬀] Meyer, Francois Martinsson, Per-Gunnar (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Extreems - Qed: Directions in Data Discovery (Data Cubed) in Undergraduate Education @ University of Colorado At Boulder
The Data Cubed project will prepare students for the challenges posed by the analysis of large datasets. As governmental, scientific, and business enterprises collect, store, and process more data, many technological challenges are encountered. The analysis of big datasets requires a collaborative effort between mathematicians, statisticians, computer scientists, and domain experts. Computation (including algorithmic development), modeling (including dimensionality/complexity reduction), and visualization are all needed. The Data Cubed project will identify talented students early in their academic career and give them appropriate mentoring and increasingly advanced statistical and computational coursework. The students will proceed to data discovery research under the guidance of faculty members and partner scientists. The ultimate goal of the Data Cubed project is to increase the number of highly qualified undergraduate students who are able to apply their skills as they enter the scientific workforce and data analytics careers and to share the results of this project with the broader community.
Students will learn mathematical and statistical techniques and software systems to collect, generate, store, analyze and visualize large amounts of data. In the Data Cubed project, several new courses will be created to train students in the core computational and statistical areas that underpin the analysis of large datasets; the students will be provided with significant research opportunities in the areas of geophysical modeling, analysis of unstructured social media data, and dimensional reduction techniques and modeling. One of the research projects will use large geophysical datasets from the National Center for Atmospheric Research (NCAR) and will involve modeling heat stress in urban environments and its relationship to public health. Another will examine the role of oceans as a primary reservoir of heat for our planet, which plays a significant role in the dynamics of climate change. Yet another project will combine heuristics of social media data (e.g., tweets) during times of mass emergency with a user's social graph to develop a more comprehensive picture of the situation. These and other projects all require fundamental knowledge and understanding of how to analyze large datasets.
|
0.915 |
2018 — 2021 |
Meyer, Francois |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cif: Small: Learning On Graphs @ University of Colorado At Boulder
Not unlike road networks, significant disruptions in the pattern of connections between brain regions have profound effect on brain function. International projects, such as the Human Connectome Project and the Autism Brain Imaging Data Exchange initiative, provide open access to massive datasets that can be used to learn the association between brain connectivity and psychiatric or neurological diseases. This award responds to the current lack of analytical and computational methods to quantify changes in the organization of brain functional networks. This project proposes to design novel machine learning algorithms that will lead the way toward precision medicine in psychiatry and neurology. The award will train several graduate students to work on big-data challenges in precision medicine. The source code that will implement the algorithms will be made publicly available in the form of open source toolboxes.
The availability of large datasets composed of graphs creates an unprecedented need to invent tools in statistical learning to study "graph-valued random variables". The first goal of this project is to develop theory and algorithms to estimate the mean and variance of a set of graphs. This basic problem is at the core of several statistical inference problems about population of graph ensembles. The second aim is to develop statistical methods that can detect significant structural changes (e.g., alteration of the topology and connectivity, etc.) in a time series of graphs. The third aim is concerned with the question of learning functions defined on a graph ensemble. The proposed approach relies on the ability to equip a graph ensemble with a metric, effectively turning the learning problem into the question of extending functions defined on a metric space.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
0.915 |