1997 — 1999 |
Schorr, Herbert Kesselman, Carl |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Workshop: Workshop On Building a Computational Grid, September 8-10, 1997, Chicago, Illinois @ University of Southern California
Developments in networking and software technologies are enabling the integration of geographically distributed compute, storage, network, and other resources into high-performance distributed computing systems or computational grids. These computational grids promise to revolutionize high-performance computing by enabling both the construction of tools providing qualitatively new capabilities, and wider access to unique resources such as supercomputers, scientific databases, and instruments. However, major technical advances are still required if they are to be broadly deployed and used. This workshop will bring together leading researchers and practitioners to survey the state of the art in metacomputing, discuss outstanding problems, and identify future directions for research and development. The final product of the workshop will be an edited book discussing the state of the art and research challenges in building a computational grid.
|
1 |
2000 — 2005 |
Prudhomme, Thomas Reed, Daniel Spencer, Billie [⬀] Bardet, Jean Pierre Parsons, Ian (co-PI) [⬀] Finholt, Thomas Kesselman, Carl Foster, Ian |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Neesgrid: a Distributed Virtual Laboratory For Advanced Earthquake Experimentation and Simulation @ University of Illinois At Urbana-Champaign
The George E. Brown, Jr. Network for Earthquake Engineering Simulation (NEES) is a project funded under the National Science Foundation (NSF) Major Research Equipment appropriation. The goal of NEES is to provide a national, networked simulation resource of geographically-distributed, shared use next-generation experimental research equipment sites, with teleobservation and teleoperation capabilities, which will transform the environment for earthquake engineering research and education through collaborative and integrated experimentation, computation, theory, databases, and model-based simulation to improve the seismic design and performance of U.S. civil and mechanical infrastructure systems. NEES will be constructed through September 30, 2004, and then enter its operational period from October 1, 2004 through September 30, 2014, during which time it will be operated by a NEES Consortium (to be established). This 39-month, $10,000,000 cooperative agreement with The Board of Trustees of the University of Illinois, Champaign, Illinois, for the project entitled "NEESgrid: A Distributed Virtual Laboratory for Advanced Earthquake Experimentation and Simulation" is the outcome of the peer review of proposals submitted to program solicitation NSF 00-7, "NEES: System Integration." The NEESgrid project will be led by the University of Illinois at Urbana-Champaign (UIUC), Champaign, Illinois, in partnership with the University of Chicago, Chicago, llinois; University of Michigan, Ann Arbor, Michigan; University of Southern California, Los Angeles, California; and TeraScale, LLC, Cedar Crest, New Mexico. This multi-organizational team will design, develop, implement, test, and make operational the Internet-based, national-scale high performance network system for NEES, called "NEESgrid" in accordance with the NEESgrid Project Execution Plan. NEESgrid will build on proven, existing Grid technology. Its design will integrate existing best practice tools from the commercial and research sectors. NEESgrid will be a layered, modular architecture that is flexible and highly extensible and will enable rapid development of new end-user applications, the introduction of new services, and the integration of new simulation software and experimental facilities as they are developed. NEESgrid will connect, through a high performance Internet network, distributed major earthquake engineering research equipment that includes shake tables, geotechnical centrifuges, a tsunami wave basin, large-scale laboratory experimentation systems, and field experimentation and monitoring installations funded through separate NEES equipment awards made by the NSF NEES program. A dedicated computer system at each experimental facility, called a NEES Point-of-Presence (NEESpop), will enable teleobservation, teleoperation, and network monitoring. Collaborative technologies will enable planning for and teleobservation of experiments as they occur. In addition to providing access for telepresence at the NEES equipment sites, NEESgrid will use cutting-edge tools to link leading edge computational resources and data storage facilities, including a curated repository that will archive experimental and analytical earthquake engineering and related data. NEESgrid also will provide distributed physical and numerical simulation capabilities and resources for visualization of experimental and computed data. NEESgrid support nodes will maintain online knowledge bases that contain tutorials and operate help desks for using NEESgrid. UIUC and its partners will work closely with NEES equipment sites, NEES Consortium Development awardee, NEES Consortium, and the earthquake engineering community to develop the user requirements and performance specifications for NEESgrid design. NEESgrid must be operational by September 30, 2004. During 2004, UIUC will transition operation of NEESgrid to the NEES Consortium for operation by the Consortium starting on October 1, 2004.
|
0.946 |
2001 — 2002 |
Kesselman, Carl |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Us/Uk Workshop On Grid Computing @ University of Southern California
This award from the Partnerships for Advanced Computational Infrastructure Program provides support for 25 American scientists to participate in a US-UK Workshop on Grid Computing. The purpose of the workshop is to: 1) identify Grid research, development, and deployment activities that could be carried out jointly by researchers from the United States and the United Kingdom; 2) explore common interests in both applications and infrastructure (middleware, networking) that would warrant, and benefit from, the creation of a high-speed transatlantic testbed, and 3) if such a testbed is desirable, begin to develop the technical agenda for a high-speed transatlantic testbed. The workshop will be held on August 4-5, 2001 in San Francisco just prior to the tenth High-Performance Distributed Computing (HPDC-10) Conference. This workshop award will pay travel expenses for US participants. The US co-organizers are Carl Kesselman (USC-ISI) and Paul Messina (Caltech).
|
1 |
2001 — 2007 |
Minster, Jean-Bernard (co-PI) [⬀] Jordan, Thomas [⬀] Kesselman, Carl Moore, Reagan |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Itr/Ap: the Scec Community Modeling Environment: An Information Infrastructure For System-Level Earthquake Research @ University of Southern California
0122464 Jordan
It is now possible, because of recent advances in geophysics, to create for the first time, fully three-dimensional simulations of earthquake fault-rupture and fault-system dynamics. Such physics-based simulations are crucial to gaining a fundamental understanding of earthquake phenomena, and they can potentially provide enormous practical benefits for assessing and mitigating earthquake risks through improvements in seismic hazard analysis. The Southern California Earthquake Center (SCEC) has embarked on an ambitious program to develop physics-based models of earthquake processes and integrate these models into a new scientific framework for seismic hazard analysis and risk management. This project involves a collaboration among SCEC, the Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Seismology (IRIS), and the U.S. Geological Survey (USGS) to develop a "Community Modeling Environment", which will function as a virtual collaboratory for the purposes of knowledge quantification and synthesis, hypothesis formulation and testing, data assimilation and conciliation, and prediction.
To achieve its objectives, the environment must provide a means for describing, configuring, initiating, and executing complex computational pathways that result from the composition of various earthquake simulation models. This entails solving a number of challenging problems in information technology. To solve these problems, the principal investigators will draw on several distinct computer science disciplines: 1) Knowledge representation and reasoning techniques; 2) Grid technologies; 3) Digital library technology; and 4) Interactive knowledge acquisition techniques.
A central element of this project will be a Knowledge Transfer, Education and Outreach program with four primary goals: 1) to transfer the technology developed under this project to the end users of earthquake information, including engineers, emergency managers, decision makers, and the general public; 2) to cross-educate advanced students in the fields of geoscience and computer-science; 3) to make the general public aware of the benefits of applying advanced information technology to the problems of earthquake risk; and 4) to use public interest in earthquake information to attract beginning students into geoscience and computer science. A specific objective will be to engage young Hispanic Americans in the intellectual challenges of earthquake information technology. ***
|
1 |
2001 — 2005 |
Kesselman, Carl Reed, Daniel Foster, Ian Butler, Randal Livny, Miron (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Deploying and Supporting a National Middleware Infrastructure: Toward a National Grids Center @ University of Illinois At Urbana-Champaign
The proposal focuses on the deployment and support of the NSF Middleware Initiative (NMI) and is designed to serve as the Service Provider for the NMI program. The NMI program will expedite the development of middleware functions and services to support the national research and scientific community, and will also accelerate the deployment and use of new and emerging applications across the Internet. The purpose of the NMI program is to enable members of the advanced networking community to collaborate in developing and assembling common network services and resources (middleware) to optimize the network for applications and to share limited resources for the common good of all users.
This proposal has three primary goals; - Provide production-quality software releases, support, training, and outreach, with the goals of ensuring that early adopter user communities are successful and that new user communities are trained in the use of the infrastructure.
- Leverage the PI's long history in running production facilities to develop tools and procedures to ensure that the resulting software is acceptable, deployable and supportable on a wide variety of end systems, including production environments at campus and laboratories.
- Provide dedicated operations capability providing 24x7 support for infrastructure elements, monitoring of grid infrastructure, and capture and analysis of grid usage data. In doing so, NMI will enable the construction of a national scale infrastructure that can be used by application communities such as NEES and GriPhyN, or discipline-specific communities, or research communities, to explore the use of the middleware infrastructure on full scale, meaningful applications
The program will will build on widely used research middleware, such as the Globus Toolkit, the de facto standard for Grid environments, Condor, and Storage Resource Broker, heavily leveraging open (e.g., IETF and W3C standard) protocols. The results of the project will rely on open protocols and open source implementations. In undertaking these activities, NCSA will work closely with major U.S. projects including the NSF PACIs, which will help reach out to other user communities, and with UCAID, which has committed to working to address the challenging and important problem of integration with campus infrastructures and the larger education and research communities. NCSA will work aggressively to support interoperability, transition relevant technologies to standards, and develop strong industrial relationships.
|
0.946 |
2001 — 2005 |
Butler, Randal Kesselman, Carl Foster, Ian Papadopoulos, Philip |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Designing and Building a National Middleware Infrastructure: Towards a National Grids Center @ University of Southern California
The proposal focuses on designing and building the NSF Middleware Initiative (NMI), and will be one of two System Integrators. The NMI program will expedite the development of middleware functions and services to support the national research and scientific community, and will also accelerate the deployment and use of new and emerging applications across the Internet. The purpose of the NMI program is to enable members of the advanced networking community to collaborate in developing and assembling common network services and resources (middleware) to optimize the network for applications and to share limited resources for the common good of all users. The proposal will be based on an open extensible architecture by building on the success of the GRID activities (primarily Globus) and other middleware technologies. The project would enhance and extend this architecture to the larger research community and has three primary goals; Define an integrated, modular, and extensible Grid architecture addressing a large fraction of current and projected future middleware requirements for the scientific and engineering communities, as well as the larger academic communities. Instantiate the NMI architecture, creating robust, tested, packaged, and documented middleware solutions that address a significant fraction of these requirements, and that involve community extensions over the next several years to address other requirements. Work with middleware research community and to evolve the architecture and to integrate selected components into the NMI middleware infrastructure. In constructing the NMI, ISI will build on widely used research middleware, such as the Globus Toolkit, the de facto standard for Grid environments, Condor, and Storage Resource Broker, heavily leveraging open (e.g., IETF and W3C standard) protocols, as well as middleware which is being developed by independent researchers and communities. The results of the project will rely on open protocols, open source implementations and interoperability. In undertaking these activities, ISI will work closely with major U.S. projects including the NSF PACIs, which will help us to reach out to other user communities, and UCAID, which has committed to working with ISI to address the challenging and important problem of integration with campus infrastructures and the broader academic science and research communities. ISI will also work with other international users communities as identified by NSF, will help transition relevant technologies to standards and products, and will develop industrial relationships.
|
1 |
2003 — 2007 |
Livny, Miron (co-PI) [⬀] Butler, Randal Kesselman, Carl Foster, Ian Papadopoulos, Philip |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nmi: Disseminating and Supporting Middleware Infrastructure: Engaging and Expanding Scientific Grid Communities @ University of Illinois At Urbana-Champaign
In 2001, the Grid Research, Integration, Development and Support (GRIDS) Center was established to define, develop, deploy, and support an integrated national middleware infrastructure. This proposal seeks to extend this work to (1) broaden the scope and capabilities of the middleware delivered; (2) embrace and extend emerging Open Grid Services Architecture (OGSA) standards; (3) expand the range of communities supported; and (4) establish international consensus in middleware. GRIDS Center 2 (GC2), like GC1, will rely heavily on open protocols, and open source implementations, and will work closely with major U.S. projects including the NSF Partnerships for Advanced Computational Infrastructure (PACI), the National Center for Atmospheric Research (NCAR), the Pittsburgh Supercomputer Center (PSC), and Internet2 to reach out to other user communities and to address integration with campus infrastructures. The proposal details a strategy to aggressively transition relevant technologies to standards and industry, building on strong corporate relationships.
This proposal stresses dissemination and support of this national middleware, with emphasis on outreach to scientific communities. Specific goals of this proposal are to: 1) Develop and deploy scalable federated support, bug tracking, and software delivery services. 2) Improve the quality of the software releases, support, and training. 3) Outreach to the community with the goals of ensuring that early adopter user communities are successful and that new user communities are entrained in the use of the infrastructure. 4) Leverage the PIs' long history in running production facilities to develop tools and procedures to ensure that the resulting software is acceptable, deployable and supportable in a wide variety of environments.
The proposed activities have significant intellectual merit and will bring broad benefits to society at large. The GRIDS leadership and staff are well qualified, having established leadership in the development and deployment of Grid middleware. These activities are on the cutting edge, yet they have real and immediate implications for technology users in the research/education domain, and beyond.
The broader impact of GRIDS Center has already been considerable, with more than a dozen leading companies such as IBM, Hewlett-Packard, and Oracle having committed to building on our technology for their Grid products. GC2 will see even greater emphasis on these outreach efforts, centering on the emergent OGSA standards that are being driven by the GRIDS leadership.
|
0.946 |
2003 — 2007 |
Livny, Miron (co-PI) [⬀] Butler, Randal Kesselman, Carl Foster, Ian |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nmi: Designing and Building a National Middleware Infrastructure (Nmi-2) @ University of Southern California
In 2001, the Grid Research, Integration, Development and Support (GRIDS) Center was established to define, develop, deploy, and support an integrated national middleware infrastructure. This proposal seeks to extend this work to (1) broaden the scope and capabilities of the middleware delivered; (2) embrace and extend emerging Open Grid Services Architecture (OGSA) standards; (3) expand the range of communities supported; and (4) establish international consensus in middleware. GRIDS Center 2 (GC2), like GC1, will rely heavily on open protocols, and open source implementations, and will work closely with major U.S. projects including the NSF Partnerships for Advanced Computational Infrastructure (PACI), the National Center for Atmospheric Research (NCAR), the Pittsburgh Supercomputer Center (PSC), and Internet2 to reach out to other user communities and to address integration with campus infrastructures. The proposal details a strategy to aggressively transition relevant technologies to standards and industry, building on strong corporate relationships.
This proposal stresses the definition of the NMI architecture and the integration of extant middleware solutions into this architecture. Specific goals of this proposal are to: * define a scalable robust middleware architecture leveraging the standards-based Open Grid Services Architecture. * create mechanisms and infrastructure for combining middleware components from a range of sources into integrated, supportable middleware software releases at least twice a year. * based on requirements of the NMI user communities, create integrated OGSA-based services packages that provide focused functionality in areas such as data, computation management, and teleinstrumentation. These packages will integrate software from many NMI development groups.
Regarding NSF merit review criteria, our proposal has intellectual significance and broad benefits for society. GRIDS leadership and staff have established reputations in designing and deploying middleware. These R&D activities have real, immediate implications in the research, education and beyond. GRIDS impact has already been considerable, with over a dozen companies such as IBM, Hewlett-Packard, and Oracle committed to building products on GRIDS technology. GC2 will see even greater emphasis on these outreach efforts, centering on emergent OGSA standards that are being driven by the GRIDS leadership. .
|
1 |
2003 — 2008 |
Livny, Miron [⬀] Butler, Randal Kesselman, Carl Foster, Ian Barford, Paul |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nmi: An Integrative Testing Framework For Grid Middleware and Grid Environments @ University of Wisconsin-Madison
In 2001, the Grid Research, Integration, Development and Support (GRIDS) Center was established to define, develop, deploy, and support an integrated national middleware infrastructure. This proposal seeks to extend this work to (1) broaden the scope and capabilities of the middleware delivered; (2) embrace and extend emerging Open Grid Services Architecture (OGSA) standards; (3) expand the range of communities supported; and (4) establish international consensus in middleware. GRIDS Center 2 (GC2), like GC1, will rely heavily on open protocols, and open source implementations, and will work closely with major U.S. projects including the NSF Partnerships for Advanced Computational Infrastructure (PACI), the National Center for Atmospheric Research (NCAR), the Pittsburgh Supercomputer Center (PSC), and Internet2 to reach out to other user communities and to address integration with campus infrastructures. The proposal details a strategy to aggressively transition relevant technologies to standards and industry, building on strong corporate relationships.
Early deployment of the NMI software releases demonstrated the benefits of such an integrated middleware package but also exposed the limitations of current testing capabilities. The goal of the effort outlined by this proposal is to develop, implement, and deploy a comprehensive testing framework and to offer its services to the Grid community.
The proposal outlines the design, construction, and dissemination of a transparent and transferable testing framework to address increasing demand for timely distribution of robust Grid middleware. The approach will be validated against the GC2 reference distribution, and the process and tools open and easy to install so custom middleware can achieve quality, reliability and stability. The approach should: 1) dramatically shorten the period from component update to release (speed of integration), 2) enhance integration of test modules (quality of integration), 3) validate middleware installations (certification), and 4) determine if an application failure is to blamed on the application, middleware, or fabric (fault isolation). GRIDS has already improved the quality and accessibility of an integrated Grid stack, but significant investment is needed to extend this capability and transfer it to community-specific software.
The proposal has intellectual significance and broad benefits for society. GRIDS leadership and staff have established reputations in designing and deploying middleware. These R&D activities have real, immediate implications in research, education and beyond. GRIDS impact has already been considerable, with over a dozen companies such as IBM, Hewlett-Packard, and Oracle committed to building products on the technology. GC2 will see even greater emphasis on these outreach efforts, centering on emergent OGSA standards that are being driven by the GRIDS leadership. .
|
0.954 |
2004 — 2007 |
Kesselman, Carl Schopf, Jennifer |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Sci: Nmi Development: Performance Inside: Performance Monitoring and Diagnosis For Nmi Software and Applications
The proposal seeks to develop software and tools to automate management for the process of collecting, analyzing, and applying diverse performance information, and in so doing lead to the understanding of baseline performance behavior and failure conditions of distributed systems and applications. The tools will support the development of extended monitoring frameworks that enable the capture of performance data across the environment by using a set of sensors, or probes, widely deployed to gather basic data. The work will enable sensors to be managed and configured easily, so that the amount of data collected can be varied according to the situation. Sensors will be able to register into a common registry so that users can easily discover and select the sensors relevant to a specific application. Additional tools, such as archivers and prediction services, will be provided to store performance data and estimate future behaviors. Trigger services will be provided for early fault detection, and higher-level analysis tools for behavioral diagnosis. The project will produce (1) a NMI Sensor Toolkit, software that implements various simple sensor templates so that sensors can easily be incorporated into tools, applications, and other services to provide performance data to users; (2) performance-enabled NMI software, sensor-enhanced versions of various NMI components (e.g., Globus Toolkit's GRAM, GridFTP, RFT, MDS; MPICH-G2; OpenSSH; MyProxy); (3) sensor management tools, services and tools for archiving, analyzing, and controlling performance monitoring in a distributed environment; and (4) client tools (APIs, libraries, command line tools, utilities) that provide users and system administrators with easy and understandable access to performance data.
|
0.964 |
2005 — 2011 |
Kesselman, Carl |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Oci: Collaborative Research: Community Driven Improvement of Globus Software @ University of Southern California
This collaborative proposal describes development work on the software system known as the Globus Toolkit spanning a 5-year period. This development work, labeled as "Community Driven Improvement of Globus Software", or CDIGS, promises generally to respond to the needs of the scientific user community in adding enhanced functionality and performance to an existing and well-used software base exceeding 2.5 million lines of code. The proposal guides the reader first through a primer on the Globus user community and the Globus technology and components. The detailed project plan focuses on the first 24 months and cites specific work in the areas of security, data management, execution management, information services, and common runtime. Their methodology on software engineering practices describes how they plan to develop, integrate, and make available new capabilities across major GT code releases. The proposal also describes in several sections their approach to user engagement and community support, leveraging multiple channels of engagement ranging from meetings and community events to IRC and email list forums. One section covers the project team and management plan, where the roles of well known and veteran GT developers and designers are defined. The project team consists of about 14 FTE covering full-time software engineering, coordination, management, and documentation. The management plan includes the creation of a Technology board that oversees software engineering activities, an External Advisory board comprised of industry/academic/federal experts and stakeholders, and a Globus Alliance board charged with defining technical direction and providing a direct tie with the Globus Alliance organization. The proposal included 102 letters of support, including letters from most of the significant grid-related projects worldwide.
Intellectual Merit: Despite the term "research" appearing in the title, the proposal is clear about being a development project. The proposal claims that new knowledge will be gained, spanning areas of software engineering, in the pursuit of delivering Globus technology to the scientific users in grid environments. Broader Impact: The proposal identifies throughout the dependence on GT by a slew of major CI initiatives. Continued support and new development of Globus will thus support and further enable these existing distributed environments, as well as smooth the way for new grid environments to form in the support of distributed science, engineering, and education. The proposal makes a direct case for CDIGS supporting production science.
|
1 |
2006 — 2009 |
Minster, Jean-Bernard (co-PI) [⬀] Jordan, Thomas [⬀] Kesselman, Carl Moore, Reagan |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
A Petascale Cyberfacility For Physics-Based Seismic Hazard Analysis @ University of Southern California
This CME Collaboration proposes to transform probabilistic seismic hazard analysis (PSHA) into a physics-based science by deploying a new cyberfacility "PetaSHA" that can execute PSHA computational pathways and manage data volumes using the nation's petascale computing resources. The objectives of the project have been formulated in terms of three science thrusts: (1) Extend deterministic simulations of strong ground motions to 3 Hz for investigating the upper frequency limit of deterministic ground-motion prediction. (2) Improve the resolution of dynamic rupture simulations by an order of magnitude for investigating the effects of realistic friction laws, geologic heterogeneity, and near-fault stress states on seismic radiation. (3) Compute physics-based PSHA maps and validate them using seismic and paleoseismic data. The investigators will assemble and operate a cyberfacility comprising four computational platforms, each designed to execute and manage the results from specific PSHA computational pathways. Two of these platforms, OpenSHA and TeraShake, have been developed by the CME Project. PetaShake will be an advanced research platform for petascale simulations of dynamic ruptures and ground motions with petascale capability computing and will be able to archive complete simulation data volumes (petascale data-intensive computing). CyberShake will be a new production platform that will employ advanced workflow management tools to compute and store the large suites of ground motion simulations needed for physics-based PSHA mapping. They will move these computational platforms forward in complexity and scale through a graduated series of calculations tied to a timeline with clear measures of success and will also develop a set of science gateways for the broader community, in particular researchers involved in the EarthScope and NEES Projects, to provide access to PSHA simulation capabilities and data products. The objectives include the cross-training of diverse groups of undergraduate interns and early-career scientists that will enable them to solve fundamental problems.
|
1 |
2007 — 2010 |
Kesselman, Carl |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Nsf Workshops On Virtual Organizations (Nsf-Vo) @ University of Southern California
OCI 0751539 Carl Kesselman U. of Southern California Virtual Organizations Workshop
We propose to organize a series of two workshops on the design, construction, evaluation, and operation of virtual organizations. The first of these workshops will focus on identifying a cross-cutting research agenda in the area of virtual organizations. The second will focus on the pragmatic issues of building and operating functional virtual organizations within the context of the deployed NSF Cyber infrastructure. Intellectual Merit.
The invitees of this workshop will consist of leading researchers from the diverse disciplines who have been studying aspects of virtual organizations and distributed collaborative work as well as leading researchers within the communities that represent both current and potential members of virtual communities. In bringing together these researchers who typically do not have a forum for interacting, these workshops will help identify key research challenges, criteria for evaluation and end user requirements for virtual organizations. These results will be captured in the form of a workshop report and disseminated to the National Science Foundation and the various constituent communities. In addition, these workshops will provide an opportunity to identify new opportunities for collaboration and cross cutting research activities in the important area of virtual organizations.
Broader Impacts. The Office of Cyber-infrastructure has identified virtual organizations as a fundamental element of its infrastructure plan. While there have been a variety of collaborations organized around general concept of virtual organizations, to date, no comprehensive understanding of fundamental mechanisms, infrastructure requirements, operational principles or evaluation criteria has emerged. The proposed workshops will take a critical first step in creating this understanding. As a result, the workshop reports should have a significant impact on the research and development agendas of the disciplines represented in the workshop. Furthermore, we anticipate that both the collaborations initiated by the workshop and the resulting report can have a significant impact on shaping the way virtual organizations are supported in Cyber-infrastructure, as well as obviously the effectiveness of the various user communities of a national scale cyber-infrastructure. Finally, in addition to the direct participation of women and under-represented minorities in the workshops, a significant result of virtual organizations is to remove geographic and organizational barriers to participation. To that that end, the contributions that these workshops will make to the advance the understanding and practice of virtual organizations can only serve to level the playing field and subsequently opening new opportunities for participation for all citizens. 1
|
1 |
2009 — 2013 |
Kesselman, Carl |
U24Activity Code Description: To support research projects contributing to improvement of the capability of resources to serve biomedical research. |
Bio-Informatics Research Network Coordinating Center (Birn-Cc) @ University of Southern California
This subproject is one of many research subprojects utilizing the resources provided by a Center grant funded by NIH/NCRR. The subproject and investigator (PI) may have received primary funding from another NIH source, and thus could be represented in other CRISP entries. The institution listed is for the Center, which is not necessarily the institution for the investigator. DESCRIPTION (provided by applicant): To respond to the needs for advanced infrastructure for collaborative biomedicine, the NCRR Biomedical Informatics Research Network (BIRN) was established to develop a federated and distributed infrastructure for the storage, retrieval, analysis, and documentation of biomedical imaging data. The network of BIRN participants initially involved three testbeds with a neuroscience emphasis. The BIRN Coordinating Center (BIRN CC) is an important piece in the overall BIRN network, serving to develop, implement, and support the information technology infrastructure necessary to achieve distributed collaborations and data sharing between BIRN users. The current BIRN CC consists of a partnership between computer scientists, neuroscientists, and engineers based at the San Diego Super Computing Center (SDSC). In this proposal, we address issues that have hindered collaborative research effort. We will evolve BIRN with new capabilities, using proven techniques that are already being utilized in other disciplines to automate the data publication process and to form "virtual data warehouses" that provides unified query interfaces over disparate and distributed data sources Specifically: 1) we will transition the BIRN CC to a new site (ISI at USC);2) migrate the core infrastructure to a virtual machine based deployment strategy;3) leverage cyberinfrastructure tools to provide stability, scalability and interoperability;4) replace Storage Resource Broker-based data solution with a service-based federated data management infrastructure, 4) move the BIRN Data Repository to storage hosted at ISI/USC, 5) migrate and support the most mature and extensible tools currently used by BIRN user base new services framework;6) create an infrastructure allowing other developers to integrate tools into BIRN, 7) implement pragmatic data curation procedures focusing on creating a framework for using multiple ontologies. Finally, we propose a new. governance structure that will be more responsive to the needs of the growing BIRN community. This new model will allow expansion of the BIRN user base and domains and will position BIRN at the forefront of new efforts to store primary data linked to peer-reviewed publication, taking advantage of new NIH open-access policies. Public Health Relevance: Biomedical research benefits greatly from the integration of biomedical imaging, clinical, and behavioral data from multiple independent institutions. This project will provide a data sharing infrastructure that will enable researchers to investigate complex data-driven problems, otherwise impossible. The impact will be dramatic as multi-model data analysis has become fundamental to the development of evidence-based medicine.
|
1 |
2009 — 2013 |
Kesselman, Carl |
U54Activity Code Description: To support any part of the full range of research and development from very basic to clinical; may involve ancillary supportive activities such as protracted patient care necessary to the primary research or R&D effort. The spectrum of activities comprises a multidisciplinary attack on a specific disease entity or biomedical problem area. These differ from program project in that they are usually developed in response to an announcement of the programmatic needs of an Institute or Division and subsequently receive continuous attention from its staff. Centers may also serve as regional or national resources for special research purposes, with funding component staff helping to identify appropriate priority needs. |
Data and Computational Models Dissemination @ University of Southern California
The multi-scale models envisioned by MC-START are predicated on the ability to share the latest measurements and data produced by the various research projects and integrate them into multi-scale models. To achieve this objective, we must have the ability to publish, discover, and retrieve data from the research projects and integrate them into multi-scale models must become a regular occurrence, not an exceptional event. For the results of these models to be scientifically valid, it is not simply enough to be able to access the underlying data, but we must be able to account for the details of its production, to record the provenance of both in vivo and model based data production, so as to enable repeatability, and to explore alternative formulations of the models is a methodical and structured approach. Furthermore, we must have a means by which computational models can be published and made both across the participations in MC-START and also eventually to the research community at large. It must be possible for members of one research project to have the ability so execute models produced by another research project, combining them to form the multi-scale models that are the underpinning of MC-START.
|
1 |
2012 — 2013 |
Kesselman, Carl |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Conceptualizing An Institute For Empowering Long Tail Research @ University of Southern California
The majority of NSF-funded research occurs in small and medium-sized laboratories (SMLs) that often comprise a single PI and a few students and postdocs. For these small teams, the growing importance of cyberinfrastructure and its applications in discovery and innovation is as much problem as opportunity. With limited resources and expertise, even simple data discovery, collection, analysis, management, and sharing tasks are difficult. An unfortunate consequence is that in this "long tail" of science, modern computational methods often are not exploited, much valuable data goes unshared, and too much time is consumed by routine tasks. To date, research investments in science cyberinfrastructure have disproportionately emphasized big science projects, providing tools for use by IT staff and technology savvy researchers rather than complete applications consumable by end users. This project's goal is to lay a foundation for a more balanced research agenda by focusing exclusively on the needs of SMLs.
This Software Institute Conceptualization project aims to determine whether these obstacles to discovery and innovation can be overcome via the use of software as a service (SaaS) methods. Such methods have proven immensely effective for small and medium businesses due to their ability to deliver advanced capabilities while streamlining the user experience and achieving economies of scale. To determine whether similar benefits can apply for SMLs, the project team will engage with multiple science communities to identify science practices, match science practices against candidate SaaS offerings, and evaluate business models that could permit sustainable development of those offerings. The outcome of this process is intended to be a compelling and competitive strategic plan for an NSF Software Innovation and Sustainability Institute that both meets immediate needs of the initial science communities and provides a basis for a new, more cost-effective method of addressing cyberinfrastructure needs across all NSF directorates.
|
1 |
2014 — 2016 |
Arnold, Donald B [⬀] Fraser, Scott E (co-PI) [⬀] Kesselman, Carl |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Dynamic Mapping of the Complete Synaptome Using Recombinant Probes @ University of Southern California
DESCRIPTION (provided by applicant): Here we propose to develop an experimental paradigm to allow dynamic monitoring of the strength and location of every glutamatergic and GABA/Glycinergic synapse within the brain of a living organism. In combination with behavioral manipulation of the organism this paradigm will allow for study of how the brain encodes information in synaptic structure. This paradigm will involve combining three technologies: 1. Recombinant probes with which postsynaptic excitatory and inhibitory sites can be labeled in vivo, allowing the location and strength of synaptic connections to be monitored in parallel. 2. 2P-SPIM microscopy, which can image large volumes very quickly, without bleaching and, potentially, with isotropic resolution. 3. Software to calculate and store the location and strengt of each synapse in such a manner that it can be easily manipulated and analyzed. Experiments will be performed in zebrafish, as they have semi-transparent brains that are relatively small, yet they are capable of relatively complex behaviors. Experiments will be used both to establish the viability of the paradigm and to answer fundamental questions about how synapses are modulated during sleep, as well as how they are changed in learning paradigms such as sound habituation and place preference/aversion conditioning.
|
1 |
2014 — 2018 |
Kesselman, Carl |
U01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Facebase 2 Coordinating Center @ University of Southern California
DESCRIPTION (provided by applicant): The FaceBase consortium is a distributed network of researchers investigating craniofacial development and dysmorphology. The consortium includes research projects directly participating via funded spoke projects as well as members of the craniofacial research community at large. The collection, sharing and integration of heterogeneous data, including genetic, imaging, and anatomical, from human and animal models are essential for advancing craniofacial research. We will develop and maintain a FaceBase 2 Data Management and Integration Hub infrastructure that will properly store, represent, and serve these data to the research community, and in addition provide access to tools for visualizing, integrating, annotating, linking and analyzing the data. Our three broad aims will provide not only the needed data migration and management, but also a set of user-centered tools for data visualization, comparison and annotation. These are: Aim 1: Create an infrastructure driven by principles of ease of use and user centered design to manage data throughout its lifecycle. By taking a user centric approach and supporting the use of data throughout its lifecycle, the rate of discovery and utility of the hub can be maximized. Aim 2: Provide tools within the Hub that accelerate craniofacial research by enabling data annotation, integration and analysis. To maximize the utility of the Hub for both spoke projects and outside users, our proposed infrastructure will allow any user to define a personal workspace, organize datasets visualization tools and analysis tools into specific user-driven workflows. Aim 3: Promote use and collaboration across the FaceBase research network. The Hub will provide a wide range of collaboration tools and services to the consortium as well as organizing face-to-face meetings in order to enhance collaboration. We have assembled an experienced team of experts in bioinformatics, imaging, genetics, mouse and human studies, including FaceBase participants who bring firsthand knowledge of the needs within the craniofacial research community. The FaceBase team is intentionally diverse, to provide each spoke group at least one informed contact who will play an integral role in the development and implementation of the Hub, thereby ensuring that the data of each spoke project will have maximum impact on the community of researchers.
|
1 |
2015 — 2019 |
Kesselman, Carl Mcmahon, Andrew P. (co-PI) [⬀] |
U01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
The Usc (Re)Building the Kidney Coordinating Center @ University of Southern California
? DESCRIPTION (provided by applicant): The functional unit of the mammalian kidney is the nephron - a complex, composite structure comprising an epithelial renal tubule with a closely associated specialized vascular network. Nephrons arise from a nephrogenic stem/progenitor population that only exists during fetal life. Consequently, the entire complement of nephrons is formed prior to birth. Thereafter, restoration of existing nephron structures is the organ's intrinic strategy for kidney repair following acute, and likely chronic, injury episodes. From studies in non-human model systems, we have a general framework for the molecular and cellular processes at play during kidney development and an understanding is emerging of intrinsic repair processes that maintain normal kidney function. Unfortunately, normal repair processes are inadequate: 1,000,000 patients are currently diagnosed with end state renal disease (ESRD), approximately 400,000 of these are on dialysis with a mean 5-year survival rates of 38%. The only option beyond dialysis for the ESRD patient is kidney transplant, but the number of transplants - approximately 20,000/year - does not match patient need. The long-term goal of the (Re)Building the Kidney Consortium is to harness our knowledge of nephron development and repair to forge innovative new approaches to treat injury and disease of the kidney. The consortium is being developed with the underlying thesis that a coordinated, focused, adaptive and interactive program of research by several groups has the potential to yield results in a timeframe that would not be possible through a static collection of independent single investigator sponsored research projects. Coordination across the research consortium demands the ability to rapidly integrate new research projects, communicate results, and share knowledge and data across the consortium. It is not enough to publish results at the end of a study, but rather, it must be possible to incrementally share at all stages of the research process. To address these issues, we propose to create a ReBuilding the Kidney Coordinating Center (RBKC) to ensure the consortium operates effectively as a coordinated research network, maximizing the value of individual projects, and communicates consortium activities to the broad research community. Specifically, the RBKC aims to: 1) create organizational structure and processes that will enable seamless and frequent communication and collaboration between RBKC researchers, the NIDDK and the broader research community, 2) create a data portal that will empower changing collections of researchers to rapidly create, analyze, share and discover diverse types of data, 3) broaden the research community and create a framework for targeted shorter term research results by establishing and managing an opportunity pool program, 4) disseminate consortium results to the broader research community. In total, the impact of these elements of the RBKC will accelerate the rate of discovery within the consortium and by the broader community.
|
1 |
2015 — 2017 |
Schuler, Robert Kesselman, Carl |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Personalizing Recommendations in a Large-Scale Education Analytics Pipeline @ University of Southern California
This pilot project aims to begin to organize the world's digital learning resources to make personalized recommendations to learners that are engaging and effective in increasing mathematics learning outcomes. The project accomplishes this goal by developing crowdsourcing techniques to organize learning resources and by analyzing the online learning activities of the student. Teachers are an integral part of this project. The target audience for this pilot is 7th grade mathematics students and teachers.
This project builds upon the Gooru platform that serves a community of over 500,000 in 140 countries and all 50 US states. The platform uses crowdsourcing by its community to curate over 70,000 collections of free web resources consisting of over 16,000,000 education resources. This project builds upon the Gooru resources by using learning analytics on the user interactions within Gooru to discover the resources that most benefit students. Thus, student resources can be tailored to the individual student to maximally engage the student and improve the students learning. Since the Gooru user owns his or her data, explicit opt-in is required for the sharing of data thus protecting the privacy of students who wish not to share their data. Gooru is open source and free so there are no economic barriers (besides internet access) to using the platform.
|
1 |
2016 — 2020 |
Kesselman, Carl |
U24Activity Code Description: To support research projects contributing to improvement of the capability of resources to serve biomedical research. |
Usc Gudmap Coordinating Center @ University of Southern California
Project Summary Congenital anomalies of the kidney and urinary tract are common birth defects. These defects are a significant public health problem affecting Americans and further, may go undetected but increase the risk of chronic diseases later in life. Since its inception, the GenitoUrinary Development Molecular Anatomy Project (GUDMAP) has provided a framework and repository for extensive expression data to provide a molecular anatomy of genitourinary development in the mouse with the long-term goal to harness and share our knowledge of urogenital development to forge innovative new approaches to treat disease. This work has advanced our understanding of subpopulations and patterning events in the developing nephron, lower urinary tract, and gonads. A deep understanding of the molecular drivers of kidney formation serves to inform current efforts in the field to create kidney organoids for therapeutic and disease studies. We propose to operate a GUDMAP Database that will facilitate a coordinated, focused, adaptive and interactive program of research by several groups, yielding results faster than a static collection of independent research projects. Coordination across the research consortium demands the ability to rapidly integrate new research projects, communicate results, and share knowledge and data across the consortium. Our work will generate connections between mouse and human data to link function to disease and thus identify potential causal and therapeutic targets. By providing tools to coalesce and link disparate datasets, we will provide efficient and easy methods to pose questions and further research. The GUDMAP Database will conduct meta- analysis to produce both informative ways to use the tools we build, and propose new testable hypotheses regarding urogenital development and disease. Our proposed GenitoUrinary Development Molecular Anatomy Project (GUDMAP) Database that will: 1) create organizational structure and processes that will enable seamless and frequent communication and collaboration between GUDMAP researchers, the National Institute for Diabetes and Digestive and Kidney Diseases (NIDDK) and the broader research community; 2) create a highly usable database that will empower changing collections of researchers to rapidly upload, organize, and search heterogeneous data types; 3) build a comprehensive toolkit for data annotation, curation, analysis, visualization, and cross validation; 4) broaden the research community and create a framework for targeted shorter term research results by establishing an opportunity pool program; 5) disseminate consortium resources to the wider research community. In total, the impact of these elements of the GUDMAP will accelerate the rate of discovery within the consortium and the broader community.
|
1 |
2017 — 2018 |
Arnold, Donald B [⬀] Fraser, Scott E (co-PI) [⬀] Kesselman, Carl |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Dynamic Mapping of the Complete Synaptome Using Recombit Probes @ University of Southern California
DESCRIPTION (provided by applicant): Here we propose to develop an experimental paradigm to allow dynamic monitoring of the strength and location of every glutamatergic and GABA/Glycinergic synapse within the brain of a living organism. In combination with behavioral manipulation of the organism this paradigm will allow for study of how the brain encodes information in synaptic structure. This paradigm will involve combining three technologies: 1. Recombinant probes with which postsynaptic excitatory and inhibitory sites can be labeled in vivo, allowing the location and strength of synaptic connections to be monitored in parallel. 2. 2P-SPIM microscopy, which can image large volumes very quickly, without bleaching and, potentially, with isotropic resolution. 3. Software to calculate and store the location and strengt of each synapse in such a manner that it can be easily manipulated and analyzed. Experiments will be performed in zebrafish, as they have semi-transparent brains that are relatively small, yet they are capable of relatively complex behaviors. Experiments will be used both to establish the viability of the paradigm and to answer fundamental questions about how synapses are modulated during sleep, as well as how they are changed in learning paradigms such as sound habituation and place preference/aversion conditioning.
|
1 |
2019 — 2021 |
Chai, Yang (co-PI) [⬀] Kesselman, Carl |
U01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Usc Facebase Iii Craniofacial Development and Dysmorpholoy Data Management and Integration Hub @ University of Southern California
PROJECT SUMMARY The major goal of the FaceBase Consortium is to advance research by creating comprehensive datasets of craniofacial development and dysmorphologies, and to disseminate these datasets to the wider craniofacial research community. The FaceBase 3 Data Management and Integration Hub builds on the existing and successful scientific and technical team that has lead the development, deployment, operation and community engagement of the FaceBase 2 data hub. Looking forward to the future impact of the FaceBase Consortium, we face major challenges that include: (1) how to annotate large datasets to empower the biomedical research community; (2) how to improve data integration and facilitate data search and retrieval from the hub; (3) how to use the data from FaceBase to design studies and otherwise inform our future research; and (4) how to translate our knowledge from animal model studies to improve human craniofacial health. Importantly, the projects we propose here are designed to address these issues as we develop anatomy-based comprehensive data search and display, incorporate additional datasets from the community, and promote the usage of FaceBase data. In pursuit of these goals, we propose the following three Specific Aims: Specific Aim 1: Create a FAIR data repository and resource hub for the craniofacial research community. Specific Aim 2: Promote and enable data contributions to FaceBase by NIDCR-supported researchers. Specific Aim 3: Foster a community of active FaceBase users through outreach activities and dissemination of new features and available datasets. We will create educational materials and provide training opportunities for the next generation of craniofacial researchers. Ultimately, our proposed projects will not only contribute to our comprehensive understanding of the molecular regulatory mechanisms of craniofacial development but will also demonstrate how others can use the datasets from the FaceBase Hub to develop innovative approaches to improve human health.
|
1 |
2020 |
Chai, Yang (co-PI) [⬀] Kesselman, Carl |
U01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Creating Scalable, Reliable, Sustainable Infrastructure For Fair Data @ University of Southern California
PROJECT SUMMARY The goal of the FaceBase III Hub is to create a FAIR data repository to serve the entire community of dental and craniofacial researchers by sharing diverse data related to craniofacial development and dysmorphia. To meet this goal, FaceBase is built on Deriva, an open-source data management system designed with FAIR data principles in mind. This platform has allowed FaceBase to evolve with changing requirements for data on new experimental methodologies and instruments, additional model organisms, cell characterization, integration of computational pipelines, and visualization interfaces. Currently, we implement Deriva on private and public clouds using a ?data center-in-the-cloud? format; i.e., treating the cloud like a traditional remote computer, to run virtual machine images and conventional data storage. However, cloud platforms such as Amazon Web Services offer a wide range of cloud-native services beyond virtual machines which if fully leveraged would drastically improve important aspects of Deriva that would directly benefit FaceBase. Hence, we propose to enhance Deriva for cloud-based operations to address three key aspects of Deriva in support of FaceBase and its other NIH communities: scalability, reliability, and sustainability. Specifically, we plan to use AWS native services to improve its scalability (Aim 1), decouple Deriva services to run in containerized execution environments to ensure its reliability (Aim 2), and develop cost management dashboards to monitor and predict costs of operating in the cloud to achieve sustainability (Aim 3). The AWS native services are fully managed and highly-scalable, and offload much of the overhead of system operations and maintenance. Improvements in Deriva scalability, reliability, and sustainability achieved by these Aims will allow the FaceBase Hub to provide the growing community of data contributors and users with better service. In addition, many other user communities such as GUDMAP, (Re)Building a Kidney, the Kidney Precision Medicine Project (NIDDK) and the Common Fund Data Environment (OD) rely on Deriva, and all of the improvements resulting from these Aims would yield a direct and immediate benefit to thousands of additional users.
|
1 |
2020 — 2022 |
Kesselman, Carl Kim, Byoung-Do (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cc* Compute: a Customizable, Reproducible, and Secure Cloud Infrastructure as a Service For Scientific Research in Southern California @ University of Southern California
This project creates a hybrid cloud infrastructure as a scientific computing gateway that promotes and supports inter-disciplinary, multi-institutional research in science, engineering, biomedicine, and the social sciences. The hybrid cloud platform also promotes regional and national research collaboration, as a portion of the resources is integrated into the Open Science Grid (OSG). Many institutes with multi-institutional research projects headquartered at University of South Carolina (USC), along with their regional and national collaborators, benefit from use of the OSG. It also extends the impact of their research outcomes and the projects themselves, as the system offers various ways to share research outputs and knowledge with external collaborators. The planned support for regional universities and integration with OSG increases opportunities to serve a broader community.
A broad research community is supported by this system by providing access to public and private cloud services as well as local high performance computing (HPC) and data resources. The design of the hybrid cloud system facilitates the creation of customizable, virtualized platforms and reproducible, container-based application services that enable multi-dimensional computing and data solutions. Researchers are able to pick and choose from a standard service catalogue to build pre-defined virtual machines and containerized applications, or, if necessary, create their own specialized environments. Along with built-in security, reproducible service modules, and the capability of creating and sharing customized environments, the hybrid cloud system bridges multi-disciplinary research domains and enhances the usability of advanced cyberinfrastructure for improved research productivity.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |
2020 |
Chai, Yang (co-PI) [⬀] Kesselman, Carl |
U01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Bioethics of Syndrome Diagnosis Using 3d Image Analysis @ University of Southern California
Project Summary/Abstract This supplement will address the unintended consequences and collateral damage that arise when facial recognition software is used for medical purposes, such as for syndrome diagnosis as in our Facebase-funded project. In Aim 1, we will determine whether the accuracy of this technology varies based on self-reported race, sex and age. In this aim, we examine our existing database for evidence of bias based on self-reported race, sex or age. We further determine the extent to which these variables influence classification performance. To the extent sample sizes allow, we will carry this analysis to the level of specific syndromes. Finally, we will use anonymized reference datasets of non-syndromic faces to compare false positive rates based on NIH race definitions, sex and age. The outcome of this aim is to objectively establish bias and estimate the effects of under-representation across race, age and sex categories within our data. In Aim 2, we will determine how the reports of race-, sex- and age-based bias in facial recognition technology may influence views of the technology and its application amongst researchers and clinicians. This aim will establish the extent to which the storing of large databases of facial images and the application of machine learning processes to them for diagnostic purposes may raise privacy concerns. The concerns investigated will include potential hacks into protected health information; fear relating to the bias in some facial recognition software (and, potentially, in the Facebase database); and fear of discrimination in the application of the technology, such as by insurers. The outcome will be a white paper that targets a high-profile journal, summarizing the findings and defining crucial issues that should guide the development of facial imaging for disease diagnosis and clinical usage.
|
1 |
2021 — 2024 |
Kesselman, Carl |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Creating Mechanisms to Make Integrative Structures of Large Macromolecular Assemblies Available From the Protein Data Bank @ University of Southern California
The goal of this project is to develop the necessary infrastructure to make the integrative structures of biomolecular systems available from the Protein Data Bank (PDB). PDB is the single global repository for three-dimensional structures of biological macromolecules and their complexes. It is used by millions of scientists and educators worldwide. This project will provide open access to integrative structures through the PDB, which will greatly enhance the understanding of cellular and molecular biology and speed the progress of biomedical and biotechnology research. The project is central to the creation of an interoperating network of structural biology resources that will provide seamless access to research data and promote efficient data sharing. This work will significantly advance biological research by making structures of complex macromolecular assemblies easily accessible to everyone through the PDB.
Increasingly, structures of many complex biological assemblies are determined using integrative approaches, in which data from multiple experimental methods are combined. The PDB archives structures determined using single methods and needs to be expanded to handle these integrative structures. In the last three years, a standalone prototype system, called PDB-Dev, has been developed for archiving integrative structures and making them publicly available. The current project will integrate the PDB-Dev system with the PDB data processing pipeline and ensure that the integrative structures are made available from the PDB according to the FAIR (Findable, Accessible, Interoperable, and Reusable) principles. Integration of PDB-Dev with the PDB will enable the broad and growing community of PDB users to have access to an even richer and more diverse set of curated structural models. The project will create a rigorous standardized process for deposition, curation, validation, archival, and dissemination of integrative structures through the PDB. This functionality will facilitate tackling increasingly more challenging structure determination problems such as elucidation of the three-dimensional structures of complete genomes and even the creation of a spatiotemporal model of an entire cell. The results of the project can be accessed from the PDB-Dev website: https://pdb-dev.wwpdb.org.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |
2021 |
Arnold, Donald B [⬀] Fraser, Scott E (co-PI) [⬀] Kesselman, Carl Kopell, Nancy (co-PI) [⬀] |
U01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Circuits Underlying Threat and Safety @ University of Southern California
Classical conditioning has been studied in many different animal models, and even in humans. However, the larval zebrafish with its transparent brain offers a unique opportunity to observe large scale changes in synaptic structure that accompany this form of learning. Accordingly, we have developed a novel paradigm for visualizing synaptic changes that occur during classical conditioning in larval zebrafish. Using this paradigm we have observed striking region-specific changes in the distributions of synapses that drive the rewiring of neural circuits that mediate threat responses. In this grant we will expand this paradigm by monitoring neuronal activity through imaging of genetically encoded calcium indicators throughout the pallium (the homolog of the amygdala) before, during and after classical conditioning and extinction. This will allow us to identify cells that comprise the circuits that control threat and safety and explore their connectivity using optogenetics. We will investigate how different sensory inputs can cause changes in the activity of those cells leading to synapse change, and the formation or extinction of associative memories. A crucial component of these studies will be the recording of field potentials to capture rhythmic activity throughout the pallium and high speed SPIM imaging of genetically encoded voltage indicators to record rhythms in individual cells. By understanding the precise timing of signals that impinge on individual cells we will uncover mechanisms that underlie synaptic plasticity. Our goal is to develop a theoretical model describing the neural circuits that underlie threat detection and how they can change as a result of associative memory formation and extinction.
|
1 |
2021 — 2023 |
Kesselman, Carl Kim, Byoung-Do (co-PI) [⬀] Pyun, Yul |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cc* Regional: a Purpose-Builtsocal Science Dmz For Catalyzing Scientific Research Collaborations @ University of Southern California
The Los Nettos Regional Network is a long-standing regional research and education (R&E) network with a history of supporting science and engineering research for its more than 30 members and associates in the greater Los Angeles area. This project builds a friction-free regional Science DMZ network across multiple Southern California college campuses, catalyzing collaborative research capabilities at the institutions. The project establishes the network infrastructure and software necessary to facilitate high speed transfers of large-scale research data for regional and national scientific collaborations. The campuses included in the network are Loyola Marymount University, Occidental College, and The Claremont Colleges consortium, which consists of Claremont Graduate University, Claremont McKenna College, Harvey Mudd College, Keck Graduate Institute, Pitzer College, Pomona College, and Scripps College.
This purpose-built science network is specially customized for each institution’s unique needs and follows the well-known Science DMZ guidelines established by ESnet. The new network interconnects with state, national, and international networks, such as CENIC’s California Research and Education Network (CalREN), Internet2, and Pacific Wave. Many projects in various science domains benefit from the significant network capacity increase that this project supports. Coordinated activities at the regional level, including technical training for administrators and researchers at each campus, ensure uniform standards are maintained. This scalable R&E network can be expanded in the future for researchers and students at other smaller regional institutions (e.g., Charles R. Drew University of Medicine and Science, ArtCenter College of Design) as their need for collaboration widens.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |
2021 |
Chai, Yang (co-PI) [⬀] Kesselman, Carl |
U01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Improving Ai/Ml-Readiness of Facebase Research Datasets @ University of Southern California
PROJECT SUMMARY The goal of the FaceBase III Hub was created by the National Institute for Dental and Craniofacial Research (NIDCR) to create a data repository to serve the entire community of dental and craniofacial researchers by sharing diverse data related to craniofacial development and dysmorphia, as well as other research communities that can leverage the diverse data that is in the FaceBase repository. One particularly unique and important element of FaceBase III is that it has over 22,000 facial images from over 11,000 human subjects, many of which are labeled with syndromes based on clinical and genomic diagnoses. Facial images are a critical resource for studying the correlation between genotype and phenotype and have received intense interest within the Artificial Intelligence (AI) and Machine Learning (ML) research field with notable advances in automated phenotyping. While FaceBase embraces the FAIR (Findable, Accessible, Interoperable, and Reusable) principles, there are unique concerns specific to AI/ML research including: presence of noise, uncertainty of labels, and bias within datasets. It is imperative that we remedy any limitations in the utility of FaceBase?s facial imaging data for AI/ML research. In this project, we propose to unlock the tremendous potential of FaceBase facial scans by identifying gaps in how data is characterized, formated, and preprocessed from the perspective of its use in AI/ML research and algorithm development. To accomplish this, we propose to initiate a pilot application that applies existing deep learning algorithms developed by investigators in this proposal to existing FaseBase data (Aim 1). The goal of the pilot is to identify how curation, organization and preparation of FaceBase data might be improved so as to streamline their use in ML/AI based investigations. Based on what we learn from the pilot, we will modify the current FaceBase self curation processes specifically around Facial Scans (Aim 2). This will require us to streamline our process associated with curation of human subject data, so that we have the necessary rich descriptive elements while maintaining required restrictions on data handling. Ultimately, the goal is to position the FaceBase Hub so that the existing facial scan resources become more broadly useful to AI/ML researchers. More significantly, we expect to see an increased availability with facial scan data and other associated data types, such as genotyping and neurofunctional data. By making the proposed improvements to our data ingest procedures, we anticipate that this proposal will allow FaceBase to scale to significantly larger data set sizes, and consequently, cementing and expanding its position as a unique resource to the broader NIH community of ML and AI researchers.
|
1 |
2021 |
Chai, Yang (co-PI) [⬀] Kesselman, Carl |
U01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Enhancing Fairness of the Facebase Research Data Hub @ University of Southern California
PROJECT SUMMARY The goal of the FaceBase III Hub is to create a data repository to serve the entire community of dental and craniofacial researchers by sharing diverse data related to craniofacial development and dysmorphia, as well as other research communities that can leverage the diverse data that is in the FaceBase repository. FaceBase was designed to follow FAIR data principles: that data be Findable, Accessible, Interoperable, and Reusable. To date, we have had significant success getting new research teams to submit data to FaceBase in a manner that adheres to FAIR. However, since the start of FaceBase III, the digital repository community has developed a new set of principles designed to promote trust in the data contained in a repository with the goal of promoting broader reuse of the data within the repository. The TRUST principles of Transparency, Responsibility. User Focus, Sustainability, and Technology have been identified as core to increasing users? confidence in a data-repository and hence promoting data use and reuse. Surveys and interviews of FaceBase users indicate that FaceBase is perceived within the community as being a high-quality trustworthy repository. However, a deliberate application of TRUST guidelines has the advantage of making FaceBase commitment to these principles more explicit, increasing our ability to promote the use of FaceBase data outside of the core craniofacial research community, and better serving our users across the NIH research community. In this proposal, we seek to achieve these goals by explicitly identifying, defining, and adopting operational principles in FaceBase to meet community agreed-upon standards for TRUST (Aim 1). A critical part of TRUST is to define specific metrics that enable one to evaluate the impact of the data repository. To this end, we will leverage the extensive built-in analytics that FaceBase collects to define key performance metrics of impact and produce a reporting structure to disseminate those metrics to the FaceBase user community (Aim 2). Ultimately, the goal is for the FaceBase Hub to become a certified trusted data repository. Certification will establish greater credibility for the FaceBase Hub with our user community and with users less familiar with FaceBase. Long term, the adoption of TRUST principles will ensure that datasets deposited with us and entrusted to our care will be maintained at the highest level of quality.
|
1 |
2023 — 2024 |
Kesselman, Carl Kim, Byoung-Do (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cc* Regional Computing: Building Cyberinfrastructure to Forge a Regional Research Computing Alliance in Southern California @ University of Southern California
Educational institutions in Southern California, particularly those focused on undergraduate studies, face challenges in accessing essential high-performance computing (HPC) resources and cyber-infrastructure. This lack of accessibility leads to decreased knowledge and experience with HPC among students and faculty and inhibits advancements in data-intensive research activities. The University of Southern California’s (USC) Center for Advanced Research Computing (CARC) houses one of the largest HPC facilities in the region and is deploying new cyberinfrastructure to expand their services and expertise to under-resourced universities in the area. The implementation of this new HPC system with corresponding user support services further solidifies the existing collaboration between USC and the Los Nettos consortium, comprised of 6 member universities and 30 associate networks.<br/><br/>The new system offers a similar computing environment to national HPC centers with advanced computing capacity, high-speed network, and abundant research applications and tools in its software stack. Additionally, the system leverages existing cyberinfrastructure at USC such as central file systems, HPC cluster interconnection, and federated authentication. This project significantly increases the accessibility of advanced cyberinfrastructure to students and researchers of regional institutions while catalyzing multi-institutional research collaborations, thereby forming a computational research and education hub in the region—the Southern California Research Computing Alliance.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |