1988 — 1991 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Research Initiation: Interior Point Algorithms For Linear and Convex Programming @ Northwestern University
Since the development of Karmarkar's algorithm for linear programming, interest is rejuvenated in the interior point algorithms for solving linear and convex programming problems. Several variants of interior point algorithms have shown considerable promise to achieve substantial improvement in efficiency over the simplex method for linear programming problems. Through this proposal, we request funds to further continue our study of the theoretical properties and the computational effectiveness of these algorithms. We intend to effectively integrate the theory developed in our previous work with our experimental experience to develop a general purpose code for solving these problems.^R JUSTIFICATION
|
1 |
1991 — 1995 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Interior Point Methods in Optimization @ Northwestern University
Methods developed by using higher order Taylor polynomial, as well as those obtained by taking a generalized predictor-corrector approach, will be studied for solving linear programs. Efficient methods for solving structured linear programs will be developed. Methods for finding vertex solution within the framework of interior point methods will be investigated. Approaches will be pursued to extend the research to parallel methods and to convex problems.
|
1 |
1995 — 1999 |
Defanti, Thomas (co-PI) [⬀] Banerjee, Pat [⬀] Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Remote Manufacturing Decision Support Using Virtual Reality @ University of Illinois At Chicago
9500396 Banerjee The use of virtual environment could provide a major advancement for manufacturing information management and decision support technologies. In this research virtual reality will be used in facilities and product layout information management and optimization systems. In the layout domain, the research will address three important problems: (1) three-dimensional (3D) general layout design, (2) layout of manufacturing cells and material flow paths in factories, and (3) layout of multi-module assemblies. The short-term objective is to demonstrate the feasibility of using virtual reality in reducing the manufacturing cycle time. The long-term objective is to develop raw technology for a real-time-virtually-real management control and decision support tool. There is also a major software development component. In addition, the research involves collaborations with Motorola and five other industrial firms, the Chicago Manufacturing Center, and National Institute of Standards and Technology, and industry collaborators will co-fund the work. The research aims to enable a shorter lead time in manufacturing. It will enhance the fundamental understanding of manufacturing decision support in light of integration of new technological developments in virtual reality. It will provide a new generation of tools for training, develop alternatives to physical prototyping and provide avenues for sustaining employee interest in better design of manufacturing systems through continuous improvement. The research will be introduced through lectures and laboratory experiments to graduate and undergraduate students. Concepts will be incorporated into courses in computer-aided design, virtual environments, production control, intelligent manufacturing systems and facilities layout. Seminars will also be conducted for industrial partners and other companies.
|
0.942 |
1999 — 2003 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Methods For Solving Linear and Nonlinear Mixed Integer Programs @ Northwestern University
Several inventory, production and process planning, layout, location, logistics, budgeting, financial planning and investment problems are modeled as linear or nonlinear mixed integer programs. These models arise when optimizing operations of medium to large size enterprises and are considered very hard. Although models for practical problems involve thousands of variables, only problems of very small size (a few hundred variables) have been solved in practice. Previous work introduced several new ideas towards solving these problems. These are: (1) development of a new branch-and-cut method for mixed convex 0-1 problems; (2) the use of nonlinear cuts for nonlinear and linear mixed 0-1 problems; and (3) generating cuts at near optimal vertex solutions for mixed linear 0-1 and integer problems. These ideas need further refinement to convert them into a practical computational methodology for solving general-purpose linear and nonlinear mixed integer programs. This project will study the effectiveness of cuts generated using disjunctive programs for the convex mixed 0-1 problems, and will investigate how these cuts can be generated efficiently. The use of semi-definite relaxations in this context will be investigated as well as their value in solving nonlinear mixed 0-1 programs. The use of nonlinear cuts for solving these problems will also be investigated. If successful, this research will increase the size of difficult nonlinear integer program that can be successfully solved by an order of magnitude. The research has an experimental and software development (computational) component where extensive testing will be performed to validate the practicality of the researched ideas. Since the research is on development of methods for solving linear and nonlinear integer models with general structure, from a practical point of view, a successful outcome is expected to have the widest impact in solving real world models.
|
1 |
2002 — 2006 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Generalized Branch-and-Cut Method For Mixed Integer Programming @ Northwestern University
Models involving general integer variables with nonlinear constraints are very hard to solve. There has been some, but limited progress towards solving large scale models of this type. Computational results on practical test problems suggest that the gap between the optimum objective value and a bound from the solution of the relaxed problem at the root node is significantly larger when compared with bounds obtained in the linear case, and the size of the enumeration tree explodes quickly. To overcome the difficulties in solving large scale problems of this type, advances in several directions are needed. Methods for generating good cutting planes need to be developed further, as well as more advanced heuristics for identifying a feasible solution quickly need to be developed. However, we think that the most fundamental advance may come from the development of advanced branching schemes in a branch-and-cut method. The standard branching schemes branch on a single variable disjunction at a time. Lenstra showed that by branching on general hyperplanes we can solve a mixed integer linear programming problem in polynomial time. Lenstra's algorithm has found limited practical value. This is because the computational cost of finding the branching hyperplane in his algorithm is very high. However, there is a possibility for finding branching planes that are ``good quality'' but computationally not as expensive. This research will focus on this problem as well as related issues towards solving nonlinear mixed integer programs.
Several engineering and management problems lead to models that are nonlinear, possibly stochastic, integer programs. These problem areas include inventory, production and chemical process planning, layout, location, logistics and financial optimization. For example, nonlinearity arises naturally while modeling uncertainty using the second moment. This proposal request funds to support our ongoing research for finding efficient techniques for solving such models. The development of general purpose solution methodology for solving nonlinear integer programming problem will have a wide ranging impact, as it would facilitate solving such problems arising from many different areas.
|
1 |
2002 — 2004 |
Mckenna, Ann Norman, Donald Colgate, J. Edward Mehrotra, Sanjay Olson, Gregory (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Institute For Design Engineering and Applications: Fostering Creative Synthesis Across the Curriculum @ Northwestern University
PROPOSAL NO.: 0230547 PRINCIPAL INVESTIGATOR: Colgate, J. Edward INSTITUTION NAME: Northwestern University TITLE: Institute for Design Engineering and Applications: Fostering Creative Synthesis Across the Curriculum
Abstract
The McCormick School of Engineering and Applied Science of Northwestern University plan to build a curriculum centered on engineering design. They will engage students in a series of meaningful and increasingly sophiscated design experiences throughout their undergraduate programs.
The design curriculum draws faculty and students together from a variety of engineering disciplines, as well as the fine arts and social sciences. To facilitate this sort of cross-disciplinary effort, and to provide the necessary infrastructure for a large-scale collaborative design program, they plan to create a new entity, the Institute for Design Engineering and Applications (IDEA).
IDEA will offer a variety of programs and courses in engineering design. These will include a Manufacturing Engineering degree program; large-scale, cross-disciplinary courses in new, more advanced area; and smaller, more focused offerings, such as courses based on national design competitions. It will also actively pursue a research agenda focused on product and process innovation. All research projects will be tied to the design curriculum.
|
1 |
2005 — 2010 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Methods For Solving Mixed Integer Programs Using Adjoint Lattices @ Northwestern University
This grant provides funding for developing advanced solution methodology for mixed integer programming problems using the adjoint lattice basis approach recently developed by the PI. The mixed integer problems are optimization problems that involve integer and continuous variables in the optimization model. The models may have nonlinear constraints in addition to the linear constraints. Models involving general integer variables with nonlinear constraints are very hard to solve. These models arise in engineering and management problem areas such as inventory, production and chemical process planning, layout, logistics and financial optimization. For example, non-linearity arises naturally while modeling uncertainty using the second moment. There has been some, but limited progress towards solving large scale models of this type. This proposal on the development of integer programming methodology will improve our ability to solve problems from this wide range of application areas.
The adjoint lattice concept allows us to describe these algorithms in the original space, without requiring any problem dimension reductions required in earlier developments of branching on hyperplane type algorithms. As a result we are able to perform several steps of these algorithms in the space of original variables. The use of adjoint lattice allows for new possibilities of "more intelligent" computations of branching hyperplanes, for example, the adjoint lattice framework opens up the possibility of computing branching hyperplanes more heuristically, and allows for the possibility of alternative computations for generating cutting planes, and feasible integer solutions. The restructuring also allows alternative ways of computing a feasible integer solutions.
This research will further develop the adjoint lattice methodology. In particular, we will develop (i) methods for using approximate adjoint lattices when generating branching hyperplanes; (ii) branch-and-cut algorithms for mixed integer nonlinear programming problems using adjoint lattices; (iii) methods for generating feasible solutions in the original space without computing kernel lattices; (iv) study the use of segment lattice basis reduction methods in our context; (iv) study more efficient methods for finding analytic or volumetric centers of continuous relaxations; (v) branch-and-cut algorithms for mixed integer nonlinear programming problems when the constraint functions are not differentiable. This development will allow the use of adjoint lattice based methodology for solving large sparse mixed integer programs.
|
1 |
2007 — 2010 |
Homem-De-Mello, Tito [⬀] Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Optimization Algorithms For Problems With Stochastic Dominance Constraints @ Northwestern University
This proposed research provides funding for the study of optimization problems where the uncertainty intrinsic to the constraints in the problem is modeled using a concept known as stochastic dominance. Two optimization problems which will receive an early focus in the research are the uncertain linear and uncertain semidefinite programs under a second-order linear stochastic dominance concept, which constitutes a particular way to model multi-dimensional stochastic orders. Efficient algorithms for such problems will be constructed. In addition, a duality theory that allows explicit construction of dual functions associated with the solution of such problems will be developed. More general stochastic orders and the solvability of corresponding optimization problems will be analyzed. The research will also address the situation where the support of the random entities in the stochastic dominance constraints is not finite or is very large, so that sampling based approaches are required. Finally, a study of stochastic entities with random parameters and their applications may be conducted within the context of stochastic dominance.
If successful, the proposed research will address the fundamental problem of optimizing a system where some components are not known with certainty, which has applications in many areas, including operations research, statistics and finance. The work will help to develop a better understanding of the benefits and drawbacks of using the concept of stochastic dominance --- which has proven to be of capital importance in many areas, ranging from economics to epidemiology --- in an optimization problem. One goal of this research is to develop algorithms for such problems, the availability of which will result in better modeling of parameter uncertainty in stochastic models. The proposed research builds upon two unrelated areas (optimization and stochastic dominance) and it is expected to promote a cross-fertilization of ideas that can potentially lead to further advances in both areas, while allowing for improved modeling abilities of application problems. This combination of different areas will also lead to the development of new graduate courses and the dissemination of ideas through a set of lecture notes on the topic.
|
1 |
2009 — 2014 |
Darosa, Debra Rodriguez, Heron Mehrotra, Sanjay Daskin, Mark (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Multi-Objective Robust Stochastic Planning and Scheduling of Healthcare Service Providers @ Northwestern University
This grant provides funding for developing novel models and methods for planning, scheduling, staffing, and assignment (PSSA) problems that have multiple objectives and are unpredictable because of future uncertainty. Such problems arise frequently in the healthcare industry, which would be the focus application area. Examples of multiple, often competing, objectives are: patient safety and satisfaction, cost of service, continuity of care, educational goals of the residents, and staff satisfaction. Examples of uncertainty are patient workload and service times. Realistic mathematical models will be developed for such problems for healthcare professionals incorporating the unique features of this setting. A particular focus will be on medical resident scheduling. Computational methods will be developed for solving these models. In particular, heuristic methods will be developed for quickly identifying good candidate solutions for such problems. These methods will be tested on realistic data from the healthcare setting.
If successful, the results of this research will enable healthcare providers to improve work flow and to make utilize the skills of the provider's medical staff. Improved training schedules of medical school residents would be possible, while meeting their residency requirements. Patient satisfaction will improve as continuity of care is enhanced. The computational methods developed for solving problems arising in healthcare will also be useful for stochastic multi-objective integer programming models arising in a wide variety of application areas including production planning, supply chain management and vehicle.
|
1 |
2011 — 2016 |
Friedewald, John Ladner, Daniela Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Addressing Geographical Disparities in Transplant Organ Accessibility Across United States @ Northwestern University
The objective of this research is to develop advanced mathematical models with sufficient granularity to guide the United Network of Organ Sharing decision makers in crafting an implementable policy towards reducing the geographic disparity of Kidney allocation over time. Nearly 700,000 patients currently receive kidney replacement therapy either in the form of hemodialysis or kidney transplantation, consuming $24 billion (6.4 percent) in Medicare expenditures annually. Kidney transplantation, where a patient receives a healthy kidney from either a deceased or living donor, has a significantly better five-year patient survival of 81.2 percent, an improved quality of life, and leads to significant cost-savings. However, kidney transplantation is marred by the shortage of available kidney organs. The US Department of Health and Human Services in their "Final Rule" states, "Organs and tissues ought to be distributed on the basis of objective priority criteria, and not on the basis of accidents of geography." The current national kidney allocation system is plagued by geographic disparities in the waiting time associated with kidney transplantation. The median waiting time during 2000-2009 varies from 0.93 years to 4.14 years depending on a patient's local area of listing. The model will incorporate simulation based input and output in an optimization framework that consistently considers multiple equity objectives within a planning-type framework. Such modeling is necessary to incorporate detailed patient population characteristics, transplant center behavior, the quality of available organs, acceptability of the model generated solution by key stakeholders, and the ease of implementing the suggested policy.
The solution techniques and algorithms developed for this research would be potentially applicable to other distribution problems where equity needs to be achieved. The project will involve and train graduate students and research outcomes will be used to inform the transplant community through presentations at national conferences.
|
1 |
2011 — 2016 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Distribution and Moment-Robust Optimization Models and Algorithms @ Northwestern University
Optimization models and methods are widely used in decision problems. Incorporating parameter uncertainty is important in constructing representative optimization models. This parameter uncertainty may be described by a probability distribution, the moments of a probability distribution, or using bounds and confidence intervals on moments. When estimating uncertain parameters, information is also available on the error distribution of the estimated parameter. The research objective of this award is to study properties and develop algorithmic methods for solving multivariate optimization models that incorporate parameter uncertainty using limited knowledge on their distribution, and the parameter estimation errors. The optimization models will be based on using parameter distribution moment estimates. The models will incorporate moment estimation errors using a suitable penalty approach.
If successful, the results of this research will lead to the development of a new class of decision optimization modeling tools with associated algorithmic techniques for solving these models. This award will contribute to better handling of parameter uncertainty in single and two-stage inventory planning models, the classical least squares model, and the models involving objectives described by quadratic functions. A better understanding of algorithmic implications of parameter uncertainty in these problems will provide foundation for handling parameter uncertainty in other application models that use the planning, the least squares and quadratic objectives as basic building blocks. The award will also contribute to the development of computational tools implementing the algorithms solution techniques. Experiments will be performed to validate the algorithms, and to compare the properties of the solutions generated from the models incorporating distributional knowledge with those that ignore this information.
|
1 |
2011 — 2015 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Models and Algorithms For Risk Adjusted Optimization With Robust Utilities @ Northwestern University
The research objective of this award is to study concepts of relaxed univariate and multivariate stochastic dominance motivated from the specification of utility functions. A major challenge in risk-adjusted decision making is the assessment of a decision maker's risk/utility function associated with uncertain outcomes. The stochastic dominance concept enables the defining of preferences of one random entity (variable, vector, matrix, process, etc.) over another and therefore can be used to compare the different risks between decision alternatives. The concept of stochastic dominance is related to utility theory, which works with a class of utility functions where limited prior knowledge on a decision maker's utility function is assumed, which unfortunately results in very conservative models. Developing models, and the corresponding algorithms, that can systematically circumvent the challenges of traditional utility theory, especially when the decision is based on conflicting objectives and multiple random outcomes, is thus critically needed.
If successful, the results of this research will lead to the development of a new class of risk-adjusted decision models that incorporate risk in a more realistic way than currently existing techniques. Computationally efficient algorithms will be developed to solve these models.
|
1 |
2013 — 2017 |
Mehrotra, Sanjay Iravani, Seyed M. R. (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Managing Downstream Patient Flow Processes Using Improved Coordination and Staffing @ Northwestern University
The objective of this award is to develop models and study the downstream patient flow processes from the perspective of in-hospital staffing decisions for the escort staff. Queuing models will be developed using real life operational data. These models will be used to estimate the properties of the waiting time incurred by a patient, while waiting for transfer to an in-hospital unit from the emergency department. The waiting time properties will be used in an optimization model that will determine the staffing level. The optimization model will incorporate quality of service constraints. Solution techniques for solving this optimization model will be developed.
If successful, the results of this research will lead to improvements in modeling and solution techniques for making shift staffing decisions in the emergency department patient transfer process. The primary goal of this work is to determine the right number of escort staff at the emergency department during a shift. Determining the right number of escort staff will reduce patient admission delays resulting from the unavailability of an escort staff, when required for patient transport. Reducing patient admission delays will free up capacity in the emergency department, which will improve patient safety in emergency departments that run at near capacity. This work will also help reduce the overall patient care cost due to more effective utilization of resources, and contribute to developing new modeling methods and computational tools available for staffing problems.
|
1 |
2014 — 2017 |
Apley, Daniel [⬀] Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
A Methodology For Reliable Risk Assessment With Error-Prone Electronic Medical Records Using Optimal Design of Experiments Concepts @ Northwestern University At Chicago
Enormous healthcare resources are devoted to compiling electronic medical record (EMR) databases that are increasingly integrated and rich in patient population and that offer potential for identifying disease risk factors via statistical analyses to predict the disease risk as a function of various factors (e.g., clinical and demographic) for that patient. Unfortunately, the disease event data may have high miscoding error rates, due to the fact that clerical personnel with limited training are employed to enter their codes. For example, in one EMR database of patients with cardiac workup, after reviewing a random sample of cases recorded as sudden cardiac arrest events, the error rate was found to be 75 percent. In order to take such errors into account and avoid developing unreliable risk assessment models, it is imperative that a doctor perform chart reviews to validate a sample of cases and determine whether the events were true events. However, the number of chart reviews is limited due to the high cost of doctors' time. The objective of this research is to develop a methodology for judiciously and efficiently selecting validation cases for maximum information content, which will allow reliable disease risk assessment even with highly error-prone EMR data. The anticipated benefits to the health and well-being of society are substantial, as this research will allow the enormous untapped potential of large EMR databases to be more fully utilized for discovering new disease risk factors. It is also anticipated that this research can be extended to other big-data application domains for extracting reliable information from large quantities of data that are of questionable quality.
Large electronic medical record (EMR) databases offer potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that predict the likelihood that a patient develops a particular condition as a function of various predictor variables (e.g., clinical, phenotypical, and demographic data) for that patient. Although the predictor variable data are often recorded reliably, the event data may have high error rates due to ICD-9 disease miscoding. To avoid developing unreliable risk assessment models, previous research used random validation sampling to estimate error probabilities for correcting biases in logistic regression models fit to the entire data, which is both inefficient and unreliable with high error rates. In contrast, this research will develop a validation sampling and reliable risk assessment (VSRRA) methodology for judiciously designing a validation sample. The intellectual underpinning is the observed analogy between VSRRA and traditional design of experiments (DOE), whereby validating the response for one error-prone case in VSRRA corresponds to conducting one experimental run in DOE. In light of this analogy, this research will develop (i) suitable VSRRA design criteria based on the Fisher information matrix for the model parameters and Bayesian counterparts such as posterior and preposterior parameter covariance matrices, applicable to a broad class of generalized linear models commonly used in medical risk studies; (ii) heuristic and more exact hybrid algorithms for selecting the validation sample to optimize the design criteria; (iii) multistage, sequential versions of the VSRRA sampling strategies that refine the designs based on information that is learned along the way, as new cases are validated; and (iv) methods that determine whether and how the full set of unvalidated data can be reliably included, along with the validated data, in the final model fitting. A fundamental tenet of data analysis is that carefully designed experimental studies produce far more reliable statistical conclusions than observational studies. Likewise, it is anticipated that the DOE-based VSRRA methodology will allow far more reliable disease risk assessment and hypotheses generation.
|
1 |
2014 — 2015 |
Nohadani, Omid [⬀] Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Health Systems Optimization Workshop; Chicago, Illinois; 12-13 September 2014 @ Northwestern University At Chicago
The goal of this workshop is to bring together experts from the fields of medicine and operations research in order to identify and address important issues pertinent to improving the cost, quality, efficiency, and efficacy of health systems. Events where experts in both fields can interact are scarce, and this workshop is unique in its expected ability to initiate and foster research collaborations among leading researchers in the two fields that can, through the synergy of their respective expertise, better address the next generation of health systems challenges.
A large number of problems in medicine and health systems can benefit from insights and capabilities developed in the field of mathematical modeling and optimization. At the same time, the difficulty of many of these problems is expected to require new methodological contributions in the field of operations research: accommodating data and model uncertainties, the large-scale nature of real-world problems, the dynamics of changing human anatomy, interplay of systems across scales, and competing policies and criteria, etc. Therefore, this workshop has the potential to advance knowledge in both fields. In addition to allowing key participants from both fields to participate, this award will also allow a large number of graduate students, postdoctoral fellows, and medical interns to take part in the workshop as well.
|
1 |
2014 — 2017 |
Nishi, Michael Chan, Jennifer Chiampas, George Mehrotra, Sanjay Smilowitz, Karen [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Goali: Improving Medical Preparedness, Public Safety and Security At Mass Events @ Northwestern University At Chicago
The objective of this Grant Opportunity for Academic Liaison with Industry (GOALI) award is to improve medical preparedness, public safety and security at mass gathering events through the use of optimization methodologies. The nation engages in over 500 mass gathering events, such as marathons, each year. Such events are subject to medical emergencies for the participants, and other security related events. This GOALI project brings together engineering and medical faculty at Northwestern University and the organizers of the Bank of America Chicago Marathon to study approaches for the mitigation of hazards and risks through course design. In the case of a marathon, course design decisions are related to the route to be followed and the locations of aid stations, medical tents and volunteers on the course. Multi-objective models and solution approaches will be developed for course design, coupled with data analytics and field observations to identify a safe and medically accessible course. The research plan is based on active integration across the areas of modeling, algorithms, data analysis and decision maker engagement.
If successful, the results of this research will lead to advances in medical preparedness and response for a variety of mass gathering events. This project provides a test bed to advance the science and practice of mass event planning and preparedness, through repeated field observations at the Chicago Marathon, conducted by faculty, students, and practitioners. Given connections across city of Chicago and within the marathon community, the research team will host seminars to disseminate best practices from this research to other mass events in the Chicago region and worldwide. This GOALI project represents a unique application of operations research that will expose students to a new type of planning problems. The operations research modeling will lead to new developments in multi-objective arc routing models, by introducing new classes of arc routing problems and creating solution methods for these problems. Further, this work will lead to new approaches to multi-objective optimization based on the iterative generation of promising solutions with input from decision makers.
|
1 |
2014 — 2017 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Analysis and Solution Methods For Function Robust Optimization Models @ Northwestern University
The objective of this award is to study a new class of optimization models where general shape constraints specify the function form, and a maximin criterion is used to resolve the function ambiguity. For many data driven decision problems the functions specifying the problem are obtained from the data through model fitting. This model fitting is done based on a presumed form of the function. The decisions are subsequently made by optimizing the fitted functions. In the modeling framework the function set is specified using properties of the function and non-parametric model fitting. Such problems are called function robust optimization problems. Different types of function robust models will be analyzed and algorithms will be developed for solving these models.
If successful, the results of this research will lead to the development of a new class of optimization modeling techniques and algorithms for solving such models. The solutions obtained from such models are expected to be more robust and efficient under data uncertainty when compared to those obtained from the classical known approaches. A general methodological framework that allows ambiguity in the function form will present a significant conceptual advancement to the field of optimization based decision making. Applications of such problems range from topics in management, intelligent control, and engineering design. Experiments will be performed to validate the algorithms, and to compare the properties of the solutions generated from the new modeling technique.
|
1 |
2015 — 2016 |
Mehrotra, Sanjay |
U01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Unassisted Blood Pressure Monitoring Using Arterial Tonometry and Photoplethysmography @ Northwestern University At Chicago
? DESCRIPTION (provided by applicant): We propose a tonometry based solution to the problem of inexpensive, unencumbering and non- invasive measurement of blood pressure. The proposed solution will be useful in clinical as well as ambulatory blood pressure measurements. Specifically, we propose to develop a large (1cm x 5cm) two-dimensional array of pressure sensors with 0.2mm spacing (24x128 pressure-sensing elements), together with robust signal processing algorithms with a feedback controlled band-tightening mechanism in order to measure blood pressure at the wrist. The pressure sensor array together with a photoplethysmogram (PPG) sensor will be embedded in a band, and will provide signals through a multiplexed circuit designed. Signal conditioning techniques will be used to get the large amount of data in the form that can be efficiently processed on a microcontroller. The proposed two-dimensional array of pressure sensors will be built using flexible plastic and micro-fabrication techniques. The increased size of the sensor array will ensure proper contact and pressure application with arteries in the wrist for tonometric measurement of BP. The signals generated from the sensor array will be processed through advanced signal processing and optimization techniques to handle the huge amount of data and to mitigate noise. The PPG signals will be used at the wrist to improve the efficiency and accuracy of the system. The sensor array system will be embodied in the form of a band, which will be integrated in a wearable device such as smartwatch. The capabilities of the smartwatch will be used for data processing and analytics. a) Design, develop, and test a two-dimensional pressure sensor array on a polyimide flexible substrate b) Process large amount of analog data from sensor array using signal conditioning and processing techniques to generate blood pressure estimates c) Design a user friendly prototype form factor, and perform a clinical trial to validate device performance The engineering members of our multidisciplinary team have specialized in cardiac instrumentation, signal and image processing for medical applications, pressure and touch sensors, signal processing and robust optimization. The team members with clinical expertise have been engaged in blood pressure trials in US, and have been working with leading institutions in India engaged in cardiovascular research. This proposal brings together their unique expertise and experience to innovatively address this challenging problem through a joint effort.
|
1 |
2016 — 2017 |
Mehrotra, Sanjay |
R21Activity Code Description: To encourage the development of new research activities in categorical program areas. (Support generally is restricted in level of support and in time.) |
Promoting Utilization of Kidneys by Improving Patient Level Decision Making @ Northwestern University
? DESCRIPTION (provided by applicant): Promoting Utilization of Kidneys by Improving Organ Acceptance Decision Making Only 15,652 of nearly 109,000 waitlisted patients for kidney transplant (KT) received one in 2014. More than six thousand died while waiting for a KT and more than two thousand became too sick to be transplanted. Patient survival, quality of life, and morbidity is significantly worse for those who remain on dialysis. The majority of kidneys that are transplanted are recovered from deceased donors. In 2014, while 12,664 kidneys were recovered for KT, 2,272 were discarded - most frequently due to being of marginal or low quality. Many of the discarded kidneys are believed to confer smaller survival gains than other procured kidneys. Previous studies show that the marginal kidneys offer survival benefits with respect to remaining on dialysis, but neglect quantification of these benefits to an individual patient. These studies ignore two major issues: (i) the low c-statistics of one and three year post-transplant patient and graft survival models; and (ii) dynamics of the organ offers and the current point based priority system for the waitlisted patients. Given these facts, from a decision maker's perspective, accepting or rejecting a low quality organ is not necessarily a well informed choice. The proposed research takes a comprehensive approach in developing a survival model incorporating time on dialysis, recipient characteristics, transplant center characteristics, and donated kidney characteristics. It proposes to apply techniques from biostatistics and machine learning to improve model accuracy. The research further aims to quantify the dynamics of kidney availability of different quality organs, and model future kidney offers as a stochastic decision tree. The proposed computational engine for estimating the survival benefits of accepting or rejecting an offer will be transformative in clinical decision making. Moreover, th successful completion of the proposed research will provide a highly reliable approach to support kidney acceptance/rejection decisions. The research will be carried out by a trans- disciplinary team with expert knowledge in the transplant allocation system, decision methodologies, and clinical decision making. The research team will incorporate input from an external advisory board consisting of leaders and key stakeholders in kidney and pancreas transplant programs.
|
1 |
2017 — 2018 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
I-Corps: Clinical Workforce Schedule Optimization Technology @ Northwestern University
The broader impact/commercial potential of this I-Corps project will be that a novel clinical schedule optimization technology developed based on a prior NSF funded research project can be used to reduce staffing costs in health systems. The technology will also help improve patient safety. Results from prior analysis suggest that significant joint operational and safety costs savings are possible using the system. Since staffing costs, specifically clinical workforce costs, constitute a significant portion of healthcare costs, a large impact on overall health care costs can result from successful implementation of the schedule optimization technology. Additional safety benefits will offer potential further savings.
This I-Corps project with help explore the processes used in making staffing and scheduling decisions for the clinical staff, and how the developed clinical schedule optimization technology can potentially be leveraged to improve these decisions in real operational settings. The findings from this exploration will help the investigator team to understand the market readiness for this technology, and the viability of potential products and services to help improve hospital systems. This will inform the technology's further development by determining the target market and understanding of current staffing products and solutions.
|
1 |
2017 — 2018 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Rapid: Addressing Geographic Disparities in the National Organ Transplant Network @ Northwestern University At Chicago
Liver transplantation is currently the only known medical treatment for End-stage Liver Disease with established survival benefits. In the United States, current allocation policies continue to result in geographical disparities for donor organs on several metrics. Disparities include the fraction of waitlisted patients dying on the waitlist, and the sickness level at which patients are transplanted in different areas of the country. This Rapid Response Research (RAPID) award supports refinements and enhancements to a novel neighborhood-based organ allocation approach currently being considered for adoption by policy makers on the United Network for Organ Sharing (UNOS) Liver and Intestine Committee. The refinements specifically address reducing geographic disparities by better matching donor organs with transplant beneficiaries. This award will provide decision support to UNOS and the broader transplantation community in the U.S.
This RAPID award will support activities to take to scale the previously developed optimization model by utilizing national datasets. Because the model involves a highly complex multi-objective optimization, a refinement of the existing model is highly challenging, as identifying acceptable solutions involves lengthy simulations and manipulation of several parameters and structural linkages between organ procurement organizations. The simulation tools will be publicly available, and the data collected from an improved model is expected to have an impact on a broad population segment. Researchers in operations research, management science, transplant surgery, and the hepatology communities will be educated through presentation at national conferences. A graduate student will be trained as part of the research, and clinicians will also be educated as to the role of a scientific modeling approach when considering public health challenges.
|
1 |
2019 — 2020 |
Mehrotra, Sanjay |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Rescuing Kidneys At Risk of Discard @ Northwestern University At Chicago
Project Summary. Kidney transplantation (KT) is a superior alternative to dialysis for many patients with respect to longevity, quality of life, morbidity, and the cost of End-Stage Renal Disease (ESRD) care. However, of the 95,000 patients waitlisted for a KT in 2017, less than 20,000 received a KT. In the same year, about 4,000 patients died, while another 4,700 were removed from the waitlist because they were considered too sick to transplant. Despite the long waitlist, more than 3,500 (19%) of the deceased donor kidneys were discarded. ?No Recipient Found? was the stated reason for discarding 1,325 of the donated kidneys. The discard rate increases significantly for the low-quality kidneys, reaching as high as 60% for the kidneys in the lowest quality decile. This discard occurs even though many discarded kidneys, even those in the lowest quality decile, would have conferred significant survival benefits to some recipients relative to remaining on dialysis. The current kidney allocation system does not efficiently allocate kidneys at high risk of discard. The patient prioritization algorithm does not quickly identify patients who will benefit most from such kidneys, and the donated kidneys are wasted. Improvements to the kidney allocation system are needed to reduce the number of discarded kidneys while allocating them equitably to patients who would most benefit. However, revising allocation policy is challenging because patients and transplant professionals must weigh the risks and benefits of accepting a low-quality kidney against the risks of waiting for a much higher quality kidney. The transplant community and patient preferences in accepting low-quality kidneys are unknown. The proposed study aims to develop kidney allocation algorithms that rescue viable deceased donor kidneys from discard. The developed algorithms will (i) identify kidneys at risk of discard using measures such as Kidney Donor Profile Index, or a novel Kidney Discard Risk Score; (ii) identify patient populations, such as those with longer waiting times to transplant or low functional status, who will benefit the most from viable kidneys. Preferences of the transplant clinicians (Surgeons and Nephrologists) and patients (on dialysis and transplant recipients) will be elicited using discrete-choice experiments that will quantify allocation efficiency and fairness tradeoffs for kidneys at risk of discard. A discrete event simulation software engine will be developed. The simulation engine will incorporate transplant center and patient-specific kidney acceptance behaviors, and the developed allocation algorithms. The benefits of different allocation algorithms will be evaluated in the simulation. Representatives from the United Network of Organ Sharing, Association of Organ Procurement Organizations, and the American Association of Kidney Patients will serve on the scientific advisory board. The developed allocation algorithms will help save approximately 1,000 lives yearly by reducing kidney discards. The research outcome will be widely disseminated to the transplant community to foster rapid implementation.
|
1 |
2021 |
Ladner, Daniela P Mehrotra, Sanjay |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Livopt -- Liver Cirrhosis - Optimizing Prediction of Patient Outcomes @ Northwestern University At Chicago
PROJECT ABSTRACT Cirrhosis is a leading cause of mortality in the United States (US), diagnosed in millions of people and resulting in over 40,000 deaths each year, predominately in older people. These patients are at higher risk of serious complications (e.g. infection, ascites, hepatic encephalopathy), hospitalization and death. Little is known about the epidemiology and disease progression and older patients are more vulnerable and might have different risk profiles. The clinical challenge is to accurately and early predict who is at highest risk for requiring hospitalization and/or death and understanding how this might be different in older versus younger adults. . One barrier to date has been the lack of an epidemiologically representative patient sample that captures those with cirrhosis and goes beyond one health system, as many data repositories are skewed or limited: CMS only captures elderly and those with debility, NIS does not allow longitudinal observation, UNOS represents less than one percent of those with cirrhosis. Hence, we will build on our previous work to use a unique dataset, the Chicago Area Patient-Centered Outcomes Research Network (CAPriCORN). CAPriCORN captures the electronic health records (EHR) from nine health systems from 2011-present (incl. academic center, county, VA and private), catching most of the diverse patient population in the greater Chicago metropolitan area, namely ~160,000 patients with liver cirrhosis. The dataset is extensive, allowing the use of traditional (e.g., regression) and novel analytical techniques (e.g. deep learning) to pursue the aims of the study. We aim to model cirrhosis progression in the elderly and to predict the risk of hospitalization and death in older patients with cirrhosis. We also aim to perform sub-group analyses focusing on race and socioeconomic factors in the elderly. The impact of accurate and early prediction of mortality and hospitalization risk in elderly patients with cirrhosis allows for targeted interventions of the most vulnerable patients to improve outcomes.
|
1 |
2022 — 2025 |
Mehrotra, Sanjay |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Amps: Robust Failure Probability Minimization For Grid Operational Planning With Non-Gaussian Uncertainties @ Northwestern University
The electric power industry accounted for the second-largest portion of all carbon emissions across economic sectors in 2020. Renewable energy resources, particularly wind and solar, are critical to decarbonizing the grid and ensuring the nation's future prosperity and welfare. However, because of their inherent and unavoidable intermittency and variability, successful integration of renewable energy resources in the nation's energy mix poses fundamental challenges for day-to-day grid operations. Failure to account for this uncertainty during planning can result in loss of service and grid de-stabilization, thus jeopardizing not only the achievement of decarbonization targets but also system reliability. This project develops the next generation of mathematical methods, computer models, and algorithms for grid operational planning, which accurately and systematically take into account the non-normal and multi-modal nature of renewable uncertainty, as well as the nonlinear and often counter-intuitive physical laws that govern electric power networks. The project's methods and computer implementations shall benefit and inform diverse planning tools, both within the electric power sector as well as the broader energy sector, including those of private companies and vendors who specialize in power systems software. The project further impacts education and the broader society by training undergraduate and graduate STEM students in energy systems optimization and the foundations of electric power grid operations, thereby enabling them to apply their analytical skills to design more environmentally- and economically-efficient future energy systems.<br/><br/>The project contributes a general methodology, including new mathematical models, theory, and algorithms, to systematically account for non-Gaussian error distributions of renewable energy forecasts, in one of the most fundamental power system planning problems called AC Optimal Power Flow. A general treatment of non-Gaussian errors in electric load and renewable energy forecasts has not been considered before in grid planning, despite being exhibited in data. The project rigorously integrates risk and uncertainty in this context by developing a novel methodology for optimization under non-Gaussian probabilistic constraints. This is achieved by exploiting the representability and analyticity of Gaussian mixture models and by designing algorithms that are modular enough to allow current methods which are proven to work well for Gaussian errors to be reusable with only minor modifications. The generality of the approach is expected to spur new algorithms in the broader field of chance-constrained optimization, including nonlinear nonconvex problems whose constraints are affected by Gaussian mixture uncertainties. The project also rigorously accounts for misspecification of the mixture model parameters by designing novel non-Gaussian ambiguity sets, which have not been studied before but have the potential to enable the discovery of robust network operating points with improved out-of-sample performance and reliability. The project uses real utility data to guide model validation and experimentation and also provides a set of practical recommendations for system operators to facilitate the adoption of the developed methods.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
|
1 |