
cached image
We are testing a new system for linking grants to scientists.
The funding information displayed below comes from the
NIH Research Portfolio Online Reporting Tools and the
NSF Award Database.
The grant data on this page is limited to grants awarded in the United States and is thus partial. It can nonetheless be used to understand how funding patterns influence mentorship networks and vice-versa, which has deep implications on how research is done.
You can help! If you notice any innacuracies, please
sign in and mark grants as correct or incorrect matches.
Sign in to see low-probability grants and correct any errors in linkage between grants and researchers.
High-probability grants
According to our matching algorithm, Joshua D. Greene is the likely recipient of the following grants.
Years |
Recipients |
Code |
Title / Keywords |
Matching score |
2008 — 2012 |
Greene, Joshua |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Cognitive and Affective Neuroscience of Moral Judgment
Cognitive and Affective Neuroscience of Moral Judgment Consider the following moral dilemma: It's wartime. You and your fellow villagers are hiding from nearby enemy soldiers in a basement. Your baby starts to cry, and you cover your baby's mouth to block the sound. If you remove your hand, your baby will cry loudly, and the soldiers will hear. They will find you, your baby, and the others, and they will kill all of you. If you do not remove your hand, your baby will smother to death. Is it morally acceptable to smother your baby to death in order to save yourself and the other villagers?
Dilemmas such as this one, in addition to being troubling and puzzling, are a window into the psychology and neuroscience of moral decision-making. People often speak of a "moral faculty" or a "moral sense," suggesting that moral judgment is a unified phenomenon, but recent research strongly suggests that this is not so. Rather, it seems that moral judgment depends on a complex interplay between intuitive emotional responses and more deliberative controlled responses that depend on different parts of the brain. When we confront dilemmas like the one above, these neural systems compete for control of behavior.
The goal of this project is to use behavioral testing and cutting-edge neuroscientific techniques to help us understand how different kinds of moral decisions are influenced by emotional responses and controlled cognitive processes. Some experiments aim to illuminate how the brain represents moral costs and benefits. Other experiments are focused on the role of social obligations in moral judgment. Some experiments use hypothetical moral dilemmas such as the one above, while others involve real decisions with real consequences for real people.
Understanding how the human brain makes moral judgments is important from a basic science perspective, but it is also a matter of broader social concern. Our capacity for moral judgment is central to our identities as humans, and understanding this capacity in biological terms may change the way we approach controversial moral issues and may also, in the long run, dramatically change the way we see ourselves and each other.
|
0.915 |
2009 — 2011 |
Greene, Joshua |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Genetics of Moral Cognition
Abstract
"This award is funded under the American Recovery and Reinvestment Act of 2009 (Public Law 111-5)."
Moral cognition is the study of moral judgment from the perspective of cognitive science, which views the brain as an information processor. Cognitive genetics is the study of how specific genetic variations affect how the brain processes information. The goal of this project is to begin the integration of these two young and rapidly progressing fields.
Recent research on moral cognition indicates that moral judgment is not the product of a unified "moral sense" or "moral faculty." Rather, moral judgments are produced through a combination of automatic emotional responses and more controlled cognitive processes, where these two types of processes sometimes compete with one another. This kind of competition is elicited by classic moral dilemmas devised by philosophers: Is it morally acceptable to push one person in front of a trolley, killing that person, in order to prevent the trolley from running over and killing five people? People disagree about this case, but research shows that the tendency to say "no" is preferentially supported by automatic responses that depend on emotion-related brain regions such as the amygdala. In contrast, the tendency to say "yes" is preferentially supported by more controlled processes that depend on a part of the brain called the dorsolateral prefrontal cortex. Recent research in cognitive genetics has identified variants of genes that influence neural activity in these brain regions. For example, there is a genetic variant that is known to affect emotional processing in the amygdala by regulating levels of the neurotransmitter serotonin. Likewise, there is different genetic variant that is known to affect controlled cognitive processing in the dorsolateral prefrontal cortex by regulating levels of the neurotransmitter dopamine. This new research builds on earlier studies to explore whether and how specific genes influence moral judgment by examining moral judgments made by people with different genotypes and by imaging the brains of people with different genotypes while they make moral judgments.
Moral judgment is a matter of widespread concern, with relevance to such practically oriented fields as law, medicine, public health, and politics. More philosophically, our capacity for moral judgment is widely considered an essential feature of our humanity. Thus, a better understanding of moral judgment and its biological basis may inform important moral decisions and may teach us important lessons about ourselves.
|
0.915 |