User login
Group Creativity Requires Knowledge, Leadership
Our creativity as a species stems in part from our ability to use knowledge passed from older generations and to receive guidance from leaders in how to use it in new ways. The shared mission of neurologists within their own groups, departments, and institutions, and within the specialty, is no exception. But our ability to work together and accept the direction of leaders is relatively new in Homo sapiens' roughly 200,000-year-old existence. In that time, it took 195,000 years to invent a wheel, 199,500 years to create a printing press, and 199,900 years to develop an automobile. Given that time frame, how can we account for this unprecedented leap in creativity if there was not enough time for natural selection's incremental physiological, structural, and genetic “improvements?”
Alfred Russell Wallace was a contemporary of Charles Darwin, and both proposed a theory of natural selection as the basis for the evolution of species. However, Wallace felt that the human mind was an exception to this theory. He posited a more spiritual explanation. Many regarded this scientific “softness” with derision, but his observation that natural selection was a poor explanation for man's unprecedented creative leap may have been more scientifically astute than Darwin's failure to question it. Many anthropologists currently agree with Wallace that incremental improvements alone fail to explain this behavioral leap. They instead explain it by human cultural evolution, which, in a nutshell, is the sharing of information within and across generations. The emergence of language probably made this sharing possible.
Homo sapiens' success in developing a cumulative culture is based on cooperation with both kin and nonkin, and exceptional reliance on cultural transmission within and across generations. This is rare or absent in other apes whose cooperative behaviors are much more closely kin focused. Kinship is an important organizing principle in primate social groups. In macaques, for example, as the genetic relatedness of members decreases within a group, the social instability of the group increases, resulting in more fighting and wounding (PLoS One 2011;6:e16365).
In contrast, primitive human hunter gatherer societies are 25% genetically unrelated, 50% distantly related, and only 25% closely related. This nonrelatedness fosters intergroup interactions that may lead to the spread of cooperative institutions. When people reside together they have frequent opportunities to observe innovations and imitate successful traits. The change in ancestral human residential structure, compared with our evolutionary ancestors, may have therefore led to greater exposure to more ideas of value and may explain why humans and no other animals developed the costly social learning mechanisms that have resulted in cultural evolution (Science 2011;331:1286-9). This increasingly complex social behavior is correlated with brain size, especially in the frontal neocortex.
The wheel and the space shuttle are both products of creativity, but among their many obvious differences is one we can call the “creative unit.” The wheel's creative unit could have been a single person with all the tools needed to generate the first prototype, whereas the space shuttle clearly required many teams of people working together. Coordinating a team requires leadership. Effective leaders maintain high mutual cooperation among their group's members by ensuring that the penalty for noncooperation is fair and outweighed by any possible reward for noncooperation. Leaders must enforce social norms, rules, or laws. If mutual cooperation with a social norm is perceived by the membership to drop, then individual defection rates will rise and the previously defined social norm will break down (Trends Cogn. Sci. 2004;8:185-90). Saying something is so will work only as long as it usually is so, and it is the leader's role to maintain that consistency. One caveat is that leaders should also be perceived as tolerant. Few people have perfect track records of cooperation, and occasional minor missteps must be accommodated. In a study that looked at the reaction of leadership to such noncooperative behavior, it was shown that cooperative behavior in a social grouping is enhanced by perceived mercy of those in charge (Nature 2003;422:137-40). Consistency, fairness, and temperance in holding members accountable all matter in a leader's ability to foster cooperation.
Effective leaders create a culture of identity and mission, and foster belief in the group's competitive superiority so that the group believes it can win. The culture must distinguish the group's creative unit from others (“Myth and Meaning” [New York: Schocken Books, 1979, p. 20]). Within such a unit, teamwork will flourish and space shuttles will fly. Cooperation is enhanced by perceived similarity among a group's members. While this can apply to physical appearances, similarity is more defined in a business setting, research lab, or neurology department by a sense of shared mission. Just as the role of every member of NASA, from astronomer to janitor, is to put us into space, the mission of a health care organization, from the doctors to the secretaries, is to heal patients.
Jonathan Haidt in his book, “The Happiness Hypothesis” (New York: Basic Books, 2006), makes the compelling argument, drawing from the school of positive psychology, that virtue enhances happiness. Virtue, in this case, is defined broadly as excellence and involves morality. A leader who can cast the actions of the group as serving a noble cause can increase the group's level of happiness, and in this virtue-inspired happy state the group will be further motivated to work toward the virtuous goal. The shared sense of a virtuous mission creates a shared identity, and the competitive, proud sense that they will excel in achieving that mission.
We in the medical world have little problem believing that we have a virtuous mission. Let us continue to work as a team within our groups, institutions, specialty, and in the broader role we have in society to use our talents creatively and cooperatively so as to continue advancing our mission for neurologic health.
Our creativity as a species stems in part from our ability to use knowledge passed from older generations and to receive guidance from leaders in how to use it in new ways. The shared mission of neurologists within their own groups, departments, and institutions, and within the specialty, is no exception. But our ability to work together and accept the direction of leaders is relatively new in Homo sapiens' roughly 200,000-year-old existence. In that time, it took 195,000 years to invent a wheel, 199,500 years to create a printing press, and 199,900 years to develop an automobile. Given that time frame, how can we account for this unprecedented leap in creativity if there was not enough time for natural selection's incremental physiological, structural, and genetic “improvements?”
Alfred Russell Wallace was a contemporary of Charles Darwin, and both proposed a theory of natural selection as the basis for the evolution of species. However, Wallace felt that the human mind was an exception to this theory. He posited a more spiritual explanation. Many regarded this scientific “softness” with derision, but his observation that natural selection was a poor explanation for man's unprecedented creative leap may have been more scientifically astute than Darwin's failure to question it. Many anthropologists currently agree with Wallace that incremental improvements alone fail to explain this behavioral leap. They instead explain it by human cultural evolution, which, in a nutshell, is the sharing of information within and across generations. The emergence of language probably made this sharing possible.
Homo sapiens' success in developing a cumulative culture is based on cooperation with both kin and nonkin, and exceptional reliance on cultural transmission within and across generations. This is rare or absent in other apes whose cooperative behaviors are much more closely kin focused. Kinship is an important organizing principle in primate social groups. In macaques, for example, as the genetic relatedness of members decreases within a group, the social instability of the group increases, resulting in more fighting and wounding (PLoS One 2011;6:e16365).
In contrast, primitive human hunter gatherer societies are 25% genetically unrelated, 50% distantly related, and only 25% closely related. This nonrelatedness fosters intergroup interactions that may lead to the spread of cooperative institutions. When people reside together they have frequent opportunities to observe innovations and imitate successful traits. The change in ancestral human residential structure, compared with our evolutionary ancestors, may have therefore led to greater exposure to more ideas of value and may explain why humans and no other animals developed the costly social learning mechanisms that have resulted in cultural evolution (Science 2011;331:1286-9). This increasingly complex social behavior is correlated with brain size, especially in the frontal neocortex.
The wheel and the space shuttle are both products of creativity, but among their many obvious differences is one we can call the “creative unit.” The wheel's creative unit could have been a single person with all the tools needed to generate the first prototype, whereas the space shuttle clearly required many teams of people working together. Coordinating a team requires leadership. Effective leaders maintain high mutual cooperation among their group's members by ensuring that the penalty for noncooperation is fair and outweighed by any possible reward for noncooperation. Leaders must enforce social norms, rules, or laws. If mutual cooperation with a social norm is perceived by the membership to drop, then individual defection rates will rise and the previously defined social norm will break down (Trends Cogn. Sci. 2004;8:185-90). Saying something is so will work only as long as it usually is so, and it is the leader's role to maintain that consistency. One caveat is that leaders should also be perceived as tolerant. Few people have perfect track records of cooperation, and occasional minor missteps must be accommodated. In a study that looked at the reaction of leadership to such noncooperative behavior, it was shown that cooperative behavior in a social grouping is enhanced by perceived mercy of those in charge (Nature 2003;422:137-40). Consistency, fairness, and temperance in holding members accountable all matter in a leader's ability to foster cooperation.
Effective leaders create a culture of identity and mission, and foster belief in the group's competitive superiority so that the group believes it can win. The culture must distinguish the group's creative unit from others (“Myth and Meaning” [New York: Schocken Books, 1979, p. 20]). Within such a unit, teamwork will flourish and space shuttles will fly. Cooperation is enhanced by perceived similarity among a group's members. While this can apply to physical appearances, similarity is more defined in a business setting, research lab, or neurology department by a sense of shared mission. Just as the role of every member of NASA, from astronomer to janitor, is to put us into space, the mission of a health care organization, from the doctors to the secretaries, is to heal patients.
Jonathan Haidt in his book, “The Happiness Hypothesis” (New York: Basic Books, 2006), makes the compelling argument, drawing from the school of positive psychology, that virtue enhances happiness. Virtue, in this case, is defined broadly as excellence and involves morality. A leader who can cast the actions of the group as serving a noble cause can increase the group's level of happiness, and in this virtue-inspired happy state the group will be further motivated to work toward the virtuous goal. The shared sense of a virtuous mission creates a shared identity, and the competitive, proud sense that they will excel in achieving that mission.
We in the medical world have little problem believing that we have a virtuous mission. Let us continue to work as a team within our groups, institutions, specialty, and in the broader role we have in society to use our talents creatively and cooperatively so as to continue advancing our mission for neurologic health.
Our creativity as a species stems in part from our ability to use knowledge passed from older generations and to receive guidance from leaders in how to use it in new ways. The shared mission of neurologists within their own groups, departments, and institutions, and within the specialty, is no exception. But our ability to work together and accept the direction of leaders is relatively new in Homo sapiens' roughly 200,000-year-old existence. In that time, it took 195,000 years to invent a wheel, 199,500 years to create a printing press, and 199,900 years to develop an automobile. Given that time frame, how can we account for this unprecedented leap in creativity if there was not enough time for natural selection's incremental physiological, structural, and genetic “improvements?”
Alfred Russell Wallace was a contemporary of Charles Darwin, and both proposed a theory of natural selection as the basis for the evolution of species. However, Wallace felt that the human mind was an exception to this theory. He posited a more spiritual explanation. Many regarded this scientific “softness” with derision, but his observation that natural selection was a poor explanation for man's unprecedented creative leap may have been more scientifically astute than Darwin's failure to question it. Many anthropologists currently agree with Wallace that incremental improvements alone fail to explain this behavioral leap. They instead explain it by human cultural evolution, which, in a nutshell, is the sharing of information within and across generations. The emergence of language probably made this sharing possible.
Homo sapiens' success in developing a cumulative culture is based on cooperation with both kin and nonkin, and exceptional reliance on cultural transmission within and across generations. This is rare or absent in other apes whose cooperative behaviors are much more closely kin focused. Kinship is an important organizing principle in primate social groups. In macaques, for example, as the genetic relatedness of members decreases within a group, the social instability of the group increases, resulting in more fighting and wounding (PLoS One 2011;6:e16365).
In contrast, primitive human hunter gatherer societies are 25% genetically unrelated, 50% distantly related, and only 25% closely related. This nonrelatedness fosters intergroup interactions that may lead to the spread of cooperative institutions. When people reside together they have frequent opportunities to observe innovations and imitate successful traits. The change in ancestral human residential structure, compared with our evolutionary ancestors, may have therefore led to greater exposure to more ideas of value and may explain why humans and no other animals developed the costly social learning mechanisms that have resulted in cultural evolution (Science 2011;331:1286-9). This increasingly complex social behavior is correlated with brain size, especially in the frontal neocortex.
The wheel and the space shuttle are both products of creativity, but among their many obvious differences is one we can call the “creative unit.” The wheel's creative unit could have been a single person with all the tools needed to generate the first prototype, whereas the space shuttle clearly required many teams of people working together. Coordinating a team requires leadership. Effective leaders maintain high mutual cooperation among their group's members by ensuring that the penalty for noncooperation is fair and outweighed by any possible reward for noncooperation. Leaders must enforce social norms, rules, or laws. If mutual cooperation with a social norm is perceived by the membership to drop, then individual defection rates will rise and the previously defined social norm will break down (Trends Cogn. Sci. 2004;8:185-90). Saying something is so will work only as long as it usually is so, and it is the leader's role to maintain that consistency. One caveat is that leaders should also be perceived as tolerant. Few people have perfect track records of cooperation, and occasional minor missteps must be accommodated. In a study that looked at the reaction of leadership to such noncooperative behavior, it was shown that cooperative behavior in a social grouping is enhanced by perceived mercy of those in charge (Nature 2003;422:137-40). Consistency, fairness, and temperance in holding members accountable all matter in a leader's ability to foster cooperation.
Effective leaders create a culture of identity and mission, and foster belief in the group's competitive superiority so that the group believes it can win. The culture must distinguish the group's creative unit from others (“Myth and Meaning” [New York: Schocken Books, 1979, p. 20]). Within such a unit, teamwork will flourish and space shuttles will fly. Cooperation is enhanced by perceived similarity among a group's members. While this can apply to physical appearances, similarity is more defined in a business setting, research lab, or neurology department by a sense of shared mission. Just as the role of every member of NASA, from astronomer to janitor, is to put us into space, the mission of a health care organization, from the doctors to the secretaries, is to heal patients.
Jonathan Haidt in his book, “The Happiness Hypothesis” (New York: Basic Books, 2006), makes the compelling argument, drawing from the school of positive psychology, that virtue enhances happiness. Virtue, in this case, is defined broadly as excellence and involves morality. A leader who can cast the actions of the group as serving a noble cause can increase the group's level of happiness, and in this virtue-inspired happy state the group will be further motivated to work toward the virtuous goal. The shared sense of a virtuous mission creates a shared identity, and the competitive, proud sense that they will excel in achieving that mission.
We in the medical world have little problem believing that we have a virtuous mission. Let us continue to work as a team within our groups, institutions, specialty, and in the broader role we have in society to use our talents creatively and cooperatively so as to continue advancing our mission for neurologic health.
Seeing Morality Through the Lens of Creativity
If a person suffers from a fatal illness for which there is inadequate treatment, what degree of risk would be considered ethically acceptable in a clinical trial? This is a question faced by physicians, scientists, institutional review boards, courts, and of course our patients every day as we seek to advance our therapeutic armamentaria for glioblastoma, amyotrophic lateral sclerosis, and many other illnesses. In the absence of an objective anchor on which to base our decisions, we are driven by our moral sense. Morality is critical for the practice of medicine and for guiding research. Perhaps surprisingly, we can begin to understand the origins of morality within the construct of human creativity that I have discussed in each edition of this column this year.
Motivation
Fairness and the punishment of unfairness activate reward centers in the brain (Science 2004;305:1254-8). Evolutionary psychologists believe this behavior is an instinct that supports the survival of social groupings (“Evolutionary Psychology: A Primer” 1997 [
www.psych.ucsb.edu/research/cep/primer.html
Perception
In any social grouping, we occasionally perceive situations involving inequality or unfairness, especially within our social niche. If I am at a restaurant with my neighbor, and we both order the same chocolate ice cream dessert, I will feel unfairly treated if he gets twice as much as I do, or if his comes with sprinkles and a cherry and mine does not. I perceive this inequality, but I envisioned (expected) equality. The difference between what I envisioned and what I perceive provides the motivational voltage that leads me to act.
Action
I now formulate a plan. I will call the waiter over, point out this obvious difference, and ask that I receive a serving equal to my neighbor's. Enactment of this plan requires me to account for and conform to the context. Though internally my motivation is clear, externally, I am among polite company in a public place and the injustice is of rather small import, even if it is unfair in principle. So, in acting out the formulated plan, I do not yell obscenities or threaten the waiter's life, but rather tactfully wave him over when I catch his eye and then politely point out the difference.
Temperament
Although this is not a long-term issue, the restaurant is busy, and it is more than a few minutes before I am able to signal my waiter, so I am waiting. In waiting, my impatience palpably grows, fueled by the indignation of the unfairness in front of me and the gradual melting of my ice cream, which I'd rather be enjoying instead of waiting to replace. But I must be patient for my plan to be effective. I miss the waiter once or twice, so I must persevere. It would be a mistake to lose my temper and yell or jump up impatiently, as it would only make me look bad.
Social Context
Infused throughout this situation is the social context that governs what is right and wrong on both sides. No one in the restaurant would likely say that two patrons ordering the same bowl of ice cream should receive such different servings. Similarly, in this restaurant it is expected that we behave in a well-mannered fashion, and to do otherwise would be a violation of the social mores that itself would exceed the injustice of the unequal ice cream. This may be too trivial of an example to merit the term “morality,” but it does at least merit the term “social conduct.” Note how different our social mores are in a car accident or a tsunami. How we behave depends on the situation, and that in turn is reflected in how we are expected to behave.
At any given point in history, there are social mores associated with various situations (Biol. Philos. 2010;25:361–78). Some of these situations are today considered wrong, yet in their time were part of the social landscape and had to be navigated just as I had to navigate our hypothetical restaurant scenario. There was a right and wrong way to treat slaves and a right and wrong way to mete out medieval torture, and it was the social context that determined this “cultural morality.” Applying the cultural expectations of social behavior in 21st-century urban America across time to the Middle Ages or across space to an isolated tribe in a tropical rain forest is a mistake that missionaries have made, sometimes resulting in death.
Philosophers may debate whether there are some universal moral truths about right and wrong and whether or not science may inform us about them, but like aesthetics, eventually any moral conclusion is applied within a social context and it is society's behavior that operationally defines if the moral creation is acceptable or not.
As we judge the moral failing of our predecessors and those of other cultures, so too will we be judged by our descendants and those of future dominant cultures. Our acceptance of that may help us to avoid the atrocities that may arise from any unidirectional belief in the absolute correctness of an existing position. Morality and its misapplication underscore the importance of understanding the model of human creative thought and the creative origin of morality so that we avoid tyranny by a would-be dictator, regardless of whether he or she is a king, clergyman, or scientist. Note that it is not politics, religion, or science, per se, that necessitates tyranny. It is the individual person using the mantle of politics, religion, or science to justify what may be his own inner turmoil, or, as others have explained (Psychol. Rev. 2001;108:814–34; Neuron 2004;44:389–400), to rationalize his emotional impulse toward a self-serving goal.
Society has created an extensive system of judgment that defines the limits for what can be tolerated within whatever bounds it may consider moral behavior. This is our legal system. Our legal system sets out the rules of social behavior and the punishments for violations. But even our legal system evolves with the times and differs across countries, each with its own national culture (and set of subcultures). If society perceives what exists (ban on gay marriage) and envisions what it believes to be something better (legalizing gay marriage), then an action plan will be formulated and enacted in an attempt to overturn the law.
Temperament is crucial. If Martin Luther King Jr. and the civil rights movement had retreated after their first encounter with police resistance and illegal violence against them, they might not have succeeded. But once the prevailing paradigm starts coughing up blood, minds start to change, society's mores evolve, and the paradigm eventually shifts. Perhaps one day in the future when our personal genomes become as standard a part of our medical record as our date of birth, we will not look upon genetic disclosure to research participants as such a great risk, but rather take the opposite approach of ensuring full genetic disclosure regardless of uncertainties or implications. What is moral today in America differs from 200 years ago even though our biology has not changed in that time, nor have the philosophical anchors of western civilization. What changes is the attitude of the people who live here and now, and that is what defines morality here and now.
If a person suffers from a fatal illness for which there is inadequate treatment, what degree of risk would be considered ethically acceptable in a clinical trial? This is a question faced by physicians, scientists, institutional review boards, courts, and of course our patients every day as we seek to advance our therapeutic armamentaria for glioblastoma, amyotrophic lateral sclerosis, and many other illnesses. In the absence of an objective anchor on which to base our decisions, we are driven by our moral sense. Morality is critical for the practice of medicine and for guiding research. Perhaps surprisingly, we can begin to understand the origins of morality within the construct of human creativity that I have discussed in each edition of this column this year.
Motivation
Fairness and the punishment of unfairness activate reward centers in the brain (Science 2004;305:1254-8). Evolutionary psychologists believe this behavior is an instinct that supports the survival of social groupings (“Evolutionary Psychology: A Primer” 1997 [
www.psych.ucsb.edu/research/cep/primer.html
Perception
In any social grouping, we occasionally perceive situations involving inequality or unfairness, especially within our social niche. If I am at a restaurant with my neighbor, and we both order the same chocolate ice cream dessert, I will feel unfairly treated if he gets twice as much as I do, or if his comes with sprinkles and a cherry and mine does not. I perceive this inequality, but I envisioned (expected) equality. The difference between what I envisioned and what I perceive provides the motivational voltage that leads me to act.
Action
I now formulate a plan. I will call the waiter over, point out this obvious difference, and ask that I receive a serving equal to my neighbor's. Enactment of this plan requires me to account for and conform to the context. Though internally my motivation is clear, externally, I am among polite company in a public place and the injustice is of rather small import, even if it is unfair in principle. So, in acting out the formulated plan, I do not yell obscenities or threaten the waiter's life, but rather tactfully wave him over when I catch his eye and then politely point out the difference.
Temperament
Although this is not a long-term issue, the restaurant is busy, and it is more than a few minutes before I am able to signal my waiter, so I am waiting. In waiting, my impatience palpably grows, fueled by the indignation of the unfairness in front of me and the gradual melting of my ice cream, which I'd rather be enjoying instead of waiting to replace. But I must be patient for my plan to be effective. I miss the waiter once or twice, so I must persevere. It would be a mistake to lose my temper and yell or jump up impatiently, as it would only make me look bad.
Social Context
Infused throughout this situation is the social context that governs what is right and wrong on both sides. No one in the restaurant would likely say that two patrons ordering the same bowl of ice cream should receive such different servings. Similarly, in this restaurant it is expected that we behave in a well-mannered fashion, and to do otherwise would be a violation of the social mores that itself would exceed the injustice of the unequal ice cream. This may be too trivial of an example to merit the term “morality,” but it does at least merit the term “social conduct.” Note how different our social mores are in a car accident or a tsunami. How we behave depends on the situation, and that in turn is reflected in how we are expected to behave.
At any given point in history, there are social mores associated with various situations (Biol. Philos. 2010;25:361–78). Some of these situations are today considered wrong, yet in their time were part of the social landscape and had to be navigated just as I had to navigate our hypothetical restaurant scenario. There was a right and wrong way to treat slaves and a right and wrong way to mete out medieval torture, and it was the social context that determined this “cultural morality.” Applying the cultural expectations of social behavior in 21st-century urban America across time to the Middle Ages or across space to an isolated tribe in a tropical rain forest is a mistake that missionaries have made, sometimes resulting in death.
Philosophers may debate whether there are some universal moral truths about right and wrong and whether or not science may inform us about them, but like aesthetics, eventually any moral conclusion is applied within a social context and it is society's behavior that operationally defines if the moral creation is acceptable or not.
As we judge the moral failing of our predecessors and those of other cultures, so too will we be judged by our descendants and those of future dominant cultures. Our acceptance of that may help us to avoid the atrocities that may arise from any unidirectional belief in the absolute correctness of an existing position. Morality and its misapplication underscore the importance of understanding the model of human creative thought and the creative origin of morality so that we avoid tyranny by a would-be dictator, regardless of whether he or she is a king, clergyman, or scientist. Note that it is not politics, religion, or science, per se, that necessitates tyranny. It is the individual person using the mantle of politics, religion, or science to justify what may be his own inner turmoil, or, as others have explained (Psychol. Rev. 2001;108:814–34; Neuron 2004;44:389–400), to rationalize his emotional impulse toward a self-serving goal.
Society has created an extensive system of judgment that defines the limits for what can be tolerated within whatever bounds it may consider moral behavior. This is our legal system. Our legal system sets out the rules of social behavior and the punishments for violations. But even our legal system evolves with the times and differs across countries, each with its own national culture (and set of subcultures). If society perceives what exists (ban on gay marriage) and envisions what it believes to be something better (legalizing gay marriage), then an action plan will be formulated and enacted in an attempt to overturn the law.
Temperament is crucial. If Martin Luther King Jr. and the civil rights movement had retreated after their first encounter with police resistance and illegal violence against them, they might not have succeeded. But once the prevailing paradigm starts coughing up blood, minds start to change, society's mores evolve, and the paradigm eventually shifts. Perhaps one day in the future when our personal genomes become as standard a part of our medical record as our date of birth, we will not look upon genetic disclosure to research participants as such a great risk, but rather take the opposite approach of ensuring full genetic disclosure regardless of uncertainties or implications. What is moral today in America differs from 200 years ago even though our biology has not changed in that time, nor have the philosophical anchors of western civilization. What changes is the attitude of the people who live here and now, and that is what defines morality here and now.
If a person suffers from a fatal illness for which there is inadequate treatment, what degree of risk would be considered ethically acceptable in a clinical trial? This is a question faced by physicians, scientists, institutional review boards, courts, and of course our patients every day as we seek to advance our therapeutic armamentaria for glioblastoma, amyotrophic lateral sclerosis, and many other illnesses. In the absence of an objective anchor on which to base our decisions, we are driven by our moral sense. Morality is critical for the practice of medicine and for guiding research. Perhaps surprisingly, we can begin to understand the origins of morality within the construct of human creativity that I have discussed in each edition of this column this year.
Motivation
Fairness and the punishment of unfairness activate reward centers in the brain (Science 2004;305:1254-8). Evolutionary psychologists believe this behavior is an instinct that supports the survival of social groupings (“Evolutionary Psychology: A Primer” 1997 [
www.psych.ucsb.edu/research/cep/primer.html
Perception
In any social grouping, we occasionally perceive situations involving inequality or unfairness, especially within our social niche. If I am at a restaurant with my neighbor, and we both order the same chocolate ice cream dessert, I will feel unfairly treated if he gets twice as much as I do, or if his comes with sprinkles and a cherry and mine does not. I perceive this inequality, but I envisioned (expected) equality. The difference between what I envisioned and what I perceive provides the motivational voltage that leads me to act.
Action
I now formulate a plan. I will call the waiter over, point out this obvious difference, and ask that I receive a serving equal to my neighbor's. Enactment of this plan requires me to account for and conform to the context. Though internally my motivation is clear, externally, I am among polite company in a public place and the injustice is of rather small import, even if it is unfair in principle. So, in acting out the formulated plan, I do not yell obscenities or threaten the waiter's life, but rather tactfully wave him over when I catch his eye and then politely point out the difference.
Temperament
Although this is not a long-term issue, the restaurant is busy, and it is more than a few minutes before I am able to signal my waiter, so I am waiting. In waiting, my impatience palpably grows, fueled by the indignation of the unfairness in front of me and the gradual melting of my ice cream, which I'd rather be enjoying instead of waiting to replace. But I must be patient for my plan to be effective. I miss the waiter once or twice, so I must persevere. It would be a mistake to lose my temper and yell or jump up impatiently, as it would only make me look bad.
Social Context
Infused throughout this situation is the social context that governs what is right and wrong on both sides. No one in the restaurant would likely say that two patrons ordering the same bowl of ice cream should receive such different servings. Similarly, in this restaurant it is expected that we behave in a well-mannered fashion, and to do otherwise would be a violation of the social mores that itself would exceed the injustice of the unequal ice cream. This may be too trivial of an example to merit the term “morality,” but it does at least merit the term “social conduct.” Note how different our social mores are in a car accident or a tsunami. How we behave depends on the situation, and that in turn is reflected in how we are expected to behave.
At any given point in history, there are social mores associated with various situations (Biol. Philos. 2010;25:361–78). Some of these situations are today considered wrong, yet in their time were part of the social landscape and had to be navigated just as I had to navigate our hypothetical restaurant scenario. There was a right and wrong way to treat slaves and a right and wrong way to mete out medieval torture, and it was the social context that determined this “cultural morality.” Applying the cultural expectations of social behavior in 21st-century urban America across time to the Middle Ages or across space to an isolated tribe in a tropical rain forest is a mistake that missionaries have made, sometimes resulting in death.
Philosophers may debate whether there are some universal moral truths about right and wrong and whether or not science may inform us about them, but like aesthetics, eventually any moral conclusion is applied within a social context and it is society's behavior that operationally defines if the moral creation is acceptable or not.
As we judge the moral failing of our predecessors and those of other cultures, so too will we be judged by our descendants and those of future dominant cultures. Our acceptance of that may help us to avoid the atrocities that may arise from any unidirectional belief in the absolute correctness of an existing position. Morality and its misapplication underscore the importance of understanding the model of human creative thought and the creative origin of morality so that we avoid tyranny by a would-be dictator, regardless of whether he or she is a king, clergyman, or scientist. Note that it is not politics, religion, or science, per se, that necessitates tyranny. It is the individual person using the mantle of politics, religion, or science to justify what may be his own inner turmoil, or, as others have explained (Psychol. Rev. 2001;108:814–34; Neuron 2004;44:389–400), to rationalize his emotional impulse toward a self-serving goal.
Society has created an extensive system of judgment that defines the limits for what can be tolerated within whatever bounds it may consider moral behavior. This is our legal system. Our legal system sets out the rules of social behavior and the punishments for violations. But even our legal system evolves with the times and differs across countries, each with its own national culture (and set of subcultures). If society perceives what exists (ban on gay marriage) and envisions what it believes to be something better (legalizing gay marriage), then an action plan will be formulated and enacted in an attempt to overturn the law.
Temperament is crucial. If Martin Luther King Jr. and the civil rights movement had retreated after their first encounter with police resistance and illegal violence against them, they might not have succeeded. But once the prevailing paradigm starts coughing up blood, minds start to change, society's mores evolve, and the paradigm eventually shifts. Perhaps one day in the future when our personal genomes become as standard a part of our medical record as our date of birth, we will not look upon genetic disclosure to research participants as such a great risk, but rather take the opposite approach of ensuring full genetic disclosure regardless of uncertainties or implications. What is moral today in America differs from 200 years ago even though our biology has not changed in that time, nor have the philosophical anchors of western civilization. What changes is the attitude of the people who live here and now, and that is what defines morality here and now.
Social Context Influences Creative Success
The success of a creation should, in theory, be determined by its creator, who is in the best position to determine how closely the creation matches the original vision. But in science, as in other creative endeavors, this is not the case. Success in science requires funding and publication, which does not arise from scientists' opinions of their own work, but rather from the judgment rendered by a peer group comprising reviewers and editors.
This socially determined valuation of a creative effort helps to determine what society (or any social grouping) deems to be important. How much value we place on a new creation influences its creator's drive to bridge the perceived gap between what is and what should be. The satiation of that creative drive is a biologically and psychologically relevant measure of creative success because it influences the likelihood that the creator will react again in the future to such perceived gaps, thus perpetuating creative behavior. Other factors may influence the degree of such satisfaction, including the reward received; the value that the creator's culture places on individual attainment (Annu. Rev. Psychol. 2003;54:403-25); enjoyment of the creative effort itself, as expressed in Mihály Csíkszentmihályi's “Finding Flow: The Psychology of Engagement with Everyday Life” (New York: Basic Books, 1997); and the nature of the creation itself, in that creators who serve the greater good may get a greater sense of happiness, as discussed in Jonathan Haidt's “The Happiness Hypothesis” (New York: Basic Books, 2006).
Although the creator's opinion is important, Dr. Csíkszentmihályi's “systems model” of creativity highlights the role of society in which a gatekeeper determines what creative work will be admitted to the existing intended domain (“The Nature of Creativity: Current Psychological Perspectives” [Cambridge: Cambridge University Press, 1988, p. 325–39]). Because society cannot know the creator's vision and so cannot match the creation to the vision, an external set of aesthetic rules is needed to judge creative achievement.
Aesthetics are the cooperatively determined hierarchical categorization and quantification of quality, expressed as rules or principles. Aesthetics reflect the opinions and values of the social grouping in which creativity arises. For example, the aesthetic value of a painting lies in the artist's choice of color and form, and the aesthetic value of a scientific experiment lies in its methodological rigor, but the general principle of judging excellence is similar for both art and science.
How do we arrive at a set of aesthetic rules? Arguably, neurophysiology might lend some degree of objectivity. For example, neuronal receptive fields and firing patterns reflecting tonal quality, timbre, pitch, temporal structure, complexity, and familiarity of music can be measured (Nat. Neurosci. 2005;8:1241-7), but even so, there must be some determination of which responses or qualities are best. As a society, therefore, we must agree to a set of principles that define a work as being good or bad. Just as social norms define what conduct is expected and tolerated within a given society, aesthetics define what is desirable and undesirable within artistic, scientific, and other creative communities.
Leaders influence such norms, and within the social or professional grouping promote cooperation among its members to conform to the set standards (Nature 2003;422:137-40). Within large social groupings, cooperation can be and usually is enforced by the membership, either through designated experts or simply in the form of peer pressure.
Social norms are necessary because one person's actions affect other members of the group. Evolutionary psychologists have provided evidence that our minds have evolved a social contract algorithm specialized for detecting liars, cheaters, and rule-breakers – those individuals who violate social law. Neuroeconomists suggest that social norms are based on “conditional cooperation,” in which the level of cooperation of each group member is based on the level of mutual cooperation of all the members. If mutual cooperation is high, then individual cooperation is high. On the other hand, if I see many people breaking the law, benefiting as a result, and getting away with it, then I will be more likely to take a chance by breaking the law, too. Looting during times of social upheaval is a familiar manifestation of this principle.
For a paradigm, law, or any social norm to prevail, it must be enforced (Nature 2002;415:137-40). And for the aesthetic principle to endure, social (aesthetic) norms must be enforced, and noncooperators (those who fail to comply with accepted aesthetic principles) punished, leaving their papers unpublished or grant applications unfunded.
As I mentioned in the February issue's discussion about motivation, we like justice and we dislike injustice. Exacting social justice activates striatal and orbitofrontal reward substrates (Science 2004;305:1254-8), so we have powerful neurobiological drivers that serve to maintain social order.
However, social norms, aesthetic principles, and scientific paradigms can change. When the cost of cooperation with such a principle rises, due perhaps to mounting evidence that the scientific paradigm is wrong, the level of mutual cooperation will drop. Recall that if the reward value of an ongoing action drops, the reduced reward is the signal that drives the formation of a new action plan. When mutual cooperation with a social norm drops and defection rates rise, the social norm is destined to break down. In science, this is termed a “paradigm shift,” as described by Thomas S. Kuhn in “The Structure of Scientific Revolutions” (Chicago: University of Chicago Press, 1970).
Aesthetic laws, as practiced at the peer-to-peer and leadership levels, define and validate the merit of a creation. Aesthetic rules, when they are enforced by credible authorities, become accepted fact. We may even extend this principle to another human creation – morality – and we shall do so next month.
The success of a creation should, in theory, be determined by its creator, who is in the best position to determine how closely the creation matches the original vision. But in science, as in other creative endeavors, this is not the case. Success in science requires funding and publication, which does not arise from scientists' opinions of their own work, but rather from the judgment rendered by a peer group comprising reviewers and editors.
This socially determined valuation of a creative effort helps to determine what society (or any social grouping) deems to be important. How much value we place on a new creation influences its creator's drive to bridge the perceived gap between what is and what should be. The satiation of that creative drive is a biologically and psychologically relevant measure of creative success because it influences the likelihood that the creator will react again in the future to such perceived gaps, thus perpetuating creative behavior. Other factors may influence the degree of such satisfaction, including the reward received; the value that the creator's culture places on individual attainment (Annu. Rev. Psychol. 2003;54:403-25); enjoyment of the creative effort itself, as expressed in Mihály Csíkszentmihályi's “Finding Flow: The Psychology of Engagement with Everyday Life” (New York: Basic Books, 1997); and the nature of the creation itself, in that creators who serve the greater good may get a greater sense of happiness, as discussed in Jonathan Haidt's “The Happiness Hypothesis” (New York: Basic Books, 2006).
Although the creator's opinion is important, Dr. Csíkszentmihályi's “systems model” of creativity highlights the role of society in which a gatekeeper determines what creative work will be admitted to the existing intended domain (“The Nature of Creativity: Current Psychological Perspectives” [Cambridge: Cambridge University Press, 1988, p. 325–39]). Because society cannot know the creator's vision and so cannot match the creation to the vision, an external set of aesthetic rules is needed to judge creative achievement.
Aesthetics are the cooperatively determined hierarchical categorization and quantification of quality, expressed as rules or principles. Aesthetics reflect the opinions and values of the social grouping in which creativity arises. For example, the aesthetic value of a painting lies in the artist's choice of color and form, and the aesthetic value of a scientific experiment lies in its methodological rigor, but the general principle of judging excellence is similar for both art and science.
How do we arrive at a set of aesthetic rules? Arguably, neurophysiology might lend some degree of objectivity. For example, neuronal receptive fields and firing patterns reflecting tonal quality, timbre, pitch, temporal structure, complexity, and familiarity of music can be measured (Nat. Neurosci. 2005;8:1241-7), but even so, there must be some determination of which responses or qualities are best. As a society, therefore, we must agree to a set of principles that define a work as being good or bad. Just as social norms define what conduct is expected and tolerated within a given society, aesthetics define what is desirable and undesirable within artistic, scientific, and other creative communities.
Leaders influence such norms, and within the social or professional grouping promote cooperation among its members to conform to the set standards (Nature 2003;422:137-40). Within large social groupings, cooperation can be and usually is enforced by the membership, either through designated experts or simply in the form of peer pressure.
Social norms are necessary because one person's actions affect other members of the group. Evolutionary psychologists have provided evidence that our minds have evolved a social contract algorithm specialized for detecting liars, cheaters, and rule-breakers – those individuals who violate social law. Neuroeconomists suggest that social norms are based on “conditional cooperation,” in which the level of cooperation of each group member is based on the level of mutual cooperation of all the members. If mutual cooperation is high, then individual cooperation is high. On the other hand, if I see many people breaking the law, benefiting as a result, and getting away with it, then I will be more likely to take a chance by breaking the law, too. Looting during times of social upheaval is a familiar manifestation of this principle.
For a paradigm, law, or any social norm to prevail, it must be enforced (Nature 2002;415:137-40). And for the aesthetic principle to endure, social (aesthetic) norms must be enforced, and noncooperators (those who fail to comply with accepted aesthetic principles) punished, leaving their papers unpublished or grant applications unfunded.
As I mentioned in the February issue's discussion about motivation, we like justice and we dislike injustice. Exacting social justice activates striatal and orbitofrontal reward substrates (Science 2004;305:1254-8), so we have powerful neurobiological drivers that serve to maintain social order.
However, social norms, aesthetic principles, and scientific paradigms can change. When the cost of cooperation with such a principle rises, due perhaps to mounting evidence that the scientific paradigm is wrong, the level of mutual cooperation will drop. Recall that if the reward value of an ongoing action drops, the reduced reward is the signal that drives the formation of a new action plan. When mutual cooperation with a social norm drops and defection rates rise, the social norm is destined to break down. In science, this is termed a “paradigm shift,” as described by Thomas S. Kuhn in “The Structure of Scientific Revolutions” (Chicago: University of Chicago Press, 1970).
Aesthetic laws, as practiced at the peer-to-peer and leadership levels, define and validate the merit of a creation. Aesthetic rules, when they are enforced by credible authorities, become accepted fact. We may even extend this principle to another human creation – morality – and we shall do so next month.
The success of a creation should, in theory, be determined by its creator, who is in the best position to determine how closely the creation matches the original vision. But in science, as in other creative endeavors, this is not the case. Success in science requires funding and publication, which does not arise from scientists' opinions of their own work, but rather from the judgment rendered by a peer group comprising reviewers and editors.
This socially determined valuation of a creative effort helps to determine what society (or any social grouping) deems to be important. How much value we place on a new creation influences its creator's drive to bridge the perceived gap between what is and what should be. The satiation of that creative drive is a biologically and psychologically relevant measure of creative success because it influences the likelihood that the creator will react again in the future to such perceived gaps, thus perpetuating creative behavior. Other factors may influence the degree of such satisfaction, including the reward received; the value that the creator's culture places on individual attainment (Annu. Rev. Psychol. 2003;54:403-25); enjoyment of the creative effort itself, as expressed in Mihály Csíkszentmihályi's “Finding Flow: The Psychology of Engagement with Everyday Life” (New York: Basic Books, 1997); and the nature of the creation itself, in that creators who serve the greater good may get a greater sense of happiness, as discussed in Jonathan Haidt's “The Happiness Hypothesis” (New York: Basic Books, 2006).
Although the creator's opinion is important, Dr. Csíkszentmihályi's “systems model” of creativity highlights the role of society in which a gatekeeper determines what creative work will be admitted to the existing intended domain (“The Nature of Creativity: Current Psychological Perspectives” [Cambridge: Cambridge University Press, 1988, p. 325–39]). Because society cannot know the creator's vision and so cannot match the creation to the vision, an external set of aesthetic rules is needed to judge creative achievement.
Aesthetics are the cooperatively determined hierarchical categorization and quantification of quality, expressed as rules or principles. Aesthetics reflect the opinions and values of the social grouping in which creativity arises. For example, the aesthetic value of a painting lies in the artist's choice of color and form, and the aesthetic value of a scientific experiment lies in its methodological rigor, but the general principle of judging excellence is similar for both art and science.
How do we arrive at a set of aesthetic rules? Arguably, neurophysiology might lend some degree of objectivity. For example, neuronal receptive fields and firing patterns reflecting tonal quality, timbre, pitch, temporal structure, complexity, and familiarity of music can be measured (Nat. Neurosci. 2005;8:1241-7), but even so, there must be some determination of which responses or qualities are best. As a society, therefore, we must agree to a set of principles that define a work as being good or bad. Just as social norms define what conduct is expected and tolerated within a given society, aesthetics define what is desirable and undesirable within artistic, scientific, and other creative communities.
Leaders influence such norms, and within the social or professional grouping promote cooperation among its members to conform to the set standards (Nature 2003;422:137-40). Within large social groupings, cooperation can be and usually is enforced by the membership, either through designated experts or simply in the form of peer pressure.
Social norms are necessary because one person's actions affect other members of the group. Evolutionary psychologists have provided evidence that our minds have evolved a social contract algorithm specialized for detecting liars, cheaters, and rule-breakers – those individuals who violate social law. Neuroeconomists suggest that social norms are based on “conditional cooperation,” in which the level of cooperation of each group member is based on the level of mutual cooperation of all the members. If mutual cooperation is high, then individual cooperation is high. On the other hand, if I see many people breaking the law, benefiting as a result, and getting away with it, then I will be more likely to take a chance by breaking the law, too. Looting during times of social upheaval is a familiar manifestation of this principle.
For a paradigm, law, or any social norm to prevail, it must be enforced (Nature 2002;415:137-40). And for the aesthetic principle to endure, social (aesthetic) norms must be enforced, and noncooperators (those who fail to comply with accepted aesthetic principles) punished, leaving their papers unpublished or grant applications unfunded.
As I mentioned in the February issue's discussion about motivation, we like justice and we dislike injustice. Exacting social justice activates striatal and orbitofrontal reward substrates (Science 2004;305:1254-8), so we have powerful neurobiological drivers that serve to maintain social order.
However, social norms, aesthetic principles, and scientific paradigms can change. When the cost of cooperation with such a principle rises, due perhaps to mounting evidence that the scientific paradigm is wrong, the level of mutual cooperation will drop. Recall that if the reward value of an ongoing action drops, the reduced reward is the signal that drives the formation of a new action plan. When mutual cooperation with a social norm drops and defection rates rise, the social norm is destined to break down. In science, this is termed a “paradigm shift,” as described by Thomas S. Kuhn in “The Structure of Scientific Revolutions” (Chicago: University of Chicago Press, 1970).
Aesthetic laws, as practiced at the peer-to-peer and leadership levels, define and validate the merit of a creation. Aesthetic rules, when they are enforced by credible authorities, become accepted fact. We may even extend this principle to another human creation – morality – and we shall do so next month.
Creativity's Links to Time and Temperament
Great achievements often take time, so a creator must have the ability to persevere until the plan is completed. But not all plans prove to be practical, and a creator must have a sense of when insufficient progress has been made for the time invested. Most relevant for our understanding of the role time plays in creativity is our perception and mental image of time, or what we might term “cognitive time.”
Cognitive time is the duration of an event. Perceived and mental images often contain a sequence of events. Each event, the time between events, and the entire sequence of events has a specific duration or chronological distance that is inherent in the mental image. We compare the chronological distance of our perceived “what is” with our imagined “what should be.” I envision that it takes 10 minutes to walk the dog, so when my son fails to return after an hour, I am worried.
The perceived passage of time is a quantity, and quantity is a property shared by all of our sensory modalities. Whether something is brighter or darker, louder or quieter, heavier or lighter, faster or slower are examples of quantified perceptions. Quantity for any sense is abstracted by our multimodal parietal lobe (Curr. Opin. Neurobiol. 2004;14: 218-24). Our conscious estimation of time is influenced by our circadian rhythms. When people are isolated in a chamber without any time cues and allowed to wake and sleep as they desire, their bodies unconsciously maintain biological rhythms, for example in body temperature fluctuation. When asked to judge the passage of time, we overestimate during periods of higher body temperature and maximal wakefulness, and underestimate during periods of lower body temperature and greater sleepiness (Physiol. Behav. 2001;72:589-93). The hour my son spent walking the dog would have seemed longer to me at 4 a.m. than at 4 p.m.
All creative plans have a time frame. We decide whether the progress gained over the time spent on a creative effort matches the chronology embedded within our envisioned action plan. How we react to the progress of our creative effort within the perceived time frame is influenced by our temperament. Dr. C. Robert Cloninger defines temperament as an unconscious property based on our automatic responses to perceived stimuli. Such responses determine whether we are driven more by the search for reward vs. the avoidance of punishment, and how well we tolerate and persist in the face of “frustrative nonreward.”
Dr. Cloninger defines four dimensions of temperament as novelty seeking (motivated by the possibility of unexpected reward), harm avoidance (happy simply to avoid punishment), reward dependence (needing praise), and persistence (“perseverance despite frustration and fatigue”). Character, he says, is driven by three dimensions: self-directedness (willpower to achieve one's own goals), cooperativeness with other individuals, and self-transcendence (the acceptance that the self is part of a universal whole). Individual differences in temperament and character define our individual personalities (Arch. Gen. Psychiatry 1987;44:573-88; Arch. Gen. Psychiatry 1993;50:975-90).
Those who are more highly motivated by the search for novelty are more likely to envision and pursue the realization of something new (what should be) than are individuals who prefer the avoidance of harm (leave well enough alone). We may be temperamentally biased to envision something better than we perceive, thus generating the motivational voltage that initiates creative behavior. Temperament determines our reactive set point; our tolerance for the status quo and for unrewarded action; how patient and perseverant we tend to be; and when it is time to alter our plan.
Of course, not every new idea is good. Character (which is based upon conscious, insight-oriented learning) allows us to regulate ourselves, our interactions with others, and our integration with more universal themes of nature and spirituality. Our ability to learn consciously that a temperament-driven gut response can be maladaptive allows us to modify our reaction consciously.
To achieve a creative goal usually requires persistence over an extended period of time, even in the absence of external encouragement. During a task in which subjects were asked to rate facial expressions, individuals with higher persistence scores performed with greater accuracy. The task had some periods that most subjects found boring. During the boring test periods, those individuals with higher persistence scores had better overall task performance and maintained activation of brain reward centers on fMRI, whereas those with lower persistence scores had poorer overall performance and deactivated those same reward regions (Proc. Natl. Acad. Sci. U.S.A. 2003;100:3479-84). If the goal of the task is itself assumed to be rewarding to the participant, reward activation during this period of boredom or frustrative nonreward may imply that the goal is more effectively maintained in the minds of those who are more persevering.
Individual differences within each of the seven dimensions of personality correlate with different patterns of regional brain activity (J. Neurosci. 2005;25:6460-6), providing some biological validity to these personality constructs. Personality and temperament are qualities of the individual creator, and they influence how the individual creator behaves and interacts with others. How society in turn reacts to the creator and the creative product ultimately defines creative success, as we shall consider in the next edition.
Great achievements often take time, so a creator must have the ability to persevere until the plan is completed. But not all plans prove to be practical, and a creator must have a sense of when insufficient progress has been made for the time invested. Most relevant for our understanding of the role time plays in creativity is our perception and mental image of time, or what we might term “cognitive time.”
Cognitive time is the duration of an event. Perceived and mental images often contain a sequence of events. Each event, the time between events, and the entire sequence of events has a specific duration or chronological distance that is inherent in the mental image. We compare the chronological distance of our perceived “what is” with our imagined “what should be.” I envision that it takes 10 minutes to walk the dog, so when my son fails to return after an hour, I am worried.
The perceived passage of time is a quantity, and quantity is a property shared by all of our sensory modalities. Whether something is brighter or darker, louder or quieter, heavier or lighter, faster or slower are examples of quantified perceptions. Quantity for any sense is abstracted by our multimodal parietal lobe (Curr. Opin. Neurobiol. 2004;14: 218-24). Our conscious estimation of time is influenced by our circadian rhythms. When people are isolated in a chamber without any time cues and allowed to wake and sleep as they desire, their bodies unconsciously maintain biological rhythms, for example in body temperature fluctuation. When asked to judge the passage of time, we overestimate during periods of higher body temperature and maximal wakefulness, and underestimate during periods of lower body temperature and greater sleepiness (Physiol. Behav. 2001;72:589-93). The hour my son spent walking the dog would have seemed longer to me at 4 a.m. than at 4 p.m.
All creative plans have a time frame. We decide whether the progress gained over the time spent on a creative effort matches the chronology embedded within our envisioned action plan. How we react to the progress of our creative effort within the perceived time frame is influenced by our temperament. Dr. C. Robert Cloninger defines temperament as an unconscious property based on our automatic responses to perceived stimuli. Such responses determine whether we are driven more by the search for reward vs. the avoidance of punishment, and how well we tolerate and persist in the face of “frustrative nonreward.”
Dr. Cloninger defines four dimensions of temperament as novelty seeking (motivated by the possibility of unexpected reward), harm avoidance (happy simply to avoid punishment), reward dependence (needing praise), and persistence (“perseverance despite frustration and fatigue”). Character, he says, is driven by three dimensions: self-directedness (willpower to achieve one's own goals), cooperativeness with other individuals, and self-transcendence (the acceptance that the self is part of a universal whole). Individual differences in temperament and character define our individual personalities (Arch. Gen. Psychiatry 1987;44:573-88; Arch. Gen. Psychiatry 1993;50:975-90).
Those who are more highly motivated by the search for novelty are more likely to envision and pursue the realization of something new (what should be) than are individuals who prefer the avoidance of harm (leave well enough alone). We may be temperamentally biased to envision something better than we perceive, thus generating the motivational voltage that initiates creative behavior. Temperament determines our reactive set point; our tolerance for the status quo and for unrewarded action; how patient and perseverant we tend to be; and when it is time to alter our plan.
Of course, not every new idea is good. Character (which is based upon conscious, insight-oriented learning) allows us to regulate ourselves, our interactions with others, and our integration with more universal themes of nature and spirituality. Our ability to learn consciously that a temperament-driven gut response can be maladaptive allows us to modify our reaction consciously.
To achieve a creative goal usually requires persistence over an extended period of time, even in the absence of external encouragement. During a task in which subjects were asked to rate facial expressions, individuals with higher persistence scores performed with greater accuracy. The task had some periods that most subjects found boring. During the boring test periods, those individuals with higher persistence scores had better overall task performance and maintained activation of brain reward centers on fMRI, whereas those with lower persistence scores had poorer overall performance and deactivated those same reward regions (Proc. Natl. Acad. Sci. U.S.A. 2003;100:3479-84). If the goal of the task is itself assumed to be rewarding to the participant, reward activation during this period of boredom or frustrative nonreward may imply that the goal is more effectively maintained in the minds of those who are more persevering.
Individual differences within each of the seven dimensions of personality correlate with different patterns of regional brain activity (J. Neurosci. 2005;25:6460-6), providing some biological validity to these personality constructs. Personality and temperament are qualities of the individual creator, and they influence how the individual creator behaves and interacts with others. How society in turn reacts to the creator and the creative product ultimately defines creative success, as we shall consider in the next edition.
Great achievements often take time, so a creator must have the ability to persevere until the plan is completed. But not all plans prove to be practical, and a creator must have a sense of when insufficient progress has been made for the time invested. Most relevant for our understanding of the role time plays in creativity is our perception and mental image of time, or what we might term “cognitive time.”
Cognitive time is the duration of an event. Perceived and mental images often contain a sequence of events. Each event, the time between events, and the entire sequence of events has a specific duration or chronological distance that is inherent in the mental image. We compare the chronological distance of our perceived “what is” with our imagined “what should be.” I envision that it takes 10 minutes to walk the dog, so when my son fails to return after an hour, I am worried.
The perceived passage of time is a quantity, and quantity is a property shared by all of our sensory modalities. Whether something is brighter or darker, louder or quieter, heavier or lighter, faster or slower are examples of quantified perceptions. Quantity for any sense is abstracted by our multimodal parietal lobe (Curr. Opin. Neurobiol. 2004;14: 218-24). Our conscious estimation of time is influenced by our circadian rhythms. When people are isolated in a chamber without any time cues and allowed to wake and sleep as they desire, their bodies unconsciously maintain biological rhythms, for example in body temperature fluctuation. When asked to judge the passage of time, we overestimate during periods of higher body temperature and maximal wakefulness, and underestimate during periods of lower body temperature and greater sleepiness (Physiol. Behav. 2001;72:589-93). The hour my son spent walking the dog would have seemed longer to me at 4 a.m. than at 4 p.m.
All creative plans have a time frame. We decide whether the progress gained over the time spent on a creative effort matches the chronology embedded within our envisioned action plan. How we react to the progress of our creative effort within the perceived time frame is influenced by our temperament. Dr. C. Robert Cloninger defines temperament as an unconscious property based on our automatic responses to perceived stimuli. Such responses determine whether we are driven more by the search for reward vs. the avoidance of punishment, and how well we tolerate and persist in the face of “frustrative nonreward.”
Dr. Cloninger defines four dimensions of temperament as novelty seeking (motivated by the possibility of unexpected reward), harm avoidance (happy simply to avoid punishment), reward dependence (needing praise), and persistence (“perseverance despite frustration and fatigue”). Character, he says, is driven by three dimensions: self-directedness (willpower to achieve one's own goals), cooperativeness with other individuals, and self-transcendence (the acceptance that the self is part of a universal whole). Individual differences in temperament and character define our individual personalities (Arch. Gen. Psychiatry 1987;44:573-88; Arch. Gen. Psychiatry 1993;50:975-90).
Those who are more highly motivated by the search for novelty are more likely to envision and pursue the realization of something new (what should be) than are individuals who prefer the avoidance of harm (leave well enough alone). We may be temperamentally biased to envision something better than we perceive, thus generating the motivational voltage that initiates creative behavior. Temperament determines our reactive set point; our tolerance for the status quo and for unrewarded action; how patient and perseverant we tend to be; and when it is time to alter our plan.
Of course, not every new idea is good. Character (which is based upon conscious, insight-oriented learning) allows us to regulate ourselves, our interactions with others, and our integration with more universal themes of nature and spirituality. Our ability to learn consciously that a temperament-driven gut response can be maladaptive allows us to modify our reaction consciously.
To achieve a creative goal usually requires persistence over an extended period of time, even in the absence of external encouragement. During a task in which subjects were asked to rate facial expressions, individuals with higher persistence scores performed with greater accuracy. The task had some periods that most subjects found boring. During the boring test periods, those individuals with higher persistence scores had better overall task performance and maintained activation of brain reward centers on fMRI, whereas those with lower persistence scores had poorer overall performance and deactivated those same reward regions (Proc. Natl. Acad. Sci. U.S.A. 2003;100:3479-84). If the goal of the task is itself assumed to be rewarding to the participant, reward activation during this period of boredom or frustrative nonreward may imply that the goal is more effectively maintained in the minds of those who are more persevering.
Individual differences within each of the seven dimensions of personality correlate with different patterns of regional brain activity (J. Neurosci. 2005;25:6460-6), providing some biological validity to these personality constructs. Personality and temperament are qualities of the individual creator, and they influence how the individual creator behaves and interacts with others. How society in turn reacts to the creator and the creative product ultimately defines creative success, as we shall consider in the next edition.
Biological Differences Bring Action to Creativity
I can think of no better example of how strategic formulation must translate into dexterously executed action to effect change than health care reform, a topic we extensively cover in this month's issue of
Just as some reform measures and proposals seek to recognize the intrinsic nature of neurologists, so then do we include by considering the intrinsic biological differences between people that render some better equipped to execute a particular plan successfully than can their neighbors. Biology matters. As recent sports scandals have suggested, athletic performance can be enhanced by drugs such as anabolic steroids and amphetamines. Some neuroscientists have even advocated the cosmetic use of therapeutic drugs as “cognitive enhancers.” Drugs are exogenous biological influences, but there are endogenous sources, too.
Some of the more readily visible biological differences thought to explain enhanced performance have regarded brain structure. Regional differences in brain function are reflected to some extent by differences in structure. For example, the planum temporale, an auditory region of the temporal lobe, is larger in the language dominant (typically left) hemisphere (Science 1968;161:186–7), a trait shared by nearly all people.
Some regional alterations reflect individual differences in use. The lateral aspect of Heschl's gyrus is larger in the left hemisphere among musicians whose pitch perception strategy favors fundamental frequency or rapid temporal processing, but larger instead in the right hemisphere among musicians whose pitch perception strategy favors spectral pitch processing. This region also is physically larger in accomplished musicians compared to nonmusicians (Nat. Neurosci. 2005;8:1241–7).
Studies of Albert Einstein's brain revealed a greater density of neurons in the cerebral cortex than normal (Neurosci. Lett. 1996;210:161–4), and an aberrant Sylvian fissure with disproportionately larger and more symmetric parietal lobes (Lancet 1999;353:2149–53). The significance of these differences has prompted speculation that the greater neuronal density reduced the time delay for one neuron to communicate with another, and the enlarged parietal lobes enhanced his inherent math and spatial skills, arguments that have some parallels in comparisons between low and high IQ individuals (Trends Neurosci. 1997;20:365–71).
Genetic variations have been considered another source of individual differences. The performances of identical twins on a variety of cognitive and physiologic tests are far more similar than the comparative performances of genetically unrelated people (Behav. Genet. 2004;34:41–50). The search for genetic variations that enhance cognitive performance has revealed several that influence memory, including the serotonin 5-HT2a receptor (Nat. Neurosci. 2003;6:1141–2), brain-derived neurotrophic factor (J. Neurosci. 2003;23:6690–4), KIBRA (found in kidney [KI] and brain [BRA]) (Science 2006;314:475–8), and the dopamine D2 receptor (Science 2007;318:1642–5). Variations of genes related to serotonin are also thought to affect our reaction to novelty and anxiety-provoking situations (J. Neurosci. 2005;25:6460–6) that in turn might influence our drive for seeking creative change. Allelic variations of the gene for catechol-O-methyl transferase (involved in dopamine metabolism, the neurotransmitter of the mesolimbic reward pathway) correlate with performance on a problem-solving task (Am. J. Psychiatry 2002;159:652–4). Interactions between genes and environmental factors may result in unexpected or “emergent” behaviors that may also affect creativity, such as the difference in emotion processing between men and women (Curr. Opin. Neurobiol. 2004;14:233–8).
Less obvious sources of enhanced performance are suspected to reflect individual physiological differences. A functional MRI study comparing the calculation skills of Rüdiger Gamm, a mathematical calculation prodigy, with nonexpert calculators showed that both activated brain regions serving arithmetic, quantity, and visual imagery, but only Gamm additionally activated memory regions (Nat. Neurosci. 2001;4:103–7). In a related study, expert abacus calculators activated the same areas for mental calculation as nonexperts, but additionally activated visuospatial cortices, congruent with the greater visuospatial demands of an abacus-based strategy (NeuroImage 2003;19:296–307). These studies suggest that the neural networks underlying prodigy-level skill may be different than those underlying ordinary-level skill. The regions required for the basic function are active in both, but the prodigies have another functional system in their skill-related network that seems to reflect their training background. It is unclear if the extra system is inherently available to anyone with sufficient practice – and if so, to what degree – or is instead a form of biologically conferred “performance synesthesia.”
Disease-mediated biological alterations of brain structure and function seem an unlikely source of heightened ability, yet autistic savants are a well-known group of individuals whose extraordinary talent resides in a circumscribed area that is grossly disproportionate to their general intellect. Savant skills have included memory, mathematics, music, calendrical calculations, and, less consistently, mechanical or artistic skill.
Biological substrates of savantism are unclear, but some correlates have included a larger amygdala (in children) and hippocampus (J. Neurosci. 2004;24:6392–401). Perseverative fixation on a single activity that is their sole avenue of socialization and reward, coinciding with their area of savant-level talent, suggests that savantism may derive from the extreme focus of reward on a single activity and structurally altered paralimbic reward substrates, but this is currently speculative.
Another group of patients whose disease can sometimes enhance creativity are patients with frontotemporal dementia possibly reflecting the reduced behavioral inhibition that characterizes FTD (Arch. Neurol. 2004;61:842–4). Some FTD patients have developed newly expressed artistic skills reflected in greater volumes of less constrained art. But contrary to popular belief, psychiatric disease is not a pathway to enhanced creativity. A large study of eminent men concluded that depression and personality disorders were common, especially among writers, and that their prevalence among the gifted exceeded that in the general population. But those disorders were generally a hindrance to creative ability, and psychosis was a frank handicap (Br. J. Psychiatry 1994;165:22–34).
Some individuals have increased dexterity to carry out creative plans for reasons that range from environmental influences on normally structured nervous systems to altered “wiring diagrams.” But regardless of how we have acquired our talents, the ways we choose to use them depend in part on our personality and temperament, which will be our focus in the next issue.
I can think of no better example of how strategic formulation must translate into dexterously executed action to effect change than health care reform, a topic we extensively cover in this month's issue of
Just as some reform measures and proposals seek to recognize the intrinsic nature of neurologists, so then do we include by considering the intrinsic biological differences between people that render some better equipped to execute a particular plan successfully than can their neighbors. Biology matters. As recent sports scandals have suggested, athletic performance can be enhanced by drugs such as anabolic steroids and amphetamines. Some neuroscientists have even advocated the cosmetic use of therapeutic drugs as “cognitive enhancers.” Drugs are exogenous biological influences, but there are endogenous sources, too.
Some of the more readily visible biological differences thought to explain enhanced performance have regarded brain structure. Regional differences in brain function are reflected to some extent by differences in structure. For example, the planum temporale, an auditory region of the temporal lobe, is larger in the language dominant (typically left) hemisphere (Science 1968;161:186–7), a trait shared by nearly all people.
Some regional alterations reflect individual differences in use. The lateral aspect of Heschl's gyrus is larger in the left hemisphere among musicians whose pitch perception strategy favors fundamental frequency or rapid temporal processing, but larger instead in the right hemisphere among musicians whose pitch perception strategy favors spectral pitch processing. This region also is physically larger in accomplished musicians compared to nonmusicians (Nat. Neurosci. 2005;8:1241–7).
Studies of Albert Einstein's brain revealed a greater density of neurons in the cerebral cortex than normal (Neurosci. Lett. 1996;210:161–4), and an aberrant Sylvian fissure with disproportionately larger and more symmetric parietal lobes (Lancet 1999;353:2149–53). The significance of these differences has prompted speculation that the greater neuronal density reduced the time delay for one neuron to communicate with another, and the enlarged parietal lobes enhanced his inherent math and spatial skills, arguments that have some parallels in comparisons between low and high IQ individuals (Trends Neurosci. 1997;20:365–71).
Genetic variations have been considered another source of individual differences. The performances of identical twins on a variety of cognitive and physiologic tests are far more similar than the comparative performances of genetically unrelated people (Behav. Genet. 2004;34:41–50). The search for genetic variations that enhance cognitive performance has revealed several that influence memory, including the serotonin 5-HT2a receptor (Nat. Neurosci. 2003;6:1141–2), brain-derived neurotrophic factor (J. Neurosci. 2003;23:6690–4), KIBRA (found in kidney [KI] and brain [BRA]) (Science 2006;314:475–8), and the dopamine D2 receptor (Science 2007;318:1642–5). Variations of genes related to serotonin are also thought to affect our reaction to novelty and anxiety-provoking situations (J. Neurosci. 2005;25:6460–6) that in turn might influence our drive for seeking creative change. Allelic variations of the gene for catechol-O-methyl transferase (involved in dopamine metabolism, the neurotransmitter of the mesolimbic reward pathway) correlate with performance on a problem-solving task (Am. J. Psychiatry 2002;159:652–4). Interactions between genes and environmental factors may result in unexpected or “emergent” behaviors that may also affect creativity, such as the difference in emotion processing between men and women (Curr. Opin. Neurobiol. 2004;14:233–8).
Less obvious sources of enhanced performance are suspected to reflect individual physiological differences. A functional MRI study comparing the calculation skills of Rüdiger Gamm, a mathematical calculation prodigy, with nonexpert calculators showed that both activated brain regions serving arithmetic, quantity, and visual imagery, but only Gamm additionally activated memory regions (Nat. Neurosci. 2001;4:103–7). In a related study, expert abacus calculators activated the same areas for mental calculation as nonexperts, but additionally activated visuospatial cortices, congruent with the greater visuospatial demands of an abacus-based strategy (NeuroImage 2003;19:296–307). These studies suggest that the neural networks underlying prodigy-level skill may be different than those underlying ordinary-level skill. The regions required for the basic function are active in both, but the prodigies have another functional system in their skill-related network that seems to reflect their training background. It is unclear if the extra system is inherently available to anyone with sufficient practice – and if so, to what degree – or is instead a form of biologically conferred “performance synesthesia.”
Disease-mediated biological alterations of brain structure and function seem an unlikely source of heightened ability, yet autistic savants are a well-known group of individuals whose extraordinary talent resides in a circumscribed area that is grossly disproportionate to their general intellect. Savant skills have included memory, mathematics, music, calendrical calculations, and, less consistently, mechanical or artistic skill.
Biological substrates of savantism are unclear, but some correlates have included a larger amygdala (in children) and hippocampus (J. Neurosci. 2004;24:6392–401). Perseverative fixation on a single activity that is their sole avenue of socialization and reward, coinciding with their area of savant-level talent, suggests that savantism may derive from the extreme focus of reward on a single activity and structurally altered paralimbic reward substrates, but this is currently speculative.
Another group of patients whose disease can sometimes enhance creativity are patients with frontotemporal dementia possibly reflecting the reduced behavioral inhibition that characterizes FTD (Arch. Neurol. 2004;61:842–4). Some FTD patients have developed newly expressed artistic skills reflected in greater volumes of less constrained art. But contrary to popular belief, psychiatric disease is not a pathway to enhanced creativity. A large study of eminent men concluded that depression and personality disorders were common, especially among writers, and that their prevalence among the gifted exceeded that in the general population. But those disorders were generally a hindrance to creative ability, and psychosis was a frank handicap (Br. J. Psychiatry 1994;165:22–34).
Some individuals have increased dexterity to carry out creative plans for reasons that range from environmental influences on normally structured nervous systems to altered “wiring diagrams.” But regardless of how we have acquired our talents, the ways we choose to use them depend in part on our personality and temperament, which will be our focus in the next issue.
I can think of no better example of how strategic formulation must translate into dexterously executed action to effect change than health care reform, a topic we extensively cover in this month's issue of
Just as some reform measures and proposals seek to recognize the intrinsic nature of neurologists, so then do we include by considering the intrinsic biological differences between people that render some better equipped to execute a particular plan successfully than can their neighbors. Biology matters. As recent sports scandals have suggested, athletic performance can be enhanced by drugs such as anabolic steroids and amphetamines. Some neuroscientists have even advocated the cosmetic use of therapeutic drugs as “cognitive enhancers.” Drugs are exogenous biological influences, but there are endogenous sources, too.
Some of the more readily visible biological differences thought to explain enhanced performance have regarded brain structure. Regional differences in brain function are reflected to some extent by differences in structure. For example, the planum temporale, an auditory region of the temporal lobe, is larger in the language dominant (typically left) hemisphere (Science 1968;161:186–7), a trait shared by nearly all people.
Some regional alterations reflect individual differences in use. The lateral aspect of Heschl's gyrus is larger in the left hemisphere among musicians whose pitch perception strategy favors fundamental frequency or rapid temporal processing, but larger instead in the right hemisphere among musicians whose pitch perception strategy favors spectral pitch processing. This region also is physically larger in accomplished musicians compared to nonmusicians (Nat. Neurosci. 2005;8:1241–7).
Studies of Albert Einstein's brain revealed a greater density of neurons in the cerebral cortex than normal (Neurosci. Lett. 1996;210:161–4), and an aberrant Sylvian fissure with disproportionately larger and more symmetric parietal lobes (Lancet 1999;353:2149–53). The significance of these differences has prompted speculation that the greater neuronal density reduced the time delay for one neuron to communicate with another, and the enlarged parietal lobes enhanced his inherent math and spatial skills, arguments that have some parallels in comparisons between low and high IQ individuals (Trends Neurosci. 1997;20:365–71).
Genetic variations have been considered another source of individual differences. The performances of identical twins on a variety of cognitive and physiologic tests are far more similar than the comparative performances of genetically unrelated people (Behav. Genet. 2004;34:41–50). The search for genetic variations that enhance cognitive performance has revealed several that influence memory, including the serotonin 5-HT2a receptor (Nat. Neurosci. 2003;6:1141–2), brain-derived neurotrophic factor (J. Neurosci. 2003;23:6690–4), KIBRA (found in kidney [KI] and brain [BRA]) (Science 2006;314:475–8), and the dopamine D2 receptor (Science 2007;318:1642–5). Variations of genes related to serotonin are also thought to affect our reaction to novelty and anxiety-provoking situations (J. Neurosci. 2005;25:6460–6) that in turn might influence our drive for seeking creative change. Allelic variations of the gene for catechol-O-methyl transferase (involved in dopamine metabolism, the neurotransmitter of the mesolimbic reward pathway) correlate with performance on a problem-solving task (Am. J. Psychiatry 2002;159:652–4). Interactions between genes and environmental factors may result in unexpected or “emergent” behaviors that may also affect creativity, such as the difference in emotion processing between men and women (Curr. Opin. Neurobiol. 2004;14:233–8).
Less obvious sources of enhanced performance are suspected to reflect individual physiological differences. A functional MRI study comparing the calculation skills of Rüdiger Gamm, a mathematical calculation prodigy, with nonexpert calculators showed that both activated brain regions serving arithmetic, quantity, and visual imagery, but only Gamm additionally activated memory regions (Nat. Neurosci. 2001;4:103–7). In a related study, expert abacus calculators activated the same areas for mental calculation as nonexperts, but additionally activated visuospatial cortices, congruent with the greater visuospatial demands of an abacus-based strategy (NeuroImage 2003;19:296–307). These studies suggest that the neural networks underlying prodigy-level skill may be different than those underlying ordinary-level skill. The regions required for the basic function are active in both, but the prodigies have another functional system in their skill-related network that seems to reflect their training background. It is unclear if the extra system is inherently available to anyone with sufficient practice – and if so, to what degree – or is instead a form of biologically conferred “performance synesthesia.”
Disease-mediated biological alterations of brain structure and function seem an unlikely source of heightened ability, yet autistic savants are a well-known group of individuals whose extraordinary talent resides in a circumscribed area that is grossly disproportionate to their general intellect. Savant skills have included memory, mathematics, music, calendrical calculations, and, less consistently, mechanical or artistic skill.
Biological substrates of savantism are unclear, but some correlates have included a larger amygdala (in children) and hippocampus (J. Neurosci. 2004;24:6392–401). Perseverative fixation on a single activity that is their sole avenue of socialization and reward, coinciding with their area of savant-level talent, suggests that savantism may derive from the extreme focus of reward on a single activity and structurally altered paralimbic reward substrates, but this is currently speculative.
Another group of patients whose disease can sometimes enhance creativity are patients with frontotemporal dementia possibly reflecting the reduced behavioral inhibition that characterizes FTD (Arch. Neurol. 2004;61:842–4). Some FTD patients have developed newly expressed artistic skills reflected in greater volumes of less constrained art. But contrary to popular belief, psychiatric disease is not a pathway to enhanced creativity. A large study of eminent men concluded that depression and personality disorders were common, especially among writers, and that their prevalence among the gifted exceeded that in the general population. But those disorders were generally a hindrance to creative ability, and psychosis was a frank handicap (Br. J. Psychiatry 1994;165:22–34).
Some individuals have increased dexterity to carry out creative plans for reasons that range from environmental influences on normally structured nervous systems to altered “wiring diagrams.” But regardless of how we have acquired our talents, the ways we choose to use them depend in part on our personality and temperament, which will be our focus in the next issue.
Practice Pays When Undertaking Creative Action
In last month's issue of
Once the decision to act is made, the success of the creative endeavor will depend on the dexterity of its execution. With 3 seconds left to play, the fate of a team down by one point will differ drastically if the ball falls into the hands of a Michael Jordan versus a basketball wannabe like me. The research reported in our pages represents the dexterous execution of well-formulated experiments that further our knowledge and ultimately lead to treatments and cures of diseases such as Parkinson's disease, epilepsy, and diabetic neuropathy as highlighted at last month's annual meeting of the American Academy of Neurology. Yet, not all research successfully illuminates the questions it was designed to answer, and not all last second shots result in victory.
We differ greatly in our levels of dexterity in the performance of any given task. How can we explain such differences and, in particular, how can we explain the extraordinary dexterity of virtuoso musicians, elite athletes, and Nobel laureate scientists? Do these differences reflect how much of an impact nurture has on nature? Might it simply be that some individuals are just more practiced than others (nurture)? If so, this then begs the question of whether anyone of us could practice to the point of perfection. Or is that we are built differently (nature)? Might biological differences between us facilitate greater dexterity in the fortunate few, and if so, would this translate to all abilities or to just certain domains of skill (such as one of the multiple intelligences proposed by Howard Gardner that were described last month)?
Nurture, nature, and their interplay all contribute. Epigenetic alterations of genetic expression – the influence of nurture on nature – can occur at all levels of our physiology from DNA transcription to behavior: the social structure in which a child is reared, the expression of trigger-sensitive phenotypes, and the plasticity of hard wired neuronal circuits are just a few examples of their interplay (Ann. N.Y. Acad. Sci. 2003;999:451-60).
But, even in the case of a biologically influenced skill, environmental factors must play a role. For example, although just knowing how to play the piano is not sufficient to achieve virtuoso status, it is still a basic requirement even for a biologically determined musical prodigy. Therefore, let us examine nurture more closely.
Practice is part of everything we call learning and education: school, music lessons, rehearsal for a play, and so on. Learning to read requires a transition from an effortful, letter-by-letter phonetic strategy to a much less effortful whole-word semantic recognition strategy, and we find a similar pattern in learning a new skill. When we first begin to practice a new skill, many details are unfamiliar to us. Before a video gamer can reach the competitive level of the game itself, he must first learn how the controller works. Button A controlled jumps in the last game, but in the new game it controls gunfire. Even the layout of controls differs between PlayStation, Xbox, and Nintendo game systems. Learning how the controller works takes time and, until the controls are mastered, a player cannot be at his potential best. Acquiring any new skill requires overcoming these unfamiliar details during early practice stages so that, early in our practice trials, we pay close attention to these unfamiliar details. This is the attentional stage of skill learning. Attention and organization of the different steps of the skill are mediated by the prefrontal cortex and other regions that comprise the attentional network.
With repetition, these details become increasingly familiar. Later in our practice trials, these details and the skill itself become so familiar that the practiced action is nearly automatic. The transition from the effortful attention to each unfamiliar detail and stitching together of a series of skill fragments into a complete seamless action marks the beginning of the automaticity stage, and it is not until then that we can start down the road to virtuoso levels of skill. Functional brain imaging studies show that activation of prefrontal cortices during the early attentional practice stage diminish as the skill becomes automatic. With increasing task familiarity comes greater task automaticity and increasing performance dexterity (Proc. Natl. Acad. Sci. U.S.A. 1998;95:853-60). By the time we reach the stage of performance automaticity, our performance level plateaus. There are individual differences in how long it takes for a skill to reach the automaticity stage and the level of dexterity achieved by that time (although external rewards can influence this), but most people can reach this stage for most tasks.
Cerebral activation patterns for subsequent practice stages differ between sensorimotor and cognitive tasks (Cereb. Cortex 2005;15:1089-102). Sensorimotor tasks are defined as those that involve repetitive movements of a specific body part, for example, the left fingers of a violinist. Ongoing repeated fingering movements enhance horizontal synaptic connectivity within the finger homunculus. Consequently, there is enhanced cortical activation of that region with the fingering movement because of the greater number of neurons recruited for that task's performance (Science 1995;270:305-7). Cognitive tasks, in contrast, rely upon the integration of multiple brain regions that are geographically distant and serve different functions. With practice, the relative activation of all these different areas diminishes perhaps because they become physiologically integrated into a functional network that requires less effort expenditure from each component region.
Practice effects powerfully influence the level of dexterity any normal human brain can attain. However, biological differences do exist among us and also influence dexterity levels, as we shall consider next month.
In last month's issue of
Once the decision to act is made, the success of the creative endeavor will depend on the dexterity of its execution. With 3 seconds left to play, the fate of a team down by one point will differ drastically if the ball falls into the hands of a Michael Jordan versus a basketball wannabe like me. The research reported in our pages represents the dexterous execution of well-formulated experiments that further our knowledge and ultimately lead to treatments and cures of diseases such as Parkinson's disease, epilepsy, and diabetic neuropathy as highlighted at last month's annual meeting of the American Academy of Neurology. Yet, not all research successfully illuminates the questions it was designed to answer, and not all last second shots result in victory.
We differ greatly in our levels of dexterity in the performance of any given task. How can we explain such differences and, in particular, how can we explain the extraordinary dexterity of virtuoso musicians, elite athletes, and Nobel laureate scientists? Do these differences reflect how much of an impact nurture has on nature? Might it simply be that some individuals are just more practiced than others (nurture)? If so, this then begs the question of whether anyone of us could practice to the point of perfection. Or is that we are built differently (nature)? Might biological differences between us facilitate greater dexterity in the fortunate few, and if so, would this translate to all abilities or to just certain domains of skill (such as one of the multiple intelligences proposed by Howard Gardner that were described last month)?
Nurture, nature, and their interplay all contribute. Epigenetic alterations of genetic expression – the influence of nurture on nature – can occur at all levels of our physiology from DNA transcription to behavior: the social structure in which a child is reared, the expression of trigger-sensitive phenotypes, and the plasticity of hard wired neuronal circuits are just a few examples of their interplay (Ann. N.Y. Acad. Sci. 2003;999:451-60).
But, even in the case of a biologically influenced skill, environmental factors must play a role. For example, although just knowing how to play the piano is not sufficient to achieve virtuoso status, it is still a basic requirement even for a biologically determined musical prodigy. Therefore, let us examine nurture more closely.
Practice is part of everything we call learning and education: school, music lessons, rehearsal for a play, and so on. Learning to read requires a transition from an effortful, letter-by-letter phonetic strategy to a much less effortful whole-word semantic recognition strategy, and we find a similar pattern in learning a new skill. When we first begin to practice a new skill, many details are unfamiliar to us. Before a video gamer can reach the competitive level of the game itself, he must first learn how the controller works. Button A controlled jumps in the last game, but in the new game it controls gunfire. Even the layout of controls differs between PlayStation, Xbox, and Nintendo game systems. Learning how the controller works takes time and, until the controls are mastered, a player cannot be at his potential best. Acquiring any new skill requires overcoming these unfamiliar details during early practice stages so that, early in our practice trials, we pay close attention to these unfamiliar details. This is the attentional stage of skill learning. Attention and organization of the different steps of the skill are mediated by the prefrontal cortex and other regions that comprise the attentional network.
With repetition, these details become increasingly familiar. Later in our practice trials, these details and the skill itself become so familiar that the practiced action is nearly automatic. The transition from the effortful attention to each unfamiliar detail and stitching together of a series of skill fragments into a complete seamless action marks the beginning of the automaticity stage, and it is not until then that we can start down the road to virtuoso levels of skill. Functional brain imaging studies show that activation of prefrontal cortices during the early attentional practice stage diminish as the skill becomes automatic. With increasing task familiarity comes greater task automaticity and increasing performance dexterity (Proc. Natl. Acad. Sci. U.S.A. 1998;95:853-60). By the time we reach the stage of performance automaticity, our performance level plateaus. There are individual differences in how long it takes for a skill to reach the automaticity stage and the level of dexterity achieved by that time (although external rewards can influence this), but most people can reach this stage for most tasks.
Cerebral activation patterns for subsequent practice stages differ between sensorimotor and cognitive tasks (Cereb. Cortex 2005;15:1089-102). Sensorimotor tasks are defined as those that involve repetitive movements of a specific body part, for example, the left fingers of a violinist. Ongoing repeated fingering movements enhance horizontal synaptic connectivity within the finger homunculus. Consequently, there is enhanced cortical activation of that region with the fingering movement because of the greater number of neurons recruited for that task's performance (Science 1995;270:305-7). Cognitive tasks, in contrast, rely upon the integration of multiple brain regions that are geographically distant and serve different functions. With practice, the relative activation of all these different areas diminishes perhaps because they become physiologically integrated into a functional network that requires less effort expenditure from each component region.
Practice effects powerfully influence the level of dexterity any normal human brain can attain. However, biological differences do exist among us and also influence dexterity levels, as we shall consider next month.
In last month's issue of
Once the decision to act is made, the success of the creative endeavor will depend on the dexterity of its execution. With 3 seconds left to play, the fate of a team down by one point will differ drastically if the ball falls into the hands of a Michael Jordan versus a basketball wannabe like me. The research reported in our pages represents the dexterous execution of well-formulated experiments that further our knowledge and ultimately lead to treatments and cures of diseases such as Parkinson's disease, epilepsy, and diabetic neuropathy as highlighted at last month's annual meeting of the American Academy of Neurology. Yet, not all research successfully illuminates the questions it was designed to answer, and not all last second shots result in victory.
We differ greatly in our levels of dexterity in the performance of any given task. How can we explain such differences and, in particular, how can we explain the extraordinary dexterity of virtuoso musicians, elite athletes, and Nobel laureate scientists? Do these differences reflect how much of an impact nurture has on nature? Might it simply be that some individuals are just more practiced than others (nurture)? If so, this then begs the question of whether anyone of us could practice to the point of perfection. Or is that we are built differently (nature)? Might biological differences between us facilitate greater dexterity in the fortunate few, and if so, would this translate to all abilities or to just certain domains of skill (such as one of the multiple intelligences proposed by Howard Gardner that were described last month)?
Nurture, nature, and their interplay all contribute. Epigenetic alterations of genetic expression – the influence of nurture on nature – can occur at all levels of our physiology from DNA transcription to behavior: the social structure in which a child is reared, the expression of trigger-sensitive phenotypes, and the plasticity of hard wired neuronal circuits are just a few examples of their interplay (Ann. N.Y. Acad. Sci. 2003;999:451-60).
But, even in the case of a biologically influenced skill, environmental factors must play a role. For example, although just knowing how to play the piano is not sufficient to achieve virtuoso status, it is still a basic requirement even for a biologically determined musical prodigy. Therefore, let us examine nurture more closely.
Practice is part of everything we call learning and education: school, music lessons, rehearsal for a play, and so on. Learning to read requires a transition from an effortful, letter-by-letter phonetic strategy to a much less effortful whole-word semantic recognition strategy, and we find a similar pattern in learning a new skill. When we first begin to practice a new skill, many details are unfamiliar to us. Before a video gamer can reach the competitive level of the game itself, he must first learn how the controller works. Button A controlled jumps in the last game, but in the new game it controls gunfire. Even the layout of controls differs between PlayStation, Xbox, and Nintendo game systems. Learning how the controller works takes time and, until the controls are mastered, a player cannot be at his potential best. Acquiring any new skill requires overcoming these unfamiliar details during early practice stages so that, early in our practice trials, we pay close attention to these unfamiliar details. This is the attentional stage of skill learning. Attention and organization of the different steps of the skill are mediated by the prefrontal cortex and other regions that comprise the attentional network.
With repetition, these details become increasingly familiar. Later in our practice trials, these details and the skill itself become so familiar that the practiced action is nearly automatic. The transition from the effortful attention to each unfamiliar detail and stitching together of a series of skill fragments into a complete seamless action marks the beginning of the automaticity stage, and it is not until then that we can start down the road to virtuoso levels of skill. Functional brain imaging studies show that activation of prefrontal cortices during the early attentional practice stage diminish as the skill becomes automatic. With increasing task familiarity comes greater task automaticity and increasing performance dexterity (Proc. Natl. Acad. Sci. U.S.A. 1998;95:853-60). By the time we reach the stage of performance automaticity, our performance level plateaus. There are individual differences in how long it takes for a skill to reach the automaticity stage and the level of dexterity achieved by that time (although external rewards can influence this), but most people can reach this stage for most tasks.
Cerebral activation patterns for subsequent practice stages differ between sensorimotor and cognitive tasks (Cereb. Cortex 2005;15:1089-102). Sensorimotor tasks are defined as those that involve repetitive movements of a specific body part, for example, the left fingers of a violinist. Ongoing repeated fingering movements enhance horizontal synaptic connectivity within the finger homunculus. Consequently, there is enhanced cortical activation of that region with the fingering movement because of the greater number of neurons recruited for that task's performance (Science 1995;270:305-7). Cognitive tasks, in contrast, rely upon the integration of multiple brain regions that are geographically distant and serve different functions. With practice, the relative activation of all these different areas diminishes perhaps because they become physiologically integrated into a functional network that requires less effort expenditure from each component region.
Practice effects powerfully influence the level of dexterity any normal human brain can attain. However, biological differences do exist among us and also influence dexterity levels, as we shall consider next month.
The Jump From Creative Vision to Strategic Plan
The essence of science is the formulation and execution of strategies designed to answer questions, and in this month's “Neuroscience Today, Neurology Tomorrow” (page 8), we are given considerable insight into the sequential neurophysiological nature of the intention to act (formulation) and the act itself (execution). Whether the question is how to prevent Alzheimer's disease, use genetic tests effectively, or make Medicare sustainable, we first formulate a plan that we will then carry out. Therefore, our April issue provides a wonderful context in which to continue our discussion of creativity by turning to the step of strategic formulation, the first step in action.
Howard Gardner, Ph.D., in his 1983 book, “Frames of Mind,” first proposed the theory of multiple intelligences. He derived a set of six (at that time) noncoinciding intellectual competencies from rigorous and overlapping neurological, psychological, and developmental observations. The original six intelligences included linguistic, musical, logical-mathematical, spatial, bodily-kinesthetic, and personal intelligence. According to Gardner, any form of intelligence has a creative aspect in which a desired goal is envisioned and a plan for attaining it is formulated and then executed (What is intelligence? in “Frames of Mind” [New York: Harper Collins, 1983, pp. 59-70]).
Translating a creative vision, an idea, into a creative achievement requires a strategic plan, as illustrated by the scientist writing a grant or the architect drafting blueprints. I have an idea that age-related memory decline may be a manifestation of subclinical Alzheimer's disease, but formulating an experiment to test that idea is a nontrivial subsequent step. The generation of ideas and the formulation of a strategic action plan to realize those ideas are not wholly separate processes because both require mental imagery, but they might be likened to Mihály Csíkszentmihályi's model of creativity where the idea represents a low-threshold, low-effort thought while the plan is a more effortful and deliberate formulation to give the idea “legs” (“Creativity: Flow and the Psychology of Discovery and Invention” [New York: Harper Collins, 1996]).
Strategic planning requires the ability to perceive and weigh the importance of multiple pieces of information that accommodate the needs of the individual to the circumstances of the situation. Working memory, the cognitive process that underlies multitasking, involves holding all relevant pieces of information in our conscious thought while we generate an action plan. Physiologically, dorsolateral prefrontal neurons become active during the period when information must be retained before a response can be enacted (Curr. Opin. Neurobiol. 2004;14:163-8). The activation of neurons in this region may reflect holding online the relevant information itself, or another aspect such as its intended future use in the action plan. We perceive various stimuli around us, and learn to associate certain ones with specific outcomes (for example, the sound of a dinner bell means dinner is served). Such stimulus-outcome relationships may be simple (single and immediately paired) or complex and removed from the salient outcome by multiple steps, but all act through activation of the same reward systems we have discussed.
But what about when a solution arrives in an “aha moment,” a sudden burst of insight that does not seem to derive from strategic planning? These moments, which are possibly analogous to the unintended, sudden spontaneous generation of a mental image, are unexpected, and generally do not occur during a time of deliberate conscious (or externally provoked) analysis (Trends Cog. Sci. 2005;9:322-8). Mind wandering provides fertile ground for aha moments. It is a state of spontaneous thought that is not deliberate and that occurs when there is a lull in external demands, such as in the shower. Sudden insights that arise during states of mind wandering can be quite relevant solutions for unfinished problems, just as if the working memory state of systematic analysis had been temporarily paused and then suddenly restarted, bringing an important missing piece of the puzzle into conscious awareness.
Functional MRI studies of insight-oriented problem solving and mind wandering have revealed that both involve enhanced activity of the anterior cingulate cortex, the same general region that is first activated when our behavior changes in response to altered reward (Proc. Natl. Acad. Sci. U.S.A. 2009;106:8719-24). The anterior cingulate is part of a “default network,” a series of brain structures that, paradoxically, become maximally active when we are at rest. When individuals are unaware of their mind wandering (and so least aware of internal and external distractions), working memory regions (including the dorsolateral prefrontal cortices) increase their activity together with the default network. This possibly provides a neural substrate for spontaneous “aha” insights that arise during periods of mind wandering.
The prefrontal cortex is a multimodal region where information about bodily states, surrounding circumstances, and semantic knowledge converge. All this information is used to develop an appropriate plan of action. Prefrontal neurons project to the locus ceruleus, the origin of cortical noradrenergic projections that influence our level of alertness and attention. Norepinephrine released from the locus ceruleus facilitates the transmission of incoming sensory signals, making it more likely that we will detect, attend to, and be influenced by environmental sensory stimuli. Parietal sensory association cortices also contain neurons with working memory type properties, so that multiple sources of perceptual information can be held “online” while a plan is being formulated (J. Neurosci. 2002;22:8720-5).
These sensory association cortices perhaps contribute perception and mental imagery to the formulated plan and constitute a working memory network designed for strategic thought. The potential rewarding and aversive values of a stimulus influence prefrontal neuronal activity, as well as other stages of the perceptual and planning networks, and thereby affect what we perceptually notice and choose to contemplate in the strategic planning process (Curr. Opin. Neurobiol. 2004;14:139-47). We are more likely to attend to higher-reward and higher-risk stimuli than to those with little potential consequence.
Next month we will consider what happens when we execute a plan of action.
The essence of science is the formulation and execution of strategies designed to answer questions, and in this month's “Neuroscience Today, Neurology Tomorrow” (page 8), we are given considerable insight into the sequential neurophysiological nature of the intention to act (formulation) and the act itself (execution). Whether the question is how to prevent Alzheimer's disease, use genetic tests effectively, or make Medicare sustainable, we first formulate a plan that we will then carry out. Therefore, our April issue provides a wonderful context in which to continue our discussion of creativity by turning to the step of strategic formulation, the first step in action.
Howard Gardner, Ph.D., in his 1983 book, “Frames of Mind,” first proposed the theory of multiple intelligences. He derived a set of six (at that time) noncoinciding intellectual competencies from rigorous and overlapping neurological, psychological, and developmental observations. The original six intelligences included linguistic, musical, logical-mathematical, spatial, bodily-kinesthetic, and personal intelligence. According to Gardner, any form of intelligence has a creative aspect in which a desired goal is envisioned and a plan for attaining it is formulated and then executed (What is intelligence? in “Frames of Mind” [New York: Harper Collins, 1983, pp. 59-70]).
Translating a creative vision, an idea, into a creative achievement requires a strategic plan, as illustrated by the scientist writing a grant or the architect drafting blueprints. I have an idea that age-related memory decline may be a manifestation of subclinical Alzheimer's disease, but formulating an experiment to test that idea is a nontrivial subsequent step. The generation of ideas and the formulation of a strategic action plan to realize those ideas are not wholly separate processes because both require mental imagery, but they might be likened to Mihály Csíkszentmihályi's model of creativity where the idea represents a low-threshold, low-effort thought while the plan is a more effortful and deliberate formulation to give the idea “legs” (“Creativity: Flow and the Psychology of Discovery and Invention” [New York: Harper Collins, 1996]).
Strategic planning requires the ability to perceive and weigh the importance of multiple pieces of information that accommodate the needs of the individual to the circumstances of the situation. Working memory, the cognitive process that underlies multitasking, involves holding all relevant pieces of information in our conscious thought while we generate an action plan. Physiologically, dorsolateral prefrontal neurons become active during the period when information must be retained before a response can be enacted (Curr. Opin. Neurobiol. 2004;14:163-8). The activation of neurons in this region may reflect holding online the relevant information itself, or another aspect such as its intended future use in the action plan. We perceive various stimuli around us, and learn to associate certain ones with specific outcomes (for example, the sound of a dinner bell means dinner is served). Such stimulus-outcome relationships may be simple (single and immediately paired) or complex and removed from the salient outcome by multiple steps, but all act through activation of the same reward systems we have discussed.
But what about when a solution arrives in an “aha moment,” a sudden burst of insight that does not seem to derive from strategic planning? These moments, which are possibly analogous to the unintended, sudden spontaneous generation of a mental image, are unexpected, and generally do not occur during a time of deliberate conscious (or externally provoked) analysis (Trends Cog. Sci. 2005;9:322-8). Mind wandering provides fertile ground for aha moments. It is a state of spontaneous thought that is not deliberate and that occurs when there is a lull in external demands, such as in the shower. Sudden insights that arise during states of mind wandering can be quite relevant solutions for unfinished problems, just as if the working memory state of systematic analysis had been temporarily paused and then suddenly restarted, bringing an important missing piece of the puzzle into conscious awareness.
Functional MRI studies of insight-oriented problem solving and mind wandering have revealed that both involve enhanced activity of the anterior cingulate cortex, the same general region that is first activated when our behavior changes in response to altered reward (Proc. Natl. Acad. Sci. U.S.A. 2009;106:8719-24). The anterior cingulate is part of a “default network,” a series of brain structures that, paradoxically, become maximally active when we are at rest. When individuals are unaware of their mind wandering (and so least aware of internal and external distractions), working memory regions (including the dorsolateral prefrontal cortices) increase their activity together with the default network. This possibly provides a neural substrate for spontaneous “aha” insights that arise during periods of mind wandering.
The prefrontal cortex is a multimodal region where information about bodily states, surrounding circumstances, and semantic knowledge converge. All this information is used to develop an appropriate plan of action. Prefrontal neurons project to the locus ceruleus, the origin of cortical noradrenergic projections that influence our level of alertness and attention. Norepinephrine released from the locus ceruleus facilitates the transmission of incoming sensory signals, making it more likely that we will detect, attend to, and be influenced by environmental sensory stimuli. Parietal sensory association cortices also contain neurons with working memory type properties, so that multiple sources of perceptual information can be held “online” while a plan is being formulated (J. Neurosci. 2002;22:8720-5).
These sensory association cortices perhaps contribute perception and mental imagery to the formulated plan and constitute a working memory network designed for strategic thought. The potential rewarding and aversive values of a stimulus influence prefrontal neuronal activity, as well as other stages of the perceptual and planning networks, and thereby affect what we perceptually notice and choose to contemplate in the strategic planning process (Curr. Opin. Neurobiol. 2004;14:139-47). We are more likely to attend to higher-reward and higher-risk stimuli than to those with little potential consequence.
Next month we will consider what happens when we execute a plan of action.
The essence of science is the formulation and execution of strategies designed to answer questions, and in this month's “Neuroscience Today, Neurology Tomorrow” (page 8), we are given considerable insight into the sequential neurophysiological nature of the intention to act (formulation) and the act itself (execution). Whether the question is how to prevent Alzheimer's disease, use genetic tests effectively, or make Medicare sustainable, we first formulate a plan that we will then carry out. Therefore, our April issue provides a wonderful context in which to continue our discussion of creativity by turning to the step of strategic formulation, the first step in action.
Howard Gardner, Ph.D., in his 1983 book, “Frames of Mind,” first proposed the theory of multiple intelligences. He derived a set of six (at that time) noncoinciding intellectual competencies from rigorous and overlapping neurological, psychological, and developmental observations. The original six intelligences included linguistic, musical, logical-mathematical, spatial, bodily-kinesthetic, and personal intelligence. According to Gardner, any form of intelligence has a creative aspect in which a desired goal is envisioned and a plan for attaining it is formulated and then executed (What is intelligence? in “Frames of Mind” [New York: Harper Collins, 1983, pp. 59-70]).
Translating a creative vision, an idea, into a creative achievement requires a strategic plan, as illustrated by the scientist writing a grant or the architect drafting blueprints. I have an idea that age-related memory decline may be a manifestation of subclinical Alzheimer's disease, but formulating an experiment to test that idea is a nontrivial subsequent step. The generation of ideas and the formulation of a strategic action plan to realize those ideas are not wholly separate processes because both require mental imagery, but they might be likened to Mihály Csíkszentmihályi's model of creativity where the idea represents a low-threshold, low-effort thought while the plan is a more effortful and deliberate formulation to give the idea “legs” (“Creativity: Flow and the Psychology of Discovery and Invention” [New York: Harper Collins, 1996]).
Strategic planning requires the ability to perceive and weigh the importance of multiple pieces of information that accommodate the needs of the individual to the circumstances of the situation. Working memory, the cognitive process that underlies multitasking, involves holding all relevant pieces of information in our conscious thought while we generate an action plan. Physiologically, dorsolateral prefrontal neurons become active during the period when information must be retained before a response can be enacted (Curr. Opin. Neurobiol. 2004;14:163-8). The activation of neurons in this region may reflect holding online the relevant information itself, or another aspect such as its intended future use in the action plan. We perceive various stimuli around us, and learn to associate certain ones with specific outcomes (for example, the sound of a dinner bell means dinner is served). Such stimulus-outcome relationships may be simple (single and immediately paired) or complex and removed from the salient outcome by multiple steps, but all act through activation of the same reward systems we have discussed.
But what about when a solution arrives in an “aha moment,” a sudden burst of insight that does not seem to derive from strategic planning? These moments, which are possibly analogous to the unintended, sudden spontaneous generation of a mental image, are unexpected, and generally do not occur during a time of deliberate conscious (or externally provoked) analysis (Trends Cog. Sci. 2005;9:322-8). Mind wandering provides fertile ground for aha moments. It is a state of spontaneous thought that is not deliberate and that occurs when there is a lull in external demands, such as in the shower. Sudden insights that arise during states of mind wandering can be quite relevant solutions for unfinished problems, just as if the working memory state of systematic analysis had been temporarily paused and then suddenly restarted, bringing an important missing piece of the puzzle into conscious awareness.
Functional MRI studies of insight-oriented problem solving and mind wandering have revealed that both involve enhanced activity of the anterior cingulate cortex, the same general region that is first activated when our behavior changes in response to altered reward (Proc. Natl. Acad. Sci. U.S.A. 2009;106:8719-24). The anterior cingulate is part of a “default network,” a series of brain structures that, paradoxically, become maximally active when we are at rest. When individuals are unaware of their mind wandering (and so least aware of internal and external distractions), working memory regions (including the dorsolateral prefrontal cortices) increase their activity together with the default network. This possibly provides a neural substrate for spontaneous “aha” insights that arise during periods of mind wandering.
The prefrontal cortex is a multimodal region where information about bodily states, surrounding circumstances, and semantic knowledge converge. All this information is used to develop an appropriate plan of action. Prefrontal neurons project to the locus ceruleus, the origin of cortical noradrenergic projections that influence our level of alertness and attention. Norepinephrine released from the locus ceruleus facilitates the transmission of incoming sensory signals, making it more likely that we will detect, attend to, and be influenced by environmental sensory stimuli. Parietal sensory association cortices also contain neurons with working memory type properties, so that multiple sources of perceptual information can be held “online” while a plan is being formulated (J. Neurosci. 2002;22:8720-5).
These sensory association cortices perhaps contribute perception and mental imagery to the formulated plan and constitute a working memory network designed for strategic thought. The potential rewarding and aversive values of a stimulus influence prefrontal neuronal activity, as well as other stages of the perceptual and planning networks, and thereby affect what we perceptually notice and choose to contemplate in the strategic planning process (Curr. Opin. Neurobiol. 2004;14:139-47). We are more likely to attend to higher-reward and higher-risk stimuli than to those with little potential consequence.
Next month we will consider what happens when we execute a plan of action.
Turning Perceptions and Mental Images Into Ideas
This is the third chapter in our creativity story. Creativity is an intentional process in which we try to change what is into what should be (as is well illustrated by the research described in each of our issues). In February, we considered the importance of motivation, and this month we turn to perception.
We perceive what is before us, but much of what we perceive is only the part that our mind is prepared to perceive. In 1851, Henry David Thoreau noted that astronomers were better served in their quest to define planets, galaxies, and other heavenly phenomena by insightful and experienced perception than by the power of their telescope. Perception is the critical bidirectional interface between external and internal reality, between the world around us and our mental image of it.
Motivationally relevant stimuli drive our behavior. We perceive an existing state and imagine a desired state. Some desired states are conceptually simple and based upon the restoration of a biological set point such as hunger and thirst. Those we generally assign to the realm of creative behavior are less directly linked to a biological set point, yet are still motivated by the same reward systems and achieve their own form of satiety. Creative behavior requires that we perceive what is and imagine a more rewarding what should be.
Each perceptual experience is a unique neurophysiological event generated by the activation of specific neuronal pathways distributed across primary sensory, association, and paralimbic cortices. These collectively constitute our mental image of the outside world (Brain 1998;121:1013–52) and our synchronously experienced inner bodily state (“The Feeling of What Happens: Body, Emotion and the Making of Consciousness” [New York: Harcourt Brace, 1999]).
Each experience activates a specific group of neuronal pathways, resulting in a unique pattern of synaptic facilitation. Synaptic facilitation leaves a lasting physiological trace of the image in our mind, digitized and distributed across all the cortical regions that contributed to the synthesis of the percept. The next time we see the “same thing,” whether it is a familiar face or a shaded contour, synaptic pathways previously facilitated by prior experience of that “same thing” process the new real-time input more quickly, more automatically, and with embellishments of coactivated details from previous similar experiences. By tapping into one piece of a facilitated pathway that constitutes a past perceptual experience, we may be reminded of parts of that past experience, thus contributing to our current experience.
Mental imagery arises from these retained, synaptically facilitated patterns of past perceptions. Conjured mental images may approximate an original percept, or be abstracted by combining select details from multiple experiences, as the philosopher David Hume observed nearly 300 years ago. Mental imagery tasks activate some of the same sensory regions as actual perception (Brain Res. Cogn. Brain Res. 2004;20:226–41), and damage to these brain regions results in both perceptual and mental imagery defects (Brain 1997;120:217–28). Yet, as Hume observed, “these faculties may mimic or copy the perceptions of the senses; but they never entirely reach the force and vivacity of the original sentiment” (Great Books of the Western World, Vol. 35, “An Enquiry Concerning Human Understanding” [Chicago: Encyclopedia Britannica, 1952, pp. 445–509]).
PET and functional MRI activation patterns are similar between perceived imagery and mental imagery, but they are not identical. The reduced clarity and vivacity of mental images compared with perception may reflect a reduced role for primary visual regions in mental imagery, different neuronal subpopulations for each, or another explanation (Psychol. Bull. 2003;129:723–46).
Mirror neurons, initially described in monkeys, encode a form of motor imagery reflecting intention rather than an actual movement. Evidence for mirror neurons in humans comes from several sources, including PET scans of people observing other people imitating their actions, electrical recordings of brain activity in epilepsy patients, and the effects of imagined and observed imitated behavior on the magnetic excitability of the brain (Exp. Brain Res. 1996;111:246–52). An implication of the ability to recognize movement patterns is our ability to generate signals that are understood by the sender and the receiver. If I wave my hand in a way that you recognize, then we both understand I am waving hello.
In a similar way, a shared symbol system based upon sound may have contributed to the evolution of language (Nat. Rev. Neurosci. 2001;2:661–70), which in turn has allowed humans to pass on knowledge from generation to generation.
Imagining what another person is thinking or feeling is another type of mental imagery, called theory of mind. The role of sensory and motor imagery substrates (including mirror neurons) in theory of mind is debated (Trends Cogn. Sci. 1998;2:493–501), but theory of mind is nonetheless important for creative behavior, because if I imagine a course of action that will impact others, it will benefit me to know how it might make them feel.
We can even combine different modalities: a dragon that meows or a fish named Nemo that talks. What we envision draws from the repository of what we have stored, but what we choose to imagine depends on prefrontally mediated attentional systems that, in turn, are motivated by our internal state, our perceived needs, the state of the world around us, our own abilities, and other factors. The relative reward of different contingencies depends on our state of need so that conjured images have a reward value within the context of present circumstances. Our prefrontal attentional network directs our sensory regions to conjure images relevant to our needs (Cereb. Cortex 2001;11:260–6), which allows us to plan a course of action to create what should be. How we formulate and execute that plan will be our next consideration.
This is the third chapter in our creativity story. Creativity is an intentional process in which we try to change what is into what should be (as is well illustrated by the research described in each of our issues). In February, we considered the importance of motivation, and this month we turn to perception.
We perceive what is before us, but much of what we perceive is only the part that our mind is prepared to perceive. In 1851, Henry David Thoreau noted that astronomers were better served in their quest to define planets, galaxies, and other heavenly phenomena by insightful and experienced perception than by the power of their telescope. Perception is the critical bidirectional interface between external and internal reality, between the world around us and our mental image of it.
Motivationally relevant stimuli drive our behavior. We perceive an existing state and imagine a desired state. Some desired states are conceptually simple and based upon the restoration of a biological set point such as hunger and thirst. Those we generally assign to the realm of creative behavior are less directly linked to a biological set point, yet are still motivated by the same reward systems and achieve their own form of satiety. Creative behavior requires that we perceive what is and imagine a more rewarding what should be.
Each perceptual experience is a unique neurophysiological event generated by the activation of specific neuronal pathways distributed across primary sensory, association, and paralimbic cortices. These collectively constitute our mental image of the outside world (Brain 1998;121:1013–52) and our synchronously experienced inner bodily state (“The Feeling of What Happens: Body, Emotion and the Making of Consciousness” [New York: Harcourt Brace, 1999]).
Each experience activates a specific group of neuronal pathways, resulting in a unique pattern of synaptic facilitation. Synaptic facilitation leaves a lasting physiological trace of the image in our mind, digitized and distributed across all the cortical regions that contributed to the synthesis of the percept. The next time we see the “same thing,” whether it is a familiar face or a shaded contour, synaptic pathways previously facilitated by prior experience of that “same thing” process the new real-time input more quickly, more automatically, and with embellishments of coactivated details from previous similar experiences. By tapping into one piece of a facilitated pathway that constitutes a past perceptual experience, we may be reminded of parts of that past experience, thus contributing to our current experience.
Mental imagery arises from these retained, synaptically facilitated patterns of past perceptions. Conjured mental images may approximate an original percept, or be abstracted by combining select details from multiple experiences, as the philosopher David Hume observed nearly 300 years ago. Mental imagery tasks activate some of the same sensory regions as actual perception (Brain Res. Cogn. Brain Res. 2004;20:226–41), and damage to these brain regions results in both perceptual and mental imagery defects (Brain 1997;120:217–28). Yet, as Hume observed, “these faculties may mimic or copy the perceptions of the senses; but they never entirely reach the force and vivacity of the original sentiment” (Great Books of the Western World, Vol. 35, “An Enquiry Concerning Human Understanding” [Chicago: Encyclopedia Britannica, 1952, pp. 445–509]).
PET and functional MRI activation patterns are similar between perceived imagery and mental imagery, but they are not identical. The reduced clarity and vivacity of mental images compared with perception may reflect a reduced role for primary visual regions in mental imagery, different neuronal subpopulations for each, or another explanation (Psychol. Bull. 2003;129:723–46).
Mirror neurons, initially described in monkeys, encode a form of motor imagery reflecting intention rather than an actual movement. Evidence for mirror neurons in humans comes from several sources, including PET scans of people observing other people imitating their actions, electrical recordings of brain activity in epilepsy patients, and the effects of imagined and observed imitated behavior on the magnetic excitability of the brain (Exp. Brain Res. 1996;111:246–52). An implication of the ability to recognize movement patterns is our ability to generate signals that are understood by the sender and the receiver. If I wave my hand in a way that you recognize, then we both understand I am waving hello.
In a similar way, a shared symbol system based upon sound may have contributed to the evolution of language (Nat. Rev. Neurosci. 2001;2:661–70), which in turn has allowed humans to pass on knowledge from generation to generation.
Imagining what another person is thinking or feeling is another type of mental imagery, called theory of mind. The role of sensory and motor imagery substrates (including mirror neurons) in theory of mind is debated (Trends Cogn. Sci. 1998;2:493–501), but theory of mind is nonetheless important for creative behavior, because if I imagine a course of action that will impact others, it will benefit me to know how it might make them feel.
We can even combine different modalities: a dragon that meows or a fish named Nemo that talks. What we envision draws from the repository of what we have stored, but what we choose to imagine depends on prefrontally mediated attentional systems that, in turn, are motivated by our internal state, our perceived needs, the state of the world around us, our own abilities, and other factors. The relative reward of different contingencies depends on our state of need so that conjured images have a reward value within the context of present circumstances. Our prefrontal attentional network directs our sensory regions to conjure images relevant to our needs (Cereb. Cortex 2001;11:260–6), which allows us to plan a course of action to create what should be. How we formulate and execute that plan will be our next consideration.
This is the third chapter in our creativity story. Creativity is an intentional process in which we try to change what is into what should be (as is well illustrated by the research described in each of our issues). In February, we considered the importance of motivation, and this month we turn to perception.
We perceive what is before us, but much of what we perceive is only the part that our mind is prepared to perceive. In 1851, Henry David Thoreau noted that astronomers were better served in their quest to define planets, galaxies, and other heavenly phenomena by insightful and experienced perception than by the power of their telescope. Perception is the critical bidirectional interface between external and internal reality, between the world around us and our mental image of it.
Motivationally relevant stimuli drive our behavior. We perceive an existing state and imagine a desired state. Some desired states are conceptually simple and based upon the restoration of a biological set point such as hunger and thirst. Those we generally assign to the realm of creative behavior are less directly linked to a biological set point, yet are still motivated by the same reward systems and achieve their own form of satiety. Creative behavior requires that we perceive what is and imagine a more rewarding what should be.
Each perceptual experience is a unique neurophysiological event generated by the activation of specific neuronal pathways distributed across primary sensory, association, and paralimbic cortices. These collectively constitute our mental image of the outside world (Brain 1998;121:1013–52) and our synchronously experienced inner bodily state (“The Feeling of What Happens: Body, Emotion and the Making of Consciousness” [New York: Harcourt Brace, 1999]).
Each experience activates a specific group of neuronal pathways, resulting in a unique pattern of synaptic facilitation. Synaptic facilitation leaves a lasting physiological trace of the image in our mind, digitized and distributed across all the cortical regions that contributed to the synthesis of the percept. The next time we see the “same thing,” whether it is a familiar face or a shaded contour, synaptic pathways previously facilitated by prior experience of that “same thing” process the new real-time input more quickly, more automatically, and with embellishments of coactivated details from previous similar experiences. By tapping into one piece of a facilitated pathway that constitutes a past perceptual experience, we may be reminded of parts of that past experience, thus contributing to our current experience.
Mental imagery arises from these retained, synaptically facilitated patterns of past perceptions. Conjured mental images may approximate an original percept, or be abstracted by combining select details from multiple experiences, as the philosopher David Hume observed nearly 300 years ago. Mental imagery tasks activate some of the same sensory regions as actual perception (Brain Res. Cogn. Brain Res. 2004;20:226–41), and damage to these brain regions results in both perceptual and mental imagery defects (Brain 1997;120:217–28). Yet, as Hume observed, “these faculties may mimic or copy the perceptions of the senses; but they never entirely reach the force and vivacity of the original sentiment” (Great Books of the Western World, Vol. 35, “An Enquiry Concerning Human Understanding” [Chicago: Encyclopedia Britannica, 1952, pp. 445–509]).
PET and functional MRI activation patterns are similar between perceived imagery and mental imagery, but they are not identical. The reduced clarity and vivacity of mental images compared with perception may reflect a reduced role for primary visual regions in mental imagery, different neuronal subpopulations for each, or another explanation (Psychol. Bull. 2003;129:723–46).
Mirror neurons, initially described in monkeys, encode a form of motor imagery reflecting intention rather than an actual movement. Evidence for mirror neurons in humans comes from several sources, including PET scans of people observing other people imitating their actions, electrical recordings of brain activity in epilepsy patients, and the effects of imagined and observed imitated behavior on the magnetic excitability of the brain (Exp. Brain Res. 1996;111:246–52). An implication of the ability to recognize movement patterns is our ability to generate signals that are understood by the sender and the receiver. If I wave my hand in a way that you recognize, then we both understand I am waving hello.
In a similar way, a shared symbol system based upon sound may have contributed to the evolution of language (Nat. Rev. Neurosci. 2001;2:661–70), which in turn has allowed humans to pass on knowledge from generation to generation.
Imagining what another person is thinking or feeling is another type of mental imagery, called theory of mind. The role of sensory and motor imagery substrates (including mirror neurons) in theory of mind is debated (Trends Cogn. Sci. 1998;2:493–501), but theory of mind is nonetheless important for creative behavior, because if I imagine a course of action that will impact others, it will benefit me to know how it might make them feel.
We can even combine different modalities: a dragon that meows or a fish named Nemo that talks. What we envision draws from the repository of what we have stored, but what we choose to imagine depends on prefrontally mediated attentional systems that, in turn, are motivated by our internal state, our perceived needs, the state of the world around us, our own abilities, and other factors. The relative reward of different contingencies depends on our state of need so that conjured images have a reward value within the context of present circumstances. Our prefrontal attentional network directs our sensory regions to conjure images relevant to our needs (Cereb. Cortex 2001;11:260–6), which allows us to plan a course of action to create what should be. How we formulate and execute that plan will be our next consideration.
Motivation: The Driving Force in All We Do
In our February issue of
Creativity requires motivation; it does not happen passively. Our lives begin with biologic appetitive and aversive drives, such as the need to feed or avoid the cold. They are the roots of motivation. In the 1950s, James Olds, Ph.D., showed that appetitive and aversive behaviors were controlled by distinct brain regions (J. Comp. Physiol. Psychol. 1954;47:419-27). He implanted electrodes into rat brains and placed the rats in a cage containing a foot switch that, when pressed, delivered an electrical shock to the brain region in which the electrode was implanted. By varying the location of the electrodes and the conditions under which rats were tested, Dr. Olds found that some regions and situations led to self-stimulation rates as high as 7,000 shocks per hour, and others led the rats to avoid self-stimulation. The size of the shock, fatigue, hunger, pain, hormonal levels, and drugs all influenced response rates.
Three brain regions, or systems, involved in motivation are the hypothalamus; the mesolimbic dopaminergic system (comprised of the ventral tegmental area [VTA], the nucleus accumbens/ventral striatum, and the orbitofrontal cortex [OFC], all linked together by the median forebrain bundle); and the amygdala. The hypothalamus maintains set points for different aspects of the “internal milieu,” such as body weight and fluid balance. As our body strays from a set point, we are driven by hunger or thirst to alter our behavior and restore the set point. Returning our body to an established set point is powerfully rewarding. Within the mesolimbic system, VTA neurons generate a reward signal by comparing what occurs with what was expected (J. Neurophysiol. 1998;80:1-27). VTA dopaminergic reward neurons are most strongly activated by rewarding events that are better than expected.
The basolateral amygdala forms associations between sensory cues and rewarding or aversive stimuli, and acts as a “fear center” (J. Neurosci. 1995;15:5879-91). It is interconnected with sensory cortices and the hippocampus, forming associations with emotionally salient aspects of a stimulus that influence our perception and memory encoding of the stimulus (Curr. Opin. Neurobiol. 2004;14:198-202). Reward centers also modulate activity of the hypothalamus and locus ceruleus, thereby influencing endocrine and noradrenergic feedback to cortical regions.
The interplay of appetitive and aversive signals define a predicted, most rewarding (or least punishing) goal. Neurologists typically awaken early and perform a variety of duties over a long day (plus hospital call). Some appetitive signals include helping patients, research discoveries, educating students, pay, and benefits. Some aversive signals are the stresses of sick or otherwise difficult patients, research failures, underperforming students, and long hours. On balance, however, the net result is a greater feeling of reward than punishment so we keep doing it. But our behavior will change if discrepancies arise between the predicted and realized reward. If my health coverage were discontinued or my pay cut in half, I would seek a different position. The activity of anterior cingulate neurons – the earliest anatomical stage of action planning and movement – is influenced by reward signals from the orbitofrontal cortex. If a goal is made less rewarding, OFC neuronal activity declines as then does OFC stimulation of anterior cingulate neurons. The less rewarding activity stops and is replaced by a more rewarding one. Immediately preceding the change in behavior, specific neurons in the anterior cingulate fire, marking the first step that results in the altered response to the reduced reward (Science 1998;282:1335-8; Proc. Natl. Acad. Sci. U.S.A. 2002;99:523-8).
Our reward system has many targets defining our wants. These include biologic stimuli such as food when we are hungry; aesthetic stimuli such as humor, paintings, music, and sports cars; and money (Neuron 2001;30:619-39). Reward centers also are activated by socially relevant behaviors, such as the decision to enact justice-related punishment and social comparisons in which we may perceive ourselves as better off than our neighbor. The developing relationship between two people learning the degree to which they can trust one another also causes changes in reward center activity (detected by fMRI) in an interpersonally synchronized fashion (Science 2005;308:78-83).
Aversive stimuli, such as pain or the loss of money, activate similar brain regions, although specific areas differ from those activated by reward (Nat. Neurosci. 2001;4:95-102). Motivation is also attenuated by diminished reward, and by nonescalating, static reward. We quickly accommodate to any improvement in our life circumstances (for example a higher income) so that initially heightened satisfaction rapidly recalibrates to baseline (the hedonic treadmill) (Science 2004;306:1776-80).
These examples illustrate that there is a final common reward pathway. All appetitive and aversive stimuli are translated into a common biologically relevant motivational signal that tells us whether something will enhance or diminish our survival or quality of life. The perceived difference in reward value between what is and what should be generates the motivational voltage that drives creativity. In our next issue, we will consider perception and mental imagery as the steps that create, in our mind's eye and imagination, what is and what should be, or the generation of ideas.
In our February issue of
Creativity requires motivation; it does not happen passively. Our lives begin with biologic appetitive and aversive drives, such as the need to feed or avoid the cold. They are the roots of motivation. In the 1950s, James Olds, Ph.D., showed that appetitive and aversive behaviors were controlled by distinct brain regions (J. Comp. Physiol. Psychol. 1954;47:419-27). He implanted electrodes into rat brains and placed the rats in a cage containing a foot switch that, when pressed, delivered an electrical shock to the brain region in which the electrode was implanted. By varying the location of the electrodes and the conditions under which rats were tested, Dr. Olds found that some regions and situations led to self-stimulation rates as high as 7,000 shocks per hour, and others led the rats to avoid self-stimulation. The size of the shock, fatigue, hunger, pain, hormonal levels, and drugs all influenced response rates.
Three brain regions, or systems, involved in motivation are the hypothalamus; the mesolimbic dopaminergic system (comprised of the ventral tegmental area [VTA], the nucleus accumbens/ventral striatum, and the orbitofrontal cortex [OFC], all linked together by the median forebrain bundle); and the amygdala. The hypothalamus maintains set points for different aspects of the “internal milieu,” such as body weight and fluid balance. As our body strays from a set point, we are driven by hunger or thirst to alter our behavior and restore the set point. Returning our body to an established set point is powerfully rewarding. Within the mesolimbic system, VTA neurons generate a reward signal by comparing what occurs with what was expected (J. Neurophysiol. 1998;80:1-27). VTA dopaminergic reward neurons are most strongly activated by rewarding events that are better than expected.
The basolateral amygdala forms associations between sensory cues and rewarding or aversive stimuli, and acts as a “fear center” (J. Neurosci. 1995;15:5879-91). It is interconnected with sensory cortices and the hippocampus, forming associations with emotionally salient aspects of a stimulus that influence our perception and memory encoding of the stimulus (Curr. Opin. Neurobiol. 2004;14:198-202). Reward centers also modulate activity of the hypothalamus and locus ceruleus, thereby influencing endocrine and noradrenergic feedback to cortical regions.
The interplay of appetitive and aversive signals define a predicted, most rewarding (or least punishing) goal. Neurologists typically awaken early and perform a variety of duties over a long day (plus hospital call). Some appetitive signals include helping patients, research discoveries, educating students, pay, and benefits. Some aversive signals are the stresses of sick or otherwise difficult patients, research failures, underperforming students, and long hours. On balance, however, the net result is a greater feeling of reward than punishment so we keep doing it. But our behavior will change if discrepancies arise between the predicted and realized reward. If my health coverage were discontinued or my pay cut in half, I would seek a different position. The activity of anterior cingulate neurons – the earliest anatomical stage of action planning and movement – is influenced by reward signals from the orbitofrontal cortex. If a goal is made less rewarding, OFC neuronal activity declines as then does OFC stimulation of anterior cingulate neurons. The less rewarding activity stops and is replaced by a more rewarding one. Immediately preceding the change in behavior, specific neurons in the anterior cingulate fire, marking the first step that results in the altered response to the reduced reward (Science 1998;282:1335-8; Proc. Natl. Acad. Sci. U.S.A. 2002;99:523-8).
Our reward system has many targets defining our wants. These include biologic stimuli such as food when we are hungry; aesthetic stimuli such as humor, paintings, music, and sports cars; and money (Neuron 2001;30:619-39). Reward centers also are activated by socially relevant behaviors, such as the decision to enact justice-related punishment and social comparisons in which we may perceive ourselves as better off than our neighbor. The developing relationship between two people learning the degree to which they can trust one another also causes changes in reward center activity (detected by fMRI) in an interpersonally synchronized fashion (Science 2005;308:78-83).
Aversive stimuli, such as pain or the loss of money, activate similar brain regions, although specific areas differ from those activated by reward (Nat. Neurosci. 2001;4:95-102). Motivation is also attenuated by diminished reward, and by nonescalating, static reward. We quickly accommodate to any improvement in our life circumstances (for example a higher income) so that initially heightened satisfaction rapidly recalibrates to baseline (the hedonic treadmill) (Science 2004;306:1776-80).
These examples illustrate that there is a final common reward pathway. All appetitive and aversive stimuli are translated into a common biologically relevant motivational signal that tells us whether something will enhance or diminish our survival or quality of life. The perceived difference in reward value between what is and what should be generates the motivational voltage that drives creativity. In our next issue, we will consider perception and mental imagery as the steps that create, in our mind's eye and imagination, what is and what should be, or the generation of ideas.
In our February issue of
Creativity requires motivation; it does not happen passively. Our lives begin with biologic appetitive and aversive drives, such as the need to feed or avoid the cold. They are the roots of motivation. In the 1950s, James Olds, Ph.D., showed that appetitive and aversive behaviors were controlled by distinct brain regions (J. Comp. Physiol. Psychol. 1954;47:419-27). He implanted electrodes into rat brains and placed the rats in a cage containing a foot switch that, when pressed, delivered an electrical shock to the brain region in which the electrode was implanted. By varying the location of the electrodes and the conditions under which rats were tested, Dr. Olds found that some regions and situations led to self-stimulation rates as high as 7,000 shocks per hour, and others led the rats to avoid self-stimulation. The size of the shock, fatigue, hunger, pain, hormonal levels, and drugs all influenced response rates.
Three brain regions, or systems, involved in motivation are the hypothalamus; the mesolimbic dopaminergic system (comprised of the ventral tegmental area [VTA], the nucleus accumbens/ventral striatum, and the orbitofrontal cortex [OFC], all linked together by the median forebrain bundle); and the amygdala. The hypothalamus maintains set points for different aspects of the “internal milieu,” such as body weight and fluid balance. As our body strays from a set point, we are driven by hunger or thirst to alter our behavior and restore the set point. Returning our body to an established set point is powerfully rewarding. Within the mesolimbic system, VTA neurons generate a reward signal by comparing what occurs with what was expected (J. Neurophysiol. 1998;80:1-27). VTA dopaminergic reward neurons are most strongly activated by rewarding events that are better than expected.
The basolateral amygdala forms associations between sensory cues and rewarding or aversive stimuli, and acts as a “fear center” (J. Neurosci. 1995;15:5879-91). It is interconnected with sensory cortices and the hippocampus, forming associations with emotionally salient aspects of a stimulus that influence our perception and memory encoding of the stimulus (Curr. Opin. Neurobiol. 2004;14:198-202). Reward centers also modulate activity of the hypothalamus and locus ceruleus, thereby influencing endocrine and noradrenergic feedback to cortical regions.
The interplay of appetitive and aversive signals define a predicted, most rewarding (or least punishing) goal. Neurologists typically awaken early and perform a variety of duties over a long day (plus hospital call). Some appetitive signals include helping patients, research discoveries, educating students, pay, and benefits. Some aversive signals are the stresses of sick or otherwise difficult patients, research failures, underperforming students, and long hours. On balance, however, the net result is a greater feeling of reward than punishment so we keep doing it. But our behavior will change if discrepancies arise between the predicted and realized reward. If my health coverage were discontinued or my pay cut in half, I would seek a different position. The activity of anterior cingulate neurons – the earliest anatomical stage of action planning and movement – is influenced by reward signals from the orbitofrontal cortex. If a goal is made less rewarding, OFC neuronal activity declines as then does OFC stimulation of anterior cingulate neurons. The less rewarding activity stops and is replaced by a more rewarding one. Immediately preceding the change in behavior, specific neurons in the anterior cingulate fire, marking the first step that results in the altered response to the reduced reward (Science 1998;282:1335-8; Proc. Natl. Acad. Sci. U.S.A. 2002;99:523-8).
Our reward system has many targets defining our wants. These include biologic stimuli such as food when we are hungry; aesthetic stimuli such as humor, paintings, music, and sports cars; and money (Neuron 2001;30:619-39). Reward centers also are activated by socially relevant behaviors, such as the decision to enact justice-related punishment and social comparisons in which we may perceive ourselves as better off than our neighbor. The developing relationship between two people learning the degree to which they can trust one another also causes changes in reward center activity (detected by fMRI) in an interpersonally synchronized fashion (Science 2005;308:78-83).
Aversive stimuli, such as pain or the loss of money, activate similar brain regions, although specific areas differ from those activated by reward (Nat. Neurosci. 2001;4:95-102). Motivation is also attenuated by diminished reward, and by nonescalating, static reward. We quickly accommodate to any improvement in our life circumstances (for example a higher income) so that initially heightened satisfaction rapidly recalibrates to baseline (the hedonic treadmill) (Science 2004;306:1776-80).
These examples illustrate that there is a final common reward pathway. All appetitive and aversive stimuli are translated into a common biologically relevant motivational signal that tells us whether something will enhance or diminish our survival or quality of life. The perceived difference in reward value between what is and what should be generates the motivational voltage that drives creativity. In our next issue, we will consider perception and mental imagery as the steps that create, in our mind's eye and imagination, what is and what should be, or the generation of ideas.
Neuroscience Today, Neurology Tomorrow
Neurology is the center of what might more properly be labeled “the clinical neurosciences,” a term that emphasizes the importance of technological innovation and scientific discovery, and recognizes that the application of neuroscientific discoveries occurs not only in neurology, but in neurosurgery, neuroradiology, and other fields as well. Neurologists at the bedside and at the bench are at the forefront of innovation and discovery, translating neuroscience into clinical practice. Our patients require this. In our new column, “Neuroscience Today, Neurology Tomorrow” we highlight discoveries that promise to advance the clinical frontier. We invite you to send your reactions/responses to this column to www.clinicalneurologynews@elsevier.com
Granulocyte-Colony-Stimulating Factor in Repair of Ischemic Stroke
Granulocyte-colony-stimulating factor and epidermal growth factor plus erythropoietin show promise for treating ischemic stroke, according to two separate studies presented at the annual meeting of the Society for Neuroscience.
In experiments with rats that underwent a transient occlusion of the middle cerebral artery, the volume of infarct significantly decreased when granulocyte-colony-stimulating factor (G-CSF) was administered either 2 or 4 hours after the onset of ischemia, reported Rico Laage, Ph.D., of Axaron Bioscience AG, Heidelburg, Germany. Similar results occurred when G-CSF was administered 1 hour after the onset of ischemia caused by transient occlusion of the common carotid artery and distal middle cerebral artery. G-CSF is normally used to treat neutropenic conditions.
In vitro studies have shown that G-CSF reduces apoptotic activity in human neurons and that the G-CSF receptor is expressed in the human brain and is upregulated in infarcted areas shortly after stroke in humans.
In Germany, Axaron is conducting a multicenter, randomized, double-blind, placebo-controlled phase II trial of G-CSF in patients who suffered an acute ischemic stroke in the region of the middle cerebral artery within the last 12 hours and are not receiving tissue plasminogen activator. Following a 3-day intravenous infusion of G-CSF, investigators measure thromboembolic complications up to discharge or day 4 and infection or other serious adverse events after 4 weeks. Neurological outcome is measured at 4 and 12 weeks after treatment while the growth of the ischemic lesion is measured from baseline to 3 months after treatment with MRI.
In another study, Trudi Stickland, at the University of Calgary, and her associates found that epidermal growth factor (EGF) and erythropoietin (EPO) stimulated the proliferation and differentiation of endogenous neural stem cells and repaired infarcted stroke regions in rats. The researchers induced ischemic strokes in the primary motor cortex that affected the forepaw contralateral to the side of the brain that was lesioned.
Intracerebroventricular infusions of EGF and EPO significantly increased gross motor functioning of the affected forepaw when rats spontaneously explored a cylindrical cage and fine motor functioning in a trained task that involved reaching through a narrow slot to grasp a food pellet.
Rats who received only serum albumin recovered very little function. After these behavioral tests (50 days after the stroke), the researchers found that stem cells had migrated to the lesion site in the motor cortex of rats that received EGF and EPO and initiated regrowth of cortical tissue. “We're not sure that these new neurons in the lesion site are necessarily functional,” Ms. Stickland said. Future experiments will determine if the new tissue is establishing new neural connections or is secreting agents that help the surrounding tissue take on new functions.
Dr. Caselli's comment: Neurologists have floundered for decades with anticoagulant therapies and have emphasized primary prevention in populations that already have advanced risk. The advent of IV recombinant tissue plasminogen activator (rTPA) ushered in the era of immediate, definitive intervention. For the first time, minutes mattered. The notion of “brain attack,” if not exactly born, finally let out a long-awaited cry that the world heard. Neurology has a long-standing reputation of being more of a diagnostic than a therapeutic specialty, leading many doctors and patients to have a nihilistic view of neurologic therapeutics. Invasive cerebrovascular techniques such as intraarterial TPA, stent coils, and other devices are changing that perception. In the two studies described above, different forms of growth factors and apoptosis inhibitors reduced stroke size when given immediately in rats.
While an intracerebroventricular avenue will not suffice clinically, especially for patients who may be eligible for thrombolysis, the notion of treating a patient with substances that immediately inhibit cell death and promote regeneration is a logical and potentially powerful next step. While more work is clearly needed, it is encouraging that some substances, such as G-CSF have reached the stage of human trials.
Parkinson's Immunotherapy Targets α-Synuclein Aggregates
Immunotherapy that targets aggregates of α-synuclein protein in dopamine neurons points toward a potential pathway for treating Parkinson's disease, Dr. Eliezer Masliah reported at the annual meeting of the Society for Neuroscience.
Aggregates of oligomeric protofibrils of α-synuclein, which is highly concentrated in presynaptic boutons and plays an important role in neurotransmitter release, may contribute to the synaptic damage and degeneration found in disorders with Lewy bodies, such as Parkinson's disease. “There is a clear relationship between the distribution and aggregation of α-synuclein in cortical and subcortical regions and the patterns of clinical manifestations,” said Dr. Masliah of the University of California, San Diego.
He and his associates used human recombinant α-synuclein to vaccinate transgenic mice that express human α-synuclein and show the characteristic accumulation of abnormal α-synuclein oligomers in plasma and synaptic membranes and motor deficits of Lewy body disease and Parkinson's disease. The animals developed relatively high titers of antibodies to α-synuclein, which showed high affinity to α-synuclein aggregates in immunoblot assays and tissue sections. The mice that produced high-affinity antibodies accumulated fewer α-synuclein aggregates, which was associated with reduced neurodegeneration. The α-synuclein aggregates appear to be degraded via lysosomal pathways. The results suggest that α-synuclein antibodies might directly interact with oligomers of α-synuclein at the synaptic membrane or they might bind to receptors, resulting in the endocytosis of the antibodies with the α-synuclein complex, Dr. Masliah said.
No effects were seen on endogenous murine α-synuclein or on other α-synuclein-related markers that are present in the synapse, such as β-synuclein. No severe inflammatory effects were observed in the mice during the experiments, but because of the risk of using active immunization in humans, Dr. Masliah and his colleagues have been developing a passive immunization protocol for future experiments.
Dr. Caselli's comment: The strategy employed by Masliah and colleagues is reminiscent of the amyloid vaccination strategy also initially tested in a transgenic mouse model, and then extended into human trials. Masliah et al. are wisely taking the next step based on the Alzheimer's vaccination experience in which 5% of vaccine recipients developed an autoimmune meningoencephalitis and cerebral vasculitis that resulted in clinical deterioration and prompted premature termination of the trial. A passive vaccination strategy is now underway for Alzheimer's disease, and given the groundbreaking work of Masliah and colleagues, we may anticipate a similar trial for Parkinson's disease in the future. This is a powerful new approach, but clinical efficacy and safety still await further experience.
Protein-Stimulated Neural Stem Cell Repair in Parkinson's Model
The protein compound sNN0031 restored nearly all function and normalized dopamine transporter levels in a rat model of Parkinson's disease for at least 10 weeks by activating endogenous stem cell repair in the striatum, Olof Zachrisson, Ph.D., reported at the annual meeting of the Society for Neuroscience.
To mimic the effects of Parkinson's disease, Dr. Zachrisson of NeuroNova, Stockholm, and his associates injected 6-hydroxy dopamine unilaterally into the right median forebrain bundle of rats, which reduced dopamine transporter binding by 75% in the striatum. When the lesioned rats were given amphetamine, they rotated toward their lesioned side, a sign of an imbalanced dopamine system.
Five weeks later, lesioned rats and those that were given a sham injection received an intracerebroventricular infusion of sNN0031 or a vehicle for 2 weeks. Afterward, lesioned rats on sNN0031 showed significantly more improvement in rotational behavior than rats given vehicle; improvement continued up to 10 weeks after the drug had been administered. Lesioned rats on sNN0031 had a significantly improved level of striatal dopamine transporter binding 10 weeks after treatment, Dr. Zachrisson said.
sNN0031 induced endogenous stem cell proliferation in the ventricular wall adjacent to the medial striatum and neurogenesis in the striatum; the investigators have not yet determined if the new neurons are producing dopamine. At a press conference, one of the investigators, Dr. Anders Haegerstrand, also of NeuroNova, said that sNN0031 is already approved by the Food and Drug Administration for a non-CNS indication, albeit in a different form and dose. He would not say what disease the drug is normally used to treat or what its composition is for proprietary reasons.
Dr. Caselli's comment: That an existing compound has demonstrated neuroregenerative potential in this rat model is encouraging and warrants further exploration, although there is also some reason for caution. First, the lesioned rats in this study are not a model for progressive degenerative parkinsonism, but rather for a static basal ganglia lesion. Second, intracerebroventricular delivery of the agent is not an ideal access point for clinical therapy, especially after the failed adrenocortical transplantation trials. Third, much more work needs to be done to elucidate how the behavioral changes in the rats came about. Fourth, the long-term effects of enhancing CNS stem cell activity (or potentially encouraging other cellular populations as well) need to be determined. For example, is there a long-term risk of brain tumors? Fifth, given that this compound is FDA approved, much more information must be available about its clinical properties, side effects, and appropriateness for the intended patient population. These and other caveats aside, the results of this study are exciting for patients and researchers alike.
Neurology is the center of what might more properly be labeled “the clinical neurosciences,” a term that emphasizes the importance of technological innovation and scientific discovery, and recognizes that the application of neuroscientific discoveries occurs not only in neurology, but in neurosurgery, neuroradiology, and other fields as well. Neurologists at the bedside and at the bench are at the forefront of innovation and discovery, translating neuroscience into clinical practice. Our patients require this. In our new column, “Neuroscience Today, Neurology Tomorrow” we highlight discoveries that promise to advance the clinical frontier. We invite you to send your reactions/responses to this column to www.clinicalneurologynews@elsevier.com
Granulocyte-Colony-Stimulating Factor in Repair of Ischemic Stroke
Granulocyte-colony-stimulating factor and epidermal growth factor plus erythropoietin show promise for treating ischemic stroke, according to two separate studies presented at the annual meeting of the Society for Neuroscience.
In experiments with rats that underwent a transient occlusion of the middle cerebral artery, the volume of infarct significantly decreased when granulocyte-colony-stimulating factor (G-CSF) was administered either 2 or 4 hours after the onset of ischemia, reported Rico Laage, Ph.D., of Axaron Bioscience AG, Heidelburg, Germany. Similar results occurred when G-CSF was administered 1 hour after the onset of ischemia caused by transient occlusion of the common carotid artery and distal middle cerebral artery. G-CSF is normally used to treat neutropenic conditions.
In vitro studies have shown that G-CSF reduces apoptotic activity in human neurons and that the G-CSF receptor is expressed in the human brain and is upregulated in infarcted areas shortly after stroke in humans.
In Germany, Axaron is conducting a multicenter, randomized, double-blind, placebo-controlled phase II trial of G-CSF in patients who suffered an acute ischemic stroke in the region of the middle cerebral artery within the last 12 hours and are not receiving tissue plasminogen activator. Following a 3-day intravenous infusion of G-CSF, investigators measure thromboembolic complications up to discharge or day 4 and infection or other serious adverse events after 4 weeks. Neurological outcome is measured at 4 and 12 weeks after treatment while the growth of the ischemic lesion is measured from baseline to 3 months after treatment with MRI.
In another study, Trudi Stickland, at the University of Calgary, and her associates found that epidermal growth factor (EGF) and erythropoietin (EPO) stimulated the proliferation and differentiation of endogenous neural stem cells and repaired infarcted stroke regions in rats. The researchers induced ischemic strokes in the primary motor cortex that affected the forepaw contralateral to the side of the brain that was lesioned.
Intracerebroventricular infusions of EGF and EPO significantly increased gross motor functioning of the affected forepaw when rats spontaneously explored a cylindrical cage and fine motor functioning in a trained task that involved reaching through a narrow slot to grasp a food pellet.
Rats who received only serum albumin recovered very little function. After these behavioral tests (50 days after the stroke), the researchers found that stem cells had migrated to the lesion site in the motor cortex of rats that received EGF and EPO and initiated regrowth of cortical tissue. “We're not sure that these new neurons in the lesion site are necessarily functional,” Ms. Stickland said. Future experiments will determine if the new tissue is establishing new neural connections or is secreting agents that help the surrounding tissue take on new functions.
Dr. Caselli's comment: Neurologists have floundered for decades with anticoagulant therapies and have emphasized primary prevention in populations that already have advanced risk. The advent of IV recombinant tissue plasminogen activator (rTPA) ushered in the era of immediate, definitive intervention. For the first time, minutes mattered. The notion of “brain attack,” if not exactly born, finally let out a long-awaited cry that the world heard. Neurology has a long-standing reputation of being more of a diagnostic than a therapeutic specialty, leading many doctors and patients to have a nihilistic view of neurologic therapeutics. Invasive cerebrovascular techniques such as intraarterial TPA, stent coils, and other devices are changing that perception. In the two studies described above, different forms of growth factors and apoptosis inhibitors reduced stroke size when given immediately in rats.
While an intracerebroventricular avenue will not suffice clinically, especially for patients who may be eligible for thrombolysis, the notion of treating a patient with substances that immediately inhibit cell death and promote regeneration is a logical and potentially powerful next step. While more work is clearly needed, it is encouraging that some substances, such as G-CSF have reached the stage of human trials.
Parkinson's Immunotherapy Targets α-Synuclein Aggregates
Immunotherapy that targets aggregates of α-synuclein protein in dopamine neurons points toward a potential pathway for treating Parkinson's disease, Dr. Eliezer Masliah reported at the annual meeting of the Society for Neuroscience.
Aggregates of oligomeric protofibrils of α-synuclein, which is highly concentrated in presynaptic boutons and plays an important role in neurotransmitter release, may contribute to the synaptic damage and degeneration found in disorders with Lewy bodies, such as Parkinson's disease. “There is a clear relationship between the distribution and aggregation of α-synuclein in cortical and subcortical regions and the patterns of clinical manifestations,” said Dr. Masliah of the University of California, San Diego.
He and his associates used human recombinant α-synuclein to vaccinate transgenic mice that express human α-synuclein and show the characteristic accumulation of abnormal α-synuclein oligomers in plasma and synaptic membranes and motor deficits of Lewy body disease and Parkinson's disease. The animals developed relatively high titers of antibodies to α-synuclein, which showed high affinity to α-synuclein aggregates in immunoblot assays and tissue sections. The mice that produced high-affinity antibodies accumulated fewer α-synuclein aggregates, which was associated with reduced neurodegeneration. The α-synuclein aggregates appear to be degraded via lysosomal pathways. The results suggest that α-synuclein antibodies might directly interact with oligomers of α-synuclein at the synaptic membrane or they might bind to receptors, resulting in the endocytosis of the antibodies with the α-synuclein complex, Dr. Masliah said.
No effects were seen on endogenous murine α-synuclein or on other α-synuclein-related markers that are present in the synapse, such as β-synuclein. No severe inflammatory effects were observed in the mice during the experiments, but because of the risk of using active immunization in humans, Dr. Masliah and his colleagues have been developing a passive immunization protocol for future experiments.
Dr. Caselli's comment: The strategy employed by Masliah and colleagues is reminiscent of the amyloid vaccination strategy also initially tested in a transgenic mouse model, and then extended into human trials. Masliah et al. are wisely taking the next step based on the Alzheimer's vaccination experience in which 5% of vaccine recipients developed an autoimmune meningoencephalitis and cerebral vasculitis that resulted in clinical deterioration and prompted premature termination of the trial. A passive vaccination strategy is now underway for Alzheimer's disease, and given the groundbreaking work of Masliah and colleagues, we may anticipate a similar trial for Parkinson's disease in the future. This is a powerful new approach, but clinical efficacy and safety still await further experience.
Protein-Stimulated Neural Stem Cell Repair in Parkinson's Model
The protein compound sNN0031 restored nearly all function and normalized dopamine transporter levels in a rat model of Parkinson's disease for at least 10 weeks by activating endogenous stem cell repair in the striatum, Olof Zachrisson, Ph.D., reported at the annual meeting of the Society for Neuroscience.
To mimic the effects of Parkinson's disease, Dr. Zachrisson of NeuroNova, Stockholm, and his associates injected 6-hydroxy dopamine unilaterally into the right median forebrain bundle of rats, which reduced dopamine transporter binding by 75% in the striatum. When the lesioned rats were given amphetamine, they rotated toward their lesioned side, a sign of an imbalanced dopamine system.
Five weeks later, lesioned rats and those that were given a sham injection received an intracerebroventricular infusion of sNN0031 or a vehicle for 2 weeks. Afterward, lesioned rats on sNN0031 showed significantly more improvement in rotational behavior than rats given vehicle; improvement continued up to 10 weeks after the drug had been administered. Lesioned rats on sNN0031 had a significantly improved level of striatal dopamine transporter binding 10 weeks after treatment, Dr. Zachrisson said.
sNN0031 induced endogenous stem cell proliferation in the ventricular wall adjacent to the medial striatum and neurogenesis in the striatum; the investigators have not yet determined if the new neurons are producing dopamine. At a press conference, one of the investigators, Dr. Anders Haegerstrand, also of NeuroNova, said that sNN0031 is already approved by the Food and Drug Administration for a non-CNS indication, albeit in a different form and dose. He would not say what disease the drug is normally used to treat or what its composition is for proprietary reasons.
Dr. Caselli's comment: That an existing compound has demonstrated neuroregenerative potential in this rat model is encouraging and warrants further exploration, although there is also some reason for caution. First, the lesioned rats in this study are not a model for progressive degenerative parkinsonism, but rather for a static basal ganglia lesion. Second, intracerebroventricular delivery of the agent is not an ideal access point for clinical therapy, especially after the failed adrenocortical transplantation trials. Third, much more work needs to be done to elucidate how the behavioral changes in the rats came about. Fourth, the long-term effects of enhancing CNS stem cell activity (or potentially encouraging other cellular populations as well) need to be determined. For example, is there a long-term risk of brain tumors? Fifth, given that this compound is FDA approved, much more information must be available about its clinical properties, side effects, and appropriateness for the intended patient population. These and other caveats aside, the results of this study are exciting for patients and researchers alike.
Neurology is the center of what might more properly be labeled “the clinical neurosciences,” a term that emphasizes the importance of technological innovation and scientific discovery, and recognizes that the application of neuroscientific discoveries occurs not only in neurology, but in neurosurgery, neuroradiology, and other fields as well. Neurologists at the bedside and at the bench are at the forefront of innovation and discovery, translating neuroscience into clinical practice. Our patients require this. In our new column, “Neuroscience Today, Neurology Tomorrow” we highlight discoveries that promise to advance the clinical frontier. We invite you to send your reactions/responses to this column to www.clinicalneurologynews@elsevier.com
Granulocyte-Colony-Stimulating Factor in Repair of Ischemic Stroke
Granulocyte-colony-stimulating factor and epidermal growth factor plus erythropoietin show promise for treating ischemic stroke, according to two separate studies presented at the annual meeting of the Society for Neuroscience.
In experiments with rats that underwent a transient occlusion of the middle cerebral artery, the volume of infarct significantly decreased when granulocyte-colony-stimulating factor (G-CSF) was administered either 2 or 4 hours after the onset of ischemia, reported Rico Laage, Ph.D., of Axaron Bioscience AG, Heidelburg, Germany. Similar results occurred when G-CSF was administered 1 hour after the onset of ischemia caused by transient occlusion of the common carotid artery and distal middle cerebral artery. G-CSF is normally used to treat neutropenic conditions.
In vitro studies have shown that G-CSF reduces apoptotic activity in human neurons and that the G-CSF receptor is expressed in the human brain and is upregulated in infarcted areas shortly after stroke in humans.
In Germany, Axaron is conducting a multicenter, randomized, double-blind, placebo-controlled phase II trial of G-CSF in patients who suffered an acute ischemic stroke in the region of the middle cerebral artery within the last 12 hours and are not receiving tissue plasminogen activator. Following a 3-day intravenous infusion of G-CSF, investigators measure thromboembolic complications up to discharge or day 4 and infection or other serious adverse events after 4 weeks. Neurological outcome is measured at 4 and 12 weeks after treatment while the growth of the ischemic lesion is measured from baseline to 3 months after treatment with MRI.
In another study, Trudi Stickland, at the University of Calgary, and her associates found that epidermal growth factor (EGF) and erythropoietin (EPO) stimulated the proliferation and differentiation of endogenous neural stem cells and repaired infarcted stroke regions in rats. The researchers induced ischemic strokes in the primary motor cortex that affected the forepaw contralateral to the side of the brain that was lesioned.
Intracerebroventricular infusions of EGF and EPO significantly increased gross motor functioning of the affected forepaw when rats spontaneously explored a cylindrical cage and fine motor functioning in a trained task that involved reaching through a narrow slot to grasp a food pellet.
Rats who received only serum albumin recovered very little function. After these behavioral tests (50 days after the stroke), the researchers found that stem cells had migrated to the lesion site in the motor cortex of rats that received EGF and EPO and initiated regrowth of cortical tissue. “We're not sure that these new neurons in the lesion site are necessarily functional,” Ms. Stickland said. Future experiments will determine if the new tissue is establishing new neural connections or is secreting agents that help the surrounding tissue take on new functions.
Dr. Caselli's comment: Neurologists have floundered for decades with anticoagulant therapies and have emphasized primary prevention in populations that already have advanced risk. The advent of IV recombinant tissue plasminogen activator (rTPA) ushered in the era of immediate, definitive intervention. For the first time, minutes mattered. The notion of “brain attack,” if not exactly born, finally let out a long-awaited cry that the world heard. Neurology has a long-standing reputation of being more of a diagnostic than a therapeutic specialty, leading many doctors and patients to have a nihilistic view of neurologic therapeutics. Invasive cerebrovascular techniques such as intraarterial TPA, stent coils, and other devices are changing that perception. In the two studies described above, different forms of growth factors and apoptosis inhibitors reduced stroke size when given immediately in rats.
While an intracerebroventricular avenue will not suffice clinically, especially for patients who may be eligible for thrombolysis, the notion of treating a patient with substances that immediately inhibit cell death and promote regeneration is a logical and potentially powerful next step. While more work is clearly needed, it is encouraging that some substances, such as G-CSF have reached the stage of human trials.
Parkinson's Immunotherapy Targets α-Synuclein Aggregates
Immunotherapy that targets aggregates of α-synuclein protein in dopamine neurons points toward a potential pathway for treating Parkinson's disease, Dr. Eliezer Masliah reported at the annual meeting of the Society for Neuroscience.
Aggregates of oligomeric protofibrils of α-synuclein, which is highly concentrated in presynaptic boutons and plays an important role in neurotransmitter release, may contribute to the synaptic damage and degeneration found in disorders with Lewy bodies, such as Parkinson's disease. “There is a clear relationship between the distribution and aggregation of α-synuclein in cortical and subcortical regions and the patterns of clinical manifestations,” said Dr. Masliah of the University of California, San Diego.
He and his associates used human recombinant α-synuclein to vaccinate transgenic mice that express human α-synuclein and show the characteristic accumulation of abnormal α-synuclein oligomers in plasma and synaptic membranes and motor deficits of Lewy body disease and Parkinson's disease. The animals developed relatively high titers of antibodies to α-synuclein, which showed high affinity to α-synuclein aggregates in immunoblot assays and tissue sections. The mice that produced high-affinity antibodies accumulated fewer α-synuclein aggregates, which was associated with reduced neurodegeneration. The α-synuclein aggregates appear to be degraded via lysosomal pathways. The results suggest that α-synuclein antibodies might directly interact with oligomers of α-synuclein at the synaptic membrane or they might bind to receptors, resulting in the endocytosis of the antibodies with the α-synuclein complex, Dr. Masliah said.
No effects were seen on endogenous murine α-synuclein or on other α-synuclein-related markers that are present in the synapse, such as β-synuclein. No severe inflammatory effects were observed in the mice during the experiments, but because of the risk of using active immunization in humans, Dr. Masliah and his colleagues have been developing a passive immunization protocol for future experiments.
Dr. Caselli's comment: The strategy employed by Masliah and colleagues is reminiscent of the amyloid vaccination strategy also initially tested in a transgenic mouse model, and then extended into human trials. Masliah et al. are wisely taking the next step based on the Alzheimer's vaccination experience in which 5% of vaccine recipients developed an autoimmune meningoencephalitis and cerebral vasculitis that resulted in clinical deterioration and prompted premature termination of the trial. A passive vaccination strategy is now underway for Alzheimer's disease, and given the groundbreaking work of Masliah and colleagues, we may anticipate a similar trial for Parkinson's disease in the future. This is a powerful new approach, but clinical efficacy and safety still await further experience.
Protein-Stimulated Neural Stem Cell Repair in Parkinson's Model
The protein compound sNN0031 restored nearly all function and normalized dopamine transporter levels in a rat model of Parkinson's disease for at least 10 weeks by activating endogenous stem cell repair in the striatum, Olof Zachrisson, Ph.D., reported at the annual meeting of the Society for Neuroscience.
To mimic the effects of Parkinson's disease, Dr. Zachrisson of NeuroNova, Stockholm, and his associates injected 6-hydroxy dopamine unilaterally into the right median forebrain bundle of rats, which reduced dopamine transporter binding by 75% in the striatum. When the lesioned rats were given amphetamine, they rotated toward their lesioned side, a sign of an imbalanced dopamine system.
Five weeks later, lesioned rats and those that were given a sham injection received an intracerebroventricular infusion of sNN0031 or a vehicle for 2 weeks. Afterward, lesioned rats on sNN0031 showed significantly more improvement in rotational behavior than rats given vehicle; improvement continued up to 10 weeks after the drug had been administered. Lesioned rats on sNN0031 had a significantly improved level of striatal dopamine transporter binding 10 weeks after treatment, Dr. Zachrisson said.
sNN0031 induced endogenous stem cell proliferation in the ventricular wall adjacent to the medial striatum and neurogenesis in the striatum; the investigators have not yet determined if the new neurons are producing dopamine. At a press conference, one of the investigators, Dr. Anders Haegerstrand, also of NeuroNova, said that sNN0031 is already approved by the Food and Drug Administration for a non-CNS indication, albeit in a different form and dose. He would not say what disease the drug is normally used to treat or what its composition is for proprietary reasons.
Dr. Caselli's comment: That an existing compound has demonstrated neuroregenerative potential in this rat model is encouraging and warrants further exploration, although there is also some reason for caution. First, the lesioned rats in this study are not a model for progressive degenerative parkinsonism, but rather for a static basal ganglia lesion. Second, intracerebroventricular delivery of the agent is not an ideal access point for clinical therapy, especially after the failed adrenocortical transplantation trials. Third, much more work needs to be done to elucidate how the behavioral changes in the rats came about. Fourth, the long-term effects of enhancing CNS stem cell activity (or potentially encouraging other cellular populations as well) need to be determined. For example, is there a long-term risk of brain tumors? Fifth, given that this compound is FDA approved, much more information must be available about its clinical properties, side effects, and appropriateness for the intended patient population. These and other caveats aside, the results of this study are exciting for patients and researchers alike.