Article Type
Changed
Thu, 09/09/2021 - 09:15

A genetically induced shift toward higher lymphocyte counts was found to increase susceptibility to childhood acute lymphoblastic leukemia, according to the results of a large genome-wide association study of 2,666 childhood patients with ALL as compared with 60,272 control individuals.

The development of ALL is thought to follow a two-hit model of leukemogenesis; in utero formation of a preleukemic clone and subsequent postnatal acquisition of secondary somatic mutations that leads to overt leukemia, according to Linda Kachuri, PhD, of the department of epidemiology and biostatistics, University of California, San Francisco, and colleagues.

The development of ALL is thought to follow a two-hit model of leukemogenesis; in utero formation of a preleukemic clone and subsequent postnatal acquisition of secondary somatic mutations that leads to overt leukemia, according to Linda Kachuri, PhD, of the Department of Epidemiology and Biostatistics, University of California San Francisco, and colleagues.

Previous research has shown that several childhood-ALL–risk regions have also been associated with variation in blood-cell traits and a recent phenome-wide association study of childhood ALL identified platelet count as the most enriched trait among known ALL-risk loci. To further explore this issue, the researchers conducted their comprehensive study of the role of blood-cell-trait variation in the etiology of childhood ALL.

The researchers identified 3,000 blood-cell-trait–associated variants, which accounted for 4.0% to 23.9% of trait variation and included 115 loci associated with blood-cell ratios: lymphocyte-to-monocyte ratio (LMR); neutrophil-to-lymphocyte ratio (NLR); and platelet-to-lymphocyte ratio (PLR), according to a report published online in The American Journal of Human Genetics.

Lymphocyte risk

The researchers found that ALL susceptibility was genetically correlated with lymphocyte counts (rg = 0.088, P = .0004) and PLR (rg = 0.072, P = .0017).

Using Mendelian randomization analyses, a genetically predicted increase in lymphocyte counts was found to be associated with increased ALL risk (odds ratio [OR] = 1.16, P = .031). This correlation was strengthened after the researchers accounted for other cell types (OR = 1.43, P = .0009).

The researchers observed positive associations with increasing LMR (OR = 1.22, P = .0017) as well as inverse effects for NLR (OR = 0.67, P = .0003) and PLR (OR = 0.80, P = .002).

“We identified the cell-type ratios LMR, NLR, and PLR as independent risk factors for ALL and found evidence that these ratios have distinct genetic mechanisms that are not captured by their component traits. In multivariable MR analyses that concurrently modeled the effects of lymphocyte, monocyte, neutrophil, and platelet counts on ALL, lymphocytes remained as the only independent risk factor and this association with ALL strengthened compared to univariate analyses,” the researchers stated.

They reported that they had no competing interests.

Publications
Topics
Sections

A genetically induced shift toward higher lymphocyte counts was found to increase susceptibility to childhood acute lymphoblastic leukemia, according to the results of a large genome-wide association study of 2,666 childhood patients with ALL as compared with 60,272 control individuals.

The development of ALL is thought to follow a two-hit model of leukemogenesis; in utero formation of a preleukemic clone and subsequent postnatal acquisition of secondary somatic mutations that leads to overt leukemia, according to Linda Kachuri, PhD, of the department of epidemiology and biostatistics, University of California, San Francisco, and colleagues.

The development of ALL is thought to follow a two-hit model of leukemogenesis; in utero formation of a preleukemic clone and subsequent postnatal acquisition of secondary somatic mutations that leads to overt leukemia, according to Linda Kachuri, PhD, of the Department of Epidemiology and Biostatistics, University of California San Francisco, and colleagues.

Previous research has shown that several childhood-ALL–risk regions have also been associated with variation in blood-cell traits and a recent phenome-wide association study of childhood ALL identified platelet count as the most enriched trait among known ALL-risk loci. To further explore this issue, the researchers conducted their comprehensive study of the role of blood-cell-trait variation in the etiology of childhood ALL.

The researchers identified 3,000 blood-cell-trait–associated variants, which accounted for 4.0% to 23.9% of trait variation and included 115 loci associated with blood-cell ratios: lymphocyte-to-monocyte ratio (LMR); neutrophil-to-lymphocyte ratio (NLR); and platelet-to-lymphocyte ratio (PLR), according to a report published online in The American Journal of Human Genetics.

Lymphocyte risk

The researchers found that ALL susceptibility was genetically correlated with lymphocyte counts (rg = 0.088, P = .0004) and PLR (rg = 0.072, P = .0017).

Using Mendelian randomization analyses, a genetically predicted increase in lymphocyte counts was found to be associated with increased ALL risk (odds ratio [OR] = 1.16, P = .031). This correlation was strengthened after the researchers accounted for other cell types (OR = 1.43, P = .0009).

The researchers observed positive associations with increasing LMR (OR = 1.22, P = .0017) as well as inverse effects for NLR (OR = 0.67, P = .0003) and PLR (OR = 0.80, P = .002).

“We identified the cell-type ratios LMR, NLR, and PLR as independent risk factors for ALL and found evidence that these ratios have distinct genetic mechanisms that are not captured by their component traits. In multivariable MR analyses that concurrently modeled the effects of lymphocyte, monocyte, neutrophil, and platelet counts on ALL, lymphocytes remained as the only independent risk factor and this association with ALL strengthened compared to univariate analyses,” the researchers stated.

They reported that they had no competing interests.

A genetically induced shift toward higher lymphocyte counts was found to increase susceptibility to childhood acute lymphoblastic leukemia, according to the results of a large genome-wide association study of 2,666 childhood patients with ALL as compared with 60,272 control individuals.

The development of ALL is thought to follow a two-hit model of leukemogenesis; in utero formation of a preleukemic clone and subsequent postnatal acquisition of secondary somatic mutations that leads to overt leukemia, according to Linda Kachuri, PhD, of the department of epidemiology and biostatistics, University of California, San Francisco, and colleagues.

The development of ALL is thought to follow a two-hit model of leukemogenesis; in utero formation of a preleukemic clone and subsequent postnatal acquisition of secondary somatic mutations that leads to overt leukemia, according to Linda Kachuri, PhD, of the Department of Epidemiology and Biostatistics, University of California San Francisco, and colleagues.

Previous research has shown that several childhood-ALL–risk regions have also been associated with variation in blood-cell traits and a recent phenome-wide association study of childhood ALL identified platelet count as the most enriched trait among known ALL-risk loci. To further explore this issue, the researchers conducted their comprehensive study of the role of blood-cell-trait variation in the etiology of childhood ALL.

The researchers identified 3,000 blood-cell-trait–associated variants, which accounted for 4.0% to 23.9% of trait variation and included 115 loci associated with blood-cell ratios: lymphocyte-to-monocyte ratio (LMR); neutrophil-to-lymphocyte ratio (NLR); and platelet-to-lymphocyte ratio (PLR), according to a report published online in The American Journal of Human Genetics.

Lymphocyte risk

The researchers found that ALL susceptibility was genetically correlated with lymphocyte counts (rg = 0.088, P = .0004) and PLR (rg = 0.072, P = .0017).

Using Mendelian randomization analyses, a genetically predicted increase in lymphocyte counts was found to be associated with increased ALL risk (odds ratio [OR] = 1.16, P = .031). This correlation was strengthened after the researchers accounted for other cell types (OR = 1.43, P = .0009).

The researchers observed positive associations with increasing LMR (OR = 1.22, P = .0017) as well as inverse effects for NLR (OR = 0.67, P = .0003) and PLR (OR = 0.80, P = .002).

“We identified the cell-type ratios LMR, NLR, and PLR as independent risk factors for ALL and found evidence that these ratios have distinct genetic mechanisms that are not captured by their component traits. In multivariable MR analyses that concurrently modeled the effects of lymphocyte, monocyte, neutrophil, and platelet counts on ALL, lymphocytes remained as the only independent risk factor and this association with ALL strengthened compared to univariate analyses,” the researchers stated.

They reported that they had no competing interests.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE AMERICAN JOURNAL OF HUMAN GENETICS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article