Article Text

Download PDFPDF

Dietary niacin and the risk of incident Alzheimer’s disease and of cognitive decline
Free
  1. M C Morris1,2,3,
  2. D A Evans1,2,4,
  3. J L Bienias1,2,
  4. P A Scherr5,
  5. C C Tangney6,
  6. L E Hebert1,2,
  7. D A Bennett1,4,7,
  8. R S Wilson1,4,7,8,
  9. N Aggarwal1,4,7
  1. 1Rush Institute for Healthy Aging, Centers for Disease Control and Prevention, Atlanta, GA, USA
  2. 2Department of Internal Medicine, Centers for Disease Control and Prevention, Atlanta, GA, USA
  3. 3Department of Preventive Medicine, Centers for Disease Control and Prevention, Atlanta, GA, USA
  4. 4Rush Alzheimer’s Disease Center, Centers for Disease Control and Prevention, Atlanta, GA, USA
  5. 5Division of Adult and Community Health, Centers for Disease Control and Prevention, Atlanta, GA, USA
  6. 6Department of Clinical Nutrition, Rush University Medical Center, Chicago, IL, USA
  7. 7Department of Neurological Sciences, Rush University Medical Center, Chicago, IL, USA
  8. 8Department of Psychology, Rush University Medical Center, Chicago, IL, USA
  1. Correspondence to:
 Dr M C Morris
 Rush Institute for Healthy Aging, 1645 W. Jackson, Ste. 675, Chicago, IL 60612; martha_c_morrisrush.edu

Abstract

Background: Dementia can be caused by severe niacin insufficiency, but it is unknown whether variation in intake of niacin in the usual diet is linked to neurodegenerative decline. We examined whether dietary intake of niacin was associated with incident Alzheimer’s disease (AD) and cognitive decline in a large, prospective study.

Methods: This study was conducted in 1993–2002 in a geographically defined Chicago community of 6158 residents aged 65 years and older. Nutrient intake was determined by food frequency questionnaire. Four cognitive tests were administered to all study participants at 3 year intervals in a 6 year follow up. A total of 3718 participants had dietary data and at least two cognitive assessments for analyses of cognitive change over a median 5.5 years. Clinical evaluations were performed on a stratified random sample of 815 participants initially unaffected by AD, and 131 participants were diagnosed with 4 year incident AD by standardised criteria.

Results: Energy adjusted niacin intake had a protective effect on development of AD and cognitive decline. In a logistic regression model, relative risks (95% confidence intervals) for incident AD from lowest to highest quintiles of total niacin intake were: 1.0 (referent) 0.3 (0.1 to 0.6), 0.3 (0.1 to 0.7), 0.6 (0.3 to 1.3), and 0.3 (0.1 to 0.7) adjusted for age, sex, race, education, and ApoE e4 status. Niacin intake from foods was also inversely associated with AD (p for linear trend = 0.002 in the adjusted model). In an adjusted random effects model, higher food intake of niacin was associated with a slower annual rate of cognitive decline, by 0.019 standardised units (SU) per natural log increase in intake (mg) (p = 0.05). Stronger associations were observed in analyses that excluded participants with a history of cardiovascular disease (β = 0.028 SU/year; p = 0.008), those with low baseline cognitive scores (β = 0.023 SU/year; p = 0.02), or those with fewer than 12 years’ education (β = 0.035 SU/year; p = 0.002)

Conclusion: Dietary niacin may protect against AD and age related cognitive decline.

  • AD, Alzheimer’s disease
  • CHAP, Chicago Health and Aging Project
  • FFQ, food frequency questionnaire
  • MMSE, Mini Mental State Examination
  • RDA, recommended dietary allowance/recommended daily amount
  • niacin
  • Alzheimer’s disease
  • dementia
  • cognitive function
  • epidemiological study
  • aged
  • population-based study

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Severe deficiency of niacin (nicotinic acid) and tryptophan, its dietary precursor, is known to cause pellagra, a condition characterised by dementia, diarrhoea, and dermatitis. Once endemic in populations where the main dietary component was maize or sorghum, pellagra can also occur with severe alcoholism and other conditions that cause nicotinic acid deficiency.1 Although a direct cause for the dementia has not been fully determined, a number of studies have found that niacin plays important roles in DNA synthesis and repair,2–4 myelination and dendritic growth,5,6 cellular calcium signalling,7 and as a potent anti-oxidant in brain mitochondria.8 Improvements in cognitive test scores9,10 and overall function11 have been reported by European trials of pharmacological preparations that include nicotinic acid.

There has been limited study of dietary niacin and the development of Alzheimer’s disease (AD). Two case−control studies reported lower blood levels of a nicotinic acid metabolite among demented patients than among age and sex matched controls.12,13 We examined the association between dietary intake of niacin and tryptophan with incident AD and overall cognitive decline in a large prospective study of a biracial Chicago community.

SUBJECTS AND METHODS

Population

Participants are from the Chicago Health and Aging Project (CHAP), a biracial study of three geographically defined contiguous neighbourhoods on the south side of Chicago. A census identified 8501 residents aged 65 years and older, of whom 439 had died and 249 had moved before participation could be secured. In total, 6158 participated (79% participation overall; 81% among blacks, 75% among whites). The study population was 62% black, 38% white, and 39% male, 61% female, with a mean educational level of 11.8 years. Data were collected from 1993–2002 in cycles of approximately 3 years, each consisting of at home interviews of all participants (including the administration of four cognitive tests), and clinical neurological evaluation on a stratified random sample.

Baseline data provided risk factor information on the entire population, identified prevalent cases of AD in a sample of 729 people, and identified a cohort of 3838 unaffected participants to follow for incident disease. The disease free cohort consisted of 3369 people who had good performance on two14,15 of the baseline cognitive tests, and 469 whose cognitive performance was intermediate or poor, but were unaffected by AD at the baseline clinical evaluation. Subsequent to the 3 year population interview, a second stratified random sample of 1249 participants was drawn from the disease free cohort, and 842 of these (73.9% of survivors) were clinically evaluated for incident AD. Sample participants were randomly selected within strata defined by age, sex, race, and change in cognitive performance from baseline to the 3 year follow up (stable or improved, small and large decline). The sample design allowed for efficient identification of AD cases and a representative sample of the disease free cohort.16 A total of 815 participants had complete data for the analysis of diet and incident AD.

A total of 4324 participants (87% of surviving residents from the original study cohort) completed a dietary questionnaire and at least two cognitive assessments in the 6 year follow up for the analysis of change in cognitive function. We excluded 147 people with incomplete or potentially invalid dietary data (see below), and 459 who completed the dietary questionnaire more than 2.5 years after the initial cognitive assessment, leaving 3718 participants for analysis of cognitive change, with a median follow up of 5.5 years. Descriptions of the population interviews17 and clinical evaluations16,18 have been published previously. The study was approved by the institutional review board of Rush University St. Luke’s Medical Center.

Dietary assessment

Diet was assessed using a modified 152 item Harvard self administered food frequency questionnaire (FFQ),19,20 on average (SD) 1.7 (0.9) years after the baseline in the clinically evaluated sample, and 1.1 (0.9) years after the initial cognitive test for the analyses of change in cognitive function. The FFQ was distributed to participants along with a self addressed envelope for its return, although some participants were interviewed at their request. The FFQ included questions about usual intake of 139 food items and individual vitamin supplements. Nutrient intake was obtained by multiplying the nutrient content of individual food items by the frequency of consumption and then summing over all items. Nutrient content of individual food items was based on the Harvard University Nutrient Database, which is continually updated using data from the US Department of Agriculture,21 and from selected individual publications.22 Participants were prompted for specific brand names of multivitamins, cereals, and margarines, type of cooking oil, and fat preferences for milk and meat products, and this information was used in the computation of nutrient intake. The computation of total niacin equivalents was based on the sum of niacin intake from foods and supplements, plus the tryptophan contribution (1 mg niacin per 60 mg of tryptophan intake). FFQs were considered potentially invalid (n = 147) and eliminated from the analyses if entire food sections or more than half the items were left blank, total energy intake (kcal) was <500 or >3800 for females or <700 or >4000 for males, or the baseline Mini Mental State Examination (MMSE) score was <10 (out of a possible score of 30). For analysis, dietary intake levels were adjusted for total energy intake using the regression residual method23 separately for men and women.

The CHAP FFQ has been shown to be a valid and reliable measure of dietary intake in the CHAP population.24 Pearson’s correlation for niacin intake levels measured by the FFQ and repeated 24 hour dietary recall interviews were 0.52 for total niacin and 0.47 for niacin excluding supplements. Intra-class correlations for reproducibility of intake levels from two FFQs 1 year apart were 0.62 for both total niacin and for niacin excluding supplements.

Clinical evaluation for incident AD

AD was diagnosed based on structured clinical evaluations that were conducted in participants’ homes. A board certified neurologist, who was blinded to participant information on dietary intake, examined every participant. The evaluations included neuropsychological testing (using tests of Consortium Established for Research on Alzheimer’s Disease25), a complete medical history, medication use, laboratory testing, neurologic examination, and informant interviews for cognitively impaired participants. The diagnosis of probable AD was based on criteria of the National Institute of Neurological Communicative Disorders and Stroke, and the AD and Associated Disorders Association,26 with the exception that the definition of AD included all cases that met the criteria, thereby including 14 participants with a co-existing dementing condition. Demented participants without AD (n = 11) were analysed as non-cases. MRI was performed when dementia was evident and clinical stroke was uncertain.

Change in cognitive function

Cognitive function was assessed on the entire study population during in-home interviews at baseline and 3 year and 6 year follow ups using four cognitive tests: the East Boston Tests of Immediate and Delayed Recall,27,28 the MMSE,29 and the Symbol Digit Modalities test.15 Raw scores on each test were converted to z scores, using the baseline mean and standard deviation of the study population, and averaged to form a composite measure.30

Covariates

Information for all non-dietary variables except clinical stroke and ApoE e4 was obtained at participants’ baseline population interview. Alcohol consumption (g/day) was based on three separate FFQ questions about usual consumption of beer, wine, and spirits. Education (years) was computed from self reported highest grade or years of formal education, and pack years (packs per day multiplied by number of years ever smoked). History of diabetes was defined as use of anti-diabetic medication or participant report of clinically diagnosed diabetes. Hypertension was defined as participant report of high blood pressure, anti-hypertensive medication use, or measured systolic pressure >160 mmHg or diastolic pressure >95 mmHg. Heart disease was defined as self reported history of myocardial infarction, use of digitalis, or evidence of angina pectoris based on participant responses to a standardised questionnaire.31 Medication information was obtained through interviewer inspection of all medications taken within the 2 week period prior to the baseline. For the analyses of AD, clinical stroke was defined as probable or possible stroke as diagnosed by the examining neurologist based on a uniform, structured examination, medical history, and MRI diagnostic testing if indicated. ApoE genotyping was conducted on blood samples collected at the clinical evaluations using methods of Hixson and Vernier,32 and the primers described by Wenham et al.33 ApoE genotype was not obtained on the total study population and therefore not available for the analyses of cognitive change.

Statistical methods

We used logistic regression analysis programmed in SAS34 to generate odds ratios as estimates of the relative risk of incident AD weighted for the stratified random sample design. Variance estimation was computed by jack knife repeated replication.35,36 We used random effects models37,38 to estimate the effect of niacin intake on intra-person rate of change in cognitive score while controlling for initial cognitive score and other covariates. Before examination of associations between the outcomes and niacin, we first determined the best basic models of the most important confounders, including non-linear and interactive associations. The basic model for incident AD was based on a previous report of the CHAP study.18 For the analysis of cognitive function, we considered higher order terms of age and education because of earlier studies indicating non-linear associations with cognitive function39,40 and exponential associations with AD.18,41 We also examined interactions among the demographic variables in baseline cognitive score to ensure optimal adjustment of initial score level when modelling change (all terms were statistically significant at p<0.05 in a model of the total study population).

Energy adjusted niacin and tryptophan intakes were each modelled as continuous log transformed variables and also in quintiles with the lowest quintile as the reference category. Dietary covariates were energy adjusted and modelled as continuous log transformed variables. Effect modification was examined in separate multiple adjusted models that included terms for nutrient intake (as continuous variables), the potential effect modifier (age, sex, race, education, ApoE-e4), and interaction terms between these variables.

RESULTS

A total of 815 people, all of whom were initially free of AD at baseline, were clinically evaluated on average 3.9 years later, and 131 were diagnosed with incident AD. The annual incidence rate was 2.6% after appropriate weighting for the stratified random sampling design. Non-dietary niacin supplementation was through multivitamins, including B complex vitamins, as opposed to use of individual supplements. Only 9 of the 815 participants were taking a prescribed form of niacin at baseline. The lowest quintile groups of total niacin intake and niacin food intake had higher percentages of females, blacks, people with clinical stroke, and people with low food intake of vitamin E than did higher quintile groups (table 1). Participants in the lowest quintile of total niacin intake, but not of food intake, were more likely to have an ApoE e4 allele than were those in the highest quintile. There was a higher prevalence of diabetes among participants in the highest quintiles of both total and food intake of niacin.

Table 1

 Baseline characteristics by quintiles of total niacin intake including supplements, and niacin from food among 815 randomly selected participants of the Chicago Health and Aging Project

Total niacin intake, including intake from food and supplements, was inversely associated with incident AD after adjustment for age, sex, race, education, ApoE e4, and time period of observation in both continuous and categorical models. Compared with the risk of disease among participants in the lowest fifth of intake (median of 14.1 mg/day), those in the second, third, and fifth quintiles had significantly lower risk by 70% (table 2). Participants in the fourth quintile of intake had a 40% non-significant reduction in risk compared with the lowest quintile group. The protective association with higher niacin intake became stronger (p for trend = 0.04) after further adjustment for multivitamin use and intake of the antioxidant nutrients (vitamin C, beta-carotene, or vitamin E from food sources) that were found in previous reports42,43 to be possibly protective against AD.

Table 2

 Relative risks (95% confidence intervals) of incident AD by quintile of intake of niacin, tryptophan, and niacin equivalents among 815 participants initially free of AD and followed a median 3.8 years, Chicago Health and Aging Project, 1993–2000

Intake of niacin from foods had an inverse association with AD in the basic adjusted model (p for trend = 0.002) (table 2). Participants in intake quintiles 2−4 had 70% reductions in risk compared with those in the lowest quintile (median intake 12.6 mg/day), whereas participants in the highest fifth of intake (median 22.4 mg/day) had an 80% reduction in risk; all were statistically significant. The relative risks were only slightly less protective in the multiple adjusted model and remained statistically significant.

We considered that the observed protective association of niacin could be entirely due to greater risk of AD among participants in the lowest quintile of niacin intake. When we excluded these from the analyses, we observed a statistically significant inverse log linear association among participants in the upper quintiles of intake (>14 mg/day) with niacin intake from food measured as a continuous log transformed variable. The basic adjusted relative risk was 0.4 (p = 0.04) per 7.2 mg/day increase in dietary niacin, which represents the difference in median intakes for the second and fifth quintiles.

Tryptophan intake from food was also inversely associated with incident AD. The basic adjusted risk decreased with increased level of intake (p for trend = 0.03) (table 2). There was no appreciable change in the relative risks with further adjustment for intake of the antioxidant nutrients and multivitamin use. Because many of the same foods that contain tryptophan also contain niacin we added control for the effect of niacin intake from foods in the multiple adjusted model; the relative risks for tryptophan intake were less protective and no longer statistically significant (for quintiles 2−5, the relative risks (95% confidence interval) were 0.6 (0.2 to1.6), 0.8 (0.3 to2.1), 0.5 (0.2 to1.5), and 0.6 (0.2 to1.4), respectively.

The recommended dietary allowance (recommended daily amount; RDA) for niacin is described in terms of niacin equivalents, which includes intake from both pre-formed niacin and tryptophan. The basic adjusted relative risks for intake of niacin equivalents were of the same magnitude as for total niacin intake and all but the fourth quintile were statistically significant (table 2).

Because some studies44–46 suggest that dietary intakes of folate and other B vitamins (vitamins B6, B12, B1, and B2) may be involved in the development of AD, and dietary intakes of the B vitamins are inter-correlated, we adjusted for these individually in separate basic adjusted models, but there were no material changes in the relative risks for niacin (total and from foods) or for tryptophan. Because cardiovascular related conditions may be associated with the development of AD, we investigated whether the observed associations for niacin and tryptophan could be attributed to the presence of clinical stroke, heart disease, hypertension, or diabetes. There were no appreciable differences in any of the relative risks when we simultaneously adjusted for these conditions in the basic adjusted models. We also found no evidence of confounding in separate basic adjusted models that controlled for alcohol consumption and pack years of smoking. In a previous report, we found evidence that different types of fat were associated with the incidence of AD.47 Therefore we examined whether additional control for intake of saturated, trans, monounsaturated, n-6 polyunsaturated, and n-3 polyunsaturated fats could have biased the observed associations, but there were no material changes in the relative risks.

We found no statistically significant modifications in the protective niacin effect by age or within categories of ApoE e4 status, race, education, sex, heart disease or clinical stroke. There was no appreciable change in the protective relative risks when we controlled for the time between the dietary assessment and the clinical evaluation for incident AD. We also re-analysed the data after excluding participants (n = 51) who had poor cognitive performance at baseline but there was little change in the multivariable relative risks (for quintiles 2−5 of niacin food intake: 0.2, 0.4, 0.3, and 0.2, respectively; p for trend = 0.02). When we restricted the analysis to participants with good cognitive performance at baseline, the multivariable relative risks were virtually unchanged and remained statistically significant.

Niacin intake and cognitive change in the entire study population

We also examined whether dietary intake of niacin was associated with 6 year cognitive change among 3718 people in the larger study population. Although this type of analysis is not specific to AD, it provides an objective and sensitive measure of gradual decline, the central characteristic of this disease. Much of the cognitive decline in the population is probably due to disease processes associated with AD, which is the leading cause of dementia, followed by vascular dementia.48

The mean cognitive score at the initial assessment (average z score of four cognitive tests) was 0.18 (range: −3.50 to 1.58), and the average annual decline was 0.042 standardised units (SU) per year. Food intake of niacin had a linear protective association in both continuous and categorical models. In the continuous model adjusted for demographic confounders, the rate of cognitive decline decreased by 0.019 SU/year (p = 0.05) per ln increase in intake (mg) (table 3). The effect was attenuated slightly (β = 0.017 SU/year; p = 0.12) after additional control for dietary intakes of antioxidant nutrients and folate, multivitamin use, smoking and alcohol use, stroke, heart disease, diabetes, and hypertension. Substitution of each of the other B vitamins for folate produced similar results.

Table 3

 Adjusted effects of niacin intake from food (per ln increase in intake (mg)) on the rate of cognitive change over 6 years, among the total cohort of 3718 participants, and among 2824 participants with no history of stroke or myocardial infarction at baseline or first follow up, Chicago Health and Aging Project, 1993–2002

Because of the likelihood of dietary changes among people who experience major cardiovascular events, we next repeated the analyses after excluding the 894 participants who reported a history of stroke or myocardial infarction at the baseline or first follow up interviews. Food intake of niacin had a linear protective association with cognitive decline in the basic adjusted model (β = 0.028 SU/year; p = 0.004). In the categorical model, the rate of cognitive decline was significantly reduced by 44% among participants in the top fifth of niacin food intake (median 22.1 mg/day) compared with those in the lowest fifth (median 12.6 mg/day), a difference of 0.021 SU/year (p = 0.003) (fig 1). Adjustment for other dietary and cardiovascular-related risk factors resulted in an even greater reduction in the rate (table 3).

Figure 1

Change in cognitive score (z score) for participants in quintiles 1 and 5 of niacin intake from foods based on the basic adjusted model (table 3) among 2824 participants with no history of stroke or myocardial infarction. The figure data were computed for a 75 year old female with 12 years of education. The annual rate of change in cognitive score was −0.048 SU/year in the lowest quintile of niacin intake (median = 12.6 mg/day) and −0.027 SU/year in the highest quintile of intake (median = 22.1 mg/day). The difference in rates of 0.021 SU/year was statistically significant at p = 0.003. The rate differences for quintiles 2−4 were 0.003, 0.002, and 0.009, respectively, and were not statistically significant.

To investigate whether the observed protective association of niacin might be due to unreliable reporting among people with poor cognition, we repeated the analyses after excluding 551 participants from the total cohort who had baseline scores in the lowest 15% of the distribution. In this cognitively restricted group, high intake of dietary niacin was associated with a greater protective effect than that observed for the total cohort in both the basic and multiple adjusted models (table 3). In other analyses, we examined whether low socioeconomic status could account for the findings by excluding participants from the total cohort who had fewer than 12 years of formal schooling. Even among the higher educated (n = 2495), the rate of cognitive decline was significantly reduced with higher food intake of niacin (β = 0.035; p = 0.002 in the basic adjusted model).

Total niacin intake (including intake from vitamin supplements) had no association with cognitive change; the effect estimates fluctuated around 0 in both the basic and multiple adjusted models in the total cohort as well as in the cohort restricted to those with no history of stroke or myocardial infarction.

DISCUSSION

In this prospective population based study, we observed inverse associations between AD and dietary intakes of total niacin (foods and supplements), niacin from foods only, and tryptophan. Although participants in the lowest fifth of intake had the greatest risk of AD, a statistically significant log linear inverse association remained when we restricted the analyses to participants with higher intake levels. Higher intake of niacin from food sources was also linearly associated with lower cognitive decline in the study population

The protective association of niacin against AD was observed after controlling for the important risk factors for dementia (age, education, race, ApoE e4) as well as many other dietary and non-dietary factors that could potentially account for the results, including cardiovascular conditions, and dietary intake of antioxidant nutrients, fats, folate, and vitamins B6, B12, B1, and B2. It is possible that residual confounding may have influenced the magnitude of the protective effect; however, there is good evidence in support of an association. Firstly, protective associations were observed after adjustment for race and education, and there was no evidence of modification in the effect by these factors. Secondly, the protective association was specific to niacin intake as opposed to other related B vitamins. Finally, we also found a specific protective effect of niacin intake from food against 6 year cognitive decline among 3718 participants in the larger cohort that was only strengthened in sensitivity analyses excluding participants with low initial cognitive scores or with less than a high school education, and with control for dietary and other potential confounders. We did not observe an association between total niacin intake and cognitive change. It is difficult to test for associations with supplemental niacin because it is obtained through multivitamins that contain many other nutrients that may confound observed effects.

A major strength of the study is the unbiased selection of clinically evaluated participants from a random sample from a community population, and unbiased detection of AD cases through uniform, structured neurological examination using standardised criteria. For a number of clinically evaluated participants, the dietary assessments occurred after baseline, and this could have biased the results if dietary behaviours or that responses to the dietary questionnaires were affected by the onset of disease. However, the protective association remained when we controlled for the timing of the dietary assessment, when we eliminated participants with the poorest memory at baseline, and when we further restricted the analyses by also eliminating those with intermediate memory performance. Further, in a validity study of 232 randomly selected CHAP participants, we found no marked differences in the correlations between nutrient intake on the FFQ and repeated 24 hour recall interviews by cognitive ability, age, race, or educational level.24

Niacin rich foods include meats, legumes, nuts, enriched grains/cereals, coffee, and tea. In addition, niacin is synthesised endogenously through the conversion of tryptophan, an amino acid that constitutes about 1% of the protein in foods. The association with tryptophan was lessened when niacin was included in the model, suggesting the protective benefit may be due to the niacin rather than the tryptophan.

It has been known since the 1930s that pellagra is a result of niacin deficiency and is responsive to synthetic niacin. Confusion and psychosis are well recognised symptoms of pellagra and of the encephalopathy associated with niacin deficiency in severe alcoholism. The level of dietary insufficiency associated with these conditions (8.8 mg niacin equivalents per 2000 kcal49) is lower than the range of intake for the lowest quintile (13.2 to 27 mg per day). The current RDA for niacin equivalents is 16 mg per day for men and 14 mg per day for women.50

Much attention has been focused on the relation between dementia and other B vitamins, particularly vitamin B12, vitamin B6 and folate. There has been little previous examination of dietary niacin and AD, although niacin has been administered to older people to prevent confusional states, and there have been several published clinical trials of medications for this indication.9–11 In this prospective study, we observed a protective association of niacin against the development of AD and cognitive decline within normal levels of dietary intake, which could have substantial public health implications for disease prevention if confirmed by further research.

Acknowledgments

This study was supported by grants (AG11101 and AG13170) from the National Institute on Aging. The authors thank the communities of Washington Heights, Morgan Park and Beverly for their support, and gratefully acknowledge the work of study coordinators, C Bibbs, M Bos, J Tarpey, H Hadden, and F Lamorticella, their staffs, and the analytical programmer, W Bang.

REFERENCES

Footnotes

  • Competing interests: none declared