Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Background Culinary nutrition education programs are increasingly used as a public health intervention for older adults. These programs often integrate nutrition education in addition to interactive cooking workshops or displays to create programs suitable for older adults’ needs, ability and behaviour change. Synthesising the existing literature on nutrition education and interactive cooking programs for older adults is important to guide future program development to support healthy ageing. Objectives To determine the extent of published literature and report the characteristics and outcomes of interactive culinary nutrition education programs for older adults (> 51 years). Design This scoping review followed the PRISMA-ScR guidelines recommended for reporting and conducting a scoping review. Methods Five databases were searched of relevant papers published to May 2022 using a structured search strategy. Inclusion criteria included: older adults (≥ 51 years), intervention had both an interactive culinary element and nutrition education and reported dietary outcome. Titles and abstracts were screened by two reviewers, followed by full-text retrieval. Data were charted regarding the characteristics of the program and outcomes assessed. Results A total of 39 articles met the full inclusion criteria. The majority of these studies (n= 23) were inclusive of a range of age groups where older adults were the majority but did not target older adults exclusively. There were large variations in the design of the programs such as the number of classes (1 to 20), duration of programs (2 weeks to 2 years), session topics, and whether a theoretical model was used or not and which model. All programs were face-to-face (n= 39) with only two programs including alternatives or additional delivery approaches beside face-to-face settings. The most common outcomes assessed were dietary behaviour, dietary intake and anthropometrics. Conclusion Culinary nutrition education programs provide an environment to improve dietary habits and health literacy of older adults. However, our review found that only a small number of programs were intentionally designed for older adults. This review provides a summary to inform researchers and policy makers on current culinary nutrition education programs for older adults. It also recommends providing face-to-face alternatives that will be accessible to a wider group of older adults with fewer restrictions. PubDate: 2023-01-24
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives Dietary restriction of methionine (Met) and cysteine (Cys) delays the aging process and aging-related diseases, improves glucose and fat metabolism and reduces oxidative stress in numerous laboratory animal models. Little is known regarding the effects of sulfur amino acid restriction in humans. Thus, our objectives were to determine the impact of feeding diets restricted in Met alone (MetR) or in both Met and Cys (total sulfur amino acids, SAAR) to healthy adults on relevant biomarkers of cardiometabolic disease risk. Design A controlled feeding study. Setting and Participants We included 20 healthy adults (11 females/9 males) assigned to MetR or SAAR diet groups consisting of three 4-wk feeding periods: Control period; low level restriction period (70% MetR or 50% SAAR); and high level restriction period (90% MetR or 65% SAAR) separated by 3–4-wk washout periods. Results No adverse effects were associated with either diet and level of restriction and compliance was high in all subjects. SAAR was associated with significant reductions in body weight and plasma levels of total cholesterol, LDL, uric acid, leptin, and insulin, BUN, and IGF-1, and increases in body temperature and plasma FGF-21 after 4 weeks (P<0.05). Fewer changes occurred with MetR including significant reductions in BUN, uric acid and 8-isoprostane and an increase in FGF-21 after 4 weeks (P<0.05). In the 65% SAAR group, plasma Met and Cys levels were significantly reduced by 15% and 13% respectively (P<0.05). Conclusion These results suggest that many of the short-term beneficial effects of SAAR observed in animal models are translatable to humans and support further clinical development of this intervention. PubDate: 2023-01-23
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives This study aimed to examine whether the decrease in muscular echo-intensity of the quadriceps by ultrasound in older inpatients is related to the improvement of gait independence than the increase of muscle thickness. Design Longitudinal study Setting Hospital-based study Participants This study included 171 inpatients aged ≥ 65 years (median age: 84.0 [77.0–88.0], 56.1% female). Patients who were able to walk independently at hospital admission were excluded from the study. Measurements Improvement of gait independence during hospital stay was assessed using the change in Functional Independence Measure (FIM) gait score (i.e., FIM gait score at hospital discharge minus FIM gait score at hospital admission) and FIM gait score at hospital discharge. Muscular echo-intensity and muscle thickness of the quadriceps were assessed at hospital admission and discharge using ultrasound images, respectively. Muscular echo-intensity has been shown to be mainly related to intramuscular adipose tissue. Multiple linear regression analysis was performed to identify the factors independently associated with the change in FIM gait score and FIM gait score at discharge. Results Change in quadriceps echo-intensity was independently and significantly associated with the change in FIM gait score (β = −0.22, p = 0.017) and FIM gait score at hospital discharge (β = −0.21, p = 0.017). In contrast, change in quadriceps thickness was not independently and significantly associated with the change in FIM gait score (β = 0.16, p = 0.050) and FIM gait score at hospital discharge (β = 0.15, p = 0.050). Conclusions Our study indicates that a decrease in muscular echo-intensity of the quadriceps by ultrasound is more related to the improvement of gait independence than an increase of muscle thickness in older inpatients. Intervention for intramuscular adipose tissue of the quadriceps may be important for improving gait independence in older inpatients. PubDate: 2023-01-16
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives Summarize the existing evidence regarding the prevalence and risk factors of frailty in stroke patients. Design A meta-analysis and systematic review. Participants Stroke patients in hospitals or communities. Methods We undertook a systematic review and meta-analysis using articles available in 8 databases, including PubMed, The Cochrane Library, Web of Science, Embase, Chinese Biomedical Database (CBM), China National Knowledge Infrastructure Database (CNKI), Wanfang Database, and Weipu Database (VIP) from January 1990 to April 2022. Studies were quality rated using the Newcastle-Ottawa Scale and Agency for Healthcare Research and Quality tool. Results A total of 24 studies involving 30,423 participants were identified. The prevalence of frailty and pre-frailty in stroke patients was 27% (95%CI: 0.23–0.31) and 47.9% (95%CI: 0.43–0.53). Female gender (OR = 1.76, 95%CI: 1.63–1.91), advanced age (MD = 6.73, 95%CI: 3.55–9.91), diabetes (OR = 1.34, 95%CI: 1.06–1.69), hyperlipidemia (OR = 1.46, 95%CI: 1.04–2.04), atrial fibrillation (OR = 1.36, 95%CI: 1.01–1.82), National Institutes of Stroke Scale (NIHSS) admission scores (MD = 2.27, 95%CI: 1.72–2.81) were risk factors of frailty in stroke patients. Conclusions Frailty was more prevalent in stroke patients. Female gender, advanced age, diabetes, hyperlipidemia, atrial fibrillation, and National Institutes of Stroke Scale (NIHSS) admission scores were identified as risk factors for frailty in stroke patients. In the future, medical staff should pay attention to the early screening of frailty in high-risk groups and provide information on its prevention. PubDate: 2023-01-16
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives Intrinsic capacity (IC) declines progressively with age, thereby increasing the risk of disability. However, it is less known whether IC trajectories are associated with disability. This study aims to identify the different patterns of IC trajectories in older people, and examine their determinants and associations with Instrumental Activities of Daily Living (IADL). Design Cohort study. Setting Community centres in different regions in Hong Kong. Participants and Measurements Longitudinal data from community-dwelling older people aged 60 years or above (n = 1371) collected between 2016 and 2021 was analysed. Their mean age was 74.5 years, and 78.7% of them were female. Repeated measurements of a set of 14 self-reported items were used to generate IC scores at four time points using a bi-factor model. Latent class growth analysis was performed to identify classes with distinct IC trajectories. The association between class membership and IADL disability was then examined using logistic regression. Results Three distinct IC trajectories were identified. The 1st class included those with the highest level of baseline IC and the least declining trajectory, whereas the 3rd class was composed by those with the lowest level of baseline IC and the most declining trajectory. Older age, female gender, lower perceived financial adequacy, living in public or subsidized housing, and chronic diseases were associated with the 3rd class. After adjusting for demographic factors, socioeconomic status, and the number of chronic diseases, the 1st class was more likely to preserve IADL when compared against the 2nd class, with OR being 3.179 (95% CI: 2.152–4.793), whereas for the 3rd class, the OR was 0.253 (95% CI: 0.178–0.359). Conclusion Monitoring IC trajectories is of relevance to clinical practice, as it helps shift the focus from treating acute episodes of illness to preserving the functional ability of older people. PubDate: 2023-01-16
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives Loss of appetite in older adults can lead to malnutrition, weight loss, frailty, and death, but little is known about its epidemiology in the United States (US). The objective of this study was to estimate the annual prevalence and incidence of anorexia in older adults with Medicare fee-for-service (FFS) health insurance. Design Retrospective and observational analysis of administrative health insurance claims data. Setting This study included Medicare FFS claims from all settings (eg, hospital inpatient/outpatient, office, assisted living facility, skilled nursing facility, hospice, rehabilitation facility, home). Participants This study included all individuals aged 65 to 115 years old with continuous Medicare FFS medical coverage (Parts A and/or B) for at least one 12-month period from October 1, 2015, to September 30, 2021 (ie, approximately 30 million individuals each year). Intervention Not applicable. Measurements Anorexia was identified using medical claims with the ICD-10 diagnosis code “R63.0: Anorexia”. This study compared individuals with anorexia to a control group without anorexia with respect to demographics, comorbidities using the Charlson Comorbidity Index (CCI), Claims-based Frailty Index (CFI), and annual mortality. The annual prevalence and incidence of anorexia were estimated for each 12-month period from October 1, 2015, to September 30, 2021. Results The number of individuals with anorexia ranged from 317,964 to 328,977 per year, a mean annual prevalence rate of 1.1%. The number of individuals newly diagnosed with anorexia ranged from 243,391 to 281,071 per year, a mean annual incidence rate of 0.9%. Individuals with anorexia had a mean (±standard deviation) age of 80.5±8.7 years (vs 74.9±7.5 years without anorexia; p<.001), 64.4% were female (vs 53.8%; p<.001), and 78.4% were White (vs 83.2%; p<.001). The most common CCI comorbidities for those with anorexia were chronic pulmonary disease (39.4%), dementia (38.3%), and peripheral vascular disease (38.0%). Median (interquartile range [IQR]) CCI with anorexia was 4 [5] (vs 1 [3] without anorexia; p<.001). The annual mortality rate among those with anorexia was 22.3% (vs 4.1% without anorexia; relative risk 5.49 [95% confidence interval, 5.45–5.53]). Conclusion Approximately 1% of all adults aged 65–115 years old with Medicare FFS insurance are diagnosed with anorexia each year based on ICD-10 codes reported in claims. These individuals have a higher comorbidity burden and an increased risk of annual mortality compared to those without a diagnosis of anorexia. Further analyses are needed to better understand the relationship between anorexia, comorbidities, frailty, mortality, and other health outcomes. PubDate: 2023-01-16
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives Malnutrition, particularly protein insufficiency, is common in institutionalised older adults and increases morbidity, mortality, and costs. We aimed to determine whether 12 months supplementation using high-protein foods (milk, cheese, yoghurt) prevents malnutrition in older adults. Design Cluster randomised control study. Setting Sixty Australian aged care facilities. Participants Older adults living in aged care homes (n=654, mean age 86.7±7.2 years, 72% females). Intervention Facilities randomly allocated to a high-protein (n=30 intervention) or regular (n=30 controls) menu. Measurements Nutritional status assessed using the Mini Nutrition Assessment (MNA) tool and fasting morning blood samples (n=302) assayed for haemoglobin (Hb) and albumin. Food intake was monitored 3-monthly using visual plate waste assessment. Measurements at baseline and month 12 were analysed using random effects model accounting for clustering (facility), repeated measure and confounders. Results Addition of 11g of protein as 1.5 servings of high-protein foods daily preserved nutritional status that deteriorated in controls [MNA screen (−0.68, 95%CI: −1.03, −0.32, p<0.001) and total (−0.90, 95%CI: −1.45, −0.35, p=0.001) scores], resulting in group differences in MNA screen (0.62, 95%CI: 0.17, 1.06, p=0.007) and total (0.81, 95%CI: 0.11, 1.51, p=0.023) scores and group difference in Hb (3.60g/L, 95%CI: 0.18, 7.03, p=0.039), the net result of preservation with intervention (0.19g/L, 95%CI: −2.04, 2.42, p=0.896) and a decline in controls (−3.41g/L, 95%CI: −6.01, −0.82, p=0.010). No group differences were observed for serum albumin. Conclusion Consumption of high-protein foods is a pragmatic approach to maintaining nutritional status in older adults in aged-care. PubDate: 2023-01-10
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Background In recent years, a potential beneficial role of Vitamin K in neuromuscular function has been recognised. However, the optimal dietary intake of Vitamin K to support muscle function in the context of falls prevention remains unknown. Objective To examine the relationship of dietary Vitamin K1 and K2 with muscle function and long-term injurious fall-related hospitalisations in older women. Design Cohort study. Participants 1347 community-dwelling older Australian women ≥70 years. Measurements A new Australian Vitamin K nutrient database, supplemented with published data, was used to calculate Vitamin K1 and K2 intake from a validated food frequency questionnaire at baseline (1998). Muscle function (grip strength and timed-up-and-go; TUG) as well plasma Vitamin D status (25OHD) were also assessed at baseline. Fall-related hospitalisations over 14.5 years were obtained from linked health records. Multivariable-adjusted logistic regression and Cox-proportional hazard models were used to analyse the data. Results Over 14.5 years of follow-up (14,774 person-years), 535 (39.7%) women experienced a fall-related hospitalisation. Compared to women with the lowest Vitamin K1 intake (Quartile 1, median 49 µg/d), those with the highest intake (Quartile 4, median 120 µg/d) had 29% lower odds (OR 0.71 95%CI 0.52–0.97) for slow TUG performance (>10.2 s), and 26% lower relative hazards of a fall-related hospitalisation (HR 0.74 95%CI 0.59–0.93) after multivariable adjustment. These associations were non-linear and plateaued at moderate intakes of ∼70–100 µg/d. There was no relation to grip strength. Vitamin K2 intakes were not associated with muscle function or falls. Conclusion A higher habitual Vitamin K1 intake was associated with better physical function and lower long-term injurious falls risk in community-dwelling older women. In the context of musculoskeletal health, Vitamin K1 found abundantly in green leafy vegetables should be promoted. PubDate: 2023-01-10
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives To investigate associations between nutrition risk (determined by SCREEN-II) and malnutrition (diagnosed by the GLIM criteria) with five-year mortality in Māori and non-Māori of advanced age. Design A longitudinal cohort study. Setting Bay of Plenty and Lakes regions of New Zealand. Participants 255 Māori; 400 non-Māori octogenarians. Measurements All participants were screened for nutrition risk using the Seniors in the Community: Risk Evaluation for Eating and Nutrition (SCREEN-II). Those at high nutrition risk (SCREEN-II score <49) had the Global Leadership Initiative in Malnutrition (GLIM) criteria applied to diagnose malnutrition or not. Demographic, physical and health characteristics were obtained by trained research nurses using a standardised questionnaire. Five-year mortality was calculated from Government data. The association of nutrition risk (SCREEN-II) and a malnutrition diagnosis (GLIM) with five-year mortality was examined using logistic regression and cox proportional hazard models of increasing complexity. Results 56% of Māori and 46% of non-Māori participants had low SCREEN-II scores indicative of nutrition risk. The prevalence of GLIM diagnosed malnutrition was lower for both Māori and non-Māori (15% and 19% of all participants). Approximately one-third of participants (37% Māori and 32% non-Māori) died within the five-year follow-up period. The odds of death for both Māori and non-Māori was significantly lower with greater SCREEN II scores (better nutrition status), (OR (95% CI); 0.58 (0.38, 0.88), P < 0.05 and 0.53 (0.38, 0.75), P < 0.001, respectively). GLIM diagnosed malnutrition was not significantly associated with five-year mortality for Māori (OR (95% CI); 0.88 (0.41, 1.91), P >0.05) but was for non-Māori. This association remained significant after adjustment for other predictors of death (OR (95% CI); 0.50 (0.29, 0.86), P< 0.05). Reduced food intake was the only GLIM criterion predictive of five-year mortality for Māori (HR (95% CI); 10.77 (4.76, 24.38), P <0.001). For non-Māori, both aetiologic and phenotypic GLIM criteria were associated with five-year mortality. Conclusion Nutrition risk, but not malnutrition diagnosed by the GLIM criteria was significantly associated with mortality for Māori. Conversely, both nutrition risk and malnutrition were significantly associated with mortality for non-Māori. Appropriate phenotypic criteria for diverse populations are needed within the GLIM framework. PubDate: 2023-01-10
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives Dietary intake information is key to understanding nutrition-related outcomes. Intake changes with age and some older people are at increased risk of malnutrition. Application, difficulties, and advantages of the 24-hour multiple pass recall (24hr-MPR) dietary assessment method in three cohorts of advanced age in the United Kingdom (UK) and New Zealand (NZ) is described. Participants The Newcastle 85+ study (UK) recruited a single year birth cohort of people aged 85 years during 2006–7. LiLACS NZ recruited a 10-year birth cohort of Māori (indigenous New Zealanders) aged 80–90 years and a single year birth cohort of non-Māori aged 85 years in 2010. Measurements Two 24hr-MPR were conducted on non-consecutive days by trained assessors. Pictorial resources and language were adapted for the New Zealand and Māori contexts. Detailed methods are described. Results In the Newcastle 85+ study, 805 (93%) participants consented to the 24-MPR, 95% of whom completed two 24hr-MPR; in LiLACS NZ, 218 (82%) consented and 203 (76%) Māori and 353 (90%) non-Māori completed two 24hr-MPR. Mean time to complete each 24hr-MPR was 22 minutes in the Newcastle 85+ study, and 45 minutes for Māori and 39 minutes for non-Māori in LiLACS NZ. Dietary assessment of participants residing in residential care and those requiring proxy respondents were successfully included in both studies. Most participants (83–94%) felt that data captured by the 24hr-MPR reflected their usual dietary intake. Conclusions Dietary assessment using 24hr-MPR was successful in capturing detailed dietary data including information on portion size and time of eating for over 1300 octogenarians in the UK and New Zealand (Māori and non- Māori). The 24hr-MPR is an acceptable method of dietary assessment in this age group. PubDate: 2023-01-09
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives To examine the association between metabolic syndrome (MetS) and frailty, and determine whether co-existent MetS and frailty affect disability-free survival (DFS), assessed through a composite of death, dementia or physical disability. Design Longitudinal study. Setting and Participants Community-dwelling older adults from Australia and the United States (n=18,264) from “ASPirin in Reducing Events in the Elderly” (ASPREE) study. Measurements MetS was defined according to American College of Cardiology/American Heart Association Task Force on Clinical Practice Guidelines (2018). A modified Fried phenotype and a deficit accumulation Frailty Index (FI) were used to assess frailty. Association between MetS and frailty was examined using multinomial logistic regression. Cox regression was used to analyze the association between MetS, frailty and DFS over a median follow-up of 4.7 years. Results Among 18,264 participants, 49.9% met the criteria for MetS at baseline. Participants with Mets were more likely to be pre-frail [Relative Risk Ratio (RRR): 1.22; 95%Confidence Interval (CI): 1.14, 1.30)] or frail (RRR: 1.66; 95%CI: 1.32, 2.08) than those without MetS. MetS alone did not shorten DFS while pre-frailty or frailty alone did [Hazard Ratio (HR): 1.68; 95%CI: 1.45, 1.94; HR: 2.65; 95%CI:1.92, 3.66, respectively]. Co-existent MetS with pre-frailty/frailty did not change the risk of shortened DFS. Conclusions MetS was associated with pre-frailty or frailty in community-dwelling older individuals. Pre-frailty or frailty increased the risk of reduced DFS but presence of MetS did not change this risk. Assessment of frailty may be more important than MetS in predicting survival free of dementia or physical disability. PubDate: 2023-01-01
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives To determine which social network, demographic, and health-indicator variables are associated with SCREEN-8 (nutrition risk) scores at two time points, three years apart, using data from the Canadian Longitudinal Study on Aging. Design A retrospective cross-sectional study. Setting and Participants 17051 Canadians aged 45 years and older with data from baseline and first follow-up of the Canadian Longitudinal Study on Aging. Measurements Nutrition risk was measured using SCREEN-8. Social network factors included social network size, frequency of contact with social network members, social participation, social support, self-rated social standing, and household income. Demographic variables included age, sex assigned at birth, marital status, educational attainment, and living situation (alone or with others). Health-indicator variables included depression, disability, and self-rated general health, mental health, healthy aging, and oral health. Multivariable linear regression was used to analyze the relationship between the social network, demographic, and health-indicator variables and SCREEN-8 scores at two time points, three years apart. Results Among the social network variables, individuals with higher social participation, self-rated social standing, and social support had higher SCREEN-8 scores at baseline and follow-up. Among the demographic variables, individuals who were single or widowed, compared to married or partnered, had lower SCREEN-8 scores at both time points. For the health-indicator variables, individuals who screened negative for depression, and those with higher self-rated general health, healthy aging, and oral health had higher SCREEN-8 scores at both time points. At baseline, as age increased, SCREEN-8 scores also increased. Conclusion Individuals with low social participation, low social standing, and low social support may be at increased nutrition risk and should be proactively screened by healthcare professionals. Interventions and community programs designed to increase levels of social participation and foster social support may help to reduce the prevalence of nutrition risk. PubDate: 2022-12-28
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives Changes in the oral cavity can reflect other changes throughout the body. This study aimed to investigate the association of dental caries with muscle mass, muscle strength, and sarcopenia, and also to describe the microbial diversity, composition, and community structure of severe dental caries and sarcopenia. Design Cross-sectional study based on a Chinese population aged from 50 to 85 years. Setting Communities from Lanxi City, Zhejiang Province, China. Participants A total of 1,442 participants aged from 50 to 85 years from a general community (62.8% women; median age 61.0 [interquartile range: 55.0, 68.0]). Measurements Dental caries was assessed by the decayed, missing, and filled teeth (DMFT) index. Sarcopenia was defined as the presence of both low muscle mass (assessed by dual-energy X-ray absorptiometry scanning) and low muscle strength (assessed by handgrip strength). Multivariate logistic regression models were used to analyze the association of dental caries with muscle mass, muscle strength, and sarcopenia. Fecal samples underwent 16S rRNA profiling to evaluate the diversity and composition of the gut microbiota in patients with severe dental caries and/or sarcopenia. Results In the fully adjusted logistic models, dental caries was positively associated with low muscle strength (DMFT ≥ 7: OR, 1.61; 95% CI, 1.25–2.06), and sarcopenia (DMFT ≥ 7: OR, 1.51; 95% CI, 1.01–2.26), but not low muscle mass. Severe dental caries was positively associated with higher alpha-diversity indices (richness, chao1, and ACE, all p < 0.05) and associated with beta-diversity based on Bray-Curtis distance (p = 0.006). The severe dental caries group and the sarcopenia group overlapped with 11 depleted and 13 enriched genera. Conclusion Dental caries was positively associated with low muscle strength and sarcopenia but not muscle mass, and this association was more pronounced in male individuals. Significant differences were observed in gut microbiota composition both in severe dental caries and sarcopenia, and there was an overlap of the genera features. Future longitudinal studies are needed to clarify causal relationships. PubDate: 2022-12-21
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Background There is conflicting evidence regarding the association between vitamin D status and cognitive function in population studies. The use of one-time vitamin D measurement in cognitive health studies may not reflect long-term vitamin D status in the body. Objective We aimed to examine the relationship of vitamin D status measured over time with the risk of neurocognitive disorders (NCDs) in Norwegian older adults. Design Prospective cohort study. Setting Regional, Trøndelag Health Study. Participants This study followed a random cohort of 717 participants from HUNT2 (1995–97) and HUNT3 (2006–08) to HUNT4 70+ (2017–19). The mean age at HUNT4 70+ was 77.7 years. Methods Seasonal-standardized serum 25-hydroxyvitamin D [25(OH)D] levels in HUNT2 and HUNT3 were averaged and used as either a categorical variable (<50 and ≥50 nmol/L) or a continuous variable (per 25 nmol/L decrease). In the cohort aged 70 years or over (HUNT4 70+), NCDs consisting of mild cognitive impairment (MCI) and dementia were diagnosed by clinical experts according to the DSM-5 criteria. Logistic and linear regression models were used to estimate odds ratios (ORs) and regression coefficients (beta) with 95% confidence intervals (CIs) to assess the relationship between 25(OH) D levels and the risk of NCDs or the Montreal Cognitive Assessment (MoCA) score. Results In total, 347 (48.4%) had NCDs in HUNT4, with 33.3% having MCI and 15.1% having dementia. Compared with participants with serum 25(OH)D ≥50 nmol/L, those with 25(OH)D <50 nmol/L had a similar risk of NCDs (OR 1.05, 95% CI 0.76 to 1.46). No association was observed with the risk of MCI (OR 1.01, 95% CI 0.71 to 1.44) or dementia (OR 1.16, 95% CI 0.70 to 1.92), respectively. In a subsample of participants evaluated with the MoCA (n=662), a 25 nmol/L decrease in serum 25(OH)D was not associated with a change in MoCA score (beta 0.33, 95% CI −0.17 to 0.85). Conclusion Vitamin D insufficiency defined by two times measurements of serum 25(OH)D with a 10-year interval was not associated with the risk of NCDs in a cohort of older Norwegian adults. Future studies utilizing multiple vitamin D measurements with a longer follow-up duration and larger sample size are warranted. PubDate: 2022-12-07
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives Oxidative stress and systemic inflammation are the main pathways by which air pollutants cause hypertension (HTN). Vitamin C intake may reduce the risk of HTN caused by air pollutants. This study aimed to investigate the association between air pollutants and pre-HTN and HTN in Korean adults and whether these associations were modified by vitamin C intake, using data from the 2013–2016 Korean National Health and Nutrition Examination Survey (KNHANES). Design Cross-sectional study. Setting This study used data from the KNHANES VI (2013–2015) and VII (2016) along with the data from the annual air pollution report of the Ministry of Environment. Participants We included 11,866 adults who had responded to a semi-food frequency questionnaire. Measurements We used survey logistic regression models to evaluate the association of ambient PM10, SO2, NO2, CO, and O3 with pre-HTN and HTN according to vitamin C intake. Results After adjusting for potential covariates, exposure to ambient PM10, SO2, NO2, and CO was significantly associated with a high prevalence of pre-HTN and HTN, whereas exposure to O3 was significantly associated with a low prevalence of pre-HTN and HTN. In particular, as the air pollutant scores increased (severe air pollution), the prevalence of pre-HTN and HTN increased in a dose-dependent manner (highest score vs. lowest score, OR=1.85, 95% CI=1.39–2.46, p for trend <.0001). However, these associations were found to be pronounced in adults with low vitamin C intake (highest score vs. lowest score, OR=2.30, 95% CI=1.50–3.54, p for trend <.0001), whereas the statistical significance disappeared for adults with high vitamin C intake (highest score vs. lowest score, OR=1.40, 95% CI=0.93–2.12, p for trend=0.007). Conclusion Exposure to air pollutants such as PM10, SO2, NO2, and CO may increase the prevalence of pre-HTN and HTN among Korean adults. In addition, a high intake of vitamin C may help prevent pre-HTN and HTN caused by air pollutants. PubDate: 2022-12-05
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: In the section of Conflict of interest of this article, the author in the sentence cited below, didn’t declare any conflict of interest: PubDate: 2022-12-01
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives The study aimed to evaluate the brief F3ALLS assessment’s validity in screening fall risk. Design This is a cross sectional and longitudinal study. Setting Participants were recruited from outpatient primary care clinics. Participants Older ambulatory adults ages 65–90 volunteered for this study. Measurements Falls risk was measured with TGBA and F3ALLS questionnaires. A 6-month follow-up period assessed for falls using falls diaries and chart review. Results Participants (n=97) were older adults ages 73.91±6.4, 68% (n=66) female. 31% of participants reported at least one fall at 6-months. F3ALLS scores were higher in participants who reported 1 or more falls at 6-months follow-up (3.23±1.5). Higher F3ALLS scores were associated with 6-month fall risk (OR=1.463, 95% CI=1.098–1.949). A score > 3 stratified patients as at risk of falling (AUC=0.77, P<.001; Sensitivity=0.65, Specificity=0.71). Conclusion The F3ALLS questionnaire adequately classifies person at risk versus not at risk for falls, and higher (worse) F3ALLS scores are associated with falls over 6 months. PubDate: 2022-12-01
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Background Polypharmacy, frailty and malnutrition are known predictors of adverse outcomes in dialysis patients. Little has reported about their interaction and composite prognostic values. We aimed to describe the interaction between polypharmacy, frailty, nutrition, hospitalization, and survival in peritoneal dialysis patients. Methods In this prospective cohort study, we recruited 573 peritoneal dialysis patients. Drug burden was measured by medication number and daily pill load. Frailty and nutrition were assessed by the validated Frailty Score (FQ) and Subjective Global Assessment (SGA) respectively. All patients were followed for two years. Primary outcome was all-cause mortality. Secondary outcomes were fall and fracture episodes, hospitalization, change in FQ and SGA. Results At baseline, each patient took 7.5 ± 2.6 medications with 15.5 ± 8.5 tablets per day. Medication number, but not daily pill load predicted baseline FQ (p = 0.004) and SGA (p = 0.03). Over 2 years, there were 69 fall and 1,606 hospitalization episodes. In addition, 148 (25.8%) patients died, while FQ and SGA changed by 0.73 ± 4.23 and −0.07 ± 1.06 respectively in survivors. Medication number (hospitalization: p = 0.02, survival: p = 0.005), FQ (hospitalization: p < 0.001; survival: p = 0.01) predicted hospitalization and survival. Medication number also predicted fall episodes (p = 0.02) and frailty progression (p = 0.002). Daily pill load did not predict any of these outcomes. Conclusions Drug burden is high in peritoneal dialysis patients, and it carries important prognostic implication. Medication number but not pill load significantly predicted onset and progression of frailty, malnutrition, fall, hospitalization, and mortality. PubDate: 2022-11-14 DOI: 10.1007/s12603-022-1859-8
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Objectives The colorectal cancer (CRC) burden is increasingly high. The aim of this study was to investigate temporal and geographical trends in CRC deaths and disability-adjusted life-years (DALYs) attributable to diet low in fiber globally from 1990 to 2019. Design Cross-sectional study. Setting The study based on the Global Burden of Disease Study (GBD) 2019. Participants The population comprised individuals from 204 countries and territories who were diagnosed with CRC attributable to diet low in fiber from 1990 to 2019. Measurements Deaths, DALYs, age-standardized mortality rates (ASMR), and age-standardized DALY rates (ASDR) for CRC attributable to diet low in fiber were described, and estimated annual percentage change (EAPC) was further calculated to assess the burden in different regions, countries, sexes, and age groups. Additionally, we explored the association between EAPC and ASMR/ASDR (in 1990) and Human Development Index (HDI, in 2019). Results From 1990 to 2019, global ASMR and ASDR for CRC attributable to diet low in fiber decreased slightly, but the corresponding deaths and DALYs increased by 63.37% and 51.36%, respectively. Those burden varied considerably between regions and countries. The burden was higher in high, high-middle and middle SDI regions, especially in Asia and Western Europe, but when HDI > 0.7, an increasingly rapid decline in ASMR and ASDR was revealed. Unexpectedly, many less well-developed countries within the traditionally low deaths and DALYs regions of Africa, Central Latin America, and Middle East showed gradual increases in ASMR and ASDR. Conclusion The global burden of CRC attributable to diet low in fiber has decreased over the last 30 years, but remains at a high level. It is essential for decision-makers to take targeted measures for improving population awareness and intake of dietary fiber. PubDate: 2022-11-14 DOI: 10.1007/s12603-022-1865-x