Oxygen free radicals [reactive oxygen species (ROS)] and nitrogen free radicals [reactive nitrogen species (RNS)] are generated by mitochondria during adenosine triphosphate synthesis, and catalytic activities of cytochrome P450, nicotinamide adenine dinucleotide phosphate oxidases (NOXs), cyclooxygenases, and nitric oxide synthases during drug catabolism, phagocytosis, and acute inflammation. Under normal circumstances, low levels of ROS and RNS provide redox signalings that control many essential physiological processes. As age progresses ROS and RNS increase excessively due to dysfunctional mitochondria, dysregulated NOX, and other free-radical generating sources, leading to oxidative stress, which causes oxidation and denaturation of key cellular components including DNA, proteins, and lipids, which become abnormal, constituting damage-associated molecular pattern (DAMP), recognized as ‘non-self’ by immune cells, leading to inflammation which is mediated by nuclear factor kappa B-inflammasome, p38-c-Jun N-terminal kinase and Janus kinase-signal transducer and activator of transcription pathways. DAMPs are continuously released from damaged and senescent cells, causing an otherwise normally transient inflammation turning into systemic chronic inflammation, the root cause of aging and age-associated diseases (AADs). Cells restore redox balance by activating the nuclear factor erythroid 2-related factor 2 (Nrf2) pathway that induces the synthesis and release of antioxidation molecules and enzymes including haem oxygenase-1, which also inhibits the three inflammatory pathways. Furthermore, upregulation of autophagy (AP) can get rid of abnormal molecules, prevent the generation of DAMPs, and attenuate inflammation. Both AP and Nrf2 signalings decrease with age. The upregulations of Nrf2, AP, and downregulation of inflammation are controlled by sensors of energy and stress levels, i.e., adenosine monophosphate-activated protein kinase, silent information regulator 1, and Sestrins, as well as the extracellular matrix, while mammalian targets for rapamycin complex 1, a nutrient sensor, act in the opposite direction. If the balance of these sensor systems becomes dysregulated, aging process accelerates, and the risk of AADs increases.
Oxygen free radicals [reactive oxygen species (ROS)] and nitrogen free radicals [reactive nitrogen species (RNS)] are generated by mitochondria during adenosine triphosphate synthesis, and catalytic activities of cytochrome P450, nicotinamide adenine dinucleotide phosphate oxidases (NOXs), cyclooxygenases, and nitric oxide synthases during drug catabolism, phagocytosis, and acute inflammation. Under normal circumstances, low levels of ROS and RNS provide redox signalings that control many essential physiological processes. As age progresses ROS and RNS increase excessively due to dysfunctional mitochondria, dysregulated NOX, and other free-radical generating sources, leading to oxidative stress, which causes oxidation and denaturation of key cellular components including DNA, proteins, and lipids, which become abnormal, constituting damage-associated molecular pattern (DAMP), recognized as ‘non-self’ by immune cells, leading to inflammation which is mediated by nuclear factor kappa B-inflammasome, p38-c-Jun N-terminal kinase and Janus kinase-signal transducer and activator of transcription pathways. DAMPs are continuously released from damaged and senescent cells, causing an otherwise normally transient inflammation turning into systemic chronic inflammation, the root cause of aging and age-associated diseases (AADs). Cells restore redox balance by activating the nuclear factor erythroid 2-related factor 2 (Nrf2) pathway that induces the synthesis and release of antioxidation molecules and enzymes including haem oxygenase-1, which also inhibits the three inflammatory pathways. Furthermore, upregulation of autophagy (AP) can get rid of abnormal molecules, prevent the generation of DAMPs, and attenuate inflammation. Both AP and Nrf2 signalings decrease with age. The upregulations of Nrf2, AP, and downregulation of inflammation are controlled by sensors of energy and stress levels, i.e., adenosine monophosphate-activated protein kinase, silent information regulator 1, and Sestrins, as well as the extracellular matrix, while mammalian targets for rapamycin complex 1, a nutrient sensor, act in the opposite direction. If the balance of these sensor systems becomes dysregulated, aging process accelerates, and the risk of AADs increases.
The gastrointestinal (GI) microbiome remains an emerging topic of study and the characterization and impact on human health and disease continue to be an area of great interest. Similarly, the coronavirus disease 2019 (COVID-19) pandemic has significantly impacted the healthcare system with active disease, lasting effects, and complications with the full impact yet to be determined. The most current evidence of the interaction between COVID-19 and the GI microbiome is reviewed, with a focus on key mediators and the microbiome changes associated with acute disease and post-acute COVID-19 syndrome (PACS).
The gastrointestinal (GI) microbiome remains an emerging topic of study and the characterization and impact on human health and disease continue to be an area of great interest. Similarly, the coronavirus disease 2019 (COVID-19) pandemic has significantly impacted the healthcare system with active disease, lasting effects, and complications with the full impact yet to be determined. The most current evidence of the interaction between COVID-19 and the GI microbiome is reviewed, with a focus on key mediators and the microbiome changes associated with acute disease and post-acute COVID-19 syndrome (PACS).
Forward head posture (FHP) is a very common pathological neck posture among people who frequently use multimedia devices, and it could be related to some musculoskeletal disorders. However, its role in influencing lung function and its relationship with neck disability are still debated in the literature. Therefore, the aim of the present study was to investigate the influence of FHP on respiratory function, and to explore a possible relationship between FHP and neck discomfort.
A cross-sectional study was conducted on a sample of 83 subjects (35.7 ± 8.4 years aged), enrolled at the Ferrari corporate wellness program “Formula Benessere”. Craniovertebral angle (CVA) was measured with a digital goniometer to assess head posture: FHP was defined with a CVA < 50° in an upright position. Spirometry was conducted according to European Respiratory Society/American Thoracic Society (ERS/ATS) criteria. Finally, subjects enrolled were evaluated through a self-administered neck disability index (NDI) questionnaire.
Among the 60 participants with agreement about the CVA measurements, 45 had FHP (11 females and 34 males) with lower CVA values. No significant differences were found in spirometric parameters between subjects with FHP (n = 45) and subjects without FHP (n = 15). Furthermore, the two groups did not differ either in NDI scores (P = 0.148).
There is no clear relationship between FHP and respiratory function indices. Moreover, no differences have been found in NDI values between subjects with FHP and subjects without FHP. Respiratory rehabilitation strategies should be focused on other parameters than FHP itself.
Forward head posture (FHP) is a very common pathological neck posture among people who frequently use multimedia devices, and it could be related to some musculoskeletal disorders. However, its role in influencing lung function and its relationship with neck disability are still debated in the literature. Therefore, the aim of the present study was to investigate the influence of FHP on respiratory function, and to explore a possible relationship between FHP and neck discomfort.
A cross-sectional study was conducted on a sample of 83 subjects (35.7 ± 8.4 years aged), enrolled at the Ferrari corporate wellness program “Formula Benessere”. Craniovertebral angle (CVA) was measured with a digital goniometer to assess head posture: FHP was defined with a CVA < 50° in an upright position. Spirometry was conducted according to European Respiratory Society/American Thoracic Society (ERS/ATS) criteria. Finally, subjects enrolled were evaluated through a self-administered neck disability index (NDI) questionnaire.
Among the 60 participants with agreement about the CVA measurements, 45 had FHP (11 females and 34 males) with lower CVA values. No significant differences were found in spirometric parameters between subjects with FHP (n = 45) and subjects without FHP (n = 15). Furthermore, the two groups did not differ either in NDI scores (P = 0.148).
There is no clear relationship between FHP and respiratory function indices. Moreover, no differences have been found in NDI values between subjects with FHP and subjects without FHP. Respiratory rehabilitation strategies should be focused on other parameters than FHP itself.
DNA paralogs that have a length of at least 1 kilobase (kb) and are duplicated with a sequence identity of over 90% are classified as low copy repeats (LCRs) or segmental duplications (SDs). They constitute 6.6% of the genome and are clustering in specific genomic loci. Due to the high sequence homology between these duplicated regions, they can misalign during meiosis resulting in non-allelic homologous recombination (NAHR) and leading to structural variation such as deletions, duplications, inversions, and translocations. When such rearrangements result in a clinical phenotype, they are categorized as a genomic disorder. The presence of multiple copies of larger genomic segments offers opportunities for evolution. First, the creation of new genes in the human lineage will lead to human-specific traits and adaptation. Second, LCR variation between human populations can give rise to phenotypic variability. Hence, the rearrangement predisposition associated with LCRs should be interpreted in the context of the evolutionary advantages.
DNA paralogs that have a length of at least 1 kilobase (kb) and are duplicated with a sequence identity of over 90% are classified as low copy repeats (LCRs) or segmental duplications (SDs). They constitute 6.6% of the genome and are clustering in specific genomic loci. Due to the high sequence homology between these duplicated regions, they can misalign during meiosis resulting in non-allelic homologous recombination (NAHR) and leading to structural variation such as deletions, duplications, inversions, and translocations. When such rearrangements result in a clinical phenotype, they are categorized as a genomic disorder. The presence of multiple copies of larger genomic segments offers opportunities for evolution. First, the creation of new genes in the human lineage will lead to human-specific traits and adaptation. Second, LCR variation between human populations can give rise to phenotypic variability. Hence, the rearrangement predisposition associated with LCRs should be interpreted in the context of the evolutionary advantages.
The main objective of the study was to formulate, evaluate and perform an optimization study of chaulmoogra oil-loaded solid lipid nanoparticles (SLNs) based gel.
The study involves isolation, identification, and quantification of hydnocarpic acid (HA), using high-performance thin-layer chromatography (HPTLC) and characterization using ultraviolet (UV), nuclear magnetic resonance (NMR), and mass spectroscopy (MS), and differential scanning calorimetry (DSC). Different concentration of assorted solid lipids and surfactants was used for the preparation of SLN gel with the improved transdermal application. Size distribution, entrapping efficiency, transmission electron microscopy (TEM), and percent yield were tested for the prepared SLN and the characterization of SLN gel was evaluated on the basis of in vitro diffusion study, stability studies, homogeneity, and skin irritancy test.
The amount of HA quantified in the oil sample was found to be 54.84% w/w. The percent yield and entrapment efficiency (EE) of HA SLNs were 96.176 ± 1.338% and 90.2 ± 0.5% respectively. The in vitro percent cumulative drug release was 80.89% for the developed SLN, the homogeneity test showed no grittiness, and the prepared gel was found to be effective as it shows no signs of erythema post-treatment of 10 days. The in vitro dissolution studies showed better results for SLN gel when compared to SLN suspension.
The nano-gel could be a better option for the topical delivery of herbal drugs with improved bioavailability providing several benefits over conventional formulation.
The main objective of the study was to formulate, evaluate and perform an optimization study of chaulmoogra oil-loaded solid lipid nanoparticles (SLNs) based gel.
The study involves isolation, identification, and quantification of hydnocarpic acid (HA), using high-performance thin-layer chromatography (HPTLC) and characterization using ultraviolet (UV), nuclear magnetic resonance (NMR), and mass spectroscopy (MS), and differential scanning calorimetry (DSC). Different concentration of assorted solid lipids and surfactants was used for the preparation of SLN gel with the improved transdermal application. Size distribution, entrapping efficiency, transmission electron microscopy (TEM), and percent yield were tested for the prepared SLN and the characterization of SLN gel was evaluated on the basis of in vitro diffusion study, stability studies, homogeneity, and skin irritancy test.
The amount of HA quantified in the oil sample was found to be 54.84% w/w. The percent yield and entrapment efficiency (EE) of HA SLNs were 96.176 ± 1.338% and 90.2 ± 0.5% respectively. The in vitro percent cumulative drug release was 80.89% for the developed SLN, the homogeneity test showed no grittiness, and the prepared gel was found to be effective as it shows no signs of erythema post-treatment of 10 days. The in vitro dissolution studies showed better results for SLN gel when compared to SLN suspension.
The nano-gel could be a better option for the topical delivery of herbal drugs with improved bioavailability providing several benefits over conventional formulation.
The novel coronavirus disease-2019 (COVID-19) has created a major public health crisis. Various dietary factors may enhance immunological activity against COVID-19 and serve as a method to combat severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The dietary factors that are responsible for boosting immunity may provide a therapeutic advantage in patients with COVID-19. Investigators have demonstrated that vitamins B6, B12, C, D, E, and K, and trace elements like zinc, copper, selenium, and iron may serve as important tools for immunomodulation. Herein this is a review the peer-reviewed literature pertaining to dietary immunomodulation strategies against COVID-19. This review is intended to better define the evidence that dietary modifications and supplementation could positively influence the proinflammatory state in patients with COVID-19 and improve clinical outcomes. With appropriate insight, therapeutic interventions are discussed and directed to potentially modulate host immunity to mitigate the disease mechanisms of COVID-19.
The novel coronavirus disease-2019 (COVID-19) has created a major public health crisis. Various dietary factors may enhance immunological activity against COVID-19 and serve as a method to combat severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The dietary factors that are responsible for boosting immunity may provide a therapeutic advantage in patients with COVID-19. Investigators have demonstrated that vitamins B6, B12, C, D, E, and K, and trace elements like zinc, copper, selenium, and iron may serve as important tools for immunomodulation. Herein this is a review the peer-reviewed literature pertaining to dietary immunomodulation strategies against COVID-19. This review is intended to better define the evidence that dietary modifications and supplementation could positively influence the proinflammatory state in patients with COVID-19 and improve clinical outcomes. With appropriate insight, therapeutic interventions are discussed and directed to potentially modulate host immunity to mitigate the disease mechanisms of COVID-19.
The periodontium is an appropriate target for regeneration, as it cannot restore its function following disease. Significantly, the periodontium’s limited regenerative capacity could be enhanced through the development of novel biomaterials and therapeutic approaches. Notably, the regenerative potential of the periodontium depends not only on its tissue-specific architecture and function but also on its ability to reconstruct distinct tissues and tissue interfaces, implying that the development of tissue engineering techniques can offer new perspectives for the organized reconstruction of soft and hard periodontal tissues. With their biocompatible structure and one-of-a-kind stimulus-responsive property, hydrogels have been utilized as an excellent drug delivery system for the treatment of several oral diseases. Furthermore, bioceramics and three-dimensional (3D) printed scaffolds are also appropriate scaffolding materials for the regeneration of periodontal tissue, bone, and cartilage. This work aims to examine and update material-based, biologically active cues and the deployment of breakthrough bio-fabrication technologies to regenerate the numerous tissues that comprise the periodontium for clinical and scientific applications.
The periodontium is an appropriate target for regeneration, as it cannot restore its function following disease. Significantly, the periodontium’s limited regenerative capacity could be enhanced through the development of novel biomaterials and therapeutic approaches. Notably, the regenerative potential of the periodontium depends not only on its tissue-specific architecture and function but also on its ability to reconstruct distinct tissues and tissue interfaces, implying that the development of tissue engineering techniques can offer new perspectives for the organized reconstruction of soft and hard periodontal tissues. With their biocompatible structure and one-of-a-kind stimulus-responsive property, hydrogels have been utilized as an excellent drug delivery system for the treatment of several oral diseases. Furthermore, bioceramics and three-dimensional (3D) printed scaffolds are also appropriate scaffolding materials for the regeneration of periodontal tissue, bone, and cartilage. This work aims to examine and update material-based, biologically active cues and the deployment of breakthrough bio-fabrication technologies to regenerate the numerous tissues that comprise the periodontium for clinical and scientific applications.
Childhood obesity is accompanied by an increased prevalence of abnormal glucose tolerance (AGT) including the prediabetes states. This study aims to investigate and evaluate the use of oral glucose tolerance test (OGTT) for detecting AGT among overweight and obese children.
A retrospective study was conducted on 895 overweight and obese Chinese children (6–18 years) with obesity assessment and analysis of demographic, anthropometric, and biochemical parameters data between January 2006 and December 2015 at Tseung Kwan O Hospital, Hong Kong Special Administrative Region.
The proportion of males and older age group was 63.7% and 55.9%, respectively. Girls were more in older age groups (62.7% vs. 52.0%, P = 0.002). AGT occurred in 17.1% of the cohort [impaired glucose tolerance (IGT) was the most frequent morbidity (11.3%)]. After regression analysis, female sex, low-density lipoprotein (LDL), triglyceride (TG), older age group, and homeostasis model assessment of insulin resistance (HOMA-IR) ≥ 4.1 were significantly associated with AGT.
AGT is common in overweight and obese Chinese children. Girls, older age, higher LDL, TG and HOMA-IR ≥ 4.1 showed significant association with AGT. OGTT is essential and fit-for-purpose to detect AGT in overweight and obese children.
Childhood obesity is accompanied by an increased prevalence of abnormal glucose tolerance (AGT) including the prediabetes states. This study aims to investigate and evaluate the use of oral glucose tolerance test (OGTT) for detecting AGT among overweight and obese children.
A retrospective study was conducted on 895 overweight and obese Chinese children (6–18 years) with obesity assessment and analysis of demographic, anthropometric, and biochemical parameters data between January 2006 and December 2015 at Tseung Kwan O Hospital, Hong Kong Special Administrative Region.
The proportion of males and older age group was 63.7% and 55.9%, respectively. Girls were more in older age groups (62.7% vs. 52.0%, P = 0.002). AGT occurred in 17.1% of the cohort [impaired glucose tolerance (IGT) was the most frequent morbidity (11.3%)]. After regression analysis, female sex, low-density lipoprotein (LDL), triglyceride (TG), older age group, and homeostasis model assessment of insulin resistance (HOMA-IR) ≥ 4.1 were significantly associated with AGT.
AGT is common in overweight and obese Chinese children. Girls, older age, higher LDL, TG and HOMA-IR ≥ 4.1 showed significant association with AGT. OGTT is essential and fit-for-purpose to detect AGT in overweight and obese children.
To evaluate angiotensin II (Ang II) and Ang-(1-7) levels and the cytokine profile in patients hospitalized with mild coronavirus disease 2019 (COVID-19) and contrast them with patients with identical clinical conditions but treated with high doses of vitamin D (vitD).
From the 218 patients recruited (ClinicalTrials.gov NCT04411446), 16 participated in this sub-study and were randomized to a single oral dose of 500,000 IU vitD (n = 10) or placebo (n = 6). Plasmatic Ang II and Ang-(1-7) levels were determined by radioimmunoassay and interleukins (ILs) 1, 6, 8, and 10 and tumor necrosis factor alpha (TNF-α) by enzyme-linked immunosorbent assay before and after treatment. Parallel, serum 25-hydroxyvitamin D3 (25-OH vitD) concentrations as vitD status was measured by a chemiluminescence immunoassay.
A trend towards an increase in Ang-(1-7) and a decrease in Ang II levels were observed in placebo- and vitD-treated COVID-19 patients compared to baseline values. There was no difference in Ang II and Ang-(1-7) levels between placebo- and vitD-treated COVID-19 patients. Similar results were obtained with ILs profile. COVID-19 patients showed an increase in the protective component of the RAS which was not improved by vitD treatment.
VitD did not improve RAS disbalance in COVID-19. Notwithstanding, the authors visualize that acute treatment with high doses of vitD may show a trend to a decline in inflammatory ILs and an increase in protective markers. Finally, the authors would like to highlight the limitations of this preliminary study, namely the small number of patients and the use of a large single bolus dose of vitD rather than lower daily doses for extended periods with prolonged follow-up times. All these factors need special consideration in the designs of new vitD supplementation trials. All these factors need special consideration in the designs of new vitD supplementation trials (ClinicalTrials.gov identifier: NCT04411446).
To evaluate angiotensin II (Ang II) and Ang-(1-7) levels and the cytokine profile in patients hospitalized with mild coronavirus disease 2019 (COVID-19) and contrast them with patients with identical clinical conditions but treated with high doses of vitamin D (vitD).
From the 218 patients recruited (ClinicalTrials.gov NCT04411446), 16 participated in this sub-study and were randomized to a single oral dose of 500,000 IU vitD (n = 10) or placebo (n = 6). Plasmatic Ang II and Ang-(1-7) levels were determined by radioimmunoassay and interleukins (ILs) 1, 6, 8, and 10 and tumor necrosis factor alpha (TNF-α) by enzyme-linked immunosorbent assay before and after treatment. Parallel, serum 25-hydroxyvitamin D3 (25-OH vitD) concentrations as vitD status was measured by a chemiluminescence immunoassay.
A trend towards an increase in Ang-(1-7) and a decrease in Ang II levels were observed in placebo- and vitD-treated COVID-19 patients compared to baseline values. There was no difference in Ang II and Ang-(1-7) levels between placebo- and vitD-treated COVID-19 patients. Similar results were obtained with ILs profile. COVID-19 patients showed an increase in the protective component of the RAS which was not improved by vitD treatment.
VitD did not improve RAS disbalance in COVID-19. Notwithstanding, the authors visualize that acute treatment with high doses of vitD may show a trend to a decline in inflammatory ILs and an increase in protective markers. Finally, the authors would like to highlight the limitations of this preliminary study, namely the small number of patients and the use of a large single bolus dose of vitD rather than lower daily doses for extended periods with prolonged follow-up times. All these factors need special consideration in the designs of new vitD supplementation trials. All these factors need special consideration in the designs of new vitD supplementation trials (ClinicalTrials.gov identifier: NCT04411446).
Given the myriad of negative sequalae associated with cancer and its treatment, the palliative use of cannabis by cancer patients is increasingly of special interest. This research sought to explore associations of acute and sustained use of legal market edible cannabis products on pain, cognition, and quality of life in a group of cancer patients.
In this observational study, cancer patients completed a baseline appointment, a two-week ad libitum cannabis use period, and an acute administration appointment that included assessments before cannabis use, one-hour post-use, and two-hour post-use. Participants completed self-report questionnaires related to the primary outcomes and the Stroop task as a measure of objective cognitive function.
Twenty-five participants [mean (standard deviation, SD) age = 54.3 years (15.6); 13 females (52.0%)] completed all study appointments and were included in the analysis. Sustained cannabis use was associated with improvements in pain intensity, pain interference, sleep quality, subjective cognitive function, and reaction times in the Stroop task, but no change in general quality of life was observed. High levels of cannabidiol (CBD) use during the two-week ad libitum use period was associated with steeper improvements in pain intensity and sleep quality. Participants reported improvements in pain intensity and increased feelings of subjective high after acute use. High levels of Δ9-tetrahydrocannabinol (THC) use during the acute administration appointment was associated with steeper increases in feelings of subjective high. Improvements in pain were associated with improvements in subjective cognitive function.
This observational study is among the first of its kind to examine associations between legal market, palliative cannabis use, and subjective and objective outcomes among cancer patients. These early findings concerning pain intensity, sleep quality, and cognitive function can help to inform future, fully powered studies of this important topic (ClinicalTrials.gov identifier: NCT03617692).
Given the myriad of negative sequalae associated with cancer and its treatment, the palliative use of cannabis by cancer patients is increasingly of special interest. This research sought to explore associations of acute and sustained use of legal market edible cannabis products on pain, cognition, and quality of life in a group of cancer patients.
In this observational study, cancer patients completed a baseline appointment, a two-week ad libitum cannabis use period, and an acute administration appointment that included assessments before cannabis use, one-hour post-use, and two-hour post-use. Participants completed self-report questionnaires related to the primary outcomes and the Stroop task as a measure of objective cognitive function.
Twenty-five participants [mean (standard deviation, SD) age = 54.3 years (15.6); 13 females (52.0%)] completed all study appointments and were included in the analysis. Sustained cannabis use was associated with improvements in pain intensity, pain interference, sleep quality, subjective cognitive function, and reaction times in the Stroop task, but no change in general quality of life was observed. High levels of cannabidiol (CBD) use during the two-week ad libitum use period was associated with steeper improvements in pain intensity and sleep quality. Participants reported improvements in pain intensity and increased feelings of subjective high after acute use. High levels of Δ9-tetrahydrocannabinol (THC) use during the acute administration appointment was associated with steeper increases in feelings of subjective high. Improvements in pain were associated with improvements in subjective cognitive function.
This observational study is among the first of its kind to examine associations between legal market, palliative cannabis use, and subjective and objective outcomes among cancer patients. These early findings concerning pain intensity, sleep quality, and cognitive function can help to inform future, fully powered studies of this important topic (ClinicalTrials.gov identifier: NCT03617692).
To investigate the causal impact of diet and sedentary behavior on Brazilian schoolchildren’s overweight/obesity using the data from observational studies.
Annual cross-sectional nutritional surveys over the 2013–2015 period, with 26,712 children old 7–12 years in Florianópolis, Brazil, provided the data for this analysis. The surveys applied an online previous-day recall questionnaire on food intake and physical/sedentary activities. Outcome measures were overweight/obesity, whereas exposure variables were daily frequencies of consuming sugary drinks and ultra-processed foods, the total number of dietary items consumed and the total number of sedentary activities per day, and consuming breakfast, mid-morning snacks, lunch, afternoon snack, dinner, and evening snack. Control variables included child age, sex, family income, school shift, survey year, day of the week the questionnaire refers to, metabolic equivalents (METs) of physical activities (PAs), and the quality of dietary and PA reports. Causal effects were estimated by augmented inverse probability weighting.
Daily consumption of sugary drinks, eating ten or more foods, and engaging in three or more sedentary behaviors per day significantly increased the odds ratios (ORs) of being overweight/obese in the range of 3–24% compared to the reference, with 95% confidence intervals in the range of 1–32%. Among 19 ORs with P-value ≤ 0.05, only 3 exceeded 10%.
Under certain conditions, not uncommon in large-scale monitoring and surveillance studies, it is possible to evaluate the causal effects of diet and sedentary activities on overweight/obesity. Daily consumption of sugar-sweetened beverages, eating ten or more foods, skipping breakfast, and engaging in three or more sedentary behaviors per day significantly increased the odds of being overweight/obese.
To investigate the causal impact of diet and sedentary behavior on Brazilian schoolchildren’s overweight/obesity using the data from observational studies.
Annual cross-sectional nutritional surveys over the 2013–2015 period, with 26,712 children old 7–12 years in Florianópolis, Brazil, provided the data for this analysis. The surveys applied an online previous-day recall questionnaire on food intake and physical/sedentary activities. Outcome measures were overweight/obesity, whereas exposure variables were daily frequencies of consuming sugary drinks and ultra-processed foods, the total number of dietary items consumed and the total number of sedentary activities per day, and consuming breakfast, mid-morning snacks, lunch, afternoon snack, dinner, and evening snack. Control variables included child age, sex, family income, school shift, survey year, day of the week the questionnaire refers to, metabolic equivalents (METs) of physical activities (PAs), and the quality of dietary and PA reports. Causal effects were estimated by augmented inverse probability weighting.
Daily consumption of sugary drinks, eating ten or more foods, and engaging in three or more sedentary behaviors per day significantly increased the odds ratios (ORs) of being overweight/obese in the range of 3–24% compared to the reference, with 95% confidence intervals in the range of 1–32%. Among 19 ORs with P-value ≤ 0.05, only 3 exceeded 10%.
Under certain conditions, not uncommon in large-scale monitoring and surveillance studies, it is possible to evaluate the causal effects of diet and sedentary activities on overweight/obesity. Daily consumption of sugar-sweetened beverages, eating ten or more foods, skipping breakfast, and engaging in three or more sedentary behaviors per day significantly increased the odds of being overweight/obese.
Polycystic ovarian syndrome (PCOS) is the most common endocrine condition, affecting 5–7% of reproductive-age women worldwide. It is associated with low-grade chronic inflammation, insulin resistance, and metabolic syndrome. Studies have shown ceruloplasmin (Cp) as an independent risk factor for metabolic syndrome and magnesium (Mg), which is required for proper glucose utilization. This study aimed to compare the serum Mg and Cp in PCOS and healthy women and correlate their levels with changes in biochemical, hormonal, and gynaecological aspects of PCOS.
The study comprised 98 women diagnosed with PCOS using the Rotterdam criteria and 75 age-matched healthy control subjects. The level of serum Cp and Mg were determined using Somani Ambade colorimetric method and methylthymol blue method respectively.
Serum Cp was higher and Mg levels were lower significantly in PCOS patients in comparison with controls. Mg was inversely correlated with fasting blood glucose and directly correlated with follicle-stimulating hormone (FSH). Cp was inversely correlated with prolactin and thyroid-stimulating hormone. Multiple regression analysis revealed that Cp correlates with both the level of luteinizing hormone (LH) and LH/FSH ratio, whereas serum Mg did not have a significant correlation with any of the clinical variables. Logistic regression analysis revealed elevated Cp, antral follicle count (AFC), body mass index (BMI), weight, and irregular menses increase the risk of developing PCOS, whereas Mg was not a risk factor. However, high LH and LH/FSH ratios were risk factors for hypomagnesemia. In conclusion, serum Cp levels in PCOS may be evaluated as an additional risk factor in association with AFC, BMI, weight, and irregular menses.
Mg deficiency and high Cp play an important etiological role in PCOS pathogenesis. Thus, research evaluating dietary interventions and supplementation is warranted.
Polycystic ovarian syndrome (PCOS) is the most common endocrine condition, affecting 5–7% of reproductive-age women worldwide. It is associated with low-grade chronic inflammation, insulin resistance, and metabolic syndrome. Studies have shown ceruloplasmin (Cp) as an independent risk factor for metabolic syndrome and magnesium (Mg), which is required for proper glucose utilization. This study aimed to compare the serum Mg and Cp in PCOS and healthy women and correlate their levels with changes in biochemical, hormonal, and gynaecological aspects of PCOS.
The study comprised 98 women diagnosed with PCOS using the Rotterdam criteria and 75 age-matched healthy control subjects. The level of serum Cp and Mg were determined using Somani Ambade colorimetric method and methylthymol blue method respectively.
Serum Cp was higher and Mg levels were lower significantly in PCOS patients in comparison with controls. Mg was inversely correlated with fasting blood glucose and directly correlated with follicle-stimulating hormone (FSH). Cp was inversely correlated with prolactin and thyroid-stimulating hormone. Multiple regression analysis revealed that Cp correlates with both the level of luteinizing hormone (LH) and LH/FSH ratio, whereas serum Mg did not have a significant correlation with any of the clinical variables. Logistic regression analysis revealed elevated Cp, antral follicle count (AFC), body mass index (BMI), weight, and irregular menses increase the risk of developing PCOS, whereas Mg was not a risk factor. However, high LH and LH/FSH ratios were risk factors for hypomagnesemia. In conclusion, serum Cp levels in PCOS may be evaluated as an additional risk factor in association with AFC, BMI, weight, and irregular menses.
Mg deficiency and high Cp play an important etiological role in PCOS pathogenesis. Thus, research evaluating dietary interventions and supplementation is warranted.
In patients with cancer, ischemic heart disease, and peripheral vascular disease, the neutrophil-lymphocyte ratio (NLR), a measure of systemic inflammation, has been demonstrated to predict mortality. This study aimed to evaluate the inflammatory status, and also examine the impact of NLR on renal outcomes (mortality and composite endpoints) in non-dialysis chronic kidney disease (CKD) patients.
This prospective cohort was conducted at a tertiary care public teaching hospital. The NLR greater than 3.53 was taken as an indication of systemic inflammation. The outcome measures include composite endpoints (end-stage renal disease, dialysis commencement, doubling serum creatinine from the baseline), and mortality. Kaplan-Meier plots and a multivariate Cox proportional hazard model were employed to analyze the outcomes.
A cohort of 360 patients aged 53.7 years ± 13.9 years had a median follow-up of 14 months ± 4.24 months and was evaluated for inflammatory status and renal outcomes. The proportion of inflammation was found to be 101 (28.7%). Higher NLR levels had shown an increased incidence of mortality (5.3%) and composite endpoints (12.3%). In reference to the NLR quartile (Q1), the highest quartile (Q4) had shown 3 times increased hazards for mortality and 95.0% increased risk of hazards for composite endpoints Q4 hazard ratio (HR) 3.09; 95% confidence interval (CI) 1.38–6.91 (P = 0.006), and Q4 HR 1.93; 95% CI 1.22–3.08 (P = 0.005), respectively. Higher NLR was positively associated with urea, creatinine, alkaline phosphatase, Pt-Global web tool©/Patient-Generated Subjective Global Assessment score and negatively correlated with estimated glomerular filtration rate, albumin, hemoglobin.
NLR is a potential predictor of mortality and composite endpoints in CKD patients even before they undergo dialysis. Additionally, inflammation should be regarded as a common comorbid condition in CKD patients due to its high prevalence.
In patients with cancer, ischemic heart disease, and peripheral vascular disease, the neutrophil-lymphocyte ratio (NLR), a measure of systemic inflammation, has been demonstrated to predict mortality. This study aimed to evaluate the inflammatory status, and also examine the impact of NLR on renal outcomes (mortality and composite endpoints) in non-dialysis chronic kidney disease (CKD) patients.
This prospective cohort was conducted at a tertiary care public teaching hospital. The NLR greater than 3.53 was taken as an indication of systemic inflammation. The outcome measures include composite endpoints (end-stage renal disease, dialysis commencement, doubling serum creatinine from the baseline), and mortality. Kaplan-Meier plots and a multivariate Cox proportional hazard model were employed to analyze the outcomes.
A cohort of 360 patients aged 53.7 years ± 13.9 years had a median follow-up of 14 months ± 4.24 months and was evaluated for inflammatory status and renal outcomes. The proportion of inflammation was found to be 101 (28.7%). Higher NLR levels had shown an increased incidence of mortality (5.3%) and composite endpoints (12.3%). In reference to the NLR quartile (Q1), the highest quartile (Q4) had shown 3 times increased hazards for mortality and 95.0% increased risk of hazards for composite endpoints Q4 hazard ratio (HR) 3.09; 95% confidence interval (CI) 1.38–6.91 (P = 0.006), and Q4 HR 1.93; 95% CI 1.22–3.08 (P = 0.005), respectively. Higher NLR was positively associated with urea, creatinine, alkaline phosphatase, Pt-Global web tool©/Patient-Generated Subjective Global Assessment score and negatively correlated with estimated glomerular filtration rate, albumin, hemoglobin.
NLR is a potential predictor of mortality and composite endpoints in CKD patients even before they undergo dialysis. Additionally, inflammation should be regarded as a common comorbid condition in CKD patients due to its high prevalence.
There is a correlation between the number of resected lymph nodes (LNs) and survival as well as staging in patients with colorectal cancer (CRC). This cohort discussed the effect of the number of dissected LNs on the prognosis [survival, disease-free survival (DFS)] of patients with stage II and III CRC.
In this historical prospective cohort study, the records of 946 patients with CRC operated in the Seyyed-Al-Shohada hospital in Isfahan from 1998 to 2014 were enrolled. Then the impact of LNs on the overall survival (OS) and DFS were analyzed.
The number of removed LNs was higher among males [mean difference = 1.38, t (944) = 2.232, P-value = 0.02]. The median of the DFS for the patients with 1 to 20 LN removal was 104 months [95% confidence interval (CI): 90.97–117.03], while this number for the patients with more than 20 nodes was 166 months (95% CI: 140.41–191.58). DFS between two groups of CRCs, LN removal 1–20, and greater than 20. Age and number of LN removal were significant predictors of the DFS. There was a strong and statistically significant correlation between DFS and OS among CRC patients.
This study shows that if the number of resected LNs in patients with CRC is more than 20, it will increase in DFS and OS.
There is a correlation between the number of resected lymph nodes (LNs) and survival as well as staging in patients with colorectal cancer (CRC). This cohort discussed the effect of the number of dissected LNs on the prognosis [survival, disease-free survival (DFS)] of patients with stage II and III CRC.
In this historical prospective cohort study, the records of 946 patients with CRC operated in the Seyyed-Al-Shohada hospital in Isfahan from 1998 to 2014 were enrolled. Then the impact of LNs on the overall survival (OS) and DFS were analyzed.
The number of removed LNs was higher among males [mean difference = 1.38, t (944) = 2.232, P-value = 0.02]. The median of the DFS for the patients with 1 to 20 LN removal was 104 months [95% confidence interval (CI): 90.97–117.03], while this number for the patients with more than 20 nodes was 166 months (95% CI: 140.41–191.58). DFS between two groups of CRCs, LN removal 1–20, and greater than 20. Age and number of LN removal were significant predictors of the DFS. There was a strong and statistically significant correlation between DFS and OS among CRC patients.
This study shows that if the number of resected LNs in patients with CRC is more than 20, it will increase in DFS and OS.
Being overweight and obesity are factors in the negative modification of bronchial asthma (BA). The mechanisms of the aggravating effect of obesity on the course of BA have not yet been fully determined, but include changes in external respiration. The aim of the study was to study the effect of being overweight/obesity on spirometric parameters and on the occurrence of dysanapsis in children and adolescents with BA.
It was a cross-sectional, open, single-center study. The data were obtained from 428 patients with atopic BA aged 7 years to 17 years, 12.0 [9.0; 14.0], and 72.9% (312/428) of them were boys. The children were divided into 3 groups: group 1—normal body weight; group 2—overweight; and group 3—obesity. All participants underwent spirometry, the ratio of forced expiratory volume in 1 second (FEV1)/forced vital capacity (FVC) was calculated and the diagnosis of dysanapsis was performed.
As body weight increases, a progressive decrease in FEV1/FVC is revealed—group 1: 79.55% [71.37; 85.43]; group 2: 76.82% [70.12; 82.03]; and group 3: 76.28% [67.04; 79.89] P = 0.004; as well as a decrease in Z FEV1/FVC: group 1—1.23 [–2.18; –0.28]; group 2—1.54 [–2.19; –0.68]; and group 3—1.75 [–2.63; –0.90] P = 0.02. Dysanapsis was detected in 37.7% (159/428) of patients. The incidence of dysanapsis increased statistically significantly with increasing body mass index (BMI) and amounted to: with normal body weight—31.7% (77/243), with overweight—42.0% (55/131), and with obesity—50% (27/54) P = 0.016.
In children and adolescents with BA, as BMI increases, there is a statistically significant decrease in the ratio of FEV1/FVC, and, consequently, bronchial patency; the incidence of dysanapsis also increases statistically significantly. Taken together, this indicates the formation of an obstructive pattern of external respiration under the influence of being overweight and obesity in children and adolescents with BA.
Being overweight and obesity are factors in the negative modification of bronchial asthma (BA). The mechanisms of the aggravating effect of obesity on the course of BA have not yet been fully determined, but include changes in external respiration. The aim of the study was to study the effect of being overweight/obesity on spirometric parameters and on the occurrence of dysanapsis in children and adolescents with BA.
It was a cross-sectional, open, single-center study. The data were obtained from 428 patients with atopic BA aged 7 years to 17 years, 12.0 [9.0; 14.0], and 72.9% (312/428) of them were boys. The children were divided into 3 groups: group 1—normal body weight; group 2—overweight; and group 3—obesity. All participants underwent spirometry, the ratio of forced expiratory volume in 1 second (FEV1)/forced vital capacity (FVC) was calculated and the diagnosis of dysanapsis was performed.
As body weight increases, a progressive decrease in FEV1/FVC is revealed—group 1: 79.55% [71.37; 85.43]; group 2: 76.82% [70.12; 82.03]; and group 3: 76.28% [67.04; 79.89] P = 0.004; as well as a decrease in Z FEV1/FVC: group 1—1.23 [–2.18; –0.28]; group 2—1.54 [–2.19; –0.68]; and group 3—1.75 [–2.63; –0.90] P = 0.02. Dysanapsis was detected in 37.7% (159/428) of patients. The incidence of dysanapsis increased statistically significantly with increasing body mass index (BMI) and amounted to: with normal body weight—31.7% (77/243), with overweight—42.0% (55/131), and with obesity—50% (27/54) P = 0.016.
In children and adolescents with BA, as BMI increases, there is a statistically significant decrease in the ratio of FEV1/FVC, and, consequently, bronchial patency; the incidence of dysanapsis also increases statistically significantly. Taken together, this indicates the formation of an obstructive pattern of external respiration under the influence of being overweight and obesity in children and adolescents with BA.
In the context of in-hospital care management, the need for infusion therapies involves the choice of appropriate devices. Historically, there is no consensus about the preference for vascular accesses, although the data present in the literature would seem to favor peripheral ones due to fearful complications and a non-negligible rate of bloodstream infections. It is also true the decision for central routes is sometimes dictated by the patient’s general clinical conditions (especially as a result of surgery) or by the need to establish continuous short or long-term support therapies. Therefore, it would seem anachronistic to favor one strategy rather than another. Probably data should be reviewed, considering and evaluating the correct application of indications and guidelines for both positioning and management of venous accesses, without facing methodological biases that could lead to scarcy and inconclusive results; although it is undeniable that some conditions promote the onset of complications.
In the context of in-hospital care management, the need for infusion therapies involves the choice of appropriate devices. Historically, there is no consensus about the preference for vascular accesses, although the data present in the literature would seem to favor peripheral ones due to fearful complications and a non-negligible rate of bloodstream infections. It is also true the decision for central routes is sometimes dictated by the patient’s general clinical conditions (especially as a result of surgery) or by the need to establish continuous short or long-term support therapies. Therefore, it would seem anachronistic to favor one strategy rather than another. Probably data should be reviewed, considering and evaluating the correct application of indications and guidelines for both positioning and management of venous accesses, without facing methodological biases that could lead to scarcy and inconclusive results; although it is undeniable that some conditions promote the onset of complications.
Pseudoneurological complaints (PNCs) are highly prevalent among the general population. Coronavirus disease 2019 (COVID-19) adversely influences such complaints in individuals who recovered from COVID-19. This study determined the prevalence and identified the predictors of PNCs among individuals who had previously experienced COVID-19 and their healthy counterparts.
This case-control study analyzed the data of 878 Bangladeshi adults (439 patients). Laboratory-confirmed COVID-19 individuals were considered cases, and the controls were those who never tested positive for COVID-19. The controls were matched with cases’ sex and age. The seven-item pseudoneurological sub-scale of the subjective health complaints scale produced by Eriksen et al. evaluated PNCs. The descriptive analysis estimated the prevalence of PNCs among the subgroups, whereas multiple logistic regression models were used to determine the predictors of PNCs.
Overall, the prevalence of PNCs was 40%; however, patients who recovered from COVID-19 reported a PNC rate of 67.4%. The regression analysis identified COVID-19 as a robust independent predictor of PNCs. Furthermore, occupation, monthly household income, current living location, hypertension, and recovery period from acute COVID-19 were independently associated with PNCs.
This study revealed a significant association between COVID-19 and PNCs. The results of this study will be helpful when discussing, planning, and implementing strategies to alleviate the overburden of PNCs among COVID-19 survivors.
Pseudoneurological complaints (PNCs) are highly prevalent among the general population. Coronavirus disease 2019 (COVID-19) adversely influences such complaints in individuals who recovered from COVID-19. This study determined the prevalence and identified the predictors of PNCs among individuals who had previously experienced COVID-19 and their healthy counterparts.
This case-control study analyzed the data of 878 Bangladeshi adults (439 patients). Laboratory-confirmed COVID-19 individuals were considered cases, and the controls were those who never tested positive for COVID-19. The controls were matched with cases’ sex and age. The seven-item pseudoneurological sub-scale of the subjective health complaints scale produced by Eriksen et al. evaluated PNCs. The descriptive analysis estimated the prevalence of PNCs among the subgroups, whereas multiple logistic regression models were used to determine the predictors of PNCs.
Overall, the prevalence of PNCs was 40%; however, patients who recovered from COVID-19 reported a PNC rate of 67.4%. The regression analysis identified COVID-19 as a robust independent predictor of PNCs. Furthermore, occupation, monthly household income, current living location, hypertension, and recovery period from acute COVID-19 were independently associated with PNCs.
This study revealed a significant association between COVID-19 and PNCs. The results of this study will be helpful when discussing, planning, and implementing strategies to alleviate the overburden of PNCs among COVID-19 survivors.
Cardiovascular diseases (CVD) are the leading cause of death globally. In the condition of type 2 diabetes mellitus (T2DM), the prevalence of CVD increase parallel with the rise of metabolic complication and higher incidence of coronary artery stenosis. The aim of this study was to compare the level of percent stenosis in coronary arteries in patients with coronary artery disease (CAD) with and without T2DM, and to measure the severity of CVD using Gensini score (GS) through angiographic data.
The current study was conducted in tertiary care specialized hospital in Delhi, India. The level of percent stenosis in coronary arteries was compared in patients with CAD with and without T2DM. The patients were divided into two groups: group I included 100 patients with T2DM, and group II included 100 non-diabetic CAD patients who underwent coronary angiography by Judkin’s technique. The severity of CVD was measured by GS through angiographic data. The serum levels of glycated haemoglobin (HbA1c) ≥ 6.5% were considered diabetic.
Significant difference was observed in serum HbA1c, and random blood sugar levels between group I and group II were also observed (P ≤ 0.001). Serum HbA1c shows a significant positive association with GS (r = 0.36, P = 0.007).
The study shows a significant level of stenosis in coronary arteries of CAD diabetic patients. However, further prospective analysis of a larger population size will be needed to strengthen the findings and the significant association.
Cardiovascular diseases (CVD) are the leading cause of death globally. In the condition of type 2 diabetes mellitus (T2DM), the prevalence of CVD increase parallel with the rise of metabolic complication and higher incidence of coronary artery stenosis. The aim of this study was to compare the level of percent stenosis in coronary arteries in patients with coronary artery disease (CAD) with and without T2DM, and to measure the severity of CVD using Gensini score (GS) through angiographic data.
The current study was conducted in tertiary care specialized hospital in Delhi, India. The level of percent stenosis in coronary arteries was compared in patients with CAD with and without T2DM. The patients were divided into two groups: group I included 100 patients with T2DM, and group II included 100 non-diabetic CAD patients who underwent coronary angiography by Judkin’s technique. The severity of CVD was measured by GS through angiographic data. The serum levels of glycated haemoglobin (HbA1c) ≥ 6.5% were considered diabetic.
Significant difference was observed in serum HbA1c, and random blood sugar levels between group I and group II were also observed (P ≤ 0.001). Serum HbA1c shows a significant positive association with GS (r = 0.36, P = 0.007).
The study shows a significant level of stenosis in coronary arteries of CAD diabetic patients. However, further prospective analysis of a larger population size will be needed to strengthen the findings and the significant association.
Gastrointestinal (GI) cancer is one of the leading causes of death that affect many patients around the world. The coronavirus disease 2019 (COVID-19) pandemic significantly impacted our healthcare system in large that diagnosis and management of GI cancer have suffered with a reduction in cancer screening. This review will describe the current practices of cancer screening during COVID-19 pandemic and summarize how each GI cancer (esophageal, gastric, colorectal, and hepatocellular cancers) has been affected by COVID-19. World widely there has been a decreasing trend in screening, diagnosis, and management of GI cancers during the COVID-19 pandemic. Many healthcare institutions are now observing the effect of this change and implementing practice variations to adapt to the pandemic.
Gastrointestinal (GI) cancer is one of the leading causes of death that affect many patients around the world. The coronavirus disease 2019 (COVID-19) pandemic significantly impacted our healthcare system in large that diagnosis and management of GI cancer have suffered with a reduction in cancer screening. This review will describe the current practices of cancer screening during COVID-19 pandemic and summarize how each GI cancer (esophageal, gastric, colorectal, and hepatocellular cancers) has been affected by COVID-19. World widely there has been a decreasing trend in screening, diagnosis, and management of GI cancers during the COVID-19 pandemic. Many healthcare institutions are now observing the effect of this change and implementing practice variations to adapt to the pandemic.
Among treatments for chronic non-cancer pain (CNCP), cannabinoid-based medicines (CBMs) have become extremely popular. Evidence remains modest and limited primarily to delta-9-tetrahydrocannabinol (THC) for neuropathic pain; nevertheless, the use of various CBMs, including cannabidiol (CBD) to treat neuropathic, nociceptive, and mixed pain has increased globally. This observational case-series assessed the impact of CBMs as a complementary treatment by pain mechanism and cannabinoid profile over three months.
An analysis of patients with CNCP and treated with CBMs who consented to an ongoing registry was performed. Outcomes were patient-reported such as the Edmonton Symptom Assessment System-Revised, Brief Pain Inventory-Short Form, and 36-Item Short Form Health Survey. Data from patients with complete outcomes for baseline and 3-month follow-up was extracted. Characteristics of adverse drug reactions (ADRs), including a description of the suspected product were also assessed.
A total of 495 patients were part of this analysis (mean age = 56 years old; 67% women). At 3-month, the proportional use of THC:CBD balanced and THC-dominant products increased. Patients with neuropathic pain had higher pain-severity scores vs. nociceptive pain. In addition to patients with neuropathic pain, patients with nociceptive and mixed pain also reported improvements in pain severity and secondary symptoms such as anxiety, depression, drowsiness, fatigue, sleep disturbances, and overall, health-related quality of life. THC-dominant treatment is more likely to be recommended when pain is severe, whereas CBD-dominant is favored for less severe cases. ADRs were more frequent among cannabis-naive patients and included dizziness, headache, and somnolence among others.
Findings suggest that CBMs can be effective for neuropathic as well as nociceptive and mixed pain. THC is more frequently recommended for neuropathic and severe pain. Future research on CBMs in pain management must include details of CBM composition, and pain mechanism and must consider potential ADRs.
Among treatments for chronic non-cancer pain (CNCP), cannabinoid-based medicines (CBMs) have become extremely popular. Evidence remains modest and limited primarily to delta-9-tetrahydrocannabinol (THC) for neuropathic pain; nevertheless, the use of various CBMs, including cannabidiol (CBD) to treat neuropathic, nociceptive, and mixed pain has increased globally. This observational case-series assessed the impact of CBMs as a complementary treatment by pain mechanism and cannabinoid profile over three months.
An analysis of patients with CNCP and treated with CBMs who consented to an ongoing registry was performed. Outcomes were patient-reported such as the Edmonton Symptom Assessment System-Revised, Brief Pain Inventory-Short Form, and 36-Item Short Form Health Survey. Data from patients with complete outcomes for baseline and 3-month follow-up was extracted. Characteristics of adverse drug reactions (ADRs), including a description of the suspected product were also assessed.
A total of 495 patients were part of this analysis (mean age = 56 years old; 67% women). At 3-month, the proportional use of THC:CBD balanced and THC-dominant products increased. Patients with neuropathic pain had higher pain-severity scores vs. nociceptive pain. In addition to patients with neuropathic pain, patients with nociceptive and mixed pain also reported improvements in pain severity and secondary symptoms such as anxiety, depression, drowsiness, fatigue, sleep disturbances, and overall, health-related quality of life. THC-dominant treatment is more likely to be recommended when pain is severe, whereas CBD-dominant is favored for less severe cases. ADRs were more frequent among cannabis-naive patients and included dizziness, headache, and somnolence among others.
Findings suggest that CBMs can be effective for neuropathic as well as nociceptive and mixed pain. THC is more frequently recommended for neuropathic and severe pain. Future research on CBMs in pain management must include details of CBM composition, and pain mechanism and must consider potential ADRs.