Treatment regimens for post-kala-azar dermal leishmaniasis (PKDL) are usually extrapolated from those for visceral leishmaniasis (VL), but drug pharmacokinetics (PK) can differ due to disease-specific variations in absorption, distribution, and elimination. This study characterized PK differences in paromomycin and miltefosine between 109 PKDL and 264 VL patients from eastern Africa. VL patients showed 0.55-fold (95%CI: 0.41-0.74) lower capacity for paromomycin saturable reabsorption in renal tubules, and required a 1.44-fold (1.23-1.71) adjustment when relating renal clearance to creatinine-based eGFR. Miltefosine bioavailability in VL patients was lowered by 69% (62-76) at treatment start. Comparing PKDL to VL patients on the same regimen, paromomycin plasma exposures were 0.74-0.87-fold, while miltefosine exposure until the end of treatment day was 1.4-fold. These pronounced PK differences between PKDL and VL patients in eastern Africa highlight the challenges of directly extrapolating dosing regimens from one leishmaniasis presentation to another.
BACKGROUND
Targeted preventive strategies in persons living with HIV (PLWH) require markers to predict visceral leishmaniasis (VL). We conducted a longitudinal study in a HIV-cohort in VL-endemic North-West Ethiopia to 1) describe the pattern of Leishmania markers preceding VL; 2) identify Leishmania markers predictive of VL; 3) develop a clinical management algorithm according to predicted VL risk levels.
METHODS
The PreLeisH study followed 490 adult PLWH free of VL at enrolment for up to two years (2017-2021). Blood RT-PCR targeting Leishmania kDNA, Leishmania serology and Leishmania urine antigen test (KAtex) were performed every 3-6 months. We calculated the sensitivity/specificity of the Leishmania markers for predicting VL and developed an algorithm for distinct clinical management strategies, with VL risk categories defined based on VL history, CD4 count and Leishmania markers (rK39 RDT & RT-PCR).
FINDINGS
At enrolment, 485 (99%) study participants were on antiretroviral treatment; 360/490 (73.5%) were male; the median baseline CD4 count was 392 (IQR 259-586) cells/μL; 135 (27.5%) had previous VL. Incident VL was diagnosed in 34 (6.9%), with 32 (94%) displaying positive Leishmania markers before VL. In those without VL history, baseline rK39 RDT had 60% sensitivity and 84% specificity to predict VL; in patients with previous VL, RT-PCR had 71% sensitivity and 95% specificity. The algorithm defined 442 (92.3%) individuals at low VL risk (routine follow-up), 31 (6.5%) as moderate risk (secondary prophylaxis) and six (1.2%) as high risk (early treatment).
INTERPRETATION
Leishmania infection markers can predict VL risk in PLWH. Interventional studies targeting those at high risk are needed.
FUNDING
The PreLeisH study was supported by grants from the Department of Economy, Science and Innovation of the Flemish Government, Belgium (757013) and the Directorate-General for Development Cooperation and Humanitarian Aid (DGD), Belgium (BE-BCE_KBO-0410057701-prg2022-5-ET).
Critical failings in humanitarian response: a cholera outbreak in Kumer Refugee Camp, Ethiopia, 2023
BACKGROUND
Visceral leishmaniasis (VL) is an important public health problem, which mainly affects the poor rural dwelling communities in Low- and Middle-Income Countries. However, little is known about the health and economic burdens of this disease in East Africa, including Ethiopia. The aim of this study was to assess the household level economic burden of VL among affected communities in Tigray, Northern Ethiopia.
METHODS
Between April and August 2020, a cross-sectional household survey was conducted on 96 patients who had been treated for VL within 12 months prior to the survey, in six districts of Tigray. Data on households’ health seeking behavior, direct and indirect costs and coping strategies were collected using a structured questionnaire and the responses were analyzed using SPSS software.
RESULTS
Most (82%) of the patients surveyed were males and the majority (74%) of them were between 16 and 30 years of age. The education level of participants was very low: over 33% had not received any form of education; 48% of patients were farmers dependent on subsistence agriculture and about 32% were daily laborers. Just under half of household families (46%) resided in “poor houses” with structures made from entirely local materials. Forty-one percent of patients from the surveyed households had traveled 48 to 72 kilometers to reach VL treatment hospitals. The median total household cost for one VL episode was estimated to be US$ 214. This is equated to 18% of the mean total annual household income or 72.5% of annual per capita income of the study population. More than 80% of the households surveyed incurred catastrophic costs of VL, where this is defined as exceeding 10% of annual household income. The median delay between the onset of symptoms and arrival at a care provider hospital was 37 days; once the patient arrived at hospital, the median delay during diagnosis was 3 days. Direct and indirect costs represented 44% and 56% of the total costs incurred, respectively. To cope with VL treatment costs, 43% of the households used more than one coping strategy: 48% took out loans, 43% sold livestock and 31% of households mobilized cash savings.
CONCLUSIONS
VL in Tigray is concentrated among young males with low educational background and mostly engaged in subsistence economic activities. Despite the free diagnostic and treatment provisions that were available at public hospitals at the time of the study, our work shows that the household economic burden of the disease had significant impact among VL-affected communities in Tigray. Initiating community awareness towards prevention, early treatment seeking and decentralization of VL treatment centers are strongly recommended. In addition, we recommend efforts to reduce household treatment costs through transport and food provisions for patients (and their accompanying carers where possible) or through cash reimbursement for patients who complete treatment at public hospitals, in order to reduce the barriers to seeking treatment for this life-threatening disease.
Human immunodeficiency virus (HIV) co-infection is a major challenge for visceral leishmaniasis (VL) control, particularly in Ethiopia where the incidence of both pathogens is high. VL-HIV often leads to high rates of antileishmanial treatment failure and recurrent VL disease relapses. Considering the high prevalence of HIV and Leishmania in the Ethiopian population, preventing the progression of asymptomatic Leishmania infection to disease would be a valuable asset to VL disease control and to the clinical management of people living with HIV (PLWH). However, such a strategy requires good understanding of risk factors for VL development. In immunocompetent individuals living in Brazil, India, or Iran, the Human Leukocyte Antigen (HLA) gene region has been associated with VL development. We used NanoTYPE, an Oxford Nanopore Technologies sequencing-based HLA genotyping method, to detect associations between HLA genotype and VL development by comparing 78 PLWH with VL history and 46 PLWH that controlled a Leishmania infection, all living in a VL endemic region of North-West Ethiopia. We identified an association between HLA-A*03:01 and increased risk of VL development (OR = 3.89). These data provide candidate HLA alleles that can be further explored for inclusion in a potential Leishmania screen-and-treat strategy in VL endemic regions.
INTRODUCTION
The severe consequences of acute kidney injury (AKI) have been well-documented in high-risk patient populations. However, the effects of milder forms in non-critically ill patients remain understudied, particularly in resource-limited settings. While the risk of mortality associated with these cases is considered low, it can still lead to various complications including prolonged hospitalization, which may influence long-term renal and patient survival. Hence, the objective of this study was to study the impact of non-dialysis-requiring AKI (NDR-AKI) on survival outcomes of non-critically ill medical patients admitted to St. Paul’s Hospital Millennium Medical College in Ethiopia during the period from July 2019 to January 2022.
METHODS
A retrospective cohort study was conducted among 300 non-critically ill medical patients, 93 with NDR-AKI and 207 without AKI. Descriptive statistics, including frequency distributions and median survival times, were employed to summarize the data. Kaplan-Meier curves and the log-rank test were utilized to compare survival experiences of groups. A Cox proportional hazards survival model was fitted to estimate the impact of NDR-AKI on time to recovery. Adjusted hazard ratio (AHR) with 95% confidence interval (CI) was used to report findings.
RESULTS
Two hundred four (68.0%) were discharged after improvement and the median recovery time was 16 days (95%CI: 13.5-18.5 days). Having NDR-AKI was associated with a 43% lower rate of achieving recovery (AHR=0.57, 95%CI=0.38, 0.84, p-value=0.004). Females were found to have a 1.41 times higher rate of recovery (AHR=1.41, 95%CI=1.03,1.94, p-value=0.033). Additionally, having tuberculosis (AHR=0.41, 95%CI=0.23,0.72, p-value=0.002) and being on anticoagulant (AHR=0.67, 95%CI=0.47,0.95, p-value=0.027) were associated with a 59% and 33% lower rate of recovery, respectively.
CONCLUSION
NDR-AKI significantly delays recovery compared to patients without AKI suggesting that even milder forms of AKI in non-critically ill patients can negatively impact patient outcomes. Early identification, prompt management, and addressing underlying causes are key to improving recovery and reducing long-term morbidity and mortality. Strict screening and monitoring of high-risk groups such as men, patients with tuberculosis, and those on anticoagulants is also crucial.
BACKGROUND
Antimicrobial resistance is of great global public health concern. In order to address the paucity of antibiotic consumption data and antimicrobial resistance surveillance systems in hospitals in humanitarian settings, we estimated antibiotic consumption in six hospitals with the aim of developing recommendations for improvements in antimicrobial stewardship programs.
METHODS
Six hospitals supported by Médecins sans Frontières were included in the study: Boost-Afghanistan, Kutupalong-Bangladesh, Baraka and Mweso-Democratic Republic of Congo, Kule-Ethiopia, and Bentiu-South Sudan. Data for 36,984 inpatients and antibiotic consumption data were collected from 2018 to 2020. Antibiotics were categorized per World Health Organization Access Watch Reserve classification. Total antibiotic consumption was measured by Defined Daily Doses (DDDs)/1000 bed-days.
RESULTS
Average antibiotic consumption in all hospitals was 2745 DDDs/1000 bed-days. Boost hospital had the highest antibiotic consumption (4157 DDDs/1000 bed-days) and Bentiu the lowest (1598 DDDs/1000 bed-days). In all hospitals, Access antibiotics were mostly used (69.7%), followed by Watch antibiotics (30.1%). The most consumed antibiotics were amoxicillin (23.5%), amoxicillin and clavulanic acid (14%), and metronidazole (13.2%). Across all projects, mean annual antibiotic consumption reduced by 22.3% during the study period, mainly driven by the reduction in Boost hospital in Afghanistan.
CONCLUSIONS
This was the first study to assess antibiotic consumption by DDD metric in hospitals in humanitarian settings. Antibiotic consumption in project hospitals was higher than those reported from non-humanitarian settings. Routine systematic antibiotic consumption monitoring systems should be implemented in hospitals, accompanied by prescribing audits and point-prevalence surveys, to inform about the volume and appropriateness of antibiotic use and to support antimicrobial stewardship efforts in humanitarian settings.
Background
People with human immunodeficiency virus (PWH) with recurrent visceral leishmaniasis (VL) could potentially drive Leishmania transmission in areas with anthroponotic transmission such as East Africa, but studies are lacking. Leishmania parasitemia has been used as proxy for infectiousness.
Methods
This study is nested within the Predicting Visceral Leishmaniasis in HIV-InfectedPatients (PreLeisH) prospective cohort study, following 490 PWH free of VL at enrollment for up to 24–37 months in northwest Ethiopia. Blood Leishmania polymerase chain reaction (PCR) was done systematically. This case series reports on 10 PWH with chronic VL (≥3 VL episodes during follow-up) for up to 37 months, and 3 individuals with asymptomatic Leishmania infection for up to 24 months.
Results
All 10 chronic VL cases were male, on antiretroviral treatment, with 0–11 relapses before enrollment. Median baseline CD4 count was 82 cells/µL. They displayed 3–6 VL treatment episodes over a period up to 37 months. Leishmania blood PCR levels were strongly positive for almost the entire follow-up (median cycle threshold value, 26 [interquartile range, 23–30]), including during periods between VL treatment. Additionally, we describe 3 PWH with asymptomatic Leishmania infection and without VL history, with equally strong Leishmania parasitemia over a period of up to 24 months without developing VL. All were on antiretroviral treatment at enrollment, with baseline CD4 counts ranging from 78 to 350 cells/µL.
Conclusions
These are the first data on chronic parasitemia in PWH from Leishmania donovani–endemic areas. PWH with asymptomatic and symptomatic Leishmania infection could potentially be highly infectious and constitute Leishmania superspreaders. Xenodiagnosis studies are required to confirm infectiousness.