BACKGROUND
In low-resource settings, limited laboratory capacity adds to the burden of central nervous system (CNS) infections in children and spurs overuse of antibiotics. The commercially available BioFire® FilmArray® Meningitis/Encephalitis Panel (FA-ME) with its capability to simultaneously detect 14 pathogens in cerebrospinal fluid (CSF), could potentially narrow such a diagnostic gap.
METHODS
In Mbarara, Uganda, we compared clinical utility (clinical turnaround time [cTAT], microbial yield, and influence on patient outcome and antibiotic exposure) of FA-ME with bacterial culture, in children 0–12 years with suspected CNS infection.
RESULTS
Of 212 enrolled children, CSF was sampled from 194. All samples underwent bacterial culture, of which 193 also underwent FA-ME analyses. FA-ME analyses prospectively influenced care for 169 of the 193 patients, and they constituted an ‘Index group’. The remaining 43/212 patients constituted a ‘Reference group’. Of all 194 CSF-sampled patients, 87% (168) had received antibiotics before lumbar puncture. Median cTAT for FA-ME was 4.2 h, vs. two days for culture. Bacterial yield was 12% (24/193) and 1.5% (3/194) for FA-ME and culture, respectively. FA-ME viral yield was 12% (23/193). Fatality rate was 14% in the Index group vs. 19% in the Reference group (P = 0.20). From clinician receival of FA-ME results, median antibiotic exposure was 6 days for bacteria-negative vs. 13 days for bacteria-positive patients (P = 0.03). Median hospitalization duration was 7 vs. 12 days for FA-ME negative and positive patients, respectively (P < 0.01).
CONCLUSIONS
In this setting, clinical FA-ME utility was found in a higher and faster microbial yield and shortened hospitalization and antibiotic exposure of patients without CSF pathology. More epidemiologically customized pathogen panels may increase FA-ME utility locally, although its use in similar settings would require major cost reductions.
BACKGROUND
Yellow fever vaccine is highly effective with a single dose, but vaccine supply is limited. The minimum dose requirements for seroconversion remain unknown.
METHODS
In this double-blind, randomized, noninferiority trial in Uganda and Kenya, we assigned adults with no history of yellow fever vaccination or infection to receive vaccination with the Institut Pasteur de Dakar 17D-204 yellow fever vaccine at a standard dose (13,803 IU) or at a fractional dose of 1000 IU, 500 IU, or 250 IU. The primary outcome was seroconversion at 28 days after vaccination with each fractional dose as compared with the standard dose, evaluated in a noninferiority analysis. Seroconversion was defined as an antibody titer at day 28 that was at least four times as high as the antibody titer before vaccination, as measured by a plaque reduction neutralization test. We conducted noninferiority analyses in the per-protocol and intention-to-treat populations. Noninferiority was shown if the lower boundary of the 95% confidence interval for the difference in the incidence of seroconversion between the fractional dose and the standard dose was higher than -10 percentage points.
RESULTS
A total of 480 participants underwent randomization (120 participants in each group). The incidence of seroconversion was 98% (95% confidence interval [CI], 94 to 100) with the standard dose. The difference in the incidence of seroconversion between the 1000-IU dose and the standard dose was 0.01 percentage points (95% CI, -5.0 to 5.1) in the intention-to-treat population and -1.9 percentage points (95% CI, -7.0 to 3.2) in the per-protocol population; the corresponding differences between the 500-IU dose and the standard dose were 0.01 percentage points (95% CI, -5.0 to 5.1) and -1.8 percentage points (95% CI, -6.7 to 3.2), and those between the 250-IU dose and the standard dose were -4.4 percentage points (95% CI, -9.4 to 0.7) and -6.7 percentage points (95% CI, -11.7 to 1.6). A total of 111 vaccine-related adverse events were reported: 103 were mild in severity, 7 were moderate, and 1 was severe. The incidence of adverse events was similar in the four groups.
CONCLUSIONS
A yellow fever vaccination dose as low as 500 IU was noninferior to the standard dose of 13,803 IU for producing seroconversion within 28 days.
While standard methods for chlorine taste and odor (T&O) detection and rejection thresholds exist, little rigorous research has been conducted on T&O thresholds in humanitarian settings. To fill this gap, we estimated chlorine T&O detection and rejection thresholds using the Forced-Choice Triangle Test (FCT) and Flavor Rating Assessment (FRA) standard methods in a Ugandan refugee settlement. We conducted these tests with 410 male and female participants, aged 5–72 years, using piped and trucked surface water and bottled water. We also conducted 30 focus group discussions and 37 surveys with data collectors. Median chlorine detection thresholds were 0.56, 1.40, and 1.67 mg/L, for piped, trucked, and bottled water, respectively. Rejection was calculated using ratings (as per the method) and five different previously-used thresholds, and was 1.6, 2.0, and 1.6 mg/L (ratings) and 2.19, 1.85, and 1.67 mg/L (using the FCT threshold method with FRA data) for piped, trucked, and bottled water, respectively. Detection and rejection thresholds were significantly associated with water quality (including turbidity, pH, electrical conductivity, and temperature), participant age and education. We observed high intra- and inter-individual variability, which decreased with participant experience. We found the method used to calculate rejection thresholds influenced results, highlighting the need for a standard method to analyze FRA data. Data collectors and participants recommended shortening protocols and evaluating fewer concentrations, and highlighted difficulties in creating accurate FRC concentrations for testing. This study provides insights on using standard methods to assess T&O thresholds in untrained populations, and results are being used to develop rapid field T&O protocols for humanitarian settings.
Treatment regimens for post-kala-azar dermal leishmaniasis (PKDL) are usually extrapolated from those for visceral leishmaniasis (VL), but drug pharmacokinetics (PK) can differ due to disease-specific variations in absorption, distribution, and elimination. This study characterized PK differences in paromomycin and miltefosine between 109 PKDL and 264 VL patients from eastern Africa. VL patients showed 0.55-fold (95%CI: 0.41-0.74) lower capacity for paromomycin saturable reabsorption in renal tubules, and required a 1.44-fold (1.23-1.71) adjustment when relating renal clearance to creatinine-based eGFR. Miltefosine bioavailability in VL patients was lowered by 69% (62-76) at treatment start. Comparing PKDL to VL patients on the same regimen, paromomycin plasma exposures were 0.74-0.87-fold, while miltefosine exposure until the end of treatment day was 1.4-fold. These pronounced PK differences between PKDL and VL patients in eastern Africa highlight the challenges of directly extrapolating dosing regimens from one leishmaniasis presentation to another.
OBJECTIVES
Chest x‐ray (CXR) plays an important role in childhood tuberculosis (TB) diagnosis, but access to quality CXR remains a major challenge in resource‐limited settings. Digital CXR (d‐CXR) can solve some image quality issues and facilitate their transfer for quality control. We assess the implementation of introducing d‐CXR in 12 district hospitals (DHs) in 2021–2022 across Cambodia, Cameroon, Ivory Coast, Mozambique, Sierra Leone and Uganda as part of the TB‐speed decentralisation study on childhood TB diagnosis.
METHODS
For digitisation of CXR, digital radiography (DR) plates were setup on existing analogue radiography devices. d‐CXR were transferred to an international server at Bordeaux University and downloaded by sites' clinicians for interpretation. We assessed the uptake and performance of CXR services and health care workers' (HCW) perceptions of d‐CXR implementation. We used a convergent mixed method approach utilising process data, individual interviews with 113 HCWs involved in performing or interpreting d‐CXRs and site support supervision reports.
RESULTS
Of 3104 children with presumptive TB, 1642 (52.9%) had at least one d‐CXR, including 1505, 136 and 1 children with one, two and three d‐CXRs, respectively, resulting in a total of 1780 d‐CXR. Of them, 1773 (99.6%) were of good quality and 1772/1773 (99.9%) were interpreted by sites' clinicians. One hundred and sixty‐four children had no d‐CXR performed despite attending the radiography department: 126, 37 and 1 with one, two and three attempts, respectively. d‐CXRs were not performed in 21.6% (44/203) due to connectivity problem between the DR plate captor and the computer. HCW reported good perceptions of d‐CXR and of the DR plates provided. The main challenge was the upload to and download from the server of d‐CXRs due to limited internet access.
CONCLUSION
d‐CXR using DR plates was feasible at DH level and provided good quality images but required overcoming operational challenges.
The Safe Water Optimization Tool (SWOT) generates evidence-based point-of-distribution free residual chlorine (FRC) targets to adjust chlorine dosing by operators and ensure water quality at point-of-consumption. To investigate SWOT effectiveness in surface waters, we conducted two before-and-after mixed-method evaluations in a Uganda refugee settlement served by piped and trucked surface water systems. We surveyed 888 users on water knowledge, attitudes, and practices; collected 2768 water samples to evaluate FRC,Escherichia coli, and disinfection by-products (DBPs) concentrations; and conducted nine key-informant interviews with system operators about SWOT implementation. After baseline data collection, SWOT chlorination targets were generated, increasing point-of-distribution FRC targets from 0.2 to 0.7-0.8 mg/L and from 0.3 to 0.9 mg/L for piped and trucked systems, respectively. At endline, household point-of-consumption FRC ≥ 0.2 mg/L increased from 23 to 35% and from 8 to 42% in the two systems. With these increases, we did not observe increased chlorinated water rejection or DBPs concentrations exceeding international guidelines. Informants reported that SWOT implementation increased knowledge and capacity and improved operations. Overall, SWOT-generated chlorination targets increased chlorine dosage, which improved household water quality in surface waters although less than previously documented with groundwater sources. Additional operator support on prechlorination water treatment processes is needed to ensure maximally effective SWOT implementation for surface water sources.
BACKGROUND
Deaths occurring during the neonatal period contribute close to half of under-five mortality rate (U5MR); over 80% of these deaths occur in low- and middle-income countries (LMICs). Poor maternal antepartum and perinatal health predisposes newborns to low birth weight (LBW), birth asphyxia, and infections which increase the newborn's risk of death.
METHODS
The objective of the study was to assess the association between abnormal postpartum maternal temperature and early infant outcomes, specifically illness requiring hospitalisation or leading to death between birth and six weeks' age. We prospectively studied a cohort of neonates born at Mbarara Regional Referral Hospital in Uganda to mothers with abnormal postpartum temperature and followed them longitudinally through early infancy. We performed a logistic regression of the relationship between maternal abnormal temperature and six-week infant hospitalization, adjusting for gestational age and 10-minute APGAR score at birth.
RESULTS
Of the 648 postpartum participants from the parent study who agreed to enroll their neonates in the sub-study, 100 (15%) mothers had abnormal temperature. The mean maternal age was 24.6 (SD 5.3) years, and the mean parity was 2.3 (SD 1.5). There were more preterm babies born to mothers with abnormal maternal temperature (10%) compared to 1.1% to mothers with normal temperature (p=˂0.001). While the majority of newborns (92%) had a 10-minute APGAR score > 7, 14% of newborns whose mothers had abnormal temperatures had APGAR score ˂7 compared to 7% of those born to mothers with normal postpartum temperatures (P = 0.02). Six-week outcome data was available for 545 women and their infants. In the logistic regression model adjusted for gestational age at birth and 10-minute APGAR score, maternal abnormal temperature was not significantly associated with the composite adverse infant health outcome (being unwell or dead) between birth and six weeks' age (aOR = 0.35, 95% CI 0.07-1.79, P = 0.21). The 10-minute APGAR score was significantly associated with adverse six-week outcome (P < 0.01).
CONCLUSIONS
While our results do not demonstrate an association between abnormal maternal temperature and newborn and early infant outcomes, good routine neonate care should be emphasized, and the infants should be observed for any abnormal findings that may warrant further assessment.
Accumulating evidence on the long-term immunogenicity of fractional dosing for yellow fever vaccines
BACKGROUND
Mobility of people living with HIV (PWH) among urban population in Goma and the fisherfolk community in western Uganda can serve as a barrier to retention in care. To address this challenge, MSF supported MoH in deployment of WHO recommended Differentiated Services Delivery Models (DSDM), especially Community ART groups (CAG) where clients form groups and rotate drug pick-up. In these studies, we aimed to explore retention-in-care, viral load coverage and suppression among PWH enrolled in DSDM and describe acceptability and satisfaction of these models in Goma, DRC and Kasese, Uganda.
METHODS
In both contexts, we carried out a retrospective cohort analysis complemented by a cross-sectional survey in Goma and a qualitative survey in Kasese. For the cohort analysis, we examined the characteristics of PWH enrolled in DSDM. Utilizing Kaplan-Meier survival analysis, we estimated retention in care and calculated viral coverage and suppression rates at 12 months post-model initiation. In Goma, we administered a satisfaction questionnaire to a subset of the active cohort, while in Kasese, we conducted interviews and facilitated focus group discussions to document the acceptability and relevance of DSDM.
RESULTS
In total, 1950 PWH in Goma and 1773 PWH in Kasese were included in the cohort analyses. After one year of model initiation, more than 90% of PWH enrolled in MSF-supported DSDM were retained in care (94.1% among PWH in Goma and 97.6% in Kasese). Of PWH who retained in care at 1-year, proportion of virally suppressed PWH was high in both contexts (96.4% in Goma and 97.0% in Goma). PWH and healthcare providers expressed positive sentiments towards DSMD, acknowledging their utility in enhancing convenience and reducing transport expenses for ART access. Moreover, they noted benefits such as decreased waiting times, alleviation of overcrowding and workload at healthcare facilities, as well as the role of DSDM in mitigating stigma and fostering responsibility sharing among group members.
CONCLUSION
Although great progress has been made in the fight against the HIV epidemic in recent years, a one-size-fits-all approach to caring for people living with HIV is no longer appropriate. The findings from these evaluations underscore the effectiveness of tailored, differentiated services, which maintain high retention rates in care, even within mobile communities, while also garnering strong acceptability. It is imperative to consider integrating DSDM into routine programming for chronic illnesses. By adapting clinical care to suit the lifestyles of PWH, such models can offer enhanced support to patients, ultimately improving health outcomes.