The majority of Plasmodium falciparum malaria cases in Africa are treated with the artemisinin combination therapies artemether-lumefantrine (AL) and artesunate-amodiaquine (AS-AQ), with amodiaquine being also widely used as part of seasonal malaria chemoprevention programs combined with sulfadoxine-pyrimethamine. While artemisinin derivatives have a short half-life, lumefantrine and amodiaquine may give rise to differing durations of post-treatment prophylaxis, an important additional benefit to patients in higher transmission areas.
METHODS
We analyzed individual patient data from 8 clinical trials of AL versus AS-AQ in 12 sites in Africa (n = 4214 individuals). The time to PCR-confirmed reinfection after treatment was used to estimate the duration of post-treatment protection, accounting for variation in transmission intensity between settings using hidden semi-Markov models. Accelerated failure-time models were used to identify potential effects of covariates on the time to reinfection. The estimated duration of chemoprophylaxis was then used in a mathematical model of malaria transmission to determine the potential public health impact of each drug when used for first-line treatment.
RESULTS
We estimated a mean duration of post-treatment protection of 13.0 days (95% CI 10.7-15.7) for AL and 15.2 days (95% CI 12.8-18.4) for AS-AQ overall. However, the duration varied significantly between trial sites, from 8.7-18.6 days for AL and 10.2-18.7 days for AS-AQ. Significant predictors of time to reinfection in multivariable models were transmission intensity, age, drug, and parasite genotype. Where wild type pfmdr1 and pfcrt parasite genotypes predominated (<=20% 86Y and 76T mutants, respectively), AS-AQ provided ~ 2-fold longer protection than AL. Conversely, at a higher prevalence of 86Y and 76T mutant parasites (> 80%), AL provided up to 1.5-fold longer protection than AS-AQ. Our simulations found that these differences in the duration of protection could alter population-level clinical incidence of malaria by up to 14% in under-5-year-old children when the drugs were used as first-line treatments in areas with high, seasonal transmission.
CONCLUSION
Choosing a first-line treatment which provides optimal post-treatment prophylaxis given the local prevalence of resistance-associated markers could make a significant contribution to reducing malaria morbidity.
In 2012, the World Health Organization recommended the addition of single low-dose primaquine (SLDPQ, 0.25 mg base/kg body weight) to artemisinin combination therapies to block the transmission of Plasmodium falciparum without testing for glucose-6-phosphate dehydrogenase deficiency. The targeted group was non-pregnant patients aged ≥ 1 year (later changed to ≥ 6 months) with acute uncomplicated falciparum malaria, primarily in countries with artemisinin-resistant P. falciparum (ARPf). No dosing regimen was suggested, leaving malaria control programmes and clinicians in limbo. Therefore, we designed a user-friendly, age-based SLDPQ regimen for Cambodia, the country most affected by ARPf.
METHODS
By reviewing primaquine's pharmacology, we defined a therapeutic dose range of 0.15-0.38 mg base/kg (9-22.5 mg in a 60-kg adult) for a therapeutic index of 2.5. Primaquine doses (1-20 mg) were tested using a modelled, anthropometric database of 28,138 Cambodian individuals (22,772 healthy, 4119 with malaria and 1247 with other infections); age distributions were: 0.5-4 years (20.0 %, n = 5640), 5-12 years (9.1 %, n = 2559), 13-17 years (9.1 %, n = 2550), and ≥ 18 years (61.8 %, n = 17,389). Optimal age-dosing groups were selected according to calculated mg base/kg doses and proportions of individuals receiving a therapeutic dose.
RESULTS
Four age-dosing bands were defined: (1) 0.5-4 years, (2) 5-9 years, (3) 10-14 years, and (4) ≥15 years to receive 2.5, 5, 7.5, and 15 mg of primaquine base, resulting in therapeutic doses in 97.4 % (5494/5640), 90.5 % (1511/1669), 97.7 % (1473/1508), and 95.7 % (18,489/19,321) of individuals, respectively. Corresponding median (1st-99th centiles) mg base/kg doses of primaquine were (1) 0.23 (0.15-0.38), (2) 0.29 (0.18-0.45), (3) 0.27 (0.15-0.39), and (4) 0.29 (0.20-0.42).
CONCLUSIONS
This age-based SLDPQ regimen could contribute substantially to malaria elimination and requires urgent evaluation in Cambodia and other countries with similar anthropometric characteristics. It guides primaquine manufacturers on suitable tablet strengths and doses for paediatric-friendly formulations. Development of similar age-based dosing recommendations for Africa is needed.
Since 2015, Europe has been facing an unprecedented arrival of refugees and migrants: more than one million people entered via land and sea routes. During their travels, refugees and migrants often face harsh conditions, forced detention, and violence in transit countries. However, there is a lack of epidemiological quantitative evidence on their experiences and the mental health problems they face during their displacement. We aimed to document the types of violence experienced by migrants and refugees during their journey and while settled in Greece, and to measure the prevalence of anxiety disorders and access to legal information and procedures.
METHODS
We conducted a cross-sectional population-based quantitative survey combined with an explanatory qualitative study in eight sites (representing the range of settlements) in Greece during winter 2016/17. The survey consisted of a structured questionnaire on experience of violence and an interviewer-administered anxiety disorder screening tool (Refugee Health Screener). RESULTS: In total, 1293 refugees were included, of whom 728 were Syrians (41.3% females) of median age 18 years (interquartile range 7-30). Depending on the site, between 31% and 77.5% reported having experienced at least one violent event in Syria, 24.8-57.5% during the journey to Greece, and 5-8% in their Greek settlement. Over 75% (up to 92%) of respondents ≥15 years screened positive for anxiety disorder, which warranted referral for mental health evaluation, which was only accepted by 69-82% of participants. Access to legal information and assistance about asylum procedures were considered poor to non-existent for the majority, and the uncertainty of their status exacerbated their anxiety.
CONCLUSIONS
This survey, conducted during a mass refugee crisis in a European Community country, provides important data on experiences in different refugee settings and reports the high levels of violence experienced by Syrian refugees during their journeys, the high prevalence of anxiety disorders, and the shortcomings of the international protective response.
Multidrug-resistant tuberculosis (MDR-TB) is a major threat to global TB control. MDR-TB treatment regimens typically have a high pill burden, last 20 months or more and often lead to unsatisfactory outcomes. A 9–11 month regimen with seven antibiotics has shown high success rates among selected MDR-TB patients in different settings and is conditionally recommended by the World Health Organization.
METHODS
We construct a transmission-dynamic model of TB to estimate the likely impact of a shorter MDR-TB regimen when applied in a low HIV prevalence region of Uzbekistan (Karakalpakstan) with high rates of drug resistance, good access to diagnostics and a well-established community-based MDR-TB treatment programme providing treatment to around 400 patients. The model incorporates acquisition of additional drug resistance and incorrect regimen assignment. It is calibrated to local epidemiology and used to compare the impact of shorter treatment against four alternative programmatic interventions.
RESULTS
Based on empirical outcomes among MDR-TB patients and assuming no improvement in treatment success rates, the shorter regimen reduced MDR-TB incidence from 15.2 to 9.7 cases per 100,000 population per year and MDR-TB mortality from 3.0 to 1.7 deaths per 100,000 per year, achieving comparable or greater gains than the alternative interventions. No significant increase in the burden of higher levels of resistance was predicted. Effects are probably conservative given that the regimen is likely to improve success rates.
CONCLUSIONS
In addition to benefits to individual patients, we find that shorter MDR-TB treatment regimens also have the potential to reduce transmission of resistant strains. These findings are in the epidemiological setting of treatment availability being an important bottleneck due to high numbers of patients being eligible for treatment, and may differ in other contexts. The high proportion of MDR-TB with additional antibiotic resistance simulated was not exacerbated by programmatic responses and greater gains may be possible in contexts where the regimen is more widely applicable.
Methods
We first adjusted for delays between symptom onset and case presentation using the observed distribution of reporting delays from previously reported cases. We then fit a compartmental transmission model to the adjusted incidence stratified by age group and location. Model forecasts with a lead time of 2 weeks were issued on 12, 20, 26 and 30 December and communicated to decision-makers.
Results
The first forecast estimated that the outbreak would peak on 19 December in Balukhali camp with 303 (95% posterior predictive interval 122–599) cases and would continue to grow in Kutupalong camp, requiring a bed capacity of 316 (95% posterior predictive interval (PPI) 197–499). On 19 December, a total of 54 cases were reported, lower than forecasted. Subsequent forecasts were more accurate: on 20 December, we predicted a total of 912 cases (95% PPI 367–2183) and 136 (95% PPI 55–327) hospitalizations until the end of the year, with 616 cases actually reported during this period.
Conclusions
Real-time modelling enabled feedback of key information about the potential scale of the epidemic, resource needs and mechanisms of transmission to decision-makers at a time when this information was largely unknown. By 20 December, the model generated reliable forecasts and helped support decision-making on operational aspects of the outbreak response, such as hospital bed and staff needs, and with advocacy for control measures. Although modelling is only one component of the evidence base for decision-making in outbreak situations, suitable analysis and forecasting techniques can be used to gain insights into an ongoing outbreak.
BACKGROUND
Zaire Ebolavirus disease (EVD) outbreaks can be controlled using rVSV-ZEBOV vaccination and other public health measures. People in high-risk areas may have pre-existing antibodies from asymptomatic Ebolavirus exposure that might affect response to rVSV-ZEBOV. Therefore, we assessed the impact pre-existing immunity had on post-vaccination IgG titre, virus neutralisation, and reactogenicity following vaccination.
METHODS
In this prospective cohort study, 2115 consenting close contacts (“proches”) of EVD survivors were recruited. Proches were vaccinated with rVSV-ZEBOV and followed up for 28 days for safety and immunogenicity. Anti-GP IgG titre at baseline and day 28 was assessed by ELISA. Samples from a representative subset were evaluated using live virus neutralisation.
RESULTS
Ten percent were seropositive at baseline. At day 28, IgG in baseline seronegative (GMT 0.106 IU/ml, 95% CI: 0.100 to 0.113) and seropositive (GMT 0.237 IU/ml, 0.210 to 0.267) participants significantly increased from baseline (both p < 0.0001). There was strong correlation between antibody titres and virus neutralisation in day 28 samples (Spearman’s rho 0.75). Vaccinees with baseline IgG antibodies against Zaire Ebolavirus had similar safety profiles to those without detectable antibodies (63.6% vs 66.1% adults experienced any adverse event; 49.1% vs 60.9% in children), with almost all adverse events graded as mild. No serious adverse events were attributed to vaccination. No EVD survivors tested positive for Ebolavirus by RT-PCR.
CONCLUSIONS
These data add further evidence of rVSV-ZEBOV safety and immunogenicity, including in people with pre-existing antibodies from suspected natural ZEBOV infection whose state does not blunt rVSV-ZEBOV immune response. Pre-vaccination serological screening is not required.
Cholera epidemics continue to challenge disease control, particularly in fragile and conflict-affected states. Rapid detection and response to small cholera clusters is key for efficient control before an epidemic propagates. To understand the capacity for early response in fragile states, we investigated delays in outbreak detection, investigation, response, and laboratory confirmation, and we estimated epidemic sizes. We assessed predictors of delays, and annual changes in response time.
METHODS
We compiled a list of cholera outbreaks in fragile and conflict-affected states from 2008 to 2019. We searched for peer-reviewed articles and epidemiological reports. We evaluated delays from the dates of symptom onset of the primary case, and the earliest dates of outbreak detection, investigation, response, and confirmation. Information on how the outbreak was alerted was summarized. A branching process model was used to estimate epidemic size at each delay. Regression models were used to investigate the association between predictors and delays to response.
RESULTS
Seventy-six outbreaks from 34 countries were included. Median delays spanned 1–2 weeks: from symptom onset of the primary case to presentation at the health facility (5 days, IQR 5–5), detection (5 days, IQR 5–6), investigation (7 days, IQR 5.8–13.3), response (10 days, IQR 7–18), and confirmation (11 days, IQR 7–16). In the model simulation, the median delay to response (10 days) with 3 seed cases led to a median epidemic size of 12 cases (upper range, 47) and 8% of outbreaks ≥ 20 cases (increasing to 32% with a 30-day delay to response). Increased outbreak size at detection (10 seed cases) and a 10-day median delay to response resulted in an epidemic size of 34 cases (upper range 67 cases) and < 1% of outbreaks < 20 cases. We estimated an annual global decrease in delay to response of 5.2% (95% CI 0.5–9.6, p = 0.03). Outbreaks signaled by immediate alerts were associated with a reduction in delay to response of 39.3% (95% CI 5.7–61.0, p = 0.03).
CONCLUSIONS
From 2008 to 2019, median delays from symptom onset of the primary case to case presentation and to response were 5 days and 10 days, respectively. Our model simulations suggest that depending on the outbreak size (3 versus 10 seed cases), in 8 to 99% of scenarios, a 10-day delay to response would result in large clusters that would be difficult to contain. Improving the delay to response involves rethinking the integration at local levels of event-based detection, rapid diagnostic testing for cluster validation, and integrated alert, investigation, and response.