Journal Article > ResearchFull Text
Trans R Soc Trop Med Hyg. 2007 August 1; Volume 101 (Issue 8); DOI:10.1016/j.trstmh.2007.02.020
van Griensven J, De Naeyer L, Mushi T, Ubarijoro S, Gashumba D, et al.
Trans R Soc Trop Med Hyg. 2007 August 1; Volume 101 (Issue 8); DOI:10.1016/j.trstmh.2007.02.020
This study was conducted among individuals placed on WHO-recommended first-line antiretroviral therapy (ART) at two urban health centres in Kigali, Rwanda, in order to determine (a) the overall prevalence of lipodystrophy and (b) the risk factors for lipoatropy. Consecutive individuals on ART for >1 year were systematically subjected to a standardised case definition-based questionnaire and clinical assessment. Of a total of 409 individuals, 370 (90%) were on an ART regimen containing stavudine (d4T), whilst the rest were receiving a zidovudine (AZT)-containing regimen. Lipodystrophy was apparent in 140 individuals (34%), of whom 40 (9.8%) had isolated lipoatrophy, 20 (4.9%) had isolated lipohypertrophy and 80 (19.6%) had mixed patterns. Fifty-six percent of patients reported the effects as disturbing. The prevalence of lipoatrophy was more than three times higher when taking d4T compared with AZT-containing regimens (31.4% vs. 10.3%). Being female, d4T-based ART, baseline body mass index >or=25 kg/m(2) or baseline CD4 count >or=150 cells/microl and increasing duration of ART were all significantly associated with lipoatrophy. Lipoatrophy appears to be an important long-term complication of WHO-recommended first-line ART regimens. These data highlight the urgent need for access to more affordable and less toxic ART regimens in resource-limited settings.
Journal Article > ResearchFull Text
Advances in Medical Education and Practice. 2022 June 6; Volume 13; 595-607.; DOI: 10.2147/AMEP.S358702
Owolabi JO, Ojiambo R, Seifu D, Nishimwe A, Masimbi O, et al.
Advances in Medical Education and Practice. 2022 June 6; Volume 13; 595-607.; DOI: 10.2147/AMEP.S358702
BACKGROUND
This article presents a qualitative study of African anatomists and anatomy teachers on the Anatomage Table-a modern medical education technology and innovation, as an indicator of African anatomy medical and anatomy educators' acceptance of EdTech. The Anatomage Table is used for digital dissection, prosection, functional anatomy demonstration, virtual simulation of certain functions, and interactive digital teaching aid.
MATERIALS AND METHODS
Anatomy teachers [n=79] from 11 representative African countries, Ghana, Nigeria [West Africa], Ethiopia, Kenya, Rwanda [East Africa], Namibia [South Africa], Zambia [Southern Africa], Egypt [North Africa], and Sudan [Central Africa], participated in this study. Focus group discussions [FGDs] were set up to obtain qualitative information from stakeholders from representative institutions. In addition, based on the set criteria, selected education leaders and stakeholders in representative institutions participated in In-depth Interviews [IDIs]. The interview explored critical issues concerning their perceptions about the acceptance, adoption, and integration of educational technology, specifically, the Anatomage Table into the teaching of Anatomy and related medical sciences in the African continent. Recorded interviews were transcribed and analyzed using the Dedoose software.
RESULTS
African anatomists are generally technology inclined and in favor of EdTech. The most recurring opinion was that the Anatomage Table could only be a "complementary teaching tool to cadavers" and that it "can't replace the real-life experience of cadavers." Particularly, respondents from user institutions opined that it "complements the traditional cadaver-based approaches" to anatomy learning and inquiry, including being a good "complement for cadaveric skill lab" sessions. Compared with the traditional cadaveric dissections a majority also considered it less problematic regarding cultural acceptability and health and safety-related concerns. The lifelikeness of the 3D representation is a major factor that drives acceptability.
This article presents a qualitative study of African anatomists and anatomy teachers on the Anatomage Table-a modern medical education technology and innovation, as an indicator of African anatomy medical and anatomy educators' acceptance of EdTech. The Anatomage Table is used for digital dissection, prosection, functional anatomy demonstration, virtual simulation of certain functions, and interactive digital teaching aid.
MATERIALS AND METHODS
Anatomy teachers [n=79] from 11 representative African countries, Ghana, Nigeria [West Africa], Ethiopia, Kenya, Rwanda [East Africa], Namibia [South Africa], Zambia [Southern Africa], Egypt [North Africa], and Sudan [Central Africa], participated in this study. Focus group discussions [FGDs] were set up to obtain qualitative information from stakeholders from representative institutions. In addition, based on the set criteria, selected education leaders and stakeholders in representative institutions participated in In-depth Interviews [IDIs]. The interview explored critical issues concerning their perceptions about the acceptance, adoption, and integration of educational technology, specifically, the Anatomage Table into the teaching of Anatomy and related medical sciences in the African continent. Recorded interviews were transcribed and analyzed using the Dedoose software.
RESULTS
African anatomists are generally technology inclined and in favor of EdTech. The most recurring opinion was that the Anatomage Table could only be a "complementary teaching tool to cadavers" and that it "can't replace the real-life experience of cadavers." Particularly, respondents from user institutions opined that it "complements the traditional cadaver-based approaches" to anatomy learning and inquiry, including being a good "complement for cadaveric skill lab" sessions. Compared with the traditional cadaveric dissections a majority also considered it less problematic regarding cultural acceptability and health and safety-related concerns. The lifelikeness of the 3D representation is a major factor that drives acceptability.
Protocol
BMJ Open. 2017 February 1; Volume 7 (Issue 2); e014067.; DOI:10.1136/bmjopen-2016-014067
Smith SL, Misago CN, Osrow RA, Franke MF, Iyamuremye JD, et al.
BMJ Open. 2017 February 1; Volume 7 (Issue 2); e014067.; DOI:10.1136/bmjopen-2016-014067
Integrating mental healthcare into primary care can reduce the global burden of mental disorders. Yet data on the effective implementation of real-world task-shared mental health programmes are limited. In 2012, the Rwandan Ministry of Health and the international healthcare organisation Partners in Health collaboratively adapted the Mentoring and Enhanced Supervision at Health Centers (MESH) programme, a successful programme of supported supervision based on task-sharing for HIV/AIDS care, to include care of neuropsychiatric disorders within primary care settings (MESH Mental Health). We propose 1 of the first studies in a rural low-income country to assess the implementation and clinical outcomes of a programme integrating neuropsychiatric care into a public primary care system.
METHODS AND ANALYSIS
A mixed-methods evaluation will be conducted. First, we will conduct a quantitative outcomes evaluation using a pretest and post-test design at 4 purposively selected MESH MH participating health centres. At least 112 consecutive adults with schizophrenia, bipolar disorder, depression or epilepsy will be enrolled. Primary outcomes are symptoms and functioning measured at baseline, 8 weeks and 6 months using clinician-administered scales: the General Health Questionnaire and the brief WHO Disability Assessment Scale. We hypothesise that service users will experience at least a 25% improvement in symptoms and functioning from baseline after MESH MH programme participation. To understand any outcome improvements under the intervention, we will evaluate programme processes using (1) quantitative analyses of routine service utilisation data and supervision checklist data and (2) qualitative semistructured interviews with primary care nurses, service users and family members.
ETHICS AND DISSEMINATION
This evaluation was approved by the Rwanda National Ethics Committee (Protocol #736/RNEC/2016) and deemed exempt by the Harvard University Institutional Review Board. Results will be submitted for peer-reviewed journal publication, presented at conferences and disseminated to communities served by the programme.
METHODS AND ANALYSIS
A mixed-methods evaluation will be conducted. First, we will conduct a quantitative outcomes evaluation using a pretest and post-test design at 4 purposively selected MESH MH participating health centres. At least 112 consecutive adults with schizophrenia, bipolar disorder, depression or epilepsy will be enrolled. Primary outcomes are symptoms and functioning measured at baseline, 8 weeks and 6 months using clinician-administered scales: the General Health Questionnaire and the brief WHO Disability Assessment Scale. We hypothesise that service users will experience at least a 25% improvement in symptoms and functioning from baseline after MESH MH programme participation. To understand any outcome improvements under the intervention, we will evaluate programme processes using (1) quantitative analyses of routine service utilisation data and supervision checklist data and (2) qualitative semistructured interviews with primary care nurses, service users and family members.
ETHICS AND DISSEMINATION
This evaluation was approved by the Rwanda National Ethics Committee (Protocol #736/RNEC/2016) and deemed exempt by the Harvard University Institutional Review Board. Results will be submitted for peer-reviewed journal publication, presented at conferences and disseminated to communities served by the programme.
Journal Article > Meta-AnalysisFull Text
Malar J. 2009 August 23; Volume 8 (Issue 1); 203.; DOI:10.1186/1475-2875-8-203
Zwang J, Olliaro PL, Barennes H, Bonnet MMB, Brasseur P, et al.
Malar J. 2009 August 23; Volume 8 (Issue 1); 203.; DOI:10.1186/1475-2875-8-203
BACKGROUND: Artesunate and amodiaquine (AS&AQ) is at present the world's second most widely used artemisinin-based combination therapy (ACT). It was necessary to evaluate the efficacy of ACT, recently adopted by the World Health Organization (WHO) and deployed over 80 countries, in order to make an evidence-based drug policy.
METHODS: An individual patient data (IPD) analysis was conducted on efficacy outcomes in 26 clinical studies in sub-Saharan Africa using the WHO protocol with similar primary and secondary endpoints.
RESULTS: A total of 11,700 patients (75% under 5 years old), from 33 different sites in 16 countries were followed for 28 days. Loss to follow-up was 4.9% (575/11,700). AS&AQ was given to 5,897 patients. Of these, 82% (4,826/5,897) were included in randomized comparative trials with polymerase chain reaction (PCR) genotyping results and compared to 5,413 patients (half receiving an ACT). AS&AQ and other ACT comparators resulted in rapid clearance of fever and parasitaemia, superior to non-ACT. Using survival analysis on a modified intent-to-treat population, the Day 28 PCR-adjusted efficacy of AS&AQ was greater than 90% (the WHO cut-off) in 11/16 countries. In randomized comparative trials (n = 22), the crude efficacy of AS&AQ was 75.9% (95% CI 74.6-77.1) and the PCR-adjusted efficacy was 93.9% (95% CI 93.2-94.5). The risk (weighted by site) of failure PCR-adjusted of AS&AQ was significantly inferior to non-ACT, superior to dihydroartemisinin-piperaquine (DP, in one Ugandan site), and not different from AS+SP or AL (artemether-lumefantrine). The risk of gametocyte appearance and the carriage rate of AS&AQ was only greater in one Ugandan site compared to AL and DP, and lower compared to non-ACT (p = 0.001, for all comparisons). Anaemia recovery was not different than comparator groups, except in one site in Rwanda where the patients in the DP group had a slower recovery.
CONCLUSION: AS&AQ compares well to other treatments and meets the WHO efficacy criteria for use against falciparum malaria in many, but not all, the sub-Saharan African countries where it was studied. Efficacy varies between and within countries. An IPD analysis can inform general and local treatment policies. Ongoing monitoring evaluation is required.
METHODS: An individual patient data (IPD) analysis was conducted on efficacy outcomes in 26 clinical studies in sub-Saharan Africa using the WHO protocol with similar primary and secondary endpoints.
RESULTS: A total of 11,700 patients (75% under 5 years old), from 33 different sites in 16 countries were followed for 28 days. Loss to follow-up was 4.9% (575/11,700). AS&AQ was given to 5,897 patients. Of these, 82% (4,826/5,897) were included in randomized comparative trials with polymerase chain reaction (PCR) genotyping results and compared to 5,413 patients (half receiving an ACT). AS&AQ and other ACT comparators resulted in rapid clearance of fever and parasitaemia, superior to non-ACT. Using survival analysis on a modified intent-to-treat population, the Day 28 PCR-adjusted efficacy of AS&AQ was greater than 90% (the WHO cut-off) in 11/16 countries. In randomized comparative trials (n = 22), the crude efficacy of AS&AQ was 75.9% (95% CI 74.6-77.1) and the PCR-adjusted efficacy was 93.9% (95% CI 93.2-94.5). The risk (weighted by site) of failure PCR-adjusted of AS&AQ was significantly inferior to non-ACT, superior to dihydroartemisinin-piperaquine (DP, in one Ugandan site), and not different from AS+SP or AL (artemether-lumefantrine). The risk of gametocyte appearance and the carriage rate of AS&AQ was only greater in one Ugandan site compared to AL and DP, and lower compared to non-ACT (p = 0.001, for all comparisons). Anaemia recovery was not different than comparator groups, except in one site in Rwanda where the patients in the DP group had a slower recovery.
CONCLUSION: AS&AQ compares well to other treatments and meets the WHO efficacy criteria for use against falciparum malaria in many, but not all, the sub-Saharan African countries where it was studied. Efficacy varies between and within countries. An IPD analysis can inform general and local treatment policies. Ongoing monitoring evaluation is required.
Journal Article > ResearchFull Text
Water Res. 2020 November 16; Volume 189; 116642.; DOI:10.1016/j.watres.2020.116642
Ali SI, Ali SS, Fesselet JF
Water Res. 2020 November 16; Volume 189; 116642.; DOI:10.1016/j.watres.2020.116642
The current Sphere guideline for water chlorination in humanitarian emergencies fails to reliably ensure household water safety in refugee camps. We investigated post-distribution chlorine decay and household water safety in refugee camps in South Sudan, Jordan, and Rwanda between 2013-2015 with the goal of demonstrating an approach for generating site-specific and evidence-based chlorination targets that better ensure household water safety than the status quo Sphere guideline. In each of four field studies we conducted, we observed how water quality changed between distribution and point of consumption. We implemented a nonlinear optimization approach for the novel technical challenge of modelling post-distribution chlorine decay in order to generate estimates on what free residual chlorine (FRC) levels must be at water distribution points, in order to provide adequate FRC protection up to the point of consumption in households many hours later at each site. The site-specific FRC targets developed through this modelling approach improved the proportion of households having sufficient chlorine residual (i.e., ≥0.2 mg/L FRC) at the point of consumption in three out of four field studies (South Sudan 2013, Jordan 2014, and Rwanda 2015). These sites tended to be hotter (i.e., average mid-afternoon air temperatures >30°C) and/or had poorer water, sanitation, and hygiene (WASH) conditions, contributing to considerable chlorine decay between distribution and consumption. Our modelling approach did not work as well where chlorine decay was small in absolute terms (Jordan 2015). In such settings, which were cooler (20 to 30°C) and had better WASH conditions, we found that the upper range of the current Sphere chlorination guideline (i.e., 0.5 mg/L FRC) provided sufficient residual chlorine for ensuring household water safety up to 24 hours post-distribution. Site-specific and evidence-based chlorination targets generated from post-distribution chlorine decay modelling could help improve household water safety and public health outcomes in refugee camp settings where the current Sphere chlorination guideline does not provide adequate residual protection. Water quality monitoring in refugee/IDP camps should shift focus from distribution points to household points of consumption in order to monitor if the intended public health goal of safe water at the point of consumption is being achieved.
Journal Article > ResearchFull Text
Public Health Nutr. 2015 August 6; Volume 19 (Issue 7); 1296-304..; DOI:10.1017/S1368980015002207
Nsabuwera V, Hedt-Gauthier BL, Khogali MA, Edginton ME, Hinderaker SG, et al.
Public Health Nutr. 2015 August 6; Volume 19 (Issue 7); 1296-304..; DOI:10.1017/S1368980015002207
OBJECTIVE
Determining interventions to address food insecurity and poverty, as well as setting targets to be achieved in a specific time period have been a persistent challenge for development practitioners and decision makers. The present study aimed to assess the changes in food access and consumption at the household level after one-year implementation of an integrated food security intervention in three rural districts of Rwanda.
DESIGN
A before-and-after intervention study comparing Household Food Insecurity Access Scale (HFIAS) scores and household Food Consumption Scores (FCS) at baseline and after one year of programme implementation.
SETTING
Three rural districts of Rwanda (Kayonza, Kirehe and Burera) where the Partners In Health Food Security and Livelihoods Program (FSLP) has been implemented since July 2013.
SUBJECTS
All 600 households enrolled in the FSLP were included in the study.
RESULTS
There were significant improvements (P<0·001) in HFIAS and FCS. The median decrease in HFIAS was 8 units (interquartile range (IQR) -13·0, -3·0) and the median increase for FCS was 4·5 units (IQR -6·0, 18·0). Severe food insecurity decreased from 78% to 49%, while acceptable food consumption improved from 48% to 64%. The change in HFIAS was significantly higher (P=0·019) for the poorest households.
CONCLUSIONS
Our study demonstrated that an integrated programme, implemented in a setting of extreme poverty, was associated with considerable improvements towards household food security. Other government and non-government organizations' projects should consider a similar holistic approach when designing structural interventions to address food insecurity and extreme poverty.
Determining interventions to address food insecurity and poverty, as well as setting targets to be achieved in a specific time period have been a persistent challenge for development practitioners and decision makers. The present study aimed to assess the changes in food access and consumption at the household level after one-year implementation of an integrated food security intervention in three rural districts of Rwanda.
DESIGN
A before-and-after intervention study comparing Household Food Insecurity Access Scale (HFIAS) scores and household Food Consumption Scores (FCS) at baseline and after one year of programme implementation.
SETTING
Three rural districts of Rwanda (Kayonza, Kirehe and Burera) where the Partners In Health Food Security and Livelihoods Program (FSLP) has been implemented since July 2013.
SUBJECTS
All 600 households enrolled in the FSLP were included in the study.
RESULTS
There were significant improvements (P<0·001) in HFIAS and FCS. The median decrease in HFIAS was 8 units (interquartile range (IQR) -13·0, -3·0) and the median increase for FCS was 4·5 units (IQR -6·0, 18·0). Severe food insecurity decreased from 78% to 49%, while acceptable food consumption improved from 48% to 64%. The change in HFIAS was significantly higher (P=0·019) for the poorest households.
CONCLUSIONS
Our study demonstrated that an integrated programme, implemented in a setting of extreme poverty, was associated with considerable improvements towards household food security. Other government and non-government organizations' projects should consider a similar holistic approach when designing structural interventions to address food insecurity and extreme poverty.
Journal Article > Meta-AnalysisFull Text
Lancet. 2010 November 8; Volume 376 (Issue 9753); DOI:10.1016/S0140-6736(10)61924-1
Dondorp AM, Fanello CI, Hendriksen IC, Gomes E, Seni A, et al.
Lancet. 2010 November 8; Volume 376 (Issue 9753); DOI:10.1016/S0140-6736(10)61924-1
Severe malaria is a major cause of childhood death and often the main reason for paediatric hospital admission in sub-Saharan Africa. Quinine is still the established treatment of choice, although evidence from Asia suggests that artesunate is associated with a lower mortality. We compared parenteral treatment with either artesunate or quinine in African children with severe malaria.
Journal Article > ResearchAbstract
Trans R Soc Trop Med Hyg. 2010 December 1; Volume 104 (Issue 12); DOI:10.1016/j.trstmh.2010.08.016
van Griensven J, Zachariah R, Mugabo J, Reid AJ
Trans R Soc Trop Med Hyg. 2010 December 1; Volume 104 (Issue 12); DOI:10.1016/j.trstmh.2010.08.016
This study was conducted among 609 adults on stavudine-based antiretroviral treatment (ART) for at least one year at health center level in Kigali, Rwanda to (a) determine the proportion who manifest weight loss after one year of ART (b) examine the association between such weight loss and a number of variables, namely: lipoatrophy, virological failure, adherence and on-treatment CD4 count and (c) assess the validity and predictive values of weight loss to identify patients with lipoatrophy. Weight loss after the first year of ART was seen in 62% of all patients (median weight loss 3.1 kg/year). In multivariate analysis, weight loss was significantly associated with treatment-limiting lipoatrophy (adjusted effect/kg/year -2.0 kg, 95% confidence interval -0.6;-3.4 kg; P<0.01). No significant association was found with virological failure or adherence. Higher on-treatment CD4 cell counts were protective against weight loss. Weight loss that was persistent, progressive and/or chronic was predictive of lipoatrophy, with a sensitivity and specificity of 72% and 77%, and positive and negative predictive values of 30% and 95%. In low-income countries, measuring weight is a routine clinical procedure that could be used to filter out individuals with lipoatrophy on stavudine-based ART, after alternative causes of weight loss have been ruled out.
Journal Article > ResearchFull Text
Malar J. 2012 April 30; Volume 11; 139.; DOI:10.1186/1475-2875-11-139
Van Malderen C, Van Geertruyden JP, Machevo S, Gonzalez R, Bassat Q, et al.
Malar J. 2012 April 30; Volume 11; 139.; DOI:10.1186/1475-2875-11-139
BACKGROUND
Malaria is a leading cause of mortality, particularly in sub-Saharan African children. Prompt and efficacious treatment is important as patients may progress within a few hours to severe and possibly fatal disease. Chlorproguanil-dapsone-artesunate (CDA) was a promising artemisinin-based combination therapy (ACT), but its development was prematurely stopped because of safety concerns secondary to its associated risk of haemolytic anaemia in glucose-6-phosphate dehydrogenase (G6PD)-deficient individuals. The objective of the study was to assess whether CDA treatment and G6PD deficiency are risk factors for a post-treatment haemoglobin drop in African children <5 years of age with uncomplicated malaria.
METHODS
This case–control study was performed in the context of a larger multicentre randomized clinical trial comparing safety and efficacy of four different ACT in children with uncomplicated malaria. Children, who after treatment experienced a haemoglobin drop ≥2 g/dl (cases) within the first four days (days 0, 1, 2, and 3), were compared with those without an Hb drop (controls). Cases and controls were matched for study site, sex, age and baseline haemoglobin measurements. Data were analysed using a conditional logistic regression model.
RESULTS
G6PD deficiency prevalence, homo- or hemizygous, was 8.5% (10/117) in cases and 6.8% (16/234) in controls (p = 0.56). The risk of a Hb drop ≥2 g/dl was not associated with either G6PD deficiency (adjusted odds ratio (AOR): 0.81; p = 0.76) or CDA treatment (AOR: 1.28; p = 0.37) alone. However, patients having both risk factors tended to have higher odds (AOR: 11.13; p = 0.25) of experiencing a Hb drop ≥2 g/dl within the first four days after treatment, however this finding was not statistically significant, mainly because G6PD deficient patients treated with CDA were very few. In non-G6PD deficient individuals, the proportion of cases was similar between treatment groups while in G6PD-deficient individuals, haemolytic anaemia occurred more frequently in children treated with CDA (56%) than in those treated with other ACT (29%), though the difference was not significant (p = 0.49).
CONCLUSION
The use of CDA for treating uncomplicated malaria may increase the risk of haemolytic anaemia in G6PD-deficient children.
Malaria is a leading cause of mortality, particularly in sub-Saharan African children. Prompt and efficacious treatment is important as patients may progress within a few hours to severe and possibly fatal disease. Chlorproguanil-dapsone-artesunate (CDA) was a promising artemisinin-based combination therapy (ACT), but its development was prematurely stopped because of safety concerns secondary to its associated risk of haemolytic anaemia in glucose-6-phosphate dehydrogenase (G6PD)-deficient individuals. The objective of the study was to assess whether CDA treatment and G6PD deficiency are risk factors for a post-treatment haemoglobin drop in African children <5 years of age with uncomplicated malaria.
METHODS
This case–control study was performed in the context of a larger multicentre randomized clinical trial comparing safety and efficacy of four different ACT in children with uncomplicated malaria. Children, who after treatment experienced a haemoglobin drop ≥2 g/dl (cases) within the first four days (days 0, 1, 2, and 3), were compared with those without an Hb drop (controls). Cases and controls were matched for study site, sex, age and baseline haemoglobin measurements. Data were analysed using a conditional logistic regression model.
RESULTS
G6PD deficiency prevalence, homo- or hemizygous, was 8.5% (10/117) in cases and 6.8% (16/234) in controls (p = 0.56). The risk of a Hb drop ≥2 g/dl was not associated with either G6PD deficiency (adjusted odds ratio (AOR): 0.81; p = 0.76) or CDA treatment (AOR: 1.28; p = 0.37) alone. However, patients having both risk factors tended to have higher odds (AOR: 11.13; p = 0.25) of experiencing a Hb drop ≥2 g/dl within the first four days after treatment, however this finding was not statistically significant, mainly because G6PD deficient patients treated with CDA were very few. In non-G6PD deficient individuals, the proportion of cases was similar between treatment groups while in G6PD-deficient individuals, haemolytic anaemia occurred more frequently in children treated with CDA (56%) than in those treated with other ACT (29%), though the difference was not significant (p = 0.49).
CONCLUSION
The use of CDA for treating uncomplicated malaria may increase the risk of haemolytic anaemia in G6PD-deficient children.
Journal Article > ResearchAbstract
Trans R Soc Trop Med Hyg. 2010 February 1; Volume 104 (Issue 2); DOI:10.1016/j.trstmh.2009.07.009
van Griensven J, Zachariah R, Rasschaert F, Mugabo J, Atté EF, et al.
Trans R Soc Trop Med Hyg. 2010 February 1; Volume 104 (Issue 2); DOI:10.1016/j.trstmh.2009.07.009
This cohort study was conducted to report on the incidence, timing and risk factors for stavudine (d4T)- and nevirapine (NVP)-related severe drug toxicity (requiring substitution) with a generic fixed-dose combination under program conditions in Kigali, Rwanda. Probability of 'time to first toxicity-related drug substitution' was estimated using the Kaplan-Meier method and Cox-proportional hazards modeling was used to identify risk factors. Out of 2190 adults (median follow-up: 1.5 years), d4T was replaced in 175 patients (8.0%) for neuropathy, 69 (3.1%) for lactic acidosis and 157 (7.2%) for lipoatrophy, which was the most frequent toxicity by 3 years of antiretroviral treatment (ART). NVP was substituted in 4.9 and 1.3% of patients for skin rash and hepatotoxicity, respectively. Use of d4T 40mg was associated with increased risk of lipoatrophy and early (<6 months) neuropathy. Significant risk factors associated with lactic acidosis and late neuropathy included higher baseline body weight. Older age and advanced HIV disease increased the risk of neuropathy. Elevated baseline liver tests and older age were identified as risk factors for NVP-related hepatotoxicity. d4T is associated with significant long-term toxicity. d4T-dose reduction, increased access to safer ART in low-income countries and close monitoring for those at risk are all relevant strategies.