Last month, the world received some encouraging news: Liberia was declared Ebola-free. After a 14-month battle with the virus that claimed nearly 5,000 Liberian lives and brought the country to its knees, the World Health Organization announced that the devastating epidemic was over (Guinea and Sierra Leone, however, are still experiencing new cases).
As Liberia recovered from the outbreak and began the long, uphill process of rebuilding its health system for other ongoing and future health challenges, some of its leaders reflected on what could have been done to prevent the Ebola outbreak. In a New York Times editorial written about a month before the epidemic’s conclusion, Bernice Dahn, Vera Mussah and Cameron Nutt discuss a troubling reality: that European researchers knew about latent Ebola antibodies in Liberian blood samples as long as 30 years ago, positioning Liberia in the Ebola endemic zone. Yet, like many studies conducted by Western researchers, the findings sat atop the proverbial ivory tower, out of reach of the Liberian doctors and policymakers who could have acted to prevent the eventual outbreak.
This disconnect between development research and the communities it studies is an all-too-common trend in an international development community that hosts a Healthcare in Africa Summit in London and discusses poverty reduction strategies fresh off private jets.