1. The web interface makes it hard to go back. The first thing I want to do when I see a surprising answer is reread the question. I understand this is likely just bad engineering, but it makes the whole site feel less trustworthy.
2. Some questions get summarized poorly. For example, the mindfulness question asks "What effect does mindfulness based stress reduction have on self-reported mental health (rates of anxiety, stress, and depression)?" but then summarizes the choices to "reduction rates of mental health issues." You can't drop a word like self-reported from a question, after all physically disabled people self-report being happier after their disability (e.g. http://www.ncbi.nlm.nih.gov/pubmed/21870935).
Also in the "Drug Substitution Programs" question the text indicates that the research is based off of cases where "Addicts were given heroin or substitutes such as methadone or buprenorphine, based on their needs," however the choices are formed "Positive effect - Prescribing heroin to addicts reduces crime rates," [note that it dropped the or substitutes]. This feels like going for shock value.
3. At the end the website is selling very hard about some newsletter. Apparently the website seems to be focused around a career guide? If you truly have no axe to grind then present high-quality information and I'll naturally explore the site more.
4. If your objective is really to help raise awareness about how often media publicity for social interventions doesn't reflect efficacy as measured in journals then I would think at the end you would propose a plan of action , such as "When hearing about a social program, you can use google scholar to tap into research findings..."