- Justin Sullivan/Getty Images
- A symptom-checking tool called Ada Health is launching a new partnership with the Bill and Melinda Gates Foundation.
- On Wednesday, the startup will begin working with the Gates Foundation to study how the tool could support healthcare workers in rural parts of the world.
- Ada Health is already one of the most popular medical apps in over 130 countries.
Getting to the doctor when you’re not feeling well is no easy task no matter where you live. But in many parts of the world, there are bigger problems than high costs and long wait times.
For roughly half the globe’s population, basic healthcare is a luxury that’s too expensive to get. So Ada Health, a tool that lets you type in your symptoms to learn what’s causing them, is launching a new initiative with the Bill and Melinda Gates Foundation to extend the reach of its services.
The Ada app is designed to tell you what’s causing your symptoms with more accurate results than you’d get from a Google search. Users open the app, enter their age and gender, and type in a symptom like pain or a cough. Then an AI-powered bot asks several questions, like what makes the symptom worse, and tells you the most likely culprit.
Starting today, Ada is working with the Bill and Melinda Gates Foundation to study how the platform can be used to support healthcare workers in rural parts of several countries in East and Sub-Saharan Africa, Southeast Asia, South America, and India.
The project is part of Ada’s new Global Health Initiative, a series of projects focused on improving access to primary care in underserved populations across the world. The effort will involve work with local governments, NGOs and other partners as well.
“The reason we’re doing this is the same reason why we started Ada in the first place: it’s about giving people better access to quality healthcare,” Daniel Nathrath, CEO and co-founder of Ada Health, told Business Insider. “While it’s a noble goal to pursue it in the US or Germany, it’s even more important in countries where so many people don’t have access to a doctor.”
Currently, the app is available in roughly 130 countries including Germany (where it started), the US, and Canada. Already, roughly a third of Ada’s customers hail from countries outside of Germany, according to the company.
To Google or not to Google
- The Ada Health team. From left to right: pediatrician Claire Novorol, Daniel Nathrath, and neuroscientist Martin Hirsch.
- Ada Health
To Google or not to Google – that’s often the question when it comes to an ailment like a cough or stomach pain.
But researching your symptoms online can send you down a rabbit hole that leads you to think you have a life-threatening condition. A trip to the doctor, n the other hand, can be time-consuming and expensive.
Nathrath and his co-founder, Claire Novorol, created Ada Health to give people a third option.
Unlike the results that come from sites like WebMD, Ada’s results are based on a growing database of hundreds of thousands of people that match your age and gender. The idea is that by homing in on a population sample you fit into, Ada can give more accurate results.
Say you’re a 31-year-old woman experiencing stomach pain, for example. Once you type in your symptoms and answer Ada’s questions, it might tell you that most of the other 31-year-old women in the database who reported your symptoms were diagnosed with Irritable Bowel Syndrome. Then Ada may advise visiting a healthcare provider. Or if the likely cause of your symptoms is not a serious issue, Ada may suggest that you simply rest.
- Ada Health
Putting Ada into the hands of healthcare workers
As part of the new partnership with the Gates Foundation, Ada researchers will look at the data the app gathers in several rural, low-income parts of the world to better understand patients’ needs and learn how to improve healthcare delivery to these regions.
In the future, Nathrath said he hopes such insights could be used to do things like help stop a deadly outbreak.
Hila Azadzoy, Ada’s managing director of the Global Health Initiative, told Business Insider that her team is now working to equip Ada with more relevant data on tropical diseases like Chagas and dengue. They’re also analyzing what kinds of physical diagnostic tests they could give people – along with Ada – to confirm some of its assessments.
“Most healthcare workers work door-to-door and can track patient symptoms,” Azadzoy said. “The vision we have is we can put Ada into their hands and even connect Ada with diagnostics tests so that – at the home of the patient -they can pull it out and say, ‘OK this is confirmed,’” she said.
Are symptom checkers the next big thing in primary care?
Since it was founded in Berlin in 2011, Ada has raised $69.3 million with the help of several big-name backers including William Tunstall-Pedoe, the AI entrepreneur behind Amazon’s Alexa, and Google’s chief business officer Philipp Schindler. The company says Ada has already been used by 6 million people in the US and Europe, where it is one of the highest ranked medical apps.
Ada is not the only tool that lets users input and track their symptoms. Another so-called “symptom checker” is primary-care app K Health, which launched in 2016.
If these services can get the science and AI right, they offer a long list of potential benefits, including reducing healthcare costs, saving time for patients and doctors, slashing unnecessary worry – and even, one day perhaps, helping to prevent an outbreak like Ebola.
But more data is needed on the effectiveness of these services. The last comprehensive assessment of symptom checkers was published by Harvard Medical School researchers in 2015, before Ada or K Health existed. Since then, at least half a dozen other services have emerged as well.
Until better data becomes available on these apps, they can at least offer users an educated assessment about what’s causing a symptom like a sore throat. And in rural areas where people don’t have access to a healthcare provider, that could be a huge source of support.
“The first step towards getting the right treatment is understanding what’s ailing you,” Nathrath said.
- Alice Zhang, co-founder and CEO of drug discovery company Verge Genomics
- Verge Genomics
- Alice Zhang started Verge Genomics in 2015 with Jason Chen to combine innovation in neuroscience, machine learning and genomics and apply it to the drug discovery process.
- The vision for Verge was to become the first pharmaceutical company that automated its drug discovery engine, helping to rapidly develop multiple lifesaving treatments in diseases like Alzheimer’s disease, ALS, and Parkinson’s disease where no cure exists today.
- On Monday, the San Francisco-based company announced it had raised $32 million in series A funding, led by Draper Fischer Jurvetson, bringing its total amount raised to $36.5 million.
The drug development process is laden with problems that make it lengthy and expensive. Right now, it takes 12 years and $2.6 billion to get a single drug to market, with the drug discovery and development process costing $1.4 billion.
Verge Genomics, run by 29-year-old Alice Zhang, is trying to address these problems by making drug discovery faster and cheaper.
On Monday, the San Francisco-based company announced it had raised $32 million in series A funding, led by Draper Fischer Jurvetson, bringing its total amount raised to $36.5 million.
Zhang was three months shy of her MD and PhD graduation from University of California-Los Angeles when she left school to start Verge Genomics in 2015 with Jason Chen, who she met during the program.
“I just became very frustrated with the drug discovery process,” she said. “It’s largely a guessing game where companies are essentially brute force screening millions of drugs just to stumble across a single new drug that works.”
At the time, Zhang also recognized the advancements in neuroscience, machine learning and genomics occurring all around her. Genome sequencing had become more and more affordable, and breakthroughs in understanding how function connects with genes opened a new field of possibilities for exploring disease and health. And there was an opening for an opportunity to guesswork out of drug discovery. The vision for Verge was to become the first pharmaceutical company that automated its drug discovery engine, helping to rapidly develop multiple lifesaving treatments in diseases like Alzheimer’s disease, ALS, and Parkinson’s disease where no cure exists today.
Recently, other big pharmaceutical agencies like Novartis are also starting to follow suit, adapting technology to different steps of the clinical trial process.
Verge, 14 people large, functions at full capacity. Not only do they have computer scientists managing the front-end of machine learning, but they also have researchers working in its own in-house drug discovery and animal lab.The team is stacked with computer scientists, mathematicians, neurobiologists, as well as industry veterans and drug development veterans.
There are three main problems in drug discovery that Verge is using data and software to tackle. The first is that many diseases like Alzheimer’s disease are caused by hundreds of genes. Verge’s algorithms on human genomic data can map these genes out. The second is instead of using animal data only for pre-clinical trials, Verge uses human data from day one, which may enable greater insight into how effective the drug actually is on human cells. Drugs that work in mice often fail in humans, and that’s because they’re usually there to serve as primary mammal model. Instead of tediously screening millions of drugs, the algorithm will computationally predict drugs that work.
Verge uses brain samples from patients that have passed away from Alzheimer’s disease or Parkinson’s disease for its human data, obtained through partnerships with over a dozen different universities, hospitals and brain banks. The company then RNA-sequences them in-house, which allows them to measure the gene expression in its most current state, and it can measure simultaneously how all of the genes in the genome are behaving. This data helps scientists figure out what’s actually causing disease in these patients and see if there are connections between genes and disease.
Verge’s scientists can make predictions about what drugs they think will work. They can take a patient’s own skin cell and turn it directly into their own brain cells in a dish. Then the predictions can be tested on these brain cells to see if they can rescue them from dysfunction or death – a basic test of drug efficacy. That validation data can feed back into the platform and continuously improve predictions over time, even across different diseases.
The Verge algorithm identifies drugable targets for treatments, then design drugs accordingly. This is done by mining through human samples to identify groups of genes that are implicated with the disease, and what crucial hub in these gene networks can turn them on or off.
The latest investment in Verge will serve to advance its ALS and Parkinson’s disease drugs. There are six drugs in development, closer to the clinical end, which are being tested to make sure they’re safe and non-toxic. The funding will also be used to expand the number of diseases Verge has in its portfolio.
Emily Melton, a partner at DFJ, told Business Insider that investment in early stage startups is largely about the team, the uniqueness of the idea and the capability and expertise of the research team. But what drew her in most was Zhang. “She was this brilliant founder, with a very organic desire to create an impact,” said Melton. “She felt like it was her calling.”
Using system learning to recognize patterns that would otherwise go undetected by the human eye can speed up the process while creating a bigger and better feedback loop, said Melton. “We’re rethinking how drug discovery is done, and we’re rethinking how therapeutics are developed.”
- Mario Tama/Getty
- Google has a new algorithm that can quickly sift through thousands of digital documents in a patient’s health record to find important information, Bloomberg reported.
- This enables the technology in some cases to help doctor make better predictions about how long a patient may stay in a hospital or when the likelihood that the patient may die.
- Google wants to take the tech into clinics and use ‘a slew of AI tools to predict symptoms and disease.’
Google’s artificial intelligence systems can cull through and analyze a person’s medical history to help doctors make more accurate predictions about a patient’s health, and even provide estimates on when a patient may die,according to a Bloomberg report.
In one situation, doctors estimated that a woman with cancer who had arrived at a city hospital with fluids in her lungs had a 9.3 percent chance of dying during her stay, Bloomberg reported. A new form of algorithm from Google said the risks of death were higher, 19.9 percent. The woman died a few days later.
Google is one of many companies trying to apply AI technology tosolve some of the problemsfaced by the health-care sector. AI has shown enormous promise at analyzing vast amounts of data and performing tasks that typically requires lots of man hours.
In this case, Google’s AI uses neural networks, which has proven effective at gathering data and then using it to learn and improve analysis. According to Bloomberg, Google’s tech can “forecasta host of patient outcomes, including how long people may stay in hospitals, their odds of re-admission and chances they will soon die.”
AI can help doctors make better diagnosis
Google’salgorithm can retrieve “notes buried in PDFs or scribbled on old charts” to make predictions, and determine “the problems with solving,” Bloomberg wrote. All this could help doctors make better diagnosis.
Predicting death is likely to stoke fears among those who worry that AI may some day hold too much control over humans.
And Google’s technology raises a variety of ethical concerns about how it is used and who gets access to it. Decisions about insurance coverage for patients seeking certain medical treatment, or hospitals trying to allocate scarce beds for patients, are obvious examples of potentially problematic scenarios where such AI predictions could come into play.
According to the report, the findings so far is that Google’s system is faster, and more accurate than other techniques at evaluating a patient’s medical history. Eventually, Google would like to take “a slew of new AI tools” that can accurately predict symptoms and disease into clinics.
- Shutterstock/Blend Images
- A new system created by Verily and Google AI researchers can use photographs of the retina to predict risk factors for cardiovascular disease.
- The system works about as well as presently used predictive methods and is far less invasive.
- In a recent study, researchers could see what the artificial intelligence software was paying attention to as it studied the eye.
Your eyes might be the perfect windows into your heart.
At least, they’re windows that Google-created artificial intelligence software can use to calculate your risk factors for heart disease.
According to a study recently published in the Nature Biomedical Engineering journal, an AI algorithm created by Google AI and Verily Life Sciences (an Alphabet subsidiary that spun off from Google) can predict whether a patient is likely to suffer a major cardiovascular event like a heart attack or stroke within five years, based on a photo of their retina.
So far, the predictions work about as well as presently accepted methods that are more invasive, according to the study.
Learning to predict heart disease
The fact that disease can be spotted in the retina isn’t a surprise. Doctors often spot medical conditions including diabetes, extreme high blood pressure, high cholesterol, and some cancers during eye exams.
To mimic that ability, the Verily and Google researchers trained AI software to identify cardiovascular risks by having the system analyze retina photos and health data from 284,335 patients. Specifically, it looked at retinal fundus images – photos that show blood vessels in the eye.
- The retina fundus photographs that the AI software uses to asses cardiovascular disease risk.
- Poplin et al., Biomedical Engineering, 2018
Known risk factors for cardiovascular disease include age, blood pressure, and gender, among other things. Based on an eye scan, the algorithm was able to predict a person’s age to within 3.26 years, smoking status with 71% accuracy, and blood pressure within 11 units of the upper number reported in their measurement.
Because the algorithm was so effective at assessing these factors, the researchers decided to see how well it could predict actual strokes and heart attacks.
They used data from a set of 150 patients that had suffered major cardiovascular events within five years of their eye scan. (That data set included 12,026 people, but only several hundred experienced a major cardiac health event, with clinical data available for 150 of those patients.) When the researchers presented the algorithm with two retina images and asked it to predict which one would suffer a major cardiac event or stroke, it predicted the correct scan 70% of the time.
By comparison, the European SCORE risk calculator, which requires a blood test, is currently used to predict risk for cardiovascular disease. That calculator predicted the correct scan in 72% of the cases from the same dataset, which is not much better than the AI performance – and the AI did just as well when it had access to demographic information like age, gender, and BMI.
A powerful demonstration
- From images like this, the researchers were able to identify what exactly the AI was paying attention to while assessing risk.
- Poplin et al., Biomedical Engineering, 2018
Cardiovascular disease is the leading cause of death in the world. Because of that, the idea that a routine retina scan could provide an early warning of heightened risk – hopefully in time to change behavior – is exciting.
The new study suggests there is more information available in the retina than scientists previously realized. The AI system is particularly exciting because it takes medical images that might already exist and gets new and potentially important data from them. And that information can be gathered and used without invasive tests.
Researchers involved in the study were also able to track which factors the algorithm was relying on to make its predictions, since the system created heat-maps of areas it focused on. In this case, the researchers know the system was paying particular attention to blood vessels to calculate blood pressure, for example.
Such information isn’t always available in machine learning processes. But in this case, it can help scientists better understand the wealth of data that’s available in retina images in the first place.
Overall, this new study highlights the ways that deep learning is transforming how scientists study the body. Machine learning can even take scans that we already have and use them to generate a far more complete picture of human health.
Still, as promising as these results seem, they are preliminary, according to a blog post by Dr. Michael McConnell, the Head of Cardiovascular Health Innovations at Verily.
“[M]ore work must be done to develop and validate these findings on larger patient cohorts before this can arrive in a clinical setting,” McConnell wrote.