Getting a sneak peek at the healthcare ecosystem of the future may be as easy as looking to Netflix or Amazon.

Apple’s ongoing encroachment on the healthcare industry – marked most recent by news that it will set up its own independent clinics for employees – leaves little doubt that the smartphone giant sees itself playing a significant role in the medical ecosystem. The company already possesses a suite of health-focused solutions in its ever-evolving technical inventory that continually collect fitness data on those users that allow it.

Paul Clark, director of healthcare research at AI and analytics firm Digital Reasoning, believes that this presents ripe pickings for future, data-driven healthcare initiatives.

“You have to view it within the context of other Apple initiatives that are going on, and things like machine learning,” Clark tells Digital Health News.

“Apple and Amazon’s move to take advantage of the FHIR standard will ensure healthcare providers make available patient records through the standard. [Users] can then download it through an app on their smartphone.”

By combining medical histories with tools like Apple HealthKit and wearables that collect vitals and fitness data, you end up with a holistic, integrated healthcare ecosystem, Clark says.

When you consider that every iPhone user is carrying a camera in their pocket, it doesn’t take a wild leap of imagination to predict that Apple could also provide telemedicine services to patients at some point in future.

“It won’t take long to connect the dots between the infrastructure of Apple FaceTime with their own employed doctors, and the model of ‘tele-doctors’,” explains Clark.

He sees this as a long-term play spanning the next 10 or so years, by which time machine learning is expected to play a larger role in healthcare delivery.

Even so, Clark reckons this new, patient-centric healthcare model won’t be completely unrecognisable and could mirror the myriad digital ecosystems we currently engage with on a daily basis, such as Netflix, Spotify and Amazon.

“Similar to how Amazon reminds you of purchases or makes recommendations to you, an AI that understands meaning and context can provide appropriate recommendations based on your body weight, your age, your medical history and so on,” he says.

Clark points out that this degree of contextual information, based on constant data gathering and user input, could deliver superior contextualisation than a patient might get from a one-time meeting with a clinician.

“It’s a whole different game when you’re talking about interacting and engaging with people on a regular basis through their preferred method of communication, their mobile device.

“People interact with their smartphones 50, 80 times per day. You don’t get that level of engagement with a face-to-face meeting with a clinician.

“A one-time meeting is not going to have the same effect as being integrated with your paired device that is always with you.”

While patients will still need face-to-face meetings with their doctors, Clark points out it will be possible for a higher proportion of these to be done via video consultation, freeing up precious time for overstretched medical staff.

Of course, it’s not just Apple that finds itself in an ideal position to take advantage of the patient revolution. Google and – as recent, rather unsavoury events have transpired, Facebook – are also ingesting unfathomable amounts of data on consumers.

The integration of social and location data collected by these firms could open up new behavioural insights on populations, Clark suggests, allowing clinicians to provide a range of digital interventions for physical, social, psychological and mental health.

“This will change the dynamic of the healthcare interaction completely” says Clark. “This is real information that we have right now – all we need to do now is integrate and act upon in with existing technology that Apple, Amazon and Google already possess.”