A computerized- Artificial Intelligence Based- Global Health Care System: A Pandemic Perspective

This article presents a robust computerized – “Artificial Intelligence” based Health Care System, which works on a global level and is efficient in handling global issues such as pandemic, in a unified, co-operative international approach enabled only with help of mutual trust and understanding.  

Most health care systems today are efficient and well equipped. But as the pandemic has shown us they are lacking  in co-operating and efficiently dealing with the current pandemic situation from a global perspective.  Here, we need a health care system with co-operates among towns & cities not just among individual care facilities  and among countries not just among cities. While most health care institutions in todays world are working on stand-alone model, where networking between even two hospitals across the street is based on emails, messenger services and telephone conversations-landline or mobile. In this article I am putting a light on an approach for better inter-operability among stand-alone the systems both within towns (cities or villages) and on a global level-within countries.

Figure 1. The flowchart describes the way information can be processed and decision making using Artificial Intelligence can be applied for future medical tasks and leaderships.

It was around a decade back that global level work on Clinical Document Architecture  (CDA) was proposed.  We all are familiar with the term medical records which may be written by human or a computer. When the records are written on a computer it needs a format, typically called a medical description language or MDL for short. To briefly explain what CDA was all about, so that you don’t have to refer back on a search engine for that. CDA, basically in layman language, provided a standard for storing medical information of a patient. The details are stored in the format finalized by the experts, the details include, patients’ medications, disease severity, doctor specifications for medications, to diet restrictions followed and may even include the comments from the nurse in charge of the patient.  There are several ways in which CDA can be written which may include PHMR (Patient Health Care Monitoring Record), CCR (Clinical Care Record) to mention a few, these are specialized forms of the  model of CDA. This may be understood as different ways to input and view information based on who is accessing the data. Once the document is made at the entry of a person in hospital, it initializes the basic details counting up to height of the person, to illustrate it to you. There are xml fragments -specified by xml tags- that are updated-and/or added once a doctor has some comments or medications to prescribe to the patient. Each disease has a code—codes were added in specification for making it easy for a computer to understand the condition of a patient. Though natural languages could have been processed (and NLP could have been used), but may by then, and even now, the question of resolving ambiguity of a natural language (about a disease name, symptoms, side effects…), that too in a critical application such as health care is paramount and could not have been left to negligence of accuracy even by 0.9%.  In this way the xml is updated, adding blocks of tags for each disease, symptoms, medications and so on. The initial impression of the proposed thing was the xml for a patient id may go to a centralized server, where all cumulative files about a health institution may be stored. These makes it easy for processing hospital level information. But nobody predicted a pandemic, as far as I know, so to the best of my knowledge even if these models had been implemented on some care systems, there seem to be an absence of global records and the cumulative understanding of these records.  

Why global accumulations of such records can be considered necessary ? Yes, it is a question about why you are reading this article!   Well, computerized documentation is being provided by many health care facilities, including individual clinics. There are names such as  medical records, medical descriptions, or simple excel file based databases which are used at backend (either on a web server or a desktop application), some may be using xml files themselves. But they may lack unified approach and standards to store information. This makes it difficult to comprehend data from one server to another. Why we need to connect data from one server to another??? Well yes there was not much need for it before this – Pandemic never hit us after the Computer Revolution, as we may call it and specifically Artificial Intelligence Revolution, as I would like to call it —to what is going on now!  This may be a reason why centralized approaches to store such contents were not performed. You may ask—how many of super computers can process scattered data, spread across countries, each small unit in a different form. I say—None-at present ! WHY?? Well you would need lot of human specialists efforts for writing rules for inter-compatibility of forms in which data is present –in each clinic, in each hospital, in each care house to mention a few! Basically, you need a standardized data, which no one has time to create—given such a big load on the Front Line Workers-Our health care providers! Now you may ask me – why doctors and specialists needed for this task- Well Guys—all these are medical terminology, the name of disease in a particular format, the medicine names. Check out LONIC and SNOMED codes for instance, that were recommended by some for a standardized approach. How many -medical description languages- use these standardized codes! ?  Or do we need more standardized codes or are these enough ?! I think a lot of work has gone in coding medical standards, and re-usability is a popular software if not hardware jargon we all follow. The answer would be yes—if we can gather some medical coding specialist to translate the formats of MDL into a unified coding format –that can be fed into  Super Computers.

The proposal was well studied and several guidelines were framed. However, it seems that not much is implemented about centralization of the medical information. Now, individual units can maintain their medical record, process it and even apply Artificial Intelligence on the data and generate analytical graphical visualizations many AI and data scientists are used to view for decision making. But here, we stand amidst, in some places on the on-set of,  the third wave and second wave of pandemic. It is needed to now use medical coder—to translate the MDL files from individual units, institutions, villages, clinics and so on to a unified form as proposed years back.

The medical information may vary from

  1. group information: such as a group of clinics operating in an area
  2. institution information: such as an autonomous or a government hospital
  3. local area information: such as a locality
  4. metropolitan information: as the name suggests each metropolitian areas
  5. country information to
  6. global information: the ultimate aim—to end a pandemic from its roots—if it ever hits back!

Once this information is on servers-we can do a lot of Artificial Intelligence on it and help people, not only doctors and policy makers with authentic information from reliable sources. I now propose the use of Artificial Intelligence in

  1. information extraction,
  2. decision making,
  3. better forecasting of next outbreak,
  4. providing medical suggestions to help doctors
  5. care suggestions at home
  6. automated updates on emails of places to avoid, to mention a few

Once this mission is accomplished, we have better tracking, and who knows we can track even the virus from its roots, with the help of tremendous amount of development in Artificial Intelligence that can beat humans in tough games like Chess!

More on this in my upcoming blog. Stay tuned! Take care! And Start Implementing It!

Published by nidhk

I have an eager research-based approach to solve problems in the domain of Artificial Intelligence and Computer Applications. I find solutions based on my strong knowledge and foundations in the subjects like Artificial Intelligence, Machine Learning, Data Mining, Optimization Techniques, Linear Algebra to mention a few. This is augmented by my high standard of coding skills which vary from C++, Java, Perl to Data Science languages such as Python, R and MATLAB. To further establish, it many of the my works have already been published online as research papers in well reputed journals. I have intense experience in Natural Language Processing applications such as summarization, search, retrieval, sentiment analysis, wordnet, deep learning. I have completed PhD specializing in Artificial Intelligence. Having worked on real time implementations of various applications of Computer Science. The domains that I have worked on are Health Care System, Electronic Document Management Systems, Natural Text Mining, EDA, Web Development etc. Apart from profession, I have inherent interest in writing especially poems, stories, doing painting, cooking, photography, music to mention a few!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: