The world of healthcare has become one of the most demanding and quickly evolving industries. This has been highly driven by the rapidly changing technology landscape, where efforts are leading to faster and better research, development, and overall outcomes. The catalyst for this transformation has been the access to large troves of data and how organizations have been able to improve both quality and overall data utilization.
The Challenges and Apparent Solutions
Data quality and utilization improvements have been a spark for better research and development in healthcare driven by the pairing of decision-making and the ability to leverage all types of data in the analytics process. Organizations today understand the ecosystem of their data, which has mainly solved the problem of accessibility through the Internet of Things (IoT), ETL and scalable cloud data stores. This was made even more impactful through the massive adoption of the dashboard.
The image above details the before and after workflow of my time as an analyst. I joined an analytics team prior to the dashboard adoption and spent even more time thereafter helping build different views for different departments in my organization. My time as an analyst consisted of working closely with both Data Infrastructure (DI) and Information Technology (IT) to ensure I would have access to the right data. If that data wasn’t being collected, that would kick off an ancillary project for the DI team to start collecting. Once we were able to confirm the right data was being collected, the DI team would provide me access to a copy of the data, typically in an Amazon S3 bucket or Amazon Redshift Cluster.
Analyzing the Data
At this point access had been attained, and it was my responsibility to start exploring the data. We had a few team members who would use a query service like Amazon Athena in order to start querying the databases. As analysts, we all didn’t have basic SQL knowledge so we spent time training and learning with a few of the DI team members. We ended up building a library of queries curated by the DI team that helped us get the basic pieces of data that we wanted. Anything more sophisticated required more time with the DI team or spending a lot of time figuring it out on our own. Getting the right data was just half the battle. Now that we had it, it would go straight into an Excel spreadsheet where the hardest work was required. We would spend days and weeks combing through the data to eventually never fully realize insights. The biggest challenge here was that we needed faster computations and to view the data some other way than just spreadsheets and data tables.
Visualizing the Data
The dashboard arrived in the form of Looker. The team now shifted time to building out an environment that democratized data by providing access in an interface that was more visual and easier to navigate. My skills and team’s skills now shifted to learning a new querying language (LookML) in order to build the right dashboards and views for our organization. The dashboard was an extremely interesting solution to the problem of accessibility. It solved a very real and tangible problem; the problem of not being able to see the data. The intuitive and highly visual UI solved this. But a deep rooted problem still existed. It didn’t matter that I had access to the data, I didn't know what to do with it. This challenge propelled the need for higher knowledge and specialties in working with complex data. We quickly reached this point at my organization which led to the massive growth of our data science team.
Traversing this data requires certain specialties in order for the time spent exploring to be guided and maximized. Organizations and teams require rich data science skills in order to reach meaningful results in a timely fashion. Oftentimes, this leads to bottlenecks and unmanageable backlogs of projects because teams are unequipped to tackle those effectively on the spot. With the landscape of data growing and organizations pressuring analytics teams to uncover more, the lack of timely insights will unfortunately cause organizations to fall behind in their work. With data-driven quality and utilization improvements being at the core of healthcare transformation, it is important that analytics teams are empowered to do this outside of traditional and heavy duty analytics workflows.
The image above details a specific hypothesis around solving problems on the spot, or at the very least, learning something meaningful on the spot that helps kick off a further investigation that is not starting from square 1. The hypothesis is that teams solving more problems on the spot will reach more meaningful insights faster that impact the business versus teams that are outsourcing pieces of work to data science, data infrastructure, or information technology teams consistently.
Virtualitics Immersive Platform Helps Analysts Solve Problems Right Now.
Virtualitics AI Platform is an augmented analytics company that helps companies enable analysts and domain experts by enhancing the value generated by existing teams without requiring extensive IT or Data Science support. The embedded no-code AI equips analysts with skills and techniques to quickly explore data and develop dynamic storytelling that targets decision makers directly. Virtualitics AI Platform focuses on the most critical components of analysts’ core workflows and requirements to work at their best.
1. Querying data
One of the most critical things for an analyst is to scale up and maintain their database knowledge. This type of knowledge is typically gained through Entity-Relationship Diagrams (ERD) or hours of querying. The visual aspect of this is extremely limited and if the data isn’t organized as cleaning, it can become almost impossible. IBM estimates that 80% of data is unstructured, which heavily impacts database knowledge and overall effectiveness of data utilization. VIP is a platform rooted in visualizations, helping analysts quickly understand relationships in their data within seconds. Analysts are able to quickly gain database knowledge but are also granted a mechanism for communicating that knowledge to other stakeholders who are not as familiar or equipped with the skills of the analyst themselves.
2. Merging domain expertise with data science for deeper understanding
It is not one or the other. One of the most critical components of an analyst is their command of the domain and business. They spend a lot of their time getting as close to the business problems as possible. You can think of them as the front line to exploring business problems, but when it comes to merging that knowledge with analytics it usually requires other resources. After researching and speaking with analysts in the healthcare industry, I quickly learned how critical it was for them to have a very intimate understanding of the data they would be using. The vast nature and complexity of the data made that a huge challenge on a daily basis for analysts. Virtualitics AI Platform completely eliminates the initial need for other resources, empowering analysts to learn more and identify insights that can meaningfully impact the problem or guide the subsequent analysis done by other teams. The platform provides no-code AI routines that enables an analyst to deploy ML models on their data and quickly identify relationships or characterize prime drivers in any of their existing models.
3. Improving communication and insights presentation
My research and interviews also yielded results that led to most analysts spending time upskilling in their data analytics capabilities, typically focused on learning at least one coding language. Almost none of the analysts highlighted presentations as something they thought of immediately improving. One of the biggest challenges in the industry is finding the common language between the highly technical audience and the non-technical audience. Part of the decision-making process is setting a foundational understanding and reaching alignment. Analytics teams today spend more time trying to convince stakeholders of one thing instead of spending more time implementing and steering the actionable change. Part of the reason for this is the knowledge gap, and how “the why” of insights are communicated. VIP eliminates this by developing core pillars in accessibility, visualizations, and explainability. The concept is that VIP can sit at the center of an organization, offering multiple access points based on your ecosystem, whether that is naturally using Python in a Jupyter notebook or simply using a web application in your browser. The idea is that using the core pillars, organizations can collaborate in the environments they feel most comfortable, using no-code AI to guide the analytics experience and using multi-dimensional visualizations as a common language for explaining “the why” behind the analytics.