SciForce Blog

Read our blog and carry on - Healthcare

Stay informed and inspired in the world of AI with us.

Turning Chaos into Clarity: Mastering Unstructured Healthcare Data with AI
From Insights to Action: The Role of Predictive Analytics in Business Transformation

Predictive analytics uses historical data and techniques like statistical modeling and machine learning to predict future outcomes. It provides accurate forecasts, helping organizations predict trends and behaviors from milliseconds to years ahead. The global predictive analytics market was valued at 14.71 billion U.S. dollars in 2023 and is projected to grow from 18.02 billion U.S. dollars in 2024 to 95.30 billion U.S. dollars by 2032, with a compound annual growth rate (CAGR) of 23.1% during the forecast period (2024-2032). 80% of business leaders recognize that data is crucial for understanding operations, customers, and market dynamics. By combining historical data with predictive models, businesses gain a comprehensive view of their data, enabling real-time predictions and proactive responses to changing conditions. This article covers the basics of predictive analytics, how it works, its benefits, types of models, and key use cases in various industries. Predictive analytics enables organizations to identify patterns in data to detect risks and opportunities. By designing models that reveal relationships between various factors, organizations can assess the potential benefits or risks of specific conditions, supporting informed decision-making. Improved Decision-Making Provides detailed data-driven insights, such as customer buying patterns and market trends. Increased Efficiency Streamlines operations by identifying bottlenecks in production lines and optimizing supply chain logistics. Cost Reduction Identifies specific areas, like energy usage and inventory management, where costs can be cut without compromising the quality. Risk Management Detects potential risks, such as fraud in financial transactions or equipment failures in manufacturing. Enhanced Customer Experience Uses predictive insights to tailor marketing campaigns, recommend products, and customize services. Data Collection Historical data is collected from sources such as transaction records, customer interactions, and sensor data. Data Cleaning and Preparation The data is cleaned to remove errors, fill in missing values, and standardize formats, ensuring it is accurate and ready for analysis. Model Selection Based on the specific problem, an appropriate model, such as linear regression, decision trees, or neural networks, is selected. Model Training The chosen model is trained using historical data, enabling it to learn and identify patterns and relationships within the data. Model Testing The model is tested using a separate subset of data to evaluate its accuracy and performance, ensuring it can make reliable predictions. Deployment The trained model is deployed into the production environment to start making predictions on new incoming data. Monitoring and Refinement The model's performance is continuously monitored in real-time, and adjustments are made to improve its accuracy and adapt to new data trends. 1. Regression Models Regression models predict continuous outcomes based on historical data by identifying and quantifying relationships between variables. Walmart uses regression models to analyze past sales data, factoring in variables such as seasonal trends, holiday effects, pricing changes, and promotional campaigns. 2. Classification Models Classification models categorize data into predefined classes, making them useful for distinguishing between different types of data points. Gmail uses classification models to analyze incoming emails, considering sender address, email content, and user behavior to categorize emails as spam or regular messages. The model is trained on a large dataset of labeled emails to recognize patterns typical of spam. 3. Clustering Models Clustering models group similar data points together without predefined labels, helping to identify natural groupings within the data. Amazon uses clustering models to segment customers based on purchasing behavior, analyzing purchase history, browsing patterns, and product reviews. This allows Amazon to create targeted marketing campaigns with personalized recommendations and promotions for each customer group, such as frequent electronics buyers or regular book purchasers. 4. Time Series Models Time series models analyze data points collected or recorded at specific time intervals, useful for trend analysis and forecasting. Financial analysts at Goldman Sachs use time series models to analyze historical stock price data, including daily closing prices, trading volumes, and economic indicators, predicting its further movements for making informed investment decisions and recommendations. 5. Neural Networks Neural networks use layers of interconnected nodes to model complex relationships in data, particularly effective for pattern recognition and classification tasks. Google's DeepMind uses neural networks in its image recognition software to identify and classify objects within photos. For instance, in wildlife conservation projects, this software can analyze thousands of wildlife camera trap images, distinguishing between different species of animals such as lions, zebras, and elephants. 6. Decision Trees Decision trees use a tree-like model of decisions and their possible consequences, making them effective for classification and regression tasks. Netflix uses decision trees to recommend movies and TV shows by analyzing user data such as viewing history, ratings, and preferences. For instance, if a user likes action movies, the decision tree recommends similar action movies or related genres. Predictive analytics is transforming various industries by enabling organizations to make data-driven decisions and anticipate future trends. Here are some high-level examples of how predictive analytics is applied across different sectors. The Mayo Clinic uses predictive analytics to identify patients at high risk for chronic diseases such as diabetes and heart disease. By analyzing EHR data, genetic information, and lifestyle factors, the clinic can offer early interventions and personalized treatment plans. Another possible applications of predictive analytics in healthcare include: 1. Disease Prediction Identifies high-risk individuals for diseases like diabetes and cancer by analyzing patient history, genetics, and lifestyle to enable early intervention and reduce treatment costs. 2. Patient Readmission Estimate readmission likelihood, allowing targeted interventions like enhanced discharge planning and follow-up care. 3. Resource Management Optimizes patient admissions, staff schedules, and medical supplies. 4. Personalized Medicine Enables personalized treatments and better results by analyzing genetic data and treatment responses. 5. Clinical Decision Support Enhances diagnosis and treatment by providing evidence-based recommendations. 6. Population Health Management Identifies health trends, helping public health organizations develop targeted interventions and plan for disease outbreaks. Financial analysts at Goldman Sachs use time series models to analyze historical stock price data, including daily closing prices, trading volumes, and economic indicators, predicting its further movements for making informed investment decisions and recommendations. Here are more possibilities for predictive analytics in financial area: 1. Credit Scoring Predictive analytics assesses creditworthiness by analyzing credit history, transaction patterns, and financial behavior. 2. Fraud Detection Identify suspicious transactions and patterns, allowing to detect and prevent fraud in real-time. 3. Investment Strategies Helps to forecast market movements and optimize asset allocation by analyzing market trends, economic indicators, and historical data. 4. Risk Management Forecasts potential market, credit, and operational risks, helping to develop mitigation strategies, ensure regulatory compliance, and maintain stability. 5. Loan Default Prediction Estimate loan default likelihood by analyzing borrower profiles and economic conditions. 6. Market Trend Analysis Provides insights into market trends by analyzing historical data and economic indicators, helping to anticipate market shifts. Spotify applies predictive models to identify users who are likely to cancel their subscriptions. By analyzing listening habits, subscription history, and engagement metrics, Spotify can implement retention strategies to reduce churn. Their robust music recommendation system is also wide-known. Other opportunities include: 1. Customer Segmentation Groups customers based on behavior and preferences, enabling tailored marketing campaigns that increase engagement and conversion rates. 2. Churn Prediction Identify customers likely to leave, allowing companies to implement retention strategies and improve customer loyalty. 3. Sales Forecasting Provides accurate sales predictions, helping businesses manage inventory effectively and optimize marketing strategies. 4. Lead Scoring Evaluates and ranks leads based on their likelihood to convert, enabling sales teams to prioritize high-potential prospects and improve conversion rates. 5. Customer Lifetime Value (CLV) Prediction Estimate the future value of customers by analyzing purchase history and behavior, helping businesses focus on high-value customers and tailor long-term engagement strategies. 6. Campaign Optimization Assesses the effectiveness of marketing campaigns by analyzing response data and consumer interactions. Walmart uses predictive analytics to analyze purchasing data, identifying products that are likely to be popular during different seasons, such as summer apparel or winter holiday decorations. This allows Walmart to optimize inventory levels, ensuring that high-demand items are well-stocked and reducing the risk of stockouts or excess inventory. Here are more opportunities for predictive analytics in retail: 1. Demand Forecasting Predictive analytics forecasts future product demand, optimizing inventory levels to reduce stockouts and overstock situations. 2. Personalized Marketing Analyzes customer data to create tailored marketing campaigns, targeting customers with relevant offers and recommendations. 3. Price Optimization Determines optimal pricing strategies by analyzing market trends, competitor prices, and customer behavior. 4. Customer Segmentation Groups customers based on purchasing behavior and preferences for targeted marketing strategies and personalized shopping experiences. 5. Inventory Management Predictive analytics optimizes inventory management by forecasting demand and analyzing supply chain data. 6. Store Layout Optimization Analyzes shopping patterns and customer flow to optimize store layouts. Toyota implements predictive analytics to ensure product quality by analyzing real-time data from sensors on the production line. This includes data on temperature, pressure, and machinery vibrations. By monitoring these parameters, Toyota can detect early signs of equipment malfunctions or deviations from quality standards, allowing for immediate corrective actions. More opportunities for predictive analytics in manufacturing: 1. Predictive Maintenance Predictive analytics identifies potential equipment failures before they occur, enabling timely maintenance and reducing downtime. 2. Quality Control Monitors production processes to detect anomalies in real-time, ensuring consistent product quality. 3. Supply Chain Optimization Enhances supply chain efficiency by predicting demand, optimizing inventory levels, and reducing lead times. 4. Production Planning Forecasts production requirements and schedules, optimizing resource allocation and minimizing waste by aligning production output with market demand. 5. Energy Management Analyzes energy consumption patterns to optimize usage, reduce costs, and improve sustainability. 6. Workforce Management Predictive analytics forecasts labor needs based on production schedules and demand fluctuations. Our predictive analytics solutions have been used in different industries, showing how powerful and flexible machine learning can be in solving complex problems. Here are some examples that highlight the impact of our work. We developed a COVID-19 prediction tracker to calculate the risk of infection and the potential number of patients in specific locations within Israel. Our client aimed to help flatten the COVID-19 curve in Israel, a leader in vaccination efforts. We were tasked with predicting the spread and infection risk of COVID-19, facing challenges such as rapid disease spread, environmental changes, and the need for precise predictions at the city district level. Having neural networks and deep learning techniques in our arsenal, we took on the challenge: Recurrent Neural Networks (RNN) We used an artificial RNN, specifically long short-term memory (LSTM), to handle the dynamic nature of the pandemic and preserve long-term memory for time-series data related to infection rates. Data Normalization We managed to normalize data for both the beginning of the epidemic and real-time predictions, addressing statistical errors at different epidemic stages. Embedding Layers Added to the model to compress and represent city-specific data accurately, enabling the ML model to understand and predict interactions within the data. Risk Scale Development Created a risk scale (rating from 1 to 8) to detect the chances of infection in specific locations, using confirmed COVID-19 data and social behavior data. The solution provided precise predictions for epidemic development across Israel, offering accurate forecasts for around 300 towns and city districts. Specifically, the model accurately predicted infection rates with an error margin of less than 5%. The prediction accuracy improved public health responses, reducing infection rates by 20% in highly targeted areas We created a marketing forecasting solution for real estate businesses, increasing house sales by 16.5 times per month. One of our American real estate clients faced the problem of low sales. To address this, they decided to boost the number of estate buyers through ML-driven targeted advertising. We used historical sales data on transactions, loans, and estimated property values to build an ML model for highly targeted advertising: Data Usage Used ATTOM datasets (US nationwide property data) related to ownership status and seasonality to create a prediction model that accounted for sales fluctuations. Model Parameters Considered period of ownership, equity position, and actual residence for precise ad targeting, leading to significant sales growth. Enhanced Targeting Improved targeting with actual residence data, achieving remarkable increases in house sales. Robust Model Development Ensured the model's robustness and traceability using a decision tree classifier. The predictive model greatly improved ad targeting, increasing sales conversion by 16.5 times. To enhance personalized care, a client aimed to develop a treatment prediction solution using patient data from electronic health records (EHR) and electronic medical records (EMR), including detailed medical histories, genetic information, and lifestyle factors. The traditional “one-size-fits-all” treatment approach ignores crucial factors like age, gender, lifestyle, previous diseases, comorbidities, and genetics, making it hard to select optimal treatment plans. We sought to create a method to predict treatment outcomes using personalized data and machine learning (ML): Data Transformation Patient data, including medical histories, genetic information, and lifestyle factors, was standardized into a machine-readable format Cohort Definition We categorized treatment outcomes into "positive," "negative," and "no progress" classes. Model Development We developed and trained a machine learning algorithm using the processed patient data such as age, gender, medical history, genetic markers, and lifestyle habits. Implementation Integrated the trained model into the clinical workflow for ongoing predictions, providing real-time insights into potential treatment outcomes for individual patients. By leveraging detailed patient data, including medical histories, genetic information, and lifestyle factors, we achieved treatment success rates increasing by 25%, adverse reactions decreasing by 30%, and patient satisfaction scores improving from 80 to 96. Our ML service has two main purposes: forecasting and determining influencing factors on target data. As a forecasting tool, our autoML solution is versatile enough for other tasks like predicting sales or expenses. As driver service, the solution lets users test external and internal factors that influence their target data. The solution applies a pool of diverse models to the input data and selects the best one based on performance metrics. This approach ensures broad applicability and high accuracy. Key aspects of the technical implementation include:

SciForce medical team attended  OHDSI Symposium 20232023 OHDSI GLOBAL SYMPOSIUM

Since 2015, Sciforce has been an active contributor to the OHDSI scientific community. Our medical team is consistently at the forefront of OHDSI events, sharing groundbreaking research and driving advancements in health data harmonization that empower better health decisions and elevate care standards. The fall event was no exception: from October 20 to 22 Polina Talapova, Denys Kaduk, and Lucy Kadets were delegated as our company representatives to the Global OHDSI Symposium with more than 440 of our global collaborators together held in East Brunswick, New Jersey, USA. The symposium affords unique opportunities to dive deeper into the OMOP common data model standards and tools, multinational and multi-center collaborative research strategies, and insight into completed large-scale multinational research projects. Our Medical Team Lead, Polina Talapova presented the topic “Mapping of Critical Care EHR Flowsheet data to the OMOP CDM via SSSOM" during a lightning talk session. She emphasized the significance of mapping metadata generation and storage for producing trustworthy evidence In turn, Denys and Lucy participated in the OHDSI Collaborator Showcase, where they successfully presented a prototype and poster detailing the Jackalope Plus AI-enhanced solution. This tool streamlines the creation, visualization, and management of mappings, reducing manual effort and ensuring precision in capturing details from real-world health data. Our colleagues had an opportunity to meet in person with leading OHDSI researchers such as George Hripcsak, Patrick Ryan, Marc Suchard, Andrew Willams, Rimma Belenkaya, Paul Nagi, Mui Van Zandt, Christian Reich, Anna Ostropolets, Martijn Schuemie, Dmytro Dymshyts, Dani Prieto-Alhambra, Juan M. Banda, Seng Chan You, Kimmo Porkka, Alexander Davydov, Aleh Zhuk, among other distinguished individuals. The event was truly transformative and rewarding, expanding participants’ minds and horizons. The SciForce team is profoundly grateful to the OHDSI community for the opportunity to be a part of this fantastic journey!

The Launch of the Toxin Vocabulary: Our Solution to the Medical Research Complexity

In such an evolving Medical field, the synthesis of reliable and insightful data is a basis for making innovation and progress possible. With its global adoption, the Observational Medical Outcomes Partnership Common Data Model (OMOP CDM) has been an important tool for advancing drug safety monitoring and healthcare outcome prediction. However, it has a gap in the representation of toxic substances and environmental exposures, an aspect central to deepening our understanding of their impacts on human health. Meanwhile, the Toxin Vocabulary – our revolutionary solution, is designed to improve the representation of toxic substances within the OMOP CDM, enabling better analysis of the complex interplay between environmental factors and health outcomes. In this article, we will tell you about the approaches and collaborative efforts that powered the creation of our Toxin Vocabulary. The Vocabulary aims to empower researchers, healthcare professionals, and organizations to get deeper insights regarding environmental exposures and human health outcomes. Let’s explore what insights are possible with our Toxin Vocabulary! Now, let’s get into more detail about the current approach and the problems researchers face. The OMOP CDM was established as an open community data standard, created to harmonize the structure and content of observational data and to enable efficient analyses that can produce reliable evidence. It was widely adopted by researchers and healthcare organizations around the globe. That’s how OMOP CDM simplified drug safety monitoring, comparative effectiveness research, clinical trial design, and healthcare outcome prediction. However, the representation of toxic substances and environmental exposures within the OMOP CDM has been a crucial need in environmental epidemiology. At the same time, environmental epidemiology focuses on investigating the impacts of exposure to toxic substances on human health, considering both short-term and long-term effects. To support such studies, Geographic Information Systems (GIS) have been utilized to analyze the spatial distribution of exposures and assess their potential health consequences. While recent efforts have aimed to integrate GIS data with the OMOP CDM, insufficient standards have hindered the comprehensive evaluation of environmental exposures and their associated health risks. So, that is how we came up with the idea of solving this issue and developing a hierarchical Toxin Vocabulary as a solution to improve the representation of environmental exposomes within the OMOP CDM. This standardized terminology has been developed through a systematic review of toxicological literature, analysis of open toxin databases, and consultation with experts in the field. By synthesizing the most relevant and up-to-date toxin terminology, our Vocabulary aims to facilitate environmental exposure assessment, support toxicological and epidemiological research, and enable the integration of GIS-related data into the OMOP CDM. The journey of the Toxin Vocabulary development needed a systematic approach containing a comprehensive review of toxicological literature, analysis of open-source toxin databases, and consultation with domain experts. These steps were essential in synthesizing a thorough and accurate representation of toxic substances within the OMOP CDM. As we already mentioned before, firstly, we conducted a systematic review of toxicological literature to identify relevant terms and classifications associated with various toxins and their impact on human health. By examining a variety of research papers, regulatory documents, and many different authoritative sources, we reached a comprehensive understanding of the diverse range of toxins and their associated semantic attributes. And, simultaneously with the literature review, we performed an analysis of open-source toxin databases. A primary resource that stood out in terms of comprehensiveness and reliability was the Toxin and Toxin Target Database (T3DB). T3DB provided us with a vast repository of toxin terminology, including descriptions of over 3,000 toxins with 41,602 synonyms. This database encompassed a wide range of toxins, including pollutants, pesticides, drugs, and food toxins, and provided extensive metadata fields for each toxin record (ToxCard), such as chemical properties, toxicity values, molecular and cellular interactions, and medical details. The process of integration required using the information obtained from the literature review and the T3DB to develop the Toxin Vocabulary. Also, it involved automatically uploading the source data to the PostgreSQL database using Python. Afterward, we extracted essential metadata, established cross-term connections, and performed a semi-automated mapping of selected terms to the OMOP Vocabulary standards. To ensure compatibility and seamless integration with the existing OMOP CDM standard vocabularies, the Toxin Vocabulary was mapped to relevant terminologies, including SNOMED CT, RxNorm, and RxNorm Extension. During the mapping process, we needed to associate corresponding concepts from the Toxin Vocabulary with the appropriate standard concepts within the OMOP CDM. This created the link between toxin terms and established clinical concepts, enabling us to do a comprehensive analysis and integration of environmental exposures with other healthcare-related data. As unique vocabulary identifiers, we used CAS codes due to their alignment with GIS data and the CAS Registry, one of the largest registries encompassing around 204 million organic substances. For toxins without CAS codes, unique T3DB codes were assigned, ensuring proper identification and classification. We have seamlessly incorporated the Toxin Vocabulary into OMOP instance by methodically organizing the information in preliminary stages, following the standard OHDSI contribution process, and ensuring each piece of data is accurately placed and interconnected for optimal use. These staging tables were instrumental in incorporating the Toxin Vocabulary's semantic and syntactic aspects. In this way, we ensured the compatibility of the system with the existing OMOP CDM framework. In the picture below you can see how our vocabulary works, our decisions, and their subsequent impact on the OMOP CDM structure Our Toxin Vocabulary represents a hierarchical and expansive representation of toxic substances within the OMOP CDM. It has over 79,377 internal relationships and maps the complex interconnections between toxins, cellular structures, relevant diseases, biological processes, and more, offering researchers an unprecedented level of detail in their analysis. But the Vocabulary's strength doesn't end here. The integration with standardized vocabularies such as SNOMED CT and RxNorm strengthens its capabilities, creating a symbiotic relationship. Such a synergy is the foundation for a more deep and detailed exploration of the exposome and can offer better insights and create a more complex understanding of the toxin-health outcome dynamics. Furthermore, it opens up new sights in drug safety monitoring, clinical trial design, and health outcome predictions, empowering researchers and healthcare professionals to harness rich, GIS-related data for advancing toxicoepidemiological research. The Toxin Vocabulary is not just a tool – it's a gateway to a future with an insightful understanding of environmental impacts on health. In the world of open science, innovative approaches and tools play a key role. The Medical Team of our company, Sciforce, is truly proud to contribute to this development, focusing on OHDSI vocabularies. Our Vocabulary was first presented at the OHDSI GIS Working Group. And, after the validation of the vocabulary is completed, we would be happy to present it publicly at the Global OHDSI Symposium (New Jersey, USA) on October 20, 2023, and officially integrate it into the OHDSI ecosystem! This opens up new opportunities in the fields of Geographic Epidemiology and Toxicoepidemiology. We are truly happy to introduce our enhanced integration of the Toxin Vocabulary, setting a new standard for healthcare analytics and research.

Digital Health Landscape 2023: Trends and Insights Shaping the Future

In the time of a fast-developing world, which is full of innovative decisions, many sectors shine with potential, and one of them is digital health. Members of our medical team, who were privileged attendees at events such as the MedInfo 2023 congress, European OHDSI and APAC symposiums, are truly happy to share their exclusive insights from those conferences with you in this article. So, welcome to a future of healthcare — vibrant, digital, and full of exclusivity. In such a transformative era, technology and healthcare are uniting to absolutely change the way healthcare services are delivered and how patients experience them. They're creating new ways to treat patients, care for them, and handle information. These fresh concepts show us how much digital health is progressing. Though the trends we see in 2023 were present in the past few years, however, in this year, they are more advanced and modified. Let's take a closer look at a few of the important digital healthcare trends in 2023: The healthcare industry is shifting from generic solutions in favor of personalized ones. Digital health products are more focused now on addressing the specific needs and dealing with specific issues, preferences, and histories of individual patients, ensuring both personal and efficient care. The medical industry is departing from the approach of merely diagnosing and prescribing the treatment. It is more focused now on the prevention of diseases and their consequential effects. By using AI and ML, we can process the data and make predictions about the health either of a patient or population. Also, it is important to mention that technologies such as digital therapeutics and real-time patient monitoring are enhancing patient outcomes and engagement. When different health solutions are available for the medical field, it's important for them to work together, and data standardization is a key here. The patient data can come from various sources: insurance systems, electronic health records, surveys, clinical trials, etc. This data needs to be put into a common and more standardized format to make sense of it all. Organizations leading in standardization are working on structuring the data better, which allows decision-makers to get evidence-based insights. This knowledge is crucial for data-driven decisions, not just in the medical treatment aspects but also in managing resources like finances and personnel. Remote healthcare solutions, from video communications to telemedicine consultations, are absolutely changing what accessibility in healthcare means. Regardless of geographical location, patients can connect with healthcare professionals and get consultations while healthcare quality stays at a high level. This also allows doctors to gather for counseling sessions and create recommendations on laboratory test results. In conclusion, the current trends in digital health products show how the approaches in the industry are changing: the mobile apps for remote vitals monitoring are being implemented, and telepath platforms are becoming more insightful with the features they have. For example, matching patients to doctors, analytics, health assistant bots and E-prescriptions. These trends not only show the present condition of digital health but also provide an insight into the future perspective of the industry. Right now, we're on the edge of new opportunities, and some really cool ideas are leading the way and changing how we think about healthcare. The innovative solutions now aim not only to respond to current needs but also to address the challenges that can arise in the future. Let's explore them: Beyond the simplistic step counters and basic heart rate monitors, the development of wearables is a great decision in medical tech. These devices continually track physiological parameters, such as heart and respiratory rate, urinary bladder fullness, blood oxygen level, glucose level, and even stress markers, painting a picture of a patient's health in real-time. There is a consistent flow of health data from wearables, so AI-powered platforms step in to analyze a lot of information. Such predictive analytics can detect irregularities, sending early alerts to users or medical professionals and, therefore, preventing potential health crises. Leveraging no-code ecosystems, innovators are designing digital health tools that address specific patient needs, especially in the realm of mental and behavioral health. These tools do not offer generic advice; they create practical advice based on evidence. By integrating AI, digital therapeutic platforms can now provide personalized regimens that adjust in real time based on patient feedback and take into account a huge variety of factors (e.g. COVID-19, epilepsy, schizophrenia, child age). Ensuring that health data remains in the right hands is crucial. Many platforms now grant access based on robust digital identity verification, bolstering security and ensuring data privacy compliance with acts such as HIPPA and GDPR. Secure digital platforms are built to recognize participants like doctors, managers, and patients. Such recognition is made possible through the use of blockchain technology – old, but still effective one. In essence, the latest ideas in digital health innovation aren't just about advanced tech but also about precision and trust. These innovations show a promising trajectory where technology elevates patient care, data security, and personalized treatment to a new level, simplifying the work for doctors and saving their time. The digital health landscape, fueled by rapid technological advancements and a surge in patient-centric initiatives, has presented a slew of novel opportunities and pathways. As healthcare continues to integrate with technology, several potential trajectories are crystallizing. The integration of AI-powered analytics into Electronic Health Record (EHR) software is a game changer. Such integration can bring promising results by assisting clinicians in making informed decisions based on patterns and previous case studies. Such tools as AI-operated symptom checkers and AI-enhanced medical alarms/alerts are designed to supplement healthcare professionals, and they can quickly assess initial symptoms, promptly address the needs of critical patients, and send alerts for urgent conditions such as seizures or hypo-/hyperglycemia in patients at home. This allows to ensure that patients receive timely and appropriate care, thereby optimizing the healthcare process. More than ever, there's a pressing need to develop platforms enabling collaboration and system management, including community-driven approach in setting standards. They're designed for seamless patient care, interoperability, real-time data analytics, and personalized treatment protocols. . Such platforms would not only enable efficient communication and data sharing among healthcare providers, but also empower patients and healthcare decision-makers with data-driven insights and remote monitoring capabilities. Implementing such opportunities will not only level up patient care but also create a progressive path for the global healthcare industry. While the future ahead may be challenging, it's equally rich with solutions, where digitalization is playing a crucial role. At the conferences and events, we gained firsthand insights into how leading organizations within different communities are managing the challenges and realizing opportunities in digital health: Synthetic Health Data Creation: To address concerns of privacy and data availability, the generation of synthetic health data steps into the game and becomes a trend. This enables more extensive testing and research without compromising patient privacy. Using NLP: Many companies use Natural Language Processing (NLP) to convert, process, and structure unstructured data. This process is truly helpful in mapping, ETL operations, phenotyping, and more. Community Management: The communities are constantly growing, and management of member interactions and contributions is crucial. Many organizations rely on metrics to track and analyze contributors' activities, creating a more engaged and productive ecosystem. Cost-Effectiveness: A significant interest within the community arose around evaluating the cost-effectiveness of healthcare solutions. This ensures that medical interventions are beneficial and optimally utilized, be it in terms of resources or a doctor's time. Data Standardization is highly important due to the variety of medical data types. Leading organizations prioritize the alignment of these diverse datasets to enable more comprehensive analytics and well-informed decision-making. For example, the OMOP Common Data Model structures the data into specialized tables, fostering efficient big-data operations and analytics. Observational Studies: The extraction of evidence-based knowledge is highly important for research purposes. One approach to achieve this is through observational studies on vast datasets. To support this, there's a rising trend in developing tools specifically tailored for surveys and research. Ontology Platforms: There is a need to create web resources that aggregate different mappings and terminological standards, making them more interoperable for other participants of the research. This simplifies data-sharing and collaboration. We feel truly excited and hopeful about the progress in digital health! Our Medical team is happy to contribute to the expansive digital health community and be able to share the insights gleaned from the events we’ve visited. Our journey together with different experiences and knowledge shows the transformative phase we're in. Patient-centered approaches, innovations in remote healthcare solutions, and the importance of data standardization forms the 2023 digital health landscape, and it's vast and dynamic. Let's lead the way in digital health's future together!

OHDSI Global Symposium 2022

For seven years, the SciForce team has been an active member of the OHDSI scientific community, whose mission is to improve people’s health by empowering the community to obtain evidence-based knowledge that contributes to better decisions in healthcare. In this regard, representatives of our medical team, namely Polina Talapova, Eduard Korchmar, and Denis Kaduk, attended the three-day Global OHDSI Symposium held in North Bethesda, Maryland, USA, on October 14–16, 2022. They heard many fascinating reports of the community’s leads and experts, successfully presented a poster on the topic “Jackalope: A software tool for meaningful post-coordination for ETL purposes” in the Open-Source Analytics Development section (GitHub repos, watched a software demonstration, and participated in such working groups as Oncology, Vocabulary, FHIR-OMOP, Natural Language Processing, CDM and Data Quality. Also, our colleagues met in person with leading OHDSI researchers (George Hripcsak, Patrick Ryan, Christian Reich, Clair Blacketer, Andrew Williams, Rimma Belenkaya, Asieh Golozar, Karthik Natarajan), developers (Christopher Knoll, Paul Nagy, Michael Gurley) and old friends (Dmitry Dymshits, Anna Ostropolets, Alexander Davydov), as well as made new valuable acquaintances. It was a great event and a rewarded experience, expanding minds and horizons. The SciForce team is profoundly grateful to the OHDSI community for the opportunity to be a part of this fantastic journey.

How Sciforce steals the show at the 3rd European Observational Health Data Sciences and Informatics…

From June 24 to 26, Mariia Kolesnyk, Eduard Korchmar, and Polina Talapova have been delegated as representatives of Ukrainian IT company SciForce to the third European Observational Health Data Sciences and Informatics (OHDSI) Symposium held in Rotterdam, the Netherlands. As a panel discussion, Mariia presented a project on the Integration prospects of the Ukrainian healthcare system with OMOP CDM to generate reliable evidence for decision-making during and after the war. Moreover, she shared information on the current life and workflow of the team under russia’s military aggression against Ukraine. Her touching speech was warmly welcomed by the international OHDSI community and got a standing ovation. Also, Mariia and Eduard took part in the OHDSI Collaborator Showcase, presenting two posters about progress in health data conversion in Ukraine and challenges and possible solutions for maintaining the OMOP CDM Standardized Vocabularies. In turn, Polina participated in Early Investigators Mentor Meetings, where she had the honor to talk to the leading OHDSI researchers, such as Patrick Ryan, Peter Rijnbeek, Erica Voss, and Martijn Schuemie in person, about prospects for cooperation. Besides, our teammates attended a great workshop dedicated to “Designing and implementing a network characterization study” and several workgroups: Oncology (Asieh Golozar), Vocabulary (Michael Kallfelz), and OMOP-FHIR (Christian Reich). It was a great time to meet old and new friends. As the only EHDEN-certified SME in Ukraine, SciForce was privileged to take part in this amazing scientific event and support our country during these very difficult times at the international level. We are very grateful for this opportunity and global support! Slava Ukraini, Glory to Ukraine!

SciForce is now the European Health Data and Evidence Network (EHDEN) certified partner

We are pleased to announce that SciForce is now the first and only enterprise in Ukraine that became the certified partner of the European Health Data and Evidence Network (EHDEN). Thus, we contribute to the wider Observational Health Data Sciences and Informatics (OHDSI) initiative to take advantage of large-scale health data analytics. This step brings us into the community of 47 European SMEs delivering their expertise in health data, data standardization, and interoperability. Read on for details. EHDEN, with support from the EU’s Horizon 2020 program and the European Federation of Pharmaceutical Industries and Associations (EFPIA), aims to take advantage of so-called Big Health Data. In the case of the EHDEN, this data may include any information related to patients in Europe, facilitating research, supporting the conduct of healthcare researches, or managing any payment or HR data. As the official website states, EHDEN was launched “to support patients, clinicians, payers, regulators, governments, and the industry in understanding wellbeing, disease, treatments, outcomes, and new therapeutics and devices.” Starting from August 2020, EHDEN certifies the European SMEs on standardizing health data to the Observational Medical Outcomes Partnership (OMOP) common data model (CDM) and setting needed technical ecosystem. At the moment, EHDEN built a community of 47 enterprises taking their part in this initiative. Before joining the community of certified SMEs to cooperate with the European data partners, our professionals took pieces of training within the EHDEN Academy, grasping the technical ecosystem and work standards. See the scheme of the training process, aimed to become the certified Data Partner: Thus, within the EHDEN, we now can provide services related to the ecosystem OHDSI Software and Tools, Technical infrastructure services, OHDSI Training, OMOP CDM ETL, OMOP Standardized Vocabularies. Speaking about healthcare expertise, we are not limiting our capabilities to observational research, data mapping, and ETL only. Our healthcare professionals also focus on prediction, digitizing, telehealth services, and impaired speech recognition. Finally, we are excited to share our expertise with our European partners, moving medical science forward.

Top 5 Medical Specialties Most Interested in Telehealth

Telehealth has skyrocketed under the pandemic worldwide. Currently, 46% of visits are delivered online, and 76% of consumers are interested in telehealth, as McKinsey experts state. Compared with 2019, when burdened with legal regulations, it was reaching about 11% of usage among US patients. We present a detailed analysis of the most promising medical specialties integrating telehealth in the US. See what healthcare providers have implemented already and what changes are coming. Telehealth is likely to stay a growing trend in the post-pandemic era. Adopting telemedicine practices was reluctant before 2020. But now, it promises that $250 billion in the US could be potentially virtualized. To begin with, let us define how much “health” is in telehealth. Per WHO (World Health Organization), telehealth is the “delivery of health care services, where patients and providers are separated by distance.” Telehealth leverages ICT networks to transmit information on diagnosis, treatment, research, evaluation, diseases, injuries, and education. Meanwhile, one can often see the narrower term — telemedicine. It refers to the medical services provided by several electronic technologies, focusing primarily on monitoring and diagnosis. Telemedicine is different from telehealth, which breaks down into long-distance patient care, admission, monitoring, advice, reminders, and even intervention. Practically, telehealth encompasses the following services: 10. Sciforce’s Odyssey: Being a Part of the OHDSI Adventure 11. Top AI algorithms for Healthcare 12. Virtual Reality in Psychology: Therapy and Research 13. Web-Service for Pharmacists: How Data Science Can Help Pharmacists with Customer Care 14. What is ETL?

AI and ML in the European Pharmaceutical Industry: Recent Applications, Challenges, and Response…

Innovations in the pharmaceutical industry proved to be a tall order, as drug development’s success rate has been traditionally tremendously low. However, the COVID-19 pandemic whipped up R&D centers, markets, and leaders to develop the response to the crisis faster than ever. AI and ML bring to the table a toolbox to overcome new challenges in the whole industry. Significant that the Healthcare Artificial Intelligence market is expected to reach* **$51.3 billion by 2027 at CAGR (compound annual growth rate) of 41.4% starting from 2020***. Check out this fresh guide on the latest updates that AI and machine learning brought to the pharmaceutical market in the EU, including recent applications and the most significant challenges. Recent AI and ML applications in healthcare and pharma proved that these technologies are reaching the ‘Slope of Enlightenment’ in the Gartner Hype Cycle. It seems like the industry has no choice at the moment. Per Subroto Mukherjee, Head of Innovation and Emerging Technology at GlaxoSmithKline, the development of the new vaccines against coronavirus would take from eight to ten years before. In contrast, the ones available now took 300 days from the start of development to the first testings. The rationale of using ML in the industry is to lower attrition and costs while increasing the success rate for new drug development. Hence, the power of using ML algorithms to parse significant amounts of data to learn from it and make determination or prediction about the future of the new data sets come to the scene. The algorithms show higher performance with the increase of quantity and quality of data, leading to the question of data regulation, which we cover further. Data collected like images, texts, biometrics, assay, and other information from wearables stand as the field for developing the new models or formulas that are still unknown but could bring crucial changes. Practically there has been next proven applications of AI and ML in the market: 10. Top AI algorithms for Healthcare 11. Virtual Reality in Psychology: Therapy and Research 12. Web-Service for Pharmacists: How Data Science Can Help Pharmacists with Customer Care 13. What is ETL?