Foreshore Data

Explainable AI

Explainable AI

XAI

Explainable AI encompasses a range of tools that allow insights and accountability from machine learning models. It’s focuses more on the “x” variables which drive the prediction. 

Foreshore Data

Use of explainable AI

Much effort in machine learning is invested in improving accuracy – by hyperparameter search, feature engineering and model family selection, as well as “auto-ML” techniques that automate these processes. Nevertheless, once a model is developed, however accurate it may be, sometimes the main interest is in understanding what features the model considers most important. Techniques for doing this are referred to as “Explainable AI” (or sometimes “Model Variable Importance”) and fall into two classes: global and local. Global methods are concerned with the question: “What factors are most impactful on our outcome of interest? alternatively, there are also Local methods, which are concerned with the impact of features on the prediction of specific instances, and might produce very specific insights.

Explainable AI is desirable if you want to do a deep dive on what’s driving predictions, and in some cases like finance it’s mandatory for compliance reasons to backup decision making processes. For example, customer X didn’t get a loan, if the decision is model based an important question is “why didn’t they get the loan”, which is where explainable AI comes in.

Explainable AI is also excellent and determining the usual boundaries of features and how they interact with other features.

At Foreshore data we are experts in using AI. We have worked extensively in health, finance, insurance, and road safety/transport using our propriety explainable AI technology.

Use cases in road safety

Deterring what road characteristics are causing major injuries and fatalities using explainable AI and using the output to optimise road expenditure to reduce this?

Hospital use case

Explainable AI was used to not only rank the probability of an ICU patient dying or suffering an adverse event, but also to explain what the elevated risk factors were.

Clinical trial use case

The Foreshore data team used explainable AI to not only predict the probability of a clinical trial succeeding/failing but also determine the factors why? For example, patient enrolment numbers etc.

Contact us and one of our data scientists will be in touch to discuss how we can turn your data…

Our Promise

We will help you navigate through the ocean of data in all its forms to enhance the capabilities of your organization and improve your outcomes.