An optimized Belief-Rule-Based (BRB) approach to ensure the trustworthiness of interpreted time-series decisions

Publisher:
ELSEVIER
Publication Type:
Journal Article
Citation:
Knowledge-Based Systems, 2023, 271
Issue Date:
2023-07-08
Full metadata record
The accuracy and reliability of XAI methods are important to establish their credibility and use in complex decision-making tasks. Existing XAI methods provide little information about the correctness and reliability of their outputs. Furthermore, post-hoc explanation approaches explain the outcomes after producing them, not in a step-by-step glass-box manner to explain how an output is reached. Our proposed approach addresses these drawbacks by designing a Belief-Rule-Based (BRB) framework that interprets in a glass-box manner why a particular decision has been reached. It does that by determining the chance of different output classes occurring for a specific time period by considering the different possible permutations of the inputs along with their influence. This also assists the user to determine if the given input dataset is incomplete, vague, imprecise or inconsistent before trusting the analysis emanating from it. We compare the performance of the proposed BRB approach against the different eXplainable artificial intelligence (XAI) methods, such as SHAP, LIME and LINDA-BN to ensure the users of the trustworthiness of its analysis. This also enables users to determine the extent to which each of the XAI techniques meets the requirements of XAI and the gaps that need to be addressed.
Please use this identifier to cite or link to this item: