an abstract image featuring various shades of blue

Publication by IRIS Co-Speaker Prof. Dr. Staab on Fairness in AI

June 10, 2024 / IRIS3D

Prof. Dr. Steffen Staab (IRIS Co-Speaker)
[Picture: Dalle 2024]

Prof. Dr. Staab, co-speaker of IRIS, together with Qusai Ramadan, Marco Konersmann, Amir Shayan Ahmadian, and Jan Jürjens, wrote "MBFair: A Model-Based Verification Methodology for Detecting Violations of Individual Fairness," which is now published and readable on Springer Link. 

This article deals with the issue of discrimination in decision-making systems, especially related to characteristics like gender and ethnicity. Prof. Dr. Staab and his team introduced a new method called MBFair. This method uses UML-based software designs to ensure individual fairness from the beginning of software development. MBFair works by creating temporal logic clauses, which help detect and report any discriminatory behavior in the software. The article shows that MBFair is effective through three real-world case studies and proves that it helps analysts find discrimination more reliably than manual methods.

Excerpt from the class and the state machine diagrams of the bank’s software
Excerpt from the class and the state machine diagrams of the bank’s software

LINK TO THE ARTICLE

Contact

Prof. Dr. Steffen Staab

To the top of the page