Paper ID: 2301.06141
Max-min Learning of Approximate Weight Matrices from Fuzzy Data
Ismaïl Baaj
In this article, we study the approximate solutions set $\Lambda_b$ of an inconsistent system of $\max-\min$ fuzzy relational equations $(S): A \Box_{\min}^{\max}x =b$. Using the $L_\infty$ norm, we compute by an explicit analytical formula the Chebyshev distance $\Delta~=~\inf_{c \in \mathcal{C}} \Vert b -c \Vert$, where $\mathcal{C}$ is the set of second members of the consistent systems defined with the same matrix $A$. We study the set $\mathcal{C}_b$ of Chebyshev approximations of the second member $b$ i.e., vectors $c \in \mathcal{C}$ such that $\Vert b -c \Vert = \Delta$, which is associated to the approximate solutions set $\Lambda_b$ in the following sense: an element of the set $\Lambda_b$ is a solution vector $x^\ast$ of a system $A \Box_{\min}^{\max}x =c$ where $c \in \mathcal{C}_b$. As main results, we describe both the structure of the set $\Lambda_b$ and that of the set $\mathcal{C}_b$. We then introduce a paradigm for $\max-\min$ learning weight matrices that relates input and output data from training data. The learning error is expressed in terms of the $L_\infty$ norm. We compute by an explicit formula the minimal value of the learning error according to the training data. We give a method to construct weight matrices whose learning error is minimal, that we call approximate weight matrices. Finally, as an application of our results, we show how to learn approximately the rule parameters of a possibilistic rule-based system according to multiple training data.
Submitted: Jan 15, 2023