TY - CHAP
T1 - Deep Neural Network for Constraint Acquisition Through Tailored Loss Function
AU - Vyhmeister, Eduardo
AU - Paez, Rocio
AU - Gonzalez-Castane, Gabriel
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.
PY - 2024
Y1 - 2024
N2 - The importance of extracting constraints from data is emphasized by its potential practical applications in solving real-world problems. While constraints are commonly used for modeling and problem-solving, methods for learning constraints from data are still relatively scarce. Moreover, the complex nature of modeling requires expertise and is susceptible to errors, making constraint acquisition methods valuable for automating this process through learning constraints from examples or behaviours of solutions and non-solutions. This study introduces a novel approach grounded in Deep Neural Networks (DNN) based on Symbolic Regression, where suitable loss functions are used to extract constraints directly from datasets. With this approach, constraints can be directly formulated. Additionally, given the wide range of pre-developed architectures and functionalities of DNNs, potential connections and extensions with other frameworks are foreseeable.
AB - The importance of extracting constraints from data is emphasized by its potential practical applications in solving real-world problems. While constraints are commonly used for modeling and problem-solving, methods for learning constraints from data are still relatively scarce. Moreover, the complex nature of modeling requires expertise and is susceptible to errors, making constraint acquisition methods valuable for automating this process through learning constraints from examples or behaviours of solutions and non-solutions. This study introduces a novel approach grounded in Deep Neural Networks (DNN) based on Symbolic Regression, where suitable loss functions are used to extract constraints directly from datasets. With this approach, constraints can be directly formulated. Additionally, given the wide range of pre-developed architectures and functionalities of DNNs, potential connections and extensions with other frameworks are foreseeable.
KW - Constraint Acquisition
KW - Deep Neural Network
KW - Symbolic Regression
UR - https://www.scopus.com/pages/publications/85199510292
U2 - 10.1007/978-3-031-63775-9_4
DO - 10.1007/978-3-031-63775-9_4
M3 - Chapter
AN - SCOPUS:85199510292
SN - 9783031637742
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 43
EP - 57
BT - Computational Science – ICCS 2024 - 24th International Conference, 2024, Proceedings
A2 - Franco, Leonardo
A2 - de Mulatier, Clélia
A2 - Paszynski, Maciej
A2 - Krzhizhanovskaya, Valeria V.
A2 - Dongarra, Jack J.
A2 - Sloot, Peter M. A.
PB - Springer Science and Business Media Deutschland GmbH
T2 - 24th International Conference on Computational Science, ICCS 2024
Y2 - 2 July 2024 through 4 July 2024
ER -