TY - GEN
T1 - Determining the Most Promising Selective Backbone Size for Partial Knowledge Compilation
AU - Balogh, Andrea
AU - Escamocher, Guillaume
AU - O’Sullivan, Barry
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
PY - 2025
Y1 - 2025
N2 - Knowledge compilation allows for quick retrieval of a wide variety of information about an instance of a constraint satisfaction problem or Boolean formula. Unfortunately, the compiled representation can be so large that it cannot be used in practical settings, especially when there are restrictions on the amount of available memory. It has been recently observed that assigning some carefully chosen variables can significantly reduce the size of the representation, with only a small loss of data about the instance. However determining how many, and which, variables should be assigned to create a compact compiled instance requires making a number of expensive computations, including completely compiling the full initial instance, which might not be achievable for large instances. Our work in this paper involves removing large compilations from the process of constructing selective backbones. First we look at how statistical properties of literals can be used as fast heuristics to pick which variable to assign next. Then we show how the behaviour of selective backbones for small instances can help determine the size at which selective backbones for large instances are expected to offer the best compromise between the quantity of knowledge lost and the amount of space saved. We also extend previous work to take into account weighted preferences on models. This is beneficial since various probabilistic models can be expressed in such terms.
AB - Knowledge compilation allows for quick retrieval of a wide variety of information about an instance of a constraint satisfaction problem or Boolean formula. Unfortunately, the compiled representation can be so large that it cannot be used in practical settings, especially when there are restrictions on the amount of available memory. It has been recently observed that assigning some carefully chosen variables can significantly reduce the size of the representation, with only a small loss of data about the instance. However determining how many, and which, variables should be assigned to create a compact compiled instance requires making a number of expensive computations, including completely compiling the full initial instance, which might not be achievable for large instances. Our work in this paper involves removing large compilations from the process of constructing selective backbones. First we look at how statistical properties of literals can be used as fast heuristics to pick which variable to assign next. Then we show how the behaviour of selective backbones for small instances can help determine the size at which selective backbones for large instances are expected to offer the best compromise between the quantity of knowledge lost and the amount of space saved. We also extend previous work to take into account weighted preferences on models. This is beneficial since various probabilistic models can be expressed in such terms.
UR - https://www.scopus.com/pages/publications/105010232304
U2 - 10.1007/978-3-031-95973-8_3
DO - 10.1007/978-3-031-95973-8_3
M3 - Conference proceeding
AN - SCOPUS:105010232304
T3 - Lecture Notes in Computer Science ((LNCS,volume 15762))
SP - 34
EP - 51
BT - International Conference on the Integration of Constraint Programming, Artificial Intelligence, and Operations Research
T2 - 22nd International Conference on the Integration of Constraint Programming, Artificial Intelligence, and Operations Research, CPAIOR 2025
Y2 - 10 November 2025 through 13 November 2025
ER -