RCRA 2012‎ > ‎

RCRA 2012 accepted papers

Marco Gavanelli, Maddalena Nonato, Andrea Peano, Stefano Alvisi and Marco Franchini.
Genetic Algorithms for Scheduling Devices Operation in a Water Distribution System in Response to Contamination Events
(Already published paper)
Abstract: This paper heuristically tackles a challenging scheduling problem arising in the field of hydraulic distribution systems in case of a contamination event, that is, optimizing the scheduling of a set of tasks so that the consumed volume of contaminated water is minimized. Each task consists of manually activating a given device, located on the hydraulic network of the water distribution system. In practice, once contamination has been detected, a given number of response teams move along the network to operate each device on site. The consumed volume of contaminated water depends on the time at which each device is operated, according to complex hydraulic laws, so that the value associated to each schedule must be evaluated by a hydraulic simulation. We explore the potentials of Genetic Algorithms as a viable tool for tackling this optimization-simulation problem. We compare different encodings and propose ad hoc crossover operators that exploit the combinatorial structure of the feasible region, featuring hybridization with Mixed Integer Linear Programming. Computational results are provided for a real life hydraulic network of average size, showing the effectiveness of the approach. Indeed, we greatly improve upon common sense inspired solutions which are commonly adopted in practice.

Richard Wallace.
SAC and Neighbourhood SAC
(Original full paper)
Abstract: This paper presents some new algorithms built around the idea of singleton arc consistency (SAC). The main focus of the paper is a new form of SAC that only considers neighbourhoods of a variable with a singleton domain. Although this algorithm is dominated by the neighbourhood inverse consistency (NIC) algorithm, in practice it has almost always produced the same amount of filtering with significantly less cost. In addition, a new SAC algorithm is presented that utilizes some of the features of one of the neighbourhood SAC algorithms and is more efficient than the classical SAC-1 algorithm. Finally, some heuristic methods and approximations to the new SAC algorithm are discussed and evaluated, some of which afford further improvements in efficiency with little or no decrement in filtering effectiveness.

Richard Wallace.
Experimental study of constraint weights obtained by random probing and other techniques
(Original full paper)
Abstract: This paper explores various issues associated with the use of constraint weighting strategies. It has two main goals. The first is to consider different strategies for obtaining information related to failure, in particular to conflict sets associated with local minima in hill-climbing and to failures due to domain wipeout. This part of the work also evaluates the relative importance of blame assignment and contention identification in crafting strategies based on search failures. The second goal is to consider whether failure information (from domain wipeouts) can be used to characterise an individual problem. Here, the technique of random probing is used to assess problem difficulty. It is found that hard and easy problems (both with the same problem parameters and in the same part of a critical complexity region) can be distinguished (most strikingly) by the way weights are distributed among variables. In addition, there are differences in the pattern of connections among highly-weighted variables that to some degree distinguish hard and easy problems. These results hold out the promise of rapid assessment of problem difficulty that can allow better allocation of problem solving resources, as well as possible heuristics based on patterns of weights among variables.

Luca Pulina.
A Swarm Intelligence Approach for Biometrics Verification and Identification
(Original short paper)
Abstract: In this paper we investigate a swarm intelligence classification approach for both biometrics verification and identification problems. We model the problem by representing biometric templates as ants, grouped in colonies representing the clients of a biometrics authentication system. The biometric template classification process is modeled as the aggregation of ants to colonies. When test input data is captured – a new ant in our representation – it will be influenced by the deposited phermonones related to the population of the colonies. We experiment with the Aggregation Pheromone density based Classifier (APC), and our results show that APC outperforms “traditional” techniques – like 1- nearest-neighbour and Support Vector Machines – and we also show that performance of APC are comparable to several state of the art face verification algorithms. The results here presented let us conclude that swarm intelligence approaches represent a very promising direction for further investigations for biometrics verification and identification.

Silvana Badaloni, Francesco Sambo and Francesco Venco.
On Hybridizing Complete Bayesian Network Structure Search With Independence Tests
(Original full paper)
Abstract: Bayesian Networks (BN) are probabilistic graphical models used to encode in a compact way a joint probability distribution over a set of random variables. The NP-complete problem of finding the most probable BN structure given the observed data has been largely studied in recent years. In the literature, several complete algorithms have been proposed for the problem; in parallel, several tests for statistical independence between the random variables have also been proposed, in order to reduce the size of the search space. In this work, we propose to hybridize the algorithm representing the state-of-the-art in complete search with two types of independence tests, and assess the performance of the hybrid algorithms in terms of both solution quality and computational time. Experimental results show that hybridization with both independence tests results in a substantial gain in computational time, against a limited loss in solution quality, and that none of the two independence tests consistently dominates the other in terms of computational time gain.

Christian Miller, Paolo Marin and Bernd Becker.
A Dynamic QBF Preprocessing Approach for the Verification of Incomplete Designs
(Original full paper)
Abstract: Bounded Model Checking (BMC) is a major verification technique for finding errors in sequential circuits by unfolding a circuit iteratively and converting the BMC instances into Boolean satisfiability (SAT) formulas. When considering incomplete designs (i.e. those containing so-called blackboxes), we rather need the logic of Quantified Boolean Formulas (QBF) to obtain a precise modeling of the unknown behavior of the blackbox. The incremental nature of the BMC problem can be exploited by means of incremental preprocessing of the QBF formulas. It was shown that applying incremental preprocessing produces smaller QBF formulas, but with the price of moving computation time from the solver to the preprocessor, which can rise significantly. In this paper we show how we can take advantage of incremental preprocessing without running into this inconvenient overhead. We start applying incremental preprocessing, and switch to conventional BMC when the preprocessing effectiveness does not compensate for the time spent. We present multiple switching heuristics, and evaluate the overall performance by running several state-of-the-art QBF solvers as back-end. Comparing these results to those obtained by standard and incremental BMC procedures it turns out that dynamically preprocessing leads to the best overall performance of the whole verification process.

Juan Jose Palacios, Camino R. Vela, Ines Gonzalez-Rodriguez and Jorge Puente Peinador.
A Particle Swarm Solution based on Lexicographical Goal Programming for a Multiobjective Fuzzy Open Shop Problem
(Original full paper)
Abstract: In the sequel, we consider a multiobjective open shop scheduling problem with uncertain durations modelled as fuzzy numbers. Given crisp due dates, the objective is to minimise both the makespan and the maximum tardiness. We formulate the multiobjective problem as a fuzzy goal programming model based on lexicographical minimisation of expected values. The resulting problem is solved using a particle swarm optimization approach searching in the space of possibly active schedules. Experimental results are presented on several problem instances to evaluate the proposed method, illustrating its potential.

Miguel A. González, Camino R. Vela, Ramiro Varela and Inés González-Rodríguez.
Scatter Search and Path Relinking for the Job Shop Scheduling Problem with Sequence Dependent and non-Anticipatory Setup Times
(Original full paper)
Abstract: We face the job shop scheduling problem with sequence dependent non-anticipatory setup times and makespan minimization by means of a scatter search algorithm which uses path relinking and tabu search in its core. We establish a graph model, that allows us to study some properties of the problem, and we will see that some of them change with the consideration of non-anticipatory setup times. We define a new local search neighborhood structure which is incorporated into the proposed path relinking and tabu search algorithms. To assess the performance of this algorithm, we present the results of an extensive experimental study, including an analysis of the algorithm under different running conditions and a comparison with the state-of-the-art methods, illustrating its good behavior. In particular, our algorithm establishes new best solutions for every instance considered.

Alfonso Emilio Gerevini, Alessandro Saetti and Mauro Vallati.
SAT-based Planning with Macro-actions and Learned Horizons
(Already published paper)
Abstract: The use of automatically learned knowledge for a planning domain can significantly improve the performance of a generic planner when solving a problem in this domain. In this work, we focus on the well-known SAT-based approach to planning and investigate two types of learned knowledge that have not been studied in this planning framework before: macro-actions and planning horizon. Macro-actions are sequences of actions that typically occur in the solution plans, while a planning horizon of a problem is the length of a (possibly optimal) plan solving it. We propose a method that uses a machine learning tool for building a predictive model of the optimal planning horizon, and variants of the well-known planner SatPlan and solver MiniSat that can exploit macro actions and learned planning horizons to improve their performance. An experimental analysis illustrates the effectiveness of the proposed techniques.

Lorenzo Foti, Marco Maratea, Simona Sacone and Silvia Siri.
Solving Train Load Planning Problems with Boolean Optimization
(Original short paper)
Abstract: The train load planning problem is very important in the context of transportation and logistic in seaport container terminals. It allows to optimally assign containers to wagons of a train in order to satisfy capacity constraints while optimizing re-handling operations and train utilization. In this paper, we first present a basic mathematical model of the problem, inspired from a real case of an Italian port. Then, we extend the basic model in order to more realistically represent the re-handling operations in the storage area. We finally present the results obtained by running several Boolean optimization, i.e. LP, PB, and SMT, solvers on a set of benchmarks coming from the two models. A preliminary experimental analysis shows that (i) challenging problems can be generated even with a relatively low number of containers and wagons, that in our models correspond to relatively small formulas; (ii) CPLEX shows the best results, and (iii) the only other solver evaluated that performs well in this domain is SCIP.

Alessandro Dal Palù, Agostino Dovier, Andrea Formisano and Enrico Pontelli.
CUD@SAT:  GPU parallelism for SAT Solving
(Original full paper)
Abstract: Every new desktop or laptop come equipped with a multicore, programmable graphic processing unit (GPU). The computation power of this kind of unit is fully exploited by some graphical software but GPU is mostly idle during common PC use. Moreover, extra GPUs can be added at a very low price per processor. Such kind of parallelism has been recently exploited by some applications able to split the work into small parallel threads. In this paper we investigate the capabilities of this hardware for search problems, where large threads sometimes need to be delegated to each parallel processors. In particular we deal with the well-known SAT problem and focus on NVIDIA CUDA architecture, one of the most popular platforms for GPU. We experimented several implementing choices and evaluated the results. The most interesting outcome is that with the built-in GPU of a typical office desktop configuration we can reach speed-up of one or even two orders of magnitude.

Alessandro Armando, Angela Contento, Daniele Costa and Marco Maratea.
Minimum Disclosure as Boolean Optimization: New Results
(Original short paper)
Abstract: In advanced credential-based, distributed applications, clients gain access to sensitive resources by proving possession of some properties to the server. Yet, user's properties (or the combination thereof) may be sensitive and the user may want to minimize their disclosure whenever possible. Given a quantitative estimate of the sensitivity of the properties contained in the credentials, the Minimum Disclosure (MIN-DISCL) problem is the problem of determining a disclosure of information that allows the user to obtain the service, while minimizing the overall sensitivity of the exposed disclosure. Previous work provided a reduction of MIN-DISCL to the Weighted Max-SAT problem and showed the practical viability of the approach through experimentation with the Yices SMT solver. In this paper we present a simplified and optimized problem formulation that leads to much smaller encodings and smaller solving times than the original formulation on randomly generated MIN-DISCL problems. 

Carlos Mencía, María R. Sierra, Miguel A. Salido, Joan Escamilla and Ramiro Varela. 
Combining Global Pruning Rules with Depth-First Search for the Job Shop Scheduling Problem with Operators
(Original full paper)
Abstract: We propose an enhanced depth first heuristic search algorithm to face the job shop scheduling problem with operators. This problem extends the classical job shop scheduling problem by considering a limited number of human operators that assist the processing of the operations. We considered total fow time minimization as objective function which makes the problem harder to solve and more interesting from a practical point of view than minimizing the makespan. The proposed method exploits a schedule generation scheme termed OG&T, two admissible heuristics and some powerful pruning rules that require recording expanded states. We have conducted an experimental study across several benchmarks to evaluate our algorithm. The results show that the global pruning method is really effective and that the proposed approach is quite competent for solving this problem.

Saida Hammoujan, Imade Benelallam and El Houssine Bouyakhf.
On the effect of DisCSP Dynamic Variable-Ordering in Constraints Propagation
(Original full paper)
Abstract: Recently, distributed constraint satisfaction community has developed powerful techniques of variable ordering heuristics during the backtracking search. As far as the authors aware, there exists no work related to the effects of DisCSP dynamic variable ordering in constraint propagation techniques. In this paper, we study the impact of DVO within an optimized version of Asynchronous Maintenance of Arc Consistency (AMAC) algorithm. During the propagation process, an agent that receives current partial assignment (CPA) message, assigns its local variables, performs Arc-Consistency in the network and proposes the better reordering of lower priority agents. We show that, despite the fact that dynamic re-ordering defects some nogoods in future agents, constraint propagation combined with relevant dynamic agent-ordering heuristics proves its robustness.

Federico Heras, Antonio Morgado and Joao Marques-Silva.
An Empirical Study of Encodings for Group MaxSAT
(Already published paper)
Abstract: Weighted Partial MaxSAT (WPMS) is a well-known optimization variant of Boolean Satisfiability (SAT) that finds a wide range of practical applications.WPMS divides the formula in two sets of clauses: The hard clauses that must be satisfied and the soft clauses that can be unsatisfied with a penalty given by their associated weight. However, some applications may require each constraint to be modeled as a set or group of clauses. The resulting formalism is referred to as Group MaxSAT. This paper overviews Group MaxSAT, and shows how several optimization problems can be modeled as Group MaxSAT. Several en- codings from Group MaxSAT to standard MaxSAT are formalized and refined. A comprehensive empirical study compares the performance of several MaxSAT solvers with the proposed encodings. The results indicate that, depending on the underlying MaxSAT solver and problem domain, the solver may perform better with a given encoding than with the others.

Roman Barták and Vladimír Rovenský.
Automated Verification of Nested Workflows with Extra Constraints: An Experimental Study
(Original full paper)
Abstract: Workflows are used to formally describe various processes. Because this description is usually done manually, it is important to verify the workflows before they are used further for example for optimization. It is known that the problem of verifying workflows is NP-complete and hence search techniques are employed in the verification process. In this paper we show experimentally the properties of a constraint-based search algorithm for verifying nested workflows with extra constraints.

Ruben Martins, Vasco Manquinho and Inês Lynce.
Clause Sharing in Deterministic Parallel Maximum Satisfiability
(Original full paper)
Abstract: Multicore processors are becoming the dominant platform in modern days. As a result, parallel Maximum Satisfiability (MaxSAT) solvers have been developed to exploit this new architecture. Sharing learned clauses in parallel MaxSAT is expected to help to further prune the search space and boost the performance of a parallel solver. Yet, so far it has not been made clear which learned clauses should be shared among the different threads. This paper studies the impact of clause sharing heuristics. Evaluating these heuristics can be a hard task in parallel MaxSAT because existing solvers suffer from non-determinism behavior, i.e. several runs of the same solver can lead to different solutions. This is a clear downside for applications that require solving the same problem instance more than once, such as the evaluation of heuristics. For a fair evaluation, this paper presents the first deterministic parallel MaxSAT solver that ensures reproducibility of results. By using a deterministic solver one can independently evaluate the gains coming from the use of different heuristics rather than the non-determinism of the solver. Experimental results show that sharing learned clauses improves the overall performance of parallel MaxSAT solvers. Moreover, the performance of the new deterministic solver is comparable to the corresponding non-deterministic version.

Mikolas Janota, Ines Lynce and Joao Marques-Silva.
Experimental Analysis of Backbone Computation Algorithms
(Original full paper)
Abstract: In a number of applications, it is not sufficient to decide whether a given propositional formula is satisfiable or not. Often, one needs to infer specific properties about a formula. This paper focuses on the computation of backbones, which are variables that have the same value in all models of Boolean formula. Backbones find theoretical applications in characterization of SAT problems but they also find practical applications in product configuration or fault localization. This paper presents an extensive evaluation of existing backbone computation algorithms on a large set of benchmarks. This evaluation shows that it is possible to compute the set of backbones for a large number of instances and it also demonstrates that a large number of backbones appear in practical instances.

Mikolas Janota and Joao Marques-Silva.
On Checking of Skolem-based Models of QBF
(Original short paper)
Abstract: Following the success story of SAT, we have recently seen more research focusing on quantified Boolean formulas (QBFs). This focus is motivated by the fact that QBFs are structurally similar to SAT yet, enable us to express a larger set of problems. This increase in expressivity also brings increase in complexity. Namely, while SAT is NP-complete, QBF is PSPACE-complete. This complexity increase also takes place in the checking of the correctness a model of a true formula. In SAT, checking if a given model is correct or not can be decided in polynomial time (for CNF it means checking that all clauses are true). In QBF, checking the correctness of a skolem-based model is NP-complete. This article proposes a novel algorithm for checking of Skolem-based QBF models and shows that the existing, commonly used, algorithm is a special case of the proposed one. The presented algorithm is parameterized by the number of clauses that are checked at a time; an experimental evaluation shows that significant speedups can be achieved by a good choice of this parameter.

Massimiliano Cattafi, Rosa Herrero, Marco Gavanelli, Maddalena Nonato, Federico Malucelli and Juan José Ramos.
An Application of Constraint Solving for Home Health Care
(Already published paper)
Abstract: Although sometimes it is necessary, no one likes to stay in a hospital, and patients who need to stay in bed but do not require constant medical surveillance prefer their own bed at home. At the same time, a patient in a hospital has a high cost for the community, that is not acceptable if the patient needs service only a few minutes a day. For these reasons, the current trend in Europe and North-America is to send nurses to visit patients at their home: this choice reduces costs for the community and gives better quality of life to patients. On the other hand, it introduces the combinatorial problem of assigning patients to the available nurses in order to maximize the quality of service, without having nurses travel for overly long distances. This problem is called in the literature the Home Health Care (HHC) problem. In this paper, we address the HHC problem in the municipality of Ferrara, a mid-sized city in the North of Italy. The problem is currently solved by hand, starting from a partitioning of patients based on predefined zones. In this paper, we describe a Constraint Programming model that solves the HHC problem, and show significant improvements with respect to the current manual solution.
Comments