بررسی تحلیلی از مسئله برنامه ریزی عملیات
کد مقاله | سال انتشار | تعداد صفحات مقاله انگلیسی |
---|---|---|
27236 | 2000 | 17 صفحه PDF |
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Computers & Chemical Engineering, Volume 23, Issues 11–12, 5 January 2000, Pages 1605–1621
چکیده انگلیسی
The combinatorial nature of problems in process systems engineering has led to the establishment of mixed-integer optimization techniques as a major paradigm in this area. Despite the success of these methods in tackling practical sized problems, the issue of exponential increase of the computational effort with the problem size has been of great concern. A major open question has been whether there is any hope of ever designing ‘efficient’ exact algorithms for this class of problems. Further, if such algorithms are not possible, one would like to know whether provably good approximation schemes can be devised. In this paper, we pursue analytical investigations to provide answers to these two questions in the context of the process planning problem. By means of a computational complexity analysis, we first prove that the general process planning problem is -hard, and thus efficient exact algorithms for this class of problems are unlikely to exist. Subsequently, we develop an approximation scheme for this problem and prove, via probabilistic analysis, that the error of the heuristic vanishes asymptotically with probability one as the problem size increases. Finally, we present computational results to substantiate the theoretical findings.
مقدمه انگلیسی
The process planning problem consists of the selection of capacity expansion and operational decisions for a chemical processing complex over a long range planning horizon. Under the assumption of fixed charge cost models and linear mass balances, the standard formulation of the problem is a mixed integer linear program (Sahinidis, Grossmann, Fornari & Chathrathi, 1989). Quite a few specialized algorithms have been developed for the exact solution of these problems. Most of these methods involve branch and bound schemes coupled with strong cutting planes and bounding techniques (Sahinidis et al., 1989, Sahinidis and Grossmann, 1992 and Liu and Sahinidis, 1995b). Despite the success of these methods in tackling practical sized problems, the issue of exponential increase of the computational effort with the problem size has been of great concern. A major open question has been whether there is any hope of ever designing ‘efficient’ exact algorithms for this class of problems. Further, if such algorithms are not possible, one would like to know whether provably good approximation schemes can be devised. In this paper, we pursue analytical investigations of the process planning problem to provide answers to the above two questions. In the first part of the paper, we carry out a computational complexity analysis of the process planning problem to provide an answer to the first question. The exponential explosion of solution effort suffered by exact methods seems to suggest that this class of problems is inherently difficult. However, proving that a particular problem is inherently difficult can be just as hard as finding an efficient exact algorithm for the problem. Instead, computational complexity theory provides a framework for proving that a particular problem is ‘just as difficult’ as a large number of other problems that are widely recognized as being difficult. Recognizing that a particular problem is as difficult as the hardest known problems, provides valuable information as to what avenues to pursue that have the potential of being most productive in terms of solving the problem. While the search for efficient exact methods for such a problem should still be pursued, it is also relevant to concentrate on designing algorithms that are not guaranteed to always give an optimal solution but do so most of the time. In short, as pointed out by Garey and Johnson (1979), the primary purpose of computational complexity theory ‘…is to assist algorithm designers in directing their problem-solving efforts towards those approaches that have the greatest likelihood of leading to useful algorithms.’ Our complexity analysis of the process planning problem reveals that even very simple cases of this problem are difficult, and thus efficient exact algorithms for these problems are unlikely to exist. In the second part of the paper, we pursue the second question of whether it is possible to design an approximation scheme for the process planning problem with some kind of a theoretical guarantee of its performance. Although the majority of the current industrial level practice in process design and planning are based on approximation or heuristic strategies, these methods typically lack the rigor associated with exact approaches. Most heuristic performance analyses are based on empirical evidence rather than any theoretical guarantee that are available for exact methods. Recognizing this gap between heuristic and exact strategies, Liu and Sahinidis (1997) proposed theoretical analysis of heuristics for process planning problems to introduce the rigor that has long been absent. Although theoretical analyses of heuristics have received considerable attention in the context of classical combinatorial optimization problems, this kind of analysis has been absent from the process systems engineering literature prior to the paper of Liu and Sahinidis (1997). Worst case analysis of a heuristic serves to provide a bound on the deviation of the heuristic solution from the optimum for any instance of the problem. Although useful in understanding when and why the heuristic will perform the worst, this kind of an analysis is usually not predictive of the average performance of the heuristic. On the other hand, probabilistic analysis serves to characterize the probable performance of the heuristic on typical problem instances. Liu and Sahinidis (1997) developed a linear programming based heuristic for a special case of the process planning problem and characterized its performance through worst case and probabilistic analysis. They provided theoretical bounds on the worst-case error of the heuristic solution; and proved under some assumptions on the problem parameters that the heuristic error vanishes with probability one as the problem size grows. In this paper, we extend the heuristic of Liu and Sahinidis (1997) to a broader class of process planning problems than those originally considered. We carry out a probabilistic analysis under less restrictive assumptions than those in Liu and Sahinidis (1997), and prove that the modified heuristic is also asymptotically optimal. The analysis demonstrates that, for the process planning problem, it is possible to design approximation schemes with provably good solutions. The remainder of this paper is organized as follows. In the next section, we provide a brief description of the process planning problem. Section 3 carries out complexity analyses of several cases of the problem. Static as well as dynamic versions of the problem are shown to be -hard. An optimization based heuristic for constructing feasible solutions to the process planning problem is developed in Section 4. The motivation for the development of the proposed heuristic and subsequent analysis comes from the empirical observations made by Liu and Sahinidis (1995a). The authors conducted computational experiments to study the effect of the length of the planning horizon on the quality of the relative LP relaxation gap for process planning problems, and made the curious observation that the gap decreased as the number of planning periods grew. This observation appears to be counter-intuitive since a larger number of time periods implies a larger number of binary variables in the integer program and increased complexity. In Section 5, we provide a mathematical explanation of this phenomenon via a probabilistic analysis of the proposed heuristic strategy. Specifically, we prove that the proposed heuristic is asymptotically optimal as the number of planning periods increases. Finally, Section 6 presents computational results. The largest planning problems solved involved 38 processes and 100 time periods. The corresponding MILP formulation requires 3800 binary variables, 16 200 continuous variables, and 18 638 constraints. Such problems are solved by the proposed heuristic within 1% of optimality after solving one single LP.
نتیجه گیری انگلیسی
We have established that the general process planning problem is -hard, and thus exact polynomial time algorithms for this class of problems are unlikely to exist. This characterization motivates the need for approximation schemes that can provide provably good solutions within reasonable time. We developed one such scheme based upon perturbing the solution to the LP relaxation of the problem to construct feasible solutions. Via a probabilistic analysis, we proved that the proposed heuristic provides solutions for which the normalized error vanishes with probability one as the problem size increases. This theoretical result is substantiated for practical problems by computational experiments with example processing networks from the literature involving up to 38 processes, 28 chemicals, and 100 time periods. The corresponding MILP formulation requires 3800 binary variables, 16 200 continuous variables, and 18 638 constraints. Such problems are solved by the proposed heuristic within 1% of optimality after solving one single LP.