دانلود مقاله ISI انگلیسی شماره 78860
ترجمه فارسی عنوان مقاله

نسل خودکار از مدل ها قابلیت سخت محاسباتی با استفاده از الگوریتم های تکاملی

عنوان انگلیسی
Automated generation of computationally hard feature models using evolutionary algorithms
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
78860 2014 18 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Expert Systems with Applications, Volume 41, Issue 8, 15 June 2014, Pages 3975–3992

ترجمه کلمات کلیدی
تست بر اساس جستجو؛ خطوط تولید نرم افزار؛ الگوریتم های تکاملی؛ مدل ها قابلیت؛ ازمایش عملکرد؛ تجزیه و تحلیل خودکار
کلمات کلیدی انگلیسی
Search-based testing; Software product lines; Evolutionary algorithms; Feature models; Performance testing; Automated analysis
پیش نمایش مقاله
پیش نمایش مقاله  نسل خودکار از مدل ها قابلیت سخت محاسباتی با استفاده از الگوریتم های تکاملی

چکیده انگلیسی

A feature model is a compact representation of the products of a software product line. The automated extraction of information from feature models is a thriving topic involving numerous analysis operations, techniques and tools. Performance evaluations in this domain mainly rely on the use of random feature models. However, these only provide a rough idea of the behaviour of the tools with average problems and are not sufficient to reveal their real strengths and weaknesses. In this article, we propose to model the problem of finding computationally hard feature models as an optimization problem and we solve it using a novel evolutionary algorithm for optimized feature models (ETHOM). Given a tool and an analysis operation, ETHOM generates input models of a predefined size maximizing aspects such as the execution time or the memory consumption of the tool when performing the operation over the model. This allows users and developers to know the performance of tools in pessimistic cases providing a better idea of their real power and revealing performance bugs. Experiments using ETHOM on a number of analyses and tools have successfully identified models producing much longer executions times and higher memory consumption than those obtained with random models of identical or even larger size.