دانلود مقاله ISI انگلیسی شماره 372
عنوان فارسی مقاله

الگوبرداری عملکرد وب سایت CVB : الگوهای فضایی و ساختاری

کد مقاله سال انتشار مقاله انگلیسی ترجمه فارسی تعداد کلمات
372 2010 10 صفحه PDF سفارش دهید 8240 کلمه
خرید مقاله
پس از پرداخت، فوراً می توانید مقاله را دانلود فرمایید.
عنوان انگلیسی
Benchmarking CVB website performance: Spatial and structural patterns
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Tourism Management, Volume 31, Issue 5, October 2010, Pages 611–620

کلمات کلیدی
الگوبرداری - کارت امتیازی متوازن - عملکرد وب سایت - اداره کنوانسیون و بازدید کنندگان - بازاریابی مقصد - تجزیه و تحلیل مکانی - کارت امتیازی متوازن - مدیریت جهانگردی - توریسم و گردشگری
پیش نمایش مقاله
پیش نمایش مقاله الگوبرداری عملکرد وب سایت CVB : الگوهای فضایی و ساختاری

چکیده انگلیسی

This study evaluated 967 U.S. CVB websites using a modified Balanced Scorecard (mBSC) approach which assesses website performance with respect to overall technical functionality, customer friendliness and usability, effectiveness of marketing the destination, and information content. Spatial maps were constructed for these four dimensions and overall CVB website performance using ArcMap v.9.2 GIS software. A structural pattern of CVB website performance was obtained using Structural Equation Modeling (SEM). It was concluded that CVB websites primarily need improvement in marketing the destination product. The analysis revealed significant differences in website performance between members and non-members of Destination Marketing Association International (DMAI) as well as regional differences. Study implications for destination marketing organizations and CVB website designers are discussed.

مقدمه انگلیسی

Marketing cities and other locations as tourist destinations has been one of the major trends in the U.S. tourism in the past three decades. Convention and Visitor Bureaus (CVBs) have acted as destination marketing organizations and have been responsible for attracting pleasure visitors and meeting groups to their respective locations (Yuan, Gretzel, & Fesenmaier, 2003). Most CVBs operate as independent, not-for-profit organizations, and their organizational structures vary depending on the character of the destination, i.e., a single city, a metropolitan area, a county, or regional destination comprised of several counties (Gartrell, 1988). CVBs provide opportunities for local businesses to promote their products in communications with their travel markets and, at the same time, help tourists and potential visitors to form expectations, create destination images, and assist in purchasing tourism products (Buhalis, 2003, Pike, 2004 and Shanshan et al., 2007). CVBs enhance the economic growth and development of their respective regions, making them a desirable location for meetings, conventions, and tours, and increase the environmental well-being of destinations through promotion and comprehensive marketing. Thus, CVBs play several important roles at once: an economic driver creating new income, employment, and a more diversified local economy; a community marketer communicating the most appropriate destination image and positioning destination appropriately in the marketplace; and an industry coordinator encouraging less industry fragmentation (Morrison et al., 1998, Presenza et al., 2005 and Wang, 2008). With the development of tourism offers and services, there is an intensive competition for visitors between destinations, and an appealing presentation of a destination to potential markets is a crucial success factor. As part of their marketing strategy, CVBs widely use Internet technology, especially destination websites, as this approach offers new distribution channels for destination product, reaches people in faraway locations, and provides richer information and less expensive means of communication. It also allows establishing and developing new relationships with members and other cooperative partners. Therefore, regular evaluations of CVB website performance are needed to effectively facilitate continuous improvements, i.e., customer retention and return on investment, and judge site performance against competitors and industry peers (Fesenmaier et al., 1999 and Tierney, 2000). Xiang, Kothari, Hu, and Fesenmaier (2007) argue that benchmarking performance of the destination marketing organizations websites should be considered as a regular and continuous instrument that contributes to development of CVBs' winning management strategies. However, the multi-dimensional nature of website performance, the variety of approaches to performance evaluations, and the time consuming evaluation process often prevent regular and comprehensive monitoring of CVB website performance. The purpose of this exploratory study was to evaluate performance of all U.S. CVB websites using a standardized measure, a modified Balanced Scorecard (mBSC) approach, thus getting a picture of CVB website performance across the whole country. Such a comprehensive survey allowed aggregated comparisons of CVB websites by different variables, such as Destination Marketing Association International (DMAI) membership, state, or geographic location (using the Geographical Information System ESRI ArcMap GIS), thus casting light on the leadership within the competition. In addition, the motivation for the study was to understand how different CVBs approach the website design task and to what website features they pay most attention, i.e., obtaining the current model of CVB website design. Such information was thought to be helpful in understanding whether CVBs in practice follow academic research recommendations and marketing theory. The data collection was conducted in 2006 as a benchmarking project for the Certified Destination Management Executive (CDME) program of the DMAI. The contribution of this study is in completeness of the evaluation data, which can be considered as nearly a census of U.S. CVB websites, as well as in applying spatial statistics and spatial maps to visualize the wealth of evaluation data and compare performance on a large scale. Practical relevance of the study was attested by participants of CDME training program over the past 10 years.

نتیجه گیری انگلیسی

This study evaluated the status of website performance for nearly all U.S. CVB websites as of summer 2006, which allowed their ranking based on the performance (PERFORM) variable, as well as in each of the four performance perspectives: Customer, Marketing Effectiveness, Destination Information, and Technical. The results of the evaluating project exposed spatial and structural patterns of the performance variable and its dimensions. As far as the authors know, this type of project has not been done before on a national scale. As can be seen from Fig. 1A, CVBs of evaluated websites are concentrated on the West and East Coasts and in the Midwest, i.e., in the most populated areas of the U.S. One of the study conclusions is that website performance is not randomly distributed: performance tends to be higher in major metropolitan areas, and websites with higher performance cluster together (Fig. 1B). DMAI members have websites of higher performance than non-members. This result is universal for Customer, Marketing Effectiveness, Destination Information, and Technical perspectives. As follows from Fig. 1C, websites generally do well on the Technical perspective. However, it should be kept in mind that this result is attributed to a somewhat simplistic evaluation made by NetMechanic and LinkPopularity that checked performance on just five criteria: the number of broken links, html code, browser compatibility, load time, and number of links. Therefore, the striking technical prowess and sophisticated technical features displayed by a number of websites were not registered by the WebEVAL® instrument. With respect to the Customer perspective, CVB websites obtained very good results, the highest among all perspectives as indicated by Fig. 1D. It was concluded (see Table 3) that CVBs universally pay attention to visual appeal (APPEAL) and navigation structure (NAVIGATE) and to providing such basics as CVB contact information (CONTACT) and currency of information (CURRENT). While t-tests indicate significant differences between DMAI member and non-members on the Technical and Customer perspectives, the differences in means and variances are not large, and statistical results can be partially attributed to the fairly large sample sizes. Fig. 1E reflects a somewhat less than adequate CVB website performance on the Marketing Effectiveness perspective, with a better situation on the East and West Coasts and in the Southwest. The Marketing Effectiveness performance displayed the lowest mean score and the second largest differences between DMAI members and non-members, as shown in Table 2. As can be seen from Table 3, the means of variables reflecting relationship marketing (RELATION), assurance (ASSURE), and level of personalization (PERSONAL) are low, which indicates that a small number of websites pay attention to these important aspects of effective marketing. Websites of DMAI members and non-members significantly differ on the Destination Information perspective as well (Table 2). Evaluation results indicated that CVB websites almost universally received high scores on information about destination activities (ACTIVITY) and facilities (FACILITY). The situation is rather different with such aspects as information for meeting planners and travel professionals (MPLANNER), as can be seen from Table 3. To some extent, it can be explained that small destinations might have chosen not to compete as places for meetings and conventions. Thus, further data analysis is needed to decide whether the differences between DMAI and non-DMAI member in Destination Information perspective can be attributed to a “meeting planner” set of evaluative criteria. The structural pattern of CVB website performance depicted in Fig. 3 reflects how the construct is understood by CVB website designers. The three clearly emerged components of CVB website performance are an adequate description of the destination product (Destination Product), effective marketing (Marketing), and ease of website use (Ease of Use). It should be noted that these three components have many aspects common with the original mBSC perspectives: Destination Information, Marketing Effectiveness, and Customer. They do not coincide completely, which is understandable: the CVB website evaluation was conducted from the theoretical/researchers' point of view, while the SEM model shows how various performance features are grouped within the websites, i.e., from the CVBs' point of view. The Technical perspective was not incorporated into the structural pattern due to extremely different nature of its measurement from the other three perspectives. Destination Product factor explains the largest amount of variance (see Table 4). Performance variables comprising this factor such as FACILITY, LINKING, ACTIVITY, DESTINFO, and SEGMENT have relatively high means (see Table 3), which reflects acceptance of these dimensions as important components of the overall CVB website performance by CVBs and website designers. In contrast, means of the variables ASSURE and RELATION included into the Marketing factor are low, meaning that websites incorporating these dimensions have larger overall performance score and a better chance of outperforming competition, as indicated by the largest path coefficient (.91). Exclusion from the model variables BRANDING, PERSONAL and FEEDBACK reflecting primarily the Marketing dimension of overall performance signals that overall website performance is less dependent on these variables. It is suggested that the pattern shown in Fig. 3 should be viewed as a benchmarking level at which any new entity entering the field needs to perform. To stand out and perform above the average level the newcomer websites should perform stronger in the area of destination marketing and address such issues as interactivity, personalization, and feedback into their websites. The version of WebEVAL®, used in this study, was designed several years ago. Since innovations in website design have been developing rapidly, the instrument is not capable of effectively measuring some of these advancements due to the lack of corresponding survey items. The 2006 benchmarking evaluation discovered that a growing number of CVB websites incorporate mileage calculation, interactive maps, hotel availability reports, etc. – features that could be classified into interactivity and personalization performance dimensions, but were not measured. The scope of this study, which can be considered as nearly a census of all U.S. CVB websites, is also a source of research limitations. The study on such a scale needed a highly standardized evaluation tool, and the only instrument the researchers were aware of was WebEVAL® based on the mBSC theoretical approach and designed specifically for CVB website performance assessment. WebEVAL® assesses a presence of a particular website feature in the CVB website. It needs to be made clear, however, that while the presence of certain website features can be equated with performance (for example, the raters gave “1” if they thought that “the pages were clear and uncluttered,” or “there was a standardized logo on all pages,” or “suggested tour itineraries were provided”), presence of more complicated features cannot be automatically equated with performance. To give an example, let us look at the items “Is a sign-up for visitors provided?” and “Can site visitors opt to send an e-postcard?” Presence of each of these features signals that website designers thought about presenting visitors with personalization opportunities and websites with these features received “1” on the corresponding survey items. The raters, however, did not evaluate whether or not these features were in a workable condition. There were three reasons for that. First, evaluation of whether a particular feature works would inevitably lead to the question “works how well?” This would negatively affect reliability of evaluations. Second, it would delay evaluation tremendously, and third, evaluation of the technical perspective by NetMechanic was thought to be able to capture, at least partially, faulty HTML code, broken links, or browser incompatibility associated with malfunctioning design features. Applicability of the WebEVAL® instrument for assessing performance of CVB websites belonging to destinations which dramatically differ in size and travel offer needs to be discussed as well. While all destinations are unique, their presentation and promotion in a form of the CVB website have many things in common. For example, websites should be aesthetically appealing, have convenient navigation, present visitors with basic destination information, highlight various aspects of the destination to different groups of visitors, tangibilize the destination product, establish long-term relationships with visitors and partners, etc. For instance, with respect to segmentation, a large destination can target various groups of leisure and business travelers, while a small destination can target only local families and wine lovers. However, as long as there is a clear organization of information for various groups of visitors, the number of these groups does not affect the website score on the respective survey items. Therefore, websites of small destinations can outperform those of the larger destinations, and many of them did. At the same time, a few survey items have different degrees of applicability for different kinds of destinations, most notably, items related to convention and meeting planners. While this is a very common feature of modern CVB websites, it is not universal. Further research is recommended to estimate how survey items which reflect the “meeting planner” aspect of the CVB website performance influence the results of this study. With respect to the “human factor” involved in the evaluation process, while evaluation of all websites by only two researchers provides better consistency of evaluations compared to that made by a larger number of people, the results in lengthy evaluation studies are prone to measurement errors due to researchers' fatigue (Krippendorff, 2004). To combat this issue, the study was completed in an 11-week period when the researchers did not have any other engagements. Reliability standards in the form of evaluation guidelines were developed and were strictly adhered to during the whole length of the project. Reliability checks were conducted four times and spaced evenly along the whole duration of the project to ensure consistent application of evaluative criteria by the researchers. Applicability of WebEVAL® instrument to the purposes of Structural Equation Modeling is also an issue in this study. The researchers operationalized the performance dimension variables based on finding by Park and Gretzel (2007) using relevant survey items contained in the WebEVAL® instrument. While some dimensions had enough items for operationalization, others, like NAVIGATE or ACTIVITY, were based on only four items, which weakened the measures. Operationalization of semi-continuous variables out of a few dichotomous ones might have led to a scale which was not particularly well accommodated by SEM procedure. Different operationalization approaches are also possible and should be tested as well; the problems of weights for various website performance dimensions is one of the directions which research should explore. In addition, evaluation process can be and should be improved in terms of its effectiveness in assessing the technical perspective. Relatively simple computer programs can be written to automatically feed website addresses for evaluations by NetMechanic and LinkPopularity and collecting the results. Moreover, there is an emerging stream of research which deals with automatic performance evaluations (e.g., Chan & Law, 2006). While this solution for higher speed and reliability is entirely feasible, more thought should be given to how to assess the technical, and any other for that matter, perspective of website performance, and what criteria should be included for this purpose. The nature of the Internet allows collection of hit statistics, number of visitors, and other types of click stream data, and consideration should be given to feasibility of methods incorporating these numbers into website evaluation matrix. With rapid technological advancements in CVB website development establishing and creating adequate performance evaluation instruments is a current challenge for the marketing and tourist researchers.

خرید مقاله
پس از پرداخت، فوراً می توانید مقاله را دانلود فرمایید.