درباره شناخت شناسی و اخلاق برقراری ارتباط یک آگاهی دکارتی
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|1659||2012||4 صفحه PDF||سفارش دهید||3590 کلمه|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Safety Science, Available online 14 July 2012
Researchers have made situation awareness into a researchable, scientific concept and generated practical progress mainly by modeling it on a natural-scientific ideal of empiricism and positivism. Crucially, in the manner of Cartesian dualism, it assumes that the world is objectively available and apprehensible, and can be compared to the internal corresponding mirror (the SA) of it. This has involved epistemological and ethical sacrifices. Most importantly, people now get blamed for losing SA. This happens in research, investigations, media and judicial contexts, where in hindsight it is pointed out that their “mind” did not get the crucial bits of “matter” that were supposedly available to them.
Peering wearily into the cradle of the concept in the mid-nineties, aviation safety veteran Charlie Billings wondered aloud whether the intercession of “situation awareness” was necessary to explain what causes people to see, or miss, or remember seeing, something. “It’s a construct!” he said in a keynote at a foundational conference on situation awareness. “Constructs cannot cause anything!” (Billings, 1996). Charlie Billings passed away in 2010. By then, loss of situation awareness (i.e. loss of a construct) had become the favored cause for an epidemic of automation/human performance-related mishaps in aviation and in other settings. A full 85% of reports produced by the Australian Transportation Safety Bureau in 1996 (ATSB, 1996) contained references to a “loss of situation awareness.” A meta-analysis of over 300 civil aviation incident reports conducted by human factors researchers indicated that that loss of situation awareness causes more incidents when the captain is at the controls than when the first officer is, and that the pilot flying is more likely to lose situation awareness than the pilot not flying (Jentsch et al., 1999). Of course, Giddens would argue that the findings reported in a study like that are only as stable as the two subsequent interpretations that brought them fourth. Human factors, after all, is an activity in which humans study humans. Humans are self-reflecting actors, not objects in the natural world that do not answer back. For Giddens, this involves a double hermeneutic. First, there are the interpretations among those people who are studied in human factors research. The second hermeneutic applies to the human researchers themselves, who are, of course, constituted in a particular context that offers a particular set of constructs, methods, and techniques. In an earlier Jones and Endsley (1996) study, like the one above, interpretation governed both the reporters’ reflections on their own performance (the description of the errors they committed and reported in their choice of words in an incident reporting system) and the researchers’ subsequent classification of those errors. None of this holds any trans-historic truth values that can progress human factors or safety science toward a greater accumulation of facts or “science.” But there are more important concerns than that. Clearly, neither lay nor researcher communities had much time for Charlie Billings’ cautions. Fifteen years hence, loss of a construct causes a majority of aviation accidents (Jentsch et al., 1999 and Jones and Endsley, 1996). On the face of it, the popular, intuitive appeal and subsequent swift adoption have given situation awareness (and the causal power of its loss) a kind of consensus authority: everybody uses it, so everybody uses it. This in spite of consistent calls for caution in addition to Billings’ (Dekker, 2000, Dekker et al., 2010, Flach, 1995, Sarter and Woods, 1991 and Smith and Hancock, 1995). That a construct is popular, however, should not be confused with it being “scientific.” Consensus authority will eventually fail to prop up an otherwise bad idea. So in order to keep using situation awareness as a cause—and apparently credibly so—human factors researchers have to see situation awareness as a good idea, a researchable one, a scientific one. This has involved a number of tacit philosophical operations that cast the study of situation awareness research a natural sciences project. Some would argue that SA amounts to a project that has generated practical progress. The question considered in this paper is whether the necessary epistemological and ethical sacrifices have been worth it.