بررسی انگیزش بیشتر دانش آموز در زمینه مداخله تجزیه و تحلیل یادگیری در طول یک برنامه پل تابستان
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|30134||2015||8 صفحه PDF||سفارش دهید||6560 کلمه|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Computers in Human Behavior, Volume 47, June 2015, Pages 90–97
Summer bridge programs are designed to improve retention and academic success among at-risk populations in postsecondary education by focusing on successful skills, behaviors, and high impact practices that promote academic performance. Recent research on these programs has focused primarily on how students’ incoming demographics and prior academic performance predict academic performance at the university level. This study investigated changes in students’ academic motivation orientations over the course of one bridge program, and how a learning analytics-based intervention was employed by academic advisors to inform their face-to-face meetings with students. The results of our study show that students’ mastery orientation decreased over the course of the bridge program, and indicate that students’ exposure to displays of their academic performance negatively predicts this change. The findings suggest that student perceptions of their goals and formative performance need to be carefully considered in the design of learning analytics interventions since the resulting tools can affect students’ interpretations of their own data as well as their subsequent academic success.
Retention has been seen as a critical issue in higher education for decades. Admitting students who fail to graduate is devastating for the students, and has ramifications for institutional accountability and related revenue models. Indeed, there is a newfound sense of urgency to address this issue because of the new institutional rating model proposed by President Barack Obama to tie U.S. federal financial aid to graduation rates, tuition, and the percentage of lower-income student enrollment (Lewin, 2013). Consequently, postsecondary institutions are ever more interested in investing in viable and successful models to increase retention, particularly for groups of students with historically lower graduation rates, such as first-generation college students (Dennis, Phinney, & Chuateco, 2005) and students from low-SES families (Adelman, 2006 and Walpole, 2003). Students’ academic persistence has been a well-researched topic. Tinto, 1987 and Tinto, 1993 longitudinal work has demonstrated that six areas—or components—contribute to students’ decision to depart college before earning a degree: (1) pre-entry attributes, such as academic preparation, cultural background, and first-generation status; (2) students’ goals such as academic major, and career choice, and level of commitment to achieving those goals; (3) students’ institutional experiences, both formal and informal with peers, faculty, and staff; (4) integration and balance between academic and social interactions; (5) re-examination and updating of goals and commitments; and (6) decision finalization based on students’ cumulative experiences. Related to Tinto’s model, Astin (1975) asserted that students who physically and psychologically involved themselves in the academic and social opportunities in the college environment are more likely to persist. Further, Astin (1984) argued that student involvement is a behavioral manifestation of the psychological construct of motivation. In order to better assess and monitor the metrics related to student persistence, many postsecondary institutions have turned to learning analytics tools utilizing models driven by pre-entry attributes to address their retention concerns. These attributes are generally paired with students’ formal institutional experiences (i.e., grades), and this approach has dominated the higher education intervention landscape (e.g., Arnold & Pistilli, 2012). To combat declining retention rates for at-risk student populations, many institutions have developed summer bridge programs (Gandara, 2001, Myers and Schirm, 1999 and Terenzini and Wright, 1993). Historically, these programs have been designed to provide academic support and information regarding college campus life, orient students to the institutional culture, and develop at-risk students’ self-esteem and self-efficacy (Ackermann, 1991, Fitts, 1989, Garcia and Paz, 2009, Kezar, 2001 and Pascarella and Terenzini, 2005). Put simply, “summer bridge programs are intended to address important preparation and achievement gaps that are evident in the research [on retention and persistence]” (Colyar, 2011, p. 123). Although these programs are extremely varied in programmatic content and implementation, most aim to develop students’ study and time management skills, their ability to utilize university services (e.g., tutoring centers, libraries), and provide meaningful exposure to college course work and faculty (Cabrera, Miner, & Milem, 2013). Early evaluations and assessments of summer bridge programs were largely descriptive, relying on student satisfaction surveys (e.g., Ackermann, 1991) while more recent research has utilized comparison samples of non-participants, longitudinal analyses, and other empirical techniques (e.g., Allen and Bir, 2012, Cabrera et al., 2013 and Strayhorn, 2011). Students’ intrinsic motivation to achieve their stated goals, and their capacity to plan and utilize available resources are fundamentally linked to their retention in higher education (Allen & Bir, 2012). However, the learning analytics tools that are currently available for deployment at a large scale do not include measures of student motivation, as reported either by the students themselves or from assessments by their instructors or academic advisors. These measures are difficult to include in large-scale analytics tools since such information is not typically captured by institutional student information systems. Whereas the assessment and evaluation of retention programs once relied primarily on student satisfaction surveys (Astin, 1993, Garcia and Paz, 2009, Strayhorn, 2011 and Walpole et al., 2008), today a reliance on institutional records of high school preparation, standardized test scores, and student aid serve as the “basis for models that evaluate the effects of interventions on attainment outcomes” (St. John, 2006, p. 100). A range of data sources representing a broader characterization of the student experience is needed if technological tools are to successfully produce models that represent all of the components of Tinto’s original model. In response, this study investigates students’ motivational orientations and how assessment of those orientations can inform a learning analytics-based intervention employed during a summer bridge program to support data-driven decisions and actions of the academic advisors. 1.1. Research questions The overarching research questions guiding this current study are: • RQ1: To what extent, if any, do students’ motivational orientations change throughout the course of a summer bridge program? • RQ2: What factors predict the changes in motivation, if any, that occur over the course of a summer bridge program? • RQ3: What is the relationship between advisors’ use of a learning analytics-powered Early Warning System and their students’ academic performance during a summer bridge program? This study is the result of working in partnership with summer bridge staff; we believe that such partnerships are necessary in order to better understand the different factors that affect student’s motivational orientations within the context of the summer transition program. These factors can, in turn, inform the future designs of learning analytics tools so that new tools include non-cognitive as well as academic performance measures to ultimately improve student learning and retention.
نتیجه گیری انگلیسی
4. Results We conducted paired-sample t-tests to investigate whether students’ goal-orientations changed throughout the course of Bridge (RQ1) by comparing pre and post measures of students’ mastery, performance-approach, and performance-avoid orientations. There were no significant differences between pre-bridge performance-approach scores (M = 2.8, SD = .93, α = .85) and post-bridge performance-approach scores (M = 2.7, SD = .99, α = .90); (t(208) = .792, p = .43). There were also no significant differences between pre-bridge performance-avoid scores (M = 2.9, SD = .92, α = .77) and post-bridge performance-avoid scores (M = 2.8, SD = .93, α = .82); (t(208) = .772, p = .44). There was statistically significant decrease, however, in students’ reported pre-bridge mastery scores (M = 4.7, SD = .55, α = .89), and post-bridge mastery scores (M = 4.3, SD = .84, α = .93); (t(208) = 6.53, p < .001). To better understand what might contribute to students’ decrease in mastery orientations during Bridge (RQ2), we specified a multiple regression model of students’ change in mastery (Table 1). We controlled for relevant demographic characteristics (e.g., athletic status, gender), previous academic expectations and support (e.g., support from family and friends), incoming mastery, as well as previous achievement (ACT scores). We were also interested in the relationship between students’ understanding of their formative academic achievement data and their mastery perceptions. To measure this relationship, we included variables that measured students’ self-reports about how often their academic advisor showed them Student Explorer data, as well as whether advisors used the Student Explorer EWS before, during, or after their meetings with students. Table 1. Change in mastery orientation over the course of the bridge program. β SE β R2 Change in mastery orientation .30 Intercept 1.890 .974 Demographic variables Mastery orientation before summer bridge .358⁎⁎ .116 Student athlete −.294 .248 Female −.045 .127 Achievement variables ACT composite score −.042 .031 1-on-1 Advisor-student discussions a Graphs within Student Explorer shown to student −.112⁎ .050 Studying for bridge courses .058 .043 Advisor’s use of Student Explorer Before meeting (count) .025 .049 During meeting (count) −.009 .030 After meeting (count) −.015 .137 Pre-enrollment support Family encouraged academic success .252⁎⁎ .082 Friends encouraged academic success .013 .062 Excellent HS teachers .145⁎ .066 ⁎⁎ p < .01. ⁎ p < .05. a Represents students’ reports of the number of times each issue was discussed during 1-on-1 meetings. Table options Students’ incoming mastery was positively associated with their outgoing mastery. However, students’ self-reports of how often their advisors showed them their Student Explorer data negatively predicted the change in mastery over the course of bridge. Advisors’ logged (actual) use of the Student Explorer EWS before, during, or after meetings with each student was not a significant predictor of the change in mastery. Since Bridge programs are meant to ensure that students are prepared for college-level coursework, we wanted to understand what factors would predict student final grade outcomes in three of the courses offered during the seven week term (RQ3): an English course, a remedial intermediate algebra course (Math A), and a college-level intermediate algebra course (Math B). We did not model the Math C, Writing, or freshman seminar courses because there was little grade variation. We used multiple regression to model which factors would predict student outcomes in each of these courses, and controlled for demographic characteristics (e.g., gender and athletic status), academic achievement measures (e.g., ACT scores, math pre-tests, and math midterms), high school experiences (e.g., perceived quality of high school teachers in general, and math/English preparation, specifically), and encouragement by family and friends. This allowed us to focus on the relationships between student motivation, advisors’ use of Student Explorer, and course grade outcomes (Table 2). Table 2. Predicting summer bridge course grade outcomes. Variable English course Math A course Math B course β SE β R2 β SE β R2 β SE β R2 .31 .63 .80 Mastery −.06 .07 −5.48⁎ 2.58 3.33 4.81 Performance-approach −.09 .01 2.00 3.29 −11.85 8.93 Performance-avoid .10 .09 3.63 2.82 4.11 8.77 Student athlete −.66⁎⁎ .26 – – −1.92 15.15 Female .11 .11 2.22 3.81 15.23 15.23 Student Explorer data shown to student −.06 .05 .65 1.62 3.72 4.29 Studying for bridge courses .03 .04 1.85 1.37 −.76 3.73 Student Explorer before meeting (count) −.04 .04 .63 1.62 −1.85 4.48 Student Explorer during meeting (count) .01 .03 −1.62 .98 −1.96 2.176 Student Explorer after meeting (count) −.36 .21 −13.42⁎⁎ 4.55 .44 7.62 Family encouragement .05 .07 1.09 2.53 −15.59⁎ 7.24 Friend encouragement −.04 .06 −.54 1.90 8.00 5.73 Excellent HS teachers .12⁎ .06 3.76 2.39 7.24 8.19 ACT English score −.01 .02 −.88 1.67 −7.46 4.15 HS English preparation .07 .05 – – – – ACT math – – .82 .86 1.42 2.71 Math pre-test – – .61 .16 .43 .81 Math midterm – – .78⁎⁎⁎ .17 .96⁎ .41 ⁎⁎⁎ p < .001. ⁎⁎ p < .01. ⁎ p < .05. Table options Results of the multiple regression models indicated that athletic status negatively predicted English course grades, while students who stated that they had excellent high school teachers positively predicted English course grades. These variables, however, were not predictive of math course grades. For students in the remedial Math course (A), mastery orientation negatively predicted course grade. The extent to which Bridge advisors viewed students’ data via Student Explorer after meeting with students also negatively predicted students’ Math A course grades. For students in the college-level Math course (B), perceived family encouragement negatively predicted course grades. The midterm score positively predicted the final course grade for both Math courses. Finally, students’ self-reports of how often their advisors showed them their Student Explorer data did not predict any course grade, suggesting that students’ perceptions of their data affected their mastery orientation but not their grade because mastery orientation.