Does in‐service professional learning for high school economics teachers improve student achievement?

  • Published on
    03-Apr-2017

  • View
    213

  • Download
    0

Transcript

  • This article was downloaded by: [University of Waterloo]On: 28 November 2014, At: 08:59Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

    Education EconomicsPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/cede20

    Does inservice professional learningfor high school economics teachersimprove student achievement?John R. Swinton a , Thomas De Berry b , Benjamin Scafidi a &Howard C. Woodard aa Georgia College & State University Center for EconomicEducation , Milledgeville, Georgia, USAb FreedHardeman University , Henderson, Tennessee, USAPublished online: 16 Dec 2008.

    To cite this article: John R. Swinton , Thomas De Berry , Benjamin Scafidi & Howard C. Woodard(2010) Does inservice professional learning for high school economics teachers improve studentachievement?, Education Economics, 18:4, 395-405, DOI: 10.1080/09645290802470434

    To link to this article: http://dx.doi.org/10.1080/09645290802470434

    PLEASE SCROLL DOWN FOR ARTICLE

    Taylor & Francis makes every effort to ensure the accuracy of all the information (theContent) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoever as tothe accuracy, completeness, or suitability for any purpose of the Content. Any opinionsand views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Contentshould not be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions, claims,proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to orarising out of the use of the Content.

    This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

    http://www.tandfonline.com/loi/cede20http://www.tandfonline.com/action/showCitFormats?doi=10.1080/09645290802470434http://dx.doi.org/10.1080/09645290802470434http://www.tandfonline.com/page/terms-and-conditionshttp://www.tandfonline.com/page/terms-and-conditions

  • Education EconomicsVol. 18, No. 4, December 2010, 395405

    ISSN 0964-5292 print/ISSN 1469-5782 online 2010 Taylor & FrancisDOI: 10.1080/09645290802470434http://www.informaworld.com

    Does in-service professional learning for high school economics teachers improve student achievement?

    John R. Swintona*, Thomas De Berryb, Benjamin Scafidia and Howard C. Woodarda

    aGeorgia College & State University Center for Economic Education, Milledgeville, Georgia, USA; bFreed-Hardeman University, Henderson, Tennessee, USA

    Taylor and Francis LtdCEDE_A_347211.sgm (Received 29 October 2007; final version received 19 March 2008)10.1080/09645290802470434Education Economics0964-5292 (print)/1469-5782 (online)Original Article2008Taylor & Francis0000000002008JohnSwintonJohn.swinton@gcsu.edu

    Education policy analysts and professional educators have called for more and betterprofessional learning opportunities for in-service teachers, and for at least 30 yearseconomists called for more content training for high school economics teachers. Usingnew data from all Georgia high school economics students, we assess the impact of in-service teacher workshops on the performance of students on a high-stakes end-of-courseeconomics exam. Controlling for student characteristics and teacher fixed effects, wefind a positive and significant impact of teacher workshop attendance once teachershave attended three workshops on student test scores. Furthermore, the results suggestthat in-service workshops for economics teachers offer a cost-effective way to providecontent training.

    Keywords: in-service teacher training; student achievement

    1. Introduction

    Teacher quality has been shown to be a large determinant of student achievement in primaryand secondary education (Rivkin, Hanushek, and Kain 2005). While many studies havefound very limited effects of teacher credentials on student achievement, Clotfelter, Ladd,and Vigdor (2007) find positive and relatively large impacts of various teacher credentialson student performance. Professional learning for in-service teachers is a more modestfinancial investment in teacher quality than teacher credentials such as earning an advanceddegree or National Board Certification, for example.

    Education policy analysts and professional educators have long called for more andbetter professional learning opportunities for in-service teachers (see, for example, NationalCommission on Teaching and Americas Future 1996). What is true throughout K12education may be an acute problem in high school economics. As early as 1977, researcherscalled for more content training for high school economics teachers (Mackey, Glenn, andLewis 1977). More recently, Walstad (2001) found that the typical teacher who is teachingeconomics has completed no more than one college course in economics. Mackey, Glenn,and Lewis (1977) suggested that teachers at least minor in economics to be qualified toteach at the high school level. However, few high school economics teachers do so.

    Currently, 17 US states include at least one high-school level course in economics as arequirement for graduation. Georgia is one of the few states to also require students to takea high-stakes end-of-course test to demonstrate an understanding of economic concepts. In

    *Corresponding author. Email: John.swinton@gcsu.edu

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

  • 396 J.R. Swinton et al.

    the 2005/06 school year, 41% of Georgias high school economics students failed thisstatewide examination, which counted for at least 15% of their final class grade (GeorgiaDepartment of Education [GaDOE] 2007) Many attribute this poor performance in highschool economics to a lack of teachers with sufficient content knowledge or training ineconomics.

    To fill the training gap, the National Council on Economic Education (NCEE), its stateCouncils, and college-based and university-based centers offer in-service training andworkshops to high school economics teachers. The NCEE programs intend to help teachersdevelop an understanding of economic concepts and to introduce them to teaching materialthat will aid them in preparing class lessons. Since its inception in 1972, the Georgia Coun-cil on Economic Education (GCEE) has offered hundreds of workshops to thousands ofGeorgias teachers. Workshops have introduced teachers to materials that address all areasof economics from personal finance in kindergarten to Advanced Placement economics inhigh school.

    There is some evidence that workshops similar to those offered by the GCEE helpteachers improve students test scores (for examples, see Highsmith 1974; Thorton andVredeveld 1977; Walstad 1979; Schober 1984; Weaver, Deaton, and Reach 1987;Bosshardt and Watts 1990, 1994; Watts and Bosshardt 1991). Workshops have changeddramatically since the videotape and conference call approaches in the 1970s. Todaysface-to-face workshops introduce teachers to materials custom made for state and nationaltesting goals. Workshops provide a vast amount of material that teachers can take directlyinto the classroom. Materials are available in the form of specific lesson plans and sophis-ticated DVDs. Often, the materials provide links to up-to-date web-based materials. Work-shops not only introduce teachers to material but help them take the material for a test driveand come to a better understanding of the economic content in each lesson. The GCEEspends about $700,000 each year to provide workshops and materials for teachers. Yet,there is little current information concerning the effectiveness of these workshops.

    In this paper, we examine the test scores of Georgia students who take the mandatoryend-of-course test (EOCT) in economics. All Georgia students must take a comprehensiveeconomics examination as part of their required economics course. Since spring 2004, thetest counts for at least 15% of the each students grade, and therefore it qualifies as a high-stakes test. We compare the scores of students whose teachers have undergone training atGCEE workshops with those of teachers who have not undergone such training. Wecontrol for student factors that previous studies have shown to affect student performance.Because there are many potentially important influences that we cannot observe, such ashow teachers decide whether to attend workshops, we estimate our models allowing forteacher fixed effects. We also adjust test scores to control for time trends within the data.We find that taking three GCEE workshops has a positive, statistically significant impacton student test scores, and the magnitude of the impact is similar to the magnitudes of thecredentials considered in Clotfelter, Ladd, and Vidgor (2007). These results suggest thatprofessional learning for in-service teachers can have a positive impact on student achieve-ment. Given the extremely low cost of the in-service workshops ($700,000 per year), itappears that this treatment is a cost-effective way to improve student achievement ineconomics. Given that the average economics teacher has had little economics contenttraining prior to teaching the subject, we cannot say whether similar workshops in otheracademic subjects would be as beneficial.

    In Section 2 we discuss the background literature, and we describe the data in Section3. We present the empirical model and results in Section 4, and provide concluding remarksin Section 5.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

  • Education Economics 397

    2. Background literature on in-service training for teachers

    There has been little empirical assessment of the efficacy of in-service instruction forelementary and secondary teachers and its role in improving student performance. Kennedy(1998) offers a good summary of the earlier empirical work. In general, the past studies areinconclusive concerning the effectiveness of in-service education. Two somewhat recentstudies stand out as an indicative of this observation. Angrist and Lavy (2001) show that in-service programs coupled with other school-wide reforms can aid teachers in their efforts toteach both mathematics and language skills in Jerusalem public schools. They found thatsome groups of students experience as much as a 0.25 standard deviation increase in testscores after their teacher participated in the training programs. Their treatment is not,however, limited just to the impact of in-service training for teachers. The program in theJerusalem schools that they study was a comprehensive attempt to improve educationaloutcomes. Jacob and Lefgren (2004), in contrast, found no evidence that in-service programshelp Chicago teachers improve the performance of students. The treatment they study costsup to $90,000 per school which is paid by the central office, but individual schools maysupplement this amount. All of the schools in their study, however, were initially low-performing schools. They studied programs that cut across academic disciplines and targetgeneral measures of learning such as reading and mathematics scores.

    Even with the evidence that teacher education matters to student outcomes, mosteconomics teachers lack sufficient content training to be as effective as they could be. InGeorgia, teachers are required to demonstrate knowledge of a content area before beingcertified to teach that subject. From 1978 until 1997 a teacher demonstrated competencyby passing the Teacher Certification Test. From 1997 until 2006 a teacher had to havepassed the Praxis II social studies test to demonstrate sufficient content knowledge to teacheconomics in high school. Since 1 September 2006, teachers must pass the GeorgiaAssessment for the Certification of Teachers test specifically in economics to teach in thefield. However, any teacher who had previously passed the Praxis II social studies test orthe Teacher Certification Test maintained certification to teach economics. A teacher couldhave passed any of these tests with no specific training in economics whatsoever. Withinthe economics profession, researchers have generally concluded there is a need to improvethe training of K12 economic educators.

    Much of the effort to offer in-service content training in economics comes from theNCEE. Consequently, much of the research focuses on its programs. As early as 1977,Thornton and Vredeveld noted that there is a common assumption that students of teacherswho had been recipients of in-service education benefit from their teachers increasedunderstanding of economics. In 1985, Walstad and Watts (1985) found that most organiza-tions involved in economic education make the general assumption that teacher training ineconomics has a direct impact on student output measured, for example, by test scores.By 1988, Baumol and Highsmith (1988) recognized that the issue is much more compli-cated than others had previously stated. There has been a considerable need to determine themajor influences on the effectiveness of teaching programs in high school economics. LikeAngrist and Lavy (2001) and Jacob and Lefgren (2004), researchers in economic educationhave struggled with the appropriate design of experiments to isolate the effect of in-servicetraining.

    Highsmith (1974), as one of the earliest empirical researchers to examine in-serviceeconomic education for teachers, designed his experiment by comparing the student resultson the Test of Economic Understanding between students of teachers who had attendedtraining workshops to students of teachers who did not attend workshops. Thorton and

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

  • 398 J.R. Swinton et al.

    Vredeveld (1977) used a similar experimental design. Schober (1984) refined the previousapproaches by adding a pre-test to his experiment. This allowed him to control for pre-existing knowledge, which other studies had ignored. Walstad (1979) introduced the notionthat workshops and other in-service training programs may do more than provide informa-tion for teachers. They may change teachers attitudes toward economics. Therefore, headdressed a simultaneity issue of both workshops and teacher attitudes affecting studentachievement. This innovation becomes important as other researchers started to look atsimultaneity and selection issues within educational production functions. For example,Grimes (1995) addressed the problem of student attrition in pre-test and post-test experi-ments as a self-selection issue. Bosshardt and Watts (1990, 1994) and Watts and Bosshardt(1991) introduced the use of fixed-effects models to address the problem of unobservedcharacteristics in either schools or teachers.

    With each innovation in the literature, researchers have come to a better understandingof what affects student learning. While Angrist and Lavy (2001) found a very large impactof in-service training (25% of a standard deviation), within the context of economics in-service education for teachers, other researchers are finding less dramatic results. Much ofthe research has found positive impacts of in-service training. While it is difficult to directlycompare many of the results, findings are generally in the range of 110% of a standarddeviation of the measurement of student achievement.

    A few critical issues, however, remain. There is no indication that any of the studies todate involve a high-stakes test. Therefore, students may have insufficient incentive todemonstrate their level of learning. Furthermore, most of the studies are fairly small scale.One of the aspects of Jacob and Lefgrens (2004) research that makes it stand out is its useof a large sample size.1 Their focus, however, on Chicagos low-performing schools lacksvariability in many of the characteristics that interest researchers.

    In our study, we examine the test scores of students who took the Economics EOCT inthe State of Georgia. We compare the scores of students whose teachers have undergonetraining at GCEE workshops with those of teachers who have not undergone such training.One advantage to our approach is that we observe all Georgia students who take the EOCT.This provides a large and diverse sample. A second advantage to our data is that, becausethe EOCT is a high-stakes test, students have an incentive to demonstrate their true under-standing of the material.

    Each test score is linked to the characteristics of the student taking the test, which allowsus to control for student characteristics thought to affect test performance such as gender,ethnicity and economic status. We employ a fixed-effects technique to control for unobserv-able differences in teacher characteristics while we examine the impact of teachers atten-dance at GCEE workshops. We also use normal curve equivalent (NCE) transformations ofour dependent variable to allow us to compare test scores across schools and over time.Finally, we observe the timing of teacher attendance at GCEE workshops. Since eachstudent test score is linked to a teacher of record, we can identify our treatment group asbeing teachers who have attended GCEE workshops before their students take the EOCT.

    3. Data

    Our data-set consists of all Georgia students who took the Economics EOCT over a five-semester period: spring 2004, fall 2004, spring 2005, fall 2005 and spring 2006.2 All Georgiastudents must take a comprehensive economics examination as part of their requiredeconomics course. Since spring 2004, the test counts for at least 15% of the each studentsgrade, and therefore it qualifies as a high-stakes test. The universe consists of 174,601

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

  • Education Economics 399

    student test scores within this period. Each student test score is matched to a classroomteacher. Data come from two primary sources. Test data and student data come from theGaDOE. Teacher workshop attendance data come from GCEE.

    The Georgia EOCT in economics is a 90-question multiple-choice standardized test ofeconomic knowledge.3 The GaDOE standardizes the test scores on a 200-point to 750-pointscale. The GaDOE also converts the test scores into three categories: P1, does not meet stan-dard (scores lower than 400); P2, meets standard (scores from 400 to 450); and P3, exceedsstandard (scores above 450). The EOCT covers five content domains: Fundamental EconomicConcepts, Producers and Consumers, Microeconomics: Elements in the Marketplace, Macro-economics: the National Economy, and the International Economy. Georgia does not use theeconomics EOCT for accountability purposes under the US No Child Left Behind law.

    The data allow us to include demographic information for each student includinggender, race, economic status, and disability status, as defined by the GaDOE. Gender isrepresented by an indicator variable that equals one for male and zero for female. Raceis a vector of four indicator variables that represent Black, Asian, Hispanic, andOther, respectively. White, non-Hispanic is the omitted comparison group. Economicstatus is an indicator variable that equals one if the student is categorized by the GaDOE asbeing Economically Disadvantaged (defined as eligible for a free or reduced price lunch).Similarly, Disability status is an indicator variable that equals one if the GaDOE defines thestudent as one who is a Student with Disabilities which covers a broad range of disabil-ities. To address the observation that average test scores rose over the three years the testhas been offered, we normalize the scores using a technique NCE, devised by the USDepartment of Education to compare scores over time (and between different tests).4 Eachstudent is matched to a teacher of record. We then match each teacher to a history of GCEEworkshops. Our control group is comprised of students whose teachers have not attended aGCEE workshop in the past five years. Our initial treatment group is comprised of studentswhose teachers have attended at least one GCEE workshop.

    We examine 37 GCEE workshops that occurred from 2000 to the end of 2005. Most ofthe workshops are offered multiple times per year. In each year, about 19% of all teachersof economics in Georgias high schools had attended at least one of the workshops understudy. Of the teachers who had attended workshops, many had attended more than oneworkshop. The GCEE raises funds from private sources, typically foundations and thebusiness community, to provide these workshops at no cost to schools and teachers. Notonly does the GCEE pay for the one or two days of instruction and materials provided forthe teachers, it also reimburses travel expenditures and pays for teachers substitutes toallow teachers to attend workshops during the school year. The GCEE budgets roughly$700,000 per year to provide all of its workshops. Georgia, however, does not require itsteachers to attend GCEE workshops.

    Figure 1 summarizes the attendance of GCEE workshops with economic content and thepercentage of Georgia students directly affected by the workshops. (For a list of workshops,see Appendix 1.) In addition, the raw correlation between teacher workshop attendance andstudent EOCT scores is 0.093.Figure 1. Workshop reach.We examined the performance of students who take Advanced Placement (AP) classesin economics and compare their performance with all other students. We find, as expected,that the AP students do much better than their counterparts. Almost all of the AP teachers,however, take the AP economics workshops offered by the GCEE. Therefore, we limit oursample to non-AP students to address the potential upward bias that the inclusion of APstudents might have on determining the effectiveness of GCEE workshops. The eliminationof AP students leaves us with 166,836 student test observations.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

  • 400 J.R. Swinton et al.

    One shortcoming of our data is that we do not pretest students in their knowledge ofeconomics before they take their mandatory economics class. Most studies of teachereffects on learning examine the change in student test scores over a defined length of time.Nevertheless, because we are looking at such a large population with a large variation incharacteristics, and because the EOCT is a high-stakes test, we believe that the EOCT scorewill provide useful information about teacher effects on student outcomes. It is the natureand coverage of the EOCT that is our greatest advantage. Summary statistics of the data weuse appear in Appendix 2.

    4. Model and results

    Although we observe student characteristics, in our data we cannot observe all of the teachercharacteristics that might affect student performance. Therefore, we use a fixed-effectsmodel to allow the intercept term to shift for different teachers. Similarly, we know that thetrend in test scores over the three-year period we examine is slightly upward. Therefore, weuse NCE transformations of EOCT scores. The transformation allows us to compare testscores across time while controlling for any underlying time trend.

    In our model (Equation (1)), we specify a fixed-effects linear regression of student NCEEOCT scores conditioned on student characteristics that include gender, race, economicstatus, disability status and the teachers workshop experience over the previous five years:

    Students are indexed by a subscript i and teachers by a subscript t. Our observations of testscores are for years 20042006. We have suppressed a time subscript.

    The intercept term, , has the standard interpretations, and t is the teacher fixed-effects parameter that captures the difference in unobserved teacher characteristics. Thevariables that are the central focus of this paper are the workshop variables: numwork,

    EOCTScore Student numwork numworkit t j it t t it= + + + + + 17

    8 92 1( )

    0

    20

    40

    60

    80

    100

    Percent

    more

    none 2 or1 or

    more

    3 or

    more

    4 or

    more

    5 or

    more

    Number of workshops

    Reach of GCEE workshops

    Percent teachers

    Percent students

    Figure 1. Workshop reach.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

  • Education Economics 401

    represents the number of workshops a teacher has attended since the year 2000, whilenumwork2 is the square of numwork to allow for a non-linear return to workshops. Thevariables numwork and numwork2 contain only workshops taken before the academic yearbegan. That is, if a teacher takes a workshop during academic year 2005/06, for example,or in a later year, then that workshop is not counted in these variables for tests taken in2005/06.

    The results of the regression represented in Equation (1) are presented in Table 1. All ofthe regressors representing student characteristics have strongly significant effects onstudent outcomes. Furthermore, all of the coefficients on student characteristics have theexpected sign. The model explains roughly 36% of the variation in the dependent variable.Finally, the F-statistic allows us to reject the null hypothesis of equal intercept terms(teacher fixed effects).

    Coefficients can be interpreted relative to the standard deviation of the dependentvariable. For example, being economically disadvantaged (Poor) reduces the expectedscaled score by 20% of a standard deviation of the NCE score.5 Of central interest is theeffect teacher attendance at workshops has on student test scores. The linear return toworkshops (numwork) is negative, small, and statistically insignificant. The square of thenumber of workshops taken is positive and statistically significant. The results indicate thattaking one or two workshops has essentially no effect on test scores (less than 1% of astandard deviation). However, taking three or more workshops has substantial effects. Forexample, the results suggest that a teacher who has taken three workshops prior to theacademic year will lead to student test score gains of 1.9% of a standard deviation, all elseequal.6

    One potential shortcoming of our data selection process is that we do not know whatmotivates teachers to attend workshops. This may introduce a selection bias into our model.A priori we do not know whether any bias would have an upward or downward effect onthe coefficients that measures the impact of workshop attendance (numwork and numwork2).On the one hand, conscientious teachers may seek out programs that offer material for theclassroom and support teaching efforts. On the other hand, principals may assign underper-forming teachers to attend workshops in an attempt to improve their performance. One wayto help mitigate this issue is to estimate our model with teacher fixed effects. By controllingfor time-invariant characteristics of teachers we can in part control for the characteristics

    Table 1. Equation (1) regression results (unrestricted sample, observations = 166,836).

    Coefficient t-Statistic

    Constant 54.80 384.32Poor 4.20 39.28Male 3.75 44.79Black 10.70 91.07Asian 2.25 8.56Hispanic (non-white) 9.02 40.90Other 3.84 11.04Disabled 11.95 58.50Numwork 0.07 0.37Numwork2 0.04 2.66

    Adjusted R2 0.3629F(8, 164,699) 2154.55

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

  • 402 J.R. Swinton et al.

    that lead teachers to attend or not attend workshops, assuming these characteristics do notchange during the observed time frame.7

    Our results suggest that, all else equal, these in-service workshops for high schooleconomics teachers improve student achievement by almost 2% of a standard deviation intest scores if the teacher attended three workshops. Other studies of the impact of in-service workshops for economics teachers suggest larger effects. However, as discussed inour literature review, these studies tend to be small-scale analyses that do not attempt tocontrol for any selection bias. The Angrist and Lavy (2001) study of in-service training forJerusalem teachers finds much larger effects (25% of a standard deviation). However, thetreatment given to the teachers in their study was much more intensive and costly than the workshops analyzed in our paper. Specifically, the in-service training provided tothe Jerusalem teachers included a semester-long content course and weekly in-house train-ing on teaching methods. In addition, the students of these teachers were provided extratutoring outside the school day. Angrist and Lavy (2001) estimate that it cost $12,000 perclassroom for this treatment.

    In contrast, GCEE workshops cost about $80 per workshop day per teacher. That is, aone-day workshop costs the GCEE about $80 per teacher, and a two-day workshop costsabout $160 per teacher. These costs do not include travel, meals, and lodging costs forteachers that are also paid by GCEE. It seems that getting an increase of almost 2% of astandard deviation in test scores for such a low cost compares extremely favorably with thecost-effectiveness of other treatments such as reducing class sizes. Krueger (2002) performsa cost-benefit analysis of the class size reductions that were part of the Tennessee Student/Teacher Achievement Ratio experiment. He reports that, depending on how one views theresults of this experiment, reducing class sizes from 22 to 15 students in grades K3increases student achievement by 0.2 or 0.1 standard deviations. Krueger also reports thatreducing class sizes by seven students costs approximately $3500 per student per year in1997/98. Given these large costs to reduce class sizes for a benefit of 0.2 or 0.1 standarddeviations, it seems that our results suggest that the economics workshops considered in thispaper are cost effective relative to reducing class sizes by this magnitude.

    5. Conclusions

    Many researchers have documented that high school economics teachers do not have signif-icant content knowledge that comes from college-level economics courses. To meet thisgap in content knowledge, the NCEE and its state affiliates such as the Georgia Council onEconomic Education and university-based centers have attempted to fill this void byproviding content workshops to in-service economics teachers. In this study, we use newdata for all Georgia high school economics students to analyze whether student perfor-mance on a high-stakes economics examination is higher when teachers have attended aGCEE-sponsored workshop.

    The primary finding of our research is that GCEE workshops have a positive impact onstudents performance on Georgia economics EOCT once the teacher has attended at leastthree workshops. With this finding in mind, it is interesting to note that more than two-thirds of the non-AP students in Georgia have economics teachers who have not taken anyGCEE workshops within the past five years. Further, the GCEE spends only about$700,000 on this teacher training. Our results suggest that these workshops improve studentachievement at an extremely low cost. We remind the reader, however, to keep in mind thatthese results might not extrapolate to other disciplines because of the general lack of pre-service economics training for teachers.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

  • Education Economics 403

    AcknowledgementsThis research was partially funded by an Excellence in Economic Education subgrant from theNational Council on Economic Education through funding from the United States Department ofEducation, Office of Innovation. The authors thank the National Council on Economic Education andthe Georgia Council on Economic Education for their financial support. They also thank the GeorgiaDepartment of Education for its willingness to share its data and human resources. Finally, theauthors appreciate the insightful comments of two anonymous referees.

    Notes1. Their sample consists of 100,288 third-grade through sixth-grade students.2. Prior to spring 2004, the economics EOCT was administered to students for purposes of standard

    setting and was not a high-stakes examination. Georgia also administers EOCTs in seven otherhigh school subjects.

    3. Of the 90 questions, 75 count toward the students score. Each test also field tests 15 questionsthat do not count toward the students score. After norming, 99% of the test scores will fall withinthe range of 199. One percent of the scores will fall in the extreme tails of the distribution.

    4. NCE score = 21.06 z-score + 50. This positive monotonic transformation of the scale scores ofthe EOCT allows comparison of examinations across time. Our results are not impacted in anymeaningful way by this transformation. The transformation, however, aids in the interpretation ofthe results.

    5. As shown in Table 1, the estimated coefficient on Poor is 4.20. The standard deviation of theEOCT score is 20.66, as seen in Appendix 2. Since 4.20 divided by 20.66 is equal to 0.20, wereport that being economically disadvantaged reduces expected test scores by 20% of a standarddeviation.

    6. Since so few of the teachers in our data attend more than three workshops, we do not feelcomfortable making strong statements about the effects of attending four or more workshops. Inaddition, we estimated our equation with a series of dummy variables indicating how manyworkshops were attended. All of these variables had positive estimated coefficients, but somewere not statistically significant in particular, the dummy variables for two and four work-shops, respectively. This lack of significance could be due to small cell sizes.

    7. In an earlier version of this paper we reported results that excluded teachers who had workshopexperience prior to appearing in our data. In those results, we found an effect of workshops about50% larger than what is reported here. We believe the results reported here do a better job ofcontrolling for unobserved selection into workshops, and we thank an anonymous referee for thesuggestion.

    ReferencesAngrist, J.D., and V. Lavy. 2001. Does teacher training affect pupil learning? Evidence from matched

    comparisons in Jerusalem public schools. Journal of Labor Economics 19, no. 2: 34369.Baumol, W.J., and R.J. Highsmith. 1988. Variables affecting success in economic education:

    Preliminary findings from a new data base. American Economic Review 78, no. 2: 25762.Bosshardt, W., and M. Watts. 1990. Instructor effects and their determinants in precollege economic

    education. Journal of Economic Education 21, no. 3: 26576.Bosshardt, W., and M. Watts. 1994. Instructor effects in economics in elementary and junior high

    schools. Journal of Economic Education 25, no. 3: 195211.Clotfelter, C.T., H.F. Ladd, and J.L. Vigdor. 2007. Teacher credentials and student achievement in

    high school: A cross-subject analysis with student fixed effects. Duke University WorkingPaper, Sanford Institute of Public Policy, Durham, NC.

    Georgia Department of Education. 2007. Standards, instruction and assessment testing: Endo ofcourse tests (EOCT). http://public.doe.k12.ga.us/ci_testing.aspx?Page Req=CI_TESTING_EOCT (accessed August 21, 2007).

    Grimes, P.W. 1995. Economic education for at-risk students: An evaluation of Choices & Changes.The American Economist 39, no. 1: 7183.

    Highsmith, R.J. 1974. A study to measure the impact of in-service institutes on the students ofteachers who have participated. Journal of Economic Education 5, no. 2: 7781.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

  • 404 J.R. Swinton et al.

    Jacob, B.A., and L. Lefgren. 2004. The impact of teacher training on student achievement: Quasi-experimental evidence form school reform efforts in Chicago. Journal of Human Resources 39,no. 1: 5079.

    Kennedy, M.M. 1998. Form and substance in in-service teacher education. Research report from theNational Institute for Science Education, University of Wisconsin, Madison, WI.

    Krueger, A.B. 2002. Understanding the magnitude and effect of class size on student achievement.In The class size debate, ed. L. Mishel and R. Rothstein, 735. Washington, DC: EconomicsPolicy Institute.

    Mackey, J.A., A.D. Glenn, and D.R. Lewis. 1977. Improving teacher training for precollegeeconomic education. Journal of Economic Education 8, no. 2: 11823.

    National Commission on Teaching and Americas Future. 1996. What matters most: Teaching forAmericas future. www.nctaf.org

    Rivkin, S.G., E.A. Hanushek, and J.F. Kain. 2005. Teachers, schools, and academic achievement.Econometrica 73, no. 2: 41758.

    Schober, H.M. 1984. The effects of inservice training on participating teachers and students in theireconomics classes. Journal of Economic Education 15, no. 4: 28295.

    Thorton, D.L., and G.M. Vredeveld. 1977. In-service education and its effects on secondarystudents: A new approach. Journal of Economic Education 8, no. 2: 939.

    Walstad, W.B. 1979. Effectiveness of a USMES in-service economic education program ofelementary school teachers. Journal of Economic Education 11, no. 1: 112.

    Walstad, W.B. 2001. Economic education in U.S. high schools. Journal of Economic Perspectives15, no. 3: 195210.

    Walstad, W.B., and M. Watts. 1985. Teaching economics in the schools: A review of survey findings.Journal of Economic Education 16, no. 2: 13546.

    Watts, M., and W. Bosshardt. 1991. How instructors make a difference: Panel data estimates fromprinciples of economics courses. Review of Economics and Statistics 73, no. 2: 33640.

    Weaver, A.M., W.L. Deaton, and S.A. Reach. 1987. The effects of economic education summerinstitutes for teachers on the achievement of their students. Journal of Educational Research 80,no. 5: 296300.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

  • Education Economics 405

    Appendix 1. Workshop names

    Appendix 2. Data summary statistics

    AP Economics Fed/GSU High School WorkshopAmerican History Fed ICA ConferenceAtlanta City Workshop Fed WorkshopChoices & Changes Financial FitnessCommodity Challenge Give & TakeConsumer Economics Great Economic MysteriesDemise of the Soviet Union Heart of Georgia RESA WorkshopEast Meets West High School EconomicsEconomics In-ServiceEconomic Education for Teachers International TradeEconomic Systems Workshop International WorkshopEconomics at Work Introduction to GCEEEconomics and the Environment Monetary PolicyEconomics in American History Online CourseEconomics in Social Studies School to WorkEconomics in Transition Stock Market GameEconomics of Food Supply Tax WhysEconomics U$A Virtual EconomicsEuropean Union

    Obs = 166,836

    Variable Mean Standard deviation Minimum Maximum

    EOCT NCE score 49.92 20.66 0 100Numwork 0.8115 1.8050 0 20Numwork2 3.9167 19.9734 0 400Poor 0.3229 0.4676 0 1Disabled 0.0712 0.2572 0 1Male 0.4797 0.4996 0 1Black 0.3740 0.4839 0 1Asian 0.0280 0.1649 0 1Hispanic 0.0433 0.2036 0 1Other 0.0148 0.1207 0 1

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f W

    ater

    loo]

    at 0

    8:59

    28

    Nov

    embe

    r 20

    14

Recommended

View more >