آیا شرکت ها دستمزدهای اسمی را در یک محیط تورم قطع کردند؟: شواهد در سطح کوچک از اواخر 19 و اوایل صنعت بانکداری قرن 20
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|17885||2010||14 صفحه PDF||سفارش دهید||محاسبه نشده|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Explorations in Economic History, Volume 47, Issue 1, January 2010, Pages 112–125
This paper examines wage adjustment in the late 19th and early 20th centuries using personnel records from the Union Bank of Australia and Williams Deacon’s Bank (England). During the period of this study there was steep and prolonged deflation. Firm-specific and industry-specific demand shocks also put downwards pressure on wages. Although it was common for individual wages at the banks to remain unchanged from year to year, wage cuts were very rare even for senior workers. Turnover at both banks was extremely low and, thus, despite flexibility in the wages of incoming workers, did not offset the effects of individual-level wage rigidity. Consequently real wages moved counter-cyclically.
One of the most fundamental principles of neoclassical economics is that markets adjust swiftly and fully to shocks in supply and demand. The behavior of wages in post-World War II labor markets thus presents a puzzle, as studies across a range of countries have consistently found downward nominal wage rigidity.1 These studies show that wages frequently remain unchanged from year to year, and that relatively few workers receive pay cuts even in years with low or negative inflation.2 However, there is far less evidence on the prevalence of wage cuts in the late 19th and early 20th centuries, when prolonged periods of deflation created stronger downward pressure on nominal wages than has occurred in more recent years. The New Keynesian economics has offered several theoretical models to explain the absence of wage cuts. These models assume that workers have deep-rooted preferences concerning wage cuts, which they believe to be “unfair”.3 The insider–outsider model posits that it is costly for firms to replace their incumbent employees, for example due to firm-specific knowledge (Lindbeck and Snower, 1988). The existence of replacement costs enables incumbent workers to extract rents and to bargain for protection from wage cuts and for employment rules such as “last hired, first fired”. Firms adjust to shocks by laying off younger workers or by hiring new workers on lower salaries than incumbents. Efficiency wage models posit that workers react to wage cuts that they believe to be unfair by leaving or withholding effort (Akerlof and Yellen, 1990). This may have the effect of reducing output by more than wages and thus increasing unit costs. Profit-maximizing firms thus choose to leave nominal wages unchanged following shocks. Summers (1988) argues that even if workers have preferences only with regard to real wages, this may result in nominal rigidities if workers have imperfect information about prices or outside labor markets. Recent work by Gordon (1982) and Hanes and James (2003) argues that rather than stemming from fundamental preferences, the infrequency of wage cuts since the Second World War may simply be a social norm that has evolved in a period of relatively high inflation. According to this view, employers are willing to leave nominal wages unchanged following productivity decreases because inflation enables real wages to adjust. However, unlike fundamental preferences, social norms are fairly flexible. A norm of no wage cuts would not develop or survive in a deflationary environment, where it has considerable costs. Hall (1980) offers another explanation for the absence of wage cuts in an inflationary environment based on implicit contracts. In the presence of long-term employment relationships, wages may simply be “instalment payments on a long-term debt”, rather than reflecting productivity in the current period. Thus with a single deflationary year, one might expect relatively little change in wages; however, after an extended period of deflation, one would expect the value of the long-term debt and thus wages to decrease. Menu cost models, which assume that there are small transactions costs to adjusting wages, offer an explanation for the infrequency of wage adjustments and the clustering of increments at exactly zero (Hanes, 2000). If a change in nominal productivity is smaller than the cost of wage adjustment, a profit-maximizing firm will leave wages unchanged. There will be pressure to cut nominal wages only if there is a decrease in real productivity that is larger than the sum of the inflation rate and the adjustment cost. This paper examines wage adjustment in the banking industry in the late 19th and early 20th centuries using records from the Union Bank of Australia (UBA) and Williams Deacon’s Bank, England (WDB).4 The records cover annual salaries of all male staff at the UBA between 1887 and 1900 and at WDB between 1890 and 1936.5 The late 19th and early 20th century banking industry provides in many ways an ideal laboratory for testing hypotheses concerning wage adjustment. Banks possessed many characteristics of modern employers, but during the period of this study faced fewer institutional barriers to cutting wages and much stronger downward pressure on wages than has been the case for the post Second World War firms that have been the subject of previous studies. Both the UBA and WDB had well developed internal labor markets with internal promotion, long-term relationships with their employees, and implicit rules governing wages (Seltzer and Merrett, 2000, Seltzer and Simons, 2001 and Seltzer and Frank, 2007). Both had wage scales covering the early years of tenure; although these were not enforceable in court and contained provisions allowing wages to be off scale at the discretion of the senior management. Neither the UBA nor WDB faced much government regulation of employment or a strong union presence over the period of this study. Both faced prolonged deflation; the UBA in the 1890s and WDB in the 1890s and between 1921 and 1933. Finally, industry-specific shocks in both countries created pressure to reduce wages. In the early 1890s the Australian banking system experienced one of the worst crises to ever occur within a branch banking country. After the First World War the British Banking industry experienced a sharp increase in the supply of clerical labor, as barriers to hiring women began to disappear and the banks began hiring large numbers of women for the first time. The main issue addressed in this paper is the nature of wage adjustment in the banking industry. The combination of strong downward pressure on wages and a lack of formal barriers to wage cuts suggests that wages should have been fairly flexible. However, this was mitigated by strong internal labor markets and the high skill level of the workforce.6 I address the question of the extent to which wages in the banking industry were rigid downward beginning with some simple descriptive statistics. I then use multinomial logit regressions to analyze the characteristics of workers who took pay cuts and whose wages were left unchanged. I then examine the role of turnover in wage adjustment, focussing whether new entrants were hired at lower wages than incumbent workers. Finally, I examine whether real wages were rigid downward at the individual and firm levels and how average wages responded to changes in the inflation rate. The results show that wage cuts were rare at both banks and wage adjustment occurred in a manner very similar to what has been frequently described for the period since the Second World War. A very high proportion of nominal increments were exactly zero, but negative increments were uncommon, with the exception of an across-the-board 10 percent cut at the UBA in 1895. This lack of downwards wage adjustments cannot be explained by underlying supply and demand factors, as the two banks faced much stronger pressure to cut wages than has been the case for firms in the post-war period. The probabilities of both zero increments and wage cuts increased with workers’ age and seniority, industry-specific downturns, and (to a much lesser extent) with deflation. However, with the exception of the UBA in 1895, the probability of workers taking a pay cut following either an industry-specific or macroeconomic downturn was very low at both firms. Although wages of existing staff were rarely cut, the wages of new entrants were much more flexible, and the average annual entry wage for junior staff decreased about as frequently as it increased. However, low rates of turnover in the industry meant that the flexibility of wages for new staff did not fully offset rigidity in the wages of continuing staff, and real wages at both firms moved counter-cyclically.
نتیجه گیری انگلیسی
This paper has examined wage adjustment and downward nominal wage rigidity in the late 19th and early 20th century banking industry using personnel data from the Union Bank of Australia and Williams Deacon’s Bank. The data have an important advantage over data used in previous studies of wage rigidity during this period; namely that observations are at an individual level. Both the UBA and WDB faced considerable macroeconomic, industry-specific, firm-specific downward pressure on wages and few institutional impediments to wage cuts. The period of this study was characterized by strong and persistent deflation in both Australia and England. The UBA also faced a serious banking crisis in the early 1890s, which resulted in the temporary or permanent closure of most Australian banks. Williams Deacon’s faced a severe recession in its Lancashire manufacturing base in the 1920s. An increase in the labor supply due to the expansion of female employment beginning during the First World War put further pressure on male wages at the Bank. The main finding of this paper is that, despite strong downward pressure on wages, nominal wage cuts for existing staff were very rare at both banks. Even during years with severe deflation and for groups of workers who were unlikely to be accumulating human capital, nominal increments clustered at zero. Wage cuts were almost completely absent at WDB and rare at the UBA except for an across-the-board cut in 1895. The determinants of unchanged wages were very similar to those of wage cuts, suggesting that leaving wages unchanged was as a substitute for cutting wages. Even though wages of incumbent staff were rigid downward, new workers were often paid less than previous cohorts. However, low turnover rates at both banks meant that wage flexibility for new entrants did not offset wage rigidity for incumbents and overall wages moved counter-cyclically.