Real Wages of Individuals Under Obama, Trump, and Biden

There have been repeated assertions by Trump during the presidential campaign (as well as by Vance in the October 1 debate between the vice presidential candidates) that people’s wages were higher under Trump than they now are under Biden.  What has in fact happened?

The chart above shows how indices of the real wages of individuals have moved during the last two years of Obama’s presidency, the four years of Trump’s presidency, and Biden’s presidency through to August 2024 (the most recent data available as I write this).  There is much to note, but first a few words on the methodology.

The primary data comes from the “Wage Growth Tracker” website provided by staff at the Atlanta Fed.  It makes use of data generated as part of the Current Population Survey (CPS) of the Bureau of Labor Statistics.  From the way the survey is designed, they can obtain data on the wages earned by each household member at a point in time and again for that same individuals twelve months later.  From this raw data, staff at the Atlanta Fed calculate for the individuals in the matched households how much their wages changed over those twelve months.  Since the CPS also collects information on the individuals themselves, they can also then determine what the average (as well as median) changes in wages were for individuals grouped by various characteristics, such as age or gender, race, education, occupation, and more.  The chart above shows both how real wages changed for workers as a whole, as well as the changes with wage-earners grouped by quartile of wage income, from the lowest to the highest.  The figures shown here are for the medians in each category.

The Atlanta Fed wage data goes back to December 1997, is presented in terms of the 12-month percentage changes, and is in nominal terms.  I converted the data to real terms based on the change over the same 12-month periods in the overall CPI (formally the CPI-U, produced by the BLS), converted this to an index number, and then rebased this to set January 2021 equal to 100.  The result is the chart at the top of this post.

In interpreting these figures, it is critically important to recognize that they reflect what households actually experience in terms of the changes in their individual wages.  This differs from what one will normally see when reference is made to changes in mean (i.e. average) or median wages.  The figures in the chart track the experience of individuals, and individuals will normally see their wages start relatively low – when they are young and inexperienced – and then grow over time as they gain skill and experience.  That is the normal life cycle.

Statistics on wages as normally presented, in contrast, measure not what the experience is of individuals, but rather movement in the overall mean or median wages of all those in the labor force at the time.  Changes in such wages will normally be less than what one observes for individual wages, as the labor force is dynamic, with young people entering (at normally relatively low wages) while older people retire and leave the labor force (at normally relatively high wages).  This will reduce the measured growth in average wages as higher-wage workers have left while lower-wage workers have entered.  While this change in the average wage of all those employed at each point in time is a useful statistic to know, it does not reflect the lived experience of individuals, who normally see their wages grow over time (at least in nominal terms) as they gain experience and hence ability.

One sees a consequence of this in the chart above.  Those in the lowest quartile of the distribution of wage earnings have seen growth in the wages they earn as individuals that is greater than the percentage increases of those in the higher quartiles.  This is because those starting out in the labor force – and entering at relatively low wages – generally see a relatively fast rate of wage growth as they gain skills and are promoted.  This slows down over time, with older workers still receiving annual wage increases (in at least nominal terms) but not as large in percentage terms as young workers do.

Tracking the real wages of individuals is therefore of interest, but cannot then be used to track over long periods of time what has happened to average (or median) wages.  But for periods of several years, as well as for a comparison of growth in some early period to growth in a similar later period, tracking as in the chart above is of greater interest than what has happened to average or median wages of an always changing labor force with young workers entering and older workers leaving.  It is useful in comparisons of the growth in wages between presidential terms.

With this understanding, a number of points may be noted on individual wage growth in recent years:

a)  Individual wages in real terms were rising at a reasonable rate in the last few years of the Obama administration.  They then grew at a similar rate (not a faster rate) during the first three years of the Trump administration prior to the disruptions due to Covid.  In fact, the growth rate of overall individual wages (as measured at the medians) was 1.4% per annum in real terms during the final two years of the Obama administration (January 2015 to January 2017), and then the exact same 1.4% per annum in real terms during the first three years of the Trump administration (January 2017 to January 2020).

Trump has repeatedly claimed that wage growth (as well as many other things) were the highest ever during his administration, but that is not the case.  The most that Trump can rightfully claim is that he did not mess up the growth path that Obama had put the economy on following his reversal of the economic and financial collapse that began in 2008, in the last year of the Bush administration.

b)  With the onset of the Covid crisis in early 2020, individual real wages in fact rose despite the chaos of the lockdowns.  This might appear perverse, but in fact makes sense.  First of all, the rate of unemployment shot up to 14.8% – the highest it has been since the Great Depression (so Trump now owns this record).  But 85.2% remained employed, and were employed under often difficult personal circumstances given the easy spread of Covid and a lack of preparation by the Trump administration for the approaching pandemic.  (Trump instead repeatedly stated that all would be fine; that the virus would quickly disappear; and that banning flights from China had been a great success in stopping the virus.)

Those who remained in their jobs during this difficult period were often compensated well for their willingness to do so.  They received significant increases in their wages and/or bonuses.  The alternative of unemployment was also not as bad as it normally would be.  Aside from the safety aspect of protecting yourself from exposure to Covid, programs for the unemployed at the time were more generous and more easily available than they normally are, due to special legislation passed to address the exceptional circumstances of Covid.  Workers had this alternative, and firms had to respond.  Firms also received often generous support through various special programs during this period, that enabled them to pay higher wages to the employees who remained on the job.

Thus one sees in the chart above that individual real wages in fact rose in 2020, despite of (or perhaps one should say because of) the Covid disruptions.

c)  The Covid disruptions continued into 2021 and the first half of 2022, while the special support programs for firms and the unemployed were scaled back to normal.  But supply chains had been radically disrupted globally due to the crisis, did not start to recover until vaccines became widely available, and then required time to catch up and normalize.  And while supply was constrained, demand rose more quickly starting in 2021 as shoppers returned.  This demand was especially high both because of pent-up needs or desires for items not purchased in 2020 due to the lockdowns as well as caution due to the easy spread of the disease, while personal savings were exceptionally high and could now be spent.  Savings (and bank accounts) were high due both to the lack of spending in 2020 and to the extremely generous financial support packages passed under both Trump and Biden.

Global supply chains then worked themselves out by mid-2022.  The rate of inflation had been relatively high before then due to the high demand confronting limited supply, but inflation as measured by the CPI index for all items other than shelter then fell dramatically from July 2022 once supply was no longer constraining.

This inflation was then reflected in the decline in real wages from early 2021 to the trough in June 2022, as seen in the chart above.  From January 2021 to June 2022 the overall individual real wage fell at a rate of 3.5% per annum.  But probably a more appropriate measure would be for the period from January 2020 (immediately before the Covid crisis) to June 2022.  Over this period, the overall individual real wage fell at a rate of 1.2% per annum.

d)  Once Covid and its related impacts were largely over in mid-2022, real wages immediately began to grow again.  And indeed, they have grown since then (through at least to August 2024 – the most recent data available as I write this) at a rate of 2.5% per annum.  This is substantially faster than the pace they had grown under Trump (as well as under Obama before him), although this can be attributed in part to a recovery from the decline in the period ending in June 2022.

With this recovery, the overall individual real wage is now back on average to where it was in January 2021.  And the real wages of those in the lowest quartile and in the second quartile of the wage income distribution are now significantly higher than they have ever been.  But the levels as of August 2024 should not be seen as especially significant in themselves.  August is simply the most recent data available.  Rather, what is significant is the strong growth seen in real wages since June 2022, with no sign yet that that strong growth is abating.  Eventually that growth will likely return to the longer-term growth seen under Obama and then in the first three years of Trump, but it is not there yet.

e)  One should also note that all these figures are for the medians over a diverse population.  While the overall figures (whether measured at the means or the medians) have gone up and down, the actual real wages of any given individual can be quite different.  While the median individual real wage is now back to where it was in January 2021, this will not be true for everyone.  That diversity in experience needs to be recognized and acknowledged.

 

Biden inherited an economy that had suffered the sharpest downturn and highest unemployment since the Great Depression.  Managing the onset of the Covid pandemic in 2020 would have been difficult for even the most competent of administrations, but the Trump administration was far from the most competent.  The impacts of that crisis – on supply chains among other effects – continued into 2021 and the first half of 2022, and they led to falling real wages over this period.  But as supply chains normalized, real wages began to recover.  As of August 2024, overall individual real wages are back to where they were in January 2021.  But more importantly, those real wages have been growing at a rapid pace since mid-2022 and as yet show no sign of slowing down.

Incomes by Field of Study for College Graduates: The Distribution Matters More Than the Average

There are numerous articles and studies on the average earnings of college graduates broken down by what they majored in when in college.  Some provide estimates for a comprehensive list of college majors, while some focus on the earnings of the top 10 or bottom 10 ten college majors.  Some focus on earnings soon after college graduation, some on earnings at mid-career, and some on earnings over a lifetime.  And some use private sources of data while others use publicly available data provided by the Census Bureau or some other government agency.  See, as examples, here, here, here, here, here, here, here, here, here, here, here, here, here, here, here, here, here, or here.

Also, some report average earnings while some report median earnings (where 50% of the individuals earn more and 50% earn less).  But all focus on such a single measure of the earnings a graduate might expect had they chosen to major in a given field of study.

This misses a lot.  No one earns the average nor the median.  There is, rather, a range of earnings above and below those average (or median) figures, and as we see in the charts at the top of this post, it is a very wide range.  The distributions overlap each other a lot, and the peak share is not all that high.

In the examples presented above, the average wage and salary earnings of those who majored in Business ($97,100) are more than a third higher than the average earnings of those who majored in one of the fields in the Liberal Arts & Humanities ($71,600).  (The source of the data for these estimates is discussed in a note at the end of this post.)  But despite the average for Business majors being more than a third higher, the earnings at the top end (the top third, say) of those who majored in one of the Liberal Arts & Humanities fields were far higher than the earnings of those who were at the bottom end (the bottom third, say) for the Business majors.

That is, it is not just the averages that matter.  It is much more important whether your earnings will be towards the top end of a given field rather than closer to the bottom end of some alternative field, even if average earnings in that alternative field may be higher.  The choices students make on what field to study in college can and should take this into account.

Instead of deciding to major in some field where the numerous articles available advise you may earn a higher income on average, it is far more worthwhile and consequential to decide on a field of study based on where you believe you can do well:  a field you can fully master and where you perform exceptionally well, and thus where there is good reason to expect you will be able to do better than others in that area.  Indeed, you should only major in a field where you have a sound basis for the belief you can be at or close to the very top in that field.

This will likely also be a field that you personally enjoy.  Expertise in a field and personal enjoyment in it usually go hand-in-hand.  You enjoy working in areas that you are good at.  And due to the broad overlap in the distributions of earnings across the different fields, this will likely also lead to far higher earnings than majoring in a field that one is not terribly good at, even if the average earnings in that field are higher.

There is, or at least there should be, of course much more than earnings to consider.  But this is a further reason to choose a field based on what one is good at, as that will likely lead to a more fulfilling career as well.

The most difficult part of this is determining what field of study will, in fact, be one where you are relatively good compared to others in that field.  The only way to discover this is to try a range of possible fields and see where you do well.  The first two years of college should normally be such a time of exploration, where you take courses in a range of subjects and see where you excel and where you struggle.  You will also see what you enjoy and what you do not.  And one should not be surprised if you end up choosing a field you had not expected.  Personally, none of my friends in college ended up majoring in the field they thought they would when starting out as a freshman, and nor did I.  There is absolutely nothing wrong with this.

The difficult part is recognizing where one can in fact perform near the top of those in the field and where you are fooling yourself.  Numerous studies have shown that people typically overestimate their skills relative to others.  For example, a now classic study from 1981 found that 93% of Americans in a sample believed they were more skillful car drivers than the median driver, and 88% were safer drivers.  Of course, only 50% can be more skillful, or safer, than the median.  The study has more recently (in 2023) been replicated, while a meta-study from 2019 confirmed the general problem:  Most of us feel we are better-than-the-average in many domains.

So how can we keep from fooling ourselves?  While not easy, the grades received in the range of courses taken when in the exploration stage of college should be an indication.  But for this, one needs to take challenging courses and not simply those where one can get a high grade without much in terms of effort or performance.  It also requires professors willing and able to provide grades that are true indicators of performance, and not simply uniformly high grades to everyone as the easy path to follow.  While it may not feel like it at the time, the most valuable gift one can receive is to get a poor grade in a field you are considering to major in but in which you in fact are not performing all that well.  It is far better to discover this when in college, rather than years later in a career in a field that you are not all that good at.

That is also why one should not pay much attention to an overall grade point average.  While grades in your senior year can demonstrate whether you have mastered the field, grades as a freshman or sophomore should reflect experimentation over a range of subject areas in those years, where some of those grades may be good and some not-so-good.

College should be a time of discovery – a time when you can explore many different fields and find out where you can in fact do well and where not.  It should be a time when you receive what I would term a true “education” by mastering in depth some particular field of study.  By the time of graduation one should have learned “how to think like a _____ ” (with the blank filled in based on the specific field).  That is, have you developed a deep and thorough understanding of the field?  Or have you basically only memorized some of its findings or conclusions, without a good understanding of how they were reached?  A career can be built on the former, but the latter soon dissipates.

 

=======================

Note on the Data:

The data for the charts come ultimately from the American Community Survey (ACS) of the US Census Bureau.  The ACS is an annual survey (with the most recently released data from 2022) based on a very large sample of about 2 million households interviewed each year.   While the Census Bureau provides easy access to tables based on the ACS for average figures, to obtain the full distribution of earnings – such as in the charts above – one needs to extract data from the Public Use Microdata Sample.  This provides, for a representative sample of the households surveyed, the full set of responses to the questionnaire at the level of the individual household.  One can then extract the wage and salary earnings of those individuals with a Bachelor’s degree who majored in a given field, and from this determine the full distribution of those earnings, not just the average.  I included all adults with a Bachelor’s degree at all age levels.

Operationally, I extracted the data via the IPUMS USA website (an institute based at the University of Minnesota – IPUMS is an acronym for “Integrated Public Use Microdata Series”).  IPUMS provides easy access to the ACS data through an online data analysis tool so no special statistical analysis software (such as SPSS or SAS) is required (although those are supported as well at the IPUMS site).  The IPUMS USA site includes data not only from the most recent ACS survey, but from each of the ACS surveys going back to 2000, and to the decennial US census going all the way back to 1790.  In addition, there are separate IPUMS sites for a range of other microdata files, both for the US and internationally.

Raising the Minimum Wage Has Not Led to Higher Unemployment: Evidence from California

A.  Introduction

California has aggressively increased its minimum wage since 2014, starting on July 1 of that year and then with increases on January 1 of each year from 2016 through to 2024.  Critics have argued that this would increase unemployment, saying that firms would no longer be willing to employ minimum-wage workers at the new, higher, minimum wage rates.  They argued that the productivity of these workers was simply too low.  If they were right, then one would have seen increases in the unemployment rate in the months following each of the steps up in the minimum wage.  But there is absolutely no evidence that this happened.

The chart at the top of this post shows this lack of a response graphically.  It may be a bit difficult to see as showing a lack of a response is more difficult than showing the presence of a response.  The chart will be discussed in more detail below, but briefly, it shows the averages in each of the subsequent 12 months following the increases in the California minimum wage (including or excluding 2020 to 2022, as the Covid disruptions dominated in those years), of the change in the unemployment rate in California versus the change in the unemployment rate in the US as a whole.  The changes are defined relative to what the unemployment rates were in the month before the increase in the minimum wage – i.e. the comparison is normally to the rate in December when the new minimum wage became effective on January 1.  The unemployment rate of course goes up and down depending on macro conditions (and was normally going down for most of this period), so to control for this the changes in the unemployment rate in California are defined relative to the changes in the US as a whole.

What was the result?  The chart shows that basically nothing happened.  If anything, what was most common was that the unemployment rate fell slightly in California relative to the rate in the US in the months following increases in the California minimum wage.  These changes were small, however, and are not really significant.  But what is clear and significant is that aggressive increases in the minimum wage in California have not led to increases in unemployment in the state.  The assertion that they would is simply wrong.

As noted above, this chart will be discussed in more detail below.  But the post will first look at the changes in the minimum wage in California since 2014, and how the minimum wage in California compared to the federal minimum wage for the US as a whole as well as to several measures of wages in the US and to the federal poverty line.  Following a look at the (non)-impact on unemployment, we will for completeness also examine what happened to labor force participation rates.  Some might argue that minimum-wage workers who would have lost their jobs might then have left the labor force (in which case they would not have been counted as unemployed).  But we will see that labor force participation rates in California also did not change following increases in the minimum wage.  Finally, the post will discuss possible reasons for why increases in the minimum wage in California did not lead to a rise in unemployment there.  Standard economics under the standard assumptions would have predicted that it would have.  But those standard assumptions do not reflect well what is happening in the real world in labor markets.

B.  The Minimum Wage Rate in California

The federal government sets a minimum wage that applies to the US as a whole.  But due to gridlock in Congress (and opposition by Republicans), the last time the federal minimum wage was raised was in July 2009, when it was set at $7.25 per hour.  As was discussed in a post on this blog from 2013, when adjusted for inflation this minimum wage was below what we had in the Truman administration in 1950, despite labor productivity now being more than three times higher than then.  And from July 2009 to now, inflation has effectively reduced the value of the $7.25 wage of July 2009 to just $4.97 (based on the CPI).  The federal minimum wage has simply become irrelevant.

Due to this lack of action at the federal level. many states have legislated their own minimum wage rules for their respective jurisdictions.  California is one, and has been particularly aggressive.  Over the past decade, the minimum wage in California has been increased to $16 per hour generally and most recently to $20 per hour for fast-food restaurant workers:

California Minimum Wage Recent History

Effective date 25 employees or less 26 employees or more
Jan 1, 2008 $8.00 $8.00
July 1, 2014 $9.00 $9.00
Jan 1, 2016 $10.00 $10.00
Jan 1, 2017 $10.00 $10.50
Jan 1, 2018 $10.50 $11.00
Jan 1, 2019 $11.00 $12.00
Jan 1, 2020 $12.00 $13.00
Jan 1, 2021 $13.00 $14.00
Jan 1, 2022 $14.00 $15.00
Jan 1, 2023 $15.50 $15.50
Jan 1, 2024 $16.00 $16.00
Fast food restaurant employees:
Apr 1, 2024 $20.00 $20.00

Sources:  California Department of Industrial Relations.  See here and here.

The focus of this post is on the series of increases that began on July 1, 2014, with the prior minimum wage set as of January 1, 2008, shown for reference.  That 2008 rate was $8.00 per hour and was raised effective on July 1, 2014, to $9.00 per hour.  California then began to increase the minimum wage annually starting January 1, 2016, with this continuing up to and including on January 1 of this year (2024).  Furthermore, effective January 1, 2017, California began to set separate minimum wage rates for workers employed in businesses with 25 employees or less or with 26 employees or more.   These could differ, although recently they have not.

Finally and most recently, California set a new minimum wage effective on April 1, 2024, of $20 per hour for employees of fast food restaurants (in restaurant chains with 60 or more locations nationwide).  I include this here for completeness, but it is still too early to say whether this has had an impact on unemployment.  So far it has not, but as I write this state-level unemployment data is available only for the months of April and May.  But those figures do not provide any support for the critics:  The unemployment rate in California in fact fell in those two months compared to that in the US.  This will be discussed below.

The general California minimum wage has now doubled – to $16 per hour – from the $8 per hour it was prior to July 1, 2014.  But for a sense of what this means, it is useful to put this in terms of various comparators:

California Minimum Wage:  Selected Comparisons

California minimum wage in firms with 26 employees or more

California Minimum Wage per hour Ratio to US median wage of hourly workers Ratio to US average hourly earnings of all private sector workers Ratio to Poverty Line for family of four Ratio to upper limit of earnings of first decile of US wage & salary workers
2008 $8.00 65% 38% 76% 93%
2014 $9.00 68% 37% 76% 94%
2016 $10.00 71% 39% 83% 102%
2017 $10.50 72% 40% 86% 103%
2018 $11.00 73% 41% 89% 104%
2019 $12.00 78% 43% 94% 109%
2020 $13.00 79% 46% 100% 111%
2021 $14.00 82% 47% 107% 115%
2022 $15.00 83% 47% 109% 113%
2023 $15.50 81% 47% 104% 108%
2024 $16.00 46% 104% 108%
Fast Food:
April 2024 $20.00 58% 129%

The comparisons here are based on the California minimum wage for employees in businesses with 26 or more employees.

The wage measures come from various reports produced by the Bureau of Labor Statistics (BLS).  The first column (following the column with the California minimum wage) shows the ratio of that minimum wage to the BLS estimate of the US median hourly earnings of wage and salary workers paid an hourly wage.  The ultimate source for this is the Current Population Survey (CPS) of the BLS, and this particular series is only provided annually (with 2023 the most recent year).  The California minimum wage rose from 65% of this median wage of hourly workers in 2008 to 83% in 2022 and 81% in 2023).  By this measure of wages – of wage and salary workers paid an hourly wage – the California minimum wage rose significantly in comparison to what a median hourly worker was being paid nationally.

A broader measure of wages is provided in the next column.  The ratios here are for a worker being paid the California minimum wage to the average hourly earnings of all private sector workers – not just workers paid at an hourly rate.  This is also provided by the BLS, but comes from its Current Employment Statistics monthly survey – a survey of business establishments that asks firms how many they employ and what they were paying those workers.  These average wages are higher as they cover all workers and not only those paid at an hourly rate, plus the average will be higher than the median in cases such as this (as the distribution of wages paid is skewed to the right).  By this measure, the California minimum wage rose from 38% of what US private sector workers were being paid on average in 2008 (and 37% in 2014) to 46-47% since 2020.

In terms of the federal poverty line, even full-time workers (40 hours per week for 52 weeks each year) paid the minimum wage in California in 2008 or even 2014 would have been able to earn only 76% of the poverty line income for a family of four.  But with the increases in the minimum wage in the past decade, they would have finally been able to reach that poverty line in 2020, and then 109% of it in 2022.  In 2023 and again in 2024, it would have been 104%.

The final column shows earnings at the California minimum wage compared to the earnings that would place a worker in the first decile (the bottom 10%) of the distribution of earnings of full-time wage and salary workers.  These are also estimates from the BLS, are expressed in terms of usual weekly earnings, and are issued quarterly based on results from the CPS surveys.

With the increases in the California minimum wage over the past decade, full-time workers earning the minimum wage in California had incomes that exceeded the upper limit of the earnings of wage and salary workers in the US as a whole who were in the first decile of the earnings distribution – ranging from 102% of what the bottom 10% earned in 2016 to 115% in 2021 and 108% currently.  Assuming the distribution of earnings in California would be similar to that in the US in the absence of the special California minimum wage laws, this can provide a rough estimate of how many workers were being affected by the California minimum wage laws.

If earnings at the California minimum wage would have matched the earnings at the upper limit of the first decile (i.e. a 100% ratio), the implication would be that the share of workers for which the California minimum wage was applicable would be 10%.  With the ratio above 100% (by varying ratios up to 115%) the share affected would have been somewhat more than 10% – perhaps 11 or 12% of workers as a rough guess.  But the BLS data is not for the entire labor force.  Rather, it is only for wage and salary workers employed full-time.  One has, in addition, part-time workers and those who are self-employed.  The distribution of hourly earnings among those workers is not available, but if it is similar to the hourly earnings of full-time workers, the share affected would be the same 10% (or more).

The purpose here is just to provide a general feel for how many minimum wage workers were being affected by the changes enacted in the California minimum wage over the past decade.  Various factors cannot be accounted for, but they are at least in part offsetting.   For the purposes here, a reasonable estimate would be that at least 10% of the labor force had wages so low that the increases in the minimum wage in California over the last decade had an impact on what they would then be paid.  That is a not insignificant share.

C.  The Impact of Increases in the Minimum Wage on Unemployment

What impact did those increases in the California minimum wage then have on the employment of workers who were being paid the minimum wage? Critics of the minimum wage argue that workers are paid a wage based on their productivity, and if they are being paid at or close to the minimum wage this is only because their productivity is low.  In this view, if the minimum wage that has to be paid is then raised, those workers will be let go and will become unemployed.  Did we see this?

No, we did not.  The evidence from the ten different increases in the minimum wage in California over the past decade (from July 2014 to January 2024) does not show any impact at all on unemployment.  The chart at the top of this post summarizes the results.

The chart is based on calculations using data on the unemployment rate in California and on the unemployment rate in the US as a whole, where I calculated the unemployment rates from underlying data on the number unemployed and the number in the labor force (as published unemployment rates themselves are shown only to the nearest 0.1% point – anything less is not considered significant).

For numerous structural reasons, the unemployment rate in any particular state (including California) will differ from the rate in the nation as a whole.  These structural reasons include the age structure of the population (middle-aged workers are less likely to be unemployed than young workers), the education structure (college-educated workers are less likely to be unemployed than workers with only a high school education), the industrial structure, the racial and ethnic mix of the population, and much more.

But while these structural factors affect the level of the unemployment rate in California relative to the national average, such structural factors change only slowly over time and hence do not have a significant impact on the month-to-month changes in that rate.  The rate of unemployment itself can, however, change significantly from month to month at the national (as well as state) levels due to macroeconomic factors.  In a recession the rate of unemployment goes up, and in a recovery or during periods of rapid growth, the rate of unemployment goes down.  It is just that in the absence of some state-specific event (such as – possibly – a change in its mandated minimum wage), the month-to-month changes in the unemployment rate at the state level will generally be similar to the changes seen at the national level.  They move together, as affected by macroeconomic factors.  The question being examined is thus whether the increases in the minimum wage in California over the past decade led to an increase in the unemployment rate in California in the months following those changes in the minimum wage, as compared to what was observed for the unemployment rate nationally.

This is a simple form of what is called the “difference-in-difference” method.  What is significant is not whether unemployment in California went up or down during the period, but whether it went up or down by more than what was seen at the national level in the same period.  For example, define the changes as relative to the month prior to a change in the minimum wage law (i.e. normally relative to what the rate was in December, as all but one of the changes were effective on January 1 of each year).  The employment and unemployment statistics (gathered by the BLS as part of the CPS household surveys) take place in the middle week of each month, so the mid-January unemployment rate will be treated as month one following the change in the minimum wage.  The mid-February unemployment figures will then be month two, and so on until mid-December of that year will be month twelve.  The minimum wage was then increased again in the next January 1, and the annual cycle was repeated for a second set of observed impacts (or non-impacts).  The changes in the unemployment rate are thus defined as the difference between changes in the California rate for the given number of months following the change in its minimum wage (i.e. in month one, or in month two, and so on to month twelve), relative to what the changes were in the same period for the US as a whole.

As a concrete example using made-up numbers, suppose that in some December the unemployment rate in California was 6.0% while the unemployment rate in the US as a whole was 5.0%.  Suppose then that in, say, month three (March) the observed unemployment rate in the US was 4.5% – a fall of 0.5% point over the period.  If the unemployment rate in California fell to 5.5% in the same period (to March), then the change in California was the same as the change in the US as a whole, and the increase in the minimum wage on January 1 did not appear to have any differential effect.  If, however, the unemployment rate in California fell only by, say, 0.3% points to 5.7%, while the US rate fell by 0.5% in the same period, one would say that it appears the increase in the minimum wage in California led to an increase in its unemployment rate by 0.2% points.  And if the rate in California fell by 0.7% points to 5.3% while the US rate fell by 0.5%, then there was a 0.2% point reduction in the unemployment rate in California following the change in its minimum wage rate.

There will of course be statistical noise, as all the figures are based on household surveys.  And importantly, in any given year there will also be special factors that could enter in that particular year that could affect the results.  More is always happening than just a change in the minimum wage law.  But to address this we have that California changed its minimum wage law on ten separate occasions over this ten-year period.  We therefore have ten separate instances, and we can work out the average over those ten separate episodes.  While special factors may have arisen in any given year, the only common factor in all ten was that California raised its minimum wage ten separate times.

(The exception in the averages is for the January 1, 2024, increase in the minimum wage,  As I write this, we only have data for the five months through May.  Thus the averages over up to the full ten instances can only be calculated for the first five months, while the averages for months six through twelve can only be for the nine cases to 2023.  Also, note that for the July 1, 2014, increase in the minimum wage, the changes were defined relative to the California and US unemployment rates in June, with the subsequent twelve months then covering July 2014 to June 2015.)

Those average impacts were then remarkably small:

Average Changes in the California Unemployment Rate less Changes in the US Unemployment Rate, in the Months Following an Increase in the California Minimum Wage (in percentage points)

Months from Minimum Wage Change July 2014 –   May 2024 July 2014-2019,                   and 2023 – May 2024
0 0.00% 0.00%
1 -0.02% 0.01%
2 -0.06% -0.05%
3 -0.05% -0.04%
4 -0.07% -0.05%
5 0.00% -0.10%
6 -0.00% -0.09%
7 0.03% -0.08%
8 0.04% -0.10%
9 -0.04% -0.05%
10 -0.02% -0.05%
11 -0.02% -0.05%
12 0.02% -0.03%
Overall average -0.02% -0.06%

The chart at the top of this post shows this table graphically.  The two columns are for averages over the full period and with the years 2020 to 2022 excluded.  The Covid disruptions dominated in those years, but the results are basically the same whether those years are included or excluded.

The changes were all essentially zero.  It is not possible to see any increase in the California unemployment rate at all resulting from the increases in the minimum wage in the state over the past decade.  If anything, the increases in the minimum wage were associated in most cases with a small reduction in the unemployment rates.  But these are all small, and are probably simply statistical noise and not significant.

To put this in perspective, recall the discussion above that arrived at the rough estimate that the share of the labor force being paid at or close to the minimum wage might be around 10%, and possibly more.  If – as the critics argue – such workers can be paid only those low wages because their productivity is so low, then they would all lose their jobs if their employers were required to pay them a higher wage.  If true, the unemployment rate would then shoot up by 10% points.  One obviously does not see that.

If we had over-estimated the share employed at the minimum wage by a factor of two, so that it was in fact 5% rather than 10% of the labor force, then the unemployment rate would have shot up by 5% points.  One does not see that either.  One does not even see an increase of 1% point, nor, for that matter, even 0.1%.  The overall average change is in fact generally a small decrease in the rate of unemployment in California relative to the US rate in the months following an increase in the minimum wage, although I suspect this is just statistical noise.

Most recently, California raised the minimum wage for workers at fast food restaurants (at chains with 60 or more locations nationally) to $20 per hour effective April 1, 2024.  We so far only have data for April and May as I write this, but that data provides no support for the belief that this has led to an increase in the unemployment rate.  Fast-food workers are of course only a small share of the labor force:  about 2.2% in California in 2023 based on BLS data for fast-food and counter workers (where fast-food workers make up about 80% of this total in national data).  But in the two months since the April 1 increase to $20 per hour for fast food workers, the California unemployment rate relative to that in the US in fact fell by 0.06% points in April compared to March, and by 0.25% in May compared to March.  It did not go up but rather went down.

Finally, it is possible that critics of the minimum wage may argue that low-wage workers laid off following an increase in the minimum wage will then leave the labor force entirely.  If they did this, they would then not show up in the unemployment statistics and one would not see an increase in the observed unemployment rates.  To be counted as unemployed in the BLS surveys, the unemployed person must have taken some positive action in the prior four weeks to try to find a job (e.g. send out applications, visit an employment center, and similarly) and yet was not employed at the time of the survey.  If they did not take such an action to try to find a job, they would not be counted as “unemployed”.  Rather, they would be counted as not participating in the labor force.

Therefore, for completeness, I calculated what happened to the Labor Force Participation Rate in California compared to the US rate in the months following the increases in the California minimum wage.  The data comes from the BLS (but is most conveniently accessed via FRED, for the US and the California rates respectively):

Average Changes in the California Labor Force Participation Rate less Changes in the US Labor Force Participation Rate, in the Months Following an Increase in the California Minimum Wage (in percentage points)

Months from Minimum Wage Change July 2014 –  May 2024 July 2014-19,                       and 2023 – May 2024
0 0.00% 0.00%
1 -0.01% -0.06%
2 -0.02% -0.10%
3 -0.07% -0.11%
4 0.04% -0.10%
5 0.01% 0.01%
6 0.11% 0.00%
7 0.06% -0.08%
8 -0.02% -0.03%
9 -0.11% -0.10%
10 -0.11% -0.05%
11 -0.04% -0.05%
12 0.02% 0.02%
Overall average -0.01% -0.05%

As with the unemployment rates, there was no significant impact.  Had the 10% of the workers being paid at or close to the minimum wage dropped out of the labor force following the increases in the minimum wage, the figures would have shown a 10% point reduction in the California labor force participation rate.  One does not see anything remotely close to that.  One does not see an impact of even 1.0% point.  There was simply no significant impact on labor force participation rates.

Thus, the data indicates the minimum-wage workers remained in the labor force and did not become unemployed.

D.  The Economics of How Wages are Determined:  In Theory and in the Real World

Economic analysis, when done well, will be clear on what conditions are necessary for certain propositions to hold.  Under those conditions, one might be able to arrive at interesting conclusions.  But a good analyst will examine whether there is reason to believe that those conditions reflect what we should expect in the real world.  Often they do not.  That is, what is of interest is not simply some proposition in isolation, but rather also under what conditions one can expect that proposition to hold.

The economics of how wages are determined is a good example of this approach.  One can show that, under certain conditions, the wages paid to a worker would reflect the value of the marginal product of that worker – that is, the value of the increase in output that was made possible by hiring that worker.  But one should then look at the conditions that are necessary for this to follow.  And in the case of wage determination, they are not at all realistic, particularly for low-wage workers.  The implication is that one should not expect the wages of these workers to reflect necessarily the value of the marginal product of such a worker.

A problem, however, is that some commentators do not follow through and examine the conditions necessary for the theoretical conclusion to hold.  That is, they stop at the proposition that workers will be paid the value of their marginal product, and fail to look at whether the conditions under which that proposition would hold are realistic.  They thus conclude, for example, that increases in the minimum wage will lead to the layoff of all the workers who were being paid the prior minimum wage.  In their world, those workers are being paid a low wage because their productivity is low, and if firms are then required to pay a higher wage then those workers – these analysts conclude – will be laid off and indeed not be employable anywhere.  They assert that their productivity is too low.

Yet as we saw above, we see nothing at all close to this in the data.  California raised its minimum wage repeatedly in the last decade, and in a significant and meaningful way.  We saw that it led to a significant increase in the wages of such workers compared to the overall wage structure in the US.  Yet the unemployment rate in California did not increase at all in the months following those increases.

What, then, are the conditions that are necessary for this theoretical model of wage determination to hold?  And how realistic are they?  This section will provide a brief discussion of that theoretical model, and will then examine some of the conditions necessary for it to hold.  It will not be a comprehensive discussion of all the issues that could arise.  There are others as well.  Rather, the purpose is to show for one set of reasons (there could be others also), the simple notion that wages will be equal to the value of the marginal product of the worker does not reflect the reality of how wages are determined.

a.  The Standard Neoclassical Model of Wage Determination

In the standard model of neoclassical economics, it can be shown that the wages of a worker will equal the value of the marginal product of the worker.  This can be shown to hold under the assumption of “perfectly competitive markets” for both labor (hired as an input) and for firms (hiring the labor).  But for such perfectly competitive markets to exist, one needs:

1.  On the side of the firms, there are many firms within a small geographic zone (small enough that commuting costs to the firms will not differ significantly) that are all competing with each other to hire labor with any given skill set.  That is, the markets are “dense”, with many firms competing for that labor.

2.  On the side of labor, there are many workers with each given skill set who are competing with each other and are seeking to be employed within that geographic zone.

3.  There are no lumpy fixed costs incurred by the firms in hiring or firing a worker, nor are there any lumpy fixed costs for a worker in finding and being hired into a new job.  Economists refer to this as no transaction costs.  That is, that there are no costs incurred (neither on the part of the firm nor the worker) when a worker is fired and replaced with another.

4.  There is full information freely available to all parties on what skills are required for a job, what skills each worker has, and how any worker will perform in any job.  Both the firms and the workers know all this, with no cost to obtain such information.

5.  Production is a smooth, upwardly rising (up to some limit), and always concave function of the hours any individual laborer provides for a job.  Concave means that while the curve is rising, it is rising by less and less as the hours provided by the laborer increases.  That is, there are no “bumps” in the curve.  The slope of that curve at any given number of hours of labor is the marginal product of the laborer at that number of hours.  That is, the slope indicates how much additional output there will be with one additional unit of labor being provided.

If all of the above holds, then one should expect that firms will pay in wages, and workers will receive, the value of the marginal product of what the workers produce.  If workers were paid less than this, they would know the value of what they produce is in fact more and they would immediately move to a nearby competing firm that is willing to pay them up to the value of their marginal product.  And if firms paid more than this, then competing firms could take away business from the firms paying the higher wages.

In this system, workers will thus be paid the value of their marginal product – no more and no less.  And if this were true in the real world, then a mandate from the government to pay a higher minimum wage would mean that all those workers whose productivity was below the new minimum wage rate would be let go.  They would become unemployed and indeed unemployable, as this set of assumptions implies that the productivity of such workers is simply too low for any firm to be willing to pay them the new minimum wage.

b.  But the real world differs

Laying out the assumptions necessary for the neoclassical theory of wage determination allows us then to see whether those assumptions correspond to what we know about the world.  They do not:

1. Markets are rarely dense.  There are usually only a few firms – and often even no other firms hiring workers with similar skills – within a geographic zone so small that a worker is indifferent as to whom they would go to work for.  There may be few or even no firms nearby that a worker could threaten to move to if they are being underpaid.  And the few firms that are there may well follow what they consider to be informal “norms” on what such workers should be paid, rather than compete with each other and bid up the local wages.

2.  There are transaction costs for both a firm considering to fire a worker and then to hire a new worker as a replacement, and for a worker when considering a move to a new employer.  There are major costs incurred by both.  Switching between employers is far from cost-free, so it is rarely done.

3.  There can also be more overt constraints imposed on labor mobility and hence the ability of a worker to threaten to leave for a better-paying job.  Noncompete clauses in many labor contracts – including for low-wage workers – may legally block workers from switching to a new employer in the industry where that worker has the particular skills to do well.  The FTC has estimated that 18% of all US workers are covered by noncompete clauses.  The FTC thus approved on April 23, 2024, new regulations banning their use.  While the rule is scheduled to enter into effect on September 4, 2024, it will undoubtedly be challenged in court, with this leading to delays before it can enter into effect (if it ever does).

There is also the separate practice of antipoaching clauses.  These are common in the fast-food industry as well as in other national chains of franchises.  The antipoaching clauses are not in the labor contracts themselves, but rather in the franchise agreements between the franchise owner and the national firm.  They require that the franchise owner not employ any individual who had worked at another franchisee’s establishment sometime before – typically at some point in the prior six months.  McDonald’s claims it ended requiring those clauses in its franchisee contracts in 2017, and several states have banned the practices within their borders.  But McDonald’s is still being sued in court, and it appears the practice remains common.  The new FTC rule – if upheld in court – may apply to these practices as well.

4.  Information is also far from complete nor is it cost-free.  A firm can never know for sure how a particular worker will perform in a job until they are already on the job (with it then costly to fire and replace them in case the performance is not good).  Nor will the worker easily know what all the job opportunities are out there, and what he or she would be paid at some alternative firm.

The relevant information may also be more readily available to one side of the transaction than to the other – what economists call “asymmetric information”.  The worker may know well his or her skills and abilities, but the prospective hiring firm will not.  Similarly, the hiring firm may know well what is needed to do well in a job, but the prospective worker will not.  Also, doing well in a particular job is more than simply a skill set.  It also requires an ability to work well with colleagues and a willingness to take the work seriously.

Firms will thus be cautious in hiring and may only be willing to pay a relatively low wage to new workers to start.  Alternative firms will act similarly, as those firms are also unsure how well a new employee might work out (information is not complete).  Thus they too will only offer a relatively low wage to start.  Plus there are significant costs in the hiring and firing process itself.  All this serves to lock in workers at the firms where they are now, without a credible threat to move elsewhere if their wages are not raised to reflect their full productivity.

5.  Workers also gain firm-specific skills simply by the time they spend at the job.  This spans the range from skills for the specific tasks that the job entails, to understanding better how the firm approaches what they want from those in these jobs, to getting to know colleagues better and their specific likes, dislikes, and how they do things.  These skills are helpful, and lead to the worker becoming more productive at that particular firm.

But while a worker may see his or her productivity rise over time at some particular firm, they will not necessarily see their wage rise by the same amount.  That is, the workers would be paid less than the value of their marginal product.  While the firm might pay the worker somewhat more simply to help lock them in, this would not necessarily reflect the full amount of their higher productivity at that firm.  The worker would not have a credible threat to leave to go to a competing firm where he or she would be paid more.  Their productivity at an alternative firm – where they would once again be starting out – would not be as high and those firms would not be willing to offer a higher wage.

6.  There is also a more fundamental problem in the ability (or rather inability) to ascertain what the productivity is of an individual worker.  One of the assumptions of the neoclassical economic analysis noted above is that the relationship between the input of individual workers and the output of the firm is strictly concave.  That is, as the input of the worker goes up (more hours) there will be a smooth decline in the extra output of the firm as a result of the increased labor input, with no “bumps” in that curve.

Economists call this diminishing marginal returns.  If one increased labor input by a unit, one would see some increase in output.  Increase the labor input by another unit, one would see an increase in output again, but by less than in the first step.  And when the relationship is strictly convex, the increase in output would be less and less for each unit increase in labor input, up to a point where there would be no further increase in output (and after which it might even decline).

Reality is more complex.  Those working in firms are not working simply as individuals but as part of teams.  Adam Smith in the first few pages of The Wealth of Nations in 1776 already noted how far more productive workers can be when working in teams than when trying to do it all individually – the famous pin factory.  It still applies today, and not simply in factories.  Take, for example, a team working a shift at a fast food restaurant.  There may normally be a team of, say, ten for a particular shift.  Each worker has different responsibilities, but most of the workers have the skills to do most or perhaps all of the individual tasks.

In this made-up example, they arrived at a team of ten as normally best to handle a particular shift based on how the tasks can be divided up and given the number of customers they normally expect.  It would be difficult to do with just nine, and not much gained with an extra worker and thus eleven on that shift.

What then is the marginal product of each of the workers?  They need to know this to determine what wages they could pay in the standard neoclassical theory, but it is not well defined.  Starting with any grouping of nine workers, the marginal product from hiring a tenth worker would be relatively high as they then could organize into the optimal team of ten.  But any one of the workers could be considered to be the tenth one added to the team, and hence responsible for the jump in output in going from what is possible with just nine workers to the more productive team of ten.  And if all of the workers were paid a wage corresponding to that jump in output that is possible when going to a full team of ten, they would together be paid more than the overall value of what is being produced with a team of ten.

While the workers would likely welcome such higher wages, the reality is that fast-food restaurants do not aim to operate at a loss.  And they don’t.  Their workers are simply not paid that much.  There are fundamental conceptual problems in trying to define the marginal product of a worker when work takes place in teams (as it normally is).

E.  Final Points and Conclusion 

California has raised its minimum wage repeatedly in the past decade, but there is no indication in the data that this has led to an increase in unemployment.  While economic theory would predict that in “perfectly competitive markets” the workers being paid below the new minimum wage would be laid off (as wages are set, under these assumptions, based on productivity, and they assert that the productivity of such workers is simply too low), this only holds under unrealistic assumptions.  Wage determination is more complex.  In the real-world conditions under which wages are in fact set, it is not a surprise to find that unemployment did not in fact go up.

This does not mean, however, that any increase in the minimum wage would not lead to higher unemployment.  If the minimum wage was set next year at, say, $100 per hour, one should of course expect issues.  What we see in the data is not that there can be any increase in the minimum wage with then no consequences for unemployment, but rather that the increases in the minimum wage that were mandated in California in the last decade did not lead to an increase in the rate of unemployment.

Increases in the minimum wage may also lead to increases in the prices of certain goods.  If the production of those goods were heavily reliant on minimum wage workers, and the firms would now have to pay a higher wage for those workers, it may well be the case that such goods will now only be available at a higher price.  Fast-food hamburgers may go up in price, but don’t view this as simply affecting “junk food”.  The prices of blueberries and strawberries might go up as well.

Does this mean that the critics of the minimum wage are in fact right?  No, it does not.  First, it remains the case that unemployment did not go up following the major increases in the minimum wage in California over the past decade.  The critics asserted that it would.

Second, while prices of fast-food hamburgers may have gone up following the increases in the minimum wage, those prices did not go up by as much as the minimum wage did.  If wages in fact reflected the value of the marginal product of the worker, the wages of the minimum wage workers would still have gone up relative to that value – just not by as much.  Under this theory of wage determination, they would still have been laid off.  But there is no evidence of this in the data.

Labor markets operate far from what economists would call “perfectly”.  In this reality, minimum wage laws can play a valuable and indeed important role.