Economists talk about the world of work as the ‘labor market’, or the supply of and demand for work by people in the economy. As soon as we start working, we ‘enter the labor market’, either as an individual offering skills or as an employer seeking them. The relative number of job seekers vs available work—the supply of labor and the demand for labor—determines how much people get paid for their work, or their wages.
So why do we ‘enter the labor market’ in the first place? A big assumption in a lot of simple labor market models is that individuals are motivated to work by money. In theory, we’d love to sit around all day and have fun, but because we need to buy things to live, we spend some time working to earn the money we need. By this logic, the way we decide how much we want to work is by weighing up how valuable our time off is compared to how much money we would make working.
If that’s true, then the more money we earn, the bigger the loss we make for every hour not spent working. People with lower wages aren’t missing out on as much money when they take time off, so they will probably want to work less. If your wage goes up, this model predicts you’ll want to work more, as your leisure time just got more expensive.
Because the labor market is controlled by supply and demand, everyone should always be able to work as many hours as they want. If you want more money, you can always work more for a lower wage. In this sense, by picking how much they work, workers get to determine their income within the limits of the kind of jobs they’re able to do and the number of hours they can physically work.
Some parts of this story clearly don’t match up to the real world. A lot of people want to work but can’t. And a lot of people go to work for reasons other than money. So while these simplified models help make economists’ jobs easier, they’re not always the most useful way to think about the real ‘labor market’ we live in.