Recently, xkcd asserted that "If you spend nine minutes of your time to save a dollar, you're working for less than minimum wage." The math here is pretty simple:
$1.00 * (60 minutes / 9 minutes) = $6.67/h
As it turns out $6.67 per hour is below the minimum wage laws of most states and the United States federal rate; there was no doubt in Randall's math. There is, however, more to think about:
If Adam holds a minimum wage full-time job, he earns $7.25 per hour for forty hours per week:
$7.25 * 40 hours = $290 per week
Adam wouldn't do much worse to spend nine minutes making $6.67 per hour -- just $0.58 less per hour than his day job (or more, if you account for income tax). How much less analogous wage would still be worth Adam's time?
That is, how does the value of our time relate to money, or an hourly wage? My first thought was to compare the cost/time (or savings/time) of something against my pay rate. It seemed reasonable that if I can earn more in an hour than a thing would cost me for an hour, I would come out ahead spending that hour earning another hour's wage, instead of an hour saving the cost. The fallacy of this idea is that I could spend that hour earning at my usual rate. In fact, I am a salaried employee, and working an extra hour would not net me another hour's wage. (Others may work a job that prohibits overtime, and thus also be unable to work hour 41.) I would have to find a second job to earn additional money, and there is no guarantee of the wage I would earn.
I wanted to resolve this full-time earning limit problem, that is, to value my time in such a way that accounts for not working maximum overtime, and for how my wage would vary (up or down) in hour forty-one. I needed new, adaptive, measurement; and my intuition told me it would be a lower number. Thinking a bit more, I decided on the average-waking-hourly-wage. This is the total wages earned in a given time period (say, a week) divided by the total number of hours one could work (whether at a job, or domestically), that is, hours awake. This average better represents how one is using their earning potential, reflecting the time one isn't working.
If Adam sleeps (or tries to sleep) eight hours per night, he holds sixteen hours per day, or 112 hours per week, awake. Adam has 112 hours during which he could produce income, but only forty during which he does produce income. This brings Adam's average-waking-hourly-wage down to:
$290 weekly income / 112 waking hours = $2.59 per waking hour
On average, Adam produces (earns) a mere $2.59 per hour. Given that insight, during the hours when Adam is not producing income, but instead consuming income, he would not be foolish to spend nine minutes working for $6.67 per hour, or, one dollar. In fact, Adam could rationally spend twenty-three minutes worth of his off-work-hours earning or saving one dollar, and still come out ahead of his average-waking-hourly-wage.
Bruce holds three minimum wage jobs, and works a full 112 hours per week. I've met Bruce, he opens McDonald's five days a week, then walks across the parking lot to bag groceries eight hours later. On the weekends, he's doing chicken and biscuits across the street. Bruce's average-waking-hourly-wage is a full $7.25 per waking hour.
Would Bruce spend nine minutes to save a dollar? I mean, if he weren't such a driven bread-winner. No, of course not, Bruce knows he can't spare the nine minutes; his time is too scarce, and he'll be late for work, that workic.
$7.25 * 112 / 40 = $20.30
If Carl earns just over twenty dollars per working hour at his full-time job, then he is producing (on average) a minimum wage each hour that he is awake; assuming our previous eight hour night. Carl has an advantage over Bruce in that he can earn the same average-waking-hourly-wage while maintaining seventy-two waking hours away from work.
During these seventy-two leisurely hours, Carl may find himself pumping fuel and having to weigh the worth of spending nine minutes to save a dollar. Carl knows that in those nine minutes he's already earning $1.09, just for breathing. Perhaps Carl finds his time is worth more than a dollar (9% more, actually) and decides he'd rather have his time, than save that dollar.
If Carl hires the neighbor's kid to cut his grass for thirty dollars, and the boy uses a wide and quick riding mower to finish in one hour, the boy has earned $30/hour. Carl may feel he has made a poor decision, knowing he earns much less per hour than he just paid someone else to do what he could have done. Carl has forgotten that it takes him a full five hours to mow his yard with his old push-reel mower, and those five hours are worth $36.25 to him. He decides he made a good choice after all.
Adam, though, will be cutting his own grass, knowing that his $2.59 average-waking-hourly-wage would take nearly 12 hours of his time to justify the spend of $30. He may even offer to cut Bruce's grass, for that price, because Bruce clearly has no time left do his own yard work.
I think the average-waking-hourly-wage makes a good baseline for measuring the monetary value of ones time. Certainly other factors should be considered in each decision. For example, sometimes one might have other priorities to attend to, or derive a certain pleasure or pain from a given chore; some people like mowing their yard. Each decision is different, but having a simple heuristic for making the simpler decisions, and helping with the rest can help us make the most of our time without stressing about the cost.
For the time it took me to write this, I should have hired someone else.
1: Workics are addicted to work like alcoholics are addicted to alcohol. Workics are not to be confused with workaholics, who are addicted to workahol.