Originally posted by: blackangst1
Originally posted by: ahurtt
Originally posted by: nobodyknows
Originally posted by: Patranus
This has nothing to do with what group 'earned' their income, it has to do with...well...'paying their fair share' of the total tax burden. 45% of Americans pay NOTHING.
Then for God's sake give them a raise. Or are you now going to claim they didn't earn it?
Not that I disagree with you as I have got squat for a raise for the past 2 years in a row and the year before that I got like 2% and the 2 years before that squat again, but I will point out that in order to get a raise there are 2 presumptions you have to make: 1) They have a job or 2) They don't have a job but want a job.
Now here's another problem I see which could contribute partly to why the pace of inflation is outpacing wage/salary growth for the working class folks. And by working class folks I'm going to lump together everybody in both salaried and hourly categories with a gross annual income of $100,000 or less. Now granted I am no eonomist and I don't hold any degrees in economics or anything like that so this is purely just me "thinking aloud" so to speak. But the thing I see is that, all things being relative, there is only so much that let's say. . .a burger flipper. . . is worth. It seems to me that at some point you have to just say on absolute terms that no, this job cooking hamburgers is not worth more than $X per hour and I don't care what the rate of inflation is. Inflation, on the other hand, knows no bounds. It can and will steadily increase.
Here's perspective. I used this calculator (cuz it's the thing I found by googling for annualized rate of inflation 1973 to present) to get the annualized rate of inflation from January 1973 to Dec 31 2008 and it tells me it is 4.5%. That means $1 today has roughly the same purchasing power that $.20 had in 1973 or vice versa, what cost you $1 in 1973 would cost you on average about $4.95 today. I picked 1973 only because that was the year somebody else quoted earlier saying that real wages had decreased every year since.
Now lets see what the minimum wage was in 1973 and assume a burger flipper in 1973 was making that amount per hour: $1.60 (taken from US dept. of labor)
Minimum wage today: $7.25.
This constitutes an increase of only 422% while you need 495% of what you had in 1973 to have kept up with inflation. There's a 73 percentile gap there. . .
I suppose you can do this exercise for any different time periods you choose using the calculator and the labor department historical minimum wage charts. But the thing is minimum wage workers are probably faring better off than salaried exempt workers in some regard because they are generally guaranteed that they will get a cost of living increase whenever the minimum wage increases because it is dictated by law. Salaried exempt workers, on the other hand, have slowly seen their standard of living eroded because they have no such guarantee and in general, if my own job is any indication of the larger trend, they are just getting poor slowly. The minimum wage workers are struggling to keep up but will never quite make it with one job while those of us who were a little better off are being gradually brought down.
Feel free to pick apart and shred my logic and thinking here because I know you guys can and will do so anyway. As I said, I'm not really making any firm assertions here as I'm not professing to be an economist but I'm just an average working guy playing with some numbers and commenting on what I see going on in the world around me. I'm sure there are any number of things that I may have failed to take into account which I'm sure you guys will be all to happy to point out.
erm...the difference between 422% and 495% is only 17%....
Well maybe I stated it unclearly. What I was saying is that the absolute (not annualized) inflation rise was 73% more than the minimum wage rise over the same time period simply by subtracting 422 from 495. You're calculating it as (495-422)/422. I get it. But regardless. . .either way you can see there's a non-trivial disparity and that's mainly what I was trying to show. If minimum wage were keeping pace with inflation it should be more in the neighborhood of $8 / hour today but you can see the current minimum wage lags that by $.75. Assuming somebody works a standard 5 day 40 hour / week job at minimum wage and takes no vacation or sick days (ideal circumstances), that is a difference of $6 per day or $1560 per year they are getting short changed. (5 days per week * 52 weeks per year = 260 days * ($.75 per hour * 8 hours per day) = $1560). Not gonna get you rich very fast but could mean the difference between having a little something to invest for retirement or not.
