This isn't a matter simply for the young. The young are definitely hit harder, but the real issue is that "work" is not valued as it once was. The 'economy' is doing great. Stocks go ever higher despite pandemics and money trends. Taxes are low. Luxury spending is at an all time high. Investors, be them wealthy independants or pensioned former workers, are generally happy. Young people own nothing and so the bulk of their income must come through wages rather than investments. It is those workers, those who don't own their sources of income, that are struggling to make rent.
Good work is still highly valued, it's just that the value is not paid to the worker, but to the company owning the work. If you increase a company's profit by $500k annually through setting up a new automation system, you'll still only get your $40k internship salary. And the company will pocket the remaining $460k, to be paid out to the owners as profits.
So yes, I fully agree with you. This hits the young people the hardest because they are the ones to own the least, on average. But more in general, everyone who is not part of the elite owning those companies and their preferred stocks is suffering.
It's almost a cliche "owners vs. workers" class war at this point. But for some reason, everyone in the US expects to enter the owning elite within years - no matter how ridiculously improbable that is - so nobody is willing to stand up for the workers.
In one of my first jobs, I came up with a novel idea to automate a big chunk of work, during the early days of ML. We A/B tested it and it increased the company’s revenue by like a third while decreasing costs (and this wasn’t a tiny company). It took a whole team a few months to implement the idea, as it involved a few different skill sets.
To your point, who deserved to capture the value here? Well there was an A/B test to clearly show what caused the boost, and a simple trail from my research (done on my own initiative) to the team's implementation to the revenue, so we at least knew that this group of people was collectively responsible. Nobody got raises, bonuses, or anything. If we had captured that value it would have had to multiply our comp many times over. So why should I have ever gone above and beyond in the first place? Meanwhile my manager repeatedly complained to me that I only worked 9-5.
This early experience was formative in my workplace opinions. Go figure.
[I'm adding thoughts here instead of editing the above]
Speaking more broadly, these kinds of misaligned incentives are probably doing massive damage to the economy, in terms of lost productivity and innovation.
A huge chunk of the people who should be productive and be innovating literally don't have an incentive to do anything above and beyond keeping their job. It's all adversarial and extractive instead of cooperative. On both sides. Employers won't pay workers for the value the workers provide, and in return workers surf HN and reddit all day.
Meanwhile, apparently there's some kind of epidemic of "bullshit jobs." Again, go figure.
Arguably the incentives aren't aligned, innovation isn't happening, and "bullshit jobs" all exist because we've hit some sort of maximal productivity trough where the free market doesn't feel a need to incentivize innovation/productivity any further.
This is often what a lot of people mean when they claim we've hit "late stage capitalism": it is possible the paperclip maximizers have already "eaten the world", turned everything to paperclips, and all that is left is "bullshit jobs". There's no reason to incentivize innovation because everything is already paperclips.
Im in the same boat as you. Have been working 2 to 4 hours each night extra for my job. I needed to come in 1 hour late twice two weeks in a row and got called into my bosses office. He said "We are an 8 to 5 office." I told him about my extra work and he just repeated the quote above.
I had a similar experience early in my career working in a call center. We received a lot of calls about 1 problem with customer modems. I wrote a script that detected snmptraps and automatically issued bounce commands to the remote modems.
Our call volume dropped noticeably. I told my manager why. He was pissed, and asked how we were going to justify headcount if our call volume was dropping.
Lesson learned. [edit: the ambiguous lesson is intentional. I stopped trying, but he kept everyone employed. So... lesson learned :)]
Weird. I tell my folks to get their work done and I expect them to schedule their own time because I trust them as professionals. If they need to work between 6pm to 2am or 4am to 10am because that's when they're the most productive, or they're dealing with something in life and they're still getting the work done I trust them. I feel folks get fully jaded because everyone sets up all this rigidity and there's a huge lost of mutual trust and respect because of it.
Coming in twice an hour late and being ridiculed about it is ridiculous. This is professional life, not some strict grade school.
In all likelihood, those people are under pressure from their bosses. They need the appearance of everyone being there and working hard. For some people in managerial positions, any kind of data showing the employees still working won't be enough... They want to see asses in seats and all at the same time.
I've had to work under many executives like this and it's pretty annoying. They can't be reasoned with.
Oh and the faster your people work - the worse it is for people who rely on headcount for their job. They can't justify a larger headcount if everyone is working efficiently...
If you made a significant contribution (and it certainly sounds like you did) then you should probably have gotten some reward for doing a good job, maybe raise or promotion etc.
However just because you increased revenue with you test in no way means you deserve to capture the value of it. You didn’t do it in isolation. At the very least the reason you were able to increase revenue by a third was because there was a revenue and a customer base to start with. You didn’t create that, that was given to you to leverage off of. I assume you had infrastructure and tooling support as well. So at the very least that increase revenue should be distributed amongst other employees that build the foundation in the first place.
Further, you can't expect to capture all or most of the upside unless you're also willing to risk the downside.
Maybe the A/B test would show no change. Maybe it would show a slight loss. Do you still expect to be paid your salary for the work that went into that? If you do, you shouldn't expect to capture all the upside.
Imagine 5 teams set out to run $100K experiments; 4 teams fail and 1 team creates a $1M gross win. On net, $500K of value was created. The company can't weather the losses from the 4 teams and pay out 100% of the upside ($900K) of the 5th experiment to the 5th team.
Funny but I think that finance does this right - give me x dollars to manage and I get some percent of the upside if any. In that case you didn't earn the capital but your effect in leveraging it is tied directly to your comp.
In a just world, you would've gotten a slice of the revenue. Especially in tech, where your performance can be a revenue multiplier and continue to pay dividends for years, it should be that the people who make the change should be incentivised to do so.
I'd say this is why companies give bonuses. Stock is fungible for cash unless the company is private, in which cases a stock bonus can be meaningfully different from a cash bonus.
I (mostly) disagree. If I receive an RSU grant that vests over 4 years, it's not just because the company hopes I'll stay that long but primarily because the company hopes that will align my incentives with the shareholders' incentives. If I work to make the company better over a multi-year period, we all win together. If they instead just give me the equivalent value in cash salary, my incentive is to extend my term of employment, not the value of the company.
Do you have an idea of why the shareholders permit you to stay employed then?
I believe that they, via management, believe that the aggregate of what you and your colleagues do is contributing to the overall enterprise value, even if your individual contribution is much smaller than a penny per share.
He specifically said what he does won’t change the stock price by a cent, so he is presumably talking about his individual contribution, not the aggregate one. Which is his point —- giving him RSU’s won’t allow him to capture the value of going above and beyond, because even as you acknowledge yourself, his going above and beyond won’t move the share price by even a penny.
If every software engineer can lift the share price by a hundredth of a penny every year, should the company give shares as part of comp to their 5,000 engineers or not? Should they even employ engineers at all?
The fact that some engineers can shirk and ride on the efforts of their peers does create a tragedy of the commons possibility, but the most appropriate solution there involves strong local leadership, rather than the conclusion that RSU grants are ineffective.
The point is that they are ineffective for the stated purpose of incentivizing the engineers to work harder. To that end, bonuses are much more effective.
The difference is that anyone can buy stock in their publicly held employer (with a bonus or otherwise) and have the same incentive. RSUs force the incentive to exist (instead of directly rewarding performance), but the balance between RSU growth as reward and ability to influence RSU growth may be negligible except at VP levels and above. If I am reasonably certain that keeping a backend service running within SLA is unlikely to meaningfully move the stock price I certainly won't be putting in extra hours or weekends to achieve it. Additionally, it forces a minimum risk on the risk-reward calculation made by an employee.
Whereas banks and sales teams have long given bonuses because it's tractable to assign value to individual contributions, I think bonuses are the generally preferred incentive when tractable. Making partner is roughly the VP-level equivalent of RSU grants where company performance starts being tied to individual performance.
Where this becomes a bit more evident is that RSU refreshes are almost always based on performance review, like bonuses.
I see RSUs as a cheaper/more flexible compensation for employers and a tax reduction strategy for employees (both worthwhile in their own right).
There are a few people who are capable of such remarkable profit-increasing feats. Most companies rarely encounter them, and have never bothered setting up systems to reward that kind of work.
Many of these people eventually end up working at companies that _do_ have such systems. Anecdata: a friend of mine recently found a clever hack that increased ad revenue on a major product by some fraction of a percent for one of the world's largest internet companies. His bonus that year was equivalent to a down payment on a nice house.
This sort of thing is the ideal item to put on your resume when looking for a higher paying job. Anything concrete about value provided in your previous job is worth gold to your next employer.
I often see employers whining their employees don't go the extra mile... without giving the employees any incentives of doing so! What I tell folks I mentor is that a job isn't serious if there's no stocks or bonus in the compensation package.
For that specific example, it's because there was no risk on your end for developing the product. To capture it's value the IP would have needed to be developed outside the company and licensed to it. But that's way more risky.
> So why should I have ever gone above and beyond in the first place? Meanwhile my manager repeatedly complained to me that I only worked 9-5.
You shouldn't have, at least, not there. Clearly they had no clue how innovation happens and how to create an environment where the type of people who innovate ends up.
I've observed a correlation between successful high-complexity, high-risk, high-unknowns, high-payoff efforts and high-trust work environments, that extend down to the lowest rungs of an organization's totem pole.
I suspect it works this way because to put it into software architecture terms, the trust is a microservice delivery model prerequisite, so people can loosely couple as needed to efficiently work on blockers. I find it ironic that many laypeople see the US military as a rigidly stratified, top-down command and control organization, but speaking to many military members, I find the amount of trust (once earned) placed in startlingly young and junior members far exceeds most commercial organizations. Despite all the military war stories of ineptitude and incompetence, I don't think it is a coincidence that compared to the average commercial organization, the military is still perceived by many to "get 'er done".
This is fantastically difficult to achieve because it requires a from-the-top commitment to developing human, empathic leadership on top of management (which is already a difficult to master skill set by itself). It takes enormous emotional guts and vulnerability to be in that kind of leadership; I know it would challenge my personal limits for sure, possibly even break them. This takes decades of continuous work without interruption through top leadership succession evolutions.
Was this a job at a place that was a such a small startup that you were being paid partially in stock? Or was this a larger company where you didn't have to take that risk?
My point is that there are places where your contribution to the overall success is more directly related to what you are paid, however many people don't want to work at those places because it involves significantly more risk.
Let's say you have a nearly 100% chance of making $50k for the next 10 years for a total of $500k. Alternatively you can take a 10% chance at making $5 million total over the next 10 years. The reason that there is potential for a larger payout in the second job is because you are taking a greater risk. If you were working at a company where someone else was taking the risk, then the rewards likely flowed to those people.
For many young people, their first job gives them some data points about what they are actually capable of delivering so they can choose to work at places that trend toward rewarding contribution (startup, self-employed, etc.) or at places that don't (like your company) based on what type of performance they are capable of providing.
I believe this is how many companies get started. You consult a lawyer, build things in your free time, then rent it back to your old employer as a SaaS and maybe give them 10% off for being your first customer.
So companies are effectively driving their employees towards creating future competition, as that seems to be the only way left for inventors to capture the value they create.
>So companies are effectively driving their employees towards creating future competition,
Or future complimentary product/service. A company that runs call centers doesn't wanna make call center software but if they can buy call center SaaS (or whatever) that makes them more efficient they're happy and the guy they're buying from is happy.
Of course the company deserves to capture the value, they are already paying you and expect some return on investment (from paying you). Your contribution would’ve been zero without the company and your teammates who are also only there because the company built the team.
Yeah, but when they go ahead and capture 100% of the value by never giving bonuses or raises they're incentivizing mediocrity, both in the form of brain-drain due to the most innovative employees seeking greener pastures once their resumes are updated and in the form of employees who may have otherwise had the potential to go above and beyond never feeling any external motivation to do so.
It's tough to say. I was young and didn't have the kind of capital to do that myself. It was in the depths of a shitty economy and this hypothetical start up wouldn't have had much of a moat, so I don't think that it would've been easy to get outside capital either. And what would the pitch be? We can do this non-unicorn company's job better and still not be a unicorn? This was right round the beginning of the whole 'everything has to be a unicorn' thing, or at least my awareness of it.
It's unfortunate that the only way to capture the value of my innovation might have been "well, just be an owner instead of a worker."
When early socialist/communists were researching cybernetics they latched on to the idea of predictive production management, to "guide" the collective workforce. I believe they should have used their new computation capabilities to capture the bits of workforce innovation and value addition to create positive feedback loops, to reward innovative and value added behavior.
Soviets certainly tried to create these positive feedback loops, although it's arguable how successful they were. The standard example is the Stakhanovite Movement [1], which rewarded workers for more productive output.
The Soviet Union had planned economy. The problem with it required constant manual intervention. So the "state projects" worked reasonably well. The defense industry worked well. Whatever was left to develop on its own quickly lost quality and only stayed competitive because of locked borders and total deficit of goods.
Socialists and communists have always fixated on capital accumulation as the reason for productivity increases; thus it would be somewhat unnatural for them to reward individuals for what they see as a collective effort.
Marxism, at least, centers on the idea that the worker should be compensated fairly for the labor/value that he or she produces. It’s in the first chapter of Capital, after all. Communism I think is about how the workers can bring about this fair compensation through politics (by the proletariat banding together, for example).
That's Marxism's initial redistributive aspect, not the core principle.
Capital was the foundation for Marxism's development, not the final stage. It's more a case of critique of laisez-faire imperialist capitalism, than anything else.
What do you make of the Hero of Labour award then? They not only went to particularly productive labourers but also do engineers and scientists (Sakharov is probably best known). Engineering and science is an individual effort, although of course you belong to a greater collective that supported you.
Hero of labor was a rare award, only given out 20812 times (according to Wikipedia); I'd be surprised if less than 20k Americans received a bonus on any given week under the current status quo.
Please remember that I was responding to a comment, not making an abstract statement that communists never give out awards. The grandparent had said that it would be good if communists and socialists were more generous with awards than capitalists, and I was making the case that it would be antithetical to the typical communist ideology.
The fact that the award was rare is precisely the point, it's a highly politicised decision to give it, comparable to a military award for exceptional valour. At a lower level, 5-year plan medals were handed out like candy. So, yes, individuals of all kinds were recognized at all levels.
In Soviet Russia you worked for people. At Amazon you work for the Machine.
I am not really sure what point you're trying to make. There was an old saying in the Soviet Union: "we pretended to work, and they pretended to pay us". Everything that you're saying is compatible with the notion that the Soviets didn't actually reward innovation and productivity, they just pretended to.
Your offhand comment is pithy, but ridiculous on its face.
First, in most companies it's very hard to say who exactly managed to increase the profit. Is it you who came up with the idea? Is it your boss who picked this specific idea out of fifty other ideas you came up with last week? Is it the QA team who managed to catch the bug which would've erased all the profits? Is it HR who hired you among other candidates?
Second, no, the remaining is not $460k.
In a Big Greedy Corporation I work an average annual cost of employee in "primary" location is around $200k (pre-covid). It includes office space, technology, infrastructure, salary, taxes, fees, insurances and so on.
Third, that's the premium the company gets for the risk. Intern will get his salary no matter what. The company makes a bet though. You want to own part of that risk? Build your own company.
OK so the company pockets the remaining $300k, while the intern still gets only $40k. Does that make it categorically better? I don't think so.
Also, the company probably only has to pay those $40k to $200k in salary costs for a few years, but the benefits derived from the work might outlive it. There's nothing stopping anyone from firing an employee just because they previously made a massive contribution to the company's financial success. Why keep the cow if you can get the milk for free?
So it might even be the full $500k in pure profits after the 2nd year.
The only way you can accurately evaluate and attribute the work of this hypothetical intern is to answer the question "what would happen if they took their idea and implemented it independently?"
* Does it require a big capital investment? the company provides that.
* Is it risky? The company gets a risk premium.
* Does it need a bunch of specialized resources? the company needs to provide those.
* More subtly, is it net-new or incremental value? Providing a company with something that boosts what they already do isn't evaluated on the incremental increase because you can't realize them without the previous baseline.
There are countless factors measuring value-add, making it extremely hard to allocate back to a single cog in the machine - unless you're a startup with very few cogs, hence they're responsible for all value (or costs).
If you want to be evaluated based solely on your individual contribution, put yourself into a situation where this is possible, i.e. do it all yourself. Otherwise join the broader team and take the benefits and costs that comes with that.
What if the company loses $1mil? Does the intern get no pay?
What if the advancement is in the wrong direction in the market? Like Windows Mobile was. You could have built a great system that delivered a few hundred million in sales, but two years down the road it cost the company a few billion in costs. Do you get to pay back your share?
I'm not advocating here for the work to be uncompensated - it's just when you get a risk shield, you also lose the payout shield.
Maybe a better culture of coop's would be a better solution.
> There's nothing stopping anyone from firing an employee just because they previously made a massive contribution to the company's financial success. Why keep the cow if you can get the milk for free?
If someone who can consistently deliver great value to a company is let go, they can go elsewhere and provide their value as a consultant or employee. You seem to think the company has no competition and doesn't need to compete with other businesses in the marketplace. I think you are also overlooking the fact that most interns are hired to see if they will be good to bring on full time. Some of them turn out to not provide substantial value. If an internship is how you are demonstrating your value to an employer, then your contribution is being average across all the interns that don't contribute anything or much of anything.
I couldn't have said it better. Critics of the status quo often cite strawperson numbers to villianize omnicorp, but it isn't that cut and dry. Every decision regarding capital is risky. Risk management and proper reward isn't easily quantifiable sometimes.
>First, in most companies it's very hard to say who exactly managed to increase the profit.
It isn't that we need to decide exactly who contributed proportionally and how much of the profits we get. We're really just reaching for something that increases wages faster than raises.
And really it highlights the issue that not only is it hard to get fair profit-sharing, it's also hard to get market-competitive raises.
But of course a company is going to say it's hard to decide who gets how much, because they have an incentive to pay their employees as little as possible, regardless if it's through base wages, raises, or profit sharing. There always seems to be an excuse.
>> something that increases wages faster than raises.
not sure if you misspoke or I don't get your point
>> fair profit-sharing
"Fair" is just a trigger that one side uses to put the other on the defensive. Companies by their very nature a designed to share profits with their owners, i.e. legal shareholders. If you want profit sharing guaranteed join an employee-owned cooperative or alternative structured org.
>not sure if you misspoke or I don't get your point
If I can't get raises that keep up with inflation or competitive market rates, I'm going to try and argue that I should seek other options for making more money. One of those options would be profit-sharing. Another option is job hopping, which tech enjoys as a luxury.
> You want to own part of that risk? Build your own company.
I was with you until your last line. You don't have to build your own company - you can simply go after a job at a company that understands how beneficial it is to align the incentives of its owners and employees.
There plenty of ways to give workers a piece of the pie: there are full-on co-ops (at one end of the spectrum), and there are more watered-down implementations like stock options, ESPP plans, etc., all of which allow the worker to take on some of the risk and share some of the gains.
Plenty of companies also distribute simple profit-sharing bonuses, which (arguably) don't require the employee to take on any risk at all.
How common are moves like that, really? Sure, it must happen since people keep bringing it up, but how many businesses actually avoid rewarding individual contributions that have had a measurable, positive impact?
In medium sized companies, I would say it is quite common.
They are too large to give new employees significant stock options, yet too small to pay FAANG-competititve rates. So you're starting with less salary and no stock. If you now have an egomaniac boss who thinks his own personal brilliance is always the cause for the company's got fate, then there's a good chance he will just take credit for your invention and your bonus is gone.
And in slightly larger companies, chances are someone in the management chain will take credit for it and get the bonus. There's a reason why in most companies, you can never talk directly to the upper management: Middle management has strong financial incentives to prevent that.
Why? The tone is offensive, but large and connected companies have received substantial bailouts, and nearly every executive has a cushy golden parachute.
"Be kind. Don't be snarky. Have curious conversation; don't cross-examine. Please don't fulminate. Please don't sneer, including at the rest of the community.
Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.
When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."
Yeah there's absolutely zero risk when government bails you out every time you fail and CEO's and Board members have multi million dollar golden parachutes.
>> Good work is still highly valued ... If you increase a company's profit by $500k annually
Not every job has the potential for such things. "Good work" is different than profitability. A factory line worker can be the hardest working and best worker on the planet, and not increase company profits more than a few dollars an hour. That's the point about how we now value work differently. We place value on work that has direct and immediate impact on the bottom line. We don't value the hard work that goes into just keeping the doors open from day to day.
I chose to argue the way I did, because it is much easier to start at that end of the spectrum. If highly skilled highly productive employees who contributed massively to the company's bottom line cannot extract a fair share of the value in exchange for their work, then it's easy to convince someone that less skilled labor has these problems, too.
> If highly skilled highly productive employees who contributed massively to the company's bottom line cannot extract a fair share of the value in exchange for their work, then it's easy to convince someone that less skilled labor has these problems, too.
What is the definition of fair share?
When you are shopping for something, how do you determine what the fair price is to pay?
If I buy a car that enables me to earn $500k per year, do I owe the seller of the car more than someone who buys the same car to earn $50k per year?
If you work hard everyday, one job should suffice for a home to live in, never worry about food, or how to pay rent, quality education for your children, and a nice vacation once a year and not having a medical event bankrupt you.
If a significant part of the population work more than one job, try actively to reach this sort of life, and hardly achieve it, then something is broken.
I don't which policy will make us closer to this sort of fairness, but that's the only bottom line that interests me.
> I don't which policy will make us closer to this sort of fairness, but that's the only bottom line that interests me.
It’s government policies that impact all businesses that get us closer to that fairness. Any single business that chooses to pay outsized amounts for their COGS, including labor, will get crushed in the larger market. There might be some like Costco and Apple that can afford to pay more, but they serve middle upper class customers and have a unique product.
For everyone else, people will choose to shop at the place that sells bread for 50 cents less.
Let's not forget that the economy has transitioned to have both partners in a marriage to work. (Being dual incoming earning) That's had a few negative effects on the market and social issues (less parental involvement with kids and possibly contributing to a higher divorce rate)
Two-thirds of people who file for bankruptcy cite medical issues as a key contributor to their financial downfall.
While the high cost of health care has historically been a trigger for bankruptcy filings, the research shows that the implementation of the Affordable Care Act has not improved things.
What most people do not realize, according to one researcher, is that their health insurance may not be enough to protect them.
I would define "fair" here as roughly 3x the local rent, so that your salary is sufficient to comfortably feed a middle-class family. Let's cap that to be at max 25% of the value created for the company, and it's a strong incentive for the inventor while remaining ridiculously profitable for the company.
It is pretty much impossible to assign numerical values to “the local rent” or “value created for the company”. Hence the whole purpose of markets, where sellers can bet that on the perceived value of their goods and services, and buyers can bet on their perceived value of the goods and services and come to an agreement.
> Good work is still highly valued, it's just that the value is not paid to the worker, but to the company owning the work. If you increase a company's profit by $500k annually through setting up a new automation system, you'll still only get your $40k internship salary. And the company will pocket the remaining $460k, to be paid out to the owners as profits.
People are paid what it costs to replace them, not the profit they generate. The less replaceable you are, the more of the profit you get.
Price (pay) is the intersection of supply and demand, and nothing else.
That is why factory workers in the US got outsourced. That is why when deciding what you are going to sell for the rest of your life, you should go for something that few others are able to sell.
And everyone does this, it’s not some evil business plan. No one pays $5 more for the same loaf of bread. Everyone I know pays the least amount they have to. Creating a floor for that is the government’s job, not an individual business (and it can’t anyway, since it has to compete).
>That is why factory workers in the US got outsourced.
Nothing makes the CEO happier to know that you believe it is the employee's fault their jobs got outsourced, and not the fault of the CEO who wanted to maximize profit at the cost of millions of good paying American union jobs that actually helped families, communities, and created the 'American Dream', all so they could afford the status that brings when they show their new multi million dollar sailboat at the regatta. The job that was outsourced still needed to be done, someone still had to do it.
There's no such thing as "American Job", jut like there's no such thing as a New York job or LA job. Jobs are a product of a market, that can be as local as a building or as global as information services.
American Dream cannot exist in a world with limited resources. It only existed in a rapidly expanding resource pool - taking lands from natives, creating new tech, freeing up resources by killing millions of people or digging up stuff from the ground.
As such - it's not CEOs job to maintain your job, specially when those CEOs are pushed by your own pension fund managers to move your job somewhere more efficient.
And finally - global poverty has dropped significantly because of that outsourcing. Ignoring that is literally being the "they toork R jerbs" South Park charachter.
Except your premise that returns to capital are going up is false: https://www.vox.com/2015/4/1/8320937/this-26-year-old-grad-s... ("Rognlie finds that when you account for depreciation properly, the capital share's increase has been entirely about housing.")
Returns to owning land and housing (rents) have gone up. That has come at the expense of both returns to labor, and returns to all other kinds of capital (stock ownership, etc.). This is obvious. Stock market returns aren't what they used to be. Banks are making 4% on mortgages, versus 8% back in the day. The country is awash in 0% financing for cars, refrigerators, etc.
There is increasing inequality in how labor income is distributed. CEO salaries have gone up, programmer salaries have gone up, but secretary salaries have stagnated. But that's a separate issue than returns to capital ownership going up.
Comparing corporate profits to GDP doesn’t make much sense. The BEA corporate profits number includes profits earned overseas, which aren’t included GDP. Corporate profits divided by GDP could go up without anything changing in the economic relationship between workers and corporations with respect to work done in the US.
The problem with this is, if the intern doesn't add any value to the company does the intern not get paid or still gets paid the $40k? What if the intern only added $10k worth of value (however that's measured). Does the intern return $30k?
You can't have your cake and eat it too; get all the reward without any of the risk.
If it blew up in a few months due to some unforeseen problem, the intern would receive some of that risk in the form of a pink slip and a bad reference.
Some jobs may not have directly attributable monetary contribution but still are needed.
But let’s assume we can attach a net dollar amount to each job and the company let’s go people that don’t contribute as much as they cost. Even in this case the company still loses out from the intern. That money still need to come from somewhere, most likely from profit derived from other employees.
The point is “the intern increased the company’s profit by 500k annually but only gets paid 40k” is not the problem it’s made out to be. That 500k wasn’t created out of thin air. There were other employees in supporting functions that made it possible. That extra 460k goes to pay for them too, as well as for the employees that aren’t as productive. Those are the risks the company takes on.
The reality is that the labor is only worth 500k to the company. If the individual could extract the same 500K from their labor without the company, they would (hopefully). They don't because it's not purely the labor generating that value, but the capital invested along with that labor. Being a great programmer is worthless without a computer.
I think a really good example is a story from Google. An intern changed the compression of the Google logo on the search page and shaved off a few bytes. Since Google serves that page billions of times, that saved google a 10's of thousands of dollars a year, and was literally minutes of work. If the same intern did the same work anywhere else, it would have been almost worthless.
This is not far from the true numbers. I've known several management consultants - the young, educated hustlers who are on their first job - who discovered from their clients what the bill actually was. Ratios of 6+ are common.
> you'll still only get your $40k internship salary. And the company will pocket the remaining $460k, to be paid out to the owners as profits.
It's because of this that I try to work as little as possible, preferably remotely. I claim to the company that I work 8 hours, in reality I work between 4 to 6 hours, and if I'm really quick, simply 2 hours (happens rarely).
That's how I keep my "profit", and if I meet expectations it's good enough for me. I don't think they'd be happy once they find out.
Any company would immediately hire someone who adds $500k annually in profit, and would steal him away from someone only paying him $40k, by offering more money.
The idea there's some large gap between value provided and value paid is unstable and completely unsustainable.
As for owning stocks, pretty much everyone can. Robinhood has $0 commissions and sells fractional shares. There's no reason why people can't buy stocks.
> Good work is still highly valued, it's just that
What about the people who don't or can't do good work, though?
Your comment is all about better compensation for what amount to elite software developers. And elite software developers are already well compensated. Maybe not as well as they should be by your calculus, but still "well" in comparison to the people the grandparent comment was talking about.
An aspect of this, in much of the west, is political demography. The voting population is getting older all the time. Politics increasingly serves their interests. Often these are directly orthogonal. Labour vs capital. Real estate values vs housing costs. Opportunity vs stability.
Even this pandemic, sadly, creates opposing interests. Younger people need workplaces to operate. Older people are more susceptible to the virus. Since the stock markets are fine, retired people don't generally have financial interests at stake.
Absolutely vast majority of income of everyone apart from billionaires/top 0.01%, comes from work rather than investments... Yes even for top 1% apart from top 0.1%, share of income from investments is negligible.
Total value of U.S. stock market is $35T and given the traditional 4% "safe" withdrawal formula, it is about 7% of GDP or 10% of overall income, as profits. About half of it going to the top 0.01%.
Problem maybe is that there are no highly paid jobs available for unqualified young people anymore.
I don’t disagree with you about UBI, but I have to say it makes me feel sad. In my experience most people really want meaningful work, and the fact that it puts food on the table makes a lot of otherwise meh work more satisfying and meaningful.
At the end of the day it’s more important that people can meet their needs, and if UBI does that, a lot of people will be able to improvise some interesting things to apply themselves to, which I think will be great for them. But a lot of people will also sit on the couch and watch TV all day, feeling there’s no point to life. We’ll have all kinds of new and unforseeable second order effects.
I also worry that the fundamental problem is not so much income but cost of living. The way things work today, any amount of UBI that you pay but just drive up the rent. If we don’t significantly increase housing supplies to meet demand, I don’t see how we can break that loop.
> I don’t disagree with you about UBI, but I have to say it makes me feel sad. In my experience most people really want meaningful work, and the fact that it puts food on the table makes a lot of otherwise meh work more satisfying and meaningful.
I think the remaining work will be such more meaningful that it will be net dealienation. I think less stress and more free time begets curiosity, so I think it's fair to say people could get a lot out of work than they do from their current drudgery.
In the long term, think large amounts of leisure time but ones does "tours of duty" in each of the core industries that power modern society as part of liberal education of how society functions.
> I also worry that the fundamental problem is not so much income but cost of living. The way things work today, any amount of UBI that you pay but just drive up the rent. If we don’t significantly increase housing supplies to meet demand, I don’t see how we can break that loop.
And it lumps them in with those who are likely still in school and have no real job yet thereby making it basically impossible to draw (useful) conclusions about independent young people.
>I would argue that as long as you are working, you have a real job.
If you're working nights in a warehouse while you finish your degree that will get you into an industry with nearly six figure starting salaries then I would argue that you're not "working yet" you're just a student with income.
There is the idea of having an emergency fund, which given its nature, should be stored in an instant access savings accounts.
This is money for when you lose your job for instance
Your appetite for risks and circumstances determine how big it should be (a good rule of thumb is six months worth of salary) but there are countless stories of people trying to be clever with their emergency funds and storing them in ETFs or riskier investments only to find that when the need comes, their fund is not enough and get in financial distress.
If you're a US citizen, "I Series Savings Bonds" are fantastic emergency fund vehicles. You can cash them out instantly after holding them only one year, mine are paying just under 2% right now.
Sure you will lose a little bit, like 2 or 3% a year. But if you have nothing saved then what will you do in case of an emergency? such as losing a job?
The solution is to tax the unproductive part of society, which only extracts wealth from everyone else: landowners. This allows us to shift the burden away from labour.
> It is those workers, those who don't own their sources of income, that are struggling to make rent.
That is exactly how social classes are distinguished in Marxism: "[T]he class of modern wage labourers who, having no means of production of their own, are reduced to selling their labour power in order to live"
Agreed. And this is a good part of the "real" grievance of both young people and working class minorities, especially in big cities in my opinion.
It's not working class wages haven't risen, they have. But, the costs of housing, healthcare, education and many other necessities have risen at a much faster pace, particularly in cities. These rises in cost are of course profits to someone so the economy is "booming. But it's become so distorted that it's becoming increasingly hard to keep up for the lower wage earners with no end in site. Of course the owners of assets don't see the problem. Their properties and stocks are appreciating nicely. And the fed has been encouraging this.
Now, one argument might be that if this wasn't occurring that low wage earners would be even worse off with no jobs and and there may be some validity to that point of view but the inequality of this isn't just an abstract ethical problem at this point but rather a system threatening problem. I don't know that there is an easy fix either. It often seems trying to fix one problem with rent controls, wage floors, progressive taxation etc. results in other problems. But the increase in costs of housing, medical care etc. while wages grow at a much slower rate isn't sustainable imo.
It's not just that 'work' is not valued. The problem is that 'value creation' is not valued (particularly in the tech sector). There are plenty of highly paid bullshit jobs.
Tech corporations are essentially empty shells; their market caps are not based on the value that they create for society but on the rate of money printing by the reserve banks of the world.
Because tech corporations are not subjected to free market forces like everybody else, they eventually lose their ability to create new value (and the kinds of people who join these companies and stay there do not have a value creator mindset). Indeed, if you compare the price of S&P500 companies relative to most stable-value commodities such as gold, it would seem that the S&P500 hasn't grown at all in the last 5 years. However, S&P500 correlates strongly with the growth in M1 and M2 money supply. This indicates that the growth in nominal price of stocks in the past 5 years is entirely artificial.
Young tech workers cannot compete with corporations because the playing field is not even. If your competitor is getting free money from someone, how can you possibly compete with them in any market?
This is why all young people in tech these days are eventually absorbed into corporations. Whether through employment or phony acqui-hire. It's also why big corporations like Google used to buy up startups and then kill the project immediately afterwards. It's mostly phony money transfer with hidden motives and it's not related to value creation.
Could you elaborate on your statements about tech market caps being dependent on reserve bank printing money?
Also, what does it mean that tech companies are not subject to the same market forces as everyone else? They are in direct competition with the next software firm - they seem to want to overcome this by building a pervasive tech ecosystem that forces users to be dependent on their software.
>> Could you elaborate on your statements about tech market caps being dependent on reserve bank printing money?
All currencies of the world are free-floating and not backed by anything and reserve banks keep constantly printing new currency and injecting it into the economy either via 'loans' through regular banks or through government contracts (since the Fed gives the government free money to pay for employees and big corporate contracts like the $10 billion one they gave to Microsoft recently).
For the past 10 years, we've had a situation such that interest rates have been at 0% and reserve requirements by banks have kept getting lower (since COVID19, they're 0%). This allows regular banks to print as much money as they like; whenever a bank makes a loan, they just type new currency into existence into someone's account. Also, the Fed has been buying toxic assets (such as bad corporate and mortgage debt) from the market (again using money that they just 'printed' into existence and backed by nothing).
This has created an economic situation with a lot of hidden inflation; most of it ends up inflating stock prices because corporations and their shareholders use bank loans to buyback their shares from the market; this causes artificial high demand for the stock and drives up the price (this is why corporations today have such low P/E ratios and pay low dividends relative to share price).
A lot of the buy orders on the market today are backed by debt (either which the corporation took out themselves to do buybacks or that of insiders or third-parties who have an interest in seeing the stock price go up).
Insiders understand that if interest rates can stay at 0% forever (which is what MMT advocates are promoting), then they will always be able to keep borrowing more money later to pay off their old debts because the stock price will always go up... And if the next generation comes along and starts doing the same thing (buying corporate stocks using debt), then each new generation ends up paying off the debts of the previous generation.
Note that this scheme works even if the financial instrument has nothing of economic value behind it. It's also why cryptocurrencies have been able to hold up such good valuations in spite of people paying for electricity to mine Bitcoin. They just restrict the supply, borrow money from banks, buy the asset, then in the future, someone else will take a loan to buy it and further drive up the price (and you'll be able to pay off your debt).
Banks do require collateral to give loans, but the problem is that a lot of the collateral these days is intangible and so people can just claim that the market price is the true value of the collateral... Which creates a vicious cycle and allows people to take bigger and bigger loans.
And the reason why corporations are not on an even playing field is because their executives and employees become wealthy from constant stock appreciation (fueled by collective debt) and then they invest in each other's businesses and award each other big contracts which keeps the currency circulating between themselves.
Echo chambers created by social media today help to prevent money from flowing between different classes of people. The big money tends to stay within the corporate sphere and is secured by a large network of venture capitalists and investors who only invest in people within that corporate sphere.
Any flow of money out of the corporate sphere can pose a major threat because it could pop the bubble when people realize how much more value they can get on the outside for the same money.
don't know why people are down-voting you. but a simple case would be if people look at weWork's founder. adam simply borrowed money from the bank using weWork shares as collateral. then he used the borrowed money to buy more buildings, lease the buildings back to weWork. weWork valuation goes up. then borrows more money from the bank. that's the modern economy these days. and yeah if you ain't moneyed you can't compete. selling widgets ain't enough.
It's funny, I was just thinking about the gold standard, but of course it couldn't be returned to as that would destroy the valuation of the holdings of the various financial elite.
(As an aside, the IRA and 401(k) were introduced in 1973 and 1978, respectively- a great way to trick the public into thinking that runaway cash valuation was a valuable move for America.)
Many people don't think it will continue for long, which is why everyone's buying guns, I guess.
Political intervention will come sooner. Keep your eye on the 2024 election, as that's predicted to be the first election where Millennials and younger will be the majority of the electorate.
Something will be done about it regardless of whether it is planned or unplanned.
The government can't just keep giving corporations a free ride using free fiat credit and expect the currency to retain its value and reputation. The only thing holding the value of the world's currencies right now is coercion by governments.
But the system always seems to open up opportunities for contrarians and dissidents to make a profit by going against the flow. The stronger the coercive force is, the higher the risk, the bigger the opportunity - There will always be people willing to take risks, no matter how big because the rewards are always proportional.
I wouldn't say that. I am 24 and I have some pretty incredible tools in my pocket right now that someone in 1991 couldn't have even dreamed of. The problem is, instead of owning a good car or having an apartment to live in, I have the pinnacle-of-technology swiss-army knife of information processors (smart phone, macbook pro, desktop computer) within reach. That is where I think the value of things have changed over time. It went from the opportunity to _have_ to the opportunity to _learn_
Finding the information was harder. There were libraries, my town library was in the basement of the city hall building, and my local mechanic has a larger building.
If the library didn’t have it, you could hit a small book store, whose specialty was romance novels for local farm wives.
Basically, anything you would actually want would be special order, hoping the book dealer knew what you were talking about in the first place.
The rewards for finding the information needed to learn were geometrically higher relative to the difficulty. It's easier to learn today; the (monetary) value of that learning is lower.
Yeah thats the problem. I'm definitely not happy with this trade-off. I'd rather have an apartment than a smartphone. But growing up I unfortunately spent too much on technology. Video games, laptops, cell phones that I didn't take care of and broke too quickly, the list goes on. I wish that I didn't value these things so much growing up and kept the money for other uses.
In more marxist terms, we have traded variation of consumer goods for precarity in our ability to live our lives. You can play video games, but your boss can fire you at a whim and your landlord can kick you on the street. You can't become class mobile without taking on huge debts and possibly not making it.
I would love to know the proportion of people at, say, 25 that can afford to buy a house (in the area they grew up and of some reasonable size e.g. 3 bedroom) over time. Sure, income has gone up, but so have house prices and so that proportion has surely collapsed. That has a much bigger effect on young peoples' lives than the number of luxury items they can buy and stuff into the property that they don't have.
Edit: I found this interesting graph for the UK (where I live) [1]. It shows the ratio of average house price to average annual income. Not perfect measure as distribution of income over age brackets is probably more imbalanced now due to starting work later (18 - 25 rather than 16 - 21) but still interesting. Brief textual summary:
* 1920 - 1945: 5 ish
* 1945 - 1955: 6 - 7 (brief spike)
* 1960 - 2000: 4 - 5 (with occasional spike to 6)
* 2000 - 2020: 8 - 9 (with dip in the middle but still above 7)
Edit 2: Here in the UK the problem is particularly acute in the suburbs of the big cities, especially London, because the ability to build new housing is artificially restricted by the green belt legislation [2].
Going by that guy's numbers, someone in 1986 had to work 73 hours per week to afford a semester of college. I either don't think those numbers are accurate enough to reflect people who "put themselves through college", or that people back in the '80s didn't actually put themselves through college.
Edit: To clarify, he says "net tuition", then throws out a $15,000 number for modern colleges. That's on the low-to-medium end of the per-semester cost of a modern public institution, so I'm assuming he's also listing the per-semester cost for the 1986 tuition.
> Edit: To clarify, he says "net tuition", then throws out a $15,000 number for modern colleges. That's on the low-to-medium end of the per-semester cost of a modern public institution, so I'm assuming he's also listing the per-semester cost for the 1986 tuition.
not sure where you're getting those numbers. the average annual cost (tuition + fees + room + board) of a four-year public institution is around $20k. [0] $15k per semester is far from the low-to-medium end.
If one is talking exclusively about in-state tuition, my numbers are high, but my understanding is that enough students pay out-of-state tuition that the discrepancy becomes relevant.
Child commenter to your comment may be right about sticker prices being paid by the rich, which I don't know as much about.
To be clear, I think college is an expensive, unnecessary scam that does very little to serve either individuals or society; I just think that it's trended more expensive and worse over the past few decades.
ah, that probably accounts for the discrepancy. I think the in-state figures may be more relevant to the discussion at hand though. why would someone paying their own way through school choose to forgo the in-state discount?
Strength of the state universities in a desired field. Some state universities offer much better quality and reputation of their programs than the ones in other states. The way that college is marketed as a social vehicle is that you choose the best one that you can get into, and it's worth paying for because of the connections and job opportunities you'll get there. (Scam! Holding people hostage for runaway credentialism!)
For instance, several state universities are listed as top-tier mechanical and aerospace engineering schools (consider Purdue vs. Mizzou aerospace engineering), so if you want to seriously compete in the field, it could be worth your money to join the "better" school from out-of-state, assuming you're a competitive student. The recruiting, research, and internship opportunities at top state schools are way better than at low/mid state schools.
However, from a pragmatic standpoint, you are correct- it is unwise to select these schools if you can better afford the in-state discount. You just have to deal with the limitation of your selection of available fields and prestige of chosen degree. (That's why I think it's a scam; as any honest manager will tell you, degree/curriculum content means very little relative to job performance; the top performers come from top schools not because the top schools made them top performers, but because they were already top performers in high school and just got chosen by top schools.)
It's worth mentioning that if you pay out-of-state tuition to attend a highly ranked State school, you are almost certainly not going to become a median earner. You will comfortably find yourself spending most of your career in the top 2 quintiles, if not the top quintile altogether (~$100,000 salary puts you in the top quintile).
If you're below median and would like to attend, you're going to receive FAFSA, Pell Grants, on top of university-specific discriminatory pricing that favors the poor.
There's no reason one should pay out-of-state tuition outside their own State unless it's at a "top-tier engineering school", in which case the point is basically moot.
The numbers don't necessarily represent how people put themselves through college, on average — it's just an apples-to-apples comparison of what it would take to "put yourself through college" in 1986 vs today.
Right, but I'm saying I don't think his numbers are correct. Either it was also not possible on average to put yourself through college in 1986, similar to how it's not possible on average to put yourself through college today, or his numbers are wrong.
In fact, I bet it wasn't actually possible to put yourself through college in 1986. I bet the people who put themselves through college went between 1950 and 1975.
No, the whole point is "not much as changed". The narrative that one could pay their way through all of college is one that likely never really existed for a broad enough population outside of a brief unreproducible period after WW2 where the entire world was in ruins and the US was the sole supplier of all goods & services globally.
Agreed, and here in the UK the trend is even more pronounced: there were no tuition fees at all twenty years ago. In fact the government used to pay students a grant to go to university (although in its final years the grant had shrunk past the point of being much use).
I'd love to combine this with the previous idea. What's the ratio of median home price to first full-time full-year income minus student loan payments? That would probably put the difference between then and now in even starker relief.
Would 'median disposable income' be a good proxy for that?
(in other words, a metric to determine whether this increased median income ends up in the pockets of and usable by the recipients, or whether it is being used to support day-to-day life)
If by "that" you mean the ability to buy a decent house for yourself then the answer is that it would be an extremely poor proxy ... almost uncorrelated. It's possible to live without buying a house or even renting by yourself (you can live with parents or rent a shared house with other young people). People in that situation would still have (and be recorded as having) a disposable income even though accomodation is an essential expense because owning your own personal house is not technically an essential expense.
Therefore, if wages are decent but housing is extremely expensive (which I believe is the case here in the UK, especially city suburbs) then median disposable income would continue to increase while the ability of young people to buy their own homes would continue to decrease.
I'll explain why I was wondering about disposable income in a little more detail:
In some dense cities in particular, renting (typical for young people prior to property purchase) can be very expensive.
My sense is that this creates a large transfer of wealth from professionals to property owners on a monthly basis.
Across San Francisco or New York, for example, what percentage of the paycheck sums paid by employers is redirected to their landlords?
If that percentage were to rise it would become more difficult for young people to become property owners since it'd be challenging for them to put savings aside.
With that context provided: disposable income is an attempt to quantify how effectively people could save towards property purchase - regardless of whether they decide to take that route.
(and to some extent it's also a 'guard metric' to detect communities that rent-seek to unsustainable levels)
Edit: 'tech companies' -> 'employers' in third para, and remove assumption of high salary. This doesn't just affect tech..
>if by "that" you mean the ability to buy a decent house for yourself then the answer is that it would be an extremely poor proxy ... almost uncorrelated.
Um, what?
If you want to buy a house "disposable income" directly translates into ability to save up for down-payment.
The ability of more people to get housing earlier in their lives is only correlated with the growth in housing stock. If the housing stock is growing more slowly than the population then it's irrelevant if people manage to save more money towards it; that just increases the house prices for everyone. Conversely, if the housing stock is growing faster than the population growth then even a decreasing amount of disposable income wouldn't stop people moving into the vacant properties... the prices would all come down to match.
Of course this sounds ridiculous because in a normal market the supply of a good or service would increase if prices go up (positive price elasticity of supply, as the economists would say). But housing markets are often not like that, and they're certainly not like that in the UK suburbs (as I said, that's partly because of the green belt).
The income has been adjusted for inflation and inflation includes housing.
I think the difference between now and 50 years ago is that everyone wants to live in a few select dense cities where back then most of the middle class didn’t want to live in cities (especially true in the US).
(Edit: I'm not an economist by training and I've realised Im not really sure about any of this, so maybe ignore everything after this paragraph. But certainly, here in the UK, house prices have more than doubled in the last 30 years after adjusting their prices for inflation, so inflation is nowhere near the whole picture.)
It is included but at a very different proportion from its actual impact on consumers. For example, the UK standard inflation index, CPI, includes the following items in its measure (as of 2017 [1]):
04.1 Actual rentals for housing: 5.6%
04.2 Owner occupiers' housing costs: 17.4%
(Also, I'm not sure whether "Owner occupiers' housing costs" only includes mortgage repayments, but that's the closest thing I could see on the list. And the way it's phrased in that list, with the word "actual rentals", makes it sound like price increases won't necessarily be reflected in the index so long as everyone were to downsize correspondingly - but I'm not clear about that one.)
It's like that because inflation is meant to reflect a very different concept than we're talking about here. We're talking about something becoming intrinsically more valuable (in this case, because of scarcity). Inflation is meant to judge when a currency has less intrisic value. So if a house price goes up by 100% but that's because it's twice as valuable, then in some sense the inflation there is 0%.
I'm not sure whether "Owner occupiers' housing costs" only includes mortgage repayments
If it's similar to the situation in NL, those housing costs includes things like municipal taxes and local services (water, sewage, electricity), parts of which are billable to the registered owner of the property, not the inhabitant. And since the central government has been outsourcing more and more responsibilities to local government, these local taxes have increased massively to pay for things that were hereto paid out of the national tax budget.
So if a house price goes up by 100% but that's because it's twice as valuable
Not likely. Many studies have shown that in scarcity, house prices don't follow the value of the property, but the budget of the buyers. The increase in housing prices is more due to the ease of access to mortgages (low interest rates, tax bonuses, etc) than due to value increases of the property.
For you. Culturally, land ownership in the US is a big deal. Yeah, some cultures are broken - I don't know if the security provided by having somewhere to call your own until the day you die is one of them.
And I really mean that - I don't know. Some cultures thrive on women being subjugated to men. Some cultures thrive on authoritarian levels of control that are inconceivable in the west.
By comparison, a desire for land ownership seems pretty mundane.
I actually think we shouldn’t be surprised that the proportion of 25-year-olds that can afford to buy a house in the area they grew up in of some reasonable size is about the same as it was for boomers. But the emphasis should be on in the area they grew up in, because in the 21st century we’re seeing a clustering of the economy in a few (predominately coastal) metros. A big part of the problem is that it’s difficult for a 25-year-old to afford to buy a house near their job in San Francisco, not that it’s difficult for them to buy a house near the middle-America suburb they likely grew up in.
It’s also worth noting that the median 25 year old doesn’t live in San Francisco or New York. If that’s you, you’re probably in the top 2 quintiles in your age group.
> A big part of the problem is that it’s difficult for a 25-year-old to afford to buy a house near their job in San Francisco, not that it’s difficult for them to buy a house near the middle-America suburb they likely grew up in.
I think both are problems, because if they remain in the middle-America suburb, salaries are a lot lower. A 25 year old living where they grew up often doesn't earn enough to buy the local house either.
Salaries are indeed a lot lower, but the multiple of the median salary required to buy a home is a lot higher in the coastal metros. In other words, the salaries are nominally lower, but in real terms they’re actually higher.
But volatility of income is higher, or at least it’s perceived to be. If you think the future is volatile, you need to be in a place where you have multiple options for income as backup, and that is why people will put up with cities even if they prefer other places.
Well, Chicago, Dallas, and Houston are the third, fourth, and fifth largest metropolitan areas in the US so you should be able to find other jobs there and they all are in the affordable housing areas.
I do not. I imagine it's pretty difficult to quantify, but given the economic stagnation of many rural areas that depended on one or a handful of manufacturing or natural resource extraction facilities, I imagine many people can see and feel it.
It sort of makes sense with cheaper labor abroad and cheaper transportation and cheaper communications that money can move very quickly to take advantage of arbitrage. Unfortunately, for people with families, that isn't as easy, so next best solution is to live somewhere with lots of employers, and is experiencing growth.
Then I'm not sure that you can definitively claim that income "volatility" is necessarily higher outside of {list of expensive coastal metros}. It could be higher, but there's no information to suggest that to be case.
> given the economic stagnation of many rural areas that depended on one or a handful of manufacturing or natural resource extraction facilities, I imagine many people can see and feel it.
Rural areas have almost never been economically dynamic, in any society (outside of agriculture, at least). Nobody is arguing that one ought to live in a rural town — but there are a ton of tier-2 and tier-3 cities, and suburbs of tier-1 cities where gainful employment is within reach, especially for wage earners in the bottom 3 quintiles. This includes Pittsburgh, Atlanta, Omaha, Little Rock, St. Louis, Milwaukee, Madison, Nashville, Salt Lake City, Minneapolis, Philadelphia, etc etc etc. Even Chicago, a tier-1 city went from having an income-to-home-price multiple of 2.45x in 1984, to 3.55x in 2017.
> Outside of a few notorious coastal metros, the multiplier hasn’t changed all that much.
Interesting you put it like that. I looked at the same graphic and thought "ah, apart from the sparesly-populated central areas, the multiplier has changed radically, similarly to the pattern in the UK".
At the end of the day, the conclusion is that the pattern is commonly reflected in (sub)urban areas and not in rural areas - and neither of those facts is more important than the other.
> the proportion of 25-year-olds that can afford to buy a house in the area they grew up in of some reasonable size is about the same as it was for boomers
OK, first, boomers aren't relevant here. The people buying houses in anything like OP's timeframe were GenX, not boomers, and I'm going to stick with that timeframe.
Second, I just don't think your claim is true. For example, in Massachusetts (where I live) median income in 1990 was $62K and median home price was $162K for a ratio of 2.6:1. Today those numbers are $77K and $408K for a ratio of 5.3:1. That's all of Massachusetts, not just the high-demand areas. Even your source admits that Seattle, Portland, and Ft. Lauderdale show much larger ratio increases, but casually dismisses them as anomalies instead of considering that they might be the tip of the real iceberg. I believe that any scientific sample would show a very different picture than that derived from cherry-picked extremes.
It's practically all because of wage stagnation BTW. That's a 3.1% annual increase in home prices, vs. a paltry 0.7% for wages.
houses have gotten much larger since 30 and 40 years go. They've almost doubled in size since 1980, but adjusted for inflation, are now cheaper per sq ft
That's true, and interesting, but I'm not sure it's relevant. People have to buy the houses that are on the market, and smaller houses tend to get torn down to make way for larger ones. Why tweak the numbers that way and not others (e.g. to account for student-loan debt)?
>I would love to know the proportion of people at, say, 25 that can afford to buy a house (in the area they grew up and of some reasonable size e.g. 3 bedroom) over time. Sure, income has gone up, but so have house prices
That question is more about how government is for young people than the economy. High Housing costs are mostly caused by local government policy. In places where developers can respond to demand housing isn't any more expensive today than it was in the last 50 years.
another interesting way to look at is to look at the other basic of life: transportation.
can the average 25 year old in any given era afford to purchase their own vehicle? It is arguably way more affordable to be able to own your own car now than it ever has been in the past.
In 1997, it took 1,768 hours of work at the median wage to purchase a Dodge Caravan. In 2018, it took just 1,396 hours of work at the current median wage.
In 1997, it took 1,565 hours of work at the median wage to purchase a Toyota Camry. In 2018, it took just 1,258 hours of work at the current median wage.
Also, not only do you work fewer hours today than 20 years ago, the car you get is of higher quality / safety.
In the '90s, the increase in income was partially attributable to employees putting in more hours every year, whereas in the recent boom working hours per employee were effectively constant.
Your stats are basically gonna look the same as the housing map but they're gonna be skewed by four-ish factors.
Vehicles are much more liquid than housing. You can't load freight trains full of land and ship them to an auction across the country where used land dealers buy them for resale.
Since the 1950s many states have adopted laws (and some states have adopted, repealed and re-adopted) and practices with the effect of increasing the price point of the "minimum viable shitbox". The bottom of the market is just a ton higher in some states than others. It's like how you can't buy a mobile home in Manhatten but with cars.
Used car dealers compete with exporters for cars. If Brazil (or wheverver) has an economic boom they remove used cars from the market in places like the US and Canada increasing cost.
The qualities of a new car and the life-cycle of a car has changed dramatically since the 1950s whereas houses are fundamentally just land prices plus some (with the "some" being a reflection of developers saying "well if I'm gonna buy this expensive-ass land I might as well put an expensive-ass McMansion on it)
Alternative (mostly public) transit options in various places has waxed and waned since the 1950s.
So yes, you could use cars for your point of comparison but you'd basically get the same thing as with housing but with the added downside of more variables to control for any time you want to make a cross-time or cross-location comparison with the data.
> Your stats are basically gonna look the same as the housing map
If I move to the big city, and I go from spending 30% of $50,000 on rent to spending 50% of $100,000 on rent that's going to look bad whatever measure of housing affordability you use.
But after rent, I've got $50,000 left instead of $35,000 - so maybe I'll get a BMW instead of a Camry.
The median might have done well, but we know for a fact that inequality has risen dramatically in the past 40 years, mostly as an effect of big tax cuts to the wealthy. I'd be interested in seeing an analysis dividing the population in bottom 50, top 10 and middle 40, post-tax.
This is the most similar thing I could quickly find
and it seems to show that low-income salaries have largely stagnated for the last 50 years. I don't think it's debatable that places like the Greater Boston area have seen increases in rent that have gone faster than inflation. Combine the two phenomena and you'll see how many people are priced out of large attractive cities.
> Low income salaries actually experienced a higher increase than higher income salaries
In percentage points. If someone gets a pay rise from $1/hr to $2/hr, they will experience a dizzying 100% wage growth, more than any CEO could ever hope for.
This is simply a mathematical property if small numbers, it doesn't contain any meaningful information on economic development or policy.
That's correct, but also not particularly relevant, as there's no society on the planet where low income salaries experience a higher absolute increase than higher income salaries.
The fundamental point is: there's a narrative that lower income salaries 1) don't increase at all, and 2) increase at a slower rate than higher income salaries, and both are false.
It's also important to remember that the increase in salaries at the lower income levels, do NOT necessarily represent a single earner's pay history. Those numbers show us the snapshot-in-time income numbers for everyone currently at that salary level, but does not account for people actually moving income levels as they progress in their career. The data isn't "here's the life of Joe over the last 5 years", it's "here's the life of people without college degrees that are X years old with Y years of experience with X & Y held constant over 5 years".
If low-end wages double and home prices double, the ratio remains the same but the poor person is still falling behind due to fixed expenses. You'd need disposable income to double for the effect to be neutral (probably a bit more because even poor people deserve a bit of indulgence now and then). As it is, poor people are more sensitive to housing inflation. Even if wages and home prices move up in lockstep (which they haven't) they'll take more years to save up enough for a down payment. That's why so many remain permanent renters, denied the equity and tax advantages of home ownership and putting them at even further disadvantage.
So yes, the point about inequality damn well is relevant. Anyone who wants to pretend they know about statistics should know that distributions matter as much as averages and medians.
Even if housing costs have gone up more than overall inflation, other things have fallen, and OVERALL inflation rate is the best rough measure of "cost of living". Focusing just on home prices or healthcare is incomplete way of understanding purchasing power / cost of living. When we say that the median income has increased adjusted for inflation, it means that the real salary has increased even accounting for the increased prices of all of the stuff that needs to be bought.
And finally, re-iterating the point: these numbers don’t represent a single earner’s pay history. It doesn’t really make sense to compare a single individual’s fixed expense in the context of these distributions, because they represent different individuals at different snapshots in time. While the shape of the distribution may be (moderately) fixed, the composition of it changes dramatically.
Taking some vague numbers from that article, a low-income salary might be $20,000, which gets an increase of 5%, so $1000. Meanwhile a high-income salary of say $100,000 gets an increase of 2.5%, so $2500. Which one is the bigger increase?
Measuring these things in percentages hides the fact that the underlying numbers are immensely different. Perhaps if we reported numbers differently, we would say that the pay increase for the high-paid worker was 2.5x the pay increase for the low-paid worker.
Alright, let's actually follow up on two of your links for once.
Regarding "The IMF was unable to reproduce":
"Piketty's response[54] noted, however, that Góes used measures of income inequality rather than wealth inequality, and inappropriately took the interest rate on sovereign debt as his index of the rate of return on capital, which makes his results not commensurate with those of Piketty's study."
Regarding "Auten & Splinter further refuted":
"There’s plenty of other literature, mentioned in the Auten-Splinter paper, the Economist articles and elsewhere, that challenges the methods used by Piketty and his collaborators. Such challenges are mainly possible because tax data draw an incomplete and often difficult-to-interpret picture of the income distribution. Tax systems are complex, different governments choose not to tax different kinds of income, and of course people, rich and poor, cheat on their tax returns. Measuring wealth inequality rises to a whole new level of difficulty: Since so few countries have wealth taxes these days, only guesstimates of the wealth distribution are possible.
So on Tuesday, Piketty, Saez, Zucman and a long list of other signatories issued a sweeping response to critics of the popular inequality narrative, pointing out that all the data-based arguments can be settled by making solid data available."
Unsurprisingly, these two doesn't seem to very convincing "refutations".
I'm not convinced you actually read through either of those sources, especially considering that none of your quotes actually ever appear in them.
First, and foremost, Piketty's findings are based on theoretical models, not empirical observations. The two fatal flaws Piketty makes are 1) he models a zero-sum world where wealth is fixed, and 2) doesn't take into account taxes & transfers.
From the IMF research:
Using a sample of 19 advanced economies
spanning over 30 years, I find no empirical evidence that dynamics move in the way Piketty
suggests. Results are robust to several alternative estimates of r-g.
I find no empirical evidence that the dynamics move in the way Piketty suggests. In fact, for at
least 75% of the countries examined, inequality responds negatively to r − g shocks, which is in line
with previous single-equation estimates by Acemoglu and Robinson (2015). The results also suggest
that changes in the savings rate, which Piketty takes as relatively stable over time, are likely to offset
most of the impact of r − g shocks on the capital share of the national income. Thus, it provides
empirical evidence to the model developed by Krusell and A. Smith (2015), who say Piketty relies
on flawed theory of savings. The conclusions are robust to alternative estimates of r − g and to the
exclusion or inclusion of tax rates in the calculation of the real return on capital.
From Auten & Splinter
Our estimates also differ for the bottom half of the distribution. PSZ estimated that the
bottom 50 percent share of pre-tax and after-tax shares decreased by about 7.5 and 6.2
percentage points since 1979. In contrast, we estimate that pre-tax and after-tax income shares decreased by 5.6 and 2.2 percentage points over this period. Our estimates suggest that taxes and transfers offset most of the recent changes in both bottom and top income shares. In contrast with the PSZ estimate that average real pre-tax incomes of the bottom 50 percent remained virtually unchanged, we estimate that they increased by nearly one-third.
For pre-tax/after-transfer income (which includes Social Security benefits) and after-tax income, we estimate a real increase for the bottom half of the distribution of nearly two-thirds
> mostly as an effect of big tax cuts to the wealthy
Maybe partially so, particularly capital gains taxes. But I'd wager globalization likely had a much more pronounced impact on American inequality by pushing labor out to Asia
Thomas Sowell makes an interesting point that very few people stay in the same categories of rich or poor in their lifetimes. Young people make up the majority of the poor, but after working and gaining experience the vast majority move out of that category.
I am pretty sure I am an example of that. Graduated in the dot com bust in 2001 and was unemployed for a year and a half. Apparently my partner and I are now in the top 10% of household earnings for our country. (Household earnings being another very misleading statistic - as people become paid better they move out of shared houses, making things look worse when they are actually getting better).
Based on the empirical tax analysis, 12% of the population found themselves in the top 1% of the income distribution for at least one year. 39% of Americans spent a year in the top 5% of the income distribution, 56% found themselves in the top 10%, and a whopping 73% spent a year in the top 20% of the income distribution.
I don't see how so many people can be in the top 5% unless selling your house counts as income and you are counting the econonic activity when someone sells the family home to downsize in their senior years. At any rate without an illustration of how the claims about income works in practice I don't know what to make of this article. Most people aren't going to spend a year as high level executives.
If you earn $170,000, you're in the top 5%. That's absolutely attainable within 5 years of graduating into a high-skill job.
That's also attainable within one's lifetime if they end up doing any sort of management or decision-making (as opposed to just manual labor all their lives).
It's my understanding that 170,000 per year is more than most managers will ever make in most parts of the country. However I can maybe see 170,000 making sense if it's based on household income and a household may have two manager class employees making 90,000 a year.
That said i'm still having a hard time believing the claim that a large number of households will have top 5% earnings. I'd love to see a blog post or something along those lines contexualizing that journal article.
Thanks. That first link makes more modest claims than the second one. They also appear to use different methodology (surveys vs tax returns). I appreciate the different links.
aren't household earnings calculated using the tax definition of household? if I live with three other engineers as roommates, that's not a $400k household; it's four $100k "households", despite the fact that we share the same roof.
I dont think that describes accurately the economy. Socioeconomic mobility is declining too, people who were born to rich families become rich and those where were born in poor families become poor.
It is not all young people who will earn a lot of money later.
I get what you're saying, but I think that the inflation does not adequately represent:
1) housing (usually one's biggest expense)
2) college (nowadays the biggest debt problem)
I think the highest stress is among young people who DID get a college degree, and who DO live in the highest-rent cities in the country, and their situation is not captured by the data you cite.
BUT, I'm not saying your data isn't relevant. Just that housing and college are pretty big things to leave out, but IIUC they are not part of the standard inflation measure.
> their situation is not captured by the data you cite
Exactly. They're people too. Their circumstances matter. Also, let's not forget that not everyone even in a high-demand area is also high-income. Somebody has to do the "lesser" jobs, and for them housing is even more of a problem. Arbitrarily excluding certain metro areas because "you're a techie and can live anywhere" is just counter to reality, and just seems like a way to skew the numbers.
I think it's extremely important not to discount the circumstances of individuals. Like you said, they're people too.
But the value in this data is in helping us understand what the root causes are, and what the solutions should be.
> Arbitrarily excluding certain metro areas because "you're a techie and can live anywhere" is just counter to reality, and just seems like a way to skew the numbers.
The goal behind "arbitrarily" analyzing the data per-metro is not to skew the numbers, but to help us understand that the root cause is largely local, rather than a global systemic one. Everyone knows that housing is way more expensive than a median income can afford in San Francisco, but that's a San Francisco problem, not an America problem. It's also one that's generally well understood: they're not building enough housing due to exogenic limits.
I completely agree that we should understand the root cause, but I completely disagree that excluding certain cities helps us do that. Rather, it obfuscates. If you want to illuminate, you should aggregate by population density, then slice and dice by age, income quintile (not just median), parental income quintile, etc.
> that's a San Francisco problem
What's your excuse for the hundreds of places that aren't San Francisco but still have ridiculous ratios between income and housing prices? Your own sources identify some. The absolute last thing this site needs is more SF-centric tunnel vision.
Of those that have experienced a dramatic increase, the root causes are common: restrictive land use policy that prevents us from building more housing. One may call that "tunnel vision", but others may call that "systematically isolating root causes".
Yet again why the arbitrary limitation? Your own source shows about a dozen areas going from the 5-8 band to the over-8 band since 1990. More than two dozen going from the 4-5 band or lower to the 5-8 band. When each multiple represents most of $100K. When the entire graph should ideally remain about the same color overall but is clearly getting worse. Buy hey, let's look only at the top 10 out of 392 and ignore the fifth of the population that's not in an MSA. That's totally the way to "systematically isolate root causes" ... so long as the root causes are the ones we find convenient. Otherwise not.
It's not arbitrary, I'm just sampling the most populous because that's typically where the increases have allegedly been the most dramatic. The intention behind choosing only the top 10 is to be as minimally charitable to my own point as possible.
In the top 20 most populous MSAs, only 7 experienced a substantial multiplier increase.
In the top 50, it's even less.
The further down the list you go, the less likely there has been a dramatic increase in home-price-to-income ratio.
> The absolute last thing this site needs is more SF-centric tunnel vision.
Indeed. Canadian here -- you Americans have it good! In Canada, there aren't so many metros to choose from. Moving away from the city you grew up in generally means moving much, much farther away, and adapting to a very different climate. For any given profession, the number of places with good opportunities is also a much more limited selection.
> BUT, I'm not saying your data isn't relevant. Just that housing and college are pretty big things to leave out, but IIUC they are not part of the standard inflation measure.
There's also not one single standard inflation measure, there are a number of different ones with different weights. The real median income is up regardless of which measure you choose to use.
These income gains are inflation adjusted, and about half the entire weight of most inflation indexes is just housing, education, and health care costs:
Even if housing costs have gone up more than overall inflation, other things have fallen, and OVERALL inflation rate is the best rough measure of "cost of living".
There's some disagreement around whether we should use CPI-U, CPI-W, or PCE — CPI-W underweights housing, for eg — but importantly, the median income has increased since 1981 regardless of the choice of index. The median total compensation has increased since WW2 regardless of the choice of index.
You're forgetting about the vast majority of jobs: the mundane economy. If you're in the mundane economy, you compete with a dozen other people for full-time jobs that pay somewhat above minimum wage, and no employer in the right mind would actively look for you in particular. You obsess about your CV and worry if anyone will hire you for anything if you lose the job you currently have.
A tech worker is in the talent economy. The receptionist at their company's office is in the mundane economy.
Really what you want is for gig / delivery style jobs to be unionized, or for it to not be legal or easy for sole operators to work with a bicycle and a backpack. I think we want limits the ability of companies like uber to employ people as contractors. If you force them to have a staff and to manage their staff, then their workers talent is important.
But at that price point, I'll probably make the trip to get my own pizza / take out.
The gig economy isn't really gig's it more about the externalization of laziness. And companies breaking the social contract IMHO. They are creating such serious problems long term for the work forces that lose effective labor to these crap jobs.
I'm Australia, I live a far distance from and India restaurant I like. I'll still do the drive to pickup my food. If I use some gig delivery app then I'm pretty sure I'm asking for someone to do work for me that's probably unpaid in some extent. I know almost to 100% that they don't have superannuation and anything near the benefits that I had when I worked in crappy jobs when I was younger.
This also irks me from a employment participation standpoint. In the tech world, people bemoad "the best minds of my generation working out how to make people click adds" etc. But no ones talking about what happens to a few generations of people that are forced into pay cheque to pay cheque living and the down stream social welfare and or retirement cost of these shit jobs.
We're letting a lot of companies make money off the future well being of their workforce who happen to also be our citizens.
>Really what you want is for gig / delivery style jobs to be unionized, or for it to not be legal or easy for sole operators to work with a bicycle and a backpack
So basically do what some states have done for plumbers, electricians or other trades with a licensing body and professional cartel?
Bars to entry and a strong incentive to keep the status quo once you're in will certainly jack up prices and pay and businesses who have to follow the law will have to get their gig services from the cartel members but everyone else will just consume less of the services. Considering that these gig economy services are mostly consumer facing I think you'd just see demand fall off a cliff if prices were raised by regulation. The survivors would have high wages but unless some other employment option is available labor force participation will suffer.
I think if it was as easy as cranking the regulation knob to 11 someone would have already done it.
I am really flabbergasted how people with low income don't look very hard for good deals, or pay for things that they shouldnt be paying for at that stage of their life.
I saw an article saying there is no place in America people can afford to live on (I forget amount). Someone shared it on social media, and I did the calculation and I knew many places that were affordable by those standards, even in a large city.
They retorted there isn't a place in America that you can find a place under 800$. I'm like .... did you look very hard? When I was in philadelphia from 2007-2009, I shared a 3 bedroom apartment with 3-4 other people for ~200$. When I wanted my own 'place', from 2009-2013 I rented a micro apartment in down town Philadelphia for 400$/month.
I'm sure prices are higher now, but there are affordable places to live, but you have to be willing to give up something: commute time, amenities, size, privacy, etc.
With income, other industries have room to complain, but not tech. my 28k TC starting salary isnt even what people are getting signing bonuses these days.
While I doubt that you have a reliable source for your assumption that poor people don’t seek out deals or limit unnecessary purchases, I’ll note that being poor literally reduces your capacity to make good decisions - https://www.reuters.com/article/us-poverty-brain/study-finds....
I don't know. If we're going down the route of what counts as "inflation-adjusted": Does housing, studying, medicare count there or only [or overwhelmingly] consumer goods? If it's the latter - which I would expect - than the stats are cherry-picked and the money ain't worth as much as it used to be.
well, yes. My point was more or less that I haven't seen an adequate inflation index whenever I read about this topic (which is quite often). So just saying "no matter which index you use" isn't worth much if every usual index is going for the wrong measurements.
Nonetheless, there are articles which focus on the real inflation and it doesn't look good: https://www.theatlantic.com/ideas/archive/2020/02/great-affo...
Choice Quote:
"The price tags for tuition and fees at colleges and universities have risen twice as fast as wages, if not more, in recent years. Rental costs are outpacing wage gains by a percentage point or more a year. Health-care costs have grown twice as fast as workers’ wages. And child-care costs have exploded."
Even if housing costs have gone up more than overall inflation, other things have fallen, and OVERALL inflation rate is the best rough measure of "cost of living".
Keep in mind that this still means that we ought to solve the rising house price problem (by liberalizing zoning restrictions) and reducing healthcare prices (either by having the State subsidize all of it or better price transparency). Doing so just improves the inflation-adjusted wage even more than it's already improved.
thanks for the discussion and your sources. I guess your numbers add up. Nonetheless, I don't think using inflation is enough as measuring stick. Upward mobility/inequality is not really captured in it. Also, it appears public service in the US has suffered a lot in the last years. So my full picture of the US is not as bright as yours although your sources and arguments are fine in itself.
If you’re interested, I provided some empirical data around upward mobility during one’s life elsewhere. I’ll just copy-paste here for convenience:
“Based on the empirical tax analysis, 12% of the population found themselves in the top 1% of the income distribution for at least one year. 39% of Americans spent a year in the top 5% of the income distribution, 56% found themselves in the top 10%, and a whopping 73% spent a year in the top 20% of the income distribution.
I’m not sure whether this oft-cited research demonstrates much regarding upward mobility, per se, but it does provide some compelling insight into the volatility of income throughout one’s life. Typically, when speaking of upward mobility, some degree of stability is assumed to follow an upward income movement, but this study doesn’t preclude the possibility of someone going from, say, the 20th income percentile to the 80th for a year, then subsequently falling back to the 20th income percentile for the remainder of their life. That’s not a fault of the study, to be sure, but it does limit the scope of its application with regard to the topic of upward mobility.
You’re absolutely correct that, while one may spend time in the a higher percentile, they may then fall back to a lower percentile for the remainder of their life.
Nevertheless, it’s an important analysis because if the majority of people really do spend a few years in affluence, it suggests the uncomfortable reality that there is a higher degree of personal responsibility involved than we may be letting on.
> it suggests the uncomfortable reality that there is a higher degree of personal responsibility involved than we may be letting on.
How did you reach this conclusion? I don't see any link between data demonstrating that most people spend at least one year in relative affluence and even the general concept of "personal responsibility," let alone some sort of specific causal relationship between the two.
Because if the majority of people spend some time in the top quintile (54% spend at least 3 years in the top 20%), it means that the majority of people experience the same (or similar) set of opportunities available to the top quintile income earners. Either they are able to capitalize on those opportunities and stay in the top quintile, or they are unable to do so and fall down to a lower quintile.
Also, nobody is concluding that it’s 100% personal responsibility. Rather, the implication is that personal responsibility might play a greater role in outcomes than some of us are willing to accept.
> Because if the majority of people spend some time in the top quintile (54% spend at least 3 years in the top 20%), it means that the majority of people experience the same (or similar) set of opportunities available to the top quintile income earners.
This is an assumption that you're making, not a logical conclusion which naturally flows from the data. All we know is that 54% of people have been in households with annual income landing in the top quintile three times. The article says nothing about opportunity - would you consider a household where two adults each work two full-time jobs, all four jobs paying $30,000, to have similar levels of "opportunity" to a new CS grad making the same $120,000 total working at FANG?
With regard to social mobility, I found the article's most compelling data to be that representing the percentage of individuals who spent 10 or more years in one of the provided income categories, as that seems more conducive to measuring sustained improvement in economic outcomes. Those numbers are as follows: Top 20%: 31.4%, Top 10%: 14.2%, Top 5%: 6.6%, & Top 1%: 1.1%. These seem to illustrate a far stabler income distribution than that suggested by the 1-5 year figures. I'd be interested in seeing comparable figures for 15 and 20 years, as well, but they aren't readily available.
>Rather, the implication is that personal responsibility might play a greater role in outcomes than some of us are willing to accept.
In all candor, while I could certainly be misreading you, it sounds to me as if you're so "willing to accept [personal responsibility playing] a greater role in outcomes" that you're practically evangelizing for the notion.
This is tangential to the discussion at hand, but, if you feel that the role of personal responsibility is under-emphasized in the United States, my assumption would be that you must spend a lot of time in relatively homogeneous, counternormative circles. American culture has long been characterized by what many other cultures might consider a glaring over-reliance on the ideal of personal responsibility as a means of attaining prosperity (perhaps that cultural inclination is unsurprising, given the unprecedented growth historically experienced by Americans). Personal responsibility is a bipartisan value for most Americans, though what it actually means often remains elusive in practice.
In Deaths of Despair [0] the authors explain how the economy in the US basically bifurcated. Nearly all gains go to those with post-secondary education. The large population of Americans without a college degree is not experiencing the boom that the rest of us have experienced since the bottom of the 2008 recession.
This doesn't change the facts you shared. The median is clearly rising. But it can also help explain the reality we all see on the ground, where those who are near the bottom of the income distribution are more frustrated than ever and see the system as rigged against them.
I think it’s exactly correct that a post-secondary education is a reliable guarantee for an upper-middle-class way of life (depending on the degree, of course).
That being said, it isn’t clear that those that go without a post-secondary education are necessarily less able to afford one today than they were in 1991.
3. And while there has been a modest increase in the inflation-adjusted price, the ROI of college degrees has increased more than the price. The net result is that the median number of years to recoup the cost of a bachelors degree, adjusted for inflation, has gone down since the 1980’s, from about 22 years to about 10 years -> https://libertystreeteconomics.newyorkfed.org/2014/09/the-va...
> The net tuition (not the sticker price, which only the rich really pay)
What exactly does "net tuition" exclude? Financial aid comes in many forms - grants, loans (multiple kinds), work study. Excluding some of these might be fair. Excluding others would be chicanery.
> the majority of Americans attend public universities (myself included
Also, the majority of students attend big universities. Want to guess how tuition has increased at Texas A&M or Ohio State vs. a legion of tiny schools that barely achieve accreditation? How many of the sources you found on Twitter are improperly taking an average across schools instead of across students?
> What exactly does "net tuition" exclude? Financial aid comes in many forms - grants, loans (multiple kinds), work study. Excluding some of these might be fair. Excluding others would be chicanery.
The source explains it better than I can:
The amount that students and their families have to pay for enrollment, “net price,” is generally much less than the published price (“sticker price”), for two primary reasons. First, students often receive grant aid from federal and state government programs. The Pell grant program allocates up to $6,345 per year to students enrolled fulltime in postsecondary education. Students from the poorest families receive that full amount, and the award size declines as family income and wealth increase.
Second, private colleges and universities sometimes give discounts to students based on their perceived ability to pay. This practice has the benefit of allowing students to enroll who otherwise would not be able to afford to do so. But it can also be viewed more pessimistically as price discrimination, which is generally perceived as a predatory behavior when exercised by for-profit companies. Essentially, colleges charge students the highest price that they would individually be willing to pay, up to the sticker price, in order to maximize revenue.
> Want to guess how tuition has increased at Texas A&M or Ohio State vs. a legion of tiny schools that barely achieve accreditation?
The source includes that information as well:
> students at public four-year colleges have seen their net costs go up significantly because of increases in tuition that were not offset to the same extent by aid and scholarships. The average annual net cost at a four-year public college has grown by almost 81% beyond inflation over the past decade, to $3,870.
On its own, that looks really bad, but once you account for the change in ROI, the net outcome looks a little different. The upfront cost is higher today, but the long run benefit is even higher.
That doesn't completely answer the question. It addresses discounts and grants, but there's more to it than that.
> On its own, that looks really bad
Yes, it does, and that's still no accounting for differences in size. Even state schools vary widely. Eastern Michigan University is not University of Michigan by a long shot.
> once you account for the change in ROI
That matters little to someone trying to figure out how they're going to afford a house while they're still paying off student loans. It mostly kicks in much later, and money later is not the same as money earlier. If you really want to consider that increased ROI, you also have to consider how it's offset by lost opportunities for saving and compounding interest because people are buried when they first get out of school.
Also, if college is no less affordable, why are they so buried? Graduating with more student-loan debt than income was practically unthinkable when I went to school. It's commonplace now. Why? This whole empathy-lacking "it's no harder than it used to be" argument just doesn't pass the smell test.
> That doesn't completely answer the question. It addresses discounts and grants, but there's more to it than that.
Discounts are huge. "Discounts" is just another way of saying that the university charges different prices based on how rich you are, and the sticker price is basically a fiction. It's essentially total price discrimination, except we only learn the true prices after we reveal our income level. It's certainly a problem worth solving.
> Yes, it does, and that's still no accounting for differences in size. Even state schools vary widely. Eastern Michigan University is not University of Michigan by a long shot.
In both instances, earnings exceed debt. Also keep in mind, those numbers are total debt, not tuition. You can play around with the tool and select different universities and different majors. It's behind a paywall, but if you don't have access, I can pull the numbers for whichever university you'd like to compare.
> That matters little to someone trying to figure out how they're going to afford a house while they're still paying off student loans.
No, the implication is that the salary for college graduates is almost always above the median wage. The calculus is whether the wage increase is higher or lower than the monthly cost to pay off the degree. If it's lower, then the degree is not worth it.
> Also, if college is no less affordable, why are they so buried? Graduating with more student-loan debt than income was practically unthinkable when I went to school. It's commonplace now. Why?
Because it's one thing to say that student loan debt is commonplace, and another thing entirely to suggest that unpayable student-loan debt is commonplace. Consider that almost 100% of homeowners have mortgages (and hence are in debt), but most of them pay them off, and end up with an asset at the end of it. College degrees are (for the most part) exactly like that. The median student loan debt is $17,000 (https://www.forbes.com/sites/zackfriedman/2020/02/03/student...) — but paying that off is like paying off a car loan for a Toyota Corolla, only college degrees aren't a depreciating asset (depending on the major).
> This whole empathy-lacking "it's no harder than it used to be" argument just doesn't pass the smell test.
I wouldn't ascribe intent, but data like this is important to understand what the reality is for the majority of people. While there absolutely exists adverse situations where someone is riddled with hundreds of thousands of dollars in student debt with no employable skill to help pay that off, it doesn't represent the majority of college graduates. The news also has a bias for reporting on the most exceptional cases, you never really hear about the successes. Importantly, this doesn't mean that we shouldn't help those that are struggling. It just means that we don't need to upend the entire system to do so.
I was comparing UM to EMU in terms of size, not cost, but this is still illustrative. Treating the two schools as equal understates cost, as the per student average is higher.
Also, did you "forget" about people without degrees again? They live in the same places, they're also affected by the high cost living, but the ROI of a college degree doesn't help them at all. It's a red herring. As with so much in this discussion, it obscures reality instead of illuminating it.
> Treating the two schools equal understates cost, as the per student average is higher.
Maybe I didn’t make myself clear: the data pulled from the WSJ tool is per student. It’s what the middle individual (per university) expects to owe vs earn. The size of the university doesn’t matter in the context of that data.
> Also, did you “forget” about people with degrees again?
No I didn’t, but people without college degrees earn the median wage and do not have to worry about paying off student loans, which was the earlier contention.
People with college degrees earn above the median wage, at which point the median numbers here don’t really apply to them.
You have to pick your grievance: either you earn the median wage OR you have to worry about paying student loans, but not both. The (wage - expenses) amount is, on average, positive in either case.
The ROI number represents the true value of college, because it not only incorporates the price, it incorporates the net expected returns. Not including the latter obscures more than it illuminates. Simply comparing the price now vs the price in year X is not apples-to-apples, because you were essentially buying a different thing in the 1980’s that was worth less than the unique thing you’re buying today. They just happen to both be named “college degree”.
You're missing the point again. This isn't about the original data. It's about how you combine data from multiple schools to calculate an overall average. If that process doesn't include enrollment, you get the wrong answer.
>> Also, did you “forget” about people with degrees again?
>No I didn’t,
> [four paragraphs about degrees, ignoring non-degreed people again]
I'm not allowed to make the comment that deserves.
To recap this conversation, I cited net tuition data that calculates per student tuition, and pegs public 4 year in-state tuition average at ~$11,000 per student.
You responded by suggesting that university sizes vary, and therefore that data is incomplete (though you still haven’t explained exactly how). You then cited EMU vs UM.
I pulled up the actual per-student data at EMU vs UM, and showed that the data across both 1) corroborates the net tuition calculation, and 2) don’t wildly vary from each other. Again, these are per student calculations. The spot-check samples agree with the aggregate calculation.
> [four paragraphs about degrees, ignoring non-degreed people again]
I didn’t ignore non-degreed people. When analyzing people that don’t have degrees, we have to try to understand what the root cause is: whether it’s because they don’t want to attend, because they couldn’t get admission, or because they couldn’t afford it. I’m simply making the argument that reason #3 is no more likely today than it was 40 years ago. I gave you maybe 4 paragraphs arguing why that is the case (degree confers higher salary that more than pays for the up-front cost + everyone has access to student loans + median loan amount is roughly == price of an entry level car)
In the context of a conversation about the costs of college, I’m also pointing out that (by definition), non-degreed people don’t have to worry about those costs. Non-degreed people are also overwhelmingly the people that earn the median wage, and if you’ve paid attention this entire thread, the numbers for median wage earners are positive — even if not as positive as their degree-holding counterparts.
If I remember my r/badeconomics crash course, the issue is that jobs are being replaced, and being replaced with jobs like burger flipping instead of manufacturing which would allow you to skill up.
I generally agree with you. I suppose the argument--depending upon what the overall distribution looks like exactly--is that you can have a sizable minority clustered somewhere below what would have been considered a reasonable middle-class lifestyle while the median still looks reasonable.
Also CPI doesn't really account for housing prices in many of the metros where the best jobs are or higher education prices.
By definition, the “cluster” below the median is exactly half, regardless of the shape of the distribution.
> Also CPI doesn’t really account for housing prices
Per my source, the median income is the highest it’s been since 1981, and the total compensation the highest it’s been since WW2 regardless of the inflation index you use.
> By definition, the “cluster” below the median is exactly half, regardless of the shape of the distribution.
Right, but that doesn't measure how for away from the median the upper and lower clusters are.
> Also, the median number of years to recoup the cost of a bachelors degree, adjusted for inflation, has gone down since the 1980’s, from about 22 years to about 10 years
That would be consistent with a hypothesis that real wages have increased for higher earners and decreased for lower earners, right?
> Right, but that doesn’t measure how far away from the median the upper and lower clusters are.
Well, it doesn’t particularly matter how far from the upper the median clusters are if their absolute standard of living has improved — and we measure that by accounting for inflation. Regardless of which inflation metric you use (some include housing prices, some don’t), median income has risen.
You can also look at the historic income trends divided up by quintile -> https://www.census.gov/data/tables/time-series/demo/income-p... (Table H1, All Races, it's an Excel spreadsheet). Inflation adjusted income for the bottom two quintiles have also increased substantially.
> That would be consistent with a hypothesis that real wages have increased for higher earners and decreased for lower earners, right?
No, it would be consistent with a hypothesis that real wages have increased faster for higher earners than for lower earners, but demonstrates no evidence that it has decreased for lower earners.
The problem with using the median for income is that it does not tell anything about inequality. If the top 1% get their income doubled (or the bottom 30% their income halved, or both at the same time) with all else being equal, the median does not change. You really need a high percentile as well. Low percentiles are not that interesting, because the distribution is much more compressed there.
In essence, a median that keep climbing nicely is no indication that the situation is not becoming catastrophic. Granted, using the average is arguably worse.
If the top 1% get their income doubled, and the median doesn't change (even after accounting for inflation / purchasing power), then it's debatable if we even have a problem.
If the bottom 30% income halved, you're correct that we have a problem, but that hasn't actually happened.
You can view the inflation-adjusted income data split up by quintile here -> https://www.census.gov/data/tables/time-series/demo/income-p... (Table H-1, All Races, it's an Excel spreadsheet). Even among the bottom quintile, inflation-adjusted income has increased.
> If the top 1% get their income doubled, and the median doesn't change (even after accounting for inflation / purchasing power), then it's debatable if we even have a problem.
I think we do, as these people have disproportionate economical and political power, which results in unbalanced economic policies that favour one specific socio-economic class and breeds discontent. I don’t have links handy, as I am on the phone, but there are works showing a correlation between political instability and economic inequality.
By definition, the top 1% are a minority of voters, so their influence is only as strong as the votes that they themselves are able to cast, or that they are able to convince others to cast. Their impact on elections is overstated -> https://fivethirtyeight.com/features/money-and-elections-a-c...
I don’t mean any disrespect, but history ancient and recent has shown that this is not true. On several occasions an upper class minority has been able to convince parts of lower classes to behave against their best interests. A very relevant example is the capture of the GOP and the enduring support they get from poor, rural whites even though their policies systematically favour wealthy elites. Witness how republican senators turned a minority of the votes from this limited support base into a weapon to completely remake the judiciary.
What is spent during campaigns is just the tip of the iceberg, and is a tiny drop in the daily deluge of propaganda. Look at the media landscape in the US. Private interests such as the Murdochs, Robert Herring, and whoever controls the Sinclair Broadcast group drive the narrative and the constant brain washing. I mean, they keep convincing a significant part of the population that accessible health care is bad. You seem to assume that people are rational and not easily influenced. They are not, and they are.
> On several occasions an upper class minority has been able to convince parts of lower classes to behave against their best interests.
No, on several occasions, an upper class minority has been able to convince parts of lower classes to behave against what you think their best interests should be.
> I mean, they keep convincing a significant part of the population that accessible health care is bad.
Look, I understand that a lot of people on this forum think that there's one and only one way to achieve accessible healthcare, but the reality is that there are SO many different ways to get affordable / high quality care. Singapore, Switzerland, the UK, Denmark, Germany, and the Netherlands all have wildly different healthcare systems, but all produce fairly good outcomes (especially Singapore).
I'm happy to talk more about all of them: I write software for claims processing systems for a living and can tell you all about how healthcare pricing works.
What you're witnessing is that there's broad agreement around what the problem is, but violent disagreement on what the solution ought to be.
That's a separate argument, because at that point we're no longer talking about the standard of living of the "average American".
That being said, across all quintiles (the bottom 2 quintiles include the clusters below the median), real income has also grown substantially -> https://www.census.gov/data/tables/time-series/demo/income-p... (click Income Limits for Each Fifth and Top 5 Percent, it's an Excel spreadsheet)
I'm not sure what use it is to consider the "average american". I'm concerned about the cohort of people who are barely scraping by, you seem to paint a picture that these peoples lives have improved because their "real income" has grown substantially. Ok so they were barely making rent at 15k/year, and their real wages increase by 5k/year since 2010. Has their rent remained stable over the last then years? Perhaps these quotable numbers aren't all that useful except to further marginalize?
Correct, which is why we adjust for inflation, and there are a ton of different inflation indexes one can use. Per the links in the source comment, regardless of which index you use, we have seen an increase in income.
> Has their rent remained stable over the last then years?
From the data, outside of expensive coastal metros, yes. If you don't have a college education, you're more likely to enjoy a higher standard of living in Little Rock, Arkansas or Minneapolis, MN than in San Francisco or Manhattan.
> Perhaps these quotable numbers aren't all that useful except to further marginalize?
No, these quotable numbers tell us that we're generally trending in the right direction. Therefore, any solution we employ to help those that are the poorest among us (which is important) doesn't necessarily need to upend the entire system.
> From the data, outside of expensive coastal metros, yes. If you don't have a college education, you're more likely to enjoy a higher standard of living in Little Rock, Arkansas or Minneapolis, MN than in San Francisco or Manhattan.
Are we comparing a person in one location to another? I thought we were talking about improvements year over year.
> Are we comparing a person in one location to another? I thought we were talking about improvements year over year.
We're doing both. And in both metrics, the bottom are the best off today than ever before. Unfortunately, the bottom were really poor in the 20th century, so being better than that (while good), doesn't mean our work as a society is complete. But that's a welfare problem, not a "what is the system like for the average American" problem.
It's also worth noting that, in America, the bottom 3 quintiles are extremely underrepresented in the highest COL metros. That's just not where they're concentrated. Given the purchasing power disparity between US states, the variance in the US-level data is probably too wide to be useful, and you kind of have to compare from State-to-State -> https://www.advisorperspectives.com/dshort/updates/2019/12/1.... Once you account for that, the root causes also become easier to identify — predominately local land use policy.
Say you calculate the median income for all spectators at a football game at $40,000. Suddenly Bill Gates walks in and the median skyrockets creating a false reality of what all others actually make. Median can often over-simply and create erroneous information.
I know! That bhupy person just won't stop addressing replies to their own comment(s) with relevant links to metrics and sources. The audacity, am I right?
this is correct. conventional statistical functions rely on a whole host of assumptions about the shape and type of data. Typically that its a bell-curve distribution & ratio style data. The more you move away from that parametric style of data, the more conventional statistical functions simply have no meaning.
To use an example. I worked at a shop where we had a freemium model. About, let's say, 50% of our customers paid $0.00. About 2% paid over, say, 10K. Between those two points we had a reasonably clean exponential curve, clustering at key product price points. (I've goosed the numbers for confidentiality purposes, but this is close enough for discussion). In this scenario, what was the average ROI per account? Well, truth is, that's not a useful answer, as it was roughly $0.01.
I wound up building a small set of non-parametric data tools to assess the customer base. It was a fun journey, but a critical lesson is: the kind of data you have and its shape determine the statistical functions that have meaning, and, they didn't teach that kind of data analysis in your basic calc-based stats class in college. :)
That's essentially only true in SF, NYC, Seattle, and maybe DC — and that can largely be attributed to local land use restrictions more than anything else.
There's an entire zone of "Fine, American Cities", all affordable to both Boomers and Millennials. There are dozens of metro areas you can move to, buy a home (if you like), and start saving your income.
It's important to note that these charts apply to the nation, not individual cities. The rent has increased 5x since 1990 in Boston - wages are up 2.7x:
The same is true for almost every major (growing) city beside Houston & Dallas.
Combined that with the fact that income taxes are higher at these "higher wages" and that student debt payments are up 4x just since 2007 - who knows how much since the 1990s - and you can start to see why things aren't so rosy for young professionals.
If you look hard enough, you can find any chart to support almost any opinion. The point of the "anecdata" was to get people to look at a certain side of things. Particularly for people living in HCOL cities that aren't "highly skilled".
Major metros where the multiplier has largely stayed the same:
Columbus
1984: 2.47x
2017: 2.97x
Omaha
1984: 2.3x
2017: 2.77x
Little Rock
1984: 2.85x
2017: 2.75x
Kansas City
1984: 2.64x
2017: 3.01x
Pittsburgh
1984: 2.16x
2017: 2.42x
Minneapolis
1984: 2.64x
2017: 3.34x
Houston
1984: 2.66x
2017: 3.68x
Chicago:
1984: 2.45x
2017: 3.49x
Also, inflation adjustments take into account the increase in prices of all goods & services, including housing -> https://www.bls.gov/cpi/tables/relative-importance/2019.txt. So even if housing costs have gone up more than overall inflation, other things have fallen, and OVERALL inflation rate is the best rough measure of "cost of living". There's disagreement over which indexes are better (based on how they weight different expenses); CPI-U vs CPI-W give different importance to different goods & services. Importantly, the inflation-adjusted incomes have gone up regardless of which index you choose to use.
I agree that it's important to compare anecdotes to available data, but we should be careful not to assume that the available data is itself "the forest." In other words, the map is not the territory.
While it's certainly possible that the sort of anecdotes commonly found in threads like this (regarding the challenges facing today's youth) are unfounded, I think it's more likely that they reflect qualitative changes not represented by the data (or, alternatively, changes which simply aren't represented by the particular data typically used in conjunction with analysis of these topics).
Data showing that the median person earns more working fewer hours today than their parents did at the same age, for example, can't reflect (hypothetical, let's say) circumstances such as:
* An increasingly competitive job market: Unemployment figures don't account for increasing education and experience requirements job-seekers face, nor do hours worked reflect time spent on career preparation in an era where high school students study for the SATs, college students are taught that "networking" is an essential skill, and college grads are more likely to have had multiple internships than their parents are to have had one between them.
* Increasing career instability: Not only are single-employer careers long gone, but, for many, so are single-career lifetimes. Additionally, the increasingly bimodal distribution of incomes means that, while the median income may have increased, the consequences of falling out of a "good" career track can seem greater than they used to be. [1] For reference, 44% of Americans work in 'low-wage jobs' with a median income of ~ $18,000. [2]
* Increases in the cost of living that counteract income gains: Rent, housing prices, (more-important-than-ever) higher education, healthcare & childcare have all increased significantly over recent decades, which likely makes it difficult for many to feel as wealthy as one would suspect.
Considering circumstances like these helps me understand how many may feel much worse than real median income might suggest and makes it seem less strange that young people appear to be poorer than their parents were at the same age - living with their parents longer into adulthood, buying fewer homes, getting married later, having fewer children, etc...
I would also speculate that, in terms of perception alone, the rise of social media and increasing economic inequality (with specific regard to the research suggesting that individual well-being is more closely linked to low inequality than absolute levels of economic welfare) could negatively impact the average American's perception of their quality of life.
Lies, damn lies, and statistics. The first 3 links are for household income. They are comparing 2 income GenX/Millenial households with 1 income Boomer households.
Yes, household income is typically used because you only pay rent once per household, not twice. You typically only have one set of expensive appliances, furniture, etc per household, not 2.
Some quotes are really timeless and applicable to many contexts, in this case the first part (pulling out the knife a bit) could be about the progressive center-right and the latter part (denying that it exists all together) is more of the status-quo apologists like the above post. Heavily cherry-picked statistics (like "household" here) even though on the ground facts is so obviously different (house prices, student debt, health care costs, child care costs etc).
"If you stick a knife in my back 9 inches and pull it out 6 inches, that's not progress. If you pull it all the way out, that's not progress. Progress is healing the wound that the blow made. They haven't pulled the knife out; they won't even admit that it's there."
We typically use "household" numbers because you only pay rent once per household, not twice. You typically have only one set of expensive appliances per household, not 2 (or N), etc etc.
That being said, this is a common (worthy) criticism, and there are 2 data points one can respond with:
Your war of attrition style of argumentation isn't that impressive. The amount of time I would need to spend to counter your links is like 10x of the time you spend producing it - which is pretty obvious given how prolific you've been in this thread.
You're right, dispassionately citing relevant metrics and sources to make a set of points is far less impressive than complaining about being out-argued. We should all aspire to whatever we call that style of argumentation!
Out-argued? Don't flatter yourself. You're only spamming links in the hope that no one will bother looking them up. It's the same MO that your fellow status-quo apologist raynier and refurb do, in every. Single. Thread. Suggesting. Inequality. Look it up, I'm not exaggerating.
Got it, so you’re not only incapable of actually engaging with arguments that are clearly cited with verifiable links, you also appear to have an axe to grind attacking other people totally unrelated to the discussion at hand. This is gold standard argumentation, I clearly have so much to learn from you!
I asked you to google a quote which take a minute to find at most. In this topic you consistently require folks to spend much more time to even come close to interpreting your links.
Axe to grind? Yes. I do notice patterns that I don't really fancy, reactionary ones in particular. Aristocrats defending monarchy get little sympathy from me.
The vast majority of my links are literally pictures of charts and 280 character tweets.
In general, when you make a claim, the burden is on you to prove it, and you should hence provide links to your citations. Imploring others to Google around to corroborate your own arguments (especially when — in your case — the arguments are incoherent) imposes that burden on the recipient. It’s far higher a burden than reading links that have already been supplied, especially when those links are tweets and charts.
On only one occasion in this discussion, my links included research papers — but that was in response to claims about Piketty, a researcher. I generally expect people that bring him up to be able to parse research papers — a skill that you typically learn as an undergraduate.
Both articles can be found within seconds by selecting the text, right-clicking and clicking "Search Google".
The tweets you are linking are summarizing cherry-picked data that itself would require much more effort to figure out.
Both quotes I pasted show significant problems with those "refutations". The first is simply using the wrong data and the latter seem to just be complaining that there's too little data to make an accurate estimate (and the obvious solution is to lobby for states to provide this, which status-quo apologists seem very unwilling to do).
Well, I draw informed, qualitative conclusions which take quantitative data into account.
I think everyone who read my comment understood what I left unsaid, even if they didn't agree with it. And, as the saying goes, there are lies, damned lies, and statistics.
Just because an argument is logical and has facts doesn't mean it's correct. There are multiple ways to shape, work with, or interpret data.
But, speaking of data, is the guy who's tweets you referenced - Ernie Tedeschi, a reliable source? Better or worse than, say, Alan Greenspan? What are his general biases? Same with the Bloomberg article. That article itself said "A rising but uneven tide - the late '10s weren't equally beneficial to all Americans."
Plus, let's use Occam's razor. Which way does it cut?
- Rent prices in major cities / economic hubs for non-tech workers.
- Cost of healthcare in the US
- Cost of higher education in the US
- Cost of home ownership by major cities in the US.
- Job security and availability of well-paying work for non-tech workers?
- Is the economy benefitting those who need it most?
- How equitably have the fruits of economic growth been distributed?
> There are multiple ways to shape, work with, or interpret data.
This is correct, but no matter how you look at it, the data tells largely the same story.
Maybe you think that the median paints an insufficient picture or you want to consider all quintiles including the bottom 2 -> https://www2.census.gov/programs-surveys/cps/tables/time-ser... — there's been a real increase across all 5 quintiles, adjusted for inflation.
Maybe you think that this doesn't account for housing/healthcare/education. But no matter which inflation index you use (each put different weights to housing, medical costs, education, etc), the adjusted median personal income is up. (And YES, this all accounts for all those things -> https://twitter.com/jmhorp/status/1308799186301747200)
Maybe you think this is attributable to dual-income households. But when you ignore those and focus on single-income households, it's still up, and higher than boomer counterparts -> https://twitter.com/ernietedeschi/status/1308775197311467520
After a certain point, it's worth reflecting on whether you're imposing new constraints / explanations on the data for the sake of intellectual curiosity, or just because you really don't want to revisit prior conclusions.
> But, speaking of data, is the guy who's tweets you referenced - Ernie Tedeschi, a reliable source? What are his general biases?
He's a policy economist https://www.linkedin.com/in/ernie-tedeschi-74ba48b/ who's worked for the US Treasury, who's focus is labor markets. His biases are no different from Piketty, Saez, or Zucman, who also have biases. What's more important is what to make of the data they cite.
I appreciate your response. To be honest it's a bit of chart/graph overload for me. If you've got the moment I'd appreciate and find helpful your take on this -
Is the article talking about something different? Would you say they're incorrect? Is their data old? Etc.
To me it seems they're arguing real wages haven't increased that much in the last 40 years.
Also I just want to point out, that the very end of your article about wages increasing for low-wage workers, says:
"That's not to say that low-paid workers have it easy. According to the Tax Policy Center, the bottom 20% of households are earning typical annual income of about $13,300. By comparison, the top 20% of U.S. households earn about $221,000 annually, or nearly 17 times more than the lowest quintile."
My interpretation is, who cares if there's a 50% wage increase from $10/hr to $15/hr if neither of those is a decent living wage to begin with, and the folks at the top of the pyramid are just making staggering amounts of money? The top 20% averages $220K, the top 1% ... much much more.
Which is a bit off-topic, but, at the end of the day economics itself is about people, otherwise there'd be no point! : )
> Is the article talking about something different? Would you say they're incorrect? Is their data old? Etc.
It's probably a combination of a couple things.
1) the article's editorializing of the data doesn't match up with the actual raw data -> https://fred.stlouisfed.org/series/MEPAINUSA672N. This raw FRED data is basically as close to the "primary source" as you can get with this sort of thing.
2) the article looks at total benefits costs and compares them with total wages and salaries, which should set off a couple alarm bells: in the US total compensation is what we need to look at. Basically, they're double counting healthcare costs by ignoring the fact that employers pay the majority of premium costs (https://www.kff.org/report-section/ehbs-2019-section-6-worke...), while also counting the full cost of those benefits against the after-wage dollar amounts. In other words, we've all been receiving huge phantom raises, and we just don't know it because it doesn't show up in our bank accounts.
3) They don't take into account taxes and transfers, which median earners broadly enjoy. To give you a sense, here's what the median disposable income looks like after taxes & transfers, by country: https://en.wikipedia.org/wiki/Disposable_household_and_per_c...
> My interpretation is, who cares if there's a 50% wage increase from $10/hr to $15/hr if neither of those is a decent living wage to begin with, and the folks at the top of the pyramid are just making staggering amounts of money?
Because that 50% wage increase from $10/hr to $15/hr tells us very little about a single earner's pay history — it presupposes that a single person stays at that level for the entirety of their life (which we know not to be true). Those numbers show us the snapshot-in-time income numbers for everyone currently at that salary level, but does not account for people actually moving income levels as they progress in their career. The data isn't "here's the life of Joe over the last 5 years", it's "here's the life of people without college degrees that are X years old with Y years of experience with X & Y held constant over 5 years". I suppose "the lowest quintile wage has modestly increased over time" has a different ring to it than "the entry level starting salary has modestly increased over time".
Also, this is a bit more debatable, but one can also argue if it really matters if the people at the top are making staggering amounts of money, if people in the middle tier (again, not a static group) are also doing better, despite being of (predominately) lower skill — especially since that staggering increase at the top doesn't happen in a vacuum, it's because our system rewards those who play an outsized role in providing goods & services that increase our collective standards of living.
> The data isn't "here's the life of Joe over the last 5 years", it's "here's the life of people without college degrees that are X years old with Y years of experience with X & Y held constant over 5 years"
That's an excellent point. I think that, for various reasons, I've been generalizing more in our discussion and you have been making much more specific and limited claims. One reason I've been generalizing is that I don't know a lot about this stuff : ).
I guess my natural next question is, why do young people (to generalize) feel screwed over by our current economy?
No problem! You've been a lot of fun to talk with, so thanks for that.
> I guess my natural next question is, why do young people (to generalize) feel screwed over by our current economy?
If I were to guess, it's probably because young people have experienced now not 1, but 2 major crises in their lifetime (2008, 2020).
Young people also have congregated to some of the most expensive metros in the US, and those localities have done a terrible job building enough housing to keep up with demand.
I think the collective angst is justified, but needs to be contextualized so that we don't prescribe the wrong solutions.
The comment you responded to above poses a valid question - what should we do if empirical data seems to contradict our own lived experience?
"The data" is not homogeneous or infallible, nor can it fully describe all phenomena it's applied to. At best, data can provide us with a rough approximation of the landscape it attempts to describe. In this particular case, I think that landscape is smaller than the territory established by the OP's linked blog post (which seems to transcend the easily-measurable world of simple economic indicators).
Would you consider it prudent to dismiss one's lived experience offhand, at the mere site of a few quickly-pasted links to tweets containing simple, decontextualized data points? What, then, if another commenter subsequently provides contradictory data? Scrolling through a typical HN thread would turn into a nauseating ride on some sort of intellectual see-saw.
If the commenter you responded to is "not convinced" by some tweeted factoids claiming that Americans are working less and making more than ever, does that imply that they're ideologically motivated?
I don't think so, no more than I think your reply to them implies that your own motivation is ideological. Without additional context, those "facts" don't significantly imperil any specific worldview.
Lived experience, while important, should be treated like anecdotes.
While anecdotes are valuable, the plural of anecdote != data.
Should we help people whose lived experience is worse than ours? Absolutely.
Does the mere existence of people whose lived experience contradicts that of the aggregate majority suggest that we ought to upend our systems? No, and that's the point.
In general, when we see (overwhelming) empirical outcomes that don't confirm our priors, we ought to shift our priors. We do this in the context of climate change, there's no reason we shouldn't also do this in the context of the economy and labor markets.
Lived experience is certainly anecdotal, and I agree that anecdotes don’t simply transform into data when aggregated. That’s not the point I was trying to make.
You ask, “Does the mere existence of people whose lived experience contradicts that of the aggregate majority suggest that we ought to upend our systems?” First of all, I’ve said nothing regarding “upending our systems.” More importantly — just as anecdotes != data, neither can data replace the role of lived experience. If many people feel like they’re struggling today, that doesn’t contradict data saying that they should be making more money in less time - to me, it suggests that there may be more to the “equation” of positive lived experience than income.
In other words, while I don’t think that it’s feasible to precisely compare the “difficulty” of living across different eras, it’s certainly possible that people can be making more money than ever before and also struggling en masse. The whole of life can’t be explained by simple economic data. Scratch that - the whole of economic life can’t even by explained by economic data... I expanded on this concept in a reply to another comment of yours, but you may not have seen it.
For sure. I think the question is what does the empirical evidence say in this case. And, for complex issues where drawing conclusions from data is nontrivial, using judgement.
And working less hours is also skewed by the rise of millions of gig jobs that don't have set hours.
The top 25% of the population is doing great by any measure. But when you discard the top 25%, the rest doesn't look so good. Income inequality in US is higher than almost every other industrialized nation. As high as it was in the 30's when massive social unrest gave rise to unions and worker protections.
When you factor in student loan debt and minimum wage being degraded by inflation and gig jobs, the bottom 50% are poorer than ever. As of this year, student loans have eclipsed credit card and auto loans as the second largest form of consumer debt behind mortgages
Like every product price, the wages are very much proportional to the demand/supply balance. And the increased centralization and standardization is directly and considerably reducing the demand. If you have 10 companies with roughly equivalent market share, you need 10 CEOs, 10 independent chains of command, 10 accountants, 10 lawyers, 10 IT guys and so on. Each of those people possesses some domain knowledge, is harder to replace, and can negotiate a relatively high salary.
Now if you let them merge into 1 megacorp, you get 1 CEO with 2-3 VPs, a couple of accountants/lawyers/IT guys/middle managers, and the rest become corporate drones following predefined instructions. They are much easier to replace, so they get lower wages.
Globally, you have increased the efficiency in terms of lowering absolute prices, but decreased everybody's living standard, because the median wage drop was even more dramatic.
If you want to gauge the scale of the problem, just think how much of your own monthly spending goes to owner-operated businesses vs. 1000+-employee corporations vs. rent-like transactions (e.g. mortgage interest).
There are also a bunch of very specific policies making this worse:
* Low interest rate make capital cheap, enabling leveraged acquisitions.
* Bailouts for inefficient corporations prevent smaller businesses from taking their place.
* Non-existing enforcement of anti-trust laws is another huge thing. Good luck selling physical items online without going on Amazon. Good like surviving after Amazon deems your niche worthy of their time and knocks off your product with "Amazon Basics". Ditto for mobile apps, ditto for any niche captured by a Google product, that is often terrible, but free to the end user due to ads.
A separate thing is the government-backed college loans, that drive the tuition prices to the point where it doesn't pay off, but is still required by most employers.
These are big long-term problems that will shape the economic future for most of the currently economically active population, but to my surprise there is very little public discussion about them.
> The stock market has no bearing on the economy right now.
nor should you expect it to. people buy securities when they expect them to be worth more when they sell (or are forced to sell) them. they sell securities when they no longer believe that. either way, it doesn't have much to do with what is happening this instant. right now the market is saying that, in aggregate, people expect the economy to bounce back before they are forced to liquidate their holdings.
Buy-and-hold investors aren't usually spooked until bankruptcies start rolling in. Think about how long the GFC was brewing before the stock market took a hit. You could see the problems for years, but it wasn't until the fall of Lehman, AIG, etc that people really began to panic sell.
Bankruptcy is the one situation where buy-and-hold doesn't work. So until we start seeing large bankruptcies, I doubt the stock market will really panic. And let's be honest, the likelihood of any of the top 10 companies in the S&P 500 going bankrupt right now is nil. And that's >25% of the total value of the index right now.
sure, I'm not saying the market is a perfect indicator of the economy's trajectory. I'm pushing back on the idea that it's completely disconnected from the economy, or at least that pointing to current unemployment figures and an S&P all time high is not enough to prove the point.
The S&P 500 is absolutely disconnected from "the economy." Out of the top 5 stocks on the S&P, 4 are tech companies employing fewer than 1 million Americans combined. Expanding that list to the top 10 companies, P&G and Visa are kind of representative of the economy, but they are highly consumer focused.
Leaving Berkshire Hathaway as the sole stock in the top 10 that is representative of the American economy as a whole by virtue of doing business in such a diverse collection of markets and owning many small, private businesses.
Another problem with the S&P500 is that it is capitalization-weighted. So even if it contains a diverse set of companies, technology stocks are over-represented because of how well they scale. Walmart has to do a hell of a lot more work per dollar of profits than Apple does.
> people buy securities when they expect them to be worth more when they sell (or are forced to sell) them. they sell securities when they no longer believe that.
If anything, the shocking over performance of retail investors has nothing to do with that trite characterization. It could be as simple as options having a built-in exit strategy, which is actually what retail lacked. People were selling and buying at the wrong times, and now they aren’t, because the thing they are trading forces them to exit at a “good time.” With an unemotional exit strategy, retail traders are taking greater risk for greater reward and obtained it.
Is this good for the economy? Would you rather have people sit on their stock portfolios until they die, passing millions of dollars tax free to their heirs, who don’t spend it either or spend it on overpriced housing? Or would you rather people take profits (and losses - to other winners) every two weeks, occasionally, you know, buying a variety of somethings?
The comment above hit the nail on the head, it's actually pretty simple.
Old people = asset holders (aka holders of financial obligations, bonds and stocks)
Young people = usually in debt (student, auto, credit card, mortgage) and hoping to earn their way out with income.
Aka the young are the very debtors that the old creditors are lending to via their retirement funds of bonds and stocks.
An the Fed, well really their only tool is to print to do what? Buy financial assets.
So in times of stress they buy those assets/liabilities, pushing up their prices first through decreasing long term rates (hence buying government bonds) and then with QE infinity, eventually buying the actual same asset/liabilities that make up the rich people's portfolios.
Hence we response to a drastic income shock by...bailing out rich people.
Ideally the Fed would just monetize government spending directly to those poor people, but as you can see with Pelosi-McConnell 2020, that stuff needs to go through Congress, which is a partisan mess.
"Old people = asset holders (aka holders of financial obligations, bonds and stocks)
Young people = usually in debt (student, auto, credit card, mortgage) and hoping to earn their way out with income."
Rich young people are being financially supported by their rich parents. Poor young people aren't as their parents aren't 'asset holders' either.
The whole 'generational conflict' is a ruse, like so many of the divisive narratives being propped up, especially in the US but also in many other places, to further divide and distract.
The 'trickle down' economic fairy tale is one propagated by both parties, ever since the Regan years imho.
I think the ponzi scheme refers to the amount of money flowing around in financial schemes that isn't actually "money". By creating numerous financial products and valuations that aren't actually based on assets, the banking system has been able to "print" more money than the central bank would ever allow. And that "money" is what's propping up the stock market. That's why the stock market is no longer representative of the economy: "money" has become an economy in itself.
All of the aspects of generational inequality stem from a single root cause: the rising dependency ratio after the peak of the demographic dividend. US labor force participation peaked in the 90s as female entrance into the workforce peaked. After that happened, the countervailing force of lower birthrates began to take over and the dependency ratio started to rise.
We're all fighting for a piece of a pie that is, relative to the population, shrinking. And it's shrinking because of demographic trends that began decades ago and would take decades to reverse.
Every developed country is going through the same thing, and some are in much worse positions than the US. Older people were lucky enough to be born into a world with demographics favorable to growth; younger people are unlucky to have been born into a shrinking world.
Are you talking mainly to things like consumer goods being bought more and more specifically because there are more people? Or systems that are basically N people paying for M people like Social Security (where N = number of workers and M = number of retirees) where N and M need to roughly stay the same ratio?
Or something else?
(More generally - I find your comment intriguing, please do go on!)
The latter, there are fewer workers per population than in the past. This means supply falls relative to demand since the growing population is seniors with savings. That leads to higher prices.
When demographics are in your favor, a youth bulge places fewer demands on society.
Now we can tweak things to make them better. However the demographic megatrend is mostly set in place and there's not much we can do to reverse it.
She is very right that the tech job market changed in 2001. I have a former coworker who lost his job in 2017. He had not been in the job market since 1995. He was shocked how things had changed. His direct quote: "It used to be easy to find a tech job. Now it is hard. I couldn't understand why in the last few years I was able to hire people who I thought were over-qualified. But I now see that I'm under-qualified".
It was eerie how much of this post reminded me of my experiences in the 90s, and I lived over a thousand miles away from the author. I was relaxed & confident because I could always find an easy job that paid the bills.
I had no frame of reference whatsoever to even understand that things were changing at all, let alone how or why. I thought I would have time to wait until the 'perfect fit' came along. I was wrong.
The scariest part is that those of us who are abled bodied who think that if times got rough they could always go back and flip burgers or be a cashier are starting to get a rude awakening that even those jobs have lots of competition and, worse, some of those jobs are getting automated.
Never be complacent. Always be learning. We are expendable. Let's not give employers a reason to think this. I used to think that I just need to learn a couple of technologies, some back end scripting language, some shell scripting, some SQL and JavaScript and I'll be set. That was 10 years ago. How wrong I was.
Lots of good comments on this but two factors I haven't seen considered:
- Rise of programming bootcamps
- More intrusive workplaces
In the 90's and early 00's programming was not cool. There was a lot of money to be made, sure, but the average developer was far more likely to be in it for the software, not the money. Companies were more willing to bring people on then because there were far fewer bullshitters. Post '08 programming became cool - remember brogrammers? - and a whole cottage industry of bootcamps popped up. Companies had to pull back and put up a lot more barriers to hiring because so many people started applying with dubious resumes.
It also seems like there has been a slow but steady increase in how intrusive your workplace is to your life. Hiring and firing people has slowly become more and more fraught for employers so they want to do less of it. Insurance has gotten incredibly expensive and is linked to your employment so both sides are wary to take chances and want a sure thing.
I don't think this is a phenomenon just in tech either. It's a US wide problem with all the incentives making it difficult to take chances.
This is focused on jobs but I’d like to see more conversation on debt, NIMBYism and interest rates. There are a lot of jobs, maybe not easy well paying ones, but lots of jobs that would have you earning $30k or $40k a year without particularly specialized skills. The problem is that those are not nearly enough to cover costs, specifically in housing.
And the reason housing is so insane is that the homeowners cartel has manufactured a supply shortage AND with more or less permanently low interest rates (because due to the massive amount of government debt we have to keep them low) the real estate market prices out people who don’t want to take on insane amounts of debt.
For anyone who does end up reading this, I highly reccomend reading r/badeconomics for most economic questions you come up with.
I find that understanding an economic issue ends up requiring me to read one papers just to penetrate one paragraph of someone's explanation of a phenomenon.
Badecon instead is a place where you see bad ideas taken and dissected to show what was wrong, and in the process many sacred cows and talking points get dissected. Makes the process a whole lot easier to understand from a topical issue perspective (rather than doing a part time econ degree)
Just make sure you don’t go to actual /r/economics which is really a politics about the state of the economy subreddit where most seemingly have never even taken an economics class
If I were starting out now, in exactly the same situation as I had been in 1991, a smart, hard-working, self-motivated, well-organized, fast-talking, interviewer-impressing young person with no college degree, nor any credentials beyond a high school diploma, but the ability to learn to do hard things fast, and independently, if just given a chance, and the chutzpah to talk oneself into jobs one doesn't actually know how to do in the confidence one will figure it out when one gets there: what options would I have to turn that promising constellation into cash in my pocket?
This statement is exactly my lived experience except that I started in 1994 or 95. My question for others here. What is the answer to his question? Is it possible to do now what we did back then? Why?
As a recent college grad who has held many junior developer interviews while working at a startup, I would say that the answer is most definitely no. The current interview process is so bogged down with a high number of applicants and focuses so much on objective accomplishments, that any shortcut to trim applicants is taken. It seemed that only by removing applicants with a lower GPA or perhaps a mismatching degree could the interview pool actually reach a manageable number. This is a real shame because so many good people are most definitely ignored, but it is so much harder to prove character traits versus simply checking for a degree.
The fact that Lambda School has a business model suggests you’re missing something. People manage to get jobs paying $50,000 a year and much more based off of nine months of full time instruction.
> Time to placement: How long does it take students to be placed into jobs after graduation?
> Of the 201 students who graduated in H1 2019 and were placed in jobs, 116 (58%) accepted jobs within 90 days of graduation, 68 (34%) accepted jobs between 90 and 180 days after graduation, and 17 (8%) accepted jobs 180 days or more after graduation.
> Salary expectations: What is the median annualized salary of placed graduates?
> The median annualized salary for H1 2019 placed graduates was $70,000.
The online application process has made it so easy to apply for jobs that employers get flooded with applications. So they really have no choice but to use arbitrary criteria to filter candidates down to a manageable set. (The same thing has also happened to college applications.)
Possible? Sure. Probable? Certainly not. You're already filtered from 99+% of career style jobs because if you don't have a bachelor's the HR software chucks you.
A combination of factors have completely ravaged Millennials' finances:
* Technology is capturing more and more of low-wage/low-skill entry level work.
* The combination of technology, free-market capitalism and globalization means that American youth have to compete with non-Americans for jobs that were solely done by Americans during most of the 20th century.
* Incompetent government policies have created a system where college tuition has outstripped core inflation by 500-600%; all of this in a system where a college education has gone from nice-to-have to an absolute necessity.
* Globalization and urbanization have led to congregation of more and more people in urban cores, causing housing and other fixed living costs to increase.
* Untimely financial crises have both retarded Millennial career progress as well as their ability to save/invest in the stock market.
None of these processes or events are inherently bad, but their combination have created so many negative externalities for a generation of Americans that these people will never be able to recover.
There is one other weird aspect if you compare millenials with GenX.
Millenials spend a lot more money on a lot of stuff that didn't even exist in the period in the 1990s the blog author was talking about.
In the early 1990s if you were young & working you:
* Possibly didn't have a TV
* Were extremely likely to not own a computer - They were REALLY expensive, most people who had them had them because they were in tech and the employer gave you one to take home
* You didn't have a $900 smartphone and pay $100 or more a month to use it
* Definitely didn't pay recurring software subscription fees, very likely didn't buy any software
* Probably went out to eat far less than today
* Were less likely to work/live in the Urban core.. you might have needed a car but everything else was cheaper
* You weren't going to Starbucks or other Cafes and buying $3-4 Coffee drinks, cause most places didn't even have cafes like that
* There was less pressure to keep up with the joneses since Social Media didn't even exist
There was a lot of stuff in Millennial culture in the 2010s that was downright expensive.
The author sort of alludes to it at the end of the article when they mention that Gen-Xers' expectations are even more tempered than Boomers' expectations about what it takes to survive today.
Is there another group/time period you're interested in specifically comparing to?
This seems to me that this is a result of the training young people get being more and more divergent from the actual work the economy needs because of governmental gerrymandering. People made it impossible to apprentice at lower wages for higher wages later so now we have huge batches of young 18-25 year old who can't compete fairly for a job. Then we also have young people getting degrees in their passion rather than in anything people actually need.
What makes you say that? I have seen absolutely no reason to believe any of that. Especially considering this article is explicitly referencing mostly unskilled labor.
Because how do these kid even get skills if the required amount to pay them prices them out of the market. You'd probably hire a drop out if you could pay much less in wages, but I see a lot of people avoiding that apprenticeship path.
They didn't have this issue in the early 90 cause inflation had so quickly outpaced minimum wage there really wasn't one. So in 1979-1985 when he was paid somewhere near 7-11 a hour I'd guess and paying $270 a month. I'm guessing it was worth it to his teach him. Now in some cities it not worth it for example in SF to hire a kid to do basic SDR or work around the office.
I'm similar age.. the talk of hourly wages rings true to me.
In the early 90s I was still in HS.
I made $8/hr as a lifeguard in HS and $12/hr as a Red Cross Water Safety Instructor. I'd clock out of one and clock in at the higher rate depending on what I was doing. That's about $14-21/hr based on inflation. These were actually jobs that were much harder to qualify for compared to most jobs high school kids get. They required my parents having the foresight to sign me up for many years of swim instruction and then lifeguarding & teaching classes. WSI classes were actually quite difficult back then, and I think they still are.
I actually know a lot of people working lifeguard and swim instructor jobs make less than $8/hr today. Just an example but I think these jobs are similar to many others. Most places my son has taken swim lessons the instructors are not Red Cross certified, and it really shows, the programs are nowhere near as well designed and the instructors don't even swim well enough to make it through the WSI course. Meanwhile the business owner charges WAY more for the swim lessons. The employees make 50% what they did 25-30 years ago but the business owners are likely making WAY more money. Heck swim lessons weren't even really a for profit market back in the 1990s. As soon as I had one year of college I always had summer software internships. The first one started at $16/hr and the hourly wages went up every year.
As far as the recollections of the .com crash in 2001/2002. It's impossible to know how good the blog author was.
In the first .com boom there were a lot of companies that did 0 productive work, burned through lots of VC money, and never even shipped a product. They hired lots of people without college degrees, sometimes they hired people who dropped out of HS. "We don't do any work, we just play Quake all day in the office" was definitely a thing in the late 90s. There were a lot of people who worked places like that who had "Sr. Software Engineer" on their Resume and had very little ability to do the job. The .com crash was a correction for a lot of that. There were other people who had the same lack of credentials but did fine because they were legit, had the talent, worked hard, learned on the job, and worked at companies that were actually doing real work.
I think today's startup market is different. Code quality standards are a lot higher. There are weird parts of the software economy around social media, advertising, etc.. but there seem to be less companies that are just there to outright defraud VCs. Investors are a lot smarter now.
I worked as a Perma-Temp for 2 years after the .com crash. It was actually very lucrative at that time. 2002-2005 were pretty scary. There was tons of talk of Outsourcing just replacing everything in the US. It didn't work.
Something i see glossed over in a lot of conversations about inflation is that inflation doesnt affect all prices equally. Sure housing has gone up significantly but things like electronics and clothing have gotten cheaper and a lot of products like household goods are flat.
You cant make apples to apples comparisons by just looking at one slice of the economy
These graphs appear to show that real, inflation-adjusted median-income is higher for Gen Z that it was for Millenials, higher for Millenials than it was for the 'Oregon Trail' cohort, higher for the 'Oregon Trail' kids than it was for Gen X, higher for Gen X than the Baby Boomers and higher for the Baby Boomers than for the Silent Generation.
Are his figures false, or is it possible that there is another factor at work? Maybe we look back at our youth with rose-coloured glasses, and maybe we look up to those older than us and fail to see how hard they worked to get where they are. I remember that my first apartment was a dive; I just now looked it up and see that it has been completely renovated, is appointed like a luxury flat and is only 36% higher in real dollars. That is not terrible. A quick check on Zillow shows units available in my original price range not that far away, in a town which is far more desirable than it was decades ago.
These cross-sectional comparisons are really difficult. For example, the author mentions rent in Boston. The homicide rate in Boston in 1990 was three times higher than today, comparable to where Newark is today. There were good jobs downtown, but the tech boom hadn't yet begun. In 1990, the city was coming off of three straight decades of population decline. Since 1990, the population has grown 20%, but the city has built almost no new housing: https://assets.bwbx.io/images/users/iqjWHBFdfxIU/iY_crGvqlCv....
The author also talks about working in "word processing." That was a much rarer and more valuable skill in 1990 than today. I'm a child of the 1980s and early 1990s, and despite coming from an upper middle class family, I didn't learn how to touch type until 8th grade. My daughter has a Macbook pro and is on pace to be a touch type by the end of second grade.
I'm not saying there is "nothing to see here." But we can't have a serious conversation about this unless we have a serious conversation about this. Part of having a serious conversation is agreeing on the facts: https://www.pewresearch.org/fact-tank/2018/12/11/young-adult.... The fact is that millennial households make more money today than previous generations did at the same age. Households headed by people ages 22-37, adjusted for changes in household size, make 60% more money adjusted for inflation than in 1967.
Another thing is acknowledging that, if you compared housing in cities to other positional goods, Medford or Astoria or Mountain View are a different product today than they were in 1990. It's like how the Camry and Accord keep moving up market and new models are slotted underneath. The equivalent of Medford back then is some suburb in Kansas City.
Third, we have to acknowledge that credentials such as college degrees are not as valuable today as in the past: https://www.the74million.org/wp-content/uploads/2019/03/Pell.... In 1970, 40% young people coming from the top quarter of households (by income) had a college degree by age 24. Today, that's 58%. In the next quartile down, that figure nearly tripled, from 15% to 41%. And in the next quartile down (people from families making below the median income) that number doubled from 11% to 20%. Being a college degree holder just doesn't mean as much today as it did 50 years ago.
Fourth, we have to acknowledge changing demographics. We have to acknowledge that economic immigrants tend to come into the country with little in the way of financial resources. And even a country that's doing a great job of providing inter-generational mobility for those immigrants will show more poverty in aggregate statistics than one that doesn't have any economic immigrants. That's true in the U.S. as well. Data shows that all major immigrant-heavy ethnic groups, including Hispanics, have incomes that are converging with those of the white majority over time: https://academic.oup.com/qje/article/135/2/711/5687353#20151.... But the percentage of the population that's a first generation immigrant--and thus disproportionately likely to have come here with limited means and education--has tripled since 1970.
It's easy to take a couple of data points from two time periods and draw conclusions. But such analyzes assume everything else is staying the same, and the conclusions aren't worth very much unless you account for changes. Doing that accounting properly is very hard.
Capital owners would much prefer running stuff locally: it's more expensive so the profit margin is higher. No headache with logistics. Predictable laws. Friendly courts.
It's customers who are not willing to pay the premium for locally made stuff.
As soon as one factory moves to China others have no choice, but to follow, otherwise they will lose the price war.
Capital owners would much prefer running stuff locally
Empirically that's not the case. There are plenty of tales of outsourcing in pharma going wrong, yet the stock market rewards contracting out to CROs over doing stuff in-house.
Director of Outsourcing in Discovery. In Discovery. That's not even a cost center, the real costs are incurred later in the pipeline, at that stage you want to keep the life scientists, the structure guys and the synthetic chemists all in one building. But Genentech are now part of Roche and a public company needs to keep an eye on the stock market.
(Presuming you're in US and generalizing) Being on NH you probably earn 2x the median US income. Do you own an American made luxury vehicle, that you probably can afford to lease or buy on credit? Probably not. And I bet it's because you don't want to.
This is a really badly considered example because Tesla exists and I'm willing to bet that a large portion of HN wants to own a Tesla vehicle. A better example might be US-made clothing, which often comes with a premium price.
It’s more complicated than just that, though. The boomer generation has been viciously anti-development. Factories make noise and usually pollute, at least a little bit. Good luck building one against the political will of a generation that doesn’t even want housing constructed.
Yes, it makes sense for them. For most boomers, their biggest asset is the value of their house. They might even have borrowed against it. So for them, preserving the value of their house becomes the most important task. As such, it kind of makes sense for them to prevent cheap competition from newly built houses and also to prevent increases in traffic, noise, or pollution in their area.
That said, some of 'em are also clearly overdoing it, like resisting the construction of a children's playground which might be within hearing distance.
I disagree with this, the profession is becoming more like a real profession instead of a whole bunch of people just slinging code left and right.
The reason programmers make good money is because we're at the beginning of a great profession like lawyers and finance guys did when their professions began.
We will come to a time that the programming career requires certification kinda like nursing, lawyers or finance guys. The question is how strict the certification process will be... will it be like CPAs where you can work in finance and eventually certify or like lawyers where you can't write code professionally without the license.
> The day you need a license to write code is the day I start a coding-as-civil-disobedience program.
It's the same as being a lawyer, a doctor, a nurse or a CPA. Those are careers that can cause a lot of harm to other people, yet programmers think they're above it all.
Wait until someone takes control of a smart power grid, or a self driving car/truck/airplane because of outsourcing or someone writes some shit software that causes an airplane to fall out of the sky.
The Boeing 737 MAX is already an example of cost-cutting in software leading to measurable deaths. But Boeing took the flak, not the software engineering profession as a whole.
You don't need a license to tell your friend how you were able to beat a ticket in traffic court (that is, give free legal advice as a non-professional). You do need a license to charge somebody to represent them in court.
IMHO, chasing a "real profession" in software is an illusion to begin with but I suppose some mental models about a career can be useful.
Comparing to the law or medical careers is a farce. The law doesn't change quickly, and medicine hardly changes at all from year to year. Our mental models of civilization don't really change much from year to year. And the human body doesn't visibly evolve even from decade to decade. Of course it's easier to have a meaningful certification in those fields.
Software still changes incredibly quickly. Not as quickly as in the 1990's -- but not slowly enough to warrant blanket certification bureaucracy that will just have to change in 5-10 years.
> Software still changes incredibly quickly. Not as quickly as in the 1990's -- but not slowly enough to warrant blanket certification bureaucracy that will just have to change in 5-10 years.
Software changes, fundamentals do not.
The people who complain about credentialism are the same people whose answer to "my code is slow" is "just add more hardware". We've built abstraction layers to make software easier to write to reduce the human cost but in term end up with less qualified people.
Right, but on the flip side, how many companies do you think should have publicly exposed Elasticsearch clusters before we think of having certified for security.
The "lite" version of certification in tech could be a requirement for a programmer equivalent of a Professional Engineer to sign off on any piece of software that becomes used in a product or service offered to people (whether commercial, public or by non-profit orgs).
This way, I'd be free to develop my Open Source library and engage in exploratory coding, and you'd be free to play with it internally, but if you want to use it in a product you sell to people, you'll have to get a licensed software person to sign off on your project. You probably won't want to waste their (and your) time on untrustworthy projects, so I'll have to get a licensed software person to sign off on particular releases of my library to certify quality, to make it more likely someone will want to use it.
This would introduce some inertia in software - not enough to be stifling, it won't pry away Turing-complete languages from the hands of curious children, but perhaps it'll be enough to introduce some sanity into the industry.
> could be a requirement for a programmer equivalent of a Professional Engineer to sign off on any piece of software that becomes used in a product or service offered to people
To be honest, it's funny graduating from a 4-year college with an Engineering degree, being offered to be inducted to the order of the Engineer, getting the ring and being told not to sign off on designs that would harm people while being offered thousands of dollars to get people addicted to software.
I wonder if we should call it "programmer's guilt"?
Yeah. As long as the industry is run by ad people, you can forget about ethics in it. Best way is to stay clear of that, but it's getting increasingly difficult.
It seems to me that all kinds of existing certifications in software have introduced insanity. It will create another layer of bureaucrats who are after certifications but cannot program.
It will further kill "open" source software, which already is destroyed by corporations.
I think programming is too open ended for certifications.
Existing certifications in software are just attempts at making money off gullible people (EDIT: at least to my experience, most are). I'm talking about real, government-backed certifications: that is, you can't sell serious software products that don't get a sign-off of someone licensed, and that licensed person then has career liability (and in extreme cases, monetary penalty or even jail time) for failures of that software.
As far as I know, software is the only field that can get away with something like this:
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
Imagine finding something like this on a pack of aspirin.
I found something pretty similar on this chemical supply company that sells acetylsalicylic acid (aspirin) for educational purposes. The purpose for which you advertise your wares matters. I don't think such a disclaimer is unreasonable on certain software projects.
> IN NO EVENT SHALL DISTRIBUTOR, DISTRIBUTOR'S SUBSIDIARIES OR AFFILIATES, OR ANY SUPPLIER BE LIABLE UNDER ANY LEGAL THEORY (INCLUDING BUT NOT LIMITED TO CONTRACT, NEGLIGENCE, STRICT LIABILITY IN TORT OR WARRANTY OF ANY KIND) FOR ANY INDIRECT, SPECIAL, INCIDENTAL, CONSEQUENTIAL OR EXEMPLARY DAMAGES (INCLUDING BUT NOT LIMITED TO LOST PROFITS), EVEN IF WE OR ANY OF OUR SUBSIDIARIES, AFFILIATES OR SUPPLIERS HAD NOTICE OF THE POSSIBILITY OF SUCH DAMAGES.
Fair enough. I shouldn't have generalized; there are some that are valuable in certain niches. MS and Cisco certs come to mind (and back when I was studying at university, the learning resources for Cisco certificates were our go-to sources for understanding computer networks). But in my impression, it's an exception to the rule.
> The "lite" version of certification in tech could be a requirement for a programmer equivalent of a Professional Engineer to sign off on any piece of software that becomes used in a product or service offered to people (whether commercial, public or by non-profit orgs).
this makes sense for software that handles PII and/or where bugs could cause loss of life or limb, but this seems like overkill for a lot of commercial software. but suppose I'm selling some desktop application that doesn't have either of those qualities. wouldn't the certification requirement be a bit overkill in this situation?
You may be right, and perhaps it would be better to relax the requirements, letting them flow from the critical software (PII, risk to life and limb) downwards.
So if I want to make a new radiation therapy machine, I won't be able to pull in random crap from NPM. My software will need a PE sign-off, and so will my dependencies. This will hopefully lead to some OSS libraries getting certified (perhaps with corporate interest funding the certification), creating an excuse/incentive for other companies. Now some bank will notice that a third-party app they use internally has eaten their data several times, causing monetary loss, and it'll decide that going forward, PE-certified software is prioritized during procurement. Creating incentive for non-critical software vendors to get their software certified. Etc.
I'm not sure if the dynamics would play out this way, but it seems plausible. The goal here is to retain free experimentation - I'd hate to end up in a world where you aren't allowed to use a Turing-complete language without a license - while at the same time forcing responsible software practices where they matter.
> You may be right, and perhaps it would be better to relax the requirements, letting them flow from the critical software (PII, risk to life and limb) downwards.
agreed, I think this would achieve a lot of the positive downstream effects without placing an undue burden on developers of less critical commercial software. to go back to the desktop app example, it probably needs to authenticate serials for authorized users. you could write a small self-contained auth server and have it certified, while continuing to ship your behemoth desktop app that can't be fully evaluated.
The FDA already has an extensive validation process for medical devices with the potential to cause harm. They look at the complete system, not just software. There have been a few notable failures but overall the process works as intended. No need for additional programmers certifications.
The new gigs in today era is: make youtube video, game streamer, instagram model, etc. Mind boggling how young people these days can make considerable amount of money out of these.
Those kinds of "influencer" gigs are a lottery. Most people do poorly in them, a very small number are very successful. It's like being a band, a few are stars, the rest can't afford to do it as their main job. The amount of money that the most successful can make is not a good indicator of likely income.
They're not what is generally meant by "gig economy" by the way. What is generally meant are precarious jobs with variable shifts, often low paid. Things like Uber driving, minimum-wage jobs delivering food when there's demand, and zero-hour contracts working in shops.
Sure, I'm not even talking about the stars, but even the fact young people can make any money out of these is incredible. I wouldn't imagine it in 1991.
What I'm saying is for young people the opportunities for gig are relatively the same compared to 1991, except its in the different form.
Not to mention the opportunities to aquire skill or to learn almost anything for free from the internet.
> What I'm saying is for young people the opportunities for gig are relatively the same compared to 1991, except its in the different form.
You've explicitly named lotteries and acted as if they're the same as actual gig employment. I have several instagram influencer friends, and a few who make youtube videos on the side. It's their dream to go pro. You want to know their collective income, from a pool of about 10 people, who have been working at this for over a year? It's under $100. That's not a gig, that's a lottery that costs time and energy.
If you want to talk gigs, talk about Wag or Uber. Which, are also definitely not the same gigs that the article was talking about.
>"You too can become an NBA player!". You have the opportunity if you work hard enough
I didn't say that.
Rather to find opportunity that suitable for you.
opportunity are there for everyone but a one particular gig may not be for everyone. Some suitable for being youtuber, some suitable for instagram model, some suitable to be game streamer, etc. These gig doesn't exist back in 1990, old people today shouldn't compare it the gig back in 1990, the economy are always evolving.
The opportunities you talk about are much like the opportunities to be an actor on television, they are few and far between compared to the amount of people who want those jobs. If the market supports 6000 Youtube stars and 100,000 people want to be Youtube stars your pep talk is naive.