Hacker News new | past | comments | ask | show | jobs | submit login
Nintendo Will Pay Its Workers 10% More (gamespot.com)
523 points by ossusermivami on Feb 7, 2023 | hide | past | favorite | 359 comments



Nintendo has been historically notorious for making more money per employee than most companies [0]. 10% is a step in the right direction, but it should definitely be more.

[0] https://www.ft.com/content/9d9624a4-8341-11dd-907e-000077b07...


Maybe this operating model is why they haven’t had to have layoffs and can weather the storm.

In Japan, people value job security over maximizing expected value of compensation.


Most of the tech companies having layoffs don't "have to" have layoffs. They're doing it to depress worker compensation across the whole sector. It's a shameless money grab at the expense of their employees.


This doesn’t really make sense to me from a game theory POV. As a company, laying off productive employees of my own has an insignificant impact on the total sector’s compensation, but reduces my output and therefore profitability. Why would I do it? If I think there’s some big sector-wide conspiracy to unnecessarily cause layoffs to depress wages, my dominant strategy is to not participate - I still gain the full benefits (since my own participation has only a trivial impact on the outcome), but without hurting my own output.


> Why would I do it?

Because the investors want you to [1]. Companies are using this as an excuse to cut unprofitable projects (e.g., Alexa). They over-hired before interest rates went up, when borrowing money was essentially free.

> This doesn’t really make sense to me from a game theory POV

Assume company leadership only cares about short-term stock price, and recompute your game theory POV. The outcome is exactly what's happening right now.

[1] https://www.ndtv.com/world-news/google-layoffs-investor-told...


First, the article you linked is talking about an investors who called for layoffs but the company didn't respond with layoffs. So it completely undermines your point.

Most importantly, the investors have minority stakes in many of these cases. Facebook, Google, etc. And how do you explain non publicly traded companies like Stripe?

Cheap debt and high margins fueled waste. It's as simple as that.


Stripe still has investors. And those investors have leverage over the corp-officers.

Cheap money did fuel waste. Investors want to reduce the burn.


> Because the investors want you to [1]. Companies are using this as an excuse to cut unprofitable projects

So it’s not a conspiracy to drive the wages down? Who would have thought.


Nnnno, it just means the investors also want to drive wages down. Because to far too many of them, labor is just a cost center.


You wrote:

    They over-hired before interest rates went up, when borrowing money was essentially free.
I'm confused. As I understand, the wealthiest tech companys do not borrow money. Yes, they all have highly advanced treasuries to manage cashflows (different currencies, etc.), but they do not need to create liabilities (new debt) to run their businesses. They are cashflow positive and highly profitable.

Are you trying to say that as interest rates rise, the consumption part of the economy has slowed, thus profit growth has slowed at Big Tech? If yes, hmm, I half agree to attribute to layoffs. Mostly, I think they are cleaning house. A lot of people are working on projects that have little or no revenue potential. During economic weak periods, it is normal to close those projects.


The big companies have debt, see https://www.microsoft.com/en-us/Investor/earnings/FY-2022-Q1... . It was often cheap for them to raise it in the past and quickly.


There is also another aspect in addition to sister comment: When interest rates are low, institutions with contractually agreed returns (pension funds and such) have to invest in riskier assets (stock, private equity, corporate bonds, riskier state bonds) to reach those targets.

When interest rates are where they should be they can just buy G8 government bonds. This leads to pretty big outflows from the stock and corporate bond markets.

Incidentally I think there is a huge blindspot (intentional or not) for the amount of economic pain this interest rate normalization will cause. After 10 years of negative real interest rates (central bank rate minus inflation) the economy and all its participants have become junkies. The withdrawal from the free money drug will be painful but neccesary.


> Because the investors want you to [1]

Investors don't actually have any market power unless the company in question is doing a stock issue. For example, AMZN doesn't get any extra operating capital no matter how much of their stock any investor buys or sells on the open market. The only way "investors" can control a publicly traded company that isn't dependent on issuing new stock is by depressing the executive comp package's value. From that follows an obvious lesson on corporate governance. The fact that that obvious lesson isn't followed is proof enough that the system doesn't work exactly as advertised.


> The only way "investors" can control a publicly traded company that isn't dependent on issuing new stock is by depressing the executive comp package's value.

Or, y'know, if the Board — which is always composed of shareholders, who are, in public companies, expected to hold the stock value as their primary interest over the internal interests of the company, whether or not they're also officers of the company — operates by firing-and-replacing the CEO whenever the CEO does something the market responds sufficiently-negatively to. Or just makes it clear to the CEO that that's what will happen. (This is, in principle, where the infamous hypothetical "duty to shareholders" is supposed to come from. It's not a law; it's the market's ability to transitively fire the CEO through a share-price-incentivized Board.)


At least in the case of Google and Facebook, the owners have a different class of stock that has enough voting power that they cannot be fired by outside shareholders.


In public companies, majority (rather than plurality) shareholders who are also the officers of the company (i.e. where the CEO doesn't have to listen to the Board, because the CEO "is" the Board for all intents and purposes) are the exception, not the norm. Most companies have sold off an internal controlling interest by the time they go public. And, if they survive in the market, most other companies that kept an internal controlling interest initially, end up selling it off once all the founders leave or retire. (In fact, insofar as you can think of a retired founder as now having the interests of an external shareholder, the process is almost a tautology.)

For the relevant example, Nintendo's shareholdership — https://www.nintendo.co.jp/ir/en/stock/information/index.htm... — is composed of "47.43% Foreign Institutions and Individuals" (this category usually meaning "foreign investors"), and "31.07% Japanese Financial Institutions" (i.e. domestic investors.) So, (more than) 78.5% of Nintendo is externally owned. These are not voting shares, but that doesn't matter; they're a majority of shares, so they're controlling shares in practice: even if you can't vote, you can still do a coordinated dump of the company's stock to signal your displeasure. Thus, the institutional shareholders with these shares drive board allocation in a game-theoretic sense, rather than a legal sense.

That's why you see some very mysterious people (https://www.nintendo.co.jp/corporate/en/officer/index.html) on Nintendo's Board, specifically on an "Audit and Supervisory Committee" (a.k.a. "the actual Board; composed of people we didn't pick and don't really want here, but were strong-armed into taking by threats to do things to our share price; who can veto any decision made by the rest of the Board.") These are either large individual shareholders, or are "ambassadors" for the interests of institutional shareholders, or both.

Takuya Yoshimura is "Vice President at Mizuho Securities" — pretty clear why he's there. Asa Shinkawa works for an M&A company. Masao Yamazaki, retired Japanese railway tycoon, is likely there as a representative of his friends' institutional interests (and maybe the interests of some, er, "groups" in Japan); while Katsuhiro Umeyama, accounting-firm CEO, is there in a more formal (but not formalized) capacity to put a literal external auditor / comptroller lens on Nintendo's spending on behalf of whichever companies think that's needed.

Make no mistake — these people on this "Audit and Supervisory Committee" can get a Nintendo CEO or "President" fired, if they don't like their last-quarter decisions and resulting stock performance. That's pretty much all they can do; but in theory it's enough to steer the company, as they can just keep rolling the dice until they get a President whose policies happen to already align with theirs.


For companies with high levels of share-based compensation, share price is the denominator for converting grant values to shares, meaning the share price does affect the share pool consumption from any given grant.


Plus if others don't follow suit then you're going to have to rehire for more. Being early would be incredibly risky.

The reality is tech had lots of bloat that was sustained by a mix of high profit margins and cheap debt.


There is no big sector-wide conspiracy. A sector-wide conspiracy would be massively illegal. If, however, several companies individually decided a layoff is in their best interest for legitimate reasons (looming recession instead of suppress wages) then there can be the same practical effect without any criminal act.

Groups of people that are coordinating in ways and times others aren't can have a competitive advantage. The game is how to coordinate, and how to establish yourself as a member of the coordinating group, without actually coordinating because that's illegal.


Just saying, it has happened before [1]. Except this time they’d have the perfect cover of “no collusion, the economy is doing bad”. The CEOs of these large companies talk all the time. As do board members. Heck, some even share board members. It’s not that hard at this level to generate collusion that’s hard to prove.

Now I’m not saying this is what’s happened. I’m a big fan of Occam’s razor. But at the same time I wouldn’t dismiss it. Oh and the “massively illegal coordination”? Why don’t you look at what the outcome was of the DOJ prosecution of that case.

[1] https://en.m.wikipedia.org/wiki/High-Tech_Employee_Antitrust...


> There is no big sector-wide conspiracy. A sector-wide conspiracy would be massively illegal

The logic of this sentence is broken. As a thought exercise:

- There is no stealing, that would be illegal.

- There is no murdering, that would be illegal.

Clearly, something being illegal does not prevent it from existing. Otherwise it would not need to be made illegal in the first place.

Fact is, this has happened before with collusion, so there is no need to hypothesize. It is definitely possible.


It is possible to kill without legally murder. It is possible to have your property taken without someone legally stealing. This is the point I was attempting to make with my rhetoric.

Put in game theory language, the optimal solution for the game includes a group of people cooperating when defecting amongst a population of defectors would be the optimal strategy.

From a communications point of view, players have limited communication channels, and must both explain the game and encourage cooperation without directly doing so. The players have similar cultural backgrounds, having read the same books and having overlapping social circles. When a player announces their agreement to cooperate they used clearly false reasons, implying there is much left unsaid. Maybe the stock bump that happens when a player cooperates is a positive reinforcement signal by people concerned about rising labor costs.


Your logic is somewhat fallacious: specifically, you create a false dichotomy between "there is an illegal sector-wide conspiracy" and "these things could happen across the sector if there were fully legitimate reasons for them to".

You are missing the middle, which is that there is no conspiracy, but that many companies in the sector saw, at around the same time (though possibly influenced by some bellwether I'm unaware of), the opportunity to juice their stock prices with layoffs even though they're making record profits and don't actually expect to be hit by a looming recession.


You don't need a formal conspiracy when interests converge.

https://www.youtube.com/watch?v=VAFd4FdbJxs


This has been done before


Employees in large companies have limited output.

You kill a few useless projects and you keep shipping the important stuff.

This is done mainly to please the market. Hire when the market is good, fire when the market is bad.

+ factor in the interest rates


you are assuming those workers were productive (in terms of contribution to the bottom-line) to begin with, which a lot of insiders tend to disagree with.


I don’t think dropping the lowest 10% of developers is hurting your output. It might even raise it.


This has baked into it the incredibly naïve (and generally accepted to be incorrect) assumption that layoffs are performance-based. There are plenty of examples of FAANG and other companies laying off folks with 15, 20, 30 years of incredibly specialized and presumably valuable experience but with a compensation price tag to match.


If anything it's a great time to shed well-paid fat. Hire a noob out of MIT for 1/2 price and hope that the Sr. gave the project enough legs and enough documentation to allow the NUG to figure it out.

aka keep wages down


I mean that's certainly part of it. If (and it's a big if) you can run a project for less money, you probably should.

However I'm not sure it's as large-scale a thought process as "let's keep industry wages as low as we can" and more "I will look good if I can trim my budget, so I'm going to try to do that." The effect is the same, generally, but not quite as nefarious.


"I don’t think dropping the lowest 10% of developers is hurting your output. It might even raise it."

I have seen a few mass layoffs while I was contractor at different companies. Most of these big layoffs weren't performance based but they axed whole departments. Another group at danger were managers that didn't actually manage anybody. they were either demoted or let go. But in general I didn't get the impression that the layoffs were about performance. It was more about being lucky and being at the right place at the right time.


This does not work in practice. You don’t have a perfect metric to evaluate performance, and managers at all levels have some margin of decision and they can interpret different people’s productivity differently. The effects on morale are also usually terrible and can reduce the productivity of the remaining employees while the best ones polish their CV and look elsewhere.

This idea is rooted in a philosophy where people are interchangeable cogs that can be evaluated purely on a one-dimensional axis, don’t have emotions, and where you don’t have competitors taking advantage of this by getting good engineers for cheap you fired for ideological reasons.


Isn't that his point? The companies don't need the conspiracy to instigate layoffs, they already had a legitimate reason to do so.


This generally isn't how layoffs work because if there is any correlation between performance and protected indicators (age / gender / race / etc.), then a mass layoff may appear to be biased and open up a legal liability.

Instead, layoffs are shaped by specific reductions in funding for product areas, and a degree of randomization and rebalancing is mixed in to eliminate bias.


If you're Amazon it would... this wouldn't be the first time companies colluded to suppress wages in SV. It might not even be explicit collusion. These companies do have other motivations for these changes.


FAANG pays 5-10x what Nintendo pays and every decade theres a 10% layoff? Seems like an OK trade to me. The whinging is reaching stratospheric levels.

Planty of stable sub-$90k dev jobs available if you want, but then you can't live in NYC/SanFran, buy 2 Teslas, and take trips to Hawaii/Europe.


How is this idea suddenly so widespread? Big tech are currently at the edge of the abyss to varying degrees. For most it has now been 4 quarters of revenues growing at a lower rate than expenses (and accelerating). At this rate it's obvious they have only a few quarters left before they go into the red. So why exactly should they keep employees around that they hired during the largest bubble in at least 40 years? Should they keep these people on and only start doing something once they are billions of dollars in debt?

Besides that, this class warfare rhetoric on HN is a cruel joke. You have the global 0.3% complaining that the 0.03% is stealing their $300,000 salaries. Why don't they accept a 75% pay cut so that the marketing assistant and cleaning personnel can get a salary more in line with theirs? If companies have to make HR decisions based on social factors this will be the result.


Then why these companies hire massively just a couple of years ago with the highest possible salaries in the world.

Nothing everything is a class struggle conspiracy.

Insisting highly paid but unproductive employees not be laid off is insisting on a form of rent seeking.


Seizing the means of un-production.


They paid people more and then laid them off to ... pay them less?

This seems an unlikely strategy. More like a conspiracy theory.


That’s an interesting theory, how do you reconcile it with the fact that several companies which have had layoffs have also given out large stock refreshes to some of the employees who weren’t laid off?


Easy. You said it yourself. “Some”.


I suspect it's less about depressing worker compensation than increasing managerial compensation. Average CEO tenure is quite short these days, so the incentive is for execs to maximize short-term gains. Layoffs fit in with what I think of as "market sadism". In economically troubled times, CEOs counter fear of weakness by looking tough. (Tough on other people, of course, never themselves.) That lifts the stock price, putting money in their pockets, and hopefully giving them longer at the trough.


>In Japan, people value job security over maximizing expected value of compensation.

I don’t think that is it. In Japan, the way of maximizing compensation is by staying in your job as long as possible, because the pay is strongly tied to seniority.


On the whole, this is a rather more "equal" process. The top managers don't get paid the huge amounts they do in the US. However this also leads to all the unproductive situations you can imagine, with seniors running the company into the ground and the juniors not being able to do anything about it.


It always fascinates me how Japan is full of anachronistic, counter-productive traditions and is also one of the most advanced countries in the world at the same time.

The whole place is full of contradictions. If everyone works absolutely insane hours and has no life, one would imagine the life expectancy would be shot as well, just due to the stress.

Someone explain this to me.


Japan was one of the most advanced countries in the world.

The country has been stagnating for quite a while. South Korea and Taiwan have since eclipsed it by PPP-adjusted metrics per person. (Japanese make higher paper incomes than South Koreans and Taiwanese, but they also generally pay more for imports due to tariffs.)


My off the cuff theory is perhaps their diet is very good, so offsets some of the shortened life expectancy from stress? On the graph of stress vs diet, where being in the top right is the worst for life expectancy, being high on only one might help.

Having said that, I don't know if the average Japanese diet is good. Just postulating a possibility.


On the whole it is probably better than US, but it's not like they don't have unhealthy habits either. Smoking is far more common than in most Western countries. Drinking I'm not so sure, but they do have the custom of forced work drinking parties, and most Asians have the gene that produce extra toxins from alcohol, so the harm from drinking must not be negligible. I would posit that the life expectancy is in part due to lack of documentation of dying. There are many cases in Tokyo where old people living alone are found dead only weeks later when neighbors complained about the smell.


It seems Nintendo profits were 4B+ last year. It has ~7000 employees, working out to 500k+ per head in profit (not revenue). I can't readily find compensation data but given the low pay in Japan I'd be surprised if the average annual compensation is substantially higher than 100k. They definitely have a lot of room to increase pay despite whatever risk you mean. And it's not like past profits are necessarily going to help "weather the storm".


> 500k+ per head in profit

I expect Nintendo to have a different workforce composition than your typical SaaS.

They will proportionally have more artists and translators, who have less leverage/alternatives in the labor market than engineers. That would explain a lower expense line in the profit formula.

Obviously, this is not a value judgement. Diamonds cost more than water, at the margin, but that doesn't mean one is more important than the other.


> They will proportionally have more artists and translators, who have less leverage/alternatives in the labor market than engineers. That would explain a lower expense line in the profit formula.

And not just artists, but artists at a name-brand company that shape games. People from all over the world would line up around the block for days, buddhist monk style, to get an artist job there. They can pay at or below market rate just because folks would kill to design the next Metroid game.


If OpenWork snippets [0] are to be believed, sounds like people after a couple of years (5)? are making 7M. People citing entry-level pay as 5M. Managers at 10M JPY. General "salary based on years of service" stuff.

Now... I think in Kyoto that's a pretty decent number! Tech company salaries in Kansai are _so much worse_ than in Kanto when I looked (5 years ago but). But if you're in Tokyo and making that... honestly you are gonna be better off elsewhere.

I have seen worse numbers at other game companies, but hearing 500k+ per head in profit, it's kind of disgusting they don't pay a bit more.

[0] https://www.vorkers.com/company_answer.php?m_id=a0910000000F...


To add insult to injury, I'm guessing the lawyers that help them enforce their IPs are paid better than their top engineers and artists, also because of a disparity in typical outside options.

My experience with Japanese people and some anectodes from a friend who worked at Nintendo is that bureaucracy, culture expectations that you shouldn't change jobs at all, and English ability are some of the strong limiting factors for Japanese rank and file employees to achieve high pay. It's a shame as Japan has so many smart people who are both down to earth and innovative.


"And it's not like past profits are necessarily going to help "weather the storm"."

It is if they're not paying out (all) the profits to shareholders.

Nothing stopping them building a war chest for tough times.

Maybe that's why they're increasing pay and apparently not laying off staff, when competitors are doing the inverse?


The storm of wanting to keep their obscenely large profits at the same scale? I mean, that's what the other big tech companies who are laying off employees are doing


What storm? Things are going great in employment numbers and GDP growth.


Couldn’t get archive.org to get the content for your article, unfortunately.

Anyways, I’m not sure that this is the right metric to criticize a company, at least not without corroborating metrics. Two main questions I’d be interested before casting judgement are:

What did they do with the excess monies they had from their large profit:compensation ratio?

Were they paying above market rate for their industry?

I do see how my response betrays my assumption that profits do not always need to be shared with the labor pool that generated/oversaw the period of profits. That’s something I’m chewing on, personally.


It's Nintendo. They're taking advantage that people that like the brand, and want to be able to say that they helped make Mario or Zelda.

You can pay disneyland ride designers less because there's status involved with designing Disneyland rides


Mostly, I believe Nintendo has about 20 years (not sure of the exact amount, a ridiculous number anyway) of operating costs in savings.

If they stopped selling anything tomorrow they’d still be good for a few decades.


That sounds suspiciously like something a 134 year-old company would do to ensure its survival over the long term.


It’s a shame GE hasn’t managed to follow suit after 130 years.


Japan believes in lifetime employment and is thinking about what you'll be doing there in 20 years.

GE focuses on the quarter and DGAF if you're there in 2 years; ditto for technical initiatives, leadership, legal compliance, you name it.


Isn't that their normal cycle though? Produce a console that prints money, produce a console that bleeds money, repeat. Seems like their cash reserves keep them nimble enough to experiment and remain independent.


I don't think bleeding money is accurate for Nintendo. They are in the habitat of iterating on their previous consoles instead of just throwing everything out the window each time like some other companies, so even with their consoles that flopped it's not like the R&D money is just down the drain, and I'm sure labor-and-materials-wise they're making some kind of margin on the hardware itself.


i don’t think they’ve had a console that bleeds money in the sense that playstations/xboxes have, though they’ve had quite a few flops over the past 20 years


Their only flop in the past 20 years was the Wii U. (The Gamecube is now over 20 years old.)


Previous poster was probably assuming, like I did, that the Virtual Boy was less than 20 years ago…


It wasn't a flop in my childhood. :'(


Given the cycles for Nintendo consoles this sounds pretty reasonable particularly now that they have a single line of consoles with the Switch instead of console + handheld.

It's not like the Wii U sold a lot. Who knows if the Switch replacement won't be a Wii U. Who knows if there won't be two Wii Us in a row.


20 years in savings is astonishing.


A human with 20 years of savings would just be considered financially sound and (barely) safe to retire.

Yet a business with over 6,000 employees with the same financial security is exceptional.

Perhaps there’s something to learn here.


Businesses are not people who get old, can't keep working, and retire. Businesses can borrow significant sums of money, issue stock, come up with a new product that can transform their revenues overnight, etc.

I know what you're trying to say but this isn't a great comparison.


That a huge company can be more confident in its future cash flow than a single human.


Companies don't retire so this whole comparison doesn't make sense.



> Nintendo has been historically notorious for making more money per employee than most companies

That is in no way a bad thing.


Then you sure would like this new concept, called slavery.


Agree with this.

The Marxist* notion that workers should get salaries commensurate to net profit of the company is weird and purely subjective.

Why should they?

There is an argument that wages should be tied to productivity, but that would lead to internal inequalities and would be very subjective as many worker outputs are hard to measure.

The current employer-employee "bargain" is for the worker to get a steady fixed income, independent of output or consistency, in exchange for the security that provides.

Self-employment is a good way to take on more risk for (potentially) higher pay.

Before anyone counters with "but what about unskilled workers who can't self-employ", I would return the question - yes, what about them?

I see no particular reason why profit of a company must be shared with its workers, most of whom did not outlay the initial capital (or subsequent funding rounds) that built the company.

*Maybe there's a better word for this or an earlier progenitor of this idea but I think this captures it well enough for this comment.


I can't tell if this is a parody of American Psycho or not.


> The Marxist* notion that workers should get salaries commensurate to net profit of the company is weird and purely subjective.

Is that actually a thing? I read the Capital quite a long time ago but I don’t think Marx advocated for any of that. He wrote about workers’ ownership of the means of production and how they should be the ones deciding what to do with the results of their labour.

In a market economy, that would lead to workers-owned companies, so the people deciding salaries and compensations would be the employees. How revenue is split is still a strategic decision: you still need war chests for difficult times, investments, etc. You’d hope that the pay would be more fair, but it’s a consequence of the employees taking decisions, rather something postulated a priori.

> Self-employment is a good way to take on more risk for (potentially) higher pay.

Indeed. And some people really like it, which is great. But you obviously cannot do the same thing as a one-man shop than as a large company. A working economy needs both, otherwise the plucky independents quickly get out of things like manufactured goods and food. Also, for some people stability is more important, and sometimes both sets of people are the same at different points in their lives. It is not really a good argument against fair wages.

> Before anyone counters with "but what about unskilled workers who can't self-employ", I would return the question - yes, what about them?

This sounds fine and dandy until they take a rifle and get off you what they need to survive. The simple truth is that people are not going to just starve because you think they don’t deserve a living.

> I see no particular reason why profit of a company must be shared with its workers, most of whom did not outlay the initial capital (or subsequent funding rounds) that built the company.

OTOH, capital itself does not build anything or produce anything. You need people to actually do the work at some point, and they are at least as necessary and important as the rich dude who signed a check at the beginning. Never mind the fact that “investors” who get rich off the secondary market do not bring anything to the company of which they trade the shares. I do it as well at my (very modest) level, but I don’t pretend I am doing anything useful by buying and selling stock.


> In a market economy, that would lead to workers-owned companies, so the people deciding salaries and compensations would be the employees.

This seems to make no sense - managers and C suite are employees already, and they decide salaries.

But more generally: what is the actual way they should reward work?

I.e. if you just pay everyone the same, what happens to your company if another company that pays based on value decides to hire all your top employees at 20% more?


> managers and C suite are employees already

They are mandated by the board, and ultimately by the shareholders (who, yes, can be employees in principle but most of the time are not).

> What is the actual way you should reward work?

Pretty much the same way it works usually, in that respect. This structure actually exists already, employees-owned companies are a thing that is common in some countries, and it is not particularly problematic. These companies do not pay everyone the same salary, but they tend to be less unequal.

Also, I am not really advocating for or against this, just that this would be how you steer towards de-alienating the workers in a market economy based on the theory in the Capital. Though yes, that is short of a communist utopia but still a step in the right direction from that point of view.


> This structure actually exists already, employees-owned companies are a thing that is common in some countries

I'm not sure about common, but yes it does happen. I'm asking how does the commenter expect this to work in practice? What's the difference between my manager telling me my new wage and my comrade telling me my new wage?


It's not how it happens.

You usually democratically determine the process by which you arrive to individual compensation. I've seen an organization to use mix of measuring actual billable output and blending it with time spent on internal company workings, tooling, management and so on. There is usually a way to redistribute couple % ad-hoc, e.g. when someone has financial difficulties or a baby or something.

There is always a financials dashboard for everyone to see and complete openness and even discussion about compensation.

But yeah, I have only see a handful of companies like that. Some were not even worker-owned, but had owners who said "OK, here is a fixed share of revenue that goes to us before you distribute the rest among yourselves" and then let the workers do the rest.


But what did the workers doing the rest actually look like? To me, almost all companies are like that: owners own a chunk of the company and authorise salary budgets, the board decides top executive pay and employees such as executives and managers decide how to distribute the pay budget.


> The Marxist* notion that workers should get salaries commensurate to net profit of the company is weird and purely subjective.

The opposite is rampant inequality, where workers get paid pennies and owners keep most of the profits.

> I see no particular reason why profit of a company must be shared with its workers, most of whom did not outlay the initial capital (or subsequent funding rounds) that built the company.

You are OK with the people who create the actual thing a company sells, not getting their share. Only people who provide capital should.

Where do you think wealth comes from?


Can you cite the section of Capital you're arguing against? I want to understand both sides.


> I see no particular reason why profit of a company must be shared with its workers

I am a devout capitalist and I largely agree with your argument, but the Marxist argument isn't rooted in the desire for every man's labour to be as valuable as its demand. It's rooted in the belief that every man's labour should be treated as equally valuable. This is a moral argument, not an argument about real value or evidence thereof.


Japanese firms also have lower salaries.


Even compared to tech giants?


Yes.

In 2022 Google at 282 billion / 190,234 employees = 1.48 million per employee.

In 2022 Nintendo was 13.923 billion / 7,136 employees = 1.85 million per employee.

In theory Google might be more consistent, but the article was written in 2008 when Nintendo was 1.6m per employee in 2008 beating Goldman Sacks at 1.2m and Google at 600k.


(2008). It was the year


I’ve always heard that Japanese software developers were very poorly paid compared to their us counterparts. This 10% might be pretty terrible if your peers received large raises for years beforehand.

I do feel like something is off with Nintendo, their games seem very scarce recently. Either they’re launching a new system this year at e3 or Covid really messed up their dev practices.


Japanese employment law is a bit of a double edged sword. It's very difficult to fire employees, so if you hire someone on a high salary and it turns out they weren't worth it you are now stuck with an employee for years or decades. You basically have to show that employee is malicious to get rid of them. Low wages is one way to soften this risk.

The other side of this is many people are not full time employees. Dispatch (派遣) employees are employed by one company and then sent to work alongside regular employees at other companies, with the dispatch company taking a good slice. The employer is now able to shrink their workforce more easily and the employee gets a smaller piece of the pie.

Simply writing a temporary contract might not even be enough. Recent changes to the law say that if you have been on temporary contract that is renewed for 5 years then you have the right to full time employment with the same conditions. Now you see plenty of people getting let go at 4 years, just to be on the safe side.


The traditional way Japanese companies get rid of unwanted employees is to assign them tasks so meaningless and mind-numbing, or just no tasks at all, that they quit by themselves.

Random link: https://www.theregister.com/2013/08/21/boredom_rooms_japan_e...

This might be less effective nowadays with ubiquitous internet access.


It's difficult to fire someone, but Japanese businesses do reductions in salary for poor performance. There's no more risk to giving pay increases than there is to hiring someone.

Part of the reason that Japanese businesses don't give pay increases is because there's effectively no inflation in Japan, and growth in general has been mostly stagnant (though in reality that's not really true, but rather an excuse); but, the other reason is because for the most part, workers work in a company for life and don't switch companies for higher wages. There's no real competition on salaries, so no reason to be competitive on pay increases either.


I was wondering why software industry didn't take off in Japan like in the USA.

Now I know. It's the employment rules that makes firing people hard once recruited.


It's much more complex than that.

Japan has the lowest TOEFL score in Asia amongst 30+ countries. English proficiency is much higher in both advanced and developing Asian economies. This is not inherently a blocker for software development, but it reflects a trend counter to globalization and being receptive to American practices.

Japan has less billionaires than both Hong Kong, SK and Taiwan, all of which have a fraction of the population. Tax policy is one part of this, but there is a cultural aversion to risk taking. The typical apartment lease is 2+ years, which reflects the sense to long term commitment. There's also more restrictive regulation and actually following policies to the tee. Even human-powered bicycles require registration and not too powerful e-bikes requires license plates, registration, insurance. All of this means there are less startups, less startup ecosystem, less hackers. Heck, Indonesia has since grown a more vibrant startup ecosystem and even deca-unicorns.

Also reigning in from Japan's manufacturing era is prioritizing reliability and virtueing craft. Japan is still building appliances and electronics that last decades, which is at odds with the SDLC chasing shiny new features for smartphones every year. Unfortunately, consumers have spoke with their wallets that they value updates more than reliability. Ironically, Japan's software is pretty crap and not reliable. My bank is one of the more vibrants one, going through a DX digital experience / "digital transformation" as it is called, but all of the technology, hardware and software is atrocious. It's been the second time within a year where my ATM card has stopped working because it's gotten demagnetized. It's a lot of security theater.


There's always more than one reason for such a complex question. It would be just as fair to say the Japanese economy was more focused on manufacturing then services when the software industry really started to boom and by the time they transitioned to a service based economy their population was skewed too old to be a major player in the new industry.


The software industry here didn’t take off because employers could just fire anyone anytime.


It's not a choice between "never able to fire someone" and "fire anyone anytime".


Yes, but what does that have to do with my comment? The ease with which an employer can fire someone doesn't have much to do with why the software industry in the US took off versus other countries.


> what does that have to do with my comment?

Let's say hypothetically that your second sentence is wrong, and ability to fire people more easily than Japan is why US software took off. That doesn't mean it needed to be "fire anyone anytime". Just that it needed to be less locked-in than Japan.

In other words, someone made a claim for why the Japan software industry failed, which was employment law being at this extreme. You then claimed the other extreme was not necessary for success. And it looked like your post was intended as a counterargument. But it's not a counterargument unless those are the only two options for employment law. So I pointed out those aren't the only two options for employment law.

> The ease with which an employer can fire someone doesn't have much to do with why the software industry in the US took off versus other countries.

Okay, I think that's what your original comment should have said. Or it should have said "The software industry here didn’t take off because employers could fire people more easily than in Japan."

And then ideally supported that argument with more data.

Instead you made a weaker argument that doesn't help narrow down the truth very much.


I think it is more to do with large, but not internationally transferable domestic market. There is demand in domestic market, but selling products outside would require changes making it slower to expand.


Eh, the Chinese firewall creates the most insular, oddball large market, and yet some Chinese companies do compete like TikTok


TikTok is state sponsored spyware, though. They're getting state money hand over fist. Probably less so now that they've been bootstrapped.


It’s not like the Japanese are known for lassiez faire industrial policy either.


True in France as well though my perception is that software industry in France is more well developed?


The software industry took off in the US because of government funding into silicon valley more than anything else.

Get the best people in one place and good stuff will happen


No, Silicon Valley happened because software took off, not the other way around.


This isn't accurate: as the name suggests, Silicon Valley happened because hardware took off...

...which happened because, after WW2, the federal government was very interested in establishing a decisive lead in computing and similar technologies [1] [2] [3]. The US military was a key early customer of Silicon Valley, and industry research labs like Xerox PARC received heavy support from publicly funded grants. Projects like ARPANET [4] paved the way for the Internet as we know it now.

To this day, programs like the Silicon Valley Innovation Program [5] continue to subsidise tech firms. This is a feature, not a bug: transitioning new technologies from early basic research to proof-of-concept to commercial viability takes time, more time than many companies and VC firms are willing to wait. ARPA, NSF, etc. exist largely to support this transition, which is why they deploy funding in both industry and academic settings.

[1] https://onezero.medium.com/the-hidden-history-of-how-the-gov... [2] https://marianamazzucato.com/books/the-entrepreneurial-state [3] https://computerhistory.org/blog/the-valley-and-the-swamp-bi... [4] https://en.wikipedia.org/wiki/ARPANET [5] https://www.dhs.gov/science-and-technology/svip


You're both right and wrong at the same time.

Tech ecosystems are based on self-reinforcing feedback loops.


The law should add: sum of dispatch employees over last 5 years. Divided by 5. Is the number of dispatch employees that have to be converted to full time employees this year.


Surely after 4 years you'd know if they were worth it and convert to an employee if that was your only concern.


> Surely after 4 years you'd know if they were worth it and convert to an employee

I think in many/most cases, the goal is not to have an additional permanent employee.

Most large, reputable companies completely gutted their hiring numbers for permanent positions after the 1989 crash. They still needed warm bodies to do stuff, so this “dispatch” system became much more common.

An unfortunate side effect of the growth of the dispatch system is that young people rightfully feel much less secure in their long term employment, so they ended up delaying or never making long-term commitments like marriage and having kids.

Imho, the growth of the “dispatch” system has been one of the leading reasons that Japan had the “lost decade” that seems to be working on its fourth decade. The federal government and the large corporations of Japan really dropped the ball on this, and they need to look in the mirror when they wonder out loud about the current societal and economic malaise.


While everything you say is true, this isn't really unique to Japan though? Shoving more and more work off to temps and vendors with much worse benefits than full-timers has become hugely common at large companies in the US and Europe as well.


> While everything you say is true, this isn't really unique to Japan though?

I can’t speak to many other countries, as I haven’t really looked as deeply into them as I have into Japan.

That said, the differences in Japan are significant in social ways that they are not in other places (like the US, where I currently live).

Want a decent/nice apartment? Some places won’t rent to you unless you have a permanent job (the dispatch jobs often/typically don’t count). It doesn’t matter if your income is much higher than it would be while working at the permanent job.

This exact experience happened to patio11 in a provincial part of Japan where it would probably have been prudent to try to be more accommodating to financially successful residents. For reference, he quit his salaryman job because bingo card creator was making a multiple of his salaryman income. When it came time to renew his lease, the landlord was not accommodating even though patio11 had proof of his drastically increased income.

Want to get married as a male? If you don’t have a permanent job, then your marriage stock takes a massive hit. The dispatch jobs are considered much lower on the prestige scale.

The list of detractions go on.

Note that this probably wouldn’t be a big deal if the changes had been gradual and/or society had changed just as rapidly, but those things didn’t happen. The hiring was completely gutted, and it never returned to anything close to prior form.

I will add that these changes happened during a time of rapid change in regulations (mostly deregulation) in Japan in the 90s. I personally think that these changes were probably good from a corporate-level economic perspective. That said, these changes also led to a complete and irreverent shattering of the social contract in Japan. It was not a change that most people wanted. It is a change that the government and the large corporations, with much goading from the US, foisted upon the broader Japanese population with complete disregard for the social fabric of society.

What could have been done differently? The changes could have been slower. Many of the changes just could have not happened, imho. Social support systems could have been developed before or while these changes were implemented. Pro-consumer laws could have been enacted.

Instead, the Japanese bureaucrats showed how deep in over their heads they were at the time by just acquiescing to western pressure to deregulate without really understanding why the existing system in Japan actually worked (short version: a lot of “non-competitive” features of the broader Japanese economy functioned as a de facto social safety/support system that never got replaced).

Note that patio11 has discussed this topic in more detail with more references than I currently have, so I would search for his stuff if you want to dig deeper.


Here is a patio11 blog post that has some relevant content:

https://www.kalzumeus.com/2014/11/07/doing-business-in-japan...

Relevant HN thread:

https://news.ycombinator.com/item?id=8573992



See my same-level reply for exact posts and threads.


This is a practice happening here in the Netherlands, due to employee contract law.


So why would someone show up to work after being hired?

Why not just surf the internet and play games all day at your desk?


"Failing to obey company orders" is a terminable offence. Any instruction that's not explicitly illegal counts as a company order.

But with the Japanese social preference for conformity[1][2][3], usually there is some mutually beneficial solution rather than firing.

[1] "The nail that sticks up gets hammered down."

[2] "Do not cause disturbance or inconvenience for others."

[3] "Everything should be frictionless."


I have a friend at Nvidia who worked on the GPU for the Switch, and so worked closely with Nintendo employees, and he said he was appalled at how much more he made than people at his equivalent seniority. It's pretty bad.


It is pretty likely that nvidia engineers are also outliers compared to the market. It is FANG level compensation no?


All countries devs are poorly paid compared to their US counterparts.


It makes sense for other countries since they don't have global giants to pay absurd amounts. But Japan has it's fair share of these global entities, their salaries in theory should be competing with those of the US. Maybe it's the language barrier that gives their companies some leverage in negotiations?


The last time they launched a big Zelda game, it was pushed to two consoles (Switch and the previous gen Wii U).

So maybe you're right. The next Zelda is coming in several months.


I’m not so sure. The new Zelda appears to use the exact same engine as the previous one.


Doesn't really tell us anything. It's a lot easier to port an engine vs writing a new one from scratch, especially if the new system is also just an ARM64 processor with an NVIDIA GPU.


Does Nvidia still make mobile GPUs? Tegra was a flop afaik, which is why Nintendo picked it, since they could buy whole container ships of them for cheap.

What’s a relatively modern SOC that sold poorly and was manufactured in large volumes? That’s probably what the next Nintedo console will use.


They make desktop-class mobile gpu's, but nothing like the switch. However, they do still make embedded dev kits (https://www.nvidia.com/en-us/autonomous-machines/embedded-sy...), with their newest ones running on ampere.

The Jetson nano is the same chip as the nintendo switch, with half the cuda cores iirc.


Nvidia is believed to be developing a custom mobile GPU for Nintendo's next console.

They posted a job ad in 2022 saying "looking to hire a deeply technical, creative and hands-on software engineer to pioneer the next generation of Graphics Developer Tools for Game Consoles"

Nvidia's only game console client is Nintendo.


My ideal (new) Switch 2 (U) would be taking advantage of AMDs mobile APUs but I think they're hamstrung with backwards ARM compatibility with how good the Switches library is.


If everything’s built against an abstracted SDK, they could have games ported over with just a recompile potentially. If the new system is much higher performance then any optimizations against ARM would be a non issue. If they allow lower level access to the hardware, it would be a lot harder, but you could conceivably implement a GPU emulation at the hardware layer. Or depending on costs, just include the entire SOC for the compatibility mode, like PS3 did for PS2.


I think if they want to keep the form factor they need the higher performance per watt that arm cores can offer.


"Exact same"? That's a colossal assumption. A company the size of Nintendo will absolutely make improvements to the new iteration of the engine - some clearly noticeable to players, some not. Thinking back historically, many game series used the "same" engine. Quake 1/2/3, Halo 1/2/3, Command & Conquer / Red Alert, Witcher 2/3/Cyberpunk2077, Elder Scrolls 3/4/5/Fallout3/NV/4 and so on (too lazy to think of more). Further, think of how many games are made with Unity, Unreal Engine, etc. Yet further, check the credits list of Breath of the Wild -- they absolutely have the workforce to make serious changes to the engine and work on a compelling new game in concert: https://www.mobygames.com/game/switch/legend-of-zelda-breath...


Do not underestimate how incredibly cheap Nintendo is. The latest Pokemon games were riddled with bugs that you would not expect from the biggest media franchise in the world by revenue.


Nintendo has very little say in what happens to Pokemon


Wait Nintendo makes Pokémon games?


A Nintendo-owned company does.

Game Freak develops them.


This is also true of Zelda Twilight Princess. It released on the Gamecube and the Wii.

Nintendo has a history of launching consoles with the newest Zelda game.


Depends on the company. Nintendo pays much better than pretty much every game company in Japan.

In the last 5-10 years, the gap between Japan and west europe for top tier companies salaries has decreased a lot: you can now get 100k $ or more.

We'll see how long the 500k + total comp in the US will continue for top tier. My prediction is that this will decrease a lot in the next 5 to 10 years.


Game devs in the US are paid shit too. Average is around 90k..i can't imagine it's that much worse in Japan.


90k isn't terrible for a creative position. The crunch makes it not worth it though.


If you think 90k is shit, you should really consider not living in NYC, LA, of SF. 90k in most non metropolis cities is upper middle class.


Most people that live in a HCOL city cannot just move to a LCOL city and get paid the same - you think they've never considered that?

The point of using the national average is that we can compare it to other careers. If we compare game devs to other software development positions they get paid significantly less on average. The BLS puts software engineers at an average of $109k for 2021 (couldn't find 2022 stats, not sure if they exist yet?)[1]. That means game devs make 20+% less on average compared to their non-game counterparts.

1: https://www.bls.gov/ooh/Computer-and-Information-Technology/...


Well what would you expect? The work is significantly simpler, and the conditions are much better. So… wait. That’s not quite right. Hmmm.

Try again. What would you expect? The value to the employer is significantly lower because each developers contribution is… No wait.

Tell me again how this capitalism thing works because I get confused?


Supply impacts wages. More people want to be game devs per open position than want to be non-game devs per open position.


Right, right. So it’s just supply and demand. So here in Australia where the number of surgeon positions are constrained by a centralised body, supply outstrips demand quite a bit. Let me google the old market at work!

Googling… Hmmm…

Are you sure this thing is working?


Surgeons and game devs are not comparable. There's no free market in medicine in most countries as the doctors have formed their own cartel/union which decides who gets to be let in and under which conditions and without membership to that tribe you cannot practice.


So why doesn’t every group with valuable skills form similar cartels? Why isn’t there some “game dev” certification body to achieve the same?

Now I understand what this Chartered Professional Engineer thing is all about! It’s a license to print money by controlling supply! I want in!


Because SW devs never wanted to unionize in any way as they say they'll get held back by unions and they'll make more compensation without them.

And no, most chartered engineers do not really print money, some making even less than SW devs.

In my insignificant EU country, nearly every profession has it's own labor union, and guess what, the IT union is one of the weakest, seeing some of the worst possible contract terms and work conditions for a skilled profession, while the metal workers union being one of the strongest, having some of the best perks, rights, working conditions and mandatory wage increases in the country.


It’s not a dirty leftist union, it’s a certification body to ensure the high ethical and professional standards of our members!

We can have cool ceremonies! Stuff to hang on your wall! A club house!


Well, it's hard to get in, which is the whole point.


> are constrained by a centralised body

> supply outstrips demand quite a bit

How does that follow? It is constrained therefore supply is artifically lowered. No matter the demand, no one can supply more doctors than the body permits, therefore the price for each is quite high.


Not demand for the services, that’s high. Supply of talented humans for the role is much much greater than the positions available. So based on the aforementioned logic; lots of competition for limited spots should reduce the wages paid right?


The medical licensing body is making the market in surgeons. Their interest is in keeping wages high and supply low. In fact, “making the market” is not even the right analogy - the licensing body is the sole supplier of surgeons.

Nobody is making the market in game devs, so market wages prevail. Gaming happens to be the world’s most popular form of entertainment so it’s not surprising that more people want to be game devs than say write ERP software.

There are also way fewer game dev jobs than there are general dev jobs. This isn’t rocket science.


Yes, that is why residents are paid pennies. But once you are fully qualified surgeon, your are in the artificially limited pool of people who can operate on patients. Which leads to high salaries for surgeons.


Your pay is defined by your colleagues and potential colleagues, and unrelated to the value you provide


Kyoto isn't really a LCOL area either.


yeah but you get to live in Kyoto


And deal with one of the most tourist-packed cities in the world with some awful commutes.

Buses just aren’t worth riding when tourism is high (you’ll be waiting forever with dozens in line ahead of you and then stuck in traffic) and the train routes aren’t nearly as convenient as a lot of other Japanese cities.


$90k is shit even in low cost of living cities [in the US]. A nice car and insurance is $15k/year. Groceries have doubled or tripled in cost.

If you mean, like, Bangkok, then sure.


"Shit" is an overstatement. If a nice car and insurance is 15k, rent is 1500 bucks a month (so 18k a year), and groceries are 10k a year (and that's double the average), that's a total of 43k a year. Add in health insurance for what, 5k a year (if you're making 90k, insurance shouldn't be more than 400 bucks a month)? 90k yearly is ~67k a year after federal taxes. After all of the necessary expenses, you're left with 17k left over. Student loan payments probably cut into that, but that's still a significant sum of money. This is especially true if you can cut that rent down, or don't drive a super nice car, or have a smaller insurance payment through your workplace, etc.


That $17k surplus assumes that this person will be doing nothing remotely fun with their lives. Add even a modest amount of travel, movies, reataurants, bars, etc., and that money disappears quickly.


I'm not saying it's exceptional pay, but if you're even considering leisure travel you're not getting "shit pay."


That's only 17k net a year left over before savings, travel, or restaurants.

Going out to eat almost anywhere decent is a hundred bucks these days.

$1500 a month is not a reasonable assumption for monthly housing overhead even in a low CoL location. Also mortgage interest rates are much higher now.


If 1,400 a month for non-necessities isn't considered good, I struggle to see what a reasonable definition of "good" is.

1,500 is perfectly reasonable for a low cost of living area. I currently pay 1,800 a month (and that includes electric, which also powers my heat) in a state that most consider to be one of the highest costs of living in the US. For that amount of money, I could be renting a standalone house in much of the country.

I also think that your assertion of going out to eat costing 100 bucks is inaccurate. Many decent restaurants have entrées in the 18-25 dollar range. Add in a beer and an appetizer, plus a 30% tip, and you still only hit 65 dollars (assuming the app/entree cost 40 bucks combined and the beer costs 10, which is a high estimate).

All of this also assumes that you drive a nice car, a luxury that most wouldn't consider a necessity. The 15k number assumes 1250 combined for car payments and insurance, a number which many would consider much too high.

Is it the most glamorous life? No. But calling it "shit pay" is a massive overreach. If you can afford a solid car, your own place, all the necessities, and still have 1400 bucks to play with, I would consider that a pretty solid life. Certainly above "shit."


$100 for one person eating out?? I feel like even at a nice sit down place in San Francisco when I visited late last year I was paying much less than that.

In low cost of living / mid cost of living (LCOL/MCOL) areas you can definitely eat out for two at a decent place for less than that. Pricing out for two at my favorite local Italian place in San Antonio, here's the bill:

$15 - spaghetti with homemade meat sauce

$16 - penne arrabiatta

$20 - wine

Pad 28% for tax and tip and you come out to ~$65.

On San Antonio Zillow, I'm finding almost 200 results for 2 bedrooms at $1200 or less per month.

I don't think San Antonio is even the lowest LCOL location you could find, by far. Even in Texas, not looking outside the state, El Paso's way cheaper, maybe Corpus Christi.

I think people often overestimate how much it costs to live quite well in a LCOL area.


the median income in the US is ~35k. 90k in a low cost of living city is a ton. Nationally 90k put you almost into the top 20%.


Median household income in the US is 70k. 90k income for an individual isn't shit for any city let alone lower cost ones.


An entry level game developer (1-3 years of experience) earns an average salary of ¥8,386,388. On the other end, a senior level game developer (8+ years of experience) earns an average salary of ¥14,449,841.

Which is 90 to 160 Canadian


This seems absurdly high. I barely see people making those kinds of salaries in large foreign companies.

Anecdotally, my friend earned something like 5M yen and was working 10-12 hour days including Saturday at entry/mid level.


¥8M is common now and I know plenty of people with salaries going for over ¥10M, but very few go over ¥20M and even less for ¥30M.

I would suggest you to re-evaluate your market opportunities if you are an Engineer getting paid ¥5M in Japan.


You must be living in a different world. I know exactly zero people with a salary over 20M, other than maybe the CIO, and probably because he migrated from the US and kept whatever compensation they gave him there (plus the expat bonus).

My starting salary was 2.1M yen in Japan, and until my current position never breached 5.5M


You might be grossly underpaid depending on your current skillsets and geographic location. What kind of work are you doing now? also, how many years have you been working? If you are in Tokyo and you are Senior, you should be above ¥8M at least.

Happy to discuss more over chat or something similar. Let me know if you want to compare notes.


Not at Japanese companies, which is nearly all game developers. The high end of the market at Japanese companies for staff level engineers with 8+ years of experience is 12-14M. 4-5M for entry level is not unusual



I think you've prematurely dropped-the-mic. Linking a URL doesn't legitimize your claim.

I've worked extensively with the best salary datasets available. Even then, you shouldn't believe everything you read. Even when the data is good, it's not reality. I fail to believe SalaryExperts are anywhere near experts in this case when there is so many odd points on that page.

For starters, it's kind of weird to fix game dev salaries on Tokyo when the Kyoto Triangle is a thing. I have on-the-ground context on Japan because I can see the Nintendo office from my backyard. People who aren't familiar with Japan don't realize that Tokyo city doesn't even exist or at least stopped existing over a century ago. That's just a litmus test as to why you trust the anecdotal evidence of people who actually live and work in Japan over a cherrypicked website from Google search for "game dev salary Japan".


That data is pure fantasy.


What's off with Nintendo is that they're getting old. Unlike US software companies with high turnover, a lot of Nintendo people are lifers (Japanese business culture).

Nintendo's well-known for taking risks with gameplay (Mario, Metroid, and Zelda were all new genres). But, as people get older, they get more conservative. The company is very heavily invested in their franchises and they seem to be taking a lot fewer risks in terms of gameplay innovation and new IP.


Isn't it [still] rather useless to compare US wages vs other countries? I mean, I'd probably be on a huge income if it didn't just cost me €90 (ninety euro) for a 6 day stint in hospital. Let alone the drugs, sleep meds, replacement for my regular chems, etc. Plus, a 6x 2x a week physio course after - €35 for 12x50 min courses. At this rate , with just a couple of car crashes and an accident at work, I'd be a millionaire no time. In America.


UK real wages have been declining for years. I promise Americans are actually significantly wealthier and higher paid than British people, and FAANG workers have good healthcare.


Even accounting for healthcare, US software developers are far better paid than anywhere else. I think the UK is maybe the next best paid and it's still only 50-70% of the US.


Last I heard they're skipping e3 this year (along with MS and Sony).


They've been doing that for many years now and hosting Nintendo Directs instead.


The economist has an article this week that discusses this on a broader Japanese economic level:

https://www.economist.com/finance-and-economics/2023/01/31/i...

>Japanese firms have long been reluctant to raise pay. But in the face of protracted inflation, leaders have begun to change their tune. Keidanren, Japan’s business federation, has urged members to give consideration to rising prices. Some multinationals and big regional firms promise hefty pay increases. Fast Retailing, parent company of Uniqlo, a clothing giant, announced raises as high as 40%; Higo Bank, a lender in Japan’s south, plans to lift base salaries by 3%, the first such rise in 28 years. The question is whether the smaller firms that employ 70% of Japanese workers will follow suit.


This is the pendulum swinging back after cost-of-living increases due to inflation. Expect lots of other companies to quietly do raises. I'm already seeing that recruiter spam in my inbox is quoting 30-40%(!) more for senior roles than they were 4 years ago.


Speaking of keeping up with inflation: AAA video games have been $59.99 for like fifteen years. Sony has been flirting with finally inflation correcting this and a lot of people freaked out for some reason. I wonder if 2023 will be the year this takes hold.


I'm convinced that's more that games were very overpriced for a long time. Or at least they were expensive because sales were generally lower. Games used to be ridiculously expensive.

Big publishers like EA, Activision, Take Two, etc. All still made/make killer profits on games. Development costs more today, but games are also selling a lot more plus there's other revenue streams like microtransactions and subscriptions. They really aren't struggling for cash.

Nintendo has the right idea. They just basically never drop the price of their games. Whereas most other publisher's reduce prices by 50% within just a month or two. Big games often have long tails.


>I'm convinced that's more that games were very overpriced for a long time.

Dunkey (famous YouTuber) made a pretty interesting short video essay on the topic:

https://www.youtube.com/watch?v=zvPkAYT6B1Q

His takeaway that video game pricing doesn't make any sense, and never has.

$60 for a beautiful work of art that you'll remember for the rest of your life is the deal of a lifetime, whereas $60 for a forgettable, shameless cash grab is a complete rip-off.

I'm run a retro games business now, and the pricing honestly makes a lot more sense from an economic point of view. $300 for EarthBound is still a better deal than $8 for Final Fantasy XIII-2.


> $60 for a beautiful work of art that you'll remember for the rest of your life is the deal of a lifetime, whereas $60 for a forgettable, shameless cash grab is a complete rip-off.

This exactly. I can't remember what I paid for Factorio or Subnautica or Valheim but 100x what I paid still would be amazing value vs 1/100th of what I paid for Fallout 4 would still be too much.


>$60 for a beautiful work of art that you'll remember for the rest of your life is the deal of a lifetime, whereas $60 for a forgettable, shameless cash grab is a complete rip-off.

I don't think this is how experiences work. Because you haven't had them in the first place, there's no way whether you'll like it or not. Second thing is, what's the point of the pricing? Covering the expenses of the creator, publisher? Making them have a buffer for a future work? Extracting as much from the people as they can? These yield very different outcomes.

In the end, pricing making sense boils down to one's economic view. If you think about it, it's about the fairness of the compensation, and so, the thinking centers around what's fair, how to ensure fairness, and whom should be included in fairness and to what degree. And so, you get the "let the market decide" people, the "manage it centrally" people, and many other people too of course. And pricing might make sense only in a framework like that. With no framework, it will never make sense.


A used games business has the luxury of hindsight. Earthbound is a great example: Nintendo didn’t know it would be a “$300 title.” They weren’t even confident it would sell in the West.

Lots of titles we know today as timeless classics were like this. They were incredibly anxious about Super Mario Kart and Mario 64. They were so unsure of the entire N64 console that they made a controller with a fallback D-pad for 2D gaming.


Also some fraction of sales have moved digital. Meaning the whole retail chain that used to exist is not there to take their cut, moving it entirely to publishers.

Manufacturing, warehousing, shipping, warehousing with distributors and shipping again to stores. Who then hold inventory and sold it was not cheap.


I don't think the games were overpriced until you factor in overmonetization through season passes, DLC, and microtransactions in many games that release at full price. The full price would have been fair, the rest of the anti consumer business model means the full price is userous.


People forget that games for the N64 retailed for $70+.


Game cartridge ROM cost was very expensive and had very long lead time.


Supply of games is a big factor here as well.

Way more quality game makers


> AAA video games have been $59.99 for like fifteen years.

AAA video games haven't actually been $59.99 for over a decade. AAA games for the last decade or so have had an "entry-level" / beginnners plan (at that $59 price point) but content that previously would have been in the base game has been sectioned off into upsells / DLC / add-ons for an extra $15 to $40/ea.

I don't personally care much (indie games have gotten so good, I barely play AAA stuff much anymore) -- but in the interest of honesty, inflation correction has already been happening in gaming this entire time. However, the "add on another extra 15% just for the CEO alone" increase that other industries did in 2021/2022 hadn't happened in gaming yet, and that's what the higher base price is for here.


Man, some of my sibling commenters are out of touch.

Tens of millions of dollars funding for log analysis / picture / messaging b2b startups, selling $100/host/month is fine, using mostly off the shelf tools (every part of the stack, even the UI components, nice).

But wanting $60 for a game that you spent $60m+ and 4 years to make, is "corporate greed" and send people into passionate youtube video style opposition. Despite needing to sell over a million copies to break even in a hit-driven business (how many b2b or even b2c businesses here do that?).

I would love to see people try to work with those rough economics (and no, it's not "reduce the price by half and sell 2x as much" when you want to make something AAA with a niche (apparently 1 million is niche) audience. This is why when the price is not allowed to rise, the quality drops: it has to appeal to more people, lowest common denominator). Instead, they are making $300k a year tweaking load balancers for picture, chat, advertisement, and api connector apps, when they arent trying to automate away artists with AI, the only decently paying job artists had being in games.

It is no wonder there's so many "live service" games now. They simply see how much, how predictably, and how risk-averse they can be with the payment of people paying into SaaS-tech is and got inspired.


I would have no problem with that if you paid a sum and got the full game. However the situation we have today is that you pay $60 for what is essentially 10% of the actual game and is constantly milked for the rest of it.

The scenario you're presented is how this situation worked in the past, but today the industry has changed into something else entirely.


Yeah I get the frustrations gamers have around buggy "we'll patch it later" releases, And sure some level of things like lootboxes can become predatory (I don't believe they are in an of themselves: they are like crane games for toys). Especially for kids, when dollars are like 10x-100x more expensive compared to a working adult.

But DLC->IAP/ads->live service is tempting to de-risk. Sure some behemoth publishers get greedy with it and shove it in games that have no business having those features in there. I just don't see those in and of itself as greedy.

To be fair to the other side of the argument, Hollywood has a similar problem, and have kept prices flat. Though big budget things have to be de-risked in the form of predictable-earning superhero movies (which can be used to fund riskier projects, but don't have to be). I just don't like games becoming too much like Hollywood, dependent on huge brands and celebrities to guarantee enough sales. The budgets and expected quality moat are already approaching it


Okay, which of the top 10 best-selling games of 2022 are 10% of an actual game?

https://www.statista.com/statistics/1285658/top-ranked-video...


> Keeping up with inflation

> AAA video games have been $59.99 for like fifteen years

Be warned, the author/presenter of the following linked video is quite a character, but they do actually make sense:

https://www.youtube.com/watch?v=N7kaK2-725w (2020) deals with those BS excuse as the very same trotted out for years, debunked even by quotes from top gaming execs, the corporate financial filings and exposees on the industry.

- $60 has for years just been the shell price with predatory monetization practices raking in more money than ever

- the same excuses have been used a few years ago to justify cramming "micro"-transactions down the collective throat of customers, nickel and diming them for the complete experience after having already paid for the shell price; now it's still being said that games' prices haven't increased in decades, while only the entry prices on the box haven't changed

- the audience size has only gotten larger

- distribution has largely gone digital, saving costs of physical distribution

- wages have been stagnant while inflation rose, even some employees of these very same games companies aren't making enough to eat in that very same company's cafeteria

- tax evasion by the games companies through pretending their IP is registered in empty offices/basements the world over means they gift themselves hundreds of millions of you US taxpayers money every year.


> AAA video games have been $59.99 for like fifteen years

SNES games were $50-60 and that was thirty years ago!


Pretty sure I paid $80 for some RPG or another (possibly Chrono Trigger) right after release back in the day.


RPGs were more expensive than other games due to a combination of high development cost, high cartridge cost (RPGs took a lot of ROM storage), and relatively small market (action games were more popular).


Factored in with translation costs (or god forbid an included manual) and it's crazy these things even made it to the US in the first place. No wonder NES/SNES RPGs are in such short supply.


Phantasy Star for the Sega Master System was $70 - in 1988. Adjusted for inflation that’s about $180.


I know these companies keep posting crazy numbers and make wild pre-sale money, but I do wonder how many people are like myself who:

(1) play video games with friends most weeks

(2) buy new games regularly

(3) have not purchased a game over 40 CAD since I was in high school, with the average being closer to 20 CAD.

My friends and I just don't have the appetite to drop 80 CAD on new releases, and just play other games until the big releases go on sale, which usually happens relatively quickly.

I feel like after Cyberpunk and No Man's Sky and Fallout 76 that the "do not buy pre-sales" wisdom is quite well known and regarded. I do wonder if there is some chance that people will increasingly just wait-and-see when it comes to new releases.

Although that's probably just wishful thinking. I just know that I'm happy waiting for sales to play a year old game with all the patch fixes.


You’re not alone, but there’s a specific experience to participating in the launch buzz of a new AAA game, just like there is with a new blockbuster film.

Even if the buzz for the game or film turns sour, or you personally hate it, there’s something to enjoy in being a part of a big cultural event in your community. Each launch/opening is as much about the festival as much as it is about the product.

There will always be people willing to pay to attend that festival, and always companies willing to compete for the opportunity to host it.


The best recent example of this I have is the Elden Ring release. The first couple months of playing that alongside a global community of people parsing the story and discovering obscenely fleshed out secrets (to entire different maps no less!) was one of my favorite videogame experiences.

I totally understand the slow gamer philosophy though, I think it's important that there's people out there doing that. The slow gamer that picks up Elden Ring on sale in 5 years and runs it at 4k60 on equipment half the price of mine will have their own uniquely special experience, not the least because the game will have been changed through patches and the community's understanding of the game will be more fleshed out... And we'll all get to enjoy their excited first time play through posts on the subreddit or wherever, which will be a delight after the game has faded into memory over years.


I have 100+ games sitting in my library that were either free, or purchased for less than $20.

When you think of all the effort and man-hours that went into making them, and the number of hours of entertainment they'll provide, that pricing is insanely cheap. It's not even that I can't afford to pay more for games - I bought Zelda + Mario Kart at full Nintendo price - but with so many options available for less than $20, does anyone really ever need to?

I've been pricing my own (non-game) software product and I can't see any way to make money unless the sticker price is at least $100, or with cosmetic micro-transactions. I would never in a million years build a game atop MTX, but I'm more open to it for other types of software.

The games industry is just brutal.


Speak of the devil, leaks are suggesting Nintendo is pricing the new Zelda at $69.99 https://twitter.com/Wario64/status/1623119561846579200


It’s already listen on Australian retailer JB Hi-Fi for $79 AUD [1]. Funnily enough they also sell breath of the wild for the same price.

Should I expect it go up in price before release?

[1] https://www.jbhifi.com.au/products/nintendo-switch-the-legen...


AAA video games have been jumping on the microtransactions bandwagon. Raising the entry price may not make financial sense anymore.


Most AAA games don't actually include microtransactions. The ones that do are mostly free for that very reason.


Many AAA games have microtransactions. Look at FIFA - a 60 dollar game, and for FUT, more money on packs. Apex Legends is a free example of a AAA game with microtransactions. Overwatch initially had cost and dropped it. COD MW2 is another example.

In any case that makes the argument even stronger - the fact that games are offered free, shows that raising the price isn't as profitable as making games with microtransactions and simply making them free. The revenues are simply in another stratosphere.


Far Cry and Shadow of Mordor immediately come to mind as full priced games with microtransactions.


AAA games increased pretty much across the board to 70€£$ last year (adopted by most publishers, you can Google it since there's a lot of different articles for each publisher). But the reality is that they've been increasing in price for two decades now. People like to forget all the special editions that include all the removed content from the base game (some are so bad, you need a spreadsheet to know what is in what version of the game), all the DLC, microtransactions, expansion packs, season pass, lootbox, pre-order bonuses, subscriptions... (the list of monetization goes on). The reality is that games stopped being 60 a long time ago, the 60$€£ is what you pay for the base game.

And lets not even get into the fact that we're glorified beta tester these days, and require an online connection to get the constant patches to make games playable.


Way more than fifteen years. They’ve been $59.99 at least since the nineties.


I remember super NES games being 450F (french francs) back in the 90s, which would be around ~100€ nowadays. Ain't no games selling for that much.

A PS2 costed 3000F, which would be 640€ nowadays, so a lot more than the PS5's 550€.


Space Quest III had this screenshot:

https://2.bp.blogspot.com/-6llg6z-mw9o/UfMX2SJk8XI/AAAAAAAAA...

From this story outline with spoilers (don't listen to this person, Astro Chicken was the best part):

https://playedbypanthro.blogspot.com/2013/07/space-quest-iii...

Caption: "$59.99?? in 1989?? I surely hope not!"

It was, though :(. There are some "I am rich" editions of games that sell for around $100 now but the base game is still $59.99 (and sometimes all the extras have little to no effect on the game). And sales are generally quicker and deeper now (games are often 50% off or more by the time the most obvious bugs are fixed) and wait a few years and they are often $5 or less.


Probably, market growth has sustained it. I do expect it will saturate at some point, but it's not clear to me that has happened yet. Though the growth of the market has been non-constant—probably sigmoid-ish, and it is certainly on the decline now.


According the USA BLS inflation calculator [1], $60 in 1990 is $140 in 2023. I'd love to see what a game company could do with a $140/unit price point. Then again, it would probably be equivalent to what we have now with a base game ($60) and several DLCs ($40*3).

[1] - https://www.bls.gov/data/inflation_calculator.htm


Yeah, I hate price hikes as much as anybody, but the fact that gamers pitch a fit if anyone contemplates moving the magic $60 price point is not sustainable. In real terms, games have been getting 2-3% cheaper every year for like two decades, with a huge cut this year due to inflation. It has to move eventually.


Why? Games are easier to develop today than they’ve ever been. The only class of game that can push the $69.99 envelope are those making AAA games.

Games and movies are the two cheapest forms of entertainment. They’re not going to mess with the formula, especially since both these industries do really well in a depression


> Why? Games are easier to develop today than they’ve ever been. The only class of game that can push the $69.99 envelope are those making AAA games.

Nobody is selling indie games for $60/$69.99 (except maybe physical releases to cover manufacturing costs, and collectors edition type stuff).

The barrier for entry may be lower, but AAA game development costs are massively higher now than they were ~10+ years ago.


Indeed it may be a shrewd business decision to effectively cut prices. But no other industry is expected to cut prices due to economies of scale.


They’re cheaper to distribute but not necessarily cheaper to bake at the highest levels.


making games is easier but they grew in scope. Also, wouldn't books be cheaper for entertainment?


I've paid 20 bucks for Minecraft, and got over 4000 hours of entertainment out of it. If there's a book with a similar cost-benefit ratio, I'm not aware of it (except for public-domain or open-source e-books obviously, can't beat free).


The dictionary is the Minecraft of the literary word. Buy one and knock yourself out!


D&D manuals?


For what it's worth, it's commonplace in Japan for new release domestic-made games to sell for upwards of 20,000 to 30,000 JPY (roughly $200~$300) USD depending on how much premium flair is tacked on; even base games going upwards of 15,000 JPY (roughly ~$150 USD).

On the one hand they are expensive compared to standard western fare, but on the other they do look more representative of the kind of money game devs should be getting.


I think the reason video games have been able to stay at $60 for so long is because the industry has been growing. Even though the unit price stayed at $60, the number of customers has grown year after year, and software has it's notoriously low marginal cost.


Where I live, a new AAA game costs 80 CHF/USD and sometimes even over 90 CHF/USD. I really wonder who actually pays those prices for games. You could buy a whole spare console after only 4–5 games.


It may cost more to make games, but people don't have more money to pay for games, so raising the price might end up a net loss. I bet there are a lot of people analyzing scenarios.


They were $59.99 (and more) in the late 1980’s. Of course the media they are on has gone to $0 but the cost to bring to market is much higher. Much larger audience today of course.


Does that include day-1 DLC?


To be fair to games today, things like Doom 2 would probably just be DLC now.


My Uncle is over the moon after hearing this news


Wish there was a way for customers to opt-in increasing the percentage they paid for a product or service from a company in a way that any employees directly tied to the related service or product received the money. Only thing is that if the company had access to the data and amounts paid, they would simply increase the cost of the related service or product.


No, we don't need even more tip culture. It's just a way for capital to avoid paying wages.


It’s the opposite. Customers avoid paying employees by outsourcing the work to companies. Customers, not companies, are the ultimate beneficiaries of their labor. Corporations are a be definition design to extract value from employees and customers, pretending otherwise is nothing more than wishful thinking.


> Customers, not companies, are the ultimate beneficiaries of their labor.

This is plain wrong. Capital is the one making a profit off of labor, not consumers. And capital frequently tries to extract further benefit at the expense of labor and consumers (like shifting wages to tips).


Short of consumers with no choice but to consume their services and products via monopolies, it’s obvious consumers do control the means of profits, not the companies that provide them. If consumers (or employees) and not companies are empowered to leverage that control is the subject of the thread. Assumption that companies must exist in capitalism is just that, an assumption.


There’s been a bigger push towards the whole Indy/kickstarter thing, several developers in the past year slipped out from under their publishers to take control and let fans directly fund their games.

Otherwise that was a selling point of nfts. Having a digital supply chain that could trace assets back to a creator for compensation. Before the market imploded, many game companies were jumping in to try and rush out nft projects to squeeze people


Wasn't it only Ubisoft? The entire idea doesn't work, which most real game companies would've noticed. You can't just "add your items to a different game".


I don't think any nfts actually did that though? Only promised it as a potential feature without building anything past what the nft creator gets paid for


Purchase from worker owned co-ops, then you wouldn't have to worry about about where the money went.

https://en.wikipedia.org/wiki/Motion_Twin


I always tell people to buy Factorio directly from their site so they don't have to pay the 30% Steam tax.


The videogame market is absolutely astonishing right now. When I was a kid there was lots of discussion if there was even room for two major consoles on the market. The combined systems moved for the SNES, Genesis/Megadrive, and the TurboGrafx-16 is less than Wii moved by itself (90m combined vs 101m for the Wii).

Here's the rundown today:

1. Nintendo:

A playing card company that became a toy company that now makes video games. Has both the currently most popular handheld (the 3DS) on the market, but also the most popular console/handheld on the market (Switch). Of the top-10 highest selling consoles of all time, Nintendo alone has half of them (if you count the Gameboy and GBC as a single console). They've managed to have a top-selling system in every decade since the 1980s and have overcome misses like the WiiU and the GameCube without missing a beat.

Their current console (Switch) is fantastically underpowered for the generation, basically a cheap Android tablet. Though not yet the all-time most sold console, it's expected to overtake the PS2 before retiring as sales are still ridiculously strong despite competing against systems many times as powerful.

Between the 3DS, Switch, Games, and licensing, Nintendo's revenue for 2022 was ~$14 billion dollars, net at ~$4b

Nintendo has an e-store, but zero cloud gaming strategy.

2. Sony:

A classic Japanese electronics giant that makes every kind of consumer electronics imaginable. Entered the console market after being insulted by Nintendo during the 90s, now arguably the second closest hegemon of the industry after Nintendo. Capable of producing incredibly exotic, but powerful, hardware -- every home console Sony has put out since the original Playstation and the current PS5 is in the current top-10 most sold. The PS2 is the all time winner.

Sony's handhelds haven't faired as well, while the Playstation Portable was a relative hit (and still popular among home brew enthusiasts today), the follow-up Vita was a relative failure, prompting Sony to pull from the portable market. Sony is the only major console maker to get into VR.

Sony proper revenued ~$81b, but Sony Interactive (the division that does games) made $20.75b, but net at $2.6b in 2021.

Sony has an e-store, and you can play a decent library of recent games over the cloud.

3. Microsoft:

Another absolute monster of a company. The history of the Xbox is fascinating (there's an interesting quasi-advertisement documentary on youtube about it), but it's basically a rouge operation that's allowed to survive because it makes money. Totally against prevailing corporate DNA, the systems have performed very well, but not industry leading, since launch. The Xbox 360 is the only system to crack the top-10, but the other systems have performed respectably.

Microsoft has focused on M&A to build up a gaming lineup, but their systems have been treated like an infectious disease in some markets (notably Japan) limiting overall performance. Severe missteps in launch and marketing of the Xbox One probably sapped a not insignificant percentage of their possible growth. It's safe to say that the Microsoft Xbox division focuses heavily on the North American market first, while the Japanese systems see that market as second.

Microsoft revenued $198.3b in 2022, Xbox made $15.56b of that. Microsoft does not seem to share net for the Xbox division.

Microsoft is all in on the cloud. all in.

4. Valve:

The massive dark horse that appears in the room when the power cuts out, covered in black smoke. A "pseudo" console platform company with Steam as their console. They focus mostly on being a channel/publisher for other games -- with various hardware and gaming software side businesses. They target existing PCs as the runtime engine, but try to enable users to access their games in various ways. Steam has a catalog of around 50,000 games.

Here's the real astonishing numbers, if the PS2 is #1 selling console of all time with 155 million units sold, and the Switch has moved 122 million units so far...Steam sees around 200 million monthly active users as of 2023. The PS5 has sold around 32 million units, Steam sees around 75-80 million daily active users per day.

Valve has recently entered the portable console space with their Steam Deck, which runs a custom Linux variant and a compatibility layer, capable of running AAA PC titles. It's estimated to have moved around a million units so far. Valve is also a major driver in the higher-end consumer VR market - one of the only true competitors to Meta.

Valve does this with less than 1200 employees, yearly revenue is estimate to be around $13-15b. It's unknown what their net is, but it's expected that they are very profitable.

Despite being fundamentally a cloud-based console, Valve does not have a cloud gaming offering where you can just sign in to some remote server with your account and start playing. Instead they've relied on external parties to provide those services like GeForce Now or Shadow.

5. Mobile:

I won't spend time here other than to say that most people who are interested in console games consider this a different segment in the way that people interested in Movies think of TV and Radio as different segments. Mobile gaming is huge, can make massive money, but is very hit driven and home to a huge predatory "free to play" industry.

6. Meta

I'll spend even less time here. But basically Zuckerberg traded a dominant, highly profitable, but fading, $116b company for a company that focused on moving fewer platform units than the PlayStation Vita. Great consumer tech, some great games and experiences, but kind of a user mess.

Here's some things that are clear at this point in the generation:

- Nintendo could probably release a Switch II with considerably better specs at any time, but has no reason to as the Switch is selling like hotcakes, and making unreal profits for the company.

- Nintendo appears to make around 30% profit, the highest of the 3 major hardware makers (Sony about 10-13%, Microsoft is unknown, but their M&A rate is crazy). It's guessed that Valve makes more than any of the above.

- The supply chain is choking the high end systems from Sony and Microsoft, and this was before the pandemic. The PS5 was released in 2020 and is still hard to find on store shelves. If you can't move systems, you can't move games.

- Valve has had an incredible year. The Steam Deck is a hit in early gaming circles, largely delivers on promises, is a relatively open platform, and open to customization. Steam dominates its perceived market (game launchers), but also may secretly be one of the largest gaming platforms on the planet. Cheap (<$300) PCs are appearing that can run an almost impossibly large library of games and Valve has been an incredible steward of individual gamers' purchased libraries.

- Nintendo has bizarrely embraced the Indy scene, something only Valve has really done. It's a perfect fit for the lower powered Switch.

- Mobile gaming didn't eat console games -- in fact some pundits are saying the mobile gaming industry has "collapsed". And gaming is now a much larger industry than movies and TV combined.

- Meta isn't even in the top-20 of platform units sold.

Bottom line, Nintendo is one of the smaller players in the industry, but makes lots of money, the giants are struggling to fill channel demand with their platform, and Valve may eventually just overtake them all in the end. Basically Nintendo employees deserve some extra spending money because they're among the best on the planet.


Notice to Google, this is how its done.


No, the Japanese employees make almost nothing. Even 10% raise doesn't get them close to a base Google salary.


Google (Larry and Sergey) nuked morale, half-decimated the workforce by choice, screwed over a tone of projects and saved negative money.

Google could have pulled some big search energy and given everyone a raise and not noticed. Lost 5% of workforce, remaining workforce is easily working 10% less (something). Easily 15% down on throughput since Jan 19th.

Regardless of how little Nintendo employees may make, giving a 10% raise while other tech titans are trading punching their employees in the shorts is a good move.


I don't disagree, but I wonder why is everyone fixated on Google this much? For the past weeks almost every day brought the news of a big tech company letting go 10+% of a workforce. Most recently zoom fired 15%.

My theory is that people (me included) still have this idealised view of Google as the "don't be evil" company that is somehow better morally than the others. So Google firing engineers is a shock, while Amazon firing even more people is just Tuesday. Curious what you think.


Why did you single out Google?


For a good moment I thought that it was "gamestop" not "gamespot".


well, somebody must be making money at nintendo, and i hope it is the devs. the price of games on the switch is just silly.


You get what you pay for at least with the first party titles. The game is in a playable state from day one, they don't nickel and dime players and their games are actually fun.


nintendon't are only doing this because they have to. not because they want to.


I am at loss towards the tech-landscape right now. A year ago, everyone was talking about hot the market was and how you could easily switch jobs for a 40% raise. Now all the megacorps are doing lay-offs and here comes Nintendo with the opposite approach. Why are the business decisions between Ninendo and the tech-giants so different? Would anyone have been surprised to see "Nintendo will lay of 10% of its workers" ?


The explanation is in the original article (not Gamespot):

> The hefty pay hike comes amid calls by Prime Minister Fumio Kishida for Japanese companies to pay workers more as inflation takes hold in an economy used to years of deflation and stagnant wages, and as Japan prepares for its annual spring round of labour negotiations.

> "It's important for our long-term growth to secure our workforce," Nintendo President Shuntaro Furukawa told an earnings briefing.

> For companies that can afford to do so, higher salaries may also help them attract talent as a falling birth rate and low immigration leave Japan with serious labour shortages.

https://www.reuters.com/technology/nintendo-trims-annual-pro...

Basically, hiring is harder in Japan, companies are adjusting in order to attract talent. In the US, companies made the mistake of hiring too much, and doesn't suffer from the same problem, so different solution needed.


> calls by Prime Minister Fumio Kishida for Japanese companies to pay workers more as inflation takes hold

So the prime minister wants to fight a phenomenon caused (in large part) by a wage price spiral by calling for accelerating the wage price spiral... we're in for a fascinating macro-economic/political landscape these next couple years...


Japan’s current inflated prices are more visible on gas and electricity which increased by 25~30% in just a few months, and are planned to increase again in the summer.

The PM is basically asking companies to foot the bill for energy costs, which have little to do with wages.


Food has been increasing too. It’s actually more visible to me that my ¥600 cereal is now ¥750.


Imported goods have been directly affected by the usd/eur/jpy exchange rate, so yes they rose too. Exchange rate has been slightly going down, so there’s a modicum of hope in this regard.


If you’re not increasing the money supply, do you still get inflation or is this just shifting money from owners to workers?


So, 4% incrises in average prices cause by the sour of gas prices and comoditiess, means employees cause the inflation. Or b what he wants is the distribution of the Cost of being share between companies and employees.


The inflation Japan is experiencing has not been caused in large part by a wage price spiral. It has largely been caused by US and Japanese monetary policy being out of sync.


Japan has desperately wanted inflation for decades. Not obvious it’s enough to get people to spend more and save less


> (in large part)

Cost of raw material going up (largely due to moving resource extraction away from slave labor) is a much bigger driver of inflation than wages. If 10% of a product's price is to pay for the overhead of personnel costs, a 10% wage increase only increases the price of the product 1%.


american tech leaders are just herding, basically responding to investor sentiment which regards the tight employment market as the cause of inflation, which is killing the real value of their wealth in equities. is the tight labor market the cause of the current inflation? maybe. will layoffs curb inflation and boost equity prices? maybe. maybe not! but the fever has taken hold and if you are in senior leadership of a big tech company you are looking around and wondering why aren’t you laying people off too. japanese companies just don’t have the same relationships, the same board membership cross polinations, the same social circles etc. it’s easier to recognize that it’s better for the health of your company to have happy employees with loyalty and motivation than distrusting workers who are all wondering when the axe is coming for them


> Would anyone have been surprised to see "Nintendo will lay of 10% of its workers" ?

It would be highly illegal in Japan. Nintendo has boatloads of money, so they can’t claim downsizing due to business pressures.


Could it be the same mentality as a copycat crime?

When a corporate executive sees a competitor decimate themselves without facing immediate consequences, they start to think that they could probably get away with it, too.


We've actually reached the tech singularity:

- ux is perfect,

- usability is perfect,

- structure is perfect,

- performance is amazing,

- legal compliance and regulations are to every human's benefit,

- the devices do everything imaginable, through first-party accessories

- ... well

- ... even for corporate use-cases

- ... dealing with confidential, private, privileged information

- there's a shmorgasbord of perfect nearly-identical services to further support your usage of the device,

- no one is ever failed catastrophically by the software as far as we know,

Welcome to the everything rally, what you have been so graciously given access to in your hand there is all you'll ever need, in perpetuity.


I think you have to look at the converse for why they're stable. Even when things are going really well, Nintendo grows very slowly. They don't take large risks outside their core business. Sounds like they believe they're in a position of having the right employees and working on right things, but at the wrong time for market conditions. A lot of the companies doing layoffs have large risky divisions that they don't believe in.


I believe labor costs are generally much lower in Japan.


Yup and the PM of Japan is asking companies to pay more. Nintendo is answering the call.

> As inflation continues to hit the global market, Japan Prime Minister Fumio Kishida urged companies to pay more to offset the rising cost of living. Nintendo President Shuntaro Furukawa said, "It's important for our long-term growth to secure our workforce."


> Yup and the PM of Japan is asking companies to pay more. Nintendo is answering the call

It's never as simple as that. Toeing the party's line to help re-elect some politician isn't what has made Nintendo a lasting enterprise spanning 133 years of existence.


Like, by a huge factor, at least according to Patio11. Think like 2-6x lower. A 10% raise on that is relatively significant, but does not get you close to US dev compensation.


US, Lead Dev position, I just received a 6.5% pay increase this last month. Folks massively overhired last year, they're mostly still doing fine however, and there's a correction going around for that. It may end up that things get a bit worse this near necessitating what I would consider a _real_ round of layoffs, but we're not quite there yet (Zoom and the like being the exception given their reliance on remote/at-home focused clients).


I think there's a confusing narrative here, where people think layoffs is the same as not hiring or paying less. The market is still hot for roles that are competitive. It's still easy for many good engineers to switch jobs for a 40% pay increase, and in ML people are making more than they ever have at large US tech companies.


This tracks with my experience (WARNING: Anecdotal evidence incoming). I'm currently aggressively hiring for development roles (in particular around data). Despite the layoffs our candidate pipelines are only modestly bigger and the quality level is similar to before. Layoffs haven't really moved the market down very much except for some pretty major outliers where someone was being paid pretty far above market.


> It's still easy for many good engineers to switch jobs for a 40% pay increase, and in ML people are making more than they ever have at large US tech companies.

Uh... I'm not sure that's the case here otherwise those folks would have been reassigned instead of laid off.


Layoffs have to be across the board / department, otherwise that would open the company up to even more lawsuits. They can’t special-case ML engineers out. This does not mean the company does not need or is aware it needs ML engineers elsewhere in the company.


> They can’t special-case ML engineers out. This does not mean the company does not need or is aware it needs ML engineers elsewhere in the company.

In the case of META, Amazon, some of the layoff target was shutting down the whole Org/Div/Product Group. In these cases, shouldn't they be able to re-assigned a few folks to another team?

But like I said, they can move these ML if they actually truly need it.


Reassigning folks would defeat the point of having a layoff, which is to be seen as having a layoff (taking difficult actions in the light of new macroeconomic conditions, etc, etc).


There’s a whole high end “tech market” that I think is actually very small relative to the rest of the tech market.


> I am at loss towards the tech-landscape right now.

There isn’t a ‘tech’ landscape. Tech is just an overly simplistic view of the world that gets clicks. Nintendo and Google are worlds apart.


The layoffs in the US market are collusion to bring down salaries plus copycat effects from smaller companies. The hedge funds are driving this because they're greedy fucks.


Higher worker pay, lower profits, and no mass layoffs. The Welch acolytes are yelling in horror in NYC.


Average annual salary across all full-time roles is $75k (about $80k with bonuses). Entry-level roles pay $1,600-$2,000/month. Welch acolytes calm down quickly, then seethe with envy. https://www.nintendo.co.jp/jobs/recruit/requirements/index.h...


- Consumer Prices in San Francisco, CA are 37.4% higher than in Tokyo (without rent)

- Consumer Prices Including Rent in San Francisco, CA are 79.3% higher than in Tokyo

- Rent Prices in San Francisco, CA are 178.4% higher than in Tokyo

- Average pay for a Sr. Software Developer is around $58k in Tokyo

$75k would probably be comfortable to live on.

https://www.numbeo.com/cost-of-living/compare_cities.jsp?cou...

https://www.payscale.com/research/JP/Location=Tokyo/Salary


Nintendo is in Kyoto (though they have a few Tokyo satellites). And many employees commute from neighbouring towns or prefectures, so it actually strengthens your point. Expenses will be massively cheaper.


$75k might. $25k as an entry-level with a doctorate wouldn't.


Game devs are the worst paid devs of any industry.

Japanese salarymen are worse paid than FTEs in almost any highly developed economy.

I wouldn't really expect that after this raise, Nintendo employees end up being paid particularly well.


> Game devs are the worst paid devs of any industry.

Not really, if you work for a major dev in a US / Canadian compagny you're ok, it's not FANG level but it's higher than your regular web dev.


Any data on this? In the very beginning of my dev career I made more than a game dev who had been in the industry for years. He was clearly a more experienced and wise developer than me but stubbornly only wanted to work in game dev. He described a lot of really bad work conditions, too.


For example an SE2 at EA which is a dev with ~~3-8 years of experience: https://www.glassdoor.com/Salary/Electronic-Arts-Software-En...

Median is at $191k.


>Japanese salarymen are worse paid than FTEs in almost any highly developed economy.

I find this obsession with instantaneous salary to be odd. In my mind, you're best off working toward total lifetime earnings, optimizing for stability. Sure it's nice to be making six figures at some startup. But that means nothing over a 10 year period interspersed with layoffs and periods of unemployment. As an American, I'd take the salaryman deal any day of the week over making a bit more and waking up every single day wondering if I'll be fired.


You mean like that employee Justin Moore that was fired by Google after 16 years throuhg an automated e-mail? [0] Or like that guy on reddit that lost 20 years of life-savings in just 5 months because their spouse got cancer?[1] Also current generations saw the retirement age go from 60 in 1995 to 67 and if Republicans get their way it will be 70 or higher, less than a decade shy of the average death age. Please, people would optimize for stability if it looked like a feasiable goal, but at this point is just a fantasy like Santa Claus, I'm glad that it has worked this far for you but don't assume other people could be so lucky.

[0] https://finance.yahoo.com/news/engineer-laid-off-over-16-104...

[1] https://i.imgur.com/GbNcdKc.png


> You mean like that employee Justin Moore that was fired by Google after 16 years throuhg an automated e-mail? [0]

He was laid off with six months severance and I expect they'll ask him to come back after that's over.


What is instantaneous salary? Seems like a contradiction. Saying that making a decent wage "means nothing" in the long run is just plain wrong. Higher earning, especially early in one's career has massive financial impact down the road. White collar jobs are actually really stable, and during recessions you see a lot of blue collar work tank(such as jobs in the service industry). Obviously this could change and we are currently going through some of that but if you're waking up every morning for the past 10 years wondering if you'll be fired, that seems to be a bit irrational(unless you're actively doing something that will get you fired).

You romanticize the salaryman title, but I am not sure you know what the actual lifestyle is...for most people in developed countries outside of Japan, working 12-14+ hours for low pay is not a good "deal".


1. I think you're overestimating the amount of time the average US worker will spend laid off.

2. If you're worried about being laid off the best thing you can do is build an emergency savings account. It seems perfectly reasonable to have 3-6 months of expenses in a "break in case of emergency" type of account. What is your plan if you are fired, or if you car breaks down, or if you have unexpected medical expenses? Savings keeps you safe.

3. The total lifetime earnings of a US office worker is higher than a Japanese office worker for the same time of work, in most professions. We are optimizing for lifetime earnings.

4. The term "Salaryman" is associated with long work hours and an unhealthy lifestyle, even worse of a woke-life balance than is offered in US.


For those talking about cartridge games being 60-70 bucks back in the 90's, don't forget that half the cost was just simply to make the games - to literally print the game cartridges and the chips it used to make them was half the cost of the price-tag. Let's not forget royalty fees and publisher costs on top of that. It's much cheaper nowadays, and even cheaper as more and more games are being sold digitally and discs cost pennies to the dollar to print.


Nintendo only sells one device now, the switch, and it still uses cartridges. Granted the carts are literally just a flash chip.


Some say XtraROM is used for Switch game cartridges, but what is this chip? PROM or just high endurance write-once Flash?


for the Switch they are using non-volatile flash storage. Even though "ROM" is in the name, it's still a flash based storage (dubbed HybridFlash by Macronix). I read somewhere that their stated lifetime is ~20 years, which means they won't last nearly as long as their NES/SNES forebears.


Wouldn't be a HN thread about pay increases without someone saying it's not enough.


I'm not going to cite the 'sneer' guideline at you because I don't think this was a sneer, but it was definitely a generic tangent, and that's also in the site guidelines: https://news.ycombinator.com/newsguidelines.html. Please try to avoid those—it makes discussions more predictable and then nastier.

We detached this subthread from https://news.ycombinator.com/item?id=34701705.


That 10% sure looks nice, but like everything, it needs context.

If I'm not mistaken, Nintendo is currently making nearly two million dollars per employee. In an industry littered with claims of overwork and burnout, no less. And in Japan, where white collar workers still strive to stay in the same company until they retire.


[flagged]


"Please don't sneer, including at the rest of the community." It's reliably a marker of bad comments and worse threads, and your comment would have been fine with just the second bit.

https://news.ycombinator.com/newsguidelines.html


There is no amount Nintendo could have raised salaries that would have avoided the GP comment, is the point.

>> In any event, based on increases in productivity since the 80s all workers should be making at least double their salary.

This is happening. It's just happening to the people who are the most productive. There is no reason productivity gains would naturally be equally distributed amongst all workers, nor the wage gains.

If you are a developer, you are partially responsible for this inequality in both pay and productivity. They tend to go hand in hand. Now, if you want to argue that software developers are underpaid, I certainly agree. But we're also part of the inequality problem, and pretending otherwise would be silly.


> This is happening. It's just happening to the people who are the most productive.

Oftentimes, not even that - just look at CEO compensation explosion over the last decade [1]. The real money ends up at shareholders and uppermost echelons of management, not at any level of the worker class.

> There is no reason productivity gains would naturally be equally distributed amongst all workers, nor the wage gains.

There is a reason, it's called unions - and it's no secret that the explosion of C-level compensation came together with union-busting laws and the general decline of unions.

[1] https://www.statista.com/statistics/261463/ceo-to-worker-com...


> There is no amount Nintendo could have raised salaries that would have avoided the GP comment, is the point.

Correct.

Which is exactly as it should be, as there also is no amount of profit increase that would satisfy company owners.


For an obvious counterexample, they could have raised everyone's salary by a million bucks, or 10 million.

Productivity remains unrelated to pay, labour power is the real decider. If you're irreplaceable, you can demand high wages. If somebody is willing to do you job for a thousandth the price, you're screwed, even if they suck


This statement:

>> Productivity remains unrelated to pay, labour power is the real decider.

Is in stark contrast to your next one:

>> If you're irreplaceable, you can demand high wages.

Extremely productive people carry high replacement costs.

Very productive individuals have labor power. Whether they choose to band together and use it like a guild or a union is a separate matter entirely.


Being irreplaceable is very different from high productivity.

Being more productive than your peers barely gets you anything in most jobs.


No, it is productivity. You're talking about how hard it is to measure productivity in some jobs. Not whether or not productivity is valuable.


Being replaceable is mostly about what special things you can do.

If you do lots of quality work, but most potential hires can do the same thing slower, then you are productive but easy to replace. (A reasonable amount slower, not a ridiculous factor.)


Can you give an example of being irrepleacable?


At the extreme end, being the only one that knows how some machine works, or how some code works. Or this guy: https://www.reddit.com/r/BestofRedditorUpdates/comments/nrnf...

At the less extreme end it's having a bunch of internal knowledge that isn't documented and is important to keeping things going. Or having business skills that whoever is in charge of hiring doesn't know how to hire for. Or knowing a programming language or framework that's fussy to learn and only 2% of your normal hiring pool knows.

Very different from just being fast.


You are describing productivity in each case. I think you could reduce each example you give to speed, as things will slow down until a replacement is eventually found, but it's simpler to call it productivity.


Let me put it this way.

The guy I linked has a very niche skill. And he talked about bringing in a Bulgarian with a similar but different very niche skill.

Neither one of those is more productive than the other.

But neither one can be replaced with the other.

That's an irreplaceability advantage, without a productivity advantage.

And you might find that a good bricklayer is actually more productive, but he can be replaced pretty easily with a different bricklayer.

Those niche guys would be in extreme demand even if they slowed down 3x, because there is no way to replace them.

Here's a hypothetical: You're in charge of purchasing supplies for making a machine. Then 9 other people build it. The whole team is pretty productive, but you think it's fair to say you do 10% of the work.

Then it turns out some critical part can only be bought by you. You've become irreplaceable, but it's hard to argue that your productivity has changed.

Then you decide to demand an assistant since you're irreplaceable. You start pushing more and more of your purchasing work onto your assistant, until you're only handling a couple products and you come in one day a week and leave at lunch. You're still irreplaceable, but your productivity has fallen off a cliff.


This is simply not true in technical fields, specifically software development. The most productive developers have high labor mobility and high rates of pay. If you don't think so, try employing the industry's most productive developers and see how cheap (or not) it is.


If you're 50% more productive than someone else, that's a lot of productivity, but it's very hard to measure in a software development situation.

I'm not really talking about the top handful of people.


Productivity isn't unrelated to pay. Your pay ceiling is your value to the company - your productivity. Your pay floor is max(your salary, what a competing company will offer you).


a sibling comment mentioned that Ninty is making 500k USD per head in profit. Salary postings sound like engineers at 10 years of experience are not making more than 100k USD. Entry at 50k. This is like.. not super lowball in Japan, but hell of a lot of companies with a lot less profit paying more.

We aren't talking about companies as a whole, we're looking at a specific company. The mental gymnastics of "well more productive employees are getting paid more" with salaries of ICs not being _that_ much higher... I dunno.


> This is happening. It's just happening to the people who are the most productive

Then why is the increased average productivity not reflected in increased average income?


Because they're eliminating the jobs of the least productive people and not paying the most productive people equivalent to the productivity gains, i.e. software developers and the technical folk. But those people are still getting paid heaps, so there aren't a lot of complaints.


> not paying the most productive people equivalent to the productivity gains, i.e. software developers and the technical folk.

So now you are making hawthornio's point for him! Software developers are underpaid.


If you read up a few more comments, you'll note that I'm the one that made that point.

> If you are a developer, you are partially responsible for this inequality in both pay and productivity. They tend to go hand in hand. Now, if you want to argue that software developers are underpaid, I certainly agree. But we're also part of the inequality problem, and pretending otherwise would be silly.


How does being underpaid make engineers part of the inequality problem.

You are talking about 2 different things.

1. Productivity benefits not being passed down to the productive people

2. Inequality

I am not talking about 2, only 1.


If I replace one staff member out of two with a machine that does half their work for them, that doesn't mean the other staff member gets a doubled salary.

This is why we don't say a software engineer writing an website should get paid all the money that it would've cost to build and run a data centre for the website, but didn't have to because you used the cloud.


> If I replace one staff member out of two with a machine that does half their work for them, that doesn't mean the other staff member gets a doubled salary.

So who keeps the surplus amount? It has to go somewhere, why not to the group that increased their productivity?


If the group bought the machine out of their own money then they would automatically keep the surplus in the form of less work. If the company bought it, then the company gets the saving, and it can choose what to do with that saving.


Basically, capital with zero effort has the right to compound. Labor does not.

The idle rich living off index funds deserve it.


> In any event, based on increases in productivity since the 80s all workers should be making at least double their salary.

I don't understand - why should this be?


Of course, the benefits of increased productivity should go to the shareholders, not the workers who increased their productivity.

It is the shareholders who are responsible for keeping the economic engine going through their smart trading and index funds.


Try picture the long ago, picture a man with a scythe, picture a millstone, picture a baker hammering out bread by hand, imagine the effort involved....

...picture an enormous John Deere tractor or a Claas Lexion, picture cargo trains, road trains and cargo ships, our engineered roads and water ways, picture our giant factories cranking out bread and what nots...

Then we also did a complex point system to divide and allocate resources, the fruits of innovation and the fruits of labor. Ideally those who contribute the most valuable things get the most points. Create organizations, take risks, invest where you think growth will occur.

But fixing exploits is poorly rewarded compared to exploiting them. People continue to find ways to be elaborately rewarded for great (and not so great) accomplishments that happened generations ago.

One could argue there to be 2 industries, one of miraculous production, progress, collective happiness and quality of life, squeezing wine out of a rock is not easy but we are apparently doing it in many ways. The other industry is also looking to improve quality of life but only for it self, it does so at the expense of others. It's not a black and white division, many of us play a role in both. Picture people at a discount store excited to buy the slave labor but also employers collectively driving down salaries.

The systemic problem we run into is that in order for further progress to take place (and I do mean durable progress in quality of life) people have to be able to afford it first. Enough of the resources and the other fruits have to be allocated to average Joe so that we can build a new and improved iphone, better cars, new and improved hospitals, better tractors, boats, factories etc etc On top of the bill for maintaining infrastructure, roads, houses, dykes, etc

You can see things went wrong in food subsidies. Food production is tens of thousands of times more efficient than before but hard working Joe needs subsidies to buy food(!?)

Take the language models, lots of people are very excited about the development but only few can spend [say] 100 per month just to play with it. Making a product is still possible but it will have to be a watered down shadow of what it could have been. Finding investors is still possible but without the river of easy money it is going to be much less obvious to bet on. No doubt the technology is going to do miraculous things in countless sectors and [like before] replace millions of jobs. We will have to find other things to do for them which isn't hard really, there is plenty to do.

But do you think if the pool of resources to be divided by the system grows by [in this case] an amount that use to take millions of hands.. do you think this should also mean the amount of resources allocated to working people should dramatically decline? Like a game of musical chairs?

The opportunity to cut salaries is certainly there every time. I could even picture an end game where we don't need people anymore. I mean, what do you and I have to contribute if our hands and our brains are to slow and to clumsy?

I like to think we've accomplished the current state of the art to improve life for humans? If that is what we want we should design the system to enforce that idea. It seems kind of important for you to agree :P


> The opportunity to cut salaries is certainly there every time. I could even picture an end game where we don't need people anymore.

And this happens. We no longer ship ice from the Arctic for cooling. That industry has zero people in it. We do most of our communication via electronic means instead of people carrying letters. We do most of our agriculture with machines, which is why for example 30% of Europe is no longer agricultural labourers.

In each case we do things better, they cost less, and we move on to the next problem and jobs open up there.

> Food production is tens of thousands of times more efficient than before

You need to distribute and sell food, and lowering this has hit wage floors and fuel price floors. Maybe one day you could send food directly to people via drones, and then you could scale it all better, but that's the issue.


> In each case we do things better, they cost less, and we move on to the next problem and jobs open up there.

The tricky part is the mechanism by which we chose the next problem.

If everything that sustains people gets cheaper we can lower all the salaries. Working people can buy more than before but get a smaller slice of the total pie. All things combined there is more stuff to divide which allows us to create more wealth elsewhere.

The pie as a whole determines what should be the next problem to attack.

Imagine for laughs an utopia where no one works, we all get a nice house, a car, free food, energy/fuel, a phone, computer, tv, nice cloths, etc, etc, everything you want plus an allowance of 10 USD per week.

Whatever problem the market attacks next there is no incentive to do something useful for humans.


easy to lick boots when the system works well for you, which is most of the gen X / millennial tech crowd, heavily represented here on HN.

big when once I jumped into tech. but when I was bartending America seemed like a pretty rough place.


People aren't paid for their productivity.


Hit the nail on the head about boot licking neoliberals here.

Watching people rush to assume that every laid off employee is simply "unproductive" to make sense of a big layoff is... well, it's something.

People rush to scorn workers because they didn't get the axe this time, or last time, not realizing it will eventually be them - their superpower of valuable work will clearly protect them forever, right?

It's so...weird, I don't get it. Like they're so convinced the system loves them and will take care of them that they'll defend it! But it won't remember when it's their time on the block.


Decades of CNN and Fox news will do that to the brain.


I thought raising wages was not what you want to do to fight inflation?


It's just another basic economic effect of having too much money supply in the system. Governments printing too much money at a fast rate is the core reason of why inflation happens.


Raising wages doesn't increase inflation, it's raising prices to customers.


Raising wages _can_ increase inflation because the workers will use them to buy more goods, which might be supply limited.

But Japan's inflation problem is that they don't have enough of it, so that's fine…


Right, but limited supply doesn't have to mean raising prices. That decision is done by management and business owners. It's not an automatic kind of outcome, but a deliberate one.


And for the open source projects (that they uses for free)?: https://www.nintendo.co.jp/support/oss/


They will also get 10% more




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: