Hacker News new | past | comments | ask | show | jobs | submit login
Thoughts about the difference between European and American H/W and S/W design (liam-on-linux.livejournal.com)
109 points by lproven on July 29, 2020 | hide | past | favorite | 157 comments



This article has a huge cultural blind spot. What about Asia? Probably most of the hardware in your phone was designed in Asia. Arm, originally from the UK, is now Japanese owned. ATI is Canadian, which also doesn't get a mention.

When it comes to programming languages, C++ and C# were both created by Danes, although they were working in the US at the time. Presumably because that's where the money was. Does that mean the US gets to claim these languages?

Personally, I think it just invalidates the whole argument of what is American and what is European. Creation of modern tech is a global effort. If you look inside any modern laptop or smartphone you'll find that most it was manufactured in Asia, and that the software and hardware design was split between many different countries and continents, and that the actual people doing the work are even more diverse.


Blogpost author/submitter here.

It's a fair point. It is not a blind spot -- I did think about it, but I don't think I have ever owned an Asian-designed computer or used Asian-designed software. Ditto African or Australian, FWIW.

I once owned a 2nd-hand Sony Ericsson P910, an early smartphone with the UIQ GUI -- but this was a Symbian device with a Swedish-designed UI. I'm honestly not sure how much input Sony had. And I have a Playstation 2. My tablet is Chinese, by a Chinese company (Chuwi), but it runs an American OS: Android. These are about as close as I can claim, sadly, so I am really not fit to comment or judge.

Whereas most of the world's electronics are partly or totally manufactured in Asia, it's mostly by or for American- or European-owned companies.

I am aware of many fascinating Japanese computers and similar devices, from the FM Towns to the Pomera e-ink portable word-processors, to the MSX2 and MSX2 Turbo-R computers, which interest me greatly and I'd like to have -- but I don't.

The Julia programming language (AIUI, an Indian project) looks fascinating, but I am a poor programmer and old enough to find learning new languages extremely hard -- not something I would do as a leisure activity. So I have never tried it myself.

There is no lack of creativity there, certainly. There is a huge amount. It is just that little of it seems to directly influence the world market elsewhere.

Why that is the case is a fascinating question that I am ill-equipped to address.


> Whereas most of the world's electronics are partly or totally manufactured in Asia, it's mostly by or for American- or European-owned companies.

Do you have a source for this claim? I suspect it's incorrect, especially once you stop focusing quite so strongly on CPUs and programming languages and start considering everything that goes into a computer or mobile device such as RAM and networking, not to mention displays, which are just as important as the CPU.

> It is just that little of it seems to directly influence the world market elsewhere.

In the country where you live, maybe. Aren't the Chinese, Japanese, Indian, Korean, and so on, markets part of the world market?

Maybe I have a different perspective on things since I live in Vietnam. But your article comes across as kind of myopic to me.


By all means please do educate me on what I am not seeing.

I once saw a British OS, Symbian, used around the world. I saw a Finnish company's phones used around the world. A little British computer maker designed what's still the most popular CPU architecture, ARM, but the company is long dead.

I understand many of the world's hard disks come from Thailand and Malaysia -- but from factories owned and run by Western Digital, Seagate, etc.

Yes, every country is part of the world market, but the salient point is who manages to break out of their own national market.

I am trying to think of Asian companies who make and sell computers or computer-related tech internationally. Samsung, Toshiba, Hitachi... But they run American OSes and use American designs.

Sony and Nintendo don't, but then they don't make CPUs or GPUs, do they? Sony's PCs run Windows, like the Japanese laptop vendors' do. When Sony made PDAs, they ran PalmOS; when Sanyo did, they ran Windows CE.

American dominance was not so universal just 25y ago. Can we identify what happened?

Is this all "worse is better"?


> I understand many of the world's hard disks come from Thailand and Malaysia -- but from factories owned and run by Western Digital, Seagate, etc.

And the largest manufacturer of screens, ram, and SSDs is Samsung. When it comes to smartphones, again, Samsung is top, then Huawei, and Apple is third. Laptops? It's Lenovo, a Hong Kong company. HP is second. Game consoles? Sony and Nintendo are top there, MS is third.

(Edit: and CPUs, top is Arm, Japanese.)

Saying these companies don't impact the world market is just plain wrong.


I think his point is that they're all clones, or running American OS, or selling excellent quality commodities (in the sense that as a designer or engineer I don't care much about the part past its specs)

Lenovo -> IBM

Samsung + Huawei -> Google

SSD, screens, ram -> commodity

You might balk at SSDs being mere commodities (or screens for that matter). Surely Samsung SSDs and screens are better! Sure, and as a designer I might care. But I probably wouldn't, past the promised MTBF, viewing angle and contrast ratio.

Put the SSD+display+CPU together and you get, for better or worse[1] , a largely American computing device. You don't get a BBC micro or an Acorn. Or a Setun.

Nintendo is the only one who obviously stands out as different, making a unique computing device (albeit with an American CPU). Although even the N64 was largely an American design (SGI)

[1] I think worse, in the sense that everything is a Unix monoculture.


If all those things are just commodities, then surely the OS, CPU, and really every single low level aspect of a computer could be considered a commodity as well? Nobody's buying a PS4 because it runs a FreeBSD derived OS, nor are people buying the Nintendo Switch because of it's ARM architecture or that it has a Tegra X1 from Nvidia inside.

Components like the CPU and OS might be designed in America, but it's the higher level software development and hardware design that takes these largely off the shelf parts and assembles them into an actual product, and in the case of Nintendo that happens in Kyoto, not Silicon Valley. Is that really any different from say, assembling a piece of furniture out of materials and components sourced from around the world? Without the product design and assembly all you'll have is a collection of wood and nails lying about.

Same with the N64 - it might have been an SGI designed system at it's core, but through the guidance of Nintendo which developed the higher level industrial design of the console, the controller, the concept of using carts over discs, some of the more lower level OS functions, and of course the user facing software like Mario 64, etc. All the things that actually make the N64 the N64. SGI didn't decide to make a games console all on their own.


> You might balk at SSDs being mere commodities

Why would I balk at that? SSDs, screens, RAM, CPUs, GPUs, laptops, phones, and are all commodities. I don't care about any of them past their specs and build quality.

> Samsung + Huawei -> Google

Oh come on. That's some serious reaching to prove your point right there.


Really? I own a Huawei and AFAICT, the OS / services (eg the kernel, android, Google play services) are basically the same as what would run on eg a Google pixel.


>> > Samsung + Huawei -> Google

> Oh come on. That's some serious reaching to prove your point right there.

On the software front it's right on the mark though. Those devices wouldn't be much without Google. I know that Samsung in particular has had some other platforms (Bada? Tizen?) to try to break that dependency, but correct me if I'm wrong, I don't think it's been successful. AFAIK Google also has some kind of hardware requirements posed on the OEMs, so they influence hardware design this way.


Not in the slightest. Samsung tried to launch products that were rejected by Google via OEM licensing. Their commercial offerings are subject to external rules.


Every company's products are subject to external rules. All these huge companies constantly fight in court over trademarks, intellectual property, and so on. The products were not "rejected by Google", they were rejected by a court somewhere. And Google are at the receiving end of many such cases themselves, in many jurisdictions:

https://en.m.wikipedia.org/wiki/European_Union_vs._Google

https://en.m.wikipedia.org/wiki/Google_v._Oracle_America


Some products don't reach the stage of a public lawsuit, i.e. their fate is determined during private negotiations.


He interestingly lists of DOS clones made in the UK by a US based company as an achievement in his blog post, but anything derivative from Asia or made by a multinational in Asia is not notable enough.


Certainly right now someone would be a fool to overlook Asian hardware companies, including examples you mentioned.

But a bunch of this is more recent than the time period the article is discussing, as the comment you're replying to made some mention of.


Symbian was a terrible OS that was impossible to program for and needed to die. iOS rightly changed the entire paradigm, and they didn’t have the vision to catch up.


> Whereas most of the world's electronics are partly or totally manufactured in Asia, it's mostly by or for American- or European-owned companies.

Top Personal Computer Vendor in 2019 by Market Share is Lenovo[0]

Top Mobile Phone Vendors by Market Share in 2020 Q1 is Samsung, Followed by Huawei [1]

Top Games Console vendor is Sony [2]

[0] https://en.wikipedia.org/wiki/Market_share_of_personal_compu...

[1] https://www.idc.com/promo/smartphone-market-share/vendor

[2] https://www.businesswire.com/news/home/20181129005477/en/Nin...


Playstation is pretty much American. Headquartered in San Mateo, CA, its a division under Sony Interactive Entertainment, a subsiadry under Sony Corp America. Both the PS4/pro and PS5 were American designs led by Mark Cerny


The PS1, PS2, and PS3 were insanely successful consoles that were completely designed in Japan as far as I understand, so it’s still 3/5 for the Japanese teams.


Followed by Nintendo, running a Japanese OS on Japanese Hardware to play Japanese developed games. Have you not noticed the popularity of the new Animal Crossing during quarantine, or the outsized impact Nintendo has had on Western pop culture?


The switch hardware is pretty much a Nvidia Tegra X1 reference platform.

The last mainline console they designed was the SNES, albeit they've been designing almost all of the mobile consoles.


Mobile consoles such as the DS, very close to being the best selling console of all time? 154 million to the PS2's 155 million. The Gameboy is number 3 at 118 million.

https://en.m.wikipedia.org/wiki/List_of_best-selling_game_co...


Sure, but the example give by the OP only listed a Switch game.

And the Switch represents them getting out of the hardware market entirely by converging their mobile and home console lines and going with an Nvidia designed machine for both.

Like, the Switch literally has the same RAM chips as the Tegra X1 dev boards. That's normally something that you'd swap out just based on your purchasing department's relationships with the chip vendors more than anything else and you almost never see those carried over.


Did you know that Animal Crossing is a series of games that run on more Nintendo consoles than just the Switch?


> the new Animal Crossing during quarantine

Which, last time I checked is a switch exclusive game.

And my point anyways, is that Nintendo hardware isn't Japanese designed anymore. Which is objectively true.

Their software is great though, and is Japanese designed. Their OS reminds me a lot of Fuchsia, but actually completed and shipped.


Or maybe in 2020 the actual SoC is a piece of commodity hardware and claiming that the SoC + support components is the totality of hardware design is how you get constantly overheating iPhones that bend when you put them in your back pocket.

https://www.ifixit.com/Teardown/Nintendo+Switch+Lite+Teardow...

I can't find any public information on what the Tegra X reference design is without registering an account with Nvidia, but even if you're telling the truth you conveniently ignore that said ram chips are made by Samsung.


> Or maybe in 2020 the actual SoC is a piece of commodity hardware and claiming that the SoC + support components is the totality of hardware design is how you get constantly overheating iPhones that bend when you put them in your back pocket.

I mean, it's not a commodity. Another SoC means it's not a switch anymore as the software is incredibly tied to the specifics of that SoC.

> I can't find any public information on what the Tegra X reference design is without registering an account with Nvidia, but even if you're telling the truth you conveniently ignore that said ram chips are made by Samsung.

Here you can see the Samsung K4F6E304HB-MGCH chips being referenced for the Jetson Tegra X1 developer board.

(https://developer.download.nvidia.com/embedded/L4T/r24_Relea...

Also, Samsung isn't a Japanese company, so their chips being in there doesn't make it any closer to Japanese hardware.


> I mean, it's not a commodity. Another SoC means it's not a switch anymore as the software is incredibly tied to the specifics of that SoC.

It seems that most of the actual OS layer of the switch is designed to be hardware independent. I mean they managed to run Skyrim on ARM so it's not a stretch to say it wouldn't take much effort to move to a different SoC. Maybe using a different SoC would be difficult with Linux, but it doesn't seem the case with the OS they're using.

> eSOL, a leading developer of real-time embedded software solutions, today announced that its µITRON 4.0-compliant real-time operating system (RTOS) and exFAT file system have been adopted in the Nintendo Switch™ game console developed by Nintendo Co., Ltd

https://www.esol.com/news/news_57.html

> Supported CPUs are numerous. ARM, MIPS, x86, FR-V and many others including CPUs supported by open source RTOS eCos and RTEMS, both of which include the support for µITRON compatible APIs.

https://en.wikipedia.org/wiki/ITRON_project

> Also, Samsung isn't a Japanese company,

This thread started because of a statement on "Asian companies"


> It seems that most of the actual OS layer of the switch is designed to be hardware independent. I mean they managed to run Skyrim on ARM so it's not a stretch to say it wouldn't take much effort to move to a different SoC. Maybe using a different SoC would be difficult with Linux, but it doesn't seem the case with the OS they're using.

There's no GPU shader compiler on the system, all of the games' shaders are compiled for this exact GPU micro revision. There's a lot of other ways that the games are tied specifically to this SoC. The closest you'd be able to get would be running the games in an emulator.

> uITRON stuff

Their OS really isn't uITRON derived, but I can believe they took the exFAT stack from there.

You can see the shape of the OS here, and it's pretty different from uITRON. https://switchbrew.org/wiki/SVC There might be a compat layer in the driver process for the exFAT driver though. They threw in a FreeBSD kernel space compat layer for their IP stack process for instance.

> This thread started because of a statement on "Asian companies"

That's cool and all. I was correcting your specific assertion that the switch is Japanese hardware.


> There's no GPU shader compiler on the system, all of the games' shaders are compiled for this exact GPU micro revision. There's a lot of other ways that the games are tied specifically to this SoC.

You're missing the point. If they can port a third party x86 game using a notoriously janky decades old game engine to ARM then they can easily move to using a different SoC if need be.

> That's cool and all. I was correcting your specific assertion that the switch is Japanese hardware.

You were being pedantic about what is and isn't hardware engineering. You can't just sell a bare PCB with a SoC on it.


Lenovo barely existed until they bought IBM's PC division. The brand is (was?) entirely American.


https://www.lenovo.com/us/en/lenovo/locations/

7 research centers in China, one in the US, one in Japan. That's from their US facing site too.


Just a couple more random examples of international sources of influential technology: the Ruby language comes from Japan, and has had a pretty wide influence. Lua is from Brazil.


As is Elixir (Brazil).


Note that for non-Pixel Android phones, the Android that it is running is only part of the picture, and is permeated with non-Google code. The hardware driver code (BSP, or Board Support Package) is an external part; custom frameworks or libraries to support product specific features such as special camera configurations or AI accelerators; custom apps such as photo apps, dialers, gallery, etc; themes such as Oxygen OS or One UI; cloud services that connect to custom apps. Your Chinese tablet might be from a smaller company which does not provide many of the above, but leading Android phone companies have large SW teams which develop these SW for their products.


> The Julia programming language (AIUI, an Indian project)

Nope, Julia is MIT.

Jeff Bezanson, Stefan Karpinski, and Viral Shah were all at MIT, under the patronage of Alan Edelman.


You don't need to apologize. Not every blog post needs to include every culture on earth in equal representation.


Not in general, no. However, a post claiming that the US monopolizes all of worldwide software and hardware is incomplete if it ignores the output of an entire continent. Especially when that continent is the main manufacturer of tech products.


I am speculating here, but I think the US was among the first to realise that access to state-of-the-art computing is a national security issue. Therefore, the US government essentially subsidized IT R&D (e.g. Silicon valley, this is confirmed by Wikipedia) giving the US a huge heads up to the fragmented EU granting/capital (and likely Asian, but I don't know) landscape. Therefore, we now live in a legacy where, although tech is a global phenomenon, it is largely controlled by US capital, with other players (EU, CN, JP) playing catch-up.


In fairness, while C++ was created by a Dane in America, anyone anywhere could request a feature addition to the language and it would be implemented, so I think C++ is truly an international language.


And the Ada example from the article describes a language adopted by and then heavily funded/legitimized by the US DoD (e.g. they funded GNU Ada, made it a requirement...)

Still, the article makes a useful point.


And if you take UNIX to mean Linux, that was created by a Finn.


And if you don't take UNIX to mean (only) Linux, then it was created at Bell Labs, and not by a Finn.


Can UNIX even mean Linux? I usually hear Linux described as UNIX-like.


Doesn't that apply to most modern languages? I know it does for JavaScript.


I think it was (mostly) just a joke about how many language features C++ has by implying they've never turned down a feature request.


ARM is still mostly the same company, softbank just holds it. That said most of the recent uarchs were designed in Austin, TX and the next gen A78 and X1 will be too. And ATI hasn't been Canadian in over a decade. The current graphics division of AMD, which used to be ATI, is headquartered in Shanghai in AMD's Asia HQ.


Liam was referring to Acorn.


> Probably most of the hardware in your phone was designed in Asia

It doesn’t appear to be true at all for Apple phones: https://www.ifixit.com/Teardown/iPhone+11+Pro+Max+Teardown/1...


Apple devices are a tiny minority on the broader scale.

https://www.counterpointresearch.com/global-smartphone-share...


Is it even true for the Asian designed phones that the majority of the hardware is designed in Asia?

https://www.ifixit.com/Teardown/Samsung+Galaxy+S20+Ultra+Tea...

LOTS of Qualcomm ICs as well as Skyworks, Qorvo and Maxim which are all US companies.


If you consider display/battery/cameras/mechanicals, then by weight/volume/cost, most of the phone is designed in Asia.


Most iPhones are made in China with parts manufactured by FoxConn. Maybe it's different for US iPhones?


They are assembled in China. FoxConn doesn’t even do the majority of the manufacture of the parts. Their primary role is assembly.


Made aka manufactured is not the same as designed. Which I don’t think they mean the ICs themselves but rather the whole phone package. Some are designed in S Korea, some in Japan, some China, some the US some in Scandinavia.


Look at the link I posted. The vast majority of ICs are US designed.


I agree with you. So when the poster I’m replying to says “made with parts” in China, I think they mean designing the pcb the bom. Which isn’t true as many of these are designed elsewhere but assembled in China (though some are designed there too) But it’s also not about assembly or design of the ICs themselves.


> Presumably because that's where the money was. Does that mean the US gets to claim these languages?

I mean I don’t think it really matters but if they moved to the US because the US economy was valuing those things then it’s the product of the American economy from a economic point of view.


I agree. If we invert the argument: just because C++ and C# were designed by Danes, does that make them Danish languages? That sounds absurd! It's almost like saying the iPhone is a Syrian product because Steve Jobs' biological father was Syrian.


I was not trying to say they are Danish or American. My point is that they are neither.


If they were created in the US, then they're American, either in part or in whole.

You might as well pretend SpaceX isn't American either, because Elon is from South Africa. And Nvidia isn't American, because Jensen Huang is from Taiwan. Google is half Russian because Sergey Brin was born in Moscow.


> If you look inside any modern laptop or smartphone you'll find that most it was manufactured in Asia, and that the software and hardware design was split between many different countries and continents, and that the actual people doing the work are even more diverse.

Indeed. It takes a civilisation to make an iPhone. Many US multinationals have EU development divisions of one sort or another. Dublin is doing very well out of FAANG at the moment.

Where are the owners? Well, if it's a publicly quoted company, they're all over the world.

Where are the corporate taxes paid? Staff salaries and sales taxes are paid locally. But profits can be routed to the cheapest jurisdiction, so Apple is technically an Irish company when it comes to profit accounting, on which they pay almost no tax (see Ireland v European Commission (Apple Sales International) (Cases T-778/16 and T-892/16) (15 July 2020))

Perhaps the real question is, "When the nationalists come for you for displaying insufficient loyalty, which jurisdiction do you end up in?" The barrier between US and China is gradually being raised again.


Perhaps. But the scope of article is in the title. And of course there are exceptions. While it's not always wise to generalize, it does make sense to look for patterns and follow arcs. That's seems to be the author's intent.

As for C++ and C#, maybe it wasn't the money. Maybe it was the culture that attracted them? Maybe they were Danes by DNA but they identified as American? If that's the case, they it's fair to say that those languages have deep American roots.


Yep. Ruby came from Japan, Ruby on Rails came from Denmark. Guido van Rossum is from Netherlands. Rasmus Lerdorf if Danish-Canadian. What's it about Danes and languages? :)


The Netherlands is Dutch, not Danish. And he's Dutch-American now (and Python was designed in the US).


I thought Python was designed at the CWI in Amsterdam. At least that's what Guido's linkedin says.


I guess there's a little bit of both. Through Python 1.2 was with CWI. US after that [1].

1. https://en.wikipedia.org/wiki/History_of_Python


Since we're making lists:

Java is Canadian. APL and J are Canadian. R is from New Zealand. Go is partly Canadian. Pascal and Modula-2 are Swiss. Scala is Swiss too. Lua is Brazilian. Ocaml is French.


(Blogpost author/submitter here.)

A small objection: Java came out of Sun, a Californian company. It would seem to me to be as American as Solaris, NeWS, SMF or vi. No?


Honestly, I'm a Canadian. We're much like Americans when it comes to software. A little bit less of a offensive security community but for almost everything else it's pretty even.


Prolog and Ada are also French.


Little known fact: Motorola's Dragonball series of CPUs, used in the original Palm, was designed by Motorola's lab in Hong Kong.


Why would you assume that TFA might know and therefore should write about things they did not mention though?


This seems to be a list of things that Europe used to have. Here's some counterpoints that you might be thinking of:

The most popular form of UNIX (Linux) got started in Finland.

The best selling CPUs (ARM) are ultimately of British origin.

The Web was created by a Brit at CERN (Switzerland).

Raspberry Pis are quite popular, and British. (Also has an ARM CPU, and often runs Linux, sometimes with a web server. Nifty!)


And the other small hobbyist computing platform, Arduino, is Italian;

Python was created by a Dutch programmer;

PHP, the much maligned bastard language that still runs most of the web, was created by a Danish Canadian.

Even from his own examples, Ada and Pascal (as in Delphi) are still widely used, though probably in limited circles.


To add a few:

C# / TypeScript / Turbo Pascal / Delphi: Anders Hejlsberg had a lot of influence even though C# /TS are MSFT langs, and he is Danish

Haskell's GHC: _glasgow_ haskell compiler, Scotland

EDIT to clarify


Skype: founded by a Swede and a Dane, coded by Estonians. MySQL: a Finn and a Swede.


The web ? We had minitel in France.


And the first microcomputer:

https://en.wikipedia.org/wiki/Micral

I only found out about this because its inventor, André Truong, studied at EFREI at around the same time as Pol Pot. Fortunately, Truong was a much better engineer, and did something more constructive with his life.


But not the first personal computer, which was American:

https://en.wikipedia.org/wiki/Kenbak-1


Minitel wasn't really the same kind of thing as the Internet or the web.


However, arguably it was closer to where the Web seems to be going now.


Why ?


Because Minitel was a non-extensible (effectively) service run on centralized systems, whereas the Internet is defined by a heterogeneous mix of networks all collaborating to share traffic, any one of which can provide a completely new service at will, without coordinating with anyone else and, more importantly, without asking permission.


That is a totally non-helpful reply. A reply that actually gave some information would be more useful.


Hmm. Either I replied to the wrong post, or msla edited it. As it stands now, it is quite informative.


He edited it, but it's still false. In the minitel era, everyone could host a service, you had to get specific equipements, but it was easy to get. What was centralized, was the indexation, people had to know you phone number to dial in your service.


If you replace Minitel by internet, you describe it well too.


I just described how Minitel was completely different from the Internet.

I guess I don't understand your post.


You said the minitel is centralized which is false. Everyone could host it's service at home. Service discovery was centralized, but everyone could dial in your service if they knew your phone number.


> The most popular form of UNIX (Linux) got started in Finland.

By a Finnish guy that moved to the US (and is a US citizen)

> The best selling CPUs (ARM) are ultimately of British origin.

The instruction set for the cpus originated in Britain (now japan), the cpu design is almost all based in the US (Qualcomm, Broadcom, Apple).

> The Web was created by a Brit at CERN (Switzerland).

The web was started by a Brit yes, the internet started long before in the US.

> Raspberry Pis are quite popular, and British.

True


> > The most popular form of UNIX (Linux) got started in Finland.

> By a Finnish guy that moved to the US (and is a US citizen)

When he moved to the USA Linux was already a popular OS, and he didn't move here in order to develop the kernel but to work for Transmeta.


> and he didn't move here in order to develop the kernel

He moved there and started the Linux foundation... what are you talking about?


Transmeta hired him, he moved, he liked it and stayed.


That's... not what happened. LF was formed by merging the FSG and OSDL. Only then did the combined entity have sufficient funding to hire Linus.


Nothing I said is in conflict with that.


He did not start the Linux Foundation.


The CPU implementation is done by Qualcomm, Apple, and the like. The CPU design is done by ARM, predominantly in the UK. The fact that Softbank (Japan) owns ARM has no bearing on where and who is doing the actual technical work.


That's incorrect, Apple designs their own micro architecture that implements the ARM ISA. Qualcomm, HiSilicon, and Samsung use ARM's reference micro architectures, which are mostly designed in Austin TX now, although some have been from the ARM Sophia, France office. Samsung also made custom uarchs at their Austin office for their Exynos chips but recently shut the operation down.


Not true - just look at the latest chip from Apple.

https://en.wikipedia.org/wiki/Apple_A13

Yes it uses the arm instruction set, but it was completely designed by Apple. This is identical to Intel using the same 64 bit instruction set as AMD... you wouldn’t say AMD designs intels chips though.


Apple is largely an exception in the crowd of ARM SoCs manufacturers, though. A lot of companies just implements ARM cores, with their additional magic in other bits of the SoCs. The fact that Apple does it does not mean that ARM does not do any design.


Qualcomm, Broadcom, and Apple all license ARM designs. They're not the ones designing new ARM cores, they're licensing them. They're attaching all kinds of interesting other bits and bobs directly to those ARM cores, which is where their special sauce comes in. You can say the SOC design is all based in the US, but the CPU design is based in...wherever ARM does its CPU design. Which to be honest I don't know. :)

Maybe it's just pedantry since the SOCs are what has resulted in ARM taking over the mobile space (and also a big chunk of the embedded space. Your SSD and HDDs likely have multiple ARM cores).


> Qualcomm, Broadcom, and Apple all license ARM designs.

Not broadly true especially in the Apple case - Apple uses arm in the exact same way intel uses AMD. AMD created the 64 bit instruction set but it would be ridiculous to say intel isn’t designing their own processors. Similarly Apple designs all their own chips but use the ARM instruction set.


I grew up in Germany during the home computer boom in the 80s and there was a definite anti-computer vibe at the time, my classmates thought I was contributing to the destruction of the high culture that was so superior to the US. Even though there were many cool platforms I think ultimately the pro-tech and pro-business atmosphere in the US drove the smaller competitors in Europe out of business.

Which isn't to say that European companies aren't strong in some areas. Here at hp we love STM32 micros (French/Italian), German robotics and automation equipment, optical equipment and of course ARM, which is a direct outgrowth of the pioneering European micro culture.


I'm surprised and disappointed that was the prevailing culture in Germany, which is famous for its engineering culture; I guess that's limited to mechanical engineering?

The British microcomputer boom had a huge boost from a classic "dirigiste" state direction, involving the BBC producing a TV series to educate the public and a matching computer to work along at home with. That led to Sophie Wilson being hired straight out of Cambridge to produce it, and ultimately to ARM.


100% - if you look at the landscape now, everything digital is still pretty much new and not working ("Das Internet ist für uns Neuland" in 2013).

Broadband is worse than in other countries, government interaction is pretty much non-digital and other things (there recently was a big thread about a government game development grant that was so startup-adverse that the person telling the story was lucky to not have been bankrupted) and it goes on and on. Also remember all the pixelated houses in Google maps? I wanted to look something up yesterday, image was made in 2008 and everything is pixelated.

Despite working in tech I don't see Germany in any sort of leading role in anything digital or IT and this won't change.


> Ada, the fast type-safe compiled language (French)? Largely dead in the market.

Well, Ada was the language of choice of the US DoD from 1983 to about 1994. And it's only dead if you ignore the millions of code lines (old and new) produced in aerospace and defence.

> The Pascal/Modula-2/Oberon family, ... Largely dead.

Well, Pascal and it's decendants (Delphi, etc.) are still in wide use with good tool support; obviously not in the author's field of sight here either.

> Nokia's elegant, long-life, feature-rich devices ...

Sunk by an American u-boat in the prime of its life.

Asians or Europeans are also behind many so-called American inventions. Anyway, I don't understand what the author is trying to get at.


> Anyway, I don't understand what the author is trying to get at.

It's about paradigms in technology deployment.


When focus is quality, quality rises and costs fall. When focus is cost, costs rise and quality falls. William Edwards Deming, an American, said that.

Whilst a lot of what Liam wrote rings very very true (I'm not Amercian either) I think it's slightly more nuanced. Windows Phone is a great example. Americans hate on Microsoft and so deprived themselves of the best mobile phone OS of ever. Even today Windows Phone is more usable, faster, more secure and better in almost every metric than Android. But Windows Phone is dead. Same with the Compaq IPaq. It too was American and amazing.

I went through the whole tech cycle Liam did (started programming in '84), and felt crushed every time another obviously superior product died. Apart from the iPaq, some highlights in the mobile space alone are the Nokia 9000 Communicator, the Nokia E90, the Nokia N900, and the best device (my opinion) I've ever experienced, the Nokia 1520.

Interesting that we as humans so often base our decisions on emotion rather than logic. I now use an Android phone and no longer bother with mobile apps. That sense of awe and wonder has been eviscerated, eaten up and spat back out.

Today's tech is throw-away. Watches, thermostats, tablets, light bulbs... Delivered by Amazon this morning, in the trash this afternoon, making space for yet another smart but shitty light bulb tomorrow.

I miss that child-like sense of excitement and wonder tech used to instill in me...


> Americans hate on Microsoft and so deprived themselves of the best mobile phone OS of ever.

I don't think that's a correct assessment for most Americans' opinions towards Microsoft.¹ I also don't think that's the reason Windows Phone never took off. When compared to iOS and Android, Windows Phone just didn't have the app ecosystem to compel people to buy them.

1 https://www.theverge.com/2020/3/2/21144680/verge-tech-survey...


I believe the assessment to be true. Windows Phone was doing really well in Europe. It was Americans who refused to write those WP apps that were so desperately wanted.


Not even refused to write, refused to let them operate. Microsoft would have made apps for the major online services themselves, if they were allowed to. For example, they created a fairly decent Youtube app themselves but Google, owner of both Youtube and Android, simply refused it access to Youtube.


> Windows Phone just didn't have the app ecosystem

Which is also a shame cause currently we have a torrent of crap in both app stores and nobody really bothers. All WP would need today is Instagram, TikTok, WhatsApp, Telegram and Maps to succeed.


I haven't used an Iphone since the Iphone 6 (2014) and I just got an Ipad a few months ago and I was absolutely blown away that the app store is still as bad as it was in 2014. I still have to go to 3rd party websites to actually filter out and find tags and the GOOD apps/games that don't make it to the front page of the app store.

It's wild. Maybe I have the same problem on android and I'm just so used to only using the apps I've known for the last 10+ years that I haven't realized it. I don't really game on Android like I do my ipad, that's where I started to realize I was just not finding things unless I went 3rd party site or found their name and searched directly for them.


Windows phone was great, but simply far too late to market. By that point it didn't matter whether it was better or not, the silos were built.

Deming is amazing as well. A real case of a prophet not honored in his own country, he's a big part of the success of post-war Japanese manufacturing.


> Windows Phone is a great example. Americans hate on Microsoft and so deprived themselves of the best mobile phone OS of ever. Even today Windows Phone is more usable, faster, more secure and better in almost every metric than Android. But Windows Phone is dead.

That's because Microsoft poisoned the well a long time ago. They might've spent a long time since Ballmer's departure cleaning up that superfund site and built a beautiful resort on it, memories persist.


You proved the point (see my 4th paragraph).


When Android started getting big, a huge community of enthusiasts were making cool proof of concepts. Port busybox, VNC, run Ubuntu on your phone, ROM customization.

Beginners started using accelerometers and multitouch to make innovative games. I remember tutorials on how to make algorithms for good gesture detection (gestures were a big thing).

Now app stores are worth almost trillion dollars. You can still find niche communities but just like with the Internet, it got too commercialized.


> "I went through the whole tech cycle Liam did (started programming in '84), and felt crushed every time another obviously superior product died."

I myself feel like a walking death sentence. Every tech that I'd ever liked and considered superior had died or getting there. Shit wins.

As for smartphones/app store combo: I consider it a crime. People are dragging along powerful computers but instead of being able to use those in a creative way they're actively isolated from this type of usage.


This.

I sometimes turned to my girlfriend while taking off in a plane (back in 2019 :P) and tell her: "OMFG! We realise humanities dream of flying!"

More examples from the comedian Louis CK: https://youtu.be/kBLkX2VaQs4


> And I think a big reason is that Europe was poorer, so product development was all about efficiency, cost-reduction, high performance and sparing use of resources. The result was very fast, efficient products.

While Europe is poorer, the issue is not a focus on fast/efficient products... rather that US companies have access to incredible amounts of "dumb money" via pension funds that they can use to simply flatten (or buy up) any European competition. Either via outright price dumping (Uber), via lobbying efforts (Microsoft), by marketing (Apple) or by having access to way more R&D resources than European companies (Apple again).


A professor in my grad school told me once "In US if you try to create a business and you fail, people will think you're entrepreneurial. In Europe they'll think you're incompetent".


Ocaml - France? Erlang?

A heartbreaker for me is Mozart/Oz (Germany?) .. wish it was developed/used/taught more .. "pickling" (in python) came from M/Oz.


His examples have so many mistakes, it's hard to take anything in this seriously.

- Ada (French). Ada is not french, it was developed for the US DOD. I also wouldn't call it dead.

- Pascal/Modula-2/Oberon: ... garbage collected ...: Neither Pascal nor Modula-2 are garbage collected. Modula-2 and Oberon are probably dead, but Pascal in its modern iterations is very alive and still widely used, apart from having a significant legacy.


> The Pascal/Modula-2/Oberon family, a fast garbage-collected compiled family

Pascal wasn't garbage-collected, was it? At least not in the original form.


> Ada, the fast type-safe compiled language (French)? Largely dead in the market.

Isn't this language used in the Eurofighter Typhoon?


I thought Ada was still fairly widely used in aviation software in general. Perhaps not in new designs?


From what I've seen in job descriptions for various defense related positions Ada seems to be written in as a small "nice to have" and makes me imagine most of it is maintenance or being replaced (probably by C++ too lol)


Not much Ada was written in the first place. The reason you see it in job requirements is because inertia and it's easier to slap those three letters in there than it is to try to explain to someone that there is no reason to require it on the job postings. Imagine if a bank listed Cobol on every single tech related job posting, that's basically what defense contractors are doing.


and some missile guidance system too


I do have anecdotal evidence for this (probably incendiary, but certainly not intended that way) point of view: Europeans tend to love creating and mastering complexity; Americans at their best despise complexity and produce simple approaches that cover most of the use-cases. (Also, the author doesn't mention the HP 200LX PDA - a full DOS box, CGA on which I have running a C compiler, scheme48, RF design s/w and much more. Still works great. https://en.wikipedia.org/wiki/HP_200LX)


Europe seems to me to have good intentioned laws and norms that stifle computer related companies. 1) Very strong privacy laws and culture that make it difficult for startups to thrive 2) Business and bankruptcy structures that aren't as friendly to investors 3) More sensical valuing of companies and higher risk-aversion


This thing of running a company into debt while growing userbase until someone figures out how to make money out of it is certainly not part of the culture.

That's what gave YouTube an edge over DailyMotion, they were pretty 50/50 in popularity in Europe before Google stepped in to monetize the thing. I certainly remember uploading videos to both platforms.


> That's what gave YouTube an edge over DailyMotion, they were pretty 50/50 in popularity in Europe before Google stepped in to monetize the thing.

YouTube was flaming out until the boardroom backscratch between Sequoia and Google bailed them out.

YouTube was burning several million dollars a month on bandwidth and was facing down billions in legal fees--both of which were rapidly increasing. It was going to be bankrupt before the buyout. The buyout meant that Google could subsidize YouTube for a very long time.

It's also not at all clear that we wouldn't be better off if YouTube had actually gone bankrupt.


I'd suggest that in Europe innovation is stifled by a strange mix of very old vested interests, the class system (certainly here in the UK), socialistic cultural scepticism around capitalism/profit, and a cynical zero-sum mentality. Sadly I can only see the US going the same way what with the combination of ever-concentrated wealth for the lucky few, plus the faux-socialist destruction of the middle class which the Democrats seem to be working extremely hard at bringing about.


I think the oversight in this article is the survivor bias regarding American designs. There were plenty of those fast, efficient designs from American developers but they too died in the avalanche of big-team giants.


I also think it severely underestimates the advantage of having such a large internal market.

Having access to a large internal market allows you to grow to a fairly large size while still only serving that internal market. Then, when you want to expand abroad, you're likely in a much better financial position to take on the expenses of dealing with lots various local laws and regulations.


> having such a large internal market.

... (I would add to your statement) with a large amount of median disposable income. Adding to this, a large numbers of consumers who have access to consumer credit and are far less reluctant to purchase on credit than other nations.

A lot of countries have a large internal market but not all of them actually have the purchase power or the inclination to spend on novel non-established products as much as the US market.


> or the inclination to spend on novel non-established products as much as the US market.

Moving from the west coast to Germany was definitely a fair amount of culture shock there. Americans love trying out shiny new things that are weird or different, my experience has been that Germans are much more cool to that kind of thing. They're simply less interested in novelty for novelty's sake.


"the inclination to spend on novel non-established products" is the key competitive advantage of the american market. Everything else are just compounding factors.


Very good point, thanks. I had that as an implicit assumption.


I don't doubt you, I just don't know of many, so some examples please?


Well, one that springs to mind for me is Palm. Their devices punched well above their weight when it came to performance vs their Windows Mobile PDA competitors with better battery life but couldn't stand up to the corporate demand for buying things with the Microsoft sticker on them. Their later smartphone products were in the same position when Apple debuted the iPhone and was able to bring boatloads of cash from their ipod successes to bear on pushing both Palm and Windows Mobile out of the smartphone market entirely.


> Use the most minimal, close-to-the-metal language that will work.

What? Javascript/Electron is super popular, way more popular than close-to-the-metal languages for desktop applications.


Not sure. I saw a lot of industrial applications and they're all native. No Electron there.


There seem to be a lot of supporting arguments (look at all these dead projects he deems to have been fast, elegant, and efficient, also, hey, my current toys are not made for me to play with like I used to be able to, and the battery life sucks). But what is the actual thesis?

It seems like he wants hacker toys and is unhappy that everything is built for consumers now. I can't get anything deeper out of it.

The EU vs. US/Asia thing just seems like a weird proxy for saying real techies don't like extruded technology product but it's what the market asked for and now we have lots of it.


What makes you think the products he mentions were “hacker toys”? They were consumer products just the same.


Some of them were consumer products. But they lost out (probably) because they were more like something a technical person would want and less like something made for a consumer.

Would you frame the reasons they failed differently? If TikTok users and gamers needed long battery life and elegant scripting tools we would have them. But they don't.


Psion in particular had a huge business following. Mind you, so did Blackberry.


There is much truth told but

> Nokia's elegant, long-life, feature-rich devices, the company who popularised first the cellphowe and then the smartphone? Now rebadges Chinese/American kit.

As someone who paod a lot for these early smartphones (N80, N800) they were highly inconsistent. Maybe ahead of it's time but when the iPhone hit the market Nokia quickly lost it's share for good reasons.


I think there is an interesting question here but I'm not sure I follow the train of thought.

First of all, I doubt that whatever happened relatively recently with regard to certain design choices in hardware and software is related in any way to how different implementations of BASIC fared against one another in the 1980s. Regardless of the conclusions, there are simply too few people involved in making these choices now that also know what the situation was then.

As a matter of fact, one of the perpetual problems in the field seems to be that so many already identified problems with certain design choices and solutions to them are being forgotten, and have to be rediscovered again by every new generation of programmers.

And I'm not sure how all this can be framed into the Europe vs America debate: most of the big things happen in the US. Arguably, having a huge single market and less regulatory overhead helps but it's also because this is where the financing is. However, there are people from all around the world working on these things, so to what extent the final outcome can be called "American" is debatable. The founder of Commodore himself was a European Holocaust survivor. And we really have to be careful lavishing praise on "elegant" and "efficient" European products lest we forget about whom to hold responsible for monstrosities like Systemd. Could it be just that we tend to remember the most succesful products from the past, and those originating outside the US had to meet a higher bar to become so?

Perhaps there is an argument to be made that at some point while stuff in Europe was still being "built to last," American businesses have already been discovering the allure of planned obsolescence. This would pertain to the hardware side of course. As far as the software is concerned, I think everybody just got lazy the moment they could get away with it due to the rapid advancement in hardware performance. So from the era where you could have Jet Set Willy or Dynamite Dan (platform games with dozens of levels) fit in 48 kB, we've now "developed" ourselves into a corner where even winver.exe (a program whose sole purpose is to display the Windows version and an "OK" button) takes more than this (56 kB on my system).

While I concur this outcome is not desirable, I can't really agree it happened because of "moving close to the metal." On the contrary, I think this was only possible because of moving away from it, one layer of abstraction away at a time. To paraphrase the quote attributed to Churchill, BASICs of the future will no longer call themselves BASICs but I think if a BASIC was being invented today, it would look like another JS and it would be part of the problem, not the solution.


(Blogpost author/submitter here.)

It's a fair cop. It was only a blog post, dashed off on the basis of a FB comment that grew overlong. :-)

I do not have a coherent thesis that I was expounding; it was merely a passing thought.

I find it interesting, though, how the computer market has developed in my lifetime -- I first touched a computer keyboard in about 1981 and owned my own a year later.

In the early 1980s there were dozens of different manufacturers of almost-totally different and incompatible systems, and it was quite normal for the larger developed countries to have one or more indigenous computer makers, often with their own architectures. Most used one of a small number of CPUs, yes, and those CPUs were mostly American, yes, but I knew of English, Welsh, French, Spanish, German, Norwegian, and Swedish makes of computer, to pick the first few to come to mind. I'm sure there were many more.

Now I live in the former Communist Bloc, I've also learned of hordes of Sinclair clones and other copies of Western home computers that thrived over here, often after they were dead and gone from their homelands. I covet a Didaktik Kompakt. :-)

By the 1990s things were starting to consolidate but there was still a lot of variety, and major innovations came from other countries -- as other commenters note, from Denmark, from a Brit at CERN, from a Swedish-speaking Finn, etc. As I said in the piece, I used a Finnish cellphone with a British OS, and so on.

But when the world market consolidated down to a handful of companies and technologies, the winners were pretty much all American. The European software luminaries relocated their and often became American.

And yes, I think the overall design ethos has changed as a result, and I find that interesting. I have rarely seen it discussed. I often see people bemoaning modern bloatware, but I rarely see analysis of why and how it happened. This was my 2¢ worth on the subject.


Also not to forget: American business culture is focused on achieving your yearly bonus, means achieving your goals, means delivering the product you promised.

Many European places are focused on quality and feature richness. That produces delays and budget overruns.

As a consequence, American products are often first to market, get the attention and the money. And often, they just buy the competition who has not made it (European or not).

(and this is not only true for companies but also for managers within the same multi-national company).


I’ve never seen such an incorrect statement in my life — have you ever been to Europe?


You mean products in Europe are not focused on quality and just garbage both in sales and time to market?

(I don’t necessarily think that)


"Products in Europe" is ridiculously broad.


I agree to that. It is too broad what I stated.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: