Yes, Japan didn't create the iPod but it doesn't mean they are behind.
I remember showing my black and white iPod to some people in Japan in 2005 and they had never seen one before. However, if you looked in the shops you could find 100s of better spec'ed MP3 players, much better than the ones in the U.K. at the time and much cheaper. I guess the iPod just looked like a over priced fashion accessory to them.
Then looking at some of the phones people had in 2005 when I was there, people were able to do all sorts of things, like look up train times with complete ease. At the time was very difficult on European phones.
The author of the article seems to think that it's progress to plug your MP3 player or phone into a computer, but it's not. For example look at iTunes, I can only put music on from one machine and can't get it off again (using iTunes at least), it's needlessly tied to a single machine, when I don't want it tied to any machine. Also people don't want to plug their digital camera into a computer to upload pictures, they just want to send them straight to the internet from their phone, the Japanese were doing this long before the iPhone+Twitpic.
I think not requiring a computer as the digital hub is the future, not the past.
Your friends in Japan had never seen an iPod in 2005? Apple claimed the iPod had 60 percent of Japanese market share as early as November of that year.
It kept the lead until September of 2009, when Sony crowed that the Walkman had outsold the iPod the previous week... because the iPod's share had been cannibalized by the iPhone. Nice work, Sony!
"Better spec'ed" devices that didn't involve a computer necessarily cram more user interface onto a limited surface, which is why almost by definition, those devices were harder to use.
The iPod won not just because its interface was better, but because there was so much less of it. Offloading complex functionality to a general-purpose computing device made the iPod a better music device. There's really no getting around this point, even if the iPod's successors are becoming general-purpose computing devices themselves.
Well they weren't my friends as such just normal non-tech Japanese that I met in various hostels around March 2005. It seemed like it was just getting established at that time, whilst a few had heard of them, the people I had met had never used one. I note that even in the U.K. 2003 very few of my non-tech friends were aware of the iPod still.
The big things at the time in Japan in March 2005 seemed to be the PSP (just released), High Definition TVs (there was one in a shop but were impossible to find in the U.K. at the time) and 100Mbs broadband. Oh yeah and I saw 3D cinema in the World EXPO whilst I was there. Mostly these things are being touted still as new in 2010.
I'd agree that the iPod won because they made a nice device, but I think that's because that is what the American/European markets demand. However there are lots of markets in the world where functionality and price are lot more important than usability, like the massive emerging ones in India and China.
Most of Apple's products are mainstream versions of technology that was available before but not suited to normal consumers.
For example, Personal Computers, MP3 players, Smart Phones and tablet PCs all existed long before Apple made products in those areas.
I wouldn't be surprised if Apple execs don't go to Akihabara every 3 months, and ask themselves how they can make what they see there, but for Americans and Europeans.
I think that's the point. The Japanese are great at making a product better, but not very good at innovating an existing product. They may have made a music player that has the same specs as the iPod years earlier, but they never would have made the mainstream version.
The year is 1978. Microsoft is selling BASIC interpreters. Apple exists in a garage. A Japanese company creates this quirky little tape deck with headphones which -- get this -- you're supposed to actually carry with you, so you can listen to music while not at home. Hilariously stupid idea, right? The Walkman will be lucky to sell a thousand units, to say nothing of crossing over to the mainstream. The Japanese should just stick with what they're good at -- perfecting things made by white people.
Other fun Japanese inventions (culling from Wikipedia):
The Biplane
Graphing Calculator
Solar powered calculator
Camcorder
Digital Camera
Blue lasers
VCRs
Anything made by Nintendo
Flash memory
Various types of steel
Quartz wristwatch
The continuously variable transmission
I think the point is that the Japanese electronics manufacturers were (and are) very good at producing standalone devices, but missed the boat on devices (like the iPod) which get their power from their tight integration with the PC.
The standalone-device model works really well for some applications (e.g. digital cameras; you wouldn't want to have an iPod-style camera that you had to tether to a specific computer to get your photos out -- you want one where you can just swap cards like disks and keep shooting), but less well for others.
Most of the cheap iPod-lookalikes don't have anything like the level of integration that the iPod has with iTunes. And that's what set the iPod apart from the beginning, and has probably been the key to its success.
The iPod wasn't the first MP3 player, or even the first hard-disk-based player. But its integration with iTunes, along with the scroll-wheel interface (which was pretty impressive when it first came out), meant that it was the first one many people ever considered buying.
As a sidenote, iTunes did actually work with some other brands of MP3 player early on (including, IIRC, the Nomad), but Apple killed this off pretty quickly once they had their own out there. Which sucked if you owned one of those devices (they learned a valuable lesson for dealing with Apple, and iTunes specifically: beware the feature-stealing "upgrade"), but it's pretty clear why Apple did it.
There have always been MP3 players with better spec sheets as standalone devices, but none really that have anything like the software/hardware (and accessory) ecosystem. They're fine if you want to load music onto the player manually via the file browser, but the whole idea of the (HDD-based) iPod was that it would just contain your entire music library, constantly mirrored from your computer, and you'd never have to do anything.
It's always struck me as an example of how a product that's inferior (or at best "minimum viable") but really well-integrated with software can beat out more feature-rich standalone devices that cut corners on integration.
I spent some time living in Japan, and I recognize a lot in this article. When I came to Japan I was surprised by how people are really not computer savvy in general. Most families have a computer, but only one, it's in the kitchen and nobody is very good at using it. This is strange, because people are quite tech-savvy in general and not at all techno-phobic like many non-geeks in the west. Japanese mobile phones have been awesome for years and there are plenty or really cool mobile web services. Non-mobile websites on the other hand frequently look like a mid-90s abdomination, only with more colors and animated gifs. Perhaps that is just a different preference in layout style though.
Computer gaming really doesn't exist in Japan. It's all on consoles, handhelds and phones. Few Japanese have even heard of World of Warcraft, Starcraft, Quake, Counterstrike or Civilization. I think there are some erotic games for PC because the console makers will not give them licenses. So the few who know about computer gaming think it's for perverts.
The authors explanation seems reasonable to me. Japan simply took a different direction towards appliances rather than general-purpose computers, and has continued on that path. Appliances are easy to use, but are far less suitable for hacking.
This means you really don't find many "hackers" in Japan. There are plenty of nerds, but few of them are into programming. Similar to my impression of China, software is seen as a branch of engineering like any other. Software companies will actually hire developers with no programming experience, as long as they have a good engineering degree and seem like a person that would fit into the company. They didn't lie on the CV either, the company fully expect to put them in front of computers with "Perl for beginners".
On a side note, I really don't like the western HR buzzword bingo either, where having exactly the right acronyms on your CV means more than your aptitude and actual ability to program. Is there any country in the world that strikes the right balance in general?
I am here on HN to contribute a different view about this allegedly exotic country, again.
> Computer gaming really doesn't exist in Japan. It's all on consoles, handhelds and phones. Few Japanese have even heard of World of Warcraft, Starcraft, Quake, Counterstrike or Civilization.
Computer gaming does exist. They aren't aware of the titles you mentioned just because they aren't localized. Many Japanese have difficulty with English despite of the amount of education throw in, but that's another story.
> This means you really don't find many "hackers" in Japan.
Hackers do exist. Again just because... well they don't speak English and often not visible to the hacker community at large. However I admit English is lingua franca of programming, and often I have to bridge them.
Re CV, it's not the same CV as in the western, in fact it's almost wrong call it CV. On the contrary, many Japanese job applicants assume the CV same as their counterparts and fail miserably.
As a nitpick, this isn't actually true: "Before the advent of computers, katakana was never used to write entire sentences." There were quite a few books written entirely in katakana in the late 19th and early 20th centuries, since it was seen as more accessible to the uneducated poor. Japanese socialists in particular tended to use katakana as an anti-elitist gesture. For a while there was even an active reform movement aiming to abolish kanji entirely, similar to the one in Korea that successfully abolished Chinese characters there in favor of the phonetic Hangul (the Korean switch happened in the 1890s, a time when the Japanese debate was also particularly heated).
the one in Korea that successfully abolished Chinese characters there in favor of the phonetic Hangul
That's actually not true. Hanja (the Korean name for Chinese characters) are still used extensively by newspapers, in academia, and in legal professions. As in Japanese, it provides a method of distinguishing between homonyms, which can be particularly important when writing contracts and the like, where precision of meaning is essential.
Just like with Japanese, Korean has a large number of homonyms in its Sino-Korean vocabulary because while Chinese is tonal, Korean is not. Moreover, the more formal Sino-Korean vocabulary is in prominent usage in the aforementioned situations.
While hanja use has certainly declined over the years, and many Korean youth these days are incapable of recognizing even the most basic of hanja (largely because of fluctuating education policy that has at times entirely eliminated hanja education), that was certainly not the case prior to the last 40 or so years.
Ironically, the place where hanja would be the easiest to read/write - on the computer - is the place where it is almost entirely absent. Take a look at the Korean language Wikipedia for an example of this; Hanja are almost always used merely for the purpose of disambiguating a word written in hangul.
Thanks for the correction! For some reason I had thought the change to Hangul was earlier and more complete than it actually appears to be. The fact that I mostly encounter Korean online, where as you note it's almost exclusively Hangul, might be part of the reason for that perception.
I thought conventional wisdom on this subject was that Japanese culture prizes conformity, so their equivalents of Jobs, Woz, Gates, et al would have been strongly discouraged from doing their own thing. The other side of that coin was that they are better at commoditizing technologies first invented in the U.S., like industrial robots, and RAM production.
Does this article mean that my assumptions are out of date?
My sense is that the conditions at small and mid-sized Japanese companies are reminiscent of the early 20th century, and are not compatible with producing innovation at a level necessary to sustain the economy.
Some things that I feel contribute (take them for what they are worth)
a) I haven't met a single female Japanese employee who hasn't been the victim of power harassment or other forms of emotional abuse. I can honestly say that 80% of women I have met at 5 different companies have been on some kind of psychological medication to cope with the stress at work. (Goodbye many useful innovations from female employees.)
b) I have met people who work 15 hour days; albeit where work consists of 6 hours of doing nothing, 4 hours of work, and then another 5 hours of waiting for the boss to leave. (Reasons include saving face - and poor housing conditions.)
c) The best university graduates tend to go directly to big companies; they don't create their own companies, because of the number of practical difficulties. (being accepted for & paying for rental properties, getting loans, etc..)
d) Everything has a big city perspective, since practically everyone has to come to Tokyo to do something important.
I can go through a hundred reasons; but the idea that there is some simple cultural issue that could be chopped away like a Gordian knot to solve the problem is a bit too simplistic.
We should talk about hacking Japan some time. Everything insane about large Japanese corporations just spells market opportunity. (In addition to the issues in the above post -- which I partially agree with -- you could pick "Engineers are grossly underpaid relative to skill", "No socially productive work is expected by anyone below the age of about 35", "Some of the best educations in the world are wasted on undifferentiated 'office work'", etc etc.)
Dude, you're seriously welcome to stop by sometime and eat with me. I'm sure we'd have a million ideas. Mail me sometime (jawaad.mahmood at!_? i do.t softbank d.ot j.p)
What gets to me is how there is absolutely no respect for younger workers; the idea is that they need to get calloused or be "hungry" before they have a shot. As if a clean-cut young man has never had a good idea in his life.
As for engineers being underpaid; there was an article in Asahi about how engineers being paid 5 million yen in Japan go to Silicon Valley and make about 10 mil. It's nonsensical.
(I'm not sure how up I am on the education system of Japan, but that probably belongs in another conversation)
You Tokyo engineers and your insanely high wages! Five million yen ($55k), hah. In Nagoya, the Employer Who Must Not Be Named exercises essentially monopsony on engineering labor, and they pay roughly according to the traditional yardstick: 2.X million yen for 20-somethings, 3.X million yen for 30-somethings, etc.
I have a friend who went from Montreal to Nagoya with Ubisoft; he was telling me about the terrible conditions and the utter inability of companies to re-invest into the company itself. They wouldn't even pay for a proper screen; they were using CRTs or something like that.
It's truly bullshit because you can take 100% of any money you spend on the company off your taxable income (as long as you have proper receipts). It makes no sense for a company NOT to upgrade its faculties in Japan.
Your friend may be finding a company- or industry-specific issue there. My company has occasional frugality attacks but Nagoya is cough no stranger to capital-intensive industry cough.
Ha, A friend of mine who worked at Square soft was paid like this (his salary was his age multiplied by ten plus some bonuses and of course some extra money because he was married if I remember well).
Since he was a foreigner and foreigners are supposed and allowed to be strange, he multiplied his salary by 2 by telling them that he would quit otherwise :-)
"Everything insane about large Japanese corporations just spells market opportunity."
Exactly right. To take an example from mobile, open innovation was literally strangled for years by the carriers until the iPhone came and just started owning. Android is going to do the same thing.
There is a difference between the hardware innovation that Japanese companies pride themselves on and the true, social innovation that can only exist when third-parties have a chance at changing the rules of the game. I will be shocked if most large Japanese corporations ever get this.
How to explain this... when you're in your twenties, you're expected to do a lot of work to prove your dedication to the company, but it isn't really expected that that work will be productive work. For example, I am 27. We have other 27 year old engineers who cannot, in fact, program Java above the level of a CS101 student. They majored in e.g. library sciences, not engineering. This is not considered bad because they're Japanese salarymen and as such will be at this company for the forty years. If it takes us most of a decade to get them up to speed on the whole Big Freaking Enterprise Web Apps thing, well, that's just an investment in their future worth to the company. We'll have employees senior to them review everything they do and rip out much of it.
In the engineering track, we'd assume that by 30 ~ 35 you've demonstrated whether you're going to be a net-productive software engineer or whether you should be moved into e.g. requirements analysis or producing documentation. If you're net-productive, you will start making stuff which directly affects the shipping products without getting exhaustively rewritten -- hence, socially productive. But nobody calls a 27 year old flailing away at his first for-loop non-productive. Of course he is being productive: he is training up an asset that the company will be able to tap in a decade. (And if he doesn't have a for-loop to flail away on he better find something to flail away on because insufficient utilization of him would reflect poorly on his team.)
I have a quirky relationship to these norms because I'm a foreigner, so if e.g. I can be reliably tasked with building a system and getting it to production without needing line-by-line guidance by my superiors, well, everyone knows foreigners are quirky.
That's insane. How much are you exaggerating for effect? I have a hard time believing software engineers in the mid-20s can be so unproductive, or that any culture could be so accepting of such a thing.
It doesn't mesh with my foreigners take on Japan, but then I have never lived or worked there.
It isn't exaggerated, at least not for traditionally managed companies.
Here's something that would shock you. Talk to a Japanese businessman who likes you and does business in Japan. Ask him what he thinks about Japan.
100% of answers (no exaggeration) have been some version of "Japan is finished.". It's hard for anyone to imagine how utterly and completely negative the Japanese self-image is, at least among the business men I've talked to.
Although much of their attitude has got to be a reflection of Japan being in it's 2nd lost decade ... but such a thing can be self-fulfilling. I.e. it sounds like there isn't much in the way of Keynes' "animal spirits" there.
And this is reflected in a "lowest low" birthrate ... Mark Steyn says no society has ever recovered from that....
Are you familiar with the significance of the word "FizzBuzz" as it relates to programming skill levels? Tolerating non-productive programmers for long stretches of time is not a trait unique to Japanese corporations. We expect ours to eventually grow out of it, though.
This sounds like institutionalizing on company/sector wide scales non-productive programmers for something like a decade, instead of some programming teams being dysfunctional.
Japan is very short on young people; how easily can it afford to waste a decade of their career? Especially when other countries don't do this?
It's also got to be ... suboptimal for the spirit to work for a decade knowing that you're not accomplishing anything that affects the real world, at least compared to other countries where young people accomplish real things and get positive self-reinforcing feedback starting at the beginning of their career.
(Even if your company doesn't succeed, you've still produced something, or at least tried to and hopefully learned some lessons along the way.)
Japan is very short on young people; how easily can it afford to waste a decade of their career?
Productivity growth. Which literally means "every year, fewer people are able to produce more stuff".
There's also the problem of the global economic slump. Which literally means "we have lots of things that could be done, like fixing hundred-year-old pipes and buildings. And we have lots of extra workers who could be doing things. But they are compelled to sit around twiddling their thumbs in their Japanese corporate cubicles, or their American parents' basements, or whatever, watching stuff crumble, because the economic equilibrium is stuck in low gear".
It has some benefits. Some companies invest like crazy, to get its engineers up to standard. That pays off, as they have good employee loyalty, and their employees are really immersed in that company's culture. That only works if you hold onto employees though.
Seriously, if your core business is producing great employees to work on your niche of products, then it doesn't hurt to invest in training.
* Note, this doesn't work great if the niche moves.
This made me laugh, I didn't completely experience it first hand because I was working in a small 12 people company (and my boss was 31, so rather atypical for japan). But, from what I heard from my friends working in big multinational companies, it's exactly like this. Of course, though as foreigners they could get away with a lot because of their inherent quirkiness (I'm definitely sure it's much nicer to be a young foreigner working at a japanese company than a young japanese employee... )
What also surprised me is the low level of computer science education given by japanese universities (compared to the level in electronics for example...)
Does this article mean that my assumptions are out of date?
That sort of implies that Japanese were once lockstep conformists. I could take issue with that, at length, but instead I'll refer you to Sugimoto's textbook "An Introduction To Japanese Society".
P.S. One of my favorite Japanese engineers swears blind that the iPod is just a Walkman with a Game Boy in it. I think that is almost as unfair as "Japan just improves on technologies that were invented by white people."
I have always felt English works as a language to interact with computers pretty well, perhaps one of the best or the best. It could be argued this is just because I speak it as a primary language, but this article illuminates exactly why I believe it is a good candidate. Our character set is very small and very simple, which is important. There are sets with fewer (the indigenous Hawaiian language IIRC has 13 letters), but English is also very widespread, and words don't wind up crazy long (like in Hawaiian).
Other European languages with the Latin alphabet could be fair contenders as well because of this, although the ones with accents are a bit more complex, and none of them are as widespread that I am aware of.
So actually really what I'm getting at is not 'go English!' but 'go Latin alphabet!'
I believe Hebrew would work pretty well too. There are 22 letters, it's also phonetically very similar to Latin characters. Most words are also shorter in Hebrew, but are phonetically equivalent to an English sounding word. For example, "Comment" could be "כמנת", thus 4 characters instead of 6, and are phonetically equivalent.
Suddenly another interesting thing springs to mind. Punctuation has become all kinds of useful for computers in the shell, in programming, etc. Things like / for folders is intuitive and easy to see. Do many other languages, particularly the better candidates w/ lower letter counts such as Hebrew, have punctuation beyond the period?
Edit: good for them, Hebrew does have punctuation.
Hebrew does, but only really because it imported it from modern European languages recently. Classical Hebrew and the religious Hebrew of the middle ages didn't have much in the way of punctuation, but when it was revived in the 19th century as a living language by European Zionists, it borrowed (adapted versions of) the punctuation common to European languages of the time.
I'm a Hebrew speaker, and I really doubt it. Whenever I get some technological appliance that can speak Hebrew, I turn it to English.
One of the big reasons is that Hebrew is right-to-left, which is a nightmare. You can't even type Facebook comments in Hebrew and have them correctly aligned.
If we'd started with Hebrew, it would be left-to-right that would be the nightmare. That's just because we've got left-to-right ground in deeply in our software, not for a fundamental physical reason (in the way that vertical orientation is arguably less useful to humans as our field-of-view is a horizontal elliptical shape, not a sphere or vertical shape).
There should be companion pieces: "Why China didn't create the iPod" and "Why Europe (Germany, Finland, Sweden, etc.) didn't create the iPod". Japan and these places are all very tech-savy but their cultures are very different than that in the US, you can almost think of it as different solutions to the same equation.
Simplifying and summing up, I think the arguments go like this:
* In Japan, the conformity culture, bureaucracy in big companies, and lack of foreign brain power and venture capital is the problem (The Economist was saying that Japan has the lowest foreign technical investment in developed countries).
* In China, the lack of entrenched copyright culture is the problem, in fact, taking a product and hacking to add new features is prized. Also, good product design design and customer feedback is unknown.
* Europe does have excellent designers; however, it lacks a start-up worshiping culture, in fact they're given the cold shoulder in government-heavy countries like France. Their modus operandi is to proceed by government-funded huge projects, e.g. Galileo, Airbus. And the foreign brain power is there, too.
So, you see, it's not Japan's fault: US has a unique combination of cultural and other factors that produces things like the iPod. I don't think this is unrepeatable, though.
I agree with points in general, but for one detail:
> Europe does have excellent designers; however, it lacks a start-up worshipping culture...
So? The iPod wasn't made by a start-up. The technology wasn't built from an acquisition of a start-up. I dare say it had absolutely nothing to do with start-up. Yes, Apple came from a start-up. But to my mind, the distance between Apple the start-up from a garage and Apple that envisioned the iPod and DRMed iTunes is sufficient to stop assuming only a start-up could've made the iPod.
"Why Germany didn't create the iPod?" is a very good question I've been asking myself, too. Remember that MP3 was even development by Germans (Fraunhofer Institut). But it required an US company to start the digital revolution...
I still remember the end of the 90s when I was standing at the Schneider CeBIT booth talking to the booth personnel about their new niche product: One of the world's first MP3 players (I think it even had a parallel port). Shown in a corner of the booth and nobody was taking notice of the technology - even the Schneider people didn't seem to be interested. I talked to them about the huge potential of this technology and made them aware of the fact the MP3 encoding was about to take off at the universities (everybody in the CS department started ripping&encoding their CD collections). But they simply didn't see it, they didn't believe that MP3 (that they) could change the world with such a product. What a wasted opportunity!
Fascinating. Makes one wonder about how many changes in ttrends in technology were motivated by seamingly insignificant details about of our cultures. imagine if people had four fingers on each hand, would floating point arithmetic binary be more intuitive?
Before you dream of 4 fingered hands, stop and consider with fewer fingers counting would have been more difficult in the old, old days, and things would probably have progressed a bit slower! :)
Not necessarily ... we use base 10 because we have 10 fingers, but that's not the only choice humans could've made.
Geometry could've played a role ... angle degrees aren't base 10, which makes measurements of angles / time awkward.
Mayas used base 20.
Babylonians used base 60, which is very convenient for angles since it's also divisible with 3 and 4.
Of course, I don't think there were number systems that had a base not divisible by 5 (except those that were baseless), and this probably does have something to do with the number of our fingers.
But I think we would've been just fine with 4 fingers ... we would probably count our feet too ... and base 16 would've had been the norm.
I couldn't imagine what a baseless numbering system could be. A quick google showed that Roman numerals are baseless. I always assumed they were a form of base10 and crazy shorthand.
I can't imagine what would be like to live with such a baseless numbering system ... life had to be much simpler for it to work. Like you didn't pay your taxes this month? Your possessions are confiscated or it's off with your head because calculating interest is too much work :)
They probably used counting boards, and specific units for measurement ... like they knew the size of a legion in their army and said "we have 3 legions, with 2 other joining, that's 5 legions" instead of ~ "we have 12600 men, 8400 more are joining, that's 21000 men".
Roman numerals are best thought of less as a numbering system (as we think of our modern system) but a semi-overgrown counting system. If you start out with hashmarks and sort of let it evolve, you can see pretty easily how they got to what they had. It's obviously limiting but not entirely stupid, when put into context.
I'm kind of surprised hardware and software interface wasn't really addressed in the discussion about the iPod. It compares very favorably with the engineer-designed UIs that many local manufacturers release for consumer electronics in Japan and elsewhere in East Asia, and I think has helped drive its success there, as well as the success of the iPhone/iTouch.
This reminds me of the recent article talking about one of the major obstacles to China spreading its culture is the complexity of its written language.
Many people think that this fact alone will stop Chinese culture from dominating and spreading in the same way Western/English culture has spread.
Good point. I know tons of nissei (well, the Chinese word for nissei, whatever it is) who can speak Chinese but completely gave up on writing it, and it looks to me sorta like the language is going to die in the US as the generations go on partly because of that. (unless immigration from china stays high)
Not slow with a server solution and next generation upload speed. (Most of the OCR could be implemented on a hardware chip, so not a full image has to be sent.)
Another question - why did the industrial revolution happen in Europe, not Asia? My guess - movable type works better with latin characters than thousands of kanji. Cheap (and more varied) books = cheap knowledge for the masses.
I've read it. GGandS claims Africa and the USA were slowed down by their geography (vertically oriented continents makes sharing crops and hunting technology difficult) and native flora (Eurasian wheat and rice is good, African crops aren't so great, and corn is poor) and fauna (African animals are too dangerous to domesticate, and the good American animals were wiped out roughly the time that humans arrived).
He may mention that the Arab world was delayed by climate change (deforestation, desertification and other reasons?), but I think he says assumes that Europe only beat Asia to the industrial revolution through dumb luck.
Thus my suggestion that latin scrip was far superior from 1400 to about now (when laser printers and LCD screens are replacing green screens, movable type and dot matrix printers.
I think the real problem is that the necessary books weren't being written in the first place (or were suppressed more ruthlessly), not that the cost of printing them was too high. The massive character set meant that becoming literate required a huge investment of time/money. And so, at least in China, all the literati went into civil service jobs, and had little incentive to publish books which would shake up the status quo.
You may be interested in the book Asia's Orthographic Dilemma by William Hannas.
No. Japanese has a LOT of homonyms. Kanji are necessary to tell the difference between all the possible meanings of a given set of sounds^. IANAL^^ but from my experience it seems Japanese is much better for speaking and reading than it is for writing. In speech the homonyms are more useful because it's easier to tell what something means from context. I find that I'm often much terser in Japanese than in English in speaking and vice-versa in writing.
^The number of kanji in use is actually INCREASING because of auto-completion software.
^^(I Am Not A Linguist), actually readability of Japanese could go either way, I'm not sure. On the one hand, the characters are more complex, requiring more space and processing; on the other, they stand for larger chunks of meaning. Has anybody here seen any studies on the reading-comprehension speed of Japanese vs. English?
^^^Also, on readability: Kanji help show the grammatical structure of a sentence somewhat. The same way you read English words by recognizing their shape, you can recognize the structure of a Japanese sentence by it's shape.
Plenty of non-english speakers find english transliteration good enough (or hindi, or arabic transliteration). It takes getting used to, but once it becomes standard, it works.
I suspect the only reason I had trouble reading your post is because I'm just not used to the spelling system you used. Is there a fundamental reason why romanji can't be used, or are people simply not used to it?
I agree, I see some type of backwards "deRomanization" used all the time by foreign students in the West. I've noticed it most prominently among the Arab and Persian students who end up on University computers without the ability to type in their native script. So they flub it with Roman characters and get by pretty much okay.
For Army Linguists that speak Korea, there's a similar system they use to write Hangul on an English keyboard. But it isn't phonetic.
It's extremely difficult to read Romanji, especially with the sheer number of words that would be represented exactly the same way. It's the same as representing everything in Hiragana or Katakana; one simply cannot read it properly.
1.Did the large number of homophones emerge through use of the written language?
No. Modern Japanese is the standardization of a 3 different scripts. Kanji, which are Chinese characters, katakana, which is a phonetic script made by simplifying Chinese, and hiragana. Katakana and kanji were used for official writing, and hiragana was used for informal writing. The Japanese government standardized the three scripts in 1915-ish and now they're all used together for different things.^
2. How do they manage to talk to each other, then?
Context, context, context. You can tell which homophone means what depending on the context and part of a sentence a word falls in. Sentences frequently consist of just the verb. Where we might say "turn it on" a Jap would say "on". Also Japanese people rarely interact with people outside their circle of friends/business acquaintances, and when they do it's in extremely formal circumstances where everybody knows what to say.
Why is it that context is insufficient for written Japanese, but sufficient for spoken Japanese, especially given the terseness of spoken Japanese? How is it that many Japanese books (such as The Tale of Genji) were originally written using phonetic writing systems? Is modern Japanese so different?
Because context is more than just what words are being used. Facial expressions, who's speaking, gestures -- everything feeds into the context of the spoken language.
My favorite example of this was when I was watching "Bushi no ichibun," a samurai movie that came out a while ago. A man and a woman were in a house, and my wife and I could not figure out their relation. The woman says "Hey, you." And we both realized that the characters were married.
"Is modern Japanese so different?"
Yes. Yes it is. Japanese has gone through so many reforms that's it's almost a completely different language. Also take into account that the Take of Genji was the first written novel and you'll see why modern speakers would not be able to read Tale of Genji in it's original format.
What you are talking about in the married scene is the emotional context of the way she said "you". I guess in the movie she probably said "anata"? But this is not a very good example, as regardless of emotional context, simply using the word "anata" instead of his name is a solid contextual message that they are married. Otherwise she would say his name, or his "role" (father, older- brother, younger-brother, etc).
I'm sure there are some sentences that require clarification, but this is true in english as well. But generally speaking, it is not an issue. Only very rarely do japanese people end up needing to disambiguate through pantomiming the strokes of kanji in the air or saying "you know, the first character from such and such, not the character from that other word" (except when discussing the writing of names, which is totally insane and is the real life version of the old monty python Throatwarbler Mangrove sketch).
I don't think blintson gave you a good answer for your first question. In fact,
(i) words derived from Chinese (kango) are a large part of written Japanese vocabulary;
(ii) kango came to Japanese largely through the writing system (not through massive Chinese immigration);
(iii) and most of the problematic homophones are kango. There aren't that many homophones in "yamato kotoba", the "native" Japanese words.
So, yes, a large number of homophones emerged through use of the written language. Of course they're part of the spoken language now, but the written language is significantly more formal and so still contains more kango, and therefore more homophones.
There are at least two reasons for (iii): spoken language naturally leads to different words having different sounds; on the other hand, many words that sound different in old Chinese end up sounding the same when adapted to the Japanese phonology. This is because Chinese has tones and consonants and vowels that don't exist in Japanese. So Chinese "kang1", "kang2", "kan1", "kan2", etc, all become "kan" in Japanese.
As blintson said, context is enough most of the time in conversation. Example for "yamato kotoba": hana can be "nose" or "flower", hashi can mean "bridge", "edge" or "chopsticks". As you can imagine, most of the time the intended meaning will be clear.
From the top of my head, I can only come up with 2 pairs of homophones that often elicit clarification in conversation: shiritsu can mean both "municipal" (市立) and "private"(私立). Annoying when you're talking about schools. Another is kagaku which can mean both "science" (科学) or "chemistry" (化学).
>Example for "yamato kotoba": hana can be "nose" or "flower", hashi can mean "bridge", "edge" or "chopsticks". As you can imagine, most of the time the intended meaning will be clear.
Those words are actually spoken differently, even though they share the same spelling in romanji (or hiragana). The stress accent is in a different place. This is similar to how in English, the noun "record" and the verb "record" are differently stressed despite their identical spelling.
Possibly a phonetic othography that included accent pitch would maintain greater readability once people were used to it. Unfortunately, people in Kansai put the stress on different syllables than the people of the Tokyo metropolitan area.
I remember showing my black and white iPod to some people in Japan in 2005 and they had never seen one before. However, if you looked in the shops you could find 100s of better spec'ed MP3 players, much better than the ones in the U.K. at the time and much cheaper. I guess the iPod just looked like a over priced fashion accessory to them.
Then looking at some of the phones people had in 2005 when I was there, people were able to do all sorts of things, like look up train times with complete ease. At the time was very difficult on European phones.
The author of the article seems to think that it's progress to plug your MP3 player or phone into a computer, but it's not. For example look at iTunes, I can only put music on from one machine and can't get it off again (using iTunes at least), it's needlessly tied to a single machine, when I don't want it tied to any machine. Also people don't want to plug their digital camera into a computer to upload pictures, they just want to send them straight to the internet from their phone, the Japanese were doing this long before the iPhone+Twitpic.
I think not requiring a computer as the digital hub is the future, not the past.