One interesting point is he brought up Einstein's invention of relativity. Notice that in the scientific community, humanity has been building upon previous work for centuries with no copyright or patent protection, and nothing more than honor, citation, and shame against plagiarists.
The fundamental defense made by many is that without patent protection for software, people would not put much effort into innovating.
1) If you look at science, open source, fashion, food, and other areas where humans continually build on culture, you see plenty of continued innovation without insane legal protection.
2) The amount of effort pales in comparison to the monopoly granted. You could make the argument for say, pharmaceuticals, that if it takes 10 years from lab through human trials and hundreds of millions of dollars, that a 2-decade long protection period might be needed. But there is no FDA for software, and Apple actually spends far less on R&D than other companies, and with $100 billion in the bank, you can't claim that haven't gotten an incredibly good return in their investment.
Therefore, it is insane to grant 20 year protection to Apple for stuff like pinch gestures. 2 years maybe. But 20? It's absurd.
Just to keep things fact based, the recent court case against samsung did not cite the "Pinch-to-zoo-after-finger-contact-has-been-broken-and-restarted patent of 7,812,826.
And, even thought that patent was _not_ referenced in the case, you will see it's a particular multi-touch behavior _associated with pinch-to-zoom_ that apple is claiming a process invention against, not pinch to zoom. It's actually quite clever.
You don't have to contest the patent for it to be suppressing others' design. I once worked for a company that made a humidifying air pump. We had to have the water tank held in with a removable flap rather than a door because our main competitor had patented 'holding the tank in with a hinged flap/door'. Sure, you could fight it in court, down the track, after hardware design (which is much more expensive than people realise). Or you could move onto the next problem and have a slightly inferior product. Patents on trivial things stifle innovation.
The points I was trying to make (and clearly failed) is that Apple does not have a patent on pinch to zoom, so they couldn't contest the patent on it, because it doesn't belong to them.
Something funny is going on - I don't recall doing my reply against this particular comment, I think it was meant to be against a different comment about patents being trivial to work around or somesuch. Probably human error on my part, my apologies.
Thank you for linking those articles, that was very helpful.
That said, Google was lucky that there was a loophole in the patent definition. Imagine a world where Apple had actually obtained an airtight patent on a system that listens for single- or multi-touch gestures and switches between scrolling and pinch-to-zoom appropriately (which is essentially what the patent in question does). So, you'd be free to implement pinch-to-zoom as long as it doesn't infringe on that. Good luck.
(the point here being that, de facto, Apple would essentially have a patent on pinch-to-scroll. In real life [not in the proposed alternate universe] this is almost true -- there are loopholes, but you have to hire a patent lawyer to find and/or confirm them).
Imagine something completely different happened, based on that, something bad could then have come out of it?
I'd rather not, lets just stick to the facts and not build strawmen. It's bad enough to even discuss pinch-to-zoom or rounded rectangles when that's not what the trial was even about.
If we're discussing the Apple trial, lets figure out what really happened, not just repeat what we read from the press.
Apple's original claim didn't have that "loophole", but added it after the patent office rejected the original version as being anticipated by prior art (specifically, a patent [1] and a patent application [2]).
It's not a 20 year monopoly. Many Android manufacturers are paying Microsoft for their patents. What prevents them from licensing Apple's patents?
I don't care which phone I use, because they're (mostly) all pretty good now. But wow, I remember the Microsoft WM5 heap that I had just before my iPhone. It was a testament to terrible UI. It had more widgets than my old Palm5 but was somehow far less usable. Not some rubbish knock-off, the premier phone from HTC running the software from the world's software monopoly, and it was terrible. I made vows never to use another Windows phone again after that. Version 5 and it was still crap. With the iPhone, Apple's first version knocked it out of the park. First try. And changed the industry, yet everyone thinks it was all obvious or people should be able to use their ideas freely.
There is obviously a balance between Apple-owns-all and everything-goes. Apple should benefit from their patents like every other company does, but others should be able to license them under reasonable terms. Until we finally dump all software patents, this seems like a reasonable compromise.
PS. - pharmaceuticals usually get a full-on monopoly for the full period. Software makers usually cross-license, so there is a difference.
PPS. - Apple's 100bn in the bank just show how far behind everyone else was at the time. Took ages to catch up.
Eep. Scrolling was through the tiny scrollbar on the right, and I'm left-handed to make it even more irritating to use. Closing a window with your finger required a fingernail in the corner (or stylus, add 2 seconds). It insisted on stuffing a Start Menu in there, with careful clickery required to not make mistakes. And a Windows Explorer, with tiny expand-contract [+] thingies. I forget most of the horror.
For others who forget, these were examples of the best apps at the time. Behold the menu bar for some apps, the scroll bar, tree menus (and tree folders, requiring stylii or fingernails), [x]-to-close, the button on the bottom that says "Up". You can see the iPhone was out, and some apps were already influenced by it.
My hardware had a slide-out keyboard. Sometimes the flip didn't flip, though, so you'd open and close it multiple times until it woke up. The phone was capable of connecting to the web, but that just generally brought misery. It could do video calls, which was theoretically pretty cool. We tried that once.
Version 5, that was. I think I'm going to put it in a fire, right now.
My goal was to add some value to the conversation, not convince you or others that you didn't like the phone. It's not about your opinion, it's about Apple and the state of the art at the time, and what they did to change it. If you have some relevant info about why it was great, paint that picture for other HN'ers who might not have used it.
Part of the price difference will be the cost of shipping and handling, customer service, warranties, etc, but I think it is reasonable to claim that at least $200 will remain after factoring in those. People do buy these phones, so there must be some $200 of value in an iPhone that is not attributable to its hardware. People will have different views on where that value is, but IMO, it is not unreasonable to assume half of it is in the iPhone UI.
Einstein didn't invent relativity - it was already there. That's why people in the scientific community rarely get miffed at people. They're not innovating or inventing - they are discovering.
There need be no software patents on the basis that software takes time to reproduce and that is your guarantee. The implementation is where the cost is. If the idea is obvious and trivial (like pinch gestures), then you probably don't deserve a head start.
> One interesting point is he brought up Einstein's invention of relativity.
He didn't invent, he discovered. I'm not just being pedantic, I think the approaches signified by these two words goes to the root of the differences your describing.
Do you invent, or do you merely discover an optimum in solution space? This is really just semantics that changes nothing, regardless of what you want to call it, there is a history of people spending decades of effort to describe new math, new models, new mechanisms, all prior to patent protection.
That reforming copyright and patent laws will reduce innovation is unproven scaremongering. There are other industries that don't litigate like this, but none the less experience a wide diversity of innovation. With the exception of Nathan Myhrvold, most innovations in field of cooking have not been patented. You don't go to a restaurant and get a patented entree, the chef cooks up new and original recipes by copying and tweaking recipes of those that came before.
In a classical sense, science is an act of discovery, of learning and understanding that which already exists, that which is the fundamental nature of the universe. These are things done out of a quest for knowledge and understanding. The entire concept of a patent is foreign to such an approach.
Invention on the other hand is a concept rooted in taking a knowledge base and using it to create something new. Something that isn't just the fundamental nature of things but rather using such fundamental rules to create a novelty. The concept of a patent is founded on this ideal.
Ultimately what I'm saying is that science marching along just fine without patents is a poor comparison to use, it has a completely different basis than what patents were created to protect (theoretically, I think they are just a mess at this point).
"You can patent pretty much anything under the sun that is made by man except laws of nature, physical phenomena, and abstract ideas. These categories are excluded subject matter from the scope of patents."
No. One can discover mathematics, but mathematics cannot be patented. One can discover an ancient civilization ... and so forth. There are many more examples, and they're not patentable.
The sure can, but that means they would be forced to share their "discovery". Discovery and Invention are two different domains when it comes to patent law.
If pinch and zoom is such a trivial thing then invent some other thing to use instead.
The patent system by-and-large doesn't pick and choose patent durations based on the kind of patent.
Theoretical physics and patentable technology are quite different things and the comparison is not helpful. Before patents we had a guild system where people who made almost anything -- pottery, telescopes, pots and pans -- would obfuscate their techniques and if possible their products. Knowledge of key manufacturing processes was a trade secret. This is the world that we'd be in without patent protections.
Trivial doesn't mean valueless. A "1 click buy" patent is trivial, but forcing everyone to "invent something else instead" (e.g. 2 clicks) is stupid. And what if 2-click is patented? Then the next guy to come along has to have a 3-click system, or some obfuscation that convinces a jury.
Pretty soon, consumers are confused, because every website they go to has a radically different user interface brought on by patents on ideas that should be basic commodities.
When we had the guild system, we also didn't have universal public education, and instantaneous near zero cost publication of ideas.
Plus, you analogy makes pinch-zoom patents look even worse. You can keep how it is implemented a complete trade secret, and from looking at it for a few seconds, I can produce an equivalent implementation.
Apple has already obfuscated their techniques anyway. Why not release iOS as open source then, if it's protected by patents? The reality is, these patents are read by no one except lawyers, and in many cases, they under specify the implementation by being very abstract.
I really don't see how anyone who writes software for a living can defend these things and defend the status quo of not even supporting reform. You are asserting that something like an XOR-cursor (another dumb patent) is equivalent to a manufacturing process deserving 20 year protection?
> I really don't see how anyone who writes software for a living can defend these things and defend the status quo of not even supporting reform.
I agree that it seems like things are a bit screwy right now, but I don't claim to know what the solution is, and I'm not sure that things haven't always been a bit screwy (Alexander Graham Bell got the telephone patent because he was ahead of some other guy in line, Farnsworth died poor having invented TV).
Abolishing software patents is -- in my opinion -- not the solution since -- given the direction technology is headed -- this is going to be disturbingly similar to abolishing patents altogether. Most suggested "reforms" of the patent system seem to satisfy Mencken's criteria ("neat, simple, and wrong")
Note that genetic and chemical-engineering patents are pretty close to software too.
1-click is one of my least favorite patents, but it really comes down to an argument about obviousness -- an argument it appears to have lost in Europe (where the patent was never granted). The patent has been challenged, most of its claims thrown out, and its remaining claims narrowed:
I'd suggest that in the end its a question of application not theory. In this case the Europeans have arguably done a better job of applying patent law than the US.
The solution already exists, in parallel with patents. If the whole goal is to "incentive innovation", then there are many ways of doing this. And we've been doing it already. If all you wanna do is incentive innovation then just give innovators what they need to innovate. That's what angel investors, angel groups, startup incubators, accelerators and even kickstarter backers do.
There's absolutely nothing about the concept of "incentive innovation" that says you must punish other innovators by giving monopolies to each individual. Of all ideas for promoting innovation you can think of, granting monopolies are among the worst ones. If before the invention of Intellectual Property, if you asked people to come up with new ideas to incentive innovation, no one would come up with "let's promote innovation by punishing innovators to pay fees to a select few". And in fact, no one did, that's not how IP was invented, it was the other way around. It started with UK monarchy monopolies with the explicit goal to make money for the monarchy. The excuse that IP protects innovation was made up later by those who were profiting from it when the monarchy fell.
Solutions to replace patents always existed, still exist and are working great. YCombinator is a great example of that. If you think it's the government who should grant some kind of incentives. I don't know about the US, but in my country we have many government programs for innovative startups. Many high tech and bio tech startups only exist because of government granted funding, incubation and mentorship.
Humans will always innovate. Solutions already exist and are working. Patents just need to stay out of our way and the rest will keep working fine.
> Knowledge of key manufacturing processes was a trade secret.
So, without patent protection, pinch-to-zoom would have been a secret and the rest of us would never have been able to figure it out? Or, without patent protection, no one would have put in the R&D effort to come up with pinch-to-zoom?
Obviously not.
The benefits that patents are supposed to provide to society just aren't benefits at all in this case. So, as a society, we ought to figure out a way to change the law so that it provides benefits where needed without doing more harm than good. It's clearly doing more harm than good in software right now.
I think you're arguing against a straw man. The idea of pinch to zoom isn't patented, it's specific tricks to the implementation, which the patent explains, that are patented.
The world that you were describing, without patent protection, would be perfectly fine for software.
Do you think samsung needed to refer to apple's patent to figure out those tricks? Or that those tricks needed significant R&D to come up with, which requires patent protection? I do not.
I've implemented pinch-to-zoom in a bunch of different contexts, and I agree that it's easy to do a half-assed version. AFAIK that's not what Apple patented, and you're free to implement half-assed pinch-to-zoom to your heart's content (although you may be infringing an earlier patent).
pinch-and-zoom is a good idea. But it is just that: An idea.
Its invention cost no money and society as a whole is not served by granted exclusive monopoly over it to any entity.
The entire discussion is turning to be absurd. Is pinch-and-zoom (and friends) all that separates Apple from the competition?
In other industry manufacturers learn from each other (seatbelts, suspensions systems, transmissions, etc), but somehow when it comes to software and electronic devices we want to stop progress and grant monopolies to every oh-so-little idea.
In so far as this is true -- that Android devices' pinch-and-zoom doesn't behave at all the same as iPads', and that this makes them much worse to use -- those Android devices have not copied what Apple did.
(I have no opinion about how much Samsung actually copied from Apple or whether what they did was legal; I haven't looked carefully enough at the case. I'm not passing comment on that. Just saying that you can't reasonably say both (1) that what Apple have rights over is a very refined implementation of pinch-and-zoom that differs from anything you'll see on an Android device, and (2) that a maker of Android devices copied it. At least, not without a further explanation of how they screwed it up so badly.)
Regarding the expense of clinical trials, it's important to note that the scientific results of a clinical trial are not protected by either patent or copyright. Pure research data cannot be restricted under any form of IP. Factual data without a creative element is explicitly exempted from both the patent and copyright system.
A published paper on the results of a trial may fall under copyright, but the factual observations and conclusions cannot be.
The "clinical trials are expensive" argument is incredibly disingenuous.
It's possible that innovation, specifically scientific progress that leads to innovation, actually relies on scientists sharing stuff with collaborators that patent lawyers would not want them to share. The patent lawyers might often be "the last to know". It's possible that many scientists often put their own scientific goals, which stand to enrich the scientific community as a whole, before the business goals of the employers they work for. Something tells me that Apple's patent lawyers do not have this problem with Apple engineers.
Not that I consider software developers to be scientists but I wonder if there are any Apple engineers contributing to open source projects or even just publishing the occasional research paper.
Doesn't Apple have rules that keep all their employees from talking to anyone outside of Apple under threat of immediate termination?
No FDA for software. You can say that again. There's also no Medicaid, Medicare or health insurance that pays for it.
I don't even think medicines should be patented. I think government should fund the R&D and then let all companies manufacture and sell it at a low price. Then it is even better than medicare.
The government can sell the production method to other countries for a profit though.
That's not government's role unless you're in Soviet Russia. The free market drives more innovation that government ever has. The Lada and the Yugo are great examples of what happens when central planners try to run things. Besides, if people aren't going to profit from their hard work, what's the incentive for hard work? I don't write code everyday for the good of humanity, I do it to try and build something someone wants to buy so I can improve my lot in life. Besides, if companies don't profit, they won't do and then you have a downward economic spiral of decreasing tax revenues and ever increasing tax rates to fund the government's work which then drives even less economic activity because the incentive is reduced.
The role of government is to protect life, freedom and property. As soon as the government gets involved in microeconomic transactions, the efficiency declines. Comparing the US Postal Servie and Amtrack to Southwest Airlines and FedEx illustrates this perfectly. The government should regulate markets only as much as necessary to ensure that the markets function. They shouldn't be running the markets themselves.
The lack of tort reform is one of the key drivers to the insane cost of drugs and medical care. When a drug company has multi-million dollar exposure for every drug they produce, it's going to raise the cost. OB-GYN docs pay several hundred thousand dollars per year in malpractice insurance because one mistake has them on the hook for multi-million dollar punitive judgements.
Government should be a referee, not a player. It should provide a safety net from complete destitution, but not serve as an insurance policy against failure. The GM bailout is a great example of what government should not be doing.
When the government starts meddling in private business (i.e. pouring money into it,) it distorts the market and makes it much harder for innovative new players to enter the market. The Tucker car from the 1950s is a perfect illustration. The big three in Detroit threw around political muscle to effectively shut down a car company that built safer cars. If government had stayed out, we would have had seat belts and pop-out windshields years earlier.
"I'm from the government and I'm here to help" are still the scariest words one can hear in America.
The US government spends an enormous amount of money on medical research, and many medical treatments would simply not exist if it weren't for such Evil Communist behavior.
The market is a tool useful for some problems. It is not the alpha and the omega of society.
About $26.4 billion, that is 28% of the total biomedical research funding in US are money from the government. Its the single largest player in medical research. In areas of core principles, cancer research, and motility effecting illness, they are 80-95% in total funding.
Government is not and has never been an referee in medical research. The drug companies would never allow it. Try removing a $26.4 billion of free R&D and see lobbyist doing everything they can to stop you. Additionally, anyone caring about sick people would join in in stopping that bill. One do not simply remove $26.4 billion of medical research in the name of free market.
Yeah I am sure government is the right place to fund innovation! (See how the NASA has been productive vs SpaceX for example) - and why don't you let the government run the production of cars, as well ? Oh wait, we tried that before. That gave Skoda and the Trabant in Eastern Europe. Clear examples how professional government sponsored innovation was.
I've sold software with a crazy markup too. Something like 100,000,000%! Because it's very cheap (far less than a penny) to duplicate the software, but I sold it online for $99.
Pharmaceutical companies do the same thing. It's not the pill that's expensive, it's designing it. Same with software.
The difference being that your software probably isn't going to save anyone's life. While a specific chemical compound designed for that purpose probably can. There is a clear moral difference.
I think the article is right in what it says but is wrong because of what it doesn't say. Apple succeeded where others had failed, and that is certainly commendable, but now we have a problem: Apple doesn't have a patent on "an iPad" meaning a device with all the individual characteristics that make an iPad an iPad and make it successful, instead they have individual patents on all the individual features.
But the individual features are the things that Apple didn't do. Yet that's what they sue over because that's how patent law is set up.
So now we see Samsung lose big in court and popular reaction is split, and here's why: People looking at the actual facts of the case are outraged that Apple could win that way because the actual grounds of the win had nothing to do with copying or Samsung's actions and everything to do with the fact that anyone with a million over-broad patents on obvious "inventions" and laws of nature and mathematics can win in court against anyone who produces a computing device, arguing that any actual copying on the part of Samsung is irrelevant. On the other hand, we have the people who look at the result and the fact that Samsung's devices do actually look entirely too much like Apple's and think Samsung got what was coming to them, ignoring that in order to do it Apple had to adopt a long list of bully tactics that they've now demonstrated that they or anyone else with a sufficient patent arsenal can successfully use against their competitors (including those whose devices aren't intentionally copied, because there are too many patents to possibly even attempt to avoid them all).
Nobody seems willing to say that Apple should potentially have some remedy against Samsung for actual copying but that what they got is the wrong remedy in the wrong way, not least which because the same tactics can be used against anyone whether they've done anything wrong or not.
I didn't read the article getting the feeling that JLG is arguing software/UX patents are all right. I read it as an argument to those imbeciles who believe "Apple Never Invented Anything" and they "merely" took what others had invented before and smashed them together and got themselves an iPad. Go to #bycottapple at Google+ and you'll meet them.
If no one else could make mayonnaise successfully and a chef came up with the process for making it, that chef should damn well get some kind of protection from the IP system for his innovation, invention, or whatever the hell you wanna call it.
That is, after all, what will make him want to share his methods instead of keeping them a secret. That's the fundamental principle of the patent system: you share, and we give you a temporary monopoly defended by law.
This patent system is obviously broken, but in the mayonnaise case it wouldn't be.
The law of protection of confidential information effectively allows a perpetual monopoly in secret information - it does not expire as would a patent. The lack of formal protection, however, means that a third party is not prevented from independently duplicating and using the secret information once it is discovered.
Yes, and that's the bargain that patents give. Either you keep it a trade secret, and live in constant fear that someone will leak it someday, or you get a patent, publish all the details, and get a government-supported monopoly so that you know that for 28 years no one else can make mayonnaise like you do.
"Apple products are like a good classy restaurant or hotel chain. They take ingredients everyone has and put a lot of work into fit and finish, they make the customer feel special for a slightly higher price."[1]
And I stand by it. I'll go even further actually, I think Apple is one of the least innovative big companies. Look at all the big research labs at Microsoft, Yahoo, IBM or Google. Anyone who seriously follows this stuff knows a) Apple doesn't have a profile in the academic world and b) knows enough computer history to know Apple is claiming things invented decades ago.
UI and UX are innovations. The silly notion that only hard, technical inventions with academic papers attached are innovative is the reason why Apple has eaten everyone's lunch up till now.
"Fit and finish" is as innovative as a new algorithm, it's shocking how much of the industry still treats it as a footnote and a detail, despite the entire history of the tech world since iPhone 1 would indicate.
> The silly notion that only hard, technical inventions with academic papers attached are innovative is the reason why Apple has eaten everyone's lunch up till now.
No, the reason why Apple has eaten's everyone's lunch up till now is because they're interested in being profitable, not innovative. The GP is correct in stating that Apple isn't innovative - they just take other people's innovations and monetize them by polishing things up and running effective ad campaigns to gain marketshare among the masses. And there's nothing at all wrong with that - if your primary interest is making money.
> "take other people's innovations and monetize them by polishing things up"
The devil is in the polishing up, evidently.
Your post is almost scarily indicative of the industry attitude that has allowed Apple to take over to the degree they have. Only hard, technical inventions are given any respect, and when we talk about UX we call it "polishing up", almost spitting those words out of our mouths in condescension.
Are you seriously going to hold up a clickwheel and say it wasn't innovative? Or the iPhone? Or the iPad? The fact that these products look and behave almost nothing like their progenitor technologies doesn't indicate innovation to you?
It really disturbs me how little respect us geeks have for the people who consume our products. When the general public votes with their wallet in a landslide victory for Apple, we blame them for being easily manipulable by slick ad campaigns and shiny baubles. The notion that Apple has actually satisfied a long-standing demand is somehow not allowed to enter this discourse.
I do think what Apple does is important, but I don't think profits are really distributed in proportion to contribution, because the final product is an accumulation of work done by different people and companies, but there's no real accounting mechanism to distribute the profits accordingly (patents are a largely failed attempt at one).
The best place to be in business for profitability is to do that last 10-20% that produces a finished product, and Apple is great at that. The worst place is to do the first 50%, basic science which may enable great stuff in 20 or 40 years, but won't do much for your profits today. Hence why much of Silicon Valley is based around mining uncommercialized academic and research-lab work for raw material that can be turned, with additional work, into successful products. I don't think that means the raw material wasn't necessary or important (sometimes even key) to those products, though, so just looking at profits doesn't tell you the story.
Monetization does seem to be maximized at a bottleneck or stumbling block. Like bridge tolls. Or that final thing that makes the whole worth more than the sum of its parts. Apple does seem to do both.
I would say that closeness in time to commercialization is more important than bottlenecks per se: doing research that will enable great products in 20 years is rarely lucrative, because it's difficult to capture any of that future value (especially if it's further out than the length of patents, and often even if it's within that length).
So it's smarter (if you want to make a profit) to let someone else do that, e.g. someone who's paid as a researcher and isn't trying to turn a profit, and instead look for things that are 1-3 years out. You even see it within academia; applied math pays a lot more than pure math, for example, even though both are quite important to mathematical progress.
An interesting example of your case: the Art dealer. Despite doing none of the work of art, they collect 50% of the sale. I think the missing link here is that the effort involve in innovation is S-shaped. [1] Like in your idea of original research, The original idea or creative spark, overcoming inertia. etc. Next, there is a lot of stuff to do to get it roughed out, but many competent people that would not have the genius to originate the idea, can/do help move it along. But then their is finishing and integration, and again for this you need a master (eg, steve jobs). Likewise, with a business. In business, the last step is sometimes looking like the art dealer. A bottleneck? Yes. But being an art-dealer is still a bit of work. They have to make it marketable. They have to market it. There is a needle in a haystack problem. There is knowing about the haystack (rolodex), doing the legwork to meet clients. Then running the shows, exhibiting, writing about the "value" and context of the work. Etc. The pay (like you say) is higher than what seems reasonable. (Although, perhaps not as easy as it seems.)
And that wheel isn't the "click wheel" which is the touchpad-surface for scrolling and the physical buttons for clicking. The first gen iPod had a mehanical scroll wheel and separate mechanical buttons.
I wholeheartedly agree that Apple's innovative in getting technology to actually work in a user friendly way and combining them with excellent design but is that process patent worthy is the real question at hand.
Another problem is that many people attribute things to being invented by Apple because they first hear of it from Apple(and because they don't use non Apple products). For example, I remember when Apple introduced hybrid graphics with a way to switch between the integrated Intel gpu and a discrete Nvidia/ATI GPU. Sony had that working before Apple, but a LOT of folks thought it was Apple that innovated it. Perhaps Apple added more polish to it, but they certainly were wrong.
Polishing and going the last mile is very tough(see OEMs with half baked software and hardware) but does it deserve patent protection? Apple innovated and got awarded with becoming the most valuable company in the world with more than 100 billion dollars in the bank with which they can invest further in innovation instead of indulging in petty patent extortion over petty things like the bounceback effect or linking phone numbers in emails to the dialer.
Design is both how it looks and how it works. If you take the Desktop WIMP UI, and try to come up with a way to redesign it work with only touch instead of mouse and keyboard, you'd roughly be already halfway to the iOS homescreen in terms of design. Instead of clicking on a icon, you touch it and the app opens, swipe to see multiple homescreens since desktop space is limited on a phone, add in a dock at the bottom. Contrast that with Metro. So that screenshot is NOT irrelevant.
> No, the reason why Apple has eaten's everyone's lunch up till now is because they're interested in being profitable, not innovative. The GP is correct in stating that Apple isn't innovative
This is decidedly backwards. From the start, Apple's focus was creating great products (by their definition of great product), and only later did they learn how to maximize their profit from them. There's a pretty good and influential taxonomy of "value disciplines": product leadership, operational excellence, and customer intimacy. Historically, Apple was good at the first and not so great at the other two. Since Jobs' return, they've really ramped up on all three.
A classic way in which companies undermine themselves is by pursuing profit through cost-cutting rather than improving product/service quality. I don't have any hard data on this, but I suspect that's what happens to intiially innovative tech companies that have non-engineer/designer business types that take over.
So other companies are more interested in being innovative than profitable and have decided to not spend as much as Apple on marketing (and polish), and that's the reason why Apple is succeeding?
Ugh, by this logic the first person to use drop-down menus should be the only person to use it (without licensing) for 20 years. Or a hover-event on a link. Or a hyperlink. Or real-time form validation. Or neverending scrolling instead of pagination. Or facebook style side menus on mobile devices.
Certainly there is UI innovation, and certainly it can move mountains-- I think arguing that fact is a straw man-- no one is saying that you can't be innovative with UI. We're just saying that most/all of it isn't worthy of patent protection.
I think much of our community actually does argue that you can't be innovative with UI (see wintermute's post above).
In any case, the notion that UX is important and innovative in no way justifies the gross abuse of IP law we've seen up till now from everyone involved.
Which leads to...
> "We're just saying that most/all of it isn't worthy of patent protection."
There's a false separation here. IMO almost nothing we do in the software industry is worthy of patent protection, UI-related or otherwise. To point at UI patents and scream foul, while giving a free pass to "real" patents (ones with academic papers behind them) seems misguided.
No, or at least I believe it's more complicated than that. A logo's trademark may encompass the colors used, but that's not a legal maneuver which prevents those colors from being used anywhere else. The point of the trademark is to be able to stop competitors from making "confusingly similar" trademarks for their products or companies. I can make a cola, and I'm certainly not enjoined from using red and white in the colors on my can. If, however, I make a red can and put "Boba Cola" on it in white letters with roughly the same typography as Coca-Cola uses, I'm cruisin' for a bruisin', legally speaking.
in·no·va·tive/ˈinəˌvātiv/
Adjective:
(of a product, idea, etc.) Featuring new methods; advanced and original.
(of a person) Introducing new ideas; original and creative in thinking: "an innovative thinker".
This definition does not tie innovation to invention. I think it would be hard to argue that Apple's work is not original relative to their competitors. Apple out-innovates their competitors in marketing, in supply chain management, in product lifecycle management and in design. It's impossible to invent every (or even most) components of a general purpose computer. But to select the right pieces, assemble them in a way that maximizes user experience and market them in a way that makes them stand apart from competitors' products made with almost the same components - that's innovation, just the same way that Netflix' model for mailing DVDs (they didn't invent the mailbox, the postal service or the DVD) was extremely innovative.
Perhaps another example of original innovation: the swiss army knife. Look at a victorinox. Look at a knife, a corscrew, a scissor, a toothpick, and a tweezer sitting on the desk. Sometimes an act of integration is enough to be transformative.
iOS is a great example of radical originality, even if not a breakthrough technically. The Concept of the ap - a litewight, bandwidth efficient, modular, reconfigurable element integrated into the OS - was certainly original. It was also thus, highly innovative. It was reductive smaller, lighter, less complex.
Note the features here: ARM CPU, Touch screen, Contextual media app, "Apps" button down the bottom, self contained apps which were modular, integrated into the OS. I know the OS well (RISC OS) and I've had my hands on an actual device.
It was smaller and lighter and less complex than anything else technologically possible at the time.
So I guess with your insight Apple could have saved lots of money by not buying FingerWorks, PA Semi etc and just launching a Newton with a color display and make it look pretty. Right?
If you don't see any software innovations in iOS, you're either blinded by Apple hate or unable to see further than checkboxes on feature lists.
I don't see how you managed to draw that conclusion. I didn't mention anything about money etc. I see it more like...well:
Sony have been doing this sort of shit for years. Someone invents something, Sony adds turd polish and a decent supply chain and manufacturing capacity, then takes the market share.
Apple just got better at it than Sony. There is no more story.
For ref, I neither hate nor like iOS devices - I've owned a couple and they've been pretty ok but nothing special. I can't see a single feature or innovation that didn't exist already somewhere else. The same applies for my current Windows Phone (the only innovation there is abysmal battery life - no wait my Treo 180G pioneered that in 2002).
Not to argue with your counter example, but another example re; integration. The Richochet from the 1990s was a 28.8 version of a 3/4G usb modem.
But that's not a tethered iPhone, in terms of is overall ambition and functionality. Similarly, the acorn with 8MB RAM, was not an integrated multi-media device (ipod, phone, etc), limited as it was. Let us not forget the power of the sw (youtube app, for example).[1]
Lastly, the innovation (in part on the business side) of the Ap store and ecosystem should not be completely overlooked. There is seamless delivery/monetization etc (not just collections, but outbound to devlepers).
In short, there is alot of originality in how the puzle is put together. Some of it is like the swiss army example. Some of it is in the conceptualiztion of the user experience. Some of it, quite frankly is execution of the physical product (manufacturing details, etc), as I have argued before. (e.g http://news.ycombinator.com/item?id=4435490)
I look at an acorn and a blackberry+phone on the desk. And I look at the iPhone. The latter looks like the victorinox, the other items like tools on the table.
EDIT:
[1] The internet-integration of the acorn aps, i'm not familiar with; e.g. is not full-time connected online unless it had a built-in richochet or whatever. clearly iOS is meant to fully integrate with live information without compromising its mobility.
Actually the connectivity problem was 100% unsolved which is what lead to the end of the device. The supporting technology wasn't available unfortunately.
The basic idea of the NewsPAD was exactly the same though.
They decided to move into the cable/STB market after this as they could deliver the same experience with the connectivity that was already there. They did this successfully for a few years in the late 90's before they marketed themselves into a hole and gave up.
Apples innovations wasn't in "the basic idea", it was in lots of implementation details that make that basic idea into a usable product. If you don't see this, then there is no point in arguing further.
I could just as well say that there has happened exactly zero innovation in the cell phone industry in the last 30-40 years, since "the basic idea" was shown in a science fiction series in the 60s. By doing this, I would be ignoring all the inventions necessary to make that "basic idea" into a real product.
You are ignoring the inventions needed to make a finger-based multi-touch interface possible on a small device, and accurate enough to leave out a physical keyboard.
Sure, wireless makes them much more usable, but claiming this is the reason for its success is just as silly as saying laptops weren't successful before Wifi (hint: they were!).
Without cellular data the iPhone would still be a killer device just as a cell phone with a music and video player, and the iPad with a complete office suite and other apps.
Cellular data was widely available in 1996, at least in Europe. I know because I had a PCMCIA card connected to my cell phone at that time.
But of course, you can keep twisting facts to fit your theory all day long if you want, I know nothing I can say will change your opinion.
You said: "Apple has invented or innovated precisely bugger all there.". I was simply trying to see how unreasonable such a statement is considering the companies that Apple bought and the work those companies had done in getting stuff like multi-touch to work right.
If all Apple did was turd polishing, then they could have saved themselves a lot of effort by polishing a turd called Newton (which was also quite advanced for its time) instead of inventing an entirely new user interface.
What I was saying is that listing stuff like you did: "ARM CPU, Touch screen, Contextual media app, "Apps" button down the bottom, self contained apps" and using that as an argument to why iOS has been done before, is to be unable to see further than a check list of features.
Lots of tablets had the same checklist before the iPad, and most of them were completely unusable.
I suspect that you would be just as happy with a WM6 or Symbian device than with an iOS device, since there were no innovations in iOS?
I also don't see how this relates to Sony. They had a lot of innovations, including the Walkman, co-creating the CD, 3.5" diskettes, Video 8, DAT, MiniDisc and lots of other stuff.
Apple didn't innovate stuff - they bought it in and stuck it together.
I would be happy with anything, but not necessarily impressed with it. A paradigm shift would be innovation but there isn't one.
I'm using a Windows Mobile 6.5 kernel based device to write this on ironically (WP7.5 Lumia 710).
Sony's ability was to take poor grade American products and package them up with Japanese reliability and quality. I'm considering their television range from the 1970-2000ish primarily. The rest of their "innovations" were turd polish over existing products: Stereobelt, 5.25" floppy disks, Panasonic U-Matic, Mitsubishi ProDigi, Canon Ion Disks...
Or are you saying that Sony copied the basic idea of a television, so that makes it impossible for them to have contributed any innovation to the space of TV sets? If you read the history, the invention of something like Trinitron required a lot of work, a lot of trial and failure to make the basic idea of a single-gun color TV feasible.
That work isn't about "polishing the turd called color television", that's called innovation.
I did. It wasn't that special when you've had several devices like that Treo 600, O2 XDA 2, Psion 5MX etc beforehand and spent several years developing software for such devices.
My wife had it and I went back to an S60 device (Nokia E51).
The iPhone was just prettier and substantially less functional.
So why, if the iPhone wasn't disruptive and just pretty as you state, has practically every manufacturer followed Apple's lead? I know that a haters gonna hate, but credit where it's due.
The point that keeps being made. Here is that innovation != invention. Inventions tend to be innovative but innovation does not require the creation of new inventions. Otherwise one might argue that there does not exist an innovative chef who does not invent new ingredients.
I think it's better to compare Apple to a high-end restaurant. You don't get three Michelin stars just by making the customer feel special and having lots of fit and finish, the food also has to be something special.
Yet still, of course there are lots of people that will claim that those restaurants can't possibly be worth the price, and _has_ to be mostly about making the customers feel special, after all they use the same basic ingredients as the restaurant around the corner.
"those restaurants can't possibly be worth the price"
First off, there is more to a dinning experience than the quality of the food, so even if the food is the same a price increase can be justified. You could serve me a McDonald's hamburger for 20USD and leave me satisfied with the transaction provided the burger was not all that I was getting... That burger would not be worth 20USD, but that would not necessarily say anything about the worth of the establishment itself.
Regardless, the question is not if the expensive food at high quality restaurants is particularly good, but rather "What is the relationship between quality and price?"
In the case of high class establishments, the food is certainly good and the price is certainly high. Is that a linear relationship though? The 50USD burger is undoubtedly better than many 5USD burgers, but is it 10x better?
Furthermore, does higher quality food always cause the same sort of price inflation? Or is it possible that similarly superb food sold at undoubtedly higher prices at a restaurant without the other things that high class establishments offer would likely be cheaper?
I would even dare state that, to some extent, high price can actually be one of the desirable services that a restaurant can offer. If you happen to be more concerned with appearances than (in the grand scheme of things) a small amount of money, then being expensive for expense's sake can be a feature.
But I think it's difficult to quantify quality differences like that. For instance, when watching the movie Jiro Dreams of Sushi, I had a hard time thinking that a sushi meal could be worth more than $400 (starting price). It would certainly be wasted money on me, since I'm not a big sushi fan. But to the reviewers, apparently it was worth a dedicated trip to just eat there, so they would probably say yes if asked if it was 10x better than a $40 sushi meal.
I think you're referring more to invention than innovation. I think innovation is more about bringing something new and good to the market. The last part "to the market" is the most important part in the whole "innovation" process.
There are a lot of inventions, especially in those R&D labs you mention, but they rarely come to market, or are good enough for the regular consumer to use.
The way I use the word, I usually intend it to mean covering a completely new idea. For me, execution, no matter how superior can never be innovative. It can of course be important, world changing, profitable, etc.
If I say that the ipad is not innovative, I mean precisely that there was no element of it that was an implementation of a completely new idea, but I don't mean to suggest that what Apple did with it is not amazing and transformative.
So yes, for me, innovative is usually a very high bar.
Jobs saw the mouse at Xerox and knew it was an idea whose time had come. The first mouse was innovative. The first optical mouse was innovative. The first mouse that reduced the number of buttons to one or reduced the cost to $30 was pragmatic and clever but not innovative.
It's a heck of a lot more than "fit and finish". This is just another way of dismissing what Apple does as frippery.
A Mont Blanc pen is prettier than a bic biro, but it doesn't really write any better. A fabulous meal is probably not as good for you as plate of boiled vegetables.
Yes, but perhaps _making things easy to use_ and _sexy hardware design_ might have something to do with their success?
The problem with the research labs at those example companies you cited is that those businesses have little incentive to introduce innovation that would compete with "yesterday's ideas" that are driving their profits.
This is why Bell Labs was so unique. They could basically do whatever they wanted (you might try to make the same claim with your example companies, perhaps) _but_ ... they also managed to release these ideas into the market. And not always to the satisfaction of AT&T. People once had to pay for UNIX. Not anymore.
Xerox PARC is another well-known case where people were "set free" to work on whatever they wanted. But their ideas did not manage to trickle out to the market very well. Instead, Microsoft got one of their key people, Excel was born and the rest is history.
Apple is _not_ an idea factory. If someone called them two-timing thieves and told us to watch our backs, I would be inclined to take it seriously. (The fact that Apple is not the idea factory is why the lawsuits are so offensive to anyone who knows anything about the history of computers. If these sort of broad patents should go to anyone, it should be people like the ones who worked at Bell Labs and Xerox PARC. But maybe patents were not their priority. Maybe they were more interested in research, or playing computer games, than money. [How many UNIX patents? 1?] Go figure.)
But, Apple is a design house. An within IT, they do not have lots of competition in that area: e.g. design of hardware casings. In addition they go to great lengths to make the great ideas (namely the flexibility and stability of UNIX-like systems) easy to use. Another area that is lacking in IT: making the good stuff (like UNIX) easy to use.
Unfortunately Apple feels the need to abuse the patent system to stay on top. It makes me think if they didn't they might be in for a big fall. Maybe they are surprised at their own success? And nervous about losing the top spot?
Incidentally you could argue IBM started all this software patent nonsense. Not sure many programmers would agree with you, but the number of filings and issued patents by IBM, most of them before Microsoft even had a patent department, tells the story quite clearly.
You are not going to see much innovation released from "research labs" at the likes of Microsoft or those other companies. They will not keep their patent department in the dark. Those guys want to keep their jobs, not take risks. "Microsoft Research" or "Google Labs" are not Bell Labs or Xerox PARC. It's a wonder that something like Kinect was even made into a product. And you could see how nervous they were about it.
Today, the "labs" and the idea factory is the world wild web.
The funny part, considering how everyone complains about not really getting to "own" their device, is bringing up AT&T as a good guy. The company who literally would not let anyone own a telephone.
They owned the network. And they wanted to control devices that could be used on it. (There may have once been legitimate reasons for this.)
Apple wants to control your devices. How you use them after your purchase. The network you use to obtain content. And even the content you download: you don't own it, they license it to you. There have never been any legtimate reasons for all this and there never will be.
Whenever I'm using "enterprise" software or some equally-awful tool that's sold using a feature checklist, I can't help but think about Steve Jobs's line about how innovation is saying no to 1000 things.
edit: Well, a feature checklist and $50,000 worth of sales dinners, games of golf and "gifts that do not violate the professional ethics rules of the company buying said tool."
"Enterprise software" is about selling a $10,000,000 piece of software to one entity which requires specific features (and usually complex logic).
Apple's mantra is selling a $1 piece of software to 10,000,000 people who require a simple piece of software.
The two are perfectly valid. In fact the latter would not exist if it wasn't for enterprise software (such as CAD systems, inventory, supply chain management etc).
Have you purchased any "enterprise" software lately? The idea that it's customized, bespoke software addressing specific requirements of the customer couldn't be farther from the truth. Here's how it works at my company. Some "architect" decides we need to have an Identity Management System and consults his latest Gartner report. He looks for the packages in the Magic Quadrant, and contacts the vendors. Then we buy what's the safest, least threatening application we can, for an exorbitant price. Hire consultants to help deploy it, discovering that it's a pile of dung that doesn't do what it advertises, much less what we actually need to do.
This happens all the time in the "enterprise" world. Whether it's sold by IBM, BMC, etc etc.
Don't conflate "enterprise" software with anything complex or intelligent. It's typically some of the worst software you can find.
Actually I build enterprise software and am a enterprise architect at a large company :) Please don't tar us all with the same brush.
It's not all like that. You've been stung by a shitty purchasing and architecture team, probably put together from people who've ascended the ranks to the point they no longer understand the technology and want to suck up to management and wheel around the country in a company rollerskate with a ThinkPad and a caffiene problem. These are called ivory tower architects or as we call them: asshat-itects.
Enterprise should mean certain guarantees about scalability and reliability and ability to adapt to the organisation.
If it doesn't, you've bought a lemon, not a piece of enterprise software.
Note: there are more lemons than not so you have to be careful.
> Are you saying that Apple is succeeding because their customers are their users?
Yes, and they have different priorities than the CTO or IT manager, who may select enterprise software by feature checklist (or other priorities) rather than end-user experience.
Its an old idea to remove everything until just the essence is left. Apple succeed in identifying markets that was making over complex products.
The silly part is that complexity goes in cycles. New products need to differential themselves with old ones, so they add new features. After a while, you end up with a car stereo with 20 buttons, and suddenly a "new" competitor comes out with a clean design with just 3 buttons and the 20 button design looks ill-designed in comparison.
Stereos is one of the earlier examples, but you can see the same phenomena in web-design with today’s White and clean design vs the old dark and complex design.
I think simplicity isn't the only story, it's usable simplicity. If simple was the only game in town, Apple would only need the iPod shuffle and something like John's Phone (http://johnsphones.org/) to rule the world.
The iPod and iPhone has added lots and lots of functionality over the years, yet the interface has stayed simple. It's possible to add lots of new features in ways that doesn't make the interface more complex, however it's a big challenge.
You can add new features without adding interface complexity, but its harder to differential products if the user do not have any visible difference between ones old product now going for 15$, the competitors new 30-99$ products, and ones own new product going for 100$. Abstract arguments like "its faster!, more feature than before!" is much harder sell when the two devices look and feel the same.
One way is to sue all competitors and have the only store available presenting exclusive the new product, but that only goes so far.
As for the johnsphones, it has buttons. As interface goes, it still look complex compared to a smartphone. The most simplistic phone design is one with only one or zero buttons, like the Third generation iPod Shuffle if it had been a phone.
The iPaq was/is a fantastic device. You could attach peripherals. Try doing that with an iPad. I could put an entire OS on a CF Card and assuming I could boot from the card, the expanded functionality is limited only by the hardware specs. They are durable. I've seen consumer electronics businesses still using them to track inventory.
I wish HP would revive the iPaq.
My only imagined use for an iPad is as a portable display. I want the Retina quality, but I have a more powerful hardware to attach and I need a real keyboard. There's nothing the iPad can do that my open, unlocked hardware cannot do.
I believe it was even possible to attach a real keyboard to an iPaq. That's the kind of flexibility I want. I can get data into and out of the device in any number of ways, without hassle.
I understand the comment. Of course the thought has crossed my mind. But I'm not so sure there's any evidence to support it.
I actually test some of my ideas with people like your mother, and surprisingly (why should I be surprised?) they have little trouble catching on.
What's really amusing is that these things that I have them doing are things that many nerds cannot themsleves do. I've got them using systems and techniques that many nerds won't touch because they think it's too "hard core". It's hilarious.
There are lots and lots of unfounded assumptions about what users can and cannot do.
There are facts, supported by evidence. And then there are assumptions. One requires a bit of work. The other is effortless: you just hit "Submit".
I understand the comment. Of course the thought has crossed my mind. But I'm not so sure there's any evidence to support it.
Your post is vague enough that I have a hard time parsing it. No evidence for what? That there is such a thing as a "non techy" user? The evidence is overwhelming, including anecdotal evidence from practically every "techy" person here who have ever had to help their family/friends with a computer issue.
And it's not a question of ability, it's a question of ability + caring enough. The non-tech people might not care to boot an OS from a CF Card -- certainly not enough to seek out how to learn it.
To mom-type:
Here's an iPad, you can use it to easily email your friends, check facebook, and check the web.
or...
Here's an iPaq 2012, you can do all the above, but it's a little harder to use, HOWEVER you can boot any linux distribution you want, add a USB device, attach ANY keyboard -- like one with mechanical switches! I personally prefer the Cherry Blues, but you might want to try the Topre ones. People who use those never go back. They cost a bit more ($~250), and you have to get it imported from Asia or find a U.S. distributor... but it's worth it.
anecdotal evidence indeed. that's precisely the point. there is no empirical evidence behind the vast majority of comments like these. conclusions without evidence or any indication of the methods used to arrive at them.
linux? are you kidding? this is exactly what i was referring to: assumptions. how did you conclude by booting an os i meant linux?
Actually that's not anecdotal evidence. Anecdotal evidence looks like this:
"It has been my experience that XXX"
The parent has made a different formulation, specifically:
"Every person belonging to group Y has had experience XXX"
This difference is significant, because the argument is basically stating that it is not only the experience of the person making the argument, but that the person making the argument is expecting that the readers of the argument are going to be able to confirm the experience for themselves. This is a much stronger argument than mere anecdote.
And for those that are already starting to lean on their keyboards to type "the plural of anecdote is not data", that platitude is a recognition that data is supposed to repose on a generalisable sample of reality, and if you are just going on anecdote, even multiple anecdotes, you are leaving yourself wide open to claims of cherry-picking. But this claim does not cherry pick, it says that a vast majority of "techy" peoply should be able to confirm the claim from their own experience.
the original comment referred to a market for "people like [me]" being "not large". my response was that i have not seen any evidence to support that sort of claim. but i'm not even sure i know what he meant by "people like [me]". i had to assume i knew. the problem with assumptions is they can be wrong.
and i'm not sure understand the reference to "techy people". i never mentioned such a group. i mentioned "people like the [commenter's] mom". presumably (another assumption), she's not a "techy person", whatever that is. but maybe i'm not a "techy person" either. what is the definition of "techy person" anyway? would the definition differ based on the person defining it? maybe i see no distinction between "techy" and "not techy". maybe i only see differences in how much a given person understands about what computers can do, and how to make computers do those things.
that's precisely the point. there is no empirical evidence behind the vast majority of comments like these.
Are you being purposefully vague? No empirical evidence of what? I'll counter that there is overwhelming evidence, but I'll share specifics once I know what your claim is. :)
no empirical evidence to support the original statement he made: "the market for people like [me] is not very large"
for one, what does "people like me" mean? people who can make use of non-apple hardware? what sort of uses? i don't know what he meant. i could take a guess. but then i would be making an _assumption_. and i might be dead wrong.
and that's what you did in your comment. you made some assumptions. what were they?
i already told you one: you assumed the os a "mom-type" would run would be various linux distros. what if it's not linux?
here's my guess: we're debating whether mdonahue's statement "people like you are not a large market, unfortunately" is true.
however as i pointed out, we haven't agreed on what "people like me" means. we cannot debate this statement until we have agreed on a definition for that. then we have to consider what is meant by a "large market". what is a "large market"? then we have to decide whether this what is assrted in his statement, if true, is "unfortunate" or not. or maybe we can skip that since it seems like just a mdonahue opinion.
is this explanation still too vague for you? i'm not sure how much more specific i can get.
Did they fully understand the reasoning behind the techniques you taught, or was it just memorization?
I have taught my mom how to do certain things, but she has no intuition. As soon as something is slightly wrong or different, she gets stuck and can't move on. The solution is always something simple, like relaunching the app, installing an update, power cycling the computer, jiggling the usb cord, modifying the permissions on a file... but there are only so many contingency plans I can teach her.
Maybe I just suck at teaching, but I think that technical people have an incredibly curiosity and comfort with troubleshooting that people like my mom don't. We are basically playing on our computers.
With the iPad, my mom is finally playing too. She is really adept at it. Sending photos, checking facebook, downloading new apps, she was never comfortable doing any of this on the computer. Too many choices and settings and things to potentially screw up that she would be paralyzed, unable to explore and try things.
Ultimately I think the home button is the most important thing in the iOS ecosystem. If all else fails, go home and everything will be fine*.
(Unless your battery dies, or there is lint in the charging port, or your screen shatters, or you muted it, or you turned on airplane mode... it's not perfect...)
I think reasoning, in addition to basic instructions, is important even though some people might not care about it.
By leaving it out you deny those who do care an opportunity to learn.
And to me it just seems more respectable when someone asks you to do something and tells you why you are doing it then if they just give you bare instructions. (That said, the bare instructions should be able tostand on their own. They had better work, every time.)
Moreover, providing reasoning forces you to demonstrate you know the subject matter well enough to be able to explain it.
This article is unintentionally an excellent argument against patent protection for Apple's products.
If Apple is a company that uniquely has the talent and taste of a good chef, the patent protection is unnecessary. They will be able to continually outdo other companies that don't have the same talent.
Arguing that a company has so much talent and is so successful that it needs legal protection seems absurd to me.
I think the point he was making was: once you see a master chef make mayonnaise you realize the reason you failed. You tried too hard. And now you can make mayonnaise.
Tablet makers (iPaq, MS, etc) tried too hard. Apple showed them don't try too hard. And now they can make tablets too.
So yes, the excellent chef can be the first to make mayonnaise, however, with out patents on mayonnaise so can everyone else now. The idea of a patent is that the master chef spent years learning to not mix the ingredients too hard, now that he has showed the world the way to make mayonnaise, the only thing stopping the world from stealing the fruits of his work are patents.
The point being if Apple never created the iPhone and iPad we would still have phones like the iPaq and the clunky MS tablets in the year 2012. I whole heartily agree with this.
If it wasn't for Apple, every other phone and tablet company today would not be making anything of the flavor of the iPhone or iPad. Android and Win Mobile Phone are of the flavor of Apple.
So why is it right for them to taste like "L'Atelier de Joel Robuchon" (Apple) in the year 2012 if they would still taste like "Taco Bell" (MS, iPaq, etc) in the year 2012 if it wasn't for Jobs' iphone and ipad. Obviously with out a parallel universe for us to visit together I can't prove this to you. However, I do feel the overwhelming facts of 20 years of failure by iPaq, MS, etc show the trajectory they were heading for and we can guess where they would be in the year 2012:
- Stylus pen, or clunky touch screen that you have to press very hard.
- Lots of ram, lots of CPU power.
- Very heavy
- Very large and thick
- Poor quality materials
- Mediocre software that does not come any where close to the current android software.
- Buggy software
- no App store, lots of viruses and other security issues.
- Expensive and running full Windows 8, no RT version.
So if the products Samsung and MS were selling today fit the above recipe, I agree Apple shouldn't be suing them. But this isn't what is happening. Apple is being robbed blind. Every technique and recipe Apple created is being meticulously stolen and engineered into Android and windows phones. These are recipes that Jobs, Ive and the whole Apple company put years of effort into, they poured their heart and soul into these recipes.
The reason so much of the tech media don't see it this way is they have no taste buds. Most people have terrible taste buds. Well, they have great unconscious taste buds but their conscious taste buds are almost worth less. So they see things like the iPaq and those old MS Tablets and think "Well Gee golly, that food sure was tasty, sure android is tastier but not much tastier, apple doesn't really deserve much credit for androids improved taste."
As some one with excellent taste buds (I am a UX/Product designer, I have predicted the success and failure of almost all major tech product that have come out in the past 10 years, you can read the comments in my HN history as I defended the iPad to almost 99% of HN thinking it was stupid when it came out, I predicted the failure of the Zune, I predicted the success of Apple as whole back in 2004, and predicted MS' current decline) I can tell you with a fair amount of certainty how different iPaq and android taste.
What android and Win Phone are doing is sneaky. It is so sneaky you don't realize how many subtle but important details they have stolen. They can do this because you don't have the conscious taste buds necessary to notice it.
These details may be subtle but they are far from small. If you ever watch a grand master play chess. Every time he makes a great move you think to yourself "Gee golly that was an obvious move". No. No, it was not. Once you see a chess move you can no longer look at it objectively. The way to objectively judge a chess move is to try to figure it out on your own before some one shows it to you. After spending hours and hours and more hours looking for this move before finding it, you then fully appreciate the move.
The tech media is watching a chess game from the sidelines Apple is the grandmaster, Android and windows phone are the people building a database of the grandmaster's moves to beat people at chess.
Most people won't be able to comprehend or believe the next sentence: A non Expert Chef or food critic (Expert Product Designer) will never be able to appreciate the Gigantic chasm between the taste of Apples products and any of their competitors, however, on an unconscious level everyone will be drawn to Apples taste so strongly that if the competitors don't copy it, soon Apple will have no competitors still in business. This is the natural Monopoly the iPod almost had for a few years. In 2010 "The latest research by NPD Group claims that iPod had a 76 percent share of the MP3 player market in US in May this year."[0] This monopoly would have been true if Samsung, Google and Microsoft were not spending the past 5 years perfecting their ability to hire Expert Product designers to steal Apple's recipes. From 2001, when the first iPod came out, until about 2010 MS, Google and Samsung went through a phase of denial, trial (trying to steal ideas) and then finally some succes (actually stealing ideas). It took them about a decade just to get good at stealing ideas from Apple. The Zune was an example of how hard it is to steal ideas from apple. A non Expert Product Designer would think the Zune was good thievery, however, it was a sloppy job, it wasn't until windows phone and android, that these companies started to be good at stealing ideas.
You are going to hate me for saying this, and you won't agree with me but the truth is that: you don't on a conscious level know what is going on. You are Unconsciously Incompetent. And Android and Win Phone are using this to their advantage. Its the same way Europeans stole land from Native americans. The native americans were Unconsciously Incompetent when it came to the idea of "Owning Land". They saw the land as owning them. Thus they signed documents that seemed to have little importance. This is what Android and Win Phone are doing. They are using you to steal from Apple.
That said Jiro [1] doesn't patent his Sushi, he simpley makes the best sushi. And he does quite well for him. If I was Apple I would spend a lot more of my time in the kitchen making the best Sushi in the world and a lot less time in the courtroom.
Note: I am sorry if I come off as arrogant by calling myself an "Expert Product Designer" and claim most people won't understand what I understand, but I don't say this out of arrogance, rather out of fact.
I have spent more than a decade to be as competent as I am in Product Design. This is not small feat. In college I was a talented student when it came to Physics, in particular my introduction to Quantum Physics and special relativity class. If I had chosen to pursue the path of Quantum Physis I am quite confident I would be an Expert Quantum Physis today.
And if this was so, it would not be arrogant for me to say such things as: "light is both a particle and wave, most people will never appreciate how amazing this is, only expert Quantum Physis', such as my self, will come close to appreciating this statements full glory." This would not be arrogant, rather it would be a fact.
The sad fact is that "Product/UX Design" doesn't get the same respect as the hard sciences. From my point of view it should. I use just as much if not more of my brain power to wrap my head around design solutions when I am designing a product as I did when I studied the particle and wave form duality of light and the intricate details of space and time learned through special relativity.
So what's the takeaway here? Are you saying that apple deserves to own the concept of, essentially, not being shitty, just because they were the first to make a good tablet? yes, Steve jobs' vision inspired his competitors to make better products. That is the single most fundamental tenet of capitalism: Competition drives all the players in a market to do better. Apple does not have a legal right to remain the market leader: they've show the competition what competitors want, if they want to remain the leader it is their responsibility to continue to produce better products. If every market leader could sue their competition for making decent products, thugs would get pretty stagnant pretty fast.
That said Jiro [1] doesn't patent his Sushi,
he simpley makes the best sushi. And he does
quite well for him. If I was Apple I would
spend a lot more of my time in the kitchen
making the best Sushi in the world and a lot
less time in the courtroom.
What I don't agree with is people saying there was prior art. No. Sorry there was no prior art, for most of apples inventions. If you take the subtle details into account.
So how can I agree with you and the above statement? I think we need patent reform. I believe in capitalism. Our current patent laws are anti capitalism, they are pro corporateerism.
If you take the subtle details into account, no one is copying Apple's inventions either (just listen to any iOS enthusiast on Android scrolling, for example).
Anyone can come out tomorrow and write their own search engine. Like Bing. Hell, Bing has even been caught red handed copying. Yet still Google makes money. Anyone remember Google suing Microsoft over that? Nope.
Maybe Apple needs to find a better business model whereby they can thrive and out-innovate before their competitors do so? That's what Google has to do, is Apple the special kid needing special treatment? What if Google started getting mad and suing every search engine competitor for infringing on their instant search patents, and other search patents? And without google, you'd be searching and hoping like you did in the 90's. Imagine that, a company that innovated, innovates, and doesn't try to sue their competition silly!!
If I hear one more primadonna talk about how Apple is getting ripped off, I'm going to explode. Please, get over yourself.
I suggest that you follow your own advice. Your post is laden with cognitive bias and fallacies, and is actually a little bit offensive. Here's the thing, no matter how much you whine that Samsung/Google were innovating by copying, you are wrong. It's plagiarism plane and simple. This does not foster "fair" competition. It is not innovation. It does not offer consumers reasonable choice, it just makes the plagiarisers rich off the back of doing little intellectual work. The irony of course is that copying is hard and copying well is extremely hard. It's easier to come up with your own solutions.
Now go off and explode in a sealed room. That much bile won't be a pleasant sight.
And you talk as if iOS didn't copy from android(notifications). That's right, Apple stole Google's innovation. I believe you ignoring this elephant in the room is a logical fallacy.
>Software is all zeroes and ones, after all. The quantity and order may vary, but that’s about it. Hardware is just protons, neutrons, electrons and photons buzzing around, nothing original. Apple didn’t “invent” anything, the iPad is simply their variation, their interpretation of the well-known tablet recipe.
You can make cookies and cakes from the same raw ingredients, doesn't mean that the person who invented cookies also invented cake.
Also the 'Software is all zeroes and ones' argument irks me. It's the bridge between an expensive paperweight and a practical device.
There was a clear before and after when they launched the iPhone
By NAILING it apple managed to popularize existing tech like the modern touch-screen interface, the app develeper economy & handheld computing.
So enough with this apple hasnt done anything bs, the reason they print money is they keep making future tech accessible, I hope the iPhone 5 (6?) lives up to their record.
I think this article makes some good points but overlooks the crucial question - what inventions are patentable? Like a great chef that combines known ingredients, apple has made excellent products, but apple has not produced new innovation that is worthy of patent protection. Apple's path to success should be winning customers, just like the chefs he describes, not through patent litigation.
The examples given by the OP actually support this point. It's true that Einstein didnt discover relativity first - both lorentz and poincare had worked out the mathematics, but Einstein articulated the concepts best. I often use this case study as an example of how innovations often arise independently from different inventors simultaneously because the conditions are right. The other famous and illustrative example is newton and leibniz inventing calculus independently.
As a VC, I see this all the time - the market conditions are right for a new idea, and suddenly 4 or 5 companies appear doing variations on the same thing, none aware of the others. Let good execution and the market decide which one is best, not the date on a patent filing for something each came up with on their own.
Einstein understood the meaning. Poincare and Lorentz just understood the machinery. That is a fundamental difference that let Einstein go beyond special relativity to general relativity. Big, big difference. The difference between friendster and facebook.
I was under the same opinion as the author. While the Apple Vs Sammy was going many ppl said every feature iphone existed before. But, iphone has the right mix of them in right proportions and that has made it Tick. W.r.t Patenting rects with round corners etc, I too disagree with Apple. Violating Trade dress by Sammy is not acceptable either.
Got to love HN, if someone copies a startup pixel by pixel and business modell, hulla, the lynch mob readies it's pitchforks. If someone copies a freelancers design, website or logo, woha, people run amok. If someone copies Apples UI, hardware, apps and app market it's "Apple Never Invented Anything."
You are right, but Apple's definition of marketing doesnt end with the billboards or the tv commercials.
Every time you see someone wearing those white earbuds, that's marketing. When your mom is always on her ipad, despite 20 years of lessons from you on how to use a computer, that's marketing. When you are sitting in an airport and you notice the glow of the Apple logo on the back of someone's Macbook Air, that's marketing. When the same dude is still using his macbook, 6 hours into a flight without a powercord, that's marketing. When you walk into an Apple store, and you feel the aluminum unibody... when you unwrap the sturdy yet smooth packaging and the bottom slides out to reveal the device... when computer boots directly to a gray apple logo on a white screen, instead of a DOS prompt for a few seconds, that's marketing.
It takes a lot of work to do Apple style marketing.
I completely agree. Apple's marketing is genius. But, the comment I was replying to seems to think that Apple invented effective marketing, and that's just not true. They do it really, really well, but you gotta be chugging the kool-aid to think that they invented the concept of using their products to advertise their brand.
There is no other company that can market a company like apple.
No other company can just show their product without having to say anything about it. That makes Apple unique in such a fundamental way which is exactly why they are the most (or one of the most) valuable company in the world.
Apple didn't even invent the idea - Bose has been doing it for years.
Bose product sounds good, yes, but not as good as their price tag. Bose also severely restricts the way you can use the product - no bass/treble knobs for example. Very similar to Apple.
Most people here seem to argue whether or not Apple have the rights to hold patents on what they did (and whether that should grand them protection). But where this argument is flawed, is that protection from being copied is not only centered around the patent system. There are multiple ways you can protect your inventions, through trade secrets, supply chain efficiency that nobody else can reproduce, through a software environment which is worth more in its whole than the sum of its parts.
We know Apple is already using all the above techniques to protect their business. They have an excellent supply chain, they obviously have preferential trade partners which enable them access the new technologies first before anyone else, and they have a iOS system which is well rounded for its purpose.
Net, they really do not need to leverage their patents protection. That could be an indication that they do not think they are going to be very innovative down the road, and therefore they just want to keep competitors as far as possible until they pull their act together again... or that they have too many lawyers with too much time on their hands to investigate how much harm they could do with claims based on air and smoke.
When a company resorts to such practices, it is usually not a good sign.
God, this is depressing, lots of whiny fandroids who can't even f---ing read.
These morons have taken the article title as a literal synposis, it is being ironic (and yes, we know that Americans famously don't get irony).
The authors (one of whom is a former Apple exec) are arguing that what Apple did was invention and does indeed deserve protection from the legion of lazy wannabes who would just otherwise simply copy from the smartest kid in class.
Right, because once that chef figured out just the right twist on those common ingredients no other chefs in the world were allowed to make mayonnaise any more and we would have no fine cuisine without extensive IP protection enforced by law.
It's a metaphor. You don't take them seriously. They're just there to tell a cute little story and you move on to the real thing. Certainly iPad involved a little more than a twist.
A better article on the Counternotions site, "Why Apple doesn’t do 'Concept Products'"
"Concept products are like essays, musings in 3D. They are incomplete promises. Shipping products, by contrast, are brutally honest deliveries. You get what’s delivered. They live and die by their own design constraints. To the extent they are successful, they do advance the art and science of design and manufacturing by exposing the balance between fantasy and capability."
Note this: Apple spends substantially less (as a fraction of revenue) on R&D, and gets substantially more, in terms of outstanding products, compared to every other tech company. What is it that they're doing so well?
Nothing besides of polished systems and good hardware. But I do not like the type of signal this can show to the tech companies around the world: that their patents of new tech that are life changing for everyone are worth less than a pinch zoom on a touch screen, now I don't think it was a obvious gesture but I do not think it should be covered by a patent. There should be a better way to protect new UI elements.
Do you have some links referring to this? Specifically the R&D expenditures for Apple's direct competitors? Say, Asus, Acer, Lenovo, Samsung's smartphone division? You can't lump all of "tech" into one basked and compare it to Apple. You have to keep it aligned with companies that are in the same industry, and then exclude the extraneous R&D like Samsung's washing machine or refrigerator business.
It doesn't feel very inflammatory, just re-iterating a point that to some seems very contentious, that the composition of separate parts Apple does is innovation.
Anyway, I very much like JLG and I liked and agree with the articles push.
The fundamental defense made by many is that without patent protection for software, people would not put much effort into innovating.
1) If you look at science, open source, fashion, food, and other areas where humans continually build on culture, you see plenty of continued innovation without insane legal protection.
2) The amount of effort pales in comparison to the monopoly granted. You could make the argument for say, pharmaceuticals, that if it takes 10 years from lab through human trials and hundreds of millions of dollars, that a 2-decade long protection period might be needed. But there is no FDA for software, and Apple actually spends far less on R&D than other companies, and with $100 billion in the bank, you can't claim that haven't gotten an incredibly good return in their investment.
Therefore, it is insane to grant 20 year protection to Apple for stuff like pinch gestures. 2 years maybe. But 20? It's absurd.