The whole reason why the HTML world took off while the existing technologies lingered is that amateurs could grab whatever they could and run with it - incompetence be damned ! Innovation often happens when amateurs find accessible technology and bend it in strange ways that make the professional cringe. Professional are then needed for security, scaling, cost reductions and all the things that require maturity but instead of innovating they would have refined the state of the art to new and still stagnating levels of perfection. Hurray for amateur enthusiasts who do interesting new things because no one told them it was the wrong way to do it !
I agree. Many underestimate how the lack of strictness helped early users. You could start with plain text and add a few <br> or maybe a <p> (wIthout end tags) to create paragraphs.
Anyone with minimal knowledge could start sharing content. If they had to pass any kind of validation, many would not have bothered. Yes it was a mess, but it helped us get enough information to reach critical levels. I would rather have a web with messy HTML than FTP servers filled with Word documents or a proprietary AOL / MSN
The thing about the web is that it was for amateurs. Things like separation of content and style are great for engineers creating larger-scale maintainable sites, but they're absolutely not helpful for new users. Being able to take a text file and add a couple of tags around some text you want to emphasise is a positive advantage for such individuals. It rewards small experiments. Understanding CSS is, by contrast, quite the undertaking.
Hypertext had been around for quite some time before the web, and there's still some academics I know who seem to scratch their heads at the success of such an simplistic, impure form. To me, its simplicity is a reason for its success, while projects like Xanadu make a fundamental error in not considering enough the importance of barrier to entry.
That last sentence reminds me of a quote from Ivan Sutherland:
When asked, "How could you possibly have done the first interactive graphics program, the first non-procedural programming language, the first object oriented software system, all in one year?" Ivan replied: "Well, I didn't know it was hard."
"[N]one of this answers the original question: why do we have an <img> element? Why not an <icon> element? Or an <include> element? Why not a hyperlink with an include attribute, or some combination of rel values? Why an <img> element? Quite simply, because Marc Andreessen shipped one, and shipping code wins."
I think a large number of these problems came about because instead of issuing an RFC for proposed solutions, they just did it. You had a number of renegade developers focused on shipping instead of developing consensus.
Given the scale of the project at the time, which had a very small user base, maybe this was a reasonable approach. It's just that as the project grew in scope, the process never seemed to change until it became what is the W3C in all of its absurdist glory.
The IETF seems intimately familiar with the technical problems they are solving, but the W3C seems oblivious to even the most superficial implications of their decisions. I honestly doubt that even one person on the W3C committee has designed a modern, high-exposure web site.
The alternatives are not much better (overloading a tags is more elegant and would probably have eliminated the later mess of needing embed tags and plugins for additional types of media).
Even now we have img, audio, video, etc. instead of say a single well-behaved and extensive media tag that could also be used to include content from another page.
I wonder if this was a concession to the limited browser code and slow networks at the time. With the img tag you know its going to be a jpg, gif, or png. If it was a media tag then you have no idea what it is. Imagine downloading a 10mb quicktime file or flash object on dial-up so the mime header can be read as you wait for the page to render. Its only recently that anyone has bothered to handle video in the browser.
Could web servers back then just send the client the header of a file? I think these guys were working with a lot of ugly limitations that we've only recently overcome. Andreeson recently said that he expected the back and forward buttons to be temporary but no one thought of a good replacement for them. We still haven't.
> Actually, maybe we should think about a general-purpose procedural
graphics language within which we can embed arbitrary hyperlinks
attached to icons, images, or text, or anything. Has anyone else seen
Intermedia's capabilities wrt this? It's one of their most impressive
capabilities, actually.
>
>Something like a cross between PostScript and CGM might work...
actually, maybe we should just use one or the other, and add the
extensions we need for the links. Also we'd want to make sure that
it's completely editable.
I didn't read that Berners-Lee was saying he thought it was a bad idea. I read it as that he didn't want to alter HTML any more at that point until sometime later, i.e., "I don't want to change HTML now if I can help it, until it has gone to RFC track"
Then Andressen's response wasn't that he was trying to force the issue, he was heavily suggesting it at that point so that maybe all future instances of the IMG tag worked the same when HTML2 began. He was pushing for consistency from the beginning, as he says, because essentially everyone was going their own version of IMG somehow anyway. Plus he was agreeing with the Berners-Lee statement I quoted above.
As a final bonus, if I'm reading it right, Berners-Lee basically suggested image maps in the message that Andressen was responding his agreement with. If Berners-Lee was expanding on the idea of images in HTML then maybe he didn't necessarily see the suggested implementation as a bad idea.
He actually said HTML2, but I think, given that we are at revision 5 going on 6 of HTML with the addition of CSS and Ecmascript, while we have held pretty steadfast at HTTP 1.1, when talking about the web, HTTP got more right out of the chute than HTML did.
If Kay really was talking about the presentation layer of the web, this seems like a reasonable argument for it.
The technologies that could be called "the internet" solve more well defined problems than the technologies that could be called "the web".
TCP/IP for example is a description of an electronic locomotive network. It ferries packets around. What is in the packets? Who knows? Who cares? What about privacy, authentication, authenticity? Not our problem!
The web on the other hand is a mishmash of adhoc parts, HTTP,HTTPS,HTML,Websock,DOM etc. Are we a client/server document protocol? Are we a distributed runtime? Do we guarantee secure communication? Perhaps. Are we stateless? Technically, but state tracking is what people use us for! We're shipping around a turing complete language everywhere, is this an integral part of the web? Or just something we slapped on top for good measure?
It's interesting to note that choosing not to worry about those things ("privacy, authentication, authenticity") were very much design choices - perhaps the most consequential design choices ever?
...and it's a shame that the WWW is good enough for people to attempt to kludge something on top of it, and continually fail, while not being bad enough for anyone to succeed in launching alternatives that fix those problems.
See also SMTP which has flooded the Internet with huge amounts of garbage traffic but which is simple and works well enough that any alternatives fail.
See also SMTP which has flooded the Internet with huge amounts of garbage traffic but which is simple and works well enough that any alternatives fail.
I agree with your main point, but I wanted to add that there are plenty of alternatives to sending e-mails using SMTP these days. They mostly go by names like "Facebook" or "Google+" and run over protocols like HTTPS talking to a centralised service provider, and they solve the fundamental problem -- letting people send text and picture messages to friends/colleagues for viewing at their own convenience -- in a different and mostly incompatible way.
Of course anyone who needs reliability and privacy will question the use of the specific public social networks I mentioned as examples, but the same principle applies when businesses deploy centralised in-house systems. They still handle messaging (and often a lot more besides) for their key functions using dedicated tools, so general purpose e-mail is no longer necessary for those functions. Increasingly these organisations also use VPNs to allow employees and other associates who are not physically present to hook into the same centralised systems remotely.
Sure, but I think you would be hard pressed to find a company that didn't have some form of email system for dealing without people outside of the company etc even if internal communication is done by some other tool. It's a network effects problem, you could build a better email system but you can't get rid of email completely until it's as commonly used as email.
Such a common tool needs to be highly resilient to catastrophic failure in the same way email is.
For example the DDOS on sendgrid affected sendgrid and their customers but the rest of us were unaffected.
The implications of a system that allowed groups like Anonymous to cripple worldwide trade at will via DDOS would be quite scary.
I agree with this too. I'm not suggesting that e-mail is dead or anything, just that a lot of use cases aren't so much being replaced by an alternative to SMTP as being replaced by another communication channel entirely.
"The technologies that could be called "the internet" solve more well defined problems than the technologies that could be called "the web"."
I'm pretty sure the original problem the web was meant to solve was the problem of enabling hyperlinks between documents that are distributed across many servers. The original concept solved this very well, scaled very well, and was then hijacked by people who thought that "document hyperlink system" means "application delivery platform."
(OK, there were a few things that happened that I glossed over, but the web's original goals were not as poorly defined as you make it seem.)
The critique is a bit unfair because the Internet had clear design goals and criteria with which to apply rigorous engineering practice. The web by contrast was a essentially a document sharing and linking protocol and document format. It looks bad mostly because it outgrew it's design, but at the same time if it had been designed correctly for its eventual use it probably never would have taken off to begin with. Granted, it has fundamental warts as well, but I ascribe to that to the relative uncertainties of designing a higher level application that has at once more constraints and unanticipated usage.
I don't know, perhaps Kay is right, being that he is 1000 times the engineer I am, but the web just seems such an amorphous thing compared to TCP/IP.
In Europe in the 80s/early 90s, X.25 comms, full seven-layer OSI stack and X.400 email was the "real thing", standardised by OSI, mandated by government procurement and implemented by telcos. This was "commercial grade" networking and email, with TCP/IP and SMTP being derided as "academic".
The IETF were the scrappy, pragmatic upstart. "We believe in rough consensus and working code".
I've not followed many later IETF standardisations, but I did take a passing interest in CAP (calendar access protocol) a few years back. (since there wasn't a good open source calendaring stack (not sure there is now, either, tbh).
It seems to me as a distant observer/potential implementor, CAP would be very hard to get started on. It mandated the use of otherwise unused-at-the-time BEEP [http://tools.ietf.org/html/rfc3080] (for good reasons - the same good reasons OSI mandated their full 7 layer stack...), embedded (unspecified parts or SQL92 (without a grammar!)) and seems to me a daunting prospect. It looked to me as though they were really saying "everyone doing CAP has to really have an SQL backing store with a particular schema and you pretty much have to pass these queries through to it".
So I see OSI->IETF->W3C as a progression over time. Not sure where to now...
[I'm sorry if I'm harsh to the people who put a lot of effort into CAP. I didn't contribute and so I don't really have the right to criticise, but from a potential implementors pt of view, it seemed daunting. I'm also not tracking any current IETF work, so I know I have a narrow view and am likely wrong in many ways.
update: ...and I've just skimmed the final RFC and they do have a grammar for the SQL subset. I recall that that wasn't present in the drafts available at the time.]
A standard should reference working code. Preferably more than one implementation by different parties. Ideally at least one implementation is open source.
This doesn't mean that standards based on working code is automatically superior, but it's less likely you'll get absolute turkeys.
I read this as Kay being unfamiliar enough with the lower level protocols to assume they're significantly cleaner than the higher level web. The “designed by professionals” era he's talking about still had major problems with security (spoofing is still too easy), reliability and performance which is why there's still new work being done tuning everything for high speed or high packet loss links. Go back just a little further and hostnames were resolved by searching a text file which people had to distribute!
Both systems are complex heterogenous systems and have significant backwards compatibility challenges any time you want to fix a wart. It's easy to spot problems, hard to fix them, and as the array of failed competitors to eithet shows it's surprisingly hard to design something equivalent without going through the same learning curve.
As a biologist might tell an intelligent design proponent, if you look at one and see genius design you're not looking closely enough.
Interestingly enough, Alan Kay, who has an undergaraduate degree in the subject, was inspired by biology (particularly the way that field discovers and organizes knowledge) when he designed Smalltalk.
Having been involved in computers since the 70's, the internet before WWW, and the web since, I come to the conclusion that the Web succeeded because it was barely good enough to work for amateurs, but no better. As hypertext it sucks. As a application platform it sucks too. But for allowing free exchange of information, it's been pretty damn good for such a crufty piece of tech.
HTML5 is being designed by "professionals" and I have never seen such a mess. Functionality is being introduced that shouldn't even be part of HTML, just look at the amount of new input types.
HTML should be a markup language, CSS should be the styling language and JavaScript should be used to handle matters such as input, processing and output. The principle of separation of concerns is very simple and elegant. In the long run it makes sure you don't get a mess of a framework in which things can be done in 50 ways, of which 25 work only in browser X and the other 25 behave differently in each browser.
You'd rather have the current situation of thousands of badly implemented datepickers and people loading the complete jQuery UI suite just for the datepicker instead of a simple <input type="date" /> ?
It doesn't solve anything. When you move such functionality into HTML you will just get "thousands" of browser specific datepicker implementations, in each version of said browser.
If there is a badly implemented datepicker written in JavaScript you can use a different library. A good JavaScript library will make sure it works with consistency and as expected in different browsers.
> When you move such functionality into HTML you will just get "thousands" of browser specific datepicker implementations, in each version of said browser.
My phone has one phone-optimised date picker; my PC has one PC-optimised date picker; my screen-reader has one screen-reader-optimised date picker. Sure there might be other versions for other platforms, but I as a user would only interact with three, and those three are all consistent and optimal for their situation.
Seems much better than our current variety of scripts which all target the desktop PC and fail at it, and don't even attempt to work on any other platform...
> If there is a badly implemented datepicker written in JavaScript you can use a different library.
Not if it's on somebody else's site I can't.
> A good JavaScript library will make sure it works with consistency and as expected in different browsers.
A library written and deployed today will work flawlessly with the browsers of tomorrow, including things like screen readers which the library author hasn't even thought about?
Taking your logic further, why bother having HTML parsing, layout, or rendering in the browser? In fact, if you object to the idea of semantic markup and would rather manually specify every detail of every element for every situation, why are you using HTML and not giving users a .exe? :P
I do somewhat agree, but how would Javascript be made to create system looking controls (i.e. Unity date controls on Ubuntu, Windows like on Windows 7, etc.), without adding some extensions for DOM?
I personally dislike that CSS is turning into a miniature programming language, but I assume this is the way forward until something way better comes along and simply replaces HTML, CSS, JS. I mean its better to say width - 10 pixel than to make JS for each silly snipet like that.
Why shouldn't HTML be used for defining input forms? You are just stating that it shouldn't. CSS should be used for styling the input form and widgets because it is a styling language, but HTML is perfectly fine for defining the semantics of input forms.
"...The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs..."
Top answer attributes difference to web pioneers being "not standards people", wtf? professional standards committee attendees aren't exactly known for technical excellence.
Actually, the underlying foundations of WWW -- primarily HTTP and hypertext -- are excellently designed from the beginning. The problem is in fact that they were designed too well -- it was too easy to build a good-enough content that was easily accessible by a lot of users, and the real harm was done not by the amateurs, but by the "wrong" kind of professionals -- graphic designers and developers from other kinds of the field (databases, system, financials) who failed to understand the capabilities and specifics of this new platform.
EDIT: Removed assertion that Web was implemented excellently from the beginning; I understand that HTTP 1.0 was inefficient, and there were other early issues. But the design was sound, and the implementation improved quickly.
I disagree with the first bit. HTTP is an ambiguous, overcomplicated mess. I regularly have to wedge myself into the stack and deal with things like cookie directives, pipelining, persistent connections and utterly broken caching semantics etc. It's just horrible from end to end.
HyperText itself is fundamentally a well engineered concept (I mean it worked fine for HyperCard etc) but to base the public facing WWW on SGML was just a plain horrible idea.
For a number of years I actually preferred Gopher, WAIS and Usenet. I still do now when I think about it for the sheer simplicity and the fact it's designed to push indexes and information to rather than to ooze marketoid vomit.
Agree with the "wrong kind of professionals" statement though.
I see where both of you are coming from, but I can't help but notice the number of competing network protocols and formats that HTTP/HTML defeated in the market. Did the web succeed despite its technical failings or because of them?
Possibly a mix. The fact that the protocols were free an unencumbered, and reference implementations of both server and client software were provided helped immensely as well. Any idiot could come along and either pick up working pieces or modify them. And they did.
Tim O'Reilly's had a few things to say about watching the Web take off. Its primary competition was closed systems: either entirely closed networks, or proprietary protocols, or both. He wasn't willing to bet his company on any such thing. When the Web emerged, it was clear to him that it was the solution he'd been looking for.
I used to appreciate the initial web and happily changed from desktop based application development to web based ones.
Nowadays, after so many scars from Web applications projects trying to bend the technology to mimic as close as possible the desktop with broken abstractions and browser issues, I gladly take desktop consulting projects over web ones.
It's high on the list, certainly... I might be inclined to hate more on the SOAP designers these days, but FTP does very well on the "idiotic complexity in few pages" metric.
From a readability or operational perspective? Yes, it was done much too early to take advantage of eventually emerged common ways of doing things (I would never guess in ftp, there's no rhyme and little reason), but files seem to go back and forth pretty reliably once you've properly asked for what you want.
The Internet was not conceived to be what it has become. Having had a front row seat to watch it go from academia to what it has become today one can only be in awe of what --I'm gonna say it-- private enterprise, capitalism and the drive to succeed can produce. Yes, it is not perfect. What project at this scale is?
The explosion of the Internet into every nook and cranny is and was a 100% due to massive private investment at all levels. It was a rush for gold. Fortunes made and lost.
EDIT: Strange words replaced. Damn iPad auto-text!
The Internet is different from the Web. It is off topic, but if you want to go into that, the Internet as distinct from the Web had an even bigger public kick-start.
Of course the two are interdependent. And both were grown to what we have today by private enterprise not government. Knowing what I know I'll venture to say that the public sector contribution to building either the web or the Internet is just-about microscopic when compared to private investment in both time and money.
Yes, there was a kick-start phase, but it was nothing compared to what came later.
Furthermore, it was done without intent, purpose or even an idea of what it could become. It was simply a communications network for the military at first and for a select few universities later.
In the TED video I linked the presenter is holding a small book that contained every single person using the Internet at the time.
What's interesting is that he registered the third Internet domain, so he was there from the very start.
I was peripherally involved with a couple of interesting private efforts in the early days of the gold rush. One of them was a group led by Phil Anschutz. He owned the right-of-way along major railway routes. They trenches the heck out of those routes to lay in massive amounts of dark fiber. A huge investment in the future that I am sure payed off handsomely.
If you are looking for the public sector to somehow be a hero in this story, well, in my opinion it simply isn't going to happen.
The public sector investment in any conceivable category is almost beyond insignificant at this stage of the game in the context of what it has taken to get here.
I have yet to see a transcript of a single Senator, Representative, President or any policy-maker with influence pre-dating the Internet and WWW that describes how they are going to take steps A, B, C and D and invest in E, F, G and H because of this thing called the "Internet" that will be a big deal.
In other words, nobody had a clue. It just happened and private enterprise made it happen. Not the public sector, regulation, or government in almost any way one could imagine.
In fact the inter-university network was largely a part of scientists trying to come-up with a way to communicate with each other. There was no government mandate or higher vision that delivered a document to these guys and said: "Lease these communications lines, connect your PDP 11/34's to them and build something that will do A, B, C, D".
I know that in pro-government or pro-public sector circles it is quite romantic to get behind the idea that we owe the Internet and the WWW to our government, but that is very far from the truth. Few things of real worth happen due to master planning. This is particularly true of things that actually work well. Yes, we are great at killing people in massive numbers.
I mean, name one Senator today who is able to articulate --and has been driving-- the next major idea in science and technology. Please don't name anyone behind solar or wind power. Private enterprise has been bleeding money for decades in those industries to get it to the point where now a Senator might actually recognize it as something interesting to pursue. I'm talking about things that do not exists today.
And, BTW, it is perfectly OK to disagree. If you are invested in your belief system it isn't my intention, to any degree imaginable, to try to change how you think. Only you can do that. I am merely stating my position here. Chances are you think I am completely wrong, and that's OK.
It just happened and private enterprise made it happen.
No actually. The total amount of private sector activity (measured in dollars) in this space may dwarf anything the public sector put in, but that's businesses setting up their own websites etc etc etc.
The amount of business involvement in the web which represents invention is tiny.
It was Compuserve, AOL and Lotus Notes that just happened,
because the private sector was completely incapable of making anything to rival the Web or the Internet.
The Internet was nurtured in the public sector and commerce was specifically excluded. Similarly, the Web was nurtured at CERN. The private sector contributed nothing.
Of course, after the Internet had been developed already, it was decided to let the private sector in.
Let's just disagree. I really don't see it your way. Businesses didn't just setup websites. The entirety of the modern internet (and web) in most countries was built privately. You would not enjoy a DSL connection today without quite literally fortunes having been put on the line to develop, install, run and sell the infrastructure.
My key point is that, if we are talking CERN and the WWW, this was not, in any way a program that originated by any one government. It was scientists trying to solve a problem. No politician devised, specified, directed, funded and pushed for the development of the WWW at CERN. It just happened because they were trying to improve the way they communicated with each other.
To me when someone say something akin to "the public sector created the WWW" or "the/a government created the WWW" it means something very different than scientist at a research organization who were trying to fix some problems and happen to create something that went big. For me to attribute anything to government it must have clear genesis in government.
A perfect example of this is Obama-care. There is no imaginable way to claim that Obama-care was created by the private sector. It is a clear case of something that was created by a government and handed down to us.
Claiming that the WWW was a public sector project, a government project or something we ought to hold-up high as an example of what central planning can create is --again, my opinion-- absurd. It's like saying Linux is a centrally-planned public sector/government project because so many of the early contributors were in government-funded academic institutions. I could not envision anyone saying we owe Linux to our or any government.
In Tim Berners-Lee's own words:
"The WorldWideWeb (WWW) project aims to allow all links to be made to any information anywhere. [...] The WWW project was started to allow high energy physicists to share data, news, and documentation. We are very interested in spreading the web to other areas, and having gateway servers for other data. Collaborators welcome!"
Yes, the European Commission and other public and private entities (MIT) would join forces later on to nail down protocols and such things in order to make ensure interoperability. Again, a natural effect. In fact, you could have taken government entirely out of the picture and this standardization would have probably happened just the same as universities and research labs got together to make the system work more universally. It's the story of computing. There are many standards in a myriad of industries that were developed ad-hoc at first and later refined and ratified by key commercial and non-commercial players in the industry.
The history of the Internet --as distinct from the WWW which one could think of as an application that runs on top of the Internet-- is a little more convoluted. Depending on how you want to interpret things it took thirty to forty years for the Internet to surface from inception in 1960/1970 in a form that enabled the WWW to be developed.
The primary motivator and driver for ARPANET had nothing whatsoever to do with us being able to buy books online. It was a purely military project from the start that mutated into civilian use at one point. No consumer-side foresight existed in its development at all. That's why it is a bit disingenuous to credit the US government with the creation of what we call the Internet today.
Much like civilian aviation had its roots in military aviation, the roots of what we call the Internet today are in military technology.
It would be insane to say that the government, in a centrally-planned sort of way, invented civilian aviation. Just the same, I don't think it is fair to credit government or the public sector with the creation and development of the modern civilian Internet. These are vastly different things.
The scary proposition here is that we ought to support every military initiative because they might develop into amazingly useful civilian projects in the future. I fundamentally reject the idea that we have to continually develop better killing tools in order to derive peace-time benefits.
I could be wrong.
The Wikipedia page on the history of the Internet is well worth reading:
Let's just disagree. I really don't see it your way
No let's just admit you are wrong. You are entitled to your opinion but not your own facts. However inconvenient the actual facts may be to you. Sorry.
The facts are that it was the public sector that provided the seed funding for the Internet. The private sector was useless for that. We actually have a controlled experiment, because we saw what the private sector was capable of producing, and that was crapware like AOL and Lotus Notes.
The entirety of the modern internet (and web) in most countries was built privately.
which is not only wrong but also entirely irrelevant, because that only happened after the Internet had already been created, entirely with US government funding as it happens. That investment only happened "naturally" for the Internet, the public sector contender that the government had built, because it was the only one good enough to invest further in.
We know that that infrastructure build-out would not have happened "naturally" for a purely private sector-seeded system, because we saw it not-happen for Lotus Notes, the private sector contender, because it was so unpromising. Had there just been the private sector efforts, you would not be able to buy books online at all today.
With the exception of a few places like the old AT&T labs, the private sector simply does not do really early stage projects, but the public sector, or more precisely the non-profit sector, can. This is because non-profits (almost by definition) can provide some funding for projects that don't need to be justified for a specific payoff.
In other words, the reason that the non-profit sector is better at seed funding is precisely because "The primary motivator and driver for ARPANET had nothing whatsoever to do with us being able to buy books online."
We are buying books online roughly 50 years after Paul Baran's packet switching. What's the net present value for a 50 year payoff? Which venture capitalist is going to invest in your project for that? Answer: nobody.
every military initiative because they might develop into amazingly useful civilian projects
This is not true. It's just very difficult to convince Americans that their taxes should pay for seed funding for healthcare or for buying books online, but very easy to convince them to pay up for whizzy new bombs for dropping on people who look like me. So the US military budget ends up being a large part of the US R&D budget. Given different politics, you could instead just fund R&D directly.
These are examples of economic war (Sun Tsu) tactics rather than "things of real worth".
For example, I can't think of many notable inventions that came out of any of the associations you mentioned. Your examples are simply examples of amassing a centrally controlled industrial and economic army in order to capture territory. It's war without bullets.
What I am talking about are things like finding the cure for cancer. That is not going to be centrally planned. You can throw all the money you want at the problem, government or private. There might very well be a kid who is still pooping in his diapers today who will stumble upon the solution in another twenty years. No amount of money is going to dig-up that solution until it, well, just happens. Will it happen? Sure. Eventually. But you are not going to centrally plan it. Even if you setup a government laboratory to hire every single cancer researcher coming out of every university on the planet central planning might very well fail. For all we know it will be a marine biologist with a unique view of cell biology who will discover the solution.
If you've read a few of my posts over time it should not come as a surprise that I am a huge proponent of the private free enterprise approach. I think governments, with the passing of time, are becoming less and less relevant and are horrible blunt instruments that try to micro-manage our lives and do a really bad job of it. In the past few decades, regardless of political party affiliation, the US government has given us a perfect living example of how badly things can degenerate when politicians are solely driven by their need and desire to remain in power. Years go by and nothing happens. Or, even worst, the things that do happen are driven purely by the need to manipulate the population for votes.
Humankind has moved from feudal slavery to the kind of freedom we have today for a reason. That did not work. This works better. I think our future has far less government in the context of an educated, responsible and self-reliant citizenry using private resources to make everyone's lives better.
Of course, these things happen slowly, and they should. Too many corner cases to deal with.
Like your other interlocutor, I find your stance here too rigid.
Here's the example that struck me:
"... finding the cure for cancer. That is not going to be centrally planned. You can throw all the money you want at the problem, government or private. ... No amount of money is going to dig-up that solution until it, well, just happens."
You're setting up a strawman here which indicates you don't know how large scientific problems are solved. In particular -- it's not like there is some grand plan that starts now and ends with "the" cure for cancer. But there is a method, that can be successful in solving hard problems.
There will be one or several advisory boards of people who have basically dedicated their life to aspects of the problem (e.g., a subcommittee of the National Academy of Science). Their expertise will be staggering.
They will produce a roadmap. It will be followed (mostly) for a while, with government or private sponsors. Over time, ideas will be funded and pan out or not, the board(s) will change, roadmaps are redrawn, and directions shift.
It's not some fixed plan. There are revolutions in disciplines, when people decide that prior approaches didn't work, and just abandon them. Staying at the forefront is highly competitive, because you're basically competing with the best people in the world.
Maybe (as has been the case with cancer) people will decide that it's a harder problem than first supposed, that there are a multitude of causes, some with easier solutions and some whose solution is still unknown.
Maybe (as with numerical weather prediction) the progress will be rather amazing, and in a couple of decades you will have a robust system that people take for granted, but that was just a crazy dream, originally. Hey, that's progress for you.
Private enterprise didn't develop and test NWP. And it certainly couldn't have been solved by cottage entrepreneurs.
My overall points:
(1) There is a large class of important technical problems that are not solvable by small entrepreneurs, or even by corporations.
(2) The "big science" solutions that have been developed do not use some kind of state-planning-from-1950 paradigm -- they are much more adaptive and competitive than you seem to believe.
I agree that my perspective above is influenced by the problems I have worked on in my life, and observed others (more gifted than I) work on.
These are examples of economic war (Sun Tsu) tactics rather than "things of real worth".
So shooting up all the way from war-ravaged societies to developed first world economies in just 40 years (from the 40s and 50s to the 80s) doesn't count as "real worth"?
You are just discounting or dismissing what doesn't fit within your ideology rather than considering the evidence.
No. You simply see things through "government and the public sector are great" glasses and I don't. Perhaps you are a government worker. I don't know. I have been an entrepreneur my entire life. And my parents have been entrepreneurs their entire lives. And, you guessed it, my grandparents have been entrepreneurs their entire lives. So my family has a thread of self-reliance that spans multiple generations, cultures and continents. It is very likely that my world view is vastly different than yours across a wide range of areas. And, as I have said many times, that's OK.
No, your stubbornness is not OK. It is called a lack of intellectual integrity.
You aren't supposed to go around wearing prejudiced "glasses" based on what your grandparents did, you are supposed to try and see things as they are - and be open to revising your opinions as necessary.
I suggest you stop pretending to be Burt from Tremors long enough to try being objective. You might like it.
As it happens, I have never been a government employee and basically all my professional experience has been in the private sector. That shouldn't dictate my opinions.
The same way that the fact that I have worked in data cleaning before doesn't make me an advocate for data cleaning.
Now if I try to advocate that a public sector hospital cafeteria is just as nice as a commercial cappuccino bar, I'd look silly, because that is an example where the free market does better.
When you dismiss all that has been achieved by the tiger economies because of your sheer prejudice, you too look very silly.
It is also silly to assume that private sector = entrepreneurship and government = master planning, which is simply not true. Counter-examples:
1) A lot of the master planning that the Japanese and Koreans do is implemented by their private sector, which is clear from the links that I provided.
2) A lot of the private sector even in the US consists of large corporations who do master planning internally using ERP systems.
3) The people working in these companies are also not showing any kind of entrepreneurship or cowboy self-reliance; they are private sector bureaucrats churning out TPS reports
4) The DARPA funding for Internet and other CS research was government funded - but by throwing money at Professors, not by politicians micromanaging the thing
Maybe looking into Zen might help you with your world view problems.
I did the interview with Alan. Let me give the larger context. The interview was done shortly after Kay had shown a presentation done 50 years ago of one of the very first GUI programs. He wondered then and later in this interview why GUIs had essentially not evolved much in the intervening period, despite the huge jumps in technology in other areas.
With me, he discussed how UIs were uninviting and mystifying to people who don't know how to use them (children and seniors). In addition, he felt the Web provided very little that would foster the individual's learning on his/her own terms.
So while I suspect he'd agree with many of the comments posted here about the Web, he was not primarily speaking about protocols and bits and bytes, but by the end result: a presentation quality that only serves up what a TV can do, rather than deeply engaging individuals.
Alan Kay was talking about the program model. There's no isolated module below the document namespace, leading to monolithic web apps and a lack of user-configurability. The browser needs to run more than one program at a time to be a serious OS.
I imagine he compared the browser pretty directly to Smalltalk.
There seems to be rather a lot of consensus that the web is a mess but it is what we have.
Have there been any serious attempts at producing a solution that is better designed that does not simply add new things to the spec. An argument can be made that if a better system existed it would not be able to compete because of inertia, but surely it would be worth at least making the attempt.
I have scribbled notes and designs for years now exploring ideas of what could be done with a clean slate and hindsight that was unavailable to early browser designers. I can't be alone in this. Who else has explored the possibilities of what browsers could be?
That does indeed look interesting. I shall take a look. I certainly hope there are more projects like this out there. There are many approaches that might be better or worse which might not be apparent until someone attempts it.
XHTML2 (now abandoned) was a serious effort in that direction (while sticking quite close to existing HTML) - consistent syntax and rules for handling it.
Honestly I don't think we'd see much change to what browsers can do - like turing-complete languages, everything is possible on the web, just not necessarily as easy as it should be. A lot of parts of the web I view as "assembly languages" - it would be possible to use better alternatives to HTML, CSS, Javascript and perhaps even HTTP, but it's not like we handle any of those directly. So no-one should care except a few library authors - and the libraries have mostly been written now.
It seems like people are talking about a bunch of stuff like the ability of amateurs to create a web page easily and that sort of thing.
Maybe some of that stuff helped. But I don't think those were the big reasons. Look at how the web works:
1) Web pages work on all relevant platforms. If someone creates a new browser, most existing web pages will render on it, and failures are in the nature of bug fixes rather than complete rewrites. Compare this with e.g. new operating systems and C programs.
2) If you change the web page, the update is available immediately to everyone. Compare this to the morass of outdated, incompatible and vulnerable software on your typical Windows box.
3) All the code effectively runs in a sandbox. Users generally don't have to worry about clicking on a link and having it send virus spam to everyone in their contact list.
Those are some huge advantages. Advantages that more than make up for the fact that web development is a huge pile of kludges built against each other that only hasn't been discarded and replaced entirely as a result of the massive amount of mutual interdependence.
The primary competitor to the web was Java. But Java was hampered by horrible mismanagement on the part of Sun and repeated malicious attempts to destroy it on the part of Microsoft, so the web won. Which is probably unfortunate. Imagine a web where Java (or Go) rather than Javascript was the primary language.
Agree. Beyond that, I'd say that it is more of an expression of culture than it is an expression of technology. (Carefully noting that these cannot be easily separated, anyway.) "The Web" as currently practiced is more of a craft (and at best, and art) than it is a technical exercise.
Certainly, is inelegant and even ugly in places: much like the tool we humans (maybe should) value most: natural language. Natural language is fraught with inconsistencies and wtf's we have kind of made it up as we've gone along (standards academies notwithstanding).
Think of how lovely an eflorescence it is of the workings of the mind, and how well it serves one of our deepest aspirations: to connect with another thinking being and share our thoughts.
I think the most interesting question with regards to the web is how it compares to other sets of protocols/formats which try to do the same thing, not how it compares to the underlying transport (the internet).
Why did the the web (i.e. HTML over http) succeed where so many other document markup formats and protocols had failed in the past - before the web we had Hypertext, SGML, gopher, FTP etc. but the web has overtaken them all (while indebted to all of them of course).
HTML was the simplest solution we've yet found to a potentially very complex problem - how to share and read documents in a common format. It's certainly not perfect, but part of the reason for its popularity is this simplicity.
The allegory is nonetheless useful. There are other more concrete examples. In the technical world, there's the contrast, say of Linux with other non-free OSes, ranging from BeOS to VMS to WinNT.
There's Wikipedia vs. Brittanica or Encarta. There's free markets vs. command economies (or both ends vs. a mixed social market economy). Many 17th, 18th, and 19th century advances and institutions in both science and letters were founded significantly on the efforts of amateurs.
The deeper point is that the root of "amateur" isn't some connotation of naive, inexperience, or lesser quality, but "amore", literally, love. And love-of-task is a strong motivator for producing high-quality work.
Pufff, there is a tremendous bias on 'raw functionality' in that statement: the Internet was so well done BUT FOR SECURITY issues, which were never taken into account.
This detail has difficulted development to a huge extent.
One could say that the Internet was so well done for 'utopian citicens'. Not that it is bad, but we have learnt a lot for the future from those mistakes. I hope we have learnt.
The routing protocols, DNS etc (which are essential for the Internet) have so many gaping security holes that today we wonder how naive one can be when designing communication protocols...
>One could say that the Internet was so well done for 'utopian citicens'.
Even for utopian societies, some of the Internet's security problems would still be problems. The main issue is that for the most part, the protocol assumes that information provided by another source is correct. This could clearly be used mallisiously, but it also allows small, innocent mistakes to take down the network.
For example, back in 2008, Pakistan decided to block youtube within their country. While this specific example would not happen in a utopian society, there could be valid reasons that a sub-network would want to disable or redirect requests going to a different server.
What happened was that one of the ISPs, instead of simply blocking outgoing requests to youtube, advertised itself as youtube. This change propogated through the network, and youtube went down globally.
Still, when these issues do come up without any malice, they should be relatively easy and painless to fix.
I saw an interesting talk recently by internet visionary Scott Shenker and I wondered if he was suggesting the opposite: that the internet is a mess and they need to learn from software people. http://www.youtube.com/watch?v=YHeyuD89n1Y
The Web and all the technologies that come with it is a special kind of hell: a Rube Goldberg machine whose level of complexity and crazyness is hard to match.
The associated mediocrity is kinda jaw-dropping too: it's 2013 and the main "engine" used to run all these client-side apps in the browser is still... Single-threaded. Seriously. I'm not saying "language" because JavaScript in itself has so many warts that there are several technologies out there who try to dodge the language altogether and directly generate JavaScript source code... Even Douglas Crockford himself is criticizing JavaScript a lot.
But it's not just JS... The madness simply never stops: HTML, (X)HTML, CSS (oh, that one...), JS. And all the incomplete and overly complicated "standards" and, of course, all the competing implementation.
I'm always amazed that we're in 2013 and a lot of the demo page posted on HN still cannot serve videos correctly to people who don't have Flash in their browsers. That's the current state of affair: most web devs still aren't able to serve various video formats to please everyone. Why isn't this something that comes for free with the webapp server? Is it rocket-science to encode in different formats and serve the correct one? Apparently it is. So imagine for more complicated things...
To me the most fascinating is that hardly anyone steps back and wonders "Why do we have such an incredible mess?".
But that's what we have and a big part of software development is now for the Web and the trend ain't stopping anytime soon: be it server-side or client-side, it's a browsers world nowadays.
So let's stop bitching, let's learn the warts and ins and outs of these uber-shitty technologies and let's get back to work ; )
Video is not a technical problem, but a legal/political/economic one (technically good codecs covered by patents & corporations unwilling to adopt to other codecs).
Because this wasn't (and couldn't be) centrally planned. It has very much evolved in a fashion analogous to the theory of biological evolution. Lots of experiments of different sizes and with different levels of success.
We have yet to identify a more powerful problem solving paradigm. Given enough time evolution by natural selection can produce systems of incredible complexity that actually work pretty well. Not without warts, and, yes, you have to be patient, but it works.