Well, for what it's worth, it's true at least in my case.
I honestly don't have the experience to build file system indexing, low level crypto modules, etc. I could never build a database, period.
I consider web programming to be the way of the future, so I've never bothered to learn the things I would need to fill those gaps, but at the same time, I do understand that you often have to write filesystem things to be served over the web.
Simply put, if you can do it in Python, I'm your guy -- if it needs to be C or (gasp) Assembly, I am absolutely NOT your guy (tho I have dabbled, for the goofs.)
While it is an overly broad, sweeping statement that simply doesn't work for a lot of developers, it is at least true in plenty of cases. I might not be representative of the average, but I'm guessing most web devs do consider web development an easier way out. I certainly did.
Yeah, because StackOverflow.com is a shining example of Jeff's ineptitude.
Jeff declares bold, out-there statements probably more than his fair share, but he's sparking beneficial conversations that get people talking (even if it's about how wrong he is). You might think his conclusions are brash and useless, but he's contributing to the programming community more than he's hindering it. Personal putdowns aren't helping anybody, no matter how witty they are.
I don't think "sparking conversation" in and of itself is a good thing. Most conversation is mundane, boring, and a waste of time.
The problem with Atwood is that his popularity on the web is an example of the stupidification of web programming. People who build web apps flock to him because he has become successful on the web, not because what he says is true, enlightening, or even beneficial to their success as a hacker.
The web is popular for wannabe's because they get exposure. They think they are hackers, but really they are just factory line workers assembling someone else's code.
In most forums I frequent, the majority of questions are from people asking how to get some FOSS project to work, not how to code some particular algorithm or do something hard. They don't do any research. One of the questions on stack overflow right now is if it's possible to find where sql server database files are stored. Seriously? If you can't do something like that, why are you trying to build a web app?? It's one of the most basic things in sql server.
I think at the moment, we're in a web programming bubble, fueled not by the love of money, or the love of programming, but the love of popularity. This love is shared by many because we all want to be loved.
I don't think "sparking conversation" in and of itself is a good thing. Most conversation is mundane, boring, and a waste of time.
Agreed, but some is productive, and while I disagree with much of what he says, he often sparks productive conversation.
In most forums I frequent, the majority of questions are from people asking how to get some FOSS project to work, not how to code some particular algorithm or do something hard.
True, but that is where most people need help. People working on the hard problems probably both know how to generate the answers themselves and know that they will find little help on those on the forums.
Ah, but it's your job to write your comments so what you are saying is reasonably clear - and yes, I indeed read the article before I read your original post.
I wholeheartedly disagree, mostly because there are a lot of people that think he's the next Fred Brooks. He's not. Far from it. He's teaching a new generation of developers these stupid axioms that make no sense. I didn't realize this until I had a few of the more nonsense ones parroted back at me during an interview once.
I think Jeff's post lacks humility and perspective. Generally you try not to name laws after yourself, but let's say it was some sort of humour.
I truly think he should put web programming aside for a while, and realize someone has got to write the operating systems, web servers, middle wares, compilers, tools that make everything possible.
If things are simple it's because other people solved the hard problems for him, and sorry, but it's more challenging and requires some "traits" not found in every engineer.
Of course writing a complex, working and reliable web application is far from trivial, but please, realize that the world is rich and vast and there is always a bigger mountain out there.
"I truly think he should put web programming aside for a while, and realize someone has got to write the operating systems, web servers, middle wares, compilers, tools that make everything possible."
Sometimes on the StackOverflow podcast Joel refers to this whole area of programming as "systems programming," which I think is the correct term. Basically, it is building the things that other programmers use to build applications that normal people use. So, programming where the end user of your work is another programmer.
It would be nice if Joel reminds him of this whole area of programming again in the next podcast. It's clear in their podcasts that Joel has been around the block a few more times, and sometimes needs to fill in the larger perspective that Jeff lacks.
It's clear in their podcasts that Joel has been around the block a few more times, and sometimes needs to fill in the larger perspective that Jeff lacks.
I think I subconsciously had the same opinion as Michael, but reading it out loud makes you consider how dumb the idea really is. Do I think python developers are stupid just because it's easier to program in than C? No of course not. Elitism is rarely justified based on the tools or platforms people use.
In fact, this is a pretty curious article. Both the quote it reacts to and also the conclusions drawn from it are absolutely wrong (and I do think, this strong wording can be used in this context).
Web programming is difficult in a different way than say, compiler construction or kernel programming. Web programmers have to deal with malicious users, a potentially vulnerable gigantic stack of applications, sessions, databases, communication costs and whatsoever. One might say: Web programming is mad due to integration issues and malicious users.
On the contrary, take compiler construction. Compiler construction is hard, because language design is hard and because mathematical reasoning about programs and program transformations is hard. Generating code for a language is hard, too. Thus, one might say that compiler construction is hard due to mathematical issues and hard with respect to algorithms.
I do not think, I would be able to build a secure, good webapplication anywhere soon, despite having touched and worked with two large, strong complicated compilers. Thus, I think the difficulties are different.
On the other hand, 'all programming is web programming' is just ridiculous. I mean, one can easily drop a gigantic list of things: kernel programming (without kernels, NOTHING works), compiler construction (without compilers, we'd be in stone age), games (c'mon, totally disregarding egoshooters?), media codec implementors (videos, music, ...). None of these directly interface with the web, but all of them are important (at least kernel programmers and compiler constructors are crucial for usual boxes to somehow work). So, not all development is done for the web.
I just wonder when and especially why Atwoods posts go (rapidly) downhill like this.
"Web programming is difficult in a different way than say, compiler construction or kernel programming... Web programming is mad due to integration issues and malicious users"
To me, web programming isn't as well-defined as most programming topics, and requires competency in too many domain-specific areas for one person to master all of it.
Assuming you know what you want your language/web site to look like, compare..
Writing a parser generator, parsing the tree, optimizing the instructions, generating the instructions, regression testing, benchmarking
with..
Designing a database schema, maintaining HTML/CSS, making a pretty design, writing Javascript, safely deploying, configuring web servers, generating test environments for all browser/OS combinations, doing usability testing, and making sure you're not giving away any of your users' data.
A lot of the compiler-writing phases are yes/no (language design aside), and a lot of the web phases can't be answered easily. E.G. "Is our login system secure?" "Well, probably... Dave's pretty good with this stuff and he thinks it is."
I think that both remarkably miss the point that the true measure of software is actualized value to the end user.
Who cares about platform or depth of technical knowledge? If some PHP hobbyist who knows nothing about mmap(3) delivers a site that works for people then he is a successful programmer.
But assertions like "You hope everything doesn't "move to the web"? Wake the hell up! It's already happened!" irk me.
That is disingenuous at best. Even the most ardent believer of "everything on the web" will fire up Photoshop for even the simplest of graphic requirements. There will always be a requirement for people to know the ins and outs of desktop programming.
The way I see it, programming for the web is, for simple apps, the easiest way to build anything in any language that provides an interface and does something useful. It's also, in my opinion, the most difficult platform to develop more complex software for.
The statelessness of HTTP, and the long list of requisite skills (JS/HTML/CSS/SQL) needed to build something application-like are pretty serious. And that's before you even get into messaging, caching, and optimization. In the blog post the buffoon spends a paragraph decreeing the nightmares of web development, and the next calling those who want to develop for it "dumb." I'd say the opposite is true.
Alas, there is somewhere in the middle of the spectrum where it's pretty boring for someone who enjoys programming. But I think any technology can get boring when the job becomes more a task of memorizing an API and using it the way you're supposed to than actually hacking. While C is "hard core," doing something in C with X11 is an exercise in rote and memorization.
He goes on about his IDEs and debuggers. What's with the debuggers guy, too dumb to not make mistakes? What's he working with that he's so happy using, Squeak?
So yes - this guy is an idiot. If you're result oriented, the web is a great place to build an app because there are fantastic frameworks to create things quickly that anyone with a browser can use. And if you're hacking oriented, you can write your own framework.
> Who cares about platform or depth of technical knowledge?
The programmer does. There are a lot of really smart programmers that just want to work on interesting challenges. And if you're the kind of programmer that enjoys solving difficult problems, you're not going to find many when building web 2.0 apps.
Really, you believe that? The two challenges that immediately come to mind are scaling and data mining, but there are a number of others. The use cases presented by the web have pushed distributed computing further than any other single thing in computing history. The web is just the interface the user interacts with... the stuff that's going on behind the scenes is still really interesting and very challenging. Building a web app requires both sides of the equation, it's not just writing some html.
Not really, most web apps are just CRUD (and MVC if you're lucky). What percentage of all web apps (remember, this includes apps that are only ever used on a corporate intranet) ever get to tackle scaling or data mining?
There are many examples of boring corporate non-webapp programs. Visual Basic was the most popular language once and it has nothing to do with web programming.
When basic things are hard, you have fun challenges addressing basic things. When basic things are easy, you are free to do more interesting things at a higher level. Web apps are easy. Making interesting web apps that do really cool things and solve serious problems - its as hard as any kind of app.
Your problem is that you're focusing on the same challenges and finding them easy, instead of thinking about what new things this simplicity enables. Step up a level, use the higher level tools to build better things - and you'll be challenged again. And you'll make better things.
After seeing what google docs is capable of it is only a matter of time before even things like photoshop will be web oriented.
Just the other day I was looking for a visio replacement, on a hunch I asked here if somebody knew of a web based version of it and I got two viable suggestions. Now, 5 years ago that would have been laughed at with 'that will never happen'. Now it is there.
Give it another 5 years and who knows. Maybe the next iteration of the gimp is going to be a service, not a local program.
I hold absolutely nothing for impossible, I've been wrong too often in the past :)
note to my fanclub: instead of downmodding try to explain why you think this is wrong.
That may all be true in the future. In fact, browsers as a rule seem to be evolving closer to operating systems with time.
For the time being, I find that when there is a web version and a nonweb version, the nonweb version is virtually always more feature rich, more robust, and more polished than the web version. That could easily change in the future.
One thing that is unlikely to change is that I will have more control over the nonweb version, including amoungst other things the ability to refuse/ignore certain upgrades and to know that my data is staying on my local machine and not being sent to third parties at all. This is normally trivial for most things, but it can be very important for mission critical items where consistency, stability, privacy, and customization are more important that being on the bleeding edge.
Of course, it is possible good solutions to even those last problems will be found, but I do not think that one is on the horizon yet.
In short, I think that there is a place for both and that for the near future more things will (and should) move onto the web. But not all of it and in particular not the most important parts.
I just have to ask... is making photoshop online any easier than making photoshop on the desktop?
The original comment had to do with amateur programmers that can't do anything else other than write simple php with complete disregard to the complexity of computer science. But when you're making a scalable commercial webservice, you need to know a quite a bit about compilers, concurrency, inheritance, and hardware. Probably the only major difference is that you learn about html + css + ajax instead of MFC (or gasp visual basic).
Software is about the value to the end user, but it's also about cost. When you want to be efficient, you can't just use the simplest tool to implement the features, you need to know much more about the specifics. It just so happens that web development is easier to do poorly than desktop programming.
Just because a language is easier doesn't make it actually easier to program in. When you make it easier to build a larger structure, you naturally end up taking on larger tasks. Poor programmers can get sucked into building a gold-plated hacked up version of something they wouldn't have gold-plated in a "harder" language. (And you have not seen the depths to which a program can sink until you've seen someone create a total hack of a Python program!) Good programmers take on tougher tasks to bring more value to the user. But either way, problems in the real world are and will remain complicated enough to suck up any "easy" we can bring to a task.
Where a C programmer may be stuck in a codebase debugging memory leaks, a Python programmer may get stuck walking through a 20-level stack trace to figure out which things are blowing away some attribute on some value that he thought he set and causing the web page to fault.
When we make things "easier", we just take on bigger tasks. Programmer skill will still be the dominant factor in quality of the resulting system. In some ways the "easier" tasks are harder, as there is a lot more opportunity to make bad cost/benefit analysis due to the larger scope of the world of possibilities.
"All Programming is Web Programming" only where "All Programming" is short-hand for "all programming that Jeff Atwood comes in direct contact with," and even then it's not really true.
Does he really think that the hospitals and banks and oil companies, etc., are run largely off web apps?
It's clearly nonsense, and part of me is beginning to suspect that this "Jeff Atwood" persona is just an excuse to troll...
I worked on a contract for Petronas in Malaysia (Retail Oil Division). We built a pay at pump system, with a centralized processing server to the different major cc providers (Am Ex, Visa, Mastercard).
The server was built using Java on a BEA Weblogic App server with Oracle Backend. Each station had a mini station controller (To manage the different pumps and manage the queue and messaging to the central server). This was built with Tomcat & SQL Lite.
While not connected to the internet, the system was almost entirely built with web programming and hardware hacking.
When these discussions come up, focus drifts towards the "all apps should be browser apps" contention. That's silly, but it's true that more and more apps can benefit from networked components. Would I want to run photoshop in my browser? Hell no, not for a few tech generations. But, would I like photoshop to automagically backup my assets and edit history onto some highly-available, redundantly stored file server? Defintely.
And sometimes it's IE6. No, it won't work on other browsers. No, they won't update or re-write the software - it was a nightmare to get developed the first time around.
By "Nightmare", I was thinking more of the dysfunctional tender + waterfall process that some large contracts (especially public or government) are prone to. If the development is so very painful, requests for change are not going to be looked kindly upon.
I don't know that it is nonsense, but it is an overstatment.
Personally, the vast majority of programs I use on a regular basis are desktop based, and I cannot find a good web app to replace them (some of them have web app versions, but sorely lacking in features I need):
Python,
PyDee (soon to be renamed Spyder),
MS Outlook,
MS Word,
SQL Server Management Studio (and associated suite like SSIS...),
MS Excel
GIMP
And when I have a few free minutes for a game, it is on a disc in a PS3.
Also, when I do my own personal programming (I'm a DBA, not developer by trade, but I do little custom things for myself and friends at night), I can do them both more easily and in a more polished fashion for a desktop gui than by running them through a browser. This may change as browsers mature (and I learn more about browser based programming) but for right now I find desktop apps are almost always higher quality than the web based versions.
(The AppState is stuff like your config files, or database connections, etc., etc. You have some state in your function, you change it, and then the next invocation of your function uses the new state. You might not always write that explicitly, but it's how you think about it.)
The only difference here is in the format of the "input" and "output". Inside, we use the same programming techniques. We solve the same complex problems, and produce the same valuable results.
(But not, he's right, I don't use UML diagrams and multiple inheritance. There are much sharper tools available to me.)
Here's a funny little tidbit about Michael Braude's post: there's more to it than what Jeff Atwood quoted and focused on in Blogging Horror. What's more, in the context of the whole post, it makes a lot sense.
If HN audience has one blind spot, it's the fact that not everyone is an entrepreneur, working on their brand new startup, boldly going where (hopefully) no one has gone before. Believe it or not, it's really true that there's a whole lot of people out there who choose web programming because it's easy enough. There's no shortage of companies whose primary business is not software, where you can find a job that involves hammering out JEE applications using Struts (or just plain old servlets and JSPs), EJBs and one or more databases. I recently interviewed a candidate for a Java developer position who has spent 9 years in that same comfort zone, never venturing out of it at all.
Yes, Jeff, you did a great job with Stack Overflow. I use it frequently and I really love it. No, it's not trivial. Nothing worth doing is trivial. And no, not all programming will be web programming. Not without redefining "web" to mean something a lot different from what it means now. Case in point: game programming is (mostly) not web programming. Now can we please move on to a post that doesn't revolve around an exaggerated claim for shock value?
People like to say that the web is replacing the desktop, but is this really the case? When I look at the web applications I use on a regular basis (Amazon, craigslist, LinkedIn...) I cannot think of a desktop app they have displaced. I cannot say, for example, that "I stopped using desktop application X because Amazon is so much better". On the other hand, my trips to the book store are less frequent because of Amazon. So I would argue that web applications are popular today because they are a convenient way for people to use a range of services traditionally offered by brick and mortar businesses.
Edit: I also don't understand why people bother building Photoshop "clones" for the web. If the goal is to clone an existing application and just move it to the web, what actual problem is being solved?
When a new computing platform arrives, it usually doesn't really replace the existing applications one for one. It makes the old applications less relevant.
A blog is not a replacement for a word processor. But many of the things you might have written, printed out, copied, and distributed to people you wanted to see it, you just type into your blog and you're done with it. And blogs make new kinds of writing possible that wasn't before, meaning even less time spent in front of a word processor.
When several people needed to work on the same document before, they put a word processing document on a network share, or just emailed it around to the next person who had to work on it. Now, a wiki makes a lot more sense for editing and sharing that kind of data.
A desktop money management application can't compete with a web application that can keep your transactions up to date for all of your online financial accounts. I suppose you could implement this as a desktop application, but most of the power comes from the data being available from the web, so what's the point?
People used to use a spreadsheet or desktop database to store and process their financial data, but it makes more sense for the logic for handling your financial data living close to that data, in a way that a lot of people can benefit from without typing in all the formulas themselves. So the use cases for the average person to use a spreadsheet diminish.
The end result, is that people don't really care whether they have a direct replacement for their old apps. As long as they have a convenient way to meet the end result (communicate, manage finances), that's all that matters.
> If the goal is to clone an existing application and just move it to the web, what actual problem is being solved?
As soon as you sit down at a different computer, neither the application nor the work you did with it are available to you. That's a problem solved by hosting it on the web, though there are other ways to do it.
So, you don't bank online or pay bills online? I know I do. This is taking X desktop application away from a CSR and providing this functionality directly to customers over the web. And pretty much any ordering system done on the web has replaced the traditional desktop application that a rep would have had to use previously as you called/faxed/mailed in your order.
Also, there's mail and Office-type apps that are solely web-based as well.
Further, (and this is just in my company so take it FWIW) but our ticekting system has moved to a web-based system. Same with our Leave/Absence tracking system (with a SAP backend). I could go on and on.
There are at on of people that can't/don't want to afford Photoshop. Also, downloading/installing it could actually block them from using it. Now they can just blast over to a "website" and get the basic functionality...that's a problem being solved in my mind. And a worthy one.
I, for one, get really annoyed with our "lol web programmers are teh suxxors" overlords.
As it turns out, Jeff says exactly what I'm thinking. I _am_ into compilers. I'm writing an OS as a side project. I code in Haskell for fun. But I love web programming, and these two quotes are the most succinct way I've seen it written:
> The web is the most efficient, most pervasive, most immediate distribution network for software ever created.
> As a software developer, I am happiest writing software that gets used.
As Jeff says, yeah, there's a lot of bad web apps. But there are also a lot of bad desktop apps. The "web programmers suck" crowd seems to ignore that fun little fact.
Even for Adwood standards this article is really incoherent. It's also a bit of a troll.
The original claim (that web developers do web development because they're too dumb to do real programming) is of course silly. However, building web apps is to a large extend about memorizing quirks, testing code in a dozen different browsers and dealing with broken libraries and lousy debuggers. Maybe 2% of the time you deal with interesting problems, the other 98% of the time you're dealing with mundane problems. Not exactly intellectually challenging.
I think that to a large extend web applications are all about interface, and interface programming is almost inherently mundane. After you've worked out how everything is supposed to work on a whiteboard the rest is "just the implementation". If you're working on desktop applications, it's very different, because the interface is mostly composed of standard controls. Back in the 80ies every console application designed its own ascii interface. Arrow key navigation? Let's build that from scratch! Essentially, web development is in the same stage. Most of the JS code is mundane boilerplate. jQuery/prototype is a leap in the right direction, but we're still in the middle ages here.
On the desktop it's different. MS Paint? Algorithms, and lots of them! Memory management. Flicker free drawing. Bézier-approximations for line drawings. MS-paint, which is a relatively trivial desktop app, has more interesting problems than blog engines, CMS engines, hackernews (the app), photo sharing sites and so on.
Other apps I have open right now? VMWare. PDF reader. Text editors. IM client. Browsers. Putty. All these apps had to solve a lot of interesting problems.
With web development you don't get to design your application (your web framework connects a method to an url), you don't get to pick an architecture (MVC is always the right solution), there are no algorithms to develop, there is no memory to manage, there is no state space to design.
Web development is about giving people what they want. People want inane things, and the web is the best medium to deliver this inanity. Good business? Occasionally. Difficult computer science? Hardly.
You will always need a desktop application if only for the browser. :) I find hacking on Arora, Qt, QtWebKit and WebKit fun. And a browser will always need a desktop (of some sort even if it is just a framebuffer)
"The reason most people want to program for the web is that they're not smart enough to do anything else"
The reason most people want to program for the web is that web is such an interesting and huge thing that is happening now and many people want to be a part of it and many others dream to make loads of money of it.
Web programming is not inherently bad compared to desktop programming. Of course there are many bad programmers who develop websites juts by reading a few ASP or PHP tutorials. But that cannot be generalized to say that all web programmers are bad. There were lot of bad programmers in the desktop world too. (Remember VB?)
"The reason most people want to program for the web is that they're not smart enough to do anything else"
Most of the people are not smart (at least when we look at them from our narrow perspectives). They are everywhere - in the desktop as well as in the web.
At the same time you will see that the ratio of good programmers to bad ones is orders of magnitude higher in some specialized fields like compiler design or kernel design. Just remember that this is not a sign that all good programmers are in the desktop. It just means that smart people tend to cluster around in places where other smart people are. There are places in the web where the ratio of smart people to dump ones is higher. Think about the engineers in Google/Youtube. Cloud computing and distributed computing are all hard problems too.
Once technologies like Google's Native Client take off, the web browser becomes merely the installer and updater for my native app.
And it'll be none too soon -- then we can abandon the doomed effort to somehow cobble together huge, powerful apps like Maya and Photoshop out of a mess of JavaScript and XML.
>> "then we can abandon the doomed effort to somehow cobble together huge, powerful apps like Maya and Photoshop out of a mess of JavaScript and XML."
Or, you know, we could develop better tools for the web.
I'm amazed that this discussion always takes the stance that we're stuck with what we're using for web development now, as if nobody could possibly develop better languages and tools for web use.
That's exactly my point -- NaCl would make languages and tools available to web app developers which have already proven their worth in developing complex, graphically intense application interfaces on the desktop.
That makes no sense to me. Google Native Client is just a desperate attempt to hide expensive code from other companies. How many nuclear power plants, switches, routers, undersea pipelines, servers and UPS systems are needed to run a Facebook turd-app connected to Google DRM anyway ? Talking about the slowness of Javascript is amusing when 856 sub-systems needs to be operational for a Web 2.0 "Hello World".
Give me a break. Americans have no cash and live on credit, thereby creating a marked for AT/T, McDonald's and Google.
Perhaps you're replying to a different comment than the one I made? I was talking about the difficulty of creating complex, responsive user interfaces in JavaScript. NaCl (if it works out) seems to open another path for developing interface-heavy apps for the web.
I love the "I like using tools like this because they are hard and it makes me feel smarter than everyone else" attitude. I understand everything on the little laundry list, including things I wish I'd never had to do (UML diagrams, for instance), but choose to work on the web. Why? Because I prefer my "hard problems" to be customer problems, not technical ones. I'm happy to work on hard technical problems when products require it, but there's plenty of difficulty just figuring out what people want and how to get it to them quickly. There's no glory in doing hard things for the sake of it. I don't need to purposely handicap myself just to prove that I'm smarter than other people.
I'm fully confident in my intelligence without playing stupid games.
Doing hard things for the sake of it, or for no other reason than to find out if you can do it, is the very essence of hacking. It is also precisely that attitude which fuels research, discovery and invention. So while I can respect your preference for solving customer issues rather than technical ones, I don't think you should diss people who prefer working on technical ones either. Are you saying there's no glory for the people who wrote the JS engines running in your customers' browsers? What about the people who wrote the operating systems on which the web server pushing out your code runs? What about the people who wrote that web server? What about the people who wrote interpreters for whichever language you're using to code your web apps?
Those people deserve more respect than being dismissed as playing "stupid games", I'm sure. Lastly, I think if you're not feeling stupid 90% of the time, you're probably not working hard enough.
You are right - doing things just because is a big part of the essence of hacking.
I fully respect all the work of people who work on "hard problems" - I even believe that I have and continue to work on "hard problems" every day. The list of things the OP used were basically "skills" not problems (virtual destruction semantics, references, pointers, etc). It's the attitude I dislike, not the desire to do hard things. The attitude that "these things are hard to understand and therefore more interesting and more worthy" is ridiculous, and shows that the writer doesn't actually understand any of the things he hasn't worked on.
I don't diss people working on hard things - I diss people who call everyone who doesn't work on what they work on weenies.
also, don't get me wrong - I'm a technical guy. 90% of each of my days is spent messing around with complicated and complex problems in code. That's what I do, and I respect everyone else who does that. That's WHY I believe the original sentiment is so flawed. Just because you use C and have to deal with tricky memory issues doesn't actually make the problems you solve any more interesting or useful. There's plenty of useless code in any language.
"The reason most people want to program for the web is that they're not smart enough to do anything else."
The web is so deep: HTML, CSS, Javascript, PHP, Mysql are all required to build the next web app.
In any field there's bad and good engineer. HTML and CSS are easy, but complicated also (browser support for example). Javascript is simple, but challenging with Jquery. PHP is an Object Oriented Language and can have classes inheritence...
Web developers need to know FTP, HTTP, requests, debugging, FireBug.
Conclusion: Web developement is so big and you can be a stupid developer whether in web or desktop!!!
Jeff failed to point out the most absurd aspect of the blog post he quotes: balking at new technologies because they make life easier. Isn't that the entire point of technology? How can you be in a tech field and maintain that attitude?
I think that Braude feeds his ego by assuring himself that his ability to program a computer must prove that he's smarter than everyone else. Now that programming has become easier and anyone can do it (which is not even close to the reality of the situation), he feels less special. It's so ironic when technologists resent technological advances for this reason.
That all programmers don't need to know about "virtual methods, pointers, references, garbage collection, finalizers, pass-by-reference vs. pass-by-value, virtual C++ destructors, or the differences between C# structs and classes" is a very good thing. Lots of that stuff is ugly and overcomplicated and poorly designed and error-prone. The reason that large classes of programmers needn't pay any attention to any of it is that the state of the art has advanced and newer, better designed technologies are at our disposal now. As technologists, how can we do anything but celebrate that?
Both of these posts, Atwood's and Braude's, are complete messes.
Braude tries to compare the development of entire desktop applications with the UI development/document presentation of web applications.
Web UI development should be compared to UI development on the desktop. I wish I could use Swing/GTK/Qt for my web UIs, or write my blog with Word, but instead we have to use HTML and DOM scripting. The web is a crappy UI framework with an excellent distribution method. The internal differences of desktop applications and web applications are typically a matter of scale.
"Atwood's Law: any application that can be written in JavaScript, will eventually be written in JavaScript." Dear god, no! The browser is not the web. The future is not photoshop in a browser, it's a photoshop UI (hopefully developed with a rational desktop GUI framework) with a network connection (HTTP?) to a processing server. Of course, that's not particularly new, it's X11.
Hanging around the hacker dojo and spending a little bit of time at open source world, there are definitely people who think this is true, that all programming and applications are web apps and they're all going to move "into the cloud." I find amusing the lack of acknowledgment that your web app requires an operating system, a web browser, network equipment, etc. to run. Unless all these "just work" and there is no more development left?
You mean, aside from the programming done on the 4 billion cell phones out there, all over the world. Not to mention the hundreds of millions of game consoles in developed countries. Over 10 billion ARM processors have been shipped, and you can bet that most of them won't be running web apps. How about the firmware and the server software and the operating systems which make your web programs possible?
I, for one, welcome our new Javascript-kernel overlords.
Seriously, what is he talking about? You could perhaps make the argument that GUI apps are by-and-large web-based at this point, but his argument is just as bad as the one he's arguing against: frontend development is not the only development.
2) Web hosted applications will be more reliable than anything you have at home, simply because it will be easy enough to use your home machine as a cache. The only good reasons to keep data off the net will sooner or later be transfer speeds (instant access) and privacy concerns. We're not there yet in this respect (witness the recent GAE outage) but we're moving in the right direction.
3) Constant internet connectivity is not 100% commonplace, but in a short while (less than a decade) an internet outage will have similar impact as a regular power outage.
4) On the contrary, web apps are almost always already more powerful than a desktop application. Webapps build on the power of the resources embodied in the web, those resources are vast, MUCH larger than anything you or I will ever have on our desktop.
> Webapps can never be as powerful as a desktop application
You're confusing technology with user interface. The primary difference between web apps and desktop apps in the future will be the idioms they use in their user interfaces.
Offline storage and various other HTML 5 APIs mean that there will be nothing a web app can do that a desktop app can't. Web apps might never be as efficient as a desktop app -- compiled versus interpreted languages being what they are -- but efficiency is much less of a concern these days. Web apps are often quicker to write, update and iterate upon, so for any given new application, the web version may have a structural advantage over the desktop app.
good point. But say you have a photoshop webapp (with all the features the desktop version has). When you want to apply several filters, you are depending on the capabilities of the server and also on the current stressload of the servers. Things might take a long time to complete or cost too much.
With a desktop app, you're in control of your own machine.
good point. But say you have a photoshop webapp (with all the features the desktop version has). When you want to apply several filters, you are depending on the capabilities of the server and also on the current stressload of the servers.
Why? You can push pixels with flash and JavaScript right in the browser. I'm not saying that doing it is right (neither is the article), but you can do it.
No with a Photoshop-like web app the server is simply sending javascript to your computer where it runs in your browser. There may be some interesting things like identifying the location of your picture that happen server-side, but things like filter will usually run client side.
Web programming, shweb programming. You have a client and a server and a way to communicate. JSON is a great way to communicate between client and server. But does it matter if the client is Java or JavaScript or a browser plugin? No. And the server side? It's what ever you want.
I remember the stacks of documentation I used to lug along from house to house (or office to office), and I wonder how the 'real' programmers today would get by without the web. Google and many other websites make for a greatly reduced effort in finding relevant information. The web has fostered communities of 'true' hackers all over the globe, working on niches so obscure that 20 years ago you'd have never met someone that even understood what it was that you were trying to achieve. It's the democratization of hacking, the barrier to entry has been lowered.
And understandably that has some folks pining for old times.
Sure, some web programming is not as exciting as finding a really elegant solution to some problem with a two page algorithm that will have everybody think ('why didn't I come up with that?').
But on that level almost all of us became 'application programmers' long ago.
The toolbox is now so filled with great instruments that we can go and make some music. There is real programming to be found on all levels of the game, more on some than on others but still. And the tools are so easy to use that plenty of people that would never dream of a programming career a decade ago are getting in on the game, and they find that they can do useful stuff.
The 'real' hackers of today build the tools that the rest of us use, and I don't think it was ever any different, just some illusion. Mostly due to the fact that you used to be able to understand everything about some branch of computing.
The big issue here is that the 'browser' really is an intelligent terminal and that sooner or later you'll see the analogue between the way we work today and the way we worked in the early 70's to 80's, before the advent of the PC.
The PC revolution is slowly being reversed by the web, more and more data is migrating 'online', more and more code is server side. The only thing that works against that trend is 'ajax', where some code migrates the other way.
So, hacking is far from dead, web programmers are 'real' programmers, there is a lot of work to do and if you want to hack to your hearts content without feeling frustrated by all the layers I have a stack of 68000 manuals here that are looking for a new home. Then you can unplug your internet connection and pretend the revolution never happened.
Programming will never be 'just web programming' or any other form of it, software is so pervasive it is scary. If you are on the frontiers of computer vision, AI research or some other field that seems to have no impact on the web today then I'm sure that your technology + the web some day in the future will be a more powerful combination than without.
Technology is a means to an end. To some it is a means unto itself but that's a small minority.
Well one could lean on the anti-web/cloud-apps case a bit heavier taking into account the bits and pieces of paranoia that easily stem from a centrally-operated system, where admin is god, a big corp has your files etc. Capitalism goes BOFH I guess. Then again, centralization also evokes images of Brazil and the french Minitel system, which, all things considered, were quite cool (and the Minitel still holds some ground).
I suppose by talking of 'web programming' one immediatelly thinks of PHP, and thus the horror, but there are quite a few other things that are considered web programming and do not promote bad practices. Then again, if the Cloud is the wave of the future, we can but ride it out.
A really great Coding Horror post (whats going on: TC has been pretty good this week, Jeff is ranting about good stuff... ;)). Anyway:
Writing Photoshop, Word, or Excel in JavaScript makes zero engineering sense, but it's inevitable. It will happen. In fact, it's already happening. Just look around you.
With that (very astute) comment in mind - is it time to look at Javascript as a client side egine.
By which I mean if the future is going to be web based (and with Office Online and Chome OS it is hard to argue that is not the overall "plan") then is JS a sufficient engine to work with. Do we need something a little mor powerful - with tighter bindings to the browser, filesystem (sanboxed obviously) and the web?
Am I the only here who is interested in subject but is uncomfortable about both sides in the argument? You know, sort of like seeing Mussolini and Pol Pot discussing human rights.
While I get and can easily relate to the OP's point about web programming, neither UML, sequence diagrams (yes I actually done them), .NET sandbox or the three-languages-clumped-in-ugly-one (C++) gives me any excitement. Bare mention of "business logic" gives me nausea, and I want to unlearn all I knew about CIM.
I guess some stuff is just boring crap to work on and some isn't. Not a terribly useful or novel conclusion, and probably would never make as controversial and hot topics as those two blogposts...
Because we're all afflicted with terrible ADD after years of internet usage. We need short paragraphs with key sentences bolded and big colorful images in-between to keep our attention.
Is web programming really that much easier than desktop application programming?
Most desktop applications can use masses of memory, CPU and disk before the average user will complain. In most cases this is not possible on a web app. since the same hardware must serve 1000s of people.
I think both of these articles are wrong. This theme has occured in several posts here in the last few days, so I'll state my "grand theory". There are several types of programming: applications development, systems programming and algorithm/data structure development.
I'm a systems programmer who is trying to pick up a greater deal of algorithm and data structure knowledge. I view applications as moving to the web but for web application to become efficient, scalable and allow for the same rich UI as a desktop application a great deal of difficult systems engineering and algorithm development had to happen first. More so than for a desktop application.
Desktop applications typically store individual user data: you could easily use an O(N) lookup structure to store it and then use just write() to write it out to disk as a C struct. Web applications are multi-tenant. For the largest web applications, even a relational database (an incredibly complex system: transaction protocols, b+trees for storage) can't scale.
Before any reputable company puts their name on application, the application has to be load balanced. Yet, this is transparent to the user. Do typical desktop applications require a shared nothing distributed architecture?
Now on the other hand, web developers don't think about these issues and write in powerful, high-level languages using frameworks and DSLs which often times don't even require writing SQL code. Yet, they aren't doing in this in vacuum.
So really the case here is that web application development appears, on the surface, to be less challenging than desktop application development. Yet the amount of systems programming that had to happen to allow that is immense.
Am I now saying that systems programming is more challenging than applications development? No, the challenges are different. I don't have nearly the sort of patience and attention to detail that UI development requires. The process (gathering user requirements, presenting mockups) is actually more complex.
If you're looking to do lower-level work, or more algorithm/data structure development because that's what you find challenging than don't be an applications developer (whether desktop or web). On the other hand, be aware that the demand for systems programmers isn't always as high (and getting the job is a lot more difficult in terms of complexity of even the interview and the experience/education level required).
On the other hand, before web developers bash statically typed languages (especially C/C++), bash CS curiculla that stress algorithms and data structures or wonder why interviewers at Google/Yahoo/Facebook ask about Binary Trees, they should be aware just how much of C/C++ code was written for them to be able to do even the simplest of SELECT statements (and if you don't know what data structures might have been traversed when you issued that statement, I won't hire you).
That guy was right - a browser is a platform nowadays. It means yes, Javascript gets ground from Delphi, Java and .NET
But he missed one thing - all browsers, v8 engine, most of rdbmses and even jvm were written in C++ and almost everything on a server side written in C. =)
I still stand by my theory: it takes intelligence to do the simplest thing possible. If anything, the thought that web programmers are doing something easier than others is an indicator that they're smarter than "lower level" programmers.
I've only been a programmer by trade for a couple years so I don't know very much yet, but could someone please enlighten me: what does the platform have to do with how difficult a problem is? I thought platforms, languages, etc. were just tools, and the web or desktop are just user environments; how do these tools and environments determine how complex the problems are? Apps can be as trivial as you want them to be or as difficult as you want them to be, regardless of how they are accessed. At least that's what I thought, or am I making some naive newbie mistake here?
Also it just seems like a lot of these anti-web "real" programmers don't ever discuss much about user interface design, as if that's trivial and backend programming is "real" programming because it's ugly and normal people get confused by it. Kind of reminds me of jr. high school when our school had apple computers, but the myself and other kids who owned PCs thought we were way smarter than apple users, because somehow we thought that typing in a black shell window makes you smarter than someone who draws pictures with a mouse.
The reason most people want to blog is that they're not smart enough to program for the web.