Hacker News new | past | comments | ask | show | jobs | submit | Spyro7's comments login

I really do think that, as time wears on, we will see more of these incidents of people switching from Node to another, more familiar development environment. It seems to me that a lot of people started using Node simply because of the amount of hype that surrounded it, and many of these decisions were not as well researched as they should've been.

Reading through this gentleman's blog post, it seems that, originally, Node was chosen for just one reason – he wanted to develop the entire application in JavaScript. Other than this, it does not seem that he took the time to research the Node development environment before deciding to develop his application in it.

With respect to the author of this post, the testing environment and the ease of developing a CRUD web application are things that should have been looked into before a single line of code intended for production had been written.

In addition to this, as several other commentators have already noted, Node is not directly comparable to Rails. Node is basically an event framework and standard library that bolts onto JavaScript. Node would be more comparable to something like Python's Twisted or Ruby's EventMachine. The author was probably using one of the frameworks available in Node such as Express.js, Railway.js, or Geddy.


It seems to me a lot of this starts with "I have a general idea, and I want to get it off the ground ASAP" and transitions to "I have a focused idea, and can now revise the project accordingly"

I see nothing wrong with that, as long as everyone involved recognizes the cost of rewriting/replacing large chunks later. If you're on a super tight schedule and don't want to write anything but JS, and you can afford to make revisions later (assuming the thing even catches on!) it strikes as pragmatic.


I think you nailed it. Different solution are valid at different times.

Also in many cases, a website rewrite is really nothing complicated.


Actually since the app was developed at startup weekend which gives you 54 hours to make a presentable demo, I'd say testing was not the first thing on their mind.


And it's generally nowhere near 54 hours... more like 42. We nominally started at 6pm on Friday night, but teams weren't formed until 9-10pm. Pitches were demoed Sunday afternoon, and 48 hours from the start on Friday - Sunday at 6pm, it was all over. I've no clue why they SW fixates on the "54 hours" number. 42 would be far closer to the truth, but perhaps more intimidating for people?


Plus if they already knew rails. I use node mostly now and wonder why people would use anything else. However if i had alot of expertise in rails i would probably use it for most everything as well until i was comfortable enough with node.


Indeed, planning for 3 months to make sure you've done no wrong is much better than coding for a weekend, and determining then if you've made the right decision. Funding engineers for 3 months isn't all that expensive and everyone always makes the correct decision after 3 months of planning...


From my personal experience, Node is a really fun platform to quickly whip up a working web site/app/service over the weekend. But I keep getting this nagging feeling that it's not sustainable for a long term project with a large codebase. Other than socket.io, I really don't see Node as a general purpose web stack.


Actually we have around 20,000 lines of node in production spread over about 20 services. It's perfect for SOA.


With respect to the author of this post, the testing environment and the ease of developing a CRUD web application are things that should have been looked into before a single line of code intended for production had been written.

The former: Golden. The latter: Eh, not so much.


"I understand that sheltering them from the real world is not your motivation for this opinion, but it is the reality of what will happen. Your kids will miss out on a huge period of personal growth and a large expansion of their experience dealing with other people which they will never be able to get back later."

It is not necessarily the case that being homeschooled automatically equates to a child that has been sheltered from the real world. In reality, whether or not a child has missed out due to homeschooling depends heavily on the parents that are doing the homeschooling. A parent can homeschool their child and simulate a private school by integrating classes (with groups of children) from museums, libraries, and independent instructors. A parent can also homeschool their child by putting on a DVD and walking away.

I was homeschooled from pre-kindergarten through 12th grade. My parents relied on a lot of external classes to ensure that I was still able to develop the ability to interact with others. The reality for me was that I was able to have a far richer social environment as a result of my being homeschooled. By the time that I entered college, I had interacted with far more cultures and personality types than most of those around me. I went on to successfully complete my undergraduate and graduate degrees.

When I was homeschooled, I kept in contact with a small handful of friends that were also homeschooled. They all went on to do just fine in both college and in life. I realize that my experiences and those of my peers are anecdotal, but a study performed by the Discovery Institute in 2000 also provided evidence that there is no "sheltering" involved with most homeschools: http://www.discovery.org/a/3479

(Please note, I actually take issue with a few things in the Discovery Institute study, but I think that it does provide some degree of evidence that homeschooling does not automatically result in a maladjusted child.)

In my economics courses, we would sometimes take a look at the economic returns of education. I always found it strangely comical when someone would speak harshly about how homeschoolers are "sheltered" and then turn to me for moral support. It was always such a shock to everyone to hear that I had been homeschooled for my entire life. Some people even called me a liar outright. (I have always wondered what is it about homeschooling that seems to rub people the wrong way?)

I am well-spoken, outgoing, and have an easy time dealing with people. I do not say this (or anything else in this post) braggingly, but I say it to give credit to my parents for what their sacrifices were able to accomplish for me. You say that "sheltering" is the "reality" of what will happen, but how are you so sure of this?

I believe that the results of homeschooling, much like other forms of schooling, is a product of the teacher, the student, and the approach. I do not disagree that sometimes, with homeschooling, some children are not able to develop social skills. What I disagree with, strongly, is your insinuation that "sheltering is the reality of homeschool". Homeschooling a child does not automatically mean that the child will struggle socially.


You seriously just cited something from the Discovery Institute, a group that believes a book trumps scientific evidence? They have a strong agenda that includes homeschooling. The article you cited is not research, it's an internally created, non-peer reviewed meta review. Not exactly high quality material.

I'm not saying your argument is necessarily wrong, just that you found some of the worst possible material to support it.


My apologies.

I never intended anyone to take the Discovery Institute paper as anything more than just a casual study of the subject (by a biased party). I certainly did not want to present it as academically rigorous on any level or as an unbiased look at the subject.

I thought that I had made this clear in my original post by using the word "study" rather than "research" or "peer-reviewed study". I really meant for it to be taken as more of a narrative account than an actual research paper. Unfortunately, it is now far too late for me to amend my original post to make this more clear.

In my defense, I typed up my post in a blaze of coffee-fueled self-righteous indignation. Mistakes tend to happen a lot when you shoot from the hip like that.


Were you homeschooled for religious reasons, though? Because especially in certain areas, many homeschooled kids are in that situation only because their parents believe the public school system will force them to believe unacceptable things: the gay agenda, evolution, Jesus wasn't an American, etc.

It's that kind of homeschooling that rubs people the wrong way, because it is specifically motivated by a desire to shelter rather than educate (it's also precisely the reason the Discovery Institute cares about this issue at all; they don't want kids growing up in a system that normalizes homosexuality, religious diversity, or an evidence-based worldview). Most of the bizarrely maladjusted homeschooled kids that I knew came out of these types of situations (though that's not to say that all kids coming out of religiously motivated homeschooling were socially inept).

When parents homeschool because they actually think their kids will learn more effectively, that's a much more positive situation that I have no problem with.


Every day, I start up a computer running OS X. Then, I proceed to fire up virtual machines for two Linux distributions, Windows XP, and Windows 7.

I occasionally play around with a headless system that I have running FreeBSD. Then I may power up my iPod Touch to test recent changes to my mobile site. I also do a significant quantity of work while I am logged into a remote debian server via ssh.

I doubt that I am the only developer with this type of daily routine. I actually do think that this poll is pretty nifty (and I checked off OS X), but I think that it is still worth observing that the term "primary operating system" just doesn't quite carry the same weight as it used to.


I use something like: "OS X is my host operating system. I develop on Linux."


Depending on how you use it, OS X can feel a lot like Linux. If you're spending most of your time in Terminal and Emacs, usually ssh'ed into a server, then your technical answer will diverge pretty far from your "real" or metaphysical answer.


I work for multiple clients. After messing a few years with "I need a postgres for this one, a mysql for this one and sometimes, things just break on the live server because Mac OS X is almost linuxlike", I switched to a fully virtualized development stack and haven't looked back since. So the answer is not metaphysical, but true.


I have tried similar to this but always find the experience of running desktop software on a VM to be lacking.

Just things like resizing windows or minimising things becomes clunky enough that I prefer to just native boot into whatever I plan to use.


I can't comment on that. I don't develop anything that requires a GUI on the machine. I just ssh into the machine and do my thing there. All editing/browser viewing is done on the host.


This is my case - whether it's a Linux VM or on the cloud, the server OS is Linux.

For my desktop, OSX provides the best combination of spatial window management (you can drag-drop nearly anything - esp. text without wiping the clipboard buffer), terminal friendliness, and MS Office (yes, Excel is still better than alternatives). Lots of hidden gems and a sustainable indie dev market.

I don't play much more than the occasional Nethack, so my need for Windows is pretty limited.


OT question for you: I'm a developer that does a lot of web development, from Perl to RoR. When I end up starting more than 1 VM with 4 GB RAM, my computer literally dies with Mac OS X. Now this is all with VirtualBox, which I suspect isn't the best all around VM application.

For some reason I feel that it is the crappy memory management with OS X that is killing me.

Other than getting more RAM do you do anything special? What VM software and how much RAM you running.


First, even if its the answer you ruled out: get a machine with lots of RAM. Don't care about the rest, even an SSD is not that important, just cram as much of RAM into the machine as possible. 8GB would be good, 16 is better. Make sure that the OS has enough space to shuffle memory.

Use tiny VMs. Most development stacks do actually fit in 512MB, as long as there is nothing else running. Pay attention about which parts of your dev stack do actually consume the memory. In my case, it's mostly in-memory databases. Sample those for smaller datasets, its good practice anyways. If its still not enough, use odd values like 700MB. Rule memory leaks out (this is one of the big advantages of small VMs: memory leaks are easy to find).

Also, use one VM per project. Unless projects are tiny, putting 2 in one VM only replicates the problems of your host-system.

Finally, I also use VirtualBox with Vagrant and am quite okay with it.


This is what I'm thinking, I'm kinda spoiled at home with both my main box and my VM server each having 8 GB.

You do make a good point about the extra cruft that isn't needed for a VM. I should actually know this as I have several LEBs on the web and optimize them highly for low memory usage. I guess personal time < work time.


More RAM, period.

Going from 4GB to 8GB on my mbp made a huge difference running VMs. Also, get Fusion. For working in VMs all day Fusion has worked better for me over VirtualBox.


You should try out Parallels then, as I find it much snappier than Fusion. After having tried Virtual Box, Fusion and Parallels, I kept the last one as it provides the best experience of the three imo.


Why such intensely large VMs? I've gotten by with 256MB appliances that then get their source deployed into big iron on-site, or into the cloud.


> my computer literally dies with Mac OS X. Now this is all with VirtualBox, which I suspect isn't the best all around VM application.

Asked and answered. I like VirtualBox - the price is right for when I just need to run an app or two on rare occasion - but it certainly isn't the most stable or least OS-crashing VM I've ever used.


What's your machine?

More RAM would certainly help, more CPU cores (real cores, not hyper-threaded "virtual" cores) will too.


I have a Quad Core i5 in my iMac with 4 GB RAM, boss doesn't wanna spend the money to upgrade :( Need to convince him otherwise.


What is this "crappy" memory management of OS X that you speak of? When it comes to memory management, OS X is definitely among the best from my 14 years of experience with modern operating systems. For what it's worth, I regularly run two VMs in VirtualBox totalling just over 2gb of guest RAM allowance, on a 4gb machine running 10.6.8, and I don't suffer problems with this. Users of 10.7 claim that it's a wee bit hungrier than 10.6, though I still can't recall the last time I saw anything else than "Swap used: 0 byte" in the Activity Monitor.


I can't seem to find the blog post about the memory architecture in Mac OS X, but it was rather recent, under 8 months ago some guy blogged about the crappy memory management that was used in OS X and why he ended up switching platforms.

I have yet to try 10.7 stuck on 10.6 until the boss allows us to upgrade. This will probably be my next big upgrade before anything else.


Yes.

I use OS X for browsing the web, web-development, graphic design, video editing, and writing.

I use Windows 7 for music production and playing video games. At some point I want to start dabbling in game development, in which case I'll probably go with Windows for that, too.

I ssh into various Linux servers privately and at work. In my spare I sometimes play around with Linux distros on my desktop just to learn about the current state of affairs. I usually can't see any advantages in it over OS X other than the fact that's it's free software and that it runs on cheap hardware. I keep being curious though.

I don't use any virtual machines because I dislike the sluggishness and I don't really need them.

Sometimes I wish I could get by with only one OS without feeling crippled in some respect. My dream setup would probably be an OSS system that's great with multimedia stuff and has about a 99% adoption so hardware would be supported really well.

But that thought depresses me because it reminds me of the state reality is in, so I try not to have it.


> I use Windows 7 for music production

I am fascinated by this claim. The last time I looked at Windows for music production was a long time ago, but back then CoreAudio beat the pants off of ASIO for real-time work. Is that not still the case?

What software do you use? Two packages that I use heavily, Logic and DP7 are mac-only.


I use Reaper by Cockos. I make rock music. I have previously used Cubase and Sonar.

Regarding ASIO vs. CoreAudio performance: http://www.dawbench.com/win7-v-osx-1.htm

There are a few freeware plug-ins I use that are Windows only. If I could afford to buy Altiverb 7 right now, which is OS X only, I'd probably switch to OS X (at least till Altiverb came out for Windows).


I use Reaper on Win7 as well. It is available on OSX though.

Mac is a better platform for audio but the difference is getting smaller.

It's true that they are a lot more freeware (vst) plugins on Windows. Great one like http://varietyofsound.wordpress.com for example.


How is Mac a better plattform for audio? This is not a rhetorical question. I keep hearing this, and some say it's because of CoreAudio. But so far I haven't been able to find a thorough explanation that's not based on biased assumptions.

The Variety Of Sound stuff is what I'm missing on OS X. I use those a lot.


Many musicians say this because Windows systems can, for a variety of reasons from hardware drivers (certain Firewire chipsets and motherboards) to bloatware, become very glitchy and finicky when it comes to low-latency recording. It's very difficult to predict if new hardware will work or not and it can be very time intensive to troubleshoot when the problems arise. Unless you buy from a music PC specialist, you're unlikely to encounter sympathy from support desks.

Alternatively, every Mac comes with Garageband and is built from the ground up for reliable recording - if you buy a system and you hear glitching in recordings (which I've never heard of), you can take it to the Apple store and have a technician troubleshoot the problem.

I don't think most PCs face this problem (although prevalent hardware like HDMI ports is often problematic for smooth audio recording), but musicians tend to recommend Macs because the certainty that it will work out of the box has a lot of value.


Okay, I agree.

Although I think this has a lot more weight when you talk exclusively about laptops, with which I indeed have had so many problems in the past that I would recommend a MacBook to any fellow musician asking me for advice, especially if he is going to go on tour with it.

The two DAWs I've assembled myself in the last 12 years were both super stable and performed really well. I don't think I would have gained anything by using a Mac.

In fact I'm on a machine that I recently built which dual boots into Windows 7 and Snow Leopard. Maybe I will benchmark Reaper in both of them and come to a surprising conclusion. If I do, I'll post it on Hacker News.


Out of curiosity, since you sound not like a fanboy of any particular os and just want to use one: What keeps you using osx over windows for browsing the web, webdev and graphic design? Are there any special advantages or is it just software which is not available for win?


Graphic design: not many. Just little things, like the great desktop zoom, the nifty screenshot shortcuts, simple access to special characters. Nothing that Windows couldn't do without some modifications, just maybe not as nicely. Also, I spend most of my time in OS X anyway, and I don't want to boot into Windows just for quickly creating or editing a file in Illustrator or Photoshop.

Browsing the web: font rendering. I don't like Windows' aggressive hinting and the one dimensional anti-aliasing. Some non-standard fonts I find not only ugly, but unreadable on Windows.

Webdev: Unix underpinnings, Rails. Text rendering, again. And I feel like I'm a lot faster at switching apps and searching for stuff on OS X. But that's probably just habits I've built over time.

Overall, OS X gets in my way the least. That's why I would choose it if I had to choose just one OS. But I don't, so …


Same here, as a developer of a cross-platform library, this is the only configuration that works for us. By restricting virtualization of OSX, Apple has effectively forced most developers to use this configuration.


That used to be my routine at my old job. Fire up WinXP and Win7 VMs to do .net development and other client work.

My new job doesn't require interacting with Windows at all so I work in OSX/sshed linux sessions all day.


This graph is completely useless. It is looking at relative Google search query volume.

Of course Node.js search queries are rising. There is a lot of noise being made about the platform, the platform is relatively recent, and people are curious about it.

Of course Ruby on Rails search queries are falling. It has been out for a while, and people have become familiar with it at this point.

In other words, they don't need to google for Ruby on Rails when they can just go to one of the sites that they have bookmarked.

This is not a milestone by any stretch of the imagination.


I have to apologize. I fat-fingered the down vote when I originally meant to up vote your original comment. I think that my mistaken down vote was the one that initially sent your comment into the gray. This was unfortunate as yours was a remarkably civil post on a subject that can be somewhat controversial at times.

I am currently working on two projects written predominantly in CoffeeScript, and I completely agree with your original post. It is a mistake to think of CoffeeScript as simply another Javascript, and it is a mistake to make the assumption that people that are comfortable with Javascript will be comfortable within a CoffeeScript project.

CoffeeScript is more than just Javascript with a small sprinkling of syntactic sugar. There is a definite learning curve between Javascript and CoffeeScript. List comprehensions, extended regular expressions, and splats - these are just a few of the things that make up that learning curve. The changes in variable scoping that CoffeeScript makes is another.

I think that perhaps the best practice for publishing a CoffeeScript project should probably be to append a .coffee to the name rather than the usual .js. This would go a long way towards eliminating the kind of misunderstanding that you refer to in your OP.


+1 for the idea of calling it tower.coffee instead of tower.js


Maybe I'm just weird, but all of this drama around a database does not make much sense to me.

To me, it is simple. Do your research:

* Go to https://jira.mongodb.org/ and look at the issues

* Read the documentation at http://www.mongodb.org/display/DOCS/Home

More than likely, if you have to ask the question "should I use a NoSQL db" then the answer is no - just stick with SQL. MongoDB (and most other NoSQL dbs) is a specialized tool that is fit for specific use cases only.

There is no need for all of this ridiculous hyperventilating drama.

I have heard it said that the marketing department at 10gen was not good at "managing expectations". If you are working with a database, I would hope that you do not allow your expectations to be set by the marketing department. If you don't do your due diligence then you deserve to be bitten.

As to the original "Don't Use MongoDB" post. Whether it was a hoax or not was completely besides the point. Every single section in there was completely unsupported by any evidence other than the authors experience.

If you are going to talk about data loss then link to a bug report or a Google query pointing to a bug report or something. Anecdotes are not data.


"If you are working with a database, I would hope that you do not allow your expectations to be set by the marketing department. If you don't do your due diligence then you deserve to be bitten."

That's the real definition of "hard to use": you have to research everything yourself and send the product through QA just to use it.

There's a very high value in products where you don't have to do a lot of research on the implementation quality and caveats. If you start using it, and it appears to work for your needs, you won't be bitten too badly later. In my opinion, PostgreSQL is an example of such a product.

Of course there is always some opportunity to do the wrong thing. It's a question of degree.

Following your advice would essentially mean "only big companies can ever release anything" because you'd need a team of full-time people to sit around doing research and QA for libc and the kernel and everything else you depend on.


Maybe I'm just weird, but all of this drama around a database does not make much sense to me.

Because it's the most painful point of tech. If you need to change programming languages to meet a latency requirement, that sucks. If you find your servers go down and they need rebooting, that sucks too. If you lose your data, it's gone and you can never get it back.

The troll did well to zero in on this. You don't really know how stable a database is under certain conditions until you start hitting the roadblocks. With NoSQL databases, you have less time in the wild vs RBDMSs, so anything which indicates there are hidden gotchas are going to set potential adopters' teeth on edge.

Even better, with database issues, you don't know about them until you have them, so you often do have to rely on folk knowledge about how well they turn out in practice after months/years of deployment.

It was a well-targeted troll.


and it is a easily trollable target as well. Had he tried to to attack durability on probably any other database, it probably would not have worked as well as it did.

Whether its a troll attempt or not, remains to be seen, but I completely agree with this specific, albeit very generic, part of the text.

> Databases must be right, or as-right-as-possible, b/c database mistakes are so much more severe than almost every other variation of mistake. Not only does it have the largest impact on uptime, performance, expense, and value (the inherit value of the data), but data has inertia. Migrating TBs of data on-the-fly is a massive undertaking compared to changing drcses or fixing the average logic error in your code. Recovering TBs of data while down, limited by what spindles can do for you, is a helpless feeling.


Couldn't agree more about trolling the issue tracker of a project you plan on adopting. It does wonders as far as learning the ins and outs of a project and how the team works/what they prioritize.

Also, +1 on the rest of what you said.


At the risk of losing karma, I have to say that I think this entire thread has really gone way too far.

Guys, this is HACKER NEWS. I thought we were supposed to hold ourselves to a higher standard of conversation over here?

Could I just post one thing?

http://www.paulgraham.com/randomness.html

If you disagree with what someone writes, then you should reply simply and without embellishment. To conclude that someone has a "sick mind" or has "killed their company" is a bit much of a conclusion to draw from a simple online debate at a tech/startup discussion site.

If Hacker News offered the ability to lock discussions, then I am quite sure that this one would have been locked.

Can't we all just calm down and get back to coding?


I generally believe very much in giving people the benefit of the doubt, and in not overreacting, and so on. The world benefits greatly from not taking offense at the drop of a hat.

However, there are times when one's blood boils for a reason. I reread this a few hours after my initial comments and... yep, I'm still angered when I read that those who call out someone for being racist are "backwards" and the equivalent of racists.

If someone wants to argue that racism isn't that big a deal these days... fine, whatever, that's debatable, and people deserve to be able to air their opinion without being shouted at. However, to stand up and say that those who denounce racism should shut up is despicable.


The insistence on misrepresenting other peoples comments here is simply stunning.


The guy wrote what he wrote: "people who get bent out of shape about it [racism] are just as backwards as the few people who are still actually racist."


He has posted a clarification of his comment numerous times, but you refuse to acknowledge it.


Just so you know, I really don't care what you think, and I don't need your approval.

Sorry you don't value free speech, or the rules here at HN.


The post said that people who get bent out of shape about racism are backwards.

Imagine the company not hiring a non-white interviewee, and that person doing thorough research.

That comment is findable in search engines.

k33n has left himself open to accusations of racism, and put his company at risk of lawsuits.


Imagine yourself picking up a law book and actually learning something. You'd delete your comments out of shame.


Overpopulation is a hard problem. There are complex socioeconomic reasons for why some countries have higher fertility rates than others. This is a heavily studied topic in health economics, but I am unwell right now and do not feel like digging through research papers. Instead, I will point you to a wikipedia article that talks about one (popular with economists) take on overpopulation:

http://en.wikipedia.org/wiki/Demographic_transition

Sorry for the wikipedia link, but it is not a half bad article on the subject and it was easy to find. Basically, this theory suggests that increased income per capita is correlated with a decrease in a country's fertility rate.

There are arguments on both sides regarding the flow of causality, but if you wanted to know what was being done to address the challenge of overpopulation then that article above is a good starting point for your own inquiries.

(Also, just as a side note, it is probably the case that you were down-voted for being off-topic and not due to some overwhelming sentimentality. Maybe if you want to talk about overpopulation you should have written a blog post and posted it up, I bet that would have probably got a much better reception.)


You can subtract death rate from birth rate, and the world looks screwed. For China, this is (13.1 - 7.1) per 1000. For India, it's (21.76 - 6.23) per 1000. So for the world's middle class, that's about 2% population growth per year. Yikes.

But the you realize, that China and India have very few really old people. Now that they have decent medical treatment, the older Chinese and Indians live longer, pushing the population up. But they aren't having any babies, so in the long term we might not be growing like lemmings.

So you look at the fertility rate:

http://en.wikipedia.org/wiki/List_of_countries_and_territori...

For China, it's 1.54, and it's 2.6 for India. Anything under 2 (plus a bit for the ones who don't reach fertility) means a declining population, in the long term.


And then you leap from the frying pan into the fire: Too many old people relative to the number of young people.

As far as I can tell, "health" isn't keeping up with "life", so we are screwed, just the other way around.

But now we also have to divine the state of robotics and AI, say 100 years out, and my brain begins to strangle itself.


Thanks.

I didn't think the topic of overpopulation was that far from finding a vaccine to a top disease.

I thoroughly enjoyed the replies I got so far so the few karma points I lost were definitely worth it.


It's gotta be a slow day on hacker news when this is at the top page. (Or, is this being upvoted because it is Cringely.)

An entire article about web development past, present, and future without a single mention of PHP....

All snark aside, I strongly disagree with several things in this article.

Java never left, so it can hardly arrive again. Anyone that has ever worked in a corporate environment problably knows what I am talking about. Java dominates the Enterprise landscape.

Disk speed limitations on database access times can be and has already been overcome by in-memory caching. This is not new. Advancing SSD tech will not suddenly lead to Java's total ascendance as a web development platform.

The characterization of dynamic languages as "easy to program for a broader, younger, and maybe less experienced crowd of developers" is a rather unfortunate blanket generalization. This is especially the case because most people that I know that use dynamic languages usually have some experience in things like Java, C, C++ that Cringlely seems to hold in high regard.

And finally:

http://www.paulgraham.com/avg.html

The real problem that the vast majority of web developers face is not trying to cope with overwhleming amounts of daily traffic. The real problem is how do you build a product that is compelling enough to get signups, and how do you continue to develop this product to attract new signups.

Java is fast, and that is lovely. However, speed of execution does not matter when your development speed drags. When you are developing a product, you need to be able to move fast. If you get substantial traffic, then you can always rewrite backend services in Java (or whatever floats your boat) at that time.

Edit: When I say Java, I refer to the language - just as Cringely does in this article.

Of course a number of excellent languages have developed that combine the advantages of the JVM with the benefits of a more powerful language. (My personal favorite being Clojure.)


What's odd is that here in the UK it's C# dominating the enterprise landscape. Or at least that's the impression I've always got from the jobs mentioned to me and the developers I've met. I've met PHP, C#, Python, Rails devs but never a Java developer.

The thing about .Net not being awesome compared to Java, it wasn't but that's not true any more, .Net is awesome compared to Java. There's a hell of a team behind the C# language at the moment and they're about 3-4 years ahead of Java in terms of new language features.

Not that I've any delusions about .Net coming to dominate in the web space, but it's more likely than Java imo which looks old now when I read it, though admittedly I don't keep up on it that well.


C# is in demand, because C# developers are rarer. Also it's a more interesting language to work in and has features that attract smart developers.

Java developers are abundant, and mostly employed in the dark corners of big enterprise software shops you've never heard of.


I have seen some very dark corners reserved for C# programmers around Sharepoint deployments...


I worked on a major financial application that had a consumer front end written in share point... it was horrific... easily the worst thing I had seen in 15 years of developing.

The guys that put it together created a situation where there multiple front ends to one database. Each front end generated GUIDs and rammed them into the shared database. My work was to fix the sync'ing problem derived from this.

Most poeple I know refer to it as scare point now, as it scares most developers ;)


That depends on the country. In the Balkans for example, there are more .NET than Java developers, though both of them are in high numbers. The thing is, a lot of students after graduating are trying to get a job as a .NET developer thinking that it's easier, and after that they stick to that job.

And the second sentence, I have to disagree with you. I can't find the link right now (when I find i'll give you another reply), but there was report on USA's most wanted ICT jobs for 2010, and it was sorted per regions in Top 5 format, and the common 3 out of 5 everywhere were: Java, SAP, Oracle. .NET was 4rd or 5th almost on all of them. Second, there are more big companies with Java departments than with .NET departments (and some of them have both).


Down here in Brighton a shitload of Python/Django is also happening, too :)


In my experience C# the language is much better than Java the language, but the .NET ecosystem doesn't even close to Java's ecosystem of both open source and proprietary libraries and frameworks. There's been too many times where I had to write my own stuff for C#, where in Java it'd already be mature, fully tested, and free to use right away.



I think the actual reason is not because of .NET awesomeness, Companies in UK have always invested in Microsoft technologies. I remember back in early 2000s, UK was the only place with a great demand for VB developers. and majority of the current work being advertised, involve rewriting the existing VB applications in C#.


What you said is increasingly true in the US too, from what I have seen.


Not from what I've seen. There are tons of Java jobs. There's good reason for that. I can run Java apps on any OS. That's pretty important. It runs fast these days. Eclipse totally rocks and most if not all of what you need is totally free in the Eclipse world.


However, speed of execution does not matter when your development speed drags. When you are developing a product, you need to be able to move fast.

However, the JVM doesn't necessarily mean 'Java'. If you get free performance and more efficient development, there's no advantage to using a poorly architected inefficient runtime.


You've pretty much nailed why JRuby catches on so well. It's still not fast for a JVM language, but it's blazing for Ruby - plus you can call through to Java where you need/want it.


> Java is fast ...

God, what a flashback. Seems like yesterday people were saying left and right that java has no chance because it's an interpreted language.


The link in the parent is https, try this regular http link if you can't read the article:

http://lkml.org/lkml/2011/10/6/317




Thank you SIR!


Consider applying for YC's first-ever Fall batch! Applications are open till Aug 27.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: