Hacker News new | past | comments | ask | show | jobs | submit login

Unless you're planning to deploy to a production server running OS X, I don't see why you'd care about Rails running natively on your Mac.

Why not develop against a VM running the same OS as your server?




Why on earth should you have to run a VM in order to run a programming language?

This hasn't solved the problem of Ruby being difficult to install, it's just added ANOTHER problem of having to learn to install a VM manager and a whole new OS, plus using up far more system resources. Seems to me like that's just making things worse.


Why on earth should you have to recreate your sever stack on a desktop OS? Seems like you're just begging to be bitten by some OS-specific bug or quirk.

It solves the problem of Rails being hard to install on OS X... because it obviates the need to install Rails on OS X. And there's no new OS to learn -- the VM runs the same thing as your server. If you don't know how to use the OS that's running on your production server, that's a bigger problem.

I've been developing this way for years. Additional benefits include: you can easily have multiple VMs with different configurations, you can snapshot/backup/restore a VM very easily, crashing a VM doesn't crash your computer, and it's relatively easy to share a fully configured VM with another developer (even if their primary OS is something totally different).


I see it as being a bit like moving to a service-oriented architecture. When you're just getting started it adds overhead, but by the time you get to your 5th developer or platform-dependent bug (hello HFS case-insensitivity bugs!), maintaining a VM image starts to make a ton of sense.

Similarly, when you start a new web app, of course a monolithic framework like Rails is great and is flexible enough to roll in all your business logic. Somewhere around 50k lines though, you start getting that ball of mud feeling and managing changes across the entire code base starts to become onerous. At that point you realize the sysadmin overhead, and cost of defining and maintaining slow-moving services interfaces allows you to turn your ten-person team into ten ten-person teams and get a reasonable productivity multiple out of the deal.


It's not about running a programming language. It's about assuring that the code you write will run properly on the target it's deployed to. Assumptions that are made on the basis of the development platform (that are external to the programming language itself) will not necessarily be true on the deployment environment and can (and often do) result in bugs that can't be replicated easily between the two environments and can often be difficult to track down.


What if you'd like to develop a gem not for any specific platform? What if you want to develop desktop apps, or make CLI apps to automate little things on your usual platform?


Well, yeah, that's almost a truism: if your app needs to run on OS X, then it needs to run on OS X.

I think very few Rails apps need to run on OS X, however.


I guess it just depends on the target. For starting out serious development I'd certainly agree with all you say, although I personally prefer the local development and tests on staging method, since it means I know it runs on at least two systems, at least with the code where it's possible. It might be nicer for beginners though to be able to quickly dive in.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: