Hacker News new | past | comments | ask | show | jobs | submit login
Heroku DX: The New Heroku Developer Experience (heroku.com)
119 points by Spiritus on Sept 23, 2014 | hide | past | favorite | 56 comments



I like the new Ember.js based UI! Nice.

I have periodically used Heroku a lot, but I do have one gripe about them:

I tend to run a lot of low traffic web apps, apps that run well on Heroku's free tier (and not minding the occasionally loading request delay). However, I just don't feel comfortable using a service a lot that is free: being a freeloader. (Compare to Google: they make money off of me because I click on interesting looking ads.)

If Heroku had a cheap $10/month 1 dyno tier (that might have a perk, like staying resident, not swapped out) that was restricted to a lower number of requests per day, then they would get a lot of my business running my experimental projects.

I have considered paying for a second dyno that I don't really need, but $30/month each for a lot of experimental web apps does add up. Not too off topic: I just wrote about how I create a personal Heroku-like experience on a VPS: http://blog.markwatson.com/2014/09/setting-up-heroku-like-gi...

In any case, I think Heroku provides a great developer experience.


I've sent Heroku tens (if not hundreds) of thousands of dollars in business by referring clients over the years. The fact that they've spent 78¢ in AWS fees for my hobby apps causes me to lose approximately 0 seconds of sleep at night.


What's with the entitlement? How about being thankful that they have a free plan you can use, that's useful, and continues to be available?

A referral is - unless an arrangement has been made - a service to the person you're referring, in that you're making a recommendation built on trust and aren't misleading someone.


I don't think nthj was being ungrateful. Referral fees are a service to both parties, and companies often pay referral fees as a thank-you. nthj was merely pointing out that he takes his referral fee in the form of free hosting for tiny projects, so he doesn't need to feel any guilt.

You are probably being downvoted for being unnecessarily caustic. Your point wasn't a bad one, even if I think you're wrong; consider rephrasing with an edit?


I use and have used Heroku for personal projects for quite a while. I don't feel too bad, because a) they only run free dyno apps upon request (idle most of the time) and b) I'm a huge advocate for their products, which has led to my work and several other companies using and recommending Heroku. At work we now have a number of clients typically running 2-5 dynos and other services, hopefully easily covering my freeloading!


+1 Thanks, that does make sense.


So, I don't mean to say that freeloading is not OK, I did it myself without losing sleep. However do consider renting a more capable VPS for personal needs - on DigitalOcean the cheapest instance with 512 MB of RAM and 20 GB SSD is $5. Or $10 if you want 1 GB RAM / 30GB of SSD.

And there's a lot you can do with a personal instance, like hosting several websites, hosting your own MySQL, having your own email server just for the kicks, using it as a VPN and so on and so forth.

I also subscribed for AWS's Free Tier and now I have a m2.micro instance and traffic included for their CDN for one year and I'm in the process of configuring my blog's assets to be loaded from their CDN because I think it is important for my blog to load in under 500ms :-)


True, but on Digital Ocean you still have to set up a bunch of things yourself (deploying Rails/Django etc. isn't just a matter of a git push like it is with Heroku, you have to worry about scheduling backups, etc.) that you get even on Heroku's free tier.

Yes, I know these things are trivial, but even 30 minutes of work is infinitely more than Heroku's 0.


I totally agree for experiments, but for long-term stuff, like your website or anything else you want to stay up, once you configure it then it stays configured - it's actually not 30 minutes, more like several hours the first time you do it, but after that you don't have to touch it again for months. The only thing that has to happen periodically are security updates, but you can configure it to update itself, you then configure Pingdom to send you alerts in case the server is down, then you're worry free.

Granted, Heroku's ease of use is totally kick-ass.



You could, but the maintainer - progrium - isn't maintaining the repo these days. There are ~47 PRs in the queue - some quite old - and he is actively working on Flynn.


He doesn't have much to do with Flynn either these days...

http://progrium.com/blog/2014/07/01/beyond-flynn-or-flynn-as...


Would love to hear a little more on this, do you mind reaching out craig at heroku.com


So basically you are complaining that they aren't charging you? That seems ... odd.


It's pretty common around here, actually.

"If you aren't paying for the product, you _are_ the product."


If you are paying, you might still be the product


Though I used Heroku in the past and enjoyed the seamless(mostly) experience, I am in the process of moving my infrastructure to docker containers.

I dislike the proprietary nature of Heroku platform and I am betting on Docker + Flynn/Deis as a future golden standard of PaaS. My confidence is based on the momentum of Docker and Docker hosting services (orchardup, tutum, stackdock) as evidenced by their funding/acquisitions and tech news hype. Also digitalocean, linode and other VPS provders have easy to install docker images.

I am enjoying (probably false) confidence that investing now in the docker expertise, I will have my development and deployment strategies mostly set for next 3-5 years. Rvm, virtualenv, haskell platform, etc - everyting becomes easier with docker to develop AND to deploy. Heroku doesn't solve that. New services and products would be adopted by decentralized docker ecosystem faster than by Heroku.

This feels really nice.

Am I wrong to bet on docker? Should I return back to the the poisoned fruits of Heroku walled garden or venture forth in the great ocean with Docker? Is there a consensus in HN community?


> I dislike the proprietary nature of Heroku platform

Most everything we do, we open source. For example I help maintain the Heroku Ruby Buildpack, and it is fully open source. This is the technology that allows us to detect and install dependencies and precompile your assets. We open source a ton of other things such as our CLI, our logging aggregation infrastructure, our continuous Postgres backup service (backs up postgres to S3). We also invest heavily in Open source, we support PostgreSQL development, and we hired the 3 most prominent Ruby Core developers so they can work on the language full time.

Under the hood, we use LXC a standard linux container. We generally don't talk about this level much because it's orthogonal to the running of your application. Most people don't know what system libraries come pre-installed on our servers, or the versions of the operating systems, or what virtualization/containment systems we use. They know their app runs, and stays running.

I'm happy you're comparing and contrasting. I think docker is a great technology like LXC and it's very useful. I wanted to chime in on the "proprietary" comment because if you feel our service ever "locks you in" in some way, it's likely a bug that needs to be fixed. I consider my work 100% open source time.


Am I wrong to bet on docker? Should I return back to the the poisoned fruits of Heroku walled garden or venture forth in the great ocean with Docker?

I'm a huge Docker fan, and I think Docker is probably a decent medium-term bet at this point (with very real short-term issues). But I also think it's a bit unfair to call Heroku a "walled garden":

- Heroku buildpacks rarely require modifying your application much. Typically, you just add a two-line Procfile and maybe a list of libraries you depend on.

- When Heroku does invent something propriety like a Procfile, they usually release an open source library to allow other people to use it.

- Heroku uses standard storage infrastructure: PostgreSQL, memcached, and so. Unlike Google and Amazon, they don't actually provide many propriety cloud APIs.

As somebody who's worked on Heroku-based projects ranging from throwaway hacks to huge production systems, I've never had the impression that Heroku was trying to lock me in.


Okay, but what is the Heroku's endgame here? Their dynos are way overpriced and migrating from Heroku to Docker infrastructure(or other PaaS/IaaS combination) on the scaling stage of startup lifecycle could save $x0 000 and $x00 000 per month for resource intensive apps. Do they want startups migrating as soon as they start scaling? I don't think so.

They definitely have some lock-in strategies in the works. Lock in may be not technological per se, but a behavioural one, like in a Github to git network effect. That "Deploy" button is one clear path to lock-in.

But I agree, Docker ecosystem is not mature enough and doesn't have its own killer app, platfrom or collaboration process. Though with the acyclic directed graph model of building containers they may have stumbled onto something really big.

I can say with 100% confidence that it's really exciting to observe the evolution of the hosting industry and the overlapping between production and development environments.


"Do they want startups migrating as soon as they start scaling?"

Actually, I do think they want most startups to migrate as they scale up. Heroku does not want to compete with AWS; their story is value-add. Their pricing will always be non-competitive for larger apps and larger apps will be migrating away anyway.

So rather than fight it, they embrace it. By making migration easy they make their service much more attractive to the little guys.

After all, everybody expects that their little app is going to grow huge one day. Heroku makes their money from the large majority that haven't gotten huge yet and probably never will.


That's an interesting thought for sure.

But consider that with an advent of golang, node.js and other faster (compared to python and ruby) languages/frameworks coupled with microservices architecture most (maybe 80%) startups will not exceed 10 or 15 free dynos. Those who need vertical scaling may pay for 2 or 3 dyno workers.

Where's the scalable business model for Heroku in this scenario? So I am not sure that Heroku wants startups to migrate.


"They definitely have some lock-in strategies in the works."

You are aware that the "P" in PaaS doesn't stand for "paranoia", right?


> My confidence is based on the momentum of Docker and Docker hosting services (orchardup, tutum, stackdock)

Orchardup was acquihired because of an open source tool they hacked and they are discontinuing their hosting service.

Tutum has no SLA at all and explicitly recommends only using their service for toy products. Basically, they're not a real hosting provider right now, just a very expensive toy.

Stackdock has been "currently upgrading our platform and sign-up has been temporarily disabled" for months now. I'm wondering what's going on, but I wouldn't be surprised if blocking new signups for such a long time means they're about to pull the plug, for whatever weird reason. So long without new customers (while I'm sure some are leaving) can't be healthy. I really want to give them money but I can't.

Finally, the people who made Docker itself once ran a PaaS service, dotCloud. It has been sold off and does not seem to be getting much new investment - definitely not any support for directly hosting Docker containers.

In short, there are currently 0 real Docker container hosting providers (and by 'real' I mean that they will exist past October and have at least some uptime guarantees).

I really want you to be right, but currently the evidence looks very bleak IMO.

We made this same bet last January and we're currently spending way too much effort DIY'ing it all on a VPS.


I wouldn't say it's bleak. The next version of Red Hat's PaaS is based on docker containers.

"The OpenShift v3 Cartridge format will adopt the Docker packaging model and enable users to leverage any application component packaged as a Docker image. This will enable developers to tap into the Docker Hub community to both access and share container images to use in OpenShift."

https://www.openshift.com/blogs/openshift-v3-platform-combin...

https://github.com/openshift/openshift-pep/blob/master/opens...


Yes, your arguments are convincing.

However, Orchardup was acquihired by Docker itself and I am sure they are planning to provide container hosting.

Tutum has just raised about $2M and are in the process of going gold with their service.

Yeah, stackdock is MIA.

However, the main point is that Docker is quite young and already has a developed ecosystem. Also all of these are specialized docker container providers, while I have no problems deploying a container on most VPS providers, IaaS platforms or even bare metal.

The Docker ecosystem is just starting to develop, Docker PaaS platforms are still in beta and while there is a lot of uncertainty, the vitals of the ecosystem look good. I may be wrong, but I am taking the plunge and my efforts will contribute a small part to the Docker ecosystem.


We have no interest in providing container hosting. We acquired Orchard for the great people doing amazing things in the ecosystem so they could continue their work on Fig in the context of Docker.


Why doesn't running Docker on Elastic Beanstalk (on AWS) count as a real hosting provider?


AWS Beanstalk for docker does work, but has some weaker points of relative comparison: * inefficient - 1 docker image per AWS instance, which is expensive to run scaled up compared to running many containers per host * slow - requires starting a VM which typically takes several minutes * feature-bare - compared with PaaS platforms like heroku, cloud foundry, openshift, etc. for example, both heroku and cloud foundry have built-in aggregated logging and health management.

there is going to be a large spectrum of ways to run docker images, and you'll be able to choose bare bones to full-service.

here is a 4min video of cloud foundry next generation runtime diego showing spinning up 300 docker images in under a minute, providing load balancing, health management and aggregated logging: http://www.youtube.com/watch?v=e76a50ZgzxM

james - cloud foundry team


> I dislike the proprietary nature of Heroku platform

What about Heroku is proprietary?

Who'll look after your docker infrastructure?


Heroku PaaS itself is proprietary. Their dashboard, monitoring and other tools are proprietary.

PaaS services based on open source PaaS solutions (Deis + Docker, Flynn + Docker) and proprietary PaaS solutions ( Tutum + Docker, for example) will look after my infrastructure. Or I will host it myself. In any case I'll be able to migrate my containers to any other infrastructure easily, quickly and with nearly zero additional configuration.


Anytime I've investigated Docker it's seemed more like something that someone who wanted to build a Heroku competitor would use as an underlying aspect of their infrastructure than something I (as a developer who wanted to minimize dealing with sysadmin tasks would use).


I think Docker really awesome to deploy app. However, I am not sure how to deploy DB to Docker to persist data to disk. Currently I am not really impress Heroku's app deployment, but I like their PostgreSQL offering (which quite expensive).


You can either use an external service or migrate your data when updating a container: http://wiredcraft.com/posts/2014/06/25/data_migration_of_nam...


I love the developer experience with Heroku, just please give some of the gains of Moore's law back to your customers. While AWS and Google drop prices and increase their offering, the 1x dyno stays exactly where it's always been. I am completely fine with paying a premium over other services for your awesome toolset, but please just give us some of the improvements other cloud services are seeing instead of taking that as profits.


I love the developer experience that Heroku provides, however I wish the would lower their prices a little bit and/or introduce additional pricing tiers.

## The dynos are quite expensive by current standards

AWS has been continually dropping their prices, however those savings haven't been carried over to Heroku? Either:

1) lower pricing, or

2) improve the resource allocation for the current dynos at the same price point, e.g. 1x dyno - x1CPU + 1GB, and 2x dyno - x2CPU + 2GB.

## There is a big gap between 2x and Performance dynos

Java apps are recommended to run with 1GB RAM, so practically my only option is to continue to scale horizontally [1]. Maybe there is an opportunity for a x2 CPU and 2GB dyno at $100/mo?

Notes: [1] - I would have loved to have a bit more memory at my disposal. 1.5 / 2GB would be ideal, but upgrading from 2x to Performance is too big of a jump ($70/mo to $570/mo).


I like Heroku's PostgreSQL. So far only AWS RDS can compete. Or there is any recommendation of PostgreSQL as a Service?

Heroku Standard 0 (1GB RAM): $50.00 per month RDS On Demand (1GB RAM): $12.96 per month RDS REserved 1 Year (1GB RAM): $9.29 per month


There are a lot of extra costs in RDS that this calculation excludes: bandwidth, disk I/O, storage and backups. All that stuff is included in Heroku Postgres. Also, you're likely to get much better perf out of the $50 Heroku plan than the cheapest RDS plan because it's a slice of a bigger instance, rather than an underpowered VM.

So the real total cost comparison is a lot closer than it seems.


I've been considering using CartoDB to replace Heroku. I only use basic geo stuff and their maps would be handy. The specs don't really tell you want you get for you money, there's a $29 which might be ok. http://cartodb.com/pricing/


In terms of what do you want the recommendation? Pricing? Performance? Scalability? Availability? Tech support?

I would go with Heroku or AWS RDS. Both truly deliver and make devs / devops sleep better.


Price/Performance. Most my apps low traffic so scalability and availability not that critical.


So far I love the new DX, I've been using the beta for a while now. It's a significantly improved UX overall.

On an unrelated note, I think it's time to retire the 1X tier, or just make that the free tier and bring the price of 2X in line with 1X. You just can't do all that much with 512MB of ram and given the progression of technology this doesn't seem like a huge ask.

Also in the interest of security I can't think of any reason why encryption-at-rest should be available on the standard database tier.

But then maybe I'm just cheap.


My start up has been running just the free tier for almost two years. Simple Rails app, running Unicorn and 4-5 workers.


How does the new Dashboard + Metrics compare to using the New Relic plugin?


I would say depending on what you're looking at they're complimentary. NR can give you some really in-depth timing information such as how much time is spent in the DB versus garbage collection on a certain request. The metrics here are very high level, response time, memory usage, CPU load, etc. New Relic doesn't have the ability to be as precise with its system resource measurements inside of the Dyno so they're both very useful.


I'd say D+M lets you know you need to do something and New Relic helps you identify what that actually is.


They complement each other. Heroku's metrics show you dyno and traffic info, New Relic shows more detailed app data.


No RAM for Postgres until you reach $50/m? Talk about profit margins!


Yeah, its a bit of a jump between the $9 and $50 tiers.


  Application Metrics are currently only available for apps with two or more running dynos of any size
Of course, I can't be a freeloader and complain about this I guess.


I always hated the all dark design of their dashboard. I'm glad they are using some lighter colors now!!


A bit of an unfortunate choice of name if you consider the name a smiley DX


You could imagine it as either an angry face, or a large grin + bow tie, I suppose.


I love that Heroku has redesigned the UI, but jeeze. Ever hear the expression, "too much of a good thing"? Their trademark purple is neat, in small doses.

For instance, I REALLY REALLY love the ocean. I love SCUBA diving, and I love the color blue. However, you won't find me living in this house http://cl.ly/3I3R2z0q1A2S. However, http://cl.ly/0R3H3k2s3Z3x looks very appealing to me. Catch my drift?

It's as if Heroku fired all of their designers and put developers in charge of the entire UI design process. I've seen that happen, so I don't say that to be a troll. I'm seriously disgusted by the monotonous nature of the new UI.

That said, it appears to have very many useful new features, which surely outweighs most of my aesthetic concerns.


Well, I like it. It's pretty and it's not just black text on white background.


Who puts a shaggy carpet like that next to a shower? That thing must be reeking of mold and mildew.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: