Hacker News new | past | comments | ask | show | jobs | submit | tomwalsham's comments login

I'm in the 'sooner rather than later' camp, and have one more key driver.

Insurance. As soon as human drivers are provably poorer than automated drivers, the cost to insure a meat driver goes up. It will not take long to build a very strong case for the lower risk profile once the early adopter vehicles take the road.


Why does it go up? The price isn't as high as it is because there's nothing better. It's based on the expected cost to the insurer, and that should drop when the predictability of the other vehicles on the road increases.


You just literally described trains and buses. Don't assume the form of the vehicle remains the intimate 3-across sedan style.


The best way to improve email delivery is to understand that email addresses represent humans. Address validation and long-term deliverability is primarily a problem of social engineering, not technical.

Ordinarily I'm in favour of things that can improve data quality with minimal user friction, but in this case while it looks like an attractive solution, it's both dangerous _and_ broken.

It's dangerous because if you repeatedly open empty SMTP sessions with major ISPs (and some neckbeard boxen) to validate addresses, you will rapidly fall onto blacklists. Furthermore existence of an address says nothing of the end user's ownership of that address.

It's broken because of the myriad crazy responses that mailservers return -: 5XX errors for soft-bounces, 4XX errors for permanent failures, deliberately dead primary MX server... The web's email infrastructure is so massively fragmented and quirkily non-RFC-compliant you just cannot rely on technical solutions to these problems except at scale of an ESP (disclaimer: I work at PostageApp.com, a transactional ESP, and we tackle this problem on a large scale)

Finally, it fails my 'Spammer Sniff Test': If you think of a clever trick to improve email delivery/opens/responses etc, it's been thought up 10 years ago by spammers and long since added to blocked behaviours in email protection infrastructure.

Check for '@', and craft your email verification process to incentivize following through. For long term delivery (to bypass the mailinator issue) provide value, pure and simple.


This is definitely good email sender behaviour, but I would hesitate to purely put this down to the altruistic notion of 'keeping your inbox tidy' - Inbox Zero is a problem fairly isolated to the Newserati and similar thin slices of population.

Fact is, Email Deliverability is increasingly engagement-driven these days, especially with the major ISPs, and additionally sending email costs money.

--

At its most basic level, a sender's 'spamminess' is determined by percentage of spam reports against overall deliveries from that IP. Levels over 1% put your reputation in the 'severe' category, and risk lack of inbox delivery, blacklists and more. Having more engaged users leads to a better ratio - for this reason alone keeping your recipients 'fresh' is valuable.

Additionally, another common pattern of email (or more correctly a sender:template combination) falling into the 'spam' category for an ISP is to see a few percentage points in drop, followed by a complete /dev/null-ing. When the initial drop happens, whether or not your recipients correct that as a false-positive will determine whether you get the Full Monty. Naturally therefore removing the least engaged users has a significant beneficial effect on overall deliverability.

These days though, it's getting more complex, nuanced and ultimately more individual.

Gmail moved some time back from a centralized concept of 'spam' to a much more personal view by using your positive and negative engagement signals: opens, clicks, replies, 'delete without reading','report spam' etc. They explicitly modify the visibility of email in your personal inbox through the 'important' flag (http://support.google.com/mail/bin/answer.py?hl=en&answe...), but there is good evidence that negative engagement can carry an email all the way to the spam box for a given user and consequently affect the overall deliverability.

This has a strong benefit for Gmail in that they become much harder to 'game' - something Google Search team also has plenty of experience in avoiding. They essentially eschew the classic SMTP 5xx return codes for 'Accept All, Ask Questions Later' in all but the most egregious cases, and provide little to no feedback for senders to troubleshoot delivery problems on the basis that if your users want your mail, it would be getting through.

--

The second primary motivator here (still with me?) is that sending email also has a non-zero cost which is almost entirely driven by sheer subscriber count and delivery attempts.

Consider a typical mass-marketing email with a 10-15% open rate, delivered multiple times a month. Even assuming a varied engagement profile that mailer is engaging with at most 50% of their list over the month. A simple list of 1MM recipients would incur an increased cost of a couple of thousand dollars a month to send into the vacuum of disinterest.

There is, in certain circumstances, a benefit to be gained from 'eyeballs on subjects' for brand awareness, but that metric is near impossible to track, and as mentioned above unopened emails can be deleterious to your overall delivery to the more engaged segments.

For both the reasons highlighted above, mass-market email has been using the 're-engagement' method (breathlessly described in the OP as a customer-driven action), to keep their lists fresh and costs down.

I do applaud the application of metrics to provide intelligent subscription management. At PostageApp we see the best delivery rates come from our clients who take active interest in the concept of humans at the end of the SMTP pipe. The growing provision of engagement data through APIs is helping drive solutions like FAB's, and the end result is a better experience for the user. That said, this particular innovation came not from the consumer-friendly high visibility consumer and SaaS markets, but has been around for many years in the risk-heavy line treading bulk marketing industry.


Cool story bro


I always like Eric Ries' take on this. You can test the theory:

If you have one 'great' idea, chances are you have more. Take your _second_ best idea and do all you can to get someone to steal it. Target individuals; shout from the rooftops; describe it in detail to everyone you meet. Nobody will run with it.

To execute on an idea you have to be personally invested, have domain knowledge and be deeply passionate. Ideas in isolation rarely have the qualities that would make them require 'stealth mode'. In fact, the best ideas come from a series of iterations on great execution.

Create success and you will have copycats and parasites, no question (Groupon, iPad, Paypal...the other examples from this thread), but at this point you should be in a position to leverage first-mover advantage, funding, and your more developed longterm roadmap.

People may copy your MVP, but Uber is a classic case where their first and prominent product (UberCab) is clearly not the actual longterm strategy. Cloning UberCab is really creating a cargo-cult startup.

The exception here would be markets - the Samwer brothers have a particular niche in cloning successful North-American scoped startups for European and African markets, but at the point you hit their radar, the 'Stealth Mode' boat has sailed anyway.


I would add one additional point of caution:

The Rails community seems unusually keen on the 'curl example.org/script | sh' as an installation method (see Pow.cx etc.).

I'd usually recommend reading these scripts before execution, but for now especially so as it would seem an obvious target if people are looking to leverage this exploit to acquire more boxen.


While I fully agree with you that the practice of curling a script and piping it into `sh` is—to say the least—risky, notice that this risk has been widely accepted long time ago. Each time you download an executable file—be it an exe for Windows, apk for Android, a Linux binary, an OS X executable—you're doing the same thing. I'll go one step further: each time you download a free/open source tarball you do not read the code before typing `make`. You make your machine run some code of unknown functionality and only plausible origin.

Arguably, HTTPS is one step forward, however vulnerabilities like the one discussed here make us defenceless. To make matters worse the line of defence based on reading the script works only in the case of relatively short, unobfuscated and unminified scripts written in plain text. It also requires the person who's downloading to have skills which despite being common for this community's audience are not widely spread across the population.

Sure, many projects sign their releases or announce cryptographic hashes of published files. But let's be honest: how many of us actually run `gpg` od `sha256sum -c` to verify them?

Spreading paranoia is not my goal here, however I hope that this comment will end up being thought-provoking.


I think the point he was making is moreso that these rails-centric sites are going to get nailed &, as a result, one should be more wary during the next few weeks using this sort of installation method for stuff.

O should be generally quite wary of it in the first place given the ease one could swap out a single file & wreck havoc.


At PostageApp.com we focus on transactional emails, which these days are moving heavily towards HTML email as a standard now that deliverability is much more focused on IP reputation than content.

We provide a templating system which allows you to build and preview with separate HTML and CSS, and we compile to inline CSS at runtime to ensure compatibility with the majority of email clients.

The painful thing to remember is that html/css for email is a weird mix of HTML3 and HTML5 - meaning some major providers only allow for basic functionality, but other platforms support the bleeding edge while omitting some of the simpler aspects. In the end your demographics will strongly influence where you focus your efforts.

The templates from Zurb here are a great basis - as a start on a responsive email boilerplate this will be great value to the community - individuals using this for customization will hopefully push the development of these bases further.


Just as a reference point, this brought to mind an old app (in the desktop sense) from 2001 - http://www.16color.com/

The site is still up, and they claim 40,000 movies were created between 1999 and 2004 (timeline from the DVD : http://www.amazon.com/Best-16-Color-DVD/dp/B000BD98AO/)

I don't think content volume is an issue if the right channel can be found, but you're right that the Instagram comparison is not the correct order of magnitude.

disclaimer: I have no relation to 16color, apart from enjoying it over a decade ago, and just discovered one of my animations was actually on their DVD.


My personal favourite quick-fix (which doesn't stand up to targeted attacks, but is a very effective band-aid), is to put the following : <input type='text' name='website' style='display:none'>

Then disallow any form submissions server-side which contain a value for 'website'. Automated bots can't resist filling out that field.


This happened to me recently with a WP blog. It happened quite by accident, however, since the client just didn't want the website field. When comments still came in with a URL, the client was concerned that I had screwed up - but it clicked right away for me that these must be bots. It might have been a little disheartening for the client, since a number of these spam messages were along the lines of "I have never read such a great article. I have bookmarked your blog and will come back every day to read more of your insightful posts." What unaware blog owner wouldn't want that on their comments? Crafty spammers.


Mine is the reverse of this idea. I have an hidden field that when you click submit, I fill in with a token via javascript. If the correct token isn't present when submitting, i reject the comment.


These sorts of issues (primarily related to auxiliary functions in grep, sed, etc) can cause portability issues when developing locally for linux systems, but no worse thanks, say, writing for fedora on an Ubuntu box. Brew is okay but a bit sparse. For me I mostly miss Gnome


I think it's clear it's going to get a lot worse with Apple purging GNU utils.


It is worse. Fedora is Linux. OS X is BSD.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: