Hacker News new | past | comments | ask | show | jobs | submit | more tomwalsham's comments login

A nice visible reason why the Rails/Node/OSX FOSS community really need to stop doing the following sort of thing for their installations (seen most recently on yeoman.io, but common to get.pow.cx, npm...)::

curl get.totallytrustworthyapp.io | bash

The above examples are obviously legit, but encouraging this kind of lazy access to even local privileges from arbitrary remote scripts (and Yeoman even asks for sudo in a super-friendly way), is the modern equivalent of padlock.gif on your payment page - training poor security practices.


You're still ultimately going to be running some code without reading all of it first, aren't you?


This is a great overview for people looking to run their own servers, with one of the clearest explanations of DKIM and SPF I've seen. Awesome. As a counterpoint, I'd like to add that from the point of view of an email sender (our company PostageApp is a transactional email service), individual Postfix (Exim, qmail, Exchange...) setups receiving email for small organizations are one of the largest headaches we face.

Large ISPs - Gmail, Hotmail, even Yahoo and AOL to an extent, are predictable. If you play nicely, tick the technical boxes and listen to feedback (SMTP return codes, FBLs, bounces etc) you get great deliverability. Even mid-size ISPs and larger companies usually have some reasonable visibility - responding to postmaster@, internal blacklist checkers, etc.

There are, though, still a nontrivial percentage of organizations and individuals who run their own setups using anything from 1998 'standards' through to modern configurations, combined with other filters like custom SpamAssassin rules, an out-of-date Barracuda appliance, or quirky ASSP installs. They often exhibit some unpredictable behaviours - sending permanent hard-bounce codes for simple inbox-full errors; marking completely innocuous email as spam; requiring three attempts for every email to block spambots (delaying delivery by hours); publishing broken MX records...

Dealing with these can be tricky - even finding the correct admin to contact is often an exercise in futility, all the while, the users are not receiving their critical emails. I guess my message is, when running your own email setup, Caveat Hack0r; if you're not in it for the long-haul, including updates, testing and responding to inquiries, you should really consider going with third-party providers.


How weird is it that something as basic as email still needs mad hacker skills to set up and dedication to maintain?

Why can't I plug a Linux box to the Internet, have a wizard auto-configure most of it (including DNS stuff, etc), let me toggle some "auto-update" thingy and be on my merry way?


There's an interesting disconnect between the public perception of email transmission and the reality, even from technically savvy observers.

The view you're espousing of the 'basic' nature of email can usually be summed up as : "It's just sending a bunch of ascii from machineA to machineB"

The reality of the complexity of email transmission is that it's an ad-hoc communications network built around an evolution of RFC standards, amended to accommodate i18n, combined with myriad third-party solutions and walled-garden 'standards' to combat a combination of real and perceived threats such as SPAM, DDoS, backscatter, spoofing, joe-jobbing, image encoding...

The main question around your proposed Auto-Updated system is what combination of these solutions are you using, how much are you paying for someone to maintain this, and do you care about the ability to customize for the inevitable false-positives caused by the necessary filters in place.

For large organizations there are solutions - Exchange being one - but they still require large amounts of custom work. The reality of Business A's needs still differ greatly from Business B, even though we're essentially just talking about sending 150Kb of ascii from machineA to machineB.


Basically, I want to see the Google Chrome of email servers.


Setting up an email server is mad simple.

Setting up an email server that can send and receive mail from other email servers reliably ... not so much.

Most of that is a result of spam and viruses, and the various ad hoc methods which have been institutionalized to deal with both. More often than I'd care to admit, it's the collision between two sites countermeasures and defenses: PGP-signed mail rejected for having attachments, VERP mail rejected for failing to provide a single consistent envelope sender (who but who even looks at envelope sender?), your random residential DSL / server-farm server winding up on a DNSBL, Yahoo pretty much insisting on properly-configured DKIM to receive mail (hey, it's their standard), dictionary spam attacks on your server, remote sites which scan for viruses after accepting your mail (and then blame you for when they don't receive messages).

It's a constant game of whack-a-mole. For basic communications, if you get yourself hosted in valid IP space, it's possible, but if you want high-volume reliable comms, it's getting pretty tough.


How would a wizard know what MX to use or how many, their priorities, etc. in DNS? That's just one example. Email requires knowledgeable systems and operations staff. Many small to mid-sized companies do very well selling email services to others. You can purchase such a service, but even then, you or someone working for you has to understand MX records and DNS and why they are important, how to set them up, etc.


I believe the GP's point is that email should be designed such that it doesn't require all that crap.


Because of SPAM, phishing, and malicious emails.


Web browsers seem to handle malicious web pages just fine. Why is email so special?


Primarily because the Web is a Pull medium, whereas email is Push. To get my eyeballs on the web you first have to trick me into going to your location. To get my eyeballs on Email you just have to know my easy-to-guess personal address.


OTOH, those "critical" automated emails turn out to not really be that critical most of the time.


My alert mails -- pretty damned critical. That's another place where services such as PagerDuty really shine. They get the mail (or pages, or calls) through.


I completely agree. These days there's very little wrong with sending HTML-based transactional email provided you've covered your bases on all other potential flags and send as multipart with a plaintext component. At PostageApp many clients have amazing inbox delivery with HTML transactional, and the benefits - improved tracking, better funnel direction etc. - outweigh the minor delivery drawbacks in many cases.

Frankly, many who send 'plaintext' transactional in fact end up sending as HTML these days in order to facilitate open tracking from embedded images, so aren't gaining much beyond avoiding the minor issues like 'HTML_IMAGE_RATIO' etc.

edit Apparently the MailRox invite email _is_ an HTML email for the exact reasons outlined above. I can see no reason why you wouldn't want to add some visual sparkle to this based on that fact. Incidentally, you're sending without a plaintext component which can cause delivery problems at some ISPs.


It was my first association as well, but I assumed it was used in a jokey linkbait fashion, rather than trying to trade off the reputation of McColo ;)

Always great to see new tools in the email space - would love to get an invite to check out the system and potential integrations.


Tom you can signup for an invite on the homepage:

https://www.mailrox.com

Your invite should come straight through :)


Some stream of consciousness thoughts on the history of internet communities, particularly those centred around tech.

Usenet had immense value in well defined subgroups prior to the Eternal September (and for some time after, regardless of what people may say). IRC ha(s|d) similar values, and remains a force within niche communities on the tech side. Slashdot was an early mover in the moderated community space which had to arise from the newfound populist web.

I still think /.'s comment moderation was superior to the HN system (pre and post-visible comment scores), but the firehose was too late and too poorly implemented to solve editorial issues.

In the middle of this, Kuro5hin rose and fell, metafilter grabbed some component of the serious moderated discussion which it still retains. Fark came and went. Boingboing, SA, b3ta. All significant for a time but not names on people's lips today.

HN cannibalised a significant portion of /., but failed to convert the greybeards - the discussion here is noticeably different because of it (and lacking the perspective sometimes).

Digg suffered greatly from demagogues (as does HN to an extent), descended too rapidly into linkbait and celebpop trash, and fell to Reddit. The redesign was just the nail in the coffin of an already dead community.

Reddit became a very granular experience from its initial tech focus, with a current frontpage of dubious intellectual interest, but their popularity speaks wonders for the ad-hoc community created by diverse interest groups with a common central park. They struggle with discovery for new members, and an apparently descending base age group.

Communities come and go. Small herds migrate towards the latest point of interest and some stick. Groupthink is a large driver of community malaise, certainly within the tech discussion arena. Individuals dominate submissions and discussions and evolve to minor demagogue status. Some communities evolve to tackle a smaller arena than just the general topical discussion field, but topicality remains critical.

Quora has tackled 'big answers'. StackOverflow 'correct answers'. These are some minor elements of the value of the larger communities, much in the same way that Hipmunk, AirBnb etc have abstracted value away from Craigslist. Hyperlocal is the next big thing with FrontPorchForum and NextDoor tacking non-technical local discussion.

I still view the approaches to these problems as relatively unsolved and ripe for disruption, in particular the algorithms related to subject and comment popularity, user 'karma' (for better or worse), and approachable comment threading when a userbase grows beyond the 'scan a single page' scale. I'm not convinced that a one-size-fits-all approach will ever work, but even within niche tribes there remains a problem with staying 'current' while avoiding alienating the 1-2% who drive much of the discussion.

I fully expect a new dominant discussion forum to arise in the tech scene in the next couple of years, but Lobsters seems to be a kneejerk reimplementation of HN that even if it claws some traction would have to evolve rapidly to solve problems rather than dangling the 2013 model of a 2012 carrot.


Communities come and go. Small herds migrate towards the latest point of interest and some stick. Groupthink is a large driver of community malaise, certainly within the tech discussion arena. Individuals dominate submissions and discussions and evolve to minor demagogue status. Some communities evolve to tackle a smaller arena than just the general topical discussion field, but topicality remains critical.

I agree with this, but this is also why I have now come to seriously question why anyone would be interested in investing in any community-based site. Let's not mistake that for social networks (which in of themselves have their own potential issues). But regarding community-based sites, I can't ever see any as having long-term value, not after all the carcasses I see lying in decay on the web. :(


I'm curious, are there any active newsgroups worth visiting and discussing?

I used to roam some groups until about 2003, but most of them are pretty much abandoned now, and I haven't been able to find the same level of discussion on the Internet


You forgot advogato


Speaking of weirdly complex, I've always been a fan of the totally deadpan delivery of the 'Straight Line Tutorial' [gimp lacks a 'line' or 'shape' tool] http://www.gimp.org/tutorials/Straight_Line/


I've been lambasted for not knowing how to do this in gimp... all I wanted was an icon somewhere!


Sending HTML-only email is still a large spam flag, even at major ISPs, independent of the concerns about client compatibility.

At PostageApp our solution to the problem is a templating engine where you can write your HTML/CSS separately and we inline it at run time to ensure client compatibility. We then provide a separate Plaintext tab with an 'import from html' function which does most of the work for you for managing dual formats.


Email deliverability these days is heavily based on sending IP reputation, to a lesser extent domain reputation, and finally content.

Once you have ticked the core boxes of compliance - clean html in your templates, never send just HTML email but use multipart with plaintext, be sure of your MTA's RFC compliance, sign with DKIM/SPF - then reputation now strongly hinges on recipient behaviour at the largest ISPs. Marking an email as Spam is an obvious flag, but additional heuristics such as open rate, clicks, deletes etc. are used to determine your relationship with that recipient, and your reputation over time.

The most common issues these days with content-based filters come from smaller independently run installs - SpamAssassin, ASSP etc. As a legitimate sender you can usually cover inbox delivery to 90% of recipients by building a solid reputation on your IP with the large ISPs. The final 10% will be related to content, linked domains etc. at the smaller services.

A large reason for the growth in email-as-a-service platforms is the reputation component is already taken care of, and you can focus on content and engagement instead of the finicky aspects of core email delivery.


This strikes me as somewhat of an edge case - to add a full rules engine for what is essentially a very app-specific set of behaviours.

One main reason the application is the best place to manage these rules is that it's rarely just the email content that dictates the engagement pipeline. Whether a user has interacted with a component, or logged in, etc, are all app-side variables which may affect whether or not you wish to send a given email.

That said, one method some of our customers use at PostageApp to simplify some of the logic is to pass a UID to the API based on the unique parameter sets - e.g. recipient + template + hour - and rely on the engine to discard the duplicate API calls.


I guess my concern is that you end up coupling components with marketing rules where there would otherwise be little or no connection.


Can you comment on the status of Priceonomics (YC) in this context? They're building a rich interface on top of CL listings data, presumably using 3Taps to get the volume of data without drawing attention.

Is the endgame for these companies to replace the content source, or to hope (or fight for) a legal precedent to 'open' CL data for third party usage?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: