Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What tasks do you automate?
389 points by flaque on July 16, 2017 | hide | past | favorite | 337 comments



I take enormous pleasure in automating every part of my research pipelines (comp sci).

As in, I like to get my experiment setup (usually distributed and many different components interacting with each other) to a point where one command resets all components, starts them in screen processes on all of the machines with the appropriate timing and setup commands, runs the experiment(s), moves the results between machines, generates intermediate results and exports publication ready plots to the right folder.

Upside: once it's ready, iterating on the research part of the experiment is great. No need to focus on anything else any more, just the actual research problem, not a single unnecessary click to start something (even 2 clicks become irritating when you do them hundreds of times). Need another ablation study/explore another parameter/idea? Just change a flag/line/function, kick off once, and have the plots the next day. No fiddling around.

Downside: full orchestration takes very long initially, but a bit into my research career I now have tons of utilities for all of this. It also has made me much better at command line and general setup nonsense.


Another nice thing about setups like yours is reproducibility. So long as you've got your setup in git and you've stored the flags/lines/functions, you can instantly redo the same experiment.


I agree and I have been working to do this with some of my pipelines as well. One challenge I have been facing is that my compute environment may be quite different than others'. This is mainly the case with respect to distributed computing that seems to be an essential part of the pipelines: I often wish to experiment with multiple hyperparameter settings which creates a lot of processes to run.

Do you or the parent or others take steps to abstract away the distributed computing steps so that others may run the pipelines in their distributed computing environments? More specifically, I use Condor (http://research.cs.wisc.edu/htcondor/) but other batch systems like PBS are also popular. Ideally my pipeline would support both systems (and many others).


I ended up writing a simple distributed build system (https://github.com/azag0/caf, undocumented) that abstracts this away. (I use it at a couple of clusters with different batch systems). You define tasks as directories with some input files and a command, which get hashed and define a unique identity. The build system then helps with distributing these tasks to other machines, execution, and collection of processed tasks.

Ultimately though, I rely on some common environment (such as a particular binary existing in PATH) that lives outside the build system and would need to be recreated by whoever who would like to reproduce the calculations. I never looked into abstracting that away with something like Docker.

(I plan to document the build system once it's at least roughly stable and I finish my Phd.)


Maybe containers like Docker are useful for your use case?


Docker can distribute the software needed to run the job well, which is definitely part of the issue and something I should use more.

However, I also have in mind a pipeline of scripts where one script may be a prerequisite to the other. Condor has some nice abstractions for this by organizing the scripts/jobs as a directed acyclic graph according to their dependencies. I was thinking other batch systems might support this as well. But some of my challenge comes in learning how each batch system would run these DAGs. Each one will have some commands to launch jobs, to wait for a job to finish before running some other job, to check if jobs failed, to rerun failed jobs in the case of hardware failure, etc.

It seems like the DAG representation would contain enough detail for any batch system but there may be some nuances. For example, I tend to think of these jobs as a command to run, the arguments to give that command, and somewhere to put stdout and stderr. But Condor also will report information about the job execution in some other log files. Cases like this illustrate where my DAG representation (or at least the data tracked in nodes) might break down, but I haven't used these other systems like PBS enough to know for sure.


Apache Airflow defines and runs DAGs in a sane way, IMO. Takes some configuration, but worth it for more complicated projects.


[Luigi](https://github.com/spotify/luigi) out of Spotify sounds exactly like what you're looking for. It allows you to specify dependent tasks, pipe input/output between them, and more.


You should check out pachyderm [1] for setting up automated data pipelines. Also great for reproducibility.

[1]: http://www.pachyderm.io


Yes, this comes especially handy when reviewers ask for additional experiments.


This is really interesting, and what I hope all research starts to shape into in the future.

Classic HN followup (at least I hope): What's currently getting in your way or annoying? What problem would you currently like to just disappear?


I actually had to think about this for a few minutes. I suppose having written a number of custom orchestration tools both in local clusters and the cloud, debugging distributed services is still incredibly tedious.


It's great that you do that, thank you. Do you also publish your data, programs and setup to allow others to reproduce/build on your research?

When talking about reproducibiluty in science, it's usually about the availability of original data to verify that the original conclusions were correct, but one level higher there's also computational reproducibility with its own challenges (freezing the original environment of the experiment).

So-called orphan repositories (i.e. content doesn't fit into any other bucket) like Zenodo welcome your articles, datasets and code.


I do this as well (research in computational chemistry.) Incredibly useful once set up, plus forcing oneself to think about a project in abstract terms to automate it can give valuable perspective.


This. I do the same, although these days I've narrowed down my scope somewhat to stuff I _must_ use rather than my old Ansible/Hadoop stack ensemble.


How much time does a full reset take?


A few minutes - killing processes on one machine, restarting a database on another machine/cluster, reloading a schema, importing data (for some experiments), deploying a new service, loading tensorflow models, warming up benchmark clients. I found that the key of good research pipelines is having a really consistent way of passing around configurations between components.

My components arent strictly microservices (a mixture of open source components and handwritten tools) and they interact in all sorts o fprotocols with each other (importing jsons, csv, GRPC, HTTP), but I essentially treat the configuration flags as their API, so there are no implicit configurations that I could forget about. The rest is just naming things well, e.g. descriptive names for experiment result files etc.

Initially I thought everyone is doing that, but from talking to PhDs in other domains I noticed that there is a strong bias towards people working in any kind of complex distributed setting having these pipelines.

My friends who devise ML models and just test them on datasets on their laptop never had a real need to get a pipeline in place because they never felt the pain points of setting up large distributed experiments.


All of my thesis project in immunology was automated, which involved several hours of blood processing repeated several thousand times (with some parallelization) by a team of a dozen robots. There are pics, schematics and vids here: http://www.zachbjornson.com/projects/robotics/.

I also like to say that the final analysis was automated. It was done entirely in Mathemtica notebooks that talk to a data-processing API, and can be re-ran whenever. The notebooks are getting released along with the journal article for the sake of transparency and reprodibility.

(Also, I automated my SSL cert renewal ;))


Awesome. You should write a paper on your paper writing / research / automation / publication process, to help advance the way scientific publishing is done.


This is awesome. I work in a research lab that does a lot of bench work and imaging under microscopy. I think that automating some of these processes would greatly speed up the research being done!


My most proud "automation" was writing a bot that would play Farmville for me.

I was at university, and Farmville was all the rage on Facebook. My girlfriend wanted me to play because it'd mean she'd be able to trade stuff with me or something (I forget why exactly), and I eventually caved in.

After ten minutes of playing it, I was bored. I couldn't really judge people that would click plants hundreds of times, several times a day, though, because I played World of Warcraft. It was just a more interesting type of grinding...

I figured out that in order to grind through the game most efficiently, I'd need to plant Tomatoes every two hours, so I wrote a bot that would:

1. Spin up a VM.

2. Open the browser to Farmville.

3. Open up an automated clicking application I had written that worked on Flash.

4. Find the outermost vegetable patch.

5. Click in a 20x20 grid (or however big the whole area was).

6. Replant, and close.

I didn't tell my girlfriend about the bot, and I'd turn it off when I went to visit her, so she was shocked when she went on my farm to see that I was a higher level than her. I'd jokingly feign ignorance, saying that I was just playing it like her, until one day when I had left the script running and she saw my farm picking itself while I was studying.


Fun story: with an extremely slow internet connection (satellite) you could send valid requests to plow fields that allowed the otherwise impossible vertical farming


The only time ever my shitty Australian connection is a benefit.


I'm the kind of nerd who greatly prefers writing automation code to doing anything remotely repetitive. (I'm afraid to work out the actual timings because I'm pretty sure that I often spend more time coming up with the automation than just doing the task would take).

I've got a script that automatically rips, converts and stitches together audiobooks from the library so that I can play them on my phone. It just beeps periodically to tell me to put the next CD in.

I also had a batch job that downloaded Doonesbury cartoons (including some delay logic so I wasn't hammering the server) and built a linked series of html pages by year and month. I've ported it to a couple of other webcomics so that I can binge read.

I also write a lot of LaTeX macros, doing things like automatically import and format code from a github gist into lecture notes (something like \includegist{C,<path/to/gist>), or autogenerate pretty PDF'd marks summaries for students from my home-rolled marks. database.

Another thing I like is building little toys to demonstrate things for students, like a Mathematica page that calculated the convergence rate and error for the trapezoidal rule (numerical integration) with some pretty diagrams.

I once wrote a bunch of lisp code to help with crypto puzzles (the ones that use a substitution code, and you try to figure out the original text). The code did things like identifying letter, digraph and trigraph frequencies, allowed you to test substitutions, etc.

As developers, we tend to focus on these big integrated projects. But one of the biggest advantages that people who can code have is the ability to quickly get a general purpose computer to assist with individual tasks. I write an awful lot of code that only gets run a handful of times, yet some of those projects were the most pleasure I've ever had writing code.


I go about automation in even less efficient way.

I spend many months doing repetitive tasks. And than I realize I should automate them, and proceed to spend hours coming up with scripts/tools to automate them.

Happens way too often...


There are some really clever people here, but as a general rule, you can't truly automate something until you can do it manually to the point where you're fully aware of all the snags and exceptions that may occur.

Once you reach that point, it then becomes a matter of trading-off how much time/money/effort it will take to automate the task against what benefit you get in return.


Agreed, but it's important to include one criterion in the trade-off calculation: I'd much rather be writing automation code than doing most automateable tasks (i.e. repetitive, simple decision tree). Even if I don't save any time, or even if it actually costs me a little time, I count it as a win. Especially since I often discover useful tools and techniques (holy smokes! Someone already wrote a parser for this weird thing I'm playing with!) that end up being valuable later in a completely unrelated project. True story: some colleagues wanted to integrate a departmental Moodle server with some bespoke scheduling software we were running. Turned I already had most of what we needed, because a year earlier I'd gotten irritated at hand-loading class lists into Moodle and hacked together a bunch of code to directly translate entities from one database to the other. I'd even generalized it into a bunch of types and tables that I didn't really need because OCD. All that 'hobby' code ended up being really valuable later.


I believe there is also merit in spending a lot of time repeatedly doing something, before proceeding to automate that workflow. Because only through that, you gain a deep understanding of 'edge/exception' cases which you can directly code into your script later on to manage.


Check out the ski rental problem, it will explain how much manual work to do before automation is right.


I tried googling for this and didn't find anything - do you have a link?


Sure thing.

https://en.wikipedia.org/wiki/Ski_rental_problem

[...] The ski rental problem is the name given to a class of problems in which there is a choice between continuing to pay a repeating cost or paying a one-time cost which eliminates or reduces the repeating cost. [...]


> I've ported it to a couple of other webcomics so that I can binge read.

Maybe this can save you some trouble:

https://www.comic-rocket.com/help/archive-binge/


Thanks!


Cool! Maybe you'll enjoy something such as Sikuli. It basically lets you automate the actions to take on your desktop or laptop visually. It's based on OpenCV and has a simple interface to use.


At least in the US, the Overdrive app for android let's you rent audiobooks from the library so you can play on your phone. No need to rip or convert.


Yes, and I use it quite a lot. My local library system, however, has a huge collection of CD audiobooks, many of which aren't available (at least in Canada) on Overdrive or Hoopla. I run my script in a window when I'm working on other things so it's trivial to just feed the disk monster. The audiobook then just shows ip in my Dripbox, ready to play.


Just a PSA for anyone who happens across this: There are some apps for Android that understand and play multi-file audiobooks just fine. I use Smart Audiobook Player and it works great.


I'd be really interested in that audiobook script. I have some that are ripped but not stitched together. I want to get to the point where all my audiobooks are one file.


I usually do one file per CD. You can usually just cat .mp3 files together, although some apps that expect non-standard metadata can get confused (mp3wrap is a more robust solution). 'cat *.mp3 > disk.mp3' (assuming that ls order is the order you want. Or 'cat first.mp3 second.mp3 last.mp3 > all.mp3') Other formats (like m4a) can be stitched together using utilities like ffmpeg.


Please share them macros, that would be gold


>(I'm afraid to work out the actual timings because I'm pretty sure that I often spend more time coming up with the automation than just doing the task would take).

You can use this calculator to estimate how much time is worth investing in automating a particular task =)

https://c.albert-thompson.com/xkcd/


Since I have a toddler in longing for a house with a garden which starts ar 800k EUR in pleasant neighborhoods in Amsterdam now, which is above my paygrade. So i wrote a script that compares surrounding towns on a number of metrics (4+ rated restaurants per citizen for instance) and let's me know when there are houses for sale with a garden facing south (or north but only if it's sufficently long that we are likely to enjoy some sun (10m+), etc.

So far this has not resulted in us buying a house and the hours that went into the project would have probably long paid for a good real estate agent :)


At the moment, you will not successfully find a house without paying a realtor. They can view entries into the Funda(1) database before they're publicized. Source: acquaintained photographer who exclusively works for realtors.

(1) Funda.nl is a real estate website that more or less has a monopoly in The Netherlands.


Interesting! Any chance you might make the script public?


Great idea. Never thought about it but why not to find the perfect place. The time will come and you'll find your space.


Is it general purpose enough to be used outside Amsterdam with minor modifications?


This is rly cool


- Downloading a song of youtube, adding meta data via beets and moving to my music lib

- Adding tasks to my todolist client from every app I use(including my bookmarking service when I bookmark with specific tags)

- Changing terminal colours based on time of the day(lower brightness in the evenings and hence dark colours, too much sunlight in the mornings and hence solarized themes)

- Automatically message people who message me based on priority(parents immediately/girlfriend a longer buffer).

- Filters on said messages incase a few require my intervention

- Phone alerts on specific emails

- Waiting for a server which you were working with to recover from a 503(happens often in dev environments) and you are tired of checking every 5 seconds: Ping scripts which message my phone while I go play in the rec area.

- Disable my phone charging when it nears 95% (I'm an android dev and hate that my phone is always charging)

- Scraping websites for specific information and making my laptop ping when the scenario succeeds(I dont like continuously refreshing a page)

I dont think several of these count as automation as opposed to just some script work. But I prefer reducing keystrokes as much as possible for things which are fixed.

Relevant to this discussion:Excerpt from the github page

>OK, so, our build engineer has left for another company. The dude was literally living inside the terminal. You know, that type of a guy who loves Vim, creates diagrams in Dot and writes wiki-posts in Markdown... If something - anything - requires more than 90 seconds of his time, he writes a script to automate that.

https://github.com/NARKOZ/hacker-scripts


> - Automatically message people who message me based on priority(parents immediately/girlfriend a longer buffer).

I'm curious what these automatic messages say. Are you talking about something like an answering machine message? "I'm at home but my phone hasn't moved in 20 minutes so I'm probably in the shower"?


Imagine if Archer could make elaborate text message pranks. Leave it!


I wonder if charging phone charging at 95% actually gives better battery lifetime. It leads to your battery cycling between 95% and whatever state of charge you start charging at again. If you charge up to 100% your phone will stop charging, but use power from the charger bypassing the battery therefore keeping the battery in better shape.

Charging to 80% and keeping it there by just letting the charger supply the operating current would probably be best if phones supported it.


Can you please share code of your scripts ? Seems interesting !


I'm curious as to how you did the "Phone alerts on specific emails". I'm currently setting my raspberry pi up as a motion detector (we were burgled 2 weeks ago)... and while I can do the motion & send out the email, I really want my phone to go nuts when it gets the alert email (mostly I have notifications turned off).

Any hints appreciated! I'm also experimenting with Telegram, which I think will also solve my issue.


Twillio and Amazon (SNS, I think) are dirt cheap and easy to use (at that non-scale).


Seconded. Whenever I need an extremely visible notification, I have Twilio literally give me a phone call.


> - Downloading a song of youtube, adding meta data via beets and moving to my music lib

I don't want to be that guy, but a music collection made of probably repeatedly lossy encoded Youtube rips sounds as if it doesn't deserve the name. Not to mention the whole piracy angle - I'm not trying to claim moral authority, but music is DRM-free and reasonably priced these days. There are hardly any arguments for piracy left.


> I don't want to be that guy

Definitely sounds like you want to be that guy.


My day to day decisions are mostly automated - what to eat for breakfast? what clothes to wear any given day of the week? when to walk my dog and for how long? When to leave work and which back roads route to take to get back home? Lunch options? When to call the folks? Exercise schedule? All automated.

It gets a little repetitive and boring at times but I'm able to save so much time and energy this way to focus on what's important to me.


This reminds me of a quote from Surely You're Joking, Mr. Feynmann!.

When you're young, you have all these things to worry about - should you go there, what about your mother. And you worry, and try to decide, but then something else comes up. It's much easier to just plain decide. Never mind - nothing is going to change your mind. I did that once when I was a student at MIT. I got sick and tired of having to decide what kind of dessert I was going to have at the restaurant, so I decided it would always be chocolate ice cream, and never worried about it again - I had the solution to that problem.


I've started to adopt a similar practice, although a somewhat more flexible one.

Whenever I'm at a restaurant, or ordering food for whatever reason, I will check the menu for an item that prominently contains mushrooms. If such an item exists (and it doesn't sound awful or ridiculously expensive), then that's what I'm getting. Decision made.

I started doing this mainly because I like mushrooms, enough that it felt like a really easy way to hand off decision making. It's also good in that, unlike the above, I don't get just the exact same thing every time (although mushroom burgers are starting to seem pretty same-y).


One thing I like to do, especially when I'm too tired or fuzzy to spend my limited energy on a large number of choices, is to wait and see what others are ordering and then pick from those options which you like best.

Also reduces the odds that you'll be looking at someone else's plate thinking "I wish I'd have ordered that instead" :)


What about trying new things, experimenting with life?


I think the idea is to remove decision making from things you personally don't care too much for to give you more time to focus on those decisions you do care a lot about.


Decide that you will always try something different from last time.


I always try to identify items I wouldn't be able to get elsewhere. For instance, if I want to order a beer I go for those that are in limited stock or from a local microbrewery. Spice of life and what not.


Um yes, go for it? :)


Nice change of focus - automating mental tasks, rather than just computer tasks.

When you don't have to spend mental energy on it, it's automated. I think that's the real benefit of Steve Jobs's standard outfit: don't have to choose, don't have to shop, just wear clothes and get on with the interesting stuff.


It would be very interesting to know how you've gone about automating those things.


How do you automate these?


For breakfast, I decided to stop trying different cereals and stick to oatmeal. Been a horse for about 2 years now. Same with lunch - find 5 reasonable choices in the office caf, repeat.

For work clothes, I have 5 decent combos which I repeat M-F. Doesn't bother me to repeat. Helps that I'm a dev so I'm always Feynman-ing myself away from unnecessary meetings and face time with business side clients.

For commute routes, I watched my phone's gps for a few weeks, tried different routes, seeded new ideas for routes and detours by brute forcing options into the backend and now I can consistently get home within 50 mins of leaving work which includes driving down a few floors of parking at 10-15 mph. I used to take anywhere between 45-90 minutes before. Plus my mpg is usually at a respectable 26-28 mpg for a midsize sedan.

Hope this helps.


Habit probably.


For the clothes it's as simple as creating a LIFO system of some sort... for example putting the clean laundry at the bottom of the drawer (underneath what's in there), or the end of the closet rack.. and pulling the day's clothes from the top of the stack or the other end of the closet rack.


And of course, there's the "buy a full set of interchangeable socks" approach. I did this a few years ago and it's great. Pairing socks seems trivial but saving 10-20 minutes a week, every week, for minimal investment (I needed socks anyway) is a huge win.

One caveat, though - if you do this, commit to it. Don't do what I did and keep one or two pairs of non-conforming socks just because I liked them. Because now they just recirculate through my sock drawer, gumming up my otherwise perfect system. :P


Sounds like you need a second container. I went for a full set of black socks and black underwear, and all anomalous socks/underwear that survived the QC process sit on the other side of a piece of cardboard I blu-tac'd to the draw. It works, and I find myself a fruitful object of comedy.


It can be more complicated than that. The next shirt in line is on the ironing board, not in the closet. I have to wear a tie today! I'm pulling a shirt from down the line. Can I also have a reminder of which pants go best with that shirt? I didn't do laundry yet so half my pants aren't available. Now I did do laundry, so half my pants have only one wear left in them and half have two.

Here's a <a href="http://i.imgur.com/VNJ08Ra.png">screenshot</a> of the thing I made in Flask to keep track. Orange needs to be washed next time I wear it, red needs to be washed now.


Maybe you're overcomplicating it? Pulling the next shirt in line is a suggestion, not a requirement. Maybe you want the second or third shirt in the queue. A small amount of common sense is required, but less than if you were looking at your entire wardrobe at once with no other information about it, and trying to make a decision.


I just wear the same thing thing each day, or close enough. Being married Middle Aged male Dev nobody cares as long as I don't smell and I'm reasonably neat. I certainly don't care, nobody to impress.

Having five sets of the same clothes certainly makes life easier. And when I finally do change my attire the office gets a laugh out of the fact I actually went into a clothes shop.


just wear one outfit. I measured 3 different shirts against same jeans, and shoes for six weeks, counted which got the most complements, and wear just that now.


I never thought about automate those things. That's a cool idea! I can use the yelp api for lunch, some rfid chips for my clothes,...


How?


please share!


I am not a programmer, but I've automated a few things in my life.

I self publish graphic novels. I have a script that runs on a directory full of page files and outputs a CSV in the format InDesign expects. I wrote it after manually editing a CSV and leaving a page out, and not noticing that until I had an advance copy in my hands and 400 more waiting to be shipped from the printer. That was an expensive learning experience.

I like to rotate my monitor portrait mode sometimes, but hate trying to rotate the Wacom tablet's settings as well. So I have a script that does this all in one go. It used to try to keep track of separate desktop backgrounds for landscape and portrait mode, but this stopped working right, so I took that part out.

I have a bunch of LIFX bulbs in my apartment. The one near the foyer changes color based on the rain forecast and the current temperature, to give me an idea of how to dress when going out, thanks to a little Python script I keep running on my computer. Someday I'll move it to the Raspberry Pi sitting in a drawer.

I recently built a Twitter bot that tweets a random card from the Tarot deck I drew. I've been trying to extend it to talk to Mastodon as well but have been getting "request too large" errors from the API when trying to send the images. Someday I'll spin up a private Mastodon instance and figure out what's going on. Maybe. Until then it sits on a free Heroku account, tweeting a card and an image of its text about once a day.

And does building a custom Wordpress theme that lets me post individual pages of my comics, and show them a whole chapter at a time, count as "automation"? It sure has saved me a lot of hassle.


I have some news for you. You are a programmer.


If doing that stuff over about 4 or 5 years or so makes me a programmer, then I'm also a carpenter, because I've put together a few pieces of furniture over the same span.


You're a programmer and carpenter. It doesn't have to be either/or.

I must say your coding achievements are impressive (great utility). Any other skills you haven't mentioned yet? :)


Well mostly I draw comics. In Adobe Illustrator. I spent four and a half years drawing a story about a robot lady dragged outside of reality by her ex-boyfriend: http://egypt.urnash.com/rita

There's other art stuff lurking around my site, too: http://egypt.urnash.com


Difference is, you're not a programmer professionally. "Carpenter" strictly implies a profession--I think. Native English speakers feel free to correct me :)

In my head, "carpenter" just has a different connotation. But it's not very clear-cut. I used to DJ a lot at parties during my university years, was I "a DJ"? I prepare dinner for 8-10 guests regularly, am I "a cook"?

It kind of depends, I guess. I've drawn a large number of cartoons in my life, but (with few exceptions) never sold them for money. I wouldn't call myself a cartoonist (mainly because I don't do it as much recently), but I would definitely not call myself "not a cartoonist" :) That would be selling myself short :)

I'm not a programmer professionally either. But then, I do have a CS degree, and I still write code almost daily. I'm a programmer.

I also generally don't feel that the fact if you do something professionally (for money) should really count as "what you do" (identity, in a sense), as this comic illustrates: http://i.imgur.com/MNJzpqL.jpg (I didn't draw this, btw)


My personal definition is that a programmer is someone who can write a computer program, and a developer is someone who writes computer programs for money. So you're a programmer, but not a developer. In the same way that you might be called a woodworker, but not a carpenter.


I like to think that a 'developer' or 'software engineer' is someone who can program not to just make it work, but addresses other concerns as well.


A programmer is to a woodworker what a software engineer is to a carpenter.

You are a programmer, and a woodworker.


I was wondering who he thinks a programmer is?


She.

A programmer is someone who can program a computer, and chooses to call themselves a programmer. If absolutely nobody is willing to pay them to do so then maybe they're lying when they call themselves one.

If you ask me what I do, I'm not gonna say "I program computers", I'm gonna say "I draw comics and stuff". Because that's what I spend my work life doing. And occasionally I need to do some tedious task, and find a way to automate it, because I spent enough time in my youth fooling around with programming that I'm not afraid to get my hands dirty now and then.


I would call that a professional programmer.

Like, you can be a swimmer, or you can be a professional swimmer. You have to be a painter a long time before you sell your stuff.

Money is as much a corrupting influence as it is a validator of talent and a correlate to time spent.


- Data pipelines (as seen elsewhere here)

- Anything related to infra (I do Azure, so I write Azure templates to deploy everything, even PaaS/FaaS stuff)

- Linux provisioning (cloud-init, Ansible, and a Makefile to tailor/deploy my dotfiles on new systems)

- Mail filing (I have the usual sets of rules, plus a few extra to bundle together related e-mails on a topic and re-file as needed)

- Posting links to my blog (with screenshots) using Workflow on iOS

- Sending SMS from my Watch to the local public transport info number to get up-to-the minute bus schedules for some pre-defined locations (also using Workflow)

- Deploying my apps on Linux (I wrote a mini Heroku-like PaaS for that - https://github.com/rcarmo/piku)

- Searching for papers/PDFs on specific topics (built a Python wrapper for arxiv/Google/others that goes and fetches the top 5 matches across them and files them on Dropbox)

- Converting conference videos to podcasts (typically youtube-dl and a Python loop with ffmpeg, plus a private RSS feed for Overcast)

Every day/week I add something new.

(edit: line breaks)


> Searching for papers/PDFs on specific topics (built a Python wrapper for arxiv/Google/others that goes and fetches the top 5 matches across them and files them on Dropbox)

Any chance you would post your code? This looks interesting, especially if it integrates sci-hub/libgen


Maybe you should automate your line breaks :)


> private RSS feed for Overcast

I for one would be interested in reading more about how you set this up!


I'm aggregating flash sales and sending post requests to azure ml using huginn. It's a work in progress, but huginn seems to be working well. Also considering giving nifi a go, but the setup seems a bit over my head.

https://github.com/huginn/huginn

http://nifi.apache.org


What are you doing with the data in Azure ML ? What is your experience with Azure ML ? How does it compare to the ML offerings from AWS ?


I'm ranking the sales based on their title. Azure makes it dead simple to use your trained model as a web service. I'm not familiar with aws, I just sort of fell into incorporating ml to my project, and since I'm in bizspark, everything that I'm doing is free. Another solution I'm considering is using prediction.io for the webservice.

http://predictionio.incubator.apache.org/index.html


In 2003 I had a perl script to query the job boards for keywords , scrap the result and send out an application email with CV attached to it (I took care to send one application to a single email). I think this was a legitimate form of spamming - at that moment the local job market was very bad.


Funny. I think Amazon does that on LinkedIn when trying to find applicants :)

Except they don't take care to send one email per person.


That's cool, did you manage to get a job that way back then?


positive. Also I think to have mentioned it at the interview, it seems to have made a good impression on at least one interviewer.


Tool-assisted job hunting!


I need to upload invoices every month from all ~20 SaaS products I subscribe to an accounting software. Most of the invoices can be just redirected from email to another SaaS that will let me download a zip file containing all invoices from a date range. Other software requires me to login to the product, navigate to a page and download a PDF or print an HTML page. I have browser-automated all of these laborious ones as well so everything will be in that zip file. Saves me 30 min monthly and especially saves me from the boring work.


I've been thinking about automating this myself. Which Saas is it? ("another SaaS that will let me download a zip file containing all invoices from a date range.")



A bot for reserving hotel rooms.

I wrote a bot to reserve hotel rooms a year in advance for a national park in the US.

It was so difficult to book. After couple days of failed attempts to reserve my desired dates, and after staying up late into the night one day, I went ahead and wrote a bot to automate the task of checking for availability and then completing the checkout process once available.

And... it worked.


How does the bot reserve hotel rooms? Through an API or does your bot interact with a website. How do you validate that your code is still able to reserve an hotel room lets say 3 years after writing it; I can imagine that changing systems will break you code.

Not trying to be a smart ass, just interested :)


Nice! I've never been able to get a date I want for a national park. So much easier when single and or in college. I see other days, or a room here and there...but not enough for my family and for the dates to coincide with everyone's schedule.

I also don't have the foresight to book that advance. I've settled with staying offsite and just driving. Not the same experience but good enough compromise I guess.


Any chance you'd be willing to share it with us?


What language did you write it in? Curious. . .


Yosemite?


A month before July 4th, i tried booking a camp site in Yosemite and got none. I should have known the camp sites gets completely booked 6-8 months ahead. When i called one of the camp sites, the automated machine replied -

press 1 to book for this year

press 2 to book for next year.


>A month before July 4th

Ha. For a top-ten most visited park in the world? Walk-up permits are your friend. No experience at Yosemite but I got some backcountry permits at Grand Canyon, it helps if it's not peak season, I think people line up at 4am for those. Worst case you are posting up in a random field or Walmart parking lot for the night.


carving up marble with industrial robots

https://vimeo.com/94076571

Cad -> robot code compiler is built on top of pythonocc


Wow, that's fascinating. On the pulley cutting mechanism, what kind of material is being used to cut the marble? Never seen anything like that before!


Interesting, do you have more information about what you're doing there?


Coolest answer here IMO.


Very nice. :)


Support tickets integrated with service monitoring.

Around 3 years ago, we started to get a lot of customers for our VoIP tunneling solution, mostly from UAE. Most of these were unfriendly customers abusing our support, so I started to implement a CRM to track "support points". I spend a half year to develop this solution (with lots of other functionality such as service monitoring) and when I finished, there was no any demand for the VoIP tunneling solution anymore :)

This is how I wasted half year instead to focus to solutions relevant for our business.

Thanks good, we started to have new customers again since last year and actually my CRM/support point tracking software is very useful now, but I still don't think that it worth’s 6 months’ time investment.

Conclusion: focus on your main business and don’t spend too much time with automation and other helper software (or hire somebody to do it if your business is big enough)


In Lausanne, Switzerland, it's very difficult to find an appartement because there are too few appartements for too many people and it mostly follows "First-come, first-served".

So I created scrappers for 3 websites + 1 facebook group. It simply looks for apartments with my specifications and notify me when a new one comes up.

I can say, I successfully found an apartment. The whole process usually takes at least 3 months, I did it in 1.


I had the same problem to find an apartment in Zurich and started scraping several websites to automate the process. The tool is at https://www.immomapper.ch perhaps you find it useful.


Sounds like you've got a business in hand


I do. I'm planning on going forward with it.


Paying all of my bills. All of them. My bank (Fidelity) can connect to most bigger companies to have the bills automatically sent to them and then they automatically pay it (with an optional upper limit on each biller).

For other bills, I got all but one to put me on "budget billing" (same amount each month, so Fidelity just sends them a check for that amount without seeing the bill). For Windstream, which varies by a dollar or two each month, I just send them an amount on the upper end and then let a credit accrue. Both of these require an update maybe once a year or so.

Windstream is a bit funny - I don't know why they can't pick a number and stick to it. Also, they apparently raised my "guaranteed price for life" a couple of times and didn't notify me until ~8 months later when they were threatening to disconnect my service for being more than a month behind. (They had turned off paper billing on my account but didn't actually enable e-billing - service still worked so I didn't even think about it. We eventually got it straightened out, but Windstream is ... special.)

Beyond that, I made a bot that automatically withdrew Elance earnings to my bank account (that got me banned for a week or so when I posted it to their forum).

I made another bot that bought and sold bitcoins and litecoins and such. It was moderately profitable until my exchange (criptsy) got hacked and lost all of my float (worth ~$60 USD at the time.)

I connected an Arduino IR blaster to my TV to make it automatically turn on my sound bar (the TV would turn it off, but not on?!) - http://www.nfriedly.com/techblog/2015/01/samsung-tv-turn-on-...

Oh, and of course, code tests and deployment. Nearly every git commit I make gets a ton of tests, and for most projects, each tag gets an automated deployment to to npm or bluemix or wherever.


As an European, it seems amazing that you even need to automate half of this stuff/that it is a perk of a single bank.

Over here nearly everything recurring is done with direct debit, in which you authorize corporations to directly and automatically withdraw funds from your bank account, and it's a required feature if you want to connect to the European payment system. (don't worry, you can charge it back in case of errors or malice)


> in which you authorize corporations to directly and automatically withdraw funds from your bank account

The corporations over there must be more trustworthy than the ones we have over here. I'd never agree to such a thing.


They're probably as trustworthy over European end as over the US end (hell, quite often those are the same corporations) - it's more that in Europe, customer protection rights make it less likely companies will make a... "mistake".


If a company makes a withdrawel that you disagree with you can log into your bank website/app and get the money back until 57 days after withdrawel.

If its >57 days but <13 months you can call the bank and they do it for you.

In Dutch we have a verb for it: storneren. I think it's best translated as 'to write back'. But I don't know a direct translation.


Yea, here in the states we have a crappy insecure version of direct debit that I've seen abused more than once, so I don't trust it anymore. For one thing, you can't do charge backs. Once the money is gone, its gone.


> Paying all of my bills. All of them.

What kind of bills? Like grocery bills? or recurring bills?

Aren't you using direct debits/standing orders/credit cards for those?


Everything that's recurring: mortgage, electricity, insurance, internet, iphone payment plan, etc.

My wife handles our groceries and such. We actually have two separate accounts: "mine" gets the paycheck, pays the bills, and transfers money to "hers" twice a month.

We also have a third "giving" account that gets a bit of money set aside to give to or buy things for other people. This can be anything from a car repair to just buying someone lunch.

Direct debit is fine if you trust the other side not to screw up and charge too much, leak your credentials, etc, but I've seen exactly that happen several times. You're basically giving them to withdraw as much as they want from your bank account.


This confuses me a little -- the UK has the direct debit guarantee which is incredibly ironclad. If I think something is funny, I call the bank and the money is back in my account immediately -- without argument or any reason required.

The DD is then cancelled and the company can argue with me, sure, but I get my money back.


Yea, our banking system in the states is pretty behind.

I've had it happen to me once and a family member once where we each had to close a bank account to make someone stop taking out money. Both were a while ago, so it may have gotten better in recent years, but it burned my trust in the system.


> Direct debit is fine if you trust the other side not to ...leak your credentials

Woah - just for your information, there is no credential sharing required here.


FWIW, some services do in fact use your bank username and password to instantly verify your account rather than the usual "we made two small deposits, tell us the amount when they come through in a few days".

But, I was mainly referring to the account and routing number. Anyone with those two numbers and access to the ACH system can drain your account. I know one person who wrote a check to a bill collector, then the bill collector entered their numbers into the their ACH system and started taking regular withdrawals from the account without the owners permission. The owner had to close the account to make it stop.

That one was like 15 years ago - I think the laws around debt collections are better today, but the ACH system is still fundamentally the same system.


My expense reports and timesheets.

The three shittiest parts of my job every week are:

- Approving timesheets

- Entering in my timesheets

- Entering in my expense reports

I've written a script that goes in using a phantom.js script, and automates the submission of my timesheet on Friday afternoon at 3:00 +/- 15minutes. It now takes into account travel time, Holidays, and approving time if I have time approvals due.

Same holds true for submitting expense reports in Oracle. I upload the receipt to Expensify, and as long as it's tagged properly in Expensify, it'll automatically generate the correct expense report in Oracle for the proper project based on the receipts in Expensify. This saves me, on average, about 6 hours a month.


17 years into the 20th century, and we are still filling timesheets.


As incredible as it sounds, it seems to be common. 2 out of my last 3 (tech, software) jobs featured mandatory timesheet reporting for salaried employees. One of them even had a fingerprint scanner at each door you were required to use so that you didn't "cheat".


21st century ;).


107 years then:D


117 years :P


Time sheets are important to businesses for reasons I neither know or care about, the problem is that they're obviously a management responsibility but management always delegates it to every employee. I don't have a clue what is and isn't capitilizable and I don't want to know, I just want to do my work with the minimum paper work possible.


In my city, there are many stadiums which cause traffic congestion during rush hours. I made a scraping bot which tells me if there's going to be traffic on my designated routes the next day. Going to try making it an app and see if it's any useful to others.


Relevant for people working/living in SF:

http://www.isthereagiantsgametoday.com/


Haha, I guess my isthereapadreshomegametoday.com wasn't as uniquely clever as I thought.


would like to know if the Sonoma Raceway is having an event for highway 37 too.


For sure. I live in Atlanta and basically the stadiums are smack in the middle of high traffic interstates. I believe Waze does already, but even viewing it for the work week would be nice. For example, those who have a few days to work from home can pick the days there are games and major events.

Luckily I do not pass by those stadiums on my way to work.


I love scraping websites, it's a stupid hobby at times, but I dig it (https://github.com/asimpson/nodejs-web-scraper-cookbook).

A couple successful scraper projects: 1. I automated buying an iPhone 6 when it came out at the time with a scraper + Twilio for SMS notifications. 2. I setup a scraper to alert me when certain blue-ray titles were available to reserve at the library. 3. I found a site that posted full NBA replays the day after a game. I set up a scraper to on their RSS feed for a couple teams and uploaded the videos to S3.

I've also automated notifications for when my sump pump gets flipped off. My sump pump is connected to a ground fault outlet which can rarely get flipped. I plugged in a spare raspberry pi to the same outlet and have my synology ping the IP periodically. If the synology can't find the pi it sends me a sms via twilio.


A little different than what other people are doing, but I have tried to automate my savings. I use Mint to figure out what my budgets for things should be, then I use Qapital to automatically save the money I didn't spend but was budgeted.


This is a great question. The PC has been around for a long time now. For the most part, users/developers have been sitting around, twiddling their thumbs and waiting for the tool and app gods to rain their blessings. This question begs the need to be proactively involved in the process of designing how you use your PC


Same sentiments. In the beginning, the user and developer of the PC is the same person. I just feel that there are so much untapped computing power waiting for people to tell the computers to go do intelligent things.


Right. Early PC apps were developed by developers mostly for the developer community and PCs were used mostly by developers themselves as a cheap replacement for workstations. When they started proliferating in the end-user space and experienced exponential growth in volumes, innovation took a back seat and commercialisation drove the chariot. Now that everyone's become rich, we need to get serious again about making it a lot more useful. Automation would probably be the first area I would address in that endeavor since the capabilities of the PC are now quite robust.


We definitely use a lot of computer power - to run Electron apps :-)


It definitely is an automation of tasks - it automates away you having to be concerned with multiple platforms, at the small cost of wasting computer power of all of your users.


just to think one of the big breakthru ideas on how to use a 64k home computer was recipe indexing.


A PBX that only let's you record voicemail greeting by dialing in and listening to the whole greeting before it can be saved. So... recording their greeting would take a good 15 minutes if they mess up and have to start over.

I wrote a simple lua script for freeswitch that dials the line, follows the prompts, and plays the person's greeting to the PBX. Of course, one day, the damn PBX will be replaced by freeswitch.


Downloading fan fiction from fanfiction.net

I have written a Python script that builds a HTML out of all chapters of a given fan fiction and then calls Calibre to convert it to MOBI for my Kindle.

Unfortunately, my life doesn't have too many automatable aspects... (I am a math researcher.)


Easy - anything boring. "Boring" usually means repetitive and not mentally challenging, which to my mind is exactly what computers are for.

Even if the task happens infrequently and the script takes longer than the task, automating it is worth the investment: - It prevents having to remember or re-discover how to handle the task next time. - It ensures the task is handled consistently. - It prevents potential manual errors.

For example, on the financial side, my company runs bank accounts in five countries, each with different GST/VAT taxes. Over time, I've developed scripts that grab the mid-month exchange rates that our Internal Revenue service requires to be used; crunches downloaded bank transaction data into categories (including tax inclusion or not); and exports it all into a huge Google spreadsheet. This provides global and country balance sheets and profit and loss, and when tax reporting time comes for each country, a tab on the spreadsheet provides all the figures so filling returns is a five minute process. Occasionally the scripts will flag an unrecognised transaction, and rather than manually correcting this in the spreadsheet, I'll add a rule to the script so it is recognised next time.

Cumulatively this probably took several tens of hours to code, but it means we don't need to employ an accounts clerk. It takes about fifteen minutes a month to download the bank data (manually - oh how I wish banks had APIs) and run the scripts. Our accountant loves this - the spreadsheet is shared with him, he can check our formulae or add other metrics, and he prepares our annual report an order of magnitude faster than any of his other clients.


Sometimes I see a lengthy text article that I tell myself I'll bookmark and read later, but I know I'm never going to read it. I much prefer audiobooks and podcasts. So I automated scraping the text from the article, piping it through text-to-speech, turning it into an MP3, and moving it to my phone so it shows up in my audiobook library. Next step is to make it an RSS feed so I can treat it like a podcast.


Most of my side projects have been about automating the little things that end up taking me a lot of time.

At my first job, part of my work (next to junior dev) was to deploy EARs on Websphere. I automated it so that people just had to drop it on a shared folder and I'd just take a look if it failed to install automatically.

I wrote a command-line tool to search and download subtitles https://github.com/patrickdessalle/periscope

I made a browser plugin to compare the price of the European Amazon and a few other websites (it grew to more countries and websites) http://www.shoptimate.com

And now I'm working on a tool that regularly checks if some of my content is getting adblocked because it's something I periodically do by hand http://www.blockedby.com

In the end, automating things can take more time than actually doing it. But if it's used by others and saves them time as well, it's gratifying.


Definitely going to try periscope. Thanks!


What is EARs?


It is a file format for java applications designed to run on J2EE servers like JBOSS, Websphere, Weblogic, etc. It is a zip file with conventions on certain files and folders in the package.

https://en.wikipedia.org/wiki/EAR_(file_format)


Sorting my mails with imapfilter. I have a yaml file where I write down which mails go into which folder depending on sender or recipient or another header field. Runs on a raspberry pi every ten minutes between 8 and 8.


What does that do for you? For me, mails have three states:

1. Need action right now. I answer them and archive.

2. Need action later. I open a TODO and archive. If I need them later, I can search for them.

3. Do not need action. These are already archived. I can search through them, if I need anything.


Well it especially helps with work emails and sorts mails from our Confluence and Jira into respective folders. These are around 20~ mails per day. I look at them once or twice per week, grouped by subject.

Another thing is newsletters. Every two to four weeks I take the time reading them. Invoices are put away (amazon, ebay, electrictity, etc.).

Well I think the biggest win for me is, that mails in my inbox are typically your category 1 and I can react right away. I do not have to spare another thought on category 2 or 3. I can check my mails twice a day and it's enough. Really helps focusing on my other work.

Back when I used GMail I had some of those rules as filters, but the big advantage of imapfilter on the Pi is: I can add rules pretty easy with a git commit and add new Inboxes if needed.

EDIT: As I mentioned my email filter runs only between 8 and 8, which is really nice, because I do not see any new mails in the evening. This really frees my mind. If the house is burning down, don't write me a mail and give me a call.


Tweeting. I suck at it. I started with a txt, which became a spreadsheet, which is becoming distrosheet.com.

Sooo slooowly that the homepage still has stock cats&dogs images. The most upsetting thing is that I've got more than one person telling me "I like the homepage". My mental reaction was "wtf!?". </rant>

Anyway, I still don't tweet much, but I'm getting there.


Designing and developing UIs. I want to develop web UIs like you develop UIs with Visual Studio or Xcode. I cannot believe how much efforts we need to build and modify web experiences.


It's funny because I can't believe how much effort some people spend to build guis in graphical editors. Guess it's a preference thing


Downloading porn and culling the old stuff. Currently automated management of over 100TB and growing!


I had tons of startup ideas that I'd always wanted to give it a try. After a point, it became frustrating to test them out one by one, either by writing custom applications in Rails or use Wordpress. But, both costed me a significant amount of time.

For example, I had this idea for a travel startup for a very, very long time and I decided to build it on Wordpress. The monetization model was selling some E-Commerce items, so I naturally tried out some of the plugins and was shocked at how long it took for me to get a simple task done. I had such a terrible experience that I'd never recommend it to anyone. Wordpress by itself is fine, but when you try to extend it, you face so many hiccups.

That's when I realized there's no use blaming the tool. It's because of the differences in philosophies between me and the core Wordpress team. So, I naturally spent another 4 months writing a Rails app for this travel startup and still wasn't satisfied with my time to market. Clearly, there had to be a better, faster way?

In essence, I realized every online startup requires these components:

1. Authentication / Authorization

2. CMS - To manage content on the site, including home page, landing pages, blog, etc.

3. Analytics - To help track pageviews, campaigns, etc

4. CRM - To manage a sales pipeline and sell to customers. Also to know very well who your customers really are.

So, I went ahead and wrote this mammoth of an application in phoenix (using DDD's architectural patterns), that has all the modules above. Now, everytime I have an idea, I just login into my interface, setup the content and the theme/design and launch a campaign...bam! My idea is now live and I can test it out there on the market.

You can think of it like a complete combination of all the startups out there:

1. Mailchimp - I can send unlimited emails, track opens, analyse them. Handled by my marketing module. I can customize the emails too, of course.

2. Unbounce - I can design my own landing pages. Handled by my CMS.

3. Buffer - I can schedule shares from within my interface based on best times by engagement. Handled by my marketing module.

4. Hubspot - My system has a full, hubspot/zoho clone of CRM.

Here are some of the key highlights:

1. All my data is collected on BigQuery and I own it instead of sending to third parties.

2. There is no forced limitation on my marketing - For example, if you used mailchimp, you know you're limited to just 2000 recepients. If anything more, it quickly gets expensive. But my system is my own, no limitations whatsoever.

3. I can spend less time developing my idea and more time executing it.

4. I have my own custom business dashboard for each of my idea, that tells me how good/bad it's performing, so that I can turn it off when needed.

Probably not the kind of automation you were expecting, but yeah.

EDIT: Added more details.


Wow, you should definitely sell this as a service. Lots of people in the market validation stage would pay for this (I know I would).

> I can send unlimited emails, track opens, analyse them. Handled by my marketing module. I can customize the emails too, of course.

I'm curious - what do you use for email sending? How do you handle bounces and ensure deliverability?


Thank you for the feedback :)

For tracking emails, I use Mailgun. For deliverability, I use some standard practices to ensure reasonable deliverability - dedicated IP, sending in batches and analyze, etc.

I don't need complex systems to ensure my campaigns are spammy (guys like MailChimp need it because there are multiple senders) because the sender of these campaigns is just me.


Have you sold your "automated start a startup" app as a service to others in your situation? There must be plenty of other people in the same boat as you.


That's a nice idea! Thanks for sharing.


Incredible. This is something I was looking / planning for a long time now. I have a bunch of ideas (couple of spiral notebooks full of 'em) which need set up and execution but due to my consulting practice, I haven't initiated anything as yet.

If you ever write a blog post on the process / checklist of "automating your startup", do share it here.


Interesting approach (and a good idea). Have you found any negatives with operating this kind of system?


Just need to watch out for security updates. Fortunately AppEngine provides security scans. But atleast I don't get weekly notifications when I used to have a Wordpress based system.


Kind of what I was referring to here .... https://news.ycombinator.com/item?id=14779517


Yep, that's correct :)


:) reminds me of startup generator http://tiffzhang.com/startup


Interesting. Mine is more than just a landing page generator though.


You just described a multimillion dollar idea.


Thank you :)


Stock market trading systems, so I don't have to watch screens, also backups and also constantly improving monitoring for smooth operations


I used to write trading systems for Amibroker. This one signaled the 2009 march bull market in equities and oil. But it is a swing/longterm trading system. I was a daytrader so it is not really useful for that.

https://github.com/tebelorg/Tump/blob/master/trading_z.js


Reading into this, and this seems pretty dense. Any suggestions for beginners who are Software Engineers but need more explanation in this domain?


The trading domain knowledge can be found from various online resources. I used to make a 200+ slide powerpoint on the technical analysis aspect of trading but it is probably irrelevant now as markets had changed much since the introduction of algorithmic and high-frequency trading.

I was writing that script for Amibroker platform (https://www.amibroker.com) but I believe the market leader is probably something like MetaTrader (https://www.metatrader4.com). Also, for developers or software engineers familiar with Python, Quantopian might be something you'll enjoy using (https://www.quantopian.com).


Thanks!


You can check my older posts for books I have recommended to get you started


Thanks


Thanks, Looks very interesting, my hobby is to back test different trading systems, so I will take a more deeper look into it next week...


Have fun! This one is a trend-following system for swing/position trading. It hasn't been backtested to factor new market behavior for the last 6-7 years, so it'll probably do badly now. My failure was not being able to convert that for daytrading timeframe.

PS: anyway, sorry for digressing away from the topic of the HN thread lol


How can I get started with this? Did you create machine learning algorithms too?


First you need data 20 years will do and then you start back testing every trading system you can find with the main focus on limiting losses when you figure this part out you just need to find a slight edge in the market and the compounding will make you rich in the comming years...


Google "Trend Following Systems". The simple systems follow the moving average and triggers a call if the share price goes up or down the average value.


Many things. Trivial one, recently wrote a script to electronically sign six documents from my divorce and related tax paperwork using ImageMagick. Just to avoid having to do it with Gimp or Preview or some other GUI tool, and then re-do it when there are revisions. Yes there are online tools but I'm working with people who don't use those, nor do I want to upload these documents anywhere I don't have to.

Often I'll spend as much time writing an automated solution as it would take to do the task manually, even if I'm only going to run the automated solution once. The work is way more fulfilling, and I can fix mistakes easier, and can learn and develop new techniques.


I have a script that downloads bank and credit card transaction data, then applies rules to create a journal in GNU Ledger format.


Do you use an API? Or just scraping? I've done the same, but stopped using it. Afraid my bank, if they detect it, might not like it.


I did the same lol (via scraping), run every midnight to multiple bank accounts, then sync to google spreadsheet. I was writing web automation scripts the hard way for a while, until I made a tool to simplify the process of writing web automation scripts - https://github.com/tebelorg/TagUI (frankly I am surprised why my banks never banned or send me a letter)


I clicked it expecting a basic script, but this is pretty comprehensive. Nice work!


I use Selenium to drive a firefox browser under the assumption that banks don't care enough to detect it: https://github.com/tmerr/bank_wrangler . I am also slightly worried about banks not liking it, but if they do care I guess I will receive a scary cease and desist in the mail and I will cease and desist.


My concern is about what will happen if fraudulent activity happens, such as my account getting hacked somehow. Then they look into my account, discover I've been doing this somehow, tell me I've broken their TOS and I'm on my own.


On the upside you are more likely to notice a fraudulent activity if your script combs through your transactions daily.


I want to do this too. Details would be handy. Even if its extracting data from PDF statements or saved HTML pages.


I used CasperJS to download a specific CSV file, providing the chrome user agent. Then I used a simple NodeJS script to parse this.


Basically use a web automation tool to simulate the login process and then grab the data you want, save it to .csv file or something. For saved HTML pages, web automation tools generally can work with them. For saved PDF I'm not sure what is good.


I used to use pyofx and a lot of custom scraping to get this, but now I use Plaid. I work at Plaid now, but I used it before I got this position.


Your bank would object to you accessing your own transaction data?


In theory they shouldn't, though I believe many financial websites explicitly declare in their terms of use that automated access or web scraping data is not allowed.


Do a careful reading of your banks terms of service and you will likely find a clause either barring you from doing this completely, or voiding fraud protection.


Whole businesses like Yodelee and Plaid do this.


This sounds incredibly rude, but why? Maybe my income is too small for me to see the benefit (student) :p


It can be eye-opening to track and categorize every dollar spent, even if just for a few months. Especially for younger people who are earning serious money for the first time.


I automate as much as possible the tasks involved in coding web automation scripts - https://github.com/tebelorg/TagUI


I wrote a little script [1] to automate a lot of the steps associated with publishing a podcast. There's still manual work but this takes care of a lot of the fiddly repetitive detail work that's both time-consuming and error-prone. Especially if I do a batch of podcasts at an event, this is a lifesaver.

[1] https://opensource.com/article/17/4/automate-podcast-publish...


I automated my dehumidifier.

I wrote about it here: https://red.to/blog/2016/9/15/automatically-controlling-a-de...

and OS'd the Rails app: https://github.com/reddavis/Nest-Dehumidifier


I frequently wipe and install from scratch my Linux desktop and laptops. I've been spending more time recently working on setup scripts that automate as much of this as possible. Things like installing packages, setting up firewall, checking out code projects and installing dependencies. Currently this is mostly a bash script plus my dot-files, but I'm always looking for ways to improve this process.


Ever since I completely tarnished my last linux install last fall I've been trying to adopt habits to fail safe my data and setup (mostly bash and choosing cloud storage for my data). I'd love to see those bash scripts if they are not too sensitive to your integrity/privacy.

Here's how far I've come: https://github.com/GustavHenning/usefulBash


I personally do this using SaltStack. The biggest gain from Bash scripts are that Salt stack are idempotents, meaning you can apply them several time and Salt will figure out by itself the states that need changes.

Chef, puppet or Ansible are also good options I heard.


Transfering lead data to Salesforce from Intercom and Slack by sending simple messages like "SQL" or "email@example.com to sf"

Receiving and sending documents to proofreading

I described them in details here: https://www.netguru.co/blog/automating-myself-out-of-the-job...


I setup crawlers to make specific queries on various website. I used them in the past with: - used car dealer websites - job posting boards (found a job a few years ago with that) - craiglist-like websites - coupon websites (looking for sushi restaurant deals) - etc

Also, not sure if that counts, but I have monit+scripts monitoring backups timestamps and DB replication


Wrote a program that tracks Australian movie release dates for movies I'm interested in. Sends a daily email if a release date moves, or there a new movies for me to flag my interest in.

Interfaces with themoviedb.org for plot summary, cast and crew info and such. Interfaces with Google Calendar for writing entries for each movie I'm tracking.


Okay, seeing as you asked.

https://github.com/evmcl/movieschedule

Let me know (or put in a pull request) if any of the instructions in the readme file are unclear.

Hope you find it useful.


Would you be able to share or open source? I would be interested in something like this and have friends that would also.


Yes please share!


leksak: It is now available at https://github.com/evmcl/movieschedule


Thanks!


Buying crypto weekly using Kraken's API.


How do you decide what to buy?


Hardcoded based on how I think the market will evolve in the long term, currently holding BTC, ETH, XMR, and XRP among a couple others.


Been a pretty rough week, reminds me why I should never put more than 10% of net worth here


High correlation between the BTC price and others. Lots of chaos in Bitcoin-land (August 1/SegWit drama) means most of the others are now falling, large investors just mass-dump their crypto holdings even though logically the prices shouldn't move in lockstep.


Chamath Palihapitiya says you should keep 1% of your net assets in Cryptocurrencies.


Wishing my friends Happy Birthday on Facebook, with Birthday Buddy : https://chrome.google.com/webstore/detail/birthday-buddy/cil...


If someone would do the same to me I would write a bot to ask him to stop it.


I automated pretty much all groceries & goods I use through a combination of Shipt and Amazon Subscribe and Save. Took a few hours one Saturday to compile list of everything I use and estimates on needing more but I genuinely enjoy not having to think about if I need toothpaste or if I have food for dinner


How much of a premium does this run you vs shopping for yourself?


I read a lot of articles by saving them to Pocket and reading via my ereader. I wrote a little PHP browser based application that interfaces with the Pocket and hn.algolia.com APIs that helps me to follow up on articles in related forums such as Hacker News and track my reading habits.

Naturally I called it Pocket Lint.


Commit hook that aborts commits if the projects code style is violated by one of the changes/added files


I can't say this often enough: http://pre-commit.com (minor disclaimer I've contributed a hook but not major source)


Last year I automated a bit of my dating by sending Tinder messages via their API. It worked, and this is how I met the woman I now live with :D http://jazzytomato.com/hacking-tinder/


Some of my own projects that I've ended up using frequently - you can see what they do from the command structure:

  mkgithub ~/dev/new-project

  fgit pull -- ~/*/.git/.. ~/dev/*/.git/..

  ~/dev/tilde/.screenlayout/right-tack.sh
And some less frequently used tools:

  mount-image ./*.iso

  vcard ~/contacts/*.vcf

  ~/dev/vcard/sort-lines.sh ~/dev/vcard/sorts/Gmail.re ~/contacts/*.vcf

  img2scad < example.png > example.scad

  indentect < "$(which indentect)"

  qr2scad < ~/dev/qr2scad/tests/example.png > example.scad

  schemaspy2svg ~/db
So yeah, automate all the things.


Scraping and compilation of various annoying web content formats, with varying levels of efficacy -- e.g. https://github.com/paultopia/scrape-ebook for open source PDF chapters and https://github.com/paultopia/spideyscrape for readthedocs-esque formats.

iCloud documents edited on iOS -> versioning and shoving in a private github repo -- https://paultopia.github.io/posts-output/backup-to-git/

CV updates via template to HTML, latex, and docx


I consult the relevant XKCD to decide: https://xkcd.com/1205/


I love XKCD but that chart is missing a few important dimensions.

- How likely is it that you will make a mistake when performing the task manually, and how easy is it to fix the various mistakes that can be made?

- How likely is it that you'll overlook the task when the time comes? (e.g. SSL cert expiration)

- What are the costs of mistakes or failure to complete the task?

- What skills can you learn while developing the automated solution?

- How can techniques developed for the automated solution be reused for other tasks?

Of course it's fair to ask questions on the other side of the equation, too, like: if the automated solution goes bad, or is run in the wrong environment, what are the risks?


Jep, love that chart. Recently showed it to a colleague when we were discussing if it's was worth it to automate our domain creating processes.


Every time I look at that chart, my eyes glaze over and I decide not to automate. The title text when alludes to this.

I just ... don't get it. I fear I may be a bear of small brain.

As someone else mentioned, there are other valuable considerations the chart ignores, such as likelihood and impact of mistakes.


OTOH automating is an acquired skill, so I learned to say "screw it". Unless you feel you're trying to spend a month to save 10 minutes total, go ahead and automate it - that's how you get better at automating more stuff faster.


Automation also gets rid of context-switching, which has a cost of its own. It may only take you half an hour to do -foo-, but to switch into the mental space for it and switch out again are costs that most people don't factor in.


1. Code formatting

- gofmt for Go, Google Java Format for Java

2. Code Style Enforcement

- golint, govet for Go, CheckStyle with Google Style for Java


I was downloading beatport song by finding them from youtube. Then I decided to automate this. I wrote a code that finds them from youtube and download automatically. Finally I decided to make it a website so that everyone can use. www.beatportube.com


Library book renewals. I have an AWS Lambda function that runs daily, scrapes html from my public library (they have no API), and if a book is due within the next day, renews it. If I've reached max renewals, it sends me a notification.


Do you have this project hosted anywhere?


https://github.com/pisomojadogrande/fcpl-api. Was a little weekend project, so not at all polished or documented, and only does a single account, but has a template where you could deploy it in your own AWS account. Also, the HTML scraping works only if your library is Fairfax Co., Virginia. But hopefully the pattern will be helpful.


In the past I've always automated exporting from Maya, 3DSMax and Photoshop, meaning I don't require artists to export from either. The artist saves the source file in the project, tools build from that to the final format for the app/game.

The more typical workflow is that artists export .JPGs or .PNGs manually from Photoshop and somewhere else save their .PSD files. Similarly with 3SDMax or Maya they'd manually export using some plugin. That seems wasteful to me and error prone. Source files get lost. Artists have to maintain multiple versions and do the export manually which seems like a huge waste of time. So, I automate it.


I learned this one the hard way. It makes sense though, submit the source not the output (as we do for source code).


I write a browser extension so i dont have to click or type a lot on some websites. Firefox: https://addons.mozilla.org/en-US/firefox/addon/clickr/ Chrome: https://chrome.google.com/webstore/detail/clickr/kbegiheknic...

Also very usefull as web developer to test some javascript on a website.


Preparing purchase form for university library and letting me know when books I order become available.

https://github.com/ehud/Library


Discalimer: I am working for uptime robot as a freelancer. Hi, if it fits, uptime robot[1] has keyword monitor type. I think you may use this monitor type to track book pages(for ex you use "available" as keyword).

It is free for 50 monitors.

[1] https://uptimerobot.com


I automated my wedding seating cards and plan.

I managed invitations as a CSV (who had been invited, who responded yes and no, addresses and dietary requirements).

I designed the placecards and seating plan as SVG in inkscape with special text I used as {templating parameters}.

I could then produce all my place cards and seating plan from a simple simple script. This was handy when guests changed their RSVP a week out from the wedding when I had little free time and I could make a change instantly. (Although admittedly I spent more time getting the layout right for the seating chart than if I had done it by hand).


It's really simple; I automate creating builds for the game www.QuantumPilot.me

rm -rf ./QuantumPilot* rm -rf ./QuantumPilot* electron-packager ~/ele/electron-quick-start/ QuantumPilot --platform=all --icon=/Users/quantum/Desktop/QuantumPilot.icns open .

for some reason, OSX has trouble deleting the Linux folder the first time. I've heard Itch.io has a CLI for this but I haven't tried it yet. https://github.com/itchio/butler


Download media. I have Sonarr+Radarr+Plex. I don't spend much time looking for media.

Code reviews. Using something like CodeClimate to automatically check code quality before anyone actually reads the code.


How do you think Radarr compares to Couchpotato? Considering making the switch, since Sonarr has been fantastic.


It shows promise but it's still a bit in its infancy.

It lacks (or lacked a couple of months ago when I last tried it) functionality that I deem indispensable, like putting and importing all the movies from the same directory (it wants - or wanted - every movie to be in its own directory).

Also, it uses a lot more memory. Which is probably fine for most people, but since my little server only has 1.5 GB RAM, it really makes a difference for me.


Great thread, thanks OP.


I just figured out how to use ansible and python to script out changing the passwords for all the network gear in the office. It uses a random password generator api https://passwordwolf.com to fetch a new password, changes it on everything, then sends me the new passwords. I'm changing passwords monthly now but it works so well that I might set it to weekly.


Using a password generator API sounds like a terrible idea from the security perspective.


Backups.


Yes! Everything else in this thread is interesting but without automated backups it's all just temporary.

I would add that it's a good idea to automate backups using more than one method as well. For example, use a popular software package but also copy important stuff to an extra drive. A cheap setup with a Raspberry Pi + USB drive plugged in to your router will do just fine.


I do this. I have a 24/7 on RPI at my parents' house for blocking ads and tracking. I just added a 2TB USB hard drive and set it up to sync everything at night when the line is idle anyway.


I was kinda like a backup freak for my PC, but once I got starting using cloud drives like google drive, it kinda takes care of the problems of hard-disk failures (which are definitely bound to happen every couple/few years).


Untested backups are a problem, though..

Now automating Radom file restores and checksumming is a great idea...


I automate filtering my RSS feeds, or creating a weekly digest of emails that are not priority (bank statement emails, receipts, etc), crawling certain pages that I need to monitor and creating new RSS feed items on updates, weekly digests of top Reddit posts for specific subreddits, monitoring flight deals that originate from my airport.

I find that converting a lot of unimportant emails into RSS feed items has been a huge win for me.


I do contract work for a few clients. I always automate the boring tasks of vpn'ning, firing up remote desktop, connecting to database servers, their email system, etc.

Automating that is fiddly and tedious, but it's worth it because I can just click a button and get a menu of clients. I choose one, and in about 10 seconds my machine is ready to go on their work.


I do this, more or less, with tmux & tmuxinator.


I have small bash script which keeps checking for Internet, if my machine does not have live working internet, it sends a notification with alert (text + sound) "You are offline, you may read some books :)" and then it launches iBooks so I can do some reading when offline.

PS. Also when Internet is back, it alerts again so I can resume online Work if I have to.


I liked writing an internal command line utility for our Go codebase. It automates common dev commands like deployments (including installing dependencies, migrations, etc), sending test emails (eg to check formatting), and running smoke tests. Pretty minor, but it makes my life a lot easier. I plan on expanding it more for accessing prod and dev APIs.


I recently had to frequently create private git repos for job candidates (containing a coding challenge). I built a simple web app that does it all in one click (as a bonus, my non-technical co-founder can also use it). https://i.imgur.com/HhQP4lX.png


I get a weekly newsletter with a bunch of music recommendations in it, which I had been manually adding to a Spotify playlist.

So I recently wrote a CLI in Node that takes a URL and a CSS-style query selector (ie, '.album-title'), then scrapes the page, searches for each found instance and adds them all to a spotify playlist.

github.com/markreid/scrapify


That's awesome! I'm currently working on a similar thing using Pitchfork best new music, Gorilla vs Bear reviews and Needle Drop best tracks of the week + favorite albums of the month. Very clean code btw o/


Cheers dude. I use the pitchfork best albums as an example in the readme because it's quite a simple one.

Feel free to fork or borrow or contribute!


I automate my time tracking using https://wakatime.com


I automate Stats of the products from Google Analytics using Google spreadsheet. By using appscript, I extract all key metrics such as activation rate/ retention rate from the raw data.

Then when I need to report all stats of multiple product, there is another automated script for me to aggregate them.

Saved me hours of context switching and copy and pasting.


Clicking! I wrote a powershell script for Windows which mimicks the autoclick functionality which Ubuntu has in it's accessibility options. I also added double/triple clicking by twitching the mouse a bit.

It takes some getting used to but I feel it helps avoid forearm soreness.


I use slack a lot for the communication.

I have automated whenever there are significant events happen in our app, I will get notified. Its simple to implement. Configure the webhook.

Also, I did things like getting notified whenever there is a commit, pull request or push in your source control.


BillGuard stopped in the UK and we don't have Mint.com so I decided to write my own personal finance tracker. It's not scalable as I only have scrapers for the banks I use but it was pretty simple to get to a basic setup.


Shopping list via Oscar, barcode scanner, open food facts

Aircon via temp sensors and node-samsung-airconditioner

still working on Owntracks/mqtt for useful automations on arrival home

lights plus motion sensor, lihht color by time of day (red at late night to save vision)


@gottlos : Would be curious to see your code/a demo of the stuff you built using Open Food Facts :-)


I've written a script which helps me copy-paste files from their folders in Material Design image library to my android project. This saves me at least 4 copy paste, and then renaming operations.


Every robotic task tangially related to Auditing. I work with robotic task automation at one of the big 4, and it's really amazing how much trivial work that's being done by humans.


Tracking packages so I could batch my trips to post office.

Simple web interface where I have a list of packages I've ordered with the last status update from post service web tracking for.


cool, what API are you using for getting the tracking information?


Haha, API :)

I'm glad my national post office provides that data at all via their website.


I've automated deployment of my side project. When I merge a pr in github to master it will pull the new build and restart any process that's changed.


Time logging. I use one piece of software to track my time, then fan those time logs out into the various pieces of software that need to know about them.


What software do you use to time log in the first place?


If I have to update a file programmatically when I make certain modifications to a codebase I'll write a script that automates the update.


I made a bot which tells what should I wear today depending on weather and the clothes that I have. It messages me every day at morning


Anything I have to do more than once. If I have to do it a second time, I'll probably have to do it a third..


my mac book pro is in german and the keyboard at the office when I plug it to the dock is in English so I created a script to detect when the mac is dis/connected to the dock and switch the language settings, small but proud of it.


All conversations.

In the case of f2f (face to face) I just let my phone run me like a peripheral.


a collegue is doing JIRA exports to Excel / MS Project.


MS flow may help


Saying "no" to meetings and interruptions. I have a box with a big "NO" written on top of it. Whenever someone comes by to ask me "how are you doing?" I tap the box.


My build process.


i wrote little sync script to my server. it is save my mysql backups to google drive.


restart all things every night.


What do you need to restart? I tend to leave things running...I used to reboot my servers once a year, but now I'm not sure I'll even bother with that.


Many apps in this shop have some sort of error handling that puts apps to a weird state. Restart works while developers debug what's going on for months.

I tend to propagate errors and let supervisor handle failing apps. But it's the way it is now.


On the long run? All.


I wrote a bot that automatically comments on HN when certain topics appear.

--

This post has been automatically generated and may not reflect the opinion of the poster.


Find yourself a configuration management server such as Puppet, Chef, CFEngine etc, and learn to automate system deployment and management with it. I use Puppet CE as my main automation tool.

Use Python/Shell for tasks that are not well suited for a configuration management server. Usually, this is when procedural code makes more sense than the declarative style of Puppet manifests. Interactive "wizards" (i.e. add domain users accounts to a samba server, and create home directories for them) and database/file backups are my usual uses for these types of scripts.

Fabric is a useful tool to use with python. It allows you to send SSH commands that you put into functions to groups of servers in bulk.

I also use python for troubleshooting network issues. It has libraries to interact with all manner of network services/protocols, as well as crafting packets and creating raw sockets.

Look into PowerShell if you work in a Windows environment. Everything from Microsoft is hooked into PowerShell in their newer versions.


I evaluated Puppet, Chef, and Ansible and Ansible was by far the easiest for me to learn. My favorite thing I use it for is my personal laptop setup. I can go from clean Ubuntu install to having all the software I use along with system settings all perfect within an hour, without me being involved.


The real world tends to have a diversity of hosts and an automation system that only partially applies is in many ways worse than none at all.

I ended up with Ansible because it seems to require nothing on the far side but python2 and ssh. Everything else needed a specific version of a specific client software. CFEngine was particularly awful in a diverse setup, fighting SSL to make puppet work was never fun. I haven't found anything Ansible won't talk to, or can't be made to talk to eventually. I used Puppet for a bit less than a decade before switching to Ansible maybe two years ago, and I don't miss fighting its SSL at all. I miss when the puppetmaster would randomly crash and I'd find out when a deployment failed or worse failed partially, or when the puppetmaster process would take 5 minutes to start because, well, I donno why, but it sure was annoying.

One interesting anti-pattern I saw over the years was people using puppet (or ansible or otherwise) as the worlds weirdest version control system. You can git pull a release branch, or you can write 10K custom lines (no kidding) of automation software to replace one line of 'git pull'. Depends if you're paid by hour/LoC or not. The borderline between "this is managed with ansible/puppet/wtf" and "this is managed by living in a git repo" can be annoyingly fuzzy and going to either extreme doesn't turn out well...


The funny thing is that with Docker builds, that anti-pattern is cool again. Why bother telling the remote to pull from Git when you can just push it an entire disk image that knows nothing about source control? Of course, those 10K lines are still there, they're just hidden by Docker abstractions :)



It's like choosing programming language by selecting the one that's easiest for you to learn. You haven't learned anything beside new punctuation and keywords.

Ansible is basically a glorified shell script executed semi-interactively through SSH. It doesn't exhibit a new paradigm in server management by any stretch.

CFEngine, on the other hand, shows a whole another way to approach managing systems, and thus it's quite hard to learn (akin to learning Prolog or Coq after writing in Java). Puppet stems from that, but then slaps on top of that many weird and possibly unnecessary concepts, without much of an apparent plan or strategy, so it ended up a mess. Chef is a Ruby framework and some tooling around it, so no wonder it's more difficult to deploy than shell scripts through SSH.


> It's like choosing programming language by selecting the one that's easiest for you to learn.

It is. Is there anything wrong with that if the most important feature to you based on your requirements is that it be the quickest to learn?

I needed something that was very quick to learn and I had most of my setup automated in Ansible before I had even finished trying to learn the other options. Ansible won hands down on the "easy to learn" factor. So far it meets all my needs, and thanks to Ansible Galaxy, I've had to do very little config myself. 95% of the time someone has already written what I need.

> thus it's quite hard to learn

Then it's definitely the wrong product based on my requirements at the time.


>> It's like choosing programming language by selecting the one that's easiest for you to learn.

> It is. Is there anything wrong with that if the most important feature to you based on your requirements is that it be the quickest to learn?

On its own it's not a wrong thing, but as I said, you didn't learn anything substantially different, so you're stuck with the approach that doesn't scale. You only added shell script SSH executor to your toolbelt, while you still haven't seen a proper configuration management system.


If I had spent the time to learn a "proper" configuration management system, it would have been a loss. I had a very limited amount of time to make it worth automating what amounts to a few hours of manual work.

> you didn't learn anything substantially different

So what? Learning something substantially different wasn't one of my requirements. My goal was to automate a few simple deployments, including my personal laptop setup, which previously took at least a day. Ansible got the job done and I haven't looked back.

> You only added shell script SSH executor to your toolbelt

You've said this enough that I'll push back. That's definitely wrong. I can write shell scripts. I would have never attempted this in a shell script. Ansible was far easier than writing a shell script, and it does a lot more than just executing scripts via SSH. For one, it checks the current state of the system it is configuring to determine whether or not software needs to be installed.

> you're stuck with the approach that doesn't scale.

That wasn't one of my requirements either. Although I'm not even sure your claim is true considering that there are some pretty big companies using Ansible, and it was recommended to me by someone who is one of the better programmers I have met in my long career.

The whole point of my original post was that I actually tried products instead of reading about pros and cons. Based on my needs, Ansible was far superior to the other products I tried. Because my requirements aren't your requirements. I'm not so sure why it seems that it's so important to convince me and / or others that Ansible isn't a real tool. People should spend some time trying a few of the most popular tools and decide for themselves which tool best meets their needs. Which may include constraints like how much time they can budget towards learning a tool. "Proper" tool or not.


>> You only added shell script SSH executor to your toolbelt

> You've said this enough that I'll push back. That's definitely wrong. [...] For one, [Ansible] checks the current state of the system it is configuring to determine whether or not software needs to be installed.

So you claim that Ansible learns the state of the system through some magic that is not possible to be called from shell? Because I could do pretty much the same in a shell script just fine.

I've seen a different paradigm for managing servers, I thought about it for a long time figuring out the differences from a shell script, and this is not where they are.

> I'm not even sure your claim [the approach doesn't scale] is true considering that there are some pretty big companies using Ansible

There are many big companies that throw money at dumb processes only to make them running, so this argument is pretty weak.

> [Ansible] was recommended to me by someone who is one of the better programmers I have met in my long career.

You see, precisely this is the issue here. It was recommended to you by a programmer. You should have asked a sysadmin, because this is where these tools come from and whose tasks they do. For some strange reason, programmers tend to avoid tools available to sysadmins, which is a quite big blind spot (even more so, since it's usually unrealized blind spot).


> So you claim that Ansible learns the state of the system through some magic that is not possible to be called from shell?

I never claimed that.

> Because I could do pretty much the same in a shell script just fine.

So could I. But I didn't have to. Because Ansible does it for me. So right there, just by comparing my declarative configuration with the current system state, it's clearly adding more value than "shell script executer".

> There are many big companies that throw money at dumb processes only to make them running

You said it can't scale. You're now moving the goalposts to "well, it can scale, but it takes more money".

> You should have asked a sysadmin

Fortunately I didn't ask this sysadmin because you keep confusing your own requirements with mine. I was never looking for a product that is best of class but takes weeks or months to learn. I've already made it dead clear several times that my most important requirement was "easy and quick to learn". Ansible is the best product of those I tried when that's the primary requirement.

> You see, precisely this is the issue here.

The only issue here is that you can't accept that Ansible was the best fit for my requirements.

> For some strange reason, programmers tend to avoid tools available to sysadmins, which is a quite big blind spot

Some programmers have enough common sense to not spend months learning a tool to automate a few hours of work that happens a few times a year. I asked my friend which would be the quickest to learn, he said Ansible, I wasn't hot on it because Chef and Puppet were more popular, so I tried learning Chef and Puppet and when I realized how long that was going to take I tried Ansible. Within a few hours I had everything automated with Ansible. It was hands-down the best solution based on my requirements. So I do agree there is quite a big blind spot here. But I wouldn't say it's on my side, or the friend who's advice turned out to be spot on. ;-)


> So right there, just by comparing my declarative configuration with the current system state, it's clearly adding more value than "shell script executer".

Very little value in this regard. As I said, I could just as easily write declarative configuration for machine deployment in a shell script.

>> There are many big companies that throw money at dumb processes only to make them running

> You said it can't scale. You're now moving the goalposts to "well, it can scale, but it takes more money".

I'm not moving any goalposts. If a process takes unproportionately more money or effort for more input, it doesn't scale. Heck, in this case it doesn't scale even if it takes the amount of resources proportional to the input!

> [...] you keep confusing your own requirements with mine.

I don't. I'm saying that you didn't learn anything significant, and as such, you would be just as good if you put some thought into how to do exactly the same with the tools you had (shell scripts). But now you're probably worse off, because you most probably got misled by Ansible's marketing into believing it is also a configuration management solution.


> if you put some thought into how

While you were over there thinking about it, I had already completed the job. Which was my #1 requirement. It doesn't seem like you'll ever understand that different requirements often result in different tool choices, but that's ok.

> But now you're probably worse off

Probably not. I now have several things automated that I once had to do manually. I may not be the smartest person in tech, but I'm smart enough that if I need a better solution than Ansible at some point in my career then at that point I'll simply learn the better solution. Different requirements = different tool choices.


Ok, I'll bite: If shell scripts, Ansible, Chef, and Puppet all suck, according to you, what solution would a competent Sysadmin use?


> you still haven't seen a proper configuration management system.

No true Scotsman...


I started with Puppet and moved to Ansible. Ansible is much easier to debug, because everything happens in sequence and it doesn't freak out and break if you have the same identical command in two modules. Puppet builds this rats-nest dependency graph before it runs, and that's a pain in the arse to debug. Ansible's limitation is 'ssh bandwidth' - something like Puppet seems to me better for more heavyweight environments where you have three figures or more of servercount.


There are many kinds of "difficult" - sometimes the difficulties may be just conceptual, other times it's because the tool is not good for the job you have in mind.

For instance, I once took a serious look at Angular and then decided it's too difficult. I'm sure there is time and place for Enterprise JavaScript, but none of the things I'm doing right now would benefit from it (and frankly, neither do most of the smaller sites I've seen using it).


I automate things that a computer can do


very informative


I automate legal documents usings Advobot (advobot.co), a messenger based chatbot that walks you through drafting legal documents. It makes drafting legal documents easy and conversational and is much faster than traditional methods. I can also use it from my phone, which makes drafting legal documents on the go much easier.

advobot.co/web


MY team and I run a reddit/HN-like community platform called Snapzu and we automate most (90%) of our social media channels.

We have 15 main categories, each with their own Twitter, Medium, WP, Blogger, etc. Here's an example of our science Twitter account: http://twitter.com/@Snapzu_Science


Oh boy, sigh, I wish I could share something I just automated, it's insane. Like, everyone that sees it tells me it's pure genius.

Problem is that it isn't ready to for the public. I'll do a show HN next week, but by GOD it is a brilliant piece of automation and scaling :P

Soon (this is more for me than anyone else, i'm literally bursting with pride right now)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: