Hacker News new | past | comments | ask | show | jobs | submit login

For mass-scripting I'd prefer to use a native HTTP library with connection pooling.



Sure, I can agree with that somewhat.

Still, it was an one-off thing and I didn't want to turn it into a project. Using a for loop, `curl`, `wkhtmltopdf` and `parallel` eventually did the job.


I for one am totally on board with writing atrocious grep/sed/awk pipeline mashups, doing whatever it takes to get a one-off job done with as little programming as possible. Sometimes the important parameter to optimize is brain cycles spent, or time to result, not CPU time or optimal usage of network resources.


Absolutely. I'd even argue that the brain cycles spent should be the first and most important metric for one-off tasks.

That's why I recently asked people here about what to use if not shell scripting. I think I'll either use a less confusing scripting language that is almost like bash/zsh -- namely `rc` -- or I'll brush off my OCaml skills.

That need was borne out of my frustration that I routinely spent non-trivial amounts of time on one-off scripts so at one point I got fed up and started looking for alternatives.


I currently have the dynamic language blues – Python to me just doesn't feel like as much fun as it used to. Maybe it's getting too big and/or too crufty. At the moment I'm having great fun with Rust instead, which feels as magical as Python and C did when I was twenty years younger.

That said, bash/sh are really good at setting up pipelines to pass around lines of text between Unix tools. That is a really cool thing, and most languages don't really make this as first class concept. Because bash does, and I have twenty years of experience with it, I mostly put up with Bash. It's not a great language but it gets the job done, literally.

I recently read some stuff about Julia which made me interested in it as a potentially better Bash (and hopefully more fun than Python) –

https://julialang.org/blog/2012/03/shelling-out-sucks/

https://julialang.org/blog/2013/04/put-this-in-your-pipe/

https://docs.julialang.org/en/v1/manual/running-external-pro...


> I currently have the dynamic language blues...

Same, I used to be a Ruby (and Rails) fanatic a long time ago but this has subsided. Nowadays I love Elixir and I reach for it in many scenarios where I don't expect CPU-bound workloads but I am acutely aware that the startup times (when scripting) are atrocious, plus working with it in business context for 4.5 years made me painfully aware of the problems caused by a lack of static typing.

> At the moment I'm having great fun with Rust instead

Same, I love it more than I loved almost any other tool I tried in my life. Just not sure how economical it is to reach for it for throwaway one-off tasks. I suppose with more experience in it it would become feasible.

> It's not a great language but it gets the job done, literally.

Agreed, I have achieved a lot with bash and `parallel` alone. But at one point it became a death by a thousand paper cuts thing; I know I can do the task but I can never quite remember all the little annoying details when e.g. iterating numbers, or file list from a previous command, or was there anything specific about piping inside the scripts (there is not, but you have to be mindful of config like `set -euo pipefail`) etc.

It might be me getting stupider and less tolerant with age but you know what? I am okay with that. These tools / languages should be intuitive and easy to remember!

I am not abandoning bash and zsh scripting, ever. But my tolerance towards banging my head against the wall was severely reduced lately. And, as mentioned in other comments of mine here, I'll either use `rc` or re-learn OCaml -- since it's extremely terse, compiles to very fast binaries (faster than Golang and only slightly slower than Rust) and has static strong typing.

But time will tell.

---

Thank you for the fantastic links. Opened them on my phone, they'll definitely be researched.


aiohttp for Python is relatively performant and in a fairly easy to write language.


Isn't that a library and not a CLI tool?


Yes, in the context of my comment higher in the thread about using libraries. This gives you a lot more control of the connection pool and queues, which can get a lot more performance than simple cURL usage. From the cURL manpage:

  curl will attempt to re-use connections for multiple file transfers, so that getting 
  many files from the same server will not do multiple connects / handshakes. This 
  improves speed. Of course this is only done on files specified on a single command 
  line and cannot be used between separate curl invokes.
So using a library gives you more flexibility in reusing connections (and avoiding the expensive TCP/TLS handshake) than the simple patterns you can use with the cURL CLI.

See also: https://docs.aiohttp.org/en/latest/http_request_lifecycle.ht...


Oh believe me, I am aware. I just didn't want to write a program for my task back then (even if it would have likely taken almost the same amount of time as it turned out in the end).

That's why I am searching for an alternative to the classic shell scripting and I have almost settled on OCaml.

Python is alright as well but never really liked it: just when you might need a tool that's quick to launch, you'll hit the Python interpreter startup overhead. Same goes for many other interpreted languages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: