Hacker News new | past | comments | ask | show | jobs | submit | naiquevin's comments login

Looks really good. I've felt the need for such an app on more than one occasions. But I'll have to wait for the iOS version I guess (based on the other comment).

Aside: Following your github link led me to discover another one of your music related projects, harmony-explorer which seems quite similar to something I've wanted to build for myself.


Thanks - it's good to know there really is demand for iOS!

Parts of Harmony Explorer are kind of still in the app to this very day, namely the modulo-12 arithmetic for notes. And once support for sounding chords is added, that will also be heavily based on how things were done in Harmony Explorer.


Org-mode[1] has support for embedded latex math fragments[2] which might be worth trying.

[1]: org-mode is a mode for the emacs editor - http://orgmode.org/

[2]: http://orgmode.org/manual/LaTeX-fragments.html#LaTeX-fragmen...


There's also Clint by Kenneth Reitz (https://github.com/kennethreitz/clint)


Clint is a cmd tool not parsing tool


Nice. I learnt about inotifywait only a couple of days back and chose it over bazillion other watch modules in various languages. In fact I think it could be called the "Poor man's auto-anything for any language" :-) For eg. I use it to auto-recompile less files to css as follows

#!/bin/sh

while inotifywait -e modify resources/public/css/less/.less; do lessc resources/public/css/less/.less resources/public/css/*.css; done


When I was in college, me and my project team mates used to go to this fabrication workshop for project work once per week. I was kind of a counter strike addict in those days. The workshop was inside an old building and at one particular spot, I used to feel like hiding behind the wall!


> It could just be me, but I prefer more control, and using the requests library gave me that

I had the exact same opinion until a month back when I started using scrapy + scrapyd[1] for a serious scraping task. Yes, it's a long running spider which is why I decided to use Scrapy in the first place. But so far I have been really impressed with it in general and might consider it even for one-off crawling in future. IMO, pipelines etc. are worth learning. I would also recommend using the httpcache middleware during development which makes testing and debugging easy without having to download content again and again.

On a separate note, for one-off scraping, it's also worth checking whether a python script is required at all. For example, if you just need to download all images, wget with recursive download option should work pretty well in most cases. For eg. a while back I wanted to download all game of life configs (plain text files with .cells extension) from a certain site[2]. Initially I wrote a Python script, but when I learnt about wget's -r option, I replaced it with the following one-liner:

$ wget -P ./cells -nd -r -l 2 -A cells http://www.bitstorm.org/gameoflife/lexicon/

[1]: http://scrapyd.readthedocs.org/en/latest/

[2]: http://www.bitstorm.org/gameoflife/lexicon/


> Everyone should absolutely be running from a virtualenv. Never touch the system python.

Why exactly? I find it convenient to have some packages installed system wide so that they can be used by quickly loading up a python shell without having to activate a virtualenv first or if they need to be used outside any specific project eg. requests, nose, jedi, pyflakes, sphinx etc.


For most systems, if you have a "system python" you'll want the system package manager to manage that python (and python packages). Because breaking python can mean breaking the package manager.

Personally I enjoy having a few bits installed under ~/opt/py-venv and simply add ~/opt/py-ven/bin to my path. It's usually no need to activate a venv to use it -- just call that venv/bin/{python|pip|hg|ipython|<whatever>}.

In other words, whenever I "pip install something" that something is installed in my "default" virtualenv. And if/when things get out of hand/I need to upgrade to a new python -- I can just recreate the virtualenv and install whatever is needed.


I have a default.env that is activated in my .profile so I can always just do a `pip install <package>` w/o having to touch the system python. The system python is for the system, not me.


+1. Also, being written in uppercase, it seems to be an abbreviation but I couldn't find a possible expansion on the site (Assuming the intended meaning is something other than Read-Eval-Print-Loop)


I can somewhat relate to the conversations in the article. While I admit that I don't think I used OOP correctly, since I started using more functions instead of classes (in Python, my primary language), I observed that it has been more convenient to reuse and refactor existing code.

Another observation is that it's far easier to read someone else's code if there is no mutation. For eg. I have enrolled for the proglang course[1] on coursera and only yesterday I completed this weeks homework which involves enhancing an already written game of Tetris in Ruby. Major part of the assignment was about reading and understanding the provided code that uses OOP and makes heavy use of mutation. It was quite difficult to understand what one method does without having a picture of the current state of the object, specially with side effecting methods that call other side effecting methods. A few times I had to add print statements here and there to actually understand which one is being when and how many times. While I could score full marks in the end, I am still not confident about completely understanding the code and have a feeling that if I ever need to work with it again, the amount of time it will take to load up everything again into my mind's memory will be equal to what it took for the first time.

Of course, one could only make a true comparison by studying an implementation of Tetris written in FP style. But from my experience with reading and writing code in Erlang (for production) and Racket, Clojure (as hobby) I find it relatively easier to study functional code written by others.

[1]: https://www.coursera.org/course/proglang


Heads up - On every blog post it seems to be showing the same figure for "x other readers are here" which is misleading.


Yeah, we implemented it naively in a few hours one night this week, we plan to go back and do a deeper sidebar listing our urls.


Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: