Hacker News new | past | comments | ask | show | jobs | submit login

Are you seriously suggesting that you can awk a makefile and get anything useful out?



Why would I need to AWK a Makefile, when make will take macro definitions as arguments on the command line?


You were the one complaining that you couldn't awk an xml file in the context of "xml versus makefile".


No, I wrote that XML for use in applications is bad, as it cannot be easily processed with standard UNIX tools. And it's most definitely bad for building software, as it is limited by what the programmer of the build software thought should be supported. A really good example of that is ANT/NANT. make, on the other hand, doesn't limit one to what the author(s) thought should be supported. Need to run programs in order to get from A to B? No problem, put whatever you need in, and have it build you the desired target.


Yes. Don't use XML as an exchange format. Use JSON or DSV instead.

Yes, I said JSON. JSON is very easy to parse, and you can grab unique key/values, which are most of them, with this regex:

  /(,|\{)\w*\"<key>\"\w*:\w*(.*?)\w*(,|\})/


PCRE. So now you have to use Perl? And what happens when your single JSON record spans multiple lines, and has recursive structures?


First off, I simply used some of PCRE for the syntax, as it's what I'm familiar with. \w could be easily replaced, and non-greedy matching is a relatively common extension.

As for when your record spans multiple lines, with recursive structures, the previous regex is for extracting simple atomic data from a json file, which is usally what you want in these cases anyway. If not, the json(1) utility can, I believe, extract arbitrary fields, and composes well with awk, grep, etc.


Yes, the json utility can process a JSON file into key:value pairs. Now ask yourself: if you end up with key:value pairs on stdout, why couldn't that have been the format in the first place? Why artificially impose not one, but two layers of complications (one as JSON, the other as the specialized json tool to process JSON)? Why not just keep it simple and go directly to a flat file ASCII format to begin with?


Well, it means not rolling your own parser. But that's not hard. The real advantage is when you actually ARE dealing with structured data, with nested objects. Most standard UNIX formats are bad at this, and sometimes you find it necessary.

Also, because JSON is so common, you get really good tooling for handling structured data by defult, instead of kinda-okay tooling for 50 different slightly-incompatable formats. 10 operations on 10 datastructures vs 100 operations on 1, and all that.

But for unstructured data, or for one-level key/value data, JSON is overkill. You can use DSV, like this:

  key1:value1
  key2:value2
  and so:on




Consider applying for YC's first-ever Fall batch! Applications are open till Aug 27.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: