> We also recommend to save the command used to generate a figure in the LaTeX file
An approach I have adopted recently is Knitr[1], so this layer of indirection goes away. With knitr, my data goes directly into the paper repository, and then my Makefile has something like this:
The nice thing is exactly what the authors recommend: it's much easier to enforce a standard appearance across all the figures, and automatically incorporate more recent data into the paper as part of the compilation process.
I'd also add that for figures Inkscape is invaluable [1]. Save as svg once, and export it as whatever later. I typically export it to PDF (from within Inkscape) for pdflatex.
While its typically indispensable for schematics, I often seem to run into the use case of combining previously generated plots or figures, or adding a label/text. Since Inkscape can import pngs, this is a breeze with it. I don't have to go back to the original code to regenerate plots, or fiddle around with latex to make minor adjustments.
For stuff generated via matplotlib, I'd strongly recommend seaborn as an additional library [2]. This is a wrapper over matplotlib. It can prettify plots with just an import and a 'set' command. You can, of course, use it to plot too, and for stuff doable in matplotlib using the seaborn alternative is much easier and looks better with little or no work. And they support pandas dataframes.
The problem with inkscape is that, any slight changes to the figures would make the user go deep into the workflow pipeline to make the changes. However using LaTeX packages like TikZ or PSTicks would simplify the workflow and make the document more maintainable.
Having just completed a dissertation in LaTex, with figures online in Overleaf and Dropbox (some of them screenshots), scripts and data spread across two computers and an external hard drive, desperate last minute plot text changes right in the pdf, I just have to ask: WHY DIDN"T YOU POST ALL THIS SOONER?
If you are serious about making beautiful figures in latex, I would seriously recommend using tikz and pgf-plots. It is quite easy to automatically generate tikz-code from python (after all it is supposed to be read and written by humans) and all aspects of the figure can easily be customized. I have been quite successful in generating automated reports with pretty and easily readable figures using tikz and pgf.
If anyone is interested I have uploaded a sample script for generating XY-plots from two numpy lists to github. The code is by no means very good, but I just wanted to share in case anyone wants to try this approach.
And it's also possible to directly load a csv file with all the data in latex and plot if with pgf, which makes it possible to keep all the plotting options in the latex file:
\addplot table[x ={Column1}, y ={Column2}] {myData.csv};
The issue is that it can take some time for pgf to load the data and do computations on them, but you can use the external library of tikz so that it does not compute the plot again (and save it as a pdf for later uses).
Indeed, TikZ is great to create beautiful-looking plots! A friend of mine is quite good at it, e.g. look at Figure 2 in [1] and Figure 2.1 (p. 18) in [2]. For complex figures, though, I find that TikZ can be a bit hard to master and sometimes results in longer compilation times.
> When writing LaTeX documents, put one sentence per line in your source file.
An interesting tip, never thought of that! It changes the way you write a bit, but it does make finding changes easier, finding errors easier, and forces you to think more about each sentence since you have to hit "enter" at the end of each one.
It also works much better with version control software (git). Not only does it help with diffs, as the article mention, but it makes merging way easier in case you and your coauthor change two adjacent sentences at the same time.
Interestingly enough, I made a very similar comment[1] about code line length and only using one statement per line in another HN thread several days ago.
I also recommend separating repetitive parts of plot generating code into template files, such as with mako or jinja2, and then programmatically generate sequences of plots by first piping the data into the jinja2 template, and then using insert commands to insert it into a bigger tex document.
I found this helpful when writing a paper where the appendix needed over 35 different tables of regression results, all with the same format but populated with data from different subpopulations, which would need to be regenerated (including updated captions, etc.) any time data cleaning or methodology was changed.
That's a great point! Templates are a great tool to generate big tables from results, I usually do that for most of the results in my papers, makes it easier to have the odd copy/paste error. I might add this to the tips and tricks, thanks!
Re: figures in EPS. I think SVG is the way to go. It can be generated with matplotlib or even a simpler script (it's just an XML after all). It can be hand edited. It's viewable with a browser. And it can be converted to PDF with rsvg-convert.
I personally find matplotlib a bit unintuitive to use, so I made a 100-line script for generating SVG. It's great.
This is probably most useful for postgrad students getting started with writing with TeX.
It’s worth pointing out that the figures are made using the matplotlib library, which is primarily based on Matlab’s plotting functionality. This is perhaps just as useful for new researchers as many of them are taught Matlab exclusively throughout their undergraduate courses.
A minor plug: I've found I generate graphs and tables in Jupyter notebooks, so I wrote ipynb-tex, to allow you to reference cells from a notebook directly in your LaTeX documents. This supports tables, and figures.
One itch which (curiously) I can't seem to quite scratch in LaTeX is that it should be possible to say "plot equation \ref{eq:smth} for X in (-4,4)" and just get the bloody graph. Why should I need to define the equation again in a separate place, perhaps even in a separate file?
This is not what you asked for, since it still requires a separate file. However it might be close enough to what you want, and -- for complicated expressions -- possibly even better.
You can write (or derive) the expression using sympy, then have sympy generate a numpy expression that can be evaluated. Sympy can also generate the LaTeX code for any expression. So while that isn't an in-LaTeX solution, it may be close to what you want.
Johansson's "Numerical Python" shows several examples of this. I will scavenge one of his examples below (trusting it falls under "fair use", and hoping I transcribe it correctly -- note I have left out the imports). The example uses sympy to generate and plot Taylor series expansions of sin(x).
The key bit to look for in the example is `sympy.lambdify()`.
LaTeX doesn't have enough information about what your notations mean. You can very well write nonsensical formulas that look pretty in LaTeX but are absolutely meaningless.
I wish I had read the texbook or something similar sooner to gain knowledge like this. Used latex for years without knowing the basics and I regret that a lot.
Also, (v)phantom and smash are something I really should have learned before all those fancy packages, nowadays I'm mostly using context anyways.
I’ve written documents in org mode and converted to pdf via LaTeX, but I find that if the document gets sufficiently complicated with formatting, I have so many LaTeX blocks in my org file I might as well be writing LaTeX directly.
I already implement most of the points mentioned there. The most useful (and new) tip for me was however the rasterization part. I normally like to have pdf figures for my LaTeX papers, but last time I had some graphics with some thousands of points plotted, which were taking too long to be printed if you did that from windows (in Linux there was no problem, that's why I didn't catch the problem earlier). At the end I decided to save the plot as png, but was not happy about it haha. It would have been good to know the rasterization trick earlier.
Thanks for the write-up! Two notes from my experience: pgf output works well with latex as well (although will slowdown compilation), and I recommend not using the pyplot submodule, especially if you'll be running things remotely over ssh and don't have a display
I had problems using pyplot over ssh because it can assume there's a display and fail when it couldn't find one.
Maybe this has changed. I use the OO interface. For example https://matplotlib.org/gallery/api/agg_oo_sgskip.html
Indeed, I remember having to switch to Agg for this reason but can't remember why I switched back. Maybe some rendering issues I had with Agg, not sure...
Is there a better and more comprehensive plotting library than Matplotlib, it's 3D plots a lack polish. Also it is kind of verbose and require much boilerplate. Its api is sprawling and hard to remember.
It shows that you took an extra year to write your dissertation. ;-)
Just kidding, but I got my degree 25 years ago, and at the time, the students who tried to have a perfectly typeset document spent a lot of time on that pursuit.
I used Bitstream-Charter [1] for my dissertation. It looks much better than Computer Modern.
My resume is typeset in Linux Libertine [2] which is used in this superlatively beautiful and elegant CV template by Dario Taraborelli [3]. Requires xelatex.
An approach I have adopted recently is Knitr[1], so this layer of indirection goes away. With knitr, my data goes directly into the paper repository, and then my Makefile has something like this:
The nice thing is exactly what the authors recommend: it's much easier to enforce a standard appearance across all the figures, and automatically incorporate more recent data into the paper as part of the compilation process.[1] https://yihui.name/knitr/