Have you tried porting the problem into postgres? Not all big data problems can be solved this way but I was surprised what a postgres database could do with 40 million rows of data.
I didn't, I don't think using a db really makes sense for this problem. The program was simulating a physical process to get two streams of timestamps from simulated single-photon detectors, and then running a somewhat-expensive analysis on the data (primarily a cross correlation).
There's nothing here for a DB to really help with, the data access patterns are both trivial and optimal. IIRC it was also more like a billion rows so I'd have some scaling questions (a big enough instance could certainly handle it, but the hardware actually being used was a cheap laptop).
Even if there was though - I would have been very hesitant to do so. The not-a-fulltime-programmer PhD student whose project this was really needed to be able to understand and modify the code. I was pretty hesitant to even introduce a second programming language.