You should just stop with this line of thinking. You have to understand the scientific context and the limits of the data to learn anything. (Source: I know several contributors to the IPCC AR5 report, and I try to have the proper respect for their expertise.)
Great, so you're capable of writing highly parallel cluster-scale code in C that does intensive precision numerical analysis? Most choose vectorized FORTRAN or a combination of C++ and CUDA, but hey, knock yourself out.
CERN has a 3700 core supercomputer to crunch through this kind of data. You can rent that on Amazon for about $800 an hour, so I guess you're good to go.
Sorry to be so harsh here. While there's always desirable amount of "constructive naivety" necessary to try the impossible, you need to recognize that there's considerable amounts of expertise required to process and analyze data of this complexity at scale.
This is not like a movie where six minutes of furious typing can solve any problem.