If you want to use metaprogramming in C you are better off doing the parsing/tokenizing yourself and creating your own macros than trying to use the C preprocessor, its a lot less work.
But then your code can't be used by anyone without your preprocessor. There is a lot of value in plain C metaprogramming because it can be compiled with an ordinary C compiler.
You can, however, use a script to generate some of the preprocessor boilerplate while still having the templates configurable and instantiable with an ordinary C compiler. This is how my metaprogramming library Pottery works:
It uses #include for templates rather than code block macros, something the article doesn't really go into. It's more powerful this way and the templates are far more readable; aside from the generated metaprogramming boilerplate, the templated code looks like (and preprocesses to) ordinary C.
Let me preface this with: I’m not a Rust fanatic (I still like C#). But Rust’s “procedural macros” are pretty interesting. They’re not just regular typed macros (which Rust has), but a function that takes a token stream and outputs a new token stream. This allows pretty cool macros like hex! for example: https://docs.rs/hex/0.4.2/hex/