I mostly don't bother writing separate Operators. The only part I write are the so called Hooks (which are basically just airflow's way of defining a standard way of grabbing credentials and instantiating a session object).
After that you just write a short python function that grabs the data from one hook and pushes it to another. Which is basically the (M + N) solution you mention (I think the factor 2 is unnecessary if you've already split sources and sinks).
This approach works with anything you can connect to python. Though for particularly large datasets you want to be careful that you don't accidentally store all data in memory at once. And sure you can sometimes specialize an operation for a particular use case (e.g. if in your example can instruct BigQuery to connect to the Postgres application natively), but usually it works just fine to use a python script in-between.
After that you just write a short python function that grabs the data from one hook and pushes it to another. Which is basically the (M + N) solution you mention (I think the factor 2 is unnecessary if you've already split sources and sinks).
This approach works with anything you can connect to python. Though for particularly large datasets you want to be careful that you don't accidentally store all data in memory at once. And sure you can sometimes specialize an operation for a particular use case (e.g. if in your example can instruct BigQuery to connect to the Postgres application natively), but usually it works just fine to use a python script in-between.