From what I know, you will need a sync check relay, ie a relay that only closes when you're synced up. Those are generally available for 1MW and upwards (one manual notes a minimum constant load on the internal grid of 500 kW or it won't work), look like about the size of half of a car battery, at a price of "contact our sales team".
Could it be made cheaper? Probably. If you get it wrong, the grid probably doesn't care but you'll briefly pump about 500W into the device that is supposed to have 500W going out of it. The reason these are big and expensive is that it requires significant safety gear so nothing explodes even in the worst case. And that safety gear is expensive. So you sell it to people who not only can afford it but also really really need it (ie, 1MW and upwards where you enter the domain of "can fry small section of grid")
Any old IGBT or even just a highly spec’d MOSFET paired with an optoisolator (a couple of dollars of parts) could do that for a small system, based on input from the inverter’s existing controller.
The specialist devices for large installations you’re talking about are only expensive because you need more expensive parts for the far larger amounts of current you’re handling (and probably because they’re made in lower volumes than commodity inverters), not because they’re doing anything particularly difficult.
grid-tie inverters are capable of syncing to a present signal, what they can't do is provide their own waveform and sync that to the grid when it comes back online.
Providing a 60Hz waveform and syncing it to the grid the easy part. You could make a standalone device that does it out of a twenty cent microcontroller.
And then you'd still have to switch over and you'd need a sync check relay for safety (if you don't and the microcontroller is off because you forgot a comma somewhere, your inverter explodes).
Additionally producing the clean sinewave that you'd need for this is not that easy, atleast not at the quality levels you want for this (if your DAC that produces the wave is off by 1% then at a 2kW load you're going to burn up 20W somewhere that doesn't like 20W being burned up)
Your mains voltage cannot be 1% off it's own phase since it's the primary phase here and in terms of voltage a 1% difference doesn't matter.
However, if you have your own generator it matters a lot.
If your phase is off by 1 degree then that 1 degree will burn roughly .2% of the incoming power of the grid at the inverter (which is unlikely designed to handle this). If you're off by 1% you burn 20 Watts on a device not designed for it.
If your voltage is off relative to the grid by 1% then you burn the difference, at 2kW that's about 20 Watts. And that's per volt. You'd be burning somewhere around 300 W if you happen to have the grid on the higher end of the tolerance and yours on the lower.
A grid-tie inverter gets around this by simply following along the sine wave of the grid, this can be done relatively cheaply and safely with analog components so the error can be much smaller than 1% and deep into random noise territory.
If you generate your own sine wave and compare it to an existing one it's much more difficult since you have to match amplitude and phase almost perfectly.
So with grid-tie nothing will explode. With an autotransfer nothing explodes either. Wanting to seamlessly couple back in requires a lot of care and expensive components.
I'm no electronics engineer, but don't regular home UPS units already do all of this? I'm talking about the type of units sold by computer and office supply shops to provide backup power for home / SOHO IT devices.
Generally there are three types of UPS on the market.
The cheapest is VFD which is basically a battery parellel to the mains which in case of a power failure interrupts mains and inserts it's own voltage. Usually labelled as "offline" or "standby" UPS since they're not active most of the time. The output frequency and voltage is the mains output and voltage until switched over, something to keep in mind if devices are sensitive to that.
These can simply switch back to mains when it's back since they usually use a simple transfer relay.
VI (Line interactive, Delta Conversion) uses the mains frequency as orientation. They don't have a transfer and can basically just compensate whatever the mains is doing to output a 230V signal. Internally they have a inverter with AC input and AC ouput which means they measure if mains is coming back from there and adapt the signal on the internal inverter for the battery.
If mains comes back on a VI they usually change frequency very abruptly which is not ideal from some devices.
VDI is completely independent of both voltage and frequency as it first converts mains AC to internal DC, simply plugs in the battery into the DC and then converts DC to AC. They don't need to synchronize at all and are the more common for datacenters since they isolate the input fairly well from output and don't have to switch anything to go from mains to backup, DC voltage is fairly good for dealing with this. They are also most expensive.
If mains comes back on a VDI they don#t do anything of notable interest other than switching the battery charger on.
Could it be made cheaper? Probably. If you get it wrong, the grid probably doesn't care but you'll briefly pump about 500W into the device that is supposed to have 500W going out of it. The reason these are big and expensive is that it requires significant safety gear so nothing explodes even in the worst case. And that safety gear is expensive. So you sell it to people who not only can afford it but also really really need it (ie, 1MW and upwards where you enter the domain of "can fry small section of grid")