The same was said back when AMD started making APUs (CPUs with on-board GPUs) and I was similarly hopeful, but we saw how that went. Most attempts at heterogeneous computing have gone only as far as sticking two previously separate chips in one package - no significant integration or co-operation. Since FPGAs are even more different from CPUs than GPUs are, even more programming effort will be needed to take advantage of them, so unless AMD make some kind of revolutionary framework for programming them, I fear FPGA acceleration will get even lower use than GPUs do (excluding 3D work, of course).
APUs have come a long way since the initial ones. AMDs own are now actually decent in 1080p gaming. I built my wife a gaming pc and didn't buy a dedicated gpu, just an AMD APU and can play CS GO maxed out at decent fps.
I think we can all agree that APUs have come a ways since first coming out. The poster you replied to was more lamenting how the inclusion of a GPU into my CPU didn't create something bigger than the sum of its parts.
What we have is still a CPU and a GPU. Separate. Discrete. The inclusion changed nothing about what kind of software we can write or problems we can solve. Back when APUs were still just a slide in a powerpoint deck, there was a lot of talk about how this could "change everything" because of the radically different type of compute you could do on a GPU.
The closest thing to this is the coin mining craze, but everyone is gravitating towards the fastest hardware for that. And that's not your APU.
Yeah, as the first sibling comment said, I was more talking about the new generation of computing APUs were supposed to bring us, instead we got GPUs that happen to be on the same chip as the CPU. AMD's APUs are definitely awesome, especially the Ryzen+Vega ones, and if they made a Ryzen 7+Vega 10 chip for desktop, I'd buy it immediately, but it's still just a GPU that does standard GPU things - the fact that it's integrated is little more than an implementation detail.