Consciousness has no technical meaning. Even for other humans, it is a (good and morally justified) leap of faith to assume that other humans have thought processes that roughly resemble your own. It's a matter philosophers debate and science cannot address. Science cannot disprove the p-zombie hypothesis because nobody can devise an empirical test for consciousness.
I think I basically agree. Unless somebody can come up with an empirical test for consciousness, I think consciousness is irrelevant. What matters are the technical capabilities of the system. What tasks is it able to perform? AGI will be able to generally perform any reasonable task you throw at it. If it's a p-zombie or not won't matter to engineers, only philosophers and theologians (or engineers moonlighting as those.)
I'm pretty sure this was the entire point of the Paperclip Optimizer parable. That is that generalized intelligence doesn't have to look like or have any of the motivations that humans do.
Human behavior is highly optimized to having a meat based shell it has to keep alive. The vast majority of our behaviors have little to nothing to do with our intelligence. Any non-organic intelligence is going to be highly divergent in its trajectory.