This is exactly how I feel about it too. "The singularity" has always sounded like a meaningless phrase to me. AI has seen a lot of interesting progress lately, but nothing remotely as fundamental like it would lead to any kind of superhuman AI. In fact, I think the 20 years before that has seen more fundamental progress than the past 20 year. Most AI today is mostly advanced statistics. I don't think we're any closer to any sort of independent reasoning in a computer.
Back when I studied AI in the 1990s, I felt like Strong AI was a red herring, and the real value of AI is not in replacing humans, but assisting them.
Back when I studied AI in the 1990s, I felt like Strong AI was a red herring, and the real value of AI is not in replacing humans, but assisting them.