I don't think that perspective really holds up to practical scrutiny. The most common sources of tech debt in the real world are when new features are tacked onto existing software without integrating with them fully/smoothly.
As a relatable example, think about how the Control Panel in Windows still uses a 2000s-era UI, even as the Settings menu, and most of the rest of the OS, uses a new UI. The likely reason for this is that it was faster to tack on a new UI and leave the Control Panel as-is. It would've taken more time to refactor the Control Panel from the ground up to accomodate a new UI.
The end result is that there are now two separate settings interfaces, and there is probably some ongoing engineering effort required in maintaining both and keeping them coherent with one another. That's a classic example of tech debt - save time now, but it may cost you later.
But, by your definition, the old UI and the new UI are separate pieces of software, therefore there is no debt. How does that track?
As a relatable example, think about how the Control Panel in Windows still uses a 2000s-era UI, even as the Settings menu, and most of the rest of the OS, uses a new UI. The likely reason for this is that it was faster to tack on a new UI and leave the Control Panel as-is. It would've taken more time to refactor the Control Panel from the ground up to accomodate a new UI.
The end result is that there are now two separate settings interfaces, and there is probably some ongoing engineering effort required in maintaining both and keeping them coherent with one another. That's a classic example of tech debt - save time now, but it may cost you later.
But, by your definition, the old UI and the new UI are separate pieces of software, therefore there is no debt. How does that track?