The International Standards Organization has determined otherwise. See ISO 8601, section 3.1.2.22, which defines "decade" to mean "time scale unit (3.1.1.7) of 10 calendar years (3.1.2.21), beginning with a year whose year number is divisible without remainder by ten".
As the link you provided acknowledges, "decade" and "century" describe a duration, not specific start times. They find it convenient for purposes of the ISO 8601 document to set the start of decades and centuries in a way that corresponds to portions of the way dates are represented. Outside the context of that document these definitions could cause problems. For example, no first century exists by that definition, since there was no year 0.
Of course you are right regarding the first century. But when we get to later centuries, say the 18th century, we are as likely to describe it as "the 1700s", in which case, it seems that it should start at 1700, not 1701. And we certainly do that with decades (notwithstanding the hard-to-name last two decades). It is now the 20s. That's a decade.
There are always going to be weird cases. If it is February 10th, and you say "one month from today," what is meant by that? 28 (or 29) days later, March 10? Or 30.4375 days later, plus or minus a day or two?
And then you have days which are 23 or 25 hours long, because of Daylight Savings Time. (see all the potential for ambiguity with the word "day": https://en.wikipedia.org/wiki/Day)
I personally think it makes the most sense to think of decades, centuries, and millennia starting when the number rolls over. When you find yourself in ambiguous territory (such as talking about the first century, and in a way it matters whether it is 99 or 100 years long), just take the extra effort to clarify. Most likely this will never come up except for a very small subset of people.
https://www.iso.org/obp/ui#iso:std:iso:8601:-1:ed-1:v1:en