Because you have to be human and mortal to understand it to credibly contribute and share the story of what that means to be. You can't superficially understand someone's situation and then take ownership of it. You can get a glimpse and really try and empathize, but you can't become the bearer of that experience, just a consumer.
>Because you have to be human and mortal to understand it to credibly contribute and share the story of what that means to be.
Aside from directors, authors, artists, etc, who have demonstrated this to be false, an AI could conceivably synthesize the experiences of every author that wrote on what it means to be human or experience mortality and create a story that captures the essence of the experience better than any one person ever could. Having the first person experience doesn't induce a superior ability to communicate features of the experience.
Movie directors have never experienced most of what they film, but they convey those experiences far better than those who have actually lived those stories. I see no reason to doubt that the same is true for artificial storytellers.