Self-awareness as a level of sentience

Can we define physically independent entities that experience self-awareness as separate highly sentient beings *worth caring about*? Humans possess a dense cluster of nerve cell in their brain with the comparative minority spread out throughout the body and gut. Throughout our life almost all of our cells die and are replaced, the exception being those nerve cells. Nevertheless, it is fair to treat neurons as substrate upon which information is encoded rather than something that is a timeless definition of “who we are.”

The only similarity between the same person at an early age and a late age is their DNA (encoding with too much compression to define an independent sentient creature, so meaningless for this exercise), much of their nervous system (many cells don’t regrow — although technology is changing that — and can live until the rest of the body dies), and some of the information encoded in their cortical connections. Without a concept of self; particularly an ability to generate narrative and a timeline which connects the past self to the future self, I propose the being has no concept of individuality or on-going existence!

This raises some interesting conclusions. People with surgically split brains can be considered two independent creatures (they are not completely independent since they are physically confined within the same system — their body — but this doesn’t change the fact that they will be able to function independently on a neurological level which is all that matters). More importantly, a lack of self-awareness may imply that a creature is unable to distinguish between it’s selves over time, let alone between itself and others. If this can be described as a lack in ability to narrate one’s life, it seems to prevent the experience of high level emotions such as suffering, joy, etc. This would also imply that it is not morally wrong to kill such creatures. Interestingly, it seems that humans fall into this category when they are at an early enough age, although the future of said human involves an ascension towards self-awareness. We can say that it is morally justifiable to kill cows since they cannot achieve self-awareness, but not morally justifiable to kill baby humans since they will achieve self-awareness in a future state (although it is maybe more morally justifiable than killing an already self-aware human?).