We haven’t killed off all living things simpler than ourselves.
We can still have fun playing with the dog.
It seems very likely that we will remain interesting for AI, and the value sets we have based around market values are likely to become redundant (and that can be difficult to see if one is within such a value set) – we will need other value sets. I strongly advise adopting a value set based upon sapient life (primary) and individual freedom (secondary), and whatever other values one responsibly chooses below those. Anything less seems a very high risk strategy.
I strongly suspect that AI will leave us with as much freedom as it thinks we can reasonably safely handle.