

I don’t understand what you’re getting at? Clearly book one is meant to give a foundation to every the other books in the series. Now you’re getting all huffy because you don’t understand this book without that foundation.
I’m not saying that you’re wrong or stupid. I’m saying if you read the first book then you might actually get something out of the rest. You also might not! It’s equally possible that this series just isn’t helpful for you.
No, the model does retain the original works in a lossy compression. This is evidenced by the fact that you can get a model to reproduce sections of its training data