You find out that the author of your favorite books is sexist and objectifying. You also learn that what they write is essentially meant only to make money. Do you continue to read their books because the story is really all that matters to you, or do stop reading them because the author is what gives the stories fire?
I would stop reading them.