@taylorlorenz not sure this is what may be playing a role, but WaPo’s robots.txt file (one way webmasters can request robots handle things on their domain) has this:
User-agent: Google-Extended
Disallow: /
That is requesting any machine that makes a request with the user-agent set to “Google-Extended” not access (and therefore not index) anything at all on that domain.
Odd thing to have in a robots.txt file, but it is there at https://www.washingtonpost.com/robots.txt