Jason Kottke notes the new robots.txt file at whitehouse.gov — down to a single “disallow” from more than 2,400 yesterday.
Posted in: Free Speech, Issues, Techniques.
Oh, that Kottke is never giving his readers their money’s worth of links! Reader to Philipp Lenssen’s blog post: “It’s understandable that the robots.txt of an 8-year-old site is longer than that of a 1-day-old site, and it’s not as if ‘/secrets/top’ or ‘/katrina/response/’ were put in the robots file.”
Still, there is a story here — UIUC researchers did find some funny changes (starting with the number of coalition of the willing countries), in a November 2008 research report titled Airbrushing History, American Style.
Also, according to Ben Smith’s reporting in the Politico, the White House will still be doing content filtering. I hate to be a killjoy– but I’m curious why. That at least deserves *some* echoes.
Subscribe to Feed
© 2016 Center for Citizen Media | Powered by WordPress
PrimePress theme by Ravi Varma