It seems trac is being hit by a high level of automated crawling, to the extent that use by humans is often slow or entirely unresponsive. To address this, I've removed the permissions from anonymous access which are the ones which cause it to fetch data from git (or at least those I think do). All the data should still be available anonymously, it's just not all available anonymously through trac like it was before. If you want to continue to browse files from git on the web through trac, you can just create and verify an account to get access, or clone the repo and browse locally, or browse anonymously on the web using one of the other options listed at: https://xapian.org/bleeding I considered creating a guest account with all the viewing permissions and advertising the username+passowrd on the front page but trac doesn't seem to support locking down an account, so someone could log into this account and mess with settings, change the password, or even delete it. Let me know if you know of a better mitigation, or if there are particular permissions that I've removed which are particularly useful and don't result in high overhead. Cheers, Olly