You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've just completed quite an extensive project of finding and blocking bots from querying our search, which has been costing us money as we make use of Google Programmable Search behind the scenes. I've spent a long time going through the significant server logs from across Canonical's websites.
I've been using user-agents to automatically block bots (using is_bot), which I now realise is all based on the regex in https://github.com/ua-parser/uap-core/blob/master/regexes.yaml. This does a great job, but I've also found the following user-agents which I needed to block manually. We've have had a significant amount of traffic from each of these agents. I'm wondering if you may want to add any of them to your list:
I've just completed quite an extensive project of finding and blocking bots from querying our search, which has been costing us money as we make use of Google Programmable Search behind the scenes. I've spent a long time going through the significant server logs from across Canonical's websites.
I've been using user-agents to automatically block bots (using
is_bot
), which I now realise is all based on the regex in https://github.com/ua-parser/uap-core/blob/master/regexes.yaml. This does a great job, but I've also found the following user-agents which I needed to block manually. We've have had a significant amount of traffic from each of these agents. I'm wondering if you may want to add any of them to your list:Of all of these, the biggest in terms of requests to our sites were: Python*, go-http-client, Assetnote, HeadlessChrome, Tiny Tiny RSS, check_http.
The text was updated successfully, but these errors were encountered: