diff options
| author | mbays <[email protected]> | 2021-08-25 12:08:56 +0200 |
|---|---|---|
| committer | Drew DeVault <[email protected]> | 2021-08-25 13:02:55 +0200 |
| commit | a8c54c1a32764d38727fb7c9f02ed9bc298e3174 (patch) | |
| tree | 2631862b0d81cc74fd02f85d50f29f0698262ac4 | |
| parent | Add -e flag to place a stylesheet externally rather than loading it inline (diff) | |
| download | capybara-a8c54c1a32764d38727fb7c9f02ed9bc298e3174.tar.xz capybara-a8c54c1a32764d38727fb7c9f02ed9bc298e3174.zip | |
Serve robots.txt disallowing all robots
This overrides any robots.txt file in the proxied gemini capsule, on the
basis that this is intended for gemini robots (which can be expected to
follow the robots.txt companion spec) rather than web robots.
The main purpose though for disallowing web robots is to prevent them
from crawling the proxied cross-site geminispace under /x/, since web
robots won't know even to read the robots.txt files for other capsules
proxied this way.
| -rw-r--r-- | main.go | 6 |
1 files changed, 6 insertions, 0 deletions
@@ -583,6 +583,12 @@ func main() { return } + if r.URL.Path == "/robots.txt" { + w.WriteHeader(http.StatusOK) + w.Write([]byte("User-agent: *\nDisallow: /\n")) + return + } + req := gemini.Request{} req.URL = &url.URL{} req.URL.Scheme = root.Scheme |