* Add a `robots.txt` file to prevent crawlers from scraping the site * Added `ASSET_RIGHTS` entry to config.yaml to control whether `/robots.txt` is served or not * Always import robots.py, determine config in route function * Finish writing a comment * Remove unnecessary redundant import and config  | 
			||
|---|---|---|
| .. | ||
| assets | ||
| static | ||
| styles | ||
| robots.txt | ||