Would anyone like to explain the following robots.txt file rule?User-Agent: * Allow: / Disallow: /*?* User-agent: ia_archiver Disallow: /
The Allow directive is redundant
Disallow parameterised URLs for all UAs
Don't allow Internet Archive to crawl and get snapshots.