Hi everyone,
I came across this robots.txt file and I was wondering if the Allow is necessary and whether is affecting the disallowed pages.
User-agent: *
Sitemap: https://www.this-is-an-example.co.uk/sitemap.xml
Allow: /
User-agent: *
Disallow: /*sortBy*
Disallow: /*v_attributes_*
Disallow: /checkout*
Disallow: /search$
Disallow: /search?*
Disallow: /my-pages*
Disallow: /epiui*
Disallow: /punchout-order*
Disallow: /resolvedynamicdata*
Thoughts?
Hey Lorenzo,
Google official documentation says:
User-agent: Googlebot Disallow: /nogooglebot/ User-agent: * Allow: / Sitemap: https://www.example.com/sitemap.xml
 https://developers.google.com/search/docs/crawling-indexing/robots/create-robots-txt
Actually allow is not required as everything is allowed by default unless blocked by disallow