HTTP Status Tester

rasanika.com 1

Blocked by robots.txt
filibot and googlebot are not allowed to be crawled due to a blocking robots.txt rule. Are you sure the URLs are correct?
robots.txt
Status code
Response size
527 Bytes
Reason
OK
Valid robots.txt file
Yes
URL Pattern checked
/
With user-agent
filibot (follows the same rules as Googlebot)
response headers
request headers
Mozilla/5.0 (compatible; filibot/1.0; +https://filibot.com/)
gzip, deflate, br
httpdev.2c6d32ab-5030-46e7-b8ea-17ab41777a7c
robots.txt content
User-agent: *
Allow: /
Disallow: /i/1*/
Disallow: /i/2*/
Disallow: /v/1*/
Disallow: /v/2*/
Disallow: /a/1*/
Disallow: /a/2*/
Disallow: /d/1*/
Disallow: /d/2*/
Allow: /*/*/$
Allow: /*/*/*/*
Disallow: /i/1*/*/*.*
Disallow: /i/2*/*/*.*
Disallow: /v/1*/*/*.*
Disallow: /v/2*/*/*.*
Disallow: /a/1*/*/*.*
Disallow: /a/2*/*/*.*
Disallow: /d/1*/*/*.*
Disallow: /d/2*/*/*.*

Sitemap: https://rasanika.com/sitemaps/static.xml
Sitemap: https://rasanika.com/sitemaps/dynamic/index.xml
Sitemap: https://rasanika.com/sitemaps/google-news.xml

If you think the results above are wrong, contact Fili with the report URL and a description of what is wrong.