Goal:
Check if the site blocks or filters bots or crawlers using the User-Agent header.
Steps:
Open your terminal.
Run this command to simulate a Google crawler:
1curl -A "Googlebot" http://www.drishya.funNow try simulating a fake bot:
1curl -A "BadBot" http://www.drishya.funCompare the Results:
Are both responses the same?
Does the "BadBot" request return a block page, CAPTCHA, or 403 Forbidden?
If “BadBot” is blocked or treated differently: bot detection is working.
If both responses are the same: bot protection is likely missing.