About Our Monitoring Bot
If you're seeing requests from watch4.me in your server logs, this page explains what our bot does and how to contact us.
What is this bot?
The watch4.me bot is an uptime monitoring service, not a web crawler or scraper. We make periodic HTTP requests to specific URLs to check if they are online and responding correctly.
If you see our bot in your logs, it means someone has configured our service to monitor a URL on your server. This is typically done by the website owner, a developer, or an operations team to receive alerts when the site goes down.
What we do
- Check specific URLs configured by our users
- Make lightweight HEAD or GET requests
- Check at regular intervals (1-5 minutes typically)
- Identify ourselves clearly in the User-Agent
- Respect rate limits and server responses
What we don't do
- Crawl or index your entire website
- Follow links or discover new pages
- Scrape content or collect data
- Store page content beyond health checks
- Share or sell any information
How to identify our bot
User-Agent String
Our monitoring bot identifies itself with the following User-Agent:
Mozilla/5.0 (compatible; watch4.me/1.0; +https://watch4.me/bot/)
IP Addresses
Our monitoring requests originate from the following IP ranges. We recommend filtering by User-Agent rather than IP, as our infrastructure may scale and IPs may change.
| Region | IP Addresses |
|---|---|
| US East | Coming soon |
| US West | Coming soon |
| Europe | Coming soon |
| Asia Pacific | Coming soon |
IP addresses will be published when our monitoring infrastructure is deployed.
Robots.txt Policy
We respect robots.txt rules that specifically target our bot, but we ignore generic crawler rules.
To block our monitoring bot via robots.txt:
User-agent: watch4.me Disallow: /
Why we ignore generic rules: If your User-agent: *
rule blocks all bots, we wouldn't be able to check if your site is up. Since someone specifically configured
monitoring for your URL, they need accurate uptime data regardless of generic crawler policies.
However, if you explicitly add a User-agent: watch4.me
rule, we will respect it. The monitor owner will be notified that their check is being blocked by robots.txt.
We only access the specific URLs configured for monitoring. We never crawl, discover, or access other pages on your site.
How to whitelist our bot
If your firewall or security service is blocking our monitoring requests, you can whitelist us using the User-Agent string.
Cloudflare
Create a WAF rule to allow requests with our User-Agent:
(http.user_agent contains "watch4.me")
Apache (.htaccess)
RewriteCond %{HTTP_USER_AGENT} watch4\.me [NC]
RewriteRule .* - [L]
Nginx
if ($http_user_agent ~* "watch4\.me") {
# Allow the request
}
How to block our bot
Option 1: Use robots.txt (recommended)
Add a rule targeting our bot specifically (see robots.txt section above). This is the standard way to control bot access.
Option 2: Contact the monitor owner
If someone in your organization set up the monitoring, check with your IT or DevOps team first.
If you don't know who configured the monitoring, you can send them a message through us. We'll relay your message to the account owner while protecting both parties' privacy.
We'll forward your message without revealing your email address.
Option 3: Block via firewall
You can block requests using your firewall or web server configuration by filtering the User-Agent string shown above.
Server impact
We design our monitoring to have minimal impact on your server:
- Single request per check (no page resources loaded)
- Typical check interval: 1-5 minutes (configurable by user)
- Short timeout (10-30 seconds)
- No retry floods (we wait before rechecking)
Contact us
Have questions or concerns about our monitoring bot? We're here to help.
Abuse Reports
abuse@watch4.mePlease include the URL being monitored and any relevant log excerpts in your message.