| Version | Supported |
|---|---|
| 0.1.0 | ✅ |
If you discover a security vulnerability, please report it by emailing the maintainers directly rather than opening a public issue. Email at totallynotitlol@gmail.com
Please include:
- Description of the vulnerability
- Steps to reproduce
- Potential impact
- Suggested fix (if any)
We will respond within 48 hours and work with you to resolve the issue.
EasyScrape includes several built-in security protections:
All URLs are validated by default to prevent Server-Side Request Forgery attacks:
- Blocks requests to private IP ranges (10.x, 172.16-31.x, 192.168.x)
- Blocks localhost and loopback addresses
- Blocks link-local addresses (169.254.x)
- Can be disabled with
allow_localhost=Truefor development
The safe_links() method filters out potentially dangerous URLs:
- Blocks
javascript:URLs - Blocks
data:URLs - Returns only HTTP/HTTPS links
SSL certificate verification is enabled by default. Disable only in development:
config = Config(verify_ssl=False) # Not recommended for production- Never disable SSRF protection in production
- Always validate user-provided URLs before scraping
- Use rate limiting to avoid being blocked
- Respect robots.txt when scraping