Returns a securer::securer_tool() that fetches content from URLs
via HTTP GET/HEAD with domain allow-lists and rate limiting.
Usage
fetch_url_tool(
allowed_domains,
max_response_size = "1MB",
timeout_secs = 30,
max_calls = NULL,
max_calls_per_minute = 10
)Arguments
- allowed_domains
Character vector of allowed domains (required). Use
*.example.comfor wildcard subdomains matching any subdomain but not the bare domain itself.- max_response_size
Maximum response body size. Default
"1MB".- timeout_secs
Request timeout in seconds. Default 30.
- max_calls
Maximum lifetime invocations.
NULLmeans unlimited.- max_calls_per_minute
Maximum invocations per 60-second window. Default 10.
Details
The tool enforces several layers of security:
Protocol restriction: Only
httpandhttpsschemes are accepted. Other protocols (e.g.file://,ftp://) are rejected.Private IP blocking: Hostnames that resolve to private or reserved IP ranges (10.x, 172.16-31.x, 192.168.x, 127.x, 169.254.x, 0.0.0.0) are blocked to prevent SSRF attacks.
No redirect following: HTTP redirects are not followed, preventing redirect-based SSRF bypasses.
Domain allow-list: Every request is checked against the
allowed_domainslist. Wildcard entries like*.example.commatch any subdomain (e.g.api.example.com,deep.sub.example.com) but not the bareexample.com.Curl-level size limit: A
maxfilesizecurl option caps the download atmax_response_sizebytes, with an additional post-downloadncharcheck as a backup.Rate limiting: Both per-minute and lifetime invocation limits are enforced.
See also
Other tool factories:
calculator_tool(),
data_profile_tool(),
plot_tool(),
query_sql_tool(),
r_help_tool(),
read_file_tool(),
write_file_tool()
Examples
# \donttest{
tool <- fetch_url_tool(
allowed_domains = c("api.example.com", "*.cdn.example.com"),
max_response_size = "512KB",
timeout_secs = 10
)
# }