Name | Type | Description | Notes |
---|---|---|---|
http_daemons_to_skip | list[str] | The names of HTTP Daemons (HTTPd) to skip when spidering. For example, `"CUPS"`. | [optional] |
maximum_directory_levels | int | The directory depth limit for web spidering. Limiting directory depth can save significant time, especially with large sites. A value of `0` signifies unlimited directory traversal. Defaults to `6`. | [optional] |
maximum_foreign_hosts | int | The maximum number of unique host names that the spider may resolve. This function adds substantial time to the spidering process, especially with large Web sites, because of frequent cross-link checking involved. Defaults to `100`. | [optional] |
maximum_link_depth | int | The maximum depth of links to traverse when spidering. Defaults to `6`. | [optional] |
maximum_pages | int | The maximum the number of pages that are spidered. This is a time-saving measure for large sites. Defaults to `3000`. | [optional] |
maximum_retries | int | The maximum the number of times to retry a request after a failure. A value of `0` means no retry attempts are made. Defaults to `2`. | [optional] |
maximum_time | str | The maximum length of time to web spider. This limit prevents scans from taking longer than the allotted scan schedule. A value of `PT0S` means no limit is applied. The acceptable range is `PT1M` to `PT16666.6667H`. | [optional] |
response_timeout | str | The duration to wait for a response from a target web server. The value is specified as a ISO8601 duration and can range from `PT0S` (0ms) to `P1H` (1 hour). Defaults to `PT2M`. | [optional] |
threads_per_server | int | The number of threads to use per web server being spidered. Defaults to `3`. | [optional] |