started documentation, fixed scanner option/result

This commit is contained in:
epi052
2020-12-26 19:11:58 -06:00
parent ac3c029bff
commit 0726376955
4 changed files with 8 additions and 2 deletions

View File

@@ -5,7 +5,7 @@ on: [push]
jobs:
build-nix:
runs-on: ${{ matrix.os }}
if: github.ref == 'refs/heads/master'
# if: github.ref == 'refs/heads/master'
strategy:
matrix:
type: [ubuntu-x64, ubuntu-x86]

View File

@@ -89,6 +89,8 @@ This attack is also known as Predictable Resource Location, File Enumeration, Di
- [Filter Response Using a Regular Expression (new in `v1.8.0`)](#filter-response-using-a-regular-expression-new-in-v180)
- [Stop and Resume Scans (save scan's state to disk) (new in `v1.9.0`)](#stop-and-resume-scans---resume-from-file-new-in-v190)
- [Enforce a Time Limit on Your Scan (new in `v1.10.0`)](#enforce-a-time-limit-on-your-scan-new-in-v1100)
- [Extract Links from robots.txt (New in `v1.10.2`)](#extract-links-from-robotstxt-new-in-v1102)
- [Filter Response by Similarity to A Given Page (new in `v1.11.0`)](#filter-response-by-similarity-to-a-given-page-new-in-v1110)
- [Comparison w/ Similar Tools](#-comparison-w-similar-tools)
- [Common Problems/Issues (FAQ)](#-common-problemsissues-faq)
- [No file descriptors available](#no-file-descriptors-available)
@@ -352,6 +354,7 @@ A pre-made configuration file with examples of all available settings can be fou
# depth = 1
# filter_size = [5174]
# filter_regex = ["^ignore me$"]
# filter_similar = ["https://somesite.com/soft404"]
# filter_word_count = [993]
# filter_line_count = [35, 36]
# queries = [["name","value"], ["rick", "astley"]]
@@ -658,6 +661,8 @@ In addition to [extracting links from the response body](#extract-links-from-res
`--extract-links` makes a request to `/robots.txt` and examines all `Allow` and `Disallow` entries. Directory entries
are added to the scan queue, while file entries are requested and then reported if appropriate.
### Filter Response by Similarity to A Given Page (new in `v1.11.0`)
## 🧐 Comparison w/ Similar Tools
There are quite a few similar tools for forced browsing/content discovery. Burp Suite Pro, Dirb, Dirbuster, etc...

View File

@@ -33,6 +33,7 @@
# depth = 1
# filter_size = [5174]
# filter_regex = ["^ignore me$"]
# filter_similar = ["https://somesite.com/soft404"]
# filter_word_count = [993]
# filter_line_count = [35, 36]
# queries = [["name","value"], ["rick", "astley"]]

View File

@@ -676,7 +676,7 @@ pub async fn initialize(num_words: usize, config: &Configuration) {
// if successful, create a filter based on the response's body
let fr = FeroxResponse::from(resp, true).await;
if let Ok(hash) = ssdeep::hash(fr.text().as_bytes()) {
if let Some(hash) = ssdeep::hash(fr.text().as_bytes()) {
// hash the response body and store the resulting has in the filter object
let filter = SimilarityFilter {
text: hash,