Information Gathering

Aktive- Subdomains

Notes and commands for Aktive- Subdomains.

2024-03-01
Tags reconinformation-gatheringaktive-subdomains

BEST TOOL

  • sublist3r

AXFR

  • https://hackertarget.com/zone-transfer/

    1. Identify NS:
  • nslookup -type=NS zonetransfer.me

    1. Testing for AXFR
  • nslookup -type=any -query=AXFR zonetransfer.me nsztm1.digi.ninja

  • DIG commands immer hinten die target ip also zb:

  • dig AXFR inlanereight.htb @10.130.33.6

  • wenn vorne 2 sachen -> es ist neue zone -> erneut AXFR auf die zone.

Gobuster Patterns

  • vim patterns.txt

  • lert-api-shv-{GOBUSTER}-sin6

  • atlas-pp-shv-{GOBUSTER}-sin6

  • export TARGET="facebook.com"

  • export NS="d.ns.facebook.com"

  • export WORDLIST="numbers.txt"

  • gobuster dns -q -r "${NS}" -d "${TARGET}" -w "${WORDLIST}" -p./patterns.txt -o "gobuster_${TARGET}.txt"

vHosts

  • Changing the HOST HTTP header to request a specific domain.

  • curl -s http://192.168.10.10 -H "Host: randomtarget.com

  • Bruteforcing for possible virtual hosts on the target domain.

  • cat./vhosts.list | while read vhost;do echo "\n********\nFUZZING: ${vhost}\n********";curl -s -I http://<IP address> -H "HOST: ${vhost}.target.domain" | grep "Content-Length: ";done

  • Bruteforcing for possible virtual hosts on the target domain using ffuf. (wobei -fs die standard response ist vom vorherigen bruteforce)

  • ffuf -w./vhosts -u http://<IP address> -H "HOST: FUZZ.target.domain" -fs 612

ZAP

  • open -> add to scope -> spider

ffuf

  • ffuf -recursion -recursion-depth 1 -u http://192.168.10.10/FUZZ -w /opt/useful/SecLists/Discovery/Web-Content/raft-small-directories-lowercase.txt

  • -recursion: Activates the recursive scan.

  • -recursion-depth: Specifies the maximum depth to scan.

  • -u: Our target URL, and FUZZ will be the injection point.

  • -w: Path to our wordlist.

  • -> now save the folders as folders.txt

  • awk '/\[INFO\]/ { gsub(/http:\/\/[^\/]+\//,""); print $NF }' output.txt | sed 's/FUZZ//' > folders.txt

  • -> use cewl to get all words from a homepage and save it into a wordlist.txt (here f.e: length 5) and –lowercase for all lowercase

  • cewl -m5 --lowercase -w wordlist.txt http://192.168.10.10

  • -> now give ffuf all parameter:

  • ffuf -w./folders.txt:FOLDERS,./wordlist.txt:WORDLIST,./extensions.txt:EXTENSIONS -u http://192.168.10.10/FOLDERS/WORDLISTEXTENSIONS