Tracking Newly Registered Domains

Published: 2017-12-13
Last Updated: 2017-12-13 07:16:05 UTC
by Xavier Mertens (Version: 1)
4 comment(s)

Here is the next step in my series of diaries related to domain names. After tracking suspicious domains with a dashboard[1] and proactively searching for malicious domains[2], let’s focus on newly registered domains. They are a huge number of domain registrations performed every day (on average a few thousand per day all TLD’s combined). Why focus on new domains? With the multiple DGA (“Domain Generation Algorithms”) used by malware families, it is useful to track newly created domains and correlate them with your local resolvers’ logs. You could detect some emerging threats or suspicious activities.

The challenge is to find a list of all those domains. They’re plenty of online services that provide this kind of data. Some of them allow to browse the new domains online[3], others sell this kind of database, usually linked with the corresponding whois data via a monthly fee (usually around $65)[4]. Some registrars offer a list for their own TLD’s (like the AFNIC in France[5]) but they are limited.

I was looking for a global list that includes all TLD’s and, if possible, for free. I found whoisds.com[6] which offers this service. They provide a complete database (domains + whois data) for a monthly fee but the “simple” list is available for free (only domains) and without any registration.

I’m fetching the file via a simple shell script and a cron job:

#!/bin/bash
TODAY=`date --date="-2 day" +"%Y-%m-%d”`
DESTDIR=“/home/domains"
URL="https://whoisds.com/whois-database/newly-registered-domains/$TODAY.zip/nrd"
USERAGENT="XmeBot/1.0 (https://blog.rootshell.be/bot/)"
TEMPFILE=`mktemp /tmp/wget_XXXXXX.zip`
LOGFILE=`mktemp /tmp/wget_XXXXXX.log`
CSVFILE="/opt/splunk/etc/apps/search/lookups/newdomains.csv"

# Check if the destination directory exists
[ -d “$DESTDIR" ] || mkdir -p “$DESTDIR"
# Ensure that the file does not exist already
[ -r “$DESTDIR/$TODAY.txt" ] && rm "$DESTDIR/$TODAY.txt"

wget -o $LOGFILE -O $TEMPFILE --user-agent="$USERAGENT" $URL
RC=$?
if [ "$RC" != "0" ]; then
        echo "[ERROR] Cannot fetch $URL"
        cat $LOGFILE
else
        unzip -d $DESTDIR $TEMPFILE >$LOGFILE 2>&1
        RC=$?
        if [ "$RC" != "0" ]; then
                echo "[ERROR] Cannot unzip $TEMPFILE"
                cat $LOGFILE
        else
                echo "newdomain" >$CSVFILE
                cat “$DESTDIR/$TODAY.txt" >>$CSVFILE
                rm $LOGFILE $TEMPFILE
        fi
fi

This script is executed once a day to store the daily file in the specified directory. A CVS file is also created in the specific Splunk application. Note that the script fetches the file being 2 days old (--date="-2 day") because I detected that sometimes, the previous day is published with some delay!

With the CVS file created in Splunk, I can now search for newly created domains in my Bro DNS logs:

index=securityonion sourcetype=bro_dns rcode="A" OR rcode="AAAA"
|rex field=qclass ".*\.(?<newdomain>\w+\.\w+)"
|search [|inputlookup newdomains.csv]

You can also search for specific keywords like brands, keywords related to your business:

# cat domains_keyword.csv
keyword
*bank*
*paypal*
*apple*
*ec2*

Here is an interesting Splunk query:

|inputlookup newdomains.csv
|rex field=newdomain "(?<keyword>\w+)\.\w+"
|search [|inputlookup domains_keyword.csv]

This search returned for yesterday:

halk-bankbireysel.com
storybankmaine.org
summitbank.org 
towercommunitybankmortgage.org

Happy hunting! 

[1] https://isc.sans.edu/forums/diary/Suspicious+Domains+Tracking+Dashboard/23046/
[2] https://isc.sans.edu/forums/diary/Proactive+Malicious+Domain+Search/23065/
[3] https://domainpunch.com/tlds/daily.php
[4] https://www.whoisxmlapi.com/newly-registered-domains.php
[5] https://www.afnic.fr/en/products-and-services/services/daily-list-of-registered-domain-names/#
[6] https://whoisds.com/newly-registered-domains

Xavier Mertens (@xme)
ISC Handler - Freelance Security Consultant
PGP Key

4 comment(s)
ISC Stormcast For Wednesday, December 13th 2017 https://isc.sans.edu/podcastdetail.html?id=5791

Comments

What's this all about ..?
password reveal .
<a hreaf="https://technolytical.com/">the social network</a> is described as follows because they respect your privacy and keep your data secure:

<a hreaf="https://technolytical.com/">the social network</a> is described as follows because they respect your privacy and keep your data secure. The social networks are not interested in collecting data about you. They don't care about what you're doing, or what you like. They don't want to know who you talk to, or where you go.

<a hreaf="https://technolytical.com/">the social network</a> is not interested in collecting data about you. They don't care about what you're doing, or what you like. They don't want to know who you talk to, or where you go. The social networks only collect the minimum amount of information required for the service that they provide. Your personal information is kept private, and is never shared with other companies without your permission
https://thehomestore.com.pk/
<a hreaf="https://defineprogramming.com/the-public-bathroom-near-me-find-nearest-public-toilet/"> public bathroom near me</a>
<a hreaf="https://defineprogramming.com/the-public-bathroom-near-me-find-nearest-public-toilet/"> nearest public toilet to me</a>
<a hreaf="https://defineprogramming.com/the-public-bathroom-near-me-find-nearest-public-toilet/"> public bathroom near me</a>
<a hreaf="https://defineprogramming.com/the-public-bathroom-near-me-find-nearest-public-toilet/"> public bathroom near me</a>
<a hreaf="https://defineprogramming.com/the-public-bathroom-near-me-find-nearest-public-toilet/"> nearest public toilet to me</a>
<a hreaf="https://defineprogramming.com/the-public-bathroom-near-me-find-nearest-public-toilet/"> public bathroom near me</a>
https://defineprogramming.com/
https://defineprogramming.com/
Enter comment here... a fake TeamViewer page, and that page led to a different type of malware. This week's infection involved a downloaded JavaScript (.js) file that led to Microsoft Installer packages (.msi files) containing other script that used free or open source programs.
distribute malware. Even if the URL listed on the ad shows a legitimate website, subsequent ad traffic can easily lead to a fake page. Different types of malware are distributed in this manner. I've seen IcedID (Bokbot), Gozi/ISFB, and various information stealers distributed through fake software websites that were provided through Google ad traffic. I submitted malicious files from this example to VirusTotal and found a low rate of detection, with some files not showing as malware at all. Additionally, domains associated with this infection frequently change. That might make it hard to detect.
https://clickercounter.org/
Enter corthrthmment here...

Diary Archives