First off, Happy New Year! I hope you have a productive and successful 2018. I thought I’d kick off the new year with another exploration of OSINT. In addition to my work as an information security leader and practitioner at Microsoft, I am privileged to serve in Washington’s military as a J-2 which means I’m part of the intelligence directorate of a joint staff. Intelligence duties in a guard unit context are commonly focused on situational awareness for mission readiness. Additionally, in my unit we combine part of J-6 (command, control, communications, and computer systems directorate of a joint staff) with J-2, making Cyber Network Operations a J-2/6 function. Open source intelligence (OSINT) gathering is quite useful in developing indicators specific to adversaries as well as identifying targets of opportunity for red team and vulnerability assessments. We’ve discussed numerous OSINT offerings as part of toolsmiths past, there’s no better time than our 130th edition to discuss an OSINT platform inclusive of previous topics such as Recon-ng, Spiderfoot, Maltego, and Datasploit. Buscador is just such a platform and comes from genuine OSINT experts Michael Bazzell and David Wescott. Buscador is “a Linux Virtual Machine that is pre-configured for online investigators.” Michael is the author of Open Source Intelligence Techniques (5th edition) and Hiding from the Internet (3rd edition). I had a quick conversation with him and learned that they will have a new release in January (1.2), which will address many issues and add new features. Additionally, it will also revamp Firefox since the release of version 57. You can download Buscador as an OVA bundle for a variety of virtualization options, or as a ISO for USB boot devices or host operating systems. I had Buscador 1.1 up and running on Hyper-V in a matter of minutes after pulling the VMDK out of the OVA and converting it with QEMU. Buscador 1.1 includes numerous tools, in addition to the above mentioned standard bearers, you can expect the following and others:
- Wayback Exporter
- HTTrack Cloner
- Web Snapper
- Knock Pages
- Twitter Exporter
To put Buscador through its paces, using myself as a target of opportunity, I tested a few of the tools I’d not prior utilized. Starting with Creepy, the geolocation OSINT tool, I configured the Twitter plugin, one of the four available (Flickr, Google+, Instagram, Twitter) in Creepy, and searched holisticinfosec, as seen in Figure 1.
|Figure 1: Creepy configuration|
The results, as seen in Figure 2, include some good details, but no immediate location data.
|Figure 2: Creepy results|
Had I configured the other plugins or was even a user of Flickr or Google+, better results would have been likely. I have location turned off for my Tweets, but my profile does profile does include Seattle. Creepy is quite good for assessing targets who utilize social media heavily, but if you wish to dig more deeply into Twitter usage, check out Tinfoleak, which also uses geo information available in Tweets and uploaded images. The report for holisticinfosec is seen in Figure 3.
|Figure 3: Tinfoleak|
If you’re looking for domain enumeration options, you can start with Knock. It’s as easy as handing it a domain, I did so with holisticinfosec.org as seen in Figure 4, results are in Figure 5.
|Figure 4: Knock run|
|Figure 5: Knock results|
Other classics include HTTrack for web site cloning, and ExifTool for pulling all available metadata from images. HTTrack worked instantly as expected for holisticinfosec.org. I used Instalooter, “a program that can download any picture or video associated from an Instagram profile, without any API access”, to grab sample images, then ran pyExifToolGui against them. As a simple experiment, I ran Instalooter against the infosec.memes Instagram account, followed by pyExifToolGui against all the downloaded images, then exported Exif metadata to HTML. If I were analyzing images for associated hashtags the export capability might be useful for an artifacts list.
Finally, one of my absolute favorites is Metagoofil, “an information gathering tool designed for extracting metadata of public documents.” I did a quick run against my domain, with the doc retrieval parameter set at 50, then reviewed full.txt results (Figure 6), included in the output directory (home/Metagoofil) along with authors.csv, companies.csv, and modified.csv.
|Figure 6: Metagoofil results|
Metagoofil is extremely useful for gathering target data, I consider it a red team recon requirement. It’s a faster, currently maintained offering that has some shared capabilities with Foca. It should also serve as a reminder just how much information is available in public facing documents, consider stripping the metadata before publishing.
Cheers…until next time.
This is a Security Bloggers Network syndicated blog post authored by Russ McRee. Read the original post at: HolisticInfoSec™