Tuesday, September 1, 2020

Advanced Web Attacks & Exploitation (AWAE) To OSWE Certification

Right, so I had been doing bug bounties for the past year. Managed to make some decent cash from my side hustle and thought that I should bring it to the 'next' level by improving my white box/code review skills. Unfortunately for me, I suck big time at coding. Then came the difficult question, do I pamper myself with a new Macbook Pro 16' or do the right thing and spend my hard earn bounties on upskilling myself?? Thankfully, I  came to my senses and decided on the latter. I signed up for Offsec AWAE course with 3 months lab time.  

The course material started off with XSS, which wasn't that difficult to understand but then came the extra miles exercises that required some XHR javascript coding. This part was a challenge for me as I had not written much code for sometime. Anyways, I managed to finish all the exercises and extra miles. Completed all materials in about 1 month. Then came the announcement from Offsec that new material had been updated to the course and existing students will get an upgrade plus 1 month lab for free. I downloaded it and found 3 extra topics. I didn't really focus much on it since I planned to complete the exam ASAP before they refresh the questions! 

I sat for the exam shortly after completing the old course materials but failed miserably. The difficulty is not exploiting the bugs, but finding it! This is where I feel Offsec fell short, the course focuses on exploitation and automation but not on bug hunting techniques. In my opinion, there should be more emphasis on how to find the bugs. Once you find it, usually, it's not difficult to exploit. But that's provided you had NOT been down countless rabbit holes that just zaps your energy, that's what happened to me during my first attempt, by the time I found the bug, I was too lethargic to proceed further, let alone writing the necessary exploit code. I tried in vain to complete the 1st host within the first 24 hours but failed to even find one flag but I didn't give up. The next day, I attempted the 2nd host but still fell short of finding all the flags. :( This was very demoralizing  but I wasn't prepared to quit so soon. Remember, to pass this exam you need to do 3 things - find the bug, exploit it and write decent code to automate the exploitation. Fall short in any of these and you will not make it thru the exam. 

There was a cooling off period of 4 weeks, so that gave me ample time to adjust and think about my next strategy.  Since I had already completed all the extra miles in the course materials, there was no point in redoing it, so I searched for some real life targets in different Bug Bounty platforms for practice. By using the techniques directly taught in AWAE course materials, I scored my first CVE-2020-15160 with a USD250 bounty! How's that for real life application!

While I can't talk much about the exam, all I can say is never give up. On my second attempt, the 1st box was still difficult and I nearly felt like quitting but I was very confident that I had identified the bug and that I just needed to get the exploit automated, this was where my coding skill was really tested. Thankfully, within 12 hours, I had scored the first & second flags and within 22 hours I had enough points to pass the exam. I took my time to write the final report while the exam was still in progress. Finally, after about 24 hours I had completed documenting all my findings along with the necessary proof.txt and local.txt flags.

My advice for peeps planning to sit for this exam - complete all extra miles and take note of the PoC scripts that you had written in the course materials.  The exam is not exactly straight forward, it test your understanding of exploiting common Web vulnerabilities. I highly recommend anyone interested in learning white box testing to undertake this course and work towards certification. I for one, will never wish to sit for such a gruesome 48 hour exam again! If I compared it to the other Offsec exams, OSWE is more difficult than OSCE and OSCP. Primarily because you need to master all 3 skills (bug identification, exploitation & exploit development) to clear this exam. In OSCE, the coding skill required was nothing compared to this one. In OSCP, there was hardly any coding required with the exception of the BoF exploit which is considered child's play. I think people with good coding skills should find it easier than those that don't, just my opinion. 

Finally, get use to the idea of someone constantly watching you during the exam via webcam and desktop sharing, LOL! I tried to look decent at all times, but Malaysia is a hot and humid country. All the best!

Friday, August 14, 2020

Bug Bounty For Fun & Profit


Here is the slide pack for my last presentation for team EG (Elite Ghost), thanks for inviting me guys! It was a pleasure to speak at your event. The video recording of my talk can be found, here.


Saturday, May 16, 2020

How Do I get Started in Bug Bounties FAQ:

Okay,  I had garnered loads of questions/interest from my connections about how do I get started in Bug Bounties.

Here is a short FAQ:


1. How do I get started in Ethical Hacking?
- Go read Web Hacking 101 by Peter Yawoski.

2. Is it legal?
- Yes, perfectly legal. There are many Bug bounty platforms that companies participate, they invite hackers from around the world to test their public facing websites and report issues that you find. It is only illegal if you extort, hack without consent or violate their terms and agreement. Read the terms and stay in scope, you will be fine.

3. How much can I make?
- Depends how good/skilled you are. Top hackers on hackerone made USD 1million in few years. I know a very talented guy in Singapore, makes USD10K per month doing it part time. But this is very rare. Most people will be lucky to get USD100-500 per month if you are good.

4. Do I need to attend any course/training?
- No. self learn, read and practice. Don't waste your time on certifications.

5. Where do I get more info on ethical hacking for money?
- Google Bug bounties and how to get started. hacker101.com, pentesterlab.com, pentesteracademy.com, hackerone.com, bugcrowd.com, etc.

6. How soon can I start making $$$?
- It took me over 1 year to earn my first paid bounty and I am a full time Cyber security advisor specializing in Penetration testing, I only do bug bounty for extra cash, learning and for fun. So be warned, it is NOT easy. You are competing with thousands of talented hackers around the world. Bug bounties is basically crowd sourcing, same concept as Uber, Grab, etc. There is nothing underground about hacking. Hollywood made up loads of sh** about hacking.

7. Do I need to be a skilled programmer/coder?
- It helps, but not neccesary. You must know how to read and modify code in php, bash, perl and python and you must be competent in Linux & Windows.

8. Besides making extra $$, what other benefits of participating in BB?
- Super natural powers and the ability to leap over buildings. With great powers comes great responsibilities; if you are a budding security enthusiastic or a security pro, BB will help you truly understand how Web breaches happen. You will understand how hackers really operate and have hands-on skill to prevent, mitigate and reproduce Web security breaches. Too many IT security pros in the market today only have theoretical knowledge of how a breach really happens, but only a hand full of people that dedicate their time and effort to learn the skills of the dark side. 

Hit me up on twitter if you have more questions @r00tpgp



Asset Recon Tools

List of tools that I used to discover assets in a subdomain targets and end-points

1. Nahamsec crt.sh


#!/bin/bash
$ curl -s https://crt.sh\?q\=%.$1\&output\=json | jq -r '.[].name_value' | sed 's/\*\.//g' | sed '/@/d' | sort -u

Save it as crt.sh.

Usage:

$ crt.sh domain.com

2. Tomnomnom assetfinder


$ assetfinder -subs-only disney.com

3. Tomnomnom waybackurls


$ echo disney.com | waybackurls

4. Maaaaz webscreenshots


$ python webscreenshot.py -i in.txt -o out.txt

OR

Michenriksen aquatone


Download the binary version, here. Copy it into /usr/local/bin

Usage:

$ cat subs.txt | aquatone


5. Tomnomnom meg


Eg to look for Citrix ADC endpoints:

$ meg '/*/vpns/cfg/smb.conf' meg-targets-in.txt meg-targets-out.txt


Installation


$ GO111MODULE=on go get -u github.com/tomnomnom/httprobe

$ cd ~/go/bin/

$ ls
assetfinder  gf  httprobe  meg  waybackurls

$ sudo cp * /usr/local/bin





Saturday, March 14, 2020

Cross Domain Referer Leaks

For those of you that are into Bug Bounties, you will know that Cross Domain Referer leaks are common bugs reported.  I had much luck reporting against such bugs on hackerone. However, do you just hate it when the triagers close your bugs out as "Informative"?



For the life of me, I couldn't understand what that meant until I had examined the output closely in Burp. When you have a Cross Domain Referer leak, it means that the target is leaking certain url that contains sensitive information such as password reset tokens, private invites, email address, etc.



Since most leaks happen to go to analytics, always check all output of adjacent requests to ensure that the following is not present:

1. Javascript/DOM output containing output to analytics.
2. CSP (Content Security Policy) trust between target analytics.

If none of those are present, safe to say you have a valid bug. If you have leaks to google analytics or doubleclick also owned by google, high probability those bugs will be closed as Informative because the victim sites already established trust between those sites so any leaks to it are considered acceptable risks.

Good luck in bug hunting boys!


Wednesday, January 15, 2020

Subdomain Recon Using Certificate Search Technique

Introduction

This enumeration step was taken off Nahamsec's recon video. It involves using the following tools:

1. crt.sh
2. jq
3. httprobe
4. meg
5. aquatone or webscreenshot. I used the latter.

These techniques involve use of the mentioned tools in particular order. Firstly, crt.sh can be called using a script or directly from its website. The input it stored into json output. That's when you need jq to parse the output into the readble format. You can oneliner it into a single script which I have included in the steps below.

Meg is used to perform force browsing for specific paths that you are looking for and aquatone or webscreenshot are tools used to index the output of the websites that was enumerated.

Steps

1. Save the following curl statement into a executable script eg. crt.sh:

#!/bin/bash

curl -s https://crt.sh\?q\=%.$1\&output\=json | jq -r '.[].name_value' | sed 's/\*\.//g' | sed '/@/d' | sort -u


2. Run the script agaisnt target:

$ crt.sh yahoo.com | tee out.txt | more

2013-en-imagenes.es.yahoo.com
3arrebni.yahoo.com
7-eleven.yahoo.com
a10.go.yahoo.com
a1.go.yahoo.com
a.analytics.yahoo.com
aat.answers.yahoo.com
ab.login.cn.yahoo.com
ab.login.yahoo.com
absnd.login.yahoo.com
ac4-as-cas01.ds.corp.yahoo.com
ac4-as-cas02.ds.corp.yahoo.com
ac4-as-isa01.ds.corp.yahoo.com
ac4-as-isa02.ds.corp.yahoo.com
academy.cc.corp.yahoo.com
academy-delivery.cc.corp.yahoo.com
academy-delivery-stage.cc.corp.yahoo.com
academy-rm.cc.corp.yahoo.com
academy-stage.cc.corp.yahoo.com
academy-stage-rm.cc.corp.yahoo.com
academy-stg.cc.corp.yahoo.com
accountlink.www.yahoo.com
accountlink.yahoo.com
--More--


4. Now, run it thru httprobe  or Nmap to see if http or https is running:

$ nmap -iL out.txt -p80,443 -oG out-nmap.txt

5. Inspect the output and remove unwanted chars and add the https:// into each line:

$ cat out-nmap.txt | httprobe

http://2013-en-imagenes.es.yahoo.com
https://2013-en-imagenes.es.yahoo.com
http://accountlink.www.yahoo.com
http://accountlink.yahoo.com
https://accountlink.www.yahoo.com
https://accountlink.yahoo.com
--More--

6. Use meg to force browse to the endpoints path that you are looking for, in this case we are looking for the infamous Citrix Netscaler bug CVE-2019-19781, don't forget to add the '/' in the search pattern:

$ meg '/*/vpns/cfg/smb.conf' out-format.txt out-dir

7. There will be a out-dir created, inside there is an index file. Here is just a sample of /index.html I searched:

/tmp/out-dir$ cat index 
out-dir/media-router-fp2.prod1.media.vip.sg3.yahoo.com/3994ddc095f04708e6a476c7dbda6808cbcdbb8b https://media-router-fp2.prod1.media.vip.sg3.yahoo.com/index.html (301 Moved Permanently)
out-dir/o2.ycpi.vip.sg3.yahoo.com/a1426f50f9c785e28e51de89ba4d52d3e5ff014f https://o2.ycpi.vip.sg3.yahoo.com/index.html (404 Not Found on Accelerator)
out-dir/w2.src1.vip.sg3.yahoo.com/f6f9173b70df41d4ac99e337fdfd137fe4a21826 https://w2.src1.vip.sg3.yahoo.com/index.html (200 OK)
out-dir/e1.ycpi.vip.sgb.yahoo.com/b8eefa16ab7eaa1a3fa6099a5a8988dad15f2c91 https://e1.ycpi.vip.sgb.yahoo.com/index.html (404 Not Found on Accelerator)
out-dir/ats1.l7.search.vip.sg3.yahoo.com/a900373dc6d4aee1f0b10fbd26d5969287737970 https://ats1.l7.search.vip.sg3.yahoo.com/index.html (404 Not Found)
out-dir/media-router-omega1.prod.media.vip.gq1.yahoo.com/c6aef34813955ecc248a8b2401b60f1e30b7e766 https://media-router-omega1.prod.media.vip.gq1.yahoo.com/index.html (404 Not Found)
out-dir/w2.src1.vip.sg3.yahoo.com/a831e26d93084d96171fce67ed4aa3f5119c990d https://w2.src1.vip.sg3.yahoo.com/index.html (200 OK)
out-dir/w2.src1.vip.sg3.yahoo.com/0ed90c45f0ae3107247eaf312b35906dc2b946a0 https://w2.src1.vip.sg3.yahoo.com/index.html (200 OK)
out-dir/w2.src1.vip.sg3.yahoo.com/1eddab1986d57d5464cf60695d5efae6cf482400 https://w2.src1.vip.sg3.yahoo.com/index.html (200 OK)
out-dir/w2.src1.vip.sg3.yahoo.com/94edbc7e3226e5496b04039e3f26329eb5fb67d5 https://w2.src1.vip.sg3.yahoo.com/index.html (200 OK)


8. Lastly, examine meg output for the interesting response, grep the '200 OK' responses and use webscreenshot or aquatone to create a curated snapshot of each result. I won't have to document this step since it is rather straight forward.

Summary

The technique above can be used to enumerate http or https endpoint for specific paths or files. You may also force browse using a dictionary like gobuster or dirb but that take more time. If you know the exact files or directories, it is faster to use this technique. Eitherway, you can still use your fav tools by building your own dictionary using the output from crt.sh. The point here is to scrape subdomains from certificate search.