From the course: Offensive Penetration Testing

Web application enumeration

Web application penetration testing. Web application enumeration. Our learning objectives are to understand what tools are available to enumerate web servers, describe the limitations of these scanners, know what directory brute-forcing is and how it can aid in website enumeration, and explain why manual enumeration of web applications is important. Do you recognize that picture? It's of our friend WordPress, which we'll get a lot more familiar with in the upcoming labs. In full disclosure, when I saw a web server in PWK or OSCP, I got nervous because our attack service just grew exponentially. There could be various pages in there, there could be admin accounts, there could be user accounts. Maybe I have to register for something to gain access to it. It introduces a whole level of complexity in a larger attack surface. Now, this is my bread and butter and really it's just a matter of breaking things down into smaller pieces and making it more digestible. And, again, the more reps you do, the more experience you have with web applications, the less scary it becomes. You'll also notice that there's all these content management systems, there's WordPress, there's Drupal, there's Joomla. And what makes them interesting is they have a large attack surface as well. Just because so many people use them. You'll notice in the news it will say, you know, this plugin for WordPress is vulnerable and now five million sites are now vulnerable to a unrestricted file upload bypass vulnerability. So because so many people use this, so many people are looking for zero days in these applications to exploit them. So it makes it a larger attack surface there. Also, if you're doing bug bounties now or if you're a web application security engineer or pen tester, your mindset needs to change. Doesn't matter if the secure flag isn't set on a cookie. It doesn't matter if there's a CRSF attack if there -- because it's a client side attack, if no one's there to be exploited. As I said, we should focus on server-side attacks. I know there's client-side attacks like cross-site scripting and CRSF but really we want to focus on the server-side attack. So get out of the bug bounty mindset of finding specific vulnerabilities in a web application. If you want to do that, I recommend the EWPT exam because that is very specific to web applications. So enumeration with our friend Nmap yet again. So keep in mind when you're doing your scan with Nmap, that's probably the first time you're going to observe a web server and don't assume it's on Port 80 or 443. It can also be on ports like 8080 or 8443, but also understand if it's HTTP or HTTPS. Seen a lot of junior pen testers try a port using HTTP when they should be using HTTPS different protocol and they can't connect to the site. So play around with those protocols on those different ports and see if maybe it changes depending on it's HTTP or HTTPS. We can also remind our -- refine our Nmap scan with that vuln scan. I talked about the vuln scan with Nmap in the very beginning. That kind of has a very robust output. It's not as good as things like Burp Suite, active scan. Of course, you can't use Burp Suite Pro in OSCP, but it's kind of, you know, it's a fast scanner -- vulnerability scanner for web applications. Also, we should get an output in Nmap of the version of Apache or Nginx or IIS. Is that version vulnerable? So check that out at first when you start enumerating these web servers. You'll see here is the output from an Nmap scan and it gives us with the vuln script in it. You'll see it already gives us a lot of good information and I was making fun of the HTTP flag not set with cookies, but here you go, right here. Or CRSF. Here you go again. Also, let's look at the bottom. We can see a possible SQL injection that's great for us. And maybe enumerating users or usernames and passwords or maybe even getting a Shell in the box. So we're also trying to look for different folders and directories on the web server. With Nikto, that also does stuff like that. This is a great scanner. Chris Sulo is a friend and I've spoken to him a lot about Nikto and how he saved me with OSCP. There's a lot of output with Nikto, and there are also a lot of false positives, but test everything because the one thing in there that's buried in that output might be what gets you a Shell in OSCP. I'm just saying. But, you know, again, this Nikto I think is a great tool to use on web servers. If it's not on Port 80, you have to specify the port of it's 443 or 8080. So keep that in mind. Directory brute-forcing. Why do we do directory brute-forcing? Well, if we use a tool like Burp Suite, which we'll talk about later, that's not brute-forcing directories for you. It's not figuring out what directories there are. We have to use tools for that. And if you're not brute-forcing directories, you may miss out on an admin directory or some juicy directory that might have information that will help us enumerate the server or vulnerabilities. So it always suggest running a brute force against a web server to see what is on it. It also finds things through response codes, and we'll talk about response codes as well. But it will look at a response code of like 200, meaning that you can reach that web page or 404, meaning that that page doesn't exist and it will tell you whether the page exists or not. Whether it gets a 200, it will -- you'll get output or 404, it won't give you output. It doesn't find vulnerabilities. A directory may have a vulnerability in it, but a directory brute force is not going to say, hey, look at this. This is what's vulnerable. Dirb is a fast and dirty directory brute force. I really like it. It goes really, really fast. And as you can see here, it gives you the response codes. It tells you whether there's a 200 or 403. And it also tells you if a directory is listable, if you can look at every single file in that directory. So I really like running this first because the other two brute forces take a lot more time. It's default installed in Kali. You can specify a word list, it has a default word list in it, but there's a word list I like using and we'll look at Dirbuster. And you can see here I like using that directory-list-2.3-medium.txt. So I always like using no matter whether it's Dirbuster or Gobuster, which we'll look at next. But Dirbusters OWASP directory brute forcer is written in Java. You can specify which word list you want to use. You can specify the file extension. That's why it's important to enumerate the technology, whether the server is using PHP or ASP, whether it has text files on it, whether it's using Perl, maybe there's CGI in there. So I would definitely recommend knowing what file extensions to specify or trying a few different ones. It's also already installed in Kali. The one that's not is Gobuster and Gobuster is probably my favorite. So Gobuster is also a command line tool. It goes very fast. As you can see here, I'm specifying that word list again. Actually, I'm specifying a different one, but my favorite is that medium word list. So you can see here you can also specify extensions. I have PHP, TXT, HTML, and it's giving me the response code of the server as well. It's written in Go. I really, really like this. Download it and give it a try. Also, I had to use Wappalyzer or get that extension because that also enumerates website technology very quickly. From this, you can already tell it's WordPress. You can already tell there's a MySQL database there. It doesn't show the version of Apache, but we know we're working with Apache. So I really like using Wappalyzer. Also, don't discount manual enumeration. Tools are great, but know how to use manual enumeration to figure out what's going on. Always look at the robots.txt file, what it says is disallowed. We really -- that's the juicy information to us as hackers. Right. It's disallowed. We want to look at it. Also, view the source of the page. Right-click "View source." Maybe there's some meta tags, maybe there's some comments in there. Maybe the developer left comments like having to build out this directory, but not yet. But yet we can surf to that directory. Or maybe something's grayed out on the page, but it's, you know, it's in the source of the page like a certain directory and we can still surf to it. So always look at the source of the page, which you can also do using cURL. If you use the cURL command, you're just looking at the source of the page and you may see things you may miss when you're looking at the beautiful browser. Also, look at cookies. If there's a PHPSESSID, you can be pretty sure it's running PHP. If it ASPSESSID, you can be pretty sure you can upload an ASP Shell like we saw and that will work for us too. Also, look at the network tab, refresh the page, look at all the connections that website makes. Maybe it will give us other directories and other files we should be looking at. Here's your hands-on quiz. In environment, you're allowed to do this. Run Dirb, run Dirbuster, run Gobuster and see which one you like, see which one you don't like, but try to compare them. And also use Nmap with that vuln script and see what you find. So I'm going to give a demo really quick. So I'd like to show you all these tools that I was just talking about. Here's this website here, my awesome photoblog. This is great. I ran Nmap, I ran SVSC, I ran the vuln script. As you can see here, it does a whole lot of different scripts, cookie flags. Like I said, don't worry about that one. CRSF client side. I'm not really worried about that either. This is enumerating directories. This might be interesting. Possible admin folder, robots.txt, and all these other interesting folders I should be looking at. Also, an internal IP that might be interesting, but it's local host. A version of Apache, maybe that's vulnerable, and all these possible SQL injections. That's interesting too. Also, I want to run Nikto. So you can see a whole lot of output. I can tell you off the bat, this OSVDB 5034. That's a false positive. But it gave us a little bit more information about this localhost IP. It says, "The server may reveal its internal or real IP in the location header via a request to images over HTTP 1.0." If we do cURL to images, doesn't give us that. Gives us 192.168.1.52. That's because we're using HTTP 1.1. So how do we downgrade to 1.0? We can use Netcat. V for verbose port 80. You'll press "Enter," you won't see anything. What you want to do is I like to use a get request forward slash HTTP and it said 1.0. So let's try that, press "Enter." Press enter again. You know what I forgot? I forgot to say images. But you'll see the output of the page and you'll also see the super secret hidden directory. Well, that's interesting. But let me go back, and do each and get out of this. GET /images HTTP/1.0, enter, enter. Now, we see the 127.0.0.1 localhost on Port 80. So we downgraded our version of HTTP to 1.0 there. What we did see using the cURL command was this. This super hidden directory at secret. But like I told you, I always like to look at robots.txt. So here we are. We find flag Number 1. We go here. You see, we found flag Number 1. Now, let's go to this super secret directory. Copy that, add that here, and we can see flag2. Also, I talked about Dirb, which is the quick and dirty one. You'll see how fast this thing is. Really, really, really fast. It tells you that the response code. This is interesting bash history. Maybe I can see the bash history of the server, which would be information disclosure. So I do see, in fact, that I'm able to read the bash history of -- this might be dub, dub, dub data that process. But I don't see anything sensitive in here. But again, that should not be accessible. But it is here, and I only found that through using Dirb. Also, here are these lists of directories if we go to them. So admin uploads is always interesting. Right. I can see all these different images here. So just running that fast and dirty scan with Dirb, I got a whole lot of good information. Try Gobuster, try Dirbuster, and see if you like those as well. So in summary, we should now understand what tools are available to enumerate web servers. We've described the limitations of scanners. We know what directory brute-forcing is and how it can aid in website enumeration. And we've explained why manual enumeration of web applications is important.

Contents