The Joys of Waiting for Tools Part II

Sometimes the joys of waiting for tools is you simply can’t wait. No matter how happily the tools will take care of the grunt work without getting tired or bored, sometimes you can’t let them.

Time is rarely on your side when it comes to work and sometimes the schedule just won’t allow waiting for the tools. When it comes to testing like dynamic analysis of a website, sometimes the schedule further restricts your time so you do not impact other dev or testing work. You probably do not have exclusive access to the testing environment after all and rapid, automated testing of a web site can have a significant impact. You might be restricted to after hours scanning. You will probably have to finish the testing by a certain date and maybe the tool just can’t meet that deadline give the other constraints.

As good as the software security tools can be, they still lack a lot of smarts and judgement. We humans can know when we are approaching a deadline and adjust our approach accordingly. We can change priorities. We can say we’ve seen enough of this and decide to concentrate on that. We can decide if a certain type testing is providing a lot of false positive we should focus on testing that provides true positives. We can decide to call an issue systemic if we see the same results over and over and over again so there is little reason to keep testing for it if we need to spend the time we have elsewhere.

The tools can’t do that. They just chug merrily along doing the drudge work. The same thing over and over and over no matter how long it takes. The very thing that makes them great at doing that tedious stuff can make them very bad at meeting a deadline.

So we have to do it for them. We have to provide that judgement for the tools.

We know the schedule and need to judge how the tool is progressing against that schedule. If it isn’t going to make it, we need to make the changes the tool can’t make on its own. Lots of false positives on one test, turn it off. Lots of true positives one a particular test call it a systemic problem and stop testing it so there is time for other tests. Have a web page with a whole lot of fields on it to be tested, exclude it from the main test and do it separately so that one page doesn’t consume too much of the testing time. Maybe the website is on an under-powered server and on a slow connection and reducing the number of testing threads may actually speed things up. If the tool seems to think its session has expired and keeps logging on, maybe its in session testing relies on a part of the gui that the testing bypasses and you need to switch to out of session detection so it stops wasting time logging in when it doesn’t have to.

Sometimes we do not have the luxury of waiting for the tools to do their jobs. Since the tools don’t get bored, they don’t watch the clock. We can and we must. We know how to take short cuts and can make the judgement call when we know it is necessary. We poor, easily bored humans need to provide the judgement the tools can’t.

Dual Monitor Linux Virtual Machine Strangeness

VirtualBox’s seamless mode is a pretty neat way to work with a virtual machine and a great way of working with two different operating systems at the same time in an almost seamless way. Windows from both the host machine and the virtual machines can exist side by side almost as if they are the machine. No more working with the virtual machine in its own isolated window. It does require having the VirtualBox Guest Additions installed on the virtual machine but once installed you are ready for seamless mode. To get to seamless mode you use ctrl+l if you have not set a different host key.

On a single monitor things are pretty simple and seamless mode mostly just works. On a dual monitor system you can get some strangeness.

First, I am working with a Windows 7 host with Kali Linux and Samurai WTF virtual machines for security testing. I have not tried other Linux distributions this way yet but if there are issues there the fixes may be similar.

Setting up a VM to use dual monitors is simple using the VM’s display settings. Set the monitor count to 2 and increase the video memory until you get to at least an acceptable amount if not more like I have. VirtualBox will let you set too low a memory setting so watch this! Continue reading “Dual Monitor Linux Virtual Machine Strangeness”

Burp Suite Tutorials

I don’t recall PortSwigger’s Burp Suite being around the last time did much web application testing. It may have been but I do not recall it and I did not use it. I am using it now and of course that means getting to know a new tool. While I’m waiting for some of Burps tests to run, I figure I’d give a shout out to the best of the Burp Suite tutorials I’ve found out there. Security Ninja has some excellent ones.

Burp Suite Tutorial – Sequencer Tool

Burp Suite Tutorial – Intruder Tool

Burp Suite Tutorial – Repeater and Comparer Tools

Results Triage – Finding a Rhythm

I just got done triaging the results of an AppScan website scan all the way through. I’ve done it before but never on a production run. It has always been partial triages before on training runs. After seven years of secure code review, I have triaged a lot of static analysis results from a variety of tools before and I’ve gotten quite good at it. I know once you get to know the tool and know how it spits out it results, which rules are almost always reliable, which are sometimes reliable and sometimes wrong, which are often wrong, you can work pretty quickly. Work some hard issues, switch to some easy ones for a break, back to hard. Adjust that based on the schedule if you need to focus on the most important stuff and do not have time to do it all. For source code scanning, I have gotten pretty good at it.

This was quite a bit different. The results of a dynamic, black box scan of a website are quite a bit different than a static analysis scan of source code where everything is laid out before you. It’s automated pen testing followed by verification still against that black box. As I’ve said before my pen testing skills are rusty from disuse and it has been a few years since I have evaluated a web application outside of training sessions. Like the pen testing, it’s a bit rusty. No well established rhythm in my triaging muscle memory.

We have a technical oversight/sanity check person working with us which is really nice in general to provide fresh eyes on things like this but it is really nice when you are in a new job in a new company.  You are good at the way you do things but not necessarily the way the new company does things so having somebody around that is good at the company way is nice. Since this first time through with the new guys I made sure to ask questions and made sure I wasn’t wandering off down the rabbit hole. I was shocked at how quickly these experienced folks said they could triage things on these web scans. As I looked at the results before me I thought there was no way I could get close to that speed.

And I didn’t.

But between the triaging cheat sheets with experienced here where the rules are good, here’s where they are prone to false positive guidance and diving in I started to find that rhythm. Figure out the tool’s display, what to look at to verify the finding or dismiss it. Find the pieces to replicate the results manually. Get to know the HTML in front of you and set the search feature right you can progress through false positive weeding out fast in some cases like when a custom error page is mistaken for a result. I didn’t come close to the speed the old pros can do but I can see how they do it as I find that rhythm and scrape the rust off the lesser used skills.

Increasing Disk Size of my Kali Linux VM

I love the Internet.

I have been putting together a pen testing virtual machine lab to work through Georgia Weidman’s Penetration Testing: A Hands-On Introduction to Hacking book that just came out. It’s been years since I’ve done much of this and there are all kinds of new tools to use doing it so I want to scrape the rust off my skills. Part of the lab setup is a Kali Linux virtual machine on VirtualBox. I am used to setting up fairly lean Linux VMs to play with so I started off with just a 15 GB virtual disk and installed Kali to that disk.

Georgia has you set up a number of additional tools into the Kali Linux system and as I was installing the Android SDK and components for the mobile pen testing I ran out of room on that virtual drive. Ugh!

I didn’t want to create a second drive and I didn’t want to start over so I hit Google. Turns out VirtualBox includes a way to expand the size of the virtual disks and Jonathan Mifsud has an excellent how to on his blog. He is working on a Linux host and I am on a Windows host but there wasn’t much difference. Now I had a larger virtual disk.

Unfortunately that new free space was not right next to the main partition so you cannot just expand it into the unused space. I had to move the swap partition to the end of the drive and open up the free space right next to the partition I wanted to expand. Luckily there’s a good blog post by Eugene at trivialproof.blogspot.com. Since I’d already increased the size of my virtual disk I skipped down to step 4. Booting with the Kali Linux live ISO image I used GParted just as described and restructured my virtual disk.

Now I’ve got plenty of space and can keep on setting up the system to follow along with the book.

I’m sure there are other ways to have skinned this cat but those were the ones I found fast. Hard not to solve problems with the Internet these days if you at least know something about what you want to do.

The Joys of Waiting for Tools

Ah the joys of waiting for tools to do their job. Set the scan up either of the source code of an application or a dynamic scan of a website, click go and wait and wait and wait and…

If you’re lucky, the progress indicators s l o w l y creeps along. And you wait and wait and wait and…

Of course you can go off and do other stuff while the computer chugs like attend a North Alabama ISSA lunch time meeting or write a blog post but you still end up coming back, looking at the progress, hoping it has moved since the last time you looked and you wait and wait and wait and…

As tedious as that is, it’s far better than the alternative. It is far more tedious to look at code line by line by line for thousands or hundreds of thousands lines of code. Far more tedious to try to hand jam parameter manipulation and send it all to a website over and over again.  It’s far less tedious to periodically check that progress bar, fingers crossed, to see if it has advanced. As much as you might be eager to get to triaging the results, letting the tool compile those results for you to look at is far less tedious than doing it all yourself. The computer doesn’t get tired or bored doesn’t need coffee and it’s pretty good at grinding its way through finding things that would take us weeks or months to do. It doesn’t care that the work day is over and can happily chug along overnight. (If you’re lucky and it doesn’t hang!) It can tirelessly keep track of a data flow from source to sink across call after call after call across complicated call stacks. Try doing that manually for each and every input and not grow old while doing it!

And when the tool is done, you get to spend your time looking at interesting things and diving deep on something rather than spending your time and your customer’s money on tediously finding everything the hard way. We get to have fun, the computer gets to do the drudge work.

As fun as waiting can be, it beats the alternative.

Sigh, still scanning but at least the bar is moving. Back to waiting and waiting and…