Red Teaming Guide | The Ministry of Defense (UK) (PDF)

This is a very good introduction to Red Teaming and it's uses. Worth reading.

I like their definition of Red Teaming:

Red teaming is the independent application of a range of structured, creative and critical thinking techniques to assist the end user make a better informed decision or produce a more robust product.


Not for social

You can follow now the Red Teams Blog in Twitter.

Now, please note that this will mostly be unmonitored, so if you try to send messages they might go answered for several days.

Regardless, news and other things that don't make it to the blog will be posted there.


Fundraising and a little something

So, we are meeting in Boston as part of the fundraising.

Now, to make this more interesting and maybe raise a little more, I decided to place my Kobold Phantom Tactical on rotation for the fundraising. Since we can't use PayPal for this, I opened a donation page. This is an informal sort-of-raffle. You can donate whatever you want, from $1 to whatever. On the day of the meetup I'll pick the winner and if that person comes to the meetup I'll give him my watch and a patch. Otherwise I'll ship the watch and the patch (sorry I cannot ship to certain countries, please contact me if you are outside the US).
The catch: I wrote a little script that will take the amount of money each person donates and transform this into a simulated number of tickets. The more you donate, the more tickets you have and a bigger chance to win.

The target is set to $4000 because that's the approximate number needed to cover the school needs for the two families.

So, please feel free to visit the donation page.

Thank you and see you in Boston.

Oh, and don't forget the challenge (at the end of the OPORD).


Quote of the day

"Proactive defense and offense can help deter all but the most focused adversaries. And those that still insist on attacking will need to modify their plans due to our proactive approach. Red Teaming can help the defenders learn how to misguide an attacker, causing her to unknowingly provide information to the organization and the security staff. An attacker might be forced to utilize tools or techniques she wasn’t planning to use and, in doing so, she might be careless, providing crucial information to the security stuff. This will provide a deeper understanding of a hacker’s TTPs."

-- Red Teams Blog

(Yes, quoting myself... I know... But it is a good quote)

Red Team Presentation

Here's a presentation (PDF) I gave a few weeks back on a closed forum. This presentaiton was based on the book that we are writing and it provides an introduction to Red Teaming and the Team.

I hope you'll find it useful.

By the way, here you have other presentations, all are PDFs.

The Devil Does Not Exist - The Role Of Deception In Cyber | Mark Mateski - Black Hat (PDF)

Here's Red Team Journal's Mark Mateski and his presentation for the 2014 Black Hat Conf.

While it might be convenient to think of cyberadversaries as ones and zeros, the reality is that systems are attacked and defended by human beings. As a result, it is important to understand the role deception plays in network operations. This presentation draws upon traditional and emerging research on deception and associated game theories to help the audience understand how attackers might deceive them, how to recognize that deception, and how defenders can also deceive their attackers.

This presentation contains a wealth of Red Teaming information, adversarial ideas and misdirection. It is a must read for everyone in the field.

As always, Mark provides the best Red Teaming information out there.

Almost ready

We are almost ready with the book. We simplified everything in the current draft, from the title (it's call only RED TEAM) to the content (we have now only 6 chapters. In the original a couple of months ago there were 11).

Based on the votes from a few weeks ago, we'll publish both a hard copy (a small paperback to keep the price down) and an eBook.

I'll post more when we are ready to publish.

Here's the TOC in the meantime.



We are being forced by some idiots out there. So, please read the terms of use for the blog.

Personally I never thought it would come down to this.


SOFREP: Red Teaming a Possible Attack on America

In this article written for SOFREP, Loren Schofield analysis from a Red Team point of view the different possible scenaries where terrorists might attack America again.

As an 18Fox, it is part of the job to understand the enemy and learn everything you can about him so you can try to predict what they will do. Part of this is playing Red Team during Course of Action (COA) development, and some of it comes into play during the planning for the mission.

In preparing the Conop and planning, one of the 18F’s duties is to come up with the enemies Most Likely COA (MLCOA), and Most Dangerous COA (MDCOA). It could focus on a specific target that you are going after, the TTPs that have been historically utilized, location, personal knowledge and numerous other factors. MLCOA is usually not as dangerous as the MDCOA which, by its nature, usually takes the “worst case” option.

He called me when he had the first draft of the article and we had a very interesting discussion. He wanted to know my opinion about his conclusion, and see if I could add some more given my experience in the Middle East, with counter-terrorism and as a Red Teamer.
I agree with his assessment and I mentioned to him other scenarios as well, which he factored in into his article. He wrote:

I would like to consider these two COAs on what an attack in the United States by Islamic Militants would look like. The point of this article is not to say they are coming right now, or that any of these attacks will certainly take place. This is simply an analysis of what a potential attack would look like, period.

My own assessment runs more along the lines of his MDCOA given the nature of the current terrorists. These are not old school, give-me-a-rifle-I-will-die-for-religion terrorists. No. We are seeing a new breed. Current groups are more organized, better funded, technologically savvy and trained. They have a different, more extreme agenda.

Yes, Loren's MDCOA is scary, but not as scary as my own MDCOA.

Go read the article, it is very intersting.