If you are looking for some of the presentation slides, here are some of them:
And here are the ones from the TAD Red Team Mindset course in February 2015.
Here are some of the projects and operations the Team did in the past. It's a good sample of the different things the Team engages on.
- Chasing the Ghost in the Machine
- Rapelling off a Roof
- Don't get caught
- When you do it right
- The Hole in the Wall
- Moving inside
- Red Team Support in the War on Terrorism Part 1, and Part 2
And some about Gear too
This is a collection of articles from the blog that somewhat talk about the mindset. Most recent posts appear first in the list.
The first thing that becomes clear once you begin adding Red Teaming to your security planning, is that a good and capable defense can only be established once you know how it will be attacked. In other words, rely only on the standards or on the checklists of certifications and you’ll be able to cover some basics. Actively test those standards and checklists and you’ll be able to identify what actually works and what needs to be strengthened.
One of the most important things you can do when you have a plan is to make sure it will survive Mr. Murphy, to the best of your effort. We've talked about this many times in the blog, but here's a small brain dump of what Red Teaming the plans would look like. your mileage may vary, thought, depending on the plan.
Three steps that can be adapted to any situation: digital, physical, personal, you name it.
Planning can be overwhelming. A lot can go wrong during this critical stage of an assessment or operation. I’ve written about this before, but I thought maybe a simple, deeper post was needed.
In Aikido you are always training, you are always discovering new things about yourself and about your possible opponents. Over the years, the different Sensei (plural) that I've had the privilege of training under, mentioned different Aikido Principles. Some resonated with me and I can see how you would also apply them to Red Teaming and security in general. Bear with me, please, while I try to make sense of this.
Earlier this morning I was having another very interesting conversation with the same person a month ago. This time the conversation centered around the topic of hackers and their motivations.
Yesterday I had a very interesting discussion with the security director of a large corporation. He began making changes to the way they handle corporate security after having two Red Team assessements done in the past year. The conversation centered around the way I think about security and the way today's majority of organizations handle their IT and Security departments. He was really interested in knowing our (the Team) opinion about his new approach.
The real world is more complex than your testing lab. The real world doesn't obey the rules you impose. The real world is not a vacuum, like most security certifications will have you believe. The real world behaves following its own chaotic rules, or lack thereof. If you try to plan for it, set your defenses only once and call it a day, the real world will eat you alive.
If you become static and stop checking, moving, developing, updating and performing the next round of security assessements (yes, this includes Red Teaming), then your adversaries will exploit this.
Red Teaming has to happen constantly, it has to be part of the security planning of any organization. Think about it, one of the Rules of SOF: Competent Special Operations Forces cannot be created after emergencies occur.
Most of you are familiar with the Rules. Like you know I had originally 12 rules. It all started with a joke: when in doubt, red team it.
Several readers asked me about the idea behind them and how I chose those specific rules, so here it is: the original 12 explained.
It is important to understand that in the majority cases, organizations are not prepared to handle a Red Team assessment. Either as whole, or at a product/planning level. They might request a Red Team assessment, sure, but they will fight it - especially if you find vulnerabilities and exploit them. That's basic human nature.