The Red Team Mindset
Adversaries don’t play by any rules. Attackers adapt and learn from their failures. A good Red Team then has to adapt and play by the same rules of the adversary, in other words: no rules. Red Teams can solve problems through an indirect and creative approach, using reasoning that is not immediately obvious and involving ideas that may not be obtainable by using only traditional step-by-step logic.
The unique talents that the members of a Red Team have, combined with the adversarial mindset - the Red Team Mindset - make this team of individuals a unique and useful asset to organizations looking to test their security as well as the military and law enforcement units searching for a better understanding of the enemy.
Red Team members think outside the box; they are not bothered by rules. They look at a problem from multiple perspectives at the same time, often probing the different sides of a problem - or solution - that was never considered. Red Teams recognize contingencies and bring them to the forefront of analysis by asking the right questions and challenging underlying assumptions.
The Red Team Mindset is one that (1):
- Thinks what no one else is thinking
- Thinks and act disruptively
- Thinks the unthinkable (Ridiculous Thinking)
- Provides unexpected solutions
- Does what no one else is doing
- Is the disruptive change
- Is the game changer
Using “Ridiculous Thinking” the red teamer plays with ideas and extracts the pieces that have the potential to work when combined with each other. The application of this, coupled with other alternative analysis techniques provide a comprehensive set of potential adversarial attacks on a given concept. Possible attacks on the classic three fronts - digital, physical and human - exploit each front’s weaknesses separately or as a combination to generate successful scenarios where the adversaries win.
Red teams play with situational awareness or lack thereof. Like an adversary, they identify patterns that link individuals to systems, systems to networks, and networks to the full target. They often expose alternative ways to probe the breaking point of policies and plans by creating false trails. They develop noisy attacks and let the target follow them, while having a secondary stealthy one ready to perform the actual attack. This disrupts the orderly way organizations perform their planning (and react to events).
Overall, having an understanding of who the adversary is and how it might exploit weaknesses and security vulnerabilities will make any organization better prepared. Rather than just reacting to a security event, organizations should implement a new posture, one based on the Red Team Mindset: be proactive, think what an attacker can exploit, and stay two or three moves ahead of him. Prepare and establish detection and deception measures. Make a future attack harder.
If one has enough visibility into what an adversary might do, their TTPs (tactics techniques and procedures) and motives, a much better overall security defense posture can be set, a better plan with various degrees of contingencies can be prepared and when a new challenge presents itself, the whole organization is better suited to deal with it.
By having a Red Team exercise, we can learn where the entry points are, what the weak links are, where we can improve and where the system is lacking. We can identify the problem areas, and when the next attack happens, and it will happen, we can be better prepared for it. We can start seeing the signs and markers earlier and have better deceptions placed in the system (whether virtual or real). We can make it harder for the attackers. If an adversary spends time, money, and resources only to find out that we led them on a false trail, then the Red Team exercise is worth it.
Proactive defense and offense can help deter all but the most focused adversaries. And those that still insist on attacking will need to modify their plans due to our proactive approach. Red Teaming can help the defenders learn how to misguide an attacker, causing her to unknowingly provide information to the organization and the security staff. An attacker might be forced to utilize tools or techniques she wasn’t planning to use and, in doing so, she might be careless, providing crucial information to the security stuff. This will provide a deeper understanding of a hacker’s TTPs.