This website focuses on awareness about what Red Teaming, and the adversarial mindset are. It presents examples, concepts, ideas and tips. All information is based on experience, past projects, and lessons learned by doing. Nothing is theory, we write about what works for us.
You will NOT find here full techniques, exploits, *hacking" tips, and other things that might aid in an attacker.
A real Red Team is a group of highly skilled professionals that mimic and simulate a real adversary. This team will research, investigate, learn and become the adversaries they are trying to portrait.
By consciously working to assume the adversary's role, a "thinking enemy", a Red Team can provide a realistic view of how a new idea, plan or approach can perform in the real world, helping leaders to understand and address the risks in every aspect of the business, by providing unorthodox views on problems and their solutions.
Adversaries use a broad spectrum of tools and tactics to compromise security. A Red Team does the same. Red Teams test the entire security posture of the organization: physical, digital and social, using any and all techniques. When used effectively, a Red Team doesn't just help security organizations find vulnerabilities in their environments, it can also help organizations prove the need for changes in plans and strategies. Red Teaming mimics the tactics, techniques and procedures of real attackers, where the organization or company as a whole is learned and analyzed, and not just the area that was scoped to be tested, like in a normal pentesting engagement.
One of the biggest benefits of understanding how the adversary plans, is that it helps not only to be prepared, but also look inward and see whether the chosen solutions and controls would work. Attacks don’t simply come out of nowhere because attackers don’t simply just attack. Adversarial actions are the result of planning. Understand this planning, this mindset, and you'll be able to understand yourself better as well.
A Red Team provides alternative and adversarial analysis of plans, operational orders and tactical decisions as well. Like an adversary, the Team identifies patterns that lead to vulnerabilities in the strategy, and often expose alternative ways to examine the breaking point of policies and plans.
Adversaries don’t play by any rules. Attackers adapt and learn from their failures. A good Red Team then has to adapt and play by the same rules of the adversary, in other words: no rules. Red Teams can solve problems through an indirect and creative approach, using reasoning that is not immediately obvious, and involving ideas that may not be obtainable by using only traditional step-by-step logic.
Red Team members think outside the box. They look at a problem from multiple perspectives at the same time, often probing the different sides of a problem - or solution - that was never considered. Red Teams recognize contingencies and bring them to the forefront of analysis by asking the right questions and challenging underlying assumptions.