On Red Teaming
One of Advanced Capabilities Group’s primary offerings is Red Teaming. This is something not everyone is familiar with, even those who are focused on security services. There’s also a variety of perspectives on what constitutes Red Teaming. Our perspective on this is deeply rooted in the adversarial mindset and is shaped by over 30 years of hands on experience. We thought it was important to give you an idea of where we stand on Red Teaming and its invaluable role in building a holistic security program.
Today’s adversaries don’t play by any rules. They constantly adapt and learn from failures and the complexity of their tactics and thinking is ever increasing. Whether nation sponsored, criminal or simply opportunistic, this new breed of attacker isn't bogged down trying to exploit the usual suspects (firewalls, web servers, email servers, etc.) They’re not wasting time thinking about your security checklists, policies, and procedures that have been painstakingly developed to thwart them. They’re happy to just go around, under, or over them and uncover weak links wherever possible.
One of the most often exploited weak links is the human one. That human risk can come from both an outsider and insider threats, including your supply chain. The question then becomes, not only whether you know your adversary or not, but do your partners, suppliers and vendors know them as well? Do they know theirs? How frequently are they doing security assessments? It’s a situation that needs frequent testing.
Red Teaming is a necessary component in an effective security strategy to face today’s realities and the modern adversary. A Red Team is a friendly force that plays the role of an advanced adversary to uncover those weaknesses before a real attacker does. The broad and dynamic experience that our Red Team members have make it a unique and useful asset to organizations looking to embrace the value and security benefits of an adversarial mindset.
Many security professionals and penetration testers rely on standards and checklists of certifications, and in doing so cover many of the basics. But, a Red Team goes well beyond this, and looks at the problem from different sides, yet still constrained by some of the rules of engagement the hiring organization puts in place.
Our team takes this one step further. We adopt and play by the same rules of the adversary, in effect we play with no rules at all. Learning the organization and its weaknesses, identifying their worst case scenario and bringing this to reality. We identify what works and the vulnerable areas that need to be strengthened. Think of it like a house. Checking the standards and checklists is like making sure your windows and doors are locked (the basics), a good Red Team says “you know someone could just cut right through that wall, maybe we should reinforce that too.” Because that’s what today’s adversary thinks like, they aren’t restricted by just trying to defeat locks, deadbolts, and alarm systems they’ll come in however they can. They only need to succeed once to win.
The reality is today’s adversary never takes a break from trying to thwart your security. To be most effective, a holistic security program can’t take a break either. Red Teaming has to be a constant tool factored in during the security planning of any organization. Think about a Red Team as the special forces of your security team, they’re trained to think non-traditionally, given greater autonomy, and have superior capabilities to that of regular forces. When thinking about Red Teaming in this context, it’s important to remember one of the five Special Operations Forces “Truths” - “Competent Special Operations Forces cannot be created after emergencies occur.” A Red Team is no different. You can’t hire or create one once your security is already compromised. If you become static and stop checking, moving, developing, updating and performing the next round of security assessments, including Red Teaming, your adversaries will exploit your complacency.
The real world behaves following its own chaotic rules, or lack thereof. If you try to plan for it, set your defenses only once and call it a day, the real world can be incredibly unforgiving. We believe in Red Teaming that acts accordingly.