Question from a Reader: Building a Red Team

Question:

How would you build red team? What positions would you create?

Dan and I talked about this on Episode 3 of the Red Team Podcast, but maybe this question warrants going a little deeper.

The Red Team

Usually a good Red Team, as we think of it, is composed of two very distinctive sub teams: the Operational Team and the Support Team.

The Operational Team usually is forward deployed. Whether performing physical reconnoissance, or open source intelligence. Whether actively trying to get into things, or on the phone working the social engineering angle. They are the people that learn the target, research the possible adversaries, and help identify the vulnerabilities and define the plan of action.
The Support Team, on the other hand, usually stays back, whether at the office listening to shells getting back, monitoring radio, providing access and intelligence to the Operational Team, and coordinating with the customer if needed.

One thing to note is that the Team Leader moves between the two sub teams, however, most often - in our case at least - he or she is on the Operational Team.

The Operational Team

As we mentioned, the Operational Team is in charge of recon, identifying the weaknesses, and executing the plan. Members of this sub team, take different roles, based on their strengths. Though the team composition might vary with each engagement, it is a good idea to cross train each person with another, thus having redundancy.

Usually the Operational Team members include:

  • Physical security expert
  • Digital security expert
  • Surveillance and recon expert
  • OSINT expert
  • Security generalist (someone that can fit on either position)
  • Team Leader

The Support Team

This sub team takes care of all the needs of the team while things are happening. They provide an extra set of eyes when needed, they perform the initial recon once a foothold on the network is gained, the execute further exploits and gain persistence on other systems, the identify more targets and generally speaking, they are in charge of connecting the dots, and the Find and Fix and Analyze on the 3FEAD.

Usually the members are:

  • Digital security expert
  • Exploitation and code writing expert
  • System and networks expert
  • Physical security countermeasure expert
  • Main planner

Again, in both cases individual team members have to cross train in multiple areas of responsibility, covering for each other, and often rotating between those 2 sub teams.

The Red Team Manifesto | Reciprocal Strategies

A great addition to the Red Teaming world by Mark Mateski at Reciprocal Strategies.

I’m a red teamer:

  • I ask questions even when the answer seems obvious.
  • I speak the truth as I understand it.
  • I protect my clients from their adversaries and from themselves.

Go read the entire post. It blends nicely with our own Rules of Red Teaming:

  • 1: The purpose of a Red Team is to become the adversary, to be the worst case scenario.
  • 2: People lacking imagination, skepticism, and a perverse sense of humor should not work as a Red Teamer.
  • 3: Red Teaming is mostly about paying attention.
  • 4: Understand the thing you are Red Teaming, If you don't, the results will be poor. Spend time learning.
  • 5: Don't play by the rules. Make your own and adapt.
  • 6: If you’re happy with your plan, you are not doing it right.
    1. The efficacy of security is determined more by what is done wrong than by what is done right.
  • 7a: Build on this. The bad guys typically attack deliberately and intelligently, not randomly. Mimic that.
  • 8: A Red Team is most vulnerable to detection and disruption just prior to an attack. Don't make mistakes.
  • 9: If you're not failing when you're training, you're not learning anything.
  • 10: There are an unlimited number of security vulnerabilities for a given system, program, or plans, most of which will never be discovered. Tap into that.
  • 11: When in doubt, Red Team it.
  • 12: We are never prepared for what we expect.
  • 12a: During a stressful moment, take a step back and look at the whole system. Analyze whether this is real stress or a deception by the defenders.
  • 12b: Act, don't react. Plan 2-3 steps ahead.
  • 13: The solution is in the problem. “When in doubt, develop the situation.”
  • 14: The more sophisticated the technology, the more vulnerable it is to primitive attacks. People often overlook the obvious.
  • 14a: Most organizations will ignore or seriously underestimate the threat from insiders. That's your in.
  • 15: Make it asymmetrical. Advantage-stacking is your friend..
  • 16: Remember PACE: Primary, Alternate, Contingency and Emergency. Always have a PACE for everything.
  • 17: Use ACTE: Assess the situation; Create a simple plan; Take action and Evaluate your progress.
  • 18: If there’s a question about if it’s necessary, remove it. KISS.
  • 18a: Stay small. Stay light.
  • 19: Don’t become predictable.
  • 20: Prioritize and execute.

Quote of the day

"It often takes a crisis for red teaming to be considered and building an ark when it has been raining for 39 days won’t protect you against the flood. For example, the FAA created their red team in response to the bombing of Pan Am 103 over Lockerbie in 1988. For the next 10 years, this group conducted undercover simulated threats to enhance aviation security systems. Then complacency crept in. Red team warnings were ignored in the late ‘90s and early 2000s and were ultimately considered a contributing factor to 9/11. This in turn gave rise to red teaming programmes, including the CIA and NYPD, in the fight against terrorism. Failure sparks change, and sport is no different."

--Best Laid Plans of Mice and Men

Best-Laid Plans of Mice and Men | Leaders in Sport

How red teaming can transform your stumbling blocks into stepping stones.

In an exclusive feature for Performance, Potts reflects on his tenure and delves into Scottish Rugby’s use of red teaming – a common training practice in the military, intelligence, aviation and politics – to explain why it may prove a valuable tool for others in the world of elite sport.

UPDATES on the tshirts

Since we didn't meet the minimum, the orders will not be fulfilled. Maybe next time.

Inductive Observation | Protection Circle

One of the things that makes the security field so interesting is that it’s mostly about people. Security efforts (even if assisted by security systems) are usually directed at people, and largely executed by people for the protection of people. The most important assets are usually people, most of the highest risks we try to mitigate have to do with people and most screening and assessment efforts are attempts to distinguish between people who pose a security risk and those who do not.

If you can’t understand people, you can’t fully understand security.

Live Podcast Recording in NYC

This coming April 5th, we'll be recording live episode 21 of the Red Team Podcast in NYC. Starting after 1700 hours (5pm), we'll be at Harding's - 32 E 21st St, New York, NY 10010.

Come and see what's up.

A Change of Mindset | Advanced Capabilities Group

This approach and way of doing things was good, however it presented a challenge. Most organizations are not ready for this kind of security assessments. Their security programs and people are not mature enough to really understand the need for Red Teaming, and they were not ready for the assessment, often resulting in wasted efforts and the fact that the Team penetrated them using techniques they never thought about.