Achieving Leader Development through Strategic Broadening Seminars: The Red Team NCO Education Experience | NCO Journal

Throughout the Red Team course, participants are taught essential tools that can assist the command during the decision making process. Red Teaming is a skill that must continuously grow and expand in the individual. Self-awareness, applied critical thinking, groupthink mitigation, and cultural empathy are the four pillars of Red Teaming; these topics lay the foundation of understanding how and why an individual makes decisions and opens the mind to see different possible solutions to a given problem.

/

Quote of the day

"Security is nigh near impossible. It’s extremely difficult to stop a determined adversary. Often the best you can do is discourage him, and maybe minimize the consequences when he does attack, and/or maximize your organization’s ability to bounce back (resiliency)."

So, you want to be a Red Teamer? Again?

We wrote about this already, but times change and things get more complicated. So, here's the 2016 version of the answer to life, the universe and everything... I mean, about being a Red Teamer.

First of all, let's clarify what a Red Teamer is. A Red Teamer is simply a person that can think like the adversary, find the way around things and test/push the limits of security, plans, policies and assumptions. Simple.
A lot of different people can fit in here: hackers, physical security experts, physicists, phycologists, Law Enforcement professionals, military personnel, teachers... Anyone can fit here. It's not what you do, but your mindset.

Being a penetration tester doesn't make you a Red Teamer. Being a programmer doesn't make you a Red Teamer. Those things help if you have the mindset, but even then, it's all about experience, it's about how you look at problems, how you learn new things, how you adapt and how you stop playing by the rules when you need to. You need to think like a bad guy. You can probably learn this, but it you don't already have it, chances are it will be hard.

Before I continue, please, please, discard things like "ethical hacking" or things like that, it's just stupid. A hacker is a hacker. They find ways around things, good or bad. But a hacker is NOT necessarily a Red Teamer and Red Teaming is not ONLY about hacking.

1st rule of Red Teaming - the purpose of a Red Team is to become the adversary, to be the worse case scenario.

This means digital, physical and human. This means real world. This means looking at the whole picture. Got it?

So, do you want to be a Red Teamer?

Operational Code Analysis for the Real-World Red Team, Part I | Red Team Journal

Congress is calling on Pentagon red teams to model potential adversaries more accurately. It’s a mandate akin to Sun Tzu’s age-old maxim, “Know thy enemy.” Unfortunately, for every 100 persons who remind us to know our enemy, perhaps five know how to practice it effectively.

To be fair, it’s a hard problem. Maxims help, but the real world is much more complex than we usually care to admit. Our adversaries are rarely unitary and completely rational. Nearly every adversary sees the world differently. Few adversaries tell the truth, and fewer still perceive the truth. Many are deceived by their own hubris. Some will uncover a short cut we haven’t anticipated. And all of this applies to us in reverse. It’s why we red team, but it’s also why red teaming is so difficult. If we’re honest, understanding reciprocal perceptions in conflict is more akin to a wild scrum of Hungry, Hungry Hippos than an artful game of chess. Know thy enemy? Good luck with that!

/

A Few Thoughts

I had a conversation with another Red Teamer recently. He and I disagree on many things, but we also agree on the basics.
The quotes below are something we talked about and we both agreed.

  • Stop thinking about the "perimeter". It's too late, the attackers are already inside.
  • Stop thinking about stopping attacks. You can't. They will continue to come. Think about making it harder for them to move inside and get that data. If you focus on stopping them, you will miss their moves.
  • Stop thinking about being able to monitor everything. Red Team this and try to think 2-3 steps ahead and prepare detection and misdirection.
  • Stop thinking about different attackers. An attacker is an attacker.
  • Don't stop learning, preparing, getting stronger and smarter.
/

Quote of the day

"1st rule of Red Teaming - The purpose of a Red Team is to become the adversary, to be the worse case scenario."

-- "On Red Teaming". Soon to be finished book.

/

Project "Mantis" - Intro

Last September we mentioned a large project that lasted 19 months. The project was informally known as Project Mantis. We call projects by names like that to keep OPSEC, especially around customers.
This is the intro to a series of posts that would recount some of the things we did and the lessons learned.

Meetings

The project started with a series of meetings with the stakeholders: the company's CEO, CIO and VP of Technology. Present there were also their head of IT, VP of Security and the CIRT manager (CIRT: Computer Incident Response Team).
We had 5 meetings until we had a final scope for the operation. Each meeting built on the previous one, with different members of the Team being there to present the case for a specific part of the project. At the end of the last meeting, the CEO and the main company lawyer signed off on the project and provided us with a written record giving us legal cover should we get caught, and agreeing that if any system went down we wouldn't be held liable. We try our utmost to NOT cause any denial of service or crash any systems, but sometimes these things happen (during this project it didn't!).

After the initial meetings, the head of IT, VP of security and manager of the CIRT came to the office to discuss the actual flow of events. We divided the project in two: Mantis Alpha - complete black box, unannounced red teaming of their digital, physical and social security posture; and Mantis Bravo - Red vs Blue random engagements with 2-3 days notice so they could get ready.
The idea on Alpha was to find vulnerabilities and exploit them, exfiltrate data and / or gain control over time of the main corporate assets. We could use any means necessary, short of kidnapping people or the physical destruction of property/servers/others. This phase lasted 14 months due to the size of the company and how spread it was among 10 countries, with even more supporting contractors and suppliers. It was a big recon!
With Bravo we focused on the more technical aspects and try to provide a realistic scenario for their CIRT to test their knowlege, ROE, tools and contingecy plans. We used the knowledge of the network gained in Alpha to write custom persistant malware and attack code that had no known signature for them to catch. The best of the exercises during this period was when we had a man inside with a laptop and we would alternate command and control from an external controller to an internal and back to external. It was fun to see people trying to figure that one. More on that when we get to that part.

Recon Planning

As always, after the initial meetings and project kickoff, we went to the recon phase. We needed to learn about the target: who they were, what they were, their main role players, their technology, their physical footprint, etc, etc, etc.
This was not easy, with 10 countries in the play, a massive network to recon with potentially thousands of internal systems and a large number with direct access to the internet, not to mention the contractors and suppliers supporting the company... It was mind numbing. It took us almost a month to just scope the recon. Divide and conquer. Team work. It was oen of our best team building exercises ever. Lots of stress, lots of friction but we prevailed and we came out of this with a bettet team and, more importantly, with a plan.

End result

At the end of the 19-months-long engagement, both the company and us were drained. We kept them guessing and they kept us busy. The project was successful across all fronts. This means two things: we were successful in penetrating them and achieving the goals and thus providing a realistic image of an adversary for the company, and the company got to test their emergency respond methods, their tools, their policies, their security measures and the most important thing, they got to test their people. It was a good exercise.

Stay tuned for Part 1, where we'll talk about the recon proper and the initial way in.

Red Teams: When you can’t find the bad guys, make some up | Ryan McGeehan

You've spent money on security products that escalate nothing. You have a 24/7 SOC that hardly pays attention to their tools, or knows how to use them. You have intelligence feeds but have no idea what consumes them. Logs are inaccessible, slow to query, or non-existent. Defenders have stopped hunting and lost a sense of purpose.

That means it’s time for a Red Team to come in and fuck shit up.

This is an example of a red team exercise done right: realistic.

During the response, we had to “cancel” vacation plans, calm panicked employees, and negotiate the tough decisions around production system shutdowns. Defenders were tasked with building a recovery plan which considered how much, if not all, we’d have to be rebuild. This is a significant amount of the headache you deal with when you’re really in the thick of a worst case intrusion, and has little to do with forensics or IR.

Next morning, we broke the news to the team that this was a staged (albeit real) good-guy intrusion. There was long silence, red faces, and we were doing a lot of explaining. I wasn’t sure how the team would react. A former FBI agent we hired to do malware research broke the silence with a “…Then that’s fucking awesome!” and the room lit up.

/