- Security TWENTY
- Women in Security Awards
Remember children’s games like Where’s Wally? Or those 3D Magic Eye pictures? The reason we all found them so intriguing because they helped us take a different perspective, to look at something in a different way from before. This aligns with exactly what a Red Team does: challenging your existing thinking to find a new perspective on cybersecurity, writes David Higgins, Senior Director, Field Technology Office at the cyber firm CyberArk.
We each individually have our own cognitive biases, influenced by our personal experiences, which make us see the world in a certain light. This becomes particularly apparent in our problem solving, as we process and solve challenges differently – although it might seem the ‘obvious’ way to us. In their studies, Israeli psychologists Amos Tversky and Daniel Kahneman observed that this happens in a few different ways. The first is via attentional bias, which is when we prioritise some things while ignoring others. Like prioritising the appearance of some new clothes while managing to sweep their rather hefty price tag out of your mind, for instance.
Alternatively, functional fixedness stunts our ability to think outside of the box. Take the idea that you’ve always used a paperclip to clip paper together, for example. Many people could never think of alternative uses for it, such as fixing a zipper or opening a lock. Lastly, there’s optimism bias – the naïve belief of invulnerability – that keeps you thinking that bad things ‘couldn’t happen to me.’
While we are all biased, we are all biased differently, which is what makes having a second brain to think over an idea infinitely valuable, helping to avoid mistakes and come to innovative solutions. Red Teams exist for this same reason: to rid us of our assumptions about threat actors and assess plans critically to make them more resilient.
Introducing the art of ‘alternative analysis’
Red Teaming was first introduced by the military to better prepare teams for combat using war games to try to penetrate real life defences. The U.S. University of Foreign Military and Cultural Studies (UFMCS) defines Red Teaming as “a function executed by trained, educated and practiced team members that provides commanders an independent capability to fully explore alternatives in plans, operations, concepts, organidations and capabilities in the context of the operational environment and from the perspectives of our partners, adversaries and others.” As a concept, it’s built on four principles: self-awareness and reflection; fostering cultural empathy; groupthink mitigation and decision support; and applied critical thinking.
Applications of ‘red team’ methodologies now stretch beyond the military; in law firms preparing for court, in journalists putting together a critically investigative story and, of course, in cybersecurity teams protecting their organisations. Many now conduct independent Red Team exercises to channel the mindset of attackers, and put their cybersecurity defences to the test.
Get ahead of an attacker
The beauty of a Red Team adversary simulation is that it provides a safe, controlled way for security operations teams to uncover vulnerabilities, test response capabilities and identify areas of improvement. Red Teamers use any means necessary to mimic a real-world attack without introducing risk to the business. Sometimes Red Teams are hired independently to be a truly fresh set of eyes on the problem.
Each Red Team will have a different goal, depending on need – perhaps testing against known threats, simulating a recent attack or building a custom angle of attack to find new flaws. In the growing threat landscape, Red Teams offer special protection against ransomware alongside the ‘fire drill’ knowledge of coping with an attack once begun. Red teams ‘fight’ blue teams – the in-house team offering their best defences. The blue team benefits from experiencing a real world scenario too, in terms of being prepared for an attack.
At the end of the exercise, organisations often receive a two-part takeaway. The first report is a top level overview of the organisation’s security status, with key findings and risk-prioritised recommendations for the executive team to consider. The second is a technical analysis for security teams that details information on the vulnerabilities uncovered and recommended remediation steps to reduce future exposure. With deeper insights into their security strengths and weaknesses, organisations are then able to bolster defences and create a baseline from which future security improvements can be measured.
Winston Churchill once said, “Criticism may not be agreeable, but it is necessary. It fulfils the same function as pain in the human body. It calls attention to an unhealthy state of things.” Many cyber-attacks begin as a twinge so minor they often go unnoticed until the detrimental impact becomes apparent. Red Teams’ ability to exploit weaknesses in systems and processes and in human nature itself pushes cybersecurity teams to think differently and see things sooner, no matter how uncomfortable the process may be. That empowers organisations with the prescience to anticipate future failures, and work to stop them before they begin.