Collaboration for battling security incidents
The keynote for Sun Security Con 2026 (SunSecCon) was given by Farzan Karimi on how incident handling can go awry because of a lack of collaboration between the "good guys"—which stands in contrast to how attackers collaboratively operate. He provided some "war stories" where security incident handling had benefited from collaboration and others where it was hampered by its lack. SunSecCon was held in conjunction with SCALE 23x in Pasadena in early March.
He began with a premise that attackers, which he sometimes referred to as
"hackers", collaborate, "they share tools, they share knowledge
";
beyond that, they may share access to some of the members of their teams
with others. On the other side, for defenders, things are different:
"As effective as we are at the individual or team level, we're often
victims of these organizational silos that trap us into not being able to
collaborate well with each other.
" Specifically, "think of security
versus software teams, product versus enterprise, blue
team versus red
team
". The boundaries can be rigid: "if you're on the
enterprise team and touch the scope of the product team, watch out
".
He was once on a red team doing an exercise that simulated an attack by an adversary where he found a way to evade detection by disabling the endpoint detection and response (EDR) sensor system. His counterparts on the blue team were unhappy with that methodology because they wanted to use that information to see how he was attempting to evade detection; he pointed out that if attackers can disable EDR, they will. The exercise did not end particularly well, as he was called a "cheater" and a "one-trick pony", which hurt at the time and had the effect of isolating him from working with the blue team.
The goal for the talk was to share some stories that demonstrated how
successful
collaboration on the defensive side "can transcend the attacker and lead
to arrests
"; on the flip side, failure to collaborate well can lead to
"the attacker getting the upper hand
". Karimi briefly introduced
himself as the deputy chief information security officer (CISO) at Moderna,
which is a Boston-based biotechnology company. He has led offensive
security teams for around 15 years, including the red team at Google for
Android for a few years, and before that was the founding member of a red
team at the game company Electronic Arts (EA). He has spoken at the
Black Hat and DEFCON security conferences over the years as well.
His theory is that "the real 0-day in companies isn't necessarily a
technical flaw
", it is, instead, isolation between humans in the
organization. There are social pressures, such as the "fear of looking
stupid
" or individuals having the "hero mentality
", that work
against collaboration. While his stories were about security incidents,
the details of those are not what is important in the talk. "It's the
human behavior after the exploit, these tense situations where they
escalated or de-escalated based on how the conversations went
".
Story time
A talk he gave at DEFCON 33 in 2025 (YouTube video, slides) provided another example of collaboration gone awry. He presented a new technique, recursive request exploit (RRE), in the talk, but 17 hours before it began he was contacted by the legal department of a company that was affected by the exploit. The company felt that he had not done responsible disclosure, because it expected the problem to be reported to its HackerOne account, but Karimi had used email—to an address that he did not realize was not monitored.
The company had heard about his upcoming talk from a journalist and was
threatening legal action if he gave the presentation. So, less than a day
before the talk that he had spent months preparing for, he participated in
a phone call where he explained what had happened; "it was really just a
big miscommunication
". He and the company had a different view of
things, however. "I thought I was in the right and ethically I thought I
took all the right steps, but the optics were different.
" When that
kind of thing happens "you find yourself in a very difficult
conversation and you have to find ways to de-escalate in those tense
situations
". He did not elaborate further, but he must have found some
way to de-escalate since he gave the talk shortly thereafter.
When he was at EA, he worked on an incident regarding the virtual currency that is used in the FIFA football (soccer) video games. The currency can be used in-game to create a better team, but it can also be used to buy game merchandise; the FIFA coins have value outside of the game itself, so they are attractive to attackers. A Europe-based attacker found a vulnerability in an API for FIFA that allowed them to generate FIFA coins.
The attacker was able to create $324,000 in the virtual currency, which
they immediately distributed across 25,000, seemingly random user accounts.
That might seem like they were a "digital Robin Hood just giving FIFA
coin back to the people
", but a closer analysis showed that some of
those accounts were actually controlled by the attacker. They were simply
trying to obfuscate where the money was going. That really stacked the
deck against EA being able to do something about the crime because it was a
foreign actor stealing money and hiding where it was going.
As it turned out, though,
"good collaboration across multiple teams led to this person's
arrest
". When the theft was happening, the EA blue team noticed that
something had gone wrong and analyzed the logs to discover which APIs had
been attacked. Sometimes, a blue team will just continue doing that kind
of analysis, but for this event, it did something different: "they
looped in my red team
" and asked if the attack could be reproduced from
the logs as a form of evidence that it was what the attacker had done.
The red team was able to do that and provided the information to EA's legal
team, which took it to court. The judge agreed that a crime had been
committed, but the attacker was based in Italy at the time. Karimi said
that he was not participating at that point, but that because it was a large
enough sum of money, the FBI got involved and the attacker was somehow
invited to come to the US. When he did, he was met by the FBI and arrested,
eventually going to prison and having to pay restitution. So that was an
example of a "really good collaboration, between IR [incident response],
red team, and legal
".
Unplanned collaboration
His next story was an example where the collaboration was not planned, but was forced on him. He was running a red team and compromising systems, which was exactly what he should be doing, but he unknowingly "compromised" a honeypot for an advanced persistent threat (APT) that attackers had left on the company's systems. The attackers behind the honeypot realized that he was part of the red team, so they used his credentials to launch other attacks on the network—effectively disguising their activities as coming from his account and systems.
One day, an investigator showed up in his office asking some rather pointed
questions about what he had been up to, since the attackers had been targeting
executives in the company. She was aware that he was not
actually the attacker, but "she could have talked to me as if I was the
victim or if I was an idiot red-teamer that didn't care about hygiene and
left my credentials everywhere
". She took a different approach that
has stuck with him throughout his career since: she enlisted his aid in
monitoring the attackers behind the APT.
"Instead of making me feel stupid, she brought me into the broader IR
[incident response]
initiative, which was really magic in a way
". She probably does not
even remember him, he said, but her actions made a big impact on him. She
asked him to keep doing his normal things, which was hard given that he knew
there were two different groups of people watching his every move. He had
to keep that up for around two weeks, while the forensics team would
periodically ask him to access various systems so they could watch the APT
team follow along shortly thereafter. That allowed the team to collect a
bunch of indicators of
compromise (IOCs) and it shows, he said, that incidents can also be
opportunities.
Over the line
The next tale was about "one of the most humbling moments of my
career
". While on a red team at Microsoft, he was doing an authorized
penetration
test (pentest) of an HR application that stored sensitive information
about employees, including their salaries. He found a vulnerability that
allowed him to get access to salary information, so he stopped and reported
the bug.
But, instead of stopping after compromising one record, he made a mistake
and wrote a script that pulled out more than 1,000 employee salaries,
"like an idiot ... in retrospect
". At the time, he thought he was
demonstrating the scale of impact. He was "feeling really full of
myself
" after finding this critical flaw, so he made a second mistake.
He joked with his office-mate that he was the lowest paid security engineer
at Microsoft.
His office-mate laughed, but someone down the hall heard the joke and they did not laugh. Instead they escalated it and Karimi was called into his manager's office about an hour later. There, Karimi found out that he would not be getting the promotion that he had hoped the find would solidify, but that he might lose his job over the incident, which had already gone to legal as an ethics violation. In a ten-minute span, he had gone from a high that he had found a critical flaw to realizing that his whole career might be in jeopardy.
That kind of situation goes well beyond red-team activities, he said.
Anyone with administrative access should be extremely careful in
determining whether to use it; "just
because you can, doesn't mean you should, and, if you have to think about it,
you probably shouldn't
". The scope of the work may provide legal
permission, "but trust is the social permission and you really need to
have both in order to be successful
". He did manage to keep his job,
which was a positive outcome.
Entertainment conference
The final story was about the web application for ticket sales to a
prominent southern California entertainment conference, which he could not
name due to an agreement with the conference's legal team. There are about
70,000 attendees and some of them were losing access to their legitimately
purchased tickets at the same time there was a spike in ticket sales on
reseller sites. He did not work for the conference and was not hired to
work on the investigation, he just happened to be attending the conference;
"it's a great story of surprise collaboration
".
The impact of the problem is potentially quite large, roughly $35 million
in total ticket sales and lots more if the conference had to be canceled
because of the problems. Ticket buyers would receive an email that
contained a link to directly take them to their page in the web
portal—without having to log in. He asked if attendees had ideas about
what kind of vulnerability to look out for; someone correctly guessed
"IDOR
", which is an insecure
direct object reference.
He showed a redacted version of his portal page and the URL, which had two
parameters of interest: login=FKnnnnn and pwd=mmmmm where
nnnnn and mmmmm are two different numbers. He asked the
audience for ideas on what to change in the URL; someone noted that "FK"
are his initials and that the pwd number corresponded to the
registration number on the page. He agreed and noted that the registration
number was an incrementing integer per registration. The number after his
initials turned out to be his phone number. "All of these are guessable
parameters.
"
It gets worse, though. The phone number was a throwaway field in the
login, he said, so the primary key was just the initials. In five
minutes or so he had written a proof-of-concept script to loop through all of the initials "AA" to
"ZZ" trying each pwd number from one to 900,000, which showed
"hundreds and hundreds of tickets
" that could be compromised. He
took that to the organization, which immediately stepped up to fix the
vulnerability and to handle the fraudulent ticket reselling. He asked
attendees what they thought he got paid for that work, which was not zero,
as guessed, but slightly more—a T-shirt, he said to laughter.
"It doesn't matter whether you're a defender or you're on the offensive
side, a software engineer or a sysadmin, when we stop treating each other
as opponents, we win.
" Once that happens, "we start trusting each
other more as a result
". That is the thread running through all of the
stories he related, he said.
The talk video is not yet available, but the YouTube livestream recording is for anyone interested in seeing the talk.
[Thanks to LWN's travel sponsor, the Linux Foundation, for its travel funding to attend SCALE in Pasadena.]
| Index entries for this article | |
|---|---|
| Security | Incident response |
| Security | Keynotes |
| Conference | Southern California Linux Expo/2026 |
