SBN

A Conflict of Interest?

Years ago,
we faced something odd in a project:
a customer was putting pressure on us
while performing a “one-shot hacking.”
The manager who hired us demanded preliminary results
and made comments about
how we should frame some of the findings,
what to keep,
and what to remove from the final report.

We decided not to adjust everything from the observations
because those changes would misrepresent reality.
After the presentation of the results,
the manager shouted at us,
and never hired us again.
A conflict of interest was in place,
but we made the right decision.

I have some questions for you,
if you are a cybersecurity manager:
who do you report to?
What does your boss expect from you and your team?
I bet IT is a likely candidate for the first.
The relationship seems sensible,
except for the auditing activities cybersecurity teams undertake.
If a performance indicator for these departments
is the number of findings in audits,
there’s a strong incentive to diminish cybersecurity conclusions.
That’s a conflict of interest.
Penetration tests should not be conducted to praise your good defenses.
Instead,
they should help you revealing your blind spots
to improve decision-making,
e.g., improving those defenses.
Be skeptic if when an auditor finds nothing relevant.

Should cybersecurity units report to a different group other than IT?
There are operational reasons
to have cybersecurity operations running smoothly with IT.
Even so,
conflicts of values are not always clear,
as we will see next.

What we get wrong about conflicts of interest

We usually see conflicts of interest as a manifest fraud.
Whenever one is detected,
we favor to think is a product of a deliberate choice.
Besides that,
the usual practice is to disclose conflicts of interest.
In some scenarios,
disclosure forbids some actions or the occupancy of positions
(i.e., in public institutions).
In others,
disclosure seems a reasonable step,
by signaling trust in doing so.

The first thing to point is that
not every conflict of interest is deliberated or conscious.
Empirical evidence (Moore & Loewenstein, 2004;
Moore, Tetlock, Tanlu & Bazerman, 2006)
supports this;
people engage in conflicts of interest
because there’s no perception of wrongdoing.
Additionally,
we are very prone to self-serving thoughts.
We are hardwired to that,
just like we are prone to mental shortcuts and cognitive biases.
Interim takeaway:
as security professionals,
we should be more cautious in our jobs;
we might be affecting business without noticing.

A second misconception is that
disclosure is an adequate countermeasure.
Disclosure of conflicts of interest makes situations worse sometimes.
Research has shown that disclosures,
in some cases,
worsen circumstances
(Cain, Loewenstein & Moore, 2005;
2010;
Loewenstein, Sah & Cain, 2012).
For example,
acting as a permit to misbehavior
(moral licensing)
or to introducing large biases in advisory work
(i.e., exaggerations).

Statue facing wrong

Statue facing wrong.

Who performs penetration testing
for your organization?
It is a critical question
in light of previous insights.
If those who handle the defenses
(i.e., Security Operations Center —SOC)
are the same in charge of offensive operations,
there is a conflict of interest you should address,
and not by disclosure.
That setting creates incentives
that are likely to influence audit results.
Think about it:
if you are responsible for defenses,
it is in your interest to show little holes
or none at all.
How do you ensure penetration testing is not biased
to your favor?
“…​the NSA’s dual mission of providing security
and conducting surveillance
means it has an inherent conflict of interest in cybersecurity,”

wrote Bruce Schneier
related to this years ago.

Consequences

Conflicts of interest pose at least two threats to businesses:

First,
these conflicts undermine the validity of audit results.
It’s tough to support findings
when someone acts as jury and interested party.
This translates to wasting valuable resources like time,
effort, and money.
Eventually,
having non-credible results
could harm one of the assets companies care the most:
reputation.

Second,
in the long run,
organizations could also face other drawbacks.
An illusion of control might emerge from audits involving conflicts,
in the short term.
Furthermore,
a troublesome reality could unveil:
your defenses aren’t as strong as you think they are.
That’s a huge blind spot.
Once you get challenged,
by a third party or a real attacker,
chances are you will struggle in facing the consequences.
Even worse:
the business case for cybersecurity audits you were leading
might become a significant weakness.

Fluid Attacks has,
of course,
confronted conflicts of interest.
Our management team has always stressed the importance of our principles
as cybersecurity experts,
and to preserve independence in our work,
delivering value to our customers.
We have been successful in helping them
as independent and skilled auditors,
typically performing Continuous Hacking,
a service to continuously check and improve robustness of defenses.

Our customers rely on our Attack Resistance Management platform,
the place to keep track of weaknesses.
They don’t have to worry much about managing findings
on their IT assets;
we make it easy for them
by providing a tool to centralize everything.

We want to share with you some advice
in avoiding conflicts of interest,
as concluding remarks:

Take a long-term perspective.
We see conflicts of interest as a game
in which you can play for the short or the long-term.
We strongly suggest you play the long, strategic one.
Your success is never on the short-term;
cybersecurity is an infinite-game
(in game-theory jargon).
We encourage you to trust independent auditors like us.
We are genuinely independent auditors
because we do not provide design and operation of defenses for the market.
Although we provide expert advice and curated resources
on how to better protect information assets
(check Criteria,
it’s FREE).

Identify conflicts of interest and design environments to avoid
them.

  1. “…​ethical systems designers should be ruthless
    in identifying conflicts of interest
    and finding ways to create or restructure rules,
    procedures, other controls, and incentives to minimize them.”
    (Source: Ethical Systems.)

We suggest you implement a policy forbidding “goalkeepers”
to be the same as “forward” players.
Additionally,
don’t rely on disclosure,
as it could backfire.
A stronger stand is a must.

“Consider the opposite.”
This is a strategy to improve decision-making
discussed by Milkman, Chugh, & Bazerman (2009).
In short,
putting yourself in the exact opposite perspective of what you believe,
could change your judgment,
eventually affecting your decisions.

  1. “Ask yourself what you want to be true
    (i.e., what is in your personal interest)
    or what you are inclined to believe.
    Then consider several possible reasons to go against it.
    Do this early in your decision process, especially when the decision is important.”
    (Source: Ethical
    Systems
    )

Lastly,
we encourage you to analyze how you frame cybersecurity
within your organization.
Avoid describing it as an operational effort;
frame it as strategic.
By playing long-term,
cybersecurity can be seen as a continuous process
by which risks can be better managed,
and for that,
you need to permanently test yourself,
to put pressure against your cyber walls,
to learn from unseen weaknesses.
Otherwise (operationally-framed),
cybersecurity will succumb easily to
short-termism and vanity metrics
that might screw you up eventually.


*** This is a Security Bloggers Network syndicated blog from Fluid Attacks RSS Feed authored by Julian Arango. Read the original post at: https://fluidattacks.com/blog/conflict-interest/