Dear People Who Prepare Others for Uncertainty,

We’re kicking off 2026 with a clear focus for ‘Ello ‘Ello.

This is a newsletter you chose to receive. It stays in its lane and gives you space to reflect on regulatory changes and news events that raise compliance training questions.

This month, the focus is behavioural risk. The moments teams face when rules exist, facts are partial, and judgement does the heavy lifting.

One quick signal that this matters: last year, the average time spent per Compliance Online training session was over 30 minutes. People are engaging. If you’re curious, here’s a short video with a few other surprises from 2025.

To make this easier, start with the piece that matters most to you.

Read if ...

… you’re in financial services
Data Protection + Cybersecurity

Cyber incidents and AI scams fail the same way: hesitation beats judgement

Cyber tactics keep changing, and at the same time:

Although cyber methods change rapidly, incidents often fail at the same human decision point. In practice, they begin when someone feels uncertain about whether escalation is required.

Why it matters:

When threats keep changing, training that focuses only on known scenarios will always lag. A more resilient question to ask is simpler:

  • Would people recognise uncertainty as a signal?

  • Would they feel permitted to escalate before they were sure?

Training that helps people exercise judgement under pressure does not prevent every incident. But it does reduce the chance that hesitation becomes the reason an incident escalates.

Ask whether current cyber training prepares people for ambiguity, not just for recognised threats.

When tactics change, judgement is what people fall back on; whether you trained for it or not.
… you sell items over R100,000
FICA

If you sell one item for R100,000+, you may become an “accountable institution” overnight

High-value sales that feel like routine commercial wins are increasingly triggering scrutiny under FICA.

Recent legal and regulatory commentary has warned that South African businesses selling individual items at or above R100,000 may fall within high-value goods dealer (HVGDs) exposure; even if this is not their core business model.

Why it matters:

The risk doesn’t show up in policies. It shows up at the point of sale, so this is where things could go wrong:

  • A sale is treated as “business as usual”.

  • Staff focus on closing the deal, not classification.

  • No one realises FICA duties may have been triggered.

  • Records, checks, or reporting steps are missed; unintentionally.

At that point, the legal and reputational risk sits squarely with the business, not the salesperson. This is not about bad actors. It’s about organisations being caught out by a threshold they didn’t realise they crossed.

Consider whether your current approach would reasonably help sales-facing staff recognise when a single transaction changes the rules.

You need people to recognise the moment when “just a sale” becomes something more.
… you’re adopting AI tools
Ethics and AI

Chasing the “new and shiny” AI while your data and people aren’t ready

Many organisations are adopting AI while still dealing with fragile data, unfinished system migrations, and uneven skills. That gap is where informal use creeps in; before governance, training, or accountability are in place.

Why it matters:

When systems are slow and pressure is high, people don’t wait, they work around.

That’s how AI risk usually appears:

  • real customer or employee data is pasted into external tools “just to test”,

  • AI outputs are treated as answers, not inputs,

  • no one is sure who must stop things when something feels off.

By the time concerns surface, data may already be exposed, decisions already made, or explanations already given.

Ask whether people would know what they may not upload, what needs verification, and when to stop and escalate.

AI can assist work, but it doesn’t replace judgement, approval, or accountability.
… you’re an exporter
Competition Law

When “collaboration” is allowed, price talk still gets you into trouble

Government has allowed limited cooperation between exporters, but only within tight boundaries.

In December 2025, the Minister of Trade, Industry and Competition published a block exemption under the Competition Act. It permits cooperation on specific activities such as joint marketing, logistics coordination, and limited information-sharing.

What it does not permit remains unchanged: price fixing, market allocation, and cartel conduct. Exporters relying on the exemption must also notify the Competition Commission so it can confirm the activity fits within it.

Why it matters:

This is a classic “permission creates confusion” moment. Under pressure to make collaboration work, conversations drift, and that’s how risk appears:

  • pricing or margins come up “just for context”,

  • customers or territories are discussed as coordination,

  • informal understandings form and are never written down.

The danger zone is rarely the formal meeting. It’s trade fairs, industry dinners, WhatsApp groups, and side conversations, where language gets loose and assumptions replace rules.

Ask whether export-facing teams would recognise when a conversation needs to stop.

The risk is that teams hear “we’re allowed to work together now” and drift into conversations that remain strictly prohibited under competition law.
… you report under King
Corporate Governance

If you can’t explain it, King V assumes you don’t control it

King V was launched on 31 October 2025 and applies to financial years beginning on or after 1 January 2026. It raises the bar on how governance is explained and defended i.e. saying the right things is no longer enough.

Why it matters:

Governance failures now surface through explanations, not policies. Risk arises when (for example):

  • disclosures rely on boilerplate language no one can comfortably explain,

  • independence or relationship issues are treated as “technical” rather than perceptual,

  • remuneration decisions make sense only to a small inner circle,

  • AI- or data-driven decisions affect people without clear ownership or rationale.

These are not board-only problems; they are everyday decisions that later need to be explained.

Raise King V awareness beyond the board pack: directors, executives, and senior managers need a shared understanding of what has changed.

Under King V, explainability is the audit trail people carry in their heads.

Email us for this one at [email protected]. The course is currently with our SME and willl be ready for enrolments shortly.

If something here made you pause, that’s probably the point.

Reply and tell us what question it raised. We read every response because they help shape what we share next.

Until then,
The Compliance Online team

Keep Reading

No posts found