· detection-engineering security-operations soc

The Overfitting Problem in Detection Engineering

Each exception added to a detection rule carves out a blind spot where attackers can operate undetected. Each exception trades a false positive for a blind spot. Attackers live in the exclusions.

In data science, “overfitting” is when a model learns the training data so well that it fails to work in the real world. It becomes too specific.

In SecOps, we do the exact same thing. We just call it “Good Hygiene.”

Every time you tune a generic detection rule to silence a noisy alert, you are effectively overfitting your defense. You are hard-coding a blind spot.

  • AND NOT (User == ‘ServiceAccount’)
  • AND NOT (Process == ‘BackupExec’)

This feels like progress because the queue goes down. The team breathes easier. Sanity is restored.

But you are buying that sanity with the currency of visibility.

Attackers love our “hygiene.” They thrive in the exceptions. They live in the “trusted” processes and the “noisy” accounts that we tuned out because they were annoying.

We are trading false positives for false negatives.

The goal shouldn’t be to silence the sensor. It should be to automate the investigation of the signal so that volume doesn’t dictate our risk posture.

Don’t tune for silence. Tune for signal.