Code-quality enhancing tools like compiler warnings, static code analyzers, or sanitizers provide invaluable feedback. They work well, when being applied from the beginning on. Retro-fitting them on legacy code yields typically a long list of issues. Fixing them could stall your team for a while.
Surely you want to update those tools from time to time. New compilers come with additional warnings, static code analysis rules evolve etc. Suddenly your code is no longer rated being of highest quality. The length of the issue list depends on how often you update those tools. So the problem is that every successful software product inevitable turns into a legacy product.
That’s why SonarQube’s "fix the leak" metaphor is so useful. It means that code written or changed from a baseline date on shall meet highest standards and must not generate any new issues. All code older than the baseline date doesn’t matter.
This allows to update code-quality enhancing tools & rules without stalling development teams by declaring your code base as "legacy" on every tool & rules update. When combined with a strategy to gradually clean up the mess[1], this takes the pain from updating tools & rules.
One drawback is that most tools need to process entire files and cannot be applied to change deltas. Changing a single character in a big file can lead to many new issues. In his CppCon 2019 talk Fred Tingaud sketches a simple algorithm to improve granularity: Apply code-quality enhancing checks before & after a change and only report the delta as new issues.