If a company creates a back door for a government in a way that it looks like an oopsie vulnerability, it gives them plausible deniability. If someone reports the vuln and they don't fix it, this somewhat shatters the plausible deniability.
Operation triangulation https://securelist.com/operation-triangulation-the-last-hard... revealed the use of four different vulnerabilities, hidden code and undocumented features to take over Iphones. Its sophistication points to an APT. Apple did not deny helping the attacker TMK.
I can imagine many non-malicious alternative explanations for Apple not commenting on that particular vulnerability. For example, doing that here opens the door to a future in which every non-denial is seen as implicit admission of collaboration.
It's also possible that Apple themselves was compromised: It's a large company, and other types of leaks do happen.
I'd focus much more on the things Apple very publicly does not do, such as in this case not using private set intersection for AirDrop.
The best vulnerability is the one you don't even have to defend, because it's just the absence of a more secure (but also more complicated) alternative. There are countless historical examples of that: Unencrypted instant messaging, non-end-to-end encrypted cloud storage etc.
The idea that Apple needs to create backdoors for governments seems absurd when you consider that any competent government will have essentially unfettered access to Radar, i.e. a firehose of almost every security bug discovered by Apple or reported to it.
Operation triangulation https://securelist.com/operation-triangulation-the-last-hard... revealed the use of four different vulnerabilities, hidden code and undocumented features to take over Iphones. Its sophistication points to an APT. Apple did not deny helping the attacker TMK.