This article originally appeared on Medianama. An excerpt is reproduced here:
The presence of a political context surrounding these cases also raises the question of how Facebook is responding to the possible weaponisation of its community reporting. We know from Facebook’s August 2020 CIB report that it took against a network engaged in mass reporting. What principles does it use to define thresholds for action? How is such coordinated activity that falls below its self-defined threshold of Coordinated Inauthentic Behaviour handled? Knowledge about the specifics of thresholds become essential when they make the difference between publicly disclosed and internal actions, as the Sophie Zhang – Guardian series demonstrated in the Indian context.
Facebook — this applies to other networks too, but Facebook is by far the largest in India — needs to put forward more meaningful explanations in such cases. Ones that amount to more than ‘Oops!’ or ‘Look! We fixed it!’. There are, after all, no secret blocking rules stopping it from explaining its own mistakes. These explanations don’t have to be immediate. Issues can be complex, requiring detailed analysis. Set a definite timeline, and deliver. No doubt, this already happens for internal purposes. And then, actually show progress. Reduce the trust deficit, don’t feed it.
This does raise concerns of being drawn into distracted by narrow content-specific conversations or being distracted by ‘transparency theater’, thereby missing the forest for the trees. These are legitimate risks and need to be navigated carefully. The micro-level focus can be about specific types of content or actions on a particular platform. At the macro-level, it is about impact on public discourse and society. They don’t have to be mutually exclusive and what we learn from one level should inform the others, in pursuit of greater accountability.