![]() ![]() ![]() This is similar to the “file drawer” problem in academia, where less interesting (and often null) results often fail to be published, causing bias in the overall accumulation of knowledge. Compounding this challenge, most research conducted internally by the platforms will only make its way into the public domain if the platforms choose to release the research publicly. Unsurprisingly, research communities lack a common framework for conceptualizing human behavior and for evaluating the efficacy of the online tools and strategies that aim to affect that behavior we also miss opportunities for collaboration to develop these frameworks. Similarly, research communities prioritize different contexts for impact: academics are inclined to optimize for advancing scientific understanding with high-quality and thoroughly cited publications platform researchers aim to improve or inform new feature design. Platform researchers, on the other hand, are primarily motivated by addressing problems on their own product surfaces and secondarily interested in advancing science. To generalize, academics tend to be motivated by a basic scientific understanding of a phenomenon-in this case the consuming, engaging with, and sharing of misinformation, and secondarily interested in the design of platform features to affect those phenomena. In the context of studying misinformation interventions, taking stock of what generally sets research constituencies apart can illuminate the untapped opportunities for collaboration between them, or at the very least should suggest ways for overcoming existing barriers to collaboration. Research communities have different perspectives, priorities, and practices. With advances in platform data access to external parties emerging through the European Union’s Digital Service Act and the European Digital Media Observatory’s outline guiding such a regime, this paper invites both academics and platform researchers to engage in a dialogue about how measurement research can evolve through collaboration. One way forward to fostering evidence-based decisionmaking to tackle misinformation online is to establish a shared understanding of the aims of interventions and the metrics for assessing them. 4īridge-building is necessary to unlock what types of interventions are best suited to address threats within the information environment, particularly in the context of democracies. 3 In this paper, we explore how greater collaboration between research communities-in particular, those inside technology platforms, the academy, and civil society-can accelerate progress toward empowering users to be safe and informed online. ![]() 1 Research suggests both that informed users can slow the spread of misinformation 2 and that users want the tools to make these judgment calls for themselves, but we lack a robust foundational understanding of how to achieve this goal. Users of social media continue to be confronted with misinformation despite the progress made by major social media companies in scaling enforcement and ranking approaches to addressing the harms caused by misleading or inaccurate information online. Finally, we introduce a measurement attributes framework to aid development of feasible, meaningful, and replicable metrics for researchers and platform practitioners to consider when developing, testing, and deploying misinformation interventions. This paper attempts to contribute to such bridge-building by posing questions for discussion: How do different incentive structures determine the selection of outcome metrics and the design of research studies by academics and platform researchers, given the values and objectives of their respective institutions? What factors affect the evaluation of intervention feasibility for platforms that are not present for academics (for example, platform users’ perceptions, measurability at scale, interaction, and longitudinal effects on metrics that are introduced in real-world deployments)? What are the mutually beneficial opportunities for collaboration (such as increased insight-sharing from platforms to researchers about user feedback regarding a diversity of intervention designs). Beyond issues related to data access, a challenge deserving papers of its own, opportunities exist to clarify the core competencies of each research community and to build bridges between them in pursuit of the shared goal of improving user-facing interventions that address misinformation online. Yet a divide remains between researchers within digital platforms and those in academia and other research professions who are analyzing interventions. The lingering coronavirus pandemic has only underscored the need to find effective interventions to help internet users evaluate the credibility of the information before them. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |