Recently, Microsoft, in a blog post, discussed how it will address internet-related terrorist content. While I did not find anything to surprising here, I did like reading about Microsoft’s partnership efforts. Here is sample of how Microsoft is patterning with others.
Leveraging new technologies: One challenge is that once a technology firm removes terrorist content, it is often quickly posted again. It is a game of “whack-a-mole,” but with serious consequences. We want to see if technology that has worked well in other circumstances can be used to good effect here. That’s why we are providing funding and technical support to Professor Hany Farid of Dartmouth College to develop a technology to help stakeholders identify copies of patently terrorist content. The goal is to help curb the spread of known terrorist material with a technology that can accurately and proactively scan and flag public content that contains known terrorist images, video and audio.
Investing in public-private partnerships: We know that tackling these difficult issues will require new and innovative partnerships bringing together experts and leaders from different backgrounds and perspectives. To help with this, we’re a founding member and a financial sponsor of a new, public-private partnership to develop or enhance activities to help combat terrorist abuse of Internet platforms. Launched in April in Geneva, the initiative brings together the United Nations Counter-Terrorism Committee Executive Directorate, civil society, academics, and government and industry representatives, to address terrorist content.
Providing additional information and resources: We appreciate that we can also work to enhance education and understanding, especially among young people. To help, we’re also adding new resources to the online safety program pages of our YouthSpark Hub, an important component of Microsoft’s YouthSpark initiative, which provides access to educational and economic information and opportunities for young people around the world. YouthSpark Hub provides resources for safer online socializing and tools to identify the risks and responsibilities of being good digital citizens. The new resources include material designed to help young people distinguish factual and credible content from misinformation and hate speech as well as tools for how to report and counter negative content. Experts say youth with more fully developed analytical and critical thinking skills are less likely to start down questionable paths, including those toward radicalization.