Microsoft launches a new tool that audits AI-powered content

Microsoft Inc has announced the launch of open-sourced tools and data sets designed to audit AI-powered content moderation systems and automatically write tests highlighting potential bugs in AI models. 

The project is called AdaTest and (De)ToxiGen. 

“We recognize that any content moderation system will have gaps, and these models are going to need to improve constantly. The goal with (De)ToxiGen is to enable developers of AI systems to find risks or problems in any existing content moderation technology more efficiently,” Ece Kumar, a partner research area manager at Microsoft Research and a project lead on AdaTest and (De)ToxiGen. 

“Our experiments demonstrate that the tool can be applied to test many existing systems, and we are looking forward to learning from the community about new environments that would benefit from this tool.” 

Please follow and like us:

Leave a Reply

Your email address will not be published.

Social Share Buttons and Icons powered by Ultimatelysocial