Modulate is an AI tool designed to combat toxic behavior in online spaces. It has a unique voice-native moderation system called ToxMod that goes beyond user reports to identify harmful speech and protect vulnerable players. ToxMod can distinguish between friendly banter and harmful roleplay, and it can be customized to prioritize specific types of harm based on community needs. Modulate also ensures player privacy and compliance with regulations like GDPR and COPPA. The tool is trusted by companies like Riot Games and has been used to build positive player communities. It has potential applications in online gaming, social media platforms, and other online spaces where toxic behavior is prevalent.