Checking job availability...
Original
Simplified
Resolver is a high-growth SaaS company whose intuitive, no-code platform gives our customers a clear picture of their risks so they can make quick and effective decisions. As a part of the Resolver team, your work will help transform risk management to risk intelligence so organizations can protect people and assets and deliver on their purpose.
We are ambitious in both our mission and our culture. As a business within Kroll, we offer an innovative, non-hierarchical work environment blended with the stability and financial security of an enterprise. Resolver has also been named one of Canada’s Great Places to Work six years in a row!
By combining Artificial and Human Intelligence, Resolver’s Intelligence delivers 24/7/365 safety by continually fighting the weaponization of communications from whoever the source, whatever the language and whichever the online harm.
The Role
We are looking for English speaking Content Moderators with a keen interest in Trust & Safety, risk based content moderation and an excellent understanding of harmful content across social media platforms.
The role will include identification and classification of content containing core risks, such as hateful and abusive chatter, bad actor profiles and sector specific risk as well as reviewing imagery, video and audio to assess content type and narrative.
Accuracy is essential in this role and the ability to process large data volumes whilst maintaining high quality outputs is key.
Criteria
We are ambitious in both our mission and our culture. As a business within Kroll, we offer an innovative, non-hierarchical work environment blended with the stability and financial security of an enterprise. Resolver has also been named one of Canada’s Great Places to Work six years in a row!
By combining Artificial and Human Intelligence, Resolver’s Intelligence delivers 24/7/365 safety by continually fighting the weaponization of communications from whoever the source, whatever the language and whichever the online harm.
The Role
We are looking for English speaking Content Moderators with a keen interest in Trust & Safety, risk based content moderation and an excellent understanding of harmful content across social media platforms.
The role will include identification and classification of content containing core risks, such as hateful and abusive chatter, bad actor profiles and sector specific risk as well as reviewing imagery, video and audio to assess content type and narrative.
Accuracy is essential in this role and the ability to process large data volumes whilst maintaining high quality outputs is key.
Criteria
- Excellent communication skills, with high standard of both written and spoken English
- Ability to work under time pressure to maintain high quality outputs whilst meeting Service Level Agreements
- Passionate about quality delivery
- Flexible, enthusiastic and confident working approach
- Methodical approach to work
- Self-motivated and able to work independently without close supervision
- Great organisational skills with the ability to prioritise
- Proactive, enthusiastic and willing member of a team
- Additional languages are desirable but not essential.