12 March 2024

Transcom Chats about Trust and Safety - with Iulian Bacain

Iulian Bacain


‘Transcom Chats’ is a series of interviews with Transcom staff on things and topics we’re passionate about. The aim is to bring forward, talk to, and highlight the people that make Transcom the brand you know and love.

With Trust and Safety fast becoming one of the most important aspects of any modern business, we thought it important to talk to our leading people in the area. Today, we’re joined by Iulian Bacain, our VP of Trust and Safety Growth.


  • Introduce yourself and tell us a bit about what you do at Transcom.

I’m Iulian, VP of Trust and Safety Growth at Transcom. In my role, I oversee outsourced Trust and Safety solutions design to create safer online environments through content moderation, enhancing user experiences with personalized and relevant advertising, and developing custom large language models (LLMs) to stay at the forefront of technology.

  • Why is trust and safety an important topic to you personally?

Trust and Safety operations are fundamental to fostering inclusive, supportive communities where individuals of all ages, especially vulnerable groups like children, can interact safely. Personally, it's crucial to contribute to a digital ecosystem that prioritizes the well-being of its users, promoting a space where everyone, irrespective of age, can learn, engage, and connect without the fear of encountering harmful content. It’s a matter of personal values; creating a secure online space is as much a personal mission as a professional one.

  • Why is trust and safety an important topic to Transcom?

Trust and Safety are foundational elements that support the integrity of digital platforms and build user trust. These practices are vital for our clients' growth and success, directly impacting user retention and satisfaction. Our efforts in this area are crucial for maintaining a reputable and secure online presence for the brands we serve.

  • How does content moderation play into it?

Content moderation is the cornerstone of Trust and Safety, acting as a filter to maintain the digital environment's health by removing harmful content and ensuring compliance with platform standards and regulations. It's a dynamic and essential process to safeguard the user experience

  • In your opinion, what are the main challenges when it comes to content moderation?

The main challenges include scaling content moderation efforts to manage the sheer volume of user-generated content, cultural understanding, adapting to the evolving nature of online threats, navigating regulatory landscapes, and maintaining a delicate balance between safeguarding users and upholding freedom of expression. Furthermore, protecting the well-being of content moderators from the psychological impacts of exposure to harmful content is a significant concern.

  • Seeing as moderators are the cornerstone of content moderation, what do we do to prepare them for the specifics of the job?

Our approach integrates a comprehensive wellness framework starting from recruitment, emphasizing mental resilience. We support our staff with a nurturing environment that includes both proactive and reactive wellness measures. Tools for mood monitoring, wellness programs, and access to professional mental health support are some of the ways we prepare and sustain our moderators for their critical role.

  • Why do you feel wellness and wellbeing are so important?

Given the nature of their work, moderators are routinely exposed to distressing content, making their mental health and overall well-being paramount. It's essential to provide them with the necessary support to navigate these challenges, ensuring they remain effective in their roles while also safeguarding their psychological health.

  • What does Transcom do in that regard?

We offer a holistic support system that includes 24/7 access to certified psychological support, comprehensive wellness programs, counseling services and training in resilience.We've designed our work environment to include proactive and reactive support measures, empowering staff to manage their well-being with tools and facilities tailored to their needs. This includes relaxation rooms and games areas, which offer various ways for staff to decompress. Daily pulse checks are conducted using tools like T:Buddy to monitor moods, with the results reviewed at all levels of the business, ensuring a pervasive culture of well-being.

  • Being a tech-forward company, does AI help with content moderation?

Yes, artificial intelligence plays a critical role in streamlining content moderation by automating the detection and flagging of harmful content. AI technologies complement human moderators, enhancing efficiency and accuracy across our operations. 

  • Any parting words?

The collaboration between technological advancements and human expertise is fundamental in advancing Trust and Safety efforts. At Transcom, we are committed to leveraging this synergy to create secure digital spaces, supporting the well-being of our teams, and ensuring that digital platforms remain safe and welcoming for all users​​

Thanks for reading.

If you want to know more about Transcom's views on Trust and Safety or want to talk to our experts in person you can find that out below.