Facebook will create 1,000 new roles in London over the course of this year, including adding to its team tackling harmful online content.
More than half of the new jobs will be technology-focused, with roles in software engineering, product design and data science, the company said.
It will take Facebook’s UK workforce to more than 4,000.
Pressure has been growing on social media firms to remove posts promoting self-harm and political extremism.
Facebook’s chief operating officer Sheryl Sandberg will announce the new jobs in London later on Tuesday, before travelling to the World Economic Forum in Davos.
“Many of these high-skilled jobs will help us address the challenges of an open internet and develop artificial intelligence to find and remove harmful content more quickly,” she is expected to say.
Those roles will be in Facebook’s “community integrity” team, which designs tools to police posts on Facebook’s platforms including Messenger, Instagram and WhatsApp.
The firm had decided to invest more in policing online content, following the suicide of teenager Molly Russell in 2017, Steve Hatch, the firm’s vice-president for northern Europe.
“The tragic death of Molly Russell made us really stop in our tracks as a company and acknowledge that there was an issue that we need to do more on.
“We’ve been putting those changes in place steadily over the last 12 months,” he said.
Molly Russell was 14 when she committed suicide. Her father, Ian, told the BBC that he believed Instagram was partly responsible for his daughter’s death.
“I have no doubt that Instagram helped kill my daughter,” he said.
Facebook aimed to build on progress it had made in tackling terrorist content to remove other problematic content such as self-harming, Mr Hatch said.
The firm had detected and removed two million posts from Facebook and 800,000 from Instagram, he added.
“As systems get better they develop, they get better and more effective,” he said. “Our aspiration is to remove every single piece of [harmful] content.”