Ardern on Sunday said she would be looking for answers from social media firms about how the mosque attack, that killed 50 people on Friday, was livestreamed on their platforms.
The distressing 17-minute livestream was available to watch on social media for hours after the attack that also left 34 people wounded.
Ardern said there were “further questions to be answered” by the social media sites.
“We did as much as we could to remove, or seek to have removed, some of the footage that was being circulated in the aftermath of this terrorist attack,” said Ardern.
“But ultimately it has been up to those platforms to facilitate their removal. I do think that there are further questions to be answered.
New Zealand mosque attacks prompt flood of support for Muslims
“I have had contact from Sheryl Sandberg [Facebook CEO]. I haven’t spoken to her directly but she has reached out, an acknowledgment of what has occurred here in New Zealand,” Ardern said a media conference when asked if Facebook should stop live-streaming.
On Sunday, Facebook said it removed 1.5 million videos of the Christchurch shootings “in the first 24 hours”.
“We continue to work around the clock to remove violating content using a combination of technology and people,” Mia Garlick, who works for Facebook in New Zealand, said on Twitter adding that of the removed videos, 1.2 million were “blocked at upload”.
“Out of respect for the people affected by this tragedy and the concerns of local authorities, we’re also removing all edited versions of the video that do not show graphic content,” she said.
Hours after the attack, New Zealand police said they were working to have the footage removed while urging people not to share it.
Tech companies “have a content-moderation problem that is fundamentally beyond the scale that they know how to deal with,” Becca Lewis, a researcher at Stanford University and the think-tank Data & Society, was quoted as saying by the Washington Post.
“The financial incentives are in play to keep content first and monetization first.”
On Friday, YouTube tweeted it was “working vigilantly to remove any violent footage” while Twitter said it suspended the account of one of the suspects.
How can online hate speech be stopped? – Inside Story