In March, it will introduce its first parental controls, which will be used to shield young users from harmful content and keep them from overusing the product.
Adam Mosseri, the head of the app inside Meta, the parent company of Facebook, said in a post that parents would be able to see how long their teenage children have spent on the app. Teenagers will be able to tell their parents if they have reported someone for a violation.
He said in the post that they will add more options over time.
Mr. Mosseri is expected to field questions about the harms of social media for children and teenagers when he appears before a Senate committee on Wednesday. The leaked documents show that the company was aware that the app made teenage girls feel worse about themselves.
Mr. Mosseri said in the post that there were more child safety improvements on the way. Users will no longer be able to tag teenagers who don't follow them. In January, the app will release a feature that will allow users to remove their posts, comments, and likes in bulk.
It was not clear if Mr. Mosseri's announcement would appease them. Meta is attempting to shift attention from their mistakes by rolling out parental guides, use timers and content control features that consumers should have had all along. We see what they are doing.
New child safety guidelines in Britain have led to changes to how children can use their products on major tech platforms. The app is considering asking some young users to go through a stricter process to prove their age, but has not yet added those features.
Cecilia contributed to the reporting.