Meta decide to protect youngsters from unwanted messages on Instagram, Facebook
Meta has announced enhanced safeguards to protect young users from unwanted direct messages on Instagram and Facebook.
Meta
1/29/20241 min read

Meta has announced enhanced safeguards to protect young users from unwanted direct messages on Instagram and Facebook. This decision follows recent commitments by Meta, the owner of WhatsApp, to increase content restrictions for young users, responding to regulatory pressure to safeguard children from harmful content across its platforms.
In response to growing regulatory scrutiny, prompted by a former Meta employee's testimony in the U.S. Senate, Meta has taken specific measures. Young users on Instagram will no longer receive direct messages from individuals they do not follow or have no connection with by default. Additionally, parental approval will be required for certain app settings changes.
On Messenger, users under 16 (and under 18 in select countries) will only receive messages from Facebook friends or individuals in their phone contacts. Meta has further implemented restrictions on adults over the age of 19, preventing them from sending messages to young users who do not follow them.
These protective measures aim to address concerns raised about harassment and potential harm faced by young individuals on Meta's platforms, signaling the company's commitment to creating a safer online environment for its younger users.
Contact us
Whether you have a request, a query, or want to work with us, use the form below to get in touch with our team.


Location
Block 14, Gulistan-e-Johar
Karachi, Pakistan
Hours
Mon-Fri 12:00pm-3:00am
Sat-Sun Closed
Contacts
+92 335 7041058
hr@treadstonetechs.com

