Instagram is the latest platform to launch new tools aimed at enforcing its age policies, as kids’ screen time rises and regulators threaten greater scrutiny of social media’s affects on children.
Why it matters: Social media giants are grappling with how to determine with certainty the age of their visitors to both evade regulatory crosshairs and create safer online experiences for children.
Driving the news: Instagram on Thursday said, starting in the U.S., if a user tries to edit their birthdate from under 18 to over 18, they will have to prove their age in one of three ways:
- Upload their IDs, which are deleted within 30 days.
- Record a video selfie that will be reviewed by Yoti, a company that uses facial analysis technology to estimate age and then deletes the video.
- Ask three mutual followers who are at least 18 to confirm the users’ age, an option Instagram calls social vouching.
The big picture: Apps are experimenting with methods of verifying ages beyond simply asking users to plug in a birthdate, as well as designing separate versions of their services for younger users.
- Youth gaming platform Roblox in September announced a new age verification system requiring users to upload a photo ID and a selfie to prove they’re real.
- YouTube created a standalone app, YouTube Kids, for users under 13 while Facebook launched Messenger Kids for younger users.
- TikTok created a view-only version of its video-sharing app for kids under 13, and removed 15 million underage accounts between October and December last year. The company removed 3.5 million suspected underage account during that same time period in 2020.
What they’re saying: Erica Finkle, director of data governance at Instagram-parent Meta, told Axios the new age verification system is only for users attempting to edit their birthdate to make them older than 18 for now, but could be expanded in the future.
- “We’re focused on really providing age-appropriate experiences and so that’s why we’re starting with this as a test,” Finkle said. “We will consider where across Meta technologies to use it going forward, but we want to test right now for effectiveness and how people are using it.”
- Instagram rolled out safety features for teenage users last year, including making the accounts of users under 16 private by default.
The intrigue: Finkle said in a blog post the company still believes the best solution is for devices or App Stores to provide apps with users’ ages, so they can have age-appropriate experiences across all apps.
- Earlier this month, Apple said with iOS 16, users will be able to use their digital ID in their Apple Wallet for apps requiring identity and age verification.
- Advocates of decentralization also see an opportunity for an identity to be stored in the blockchain, with users able to provide credentials to various applications and services.
- Apple’s iOS and Google’s Android system allow parents to block or limit specific apps on their child’s device.
Between the lines: Instagram requires users to be at least 13 years old, but has faced outcry from lawmakers over children’s use of the app.
- Meta in September said it was pausing development of a version of Instagram for kids under 12 after a Wall Street Journal investigation revealed internal research on Instagram’s negative impact on teens.
- Instagram head Adam Mosseri said at the time that a kid-friendly version of the app is better for parents than relying on an app’s ability to verify the age of users who are too young to have an ID.
Our thought bubble: Even though a smartphone has more than enough info to verify who people are (and therefore how old they are), privacy and logistic concerns have made this a thorny issue.
- Knowing the age of users is all the more important since some of the strictest rules on the internet are designed to regulate how websites treat children’s data.
Droolin’ Dog sniffed out this story and shared it with you.
The Article Was Written/Published By: Margaret Harding McGill