
What might social media platforms look like if teens had a voice in how they are designed?
Most digital platforms were built with adult users in mind. As a result, they do not always reflect how teens experience online spaces or what they need.
This digital design guide imagines something different: teens sharing what they want.
Developed with input from teen advisors at the American Academy of Pediatrics (AAP) Center of Excellence on Social Media and Youth Mental Health, this guide reflects a vision of social media that better supports how adolescents think, feel, and connect.
A Vision for Ideal Online Spaces for Teens
We Want Connection

- This is why we go online: to participate in pop culture, learn more about ourselves and other parts of the world, find meaningful community, and laugh or sympathize with our friends. While we can often connect with peers offline, our ability to do this differs by whether we live in a busy city or on a rural farm, whether our parents can afford after-school activities, whether we have a medical condition, or whether we feel understood and accepted by the people around us.
- For younger teens (middle schoolers, ~ages 11–14), we want to connect online with our friends and peers. If we are searching for a community online, that community/forum should be safe and moderated. Direct messaging and group chats are great. We want to be able to watch videos and see posts from positive, funny, and educational sources. These earlier years of social media use could be simpler and supported so that we have clear expectations of being kind online. We should also be encouraged to log off to go do other things. Additional options for setting screen time limits or “gamified” adherence to screen time limits may be helpful.
- For older teens (high schoolers, ~ages 15 and above), we are starting to build connections that are broader. This would continue to include direct messages (DMs) and group chats, but also wider access to videos and posts that help us explore interests. Less support is needed at this age when we should be developing better self-regulation and critical thinking.
We Want Privacy
- Other people shouldn’t be able to locate us without our express permission or invitation. Location-sharing services on social media should not be available for people under 16. (We have other ways to share location through our phones if needed, for example, to coordinate a pickup from parents.) For those 16+, location services should be opt-in only.
- We don’t want to be contacted by adults we don’t know. Our accounts should be private and not discoverable by algorithms. People 18 and over should not be able to initiate conversations with teen accounts, unless it is a parent or
someone we invite to connect. - Messaging for teen accounts should include additional safety protections, such as:
- Easy-to-access blocking and reporting tools, including a popup option asking whether we want to report a message as spam and block the sender.
- Stronger protections on photo sharing, such as limiting the ability to send images until a teen has accepted the initial message, added the person as a friend, and responded at least once.
- Clear distinctions for club, school, and community organization accounts, ensuring that teens can still communicate with legitimate group or educational accounts that may need broader messaging access.
- A short message preview, allowing us to see one sentence before deciding whether to open the message.
- Separate inbox filtering, so messages from people we don’t know automatically go into a different folder. This helps us recognize when we need to be more cautious without fully blocking all forms of contact.
We Want More Control of Ads We See and How We Are Profiled for Marketing Purposes

- Transparency builds trust. We want simple, teen-friendly explanations of how our data is collected, stored, and used. Note: AAP policy recommends against all data-driven profiling or targeted advertising for children and adolescents.
- We need simple, one-click options to opt out of targeted ads, certain types of ads, and data sales so that we don’t have to be exposed to political ads, diet ads, or ads for products that make us feel insecure.
We Want to Feel Safe
- Sometimes the content that trends on digital media is violent, stressful, or overwhelming. We want feeds that prioritize our positive experiences rather than prioritizing engagement and view time. Recommendation algorithms should deprioritize or completely filter out content and accounts flagged for hate speech, discrimination, dangerous challenges, or harmful behavior and prohibit engagement-driven amplification of provocative or controversial content.
- We want the power to block and report and have these practices lead to meaningful action. Platforms should clearly communicate how content is moderated and provide teens with easy-to-use reporting tools when we encounter what we perceive to be unsafe or harmful content.
- We want our time and attention respected. Content farm accounts that make money by flooding feeds with “brain rot,” especially that generated by artificial intelligence (AI), should be deprioritized, or platforms should allow teens to turn off content that may be AI-generated. We want to follow inspirational, interesting, funny, and human accounts.
We Want to Feel Seen and Welcomed
- We want to see ourselves in the media we consume. For example, it helps when social media platforms provide options to filter by hair type and body type when searching for beauty/haircare products.
- Content and accounts reported for hate speech or discrimination should not be recommended on teen feeds.
- Highlight uplifting, educational, and creative content rather than sensational or harmful trends.
We Want Trusted Information
- We often go searching online for answers when we are stressed. If teens enter keywords such as “suicide” or “depression,” or related types of search terms indicating distress or unhealthy behaviors like eating disorders, platforms should show them resources that are approved by medical and mental health professionals, rather than user-created content.
- Make accessible, human-reviewed libraries of social media content on topics like “anxiety” and “handling stress.”
We Want to Build Media Literacy As We Go
- Kids and teens learn a lot about how media works simply from interacting with it. Therefore, platforms should provide teens and parents with plain-language information on digital/media literacy in multiple locations, including in their feeds and in an easy-to-access area such as a family center.
- Do not recommend content that is suspected to be misinformation to teens. Allow teen accounts to flag content that they perceive to be misinformation.
- When teens search for medical information, initially prompt links to reliable sources (eg, American Academy of Pediatrics, American Medical Association).
We Want Balanced Social Media Use

- Higher social media use can be a sign that we are stressed or struggling. Provide parents/guardians with insights about the social media use of those with teen accounts, such as daily duration, number of new accounts followed, and whether the teen blocked or flagged contacts or content.
- As a default, add more friction and reminders for teens to get off their phones:
- during school hours,
- during late-night hours (eg, 11:00 pm), encouraging healthy sleep habits and reducing screen time before bed, and
- after 45 minutes of app use, encouraging breaks and suggesting healthy offline activities (ideally pre-chosen by the teen to increase their buy-in and motivation).
We Want to Explore AI But Not Become Dependent On It
- AI chatbots shouldn’t show up in our feeds or contact lists. We want to interact with real humans.
- When we search out information from AI, we want systems that detect and interrupt harmful dialogue loops and avoid offering therapeutic or medical advice. If AI chatbots are designed to provide therapy advice, we want them to be thoroughly tested for safety and effectiveness.
- We don’t want AI to agree or tell us we’re right all the time (sycophancy). We grow more when we are challenged to think for ourselves and learn new skills. It’s especially important not to let AI reinforce harmful beliefs.
- If we are in a crisis or struggling emotionally, help us find trusted resources like a suicide hotline rather than trying to be our AI therapist or engaging us even more. Teens should not be able to prolong conversations with AI on sensitive topics; prompts to continue should be disabled.
We Want Online Spaces That Help Us Protect Our Digital Footprint and Reputation

- What we share online can follow us for years — including when we apply to colleges, scholarships, or future jobs. Platforms should support teens in making thoughtful choices and give us tools to manage how we show up online.
- Platforms should offer clear, teen-friendly prompts showing what parts of our profiles or posts are publicly visible.
- Teens should have simple, one-click tools to hide old posts, bulk-delete content, or switch previously public posts to private.
Funding for the Center of Excellence was made possible by Grant No. SM087180 from SAMHSA of the US Department of Health and Human Services (HHS). The contents are those of the author(s) and do not necessarily represent the official views of, nor an endorsement by, SAMHSA/HHS or the US Government.
Last Updated
05/11/2026
Source
American Academy of Pediatrics
