Credentialed media at the University of Colorado are once again free to share simulations of game footage. The university has removed a provision from its media policy that barred outlets from sharing “[s]imulated video or slideshows mimicking game action.”
As FIRE wrote in our letter to CU, the policy impermissibly restrained journalists from choosing to use “‘simulated game action’ or a slideshow to display game data.” That kind of choice is exactly the sort of editorial decision the First Amendment requires be left to journalists, not the government.
Conditioning credentials on this unconstitutional requirement restricted the First Amendment freedoms of journalists miles away from the field, court, or swimming pool. While universities can sell exclusive broadcasting rights to their sporting events, they can’t dictate how media members report on what happened in an athletic competition..
Our letter called on CU to repeal the policy. Thankfully, it did.
This welcome change comes as a direct result of that letter. In his reply, Athletic Director Rick George acknowledged that the university’s policy was far broader than administrators had first realized. He also affirmed CU’s strong commitment to free expression and committed to repealing the policy, which went well beyond the university’s obligations as a member of the Big 12 Conference and signatory to the conference’s media rights deal.
CU’s response here is exactly what universities should do when their policies fall short of their First Amendment obligations: acknowledge the problem, commit to protecting expression, and promptly fix the issue. And it’s surely part of the reason the university is ranked fifth in FIRE’s College Free Speech Rankings, with a majority of students saying CU is at least somewhat clear that the administration protects free speech on campus.
FIRE’s Student Press Freedom Initiative is pleased to see CU put the free press over profits. Other universities should take a page from the Buffaloes’ book.
FIRE defends the rights of students and faculty members — no matter their views — at public and private universities and colleges in the United States. If you are a student or a faculty member facing investigation or punishment for your speech, submit your case to FIRE today. If you’re a college journalist facing censorship or a media law question, call the Student Press Freedom Initiative 24-hour hotline at 717-734-SPFI (7734). If you’re faculty member at a public college or university, call the Faculty Legal Defense Fund 24-hour hotline at 254-500-FLDF (3533).
Faculty members in the Los Angeles Valley College (LAVC) Media Arts Department implicated in decades of fraud and misconduct have been removed from administrative positions, though they remain in teaching roles.
Over the summer, longtime department head Eric Swelstad, who had led Media Arts since 2008, was replaced as chair by Chad Sustin, a full-time professor of Cinema and Media Arts. The change followed a notification from the LACCD Whistleblower Movement to new Chancellor Alberto J. Roman, alleging that Swelstad falsely claimed membership in the Writers Guild of America – West for more than 20 years and used this misrepresentation in official LACCD promotional materials.
Sustin, a tenured faculty member since 2016 and a former Technicolor post-producer, now leads the department.
The reshuffling comes amid years of internal turmoil. In 2022, full-time cinema professor Arantxa Rodriguez resigned and was replaced by Jonathan Burnett as assistant professor. Rodriguez had previously been implicated in department infighting and, alongside Swelstad, was named as a co-defendant in a 2008 case alleging failure to provide advertised technical training and education. Burnett’s hiring bypassed longtime adjunct and former grant director Dan Watanabe.
Watanabe previously administered several Media Arts training grants, the last of which—ICT & Digital Media, LA RDSN—was reported as fraudulent in 2016. The 2013 grant proposal promised courses such as The Business of Entertainment, Advanced Digital Editing, Photoshop, and After Effects. Yet once funding was approved, The Business of Entertainment and Advanced Digital Editing were archived by LAVC’s Academic Curriculum Committee and Senate. Photoshop and After Effects were offered only minimally, with After Effects disappearing after 2015 and Photoshop shifting to online-only by 2017.
Students reported the suspected fraud to the State of California in 2016, prompting a review of the grant. Renewal applications submitted by Watanabe in 2018 and 2021 were both denied.
Grant Record (Denied Renewal, 2018):
Project Title: ICT & Digital Media – LA RDSN (Renewal)
Funding Agency: CCCCO EWD
Grant Amount: $165,000
Funding Period: Oct. 1, 2018 – June 30, 2019
Project Director: Dan Watanabe
Description: Proposed renewal of the Deputy Sector Navigator grant under the California Community Colleges Chancellor’s Office, focused on curriculum development and alignment with universities and K–12 schools.
Despite this, Watanabe (who was also passed over for a full-time position at Los Angeles Pierce College) remains an adjunct faculty member slated to teach Cinema 111, Developing Movies – a field he reportedly last worked in twenty years ago. Arantxa Rodriguez and Eric Swelstad have both been scheduled to teach Fall 2025. Despite falsifying his credentials as a member of the Writer’s Guild of America – West, and implying he was a Primetime Emmy Winner (he in fact was the director of a movie that received a local Los Angeles Emmy in the 1990s), he is slated to teach Cinema 101 and screenwriting core class Media Arts 129. Rodriguez will a remote History of Film Class.
Reportedly the new full-time faculty in the department have started working to reverse the damage. Fall 2025 schedule includes Media Arts 112, Creative Sound Design Workshop.
We are bombarded with information from news sites, friends, entertainment platforms and companies — through articles, messages, images, videos, ads and graphics. It can be difficult to know or even think about where the information is coming from.
Whether you realize it or not, this tsunami of information affects you: It shapes what you know about events, how you see the world and how you feel about people, issues and products.
In order to further understand big global issues such as climate change, and how it affects communities all over the world, it is helpful to understand how the media functions, what journalists do and how you can communicate about the climate crisis yourself.
The media can and, ideally, should perform a variety of critical functions in any society. It should:
• Keep the public informed of current events and issues;
• Foster informed debate and discussion on matters of public importance;
• Hold powerful governmental and private actors to account.
Where media falls short
In reality, the media often fails to do that in part because some foundational “building blocks” that keep media strong and independent have eroded. A World Bank How-To Guide on media development identifies five building blocks for a robust and independent media sector:
1. Infrastructure: Everything from transmission towers and cables, to news disseminators, to cell phone ownership should be publicly-owned or in a competitive landscape of corporate owners.
2. Professional Skills and Editorial Independence: A country must have enough journalism professionals trained to gather, produce and publish information according to ethical standards, and who are protected by law and policies from interference by governmental or business actors.
3. Financial Sustainability: Media organizations must have financially-sustainable business models that enable them to employ journalism professionals and fund the gathering, production and dissemination of news content.
4. Policy and Regulatory Environment: A country’s legal and policy framework must support and protect the gathering and disclosure of information, uphold editorial independence and protect journalists and their sources.
5. Civil Society and the Public: There must be a media-literate public, journalists’ unions and free press watchdogs to both protect the journalists doing their jobs and hold them to account for transgressions of ethical codes.
How healthy is your media ecosystem?
Many countries around the world lack some or all of the core building blocks of a robust media sector. As a result, the media content available in these countries is often poor, and the media fails to perform its good governance functions.
You can evaluate the state of the media sector in different countries by referring to a variety of online resources, including Reporters Without Borders’ World Press Freedom Index and the Media Ownership Monitor.
But even in places where the press seems to have a great deal of freedom, the media most people consume might be in the control of a very few corporate owners and some of those corporations are privately held by one person or family.
Can you think of some reasons why governments and families might have an interest in controlling the media?
The short answer is that owning media enables you to control the message. You can influence:
• What information is supplied;
• How much information is provided on any particular person, issue or topic; and
• How the information is presented.
A sustainable media ecoystem
In a sustainable media ecosystem both government and private media owners would fulfill the “good governance” functions discussed above: keeping the public informed, fostering debate and holding the powerful to account.
Media owners do this when they put institutional safeguards in place to ensure that the people it employs can report on issues without restraint or fear of repercussion.
This is essential because a journalist is the eyes and ears of the public. Few people have the time or energy or attention to keep an eye on all the things their government does or all the decisions corporations make that affect their lives.
That’s why historically people subscribed to newspapers and why people now follow news sites and journalists on social media. We rely on journalists to go out into the world to ask questions, observe what is happening and gather factual information to report it all back to us.
In practice, many media outlets fall short of this goal.
Profits and the press
One way reputable media organizations protect editorial independence separating the editorial aims of the organization from its profit making function; the organization’s business operations don’t interact with the employees who produce its media content. That leaves journalists free to pursue important news stories, even if doing so could hurt the media outlet’s ability to sell ads or risks losing subscribers.
By doing this, the media organization builds and maintains credibility; It becomes a place where people come for information they can rely on. This information helps them make important decisions about their lives. Is it a good time to buy a house? Can they feel safe where they live? Will they be able to keep their jobs or find new ones?
Unfortunately, many media owners have found that it might be more profitable in the short term to focus news coverage in a way that pleases core audiences and advertisers. That happens when media consumers decide they will pay only for information that aligns with their beliefs and reject media that contradicts what they wish to believe.
Ultimately, we have to think of the media ecosystem as a buffet you can go to for your meals. If too many people choose only the foods that satisfy their cravings for the sweet and salty, not only will their own health suffer, but the people who stock the buffet will start eliminating healthy foods altogether. What seems like a lot of choice in what you consume will end up as a lot of the same and none of it healthy.
So what can you do to support a healthier and sustainable media ecosystem?
Understand who owns the media you consume. Diversify the sources from where you get your information and seek out contrasting perspectives. If you can afford it, pay for subscriptions to outlets that have a record of independence. Support organizations that fight for a free and robust press.
As a consumer of media, you have power you can exercise. Media producers rely on you to read or listen to or watch what they produce. If you choose to do so, you support what they are doing. If you don’t, you tell them a different message altogether.
Questions to consider:
1. What is a media ecosystem?
2. How many information sites have you visited in the last three days? Can you list them?
3. Pick one of those sites. Can you figure out who owns it? Is that company based in your country?
For schools, colleges, and universities, social media has become more than just a communications tool. It’s now a primary stage for community engagement, student recruitment, and institutional storytelling. It’s where prospects discover programs, parents check updates, and alumni stay connected. But here’s the challenge: opportunity without clear guidelines can quickly lead to risk. Without a social media policy, schools leave themselves vulnerable to privacy breaches, inconsistent messaging, blurred boundaries between staff and students, misinformation, accessibility oversights, and even regulatory non-compliance.
That’s why a strong, modern school social media policy is essential. It empowers your team with a clear mandate, sets guardrails for professional and ethical use, and establishes workflows that make social platforms a strategic advantage rather than a liability. Done right, a policy doesn’t stifle creativity; it gives staff, faculty, and student ambassadors the confidence to represent your institution authentically, safely, and effectively.
This guide will walk you through a step-by-step, practical framework for building a school social media policy from the ground up. Drawing on Canadian legal requirements like PIPEDA, MFIPPA, and FOIP/FOIPPA, as well as accessibility standards such as AODA and WCAG, we’ll highlight best practices you can adapt to your own institutional context. We’ll also pull in examples from reputable policies and toolkits already in use across the education sector, so you can see how schools of all sizes, from K-12 districts to large universities, are tackling this challenge.
The goal? To help you design a policy that protects your institution, builds trust with your community, and unlocks the full potential of social media as a driver of engagement and recruitment.
Struggling with enrollment?
Our expert digital marketing services can help you attract and enroll more students!
Step 1: Scope and Objectives (Set the Mandate)
The first step in building a school social media policy is setting its scope and objectives. In other words, define exactly what the policy will and won’t cover, and establish its purpose. Without a clear mandate, policies can easily become either too vague to be useful or so broad they’re unenforceable.
Start with the scope. Your policy should outline the types of accounts and activities it governs. This typically includes:
Official institutional accounts (the main school, college, or university channels).
Department, program, and athletics accounts are managed under the institutional brand.
Professional use of social media by staff when tied to their role at the institution.
Personal accounts only when they intersect with professional responsibilities, for example, when an employee references their school role in a bio or shares institutional content.
It’s equally important to clarify who the policy applies to. Most schools extend it beyond full-time employees to include contractors, volunteers, trustees or board members, and student workers. That ensures consistency across every voice representing the institution.
Next, define platforms in scope. Policies usually include public-facing social networks (Facebook, Instagram, TikTok, YouTube, X/Twitter, LinkedIn) and messaging apps when used for school business (e.g., WhatsApp, Slack, or Teams). Learning management systems (LMS) or academic collaboration tools like Brightspace or Google Classroom may be excluded if they’re already governed by separate policies.
Finally, tie the scope to objectives. A strong policy should:
Support institutional values and brand consistency.
Protect privacy and data security.
Ensure compliance with laws and regulations.
Safeguard professional boundaries between staff, students, and the public.
Promote accessibility and inclusivity.
Provide clear guidance for staff and students so they can engage with confidence.
Example: Arcadia University’s social media policy explicitly applies to “all faculty, staff, students, trustees, volunteers, and third-party vendors” who manage accounts on behalf of the university. In other words, anyone handling an official or work-related social media presence is within the policy’s scope, not just employees. This breadth ensures a consistent standard across all channels and individuals associated with the school’s online presence.
Step 2: Risk and Needs Assessment (Ground It in Reality)
Before drafting rules, you need a clear picture of how social media is currently used across your institution. Start with an audit: which accounts exist, who manages them, what devices they use, and what level of access is granted? This mapping exercise not only shows how sprawling your social presence may be but also reveals immediate risks.
Categorize those risks clearly:
Privacy: posting student names, images, or personal data without consent.
Reputational: off-brand messaging, unmoderated comments, or negative publicity.
Operational: lost passwords, shadow accounts, or inactive pages damaging credibility.
Compliance: failures in records retention, accessibility (AODA/WCAG), or anti-spam legislation.
Example: University of Waterloo (Renison University College) – The School of Social Work’s social media policy begins with a frank acknowledgment of the rapidly changing social media landscape and the challenges it poses (e.g. blurred boundaries between students and professionals). It emphasizes the need for guidelines to protect everyone involved from “potential negative consequences,” directly addressing the risks and needs that prompted the policy. This reality-grounded preamble shows the policy was built in response to actual issues observed in practice.
Go further by interviewing principals, faculty, coaches, and IT/security staff. These conversations often uncover grey areas, like student leaders running unofficial team accounts or staff using messaging apps for school business.
For inspiration, review policies like the Toronto District School Board’s Procedure PR735, which provides clear guidance on professional use and compliance (TDSB PR735 PDF).
Finally, create a simple risk register (spreadsheet) listing each risk, its likelihood, potential impact, current controls, and planned mitigations. Revisit this quarterly to keep your policy grounded in reality, not theory.
Step 3: Core Legal and Policy Foundations (Canada-Specific)
Schools and their social media policy must be anchored in the laws and standards that govern privacy, access to information, and accessibility. In Canada, the framework varies depending on the type of institution.
For universities, colleges, and many independent schools in the private sector, PIPEDA applies. Its consent principles require that personal information be collected and shared only with meaningful consent that is specific, informed, and easy to withdraw (Office of the Privacy Commissioner of Canada).
Public institutions must look to provincial laws. In Ontario, MFIPPA governs how student information is collected, used, and disclosed (IPC Guide for Schools). In British Columbia, FOIPPA applies to boards, colleges, and universities, supported by practical guidance like the province’s social media tip sheet (BC FOIPPA Social Media Guide). In Alberta, FOIP covers public school authorities, with resources from the OIPC and universities.
What is an example of a social media policy? In higher education,Mohawk College’s Social Media Policy ties online activity directly to Canadian privacy laws, accessibility requirements, and internal codes of conduct, while also setting expectations for official accounts. For K–12,Greater Victoria School District Policy 1305 offers a concise framework rooted in district values and professionalism.
Accessibility is equally critical. In Ontario, the AODA requires that all digital communications be accessible, aligned with WCAG 2.0 levels A/AA.standards (Ontario Accessibility Guidance). Federally, the Treasury Board recommends WCAG 2.1 AA and EN 301 549 adoption (Government of Canada Digital Accessibility Toolkit).
Anchoring your policy in these laws ensures your institution not only reduces risk but also demonstrates accountability and inclusivity from the outset.
Example: Nova Scotia Community College (NSCC): NSCC’s Social Media Policy explicitly lists the Canadian laws and regulations that underpin acceptable social media use. It requires adherence to legislation such as Canada’s Anti-Spam Law (CASL), privacy laws like FOIPOP (provincial Freedom of Information and Protection of Privacy) and PIPEDA, the Human Rights Act, the Intimate Images and Cyber-protection Act, the Copyright Act, etc., as well as relevant college policies. By doing so, NSCC ensures its policy is grounded in national and provincial legal frameworks, providing a clear legal context for users.
Strong governance is the backbone of any school’s social media policy. Start by maintaining a central registry of all official accounts, whether institutional, departmental, or program-specific. For each, assign three roles: an accountable owner, a backup owner, and a communications/marketing lead. This ensures continuity when staff change roles. Require two-factor authentication across platforms, prohibit credential sharing, and centralize credential storage where possible.
Visual consistency matters, too. Borrow from UBC Brand’s social media guidelines on avatars, logos, and naming conventions to maintain a unified institutional identity (UBC Brand Guidelines).
Before any new account launches, establish an approval workflow. Require an application form documenting the account’s purpose, audience, staffing plan, and moderation strategy. This prevents “shadow accounts” and ensures new initiatives align with institutional priorities.
Finally, don’t overlook records management. Communications conducted through official accounts may constitute institutional records under provincial law. Align your policy with your school’s records retention framework, clarifying who is responsible for archiving social content.
Example: McGill’s guidelines require each institutional account to have at least two staff administrators plus a “central communications” administrator, and that accounts be tied to a departmental email (not an individual’s email) for password recovery. These practices ensure accounts are not “personal fiefdoms,” they belong to the institution, and records (including login info and content archives) are managed responsibly.
For inspiration, look at NYC Public Schools’ staff social media guidance, which requires registration of official accounts and outlines monitoring expectations (NYCPS Guidelines). While U.S.-based, the governance structures translate well to Canadian contexts.
Step 5: Privacy, Consent, and Student–Staff Boundaries
Protecting personal information is one of the most important functions of a school’s social media policy. Define clearly what counts as personal data: names, images, video, voice recordings, and any identifiable details. As the Office of the Privacy Commissioner of Canada advises, consent should always be obtained before posting content involving others online (OPC Guidance).
In Ontario, boards must ensure alignment with MFIPPA. For example, Abbotsford School District’s AP 324 media consent policy demonstrates best practices, including clear parental consent forms and proper recordkeeping (Abbotsford AP 324 PDF). Such models can guide how to design workflows that balance opportunity with privacy protection.
Equally critical are staff–student boundaries. Your policy should mandate the use of approved channels only, no personal phone numbers, no personal accounts, and no “friend” connections with students online. Communication must remain professional and transparent. NYC Public Schools provide a helpful benchmark, with explicit staff guidance and even age-specific student social media guidelines (NYCPS Staff Guidelines).
Example: Toronto Catholic District School Board (TCDSB): TCDSB’s social media guidelines draw very clear lines to protect privacy and maintain professional boundaries. Staff are forbidden from “friending” or privately messaging students on personal social media – all communication with students must occur through official, school-sanctioned accounts and only for educational purposes. The policy also enforces strict consent rules: no student’s name, photo, or any identifying information may be posted on social media without written parental consent, and the use of student images on official accounts must follow the board’s annual consent process in compliance with Ontario privacy law (MFIPPA).
✅ Do use only approved institutional channels for all communication.
✅ Do secure and store consent forms before posting student content.
✅ Do respect privacy by default. When in doubt, leave it out.
❌ Don’t use personal accounts, texts, or private messaging apps with students.
❌ Don’t post identifiable student content without explicit, recorded consent.
❌ Don’t blur professional boundaries (e.g., friending or following students on personal profiles).
Are teachers allowed to post their students on social media? Yes, but only with appropriate consent and in full compliance with privacy legislation. In the private sector, PIPEDA requiresmeaningful consent. Ontario’s public boards must follow MFIPPA, with guidance from theIPC’s education resources. By embedding privacy safeguards and clear boundary rules, schools protect students, staff, and their reputation while still enabling authentic digital engagement.
Step 6: Content Rules, Moderation, Accessibility, and Contests
A strong social media policy must tell people what to post, how to post it, and how to manage responses. Start with standards for tone, accuracy, and brand alignment. Require respectful, inclusive language and clear disclosures (e.g., partnerships, sponsorships).
Next, define moderation. Borrow from BC’s corporate moderation policy (BC Gov Guidelines): state what comments are removed (hate speech, spam, off-topic promotions), how warnings are issued, and when accounts are blocked. Make moderation workflows transparent to staff and users.
Example: Queen’s University underscores the importance of moderation rights: they reserve the right to delete disruptive or defamatory posts, and to remove or block users who repeatedly violate guidelines. Like other schools, they want to allow dialogue but will intervene if someone is, for instance, spamming the page or attacking others. The guidelines mention that collaborators (i.e., those who contribute to Queen’s social media) must “obtain explicit permission to publish or report on conversations intended to be private or internal”. In other words, don’t take a private email or a closed meeting discussion and post it publicly without consent – doing so could breach confidentiality. Similarly, no confidential or proprietary info about the university or its partners should be shared on social media.
Accessibility is non-negotiable. Every post should follow WCAG 2.0 levels A/AA and AODA requirements: alt text for images, captions for videos, no text-only graphics, and accessible hashtags (#CapitalizeEachWord). See Ontario’saccessibility guide and Canada’sDigital Accessibility Toolkit.
Contests or giveaways add another layer. Do social media contests require special rules? Yes. Schools must comply with Canada’s Anti-Spam Legislation (CASL) when running promotions involving commercial electronic messages or online entries. TheCRTC’s CASL guide andFAQs explain consent and identification requirements. For drafting contest rules, see legal overviews byBLG (2025) andGowling WLG (2023).
Checklist for Staff:
✅ Post accurate, respectful, branded content
✅ Add alt text, captions, and accessible formatting
✅ Moderate comments against clear rules
✅ Secure consent before promotions/contests
❌ Don’t post text-in-images without alternatives
❌ Don’t run contests without legal review
Step 7: Training, Launch, Metrics, and Continuous Improvement
Even the strongest policy fails without training. Translate your guidelines into practice by building role-specific training modules for account owners, moderators, coaches, and student ambassadors. Incorporate Canadian digital literacy resources like MediaSmarts’ Digital Literacy Framework (overview;full PDF) to reinforce safe, ethical, and effective online engagement. Support staff with PD sessions, publish an internal FAQ, and run scenario-based exercises, such as managing a doxxing attempt or handling a viral misinformation post.
When launching, stagger the rollout: pilot in one department, gather feedback, and expand with adjustments. Communicate the policy widely so every stakeholder understands their role. Schedule quarterly refreshers to ensure compliance as platforms, tools, and threats evolve.
Example: University of British Columbia (UBC): UBC provides a detailed Social Media Playbook and Project Planning Tips to guide training and content planning for account managers. They recommend auditing capacity before launch, building content calendars, and using analytics for continuous improvement. UBC also sets platform-specific tips (e.g., mobile-first design, proper hashtag use) to elevate training beyond policy to practice.
Success requires measurement. Track metrics that matter: audience reach, engagement quality, average response time, accessibility compliance (captioning/alt-text rates), harmful content removal time, and incident frequency. Pair this with annual policy reviews against your risk register and evolving legal obligations. Document revisions and circulate them across the institution so no one is left behind.
Checklist for Staff:
Complete mandatory training before account access
Use MediaSmarts or similar frameworks for student modules
Run tabletop exercises annually
Measure engagement, accessibility, and incident response
Review/update policy yearly
How to Use This Checklist
Policies can sometimes feel abstract, but implementation lives in the details. To make your school or institution’s social media policy actionable, translate the principles into operational steps your teams can follow every day.
The following checklist is designed as a drop-in appendix: administrators can copy it directly into their policy, while communications teams and account owners can use it as a quick reference. It consolidates the essentials, governance, privacy, accessibility, moderation, and security into a single, practical tool. Review it regularly, update it as laws and platforms evolve, and use it as both a compliance safeguard and a training guide.
Operational Checklist (Copy-Paste into Your Policy)
Action
Reference / Example
Maintain a central registry of all official accounts, owners, and backups; enforce two-factor authentication on every account.
Creating a modern, compliant, and effective school social media policy isn’t just about managing risk. It’s also about empowering your institution to communicate with confidence. The right framework balances opportunity and responsibility, ensuring your teams can build authentic connections with students and families while safeguarding privacy, accessibility, and professionalism.
At Higher Education Marketing (HEM), we help schools, colleges, and universities do exactly that. From developing policies rooted in Canadian legal standards to training staff and student ambassadors on best practices, our team specializes in building digital strategies that drive engagement and enrollment. Whether you need support crafting your first policy, auditing existing processes, or integrating governance into a broader digital marketing strategy, HEM provides the expertise to make it happen.
In a digital-first world, trust and clarity are everything. By partnering with HEM, your institution can move forward with a social media policy that not only protects your community but also amplifies your brand in the right way.
Struggling with enrollment?
Our expert digital marketing services can help you attract and enroll more students!
Frequently Asked Questions
Question: What is an example of a social media policy? Answer: In higher education,Mohawk College’s Social Media Policy ties online activity directly to Canadian privacy laws, accessibility requirements, and internal codes of conduct, while also setting expectations for official accounts. For K–12,Greater Victoria School District Policy 1305 offers a concise framework rooted in district values and professionalism.
Question: Are teachers allowed to post their students on social media? Answer: Yes, but only with appropriate consent and in full compliance with privacy legislation. In the private sector, PIPEDA requiresmeaningful consent. Ontario’s public boards must follow MFIPPA, with guidance from theIPC’s education resources.
Question: Do social media contests require special rules? Answer: Yes. Schools must comply with Canada’s Anti-Spam Legislation (CASL) when running promotions involving commercial electronic messages or online entries. TheCRTC’s CASL guide andFAQs explain consent and identification requirements. For drafting contest rules, see legal overviews byBLG (2025) andGowling WLG (2023).
Student journalists have their fingerprints on more than 282 public radio or television stations across the country, providing behind-the-scenes support, working as on-screen talent or reporting in their local communities for broadcast content. But over $1 billion in federal budget cuts could reduce their opportunities for work-based learning, mentorship and paid internships.
About 13 percent of the 319 NPR or PBS affiliates analyzed in a report from the Center for Community News at the University of Vermont operate similarly to teaching hospitals in that a core goal of the organization is to train college students. Nearly 60 percent of the stations “provide intensive, regular and ongoing opportunities for college students” to intern or engage with the station.
Scott Finn, news adviser and instructor at the Center for Community News and author of the report, worries that the cuts to public media and higher education more broadly could hinder experiential learning for college students, prompting a need for additional investment or new forms of partnerships between the two groups.
In July, Congress rescinded $1.1 billion in federal funding for the Corporation for Public Broadcasting, which funds public media stations including NPR and PBS. The cuts threaten the financial stability of many stations, some of which are directly affiliated with colleges and universities.
Working at a public media station provides a variety of benefits for students, Finn said. In his courses, Finn partners with community outlets that will publish students’ stories, depending on the quality and content, which he says motivates students to submit better work.
“Being published, being broadcast is important. The whole focus of the exercise changes,” Finn said. “It’s not just trying to please me as the instructor or a tick box for a grade. They have real-world consequences. Their story will have an impact. It will move people, it will change policy, and that knowledge them inspires them to work harder.”
Most students want internship opportunities; a recent study by Strada found students rate paid internships as the most valuable experience for improving their standing as a candidate for future jobs. But nationally, there’s a shortage of available, high-quality internships compared to the number of students interested in participating, according to a 2024 report from the Business–Higher Education Forum.
A Handshake survey from earlier this year found 12 percent of students in the survey didn’t have an internship before finishing their degree, largely because they lacked the time or weren’t selected for one.
For interns or students working directly in the studio, partnering alongside career journalists also gives them access to a professional network and a career field they may not otherwise engage in.
But student journalists aren’t the only ones who lose out when internship programs are cut.
Emily Reddy serves as news director at WPSU, a PBS/NPR member station in central Pennsylvania associated with Penn State University. Reddy hosts a handful of student reporting interns throughout the calendar year, training them to write, record and broadcast stories relevant to the community.
“[Interns] bring an energy to the newsroom,” Reddy shared. “They’re enthusiastic. They are excited to go out to some board meeting that no one else wants to go to. They bring us stories that we wouldn’t know about otherwise.”
WPSU uses a variety of funding sources to pay student interns, including endowed scholarships at the university and donated funds. But like many other stations, WPSU is facing its own cuts. Earlier this year, Penn State reduced funding to the station by $800,000, or around 9 percent of the station’s total budget. That resulted in a cut of $400,000 from CPB.
In response, WPSU shrank its full-time head count, laying off five staff members and cutting hours for three. Roles vacated by retirements were left unfilled. In October, the station will lose around $1.3 million as a result of the federal cuts, though Reddy doesn’t know what the full impact will be on staffing.
WPSU had planned to increase its internship offerings, and Reddy is still hopeful that will happen. However, the laid-off personnel were among those responsible for managing learners.
“The big thing that I’m concerned about working with students is that you can’t just have the students; somebody has to train them, somebody has to edit them, somebody has to voice coach them and clean up their productions,” Reddy said.
About 12 percent of the stations in the Center for Community News’s report don’t sponsor interns, and they pointed to budget cuts as a key reason why. For stations experiencing financial pressures, Finn hopes newsrooms find creative ways to keep students involved in creating stories, including classroom partnerships or faculty editors who trim and refine stories. Universities are uniquely positioned to assist in this work, Finn said, because they have more resources than public stations and have a strong motivation to place students in successful internship programs.
“This is a really important time for universities to double down on their relationship with public media stations and not walk away from it,” Finn said. “A lot of [stations] are these underutilized resources, in terms of student engagement and student learning.”
Finn also says alumni and other supporters of student learning and public media can help to fill in gaps in funding, whether that’s supporting a paid full-time faculty role to serve as a liaison between students and stations or to endow internship dollars.
“If public media stations are important to student success, then university advancement has to embrace the public media station as a part of its mission and help raise money for it,” Finn said.
What’s it like to be an artist and scientist? Meg Mindlin studies octopuses, shares videos for Instagram Reels and TikTok. And, she’s a talented artist who helps people communicate science in engaging way. I felt lucky to attend her thesis defense live on YouTube.
In this conversation, we talk about her research, dealing with the political spectrum when speaking up on social media, and sharing her art online.
Bio
Meg Mindlin (@invertebabe) is a molecular biologist and science communicator. She combines her background in art with an ability to communicate complex science in an engaging manner. She received her Masters in Biology studying octopuses and how ocean acidification effects a molecular process known as RNA editing.
When Sarah Layden shared her satire piece, ‘Unfriend Me Now’ on her LinkedIn profile, I reached out right away about her appearing on The Social Academic interview series. She wrote ‘Unfriend Me Now’ after reading research from Floyd, Matheny, Dinsmore, Custer, and Woo, “If You Disagree, Unfriend Me Now”: Exploring the Phenomenon of Invited Unfriending, published in American Journal of Applied Psychology.
Bio
Sarah Layden is the author of Imagine Your Life Like This, stories;Trip Through Your Wires, a novel; and The Story I Tell Myself About Myself, winner of the Sonder Press Chapbook Competition.
Sarah Layden
Her short fiction appears in Boston Review, Blackbird, McSweeney’s Internet Tendency, Best Microfiction 2020, and elsewhere. Her nonfiction writing appears in The Washington Post, Poets & Writers, Salon, The Millions, and River Teeth, and she is co-author with Bryan Furuness of The Invisible Art of Literary Editing.
She is an Associate Professor of English at Indiana University Indianapolis.
Whenever a bill aimed at policing online speech is accused of censorship, its supporters often reframe the conversation around subjects like child safety or consumer protection. Such framing helps obscure government attempts to shape or limit lawful speech, yet no matter how artfully labeled such measures happen to be, they inevitably run headlong into the First Amendment.
Consider the headline-grabbing Kids Online Safety Act (KOSA). Re-introduced this year by Sens. Marsha Blackburn (R-Tennessee) and Richard Blumenthal (D-Connecticut) as a measure to protect minors, KOSA’s sponsors have repeatedlycharacterized its regulations as merely providing tools, safeguards, and transparency. But in practice, it would empower the federal government to put enormous pressure on platforms to censor constitutionally protected content. This risk of government censorship led KOSA to stall in the House last year after passing the Senate.
Child safety arguments have increasingly surfaced in states pursuing platform regulation, but closer inspection reveals that many such laws control how speech flows online, including for adults. Take Mississippi’s 2024 social media law (HB 1126), which was described as a child safety measure, that compelled platforms to verify every user’s age. Beneath that rhetoric, however, is the fact that age verification affects everyone, not just children. By forcing every user — adult or minor alike — to show personal identification or risk losing access, this law turned a child-safety gate into a universal speech checkpoint. That’s because identity checks function like a license: if you don’t clear the government’s screening, you can’t speak or listen.
A judge blocked HB 1126 last month, rejecting the attorney general’s argument that it only regulated actions, not speech, and finding that age verification gravely burdens how people communicate online. In other words, despite the bill’s intentions or rationales, the First Amendment was very much at stake.
Utah’s 2023 Social Media Regulation Act demanded similar age checks that acted as a broad mandate that chilled lawful speech. FIRE sued, the legislature repealed the statute, and its 2024 replacement — the Minor Protection in Social Media Act — met the same fate when a federal judge blocked it. Finding there was likely “no constitutionally permissible application,” the judge underscored the clear conflict between such regulations and the First Amendment.
Speech regulations often show up with different rationales, not just child safety. In Texas, HB 20 was marketed in 2021 as a way to stop “censorship” by large social media companies. By trying to paint the largest platforms as public utilities and treating content moderation decisions as “service features,” the legislature flipped the script on free expression by recasting a private actor’s editorial judgment as “conduct” the state could police. When the U.S. Court of Appeals for the Fifth Circuit upheld the law, in a decision that was later excoriated by the Supreme Court, the court repeated this inversion of the First Amendment: “The Platforms are not newspapers. Their censorship is not speech.”
Florida tried a similar strategy with a consumer-protection gloss. SB 7072 amended the state’s Deceptive and Unfair Trade Practices Act to include certain content moderation decisions, such as political de-platforming or shadow banning, exposing platforms to enforcement and penalties for their speech. Unlike the Fifth Circuit, the Eleventh Circuit blocked this law, calling platform curation “unquestionably” expressive and, therefore, protected by the First Amendment.
In July 2024, the Supreme Court took up the question when considering challenges to these two state laws in Moody v. NetChoice. Cutting through the branding, the Court rejected the idea that these laws merely regulated conduct or trade practices. Instead, it said content moderation decisions do have First Amendment protection and that the laws in Texas and Florida did, in fact, regulate speech.
The Court clarified in no uncertain terms that “a State may not interfere with private actors’ speech to advance its own vision of ideological balance.” And it added that “[o]n the spectrum of dangers to free expression, there are few greater than allowing the government to change the speech of private actors in order to achieve its own conception of speech nirvana.”
California tried the dual framing of both child safety and consumer protection. AB 2273, the California Age Appropriate Design Code Act, was described as a child-safety bill that just regulated how apps and websites are built and structured, not their content. The bill classified digital product design features, such as autoplaying videos or default public settings, as a “material detriment” to minors as well as an unfair or deceptive act under state consumer-protection statutes. But this too failed and is now blocked because, the court noted, “the State’s intentions in enacting the CAADCA cannot insulate the Act from the requirements of the First Amendment.”
Multiple nationwide lawsuits now claim social media feeds are defective products, using product-liability law to attack the design of platforms themselves. But by calling speech a “product” or forcing it into a product liability claim, it recharacterizes the editorial decisions of lawful content as a product flaw, which attempts to shift the legal analysis from speech protections to consumer protection. State attorneys general, however, cannot erase the First Amendment protections that still apply.
A sound policy approach to online speech looks not at branding, but impact. Even when packaged in terms of child safety, consumer protection, or platform accountability, it is essential to ask whether the rule forces platforms to host, suppress, or reshape lawful content. Regardless of the policy goal or rhetorical framing, if a requirement ultimately pressures platforms to host or suppress lawful speech, expect judges to treat it as a speech regulation.
Unfortunately, re-branding speech regulations can obfuscate their censorial ends and make them politically attractive. That’s what’s happening with KOSA’s obvious appeal of protecting children, combined with the less obvious censorship threat from targeting “design features,” has made it popular in the Senate.
Giving the government power to censor online speech puts everyone’s liberty at risk. Just as Americans enjoy the right to read, watch, and talk about whatever we want offline, those protections extend to our speech online as well. Protecting free expression now keeps the marketplace of ideas open and guards us from sacrificing everyone’s right to free expression.
Common Sense Media has released its first AI Toolkit for School Districts, which gives districts of all sizes a structured, action-oriented guide for implementing AI safely, responsibly, and effectively.
Common Sense Media research shows that 7 in 10 teens have used AI. As kids and teens increasingly use the technology for schoolwork, teachers and school district leaders have made it clear that they need practical, easy-to-use tools that support thoughtful AI planning, decision-making, and implementation.
Common Sense Media developed the AI Toolkit, which is available to educators free of charge, in direct response to district needs.
“As more and more kids use AI for everything from math homework to essays, they’re often doing so without clear expectations, safeguards, or support from educators,” said Yvette Renteria, Chief Program Officer of Common Sense Media.
“Our research shows that schools are struggling to keep up with the rise of AI–6 in 10 kids say their schools either lack clear AI rules or are unsure what those rules are. But schools shouldn’t have to navigate the AI paradigm shift on their own. Our AI Toolkit for School Districts will make sure every district has the guidance it needs to implement AI in a way that works best for its schools.”
The toolkit emphasizes practical tools, including templates, implementation guides, and customizable resources to support districts at various stages of AI exploration and adoption. These resources are designed to be flexible to ensure that each district can develop AI strategies that align with their unique missions, visions, and priorities.
In addition, the toolkit stresses the importance of a community-driven approach, recognizing that AI exploration and decision-making require input from all of the stakeholders in a school community.
By encouraging districts to give teachers, students, parents, and more a seat at the table, Common Sense Media’s new resources ensure that schools’ AI plans meet the needs of families and educators alike.
eSchool Media staff cover education technology in all its aspects–from legislation and litigation, to best practices, to lessons learned and new products. First published in March of 1998 as a monthly print and digital newspaper, eSchool Media provides the news and information necessary to help K-20 decision-makers successfully use technology and innovation to transform schools and colleges and achieve their educational goals.