Click on each student’s name to expand a summary of their manifesto topic, thesis, audience, research themes, and visual strategy. Use the buttons below to expand or collapse all sections at once.
| Thesis | Social media filters, AI-generated content, and curated personas harm users’ sense of self and mental well-being; platforms should encourage healthier forms of identity and expression. |
|---|---|
| Audience | Social media users, especially young adults. |
| Call-to-Action | Users should embrace authenticity, and platforms should design features that reduce harmful comparison culture. |
| Course Concepts Used | Mediation; Affordances & Constraints; Ideology; Identity/Performance; Invisible Technologies. |
| Research Themes | Mental health impacts; identity distortion; authenticity loss; psychological effects of mediated self-presentation. |
| Visual Strategy | Clean, minimalist slides; contrasting curated versus authentic imagery; examples of social media posts. |
| Key Evidence Used | NAMI mental health data; UC Davis Health research; Avci study on self-concept clarity; Brüns work on authenticity loss. |
| Thesis | AI tools weaken student interaction, critical thinking, and authentic academic engagement; institutions should reconsider how AI is implemented in learning contexts. |
|---|---|
| Audience | College students, educators, administrators. |
| Call-to-Action | Promote meaningful human connection and rethink when and how AI is used in classrooms. |
| Course Concepts Used | Mediation; Algorithmic Influence; Technological Ideology; Affordances & Constraints. |
| Research Themes | Social-emotional development; teacher–student relationships; AI-driven learning habits; early exposure to AI tools. |
| Visual Strategy | Social-media-style informational slides; simple icons; text-forward educational visuals. |
| Key Evidence Used | CDT emotional-attachment studies; EducationWeek reporting; Frumin’s cautionary letter and related commentary. |
| Thesis | Social media–driven overconsumption harms individuals and the environment; users and platforms must resist algorithmic pressure to constantly buy. |
|---|---|
| Audience | Gen Z consumers; social media users. |
| Call-to-Action | Prioritize sustainability and resist consumerist pressures shaped by platforms. |
| Course Concepts Used | Ideology (consumerism); Mediation; Algorithmic Influence; Affordances & Constraints. |
| Research Themes | Fast fashion; influencer marketing; environmental impact; consumer psychology. |
| Visual Strategy | Before/after comparisons; environmental data visuals; muted palette with environmental imagery. |
| Key Evidence Used | Becker on fast fashion and influencers; Nonprofit Quarterly on platform incentives; Net Impact environmental data. |
| Thesis | AI-driven and opaque digital advertising systems mislead users and erode trust; transparency and regulation are needed to protect consumers. |
|---|---|
| Audience | General digital users; regulators; brands. |
| Call-to-Action | Advocate for clearer disclosures, ethical advertising practices, and stronger oversight. |
| Course Concepts Used | Invisible Technologies; Mediation; Ideology; Datafication; Algorithmic Manipulation. |
| Research Themes | Deepfakes; deceptive interfaces; influencer sponsorship issues; regulatory challenges. |
| Visual Strategy | Polished UI mockups; contrasting examples of deceptive ads; accessible visual hierarchy. |
| Key Evidence Used | Deepfake advertising research; legal scholarship on manipulation; analyses of online ad infrastructure; influencer transparency reporting. |
| Thesis | Influencer culture commodifies “authenticity,” shaping online identity in harmful and manipulative ways. |
|---|---|
| Audience | Social media users; creators; young adults. |
| Call-to-Action | Encourage critical awareness of influencer strategies and support more transparent, ethical content. |
| Course Concepts Used | Identity/Performance; Mediation; Ideology (self-commodification); Algorithmic Influence. |
| Research Themes | Influencer culture; emotional labor; branding psychology; marketing persuasion strategies. |
| Visual Strategy | Feed-style layouts; contrasting “authentic” vs. staged visuals; clean, structured aesthetic. |
| Key Evidence Used | Harvard Business Review on influencer transparency; Arnesson & Reinikainen on influencer professionalism; Grau on authenticity and branding. |
| Thesis | Librarians are essential ethical guides in an AI-saturated information landscape and cannot be replaced by automated systems. |
|---|---|
| Audience | Students; faculty; librarians; educational leaders. |
| Call-to-Action | Advocate for funding, training, and recognition for librarians as AI-literacy leaders. |
| Course Concepts Used | Mediation (information); Invisible Technologies; Ideology (automation); Ethics & Equity; Critical Literacy. |
| Research Themes | AI literacy; information trust; ethical stewardship; ALA Core Values; librarians’ evolving roles. |
| Visual Strategy | High-contrast instructional slides; clear sectional dividers; strong accessibility practices. |
| Key Evidence Used | Lo (AI literacy); Lapierre on human–AI context; Martin & Armstrong on AI challenges; ALA Core Values documents. |
| Thesis | AI writing tools reduce cognitive activity, memory formation, and engagement; students should build foundational skills before relying on AI. |
|---|---|
| Audience | Students; teachers; parents. |
| Call-to-Action | Encourage foundational skill-building and reduce reliance on AI writing tools in educational settings. |
| Course Concepts Used | Mediation (cognition); Affordances & Constraints; Human–AI Interaction; Learning & Memory. |
| Research Themes | Cognitive activity; neural engagement; confidence and critical thinking; academic development with AI. |
| Visual Strategy | Traditional essay/PDF layout with emphasis on clear organization and explanation of research findings. |
| Key Evidence Used | Kosmyna et al. brain-activity study; Lee et al. on critical thinking and confidence; Quirk’s synthesis of cognitive research. |
| Thesis | Spotify should adopt user-centered transparency around how listener data is collected, used, and monetized. |
|---|---|
| Audience | Spotify users; tech companies; UX and product designers. |
| Call-to-Action | Implement in-app transparency dashboards and user-friendly explanations of data use and controls. |
| Course Concepts Used | Invisible Technologies; Ideology (surveillance capitalism); Mediation; Datafication; UX Transparency. |
| Research Themes | Data extraction; opaque recommendation systems; privacy policy complexity; user autonomy. |
| Visual Strategy | Spotify-inspired dashboard mockups; clear information hierarchy; accessible interface design. |
| Key Evidence Used | Spotify Teardown; Shoshana Zuboff’s The Age of Surveillance Capitalism; Spotify Privacy Policy; Newlands et al. on recommendation systems. |
| Thesis | AI cannot replicate the emotional depth and human connection of true creative work; it should inspire ideas rather than replace artists. |
|---|---|
| Audience | Artists; creative workers; general audiences interested in AI and art. |
| Call-to-Action | Preserve human-centered creativity and use AI as a collaborative tool, not a replacement. |
| Course Concepts Used | Mediation (creative process); Ideology (efficiency/cost-cutting); Affordances & Constraints; Human Authenticity. |
| Research Themes | AI’s effect on art; psychological preference for human-made work; ethical concerns surrounding training data and artistic integrity. |
| Visual Strategy | Side-by-side comparisons of human vs. AI art; textured backgrounds; grayscale theme to emphasize artwork itself. |
| Key Evidence Used | Bellaiche et al. on human preference for human-created art; Kelly LeBlanc on authenticity and integrity; Eric Reinhart’s critique of AI art. |