Skip to main content
SearchLoginLogin or Signup

Unboxing Creators’ Algorithmic Trust in Kids’ YouTube

Published onMay 24, 2019
Unboxing Creators’ Algorithmic Trust in Kids’ YouTube

This paper critically examines YouTube creators who make toy unboxing videos for children and offers insights into the intermingled algorithmic, commercial, and personal incentives which undergird their creator culture in the USA. Top toy unboxing creators have noted how YouTube’s algorithms and platform affordances have influenced how they create their videos. They have also noted that this algorithmic influence has increased over time, surpassing more creative choices – such as choosing to follow genre conventions, engage in sponsored or branded partnerships, iterate on individual production practices, and interact with viewers and other creators – as a matter of influence over which videos go viral and how the discourse around toy unboxing is constructed. The lack of creator agency over how the algorithm positions their content is further complicated by the burden placed on creators by families, the YouTube platform, and child advocates to act as mediators and regulators of children’s access to quality media and entertainment content, product information, and networked para-sociality. While popular press discourse indicates that YouTube has a responsibility to consider the needs of their child viewers, the discourse also treats toy unboxing creators and their content as if they were part of the YouTube corporation itself (Rubin, 2018). This finger-pointing at creators has persisted despite the fact that these creators have little control over their status in the suggested video algorithm and have even less over what their content is suggested alongside of.

Scholarly literature has established that these creators’ popularity and considerable commercial success (via YouTube AdSense revenue) have positioned these creators as being more authentic (read: trustworthy) review systems by and for children than some of the branded commercials and entertainment properties featuring the same characters, toys, and products (Craig and Cunningham, 2017; Marsh, 2016). However, content creators are also in need of algorithmic trust. Creators need to trust that the algorithm will prompt families to watch their content together to gain parent and kid approval alike. Many creators have no choice but to trust that the algorithm will provide platform stability so that their vertical will survive the loss of advertising revenue (as evidenced by the AdPocalypse in Cunningham and Craig, 2019) or being associated with manipulated satires of children’s content (as seen with ElsaGate discussed by Bridle, 2017). Similarly, they must trust that the algorithmic privileging won’t demonetize their channels (blacklist) or algorithmically deprivilege their channels (greylist) despite the market saturation in the kids and family vertical. These concerns, as well as broader moral panics about kids’ YouTube content regarding commerciality, consumerism, and screen-time, feed into the complexity for creators navigating child audiences and black box algorithms.

Drawing from 24 interviews (McCracken, 1988) with top-ranking toy unboxers, textual analyses of toy unboxing videos (McKee, 2003), application walkthroughs of YouTube and YouTube Kids (Light, Burgess, and Duguay, 2018), and digital mapping of the YouTube suggested video algorithm using digital methods (Rieder, 2015), this research unpacks how creators have tried to ethically create media and manage their online presence for child audiences while concurrently optimizing their content to be made more visible by the algorithm.These insights suggest that not only do these creators struggle to impress the algorithm and suffer the ill-effects of algorithm but that their ability to make content for children without considering the algorithm is wholly stunted. In effect, this leaves many creators feeling that the potential quality of content that they could make for children is hindered by the algorithm, leaving children to suffer the effects of the algorithm second hand.

With this in mind, this research situates toy unboxing within a larger circuit of culture (Du Gay et al., 2013; Buckingham, 2008) and actor-networked dynamics (Latour, 2005). Doing so allows this paper to posit that content creators, viewers (child and adults alike), and the platform must re-negotiate trust with one another by examining the algorithms that bring them together. In reflecting the interview data from the aforementioned creators, critical algorithmic literacy (in conjunction with media and platform literacy) is needed to improve content creation, viewing practices, and YouTube platform policies for children and their families moving forward. By bringing well-intended creators’ voices into discussions about algorithms, data, and platform policies, this research offers industry-led insights with the intention of providing children with great content and experiences and exonerating some criticism of creators on YouTube. This research breaks down barriers between children’s media academics and industry professionals and offers new strategies for intervention and content production for the algorithmic digital childhoods of today.


Bridle, J. (2017). ‘Something is wrong on the Internet,’ Medium. 06 November 2017. Accessed from:

Buckingham, D. (2008). Children and Media: A Cultural Studies Approach. In Drotner, K., & Livingstone, S. (Eds.) Handbook of Children, Media, and Culture. London, UK: Sage.

Craig, D., and Cunningham, S. (2017). “ Toy unboxing: living in a(n unregulated) material world,” Media International Australia, 163(1), pp. 77 – 86.

Cunningham, S., and Craig, D. (2019). Social Media Entertainment: The New Intersection of Hollywood and Silicon Valley, New York, NY: New York University Press.

Du Gay, P., Hall, S., Janes, L., Madsen, A.K., Mackay, H., and Negus, K. (2013). Doing Cultural Studies: The Story of the Sony Walkman, 2nd Edition, London, UK: The Open University Press.

Marsh, J. (2016). ‘Unboxing’ videos: co-construction of the child as cyberflaneur. Discourse: Studies in the Cultural Politics of Education, 37(3), 369 – 380.

McCracken, G. (1988). The Long Interview, London, UK: SAGE.

McKee, A. (2002). Textual Analysis: A Beginner’s Guide, London, UK: SAGE.

Latour, B. (2005). Reassembling the Social: An Introduction to Actor-Network-Theory, Oxford, UK: Oxford University Press.

Light, B., Burgess, J., and Duguay, S. (2018). “The walkthrough method: An approach to the study of apps,” New Media & Society, 20(3), pp. 1 – 20.

Rieder, B. (2015). “Introduction the YouTube Data Tools,” The Politics of Systems, Accessed on February 11, 2019.

Rubin, M. (2018). ‘The world of kids on YouTube is wild, weird, and almost entirely unregulated,’ Quartz News. Accessed from:


Jarrod Walczer is a media and communications doctoral candidate who fuses cultural studies with digital methods to explore the impact toy unboxing creator culture has had on YouTube and the children’s media industry in the USA. His research walking through the YouTube Kids application has been published in The Spectator with other works for Routledge and New York University Press exploring toy unboxing’s circuit of culture and creator communities forthcoming. He received his M.A. from The University of Southern California’s Annenberg School for Communications and Journalism, his M.Sc. from The London School of Economics & Politics, and his B.S. from The Roy H. Park School of Communications at Ithaca College. He has also earned certificates in visual digital methods from The University of Amsterdam and archival digital humanities from The University of Pennsylvania. He is currently pursuing his Ph.D. in Media and Communications at the Queensland University of Technology (QUT)’s Digital Media Research Centre.

No comments here
Why not start the discussion?