Hostname: page-component-5cf477f64f-h6p2m Total loading time: 0 Render date: 2025-03-31T05:46:37.891Z Has data issue: false hasContentIssue false

Safeguarding Child Viewers: Legal Strategies for Commercial Sharenting on Social Media in China

Published online by Cambridge University Press:  25 March 2025

Bing Shui
Affiliation:
Faculty of Law, University of Macau. E-mail: [email protected]
Yingying Jiang
Affiliation:
Faculty of Law, University of Macau. E-mail: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

In the digital age, “commercial sharenting” refers to parents excessively sharing their children’s images and data on social media for profit. Initially motivated by parental pride, this practice is now driven by child-to-child marketing, where young influencers shape their peers’ consumption habits. While regulations protect child influencers’ privacy, a significant gap remains regarding the rights of child viewers. We argue that commercial sharenting threatens children’s right to health under Article 24(1) of the UNCRC, potentially leading to harmful consumer behaviors and identity confusion. In response, China has adopted a fragmented regulatory approach to platform liability. This article advocates for a comprehensive legal framework incorporating content filtering, moderation, and reviewal to regulate commercial sharenting and safeguard children’s rights and interests in China.

Type
Article
Copyright
© The Author(s), 2025. Published by International Association of Law Libraries

Introduction

The term “commercial sharenting” denotes the practice wherein parents excessively share photos and other information about their children on social media platforms with the primary aim of seeking financial profit.Footnote 1 The earnings may come in the form of immediate payment, the establishment of future business interests for compensation, or through other avenues of current or potential revenue generation.Footnote 2 Once a certain threshold of followers is reached, parents managing the account can monetize their social media presence and secure sponsorships from well-known brands, such as Walmart, Hasbro, and others.Footnote 3 The invasion of child influencers’ rights, like privacy and economic exploitation in commercial sharenting scenarios, has been extensively discussed in previous scholarship, prompting lawmakers in various jurisdictions to address this issue by safeguarding the right to privacy and preventing the digital exploitation of child influencers.Footnote 4 However, apart from digital advertising,Footnote 5 existing literature rarely provides a systematic and comprehensive focus on the impact of sharing images or information about child influencers on other children behind the scenes.

Children today are both creators and audiences of social media.Footnote 6 Studies indicate that children spend considerable time watching videos of their favorite influencers, during which they are also exposed to influencer marketing practices.Footnote 7 Excessive and unhealthy engagement with videos featuring child influencers can have significant adverse effects on child viewers’ physical and mental health, including increased risks of myopia, spinal issues, and psychological distress, which are explored in detail in the third section of this article. Previous literature also indicates that children are easily affected by the content posted by influencers,Footnote 8 especially peer influencers.Footnote 9 The fast-paced development of social media opens up fresh avenues for brands to interact with children, integrating embedded advertising formats that seamlessly weave brand messages into captivating media content. This method makes promotional aspects less obvious, thus increasing the difficulty of detection by the target audience.Footnote 10

This article endeavors to provide a systematic analysis of child viewers within the legal framework of commercial sharenting. Given that content shared by child influencers or their parents falls under the category of user-generated content, platform governance is considered a highly effective strategy for addressing associated issues.Footnote 11 We argue for the necessity of compelling platforms to assume additional content moderation responsibilities in the context of commercial sharenting. This can be achieved through the establishment of a content governance mechanism, which would include detailed rules for the proper disclosure of sponsored products and strict content classification. A three-step mechanism could be employed to safeguard both child influencers and child viewers in the realm of commercial sharenting. Firstly, any content in commercial sharenting that risks encroaching on children’s privacy should be prohibited on social media, as it constitutes unlawful behavior. Secondly, social media platforms should be obligated to moderate content to protect child viewers, particularly through advertising laws or any other specific regulations. Finally, the third mechanism could leverage the “Notice and Action” provisions of the European Union’s Digital Services Act (DSA), requiring social media platforms to promptly remove commercial sharenting content upon notification by public authorities or third parties. The first and second steps represent ex-ante regulations, while the third step serves as an ex-post mechanism to prevent potential harm to child viewers.

The Irrefutable Influence of Child Influencers on Child Viewers

The manifestation of child-to-child marketing exerts a pervasive influence on the targeted children, with child influencers regularly promoting products like toys to their peers on platforms such as YouTube. This trend underscores that child-driven marketing has a significant influence on and economic implications for those children’s peers.Footnote 12 Researchers have found that children will copy their peers’ behavior, including their consumption patterns,Footnote 13 a phenomenon referred to as peer modeling or social learning.Footnote 14 Gaining peer acceptance and approval is important to children.Footnote 15 Particularly, from the ages of 6 to 7, peers become important agents for consumer socialization, and children begin to model their peers’ behavior.Footnote 16 Even preschool children’s behavior and consumption may already be influenced by their friends.Footnote 17 This has shown to be particularly true when the peer is the same age or slightly older.Footnote 18

The rise of social media introduced a new type of peer endorsement, namely social media influencers, who are now widely acknowledged as a new form of celebrity.Footnote 19 This new form of peer endorsement can reach a large number of child viewers due to the proliferation of social media, resulting in a wide-ranging impact.Footnote 20 Due to their related and approachable nature, child influencers offer viewers a window into their private lives though social media posts. This fosters a pronounced sense of shared experience between the creator and the audience, positioning child influencers as akin to their peers in the viewers’ eyes.Footnote 21 According to source attractiveness theory, the driven factors of source attractiveness consist of the source’s similarity, familiarity, and likeability. Specifically, similarity refers to the resemblance between the source and the receiver; in this context, the child influencer and the child viewer share a common identity as children. Familiarity denotes the degree to which the receiver knows the source through repeated exposure; in this scenario, the content created by child influencers, which is often centered around their daily lives and filmed in their homes, fosters a sense of familiarity with their audiences. Likability is defined as the affection felt towards the source, which can result from the source’s physical appearance and behavior.Footnote 22 Similarity, familiarity, and likability contribute to the receivers’ identification with the source, thus increasing the likelihood of adopting the source’s beliefs, attitudes, and behaviors.Footnote 23 In this case, the tangible appeal that child influencers have for their young audiences amplifies the likelihood of impressionable children emulating behaviors or making consumer choices based on the marketed content.

Brands hope that the positive associations linked to celebrities will transfer to the brands they endorse.Footnote 24 The fast-paced development of social media opens up fresh avenues for brands to interact with children, integrating embedded advertising formats that seamlessly weave brand messages into captivating media content. Since child viewers and their parents are often less resistant towards endorsements from child influencers, as peers are considered authentic information sources with no commercial interest,Footnote 25 this method makes promotional aspects less obvious, thus increasing the difficulty of detection by the target audience.Footnote 26

Detrimental Impacts on Child Viewers

The detrimental effect that social media can have on child viewers emanates from various sources. Considering the gravity of this issue, our attention is directed towards the deleterious impact engendered by child-to-child marketing facilitated by child influencers on social media.

Misguided consumer behavior with undisclosed sponsorship

Based on the reviewed literature and theories, children are easily influenced by their peer models and are likely to imitate their behavior, including the use of products they endorse.Footnote 27 As child influencers usually show the usefulness of the toys, clothes, or food they endorse, this action may accordingly lead to observational learning from their viewers.Footnote 28 Thus, child viewers are particularly vulnerable to exploitation through targeted advertising, especially when ads are designed to exploit their developmental vulnerabilities, such as the varying levels of advertising literacy across early childhood (under age 5), middle childhood (6–9 years), and late childhood (10–12 years).Footnote 29

In 2019, the watchdog group Truth in Advertising (based in Connecticut) filed a complaint with the United States (US) Federal Trade Commission (FTC) against “Ryan Toys Review,” YouTube child influencer Ryan Kaji’s top toy review channel,Footnote 30 accusing Kaji of deceiving children through “sponsored videos that often have the look and feel of organic content.” Approximately 90% of his videos have featured at least one paid product recommendation targeting preschoolers, a demographic too young to discern between commercial content and genuine reviews.Footnote 31 As Bram and Catalina have stated, “embedding sponsored content without any disclosure is the biggest harm faced by consumers in this industry.”Footnote 32 The purposeful directing of advertising content towards children through videos created by child influencers necessitates specific regulatory oversight to ensure ethical and responsible marketing practices.

Right to health

The right to health is recognized as a fundamental right under the United Nations (UN) Convention on the Rights of the Child (UNCRC). Article 24(1) of the UNCRC specifically outlines the right of the child to enjoy the highest attainable standard of health. The content of children’s right to health in the digital age is specifically interpreted in General Comment No. 25 (2021) on children’s rights in relation to the digital environment, which compels Member States to implement regulatory measures against materials and services that have the potential to adversely affect children’s mental or physical well-being. These provisions specifically address targeted or age-inappropriate advertising, marketing strategies, and other pertinent digital services, intending to shield children from exposure to the promotion of unhealthy engagement in social media.Footnote 33 In the visual world of commercial sharenting, child influencers’ excessive and unhealthy engagement with videos can have significant adverse effects on child viewers’ physical and mental health.

Previous research on the impact of celebrity endorsements on children has mainly focused on food marketing. Specifically, research has found that brands use celebrities to endorse energy-dense and nutrient-poor products,Footnote 34 increasing children’s intake of these less-healthy foods.Footnote 35 Researchers have examined the impact of social media influencers’ marketing on children’s food intake.Footnote 36 Results have shown that influencer marketing of unhealthy foods increases children’s immediate food intake, which is direct evidence of the detrimental impact on children’s physical health.

Researchers use “health-related quality of life (HRQoL),Footnote 37 an important multidimensional concept to measure the risk of precursors of disease to indicate the health status of the next generation.”Footnote 38 Previous empirical studies indicate that children with less screen time and moderate physical activity had the greatest HRQoL levels.Footnote 39 According to a 2021 study by the Center for Research on Chinese Youth, 70.8% of respondents often used Douyin (Chinese TikTok), Kuaishou, Huoshan, and other platforms to watch short videos.Footnote 40 The addiction to social media poses great challenges to the physical and mental health of young people.Footnote 41 The continuous viewing of screens, often for extended periods, leads to an increased myopia risk and other vision-related issues among children.Footnote 42 Previous studies indicate that there was a statistically significant correlation between screen time (high vs. low) and myopia.Footnote 43 Additionally, long-time sedentary behavior—maintaining improper posture while engrossed in video content—can contribute to spinal problems and musculoskeletal pain in young viewers.Footnote 44 In addition to physical health concerns, the mental health implications of excessive screen time for children are equally concerning. Researchers have highlighted the adverse effects of internet addiction, characterized by an inability to regulate internet usage, which can lead to significant distress and functional impairment.Footnote 45 Studies have shown a notable correlation between excessive use of short video applications and addictive behavior,Footnote 46 as well as a strong association between excessive social media use and depressive symptoms.Footnote 47

These physical and mental problems of children have escalated to the point of becoming a significant public health concern. Per a report by the World Health Organization (WHO) titled “Public Health Implications of Excessive Use of the Internet, Computers, Smartphones, and Similar Electronic Devices,” adolescents and young adults have emerged as the primary users of the internet and contemporary technologies. Behavioral addictions, characterized by an irresistible urge to repeatedly engage in online activities such as social media, have become a prevalent phenomenon across various jurisdictions.Footnote 48 The negative impact on child viewers cannot be overlooked and warrants serious consideration due to the potential long-term consequences on their development and well-being.

Identification confusion

Previous studies indicate that children are highly inclined to identify with popular child influencers and adopt their attitudes, beliefs, and behaviors, including those related to brands and products.Footnote 49 Sources used in advertising can serve as role models that children emulate in forming their identities.Footnote 50 The likelihood of this process increases when consumers develop a parasocial relationship (PSR) with the source.Footnote 51 PSR, as introduced by Horton and Wohl,Footnote 52 refers to the relationships consumers develop with media figures, making them influential sources of information.Footnote 53 The need for companionship, which is a fundamental driver for relationship formation, begins to emerge during childhood.Footnote 54 As a result, children are particularly susceptible to developing PSRs with media figures.Footnote 55 The one-sided emotional connections that individuals form with media personalities, such as popular child influencers, can significantly influence children’s attitudes, beliefs, and behaviors, including their consumption choices.

The proliferation of inappropriate content within commercial sharenting, including the flaunting of luxury items, has the potential to inadvertently endorse materialistic values to child viewers. The promotion of consumer culture by child influencers to their child viewers fosters a mindset centered around consumerism and material possessions, which can have profound adverse effects on influencers’ and viewers’ identity development and well-being.Footnote 56 This emphasis on material wealth can contribute to feelings of inadequacy as well as comparison to and dissatisfaction with one’s own circumstances, potentially leading to adverse effects on young audiences’ identification clarity.Footnote 57 It is essential to recognize and explore the implications of such content within commercial sharenting platforms to promote healthier lifestyles and values among child viewers.

Basic Framework for Protecting the Child Viewer in China

In the digital era, the underlying issue regarding the effective regulation of commercial sharenting or other user-generated content falls on social media platforms’ governance abilities. According to the gatekeeping theory, formally identified by Kurt Lewin,Footnote 58 the gatekeeper plays a crucial role in determining what information should be allowed to pass to groups or individuals and what should be restricted. Indeed, through the gatekeeping process, the gatekeeper filters and removes unwanted, sensitive, and controversial information. This function serves to exert control over society or a group, guiding it along a path perceived as appropriate or beneficial.Footnote 59

Moreover, in recent years, platforms have increasingly employed pre-designed algorithms for the organization and recommendation of content.Footnote 60 This transformation has significantly altered how users engage with online content.Footnote 61 Take TikTok as an illustrative case: the platform predominantly relies on content interactions within personalized video feeds, with the performance relying heavily on the effectiveness of the recommendation algorithm.Footnote 62 It is noteworthy that the potential of recommendation algorithms is rooted in the abundant availability of user data, rendering TikTok capable of augmenting the efficiency of content distribution and enhancing the adaptability of the personalized video feed.Footnote 63 As the primary driving force behind content delivered to users, particularly minors, the legal regulation of recommendation algorithms holds significant importance within the content moderation process.

Legal Regulations Surrounding Platform Governance

To enhance content governance on platforms, China has implemented a series of laws, including the Cybersecurity Law of the People’s Republic of China, the Data Security Law of the People’s Republic of China, etc. Several related administrative regulations and departmental regulatory documents have also been issued in recent years. The Provisions on Ecological Governance of Network Information Content, as deliberated and adopted at the executive meeting of the Cyberspace Administration of China (CAC), came into force on March 1, 2020.Footnote 64 The Opinions of the Cyberspace Administration of China on Further Pushing Websites and Platforms to Fulfill Their Primary Responsibility for Information Content Management were issued on September 15, 2021, to further prompt websites and platforms to fulfill their primary responsibility for information content management.Footnote 65

To regulate algorithm-generated recommendations for internet information services, the departmental Provisions on the Administration of Algorithm-generated Recommendations for Internet Information Services were issued, which explicitly denote the user’s right to know the algorithm,Footnote 66 the user’s right of choice,Footnote 67 and the special protection of minors.Footnote 68 Further, on August 12, 2022, the CAC issued an initial set of announcements obligating internet service providers to submit information concerning the algorithms of domestic internet information services. This mandate aligns with the Provisions on the Administration of Algorithm-generated Recommendations for Internet Information Services, which pertain to algorithm-generated recommendationsFootnote 69 that highlight the responsibility of internet service providers to ensure algorithm security and construct an algorithm accountability mechanism.Footnote 70 These provisions on algorithms encompass algorithm-powered recommendation services (ARS) utilizing technologies such as generation and synthesis, personalized push, selection sort, search filtering, scheduling decisions, and other algorithmic technologies to deliver information to users.Footnote 71 As a result, the majority of online platforms that Chinese internet users utilize fall within the regulatory scope that these provisions define.Footnote 72 The governance of platforms in China also covers other aspects related to the protection of children in the digital sphere, such as the anti-addiction system and special protection of personal information contained in the Law of the People’s Republic of China on the Protection of Minors. Besides, China’s effort to specially protect the rights of minors in cyberspace can be seen in the newly issued Regulation on the Protection of Minors in Cyberspace, which came into force on January 1, 2024.Footnote 73

Broadly speaking, China has adopted a fragmented regulatory strategy to govern platform liability, and there is still an absence of a dedicated law that comprehensively addresses content moderation. The primary responsibilities of platforms are governed by departmental regulatory documents that have a relatively low legal status.Footnote 74 Additionally, the rigorous administrative supervision system, which relies on the approach of “further pushing platforms to fulfill responsibility,” may not provide strong incentives for platforms to proactively engage in content moderation. In reality, content governance primarily occurs through “special campaigns” rather than through normalized governance processes.Footnote 75 For example, the CAC implemented a nationwide campaign to purify the online environment, which was intended to “crack down on wrongdoing and malpractice in live-streaming and short videos” and resolutely curb the manipulation of minors to make profits through live broadcasting and short videos to create child influencers.Footnote 76 In spite of its immediate effect, this governance approach may face challenges in terms of long-term sustainability.

Specific Law

The previous analysis of children’s right to health reveals that long-term viewing of online videos or live broadcasts, as well as other forms of online social networking, poses inevitable health harms for child viewers. Some special laws have been enacted to address this issue in China. Specifically, Article 74 of the Law of the People’s Republic of China on the Protection of Minors stipulates that network product and service providers shall not provide minors with products and services that induce addiction. Cyber games, online live broadcasts, online audio and video services, online social networking platforms, and other types of network service providers shall set corresponding time limits, authority, and consumption management parameters along with other rules for minors’ use of their services.

Furthermore, social media platforms in China face significant liability if they fail to handle children’s personal information in accordance with the Law of the People’s Republic of China on the Protection of Minors. This law mandates that network product and service providers, including social media platforms, receive warnings, have their illegal gains confiscated, and be fined between 100,000 yuan and one million yuan.Footnote 77 The most severe legal consequences include the suspension of their business licenses and the closure of their websites by public authorities.Footnote 78

Previous empirical studies assessing the impact of an advertising disclosure on minors’ recognition of influencer marketing have demonstrated that the inclusion of an advertising disclosure aids children in recognizing advertising.Footnote 79 Therefore, the disclosure of sponsorship in influencer endorsements should be required to protect child viewers from misleading advertising. According to Article 9 of China’s Measures for the Administration of Internet Advertising, internet advertisements must be clearly identifiable, ensuring that consumers can recognize them as advertisements when promoting goods or services. This means that all internet advertisements, whether online or offline, must be explicitly labeled as advertisements to comply with legal requirements.Footnote 80 If the internet advertisement is not identifiable, the market regulatory department shall order the violator to take corrective action and impose a fine of not more than 100,000 yuan on the publisher of the advertisement.Footnote 81

Response and Measures for Protecting Child Viewers in Commercial Sharenting

Regulating Platform Liability

In analyzing China’s fragmented regulatory strategy, it becomes apparent that the existing regulatory strategy is insufficient for the commercial sharenting regime. To effectively safeguard the rights of child influencers and viewers alike, there is a pressing need for an integrated and optimized regulatory scheme. This scheme should encompass a comprehensive framework that addresses the challenges inherent in commercial sharenting while ensuring the protection of children’s rights and interests. As described above, a three-step mechanism could be employed to safeguard both child influencers and child viewers in the realm of commercial sharenting.

Primarily, any content within the realm of commercial sharenting that jeopardizes the privacy of children ought to be strictly prohibited on social media platforms, given their potential to infringe legal boundaries. Secondly, it is imperative that social media platforms bear the responsibility of actively moderating content associated with commercial sharenting to safeguard child viewers’ interests in compliance with the protection of minors via advertising laws and other special laws. Finally, a regulatory mechanism could use the “Notice and Action” provisions outlined in the EU’s DSA for reference. This would entail social media platforms being mandated to promptly remove inappropriate commercial sharenting content upon notification by either public authorities or third parties. The first and second steps represent ex-ante regulations, while the third step serves as an ex-post mechanism to prevent potential harm to child viewers.

Content filtering

To protect child influencers’ rights, social media platforms should prohibit any content that poses a risk of invading children’s privacy or personal information, as it constitutes unlawful behavior. The content disseminated on social media platforms of child influencers frequently transcends superficial portrayals of their domestic environments, delving into the disclosure of intimate personal details, including specific identifying information like the child’s name, birth date, school, etc. Moreover, intimate information, as termed by Professor Leah Plunkett, encompasses geographic locations, daily routines, preferences, and other private details that viewers can glean from the posted content.Footnote 82 As Fishbein notes, this implies that a child influencer’s private life becomes openly accessible to the public.Footnote 83 The personal lives of child influencers are laid bare for anyone on the internet to observe, thereby exposing them to significant risks of potential misconduct by individuals with malicious intent, such as perpetrators, pedophiles, or identity thieves. When a platform detects content containing certain information that endangers children’s privacy or personal information, such as nude pictures of children, the content should be directly prohibited.

One case involving a social media platform being charged with violating child data protection laws occurred in Ireland. On September 1, 2023, the Data Protection Commission (DPC), Ireland’s supervisory authority, finalized its decision to impose a fine of 345 million euroFootnote 84 on TikTok, which allegedly violated the specific General Data Protection Regulation (GDPR) provision concerning children’s data protection.Footnote 85

Content moderation

As the second step, social media platforms should be obligated to moderate content related to commercial sharenting to protect child viewers through laws on the protection of minors, advertising laws, or any other specific regulations. Various legal frameworks impose obligations on platforms and other network service providers to manage the amount of time users spend online, which is required by such instruments as the Law of the People’s Republic of China on the Protection of Minors,Footnote 86 while the non-binding Initiative for Preventing Juveniles from Short Video Addiction compels platforms and providers to disclose product endorsements.

To safeguard children against manipulative covert advertisements with child-to-child marketing conducted by child influencers, advertising laws that include video-sharing platforms in their scope should be promulgated, mandating that sponsored content be appropriately disclosed. In the Chinese law, the relative provision can be observed in the first paragraph of Article 9: “An [i]nternet advertisement shall be identifiable so that it can be clearly identified by consumers as an advertisement.”Footnote 87 However, from examining regulations in the United Kingdom (UK) and the US, it is evident that China lacks specific legislation concerning social media influencer advertising.

The UK’s Communications Act 2003 pertains specifically to the child influencer industry, serving to shield children from exposure to harmful content and ensuring that viewers of influencer content are safeguarded against encountering advertisements without adequate warning.Footnote 88 Article 319(2)(1) of the Communications Act 2003 prohibits the “use of techniques which exploit the possibility of conveying a message to viewers or listeners or of otherwise influencing their minds, without their being aware or fully aware of what has occurred.”Footnote 89 This legislation prohibits child influencers from disseminating sponsored content without appropriate disclosure, thereby offering a preventive measure against potential manipulations of uninformed children.

In the US, in November 2019, the FTC issued comprehensive guidelines for endorsements by social media influencers and brands, which also encompass child influencers. The guidelines require that sponsored content be clearly labeled and that endorsements, which include featuring a product or service in a post, tagging or liking brands, “pinning” brands, or commenting on or providing reviews of brands,Footnote 90 be truthful and not misleading.Footnote 91 This brief comparative study does not aim to provide a comprehensive analysis of the UK and US approaches to regulating child influencer advertising, but it does seek to highlight the need for more specific legislation on social media advertising in China.

Content reviewal

Finally, the third mechanism could leverage the “Notice and Action” provisions of the EU’s DSA, requiring social media platforms to promptly remove commercial sharenting content upon notification by public authorities or third parties.Footnote 92 Thus, platforms should establish an ecological content governance mechanism, complete with detailed rules for the strict classification and grading of content. If platforms utilize personalized algorithmic recommendation technology to push information, the content recommended by those algorithms should be transparently communicated to consumers through prominent labels. Platforms should conduct manual reviews or interventions based on their own transparency guidelines, removing the unlawful and unappropriated content generated by sharenting. This empowers audiences with the right to disable recommended services or delete their personal labels, with digital content platforms offering the option to refrain from pushing services based on personal data. To safeguard minors, digital content platforms are prohibited from recommending content to minors or utilizing personally sensitive data for content recommendations.Footnote 93

Convenient channels for filing complaints and reports should be prominently displayed, accompanied by clear instructions on how to utilize these channels as articulated in China’s Measures for the Administration of Internet Advertising. Article 16 of this 2023 departmental rule requires internet platform operators to monitor and examine their advertising content, and if illegal advertisements are discovered, the operator shall take necessary measures, such as giving notice to request a correction, deleting, blocking, or disconnecting the link(s) to the advertisement(s) along with maintaining relevant records. Internet platform operators should also establish an effective mechanism for receiving and processing complaints and reports. This includes setting up convenient channels for submitting complaints and making public the methods for lodging complaints and reports, ensuring they are handled promptly.Footnote 94 If Article 16 is violated, the market regulatory authority at or above the county level shall order internet platform operators to take corrective action and impose a fine of not less than 10,000 yuan and not more than 50,000 yuan.Footnote 95 However, the legal status of this regulation remains relatively low in terms of China’s normative hierarchy, and the penalties are mild, making it difficult to achieve a deterrent effect.

Further, online platforms should be encouraged to develop models tailored for minors and provide online products and services suitable for this demographic. Public authorities functioning as external supervisors are not engaged in case-by-case supervision. Instead, their role involves examining whether platforms have established relevant mechanisms for applying content moderation and are undertaking responsible measures in response to complaints or notices.

The first and second steps, constituting ex-ante regulations, involve proactive measures implemented before potential harm occurs to both child influencers and child viewers. In contrast, the third step serves as an ex-post mechanism, which functions retrospectively to address the harm that has already occurred to child viewers. The ex-post mechanism typically involves responsive measures, such as complaint mechanisms, reporting tools, and content removal procedures, to address harmful content or activities that have been identified after the fact. This mechanism may also include measures to provide support for affected individuals, including child viewers and their families.

Web Literacy Intervention

In addition to establishing comprehensive platform liability, enhancing children’s web literacy is also crucial. Take online advertising literacy as an example: researchers have explored the effectiveness of an educational vlog in assisting children aged 11–14 to cope with advertising. Their findings indicate that advertising disclosures enhanced children’s recognition of advertising.Footnote 96 Studies testing the impact of advertising disclosures on minors’ recognition of influencer marketing have consistently demonstrated that the inclusion of such disclosures aids children in recognizing commercial promotions.Footnote 97

Recently, China enacted administrative regulations aimed at enhancing the recognition and promotion of web literacy and morality. According to the Regulation on the Protection of Minors in Cyberspace, the Ministry of Education shall incorporate web literacy and morality education into quality-oriented education and formulate indicators for assessing minors’ web literacy and morality in conjunction with the State’s cyberspace affairs department. Education departments shall guide and support schools in carrying out web literacy and morality education for minors and foster minors’ cybersecurity awareness, digital literacy behavioral habits, and protection skills, focusing on the formation of moral awareness in cyberspace and the cultivation of the concept of the rule of law in cyberspace.

Conclusion

Commercial sharenting is a global phenomenon in the digital era. The practice of sharenting is not inherently bad; after all, it represents a manifestation of parental love and concern. Like a coin that has two sides, sharenting fueled by child-to-child marketing adversely affects both child influencers and viewers. Conventional legal paradigms predominantly center on child influencers. Nevertheless, considering the escalating prevalence of child-to-child marketing via social media platforms, the irrefutable influence of child influencers on child viewers cannot be overlooked, especially child viewers’ right to health. To regulate oversharing effectively, it is imperative to safeguard the interests of both child influencers and child viewers. Chinese lawmakers have implemented a series of legal regulations concerning platform liability for user-generated content, although these regulations represent a fragmented approach.

To systematically address the issues related to commercial sharenting involving both child influencers and child viewers, as well as other forms of user-generated content, we propose that China take a three-step approach: 1) content filtering, 2) content moderation, and 3) content reviewal. This three-step approach seeks to establish a comprehensive regulatory framework that safeguards child influencers from privacy violations and economic exploitation while ensuring child viewers’ rights are protected through legal compliance and transparent advertising.

References

1 Fineman, Melanie N., “Honey, I Monetized the Kids: Commercial Sharenting and Protecting the Rights of Consumers and the Internet’s Child Stars,” Georgetown Law Journal 111, no. 4 (2023): 847–90.Google Scholar

2 Plunkett, Leah A., Sharenthood: Why We Should Think before We Talk about Our Kids Online (MIT Press, 2019)CrossRefGoogle Scholar, https://doi.org/10.7551/mitpress/11756.001.0001.

3 Edney, Amber, ‘“I Don’t Work for Free’: The Unpaid Labor of Child Social Media Stars,” University of Florida Journal of Law and Public Policy 32, no. 3 (2021–2022): 547–72Google Scholar.

4 Steinberg, Stacey B., “Sharenting: Children’s Privacy in the Age of Social Media,” Emory Law Journal 66 (2016): 839 Google Scholar; Fineman, “Honey, I Monetized the Kids” (n 1); Fishbein, Rachel, “Growing up Viral: ‘Kidfluencers’ as the New Face of Child Labor and the Need for Protective Legislation in the United Kingdom,” Note, George Washington International Law Review 54, no. 1 (2022–23): 127–56Google Scholar.

5 De Veirman, Marijke, Hudders, Liselot, and Nelson, Michelle R., “What Is Influencer Marketing and How Does It Target Children? A Review and Direction for Future Research,” Frontiers in Psychology 10 (3 Dec. 2019), https://doi.org/10.3389/fpsyg.2019.02685 CrossRefGoogle ScholarPubMed.

6 Ibid.

7 Folkvord, Frans et al., “Children’s Bonding with Popular YouTube Vloggers and Their Attitudes toward Brand and Product Endorsements in Vlogs: An Explorative Study,” Young Consumers 20, no. 2 (2019): 7790. https://doi.org/10.1108/YC-12-2018-0896 CrossRefGoogle Scholar.

8 Martínez, Carolina and Olsson, Tobias, “Making Sense of YouTubers: How Swedish Children Construct and Negotiate the YouTuber Misslisibell as a Girl Celebrity,” Journal of Children and Media 13, no. 1 (2 Jan. 2019): 3652, https://doi.org/10.1080/17482798.2018.1517656 Google Scholar.

9 Bandura, Albert, “Social Cognitive Theory in Cultural Context,” Applied Psychology 51, no. 2 (2002): 269–90, https://doi.org/10.1111/1464-0597.00092 CrossRefGoogle Scholar.

10 Hudders, Liselot et al., “Shedding New Light on How Advertising Literacy Can Affect Children’s Processing of Embedded Advertising Formats: A Future Research Agenda,” Journal of Advertising 46, no. 2 (2017): 333–49Google Scholar.

11 Saad Khan and Mia Lucas, “Platform Governance and Content Moderation: Examining the Role of Social Media Platforms in Content Moderation, Including Policies, Guidelines, and Challenges Related to Regulating News Content,” 8 July 2023 (available via ResearchGate).

12 See Catherine Jane Archer and Kate Delmo, “‘Kidinfluencer’ culture is harming kids in several ways – and there’s no meaningful regulation of it,” The Conversation, May 1, 2023, https://www.the-conversation.com/kidfluencer-culture-is-harming-kids-in-several-ways-and-theres-no-meaningful-regulation-of-it-204277.

13 Bandura, “Social Cognitive Theory in Cultural Context” (n 9).

14 Bandura, Albert, “Self-Efficacy: Toward a Unifying Theory of Behavioral Change: Psychological Review,” Psychological Review 84, no. 2 (Mar. 1977): 191215, https://doi.org/10.1037/0033-295X.84.2.191 CrossRefGoogle Scholar.

15 Mangleburg, Tamara F., Doney, Patricia M., and Bristol, Terry, “Shopping with Friends and Teens’ Susceptibility to Peer Influence,” Journal of Retailing 80, no. 2 (1 Jan. 2004): 101–16, https://doi.org/10.1016/j.jretai.2004.04.005 CrossRefGoogle Scholar.

16 John, , “Consumer Socialization of Children: A Retrospective Look at Twenty‐Five Years of Research,” Journal of Consumer Research 26, no. 3 (Dec. 1999): 183213, https://doi.org/10.1086/209559 CrossRefGoogle Scholar.

17 Atkinson, Lucy, Nelson, Michelle R., and Rademacher, Mark A., “A Humanistic Approach to Understanding Child Consumer Socialization in US Homes,” Journal of Children and Media 9, no. 1 (2 Jan. 2015): 95112, https://doi.org/10.1080/17482798.2015.997106 Google Scholar.

18 Brody, Gene H. and Stoneman, Zolinda, “Selective Imitation of Same-Age, Older, and Younger Peer Models,” Child Development 52, no. 2 (1981): 717–20, https://doi.org/10.2307/1129197 Google Scholar.

19 De Veirman, Marijke, Cauberghe, Veroline, and Hudders, Liselot, “Marketing through Instagram Influencers: The Impact of Number of Followers and Product Divergence on Brand Attitude,” International Journal of Advertising 36, no. 5 (3 Sept. 2017): 798828, https://doi.org/10.1080/02650487.2017.1348035 Google Scholar.

20 Knoll, Johannes, “Advertising in Social Media: A Review of Empirical Evidence,” International Journal of Advertising 35, no. 2 (3 Mar. 2016): 266300, https://doi.org/10.1080/02650487.2015.1021898 Google Scholar.

21 Schouten, Alexander P., Janssen, Loes, and Verspaget, Maegan, “Celebrity vs. Influencer Endorsements in Advertising: The Role of Identification, Credibility, and Product-Endorser Fit,” in Leveraged Marketing Communications, eds. Yoon, SukkiChoi, Yung Kyun, and Taylor, Charles R. (Routledge, 2021)Google Scholar.

22 McGuire, W.J., “Attitudes and Attitude Change,” in Handbook of Social Psychology, eds. Lindzey, Gardner and Aronson, Elliot (Random House, 1985), 233346 Google Scholar.

23 Basil, Michael D., “Identification as a Mediator of Celebrity Effects,” Journal of Broadcasting & Electronic Media 40, no. 4 (1996): 478–95Google Scholar.

24 McCracken, Grant, “Who Is the Celebrity Endorser? Cultural Foundations of the Endorsement Process,” Journal of Consumer Research 16, no. 3 (1989): 310–21CrossRefGoogle Scholar.

25 de Vries, Lisette, Gensler, Sonja, and Peter, S.H. Leeflang, “Popularity of Brand Posts on Brand Fan Pages: An Investigation of the Effects of Social Media Marketing,” Journal of Interactive Marketing 26, no. 2 (1 May 2012): 8391, https://doi.org/10.1016/j.intmar.2012.01.003 CrossRefGoogle Scholar.

26 Hudders et al., “Shedding New Light” (n 10).

27 Suedfeld, Peter et al., “Processes of Opinion Change,” in Attitude Change (Routledge, 1968): 29 Google Scholar.

28 Bandura, Albert, Grusec, Joan E., and Menlove, Frances L., “Observational Learning as a Function of Symbolization and Incentive Set,” Child Development 37, no. 3 (1966): 499506, https://doi.org/10.2307/1126674 CrossRefGoogle ScholarPubMed.

29 Rozendaal, Esther, Buijzen, Moniek, and Valkenburg, Patti, “Children’s Understanding of Advertisers’ Persuasive Tactics,” International Journal of Advertising 30, no. 2 (Jan. 2011): 329–50, https://doi.org/10.2501/IJA-30-2-329-350 Google Scholar.

30 “9-year-old boy named highest-paid YouTube star,” CNN Business, Dec. 23, 2020, https://edition.cnn.com/videos/business/2020/12/23/youtube-2020-highest-paid-ryan-kaji-sot-vpx.hln.

31 Tiffany Hsu, “Popular YouTube Toy Review Channel Accused of Blurring Lines for Ads,” New York Times, Sept. 4, 2019, https://www.nytimes.com/2019/09/04/business/media/ryan-toysreview-youtube-ad-income.html.

32 Duivenvoorde, Bram and Goanta, Catalina, “The Regulation of Digital Advertising under the DSA: A Critical Assessment,” Computer Law & Security Review 51 (1 Nov. 2023): 105870, https://doi.org/10.1016/j.clsr.2023.105870 Google Scholar.

33 Committee on the Rights of the Child, General Comment No. 25, Children’s Rights in Relation to the Digital Environment, U.N. Doc. CRC/C/GC/25 (Nov. 19, 2021), paras. 96–97.

34 Bragg, Marie A. et al., “Popular Music Celebrity Endorsements in Food and Nonalcoholic Beverage Marketing,” Pediatrics 138, no. 1 (July 2016): e20153977, https://doi.org/10.1542/peds.2015-3977 Google ScholarPubMed.

35 Boyland, Emma J. et al., “Food Choice and Overconsumption: Effect of a Premium Sports Celebrity Endorser,” Journal of Pediatrics 163, no. 2 (1 Aug. 2013): 339–43, https://doi.org/10.1016/j.jpeds.2013.01.059 Google ScholarPubMed.

36 Coates, Anna E. et al.,“Social Media Influencer Marketing and Children’s Food Intake: A Randomized Trial,” Pediatrics 143, no. 4 (1 Apr. 2019): e20182554, https://doi.org/10.1542/peds.2018-2554 CrossRefGoogle ScholarPubMed; Coates, Anna Elizabeth et al., “The Effect of Influencer Marketing of Food and a ‘Protective’ Advertising Disclosure on Children’s Food Intake,” Pediatric Obesity 14, no. 10 (2019): e12540, https://doi.org/10.1111/ijpo.12540 CrossRefGoogle Scholar.

37 Marker, Arwen M., Steele, Ric G., and Noser, Amy E., “Physical Activity and Health-Related Quality of Life in Children and Adolescents: A Systematic Review and Meta-Analysis,” Health Psychology 37, no. 10 (Oct. 2018): 893903, https://doi.org/10.1037/hea0000653 Google ScholarPubMed.

38 Wong, Monica et al., “Time-Use Patterns and Health-Related Quality of Life in Adolescents,” Pediatrics 140, no. 1 (1 July 2017): e20163656, https://doi.org/10.1542/peds.2016-3656 Google ScholarPubMed.

39 Dumuid, Dorothea et al., “Health-Related Quality of Life and Lifestyle Behavior Clusters in School-Aged Children from 12 Countries,” Journal of Pediatrics 183 (1 Apr. 2017): 178–83.e2, https://doi.org/10.1016/j.jpeds.2016.12.048 Google ScholarPubMed.

40 Center for Research on Chinese Youth, “Report on Preventing the Juvenile from Short Video Addiction Model” (2021), http://www.cycrc.org.cn/kycg/seyj/202105/P020210526576438296951.pdf. Chinese language.

41 Lu, Lihong et al., “Adolescent Addiction to Short Video Applications in the Mobile Internet Era,” Frontiers in Psychology 13 (10 May 2022), https://doi.org/10.3389/fpsyg.2022.893599 Google ScholarPubMed.

42 Lanca, Carla and Saw, Seang-Mei, “The Association between Digital Screen Time and Myopia: A Systematic Review,” Ophthalmic and Physiological Optics 40, no. 2 (2020): 216–29, https://doi.org/10.1111/opo.12657 Google ScholarPubMed.

43 Zong, Zhiqiang et al., “The Association between Screen Time Exposure and Myopia in Children and Adolescents: A Meta-Analysis,” BMC Public Health 24, no. 1 (18 June 2024): 1625, https://doi.org/10.1186/s12889-024-19113-5 CrossRefGoogle ScholarPubMed.

44 da Costa, Lucas et al., “Sedentary Behavior Is Associated with Musculoskeletal Pain in Adolescents: A Cross Sectional Study,” Brazilian Journal of Physical Therapy 26, no. 5 (1 Sept. 2022): 100452, https://doi.org/10.1016/j.bjpt.2022.100452 CrossRefGoogle ScholarPubMed.

45 Burnay, Jonathan et al., “Which Psychological Factors Influence Internet Addiction? Evidence through an Integrative Model,” Computers in Human Behavior 43 (1 Feb. 2015): 2834, https://doi.org/10.1016/j.chb.2014.10.039 CrossRefGoogle Scholar.

46 Zhang, Xing, Wu, You, and Liu, Shan, “Exploring Short-Form Video Application Addiction: Socio-Technical and Attachment Perspectives,” Telematics and Informatics 42 (1 Sept. 2019): 101243, https://doi.org/10.1016/j.tele.2019.101243 CrossRefGoogle Scholar.

47 Ivie, Elizabeth J. et al., “A Meta-Analysis of the Association between Adolescent Social Media Use and Depressive Symptoms,” Journal of Affective Disorders 275 (1 Oct. 2020): 165–74, https://doi.org/10.1016/j.jad.2020.06.014 CrossRefGoogle ScholarPubMed.

48 World Health Organization, “Public Health Implications of Excessive Use of the Internet, Computers, Smartphones and Similar Electronic Devices,” WHO meeting report, Tokyo, Japan (27–29 August 2014), Sept. 9, 2015, https://www.who.int/publications/i/item/9789241509367.

49 Schouten, Janssen, and Verspaget, “Celebrity vs. Influencer Endorsements” (n 21); Pilgrim, Katharina and Bohnet-Joschko, Sabine, “Selling Health and Happiness How Influencers Communicate on Instagram about Dieting and Exercise: Mixed Methods Research,” BMC Public Health 19, no. 1 (6 Aug. 2019): 1054, https://doi.org/10.1186/s12889-019-7387-8 CrossRefGoogle ScholarPubMed.

50 Hoffner, Cynthia and Buchanan, Martha, “Young Adults’ Wishful Identification with Television Characters: The Role of Perceived Similarity and Character Attributes,” Media Psychology 7, no. 4 (1 Nov. 2005): 325–51, https://doi.org/10.1207/S1532785XMEP0704_2 Google Scholar.

51 Tsay-Vogel, Mina and Schwartz, Mitchael L., “Theorizing Parasocial Interactions Based on Authenticity: The Development of a Media Figure Classification Scheme,” Psychology of Popular Media Culture 3, no. 2 (2014): 6678, https://doi.org/10.1037/a0034615 Google Scholar.

52 Donald Horton and R. Richard Wohl, “Mass Communication and Para-Social Interaction,” Psychiatry (1 Aug. 1956), https://www.tandfonline.com/doi/abs/10.1080/00332747.1956.11023049.

53 Rubin, Alan M., Perse, Elizabeth M., and Powell, Robert A., “Loneliness, Parasocial Interaction, and Local Television News Viewing,” Human Communication Research 12, no. 2 (1985): 155–80, https://doi.org/10.1111/j.1468-2958.1985.tb00071.x CrossRefGoogle Scholar.

54 The Handbook of Children, Media, and Development, Wiley Online Library (2008), accessed 11 Oct. 2024, https://onlinelibrary.wiley.com/doi/10.1002/9781444302752.

55 de Droog, Simone M., Buijzen, Moniek, and Valkenburg, Patti M., “Use a Rabbit or a Rhino to Sell a Carrot? The Effect of Character–Product Congruence on Children’s Liking of Healthy Foods,” Journal of Health Communication 17, no. 9 (1 Oct. 2012): 1068–80, https://doi.org/10.1080/10810730.2011.650833 Google ScholarPubMed.

56 Dittmar, Helga, “The Costs of Consumer Culture and the ‘Cage Within’”: The Impact of the Material ‘Good Life’ and ‘Body Perfect’ Ideals on Individuals’ Identity and Well-Being,” Psychological Inquiry 18, no. 1 (2007): 2331 CrossRefGoogle Scholar.

57 Hill, Jennifer Ann, “Endangered Childhoods: How Consumerism Is Impacting Child and Youth Identity,” Media, Culture & Society 33, no. 3 (Apr. 2011): 347–62, https://doi.org/10.1177/0163443710393387 Google Scholar.

58 Lewin, Kurt, “Frontiers in Group Dynamics: Concept, Method and Reality in Social Science; Social Equilibria and Social Change,” Human Relations 1, no. 1 (1 June 1947): 541, https://doi.org/10.1177/001872674700100103 CrossRefGoogle Scholar.

59 Ibid.

60 Eric N. Holmes, Liability for Algorithmic Recommendations, Congressional Research Service, R47753 (Oct. 12, 2023), https://crsreports.congress.gov/product/pdf/download/R/R47753/R47753.pdf.

61 Lu, Xing, Lu, Zhicong, and Liu, Changqing, “Exploring TikTok Use and Non-Use Practices and Experiences in China,” in Social Computing and Social Media. Participation, User Experience, Consumer Experience, and Applications of Social Computing, ed. Meiselwitz, Gabriele (Springer, 2020), 5770, https://doi.org/10.1007/978-3-030-49576-3_5 CrossRefGoogle Scholar.

62 Daniel Klug et al., “Trick and Please. A Mixed-Method Study on User Assumptions About the TikTok Algorithm,” in Proceedings of the 13th ACM Web Science Conference 2021, WebSci ‘21 (Association for Computing Machinery, 2021), 84–92, https://doi.org/10.1145/3447535.3462512.

63 Wang, Pengda, “Recommendation Algorithm in TikTok: Strengths, Dilemmas, and Possible Directions,” International Journal of Social Science Studies 10, no. 5 (2022): 6066 CrossRefGoogle Scholar.

64 Provisions on Ecological Governance of Network Information Content (promulgated by the Cyberspace Administration of China, Dec. 15, 2019, effective Mar. 1, 2020), ChinaLawInfo (last visited Nov. 3, 2023) (P.R.C.).

65 Opinions of the Cyberspace Administration of China on Further Pushing Websites and Platforms to Fulfill Their Primary Responsibility for Information Content Management (Cyberspace Administration of China, Sep. 15, 2021, effective 15 Sep. 2021) (promulgated by the Cyberspace Administration of China, Dec.15, 2019, effective Mar. 1, 2020), LawInfoChina (last visited Nov. 3, 2023) (P.R.C.).

66 Provisions on the Administration of Algorithm-generated Recommendations for Internet Information Services (promulgated by the Cyberspace Administration of China, Dec. 31, 2021, effective Mar. 1, 2022), ChinaLawInfo (last visited Nov. 3, 2023) (P.R.C.).

67 Ibid.

68 Ibid.

69 Fei Yang and Yu Yao, “A New Regulatory Framework for Algorithm-Powered Recommendation Services in China,” Nature Machine Intelligence 4, no. 10 (Oct. 2022): 802–03, https://doi.org/10.1038/s42256-022-00546-9.

70 Ibid.

71 Provisions on the Administration of Algorithm-generated Recommendations for Internet Information Services (n 66).

72 Yang and Yao, “A New Regulatory Framework for Algorithm-Powered Recommendation Services in China” (n 69).

73 Regulation on the Protection of Minors in Cyberspace (promulgated by the State Council, Oct. 16, 2023, effective Jan. 1, 2024), ChinaLawInfo (last visited Nov. 3, 2023) (P.R.C.).

74 Yaojia Tang and Chunhui Tang, “Research on the Allocation of Platform Liability for Network Information Content Governance,” Research on Financial and Economic Issues 6, no. 59 (2023): 72.

75 Yang and Yao, “A New Regulatory Framework for Algorithm-Powered Recommendation Services in China” (n 69).

76 “Fighting rumors, governance algorithms […] The ‘nationwide campaign to purify the online environment’ in 2022 will focus on correcting this network chaos,” China Daily, Mar. 17, 2022, https://cn.chinadaily.com.cn/a/202203/17/WS623348a5a3101c3ee7acc315.html. Chinese language.

77 One Chinese yuan is equal to about US$.137 (as of Jan. 17, 2025).

78 Regulation on the Protection of Minors in Cyberspace (n 73).

79 De Jans, Steffi, Cauberghe, Veroline, and Hudders, Liselot, “How an Advertising Disclosure Alerts Young Adolescents to Sponsored Vlogs: The Moderating Role of a Peer-Based Advertising Literacy Intervention through an Informational Vlog,” Journal of Advertising 47, no. 4 (2 Oct. 2018): 309–25, https://doi.org/10.1080/00913367.2018.1539363 Google Scholar; Coates et al., “The Effect of Influencer Marketing of Food and a ‘Protective’ Advertising Disclosure on Children’s Food Intake” (n 36).

80 Measures for the Administration of Internet Advertising (promulgated by the State Administration for Market Regulation, Feb. 25, 2023, effective May. 1, 2023), ChinaLawInfo (last visited Nov. 8, 2023) (P.R.C.).

81 Ibid.

82 E.g., “Tekkerz kid, A Very Real Morning Routine! ft Tekkerz Kid JR,” YouTube, 10 July 2021, https://www.youtube.com/watch?v=DBrKc8HBwY [https://perma.cc/3E6V-3W64], showing one of Tekkerz Kid’s YouTube videos that depicts details of the inside of his bedroom.

83 Fishbein, “Growing up Viral” (n 4).

84 One euro equals US$1.027 (as of Jan. 17, 2025).

85 Data Protection Commission, “Irish Data Protection Commission announces €345 million fine of TikTok” (Sept. 15, 2023), https://www.dataprotection.ie/index.php/en/news-media/press-releases/DPC-announces-345-million-euro-fine-of-TikTok.

86 Wei Cheng Nian Ren Bao Hu Fa [Law of the People’s Republic of China on the Protection of Minors] (promulgated by the Standing Committee of the National People’s Congress, Oct. 17, 2020, effective June 1, 2021) Standing Comm. Nat’l People’s Cong. Gaz. (P.R.C.). Chinese language.

87 Measures for the Administration of Internet Advertising (n 80).

88 Communications Act 2003, c. 21 (UK).

89 Communications Act 2003, c. 21 (UK), art. 319(2)(1).

91 Federal Trade Commission, Advertising and Marketing, https://www.ftc.gov/business-guidance/advertising-marketing (last visited Oct. 10, 2023).

92 Digital Services Act (EU Regulation 2022/2065) (2022), art. 22.

93 Tang and Tang, “Research on the Allocation of Platform Liability for Network Information Content Governance” (n 74).

94 Measures for the Administration of Internet Advertising (n 80).

95 Ibid.

96 De Jans, Cauberghe, and Hudders, “How an Advertising Disclosure Alerts Young Adolescents to Sponsored Vlogs” (n 79).

97 Ibid.