Introduction
The term “commercial sharenting” denotes the practice wherein parents excessively share photos and other information about their children on social media platforms with the primary aim of seeking financial profit.Footnote 1 The earnings may come in the form of immediate payment, the establishment of future business interests for compensation, or through other avenues of current or potential revenue generation.Footnote 2 Once a certain threshold of followers is reached, parents managing the account can monetize their social media presence and secure sponsorships from well-known brands, such as Walmart, Hasbro, and others.Footnote 3 The invasion of child influencers’ rights, like privacy and economic exploitation in commercial sharenting scenarios, has been extensively discussed in previous scholarship, prompting lawmakers in various jurisdictions to address this issue by safeguarding the right to privacy and preventing the digital exploitation of child influencers.Footnote 4 However, apart from digital advertising,Footnote 5 existing literature rarely provides a systematic and comprehensive focus on the impact of sharing images or information about child influencers on other children behind the scenes.
Children today are both creators and audiences of social media.Footnote 6 Studies indicate that children spend considerable time watching videos of their favorite influencers, during which they are also exposed to influencer marketing practices.Footnote 7 Excessive and unhealthy engagement with videos featuring child influencers can have significant adverse effects on child viewers’ physical and mental health, including increased risks of myopia, spinal issues, and psychological distress, which are explored in detail in the third section of this article. Previous literature also indicates that children are easily affected by the content posted by influencers,Footnote 8 especially peer influencers.Footnote 9 The fast-paced development of social media opens up fresh avenues for brands to interact with children, integrating embedded advertising formats that seamlessly weave brand messages into captivating media content. This method makes promotional aspects less obvious, thus increasing the difficulty of detection by the target audience.Footnote 10
This article endeavors to provide a systematic analysis of child viewers within the legal framework of commercial sharenting. Given that content shared by child influencers or their parents falls under the category of user-generated content, platform governance is considered a highly effective strategy for addressing associated issues.Footnote 11 We argue for the necessity of compelling platforms to assume additional content moderation responsibilities in the context of commercial sharenting. This can be achieved through the establishment of a content governance mechanism, which would include detailed rules for the proper disclosure of sponsored products and strict content classification. A three-step mechanism could be employed to safeguard both child influencers and child viewers in the realm of commercial sharenting. Firstly, any content in commercial sharenting that risks encroaching on children’s privacy should be prohibited on social media, as it constitutes unlawful behavior. Secondly, social media platforms should be obligated to moderate content to protect child viewers, particularly through advertising laws or any other specific regulations. Finally, the third mechanism could leverage the “Notice and Action” provisions of the European Union’s Digital Services Act (DSA), requiring social media platforms to promptly remove commercial sharenting content upon notification by public authorities or third parties. The first and second steps represent ex-ante regulations, while the third step serves as an ex-post mechanism to prevent potential harm to child viewers.
The Irrefutable Influence of Child Influencers on Child Viewers
The manifestation of child-to-child marketing exerts a pervasive influence on the targeted children, with child influencers regularly promoting products like toys to their peers on platforms such as YouTube. This trend underscores that child-driven marketing has a significant influence on and economic implications for those children’s peers.Footnote 12 Researchers have found that children will copy their peers’ behavior, including their consumption patterns,Footnote 13 a phenomenon referred to as peer modeling or social learning.Footnote 14 Gaining peer acceptance and approval is important to children.Footnote 15 Particularly, from the ages of 6 to 7, peers become important agents for consumer socialization, and children begin to model their peers’ behavior.Footnote 16 Even preschool children’s behavior and consumption may already be influenced by their friends.Footnote 17 This has shown to be particularly true when the peer is the same age or slightly older.Footnote 18
The rise of social media introduced a new type of peer endorsement, namely social media influencers, who are now widely acknowledged as a new form of celebrity.Footnote 19 This new form of peer endorsement can reach a large number of child viewers due to the proliferation of social media, resulting in a wide-ranging impact.Footnote 20 Due to their related and approachable nature, child influencers offer viewers a window into their private lives though social media posts. This fosters a pronounced sense of shared experience between the creator and the audience, positioning child influencers as akin to their peers in the viewers’ eyes.Footnote 21 According to source attractiveness theory, the driven factors of source attractiveness consist of the source’s similarity, familiarity, and likeability. Specifically, similarity refers to the resemblance between the source and the receiver; in this context, the child influencer and the child viewer share a common identity as children. Familiarity denotes the degree to which the receiver knows the source through repeated exposure; in this scenario, the content created by child influencers, which is often centered around their daily lives and filmed in their homes, fosters a sense of familiarity with their audiences. Likability is defined as the affection felt towards the source, which can result from the source’s physical appearance and behavior.Footnote 22 Similarity, familiarity, and likability contribute to the receivers’ identification with the source, thus increasing the likelihood of adopting the source’s beliefs, attitudes, and behaviors.Footnote 23 In this case, the tangible appeal that child influencers have for their young audiences amplifies the likelihood of impressionable children emulating behaviors or making consumer choices based on the marketed content.
Brands hope that the positive associations linked to celebrities will transfer to the brands they endorse.Footnote 24 The fast-paced development of social media opens up fresh avenues for brands to interact with children, integrating embedded advertising formats that seamlessly weave brand messages into captivating media content. Since child viewers and their parents are often less resistant towards endorsements from child influencers, as peers are considered authentic information sources with no commercial interest,Footnote 25 this method makes promotional aspects less obvious, thus increasing the difficulty of detection by the target audience.Footnote 26
Detrimental Impacts on Child Viewers
The detrimental effect that social media can have on child viewers emanates from various sources. Considering the gravity of this issue, our attention is directed towards the deleterious impact engendered by child-to-child marketing facilitated by child influencers on social media.
Misguided consumer behavior with undisclosed sponsorship
Based on the reviewed literature and theories, children are easily influenced by their peer models and are likely to imitate their behavior, including the use of products they endorse.Footnote 27 As child influencers usually show the usefulness of the toys, clothes, or food they endorse, this action may accordingly lead to observational learning from their viewers.Footnote 28 Thus, child viewers are particularly vulnerable to exploitation through targeted advertising, especially when ads are designed to exploit their developmental vulnerabilities, such as the varying levels of advertising literacy across early childhood (under age 5), middle childhood (6–9 years), and late childhood (10–12 years).Footnote 29
In 2019, the watchdog group Truth in Advertising (based in Connecticut) filed a complaint with the United States (US) Federal Trade Commission (FTC) against “Ryan Toys Review,” YouTube child influencer Ryan Kaji’s top toy review channel,Footnote 30 accusing Kaji of deceiving children through “sponsored videos that often have the look and feel of organic content.” Approximately 90% of his videos have featured at least one paid product recommendation targeting preschoolers, a demographic too young to discern between commercial content and genuine reviews.Footnote 31 As Bram and Catalina have stated, “embedding sponsored content without any disclosure is the biggest harm faced by consumers in this industry.”Footnote 32 The purposeful directing of advertising content towards children through videos created by child influencers necessitates specific regulatory oversight to ensure ethical and responsible marketing practices.
Right to health
The right to health is recognized as a fundamental right under the United Nations (UN) Convention on the Rights of the Child (UNCRC). Article 24(1) of the UNCRC specifically outlines the right of the child to enjoy the highest attainable standard of health. The content of children’s right to health in the digital age is specifically interpreted in General Comment No. 25 (2021) on children’s rights in relation to the digital environment, which compels Member States to implement regulatory measures against materials and services that have the potential to adversely affect children’s mental or physical well-being. These provisions specifically address targeted or age-inappropriate advertising, marketing strategies, and other pertinent digital services, intending to shield children from exposure to the promotion of unhealthy engagement in social media.Footnote 33 In the visual world of commercial sharenting, child influencers’ excessive and unhealthy engagement with videos can have significant adverse effects on child viewers’ physical and mental health.
Previous research on the impact of celebrity endorsements on children has mainly focused on food marketing. Specifically, research has found that brands use celebrities to endorse energy-dense and nutrient-poor products,Footnote 34 increasing children’s intake of these less-healthy foods.Footnote 35 Researchers have examined the impact of social media influencers’ marketing on children’s food intake.Footnote 36 Results have shown that influencer marketing of unhealthy foods increases children’s immediate food intake, which is direct evidence of the detrimental impact on children’s physical health.
Researchers use “health-related quality of life (HRQoL),Footnote 37 an important multidimensional concept to measure the risk of precursors of disease to indicate the health status of the next generation.”Footnote 38 Previous empirical studies indicate that children with less screen time and moderate physical activity had the greatest HRQoL levels.Footnote 39 According to a 2021 study by the Center for Research on Chinese Youth, 70.8% of respondents often used Douyin (Chinese TikTok), Kuaishou, Huoshan, and other platforms to watch short videos.Footnote 40 The addiction to social media poses great challenges to the physical and mental health of young people.Footnote 41 The continuous viewing of screens, often for extended periods, leads to an increased myopia risk and other vision-related issues among children.Footnote 42 Previous studies indicate that there was a statistically significant correlation between screen time (high vs. low) and myopia.Footnote 43 Additionally, long-time sedentary behavior—maintaining improper posture while engrossed in video content—can contribute to spinal problems and musculoskeletal pain in young viewers.Footnote 44 In addition to physical health concerns, the mental health implications of excessive screen time for children are equally concerning. Researchers have highlighted the adverse effects of internet addiction, characterized by an inability to regulate internet usage, which can lead to significant distress and functional impairment.Footnote 45 Studies have shown a notable correlation between excessive use of short video applications and addictive behavior,Footnote 46 as well as a strong association between excessive social media use and depressive symptoms.Footnote 47
These physical and mental problems of children have escalated to the point of becoming a significant public health concern. Per a report by the World Health Organization (WHO) titled “Public Health Implications of Excessive Use of the Internet, Computers, Smartphones, and Similar Electronic Devices,” adolescents and young adults have emerged as the primary users of the internet and contemporary technologies. Behavioral addictions, characterized by an irresistible urge to repeatedly engage in online activities such as social media, have become a prevalent phenomenon across various jurisdictions.Footnote 48 The negative impact on child viewers cannot be overlooked and warrants serious consideration due to the potential long-term consequences on their development and well-being.
Identification confusion
Previous studies indicate that children are highly inclined to identify with popular child influencers and adopt their attitudes, beliefs, and behaviors, including those related to brands and products.Footnote 49 Sources used in advertising can serve as role models that children emulate in forming their identities.Footnote 50 The likelihood of this process increases when consumers develop a parasocial relationship (PSR) with the source.Footnote 51 PSR, as introduced by Horton and Wohl,Footnote 52 refers to the relationships consumers develop with media figures, making them influential sources of information.Footnote 53 The need for companionship, which is a fundamental driver for relationship formation, begins to emerge during childhood.Footnote 54 As a result, children are particularly susceptible to developing PSRs with media figures.Footnote 55 The one-sided emotional connections that individuals form with media personalities, such as popular child influencers, can significantly influence children’s attitudes, beliefs, and behaviors, including their consumption choices.
The proliferation of inappropriate content within commercial sharenting, including the flaunting of luxury items, has the potential to inadvertently endorse materialistic values to child viewers. The promotion of consumer culture by child influencers to their child viewers fosters a mindset centered around consumerism and material possessions, which can have profound adverse effects on influencers’ and viewers’ identity development and well-being.Footnote 56 This emphasis on material wealth can contribute to feelings of inadequacy as well as comparison to and dissatisfaction with one’s own circumstances, potentially leading to adverse effects on young audiences’ identification clarity.Footnote 57 It is essential to recognize and explore the implications of such content within commercial sharenting platforms to promote healthier lifestyles and values among child viewers.
Basic Framework for Protecting the Child Viewer in China
In the digital era, the underlying issue regarding the effective regulation of commercial sharenting or other user-generated content falls on social media platforms’ governance abilities. According to the gatekeeping theory, formally identified by Kurt Lewin,Footnote 58 the gatekeeper plays a crucial role in determining what information should be allowed to pass to groups or individuals and what should be restricted. Indeed, through the gatekeeping process, the gatekeeper filters and removes unwanted, sensitive, and controversial information. This function serves to exert control over society or a group, guiding it along a path perceived as appropriate or beneficial.Footnote 59
Moreover, in recent years, platforms have increasingly employed pre-designed algorithms for the organization and recommendation of content.Footnote 60 This transformation has significantly altered how users engage with online content.Footnote 61 Take TikTok as an illustrative case: the platform predominantly relies on content interactions within personalized video feeds, with the performance relying heavily on the effectiveness of the recommendation algorithm.Footnote 62 It is noteworthy that the potential of recommendation algorithms is rooted in the abundant availability of user data, rendering TikTok capable of augmenting the efficiency of content distribution and enhancing the adaptability of the personalized video feed.Footnote 63 As the primary driving force behind content delivered to users, particularly minors, the legal regulation of recommendation algorithms holds significant importance within the content moderation process.
Legal Regulations Surrounding Platform Governance
To enhance content governance on platforms, China has implemented a series of laws, including the Cybersecurity Law of the People’s Republic of China, the Data Security Law of the People’s Republic of China, etc. Several related administrative regulations and departmental regulatory documents have also been issued in recent years. The Provisions on Ecological Governance of Network Information Content, as deliberated and adopted at the executive meeting of the Cyberspace Administration of China (CAC), came into force on March 1, 2020.Footnote 64 The Opinions of the Cyberspace Administration of China on Further Pushing Websites and Platforms to Fulfill Their Primary Responsibility for Information Content Management were issued on September 15, 2021, to further prompt websites and platforms to fulfill their primary responsibility for information content management.Footnote 65
To regulate algorithm-generated recommendations for internet information services, the departmental Provisions on the Administration of Algorithm-generated Recommendations for Internet Information Services were issued, which explicitly denote the user’s right to know the algorithm,Footnote 66 the user’s right of choice,Footnote 67 and the special protection of minors.Footnote 68 Further, on August 12, 2022, the CAC issued an initial set of announcements obligating internet service providers to submit information concerning the algorithms of domestic internet information services. This mandate aligns with the Provisions on the Administration of Algorithm-generated Recommendations for Internet Information Services, which pertain to algorithm-generated recommendationsFootnote 69 that highlight the responsibility of internet service providers to ensure algorithm security and construct an algorithm accountability mechanism.Footnote 70 These provisions on algorithms encompass algorithm-powered recommendation services (ARS) utilizing technologies such as generation and synthesis, personalized push, selection sort, search filtering, scheduling decisions, and other algorithmic technologies to deliver information to users.Footnote 71 As a result, the majority of online platforms that Chinese internet users utilize fall within the regulatory scope that these provisions define.Footnote 72 The governance of platforms in China also covers other aspects related to the protection of children in the digital sphere, such as the anti-addiction system and special protection of personal information contained in the Law of the People’s Republic of China on the Protection of Minors. Besides, China’s effort to specially protect the rights of minors in cyberspace can be seen in the newly issued Regulation on the Protection of Minors in Cyberspace, which came into force on January 1, 2024.Footnote 73
Broadly speaking, China has adopted a fragmented regulatory strategy to govern platform liability, and there is still an absence of a dedicated law that comprehensively addresses content moderation. The primary responsibilities of platforms are governed by departmental regulatory documents that have a relatively low legal status.Footnote 74 Additionally, the rigorous administrative supervision system, which relies on the approach of “further pushing platforms to fulfill responsibility,” may not provide strong incentives for platforms to proactively engage in content moderation. In reality, content governance primarily occurs through “special campaigns” rather than through normalized governance processes.Footnote 75 For example, the CAC implemented a nationwide campaign to purify the online environment, which was intended to “crack down on wrongdoing and malpractice in live-streaming and short videos” and resolutely curb the manipulation of minors to make profits through live broadcasting and short videos to create child influencers.Footnote 76 In spite of its immediate effect, this governance approach may face challenges in terms of long-term sustainability.
Specific Law
The previous analysis of children’s right to health reveals that long-term viewing of online videos or live broadcasts, as well as other forms of online social networking, poses inevitable health harms for child viewers. Some special laws have been enacted to address this issue in China. Specifically, Article 74 of the Law of the People’s Republic of China on the Protection of Minors stipulates that network product and service providers shall not provide minors with products and services that induce addiction. Cyber games, online live broadcasts, online audio and video services, online social networking platforms, and other types of network service providers shall set corresponding time limits, authority, and consumption management parameters along with other rules for minors’ use of their services.
Furthermore, social media platforms in China face significant liability if they fail to handle children’s personal information in accordance with the Law of the People’s Republic of China on the Protection of Minors. This law mandates that network product and service providers, including social media platforms, receive warnings, have their illegal gains confiscated, and be fined between 100,000 yuan and one million yuan.Footnote 77 The most severe legal consequences include the suspension of their business licenses and the closure of their websites by public authorities.Footnote 78
Previous empirical studies assessing the impact of an advertising disclosure on minors’ recognition of influencer marketing have demonstrated that the inclusion of an advertising disclosure aids children in recognizing advertising.Footnote 79 Therefore, the disclosure of sponsorship in influencer endorsements should be required to protect child viewers from misleading advertising. According to Article 9 of China’s Measures for the Administration of Internet Advertising, internet advertisements must be clearly identifiable, ensuring that consumers can recognize them as advertisements when promoting goods or services. This means that all internet advertisements, whether online or offline, must be explicitly labeled as advertisements to comply with legal requirements.Footnote 80 If the internet advertisement is not identifiable, the market regulatory department shall order the violator to take corrective action and impose a fine of not more than 100,000 yuan on the publisher of the advertisement.Footnote 81
Response and Measures for Protecting Child Viewers in Commercial Sharenting
Regulating Platform Liability
In analyzing China’s fragmented regulatory strategy, it becomes apparent that the existing regulatory strategy is insufficient for the commercial sharenting regime. To effectively safeguard the rights of child influencers and viewers alike, there is a pressing need for an integrated and optimized regulatory scheme. This scheme should encompass a comprehensive framework that addresses the challenges inherent in commercial sharenting while ensuring the protection of children’s rights and interests. As described above, a three-step mechanism could be employed to safeguard both child influencers and child viewers in the realm of commercial sharenting.
Primarily, any content within the realm of commercial sharenting that jeopardizes the privacy of children ought to be strictly prohibited on social media platforms, given their potential to infringe legal boundaries. Secondly, it is imperative that social media platforms bear the responsibility of actively moderating content associated with commercial sharenting to safeguard child viewers’ interests in compliance with the protection of minors via advertising laws and other special laws. Finally, a regulatory mechanism could use the “Notice and Action” provisions outlined in the EU’s DSA for reference. This would entail social media platforms being mandated to promptly remove inappropriate commercial sharenting content upon notification by either public authorities or third parties. The first and second steps represent ex-ante regulations, while the third step serves as an ex-post mechanism to prevent potential harm to child viewers.
Content filtering
To protect child influencers’ rights, social media platforms should prohibit any content that poses a risk of invading children’s privacy or personal information, as it constitutes unlawful behavior. The content disseminated on social media platforms of child influencers frequently transcends superficial portrayals of their domestic environments, delving into the disclosure of intimate personal details, including specific identifying information like the child’s name, birth date, school, etc. Moreover, intimate information, as termed by Professor Leah Plunkett, encompasses geographic locations, daily routines, preferences, and other private details that viewers can glean from the posted content.Footnote 82 As Fishbein notes, this implies that a child influencer’s private life becomes openly accessible to the public.Footnote 83 The personal lives of child influencers are laid bare for anyone on the internet to observe, thereby exposing them to significant risks of potential misconduct by individuals with malicious intent, such as perpetrators, pedophiles, or identity thieves. When a platform detects content containing certain information that endangers children’s privacy or personal information, such as nude pictures of children, the content should be directly prohibited.
One case involving a social media platform being charged with violating child data protection laws occurred in Ireland. On September 1, 2023, the Data Protection Commission (DPC), Ireland’s supervisory authority, finalized its decision to impose a fine of 345 million euroFootnote 84 on TikTok, which allegedly violated the specific General Data Protection Regulation (GDPR) provision concerning children’s data protection.Footnote 85
Content moderation
As the second step, social media platforms should be obligated to moderate content related to commercial sharenting to protect child viewers through laws on the protection of minors, advertising laws, or any other specific regulations. Various legal frameworks impose obligations on platforms and other network service providers to manage the amount of time users spend online, which is required by such instruments as the Law of the People’s Republic of China on the Protection of Minors,Footnote 86 while the non-binding Initiative for Preventing Juveniles from Short Video Addiction compels platforms and providers to disclose product endorsements.
To safeguard children against manipulative covert advertisements with child-to-child marketing conducted by child influencers, advertising laws that include video-sharing platforms in their scope should be promulgated, mandating that sponsored content be appropriately disclosed. In the Chinese law, the relative provision can be observed in the first paragraph of Article 9: “An [i]nternet advertisement shall be identifiable so that it can be clearly identified by consumers as an advertisement.”Footnote 87 However, from examining regulations in the United Kingdom (UK) and the US, it is evident that China lacks specific legislation concerning social media influencer advertising.
The UK’s Communications Act 2003 pertains specifically to the child influencer industry, serving to shield children from exposure to harmful content and ensuring that viewers of influencer content are safeguarded against encountering advertisements without adequate warning.Footnote 88 Article 319(2)(1) of the Communications Act 2003 prohibits the “use of techniques which exploit the possibility of conveying a message to viewers or listeners or of otherwise influencing their minds, without their being aware or fully aware of what has occurred.”Footnote 89 This legislation prohibits child influencers from disseminating sponsored content without appropriate disclosure, thereby offering a preventive measure against potential manipulations of uninformed children.
In the US, in November 2019, the FTC issued comprehensive guidelines for endorsements by social media influencers and brands, which also encompass child influencers. The guidelines require that sponsored content be clearly labeled and that endorsements, which include featuring a product or service in a post, tagging or liking brands, “pinning” brands, or commenting on or providing reviews of brands,Footnote 90 be truthful and not misleading.Footnote 91 This brief comparative study does not aim to provide a comprehensive analysis of the UK and US approaches to regulating child influencer advertising, but it does seek to highlight the need for more specific legislation on social media advertising in China.
Content reviewal
Finally, the third mechanism could leverage the “Notice and Action” provisions of the EU’s DSA, requiring social media platforms to promptly remove commercial sharenting content upon notification by public authorities or third parties.Footnote 92 Thus, platforms should establish an ecological content governance mechanism, complete with detailed rules for the strict classification and grading of content. If platforms utilize personalized algorithmic recommendation technology to push information, the content recommended by those algorithms should be transparently communicated to consumers through prominent labels. Platforms should conduct manual reviews or interventions based on their own transparency guidelines, removing the unlawful and unappropriated content generated by sharenting. This empowers audiences with the right to disable recommended services or delete their personal labels, with digital content platforms offering the option to refrain from pushing services based on personal data. To safeguard minors, digital content platforms are prohibited from recommending content to minors or utilizing personally sensitive data for content recommendations.Footnote 93
Convenient channels for filing complaints and reports should be prominently displayed, accompanied by clear instructions on how to utilize these channels as articulated in China’s Measures for the Administration of Internet Advertising. Article 16 of this 2023 departmental rule requires internet platform operators to monitor and examine their advertising content, and if illegal advertisements are discovered, the operator shall take necessary measures, such as giving notice to request a correction, deleting, blocking, or disconnecting the link(s) to the advertisement(s) along with maintaining relevant records. Internet platform operators should also establish an effective mechanism for receiving and processing complaints and reports. This includes setting up convenient channels for submitting complaints and making public the methods for lodging complaints and reports, ensuring they are handled promptly.Footnote 94 If Article 16 is violated, the market regulatory authority at or above the county level shall order internet platform operators to take corrective action and impose a fine of not less than 10,000 yuan and not more than 50,000 yuan.Footnote 95 However, the legal status of this regulation remains relatively low in terms of China’s normative hierarchy, and the penalties are mild, making it difficult to achieve a deterrent effect.
Further, online platforms should be encouraged to develop models tailored for minors and provide online products and services suitable for this demographic. Public authorities functioning as external supervisors are not engaged in case-by-case supervision. Instead, their role involves examining whether platforms have established relevant mechanisms for applying content moderation and are undertaking responsible measures in response to complaints or notices.
The first and second steps, constituting ex-ante regulations, involve proactive measures implemented before potential harm occurs to both child influencers and child viewers. In contrast, the third step serves as an ex-post mechanism, which functions retrospectively to address the harm that has already occurred to child viewers. The ex-post mechanism typically involves responsive measures, such as complaint mechanisms, reporting tools, and content removal procedures, to address harmful content or activities that have been identified after the fact. This mechanism may also include measures to provide support for affected individuals, including child viewers and their families.
Web Literacy Intervention
In addition to establishing comprehensive platform liability, enhancing children’s web literacy is also crucial. Take online advertising literacy as an example: researchers have explored the effectiveness of an educational vlog in assisting children aged 11–14 to cope with advertising. Their findings indicate that advertising disclosures enhanced children’s recognition of advertising.Footnote 96 Studies testing the impact of advertising disclosures on minors’ recognition of influencer marketing have consistently demonstrated that the inclusion of such disclosures aids children in recognizing commercial promotions.Footnote 97
Recently, China enacted administrative regulations aimed at enhancing the recognition and promotion of web literacy and morality. According to the Regulation on the Protection of Minors in Cyberspace, the Ministry of Education shall incorporate web literacy and morality education into quality-oriented education and formulate indicators for assessing minors’ web literacy and morality in conjunction with the State’s cyberspace affairs department. Education departments shall guide and support schools in carrying out web literacy and morality education for minors and foster minors’ cybersecurity awareness, digital literacy behavioral habits, and protection skills, focusing on the formation of moral awareness in cyberspace and the cultivation of the concept of the rule of law in cyberspace.
Conclusion
Commercial sharenting is a global phenomenon in the digital era. The practice of sharenting is not inherently bad; after all, it represents a manifestation of parental love and concern. Like a coin that has two sides, sharenting fueled by child-to-child marketing adversely affects both child influencers and viewers. Conventional legal paradigms predominantly center on child influencers. Nevertheless, considering the escalating prevalence of child-to-child marketing via social media platforms, the irrefutable influence of child influencers on child viewers cannot be overlooked, especially child viewers’ right to health. To regulate oversharing effectively, it is imperative to safeguard the interests of both child influencers and child viewers. Chinese lawmakers have implemented a series of legal regulations concerning platform liability for user-generated content, although these regulations represent a fragmented approach.
To systematically address the issues related to commercial sharenting involving both child influencers and child viewers, as well as other forms of user-generated content, we propose that China take a three-step approach: 1) content filtering, 2) content moderation, and 3) content reviewal. This three-step approach seeks to establish a comprehensive regulatory framework that safeguards child influencers from privacy violations and economic exploitation while ensuring child viewers’ rights are protected through legal compliance and transparent advertising.