Here’s how sweeping legal changes will affect content creators in 2023

By 12/29/2022
Here’s how sweeping legal changes will affect content creators in 2023

The future of the creator economy ecosystem is difficult to predict, especially following the whirlwind year of events throughout 2022. But we do know activities across the legal sector are poised to bring drastic changes across platforms, creators, and advertisers.

The past year brought about changes to internet laws and regulations in the European Union, as well as challenges to laws in the United States that will begin to take shape throughout 2023. Additionally, regulatory authorities have continued exercising their control in new and emerging areas, dropping penalties and warnings on both creators and brands.

Meanwhile, technological advancements in artificial intelligence (AI) were made available to the masses throughout 2022, raising questions about opportunities and risks for creators and brands alike.

Tubefilter

Subscribe to get the latest creator news

Subscribe

From creators racking up liabilities to platforms facing global regulatory control, the legal landscape of the creator economy holds lots of promise, and challenges, for the year ahead. Let’s explore the possibilities!

Content liability for creators–music, the “fediverse,” subscriptions, sponsorships, and more

It might appear that creators are set up nicely going into 2023, with robust music catalogs that can be tapped into as they create content on platforms, but they should think twice before using those tracks.

Numerous platforms secured music licensing deals to expand their music library catalogs to include a mix of music from the major record labels (Sony Music Group, Warner Music Group, and Universal Music Group) as well as independent label collective Merlin. An exception would be video-sharing app Triller, which Sony Music has sued for not paying royalties as the platform begins to remove music from the other major labels, even admitting in court it’s been unable to pay royalties.

If creators rely on a platform’s licenses, those music tracks present a couple of pitfalls:

  • These deals usually do not cover commercial use of the tracks, such as for an influencer to mix into a TikTok or a brand account to drop into a YouTube video. The licensing rights typically only cover personal, non-commercial use by regular (for lack of a better term) users generating content. It will be important for creators to ensure they understand the licensing rights for the assets they are using when creating content for distribution across platforms. Otherwise, improper use could make them, and the brand partners, subject to copyright infringement lawsuits.
  • The licenses usually don’t transfer to other platforms. Meaning, if a creator uses music from the YouTube Music Library or the TikTok Audio Library, it is likely they need to confirm licenses anywhere else they upload the content.
  • The licenses may expire since most platforms are usually only licensing for only a couple of years at a time. This leaves creators potentially exposed unless the platform’s license offers coverage for works that incorporate the music beyond just the term length for when the music is available in the platform’s music library. Even then, a creator may be left years later trying to prove they obtained the music as part of a platform’s license agreement, which isn’t always easy.

Creators also exploring non-traditional platform options such as Mastodon and similar fediverse” technologies should be mindful of taking on platform liability issues, such as copyright or trademark infringement and content moderation, which usually are handled by well-established and supported departments within technology companies.

A key component to the interoperability of the fediverse is the social protocol ActivityHub, for which Tumblr recently announced support. 2023 may present opportunities for additional partnerships and for creators to explore how (if at all) decentralized platforms will play a role in their business strategies.

Creators offering subscriptions and other gated access mechanisms in connection with their communities need to understand the potential liability they are taking on, too. Operating a community powered by platforms such as Discord or Slack places a burden on creators to establish rules, moderation policies, and dispute systems within their micro-communities. It isn’t a stretch to see current lawsuits being filed against platforms to also start including creators themselves as the operators.

Additionally, as the Federal Trade Commission (FTC) noted in a 2019 statement regarding YouTube, “individual channels on a general audience platform are “websites or online services” under COPPA. This framing puts content creators and channel owners on notice that we consider them to be standalone “operators” under COPPA, subject to strict liability for COPPA violations.”

Creators should expect to comply with COPPA and any other laws that are relevant to the activities they undertake, much in the same way the EU’s robust privacy laws (and more recently those out of California) impacted privacy practices across the creator economy.

The FTC is expected to continue work on updating and expanding its enforcement options and efforts throughout 2023, focusing heavily on paid endorsements and reviews.

“The FTC is poised to update its Endorsement Guides, and the proposed changes focus on disclosure and transparency,” says Robert Freund, an attorney who works with online advertising and e-commerce companies, brands, and creators.

An announced rulemaking proceeding will address the need for more robust civil penalties and more clearly establishing prohibited activities. A public comment period will close in January 2023, after which the FTC will host several public workshop conferences throughout 2023.

“Specifically, the FTC will expand the definition of ‘endorsement’ to clarify that all ‘marketing’ and ‘promotional’ messages—even social media tags—are endorsements subject to FTC’s disclosure rules and truth-in-advertising principles,” Freund adds. “The revisions also clarify that influencers and other creators may be liable for failing to disclose their relationships with brands—while this has long been the case, the clarification may signal the FTC’s intent to pursue more enforcement against individual creators, not just brands and agencies.”

The FTC is seeking ways to reclaim some of its authority to enforce civil penalties following the U.S. Supreme Court’s 2021 decision in AMG Capital Management, LLC v. FTC, which held that the FTC exceeded its authority under Section 13(b) of the FTC Act when awarding monetary penalties.

As the FTC explained in an October press release, “A potential rule that clearly spells out prohibited practices may strengthen deterrence by allowing the agency to impose civil penalties, while simplifying FTC enforcement.”

“Because of this additional regulatory attention and the increase in consumer litigation related to misleading endorsements,” Freund says, “creators need to understand the rules to insulate themselves from liability.”

Expect more focus on content platforms and distribution channels

Established creators understand the importance of diversification when it comes to the platforms on which they build a community. This was made more evident by the whirlwind transformation of Twitter into a privately held company during the last half of the year. Looking into 2023, there are potentially more changes in store for how platforms are regulated that could impact creators.

In the United States, the Supreme Court is taking on a pair of cases that tackle important legal issues about Section 230 of the Communications Decency Act. These cases are also important because it’s the first time Section 230 is under scrutiny by the Supreme Court. The Court has previously denied review under previous cases, waiting for the right set of facts.

Section 230 became law in the 1990s and specifically provides a liability shield for internet service providers, which includes platforms such as YouTube, Twitter, and Meta, from being publishers of user-generated content.

“A lot has changed since 1995, which is when a NY court ruling declared that platforms who policed any user generated content on their sites should be considered publishers of–and therefore legally liable for–all of the user-generated content posted to their site,” noted Vivek Jayaram, founder of Jayaram Law. “This ruling, in part, gave birth to the enactment of Section 230.”

The first case before the Supreme Court is Gonzalez v. Google. YouTube is considered an “interactive computer service” and is a subsidiary of Google LLC. YouTube, as with most (if not all) other interactive computer services, relies on recommendation algorithms to serve content to end users. Courts have routinely held that this use of recommendation systems by interactive computer services is still considered covered by Section 230 liability protections.

The content at issue in Gonzalez involved auto recommendations for ISIS extremism and recruitment content. Twenty-three-year-old Nohemi Gonzalez, an American studying in Paris, was killed during the November 2015 terrorist attack carried out by ISIS gunmen, resulting in 129 people killed. Gonzalez’s relatives and estate brought a lawsuit against Google, as explained in lower court filings, alleging “that Google, through YouTube, had provided material assistance to, and had aided and abetted, ISIS, conduct forbidden and made actionable by the AntiTerrorism Act.”

The family says Google’s recommendation system helped ISIS grow. They also argue “the protections of Section 230 are limited to a publisher’s traditional editorial functions, such as whether to publish, withdraw, postpone or alter content provided by another, and do not additionally include recommending writings or videos to others.”

The second case before the Supreme Court is Twitter v. Taamneh. Twitter v. Taamneh goes beyond Section 230 and implicates Section 2333 of the Anti-Terrorism Act, which allows U.S. citizens to sue another person, or company, if they are found to aid, assist, or conspire with someone that commits an act of international terrorism. The argument at issue in Twitter v. Taamneh stems from what role, if any, platforms including Twitter, Meta, and Google had in the January 2017 attack that killed 39 people at the Reina nightclub in Istanbul, Turkey.

The Supreme Court is presented a narrow question, or sometimes a set of narrow questions, that are in dispute at lower courts. This is called a “question presented” which is what they’ll focus on answering. For Gonzalez, it’s a singular, two-part question: “Does section 230(c)(1) immunize interactive computer services when they make targeted recommendations of the information provided by another info content provider, or only limit the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information?” The question put more simply is “whether section 230 protects recommendations, or is limited to traditional editorial functions.”

For Twitter v. Taamneh, the Court is presented with two questions:

  1. Whether a defendant that provides generic, widely available services to all its numerous users and “regularly” works to detect and prevent terrorists from using those services ‘knowingly’ provided substantial assistance under Section 2333 merely because it allegedly could have taken more “meaningful” or “aggressive” action to prevent such use.
  2. Whether a defendant whose generic, widely available services were not used in connection with the specific “act of international terrorism” that injured the plaintiff may be liable for aiding and abetting under Section 2333.

Twitter filed its brief in November. Gonzalez’s relatives and estate filed their brief in December. More filings and proceedings should come in early 2023.

Looking back to the early rulings involving content-sharing platforms, Jayaram argues that, “Amidst substantial uncertainty, Courts and practitioners alike need legislation that acknowledges the importance of social media platforms while also balancing the need to hold platforms responsible for certain kinds of conduct. A duty of care approach seems most reasonable.”

In the European Union, the European Parliament approved the final versions of the Digital Markets Act (DMA) and the Digital Service Act (DSA), in March and July respectively. Much in the same way the General Data Protection Regulation, or GDPR, and the California Consumer Privacy Act, or CCPA, changed privacy practices for businesses around the world, the DMA and DSA will impact creators and their businesses regardless of whether they are based in or operating out of the EU.

The DMA seeks to regulate large tech platforms, such as Google and Meta, and the control those platforms have as “gatekeepers” through a set of affirmative obligations and prohibitions. The DMA broadly covers topics including the access, control, portability, and interoperability of data (personal, advertising, and more) and systems (such as app stores and communication tools).

Regardless of whether platforms are based in the EU or elsewhere in the world, the DSA creates additional, more specific regulations over the content that is uploaded to and distributed by online platforms and how products and services are marketed and sold within online marketplaces. The DSA overhauls transparency requirements, such as with the use of algorithms with content moderation and IP rights protection.

For creators, the combined implementation and enforcement of the DMA and DSA, which together make up the Digital Services Package, could impact the once-siloed gardens in which content distribution took place, enabling more control over communities built across platforms. However, it could also impact the revenue models and opportunities for technology platforms, which would have a downstream impact on creators.

Platforms forced to take on more liability for the actions of their users may implement more strict content standards and enforcement mechanisms, potentially impacting the ability of creators to create and share content in the same manners afforded under previous laws. However, the systems will require more transparency from platforms, as well as the establishment of safeguards intended to protect fundamental rights, including the freedom of speech and expression. Ultimately, it remains to be seen how much impact regulatory compliance efforts will have on the overall operating budgets and revenue generation of historically highly-profitable technology platforms.

Beyond the hype: Web3, blockchain, and NFT technologies get practical

During 2022, the world witnessed the highs and lows of the web3, blockchain, cryptocurrency, and non-fungible token (NFT) ecosystem.

The promise of 2023 is that creators will begin to witness the evolution of web3 technologies that move past the hype and into the practical, industry-disrupting opportunities. Jellysmack is a prime example of an arguably web3-focused company through its use of AI technology to guide decisions and automation, as well as the interoperable service it offers to connect disparate content distribution platforms.

Alongside the opportunities available, the risks of web3 continue to make splashy headlines as regulators enforce regulations and laws against creators and influencers.

In June 2021, Kim Kardashian posted on Instagram to promote the EthereumMax cryptocurrency token (EMAX) in exchange for $250,000. As a result of the posts, she settled for $1.26 million as a fine paid to the U.S. Securities and Exchange Commission (SEC). Kardashian failed to disclose the $25,000 payment to her 255 million Instagram followers, violating Section 17(b) of the Securities Act, which makes it “unlawful for any person to promote a security without fully disclosing the receipt and amount of such consideration from an issuer.”

Ripple v. SEC presents an opportunity for a court to weigh in on the burning question of whether or not a crypto asset is or isn’t considered a security. How cryptocurrency, NFTs, and related blockchain technologies are classified will significantly impact creators as the technologies continue to become integrated into the creative process from start to finish, such as proof of authenticity or ownership, and opportunities for crowdfunding, revenue generation, and profit-sharing. For example, with Kardashian’s promotion, the EMAX tokens were offered and sold as investment contracts and therefore securities under Section 2(a)(1) of the Securities Act.

Kardashian, as well as other celebrity endorsers Floyd Mayweather and Paul Pierce, also faced lawsuits filed by EthereumMax investors claiming they fraudulently mislead people into investing in the EMAX tokens. Although the case has been dismissed, similar lawsuits may start popping up in 2023 over similar endorsement activities.

Creators should be mindful of getting involved with any heavily regulated industry that triggers regulatory oversight, such as promoting or selling products under the watch of the Food & Drug Administration (FDA) and advertising to children. This is in addition to more commonly-known Federal Trade Commission Act requirements for the disclosure of a compensation arrangement in exchange for social media posts, as detailed in the social media influencer guidelines. It remains to be seen whether any additional action is taken by the FTC.

The U.S. Copyright Office (USCO) and U.S. Patent and Trademark Office (USPTO) have launched a joint study to examine the impact of NFTs on existing IP law and policy, as well as opportunities for future development in the space. The offices have started collecting written input from across the industry as part of a public comment period that closes in January 2023. The information-gathering phase of the study also features three public roundtables in January 2023 where a select group of representatives will discuss the issues live.

Where does the future of the creator economy lead, especially during a pivotal moment in time for Web 2.0 technologies and platforms and the emergence of web3? It remains to be seen as laws and regulations continue to impact and heavily influence the direction of the future of the web.

An uncertain future for artificial intelligence

2022 presented the largest explosion of tools powered by artificial intelligence and machine learning (ML) technologies. Previously private, proprietary technologies, such as OpenAI’s DALL-E, Midjourney, and Stability AI’s Stable Diffusion, entered into public betas and full-blown commercial offerings unlocking a previously untapped opportunity for creators to experiment with a new way to create content.

Adobe already incorporates its AI and ML technology, named Adobe Sensei, across most of its Creative Cloud software offerings. Microsoft recently announced Microsoft Designer, its AI-powered tool that appears to rival similar tools available in Adobe Express.

But where does this development leave creators legally?

That question is best examined through two buckets: (1) the input; and (2) the output. The development of AI tools has caused concern among creators who fear that their works have been scraped, copied, and fed into software to train an AI model without their authorization, knowledge, or compensation. There are not currently any exceptions that would permit this activity in the U.S., European Union, and other parts of the world. The United Kingdom is an outlier in that it offers what’s called a text and data mining (TDM) exception for researchers and academics who are developing AI tools, with plans to expand the exception beyond academic and research purposes.

As it pertains to the output of an AI tool, copyright laws in the U.S., European Union, and other parts of the world do not recognize non-human creators. This means works that are entirely AI-generated may not be eligible for copyright protection, potentially impacting ownership, licensing, and related monetization options for creators. The problem for creators using heavy amounts of AI tools that either entirely or significantly supplement the creative process is that there is no clear guidance as to where the line is drawn to determine whether or not copyright protection will be available. (The U.K. is again an outlier in that it recognizes non-human authorship and extends copyright protections to generative works.)

Kristina Kashtanova, an AI researcher and creator, used Midjourney to generate artwork she compiled into a graphic novel she wrote and designed, Zarya of the Dawn, documenting the process on her Instagram. She received media attention touting her work as the first-ever successful registration of generative art.

However, shortly afterward, she became the subject of focus for the USCO, which threatened to cancel her registration unless she provided sufficient evidence and documentation for her involvement in the creation process. The outcome likely won’t be known until sometime in 2023, and could potentially be argued further if there are subsequent issues between how the USCO interprets the Copyright Act and its authority to register creative works, in light of works that involve, or are entirely comprised of, AI-generated works.

Creators should make decisions that are best for the business when it comes to the use of AI tools. It could potentially be riskier to incorporate the use of AI for paid endorsements or projects that will be licensed under terms that would put the creator on the hook for damages if there are issues, such as claims of copyright infringement or name, image, and likeness violations, due to the use of an AI tool.

Creators and their business teams should think carefully through the indemnification and limitation of liability sections of contracts, as those sections are what would end up dictating the extent of a creator’s obligation to make things right for licensees or third parties making claims.

It’s also worth noting that Shutterstock this year announced a partnership with OpenAI to offer text-to-image generation services while simultaneously prohibiting the sale of generative works from third-party AI tools, given the inability to validate the models, and underlying training data sets, that were used. Getty Images also announced a ban on generative works being sold on its platforms. Meanwhile, Adobe Stock announced its own set of standards that, if met, would allow stock content submission of generative works.

Creators can review and potentially implement their own set of standards based on the announcement from Adobe Stock. The standards proposed offer insight into how creators may appropriate disclose the use of AI tools to partners, including brands and agencies hiring them for specific content creation work.

The impending recession is already tightening advertiser budgets

Will we, won’t we, or are we? It seems like there’s a weekly, if not daily, debate as to the existence of a current or looming global economic recession. Coming off the post-pandemic economic boom, creators may be facing one of the first lengthy economic recessions since 2008’s Great Recession. Advertiser budgets are bending under pressure and the world’s largest ad-buying firms are projecting a slowdown.

“We have seen a significant ebb and flow for the creator community over the past few years,” Michael Ransom, an entertainment and IP attorney in Nashville, says. “In the early days of COVID-19, many creators were forced to innovate in order to stay afloat, with many finding significant success in creating new markets for their craft. The resulting landscape is particularly favorable for creators due to the large number of creative outlets and a readily available audience.”

Where does a recession leave creators and ad spend, one of the main sources of revenue for the creator economy? If YouTube’s ad revenue experiencing a 2% decline for Q3 2022 is an indication, the advertising revenue pools may dry up, which impacts the revenue splits between creators and platforms–and thus the payouts to creators.

In addition to potential revenue decreases, the state of the economy impacts staff and workers at companies that operate within the creator economy. Layoffs have already impacted tens of thousands of workers, including people at Patreon, Snap, Meta, Twitter, and Jellysmack. TikTok remains an outlier by expanding its workforce.

Ransom adds that “2023 could, however, see some challenges for the creative community given the direct correlation between consumer spending and recession-like conditions, which could force the market for creativity to contract as consumers allocate resources elsewhere.”

Additionally, the use of tools that facilitate creator and brand efforts to link audiences across platforms, ranging from simple tools like Linktree to robust partnerships like Jellysmack, will become even more vital to the continue success of creators looking to expand their businesses throughout 2023.

Creators can start by focusing on the diversification of business revenue channels, if they haven’t done so already. Ensuring proper protection of a creator’s brand assets, both from an IP standpoint and from a contractual standpoint, should also be a point of focus for creators during 2023. Creators should think through exclusivity deals that are tied to revenue splits or royalty payments, as opposed to upfront payments that guarantee a creator sees the income. On the flip side, brands may focus on structuring sponsorship and other advertising deals in a way that limits the upfront financial exposure for campaigns.


Franklin Graves is an in-house technology, IP, and media law attorney based in Nashville, Tenn. He runs Creator Economy Law, a weekly newsletter and blog focused on breaking down all things legal across the creator economy space. He also enjoys providing volunteer legal assistance to creators. He can be reached at franklin.graves@gmail.com.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe