Policymakers increasingly ask Internet platforms like Facebook to “take responsibility” for material posted by their users. Mark Zuckerberg and other tech leaders seem willing to do so. That is in part a good development. Platforms are uniquely positioned to reduce harmful content online. But deputizing them to police users’ speech in the modern public square can also have serious unintended consequences. This piece reviews existing laws and current pressures to expand intermediaries’ liability for user-generated content. It discusses three ways that poorly designed laws can do damage — to First Amendment-protected online speech, national security, and the economy.
Atty-Judge Facebook Friendship
FL Supreme Court in a 4-3 decision ruled that a judge is not automatically recused from a case because of a Facebook friendship with one of the attorneys appearing before her. In Law Offices of Herssein and Herssein v. USAA, the Florida high court stated that
it is commonly understood that Facebook “friendship” exists on an even broader spectrum than traditional “friendship.” Traditional “friendship” varies in degree from greatest intimacy to casual acquaintance; Facebook “friendship” varies in degree from greatest intimacy to “virtual stranger” or “complete stranger.
The court went on to find that
the mere existence of a Facebook “friendship” between a judge and an attorney appearing before the judge, without more, does not reasonably convey to others the impression of an inherently close or intimate relationship. No reasonably prudent person would fear that she could not receive a fair and impartial trial based solely on the fact that a judge and an attorney appearing before the judge are Facebook “friends” with a relationship of an indeterminate nature.
In contrast to the majority view, the dissent recommends adopting
a strict rule requiring judges to recuse themselves whenever an attorney with whom they are Facebook “friends” appears before them.
Selfie as Alibi
One benefit from living your life online is that social media can provide you with a ready-made alibi when law enforcement comes to your job and arrests you for a crime that carries a potential 99-year prison sentence which is exactly what occurred with 21-year old Cristopher “CJ” Precopia, 21.
Ind. Body to Review Content on Facebook
Here is a NY Times editorial discussing the plans by Facebook to create an independent body to review content posted on the platform. The authors, Kate Klonick and Thomas Kadri, liken the idea to a Supreme Court for Facebook.
NYTimes.com: How to Make Facebook’s ‘Supreme Court’ Work
This review starts with a historical overview of trial by jury and then moves to a discussion of media and communication. This is followed by an examination of the advantages and disadvantages associated with jurors and digital technology. The heart of the article is a review of six scholarly studies that attempt to explain why jurors use the Internet, as well as methods for combating such use. The article concludes with recommendations for future areas of research.
Facebook seeks a highly motivated and experienced team player to serve as Product Counsel on Facebook’s product legal team. The position will focus on providing strategic and tactical legal counseling to the product and business teams that build tools for Facebook’s business integrity efforts. This is a great opportunity to join a growing legal team and to work on novel issues in an exciting, fast-paced environment.
Review products, features and initiatives to assess legal compliance across multiple jurisdictions
Counsel product, marketing, engineering, and other business teams on legal issues related to the provision of online advertising products to help ensure compliance with consumer protection laws, privacy laws and regulations, and other legal requirements
Coordinate with legal, public policy and communications colleagues on multi-disciplinary issues
Coordinate with content policy, operations, legal and product teams to implement and enforce content standards
Support escalations on advertising content assessment and takedown requests
J.D. degree and membership in at least one U.S. state bar
4+ years of legal experience, including at a law firm or in-house experience (including applicable litigation, regulatory or product counseling experience)
Experience working on multiple different projects simultaneously and building consensus across cross-functional stakeholders
Experience advising clients on risk mitigation across technology product platforms
Content liability counseling experience related to CDA Sec. 230 or related international frameworks
Experience with online advertising industry, including from the perspective of social media or other internet companies, ad agencies, advertisers or publishers, or political advertising
Experience with international legal requirements, including those related to data privacy, and how to work effectively across multiple jurisdictions