Gossip can be fun, but also unlawful?

Gossip can be fun, but also unlawful?

Recently, New York Times reporter Mr. Jacob Bernstein was overheard at a party calling Mrs. Melania Trump a “hooker.” Although he subsequently apologized, the legal question is what legal liability, if any, does either he or The New York Timeshave for his statement? In these times of fast and loose media stories, the question is timely for media professionals and consumers of news.

The First Amendment does not protect all speech. One category of unprotected speech is defamation, an actionable tort. In order to prove defamation, a plaintiff must generally show that an untrue statement was communicated about him or her, that the statement was false, and that such statement injured their reputation in the community. Proof of damage can include, but is not limited to, lost sales for a business. In the case of defamation per se, however, a plaintiff need not show damage because the statement in question is considered harmful on it’s face. Examples of defamation per se generally include calling someone a “bank robber,” a “prostitute,” or both.

Of course, context matters. Where the statements are made under the guise of parody or the words, when read in context, do no mean what they would otherwise mean in isolation, then there may be a potential defense against liability. Barring such context, however, legal liability generally exists. This is true even if you republish the defamatory statement or if the statement was made by one of your employees during the course and scope of their duties to you.

That being said, it is harder to prove defamation if you are a pubic figure. In such a case, you must show that the allegedly false statement was made with actual malice, which means that the person knew the statement to be untrue, or that the person made the statement with reckless disregard of whether the statement was true or false.

Under these guidelines, Mr. Bernstein’s recent statement would be considered defamatory per se. Needless to say, truth is an absolute defense to a defamation claim, so if he could proffer admissible evidence showing his statement to be true, then there would be no liability. Whether his apology absolves him of liability is another question. Defamation law varies by state. In all likelihood, the apology wouldn’t absolve him of liability, but it would be an issue for the jury to consider in determining the amount of compensatory or punitive damages.

Whether The New York Times could be held liable for Mr. Bernstein’s statement is unclear. To be liable, Mrs. Trump would need to show that Mr. Bernstein made the statement within the course and scope of his employment. Courts use various factors to determine this question. One factual issue would be whether Mr. Bernstein was attending the party on behalf of The New York Times, or in his personal capacity. If the former, liability will be more likely. If the latter, less likely, for The New York Times.

Media professionals are under immense pressure to get views of their content, and the quick way to do that is to run salacious eye-grabbing headlines. At the same time, the First Amendment’s protections are not infinite for media professionals. Finding the right balance between offering tantalizing news and also respecting the lines of defamation is a prudent course, but one that may be at risk of attack in today’s fast food news environment.

Here come the robot lawyers! Is that a good thing?

Here come the robot lawyers! Is that a good thing?

What do you call 5,000,001 lawyers at the bottom of the ocean? A good start! Hah! While there may be some truth to the joke given the unprofessional behavior of many lawyers, the question is whether we want robot replacing human lawyers as decision makers for global reaching legal — and policy — decisions. I think not.

Robot lawyers aren’t pure fantasy. As reported in Who Will Own the Robots, an article in MIT Technology Review, Narrative Science is a Chicago based company which is “able to take data — say, the box score of a baseball game or a company’s annual report — and not only summarize the content but extract a ‘narrative’ from it.” Imagine that Google — or Microsoft! — creates an app called “Robot Lawyer.” You download the app, choose your accent (Chinese, Russian, or, um, So Cal surfer), input your facts, and then ask your legal questions. The input-output algorithm of the Robot Lawyer system resembles the means-end reasoning of the human mind.

But does it? Have you ever seen War Games? If you haven’t seen it, in the movie, the Department of Defense (“DOD”) replaces human with computer fingers on the nuclear bomb buttons. With everything governed by that intelligence, the chances of error decrease, right?

Wrong. In the movie, Matthew Broderick unintentionally — and easily — hacks — surprising, eh? — the government’s server. He ends up playing a virtual game of thermonuclear war with Joshua, the war game computer that the DOD created to play war games without actually conducting nuclear war. When Broderick intentionally launches a virtual nuclear attack from U.S.S.R. on the U.S. through Joshua, the DOD generals without computer knowledge think it is a real nuclear attack.

Why? Because those generals were living in their virtual tech cave — a system closed off from objective reality — and were tempted to trust that system rather than wait for reports from humans in the field to see if bombs were actually dropping. (Not only that, but the generals didn’t understand the workings of Joshua. Only Joshua’s father, and Broderick did, which is why these outsiders saved the world from catastrophe.)

No matter how rational lawyers make the legal system appear, it is not. External political factors can change the outcomes of cases which, without these unpredictable intervening influences, can be more susceptible to prediction by simple law applied to facts reasoning. In this respect, law practice is more akin to a humanistic art form merged with soft science than a pure mathematical system. While artificial intelligence can supplement the art of human decision making, it cannot replace it.

That’s why I’d prefer a creative, principled, and savvy Atticus Finch from To Kill of Mockingbird as my lawyer over his Robot Lawyer counterpart any day.

The emperor still has no clothes!

The emperor still has no clothes!

Recently, a jury found Mr. Ross Ulbricht guilty of running the black market website Silk Road. Many observers claim that the government’s theory expanded liability for third parties like Mr. Ulbricht online. As I mention in a recent GizMoto interview, the government’s theory of liability wasn’t new, but “whether the government obtained the evidence that they wish to use to prove this narrative . . . in a lawful way consistent with the Fourth Amendment” is still up for debate.

On Silk Road, you could buy everything from cyanide, to marijuana, to, yes, some say hit men! The site was dubbed the Amazon of the black market. While diary entries from Mr. Ulbricht showed that he initially intended to launch the site so that he could sell mushrooms, the factual issue in the trial was whether he was the infamous Dread Pirate Roberts who continued to captain the site after it got up and running — and after Mr. Ulbricht supposedly bailed out.

Some have claimed that the government’s theory of liability “would expand legal liability for commerce in contraband online,” and that the outcome of the trial shows that “anonymity is dead.” Under this view, it is a slippery slope to hold Mr. Ulbricht liable for the conduct of people on Silk Road. That means all folks running websites have to be nannies who oversee all that is done on the site or risk criminal prosecution.

Maybe so. The Silk Road verdict makes it tougher to be a libertarian provider of a virtual platform where people can freely — and anonymously — transact. The freewheeling atmosphere on Silk Road was facilitated via the use of Bitcoin as the medium of payment. Some in the financial industry have sought similar anonymity with their “dark pool” methods of trading, where “the trading volume created by institutional orders . . . are unavailable to the public.” Dark pools, too, have come under legal scrutiny.

Contributory liability under copyright makes a third party — here Mr. Ulbricht — liable for infringements that occur under their control that they are aware of, or should be aware of. There is no intentional ostrich defense — “I chose not see or hear criminality!” — to such liability, nor is there such a defense to aiding and abetting violations of federal law. If Mr. Ulbricht was, in fact, Dread Pirate Roberts, then he intentionally facilitated the illegal transactions. In this respect, the case did not “expand legal liability for commerce in contraband online,” and so the emperor still has no clothes, contrary to what others say.

However, Silk Road did suggest new methods of potential government overreaching in the digital age. According to some pundits, the F.B.I. was mysteriously able to uncover the Silk Road servers supposedly via a software flaw on a site’s login page that, in turn, revealed an IP address. Supposedly, the IP address led the feds to an Iceland location where the server for Silk Road was located. Whether this cookie crumb trail created by the feds violated the Fourth Amendment is an issue that will likely be raised on appeal.

Regardless of the outcome of that appeal, Silk Road illustrates the tension between being able to conduct business in private online without the government unlawfully snooping, and society’s interest in regulating virtual transactions that have negative externalities — nasty effects — on all but the transacting parties.