Rumor #1: Walrus sues for speech rights.

Rumor #1: Walrus sues for speech rights.

We wish you a bright, refreshing, and spirited jump into 2026. 

Technology is growing. From banks to marketing, more companies are using AI and ChatGPT. But with these uses come increasing risks. Among them are deep fakes and digital hallucinations.

Rumor #1: Newsweek reports that walrus sues for speech rights. The real story is that the First Amendment can be used by animal rights advocates, among others. Rumor #2: Time article says moon is a deep fake and all images, along with stories, were manufactured by AI. The real story is that AI can be used to better understand the moon. 

How to deal with these legal issues in the new year? One way is to understand the nature of the problem. In a recent posting from DeepStrike, a cyber firm, the company shows that these fakes increased from 500,000 in 2023 to 8 million in 2025. The consequences of using a deepfake can be dire. According to some estimates, by the likes of the OECD.AI, the costs of deepfake scams were slated to soar to $10.5 trillion in 2025. It’s hard to tell as some deepfake use is knowing. The same is true of hallucinations. They cost companies approximately $67 billion in 2024. These are only ones that have been reported.

In the meantime, please don’t hesitate reaching out to us with any questions within our expertise.

False Advertising: Undisclosed Chatbot Use?

False Advertising: Undisclosed Chatbot Use?

Classes at institutions like Stanford are examining the issue of living in a “post truth world.” Who cares about accuracy? ChatGPT may not. It could say: “only your dog’s seller will be legally liable if it bites the newspaper delivery boy, not you.” 

However, businesses should care.

Legal liability for using bots, among other technologies, can be affected by various regulatory regimes. In September, the FTC started an investigation into various ways that companies use chatbots to interact with their customers, particularly minors. The concern is that the interactions can be deceptive, similar to those public figures who wear certain fashion garments without disclosing that they are paid for sponsoring. This is particularly apt given the recent suicide by one caused allegedly due to a chat bot.

More recently, proposed California legislation can affect your business, too. Under this proposed rule, disclosing to some users that they are interacting with a chatbot would be required. Whether and how this rule — which may be adopted by other states — will be implemented remains to be seen.

This being said, analogies to existing bodies of law exist. Fraud, negligent misrepresentation, and various unfair competition laws cover much of this behavior — albeit to non-bot cases. Nonetheless, using a bot is akin to teaming with a dog or human independent contractor. While ChatGPT may say truth is dead, it still matters under the law.

ChatGPT–Original or Copied?

ChatGPT–Original or Copied?

Chat GPT and originality. Some say you can use Chat GPT that trains on copyrighted material without fear of being sued. Others disagree. What direction to take when deciding to use this powerful technology? 

As you may have heard, the Author’s Guild recently obtained a settlement for infringement of copyrighted works by AI. This case is relevant to your company’s use of this technology. Does it enable more copyright infringement? What about its effect on creativity? These are some questions to consider when using AI — from a legal standpoint.

Recently, the Council of Fashion Designers of America published a piece I penned. It touches upon these legal issues — albeit in the design process. 

Liable? Unknowing deep fake use.

Liable? Unknowing deep fake use.

Deepfakes are all over. From images to speech, it’s hard to avoid these digital counterfeits. They can make it look like you are in a desert — even when you aren’t. This problem has appeared in scientific publishing, where fake papers are proliferating. But what is your company’s liability for unknowingly incorporating such content? What are some ways to navigate this new digital mountainous range?

Recently, in an article penned for The National CIO Review, this issue is delved into. Do you have a CIO? Then what is her or his liability to the company for deep fake use? Does Mr. Elon Musk your have legal recourse if a fake version of his voice suddenly appears in your video game or image seeps into your news story?

Cookies: They Can Crumble You

Cookies: They Can Crumble You

Cookies. They aren’t only the things you eat after lunch. They are also electronic means to track visitors to your website. A good definition is here. There are various types — including ones that stick to your web surfing activities known as “persistent” cookies.

Why does this matter? Cookies used by your website can violate state and federal privacy laws. Recently, for instance, a judge permitted a class action lawsuit against CNN to proceed for using cookies — even though the company had a pop-up banner. 

So the contours of these legalities are quite tricky. Due care should be taken in your cookie strategy — especially as your business scales into different states, particularly California, where the California Invasion of Privacy Act can get your business fined or worse.

Quicksand ahead? Foresight can help.

Quicksand ahead? Foresight can help.

Software has become a part of our daily lives. Some of you may have invested in software companies — or even social media like Meta. But can software companies be liable for the health issues that afflict their users? It’s a relevant question whether you use social media or invest in it. So having some foresight into these legal issues is invaluable to avoid legal quicksand.

New York City recently sued Meta, among others, in California. There is an article I penned for the U.K. based Society for Computers & Law that delves into the case. In short, N.Y.C. is seeking to hold companies like Meta liable for the negative health affects that the city alleges afflict its residents. Where the lawsuit ends up will have ramifications for the AI marketplace among other technologies.

Content and Tech — or vs. Tech?

Content and Tech — or vs. Tech?

As we start the year, I thought it would be good to raise for your attention this question: is your business content strategy helped by your technologies — or is it hindered? Food for thought during your next company outing.

Think about this. Former Google CEO, Mr. Eric Schmidt, co-authored the book The Age of AI,” which is about the the benefits — and costs — of AI. The book has received a lot of favorable press by the likes of the Wall Street Journal.

What if Mr. Schmidt wrote the whole book using ChatGPT? And consider if the technology used data — writings — from your company’s stash of information about AI? This raises copyright considerations. What is more, there is a slew of marketing material out there where you can get reviews of your content artificially, as related in this FTC warning.

So when you are thinking about using a bright shiny technology to prepare your content, due care should be taken in 2025. Outsourcing — automating — content making is something that is quite popular

But there remain legal issues to consider.

In the meantime, please reach out to me with any legal questions within my expertise. My office is here to navigate your business through this thicket of tech legalities.

AI and finance: friends, foes, or somewhere in between?

AI and finance: friends, foes, or somewhere in between?

Artificial intelligence (“AI”) has been the craze — particularly in fintech. From banks to apps, AI is increasingly being used for data analytics — among other things. It is also being used in commercial banking. But what are some of the issues in using AI for the financial aspects of your business? This concern is all the more apt for large data sets — when you receive complex information from different sources. How can you be sure that the data you are using is reliable? Find out more in this podcast I launched with the London based AI Accelerator Institute. It’s an interview of Mr. Daniel Wu, an AI expert from JP Morgan.

No music license needed! AI scraped it.

No music license needed! AI scraped it.

Music licensing. Whether it’s a Willie Nelson sample or otherwise to use in your company’s advertisement, permission is generally needed. But does artificial intelligence (“AI”) change this? Some say AI can crawl samples and mash them — making licensing unnecessary.

That’s until your company gets sued for copyright infringement.

To find out more, an understanding of music permissions is needed. Please click here for a short presentation from the Ella Project in New Orleans. As you will hear, music licensing is tricky.

Walmart has your number: why?

Walmart has your number: why?

Your consumer data — how does Walmart use it? From your computer’s location to your buying habits, the commercial behemoth stores this information. Walmart isn’t alone — others you buy from on and offline use your data for predictive analytics.

How can you use consumer data to help with the growth of your business?

Find out more. Please click here for an interview I did of Mr. Abuchi Okeke, a software engineer from Walmart. It’s part of a series I launched with the AI Accelerator Institute — titled “AI Keyhole” — about AI evolution, applications, and policy.