By Jon Polenberg, JD.
Artificial Intelligence is no longer the future; it’s here, chewing through industries, spitting out data, and rewriting rules faster than accountants and other financial services professionals can keep up. From automating processes to analyzing massive datasets, AI is reshaping the accounting landscape while lawyers and judges scramble to decide who’s responsible when it all goes wrong. From intellectual property spats to bias allegations, AI is breaking down courtroom doors, and the legal system isn’t ready.
Who Owns What AI Makes?
AI-generated music, art, inventions—who owns the rights to these? Are they even rights at all? In Thaler v. Perlmutter, a U.S. court ruled that works without human authorship can’t be copyrighted. If a machine spits out something brilliant, tough luck—the law doesn’t want to protect it. AI-generated outputs like financial reports, projections, or even decision-making insights can raise thorny questions about ownership and intellectual property.
But what about the other side of the coin—the data AI ingests to become brilliant? Lawsuits against OpenAI and Stability AI claim their models train on mountains of copyrighted books, art, and music without permission. If the courts agree, it could spell disaster for the AI economy and disrupt the availability of AI tools relied upon in financial modeling and auditing.
So, here’s the tension: AI tools need massive datasets to learn but handing them free rein could gut creative incentives. It’s a messy battle between the tech and creative worlds, and no one’s sure where the law will land. Accountants must stay informed, as these decisions could reshape not only the tools used daily in their workflows but also how to evaluate client balance sheets.
When AI Fails, Who Bears the Costs?
Imagine this: an AI-powered accounting system hallucinates expenses, leading to financial restatements or tax penalties. Who is liable? The software vendor, the developers, or the end user? Courts must decide whose neck to stretch. In healthcare, an AI misdiagnosis can be a matter of life and death. In finance, AI loan algorithms deny mortgages to qualified applicants. Someone must answer for these failures—but who?
This is where the law starts to buckle. Courts are testing old-school theories like negligence and strict liability, asking: Was the failure foreseeable? Should developers bear that burden, or do human operators get the blame? The answer hinges on a bigger question: Who made the critical decision—a human or the machine? If the line is blurry, get ready for fireworks in courtrooms nationwide.
The Bias Problem: AI, Fairness, and Discrimination
They say AI doesn’t discriminate. They’re wrong. AI is only as fair as the data it chews on, and biased data means biased results. Plaintiffs are already lining up with lawsuits against AI tools that disadvantage people based on race, gender, or other protected traits—particularly in hiring, lending, and housing.
Employers using AI to screen candidates are now staring down claims under Title VII of the Civil Rights Act. In finance, companies deploying biased underwriting tools are getting hit with lawsuits invoking the Equal Credit Opportunity Act. The takeaway? AI bias is no theoretical problem. It’s real, it’s costly, and plaintiffs’ lawyers are sharpening their knives. Auditors poking around these systems must make sure they’ve got the tools and safeguards to sniff out bias and squash it before it causes clients’ trouble.
Privacy: The Data-Training Feeding Frenzy
AI is a data glutton, and its appetite is getting companies sued. To train generative AI models, developers scrape the internet, vacuuming up everything from copyrighted texts to personal information. But people are fighting back.
Take Illinois’ Biometric Information Privacy Act (BIPA)—a loaded gun for plaintiffs. It’s aimed at AI systems processing biometric data like facial recognition tools. Companies that overstep face massive penalties and class-action lawsuits.
Meanwhile, AI firms are testing the limits of publicly available data. Just because something’s online, does that mean it’s fair game? Courts are having to reconcile privacy doctrines crafted in a pre-AI world with the reality of machine learning.
Accountants need to grasp how their organizations and clients handle and shield data, especially when AI is in the mix for things like spotting fraud or running payroll. The stakes for slipping up—financially and reputationally—could be massive.
The Litigation Boom and Missing Rulebook
There’s no playbook for AI—yet. While the Federal Trade Commission and other agencies are starting to scrutinize AI claims, they’re barely keeping up. Plaintiffs’ lawyers, sensing the vacuum, are rushing in, targeting misleading marketing, unsafe applications, and regulatory gaps.
The European Union’s AI Act is setting the global pace, imposing strict standards on “high-risk” AI systems. But in the U.S., Congress remains paralyzed, more interested in hearings and grandstanding than crafting meaningful rules. Until lawmakers act, the courts will draw AI’s boundaries—one lawsuit at a time. These shifts are bound to shake up risk assessments, disclosures, and compliance audits.
Accountants’ Role and AI Litigation’s Future: Balancing Innovation vs. Accountability
AI is stretching the law in ways no one predicted. From data theft to copyright to bias claims, courts are scrambling to adapt, and companies are learning fast: transparency, fairness, and compliance are no longer optional. Get it wrong, and you’ll find yourself in front of a judge.
The cases playing out now will decide AI’s future. Will the law stifle innovation, or will it hold reckless developers accountable? Accountants are in a prime spot to help organizations navigate this tangled web, keeping things transparent, compliant, and ethically sound when it comes to AI. By staying sharp and ahead of the curve, they’ll be the ones steering how AI gets used in their organizations and for their clients.
====
Jon Polenberg serves as Vice Chair of Becker & Poliakoff’s Business Litigation Practice. He can be reached at jpolenberg@beckerlawyers.com.
Thanks for reading CPA Practice Advisor!
Subscribe Already registered? Log In
Need more information? Read the FAQs
Tags: Accounting, Risk Management, Technology