If you are an author, or know an author, something huge just happened in the first of more than a dozen lawsuits against one of the largest AI companies in the world that directly affects anyone who makes a living as a creative professional. Anthropic just agreed to pay a $1.5 Billion settlement for pirating–literally stealing–just about every copyrighted book in the world to train its AI. It is offering authors just $3,000 each to drop their case.
A lot of authors are overjoyed by the prospect of getting a little cash without doing any work. We, after all, are historically beaten down by publishers and media conglomerates, so it feels pretty good to cash a check. With six of my books in the database I stand to make a cool $18,000 out of this settlement.
But I have to tell you that the legal maneuverings between Anthropic and the Authors Guild to arrive at this number is a complete and total sham.
That’s because even though this settlement is valued in the billions, it’s almost comically—and even insultingly—low. It sets a price precedent that every AI company that comes after it can use if it’s caught stealing our work. And, most importantly, copyright law clearly states that such obvious and admitted theft at such a large scale requires the penalty to be a deterrent against future crimes. They set that number not at $3000 per work, but up to $150,000.
To be clear: anthropic literally pirated authors’ books by downloading them from various online websites like Anna’s Archive in order to train an AI program that is designed to make human-created writing obsolete. In their sworn testimony they admitted that they downloaded books illegally to both save money and to train their AI.
In order to get up to $150,000 in damages the plaintiff needs to prove “willful infringement” which the law defines as “acts infringed the copyright, or the defendant acted with reckless disregard for, or willful blindness to, the copyright holder’s rights.
They literally stole millions of books without one single thought as to how those millions of authors might feel about their work being used to train their replacements. If that doesn’t constitute “reckless disregard” then I don’t know what does.
Don’t just take it from me. Take it from my lawyer.
“If your book is registered within three months of first publication, or prior to infringement, you have the option to elect what’s known as statutory damages which is up to $150,000 per work. So for instance, if you’ve got 10 works and each of them would be be able to seek $150,000 per each of those books,” says James Bartolomei of the Duncan Law Firm.
With six books meeting the criteria, that would mean Anthropic should owe me up to $750,000 which is a helluva lot more than $18,000.
Remember, while this is the largest proposed settlement in copyright history, that it’s ALSO the largest copyright THEFT in history. Anthropic used our books to train its program and the company is now worth $183 billion. In fact, at the beginning of September it raised an additional $13 billion from investors.
Meanwhile, the entire book publishing industry—meaning every mainstream publishing house, academic journal, self-publishing group and trade publisher combined–was only worth a combined $129 billion in 2025.
That’s right, a single company with just over 1000 employees–one of several that are doing the same thing–that stole all of our combined work, is worth more than all human derived publishing.
$1.3 billion is not even a speed bump. It is permission. It is simply an expense that they will write off on a single quarterly report.
If you take this settlement, you have given up on the intrinsic value of your work. You have decided that a $3000 payment is all your career was worth anyway.
On the other hand, if you want to fight the unfolding injustice, there is something you can do about it. I have a game plan.
The first thing you need to do is go this website and register the books that they might have used to train their AI in order to get onto their list. Put in your contact information and fill out all the necessary information. Even if you want to TAKE the settlement you need to do this step.
(FLASH EDIT: They just updated this page so you can now check if your book is specifically in their database. Look it up here.)
Then, if you want to opt out, you have two options: you can hire a lawyer to continue the case with your own legal team and at your own expense. Lawyers cost between $300-$700 an hour and expenses add up quick.
The other option is to join with other authors and opt out together in hopes of either taking this case all the way to the supreme court, or settling down the road for a larger sum of money.
Personally, I’ve retained James Bartolomei at the Duncan Law Firm who has been fighting for authors’ rights for more than 20 years. James is working on contingency which means you won’t pay anything to retain his services, and he will recoup his expenses when we win. And I have no doubt that we will win more than the pittance anthropic has offered.
I’d love to have you join the case–and I’m looking for between 50 and 500 authors to combine forces because I believe that we’re stronger together.
If you want to join on, or know someone who might, please contact James at James@duncanfirm.com and you can cc me as well at scott@foxotpus.ink.
I don’t know what the final outcome of this fight will be. But I do know that if we don’t rise up and fight right now, the Ai companies will sell us right down the river in the future.










