Minnesota solar company sues Google over false information in AI summary

Wolf River Electric’s lawsuit raises larger questions about who is responsible when AI gets it wrong, and whether existing internet laws can keep up.

The Minnesota Star Tribune
June 13, 2025 at 11:00AM
Wolf River is suing Google over a now taken down AI Overview the search engine put up for the company that says it is part of a lawsuit when it is not. (Alex Kormann/The Minnesota Star Tribune)

A Minnesota solar company is suing Google for defamation, claiming the tech company’s “AI Overview” falsely stated that the company faced a lawsuit from the Minnesota Attorney General Keith Ellison.

Ellison did sue four solar-lending companies last year, accusing them of concealing and improperly charging $35 million in fees to Minnesotans since 2017.

However, Isanti-based Wolf River Electric was not a defendant in that suit.

Wolf River filed its lawsuit against Google in March in state court. On Monday, Google filed to move the case to federal court, and the case was assigned to Judge Jeffrey Bryan of the U.S. District Court in Minnesota.

Wolf River is seeking damages of $110 million to $210 million.

The lawsuit provided examples of business losses, including a March 5 incident when a customer raised concerns about Google’s claim that Wolf River was being sued and terminated a $150,000 contract with the company.

“Members of the public, including potential employees and customers, have viewed the defamatory publications made by Google and have relied upon these false statements in regurgitating, spreading, and further defaming Wolf River, all because of Google’s false and unsupported statements,” the complaint said.

AI Overviews are Google’s generative artificial intelligence summaries that appear prominently on search results, often before listings and even above ads. The AI answer to search queries, often written in a conversational tone, provides a summary synthesized from multiple sources.

Wolf River originally intended to litigate privately without court supervision via pocket service because the situation “was never about publicity,” Nick Kasprowicz, general counsel for Wolf River, said in a statement.

“Google’s decision to remove the case to federal court, thereby making the lawsuit public, was entirely its own,” he said. “While we were not surprised by the move, it made clear that Google preferred to litigate this matter in a public forum rather than resolve it quietly and responsibly.”

According to the complaint, when a user searched “Wolf River Electric lawsuit,” the AI Overview stated the company was facing a lawsuit from Ellison for alleged deceptive sales practices, including “misleading customers about cost savings, using high-pressure tactics, and tricking homeowners into signing binding contracts with hidden fees.”

Wolf River employees discovered the issue in September. The AI Overview has since been removed.

The complaint says Google’s AI cited four sources for the claim, including a Minnesota Star Tribune article about Ellison’s case against the four other solar companies. The story mentioned Wolf River Electric at the end, but did not say the company was included in the lawsuit.

Other cited sources included Angie’s List and a news release from the Minnesota Attorney General’s Office. None of the sources support the claim that Wolf River was sued, according to the complaint.

The AI Overview continued to claim Wolf River was a defendant in the attorney general’s lawsuit for deceptive marketing, high-pressure tactics, hidden fees and installation issues.

Google filed a response to the complaint, denying the allegations.

The AI Overview describes “documents, publications, and internet postings, the contents of which speak for themselves,” the response said. Neither Google’s counsel for the case nor the company responded to a request for further comment.

The broader context

Wolf River’s lawsuit raises larger questions about who is responsible when AI gets it wrong, and whether existing internet laws can keep up.

Its complaint is among the earliest legal tests of false claims made by AI. Just a few weeks ago, OpenAI won a similar case in Georgia against a radio host who accused the company of defamation via ChatGPT.

William McGeveran, dean of the University of Minnesota Law School, said Wolf River faces an uphill battle, and the move to federal court likely favors Google.

“Assuming that they’re going to assert a Section 230 defense, federal courts have been very receptive to those,” he said.

Under Section 230 of the Communications Decency Act of 1996, providers and users of an “interactive computer service” cannot be held liable as the publisher or speaker of content provided by another “information content provider.”

The law generally protects platforms such as Facebook and Google from being sued over content that users or third parties post on their services. In other words, they aren’t typically held responsible for content they didn’t write.

However, the sources the AI Overview cited in Wolf River’s case did not say the solar company was part of the lawsuit — that was an AI hallucination.

Google’s AI Overview challenges Section 230 “because it sounds a lot more like something that Google wrote,” McGeveran said. It sounds like it was written by a human, even though it wasn’t.

McGeveran described the generative AI product as a “souped-up” version of Google’s search algorithm and said a judge is likely to rule that it is protected by Section 230.

“The question in this case is, ‘Who wrote the AI Overview?’” McGeveran said. “Under current law, I think the answer, pretty clearly, is: ‘Not Google.’”

The lack of litigation involving generative AI and Section 230 leaves open the question of whether the law should apply in these cases.

Kasprowicz said the inaccurate information in the AI Overview on Wolf River is a warning that if left unchecked, the technology poses a “profound risk to the legal and reputational stability every business depends on.”

“This lawsuit is not just about defending our company’s reputation, it’s about standing up for fairness, truth, and accountability in the age of artificial intelligence,” he said in a statement on behalf of the company.

The company’s main objective is to recover damages it has suffered as a result of the AI hallucination, Kasprowicz said.

But Wolf River also hopes it might establish a legal precedent that holds corporations “accountable for the outputs of their AI systems, ensuring that these tools are developed and deployed with ethical, responsible, and reliable safeguards.”

about the writer

about the writer

Emmy Martin

Business Intern

Emmy Martin is a business reporting intern at the Minnesota Star Tribune.

See Moreicon