Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Brian Bolland spent over a decade figuring out how to build a system that would make Meta money. On Thursday, he told a California jury that it motivated more and more users, including teens, to Facebook and Instagram — despite the risks.
Boland’s testimony came one day after Meta CEO Mark Zuckerberg Take a stand on an issue On whether Meta and YouTube are liable for alleged harm to the mental health of a young woman. Zuckerberg framed Meta’s mission as balancing safety with freedom of expression, not revenue. Boland’s role was to counter this by explaining how Meta makes money, and how that shapes the design of its platforms. Boland testified that Zuckerberg fostered a culture that prioritized growth and profits over the well-being of users from the top down. He said he had been described as a whistleblower — a term that Meta had widely sought to limit for fear it would prejudice jurors, but which the judge generally allowed. Over the course of his 11 years at Meta, Poland said he went from “deep, blind faith” in the company to “the firm belief that competition, power, and growth are the things Mark Zuckerberg cares about most.”
Boland last served as vice president of partnerships at Meta before leaving in 2020, working to bring content to the platform that could monetize it, and previously worked in a variety of advertising roles starting in 2009. He saw Facebook’s infamous early mantra of “move fast and break things” as representing “the cultural ethos of the company.” The idea behind the slogan in general, he said, is “don’t really think about what can go wrong with a product, but just put it out there, learn and watch.” At the height of its internal popularity, employees would sit at their desks to see a piece of paper that said: “What are you going to break today?” Boland testified.
“The priorities were growth and engagement.”
Zuckerberg has made his priorities for the company crystal clear, according to Poland. He announced it at all the meetings and left no doubt about what the company should focus on, whether it was building its product to be mobile-first, or getting ahead of the competition. When Zuckerberg realized that Facebook at the time had to prepare to compete with one of the social network’s rivals, Google (which he didn’t name, but seemed to refer to Google+), Poland mentioned the digital countdown clock in the office that symbolized the amount of time left to achieve their goals during what the company called a “lockdown.” During his time at the company, Poland testified that there was never any closure on user safety, and Zuckerberg allegedly instilled in engineers that “the priorities were about growth and engagement.”
Meta has repeatedly denied that it is trying to maximize user engagement on its platforms at the expense of keeping them safe. In recent weeks, both Zuckerberg and Instagram CEO Adam Mosseri have testified that building platforms that users enjoy and feel good about is in their long-term interest, and that’s what drives their decisions.
Boland disputes this. “My experience is that when there were opportunities to try to understand what products could do harmfully in the world, that was not the priority,” he testified. “This was more of a problem than an opportunity for reform.”
When safety issues came up through press reports or regulatory questions, “the basic response was to figure out how to manage through the press cycle, and what the media was saying, rather than saying, ‘Let’s take a step back and really understand deeply.’” Although Poland said he told his advertising-focused team that they should be the ones to discover “the broken pieces,” not those outside the company, he said the philosophy did not extend to the rest of the company.
A day earlier, Zuckerberg pointed to documents from around 2019 showing his employees disagreed with his decisions, saying they showed a culture that encouraged diversity of opinions. However, Boland testified that while this may have been the case earlier in his tenure, it later became a “very closed culture.”
“There is no moral algorithm, this is nothing… He doesn’t eat, he doesn’t sleep, he doesn’t care.”
Because the jury can only consider decisions and products that Meta itself has made, rather than content it has hosted from users, lead plaintiff attorney Mark Lanier also asked Boland to describe how Meta’s algorithm worked, and the decisions she helped make and test. Algorithms have “an enormous amount of power,” Poland said, and are “absolutely relentless” in pursuing their programmed goals — and in many cases in meta, that was alleged engagement. “There is no ethical algorithm, this is not a thing,” Boland said. “He doesn’t eat, he doesn’t sleep, he doesn’t care.”
During his testimony on Wednesday, Zuckerberg commented that Poland “developed some strong political views” toward the end of his tenure at the company. (Neither Zuckerberg nor Bolland provided specific details, but in 2025 Blog postBolland indicated that he was deleting his Facebook account in part due to disagreements over how Meta handled events like January 6, writing that he believed “Facebook contributed to spreading ‘Stop the Steal’ propaganda and enabling this coup attempt.”) Lanier spent time proving that Bolland was respected by his peers, and showed CNBC condition About his departure he quoted a glowing statement from his then-boss, and a reference to an unnamed source who reportedly described Boland as a person of strong moral character.
During cross-examination, Meta’s attorney Phyllis Jones explained that Boland did not work on the teams charged with understanding youth safety at the company. Boland agrees that advertising business models aren’t inherently bad, and neither are algorithms. He also admitted that many of his concerns relate to content posted by users, which is unrelated to the current case.
During his direct questioning, Lanier asked whether Bolland had expressed his concerns to Zuckerberg directly. Poland said he told the CEO he had seen data showing “harmful results” for the company’s algorithms and suggested they investigate further. He noted Zuckerberg’s response something like this: “I hope there are still things to be proud of.” He said he resigned shortly after.
Boland said he left more than $10 million in unvested Meta shares on the table when he left, though he admitted he made more than that over the years. He said he still finds it “nervous” every time he talks about the company. “This is an incredibly strong company,” he said.