Why can Meta run fake ads of my prime minister or the CBC to front scams with no due process, but for this they can use their judgement to block?
I know they’re an American company and my complaints are Canadian, but the double standard stinks.
I wonder if that is what will happen next.
> "We're actively defending ourselves against these lawsuits and are removing ads that attempt to recruit plaintiffs for them," a Meta spokesperson tells Axios.
Communication is highly regulated for good reasons, and advertisement is not. This is as if telecommunication companies would disconnect calls when what is being said does not fit their agenda.
This should be illegal for advertising companies as well.
Wow.. That is quite a statement. Am I right in saying that in order to claim for the class action lawsuit, which facebook has been 'found negligent', that the victims need to take an action collectively in order to claim ? IE They need to be reached somehow to inform them of the possibility ?
Seems the most obvious place to advertise would be Meta.
I understand Meta can basically do whatever they like with their ToS but the statement from the Meta spokesperson seems like an extremely bad idea.
Annually poll all the students, to get rankings of how the ethics of well-known companies/brands are perceived by the students.
Then publish the results to students, in a timely fashion, before they're deciding job offers and internships.
I speculate that effects of this could include:
1. Good hiring candidates modifying what offers they pursue and accept -- influenced by awareness, self-reflection, and/or peer-pressure.
2. Students thinking and talking about ethics, when they didn't before. Then some of them carry this influence with them, as part of their character and intellect, going forward (like is one of the ideals of college education).
Also, maybe the second year of the poll, the sentiments are better-informed, because a lot more people have started paying more attention to the question of ethics of a company.
The perception breakdowns by college major would also be interesting, but maybe don't publish those, to reduce internal incentives to game the results. (Everyone knows some majors tend a bit more towards sociopathic than others, but some would rather that not be officials.)
Individuals bringing their own lawsuits seems like it would affect better change as 1) the award money would be better distributed instead of concentrated and 2) the amounts levied against the companies would be higher and more of concern than the class-action slap-on-the-wrist they currently get.
It would certainly be interesting if we wanted legislation to force private companies who provide paid ad space to publish ads that paid the most regardless of the content, but then that opens up a whole other can of worms. What if the ad offering the most money is racist and horrible, or disgustingly obscene? At that point you start needing the government to decide what is allowed to be banned and what isn't, and then it's meddling in speech which is prohibited by the first amendment.
So this just seems like an obvious non-story to me. Of course Meta is removing these ads, because pretty much any advertising platform would do the same about ads that criticized it.
Seems like they couldn't write even three lines without a LLM.
It isn't reasonable to ask a platform to host content that is literally about suing them, not because of "freedom" concerns or whether or not Facebook is being hypocritical, but more because in the end there isn't a "fair" way for them to host that. The constraints people want to put on how Facebook would handle that ends up solving down to the null set by the time we account for them all. Open, public rejection is actually a fairly reasonable response and means the lawyers at least know what is up and can respond to a clear stimulus.
"MZ Is A Punk-Ass B
payed for by Person & Guy LLP"
I do not have any sympathy for Meta.
All corporate CYA ideas sound that way, but ultimately end up benefiting the company in the end. Meta is right to do this. That's not to say it's right to do, but it's right for the company.
Its own TOS states that they won’t allow that.
It's not just a Meta issue either.
https://cdn.mos.cms.futurecdn.net/yUEJgQzunhbnYYtsckup7i.jpg
https://www.reuters.com/investigations/meta-is-earning-fortu...
Mine is that it could then well be required to do so by law. Companies are not individuals, so I don't think they are owed any freedoms beyond what is best for utility they can provide.
Zuckerberg was told about gay people being added to groups and it outed them by posting to their wall, and he ignored it https://www.youtube.com/watch?v=nRYnocZFuc4
And obviously https://news.ycombinator.com/item?id=1692122 (guess we don't get access to his other messages, though https://news.ycombinator.com/item?id=16770818)
His stare isn't the only thing about him that's sociopathic
Edit: oh yeah and https://news.ycombinator.com/item?id=42651178
You don't even have to invoke the idea that Meta is big enough to be regulated as a public utility for this to have broad precedent in favor of forcing a malicious actor to inform its victims that they might be entitled to a small fraction of their losses in compensation.
The 20$ dollars people get is nothing but a guise that the trial lawyers are helping people.
Wild stuff
The article even at least mentions that at least one of the suits is private equity funded; which generally will result in the partners and/or investors of the private equity firm and the attorneys suing, which are often all one and the same in what is just a financial and legal shell game, net tens of millions of dollars, while the supposed victims will end up with nothing but pennies on the dollar of harm and injury.
I get the impulse to also “cheer” for the lawsuits, but if you thought Meta, etc. are bad; you really don’t want to look into the vile pestilence that is the law firms that are basically organized crime too by the core definition of crime being an offense and harm upon society.
I don’t really know a solution for this problem because it is so rooted in the core foundation of this rotten system we still call America for some reason, but for the time being I guess, the only moderately effective remedy for harm and injury is to combat it with more harm and injury.
It's not a hard thing to implement on their end and should be mandated by a judge as you said.
Filing this away for later use.
Your examples pale in comparison.
The entire point, of course, is to encourage such suits by incentivizing those able to bring them.
If they went back to operating as “friends and family feed providers” then letting them keep their 230 immunity would be easier to justify.
It's to allow companies to not have to deal with individual claims for each person. I see that the ranges can be substantial though, several thousands, but seems to be criteria.
> Nearly nine months later, Mark received a notification that his claim had been approved. Two weeks after that, $186 was deposited into his bank account. While the amount wasn’t substantial, it covered a grocery run and a phone bill—and more importantly, it reminded him that companies can be held accountable, even in small ways. [0]
[0] https://peopleforlaw.com/blog/how-much-do-people-typically-g...
If the fine's don't dissuade companies from bad practices, the class actions with theoreticaly no upper limit might be a better option to enforce proper behaviour.
It is "the business", not an imagined side revenue stream.
"We will allow more speech by lifting restrictions on some topics that are part of mainstream discourse and focusing our enforcement on illegal and high-severity violations."
Also why we need much less megacorps than there are now.
It often comes up in (anti) free-speech trials, where the government compels the perpetrator to issue a public apology to the victim. Forcing them to buy an ad in a newspaper for example is not unheard of.
As far as I understand, Americans consider this to be "compelled speech" and hence prohibited, but I might be wrong on this.
They analyze the video posts on instagram. If they detect the video has even a small amount of commercial value, they classify it as branded content and you need to pay for it to get promoted.
No
The ad sellers and the journalists are normally separate and will not interfer much with one another's work. It also helps that they never say no to money. I don't know about the New York Times specifically but similar things have happened many times in other newspapers, and there is such a thing as a paid editorial. Those are usually clearly marked as such, but it's basically the same thing.
(However, there may be other reasons why you might want to go with a competitor instead, and not pay the newspaper you hate $100k.)
Only if you don't opt out. Individuals who opt out of being part of the class can still file their own suits. (Although it's not clear how successful you will be if your situation/harm is not substantially different from the other members of the class.)
the flaw with class actions is not that they don't pay enough (or too much, to the wrong people) money. it's that they're reactive, which is to say, it's the same tradeoff with nearly all US commercial policy.
One of the strangest things I find is that people will spend serious money to buy a house somewhere they don't have to be exposed to the great unwashed yet will happily expose their brains to the worst of the worst via social media.
Kafkaism is natural and organic.
Explicit rejection is better than opacity but better still is public accountability. Meta’s properties have a combined userbase that amounts to just over 1 in 4 people on earth; these platforms should have been regulated as utilities a long time ago. Suppose I wanted to run ad campaigns advocating for antitrust legislation targeting social media companies and ended up getting booted off of all of the major platforms; what feasible method is there for me to advance these ideas that could possibly compete with the platforms’ own abilities to influence public opinion?
But why must we limit ourselves to simplistic, false dichotomies such as "Good vs Evil", "Education vs Ignorance", "Community Well-Being vs Disinformation and Arrant Nonsense", "Democracy and Social Confidence vs Propaganda and Conspiratorial Mayhem", and "Mental Health vs Despair and Self-Harm" ? We really are focused on building apps that people love.
I get that the distinction matters a bit from time to time (court cases keep blurring the line in the US though), but:
1. With all the other shit that makes it through the filter, this was pretty clearly a targeted, strategic takedown rather than some sort of broad "we don't allow bad ads on the platform." Allowing "all ads" isn't the thing being argued; it's allowing "this ad."
2. The non-offensive idea of "abusers shouldn't be allowed to deceive and gaslight their victims" is pretty strongly in favor of this being a bad move on Meta's part if it was an intentional act. Maybe it shakes out fine for them legally in this particular instance, but the fact that as a society we routinely require companies and individuals to behave with more appearance of moral standing than this suggests that blocking this particular ad is over the line, and it's neither naive nor utopianistic to think so. Even if it's legally in the light-grey, it's an abuse of power worth talking about, and hopefully it inspires more people to leave their platform.
What it does say is you aren't liable for something someone else wrote.
It doesn't create liability for things not covered by it.
Guess who decides the order and contents of Facebook feeds? Facebook does. So they wouldn't be liable for someone writing a post saying "gas the jews" but they would still be liable for choosing to show it at the top of everyone's front page, if that was a choice, because the front page was choice-based rather than chronological.
When they are making editorial decisions about what to content to promote to you and what content to hide from you, then they should lose it.
It’s a lawsuit, with the users of the platform as the damaged party, against the platform. Removing the possibility to reach the users should result in a default judgement with maximum damages immediately.
Does Zuckerberg have some kind of clinical condition where he just can't imagine how other people might see him?
Sure this will slow down the personal injury lawyers finding clients but it won't stop them, meantime it is more ammunition for Facebook's enemies to use against it.
It is one thing to do shady business, it is another thing to incriminate yourself. If you were involved with weed and somebody sent you an email asking if they could come around and pick up a Q.P. next Saturday I'd expect you to give the person a correction in person that they shouldn't do that again.
Not to say you should be like Epstein but I mean he and the people he corresponded with had some sense so there is is very little evidence of criminal activity in millions of emails.
At Facebook on the other hand all the time people sent emails about things that could just as easily been left as "dark matter" unexplained and minimally documented decisions but no it is like that M.F. Doom song "Rapp Snitch Knishes", like a bunch of children or something with no common sense at all.
This has happened.
> the government to decide what is allowed to be banned and what isn't,
This is a civil lawsuit where people are trying to a) prove they were harmed and b) be compensated for that harm. The government is just the referee.
> Meta is removing these ads, because pretty much any advertising platform would do the same about ads that criticized it.
Which was not very smart because the next step will be a court order requiring them to put a banner on ever page with a link to sign up to join the lawsuit - for free.
Social media is a perfect storm for the elites in this system. It’s a CIA wet dream. It’s literally a globalized and hyper personalized propaganda distribution platform. This is the inevitable outcome of capitalism and human behavior. Meta’s whole purpose is to create the most optimized pipeline for accepting money from 3rd parties in exchange for convincing as many people as possible of what they want those people to believe.
Social media is evil but it’s also the natural course of what happens with current technology and the incentives of capitalism.
This is not how it works when you're found guilty of committing harm. Tobacco companies are a good example of this.
You own a restaurant, where you sell poisoned (intentionally and knowingly) food. A group of people band up for class action lawsuit for poisoning them, and have the lawyers post a sign at your restaurant, that everyone poisoned there should reach out and get some compensation.
Should you be allowed to take the sign down?
An interesting variant I’ve seen on anti-smoking banners at convenience stores is “A federal court has ordered a Philip Morris USA to say: …”
I actually am more at odds with HN than many people might be because I think the lies surrounding covid and the censorship were absolutely wrong and platforms could genuinely after things like that lay claim to being unfairly directed, but you can tell Zuck doesn't actually care because he immediately started doing that
This is humanity vs Mark Zuckerberg.
Yeah, it's called having-too-much-money-to-careitis.
Not sure he cares. He's literally got hundreds of billions of dollars to his name, and the corporation he founded is worth trillions.
I'm talking about a poll that raises awareness across the entire student population, in a low-cost way, with no incentive to cheat nor to do anything other than give honest answers.
Regarding whether awareness of ethics would penalize those with any ethical tendency:
We can already see the last 2 decades of rampant unethical behavior throughout tech companies. Virtually every tech company knowingly sells out its users to data brokers, for example. Sociopaths already dominate, and have conditioned everyone below them to behave in an at least compliant way even when doing something unethical.
It's too bad we're starting late on that absolutely pervasive ethics problem, but we can't make the problem any worse by trying an education intervention now. And it's the traditional job of colleges to do this -- not to pump out oblivious worker drones.
And of course there are scammers on all sides - not just legitimately bad stores trying to whitewash their online presence, but also entire scammer rackets that extort legitimately good stores by flooding them with BS reviews [2].
[1] https://www.linkedin.com/pulse/how-shady-companies-remove-ba...
[2] https://9now.nine.com.au/a-current-affair/inside-the-extorti...
Source?
Wall Street makes those demands. Those demands are backed up by court cases and precedent. Nothing about this is synonymous with "capitalism."
> It’s a CIA wet dream.
And they spend a significant amount of money. Is this "capitalism" still? Or are there more specific terms that would apply more directly to this arrangement?
> Social media is evil
The US is the largest manufacturer and seller of weapons in the world.
Are people in CIA incompetent?
I know this answer doesn't pass the vibes test, but it's how the law actually works. If you post a sign on someone's property without permission, you'll get in trouble for trespassing, vandalism, or both.
So get a judge to issue an order. In a serious situation, they very well might.
It doesn’t. You can almost[1] always opt out of class action lawsuits to pursue your own suit. This would be expensive and unwise for most people, but you have right.
[1] There are rare exceptions.
I feel probably that the emotional maturity of most billionaires is at the toddler level or below, and I mean that quite seriously and literally.
Cory Doctorow describes Mark Zuckerberg's and Elon Musk's attitude toward other people as billionaire solipsism [1].
[1] https://pluralistic.net/2026/01/05/fisher-price-steering-whe...
Nah, he just doesn't care. Nothing he does will ever get people (en masse, onesie, twosies don't matter) to stop using Meta products.
People can/will complain about him forever, but shitty people will continue to help him build things, and shitty people will continue to use them.
Courts aren't lacking in budget to hire more people. They're lacking in people available to hire, with the specific expertise that they need to fill any gaps. The legal profession, at least in the US, consistently has some of the lowest unemployment rates across the board. Unlike over here in the tech sector, the scarcity is in available talent, rather than available jobs.
https://trends.google.com/explore?q=facebook&date=all&geo=US
and maybe Zuck doesn't think he can do anything about it. There are different theories but i like this one:
-- originally you would put some imagination and elbow grease into using Facebook and get some intention which made it very attractive and interesting to people around 2010
-- then it found a business model which was dependent on your not being able to use imagination and elbow grease to get attention which made it less interesting in general but still somewhat interesting because now you could put cash into the slot machine and get cash out
-- over time they lowered the payout of the slot machine which made the game less interesting and more dependent on 100% profitable scams which could function no matter how bad the payout was; people lose trust in the platform and stop engaging with ads, real advertisers don't want to be seen next to scam ads (lest they be seen as scams) which further lowers the payout and makes the game less interesting over time
-- and now they won't even take your money... so who cares?
The game is rigged, also Instagram and Whatsapp (yeah, companies get acquired. but WA's Acton was very explicit - "delete Facebook" (also, ever tried deleting FB? almost impossible. more network effects). he was pissed off at what happened)
You have one thing on your mind - "social media bad" - and it is poisoning your ability to see the complexity in the world. There is rarely a single cause for anything, and there is never a single cause for everything.
Once I started using the social features on my MQ3 I found it really was Zuck's worst nightmare. I met all these nice retirees who were fun to play Beat Saber with and who would go on cruises and post YouTube links to pano videos they take of the ship.
I think it is more like radioactive decay than say, cheese going bad, but maybe I'm wrong. You can't smell the radioactive decay!
Meta on Thursday began removing advertisements from attorneys who were seeking clients that claim to have been harmed by social media while under the age of 18.
Why it matters: This comes just two weeks after Meta and YouTube were found negligent in a landmark California case about social media addiction.
Driving the news: Axios has identified more than a dozen such ads that were deactivated today, some of which came from large national firms like Morgan & Morgan and Sokolove Law.
Zoom in: Meta appears to be relying on part of its terms of service that say:
What they're saying: "We're actively defending ourselves against these lawsuits and are removing ads that attempt to recruit plaintiffs for them," a Meta spokesperson tells Axios. "We will not allow trial lawyers to profit from our platforms while simultaneously claiming they are harmful."