It's becoming increasingly apparent that if you don't use something truly free and open source and host it yourself, you're just setting yourself up for more of this sort of thing.
You can't trust anyone to properly handle the problem of "how the hell do we keep creeps the f*ck away from kids?" with any amount of common sense.
And even if I was able to register, that "automated system" still randomly bans people whenever it feels like it. Search the r/discordapp subreddit or just google "discord random ban", it's a widespread problem with no solution and I have no idea how so many other people seem to have no issues, yet at the same time you can find lots of people just as frustrated as me.
Their solution is for everyone to pay for Matrix with a credit card to verify age. I assume that means there must be a way to force only paid registered accounts to join ones instance? What percentage of the accounts on Discord are paid for with a credit or debit card? Or boosted? I don't keep up with terminology
[0] - https://en.wikipedia.org/wiki/Online_age_verification_in_the...
> Practically speaking, that means that people and organisations running a Matrix server with open registration must verify the ages of users in countries which require it. Last summer we announced a series of changes to the terms and conditions of the Matrix.org homeserver instance, to ensure UK-based users are handled in alignment with the UK’s Online Safety Act (OSA).
At least you can self-host matrix and messages are end to end encrypted, unlike IRC.
https://telegra.ph/why-not-matrix-08-07
There are even custom message/media types that people use to upload hidden content you can't see even if you're joined to the same channel using a typical client.
A bug blocking functionality is an annoyance, but a Scarlet Letter branded onto a secret dossier is terrifying.
https://news.ycombinator.com/item?id=46982421
https://tech.yahoo.com/social-media/articles/now-bypass-disc...
So… choose your poison? I’m sure Matrix/Element works for someone or they would be out of business, but it does not work for me.
> isn't Matrix based out of the UK and primary hosted instances on AWS in the UK?
It doesn't matter what country you run your server in or where your company is based; if you're providing public signup to a chat server then the countries (UK, AU, NZ etc) which require age verification will object if you don't age verify the users from those countries. (This is why Discord is doing it, despite being US HQ'd). In other words, the fact that The Matrix.org Foundation happens to be UK HQ'd doesn't affect the situation particularly.
(Edit: also, as others have pointed out, Matrix is a protocol, not a service or a product. The Matrix Foundation is effectively a standards body which happens to run the matrix.org server instance, but the jurisdiction that the standards body is incorporated in makes little difference - just like IETF being US-based doesn't mean the Internet is actually controlled by the US govt).
> Their solution is for everyone to pay for Matrix with a credit card to verify age.
Verifying users in affected countries based on owning a credit card is one solution we're proposing; suspect there will be other ways to do so too. However: this would only apply on the matrix.org server instance. Meanwhile, there are 23,306 other servers currently federating with matrix.org (out of a total of 156,055) - and those other servers, if they provide public signup, can figure out how to solve the problem in their own way.
Also, the current plan on the matrix.org server is to only verify users who are in affected countries (as opposed to try to verify the whole userbase as Discord is).
https://piunikaweb.com/2026/02/12/discord-uk-age-verificatio...
Moderation and centralization while typically aren't independent, aren't necessarily dependent. One can imagine viewing content with one set of moderation actions and another person viewing the same content with a different set of moderation actions.
We sort of have this in HN already with viewing flagged content. It's essentially using an empty set for mod actions.
I believe it's technically viable to syndicate of mod actions and possibly solves the mod.labor.prpbl, but whether it's a socially viable way to build a network is another question.
The 3D model method might work on Persona, but that demo only shows it fooling K-IDs classifier.
There should be no reason for a phone number and nor do I want to waste my time trying to buy pass it with internet provided single use numbers.
Unless it is a service I must use, then I will provide a phone number. If it is a service I get to choose to use then I will never provide a number.
Whether it matters depends very much on what sort of organization you are.
Discord is a multinational for-profit corporation planning an IPO. It takes payments from users in those countries, likely partners with companies in those countries, and likely wants to sell stock to investors in those countries. Every one of those countries has the ability to punish Discord if it does not obey their laws, even if it does not have a physical presence there.
The situation is likely quite different for most of the 23,306 Matrix servers that federate widely. The worst thing Australia, for example could do to one of their operators is make it legally hazardous for them to visit Australia.
Matrix is basically labeled "adults only" everywhere, so restricting certain servers/rooms due to possible innocent eyes is likely out of scope.
Practically speaking, I would just ignore this requirement. The UK government has no jurisdiction on this side of the pond.
There are a few IRC clients that support OTR. irssi-otr is one [1] weechat-otr is another [2]. Pidgin though I have not used it in a very long time. Hexchat using an always work in progress plugin. There may be others.
OTR could use some updates to include modern ciphers similar to the recent work of OpenSSH but probably good enough for most people.
E2EE aside having chat split up into gazillions of self hosted instances makes it much harder for chat to be hoovered up all in one place. It takes more effort to target each person and that becomes a government scalability issue. Example effort: [3]
[1] - https://github.com/cryptodotis/irssi-otr
[2] - https://github.com/mmb/weechat-otr
[3] - https://archive.ph/4wi5t
In that kind of environment, end to end encryption really doesn't add value.
This matrix discussion here is missing the point - many people don't want ubiquitous tracking of everything we do on the internet. You and matrix are seemingly not honestly addressing that point, because matrix doesn't seem different discord (in the requirements).
It seems far too risky to sign up on a service for the purpose of intercommunication that is able (or even likely) to burn bridges with another for any reason at any time. In the end people will just accumulate on 2 or 3 big providers and then you have pseudo-federation anyway.
Apparently my monopoly ISP rotates IPs fairly often and I am sharing them with people that have been doing bad things with them, so not only are many Matrix channels blocked but even large regular websites like etsy or locals are completely blocked for me as well. Anything with a CF captcha is also an infinite loop.
Whether or not authorities with jurisdiction over you would notice your instance (homeserver) or bother you about age verification is an issue you'd have to consider for yourself.
I honestly have no idea. As much as they love money I am not paying my lawyers to research AI this one. I would probably wait for others to get made example of.
Clueless lawmakers will see this app called Element full of kids chatting without restrictions and tell it to add a filter. When the app says "we can't", the government says "sucks to be you, figure it out" and either hands out a fine or blocks the app.
There are distinctions between the community vibe Discord is going for (with things like forums and massive chat rooms with thousands of people) and Matrix (which has a few chatrooms but mostly contains small groups of people). No in-app purchases, hype generation, or kyhrt predatory designs, just the bare basics to get a functional chat app (and even less than that if you go for some clients).
I'd say being based in the UK will put matrix.org and Element users at risk, but with Matrix development being funded mostly by the people behind matrix.org that implies an impact to the larger decentralized network.
I thought it was both and their hosted service is in the UK. Is it not? I know people can host their own but I have had very little success in getting people to host their own things. Most here at HN will not do anything that requires more than their cell phone. Who knows maybe Discords actions will incentivize more people to self host.
However all the LGBT+ friendly servers federate with each other and that's good enough for me. I like not having to see toxicity, there's too much of it in the world already.
19. "media downloads are unauthenticated by default" -> fixed in Jun 2024: https://matrix.org/blog/2024/06/26/sunsetting-unauthenticate...
20. "ask someone else’s homeserver to replicate media" -> also fixed by authenticated media
21. "media uploads are unverified by default" - for E2EE this is very much a feature; running file transfers through an antivirus scanner would break E2EE. (Some enterprisey clients like Element Pro do offer scanning at download, but you typically wouldn't want to do it at upload given by the time people download the AV defs might be stale). For non-encrypted media, content can and is scanned on upload - e.g. by https://github.com/matrix-org/synapse-spamcheck-badlist
22. "all it takes is for one of your users to request media from an undesirable room for your homeserver to also serve up copies of it" - yes, this is true. similarly, if you host an IMAP server for your friends, and one of them gets spammed with illegal content, it unfortunately becomes your problem.
In terms of "invisible events in rooms can somehow download abusive content onto servers and clients" - I'm not aware of how that would work. Clients obviously download media when users try to view it; if the event is invisible then the client won't try to render it and won't try to download the media.
Nowadays many clients hide media in public rooms, so you have to manually click on the blurhash to download the file to your server anyway.
"People will just accumulate on 2 or 3 big providers" is far from an inevitable circumstance, but there are conditions that make it more likely. That, too, is largely down to negligence or malice (but less so than the abusive communications problem).
Additionally, OTRv3 does not allow multiple clients per account, which makes it unusable for anyone who wants to chat from two devices.
Even without registering my nick, I would expect a modern protocol to keep my pm communication private by default.
More likely, it just won't become popular enough for lawmakers to notice because the UX is a little rough, and people have very little patience for such things anymore.
That might happen here, but I don’t think that principle holds generally. If that were true, wouldn’t every component of the service provider chain be sued for people e.g. downloading pirated or illegal stuff? The government cracks down on e.g. torrent trackers and ISPs, but they haven’t seriously attacked torrent clients or the app stores/OSes that allow users to run those clients. Why not?
You're just talking to the wrong ones :-)
Is that still true? As the admin of a small instance, I find the abuse coming from mastodon.social has been really low for a few years. There is the occasional spammer, but they often deal with it as quickly as I do.
> It's neutral to this topic, it's about tech.
this thread began by xe bringing up failures in moderation affecting trans people
Asking trans people to ignore this is like asking Jews to be comfortable in a bar where only ten percent of the patrons are Nazis. Arguing that "well not everyone is a Nazi" doesn't help, an attitude of "we're neutral about Nazis, we serve drinks to anyone" still makes it a Nazi bar, just implicitly rather than explicitly.
We do discuss all kinds of different topics here. Despite what many people here want to believe, Hacker News isn't exclusively for tech and tech-related subjects.
>and pretty sure anything openly transphobic would be flagged or deleted pretty soon.
But not banned, that's the problem. The guidelines are extremely pedantic but nowhere is bigotry, racism, antisemitism or transphobia mentioned as being against those guidelines. You might say that shouldn't be necessary, but it's weird that so much effort is put into tone policing specific edge cases but the closest the guidelines come to defending marginalized groups is "Please don't use Hacker News for political or ideological battle. It tramples curiosity." Transphobia is treated as a mere faux pas on the same par as being too snarky, or tediously repetitive. The real transgression being not the bigotry but "trampling curiosity." Any trans person who posts here knows that bigots who hate them and want to do them harm aren't going to suffer meaningful consequences (especially if they just spin up a green account) and that the culture here isn't that concerned about their safety.
Read the green account just below me. That sort of thing happens all the time. Yes, the comment is [dead] but why should a trans person be comfortable here, or consider themselves welcome, knowing that this is the kind of thing they'll encounter?
Hi all,
We’ve seen a huge spike of signups on the matrix.org homeserver over the last few days due to Discord announcing its plans to age-verify all users as of next month. We’d like to give a warm welcome to the massive influx of users currently trying Matrix as an open decentralised alternative to centralised platforms like Discord. We wish we had more time and resources to develop all the features needed for mainstream adoption (see The Road To Mainstream Matrix from last year’s FOSDEM), but we're happy to welcome you anyway!
The biggest difference between Matrix and Discord is that Matrix is an open standard, like email or the Web. There’s a wide range of both clients and servers, and anyone can run their own server on their own terms while participating in the global Matrix network.
However, it’s important to note that server admins are still subject to the law in the jurisdiction where they operate.
Practically speaking, that means that people and organisations running a Matrix server with open registration must verify the ages of users in countries which require it. Last summer we announced a series of changes to the terms and conditions of the Matrix.org homeserver instance, to ensure UK-based users are handled in alignment with the UK’s Online Safety Act (OSA). Since then Australia, New Zealand and the EU have introduced similar legislation, with movement in the US and Canada too. If you’ve been around for a while, you will have seen that we started raising the alarm about the dangers and potential risks of the OSA back in 2021 - but the reality is that these laws already apply, and the consequences of getting it wrong are serious.
From our perspective, the matrix.org homeserver instance has never been a service aimed at children, which our terms of use reflect by making it clear that users need to be at least 18 years old to use the server. However, the various age-verification laws require stricter forms of age verification measures than a self-declaration. Our Safety team and DPO are evaluating options that preserve your privacy while satisfying the age verification requirements in the jurisdictions where we have users. As a free service, we also have to be mindful of the cost of age-verification compliance. Paying for a matrix.org Premium account with a credit card is one approach which would verify your account and support our work. Premium accounts are currently going through a phased roll out, so if you’re on an older account you might not see the option to convert your account yet, you can mail [email protected] if you wish to be upgraded.
We also want to make it easy for users to move their account to another server with a feature called account portability. Account portability would give users more freedom to choose a server that matches their needs, and it would reduce the load on our matrix.org server. This takes significant work, but there should be some new Matrix Spec Change proposals (MSCs) in the coming weeks showing the direction of travel.
Finally: we’re painfully aware that none of the Matrix clients available today provide a full drop-in replacement for Discord yet. All the ingredients are there, and the initial goal for the project was always to provide a decentralised, secure, open platform where communities and organisations could communicate together. However, the reality is that the team at Element who originally created Matrix have had to focus on providing deployments for the public sector (see here or here) to be able to pay developers working on Matrix. Some of the key features expected by Discord users have yet to be prioritised (game streaming, push-to-talk, voice channels, custom emoji, extensible presence, richer hierarchical moderation, etc). Meanwhile no other organisation stepped up to focus on the “communication tool for communities” use case and provide a production ready Discord alternative, but clients like Cinny or Commet may feel much closer to Discord. On the other hand, Matrix goes far beyond Discord in other areas: both messages, files and calls are end-to-end-encrypted; we have read receipts; Matrix is an open protocol everyone can extend, and in the end, most Matrix clients are open source; there is nothing stopping developers from starting their own project based on existing ones and adding the missing features themselves. They may even eventually get accepted in the original projects!
Anyway, TL;DR: Welcome to everyone trying Matrix for the first time; please understand that public Matrix servers will also have to uphold age verification laws, as misguided as they might be. However, at least in Matrix you have the opportunity to run your own servers as you wish: we actively encourage you to make your own assessments and seek legal advice where needed.
The Matrix.org Foundation is a non-profit and only relies on donations to operate. Its core mission is to maintain the Matrix Specification, but it does much more than that.
It maintains the matrix.org homeserver and hosts several bridges for free. It fights for our collective rights to digital privacy and dignity.