This is why having decent standards in politics and opposition matters so much.
We all got together to vote out the last wankers, only to find that the current lot are of the same quality but in different ways.
And to think... the 'heads up their arses while trying to suppress a Hitler salute' brigade (Reform) are waiting in the wings to prove my point.
This principle applies directly to search engines and data processors. When Courtsdesk was deleted before reaching AI companies, the government recognized the constraint: AI systems can't apply this "completeness" standard. They preserve raw archives without contextual updates. They can't distinguish between "investigato" (under investigation) and "assolto" (aquitted). They can't enforce the court's requirement that information must be "correttamente aggiornata con la notizia dell'esito favorevole" (properly updated to reflect the favorable resolution of the proceedings).
The UK government prevented the structural violation the Cassazione just identified: permanent archival of incomplete truth. This isn't about suppressing information, it's about refusing to embed factually false contexts into systems designed for eternity.
Ministry of Justice in UK has always struck me as very savvy, from my work in the UK civic tech scene. They're quite self-aware, and I assume this is more pro-social than it might seem.
It's not easy to see who to believe. The MP introducing it claiming there is a "cover up" is just what MPs do. Of course it makes him look bad, a service he oversaw the introduction of is being withdrawn. The rebuttal by the company essentially denies everything. Simultaneously it's important to notice the government are working on a replacement system of their own.
I think this is a non-event. If you really want to read the court lists you already can, and without paying a company for the privilege. It sounds like HMCTS want to internalise this behaviour by providing a better centralised service themselves, and meanwhile all the fuss appears to be from a company operated by an ex-newspaper editor who just had its only income stream built around preferential access to court data cut off.
As for the openness of court data itself, wake me in another 800 years when present day norms have permeated the courts. Complaining about this aspect just shows a misunderstanding of the (arguably necessary) realities of the legal system.
Or it should be sealed for X years and then public record. Where X might be 1 in cases where you don't want to hurt an ongoing investigation, or 100 if it's someone's private affairs.
Nothing that goes through the courts should be sealed forever.
We should give up with the idea of databases which are 'open' to the public, but you have to pay to access, reproduction isn't allowed, records cost pounds per page, and bulk scraping is denied. That isn't open.
https://www.tremark.co.uk/moj-orders-deletion-of-courtsdesk-...
They raise the interesting point that "publicly available" doesn't necessarily mean its free to store/process etc:
> One important distinction is that “publicly available” does not automatically mean “free to collect, combine, republish and retain indefinitely” in a searchable archive. Court lists and registers can include personal data, and compliance concerns often turn on how that information is processed at scale: who can access it, how long it is kept, whether it is shared onward, and what safeguards exist to reduce the risk of harm, especially in sensitive matters.
> ... the agreement restricts the supply of court data to news agencies and journalists only.
> However, a cursory review of the Courtsdesk website indicates that this same data is also being supplied to other third parties — including members of @InvestigatorsUK — who pay a fee for access.
> Those users could, in turn, deploy the information in live or prospective legal proceedings, something the agreement expressly prohibits.
The counter claim by the government is that this isn't "the source of truth" being deleted but rather a subset presented more accessibly by a third party (CourtsDesk) which has allegedly breached privacy rules and the service agreement by passing sensitive info to an AI service.
Coverage of the "urgent question" in parliament on the subject here:
House of Commons, Courtsdesk Data Platform Urgent Question
Relatedly, there's an extremely good online archive of important cases in the past, but because they disallow crawlers in robots.txt: https://www.bailii.org/robots.txt not many people know about it. Personally I would prefer if all reporting on legal cases linked to the official transcript, but seemingly none of the parties involved finds it in their interest to make that work.
https://x.com/CPhilpOfficial/status/2021295301017923762
https://xcancel.com/CPhilpOfficial/status/202129530101792376...
Then they start jailing people for posts.
Then they get rid of juries.
Then they get rid of public records.
What are they trying to hide?
you _really_ shouldn't be allowed to train on information without having a copyright license explicitly allowing it
"publicly available" isn't the same as "anyone can do whatever they want with it", just anyone can read it/use it for research
"The government has cited a significant data protection breach as the reason for its decision - an issue it clearly has a duty to take seriously."
https://www.nuj.org.uk/resource/nuj-responds-to-order-for-th...
> HMCTS acted to protect sensitive data after CourtsDesk sent information to a third-party AI company.
(statement from the UK Ministry of Justice on Twitter, CourtsDesk had ran the database)
but it's unclear how much this was an excuse to remove transparency and how much this actually is related to worry how AI could misuse this information
I haven't confirmed it but I've seen journalist friends of mine complain also that the deletion period for much data is about 7 years so victims are genuinely shocked that they can't access their own transcripts even from 2018... Oh and if they can they're often extremely expensive.
In the United States our Constitution requires the government conduct all trials in public ( with the exception of some Family Court matters ). Secret trials are forbidden. This is a critical element to the operation of any democracy.
Shutting down the only working database is the proof point that perfect is the enemy of good.
Of course, this gives cover to the "well, we're working on something better" and "AI companies are evil"
Fine, shut it down AFTER that better thing is finally in prod (as if).
And if the data was already leaked because you didn't do your due diligence and didn't have a good contract, that's on you. It's out there now anyway.
And, really, what's the issue with AI finally having good citations? Are we really going to try to pretend that AI isn't now permanently embedded in the legal system and that lawyers won't use AI to write extremely formulaic filings?
This is either bureaucracy doing what it does, or an actual conspiracy to shut down external access to public court records. It doesn't actually matter which: what matters is that this needs to be overturned immediately.
Eeehh
I agree in principle. Its just that, a lot of people would want those databases heavily redacted if this was the case, which would ruin their utility.
The financial disincentive that a 25 dollar access fee creates, reduces the amount of spam that can be directed at people listed in these databases by several orders of magnitude. Land title searches are already difficult (in my locality), because a few people can afford to have their solicitors or agents receive and read all their mail. Open that right up, and everyone will either hide behind an agent selling a trash all service, or try and sneak the wrong data in there. Poisoning it.
> We should give up with the idea of databases which are 'open' to the public, but you have to pay to access, reproduction isn't allowed, records cost pounds per page, and bulk scraping is denied. That isn't open.
I really don't see why. Adding friction to how available information is may be a way to preserve the ability for the public to access information, while also avoiding the pitfalls of unrestricted information access and processing.
I disagree.
Even if you simply made the database no cost but such that an actual human has to show up at an office with a signed request, that is fine. That's still open.
The problem isn't the openness; it's the aggregation.
OR it should be allowed for humans to access the public record but charge fees for scrapers
Eg if you create a business then that email address/phone number is going to get phished and spammed to hell and back again. It's all because the government makes that info freely accessible online. You could be a one man self-employed business and the moment you register you get inundated with spam.
Quoting from an urgent question in the House of Lords a few days ago:
> HMCTS was working to expand and improve the service by creating a new data licence agreement with Courtsdesk and others to expand access to justice. It was in the course of making that arrangement with Courtsdesk that data protection issues came to light. What has arisen is that this private company has been sharing private, personal and legally sensitive information with a third-party AI company, including potentially the addresses and dates of birth of defendants and victims. That is a direct breach of our agreement with Courtsdesk, which the Conservatives negotiated.
https://hansard.parliament.uk/Commons/2026-02-10/debates/037...
An analogy would be Hansard and theyworkforyou.com. The government always made Hansard (record of parliamentary debates) available. But theyworkforyou cleaned the data, and made it searchable with useful APIs so you could find how your MP voted. This work was very important for making parliament accessible; IIRC, the guys behind it were impressive enough that they eventually were brought in to improve gov.uk.
Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions? No thanks.
AI firms have shown themselves to be playing fast and loose with copyrighted works, a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
It's not about any post-case information.
How about rate limited?
England has a genuinely independent judiciary. Judges and court staff do not usually attempt to hide from journalists stuff that journalists ought to be investigating. On the other hand, if it's something like an inquest into the death of a well-known person which would only attract the worst kind of journalist they sometimes do quite a good job of scheduling the "public" hearing in such a way that only family members find out about it in time.
A world government could perhaps make lots of legal records public while making it illegal for journalists to use that material for entertainment purpose but we don't have a world government: if the authorities in one country were to provide easy access to all the details of every rape and murder in that country then so-called "tech" companies in another country would use that data for entertainment purposes. I'm not sure what to do about that, apart, obviously, from establishing a world government (which arguably we need anyway in order to handle pollution and other things that are a "tragedy of the commons" but I don't see it happening any time soon).
Through the way AI companies could "severely/negligent mishandle the data potentially repeatedly destroying innocent people live" is about historic data, tho.
> “We are also working on providing a new licensing arrangement which will allow third parties to apply to use our data. We will provide more information on this in the coming weeks.
Alternatively consider that you are assuming the worst behavior of the public and the best behavior of the government if you support this and it should be obvious the dangerous position this creates.
I robbed a drug dealer some odd 15 years ago while strung out. No excuses, but I paid my debt (4-11 years in state max, did min) yet I still feel like a have this weight I can’t shake.
I have worked for almost the whole time, am no longer on parole or probation. Paid all fines. I honestly felt terrible for what I did.
At the time I had a promising career and a secret clearance. I still work in tech as a 1099 making way less than I should. But that is life.
What does a background check matter when the first 20 links on Google is about me committing a robbery with a gun?
Edit: mine is an extreme and violent case. But I humbly believe, to my benefit surely, that once I paid all debts it should be done. That is what the 8+ years of parole/probation/counseling was for.
But the courts are allowed to do it conditionally, so a common condition if you ask for a lot of cases is to condition it to redact any PII before making the data searchable. Having the effect that people that actually care and know what to look for, can find information. But you can't randomly just search for someone and see what you get.
There is also a second registry separate from the courts that used to keep track of people that have been convicted during the last n years that is used for backgrounds checks etc.
If it's not supposed to be public then don't publish it. If it's supposed to be public then stop trying to restrict it.
Any tool like this that can help important stories be told, by improving journalist access to data and making the process more efficient, must be a good thing.
It's like having you search through sand, it's bad enough while you can use a sift, but then they tell you that you can only use your bare hands, and your search efforts are made useless.
This is not a new tactic btw and pretty relevant to recent events...
In other countries, interference with the right to a fair trial would have lead to widespread protest. We don't hold our government to account, and we reap the consequences of that.
Obviously the government Ministry of Justice cannot make other parts of government more popular in a way that appeases political opponents, so the logical solution is to clamp down on open justice.
You're conflating two distinct issues - access to information, and making bad decisions based on that information. Blocking access to the information is the wrong way to deal with this problem.
> a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
1000x this. It’s one thing to have a felony for manslaughter. It’s another to have a felony for drug possession. In either case, if enough time has passed, and they have shown that they are reformed (long employment, life events, etc) then I think it should be removed from consideration. Not expunged or removed from record, just removed from any decision making. The timeline for this can be based on severity with things like rape and murder never expiring from consideration.
There needs to be a statute of limitations just like there is for reporting the crimes.
What I’m saying is, if you were stupid after your 18th birthday and caught a charge peeing on a cop car while publicly intoxicated, I don’t think that should be a factor when your 45 applying for a job after going to college, having a family, having a 20 year career, etc.
Instead, we should make it illegal to discriminate based on criminal conviction history. Just like it is currently illegal to discriminate based on race or religion. That data should not be illegal to know, but illegal to use to make most decisions relating to that person.
But why shouldn't a 19 year old shoplifter have that on their public record? Would you prevent newspapers from reporting on it, or stop users posting about it on public forums?
If load on the server is a concern, make the whole database available as a torrent. People who run scrapers tend to prefer that anyway.
This isn't someone's hobby project run from a $5 VPS - they can afford to serve 10k qps of readonly data if needed, and it would cost far less than the salary of 1 staff member.
I don't think all information should be easily accessible.
Some information should be in libraries, held for the public to access, but have that access recorded.
If a group of people (citizens of a country) have data stored, they ought to be able to access it, but others maybe should pay a fee.
There is data in "public records" that should be very hard to access, such as evidence of a court case involving the abuse of minors that really shouldn't be public, but we also need to ensure that secrets are not kept to protect wrongdoing by those in government or in power.
Absolutely fucking crazy that you typed this out as a legitimate defense of allowing extremely sensitive personal information to be scraped.
> only system that could tell journalists what was actually happening in the criminal courts
Who cares? Journalism is a dead profession and the people who have inherited the title only care about how they can mislead the public in order to maximize profit to themselves. Famously, "journalists" drove a world-renowned musician to his death by overdose with their self-interest-motivated coverage of his trial[1]. It seems to me that cutting the media circus out of criminal trials would actually be massively beneficial to society, not detrimental.
[1] https://www.huffpost.com/entry/one-of-the-most-shameful_b_61...
https://hansard.parliament.uk/Commons/2026-02-10/debates/037...
Though I'm not sure stopping this service achieves that.
Also - even in the case that somebody is found guilty - there is a fundamental principle that such convictions have a life time - after which they stop showing up on police searches etc.
If some third party ( not applicable in this case ), holds all court cases forever in a searchable format, it fundamentally breaches this right to be forgotten.
Blow for open justice

A digital archive that helped journalists track criminal court cases is being shut down by the Ministry of Justice.
Courtsdesk will reportedly be deleted within days after HM Courts & Tribunals Service ordered every record wiped. The platform had been used by more than 1,500 reporters from 39 media outlets to search magistrates’ court lists and registers, but the move has triggered warnings that important cases could now go unreported.
Courtsdesk says it repeatedly found the media wasn’t being told about hearings, with two-thirds of courts regularly hearing cases without notifying journalists.
The platform was launched in 2020 following an agreement with HMCTS and approval by the Lord Chancellor and former Justice Minister Chris Philp, but HMCTS issued a cessation notice in November citing “unauthorised sharing” of court information.
Courtsdesk founder Enda Leahy said the company wrote to government agencies 16 times trying to save the service. It asked for the matter to be referred to the Information Commissioner’s Office but says that request went nowhere, and former Philp himself approached current courts minister Sarah Sackman asking for the archive not to be deleted. The government refused last week.
Leahy told The Times that HMCTS couldn’t do what Courtsdesk did. She pointed to figures showing the court service’s own records were accurate just 4.2% of the time and that 1.6 million criminal hearings went ahead without any advance notice to the press.
“We built the only system that could tell journalists what was actually happening in the criminal courts,” she said.
An HMCTS spokesperson said the press would continue to have full access to court information to support accurate reporting.
HMCTS acted to protect sensitive data after CourtsDesk sent information to a third-party AI company.
Journalists’ access to court information has not been affected: listings and records remain available. pic.twitter.com/4KWlpCcaAq
— Ministry of Justice (@MoJGovUK) February 10, 2026
If you don't "know about them from another source" you can't effectively find/access the information and you might not even know that there is something you really should know about.
The service bridged the gap by providing a feed about what is potentially relevant for you depending on your filters etc.
This mean with the change:
- a lot of research/statistics are impossible to do/create
- journalists are prone to only learning about potentially very relevant cases happening, when it's they are already over and no one was there to cover it
That's an assertion, but what's your reasoning?
> This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
All across the EU, that information would be available immediately to journalists under exemptions for the Processing of Personal Data Solely for Journalistic Purposes, but would be simultaneously unlawful for any AI company to process for any other purposes (unless they had another legal basis like a Government contract).
Exactly. One option is for the person themselves to be able to ask for a LIMITED copy of their criminal history, which is otherwise kept private, but no one else.
This way it remains private, the HR cannot force the applicant to provide a detailed copy of their criminal history and discriminate based on it, they can only get a generic document from the court via Mr Doe that says, "Mr Doe is currently eligible to be employed as a financial advisor" or "Mr Doe is currently ineligible to be employed as a school teacher".
Ideally it should also be encrypted by the company's public key and then digitally signed by the court. This way, if it gets leaked, there's no way to prove its authenticity to a third party without at least outing the company as the source.
Absolutely not. I'm not saying every crime should disqualify you from every job but convictions are really a government officialized account of your behavior. Knowing a person has trouble controlling their impulses leading to aggrevated assault or something very much tells you they won't be good for certain roles. As a business you are liable for what your employees do it's in both your interests and your customers interests not to create dangerous situations.
Blocking (or more accurately: restricting) access works pretty well for many other things that we know will be used in ways that are harmful. Historically, just having to go in person to a court house and request to view records was enough to keep most people from abusing the public information they had. It's perfectly valid to say that we want information accessible, but not accessible over the internet or in AI datasets. What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.
That's up to the person for the particular role. Imagine hiring a nanny and some bureaucrat telling you what prior arrest is "relevant". No thanks. I'll make that call myself.
This made me pause. It seems to me that if something is not meant to inform decision making, then why does a record of it need to persist?
At some point, personal information becomes history, and we stop caring about protecting the owner's privacy. The only thing we can disagree on is how long that takes.
The UK does not have a statute of limitations
I'd go further and say a lot of charges and convictions shouldn't be a matter of public record that everyone can look up in the first place, at least not with a trivial index. File the court judgement and other documentation under a case number, ban reindexing by third parties (AI scrapers, "background check" services) entirely. That way, anyone interested can still go and review court judgements for glaring issues, but a "pissed on a patrol car" conviction won't hinder that person's employment perspectives forever.
In Germany for example, we have something called the Führungszeugnis - a certificate by the government showing that you haven't been convicted of a crime that warranted more than three months of imprisonment or the equivalent in monthly earning as a financial fine. Most employers don't even request that, only employers in security-sensitive environments, public service or anything to do with children (the latter get a certificate also including a bunch of sex pest crimes in the query).
Then there's cases like Japan, where not only companies, but also landlords, will make people answer a question like: "have you ever been part of an anti-social organization or committed a crime?" If you don't answer truthfully, that is a legal reason to reject you. If you answer truthfully, then you will never get a job (or housing) again.
Of course, there is a whole world outside of the United States and Japan. But these are the two countries I have experience dealing with.
However, there's also jobs which legally require enhanced vetting checks.
And only released if it's in the public interest. I'd be very very strict here.
I'm a bit weird here though. I basically think the criminal justice system is very harsh.
Except when it comes to driving. With driving, at least in America, our laws are a joke. You can have multiple at fault accidents and keep your license.
DUI, keep your license.
Run into someone because watching Football is more important than operating a giant vehicle, whatever you might get a ticket.
I'd be quick to strip licenses over accidents and if you drive without a license and hit someone it's mandatory jail time. No exceptions.
By far the most dangerous thing in most American cities is driving. One clown on fan duel while he should be focusing on driving can instantly ruin dozens of lives.
But we treat driving as this sacred right. Why are car immobilizers even a thing?
No, you can not safely operate a vehicle. Go buy a bike.
ETA: They didn't ship data off to e.g. ChatGPT. They hired a subcontractor to build them a secure AI service.
Details in this comment:
https://news.ycombinator.com/item?id=47035141
leading to this:
https://endaleahy.substack.com/p/what-the-minister-said
The government is behaving disgracefully.
I’d then ask OpenAI to be open too since open is open.
I could have my mind changed if the public policy is that any public data ingested into an AI system makes that AI system permanently free to use at any degree of load. If a company thinks that they should be able to put any load they want on public services for free, they should be willing to provide public services at any load for free.
If it is public, it will be scraped, AI companies are irrelevant here.
If information is truly sensitive, do not make it public, and that's completely fine. This might have been the case here.
Yes, license plates are public, and yes, a police officer could have watched to see whether or not a suspect vehicle went past. No, that does not mean that it's the same thing to put up ALPRs and monitor the travel activity of every car in the country. Yes, court records should be public, no, that doesn't mean an automatic process is the same as a human process.
I don't want to just default to the idea that the way society was organized when I was a young person is the way it should be organized forever, but the capacity for access and analysis when various laws were passed and rights were agreed upon are completely different from the capacity for access and analysis with a computer.
They don't have a budget for that. And besides, it might be an externalized service, because self hosting is so 90s.
Traced what? Innuendo is not a substitute for information.
If a conviction is something minor enough that might be expungable, it should be private until that time comes. If the convicted person hasn't met the conditions for expungement, make it part of the public record, otherwise delete all history of it.
Why are we protecting criminals, just because they are minors? Protect victims, not criminals.
Unfortunately reputational damage is part of the punishment (I have a criminal record), but maybe it's moronic to create a class of people who can avoid meaningful punishment for crimes?
Yes
>Oh no, some musician died, PASS THE NATIONAL SECURITY ACT, LOCK DOWN ALL INFORMATION ABOUT CRIMINALS, JAIL JOURNALISTS!!!!
That said I don't know why the hell the service concerned isn't provided by the government itself.
This is why this needs to be regulated.
Well, no ; that's not up to you. While you may be interested in this information, the government also has a responsibility to protect the subject of that information.
The tradeoff was maintained by making the information available, but not without friction. That tradeoff is being shattered by third parties changing the amount of friction required to get the information. Logically, the government reacts by removing the information. It's not as good as it used to be, but it's better than the alternative.
I also think someone who has suffered a false accusation of that magnitude and fought to be exonerated shouldn’t be forced to suffer further.
https://en.wikipedia.org/wiki/Limitation_Act_1980
Applies to England and Wales, I believe there are similar ones for Scotland and NI
Is the position that everyone who experienced that coverage, wrote about it in any forum, or attended, must wipe all trace of it clean, for “reasons”? The defendant has sole ownership of public facts? Really!? Would the ends of justice have been better served by sealed records and a closed courtroom? Would have been a very different event.
Courts are accustomed to balancing interests, but since the public usually is not a direct participant they get short shrift. Judges may find it inconvenient to be scrutinized, but that’s the ultimate and only true source of their legitimacy in a democratic system.
In the local data that the audit examined from three police forces, they identified clear evidence of “over-representation among suspects of Asian and Pakistani-heritage men”.
It’s unfortunate to watch people and entire countries twist themselves in logic pretzels to avoid ever suggesting that immigration has no ills, and we’re just being polite here about it.
https://www.aljazeera.com/news/2025/6/17/what-is-the-casey-r...
Sometimes can you can't prove B was more qualified, but you can always claim some BS like "B was a better fit for our company culture"
[1] https://en.wikipedia.org/wiki/Disclosure_and_Barring_Service
Do you regard the justice system as a method of rehabilitating offenders and returning them to try to be productive members of society, or do you consider it to be a system for punishment? If the latter, is it Just for society to punish somebody for the rest of their life for a crime, even if the criminal justice considers them safe to release into society?
Is there anything but a negative consequence for allowing a spent conviction to limit people's ability to work, or to own/rent a home? We have carve-outs for sensitive positions (e.g. working with children/vulnerable adults)
Consider what you would do in that position if you had genuinely turned a corner but were denied access to jobs you're qualified for?
If all you care about is preventing the information from being abused, preventing it from being used is a great option. This has significant negative side effects though. For court cases it means a lack of accountability for the justice system, excessive speculation in the court of public opinion, social stigma and innuendo, and the use of inappropriate proxies in lieu of good data.
The fact that the access speedbump which supposedly worked in the past is no longer good enough is proof that an access speedbump is not a good way to do it. Let's say we block internet access but keep in person records access in place. What's to stop Google or anyone else from hiring a person to go visit the brick and mortar repositories to get the data exactly the same way they sent cars to map all the streets? Anything that makes it hard for giant companies is going to make it hard for the common person. And why are we making the assumption that AI training on this data is a net social ill? While we can certainly imagine abuses, it's not hard to imagine real benefits today, nonetheless unforeseen benefits someone more clever than us will come up with in the future.
> What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.
We've been dealing with people making bad decisions from data forever. As an example, there was red lining where institutions would refuse to sell homes or guarantee loans for minorities. Sometimes they would use computer models which didn't track skin color but had some proxy for it. At the end of the day you can't stop this problem by trying to hide what race people are. You need to explicitly ban that behavior. And we did. Institutions that attempt it are vulnerable to both investigation by government agencies and liability to civil suit from their victims. It's not perfect, there are still abuses, but it's so much better than if we all just closed our eyes and pretended that if the data were harder to get the discrimination wouldn't happen.
If you don't want algorithms to come to spurious and discriminatory conclusions, you must make algorithms auditable, and give the public reasonable access to interrogate these algorithms that impact them. If an AI rejects my loan application, you better be able to prove that the AI isn't doing so based on my skin color. If you can do that, you should also be able to prove it's not doing so based off an expunged record. If evidence comes out that the AI has been using such data to come to such decisions, those who made it and those who employ it should be liable for damages, and depending on factors like intent, adherence to best practices, and severity potentially face criminal prosecution. Basically AI should be treated exactly the same as a human using the same data to come to the same conclusion.
They are only available for vulnerable sectors, you can't ask for one as a convenience store owner vetting a cashier. But if you are employing child care workers in a daycare, you can get them.
This approach balances the need for public safety against the ex-con's need to integrate back into society.
[1] https://rcmp.ca/en/criminal-records/criminal-record-checks/v...
So I likewise, require to know everything about you, including things that are none of my business but I just think they are my business and that's what matters. I'll make that call myself.
Thanks, but I don't want to have violent people working as taxi drivers, pdf files in childcare and fraudsters in the banking system. Especially if somebody decided to not take this information into account.
Good conduct certificates are there for a reason -- you ask the faceless bureaucrat to give you one for the narrow purpose and it's a binary result that you bring back to the employer.
Sometimes people are unfairly ostracized for their past, but I think a policy of deleting records will do more harm than good.
That is a great recipe for systematic discrimination.
As if "character" was some kind of immutable attribute you are born with.
The question a people must ask themselves: we are a nation of laws, but are we a nation of justice?
I think the solution there is to restrict access and limit application to only what's relevant to the job. If someone wants to be a daycare worker, the employer should be able to submit a background check to the justice system who could decide that the drug possession arrest 20 years ago shouldn't reasonably have an impact on the candidate's ability to perform the job, while a history of child sex offenses would. Employers would only get a pass/fail back.
But the Internet's memory means that something being public at time t1 means it will also be public at all times after t1.
Coincidentally these same countries tend to have a much much lower recidivism rate than other countries.
This kind of logic does more disservice than people realize. You can combat bigotry towards immigrants (issue #1), without covering up for criminal immigrants (issue #2) in fear of increase of issue #1 among the natives. It only brings up more resentment and bigotry.
Note you are free to advertise hiring prior offenders.
You can also look up business ownership details and see if they have criminal records as well.
At the heart of Western criminal law is the principle: You are presumed innocent unless proven guilty.
Western systems do not formally declare someone "innocent".
A trial can result in two outcomes: Guilty or Not Guilty (acquittal). Note that the latter does not mean the person was proven innocent.
You are found Guilty or confirmed you continue to be Not Guilty.
In Scotland there was also the verdict "not proven" but that's no longer the case for new trials
Couldn't they just point to the court system's computer showing zero convictions? If it shows guilty verdicts then showing none is already proof there are none.
The appellate court records would contain information from the trial court records, but most of the identifying information of the parties could be redacted.
You can do something very simple like having a system that just lists if a person is - at that moment - in government custody. After release, there need not be an open record since the need to show if that person is currently in custody is over.
As an aside, the past few months have proven that the US government very much does not respect that reasoning. There are countless stories of people being taken and driven around for hours and questioned with no public paper trail at all.
Welcome to the world of certificates of the good conduct and criminal record extracts:
Democrats love it too.
They call em Jump Outs. Historically the so called constitution has been worth less than craft paper. From FDRs executive order 9066 to today, you have no rights.
I'm an employer and I want to make sure you haven't committed any serious crimes, so I ask for a certificate saying you haven't committed violent crimes. I get a certificate saying you have. It was a fistfight from a couple of decades ago when you were 20, but I don't know if it's that or if you tortured someone to death. Gotta take a pass on hiring you, sorry.
Seems like the people this benefits relative to a system in which a company can find out the specific charges you were convicted of would be the people who have committed the most heinous crimes in a given category.
To be this brings in another question when the discussion should be focused on to what extent general records should be open.
It worked well enough for a pretty long time. No solution can be expected to work forever, we just need to modify the restrictions on criminal histories to keep up with the times. It's perfectly normal to have to reassess and make adjustments to access controls over time, not only because of technology changes, but also to take into account new problems with the use/misuse of the data being restricted and our changing values and expectations for how that data should be used and accessed.
> If you don't want algorithms to come to spurious and discriminatory conclusions, you must make algorithms auditable, and give the public reasonable access to interrogate these algorithms that impact them.
I think we'd have much better success restricting access to the data than handing it out freely and trying to regulate what everyone everywhere does with that data after they already have it. AI in particular will be very hard to regulate (as much as I agree that transparent/auditable systems are what we want), and I don't expect we'd have much success regulating what companies do behind closed doors or force them to be transparent about their use of AI
We both agree that companies should be held liable for the discriminatory outcomes of their hiring practices no matter if they use AI or not. The responsibility should always fall on the company and humans running the show no matter what their tools/processes are since they decide which to use and how to use them.
We also agree that discrimination itself should be outlawed, but that remains an unsolved problem since detection and enforcement are extremely difficult. It's easier to limit the opportunity to discriminate than try to catch companies in the act. You mention that hiding people's race doesn't work, but that's actually being explored as a means to avoid bias in hiring. For example, stripping names and addresses (which can hint at race) before passing resumes to algorithms seems like it could help reduce unintentional discrimination.
Ultimately, there'll always be opportunities for a bigot to discriminate in the hiring process but I think we can use a multifaceted approach to limit those opportunities and hopefully force them to act more explicitly making deliberate discrimination a little easier to catch.
You're over-thinking it, trying to solve for a problem that doesn't exist. No one has a "right" to work for me. There's plenty of roles that accept ex-cons and orgs that actively hire them.
Please don't unnecessarily censor yourself for the benefit of large social media companies.
We can say pedophile here. We should be able to say pedophile anywhere. Pre-compliance to censorship is far worse than speaking plainly about these things, especially if you are using a homophone to mean the same thing.
Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
The only way to have a system like that is to keep records, permanently, but decision making is limited.
So anyone who is interested in determining if a specific behavior runs afoul of the law not just has to read through the law itself (which is, "thanks" to being a centuries old tradition, very hard to read) but also wade through court cases from in the worst case (very old laws dating to before the founding of the US) two countries.
Frankly, that system is braindead. It worked back when it was designed as the body of law was very small - but today it's infeasible for any single human without the aid of sophisticated research tools.
Sure there is still some leeway between only letting a judge decide the punishment and full on mob rule, but it's not a slippery slope fallacy when the slope is actually slippy.
It's fairly easy to abuse the leeway to discriminate to exclude political dissidents for instance.
Protect victims and criminals. Protect victims from the harm done to them by criminals, but also protect criminals from excessive, or, as one might say, cruel and unusual punishment. Just because someone has a criminal record doesn't mean that anything that is done to them is fair game. Society can, and should, decide on an appropriate extent of punishment, and not exceed that.
This - nearly all drug deliveries in my town are done by 15 years olds on overpowered electric bikes. Same with most shoplifting. The real criminals just recruit the schoolchildren to do the work because they know schoolchildren rarely get punishment.
But when the conspiracy involves lack of prosecution or inconsistent sentencing at scale and then the Ministry of Justice issues a blanket order to delete one of the best resources to look into those claims...? Significantly increases the legitimacy of the claims.
I assumed it was the usual conspiracy stuff up until this order.
"We hired a specialist firm to build, in a secure sandbox, a safety tool for journalists. They are experts in building privacy-preserving AI solutions - for people like law firms or anyone deeply concerned with how data is held, processed, and protected. That’s why we chose them. Their founders are not only respected academics in addition to being professionals, they have passed government security clearance and DBS checks in the past, and have worked on data systems for the National Archives, the Treasury, and other public agencies. They’ve published academic papers on data compliance for machine learning.
"The Minister says we ‘shared data with an AI company”... as if we were pouring this critically sensitive information into OpenAI or some evil aggregator of data. This is simply ridiculous when you look at what we do and how we did it.
"We didn’t “share” data with them. We hired them as our technical contractor to build a secure sandbox to test an idea, like any company using a cloud provider or an email service. They worked under a formal sub-processor agreement, which means under data protection law they’re not even classified as a “third party.” That’s not our interpretation. It’s the legal definition in the UK GDPR itself. ... "And “for commercial purposes”? The opposite is true. We paid them £45,000 a year. They didn’t pay us a penny. The money flowed from us to them. They were prohibited, in writing, from selling, sharing, licensing, or doing anything at all with the data other than providing the service we hired them for.. and they operated under our supervision at all times. They didn’t care what was in the data - we reviewed, with journalists, the outputs to make sure it worked."
If this is true, it does seem that the government has mischaracterized what happened.
At a certain point, we say someone is an adult and fully responsible for their actions, because “that’s who they are”.
It’s not entirely nuanced—and in the US, at least, we charge children as adults all the time—but it’s understandable.
"Free speech" is some kind of terminal brain worm that begs itself to be invoked to browbeat most anybody into submission, I suppose. Now we're apparently extending "free speech" to mean "the government must publish sensitive private information about every citizen in an easily-scraped manner". Well, I don't buy your cheap rhetorical trick. I support freedom of speech in exactly the scope it was originally intended, that is, the freedom to express ideas without facing government censorship or retaliation. Trying to associate your completely unrelated argument with something that everybody is expected to agree with by default is weak.
Nor is it "an attack on journalism". If a real journalist still exists in the current year, they can do investigative work to obtain information that is relevant to the public's interests and then publish it freely. Nobody is stopping them.
> LOCK DOWN ALL INFORMATION ABOUT CRIMINALS
Notably, people who aren't criminals may find themselves in court to determine whether or not they aren't. Unfortunately, people like yourself and the entirety of the broadcast and print media then go on to presume every person who goes to court are criminals and do everything in their power to ruin their lives. Far from just "some musician", most people who are arrested on serious charges get a black mark on their record that effectively destroys their careers and denies them the ability to rent property outside of a ghetto, as employers and landlords discriminate against them baselessly even after they are acquitted of all charges against them.
Prosecution of sexual assault is often handled extremely badly. It needs to be done better, without fear or favor, including people who are friends with the police or in positions of power. As we're seeing the fallout of the Epstein files.
I'm not sure we can write that much more COBOL.
The premise here is, during an investigation, a suspect might have priors, might have digital evidence, might have edge connections to the case. Use the platform and AI to find them, if they exist.
What it doesn’t do: “Check this video and see if this person is breaking the law”.
What it does do: “Analyze this persons photos and track their movements, see if they intersect with Suspect B, or if suspect B shows up in any photos or video.”
It does a lot more than that but you get the idea…
The interpretation of the law is up to the courts. The enforcement of it is up to the executive. The concept of the law is up to Congress. That’s how this is supposed to work.
Or, you might just be doing the meme: https://x.com/MillennialWoes/status/1893134391322308918?s=20
Should it though? You can buy a piece of real estate without living there, e.g. because it's a rental property, or maybe the school is announced to be shutting down even though it hasn't yet. And in general this should have nothing to do with the bank; why should they care that somebody wants to buy a house they're not allowed to be in?
Stop trying to get corporations to be the police. They're stupendously bad at it and it deprives people of the recourse they would have if the government was making the same mistake directly.
Indeed. And as far as I know, "courts" is not an alternative spelling of "AI".
> Our understanding is that some 700 individual cases, at least, were shared with the AI company. We have sought to understand what more may have been shared and who else may have been put at risk, but the mere fact that the agreement was breached in that way is incredibly serious.
> ... the original agreement that was reached between Courtsdesk and the previous Government made it clear that there should not be further sharing of the data with additional parties. It is one thing to share the data with accredited journalists who are subject to their own codes and who are expected to adhere to reporting restrictions, but Courtsdesk breached that agreement by sharing the information with an AI company.
(from https://hansard.parliament.uk/Commons/2026-02-10/debates/037...)
it's a clear cut free speech issue, you just don't want to admit it since certain data being available and spread to other people doesn't suit your political ideology
That’s why the government should be transparent.
Great. How does it change the substance of my comment?
Perhaps, instead of arguing about whether “immigrants” is always a group as a collective, or a certain number of individuals acting together, you would focus on the high level implications of government’s action or inaction?
What do Epstein files have to do with anything right now? Stop shifting the goal posts.
- Policing language to distract from the topic.
- Trying to claim things are just a series of isolated incidents with absolutely nothing in common
- Claiming there are wider problems (that should be addressed in a manner that would take years and isn't even defined well enough to claim measure as being "better")
At a certain point, poorly thought out "protections", turn into a system that protects organized crime, because criminals aren't as stupid as lawmakers, and exploit the system.
There is a big difference between making a mistake as a kid that lands you in trouble, and working as a underling for organized crime to commit robberies, drug deals, and violent crime, and not having to face responsibility for their actions.
The legal system has so many loopholes for youth, for certain groups, that the law is no longer fair, and that is its own problem, contributing to the decline of public trust.
Whilst we're on Rotherham:
"...by men predominantly of Pakistani heritage" [0]
https://www.bbc.com/news/uk-england-south-yorkshire-61868863
Their parents or grandparents were immigrants...
great moral system you have there
To me any other viewpoint inevitably leads to abuse of one group or class or subset of society or another. If they are legally allowed to discriminate in some ways, they will seek to discriminate in others, both in trying to influence law changes to their benefit and in skirting the law when it is convenient and profitable.
If you can know the character of individual people, you have less reason to discriminate against those from statistically higher criminal communities.
Now for a serious answer, what happens in practice in Europe is not secret trials, because trials are very much public. Since there is only so many billionaries, their nephews, actual mafiosi and people with political exposure prosecution, the journalists would monitor them closely, but will not be there on a hearing about your co-workers (alleged) wife-beating activities.
It's all reported, surname redacted (or not, it depends), but we all know who this is about anyways. "Court records says that a head of department at a government institution REDACTED1 was detained Monday, according to the public information, the arrests happened at the Fiscal service and the position of the department head is occupied by Evhen Sraka".
What matters when this is happens is not the exact PII of the person anyways. I don't care which exact nephew of which billionarie managed to bribe the cops in the end, but the fact that it happened or not.
Rank and file cops aren't that interesting by the way, unless it's a systemic issue, because the violence threshold is tuned down anyway -- nobody does a routine traffic stop geared for occupational army activities.
Like everything, privacy is not an absolute right and is balanced against all other rights and what you describe fits the definition of a legitimate public interest, which reduces the privacy of certain people (due to their position) by default and can be applied ad-hoc as well.
Have you ever considered that these children are victims of organized crime? That they aren't capable of understanding the consequences of what they're doing and that they're being manipulated by adult criminals?
The problem here isn't the lack of long term consequences for kids.
Just because exceptions are exploitable, doesn't mean we should just scrap all the exceptions. Why not improve the wording and try to work around the exceptions?
So it would seem that you're the one straying from the topic.
In one comment you managed to violate a whole bunch of the HN commenting guidelines.
12 year olds know it’s not right to sell crack.
The problem is the gap between lack of legal opportunities for youth and the allure of easy money, status and power from being a criminal. Doesn’t help that the media makes it look so fun and cool to be a gangster.
please, show me your good faith interpretation and i will take back my comment
The problem that is happening in most Western countries is that criminal organizations take advantage of the fact that minors get reduced sentenced and that their criminal records are usually kept sealed (unless tried as an adult). Whether it be having them steal cars, partake in organized shoplifting operations, muggings, gang activity, drug dealing, etc...
Your reasoning for why this information shouldn't be public record seems to boil down to the fact that you don't agree with other peoples judgement of someone's past crimes. You'd like to see more forgiveness, and you don't think others will show the same forgiveness, so you want to take away all the access to information because of that. To me that seems like a view from a point of moral superiority.
I'd rather people get access to this information and be able to use their own brains to determine whether they want that person working there. If you were involved in shoplifting at 17 years old, and turn 18, I think it would be very fair for a store owner to be able to use that information to judge you when making a hiring decision. To me it doesn't make sense that you turn a magical age of 18 and suddenly your past poor decisions vanish into a void.