> We don’t own Your Content, but we may use Your Content to operate Copilot and improve it. By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf.
lol
The section titled "WHEN & WHERE THESE TERMS APPLY" includes:
> Conversations you have with Copilot through other Microsoft apps and websites
we can’t promise that any Copilot’s Responses won’t infringe someone else’s rights (like their copyrights, trademarks, or rights of privacy) or defame them.
You agree to indemnify us and hold us harmless (including our affiliates, employees and any other agents) from and against any claims, losses, and expenses (including attorneys' fees) arising from or relating to your use of Copilot
Maybe a shirt, could sell it on the Microsoft store even. Now that would be entertainment.
This is why I'm skeptical about all this AI coding thing...
Says the bot based on scraped data
“These Terms don’t apply to Microsoft 365 Copilot apps or services unless that specific app or service says that these Terms apply.”
Think of Copilot being a suite of different products under the same overall banner and it starts to make (a bit) more sense.
Non-commercial use only. You agree not to use our Services for any commercial or business purposes and we (and our Providers) have no liability to you for any loss of profit, loss of business, business interruption, or loss of business opportunity.
It's funny that a plan called "Pro" cannot be used professionally.
> IMPORTANT DISCLOSURES & WARNINGS
Tells us:
> You may stop using Copilot at any time.
That's an odd thing to include in a ToS.
- ‚Are you for entertainment purposes only?‘
-‚Not at all — unless you want me to be. The short version: I’m not “for entertainment only.”‘
Edit: Ok I see it is legal framing to not be held liable, but can they just do that via ToS and let the tool itself promote something else?
> You may stop using Copilot at any time.
But how? Microsoft has shoved it into so many products that I don't see how it's possible, without dropping them alltogether.
"We don’t own Your Content... By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf."
"Funny how? I mean, funny like I’m a clown? I amuse you? I make you laugh? I’m here to fuckin’ amuse you? What do you mean funny, funny how?"
Now what?! Do I have to uninstall Windows?
The words they used, as commonly understood by the target audience, were intentionally crafted to be interpreted differently than what they were going to say they meant in court. They spent time, effort, and money, ran focus groups, and carefully selected and curated their words to be incorrectly interpreted by the target audience to reach knowingly false conclusions.
The correct standard should be that they spent time, effort, and money, ran focus groups, and carefully selected and curated their words to be correctly interpreted by the target audience to reach true conclusions. Their statements should only be accidentally incorrect in proportion to the time and effort spent crafting and distributing them.
"Technically, your honor", should be treated as the ethical abomination it is.
People in glass houses shouldn't throw stones.
> That's an odd thing to include in a ToS.
Maybe it's the only Microsoft product for which that's true? (It certainly feels that way, sometimes.)
He signed, sent both copies, got his bank signed copy back
Went yo the bank, the bank sued him, he won (the judge told the bank that when you play dirty games you sometimes loose) and they ultimately settled.
Are you saying that the business version cannot make mistakes and can be relied upon for important advice?
Although intentionally saying things that contradict whats in the contract might be legally objectionable.
diff of the changes between US and UK:
https://www.diffchecker.com/BtqVrR9p/
There's the usual expected legal boilerplate differences. However, the UK version injects the additional clause at line 134 that has no analog in the US version.
When sh!t hits the fan, Anthropic will immediately point to this clause. Who knows, maybe a court would see it as valid.
Meanwhile, your customer (and thus, your management) is looking for someone to blame for excrement making contact with the impellers. And that someone's gonna be you.
just to be greeted with an email that welcomed me to copilot and the free plan. No button or link to disable the thing.
And belive me, if you use any Microsoft products or services they really make it hard to avoid accidentally using the damn thing.
Including adding it to your office plan and then charging you 2x.
They do seem to word this at a more professional level in this context (the terms linked are for individuals using Copilot in Windows, probably?)
IF YOU LIVE IN (OR YOUR PRINCIPAL PLACE OF BUSINESS IS IN) THE UNITED STATES, PLEASE READ THE BINDING ARBITRATION CLAUSE AND CLASS ACTION WAIVER IN SECTION 15 OF THE MICROSOFT SERVICES AGREEMENT. IT AFFECTS HOW DISPUTES RELATING TO THESE TERMS ARE RESOLVED.
Welcome to Copilot, your personal AI companion!
These Terms explain how you can use Copilot. By using Copilot, you agree to these Terms. Please read them carefully before you start using Copilot.
These Terms apply to your use of “Copilot,” which includes:
These Terms don’t apply to Microsoft 365 Copilot apps or services unless that specific app or service says that these Terms apply.
Certain words and phrases we use in these Terms have a particular meaning:
You need to be old enough to use Copilot – usually at least 13, but sometimes 18 or older, depending on your country’s laws. Because laws vary by country, Copilot isn’t available everywhere.
If you’re under 18, or if you use Copilot without logging in, we might turn off or limit some features for legal or safety reasons. If we ask for your birthday and country when you sign up or log in, you must give us your real information.
Don’t use tools or computer programs (like bots or scrapers) to access Copilot. You can only use Copilot for your own personal use.
Copilot is an AI-powered conversational service. Copilot will generate Responses to Prompts you submit and may also offer you Responses directly in your ongoing conversations or for things you have asked Copilot to remember.
Copilot tries to give you good answers, but it can make mistakes. Sometimes, the sources Copilot uses may not be reliable, relevant, or accurate, and sometimes, Copilot may give you wrong information. When responding, Copilot may use information it finds on the internet, and we don’t control that content. You might see Responses that seem convincing but are incomplete, inaccurate, or inappropriate.
Always use your judgment and check the information you get from Copilot before you make decisions or act. If you see something wrong or inappropriate from Copilot, use the Report or Feedback features in Copilot to let us know. If you have a legal concern about something Copilot says, please use the Report a Concern page to tell us.
Because of the way Copilot works, the Responses you get from Copilot may not be unique to you. Copilot may give the same or similar Responses and Creations to Microsoft, or to other people. Other people may send similar Prompts as yours, and they could get the same, similar, or different Responses and Creations.
By using Copilot, you’re telling us that:
When you use Copilot, you must follow the general Code of Conduct set out in the Microsoft Services Agreement. As applied to Copilot, this means:
If you see something wrong or inappropriate from Copilot, use the Report or Feedback features in Copilot to let us know. If you have a legal concern about something Copilot says, please use the Report a Concern page to tell us.
We may block, restrict, or remove your Prompts or other content from you that violates these Terms, or that could lead Copilot to create a Response that violates these Terms.
We may choose to limit or stop offering or supporting Copilot or any feature within Copilot at any time and for any reason.
Unless prohibited by law, we may limit, suspend, or permanently revoke your access to or use of Copilot (and potentially all other Services) in our sole discretion, at any time and without notice. Some of the reasons we might do this, for example, is if you breach these Terms or violate the Code of Conduct, if we suspect you’re engaged in fraudulent or illegal activity, or if your Microsoft Account or the account you use to log in to Copilot is suspended or closed. If you feel your access has been restricted by mistake, you may ask us to reevaluate our decision by submitting a request using the Report a Concern form outlining what you think we got wrong and why.
Depending on your location and other factors, we may offer you the opportunity to browse, shop and buy certain products through Copilot. If you use Copilot to buy something, it’s sold and shipped by a third party (“Merchant”), not by us. We don’t process payments for your purchases through Copilot.
We don’t own Your Content, but we may use Your Content to operate Copilot and improve it. By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf.
We get to decide whether to use Your Content, and we don’t have to pay you, ask your permission, or tell you when we do. But that doesn’t mean we can use it however we want. The Microsoft Privacy Statement explains how we use Your Content, and the privacy options in Copilot give you control over some of those uses.
We can decide to remove or stop using Your Content at any time for any reason. By sharing Your Content with Copilot, you promise us that you have all rights to Your Content and that if we use Your Content, we won’t be violating someone else’s rights.
Although our Terms grant you permission to use Copilot, we are not granting you any rights in the underlying technology, intellectual property, or data that makes up Copilot.
From time to time, we might need to update these Terms for different reasons. Some of those reasons might include adding new features, complying with changing laws, addressing security, safety, or fraud issues, or making our Terms clearer and easier to understand.
There may be rare circumstances where we need to update these Terms immediately. Otherwise, we’ll post the updated Terms to this page at least 30 days before they take effect. We’ll also include the date the terms take effect at the top of the page, so you can easily tell when we’ve made an update.
If you keep using Copilot after the updates take effect, you’re agreeing to those updates. If you don’t agree to the updates, you must stop using Copilot.
Software in general is usually provided on an "as is" basis with the creator not taking responsibility for anything going wrong.
> These Terms apply to your use of “Copilot,” which includes:
> The standalone Copilot apps on your computer or mobile device
> The Copilot service we offer at copilot.microsoft.com, copilot.com, and copilot.ai
> Conversations you have with Copilot through other Microsoft apps and websites
> Conversations you have with Copilot through third-party apps and platforms
> Other Copilot-branded apps and services that link to these Terms
> These Terms don’t apply to Microsoft 365 Copilot apps or services unless that specific app or service says that these Terms apply.
I can never find an article that mentions the final outcome.
In practice, availing yourself of any of these protections is a massively uphill battle. Judges tend to presume that these common law matters are already embedded into the de facto legal system because the people writing the laws already operated under those assumptions while framing the law. Personally, I disagree and think a lot of these protections have eroded away into either nothing, or so little that it might as well be nothing, but you have a 0% chance of drawing me as a judge in your case so that won't help you much if you try.
To be fair to them, MS are quite open about accuracy for the business offerings, see here as one example:
https://learn.microsoft.com/en-us/copilot/microsoft-365/micr...
This is not such a disclaimer. If Copilot fails its purpose of entertaining you, you can sue. /i
I reimplemented my startup idea from scratch with Codex a few months ago, just for peace of mind.
https://en.wikipedia.org/wiki/Buffalo_buffalo_Buffalo_buffal...
It is not at all uncommon for such absurd contract terms to be unenforceable - especially in B2C contracts, although it might even be tricky for B2B clickthrough ones.
The idea being that most contracts are fairly standard, so a lot of people will just skim through them. Putting a landmine in them is obviously in bad faith, so making it enforceable would basically make it impossible to do any kind of business at all.
When it's huge, falls upon people that can't justify a lawyer, and keeps changing all the time, one shouldn't even need to claim it. It should be automatically invalid.
In the UK the consumer terms say its subject to English law and the courts of the UK jurisdiction you live in.
The commercial terms say that in the UK, Switzerland and the EEA there will be binding arbitration by an arbitrator in Ireland appointed by the President of the Law Society of Ireland.
When a construction guy messes up measurements and thousands of dollars of work has the be removed and redone, no one thinks of taking the employee to court. Why would you want to take your Ai to court?
[1] https://en.wikipedia.org/wiki/Microsoft_.NET_strategy
The line i initially quoted:
> You may stop using Copilot at any time.
Was incomplete. It continues with what initially appears to be a non sequitur:
> You may stop using Copilot at any time. If you want to close your Microsoft Account, please see the Microsoft Services Agreement.
It may not be a non sequitur, but may well be the only way to "opt out" of Copilot.
We cancelled at T-45 or so days before renewal, having determined it wasn't a fit for our client anymore, and they insisted "well, actually, you've renewed anyway!" which, no, we haven't. Absolutely absurd to try to "clickwrap" buried renewal terms in a 20+ page T&C/privacy document rather than as a material point of fact on the actual order form being executed.
Feels like the height of absurdity to try to bully your client into forcing them to use your services against their will when they still gave ample notice that they were cancelling and when there was no material loss to the business, but it's always felt like their revenue team has been unhinged in general: exploding offers, insane terms, super high-pressure sales... part of the reason we left them in the first place.
If you just wrote them in "plain language" there would be far too much ambiguity and arguing over what was really meant or implied or agreed to.
We are comparing like for like - an individual user using a Claude Pro subscription. A US user can use it for commercial use and be in compliance with the terms, the UK user cannot.
(Not only that, employees who got a reprimand too heavy handed can sue back. Plenty of cases around.)
"AI" company provides a service. They might or might not be adequate, that's not the point, the point is that the ability to sue them must always be on the cards if the agreed upon terms aren't met.
We live in a world where advertising boneless chicken does not actually mean the chicken does not contain bones.
Seems pretty clear to me, do you really think people need a lawyer to understand that?
Granted that this one document has a surprisingly clear language, but no, it's still not reasonable. Also, it was changed less than 6 months ago.
Why would they include a product for entertainment purposes only in the product they sell to large companies for doing work?
So either that document is fraudulent or everyone else at Microsoft is committing fraud daily.
Examples from the first search result: https://support.microsoft.com/en-us/topic/microsoft-365-copi...
Support page with ~25 tutorials provided by Microsoft about how to "Create a document with Copilot" or "Create a branded presentation from a file" or "Start a Loop workspace from a Teams meeting".
Do you actually believe that creating branded presentations (from Microsoft's own examples) is something people do for "entertainment purposes"?