https://www.goodboydigital.com/pixijs/bunnymark/
I'd assume most bots don't have a GPU attached :)
But, a few things could be more straightforward. Cloudflare makes the whole static site and DNS zone piece feel far more seamless. With Bunny you will still need to stitch records between different parts of their dashboard.
They recently upgraded the player for streaming media, we use in one instance for tutorial videos, that apparently adds some missing accessibility features. All we needed to do was adjust the embed URL structure we were using and all set.
But I hit a real issue recently: CDN edge caching served stale HTML after a deploy, and the service worker cached the bad response. Took a CDN purge from the dashboard to fix. The debugging experience when things go wrong at the edge is painful, you're always guessing which cache layer is the problem.
That being said, the free tier is hard to beat for getting started. Workers, Pages, KV, R2 — you can run a full production app at near-zero cost until you hit scale. Not sure if Bunny offers that.
If i see something horrific like:
import * as BunnySDK from "@bunny.net/edgescript-sdk" BunnySDK.net.http.serve(async (request: Request) =>
Thats a proprietary lock-in worse than what it tries to replace!
Easy upload of bind test files Flattened CNAME to support naked domains Robust free role based permissions to add other ppl
Anyone have suggestions for moving a stack of domains, many being little community and hobby projects away from cloudflare for a small overall price. Agency pricing like migadu offers for email on custom domains is what I have in mind.
Not many DNS management providers (that I'm aware of, please correct me) support CNAME flattening. That is having your A record point to a CNAME.
Every time I purge the pull zone cache, I do it twice, cause once from my CI isn't enough. My CI does individual page cache invalidation during deployment, but there needs to be some kind of delay (with no feedback) when assets are distributed across.
Only using edge storage, DNS, and CDN so far but very happy with Bunny.
Highly recommend their Edge Containers product, super simple and has nice primitives to deploy globally for a low latency workloads.
We connect all containers to one redis pubsub server to push important events like user billing overages, top-ups etc. Super simple, very fast, one config to manage all locations.
I’ve now been with Bunny.net for over a year and have been very happy with the service.
edit: I'm thinking of the use case where the CDN as a proxy for APIs and uncachable content as well, where it used as a reverse proxy for transit/ddos protection.
See https://bunny.net/gdpr/. Also noticed this:
While uncommon, bunny.net also provides a way to block users from the EU from accessing your content altogether by using our traffic manager tools if you do not wish to serve users from the European Union. Which I assume can be reversed, only serving to users from the EU.
That said, the edge-caching being how it is, it's possible to run into some race-conditions where the cache has been purged but not propagated to the edge network, and if visited too soon, the stale version might end up back into the cache.
[0]:
curl --fail --output "/dev/null" --silent --show-error -X POST "https://api.cloudflare.com/client/v4/zones/{our_zone_id}/purge_cache" \
-H "Authorization: Bearer $CLOUDFLARE_CACHE_PURGE_API_TOKEN" -H "Content-Type: application/json" \
--data '{"hosts": ["instorier.com", "www.instorier.com"]}'But that changes things
That being said, I had enough issues with Bunny and CF debugging across regions that I made this free tool to do both remote HTTP and TCP traceroutes to keep my sanity: https://dnsisbeautiful.com/global-http-availability
I don't like free offerings, because what if they decide to charge someday? What if someone decides "free is not feasible, we start charging $20 per instance now".
I'd rather have a low fee now, a change from $2 to $3 is more likely and that's fine for me. But from free to not free is risky for me.
I also like smaller, independent-ish ompanies that actually care about developers. That's why I use bunny.net, transistor.fm, Plausible Analytics.
I hear this argument all the time, but I think it's more complicated.
Firstly, if people used more diverse / smaller services the distribution of outages would change. While there will likely to be more frequent "smaller" asynchronous outages, many platforms can still break even when only one of their dependencies break. So, you might likely to face even more frequent outages, although not synchronous.
Secondly, we are not sure if these smaller services are on par with the reliability of Cloudflare and other big players.
Thirdly, not all Cloudflare infrastructure is fully centralized. There is definitely some degree of distribution and independence in/between different Cloudflare services. Some Cloudflare outages can still be non global (limited by region or customers that use certain feature set, etc).
https://social.mikutter.hachune.net/@mok/116208294430782702
BunnyCDN intentionally mis-writes any Mastodon request signing, as to make it incompatible with Mastodon.
And, they confirmed it's intentional.
Does anyone have thoughts or disagree on this in terms of pricing and cost effectiveness?
>One of my biggest concerns though is around how easily I could become heavily dependent on this one single company that then can decide to cut me off [...]
How does switching to Bunny make a difference?
It would be super nice to have a setup that uses multiple CDNs w/ automatic failover.
Some of our users were unable to reach our CDN altogether. They couldn't load any assets at all. Bunny's customer service was far too slow to respond and mostly gave unhelpful answers. They couldn't even identify the issue.
In less than 45 minutes, I moved our CDN entirely from Bunny to Cloudflare Workers. Now our CDN just actually works, I don't have to debug our CDN for the Bunny customer service team.
Also, this is obviously a marketing post.
Some of you may be skeptical about this but it allows for much easier management when working on multiple SaaS/hobby projects/personal tools.
There’s a cost limit to how much high availability is worth on any project but vendors like CloudFlare don’t respect that.
I feel like you missed what the author meant with that phrase. The author wasn't talking about for their website, but the internet as a whole.
> I can’t help but feel that the idea of centralizing the internet into a single US corporation feels off.
The point of picking Bunny.net is that it's alternative to this single entity that's got so much of the internet running through it, and is less susceptible to the BS in the US.
A few people here are complaining about the lack of a free tier, but Magic Containers can cover a lot of the same ground as Cloudflare's Durable Objects, which IIRC cost a minimum of $5/month.
The solution was to move to Bunny, and that worked for everyone.
With free offerings, you’re always helping the supplier in some way. Then you become the product. Which makes it difficult to understand the value exchange; it’s much easier to do so when you’re just paying a fair sum of money.
I have IPv6-only backends and I had to select serving from the main POPs rather than the entire network (which is fine by me as they are also cheaper).
If you actually care for the resiliency necessary to survive a provider outage you should have more than one provider.
Which means you should be running your own origin and using the simplest CDN features you possibly can to make your use case work.
In a time where more people usually beg for forgiveness instead of asking for permission, it already has
https://www.cloudns.net/premium/
https://www.luadns.com/pricing.html
I've found every other offering to be lacking. Some examples: Cloudflare is alright but has settings footguns if you're not used to Their Way of Doing It™ (e.g., before using DNSControl, I had to manually flip switches to turn off proxying every time I updated my zones). deSEC is free and okay, but sometimes quite slow to propagate and its UI+API are unwieldy. DNS Made Easy is often pushed on social media, but it's ridiculously pricey for what you get if you don't need a SLA. DNSimple seemed nice but IIRC I couldn't get a different API token per zone (?).
I'm currently relying mainly on LuaDNS. For me, it functions as a "dumb" DNS host (i.e., not using their Lua configuration-as-code system). Their API is oddly designed, but it's been passable since a recent-ish update, which has allowed me to safely port my zone files to DNSControl.
?
So 1 euro a month is too expensive for you? Wow.
Just pay the 1 Euro or go to GitHub where that is free but goes down almost every week.
This logic doesn't hold much water, however. Abrupt changes in pricing or other conditions happen with paid tiers as well
Practically, any metered supplier can put you out of business. It usually doesn't happen because destruction is mutually assured.
+1 for using smaller, more independent companies in any case!
You can just move to another provider at that point. At least when it comes to CDN and DNS there’s literally no vendor lock-in.
You can grab your dns records export them to csv and import somewhere else easily and a CDN is just a file server so you can just give your files to someone else easily.
The CDN certainly has it: https://bunny.net/blog/ipv6-returns-to-bunnycdn/
Depending on where I query from, OP's blog does have it as well:
# host jola.dev
jola.dev has address 37.19.207.38
jola.dev has IPv6 address 2400:52e0:1a04::1310:1Doable, but that removes all the free tiers of all the CDN's. AFAIK they all require an enterprise account to keep using ones own DNS and their own GSLB DNS failover. There are probably a few exceptions and one could maybe make something of that but I don't know which ones are the exceptions.
A week ago I (a hobbyist running a small side project for a dollar or two a month in normal usage, so my account is marked as "individual") got hit with a ~$17,000 bill from Google cloud because some combination of key got leaked or my homelab got compromised, and the attacker consumed tens of thousands in gemini usage in only a few hours.
Google denied a rate adjustment, and haven't reached back out to me for a good few days now. My credit card denied the charge because it was over my credit limit by a good few thousand dollars and they suspected fraud, but now I am terrified of being taken to collections and ruining my prospects of renting an apartment due to my credit score/history being ruined, or them just taking me to court.
I am never going to use "use now pay later" services, especially with cloud portals where it's so hard to put in a actual cap, and the cloud provider not having any sane rate limits. I am fine paying if it was negligence or a mistake on my part as a very expensive lesson in security, but 17k is brutal.
The fact they don't have an easy way to hard cap usage (especially for an individual account) and have ineffective rate limits (how on earth is an account that pays a few dollars a month able to run up tens of thousands in just a few hours), makes me never want to use their (or any use now pay later with no easy caps or rate limits) service ever again. Or even a phone number to call.
50 cents per domain per month 10 cents per million queries
That’s prob cheap enough to support lots of little hobby sites and bigger traffic sites likely have some budget.
I still have no idea what any of this has to do with any clients moving from Cloudflare to Bunny.net, what am I missing?
Granted cloudflare also does DDOS protection, and that makes sense for an API. For that you could do some DDOS protection without stripping TLS, but it can only protect against volumetric attacks like syn/ack floods and not against attacks that are establishing full TCP connections and overwhelming the app server. (rate limiting incoming connections can go a long way, but depending on details, it might still be enough to overwhelm the serving resources, your use case is up to you to understand).
But if a free offering suddenly says "We are getting rid of free, only starting $899 a month baseline, because we noticed our free users aren't converting and we only want to support enterprise from now on". Well, then I have to move everything.
Still a big price hike can come, but +20% monthly is easier to stomach than if I can't be sure what will happen to the free offering.
The user sent in a help ticket, and Bunny confirmed this response rewrite was intentional and would not fix it.
I wanted to get this out, not to conjecture as to why.
Also before doing this save anything important that Google owns (gmail, youtube videos, anything in storage). The leaders at Google are vengeful enough to completely lock you out for challenging them.
ehhhh, really depends on which CDN features you're using, and at what volume. Using ESI? VCL? Signed URLs or auth? Any other custom functionality? Are you depending on your provider's bot management features which are "CONTACT FOR PRICE" with other providers? Does your CDN provider have a special egress deal with your cloud provider?
It's possible to picture this being easy in the same way that being multi-cloud or multi-region is easy.
[1] Not completely sure but I think this was the incident https://blog.dnsimple.com/2020/07/incident-dns-resolution/
As a consequence I've had to build quite defensively - adopting a PWA approach - heavy caching and background sync. My hope is that latency improves over time because the platform is nice to work with.
Good riddance to the "free" model. It's never actually free. You either pay with your data, or have to consume ads, or you're forcing other customers to pay for your free usage.
Bunny bills per resource utilization (not provisioned) and since we run backend on Go it consumes like 0.01 CPU and 15mb RAM per idle container and costs pennies.
At some level, it's like they become your edge router.
Not yet. Working on it, though.
https://www.ftc.gov/business-guidance/resources/disclosures-...
It's not perfect but it's better than the alternatives and we really need a power bloc (even if currently only economic) that isn't the US and China.
Alternatives to US big tech are always welcome.
> You provide handlers that fulfil requests from the system.
As I said previously, though I wish they were, such handlers are not part of WinterTC.
And then again, how those handlers are registered is also not part of WinterTC, which I also wish it were.
> APIs like that leak implementation details
How?
Almost all runtimes, like Bunny Edge Scripting, Cloudflare Workers, Deno, Bun, etc. use the same basic signature for the handler:
(request: Request) => Promise<Response>
Only how you register said handler is, unfortunately, different for each runtime.
[1] https://developers.cloudflare.com/workers/runtime-apis/handl...
Almost all technological choices I made as a teen were driven by "what hosting can I get for free, as my parents sure as hell won't put down their payment information for that". Back then that usually meant PHP and a max. 50MB MySQL.
also I said this in a another thread, they charges 1$ even for single testing http request.
If they can't they probably should move to an international focused site.
TL;DR my motivation and experience for moving my blog from Cloudflare to bunny.net
I’ve been a long time Cloudflare user. They offer a solid service that is free for the vast majority of their users, that’s very generous. Their infrastructure is massive and their feature set is undeniably incredible.
One of my biggest concerns though is around how easily I could become heavily dependent on this one single company that then can decide to cut me off and disable all of my websites, for any arbitrary reason. It’s a single point of failure for the internet. Every Cloudflare outage ends up in the news. And I can’t help but feel that the idea of centralizing the internet into a single US corporation feels off. Not to mention the various scandals that have surrounded them. So I was open to alternatives.
Bunny.net (affiliate link because why not, raw link here) is a Slovenian (EU) company that is building up a lot of momentum. Their CDN-related services rival Cloudflare already, and although their PoP network is smaller than Cloudflare’s, they score highly on performance and speed across the globe. It’s a genuinely competitive alternative to Cloudflare.
It has the additional benefit of being a European company, and I like the idea of growing and supporting the European tech scene.
I’ve been using various different services, but focusing on this blog, the first thing was Cloudflare as the registrar for the domain name. I did some research on alternative registrars, but I just didn’t find any good European options. The closest I found was INWX, but their lack of free WHOIS Privacy made them a non-option. I ended up with Porkbun. They run on Cloudflare infrastructure, but they have better support. So the remaining thing Cloudflare was doing for me was the “Orange Cloud”: automatic caching, origin hiding, and optional protection features.
So that’s what we’re moving over! I’m gonna walk you through how to set up the bunny.net CDN for your website, with some sensible defaults.
Setting up your bunny.net account is quick and you get $20 worth of free credits to play around with, those are valid for 14 days. You don’t need to give them a credit card up front to try things out, but if you do, you get another $30 worth of credits. You do need to confirm your email though before you can start setting things up. Once you’re out of the trial, you pay per use, which for most cases is cents a month. However, note that bunny.net require a minimum payment of $1 per month.
I guess a cheap price to pay to stop being the product and start becoming the customer.
The pull zone is the main mechanism for enabling the CDN for your website. You’ll find them under CDN in the left navigation bar. Here’s how to set one up:
And you’re done with the first part!
Now that you’ve set up the pull zone, it’s time to hook it up to your website and domain. Go to the pull zone you created. You’ll see a “hostnames” screen. Time to connect things.
This is the part where bunny.net will really shine through!
If your website is set up to return the appropriate cache headers for each resource, things will just work. Bunny defaults to respecting the cache control headers when pointing a pull zone at an origin site. To verify, go to Caching → General and check that “Respect origin Cache-Control” is set under “Cache expiration time”. Note that if you set no-cache, bunny will use that and will not cache at the edge.
Alternatively, if you don’t have cache headers set up, and you don’t want to control that yourself, you can instead enable Smart Cache. This will default to caching typically cached resources like images, CSS, JS files etc, while avoiding caching things like HTML pages. This will work for most cases!
But I wanted to go faster. If you’ve read my post about building this website, here’s how I’ve set up my cache headers: I added a new pipeline in the router called public and added an extra middleware to it. I technically have everything using this pipeline, but leaving the standard browser pipeline that comes out of the box with Phoenix keeps my options open to add authenticated (uncached) pages in the future.
pipeline :public do
plug :accepts, ["html"]
plug :put_root_layout, html: {JolaDevWeb.Layouts, :root}
plug :put_secure_browser_headers, @secure_headers
plug :put_cdn_cache_header
end
defp put_cdn_cache_header(conn, _opts) do
put_resp_header(conn, "cache-control", "public, s-maxage=86400, max-age=0")
end
You can see the whole router here https://github.com/joladev/jola.dev/blob/main/lib/jola_dev_web/router.ex.
This setup means I even cache the HTML pages, which makes this ridiculously fast. Here’s the landing page response time from various locations, using the Larm response time checker tool:

Because I’m caching the HTML pages, if I publish a new post I do need to purge the pull zone to reset the cached HTML files.
All of these are optional, but nice to have!
On your pull zone page, under General → Hostnames, go toggle “Force SSL” on for your domain to ensure that all requests use SSL. SSL/TLS is pretty standard these days, and many TLDs and websites use HSTS to enforce it, but no harm in enabling it here too.
DDoS protection comes out of the box, but we can set some other things up. First of all, go to Caching and then Origin Shield in the left menu on your pull zone, and activate Origin Shield. Select the location closest to your origin. This reduces load on your server, as bunny.net will cache everything in the Origin Shield location, and all edge locations will try that location first before hitting your server.
Next, go to Caching → General and scroll down. At the bottom of the page you can select Stale Cache: While Origin Offline and While Updating. This means bunny will keep serving cached content even if it is stale, if it can’t reach your origin, and that it will serve stale content while fetching the latest version. Both are nice to haves, nothing you have to enable, but provide a slightly better service to your users!
Next, let’s set up an Edge rule to redirect any requests to our automatically generated pull zone domain to our actual domain, to avoid confusing crawlers. On your pull zone, in the left menu, click Edge rules.
https://jola.dev{{path}} .*://<slug>.b-cdn.net/* replacing <slug> with the name given to your pull zone.Now you should be able to go to https://slug.b-cdn.net for your pull zone and get redirected to your proper domain!
This post just covers the very basics of getting set up on bunny.net. I haven’t even scratched the surface of edge rules, cache configuration, the Shield features for security and firewalls, video hosting and streaming, edge scripting and edge distributed containers, and much more.
I especially appreciate the great statistics, logs, and metrics you get out of the dashboard. You can even see every single request coming through to help you investigate issues, and clear feedback on what’s getting cached and not. I’m actively moving everything else over and I’m excited for the upcoming S3 compatible storage!
You should give bunny.net a try!
lol that ship sailed a long time ago it's certainly not a full federal republic but it's a lot closer to one then a mere "economic alliance".
The line between those two things in the case of the EU is awful blurry.
The Espace Léopold issues laws that are binding on member nations, wields significant power over trade, fiscal policy, and mandates open borders between member nations. These are hardly the features of a purely economic treaty organisation.
I have no idea what two of those acronyms mean. None of this is part of what a CDN offers.
Yes if you use DDoS protection, or cloudfare’s ZeroTrust or embrace $X proprietary features then what I said no longer applies.
I strictly said DNS and CDN.
Or there is a loosely defined locally-run thing called 'Trading Standards' which is done at the council ("municipality") level.
and for the record I am just being difficult and everyone in tech/mildly well read knows what the (U.S.) FTC is. My point is more that one country's rules don't always matter for the operations of domestic commerce in another amongst their own citizens.
We famously mock our own jusrisprudence - "if Parliament passes a law that it is illegal to smoke on the streets of Paris, then it is illegal to smoke on the streets in Paris", so even when hard legislation exists (4chan/Ofcom shitshow?) it is meaningless.
The only power that matters long term in the universe is sheer force and hard power, and it has always been that way.
Either way, we are on the internet. Pretty international stuff.
But in practice, we almost never receive major contributions from outside the team. Which is fine. We're happy just to have our team working in the open.
The reasons we open sourced it are:
1. Support a realistic local dev environment (without binary blobs).
2. Provide an off-ramp for customers concerned about lock-in. Yes, really. We have big customers that demand this, and we have had big customers that actually did move off Cloudflare by switching to workerd on their own servers. It makes business sense for us to support this because otherwise we couldn't win those big customers in the first place.
The point of this discussion is that you can self-host, and you have a good chance of migrating the code away entirely. That's a big benefit that isn't "an attempt to get free labor". For that use, not only does it not matter if it's meaningfully open source, it doesn't matter if it's open source at all.
VCL = Varnish Configuration Language i.e. how you configure your Fastly services
If you're just using a CDN as a proxy then there's no lock in but plenty of sites are using CDNs for much more than that
1. For DNS we have standardized AXFR requests which the DNS provider needs to support as they are part of the DNS standard. There is not an option of not having that unless you have a really shitty provider that you should change anyway.
2. Same for Mass Import because again DNS already defines these things at the protocol level.
And resetting 2FA or whatever is just the cost of using any service
Personally I have used CF for ~10 years so I have saved $240 and I simultaneously use GitHub Pages and CF Pages for CDN because again I just need to give them a bunch of static files. Adding a third CDN provider would literally be a single command at the end of my build pipeline.
And how is that related to me? My comment said (and the parent I replied to) mentioned DNS and CDN.
Now we add compute services, data storage, whatever D1 is and the other comment mentioned auth/authz
Are people not aware what CDN and DNS are?
In my case, and it was the 90s, I took the time to setup a way to pay by calling a premium (1-900) for $1.49 number so the barrier to entry even for kids was still reasonable.
Maybe in modern day the equivalent is adding Google pay and Apple pay then you cover some kids at least (gift cards and such).
Quite the hassle for the provider, and it will turn away any person who cares about privacy. There's no way to win anymore.
> Minimum Account Balance
> In order to keep your service online, you are required to keep a positive account credit balance. If your account balance drops low, our system will automatically send multiple warning emails. If despite that, you still fail to recharge your account, the system will automatically suspend your account and all your pull zones. Any data in your storage zones will also be deleted after a few days without a backup. Therefore, always make sure to keep your account in good standing.
You proactively replenish your balance, so in the worst case, you can just let the account go.