Local first apps have some peculiar technical features, yes. I worked on CRDTs from maybe 2008 and in 2012 we made a collaborative editor. Local first, all data on the client, syncing by WebSocket. I remember when we debugged it we once had a problem resetting a document. It kept reappearing. The guilty was an iPad laying on the table face down. It synced all the data back. That was seriously different from e.g. Evernote that was losing data all the time. In our system, intentionally purging was really difficult.
Once we ran two weeks with a "poisoned" document that was crushing any server it was uploaded to. The user kept the tab open just working like nothing was happening. Then, we found the bug, but in theory we may have made the entire cluster restart all the time. Apart from electricity consumption, that would hardly change anything. The load and syncing time would be worse, but not by much.
With local-first, everything keeps working even without the server.
I purposely did not set up any infrastructure for my iOS app Reflect and made it local-first [0]. I did it so that I could make the app completely private but it’s come with the wonderful side effect that the product is easily horizontally scalable. The only overhead I have for more users is a busier discord channel and so far that community growth has felt very rewarding.
It's amazing to me how we called "box-product" now has a fancy new name "local-first". "Box product" is quite well understood. It has a lot of benefits, but also the business model is harder to get right comparing to cloud services. For opensource projects, that will not be a problem tho.
IMHO, this overlooks probably the biggest advantage of all: software you can buy once and run locally is predictable.
While the modern world of mobile devices and near-permanent fast and reliable connectivity has brought some real advantages, it has also brought the ability for software developers to ruthlessly exploit their users in ways no-one would have dreamt of 20 or 30 years ago. Often these are pitched as if they are for the user’s benefit — a UI “enhancement” here, an “improved” feature there, a bit of casual spying “to help us improve our software and share only with carefully selected partners”, a subscription model that “avoids the big up-front cost everyone used to pay” (or some questionable logic about “CAPEX vs OPEX” for business software), our startup has been bought by a competitor but all the customers who chose our product specifically to avoid that competitor’s inferior alternative have nothing to worry about because they have no ulterior motive and will continue developing it just the way we have so far.
The truth we all know but don’t want to talk about is that many of the modern trends in software have been widely adopted because they make things easier and/or more profitable for software developers at the direct expense of the user’s experience and/or bank account.
And if you’re willing to pay $195 for the package and upgrades of $50 each time the OS breaks the app, we are good. And as long as this software doesn’t have to sync across mobile and desktop. And as long as this is single user software.
Take Microsoft Office, if you just need Word, Excel, PowerPoint, Outlook and OneNote, that's the same price as a two year subscription to Office 365 / Microsoft 365 / CoPilot. That's potentially a good deal, depending on your needs.
Some people have been running Office 2003 on everything from Windows XP to Windows 10. Assuming they bought the license 22 years ago, that's pretty cheap, probably $15 per year. As a bonus, they've never had their workflow disrupted.
I think you missed the most important thing, more than any of those, if there is no service, then the service cannot delete or deny access to your data.
And related, if there is no service, then the service cannot fail to secure your data.
No possibility of a billing or login error, where you lose access to your stuff because they just think your're not current or valid when you are.
No possibility of losing access because the internet is down or your current location is geo blocked etc.
No possibility of your account being killed because they scanned the data and decided it contained child porn or pirated movies or software or cad files to make guns or hate speech etc.
Those overlap with free and privacy but the seperate point is not the money or the privacy but the fact that someone else can kill your stuff at any time without warning or recourse.
And someone else can lose your stuff, either directly by having their own servers broken into, or indirectly, by your login credentials getting leaked on your end.
I will never decide that my kids doctor photos are child porn and delete all my own access to both my email and phone number to access my own bank and retirement accounts.
The fact that a hard drive can break and you can fail to have a backup is not remotely in the same class of problem of living at the whim of a service provider.
If we're tallying such things up, I've had those sorts of major issues with Trello, Google, Navionics, FitBit, FixD, and a number of other companies over the years. My local data on the other hand has had zero issues.
I’ve seen this happen in a previous job where the IT team of a customer deleted years worth of financial records because they didn’t know about it when they cleaned up the server. Our CEO had to go and help them rebuild their books, at great cost to the customer!
This entire paradigm gets turned on its head with AI. I tried to do this with purely local compute, and it's a bad play. We don't have good edge compute yet.
1. A lot of good models require an amount of VRAM that is only present in data center GPUs.
2. For models which can run locally (Flux, etc.), you get dramatically different performance between top of line cards and older GPUs. Then you have to serve different models with different sampling techniques to different hardware classes.
3. GPU hardware is expensive and most consumers don't have GPUs. You'll severely limit your TAM if you require a GPU.
4. Mac support is horrible, which alienates half of your potential customers.
It's best to follow the Cursor model where the data center is a necessary evil and the local software is an adapter and visualizer of the local file system.
Define "good edge compute" in a way that doesn't have expectations set by server-based inference. I don't mean this to sound like a loaded request - we simply can't expect to perform the same operations at the same latency as cloud-based models.
These are two entirely separate paradigms. In many instances it is quite literally impossible to depend on models reachable by RF like in an ultra-low power forest mesh scenario for example.
We're in agreement that not all problem domains are amenable to data center compute. Those that don't have internet, etc.
But for consumer software that can be internet connected, data center GPU is dominating local edge compute. That's simply because the models are being designed to utilize a lot of VRAM.
> This entire paradigm gets turned on its head with AI. I tried to do this with purely local compute, and it's a bad play. We don't have good edge compute yet.
Your TV likely has a good enough CPU to run a decent model for home automation. And a game console most definitely does.
I'd love to see a protocol that would allow devices to upload a model to a computer and then let it sleep until a command it received. Current AI models are really self-contained, they don't need complicated infrastructure to run them.
Brilliant… but now you need to validate that the client did all their business logic correctly without tampering. That alone can be so complex it defeats the point.
No... you don't need that. Not for the overwhelmingly vast majority of cases. Let people use their own software. Tampering? Not my problem. Let people do it if they want.
The "overwhelmingly cast majority of cases" will be an employee of a larger company, a person/computer that cannot be trusted with arbitrary access to data and exceptions to business rules in code.
If it's a single-user app, you can only load data the user actually needs and is cleared to access. And/or lock down the device.
Multi-user app (and if we're talking about companies, it's multiple users by the very definition) where users are not trusted almost always needs either a central service with all the access controls, or a distributed equivalent of it (which is, indeed, very hard to implement). “Local-first” in those cases becomes less relevant, it’s more of a “on-premises/self-host” in this case.
But I think while end-user non-business software can be small compared to enterprise stuff, it is still a fairly big market with lots of opportunities.
Anything that involves sharing data with other people will run into issues around updating. If your API surface is a shipped sqlite db instead of an API call it's liable to be abused in so many ways.
Local-first doesn't mean local-only though, yeah? Isolate cloud usage to those collaborative features. If that's a huge ask, then your thingy probably isn't the kind of tool we're talking about localizing here!
Anything that runs as a SaaS, or B2B, has that issue… which is the overwhelming majority of software.
Anything that requires sharing information with other users is also a pain in the neck, as you basically need to treat your internal logic like a proprietary, potentially hostile, file format.
There is a lot of SaaS that is essentially "for the buyer to the buyer", what I mean is that the software doesn't provide content to somebody else, or there is no incentive to serve malicious content (e. g. B2B). Why would tampering be relevant in those cases?
There are situations where it's relevant, but I don't think it's as many as you say
The reason it's easier to scale local software is that it does not rely on cloud resources. As a result it's cheaper for a startup to distribute local first software since they don't need the infrastructure of a traditional cloud app. The problem is there is no business model for local first software like there is for subscriptions with SaaS. Traditional desktop apps were sold as single purchase items on CDs. That just doesn't work for local first software, since you probably just navigate to a website to get the software.
Where do you check their subscription in order to cut off the service when they stop paying? One of the nice bits about local first is that there's no need for logins. Do you install security software in your local first app, which, typically, includes the code in a format that's fairly easy to bypass? Pirating desktop software was a big issue for companies. Are we going back to that horrible world?
So that kind of violates the principles of the local first software since you still need the cloud and license key in order to run the app. It also means more work for the developer, since they have no other reason to provide a server in most cases. It also means that they need to have logins which are not actually necessary for local first software and one of the benefits. Not so easy.
> The problem is there is no business model for local first software like there is for subscriptions with SaaS.
I think this is too broad a stroke to paint with. There's local-first software that still connects to the cloud for additional features. Local-first can enable you to continue to work when offline, but the software can still be more useful when online.
So your primary technical goal is to do local first, and you keep the cloud because it provides a business model that works? That feels very brittle. The way you're describing it, local first is an optional additional element for a cloud-based tool. I don't think that's the primary intent of the local first movement.
Developers can charge money, or subscriptions, for local apps as well. They can probably charge less, as they don't have a cloud provider to pay to host everything. This is pretty common with mobile apps.
Sure, you can try it charge subscriptions, but what are you actually charging for? You're not storing any of the customer's data or providing a service that they need, since the software should work entirely without your help. Fundamentally charging a subscription without having a centralized server is pretty tricky. Even the traditional desktop apps that transitioned to the cloud, like Photoshop or Maya, have really worked hard to beef up their cloud-based service to justify the subscription fee. A mismatch between your business model and your technical infrastructure is not going to stand up well over time. Don't get me wrong. I'm just trying to figure out what business model works for local-first software?
I think subscription models became associated with SaaS because cloud hype was at its peak around the time the first big corps were first migrating their products from perpetual licensing to susbscription, and just being on the cloud was seen as a selling point weighty enough to justify the price bump.
Now that cloud hype has died down, I don't see why subscription based would not be viable just because your product runs locally (assuming that all your competitors are already subscription based). ZBrush started selling local first subscriptions, so I guess we'll see soon enough whether that works out for them.
Subscription models are associated with SaaS because you're selling a service. The service is typically storing your data or providing capabilities on the back end. With a local first app, the company is not paying for back-end resources, so there's a mismatch between the expectations of customers who are actually providing the resources on their own computer, and the desires of the company to make money.
> Traditional desktop apps were sold as single purchase items on CDs. That just doesn't work for local first software, since you probably just navigate to a website to get the software.
Did I misunderstand this part? A lot of local software is sold as one time purchase downloads.
> Traditional desktop apps were sold as single purchase items on CDs. That just doesn't work for local first software, since you probably just navigate to a website to get the software.
How does the reason you provide support the idea you provide it in support of? There are an infinite number of things that are sold as single purchases that you buy by just navigating to a website where you make the purchase.
There are an infinite number of things that are sold as single purchases on CDs that you buy by just navigating to a website where you make the purchase.
While local first software might be easier to scale, the financial aspect remains unsolved.
I can see that Harper (The author's software) was acquired by Automattic so I assume that Automattic is paying the author to maintain Harper, effectively subsiding the maintenance costs to keep Harper free.
Not every local first (and open source) software has the opportunity to be supported by a big company.
An alternate article for this traffic spike is "PaaS is Easier to Scale". When the author relies on others to do the hosting and handle the scale the author doesn't have to worry about it. That's why he didn't need to be alerted. He's relying on others for that responsibility.
The authors rely on themselves, not other people. The developers of local first software take on the responsibility of making sure the app runs entirely on local resources. That's not the job of anybody else, and you don't need to pay for it like you do with cloud resources. This means it's cheaper for a startup to distribute local first software. The trick is that it's much harder to get paid for the app.
The website is hosted by Automattic. The Firefox extention is hosted by Mozilla. The Chrome extention is hosted by Google. The Obsidian plugin is hosted by Obsidian. The VSCode extention is hosted by Microsoft. The source code is hosted by Github. The discord is hosted by Discord.
If you delegate your entire backend to other companies you won't be the one who has to worry about scaling.
With true local first software, there is no backend. Sometimes there's a static host that just delivers the code, but has no endpoints. Providing the code, or the extension, is not a service that the customer cares about. It's not for the customer. That's for the developer.
I'm talking about the developer having to worry about scaling and not the software. This article is about how the author got a sudden influx of traffic and did not have to worry about scaling anything to support it. That influx of traffic was to his website and then to the various distribution channels of his app. There is a reality that his website would fall over from the traffic and he would have to worry about scaling it, but in this reality he is paying someone else to handle scaling it for him.
If it wasn't local first software and he paid someone else for the backend, he also wouldn't have to worry about scaling the backend. My comment is pointing out that the dichotomy isn't between local first and non local first software, but between self hosting and not self hosting.
Local first apps have some peculiar technical features, yes. I worked on CRDTs from maybe 2008 and in 2012 we made a collaborative editor. Local first, all data on the client, syncing by WebSocket. I remember when we debugged it we once had a problem resetting a document. It kept reappearing. The guilty was an iPad laying on the table face down. It synced all the data back. That was seriously different from e.g. Evernote that was losing data all the time. In our system, intentionally purging was really difficult.
Once we ran two weeks with a "poisoned" document that was crushing any server it was uploaded to. The user kept the tab open just working like nothing was happening. Then, we found the bug, but in theory we may have made the entire cluster restart all the time. Apart from electricity consumption, that would hardly change anything. The load and syncing time would be worse, but not by much.
With local-first, everything keeps working even without the server.
Here is the 2012 engine, by the way:
https://github.com/gritzko/citrea-model
I purposely did not set up any infrastructure for my iOS app Reflect and made it local-first [0]. I did it so that I could make the app completely private but it’s come with the wonderful side effect that the product is easily horizontally scalable. The only overhead I have for more users is a busier discord channel and so far that community growth has felt very rewarding.
[0] https://apps.apple.com/us/app/reflect-track-anything/id64638...
It's amazing to me how we called "box-product" now has a fancy new name "local-first". "Box product" is quite well understood. It has a lot of benefits, but also the business model is harder to get right comparing to cloud services. For opensource projects, that will not be a problem tho.
I'm old enough to have bought software boxes in stores and I've never heard the term "box-product".
Lot of young people don't know what a box product is. You're showing your age.
How dare people walk around showing their age. Disgusting!
Way to miss the point. They said box product is well understood but it isn't these days.
I'm 40 and I never heard that term.
Three clear advantages of a local first software:
1. No network latency, you do not have to send anything across the atlantic.
2. Your get privacy.
3. Its free, you do not need to pay any SaaS business.
An additional would be, the scale being built-in. Every person has their own setup. One central agency doesn't have to take care of all.
IMHO, this overlooks probably the biggest advantage of all: software you can buy once and run locally is predictable.
While the modern world of mobile devices and near-permanent fast and reliable connectivity has brought some real advantages, it has also brought the ability for software developers to ruthlessly exploit their users in ways no-one would have dreamt of 20 or 30 years ago. Often these are pitched as if they are for the user’s benefit — a UI “enhancement” here, an “improved” feature there, a bit of casual spying “to help us improve our software and share only with carefully selected partners”, a subscription model that “avoids the big up-front cost everyone used to pay” (or some questionable logic about “CAPEX vs OPEX” for business software), our startup has been bought by a competitor but all the customers who chose our product specifically to avoid that competitor’s inferior alternative have nothing to worry about because they have no ulterior motive and will continue developing it just the way we have so far.
The truth we all know but don’t want to talk about is that many of the modern trends in software have been widely adopted because they make things easier and/or more profitable for software developers at the direct expense of the user’s experience and/or bank account.
And if you’re willing to pay $195 for the package and upgrades of $50 each time the OS breaks the app, we are good. And as long as this software doesn’t have to sync across mobile and desktop. And as long as this is single user software.
Then sure.
Take Microsoft Office, if you just need Word, Excel, PowerPoint, Outlook and OneNote, that's the same price as a two year subscription to Office 365 / Microsoft 365 / CoPilot. That's potentially a good deal, depending on your needs.
Some people have been running Office 2003 on everything from Windows XP to Windows 10. Assuming they bought the license 22 years ago, that's pretty cheap, probably $15 per year. As a bonus, they've never had their workflow disrupted.
On the other hand, theres 17+ years of security updates missing.... and office macros are a known vector.
You shouldn't upgrade the OS on a machine with important paid-for software which you are using.
I think you missed the most important thing, more than any of those, if there is no service, then the service cannot delete or deny access to your data.
And related, if there is no service, then the service cannot fail to secure your data.
No possibility of a billing or login error, where you lose access to your stuff because they just think your're not current or valid when you are.
No possibility of losing access because the internet is down or your current location is geo blocked etc.
No possibility of your account being killed because they scanned the data and decided it contained child porn or pirated movies or software or cad files to make guns or hate speech etc.
Those overlap with free and privacy but the seperate point is not the money or the privacy but the fact that someone else can kill your stuff at any time without warning or recourse.
And someone else can lose your stuff, either directly by having their own servers broken into, or indirectly, by your login credentials getting leaked on your end.
> the service cannot delete or deny access to your data... the service cannot fail to secure your data.
You, on the other hand, are much more likely to do one or both of these things to yourself.
I will never decide that my kids doctor photos are child porn and delete all my own access to both my email and phone number to access my own bank and retirement accounts.
The fact that a hard drive can break and you can fail to have a backup is not remotely in the same class of problem of living at the whim of a service provider.
If we're tallying such things up, I've had those sorts of major issues with Trello, Google, Navionics, FitBit, FixD, and a number of other companies over the years. My local data on the other hand has had zero issues.
I’ve seen this happen in a previous job where the IT team of a customer deleted years worth of financial records because they didn’t know about it when they cleaned up the server. Our CEO had to go and help them rebuild their books, at great cost to the customer!
All of those advantages are also reason why businesses don’t adapt it…
It's open source, you "just" have to help write it!
This entire paradigm gets turned on its head with AI. I tried to do this with purely local compute, and it's a bad play. We don't have good edge compute yet.
1. A lot of good models require an amount of VRAM that is only present in data center GPUs.
2. For models which can run locally (Flux, etc.), you get dramatically different performance between top of line cards and older GPUs. Then you have to serve different models with different sampling techniques to different hardware classes.
3. GPU hardware is expensive and most consumers don't have GPUs. You'll severely limit your TAM if you require a GPU.
4. Mac support is horrible, which alienates half of your potential customers.
It's best to follow the Cursor model where the data center is a necessary evil and the local software is an adapter and visualizer of the local file system.
Define "good edge compute" in a way that doesn't have expectations set by server-based inference. I don't mean this to sound like a loaded request - we simply can't expect to perform the same operations at the same latency as cloud-based models.
These are two entirely separate paradigms. In many instances it is quite literally impossible to depend on models reachable by RF like in an ultra-low power forest mesh scenario for example.
We're in agreement that not all problem domains are amenable to data center compute. Those that don't have internet, etc.
But for consumer software that can be internet connected, data center GPU is dominating local edge compute. That's simply because the models are being designed to utilize a lot of VRAM.
> This entire paradigm gets turned on its head with AI. I tried to do this with purely local compute, and it's a bad play. We don't have good edge compute yet.
Your TV likely has a good enough CPU to run a decent model for home automation. And a game console most definitely does.
I'd love to see a protocol that would allow devices to upload a model to a computer and then let it sleep until a command it received. Current AI models are really self-contained, they don't need complicated infrastructure to run them.
Closest thing to true “serverless”: entire MVC app (Django/Rails/Laravel) in the browser with WASM and data persistence by SQLite over CDN.
All the server has to do then is serve binaries, all the business logic is in the client.
What's WASM adding here? Without that you're just describing an ordinary SPA+CDN
WASM adds the ability to run a local copy of SQLite (or even PostgreSQL) entirely in the browser.
The ability to port existing apps to serverless. See for example Wordpress in WASM.
Brilliant… but now you need to validate that the client did all their business logic correctly without tampering. That alone can be so complex it defeats the point.
No... you don't need that. Not for the overwhelmingly vast majority of cases. Let people use their own software. Tampering? Not my problem. Let people do it if they want.
The "overwhelmingly cast majority of cases" will be an employee of a larger company, a person/computer that cannot be trusted with arbitrary access to data and exceptions to business rules in code.
If it's a single-user app, you can only load data the user actually needs and is cleared to access. And/or lock down the device.
Multi-user app (and if we're talking about companies, it's multiple users by the very definition) where users are not trusted almost always needs either a central service with all the access controls, or a distributed equivalent of it (which is, indeed, very hard to implement). “Local-first” in those cases becomes less relevant, it’s more of a “on-premises/self-host” in this case.
But I think while end-user non-business software can be small compared to enterprise stuff, it is still a fairly big market with lots of opportunities.
Anything that involves sharing data with other people will run into issues around updating. If your API surface is a shipped sqlite db instead of an API call it's liable to be abused in so many ways.
Local-first doesn't mean local-only though, yeah? Isolate cloud usage to those collaborative features. If that's a huge ask, then your thingy probably isn't the kind of tool we're talking about localizing here!
Anything that runs as a SaaS, or B2B, has that issue… which is the overwhelming majority of software.
Anything that requires sharing information with other users is also a pain in the neck, as you basically need to treat your internal logic like a proprietary, potentially hostile, file format.
There is a lot of SaaS that is essentially "for the buyer to the buyer", what I mean is that the software doesn't provide content to somebody else, or there is no incentive to serve malicious content (e. g. B2B). Why would tampering be relevant in those cases?
There are situations where it's relevant, but I don't think it's as many as you say
The reason it's easier to scale local software is that it does not rely on cloud resources. As a result it's cheaper for a startup to distribute local first software since they don't need the infrastructure of a traditional cloud app. The problem is there is no business model for local first software like there is for subscriptions with SaaS. Traditional desktop apps were sold as single purchase items on CDs. That just doesn't work for local first software, since you probably just navigate to a website to get the software.
You can just charge subscriptions for local first software. No cloud is irrelevant. Only the value to the user matters.
Where do you check their subscription in order to cut off the service when they stop paying? One of the nice bits about local first is that there's no need for logins. Do you install security software in your local first app, which, typically, includes the code in a format that's fairly easy to bypass? Pirating desktop software was a big issue for companies. Are we going back to that horrible world?
A simple license key and monthly ping to license server (5 usd digital ocean) is enough.
Pirates is the cost of doing business. Just ignore them.
No need to make this too complicated.
So that kind of violates the principles of the local first software since you still need the cloud and license key in order to run the app. It also means more work for the developer, since they have no other reason to provide a server in most cases. It also means that they need to have logins which are not actually necessary for local first software and one of the benefits. Not so easy.
A license key is not a login. And a server like that is not a big deal.
Source: me, I do this way.
Interesting. I will contemplate. Thanks for explaining.
> Pirates is the cost of doing business. Just ignore them.
There are a lot of cracking groups that circumvent license servers on day one with software that have license servers.
I'm sure this is the reason that Adobe went to the cloud, Adobe couldn't ignore them as with other 'box software'.
> The problem is there is no business model for local first software like there is for subscriptions with SaaS.
I think this is too broad a stroke to paint with. There's local-first software that still connects to the cloud for additional features. Local-first can enable you to continue to work when offline, but the software can still be more useful when online.
So your primary technical goal is to do local first, and you keep the cloud because it provides a business model that works? That feels very brittle. The way you're describing it, local first is an optional additional element for a cloud-based tool. I don't think that's the primary intent of the local first movement.
Developers can charge money, or subscriptions, for local apps as well. They can probably charge less, as they don't have a cloud provider to pay to host everything. This is pretty common with mobile apps.
It depends on your customer base.The SCADA world is largely local, and believe me they have no trouble selling subscriptions.
Their customers would probably prefer that there where no subscription though.
Probably, but you'd be surprised how little that matters.
I remember Skype was local-first. I believe it was the only one commercially successful P2P project.
But over time and multiple hard-to-recover incidents they switched to cloud.
Sure, you can try it charge subscriptions, but what are you actually charging for? You're not storing any of the customer's data or providing a service that they need, since the software should work entirely without your help. Fundamentally charging a subscription without having a centralized server is pretty tricky. Even the traditional desktop apps that transitioned to the cloud, like Photoshop or Maya, have really worked hard to beef up their cloud-based service to justify the subscription fee. A mismatch between your business model and your technical infrastructure is not going to stand up well over time. Don't get me wrong. I'm just trying to figure out what business model works for local-first software?
I think subscription models became associated with SaaS because cloud hype was at its peak around the time the first big corps were first migrating their products from perpetual licensing to susbscription, and just being on the cloud was seen as a selling point weighty enough to justify the price bump.
Now that cloud hype has died down, I don't see why subscription based would not be viable just because your product runs locally (assuming that all your competitors are already subscription based). ZBrush started selling local first subscriptions, so I guess we'll see soon enough whether that works out for them.
Subscription models are associated with SaaS because you're selling a service. The service is typically storing your data or providing capabilities on the back end. With a local first app, the company is not paying for back-end resources, so there's a mismatch between the expectations of customers who are actually providing the resources on their own computer, and the desires of the company to make money.
> But over time and multiple hard-to-recover incidents they switched to cloud.
My understanding was that they switched to being centralized because phones couldn't run the decentralized version.
> Traditional desktop apps were sold as single purchase items on CDs. That just doesn't work for local first software, since you probably just navigate to a website to get the software.
Did I misunderstand this part? A lot of local software is sold as one time purchase downloads.
> Traditional desktop apps were sold as single purchase items on CDs. That just doesn't work for local first software, since you probably just navigate to a website to get the software.
How does the reason you provide support the idea you provide it in support of? There are an infinite number of things that are sold as single purchases that you buy by just navigating to a website where you make the purchase.
There are an infinite number of things that are sold as single purchases on CDs that you buy by just navigating to a website where you make the purchase.
Local-first is the true server-less and your device is the real "edge".
So much truth to this post.
Can we go back to native apps yet?
Local first is easier to scale technically.
Paid hosted software is easier to scale financially.
Without the latter, it's very hard to come up with the money to pay people to build, to support the market, etc.
I’m not sure it’s that clear…
Take an application like Slack and consider how to scale it local-first for a team with 1000 people.
And then consider how to coordinate deployment of new features and schema migrations…
Compare that with running a centralized server, which is going to be much easier.
Why local first and not a native application?
"Lots of cloud providers like to brag about being able to scale with their users. I like to brag about not having to scale at all"
Bars.
While local first software might be easier to scale, the financial aspect remains unsolved.
I can see that Harper (The author's software) was acquired by Automattic so I assume that Automattic is paying the author to maintain Harper, effectively subsiding the maintenance costs to keep Harper free.
Not every local first (and open source) software has the opportunity to be supported by a big company.
"Lots of cloud providers like to brag about being able to scale with their users. I like to brag about not having to scale at all."
Bars.
An alternate article for this traffic spike is "PaaS is Easier to Scale". When the author relies on others to do the hosting and handle the scale the author doesn't have to worry about it. That's why he didn't need to be alerted. He's relying on others for that responsibility.
The authors rely on themselves, not other people. The developers of local first software take on the responsibility of making sure the app runs entirely on local resources. That's not the job of anybody else, and you don't need to pay for it like you do with cloud resources. This means it's cheaper for a startup to distribute local first software. The trick is that it's much harder to get paid for the app.
>not other people
The website is hosted by Automattic. The Firefox extention is hosted by Mozilla. The Chrome extention is hosted by Google. The Obsidian plugin is hosted by Obsidian. The VSCode extention is hosted by Microsoft. The source code is hosted by Github. The discord is hosted by Discord.
If you delegate your entire backend to other companies you won't be the one who has to worry about scaling.
With true local first software, there is no backend. Sometimes there's a static host that just delivers the code, but has no endpoints. Providing the code, or the extension, is not a service that the customer cares about. It's not for the customer. That's for the developer.
I'm talking about the developer having to worry about scaling and not the software. This article is about how the author got a sudden influx of traffic and did not have to worry about scaling anything to support it. That influx of traffic was to his website and then to the various distribution channels of his app. There is a reality that his website would fall over from the traffic and he would have to worry about scaling it, but in this reality he is paying someone else to handle scaling it for him.
If it wasn't local first software and he paid someone else for the backend, he also wouldn't have to worry about scaling the backend. My comment is pointing out that the dichotomy isn't between local first and non local first software, but between self hosting and not self hosting.