Enter the Free Hosting Options...
So, I built my brother's platform on some cheap mini-PCs and they were fine for the longest time. Then I went and installed Docker. Now things run well for a few hours and the things get starved for resources and need a reboot. Not ideal.
I also live 4.5 hours away. Also, not ideal.
My first thought was to throw more hardware at the problem. But they have also run out of ethernet ports. Yet again, not ideal.
Anyway, the short version of the story is that this is forcing me to think about how best to deliver this to take the burden off of their budget hardware while retaining the safety of not actually relying on that weak hardware in the first place.
Currently, I have near real time backups of their database. Or rather, I get incremental update data streamed to me which I replicate into a backup database. It is pretty awesome.
And Docker was pretty awesome too. Aside from the fact that running it on Windows consumes a few extra gigs even when it does nothing. I had Watchtower installed and monitoring the whole thing. It was decent.
I can do that fairly securely. It talks to my backend server and loads images from a password protected, private Docker Repo which short lived tokens. But, they have been running in house for years and originally had no password and now have not much better than pins. I have a security solution for that as well.
Anyway... I think that the idea will be to setup the main machine (or all of them) as GitLab runners pointed at my private GitLab repo. Instead of local SQL I'll communicate with a Data Broker of sorts. Basically a service running at the Edge which will route to my local server when available and sync to the remote a hosted Postgres server when not. And if mine goes down, then it reverses. Requests will go to the hosted Postgres and sync data will be queued up in persistent storage to feed back to the primary DB when it comes back online.
The idea is that by removing both Docker and SQL from the anemic machines they should start running fast again. The GitLab runner will give me more control and more feedback than Watchtower + Docker did. And then by brokering the data at the edge I can aggregate the calls to my cloud services keeping the services either in the free tier, or as low as humanly possible while serving the data in most cases from my home network.
The most difficult part to source for free or cheap will be the services living at the edge. The ideal would be to leverage something like Google Cloud Run with some function-like server less apps. .Net allows AoT compilation that can deliver some pretty impressive cold start times. Not sure what the connection times to the DBs will be like. DB Cold starts shouldn't really be an issue. My server will always be running, except when it goes down. When it DOES go down, the replica should have far fewer chances to go cold.
Needless to say, this is an interesting problem.
Right now, the goal is to keep the dev and prototyping somewhere in the vicinity of free. I'm using Mongo Atlas for a side project which will eventually help with securing this if it ends up in the cloud. I'm using Neon for a hosted Postgres DB. And I'm thinking I'll use Google Cloud Run for the services. Though I saw a site mentioned named fly.io as well which may be able to host the services as is for free. This is a REALLY low volume service (1-3 users, 8hrs/day).
Any, the real point of this article was just to talk through the design process and the mention the services I'm using or planning to use. Without some ingenuity you probably can't run production software like this. But, that isn't the point. If I can make it work, I certainly will. And, if nothing else, I think it is a brilliant study in how to use Edge computing to manage costs.
Comments
Post a Comment