I have a new experience of overcoming vibe coding that I would like to share. While finishing up my first freelance project, my client and I were seeing a problem with trying to send GCode files to the custom 3D printer. That is the custom 3D printer firmware has a regular HTTP server, and my website was trying to access those endpoints from HTTPS. So “Mixed Content” warnings have been showing up in the browser console regularly.
To fix this warning, I’ve been trying to bundle my web application with Tauri and have that desktop application make a function that can call HTTP endpoints. As I learned how to bundle web apps with Tauri, I realized that code signing is required for users to use the app, and getting signing certificates can be a little expensive. So I thought to try something cheaper so that my client could have a functioning web app to test the printer.
To solve the problem, I looked for a web hosting solution that could provide an HTTP server. It turns out that nearly every major web host requires HTTPS, so I couldn’t get around the “Mixed Content” warnings. I turned to DigitalOcean for their cheap VPSs, because those would serve HTTP. I had worked with DigitalOcean before, and I actually liked working with remote servers. So I thought I could get something rolling quickly. My freelance project is a static Vanilla JS app built with Vite. My end goal would be build a CI/CD pipeline and deploy the built artifact to DigitalOcean.
To familiarize myself with DigitalOcean again, I queried AI to give me a simple guide of deploying a website. Build the droplet, SSH into the droplet, update the server, install Node and NGINX, and clone the repository. While building the repository, I noticed that the build was going very slow, and, in fact, the droplet terminated the build probably due to not enough RAM. So I repeated the setup steps giving the droplet more power each time, and eventually my build could finish on 2GB of RAM.

Let me put this droplet in the ocean of information!
Now I didn’t want to have a 2GB RAM VPS constantly running when all I was doing was serving static files and PWA tech. I had to get my project built somewhere else, and I landed on Github Actions since they had around 16GB RAM available, and they allowed 2,000 build minutes for free. I can use this and save money!
Side note, I have the opinion that AI technology is a great tool for software engineers. There is no way that AI technology is going to replace software engineers. What AI can do is speed up a software engineer’s work. As such, I thought I could use AI to build out a Github Action for me! Sadly, I didn’t know what I was doing, and I needed to do some work so that I could understand what I was trying to do.
My first prompt to AI was, “Help me make a GitHub action that builds and deploys to a digital ocean droplet”.
The AI responded with a set of instructions.
- Prepare Your DigitalOcean Droplet
- This step instructed me to make a deployer user in the VPS.
- It also instructed me to make a new SSH key pair on my local machine. This was the first thing I got confused on, and now I’m not sure why I did get confused. I guess I just didn’t know what I was doing.
- Add Secrets to Your Github Repository
- AI told me to put SSH information in the Github Repository, which was a little confusing. Eventually, I learned that the Github CLI tool was very helpful to set the right vars in the right spots.
- Create the Github Actions Workflow
- In summary, this action would execute an SSH action in the VPS, cloning my repository and building the repository in the VPS.
No! That’s not what I wanted. I want Github Actions to build my project and do a scp
to my VPS. So I asked a follow up question, “Does this build and send the artifact to digital ocean?” I got more confused. This response referenced an action from the user “appleboy”. I wasn’t sure at this time if I wanted to use “appleboy”, so I decided to ignore this response. Over the next few hours, I struggled to figure out what to do.
Eventually, I decided to kick it to the basics, and by so doing I taught myself what Github Actions was doing. I created a simple Github Action that would just SSH into the VPS and print a “hello world”. Afterwards, I stepped it up by doing an SCP action. Then I returned to “appleboy”. I went to these ssh-action
and scp-action
repositories to find that these actions were really well made, and I found simple uses that I can have for my project. In short, I learned what I needed Github Actions needed to do.
And so I coded up and tested my final Github Action to deploy an artifact to a DigitalOcean droplet. I reviewed the previous AI responses to find that I didn’t really know what I wanted at that time, and I had to teach myself what I wanted. In the end, I used the generated responses from AI and I modified them slightly to my use case.
Now that I am more familiar with Github Actions, I can use this knowledge to speed up my freelance work and potentially reach more clients in the future! This experience taught me once again the importance of actually knowing what I am working on while working with AI. With the combination of knowledge and AI speed, software engineer productivity may increase dramatically.
My final Github Action may be viewed here.
Use DigitalOcean my clicking this referral button!