The advantage of Docker comes when you want to make your project accessible to everyone.
An Example of When You'd Use Docker
The advantage of Docker is strongest when your project is ready to deploy.
Let's say that, on your machine, you have a JS app, and your database.
So that means you have a node server, an index.js file, and a MongoDB instance.
For you, there's no issue. For situations where you are the only user and you're not connected to the Internet, it's great.
Now how do we run this online, for other people to use it?
Let's say we're using AWS or Amazon Web Services.
First, we would create a virtual server on AWS, an EC2 instance; that holds our app. We install MongoDB on this EC2 instance.
But communicating directly with MongoDB ends up being a pain point. It's not working for our team.
So we decide to add an API, a natural addition to any app. The app communicates with the API and the API communicates with the database - that's the API's purpose, to be the middleman.
So now our AWS setup has: an EC2 instance running our app (node server), our API (another node server), and our database, MongoDB, installed directly on the container.
But we've had problems with our app, and our EC2 instance, crashing before. (Note: I've never had a long-running EC2 instance not crash on me). Every time it crashes, we have to restart the app. Now we have to add monitoring...
You can see how, after the first crash, we might get tired of this setup.
Isn't there an easier way to do this, that's not such a pain?
There is - with Docker.
How Docker Can Simplify Our Process
One way to simplify things is just to have fewer of them.
Essentially, we can combine the functionality of the EC2 instance and the API. That's what Docker can do for us. (We could even throw in the MongoDB instance too, though I'll keep that separate).
So now we create a Docker container that has the app (node server) and the API (another node server).
It works great. It's easier to keep them as one unit for our team, which now only has to check and update in one place.
Docker offers an elegant solution to complex deployments, by consolidating and containerizing your components, reducing the number of moving parts.
How People Use Docker In The Real World
Remember how I said our EC2 instance would crash? That usually happens at the worst moments - when our app experiences heavy load.
We want to avoid this if at all possible. If we start with Docker, we can.
This is because, with Docker, scaling our application becomes straightforward.
If your user base grows rapidly and demands more resources, you can effortlessly spin up additional containers to handle the load. Conversely, when demand decreases, you can scale down accordingly. This scalability ensures the application remains responsive and reliable.
Listen to this engineer who explained their experience with Docker on Reddit:
As told on r/webdev:
Normally to deploy an application you have to install all the dependencies (MySQL, Node, JRE, ncurses, whatever) on the host system.
With Docker, instead of managing host systems, you just specify a Docker file that lists all the dependencies it needs, even to the exact version if you like. And then that image will work wherever the container is run. This also allows you to check in this Dockerfile in to version control, to keep easy track of it.
Now the only thing you have to install on the host system is Docker. And it can be any host system (sort of). Bonus points if you spin this up in the cloud you can also have that scripted so your whole application can be spun up, torn down, scaled, moved, etc.
So now you have "configuration as code" and fewer sysadmin tasks to do.
In summary, learning to work with containers, while its own topic, offers a more efficient and manageable approach compared to traditional deployment methods.
Technology is complicated enough. Use Docker to make the deployment part a little bit easier.