Go Loco for Local? The No-Bullshit Microservices Dev Guide (part 2)
In the first part of this series, we discussed how developing with microservices differs from monolith development, and how those differences impact the choice of dev environment. In this second post, we’ll talk about when it’s best to leverage the local microservices environment and the tools that can help.
The Pluses of Microservices Local Development
For most developers, the natural impulse is to begin tackling microservices dev on your trusty machine. This is pretty much the tack that just about everyone takes when they start. Just like the good ole days, right?
But while this approach feels familiar and safe, it is worth pausing to consider when it is best to work locally, and which tools can help.
Like almost everything else, developing microservices on your computer has pros and cons. On the plus side, we have no need for internet connectivity, familiar tooling, and technologies and resources that are literally right at your fingertips. Let’s dive a bit deeper into each of these pluses:
Internet connectivity seems like a red herring, right? After all, when was the last time you were anywhere for any length of time with no internet connection?
In actuality, however, this is a non-trivial issue: For a development environment, any old internet just isn’t enough. Development requires connections that are very fast and very low-latency. Add in essential elements such as the ability to authenticate into your network and log into your VPN, and suddenly, sufficient internet connectivity isn’t trivial at all. As a result, a dev environment that does not need connectivity seems — equally suddenly — quite appealing.
The ability to use existing dev tooling can mean huge savings of time as well as money. If development is done on a familiar platform, there’s no need to invest in new tools and technologies. The sniffers, process monitors, and debuggers that you usually use are most likely just fine. And the time and effort required to teach old devs new tricks and convince them to adopt them can instead be invested in development itself.
Computing, storage, databases and all else you need for development are available right on the machine. There’s no need to spend scarce budgets and even scarcer time on spinning up instances in the cloud for essential resources with hefty price tags attached.
…And the Downsides
Of course, there are significant downsides to developing microservices on local platforms as well. These include being unable to use cloud-based infrastructure such as DBaaS; testing on a different environment than the one in which you will actually run the microservices; and maintaining multiple configurations. Let’s dig into each of these a bit:
Running microservices on a laptop is very different from running the app in the cloud. Obvious differences, which impact all dev-to-production transitions, include HW configuration; OS type, version and/or configuration; and network configuration. But the impact is way greater for apps that rely heavily on the service mesh, k8s networking, and other technologies to coordinate communication and interaction between microservices since these infrastructure layers can only be approximated in the local environment.
Local development environments utilize different technology stacks than those used in the cloud, making them highly susceptible to configuration divergence. Keeping the local configuration updated and in sync is a difficult and annoying task and one that is often neglected. This can have a significant negative impact on local development productivity.
If your application relies on a cloud-based infrastructure such as DBaaS (RDS/DynamoDB/CloudSQL), queues, or SaaS/PaaS products, you won’t be able to fully run your application when developing locally. Replacing cloud-based infrastructure with local alternatives further exacerbates both issues mentioned above. You may also choose to connect to those services remotely, although that creates a new set of provisioning and authentication challenges. We’ll discuss this further in the next post in this series — be sure to check back.
When is Local the Way to Go?
Developing microservices locally on your laptop is a tricky business, as indicated by the pros and cons listed above. So, let’s get to the nitty-gritty of when the pluses of moving back to local outweigh the minuses, and when it is better to stick with the cloud.
Local microservices development is the route you want to take when:
- You are developing only a few microservices and their interactions are easy to define and maintain
- Inputs and outputs for your services are easy to simulate locally
- Being able to develop offline is a significant requirement
- Executing your code does not depend on cloud infrastructure
- You are trying to cut back on cloud computing costs for dev environments
- You’ve moved to the cloud, but are having a hard time figuring out why your code (mis)behaves as it does, and need to observe it more carefully than you can do in the cloud.
The Local Development Toolbox
Now that we’ve touched on when it is reasonable to develop locally, and some of the drawbacks to be aware of, let’s review the tools that can help you with local development and briefly discuss how to apply them to microservices dev.
Processes:
The easiest way to run microservices locally — and with the lowest overhead and least setup time — is simply as processes run directly from your IDE/shell. But be aware: this is likely to require a fair amount of manual intervention and can be error-prone.
Docker Compose:
As a way to meet the challenge of container orchestration, Docker Compose is not necessarily a winning horse to bet on these days. But it’s unquestionably a useful utility for running a few microservice containers locally. The big advantage of Docker Compose configuration, however, is that it is much easier and has a learning curve that is less steep than the more complex Kubernetes YAML files.
The upside here is clear: Developers on your teams will find it much easier. The downside, however, is equally clear: You’ll need to create and maintain local Docker Compose configurations in addition to Kubernetes configurations, and more advanced features may be hard to replicate.
Minikube:
This official CNCF tool makes it easy to spin up a Kubernetes instance on your machine. In fact, the latest version of Docker for desktops also comes with Kubernetes built-in. It would be reasonable to think that it allows you to quickly spin up your application on Kubernetes on your laptop, right? No such luck.
Surprisingly enough (and disappointingly), more often than not, your Kubernetes configurations will probably not work out of the box in Minikube and will require some minor tweaking. For instance, ingress controllers may be missing or your laptop might not have enough horsepower to run a full-fledged development environment.
Even worse, Kubernetes is hard to learn and is quite daunting for many developers, so it may be difficult to get team members on board.
Hotel:
Hotel is a local process manager for running microservices that supports all types of OSes. The simple-to-use utility allows your team to easily define and share local run configurations and see logs for each process.
One major advantage of Hotel is that it can run both containers and processes. Processes are less expensive to run than containers. When you’re running locally and performance is limited, the ability to run both can be valuable. Hotel also includes a few useful extras such as management of simple use cases for HTTPS, DNS, and proxying
While local configure definition is much easier, it still takes time, effort and attention to create and update parallel configurations for Hotel and Kubernetes.
Summary
In this post, we’ve discussed the pros and cons of developing locally, when local is the preferred way to go, and some tools that help make local microservices dev quicker, easier, and more robust.
Check back for the next installment in this series. We’ll cover the pros and cons of cloud dev and offer some guidance as to when it’s the right approach. And finally, in the fourth and final chapter, we’ll discuss tools that can help when you work in the cloud.