5 Tools Every Developer Should Use in 2022

Richard Dubniczky
9 min readMay 9, 2022

--

Photo by Dan-Cristian Pădureț on Unsplash

Software development is a rapidly expanding field, with new languages, applications, frameworks, and technologies popping up day by day. With such an extensive list of tools to use, it can be difficult to find the right one for the job. Should you go with the tried and true, or try something new and different?

In this article, I aim to highlight some of the industry-standard tools that I think will serve you well in your development journey (in no particular order).

1. Version Control Systems

Photo by Yancy Min on Unsplash

First of all, I have to mention version control systems, and specifically git. Git has became the go-to way to write software and that’s all with great reason. Git provides amazing tools to manage your code base and collaborate with teammates on nearly any task.

You never have to worry about losing code again, even after major breaking changes, the previous versions will still be available. Git uses small save points called commits to store different versions of your code. Each change is saved as the difference between the files, so the overall size is not going get too big any time soon. Say goodbye to v1.zip, v2.zip, … on your Dropbox.

Git also provides ways to cleanup your code base by ignoring build, executable and temporary files (after some initial setup). You’ll no longer send your teammates big zipped folders with countless unnecessary files and directories. Everything that’s there is there for a reason.

Adding git source control to a project:

$ cd my/project/directory
$ git init
$ git add *
$ git commit -m "Initial commit"

And boom, you just added the first version to your code. After changing some files, all you need to do is to repeat steps 3 and 4 with a different message. Even though git has a great CLI, I recommend using some visual method of managing your commits, like Visual Studio Code source control.

There are many great free opportunities to store your project in the cloud, like GitHub or GitLab, providing an extra layer of backup, collaboration tools, and visibility. When you create a practice project, you may publish and share it, slowly building yourself a developer portfolio. If you’re interested in some inspiration, here is my GitLab to check out:

If you are interested in learning git, here is a great online interactive tool to do just that: https://learngitbranching.js.org/

2. Shell Scripting

Photo by George Girnas on Unsplash

In every developer’s life there comes a point (hopefully), when they are done with a project and would like to deploy it somewhere. Most of the time this means some sort of Linux machine. Linux however has its own scripting language just like most programming languages, called bash.

Knowing how to write shell scripts, or more specifically Bash scripts will save you enormous time in the form of automation. Bash makes it easy to move files from one computer to another, or make basic changes to the system.

Every system allows you some way to interact with it using bash. In Windows it’s called WSL, in mac, you have a built-in zsh shell (and bash), and of course, Linux has native bash.

This means that you can write short scripts for tasks like setting up your environment, doing checks, or automating frequently used tasks and be able to use it on every operating system out there. This makes bash scripting one of the most powerful languages in the world.

Not long ago I was working with some legacy Laravel code that was using SQLite databases with random seeded data for development. While coding I often had to reset the database and seed it with fake data, so I made a script for it:

#!/bin/bash
rm -f database/db.sqlite
touch database/db.sqlite
php artisan migrate
php artisan seed

I saved this script as dbreset.sh and called it from the terminal by typing ./dbreset.sh. It took me 20 seconds to automate something that would take 10 seconds to do every time, maybe hundreds of times.

One additional advantage is that if someone else starts working on the project, or I return after months or years — or let’s be honest, sometimes a few days is enough — I won’t have to remember the commands used, they will all be in the project folder.

I recommend all developers get familiar with bash since it’s the industry standard used for code deployments, task automation, and environment setup.

If you are unfamiliar with bash, I recommend starting with these resources:

Custom Build Systems

Photo by Danist Soh on Unsplash

You started working on your shiny new project in python as an example. Hah, jokes on you! Python is an interpreted language, so no build system is required! You write a server and it just requires a few dependencies to run, which you place in the usual requirements.txt. Everything is awesome, right? Well maybe.

Now let’s say you hit up the program again after working on something else for some time. It crashes immediately because dependencies are missing. Okay, you install them, but now… what was the command to start it? Did I need to add any parameters? Do I need to set anything up? Which testing system did I use? How do I run a quick test? Let’s read the instructions which I may or may not have created.

As you might guess, going through this will pose an even greater challenge for someone else, especially if they are relatively new to the language or the tools you’ve used. So some build system might come in handy. NodeJS for example, has a great way to handle this, by using package.json scripts and calling them using npm or yarn.

Let’s try to imitate that using a Makefile:

.PHONY: all install start testall: install startinstall: requirements.txt
pip install -r requirements.txt
start:
python server.py 8080
test:
python pytest .

After writing this sample code you just save it to a file called Makefile and probably mention it in the usage instructions that we use make to manage the application. What does any developer who sees a makefile do? They run: make in their terminal. In this case running make will install all dependencies and run the application with the given demo parameters. No spending time reading instructions. As an additional benefit, make is pre-installed on most Linux distributions, so there is no hassle with that.

You just made your project way more convenient for other developers to quickly pick up and start working. Also, if you are using automatic testing and deployment, it’s easier to edit these commands and call: make test in the test environment, than to remember to edit the commands everywhere.

This is a very simple use-case for a basic makefile. You can write very fancy ones that create virtual environments and update packages when they see an update in requirements.txt. Make was initially intended for use with C/C++, but we just saw a use-case for a python project, so it’s language independent.

If you are interested in Make and other build systems, here are some resources for you:

CI/CD Pipelines

Photo by T K on Unsplash

Oh yes, the buzzword everyone is talking about nowadays. CI/CD, alongside AI and blockchain of course. But it’s with a good reason too! Let’s take a look at how a pipeline can help your work.

Let’s say you are working on a brand new website with your team. Someone experiences some problems or wants to talk to you about some feature. They ask for help and send some screenshots. You can take a look, but you can’t test it on your device or any other device. What do you do? They might push the code and leave it to you to check out and fetch their branch, install everything, build the project and host it locally. Finally, you can take a look. But doing all this might have taken more of your time than solving the actual problem will.

Can we do better than that? Continous Deployment says yes. In short, it means deploying (or hosting) each new version of your project as you go. But of course, you wouldn’t want to do this manually by uploading to an FTP server every single time. You can create a pipeline that automatically deploys your code to a server when you push. GitLab, as well as many others has a built-in tool for this. Let’s see an example:

image: node:14

stages:
- deploy

pages:
stage: deploy
only:
- main
artifacts:
paths:
- public
script:
# Install dependencies
- yarn install
# Build page
- yarn build
# Move dist to public (required by gitlab pages)
- mv ./dist ./public

If you save this short script in your NodeJS repository (.gitlab-ci.yml) where you build to the dist folder using yarn, GitLab will deploy your website and make it available as a static page. In this case it will only deploy if the changes were committed to the main branch, so you can control who is allowed to update the deployed site by requiring developers to work on other branches and manually allowing the merge.

This is a single example of course, you could remotely connect to an ftp server and move the files, or deploy to a cluster if you’re feeling fancy.

Now, you’ve fixed the problem and everything works fine. Right? Well, mostly. What happens if there were some errors in the code that was pushed? You can look at the merge request and try to find some errors before you allow it. You might even download the code and run the tests yourself, which works fine for maybe the first 14 times you do it.. After a while, it just gets old. Let’s automate this as well!

image: node:14

stages:
- test
- deploy
run_tests:
stage: test
script:
- yarn install
- yarn test
pages:
... unchanged

This has two main functions now:

  1. You will see if the tests were completed successfully after every push and also each merge request will also contain the last pipeline result. So all you need to do is wait for a green check before you allow it to merge.
  2. Since the deployment stage comes after the test stage, if any of the tests fail, the deployment will be skipped automatically.

There are a lot of great pipelining tools, including the ones that come built-in with some DevOps platforms like GitLab CI or GitHub Actions, or standalone tools like the most popular Jenkins.

Creating pipelines for the most repetitive and critical parts of your projects will help you focus on what’s important in the long run. Once you create a few pipelines yourself, doing the next one will feel way more natural.

Container Environments

Photo by Dominik Lückmann on Unsplash

Migrating from one environment to another can be a chore. Especially if you’ve only used the same machine for a while and suddenly switched to a new one. You try to download and run the project you’ve been working on a while back, but it does not run.

Chances are, you’ve had this happen to you, or it probably will. One easy step to circumvent this is to build a stable container environment. With that, even in the case of a local build failure, you’ll be able to get it up and running, and start diagnosing the problem faster.

All you need is to set up a Dockerfile in the project root and specify some parameters for your application. Let’s look at this with an example NodeJS server project.

FROM node:15WORKDIR /app
COPY . .
RUN npm installENTRYPOINT [ 'npm' ]
CMD [ 'start' ]

We specify the source Linux image using the FROM command. Thankfully, the NodeJS team already made one, so all we need to do is add our code, install the dependencies and get ready to run when the container starts.

We then build the container and give it a name:

docker build -t node-server .

Afterward, all we need to do is run the container:

docker run node-server

Congratulations! You just made a container for your code for easy deployment. This was a simple example, but containerizing applications is a big subject with countless variations. Use this as a start point to experiment!

If you’re interested in learning more about containers, you can start here:

These were my top picks for the top 5 most important tools to use in 2022!
What would make it to your top 5?

Thank you for reading the article, I hope you found it useful! If you did, please consider buying me a coffee, ☕ cheers!

Happy coding,
Richard

--

--

Richard Dubniczky
Richard Dubniczky

Written by Richard Dubniczky

Security/Infrastructure Enginner, Full-Stack software developer, Cryptography MSC

Responses (1)