aka how to utilize a bare-bone PC to serve me at development.
The problem
The project that I’m working on is getting bigger and bigger. IntelliJ and Chrome eat a lot of RAM, and my working laptop has only 16Gb of it, soldered to the motherboard. Indeed, the development environment is Linux (or macOS), but one part of the final product is a plugin for an Autodesk product, which is only available for Windows. Technically speaking, the application has four parts:
- Keycloak for authentication /authorization
- Back-end running with lein repl
- Using embedded PostgreSQL
- Front-end via Shadow-cljs, with some Portfolio and Stories additional and three different tenants with different CSS
- The plugin, which requires Windows
Naive solution
Of course, I can use Virtualization for the Windows guest to test the plugin itself, but it will take resources from the development environment. Get a VirtualBox, allocate memory, harddisk, install Windows. Then enjoy how swapping comes to save my life, and kill my SSD. With NAT networking, even the plugin can reach the platform. It’s straightforward, but not so comfy.
Time to change
It’s time to buy a second computer. Let’s see the
Requirements:
- I’d like to have it with Windows. Not really, but for testing I must have it.
- If it’s possible, then migrate the heavy part of the development to it.
- It would be nice to use for other occasions too.
Possibilities:
Buy a used server:
- Not so much RAM installed (at least what I found had 16Gb mostly)
- Used
- Old(er) hardware for high price
- High power consuming, high electricity bills
Buy a desktop PC
- Even higher price
- But new hardware
- Still relatively high power consuming
- Tower cases taking a lot of space, and I don’t have so much
Bare-bone PC
- Better hardware for the lower price
- Extendable
- Brand new!
I shouldn’t say, I’ve decided to buy a bare-bone. 10th Generation Intel® Core™ i7 Processor i7-10710U 4.7GHz, six core (TDP 15W), 12 threads, 1Tb M.2 SSD, 64Gb DDR4 RAM should be future-proof enough. Lot of USBs, HDMI, Gigabit LAN, even RS232 connector. For the price of a new middle range laptop.
Operating system(s)
Remote host
The key is virtualization. I need Windows and Linux too. At the same time. It would be nice to have an easy-peasy administration interface. With good documentation available and a big community around. After some research, I’ve decided to use Proxmox as a hosting OS. It has a nice and clean web interface, which makes it easy to install LXC containers and VMs.
Virtual machines
As I mentioned, I need a Windows installation. It’s straightforward, just can go with best practices from PVE wiki. I need a Linux VM too, for easy living I’ve chosen Debian. I don’t need any extra, just a stable and familiar system with clojure, leiningen, deps, yarn and a PostgreSQL client. I’m using a window manager I like, and my second favorite selection is Xfce. I have set up eight cores and 32Gb memory to Debian, and four cores with 24Gb to the Windows VM.
Laptop / console / IDE
My setup would be the same as before. Xubuntu with i3wm, IntelliJ with Cursive. Additionally, I’ve ended up using Remmina for remote accessing the Windows VM. The RDP protocol adds file sharing, I can select the screen resolution, and disable all fancy eye-candy for faster render. Accessing the Debian VM happens through Proxmox’s web interface.
Final setup
Network
I have a router, the bare-bone PC is connected via Ethernet cable, and I’ve set up a static IP for it, outside the DHCP range to avoid conflicts. I’m lucky and my virtual machines are getting semi-fix IPs from the router, for example, the Debian VM gets 192.168.0.106 every time, so I don’t need to deal with static IPs. My laptop is connected to the same router via Wi-Fi.
Users
To avoid confusion for different paths, users and passwords, I have set up the same user on all Linux systems.
File sharing
In an ideal world, I’d use NFS for file sharing between two Linux machines, but here we could have issues. What I learnt while designing my configuration, inotify may not work on shared file systems. And this could kill Shadow-cljs’s refreshing mechanism. After some research, I’ve ended up using Rsync. After installing it on both linux machines, I’ve set up a server config on the Virtual Machine, using the same path as I have it on my laptop:
[project-dir]
uid = g-krisztian
gid = *
path = /home/g-krisztian/project-dir
comment = project root read only = no
On my laptop I’m using File Watchers plug in for Idea (even for automatic formatting), so I have added a new rule:
Name: Rsync
File type: Any
Scope: Project files
Program: rsync
Arguments: -avz --update $ProjectFileDir$ rsync://192.168.0.106/project-dir
This way, every time when I’m saving any file in the IDE, it will sync the whole project directory. To make it more fluent, I’ve generated and shared an SSH key between the two machines. It’s strongly advised to use –update switch, or exclude directories where Shadow-cljs compiles its output, or you will have a hard time to figure out where the compiled js files are gone after syncing.
Repl connection
If you are working with Clojure, you know connecting to a running repl is a must for development. But maybe you never connected to a remote session. So in this case, a connection should have these steps.
1. Start the repl with static port, and open for anyone.
In the VM
lein repl :start :port 4000 :host 0.0.0.0
The :port 4000 parameter for which port should be used to connect, the :host 0.0.0.0 enables the connection from any IP addresses.
2. SSH port forwarding
At the local machine, you should do a port forwarding from your remote VM for the IDE:
ssh -L 4000:192.168.0.106:4000 g-krisztian@192.168.0.106
3. Connect
Set up a remote nRepl session in Idea:
Name: Container
Connection type: nRepl
Context module: project-name
Connect to server
Host: localhost
Port: 4000
and connect.
Conclusions
What I won
- I should not worry about running out of memory, or overheating my CPU.
- I can use different database dumps for my development process; without worrying to commit it by accident (it’s placed on the remote VM only)
- Better windows experience, less is more. I don’t need to set up shared folders and clipboards.
- Less noise. The bare-bone PC is in another room, I can barely hear it.
- Remotely running playwright tests. It’s always funny to see them running on a remote screen.
- Experience, what I can share with you. 😉
What I lost
- One second. Whenever I’m saving a file in Idea, it takes one second to do the synchronization of changed files. And Shadow-cljs starts to re-compile them when the sync is finished.
- Easy travel. When I’m traveling, and I’m planning to do some work, I have to pack the bare-bone PC too. And just to be sure, my router too.
What is missing
- CLJS repl. I didn’t care about connecting to Shadow-cljs’s repl. So I have no idea how it can be set up to allow me to connect to it remotely.
- Debugging. Debugging what IntelliJ offers for Java and for Clojure too, with breakpoints, and code evaluation. I’m really missing it. The only solution is starting a repl inside Idea, and it will work.
- Cleanup. Be aware, Rsync only updates files, and does not delete any. Even if you delete it on your local machine. And some files are really dangerous if they are there when they shouldn’t be. For example, a migration from another branch.
- Same topic, different view: synchronization in the other way around. You generated a file from your repl, and you need it on your local machine. You can use scp to get it.
References
- Proxmox
https://www.proxmox.com/en/ - Proxmox Windows 10 best practices
https://pve.proxmox.com/wiki/Windows_10_guest_best_practices - SSH port forwarding / tunneling
https://www.ssh.com/academy/ssh/tunneling-example - Setup Rsync
https://jumpcloud.com/blog/how-to-use-rsync-remote-backup-linux-system