10
votes
What programming/technical projects have you been working on?
This is a recurring post to discuss programming or other technical projects that we've been working on. Tell us about one of your recent projects, either at work or personal projects. What's interesting about it? Are you having trouble with anything?
Outside work, I'm currently preparing a patchset for SCANsat mod to the game Kerbal Space Program, that will allow the players to export satellite imagery they make in GeoTIFF - an extension to TIFF that also carries metadata which lets GIS software interpret the images.
For context: Kerbal Space Program is a videogame, in which you build (relatively) realistic spacecraft from pieces and then fly them around a fictional solar system, governed by (close approximation to) real orbital mechanics. The game has a big modding community. One of the popular mods is SCANsat, which adds the mechanic of mapping various characteristics of celestial bodies via instruments on satellites. The common use of such mapping is to discover areas with abundant resources (that can be used to refuel and build spacecraft off-world) and flat enough terrain to support a mining base that doesn't glitch the game's physics engine.
At some point, I thought: wouldn't it be nice to analyze SCANsat data in software that's used for similar purposes in the real world? Hence the idea for GeoTIFF exporter.
The main challenge here is understanding how to compute and then encode georeferencing data - information that let GIS software determine the actual location any given pixel represents on a celestial body (and thus compute distances, or align multiple images on the same map). Since I've never done anything with GIS before, this required a deep dive into mathematics of map projections, as well as industry-specific terminology. Turns out, GIS space is very much focused on Earth and it's hard to find information on encoding data about other celestial bodies (much less fictional ones).
I was hoping to get this done in a week. Right now, I'm two weeks in, and I expect it to take at least one more. But at this point I think I have most of the math and format-specific problems sorted out.
If anyone here is interested in Kerbal Space Program or satellite imagery and GIS stuff, let me know, and I'll post something about it in the future, when I'm done with my mod to a mod :).
I haven't played KSP in awhile, but I greatly enjoyed it. I look forward to seeing your progress!
I'm working on re-re-making my server (after ~2 months), mostly because I got bored an wanted something to do.
I've ditched my "Ansible+bare metal" setup and instead gone (back) to Docker. I'm building all my images from scratch in hopes of keeping everything light and reducing the amount of duplicate files by building on top of a shared base. Docker still isn't great, and I do have a few issues with it still¹.
Instead of Gitea, I've switched to an ungodly configuration of Gitolite+cgit+git-daemon, as I don't need most of the "fluff" Gitea has, also to experiment with new things.
For static pages, I'm not entirely sure how to proceed. On my current setup, I use regular (non-bare) Git repositories and a bunch of hacky configuration², to deal with them. As Gitolite only supports bare repositories, I cannot use this setup. I have two choices:
I still have some more stuff to configure before working on trying to mount this pile of Docker images into a proper server setup. I've already done most of the plumbing of this part, and let me tell you it's probably the hackiest part of it all³.
I'll most likely be ditching my self-hosted Mastodon instance when I put this into production. I wrote a bit about why on Mastodon, but because it's under risk of deletion i'll copy paste it here instead of linking to the original
I won't leave the fediverse completely, instead I'll probably go onto an instance hosted by someone else. I have an account on mastodon.social I can re-use, but .social already has too much people on it, so I might look around for somewhere new.
¹: As an example, it's a PITA to deal with "static files" on web-based things, which I built a hacky solution for, but which isn't used for now as I've ditched both services that would require it.
²: https://git.sr.ht/~admicos/nginxpages exists as an older version of this setup, that uses Docker.
³: Pulling Docker images through a temporary Docker registry hosted on my workstation through a reverse SSH tunnel?!what?
Update (I guess I'll post updates now, why not?)
I've been working on a script to automatically set everything up, and ~3 hours ago I've finished™ implementing the hardest parts, and everything mostly works™ (there are some likely permission issues to go through still, should be quick, i hope)
After all that, I'll still need an acme/letsencrypt client. I was looking at dehydrated and getssl, but they seem to require bash, which my host distro does not have by default, and I'm not planning to install it just for a single script.
Suggestions on acme clients that are ...
... are totally welcome and appreciated
Aside from ssl, I'll need to configure a firewall. I've configured ufw to a "good enough" state already, so it's just fail2ban, and figuring out how to deal with integrating it into the Docker setup I have left to do. (mounting some volumes will most likely Be Enough™)
So far, all this is either happening on my workstation, or on a virtual machine I can reload back into a "fresh" state to test the scripts again.
If nothing goes wrong, I'm planning to switch over to this setup in 2-3 days. I might keep the old server running for a day or more while figuring out how to deal with Mastodon (might just move back to mastodon.social, it's the easiest option I have). Finally, I'll run
tootctl self-destruct
, and tear down the old server for good, backing up what isn't backed up already.Another Update!
As far as I can tell right now, everything is done! (Locally, anyway)
For acme, I went with lego, mostly because it was already packaged in my host distro. I intentionally didn't automate the SSL parts, as I don't want to deal with the possibility of it going wrong. The repetitive parts are scripted, though, so all I have to do when deploying is to copy a crontab line, edit a script to point it to the non-staging Let's Encrypt endpoint, and run it.
I can't really recall anything else I should've done, so it seems like it's all ready to be deployed. I might just go ahead and do it tomorrow.
For Mastodon, I'll just move back to mastodon.social, as it's the easiest option I have, and doesn't really matter that much.
Final Update!
Re-formatted the server, ran the deployment script, and it seems to have worked mostly well (~3 lines needed changing near the end due to some permission things I should've expected, though this was the trivial parts i could've easily done by hand if I had to)
All I need to do now is to re-generate my static sites and upload them by hand, while looking around for stuff to automate that later on!
Tell me how many passwords I've accidentally commited
RE acme clients, not sure if it'll work for your use case. But check out Caddy (v2)
How do you actually deploy the containers? I am using docker a lot to self host, but I still use ansible to install docker daemon, build the containers an deployment. Curious if anybody has a better alternative.
I build the containers on my local machine (as it's way more powerful than my server), and transfer them through a temporary local Docker registry over an SSH connection. [1, 2, 3].
This isn't as automated as I'd like yet, but works pretty well if you have software (like Mastodon) that needs way more resources to build, than to run. (Which I currently don't, so this is unnecessary for my needs)
For deployment, my scripts install Docker daemon automatically, but I start the containers manually (via
docker-compose up
) after I set up everything. This gives me a time window where I can set up SSL, restore backups, or whatever else I might need to do before starting everything up and leaving them be.I don't expect any of this to be better than whatever you're doing, though.
ah ok. yes, I suppose I could write a script like that. Was just wondering what other people were using. Thanks for sharing.
I made a static site generator to learn more about Deno! The biggest thing I wanted that I haven't found in existing SSGs is a fully customizable and programmable template pipeline, so I can create pages for any type of input file. So I actually import JS/TS modules at runtime and execute them for each content file, and they can choose to do whatever they want.
The biggest annoyance I'm having is with paths. There are so many paths to keep track of:
/home/forest/projects/websites/blog/content/
essays/logitech-software.md
/home/forest/projects/websites/blog/output/
/essays/logitech-software
/essays/logitech-software/index.html
The thing is, the output path is actually a directory! So every page needs to be aware of that, because I want to use relative paths everywhere (so the resulting site can be hosted in any subfolder.) It means a relative link from
/essays/logitech-software
to/essays/valve-index-review
is actually../valve-index-review
.So I made a
page.link(outputPath)
method, which converts an output-root-relative filename to a relative link:Of course, this all seems fairly obvious in hindsight, but it's a lot of work to get everything working correctly in every corner case. (Example: I treat content files named
index
differently: the content file/essays/valve-index-review/index.md
is compiled to/essays/valve-index-review
. So the content filename is actually in a different place compared to the output filename.)I wrote code to rewrite links (includes images) like that recently: https://github.com/Apostolique/apos-docs/blob/v0.2.5/src/.eleventy.js#L56. It's surprising how many edge cases there are to cover and I'm sure I still missed some.
I’m wondering if using a different type for each kind of path might help? I’m not too familiar with Typescript’s type system, but I would try that in Go if I started getting confused.
This is a very good idea, thank you! I don't know why on earth I didn't think of that. Ah, the things JavaScript does to a man...
I re-wrote https://midnight.pub (a virtual pub - internet messageboard) from react + serverless storage to golang and postgres. I completely removed all javascript and tried to make it as simple as possible. The executable is tiny and runs on a linode machine with enough space to last for a really long time. The whole app is served over gemini and translated to HTML at request time. I found it to be a good compromise since gemtext is straightforward to parse. Would love to connect to any of you who's got a capsule on gemini. :)
I really want to figure out what Gemini is and how I can tweak gurlic to serve it. I did check out midnight.pub a while back and found it inspiring. Gurlic uses Go + Postgres too, but there's a fair bit of JS for the frontend.
Do you have any stats for midnight.pub? As in, daily users and activity? And what kind of linode instance are you running, specs-wise?
Anyway, very cool!
Thanks ahq! I'm happy to hear you found it inspiring. :) I didn't know about gurlic, I will definitely check it out as well.
I used to have analytics a while back, but decided to remove them. Usually, there are one or two posts a day. For linode, I take the Nanode 1GB one at $5/month. I also use tarsnap to take daily snapshots of the go executable / database.
The best way to see what Gemini is would be to start by getting a Gemini browser. My favorite one is Amfora, which is actually written in go: https://github.com/makeworld-the-better-one/amfora
Once you have it, you can navigate to CAPCOM, a link aggregator with activity from the whole geminispace: gemini://gemini.circumlunar.space/capcom
One thing you will notice is that gemini is mostly made to read content. There's no real equivalent to a POST request like there is in HTTP. That's because most user self-publish their content as static files. I found having a Gemini + a web frontend for administration (writing/publishing) is a good compromise.
By the way, here is the source code of midnight, if you are interested! It's pretty messy since I'm going through a lot of experimenting these days, but maybe it can help you if you want to explore how to code for gemini a bit more: https://sr.ht/~m15o/midnight-pub/
Cheers!
I like that midnight.pub is super lean.
Thanks for the link to the source code, I'll definitely have a look in a bit. Took a glance and it didn't look very messy at all. :)
I don't know what made me think of it, but I ended up looking up Craig Reynolds' original paper on Flocking "Boids" and implementing it. It's pretty cool to tweak the various parameters and see how that affects the flocking of the animals. I just did a simple 2D version. I might expand it to 3D in the future. It wouldn't be hard, but I'm lazy and on vacation.
And by an odd coincidence, Ars Technica released this article today, which discusses research that updates some of our knowledge on flocking and swarming behaviors.
This probably seems rudimentary to most, but it's the most involved programming project I've ever completed.
I finally worked up the courage/motivation to try to tackle the arduino coding side of my led cloud project, after hemming and hawing for probably a year. Five days of complete trial and error, mostly error, a few questions to forums and such that were only tangentially helpful (perhaps it's me being a layman, but programmers don't seem to be particularly interested in giving direct answers to noobs like me), but I finally got some code that mostly does what I'd like it to.
Anyway, I'm glad to have this one under my belt. I really do have a ton of respect for people that do this kind of thing. It's still VERY foreign to me, but I'm a little further on my journey to it all not seeming completely alien, so yay.
I started at my job a year ago with zero knowledge of scripting languages or the command line. I've come a long way since then, and in the last week, I worked on a project to do automated GUI testing.
To start, I wrote some input commands in a file for the software I test. Those call some Perl scripts that do a given operation and then write instructions for the main software, either to continue or to abort. If the preliminary Perl scripts pass, I then use an automated GUI tester to run through a sequence of commands and print out whether they passed or failed, and then I bring those results back into the main software. I had to think a lot about this project, since I wanted it to do/have the following features:
Basically, I needed these scripts and testing framework to be accessible by anyone who looks at them after me. All they need to do now is record the GUI commands, give their GUI test a name and specify 2 command line args for how the script should run (or don't, and the script will use some reasonable defaults).
I installed neuron on my unraid tower to better organize my notes (Zettels). Currently I'm using the issues page of a gitlab repo for managing my Zettels. While this does work fairly well, it's not exactly the intended usage. Compared to neuron, using gitlab issues has several drawbacks.
But neuron still has its drawbacks:
I'm learning React, to attempt to learn more about what "modern web development" is all about. The last time I seriously (not professionally) built websites was before HTML5, so I have a lot to catch up on.
This comes with the purpose of setting up a website for my band. I don't even know what I want to put on it yet, but one of the websites we host our music on has a React library for their streaming API, so I guess I'm going to start by embedding their player into a blank page and build around that? Not sure yet really, I guess I'd hoped that the more I learn about React, the more I'll think of cool stuff to do with it along the way.
I've finally finished automatic disk decryption in my home Debian Sid system using tpm2. It's all automatic, without a need to run any commands manually during kernel updates etc: just like bitlocker on Windows. And dual boot works just fine too.
I've been sitting down to this project every few months and moving it slightly every time. One of my most important goals was simplicity: both to setup and understand.
I've ended up with setup that consists of
Now I only want to write a howto somewhere and share it with more folks who might find it useful: which is a challange in itself because I don't have a place like this yet. For now I'm mostly leaning toward setting a static blog website using Hugo and hosted on GCS+Cloudflare... Just another small project...
I working through Land of Lisp :D
I've published my Roku virtual remote web app: https://sr.ht/~pistos/remoteku/ I think it's ready for others to try to use. Constructive feedback welcome.
I use this regularly, especially to type text quickly into text fields on the Roku (like login credentials, and search terms).
I mentioned previously that I was teaching myself how Perlin noise worked. I ended up figuring out how to make it repeat at the edges, so I made a simple terrain generator that tiles infinitely. It looks like a late 90s demo because it's got terrible lighting and I'm coloring the polygons based on their height (which a little bit of probability mixed in), but it was a fun exercise! Very little of my actual work involves generating terrain, but for some reason, I find exploring these algorithms to be a lot of fun.
I also studied that 2 weeks ago while going through The Book of Shaders, specifically this chapter: https://thebookofshaders.com/11/. It's so fun to play with.