10 votes

What programming/technical projects have you been working on?

This is a recurring post to discuss programming or other technical projects that we've been working on. Tell us about one of your recent projects, either at work or personal projects. What's interesting about it? Are you having trouble with anything?

29 comments

  1. [2]
    TeMPOraL
    Link
    Outside work, I'm currently preparing a patchset for SCANsat mod to the game Kerbal Space Program, that will allow the players to export satellite imagery they make in GeoTIFF - an extension to...

    Outside work, I'm currently preparing a patchset for SCANsat mod to the game Kerbal Space Program, that will allow the players to export satellite imagery they make in GeoTIFF - an extension to TIFF that also carries metadata which lets GIS software interpret the images.

    For context: Kerbal Space Program is a videogame, in which you build (relatively) realistic spacecraft from pieces and then fly them around a fictional solar system, governed by (close approximation to) real orbital mechanics. The game has a big modding community. One of the popular mods is SCANsat, which adds the mechanic of mapping various characteristics of celestial bodies via instruments on satellites. The common use of such mapping is to discover areas with abundant resources (that can be used to refuel and build spacecraft off-world) and flat enough terrain to support a mining base that doesn't glitch the game's physics engine.

    At some point, I thought: wouldn't it be nice to analyze SCANsat data in software that's used for similar purposes in the real world? Hence the idea for GeoTIFF exporter.

    The main challenge here is understanding how to compute and then encode georeferencing data - information that let GIS software determine the actual location any given pixel represents on a celestial body (and thus compute distances, or align multiple images on the same map). Since I've never done anything with GIS before, this required a deep dive into mathematics of map projections, as well as industry-specific terminology. Turns out, GIS space is very much focused on Earth and it's hard to find information on encoding data about other celestial bodies (much less fictional ones).

    I was hoping to get this done in a week. Right now, I'm two weeks in, and I expect it to take at least one more. But at this point I think I have most of the math and format-specific problems sorted out.

    If anyone here is interested in Kerbal Space Program or satellite imagery and GIS stuff, let me know, and I'll post something about it in the future, when I'm done with my mod to a mod :).

    12 votes
    1. Omnicrola
      Link Parent
      I haven't played KSP in awhile, but I greatly enjoyed it. I look forward to seeing your progress!

      I haven't played KSP in awhile, but I greatly enjoyed it. I look forward to seeing your progress!

      6 votes
  2. [8]
    admicos
    Link
    I'm working on re-re-making my server (after ~2 months), mostly because I got bored an wanted something to do. I've ditched my "Ansible+bare metal" setup and instead gone (back) to Docker. I'm...

    I'm working on re-re-making my server (after ~2 months), mostly because I got bored an wanted something to do.

    I've ditched my "Ansible+bare metal" setup and instead gone (back) to Docker. I'm building all my images from scratch in hopes of keeping everything light and reducing the amount of duplicate files by building on top of a shared base. Docker still isn't great, and I do have a few issues with it still¹.

    Instead of Gitea, I've switched to an ungodly configuration of Gitolite+cgit+git-daemon, as I don't need most of the "fluff" Gitea has, also to experiment with new things.

    For static pages, I'm not entirely sure how to proceed. On my current setup, I use regular (non-bare) Git repositories and a bunch of hacky configuration², to deal with them. As Gitolite only supports bare repositories, I cannot use this setup. I have two choices:

    • Switch to regular folders + rsync, eliminate Git completely (No file history kept, might be useful in keeping stuff light)
    • Keep using Git, but write something to read files from bare repositories (I already have a prototype of this on hand)

    I still have some more stuff to configure before working on trying to mount this pile of Docker images into a proper server setup. I've already done most of the plumbing of this part, and let me tell you it's probably the hackiest part of it all³.

    I'll most likely be ditching my self-hosted Mastodon instance when I put this into production. I wrote a bit about why on Mastodon, but because it's under risk of deletion i'll copy paste it here instead of linking to the original

    Lately (read, the last few hours) I've been thinking of just shutting down this Mastodon instance and moving back to someone else's instance.

    Mastodon is essentially holding my server hostage, both with its high resource usage (~460MB of my server's ~980MB RAM is eaten by Mastodon, and this is a single-user instance!), and with its complex setup (which is even more complex when you don't want to build the sources on the server, because you run out of ram otherwise).

    Because of this, I cannot experiment with new stuff on my server, as I always have to keep either "not breaking Mastodon", or "not running out of RAM" in mind.

    And the main reason I have the server in the first place is to experiment. I could just as easily switch to a static site host like Netlify for my homepage, and throw all my code on sourcehut/codeberg/wherever and keep everything going pretty damn easily.

    I've had a blog post in mind about stuff like Mastodon (or GitLab, or Synapse/Matrix) where they're all resource-hungry and complex monsters that expect an entire server and tech staff dedicated to their own existence, but I know my arguments aren't exactly the best, so I've been holding off of that (look at me with my "standards!", it's not like I've ever published stuff at 3AM high from sleep deprivation at all! hah!)

    I won't leave the fediverse completely, instead I'll probably go onto an instance hosted by someone else. I have an account on mastodon.social I can re-use, but .social already has too much people on it, so I might look around for somewhere new.


    ¹: As an example, it's a PITA to deal with "static files" on web-based things, which I built a hacky solution for, but which isn't used for now as I've ditched both services that would require it.

    ²: https://git.sr.ht/~admicos/nginxpages exists as an older version of this setup, that uses Docker.

    ³: Pulling Docker images through a temporary Docker registry hosted on my workstation through a reverse SSH tunnel?!what?

    9 votes
    1. [4]
      admicos
      Link Parent
      Update (I guess I'll post updates now, why not?) I've been working on a script to automatically set everything up, and ~3 hours ago I've finished™ implementing the hardest parts, and everything...

      Update (I guess I'll post updates now, why not?)

      I've been working on a script to automatically set everything up, and ~3 hours ago I've finished™ implementing the hardest parts, and everything mostly works™ (there are some likely permission issues to go through still, should be quick, i hope)

      After all that, I'll still need an acme/letsencrypt client. I was looking at dehydrated and getssl, but they seem to require bash, which my host distro does not have by default, and I'm not planning to install it just for a single script.

      Suggestions on acme clients that are ...

      • reasonably lightweight
      • support dns verification with wildcard domains
      • support automating said dns verification through dns server APIs

      ... are totally welcome and appreciated

      Aside from ssl, I'll need to configure a firewall. I've configured ufw to a "good enough" state already, so it's just fail2ban, and figuring out how to deal with integrating it into the Docker setup I have left to do. (mounting some volumes will most likely Be Enough™)


      So far, all this is either happening on my workstation, or on a virtual machine I can reload back into a "fresh" state to test the scripts again.

      If nothing goes wrong, I'm planning to switch over to this setup in 2-3 days. I might keep the old server running for a day or more while figuring out how to deal with Mastodon (might just move back to mastodon.social, it's the easiest option I have). Finally, I'll run tootctl self-destruct, and tear down the old server for good, backing up what isn't backed up already.

      7 votes
      1. [2]
        admicos
        Link Parent
        Another Update! As far as I can tell right now, everything is done! (Locally, anyway) For acme, I went with lego, mostly because it was already packaged in my host distro. I intentionally didn't...

        Another Update!

        As far as I can tell right now, everything is done! (Locally, anyway)

        For acme, I went with lego, mostly because it was already packaged in my host distro. I intentionally didn't automate the SSL parts, as I don't want to deal with the possibility of it going wrong. The repetitive parts are scripted, though, so all I have to do when deploying is to copy a crontab line, edit a script to point it to the non-staging Let's Encrypt endpoint, and run it.

        I can't really recall anything else I should've done, so it seems like it's all ready to be deployed. I might just go ahead and do it tomorrow.

        For Mastodon, I'll just move back to mastodon.social, as it's the easiest option I have, and doesn't really matter that much.

        4 votes
        1. admicos
          (edited )
          Link Parent
          Final Update! Re-formatted the server, ran the deployment script, and it seems to have worked mostly well (~3 lines needed changing near the end due to some permission things I should've expected,...

          Final Update!

          Re-formatted the server, ran the deployment script, and it seems to have worked mostly well (~3 lines needed changing near the end due to some permission things I should've expected, though this was the trivial parts i could've easily done by hand if I had to)

          All I need to do now is to re-generate my static sites and upload them by hand, while looking around for stuff to automate that later on!

          Tell me how many passwords I've accidentally commited

          3 votes
      2. dedime
        Link Parent
        RE acme clients, not sure if it'll work for your use case. But check out Caddy (v2)

        RE acme clients, not sure if it'll work for your use case. But check out Caddy (v2)

        1 vote
    2. [3]
      simao
      Link Parent
      How do you actually deploy the containers? I am using docker a lot to self host, but I still use ansible to install docker daemon, build the containers an deployment. Curious if anybody has a...

      How do you actually deploy the containers? I am using docker a lot to self host, but I still use ansible to install docker daemon, build the containers an deployment. Curious if anybody has a better alternative.

      1 vote
      1. [2]
        admicos
        Link Parent
        I build the containers on my local machine (as it's way more powerful than my server), and transfer them through a temporary local Docker registry over an SSH connection. [1, 2, 3]. This isn't as...

        I build the containers on my local machine (as it's way more powerful than my server), and transfer them through a temporary local Docker registry over an SSH connection. [1, 2, 3].

        This isn't as automated as I'd like yet, but works pretty well if you have software (like Mastodon) that needs way more resources to build, than to run. (Which I currently don't, so this is unnecessary for my needs)

        For deployment, my scripts install Docker daemon automatically, but I start the containers manually (via docker-compose up) after I set up everything. This gives me a time window where I can set up SSL, restore backups, or whatever else I might need to do before starting everything up and leaving them be.

        I don't expect any of this to be better than whatever you're doing, though.

        1 vote
        1. simao
          Link Parent
          ah ok. yes, I suppose I could write a script like that. Was just wondering what other people were using. Thanks for sharing.

          ah ok. yes, I suppose I could write a script like that. Was just wondering what other people were using. Thanks for sharing.

          1 vote
  3. [4]
    zlsa
    Link
    I made a static site generator to learn more about Deno! The biggest thing I wanted that I haven't found in existing SSGs is a fully customizable and programmable template pipeline, so I can...

    I made a static site generator to learn more about Deno! The biggest thing I wanted that I haven't found in existing SSGs is a fully customizable and programmable template pipeline, so I can create pages for any type of input file. So I actually import JS/TS modules at runtime and execute them for each content file, and they can choose to do whatever they want.

    The biggest annoyance I'm having is with paths. There are so many paths to keep track of:

    • Content root (the root filesystem directory for all content): /home/forest/projects/websites/blog/content/
    • Content filename (relative to the content root): essays/logitech-software.md
    • Output root (the root filesystem directory for generated output): /home/forest/projects/websites/blog/output/
    • Output path (this is the URL users will see): /essays/logitech-software
    • Output filename (this is the output path on the filesystem, relative to the output root): /essays/logitech-software/index.html

    The thing is, the output path is actually a directory! So every page needs to be aware of that, because I want to use relative paths everywhere (so the resulting site can be hosted in any subfolder.) It means a relative link from /essays/logitech-software to /essays/valve-index-review is actually ../valve-index-review.

    So I made a page.link(outputPath) method, which converts an output-root-relative filename to a relative link:

    // Gets a page by its output path.
    const logitechSoftware = site.getPage('/essays/logitech-software')
    
    logitechSoftware.link('valve-index-review') // Returns '../valve-index-review'
    logitechSoftware.link(site.getPage('/essays/valve-index-review')) // Returns '../valve-index-review'
    

    Of course, this all seems fairly obvious in hindsight, but it's a lot of work to get everything working correctly in every corner case. (Example: I treat content files named index differently: the content file /essays/valve-index-review/index.md is compiled to /essays/valve-index-review. So the content filename is actually in a different place compared to the output filename.)

    7 votes
    1. Apos
      Link Parent
      I wrote code to rewrite links (includes images) like that recently: https://github.com/Apostolique/apos-docs/blob/v0.2.5/src/.eleventy.js#L56. It's surprising how many edge cases there are to...

      I wrote code to rewrite links (includes images) like that recently: https://github.com/Apostolique/apos-docs/blob/v0.2.5/src/.eleventy.js#L56. It's surprising how many edge cases there are to cover and I'm sure I still missed some.

      5 votes
    2. [2]
      skybrian
      Link Parent
      I’m wondering if using a different type for each kind of path might help? I’m not too familiar with Typescript’s type system, but I would try that in Go if I started getting confused.

      I’m wondering if using a different type for each kind of path might help? I’m not too familiar with Typescript’s type system, but I would try that in Go if I started getting confused.

      3 votes
      1. zlsa
        Link Parent
        This is a very good idea, thank you! I don't know why on earth I didn't think of that. Ah, the things JavaScript does to a man...

        This is a very good idea, thank you! I don't know why on earth I didn't think of that. Ah, the things JavaScript does to a man...

        2 votes
  4. [4]
    m15o
    Link
    I re-wrote https://midnight.pub (a virtual pub - internet messageboard) from react + serverless storage to golang and postgres. I completely removed all javascript and tried to make it as simple...

    I re-wrote https://midnight.pub (a virtual pub - internet messageboard) from react + serverless storage to golang and postgres. I completely removed all javascript and tried to make it as simple as possible. The executable is tiny and runs on a linode machine with enough space to last for a really long time. The whole app is served over gemini and translated to HTML at request time. I found it to be a good compromise since gemtext is straightforward to parse. Would love to connect to any of you who's got a capsule on gemini. :)

    5 votes
    1. [3]
      ahq
      Link Parent
      I really want to figure out what Gemini is and how I can tweak gurlic to serve it. I did check out midnight.pub a while back and found it inspiring. Gurlic uses Go + Postgres too, but there's a...

      I really want to figure out what Gemini is and how I can tweak gurlic to serve it. I did check out midnight.pub a while back and found it inspiring. Gurlic uses Go + Postgres too, but there's a fair bit of JS for the frontend.

      Do you have any stats for midnight.pub? As in, daily users and activity? And what kind of linode instance are you running, specs-wise?

      Anyway, very cool!

      1 vote
      1. [2]
        m15o
        Link Parent
        Thanks ahq! I'm happy to hear you found it inspiring. :) I didn't know about gurlic, I will definitely check it out as well. I used to have analytics a while back, but decided to remove them....

        Thanks ahq! I'm happy to hear you found it inspiring. :) I didn't know about gurlic, I will definitely check it out as well.

        Do you have any stats for midnight.pub? As in, daily users and activity? And what kind of linode instance are you running, specs-wise?

        I used to have analytics a while back, but decided to remove them. Usually, there are one or two posts a day. For linode, I take the Nanode 1GB one at $5/month. I also use tarsnap to take daily snapshots of the go executable / database.

        I really want to figure out what Gemini is and how I can tweak gurlic to serve it

        The best way to see what Gemini is would be to start by getting a Gemini browser. My favorite one is Amfora, which is actually written in go: https://github.com/makeworld-the-better-one/amfora

        Once you have it, you can navigate to CAPCOM, a link aggregator with activity from the whole geminispace: gemini://gemini.circumlunar.space/capcom

        One thing you will notice is that gemini is mostly made to read content. There's no real equivalent to a POST request like there is in HTTP. That's because most user self-publish their content as static files. I found having a Gemini + a web frontend for administration (writing/publishing) is a good compromise.

        By the way, here is the source code of midnight, if you are interested! It's pretty messy since I'm going through a lot of experimenting these days, but maybe it can help you if you want to explore how to code for gemini a bit more: https://sr.ht/~m15o/midnight-pub/

        Cheers!

        2 votes
        1. ahq
          Link Parent
          I like that midnight.pub is super lean. Thanks for the link to the source code, I'll definitely have a look in a bit. Took a glance and it didn't look very messy at all. :)

          I like that midnight.pub is super lean.

          Thanks for the link to the source code, I'll definitely have a look in a bit. Took a glance and it didn't look very messy at all. :)

          2 votes
  5. [2]
    joplin
    Link
    I don't know what made me think of it, but I ended up looking up Craig Reynolds' original paper on Flocking "Boids" and implementing it. It's pretty cool to tweak the various parameters and see...

    I don't know what made me think of it, but I ended up looking up Craig Reynolds' original paper on Flocking "Boids" and implementing it. It's pretty cool to tweak the various parameters and see how that affects the flocking of the animals. I just did a simple 2D version. I might expand it to 3D in the future. It wouldn't be hard, but I'm lazy and on vacation.

    5 votes
    1. joplin
      Link Parent
      And by an odd coincidence, Ars Technica released this article today, which discusses research that updates some of our knowledge on flocking and swarming behaviors.

      And by an odd coincidence, Ars Technica released this article today, which discusses research that updates some of our knowledge on flocking and swarming behaviors.

      2 votes
  6. kayelcio
    Link
    This probably seems rudimentary to most, but it's the most involved programming project I've ever completed. I finally worked up the courage/motivation to try to tackle the arduino coding side of...

    This probably seems rudimentary to most, but it's the most involved programming project I've ever completed.

    I finally worked up the courage/motivation to try to tackle the arduino coding side of my led cloud project, after hemming and hawing for probably a year. Five days of complete trial and error, mostly error, a few questions to forums and such that were only tangentially helpful (perhaps it's me being a layman, but programmers don't seem to be particularly interested in giving direct answers to noobs like me), but I finally got some code that mostly does what I'd like it to.

    Anyway, I'm glad to have this one under my belt. I really do have a ton of respect for people that do this kind of thing. It's still VERY foreign to me, but I'm a little further on my journey to it all not seeming completely alien, so yay.

    4 votes
  7. soks_n_sandals
    Link
    I started at my job a year ago with zero knowledge of scripting languages or the command line. I've come a long way since then, and in the last week, I worked on a project to do automated GUI...

    I started at my job a year ago with zero knowledge of scripting languages or the command line. I've come a long way since then, and in the last week, I worked on a project to do automated GUI testing.

    To start, I wrote some input commands in a file for the software I test. Those call some Perl scripts that do a given operation and then write instructions for the main software, either to continue or to abort. If the preliminary Perl scripts pass, I then use an automated GUI tester to run through a sequence of commands and print out whether they passed or failed, and then I bring those results back into the main software. I had to think a lot about this project, since I wanted it to do/have the following features:

    • Run in a completely automated fashion with 1 command line entry
    • Determine the platform (Windows/Linux), then execute commands based on that
    • Be agnostic to the directory/location path
    • Be agnostic to the GUI test name
    • Handle errors and provide clear feedback about what broke
    • Take arguments that alter the behavior of the script and require minimal input for the user
    • Be self-documenting and provide clear test results

    Basically, I needed these scripts and testing framework to be accessible by anyone who looks at them after me. All they need to do now is record the GUI commands, give their GUI test a name and specify 2 command line args for how the script should run (or don't, and the script will use some reasonable defaults).

    4 votes
  8. psi
    Link
    I installed neuron on my unraid tower to better organize my notes (Zettels). Currently I'm using the issues page of a gitlab repo for managing my Zettels. While this does work fairly well, it's...

    I installed neuron on my unraid tower to better organize my notes (Zettels). Currently I'm using the issues page of a gitlab repo for managing my Zettels. While this does work fairly well, it's not exactly the intended usage. Compared to neuron, using gitlab issues has several drawbacks.

    1. Ironically, there's no simple/obvious way to backup gitlab issues.
    2. Similarly, there's no simple/obvious way to version-control gitlab issues.
    3. If I lose my gitlab account, I lose my Zettels.
    4. Math mode is clunkier in gitlab-flavored markdown than neuron-flavored markdown.
    5. Linking Zettels is easier with neuron.
    6. I'm at the mercy of the gitlab issues api. As I wrote before, my usage is not really intended, so api changes in the future could break my entire workflow.

    But neuron still has its drawbacks:

    1. Unlike gitlab, there's no built-in editor. Instead I have to install a dedicated editor on my tower.
    2. Currently that build-in editor is code-server. Unfortunately, since code-server is based on monaco, and since monaco doesn't support mobile browsers, I can't access code-server on my ios devices (and therefore I'm unable to edit my Zettels when I'm away from my computer).
    3. Neuron is a fairly young project. Maybe it'll be abandoned, maybe api changes will break my workflow. Fortunately, each Zettel is stored in a standard-ish markdown file, so it wouldn't be too difficult to transfer my Zettels to some other project (eg, a SSG) if I had to.
    4 votes
  9. 3d12
    Link
    I'm learning React, to attempt to learn more about what "modern web development" is all about. The last time I seriously (not professionally) built websites was before HTML5, so I have a lot to...

    I'm learning React, to attempt to learn more about what "modern web development" is all about. The last time I seriously (not professionally) built websites was before HTML5, so I have a lot to catch up on.

    This comes with the purpose of setting up a website for my band. I don't even know what I want to put on it yet, but one of the websites we host our music on has a React library for their streaming API, so I guess I'm going to start by embedding their player into a blank page and build around that? Not sure yet really, I guess I'd hoped that the more I learn about React, the more I'll think of cool stuff to do with it along the way.

    3 votes
  10. p2004a
    Link
    I've finally finished automatic disk decryption in my home Debian Sid system using tpm2. It's all automatic, without a need to run any commands manually during kernel updates etc: just like...

    I've finally finished automatic disk decryption in my home Debian Sid system using tpm2. It's all automatic, without a need to run any commands manually during kernel updates etc: just like bitlocker on Windows. And dual boot works just fine too.

    I've been sitting down to this project every few months and moving it slightly every time. One of my most important goals was simplicity: both to setup and understand.

    I've ended up with setup that consists of

    • secureboot enabled with own keys for signing bootloaders and dkms modules
    • ditching grub and using systemd-boot instead (I've tried to keep it for a long time, I believe it's possibile, but IMHO too cumbersome)
    • using EFI unified kery images that contain kernel, commandline, initrd all packed and signed together
    • clevis for actual encryption tpm2 key management and initramfs integration
    • a small post initramfs build script that builds and copies EFI unified kernel image to ESP partition

    Now I only want to write a howto somewhere and share it with more folks who might find it useful: which is a challange in itself because I don't have a place like this yet. For now I'm mostly leaning toward setting a static blog website using Hugo and hosted on GCS+Cloudflare... Just another small project...

    3 votes
  11. nostradamnit
    Link
    I working through Land of Lisp :D

    I working through Land of Lisp :D

    3 votes
  12. Pistos
    Link
    I've published my Roku virtual remote web app: https://sr.ht/~pistos/remoteku/ I think it's ready for others to try to use. Constructive feedback welcome. I use this regularly, especially to type...

    I've published my Roku virtual remote web app: https://sr.ht/~pistos/remoteku/ I think it's ready for others to try to use. Constructive feedback welcome.

    I use this regularly, especially to type text quickly into text fields on the Roku (like login credentials, and search terms).

    3 votes
  13. [2]
    joplin
    Link
    I mentioned previously that I was teaching myself how Perlin noise worked. I ended up figuring out how to make it repeat at the edges, so I made a simple terrain generator that tiles infinitely....

    I mentioned previously that I was teaching myself how Perlin noise worked. I ended up figuring out how to make it repeat at the edges, so I made a simple terrain generator that tiles infinitely. It looks like a late 90s demo because it's got terrible lighting and I'm coloring the polygons based on their height (which a little bit of probability mixed in), but it was a fun exercise! Very little of my actual work involves generating terrain, but for some reason, I find exploring these algorithms to be a lot of fun.

    2 votes
    1. Apos
      Link Parent
      I also studied that 2 weeks ago while going through The Book of Shaders, specifically this chapter: https://thebookofshaders.com/11/. It's so fun to play with.

      I also studied that 2 weeks ago while going through The Book of Shaders, specifically this chapter: https://thebookofshaders.com/11/. It's so fun to play with.

      3 votes