wirelyre's recent activity

  1. Comment on Why OpenBSD Rocks in ~comp

    wirelyre Link Parent
    Somehow it has never occurred to me that /lib could be for plain text documents. You know, like a library.

    Within the /lib folder, 9front contains plain text copies of The Manifesto of the Communist Party[1] […]

    Somehow it has never occurred to me that /lib could be for plain text documents. You know, like a library.

    3 votes
  2. Comment on Can You Trust Kurzgesagt Videos? in ~misc

    wirelyre Link Parent
    This one? Seems pretty good to me. In fact, for a 7-minute overview of quantum computing, it's remarkably accurate and not misleading. I guess I would criticise their discussion of database...

    This one? Seems pretty good to me. In fact, for a 7-minute overview of quantum computing, it's remarkably accurate and not misleading.

    I guess I would criticise their discussion of database searching (alluding to Grover's algorithm), because in order to search an arbitrary database, you'd have to construct the whole database in your quantum computer. Grover's algorithm is not really a database search algorithm in the usual sense of "database" and "search".

    8 votes
  3. Comment on Comparing Textile vs. Markdown for mobile use in ~comp

    wirelyre Link
    I think that's right; it's parallel to link syntax (link text ↦ alt text, link target ↦ image source). This also reveals the (main?) difference between Markdown and Textile. Textile is "a...

    In Markdown alt text [for images] seems to be obligatory

    I think that's right; it's parallel to link syntax (link text ↦ alt text, link target ↦ image source).

    This also reveals the (main?) difference between Markdown and Textile. Textile is "a shorthand syntax used to generate valid HTML". A Textile document is not the final form. It's just for writing. But Markdown is "a plain text format for writing structured documents". It is for reading and writing.

    If you're reading a structured plain text document, image descriptions are crucial. The embedded image is completely useless, since you can't see it. But if the document is guaranteed to turn into HTML, then alt text is not necessary — for sighted users anyway.

    4 votes
  4. Comment on What programming language do you think deserves more credit? in ~comp

    wirelyre Link Parent
    Erlang takes a lot of inspiration from Prolog too. The syntax in particular resembles Prolog quite strongly, with English punctuation (fac(0) -> 1; fac(N) -> N * (N-1).), a strong emphasis on tail...

    Erlang takes a lot of inspiration from Prolog too. The syntax in particular resembles Prolog quite strongly, with English punctuation (fac(0) -> 1; fac(N) -> N * (N-1).), a strong emphasis on tail recursion (i.e. cuts), capitalization distinguishing variables and atoms, and probably more.

    When I walked through the manual a few years ago, I remember being delighted with bitstrings — it's possible to pattern match directly on bytes and bits.

    There are a few other surprises, like hot-swappable code. Essentially all of the unusual features are stuff you'd typically find in libraries (threading, actors, byte parsing), lifted into the core language. It's a very well-designed system.

    2 votes
  5. Comment on Nintendo Makes It Clear that Piracy Is the Only Way to Preserve Video Game History in ~games

    wirelyre Link
    In fairness, I think it should be pointed out that from Nintendo's perspective, this isn't much different from retiring old consoles and halting production of cartridges. They're just not selling...

    In fairness, I think it should be pointed out that from Nintendo's perspective, this isn't much different from retiring old consoles and halting production of cartridges. They're just not selling the old games anymore.

    The effect is quite different, of course. Download-only titles can't be resold like physical cartridges or games can. But it's better to think of this as Nintendo's inaction and refusal to change process, rather than a specific anti-consumer move.

    3 votes
  6. Comment on Why I use old hardware in ~comp

    wirelyre (edited ) Link
    There could be other benefits to programming under extra constraints. One night, in a state of hazy insomnia, I used my smartphone (and a touch keyboard) to write a small language interpreter....

    There could be other benefits to programming under extra constraints.

    One night, in a state of hazy insomnia, I used my smartphone (and a touch keyboard) to write a small language interpreter. Reading through it the next morning, I found that I had naturally kept to ~30 columns, uncramped spacing, and crystal clear data structures. I'm sure I have never written better C.

    Iterating on a problem gives you new views of the problem that are increasingly well suited to your tools. If there are two equally clear ways to solve a problem, then the solution using more primitive tools is probably better.

    Edit. I should complete this argument.

    If you use older hardware or tools, besides creating more widely usable software, you might very well create better software. The development process will refine your conception of the problem past what is natural for more powerful tools. It will also make scope creep less sustainable. Constraints are good.

    10 votes
  7. Comment on How do I hack makefiles? in ~comp

    wirelyre Link Parent
    Object files separate the concern of compilation (e.g. language parsing, optimization, instruction selection, register allocation) from that of linking (i.e. executable layout). Imagine you're a...

    Object files separate the concern of compilation (e.g. language parsing, optimization, instruction selection, register allocation) from that of linking (i.e. executable layout).

    Imagine you're a compiler. You're reading plain text and producing an executable file.

    When you compile a function, you emit machine code (add this to that, check if it's zero, otherwise jump backwards an instruction). Sometimes, you need to reference code or a memory location that hasn't been defined. This is done by allocating space for that instruction, but leaving the operand out (jump to ____; multiply ____ by two).

    Why do you need to reference undefined stuff? Part of your job is determining the memory layout that the machine code will have when the program is running. But if you haven't compiled that other function yet, you don't know where it will live in memory. And you can't really guess, because you don't know how many bytes each compiled function uses until after it's compiled.

    You might also reference code from a dynamic library, which is code that, by definition, has no assigned memory location until the program starts. The location is unknowable at compile time.

    There is a logical point like this for many compiled languages. The strategy is to leave blanks in the machine code, then keep track of those blanks. Blank spots have names called symbols. A file containing blanks, symbols, and machine code is an object file.

    If the job of a compiler is to produce machine code, then the job is finished once all internal references are resolved and the blanks are filled (mostly). Now you have a blob of binary that is independent of language-specific details (mostly). When you want to finish up by merging object files, possibly produced in wildly different ways (with different languages or compilers), you can use a separate program called a linker that doesn't need to know anything about programming languages.

    2 votes
  8. Comment on How do I hack makefiles? in ~comp

    wirelyre Link Parent
    You can also mix object files produced in different ways. For instance, a C compiler, a C++ compiler, and an assembler can all make object files that are usable together.

    You can also mix object files produced in different ways. For instance, a C compiler, a C++ compiler, and an assembler can all make object files that are usable together.

    1 vote
  9. Comment on How do I hack makefiles? in ~comp

    wirelyre Link Parent
    Typical kids, criticising things That Work and are historically essential, right? :P I'll second the Linux Makefile. It's beautiful and well-documented.

    Typical kids, criticising things That Work and are historically essential, right? :P

    I'll second the Linux Makefile. It's beautiful and well-documented.

    2 votes
  10. Comment on How do I hack makefiles? in ~comp

    wirelyre (edited ) Link
    The GNU build system is a huge mess. This comment is an introduction to that mess. It might help you. It's mostly a rant. Most ./configure scripts aren't written by hand. That's why they look like...
    • Exemplary

    The GNU build system is a huge mess. This comment is an introduction to that mess. It might help you. It's mostly a rant.

    Most ./configure scripts aren't written by hand. That's why they look like casseroles. In fact, most Makefiles also aren't written by hand. We'll get to that.

    Make is nothing more or less than a language for describing dependencies for files; for building files into others. This is all very good. If you want to use Make in a sensible way, by expressing e.g. "build object files from these .c files, then link them into this executable", you can read the GNU make manual. Read sections 2 and 4, skim section 6, and keep the rest as reference. Suckless sbase, musl, and the Tiny C Compiler all have excellent Makefiles.

    Here comes the mess.

    Autotools

    Back in the day, software had to deal with a bunch of very incompatible compliation environments. Sometimes functions had different signatures, sometimes they weren't defined. So people started writing shell scripts to explore the computer (libraries, headers, etc.), and generate the Makefile automatically from a template (Makefile.in).

    But configure scripts are tedious to write by hand. So GNU Autoconf was born. Autoconf takes configure.ac scripts and makes configure scripts… which make Makefiles… which make programs. And as a bonus, Autoconf scripts are written in an ancient language, M4, which you will never see anywhere else.

    Unfortunately, Make isn't smart enough to keep track of which files depend on each other. If you do it manually but incorrectly, the program might not build, or worse, might build with outdated parts. Fear not! There is a tool called GNU Automake, which explores your source tree and generates Makefile.in (which configure uses to make the real Makefile) automatically! Except Automake can't determine everything about your project, so you need to give it a file Makefile.am as input.

    Wouldn't it be great if you didn't have to write Autoconf scripts? Surely someone has a list of functions that don't exist on some computers or whatever. Good news! Autoconf includes autoscan, which does that for you!

    To recap:

    • You don't want to type compilation commands by hand, so you need a Makefile.
    • Compilers behave differently on different systems, so you need a configure script and a template Makefile.in to generate the Makefile.
    • Configure scripts are similar to each other, so you need configure.ac to generate them.
      • Actually, you don't; they can be deduced automatically by autoscan.
    • Template Makefiles are tedious to write, so they can be deduced automatically. Mostly. They still need Makefile.am.

    Complexity

    Obviously this is all a load of garbage very complex. It's ironic that, towards the goal of a free Unix, GNU sacrificed the Unix philosophy to portability. These programs are all subtly interconnected and difficult to reason about. The portability concerns are outdated too! There's no reason you couldn't write these tools as, say, libraries in shell script; or as programs in GNU Make, which, by the way, is incompatible with other Makes.

    The upshot is that, incredibly, knowing Make will often not help very much when you need to fix builds that use Make. But do write Makefiles for your own projects where possible (i.e. when simple enough). Make is a great tool.

    Other projects saw this problem and fixed it in different ways. Let's explore.

    Alternatives

    Ninja is a replacement for what Make became: a description of file dependencies and commands which are generated by other tools. Ninja is very fast and simpler than Make. Ninja is in charge of executing builds.

    CMake is a tool for describing software requirements and finding them on the computer; and for determining which source files depend on which others. It is basically a replacement for configure scripts and what's involved in generating them. It's clunky, but better than casserole. CMake can generate Makefiles, Ninja files, and files for other build systems. CMake is in charge of planning builds.

    Meson is like CMake, except the configuration language is quite different. As far as I can tell, the project emphasizes automatic configuration (edit the config specification as necessary) over CMake's occasional manual tweaks (edit the generated variables as necessary). It's hard to recommend one of these over the other, although Meson can use CMake dependencies and its syntax is more C-like. Meson also generates Ninja files.

    SCons and waf replace the entire build system, from configuration to build execution. I can't talk much about these since I don't use them.

    Hopefully this was useful for someone. My relationship with GNU builds became very strained recently. May yours remain cordial.

    Edit. s/a load of garbage/very complex/, don't be rude.

    19 votes
  11. Comment on The International Gymnastics Federation wants to recognise parkour as a new discipline, with a view to Olympic inclusion in 2024. But the parkour community is opposing the FIG’s efforts in ~sports

    wirelyre Link
    We saw similar criticism with the introduction of skateboarding and surfing at the 2020 Olympics. Both of these sports have grown communities that embrace personal expression and counterculture...

    We saw similar criticism with the introduction of skateboarding and surfing at the 2020 Olympics. Both of these sports have grown communities that embrace personal expression and counterculture over pure technical skills.

    Plenty of athletes find the inclusion of these sports in the Olympics to be completely contrary to the spirit of the sports. An LA Times article summarizes:

    More than 5,500 people identifying themselves as skateboarders from around the world have signed an online petition asking the International Olympic Committee not to add their sport to the Games.

    […]

    "Skateboarding is not a 'sport' and we do not want skateboarding exploited and transformed to fit into the Olympic program," the online petition states. "We feel that Olympic involvement will change the face of skateboarding and its individuality and freedoms forever."

    And from an Outside article:

    And young surfers don’t want to listen to Bob Costas narrate John John Florence’s top turns. They want to watch a webisode, scroll through heats on demand (if the waves are firing), and then go surfing.

    There are even some parallels with the difficult involvement of community leaders in skateboarding and parkour. The Guardian article reports that members of the skateboarding commission left in part due to "no involvement of the international parkour community" in the commission. From a Vice article:

    Getting skating into the Olympics, however, was never Ream's mission. He felt that would happen whether he cooperated or not. But the IOC requires every Olympic sport to have an international federation, and the fear was that an organization with little to no experience with skateboarding or its culture, rules, or people would wind up in charge. So Ream and other icons of skate formed the ISF in 2004. In describing his overall role regarding the Games, Ream said he was "very active in protecting skateboarding in its relationship with the Olympics."

    Having no personal involvement in these sports or communities, I think parkour is obviously a great fit as a sub-discipline of gymnastics. But I worry about the effect of the FIG on future competitive parkour. The 2006 scoring changes in artistic gymnastics represented a huge turning point in the sport. They encourage significantly trickier skills at the cost of artistic integrity. I hope that parkour can keep well enough to its roots to avoid similar changes.

    4 votes
  12. Comment on Where would a beginner start with data compression? What are some good books for it? in ~comp

    wirelyre Link
    My local library has Sayood's Introduction to Data Compression and Salomon/Motta's Handbook of Data Compression, both of which I would recommend. Sayood is a great medium-paced read that's usable...

    My local library has Sayood's Introduction to Data Compression and Salomon/Motta's Handbook of Data Compression, both of which I would recommend. Sayood is a great medium-paced read that's usable as a self-taught course, although it's a little heavy on the mathematics. Salomon/Motta is an awfully dry reference that is unusable for learning the basics, but it's an incredible overview of compression strategies and has a section for basically every algorithm in common use.

    6 votes
  13. Comment on Programming Challenge - It's raining! in ~comp

    wirelyre (edited ) Link
    This is very cool. Here is a Haskell solution. Try it online! import Data.Ratio main = print . fill $ map lake [1, 3, 5] data Lake = Lake { unfilled :: Rational -- litres , rate :: Rational --...
    • Exemplary

    This is very cool.

    Here is a Haskell solution. Try it online!

    import Data.Ratio
    
    main = print . fill $ map lake [1, 3, 5]
    
    data Lake = Lake { unfilled :: Rational -- litres
                     , rate :: Rational -- litres per hour
                     }
    
    lake :: Rational -> Lake
    lake depth = Lake {unfilled=depth, rate=1}
    
    fill :: [Lake] -> Rational
    fill [] = 0
    fill ls = time + fill (spill 0 (map fill' ls))
      where
        time = minimum . map (\lake -> unfilled lake / rate lake) $ ls
        fill' Lake {unfilled=u, rate=r} = Lake {unfilled=u-r*time, rate=r}
    
    spill :: Rational -> [Lake] -> [Lake]
    spill _ [] = []
    spill incoming (Lake {unfilled=0, rate=r} : ls) = spill (incoming+r) ls
    spill incoming (Lake {unfilled=u, rate=r} : ls) = Lake {unfilled=u, rate=incoming+r} : spill 0 ls
    

    Since the answer can be any rational number, we'll use the built-in Ratio library. Haskell will promote literal integers like 0 and 1 into Rationals automatically.

    At any given time, for each lake, we only care about

    1. how much is unfilled; and
    2. how quickly water is flowing into it.
    import Data.Ratio
    
    data Lake = Lake { unfilled :: Rational -- litres
                     , rate :: Rational -- litres per hour
                     }
    

    At the start, each lake has a flow rate of 1 litre per hour, so we'll make a convenience function to construct a lake from its volume.

    lake :: Rational -> Lake
    lake depth = Lake {unfilled=depth, rate=1}
    

    Finally, we'll declare a function fill, which takes a list of lakes and finds out how long it takes to fill them. Now we can write a main function.

    fill :: [Lake] -> Rational
    main = print . fill $ map lake [1, 3, 5]
    

    Here's the plan: At every time step, we find out how long until some lake is filled next. Then, for each lake, we decrease unfilled by the appropriate amount. Finally, we clean up the list of lakes by removing lakes that are completely full.

    But wait! A lake that is full needs to spill incoming water to the next lake. We need one final auxiliary function, spill. spill takes a list of lakes, and removes lakes that are completely full, but adds water intake rates forward into lakes that are not yet full. That is, if a lake is full, all of the incoming water is carried forward to the next lake.

    spill :: Rational -> [Lake] -> [Lake]
    spill _ [] = []
    spill incoming (Lake {unfilled=0, rate=r} : ls) = spill (incoming+r) ls
    spill incoming (Lake {unfilled=u, rate=r} : ls) = Lake {unfilled=u, rate=incoming+r} : spill 0 ls
    

    This is a common form for recursive functions. Let's step through each case.

    1. Base case: If there are no lakes remaining ([]), no need to do anything.
    2. Otherwise, we have at least one lake. If the lake is empty (unfilled=0), all of the incoming water spilled over so far, plus all of the water that used to fill this lake, overflows forward. Discard the lake.
    3. Otherwise, we have at least one lake and the lake is not empty. Keep the lake, and increase the incoming water rate. Zero water overflows forward (because any incoming water will fill this lake first).

    Now the rest is straightforward. For the base case, it takes no time to fill no lakes.

    fill :: [Lake] -> Rational
    fill [] = 0
    

    How long until the next lake is filled to capacity? (A little math: each lake fills in unfilled / rate hours. The minimum of those is the next to fill.)

      where
        time :: Rational
        time = minimum . map (\lake -> unfilled lake / rate lake) $ ls
    

    Once that amount of time has passed (once the next lake has filled to capacity), what are the water levels in a single lake?

      where
        fill' :: Lake -> Lake
        fill' Lake {unfilled=u, rate=r} = Lake {unfilled=u-r*time, rate=r}
    

    Finally, we recurse. After time has passed, we update each lake (map fill' ls), spill over empty lakes (spill 0 _, because 0 litres per hour overflow into the first lake), and add time to however long it takes to fill the new list of lakes.

    fill ls = time + fill (spill 0 (map fill' ls))
    
    11 votes
  14. Comment on Hey, Tildes, what's a strong opinion you hold, but which you also feel like is the minority opinion? in ~talk

    wirelyre Link Parent
    Since I'm an active classical musician, I'll engage with the music opinion. (At length, apparently. TL;DR: soft agree in spirit, hard disagree on specifics.) I think you're saying that there...
    • Exemplary

    Since I'm an active classical musician, I'll engage with the music opinion. (At length, apparently. TL;DR: soft agree in spirit, hard disagree on specifics.)

    Orchestras and other classical ensembles either need to perform new music or not perform at all.

    I think you're saying that there should be no performances of old music at all, which is quite controversial. But if you instead relax this to "concerts and recitals should always have new music" or even "should focus on new music", this is a common opinion. A surprisingly large minority of programs include music written in the past ten years. Plenty of ensembles, particularly chamber ensembles, perform new music almost exclusively.

    There are a billion recordings of Bach/Beethoven/Mozart/Mahler/etc., what does your performance bring to the table that we haven't heard before?

    Most people don't go to a concert to hear the music. They come to hear a performance. They appreciate the atmosphere and the spectacle. Why see Othello onstage when you could watch a film adaptation? Why go see a famous comedian when you could stream the same jokes in a well-edited special at home? For that matter, why do pop bands do concert tours? They've already recorded the "official" versions of the songs, right?

    Concerts and recitals are simply different experiences from listening to recordings. You are surrounded in sound. You literally face the performers. It's very personal.

    And even if you suppose that the whole goal of performance is to generate recordings, each recording really is quite different. I can't tell you how many times I've been listening to multiple recordings of a particular piece, where I end up with a recording in the past 20 years that just blows me away compared to one from the '60s. Maybe the audio quality is better; or the ensemble is tighter; or the performance makes the musical structure clearer; or the performers are better.

    [T]he old guard of composers needs to make way for living musicians writing relevant music.

    I choose to interpret this as defending orchestral arrangements of modern pop because I get to disagree more. ;-)

    Personally, I perform and listen to music because it's fun. I enjoy the actual mechanical process of live music; and the intellectual process of understanding the music, both in the large while studying a piece, and in the small during a performance.

    And pop music is just. so. boring. It's 90% 8-bar 4/4 measures with verse–chorus and a bridge if you're lucky. I appreciate that songs lose a lot when you remove the lyrics. But as far as musical content goes, you're in a desert. I often (and sincerely) encourage beginners who are showing off whatever piano arrangement of the new song they learned, but, like, no, it doesn't actually sound any different from the last one I heard because you don't have the trap set or steel pans that actually make it distinctive.

    Sorry, got a bit off track there.

    "Relevant", applied to new music, is a bit of a nasty term because of the nature of artistic canon. Music directors program pieces that people enjoy. Sort of by definition. We hear Beethoven's 9th symphony, Haydn 101 "The Clock", or Mozart 41 "Jupiter" so often because they're really good. They're epitomes of the genre, or else groundbreaking somehow. Calling new stuff "relevant" is highly misleading in two senses: it implies that singular new pieces are somehow more important to the genre than those pieces that have informed the landscape for literal centuries; and it obscures that, once, these old pieces were new and relevant and riskily premiered to skeptical crowds.

    When's the last time you saw Haydn Symphony 34 or Mozart Symphony 10 live? Probably never. They're not particularly interesting. The sieve of time lets through only the best "irrelevant" music.

    Classical music has this dead white male fetish

    To some (quite limited) extent this is a tautology. "Western classical music" refers primarily to European stuff since the 17th century, and for a long time, unless you were a white man, good luck getting published. There were a few notable exceptions (who are not objectively better or worse than their contemporaries), but the dominance of white men that we see now is due in no small part to whose music was actually performed at the time.

    And of course most are dead now, because they were born a while ago. So that's that part.

    To the far greater extent that this is a problem with modern directors not bothering to find music by non-white people or women, I agree wholeheartedly. And this is a widely held view.

    8 votes
  15. Comment on Microsoft Announces First Paid-For $20 Linux Distro for Windows 10 October 2018 Update in ~tech

    wirelyre Link Parent
    That makes sense. I bet Windows Server users are generally tied more strongly to software frameworks than to the hardware. And with .NET Core targeting Linux, it might not be a crazy move.

    That makes sense. I bet Windows Server users are generally tied more strongly to software frameworks than to the hardware. And with .NET Core targeting Linux, it might not be a crazy move.

    1 vote
  16. Comment on Microsoft Announces First Paid-For $20 Linux Distro for Windows 10 October 2018 Update in ~tech

    wirelyre Link Parent
    I'm hilariously unqualified to talk about this, but I just can't see that happening. Or, at least, it doesn't seem like it would be a good move. There would be a lot of users stuck with legacy...

    I'm hilariously unqualified to talk about this, but I just can't see that happening. Or, at least, it doesn't seem like it would be a good move. There would be a lot of users stuck with legacy WinAPI applications and drivers. Granted, they could use Wine (it's LGPL) or something, but Microsoft would still be left supporting the whole API. Not much gained, and the greater part of the code is FOSS.

    On the other hand, I'm not sure there's anything inherently desirable about the NT syscall surface or ecosystem. The continuing draw for Linux, I think, has been a stable ABI and a huge set of drivers — despite the bazaar userland. I don't see WSL being attractive enough to overwhelm the GPL Linux community, especially being a second-class interface (it has to coexist with NT/WinAPI).

    4 votes
  17. Comment on A layperson's introduction to Thermodynamics, part 1: Energy, work, heat in ~science

    wirelyre Link
    What is the relationship between thermodynamic entropy and information-theoretic entropy?

    What is the relationship between thermodynamic entropy and information-theoretic entropy?

    1 vote
  18. Comment on Suggestion: that there be only one all-inclusive topic type on Tildes. in ~tildes

    wirelyre Link Parent
    You could also think of a link-and-text topic as a discussion with a particular context and focus. We already get some context via the group — an article about a video game posted to ~games is...

    You could also think of a link-and-text topic as a discussion with a particular context and focus. We already get some context via the group — an article about a video game posted to ~games is different than one posted to ~comp (maybe the latter relates to technical achievements).

    I think there's merit to restricting what kind of discussion is on- and off-topic. This could clarify why the link was posted, and potentially make discussion more productive. Is it worthwhile to ask an interesting question about an otherwise uninteresting article? Is it worthwhile to provide context to a well-written article about an obscure topic?

    1 vote
  19. Comment on Calling Prophet Muhammad a pedophile does not fall within freedom of speech: European court in ~news

    wirelyre Link Parent
    The phrase in question (§188) reads "eine Person […], die den Gegenstand der Verehrung […] bildet". The noun Verehrung derives from verehren. I don't speak German natively, but Linguee suggests...

    They first refer to it as 'veneration', only later use 'worship'.

    The phrase in question (§188) reads "eine Person […], die den Gegenstand der Verehrung […] bildet". The noun Verehrung derives from verehren.

    I don't speak German natively, but Linguee suggests that the word strongly carries the sense of "adore" and "admire", like an alternative form of ehren, "to honour". Duden separates the religious sense from the secular sense in 1a and 1b.

    So this seems like a subtly bad translation.

    10 votes
  20. Comment on Solo - Open source FIDO2 security USB key in ~tech

    wirelyre Link Parent
    Could you expand on this? I'm having trouble sorting through all of the initialisms.

    Could you expand on this? I'm having trouble sorting through all of the initialisms.

    1 vote