Jakobeha's recent activity

  1. Comment on Can noise canceling headphones be effective against non continuous noise such as music? in ~tech

    Jakobeha
    Link Parent
    I have a similar solution, except instead of a noise generator I have bone-conduction headphones (AfterShokz). You can wear both ear plugs and bone-conduction headphones, and I also bend the...

    I have a similar solution, except instead of a noise generator I have bone-conduction headphones (AfterShokz). You can wear both ear plugs and bone-conduction headphones, and I also bend the headphones to only cover one ear because I sleep on my side (they're very flexible, and they have a model that go on each ear separately).

    6 votes
  2. Comment on Does something like a charity fund for FOSS exist? If not, do you think it could be a good idea? in ~tech

    Jakobeha
    Link
    Something similar is thanks.dev: it scans your projects' dependency trees, then you "donate" a set amount of money and it distributes it among the dependencies you use. Although I think it's more...

    Something similar is thanks.dev: it scans your projects' dependency trees, then you "donate" a set amount of money and it distributes it among the dependencies you use.
    Although I think it's more common for people to donate to specific projects, as others have mentioned.

    You could also donate to a non-profit foundation, such as Mozilla, Wikipedia, or Creative Commons. Sometimes these groups will select and donate to open-source projects themselves; for example, the Mozilla Technology Fund "supports open source technologists whose work furthers promising approaches to solving pressing issues".

    8 votes
  3. Comment on White House urges use of type safe and memory safe programming languages and hardware in ~tech

    Jakobeha
    Link Parent
    A few points: Code doesn't have to be written in C/C++ or Rust to be performant. It's true that in practice, well-optimized C++ or Rust code is faster than well-optimized Python or JavaScript; but...

    A few points:

    • Code doesn't have to be written in C/C++ or Rust to be performant. It's true that in practice, well-optimized C++ or Rust code is faster than well-optimized Python or JavaScript; but well-optimized Python and JavaScript code is fast. I'd argue the excessive slowness of modern software isn't because of the language, but because of other factors like performance-draining abstractions and bloat. If your basic GUI-based desktop app, 2D/low-poly 3D game, or website can't function with zero lag on a modern computer, it's not because of the language; and in fact, an inefficient C++ or Rust implementation can be even slower than an efficient Python or JavaScript one. This is reinforced by how slow software also tends to be buggy (and vice versa), but C++ certainly doesn't prevent prevent more bugs than say, Java or TypeScript (maybe Rust does, and maybe both beat untyped Python or JavaScript; but back to the argument for automatic memory management, the entire point is that it effectively eliminates memory-related bugs that C and C++ keep).
    • The first source focuses on big companies rewriting their software in faster languages. But importantly, all of them are big companies with already-successful products, and all of them started with a small prototype in a dynamic language. In fact, I believe the correct way to write great software is to start with a prototype written in something like Kotlin which prioritizes flexibility over performance, and only rewrite in something like Rust once the concept is solidified if performance becomes an issue. Because if you write the initial app in C++ or Rust, you have to think about memory or lifetimes alongside your core app's architecture; this makes development slower, discourages important-but-big changes, and makes you end up with a more convoluted product that you may just rewrite again anyways.
    • Some languages, like JavaScript and Python, have design decisions which create much more overhead than their automatic memory management and runtime guards: both JavaScript and Python are interpreted (with optional JIT compilation) and don't have a static type system. In general, code written in Swift, Java, or OCaml performs much closer to C than it does to JavaScript or Python (here's a comparison of C/Java/Python programs with a simple benchmark, although benchmarks don't really reflect real-world code).

    The bottom-line is that there's a lot of "low-hanging fruit" when it comes to slow software. Firstly, time complexity and expensive redundant computations will destroy software performance no matter what language. Then there's reducing allocations, swapping inefficient libraries for better or home-grown ones, and other "mid-level" optimizations, where in a language like C++ or Rust is probably easier, but in Java or JavaScript it's still possible. All these optimizations affect performance much more than removing the overhead given by automatic memory management and runtime guards. Eventually you may get to a point where removing the GC and guards is necessary; but people very frequently choose C++ or Rust prematurely, for a project which will never reach that point.

    12 votes
  4. Comment on White House urges use of type safe and memory safe programming languages and hardware in ~tech

    Jakobeha
    (edited )
    Link Parent
    The author is correct that memory safety vulnerabilities aren't the only vulnerabilities, and it's possible to write C or C++ in a way that ensures you don't get memory errors. Except memory...

    The author is correct that memory safety vulnerabilities aren't the only vulnerabilities, and it's possible to write C or C++ in a way that ensures you don't get memory errors.

    Except memory safety vulnerabilities are very common: an analysis by Microsoft in 2019 concluded that over 70% of their CVEs were memory safety issues. And it doesn't matter whether memory unsafety is the worst cause of exploits; it's the cause of some high-profile exploits, so when you're writing secure software, it's something you care about.

    Furthermore, while you can technically write memory-safe C/C++, it's very tedious and hard to get right. It involves formal verification (which makes solving borrow-checker issues feel like cakewalk) and/or using a subset of the language which is so small, you can barely write anything. The author severely underestimates how pervasive C/C++ footguns are; his example of a "safe" container, std::string, has at least one memory-safety footgun of its own! The entire point of Rust's strict, unintuitive rules is that developers can't be trusted to write safe code on their own, and there are so many seemingly-innocuous ways to get memory corruption; Rust's annoying rules are what's necessary to write truly-safe, high-performance software.


    To the author's credit, and ironically something I don't think he addressed, the discussion around this seems to be missing that Rust is far from the only memory-safe language. In fact, most languages being used today make it so the average developer has to go out of their way to cause memory corruption: I'm talking about Go, Java, Python, JavaScript, Lua, Swift, Haskell, and any other language with garbage collection or some other form of automatic memory management. These languages achieve safety by taking memory control away from the user and managing it themselves (hence "automatic memory management"), and inserting guards before reads, writes, and other potentially-unsafe operations.

    The drawback is that automatic memory management and guards cause extra overhead, so C++ and Rust are preferred for code that needs to run as fast as possible. But I believe that it's rare* that speed is so important, this overhead is a real issue; at least, more rare than the cases where C/C++/Rust are chosen. Modern computers are fast, and well-written Swift/Go/Java is almost as fast as well-written C. Even if your program needs to be efficient, it's usually only a small part that does the heavy lifting, which can be written in a fast language while the rest is written in a scripting language; this is what most game engines (e.g. Unity, Godot) and neural networks (e.g. PyTorch) do.

    * Relatively. I can think of a few categories of programs which are exceptions, and I'm sure there are many programs written in these categories.

    26 votes
  5. Comment on Is an ethical social media platform even possible? in ~tech

    Jakobeha
    (edited )
    Link Parent
    Hacker News is filled with subtle advertisements and the entire site is a giant advertisement for YCombinator. They also occasionally have “YC company is hiring” posts which can’t be upvoted or...

    Hacker News is filled with subtle advertisements and the entire site is a giant advertisement for YCombinator. They also occasionally have “YC company is hiring” posts which can’t be upvoted or commented, as well as “Launch HN” which can be upvoted and commented but are guaranteed to show up on the front page.

    I will say the advertisements are a lot better than most sites. There’s at most one promoted page at a time (while each page has 30 posts), “subtle” advertisements have to be genuinely interesting or they’ll never make it to the front page, and their comments are practically guaranteed to have criticism (if your app sucks commenters will describe in detail how it sucks; if your app is really good, someone will find something they feel is wrong with it and describe how that makes it suck).

    But it’s far from free of monetization. I’m certain it netted YCombinator millions if not billions in convincing at least one successful startup to join the YC program. It’s just advertising for a different audience. Even the bare-bones site design is just a way to signal “YCombinator is practical and BS-free, so you can trust us with your startup”.

    6 votes
  6. Comment on A 2024 plea for lean software in ~comp

    Jakobeha
    (edited )
    Link Parent
    Modern JIT compilers make it possible to write code in languages like JavaScript, which still looks high-level and doesn't directly manipulate memory, but gets compiled to something efficient. The...

    Modern JIT compilers make it possible to write code in languages like JavaScript, which still looks high-level and doesn't directly manipulate memory, but gets compiled to something efficient.

    The problem is that it's hard to know when said code is being compiled to something efficient, because the JIT compilers are very complicated, and the smallest seemingly-irrelevant change can suddenly disable a lot of optimizations. For instance, in V8, calling a JavaScript function with more than 4 types of objects will prevent it from being optimized at all; there are a lot of these subtle things which cause the compiler to "give up" and it can be hard to figure out why your code suddenly runs slower (static languages do have their own "subtle things" which make your code slower like branch misprediction and cache misses, but there are a lot less of them).

    Another issue doesn't relate to the language itself but its community: people who write low-level languages are generally more aware of the performance characteristics of their code, while people who write very-high-level languages are not. So the libraries you use in C and Rust may be more optimized, and explicitly document which functions are expensive and which are efficient, while some libraries in JavaScript over-use abstractions and design patterns which make it extra hard for the JIT compiler to optimize them. But as mentioned, this isn't intrinsic, just statistical: there are poorly-written inefficient C libraries and highly-optimized JavaScript libraries.

    4 votes
  7. Comment on Older folks: Do you feel like work ethic has changed? Better or worse? Do you notice any generalizations? Have the times changed that much? in ~talk

    Jakobeha
    Link Parent
    I suspect another reason is just that the work is less interesting. Back in the 90s you could get rich making a generic "social media app" or "first-person shooter" because many of the big players...

    I suspect another reason is just that the work is less interesting.

    Back in the 90s you could get rich making a generic "social media app" or "first-person shooter" because many of the big players haven't been developed yet, and those things weren't yet "generic". The tools were a lot worse, but the expected quality and size of software was a lot less, so while I don't really know, it seems like it was a lot easier to make something useful. Nowadays it seems like there's an app for everything and you can't expect people to use your app unless it's extremely approachable. It seems like there are a lot of "bullshit jobs", and even work on genuinely important software isn't very creative. Although a big part of this perception is just me growing up.

    Also as someone mentioned, the uncertainty of raises/promotions is a big factor. People won't put in extra effort unless they get extra reward, if not intrinsic, extrinsic. Which leads to non-innovative software, and I suspect eventually bad software, mainly because nobody can effectively measure productivity or quality, so developers need some level of trust.

    Anyways, not everything can be novel, but I think it would help for both companies and developers to focus on what part of their product is essential and what "improvements" make it more useful. That's what people did back then, when computers ran in megahertz and stoage was in megabytes, and despite today's computers being much more powerful and people expecting a lot more, I think it's still relevant.

    12 votes
  8. Comment on The Markup iceberg in ~tech

    Jakobeha
    Link Parent
    I should clarify: the language itself is powerful enough, it's the third-party support which makes Typst "not as powerful as LaTex". LaTex has decades of packages providing all sorts of...

    I should clarify: the language itself is powerful enough, it's the third-party support which makes Typst "not as powerful as LaTex". LaTex has decades of packages providing all sorts of extensions, some which are complex and surely rely on LaTeX-specific behavior that would make them hard to port.

    Even then, I'm sure Typst fits almost every situation, except when you're working with a team who's more familiar with LaTex, or have to submit in a very specific format which is written in a LaTeX template. Unfortunately those exceptions are pretty common.

    5 votes
  9. Comment on The Markup iceberg in ~tech

    Jakobeha
    Link Parent
    It’s pretty good: I tried ChatGPT some MathTeX by giving it a couple examples, and it made a couple mistakes, but nonetheless saved me time. Though nowadays I’d use Pandoc, which can now convert...

    It’s pretty good: I tried ChatGPT some MathTeX by giving it a couple examples, and it made a couple mistakes, but nonetheless saved me time.

    Though nowadays I’d use Pandoc, which can now convert LaTeX (or any of its other input formats) to Typst.

    2 votes
  10. Comment on The Markup iceberg in ~tech

    Jakobeha
    Link
    Typst is a new-ish (beta, but it's been out for a while) alternative to LaTeX for writing papers. My experience using it has been really impressive: it's not as powerful as LaTeX, but it has fast...

    Typst is a new-ish (beta, but it's been out for a while) alternative to LaTeX for writing papers. My experience using it has been really impressive: it's not as powerful as LaTeX, but it has fast live preview, and the syntax is actually intuitive. [It Markdown-like syntax] for bold, italic, list, and header (list and header use different characters than Markdown but otherwise it's the same), and it has a math mode similar to MathTex.

    Unfortunately when a paper needs to use a template defined in LaTeX, LaTeX is the only option. But I hope Typst gets more popular because it definitely seems like a serious contender.

    5 votes
  11. Comment on Is there a markdown editor which let me open .md files from Windows? in ~tech

    Jakobeha
    Link
    I assume you also want a sort of WYSIWYG “plain” editor. MarkText. GhostWriter except it isn’t “perfect” WYSIWYG (monospaced font), but close. Typora is very good except it costs money (used to be...

    I assume you also want a sort of WYSIWYG “plain” editor.

    MarkText.

    GhostWriter except it isn’t “perfect” WYSIWYG (monospaced font), but close.

    Typora is very good except it costs money (used to be free, now it’s a 15-day trial).

    6 votes
  12. Comment on Why do some educators dislike teaching people who don't already know? in ~life

    Jakobeha
    (edited )
    Link
    2 big issues with the general way we teach: Students come in with different levels of experience and learn at different rates. Some students get bored because they’re already ahead and learn fast,...

    2 big issues with the general way we teach:

    1. Students come in with different levels of experience and learn at different rates. Some students get bored because they’re already ahead and learn fast, some students need review and need more of each lesson to grasp its concept.

    2. Professors teach the same classes over and over, so they get bored and disinterested, especially teaching entry-level classes where there isn’t much room for flexibility or creativity.

    Unfortunately, these issues especially apply when students need review and ask review questions. For 1), some students would benefit from review lectures, but most would’ve already understood and they would be a waste. Also, review lectures mean there’s less time to teach what the course is actually about, so students who do already understand don’t get the amount of learning they signed up for. But that means the struggling students email and go to office hours for review, which leads to 2), the professor has to waste their time with the same replies over and over (in contrast, many professors love it when students ask unique and interesting questions).

    To be clear, I believe the students who need extra support and review deserve it. I also believe that students who learn fast deserve to advance through their classes faster. I think there has to be a better way, so that teachers and professors can spend their time teaching interesting material and answering uncommon questions, which is a better use of their talents, and students can learn at their own rate with more personal support. I’m a big fan of online learning, and I imagine the ideal is platforms like Khan Academy replacing lectures entirely, and teachers/professors only teaching interactive seminars and assisting students who have questions and/or need a form of help that the online tools can’t provide.

    28 votes
  13. Comment on Netflix is reportedly exploring adding in-game ads to its gaming service in ~games

    Jakobeha
    Link Parent
    Lots of people don’t care or at least aren’t bothered/informed enough to do anything about it. Honestly I bet it can take a lot more. The good news is there’s a large userbase who does care, and...

    Lots of people don’t care or at least aren’t bothered/informed enough to do anything about it. Honestly I bet it can take a lot more.

    The good news is there’s a large userbase who does care, and is even willing to pay extra, so there will always be alternatives if not workarounds.

    Example: gaming. Everyone knows the rise of freemium mobile games and micro-transactions but there are still AAA-quality games produced with one-time purchases “the old way” like Elden Ring, TOTK, and Baldur’s Gate 3. And they are making plenty of revenue, in fact Baldur’s Gate is the #1 revenue-producing game on Steam: https://store.steampowered.com/charts/topselling/US.

    2 votes
  14. Comment on Core Internet – what sites and services should we permanently preserve? in ~tech

    Jakobeha
    Link
    GitHub. They already preserved some of it in 2020 in the arctic code vault.

    GitHub. They already preserved some of it in 2020 in the arctic code vault.

    6 votes
  15. Comment on 50 Algorithms Every Programmer Should Know (Second Edition) in ~comp

    Jakobeha
    Link Parent
    From Ask HN What are some cool but obscure data-structures you know about? What are your favorite algorithms? What is new in algorithms and data structures these days? What are the most requested...
    1 vote
  16. Comment on On GitHub Copilot in ~comp

    Jakobeha
    Link Parent
    https://www.jetbrains.com/ai/ I believe it’s basically everything except Copilot. It gives you IDE actions which invoke an LLM like “generate Git commit message”, “generate documentation”,...

    https://www.jetbrains.com/ai/

    I believe it’s basically everything except Copilot. It gives you IDE actions which invoke an LLM like “generate Git commit message”, “generate documentation”, “explain this error”, and “suggest name” (those are the ones I’ve seen so far, their site really isn’t specific and mostly AI hype unfortunately). It also provides a ChatGPT clone within IntelliJ, which allegedly has contextual information about your project fed to it somehow.

    IMO it’s not worth it right now. Like most AI, Jetbrains AI has a lot of potential but none of its features translate into any practical benefit. Copilot is one of the exceptions which does have practical uses.

    4 votes
  17. Comment on On GitHub Copilot in ~comp

    Jakobeha
    Link
    Copilot is the only LLM I use more than rarely. I use it to write boilerplate, and languages I'm unfamiliar with (sometimes it gives idiomatic completions I wouldn't discover on my own). I check...

    Copilot is the only LLM I use more than rarely. I use it to write boilerplate, and languages I'm unfamiliar with (sometimes it gives idiomatic completions I wouldn't discover on my own). I check over the completions and mistakes show up fairly often, especially in non-trivial cases, but I believe it's increased my productivity nonetheless. Surprisingly it works well for documentation and other English (JetBrains AI has a "generate documentation" feature, as well as "generate commit message", but IME right now they're both really inaccurate and gimmicky).

    8 votes
  18. Comment on Do you ever "self filter" before making a post or comment and what is it based on? in ~talk

    Jakobeha
    Link Parent
    The bar for a moderator to outright remove something "not interesting enough" is usually (and IMO should be) very low. The more important factor is what gets promoted. Even some of the most biased...

    The bar for a moderator to outright remove something "not interesting enough" is usually (and IMO should be) very low. The more important factor is what gets promoted.

    Even some of the most biased communities sometimes have good posts and comments (like new information, uncommon facts backed by sources, or uncommon opinions with good arguments), which tend to get pushed to the top of the feed or thread. Most noise already gets filtered by not being upvoted, so it falls to the bottom where most people will never see it. And although there are people who upvote memes and non-credible statements and downvote anything they mildly disagree with, I'm sure there are moderator, "veteran", "new member", and other weights which can counteract those.

    However, it's hard for users and mods to find and promote interesting content when there's so much uninteresting content to sift through.

    I know that what I think is interesting and significant usually differs from others' opinions, and I understand perfectionism, people who almost avoided (or did avoid) sharing meaningful ideas and works because they felt they weren't interesting enough. Some misses here and there are inevitable, and creating some noise really isn't an issue. But words are cheap, by which I mean: the average real-world impact of a single post is almost 0; there are so many people on the internet, that if everyone posted much less frequently, there'd still be continuous new posts; and I think when someone has something especially meaningful to say, they absolutely know. So I also think, even with a system in place to filter good posts, even if their post is just not upvoted, people should still self-filter and only post what they believe is high quality.

    4 votes
  19. Comment on Do you ever "self filter" before making a post or comment and what is it based on? in ~talk

    Jakobeha
    (edited )
    Link
    I filter not because of controversy, but because reading over, I realize whatever I’d post isn’t worth it. Usually I don’t have anything meaningful to say. Other times I have a fact I’m not sure...

    I filter not because of controversy, but because reading over, I realize whatever I’d post isn’t worth it.

    Usually I don’t have anything meaningful to say. Other times I have a fact I’m not sure is accurate, or an opinion which is too one-sided, especially when I think more about it, and the nuanced take seems too obvious and vacuous.

    Maybe this sounds like low self-esteem. But the reality is, lots of internet threads devolve into the same themes, with (sometimes literally) the same basic statements over and over. And some people seem to think they understand how a decent-sized chunk of the world works, but nobody really does. Today’s world is so complicated, with so much hidden information, I doubt anyone understands more than a deep narrow slice of whatever they specialize in, and a surface-level understanding of everything else. Most (but not all) of the surface being no deeper than other’s understanding, especially in the community they’re posting in (which is already biased towards them since it has their interest).

    I think that people posting too leniently is the main reason why we have echo chambers, black-and-white worldviews, and a low signal-to-noise ratio. I think that if more people read over their posts before submitting, and try to post only content they believe is significant, accurate, and contains more than what’s already expressed, forums would be better and more interesting. Key word being “try”: I definitely still post tropes and inaccuracies, there’s nuance in filtering and it has downsides (like almost everything else), but just try harder than however they’re trying now.

    Tildes is a lot better at this, which is nice even if there’s less discussion, because there’s less noise. Hacker News is sometimes good at this, in part because most of the content is technical, but the culture helps too; but there’s definitely still an echo chamber & noise & inaccuracy problem there. Next are niche Reddit subs, but even with factual, technical content, they tend to be invaded by low-effort discussion or common tropes unless they’re very niche. Popular Reddit subs are the worst of any social networks I use. I don’t use Facebook or Twitter so I can’t compare.

    Some of it’s definitely misinformation, karma farming, bots, etc. But I do think a lot is real people with good intentions who don’t realize, the “Eternal September”. Basically, my argument is that aggressive filtering is a good thing: because even if there are less posts, most of what gets removed is insignificant noise, and the important posts have concrete, significant content so they are not filtered.

    26 votes
  20. Comment on Unproven 'winter break' hypothesis seeks to explain ChatGPT's seemingly new reluctance to do hard work in ~tech

    Jakobeha
    Link
    ChatGPT’s system prompt includes the current day, correct? So on December 25, if you ask it something mundane like “write me a story” (with, to be clear, no reference to holidays whatsoever), is...

    ChatGPT’s system prompt includes the current day, correct?

    So on December 25, if you ask it something mundane like “write me a story” (with, to be clear, no reference to holidays whatsoever), is it going to give you Christmas-themed answers?

    9 votes