onyxleopard's recent activity

  1. Comment on Lemmy, an open-source federated Reddit alternative, gets funding for development in ~tech

    onyxleopard
    Link Parent
    The notion that you can robustly identify bad and good people by looking at individual words, rather than by how they use words is highly ignorant of how natural language works.

    The notion that you can robustly identify bad and good people by looking at individual words, rather than by how they use words is highly ignorant of how natural language works.

    12 votes
  2. Comment on General-purpose OS, special-purpose OS, and now: vendor-purpose OS in ~comp

    onyxleopard
    Link Parent
    Apple has historically, but they stopped sourcing GPUs from Nvidia and it seems that past history has soured relations between Apple and Nvidia. Here’s an article that gives some color on the...

    Apple has historically, but they stopped sourcing GPUs from Nvidia and it seems that past history has soured relations between Apple and Nvidia. Here’s an article that gives some color on the history.

    Edit: FWIW, I recall needing to get replacements for 2 consecutive generations of MacBook Pros with discrete Nvidia mobile GPUs due to GPU hardware failures circa 2005-2008, so there were clearly some issues with quality from Nvidia’s side—Apple had to eat the cost of replacing both my laptops for me due to Nvidia’s poor quality control.

    4 votes
  3. Comment on Lemmy, an open-source federated Reddit alternative, gets funding for development in ~tech

    onyxleopard
    Link Parent
    Trying to preemptively come up with words that you think are bad is an ignorant policy and I am of the opinion that such ignorance belies a fundamental misunderstanding of language and discourse...

    Trying to preemptively come up with words that you think are bad is an ignorant policy and I am of the opinion that such ignorance belies a fundamental misunderstanding of language and discourse that it absolutely is cause for concern. Using automated software to help moderators moderate is a good thing, but this kind of language policing is misguided.

    11 votes
  4. Comment on General-purpose OS, special-purpose OS, and now: vendor-purpose OS in ~comp

    onyxleopard
    Link Parent
    I think in this context, user-focus/orientation is being confused for developer-focus/orientation. I think Apple, Microsoft, and Google probably have a much better understanding of the users of...

    I think in this context, user-focus/orientation is being confused for developer-focus/orientation. I think Apple, Microsoft, and Google probably have a much better understanding of the users of their platforms than the software developers targeting those platforms. Now, I can agree that things like Windows displaying ads that are baked into the OS is one thing. But something like Apple warning users when they install software from developers who don’t sign their apps is user-focused. It’s just not developer-focused if you are a developer who believes they are entitled to distribute their software at the expense of users’ security/privacy/system stability etc.

    Basically, I think macOS is far more user-focused than it is developer-focused, but among the platforms called out, it’s actually still very developer-focused, comparatively. Platforms have to balance users’ safety with developers’ convenience, and it’s pretty clear how that’s trending. I have no major dispute there. I just don’t think this trend is entirely explainable as the platforms being self-serving. They want to make sure the user-experience hits a minimum level of quality, and if that means sacrificing quality of life for developers in certain respects where developers have historically been given pretty much free reign, that’s no longer the case. It’s no longer the Wild West for developers. For users who want that freedom, platforms like macOS still offer a great amount of such freedom at the expense of disabling a bunch of the assurances Apple provides by default, like System Integrity Protection etc. But, if Apple were really solely focused on being self-serving, they wouldn’t allow users to disable SIP or install unsigned apps at all—much less allow things like Bootcamp or virtualizing other OSes on the Mac.

    2 votes
  5. Comment on Apple Worldwide Developers Conference (WWDC) 2020 Livestream & Discussion (starts 10AM June 22 PT / 5PM June 23 UTC) in ~tech

    onyxleopard
    Link Parent
    Having lived through the PPC to x86 transition and the use of the Rosetta translation layer, I’m taking Apple at their word that the transition shouldn’t really be that much of a hassle for...

    Having lived through the PPC to x86 transition and the use of the Rosetta translation layer, I’m taking Apple at their word that the transition shouldn’t really be that much of a hassle for end-users. For those who compile their own software or develop their own software for macOS and other platforms, I can imagine some headaches. But as a user, I’m personally not very anxious.

    A lot of existing tech and other operating systems are still primarily on x86 and it's been a good desktop standard for literally decades, with Apple having moved towards it for that reason in 2006 or so.

    I think you’re confusing CPU instruction sets for something else. Apple moved from PowerPC to Intel at the time because building a PowerPC G5 chip that would be viable for laptops was thermodynamically not feasible. The whole spiel in the keynote about power consumption and performance per watt is the exact same motivation today for the current architecture transition. Apple wants their Mac laptops (by far the more popular form factor for personal computers) to be more power efficient than Intel chips are capable of. Apple has already succeeded in designing and getting TSMC to manufacture their own SoCs that, per watt, are already outperforming CPUs from other designers. What’s going to be interesting to see is how far Apple can push their chip designs to compete at the high performance end. I have an inkling that by the time Mac platform fully transitions to A-series chips, Macs will be clearly the best bang for your computing dollar on the market.

    My real concern is, again, that this points towards Apple making iOS their primary operating system, which could lead to a more locked-in system and UI compromises moving away from mouse input towards touch/gesture input.

    iOS is already their primary operating system, so if you’re afraid of that, I don’t know what to tell you. We’re already there. Which instruction set the OS runs on has nothing to do with which OS is primary or secondary or tertiary—that has to do with what sells best. The switch to ARM is essentially a way for Apple to reap more rewards from their own chip designs rather than having to be reliant on third parties like Intel. This is a trend we’ve seen from Apple in everything lately. Apple does not want to be reliant on third parties wherever possible, and they are large enough and cash-rich enough now that they can afford to reap the rewards of the investments they’ve made in themselves.

    I could see them having planned just selling iPads with a keyboard attachment as a possible end game for this decade and that just makes me uncomfortable.

    I guess I just don’t understand your perspective. I don’t think there is any “endgame” for Apple other than making good personal computers and well-integrated software with those computers. If Apple actually planned to deprecate macOS, they wouldn’t be doing all this work to transition it to their SoCs. All signs point to macOS being here for a while yet. The increment from 10.15 to 11.0 is actually a really wonderful thing to see, IMO.

    They already got rid of the F-keys and replaced them with a touch bar.

    Apple still sells full size, external keyboards. You can even use them with iPads today! You can buy nearly any USB or Bluetooth keyboard and it will work with your Mac (or even your iPad now in iOS 13+). I use a 2017 MacBook Pro for work and it has a touch bar, and honestly I don’t even notice it’s there. I think the only thing I’ve ever used it for semi regularly is changing the audio volume. When I’m at my desk at home, I just use my full-size Apple Bluetooth keyboard instead. And this all has absolutely nothing to do with switching to ARM.

    1 vote
  6. Comment on Apple Worldwide Developers Conference (WWDC) 2020 Livestream & Discussion (starts 10AM June 22 PT / 5PM June 23 UTC) in ~tech

    onyxleopard
    Link Parent
    Can you give examples of softwares that will not be recompiled for ARM by the time Apple stops supporting Intel-based Macs? Because if that’s what your anxiety is about, I don’t think it’s...

    Can you give examples of softwares that will not be recompiled for ARM by the time Apple stops supporting Intel-based Macs?

    Because if that’s what your anxiety is about, I don’t think it’s warranted. Outside of servers, ARM is becoming the standard architecture for personal computing.

  7. Comment on Apple Worldwide Developers Conference (WWDC) 2020 Livestream & Discussion (starts 10AM June 22 PT / 5PM June 23 UTC) in ~tech

    onyxleopard
    Link Parent
    Apple is actually peculiar in that I find a very robust inverse correlation in the size of a company and the quality of the software it produces. Apple is an outlier in that regard.

    Apple is actually peculiar in that I find a very robust inverse correlation in the size of a company and the quality of the software it produces. Apple is an outlier in that regard.

    3 votes
  8. Comment on Apple Worldwide Developers Conference (WWDC) 2020 Livestream & Discussion (starts 10AM June 22 PT / 5PM June 23 UTC) in ~tech

    onyxleopard
    Link Parent
    I don’t really understand the anxiety before you use it yourself. Apple is in a weird position with WWDC keynotes in that the audience is very wide. If they focus on technical details too much,...

    I don’t really understand the anxiety before you use it yourself.

    Apple is in a weird position with WWDC keynotes in that the audience is very wide. If they focus on technical details too much, they will lose the outside edges of the audience. You can really think of the keynote as a preview of the consumer-facing features with a bit of pointers thrown in for the keen-eyed, more technical audience. For actual macOS devs, there is a whole week of sessions to attend to learn about framework updates, Rosetta 2, Universal Binaries 2, etc.

    Did you live through the PowerPC to x86 transition? Were you anxious then? Are you in the developer program? Are you going to download the beta and try it?

    2 votes
  9. Comment on Apple Worldwide Developers Conference (WWDC) 2020 Livestream & Discussion (starts 10AM June 22 PT / 5PM June 23 UTC) in ~tech

    onyxleopard
    Link Parent
    And most of the time they do a better job than the 3rd party solution. The macOS team sometimes add things that nobody asked for, though, like Launchpad, which AFAICT was a total dead end and...

    Apple has a long history of sherlocking 3rd party features.

    And most of the time they do a better job than the 3rd party solution.

    The macOS team sometimes add things that nobody asked for, though, like Launchpad, which AFAICT was a total dead end and probably shouldn’t have been released in the first place considering the weird half-baked UX and frustrations of trying to drag app icons into folders.

    1 vote
  10. Comment on Replacing (potentially) insensitive terminology in programming in ~comp

    onyxleopard
    (edited )
    Link Parent
    Imagine you are a beginner and you find a git tutorial. The tutorial uses the new terminology of main, but you happen to be working with an older, preexisting repository created before the new...

    Beginners are the ones who would be least hurt by such a change.

    Imagine you are a beginner and you find a git tutorial. The tutorial uses the new terminology of main, but you happen to be working with an older, preexisting repository created before the new terminology was adopted, so it uses master. For example, if the tutorial tells you to run git checkout main, this is what happens if you try to do so in a repository where there is no main branch (not to mention the issue if you happened to have a file checked in named “main”):

    git checkout main
    error: pathspec ‘main’ did not match any file(s) known to git
    

    If I were a beginner trying to use git and this is what I saw, I’d assume I did something wrong.

    What are, exactly, your principles?

    The principles of humanity and charity, as well as the cooperative principle, at least. I don’t go around flouting these principles and assuming the most pessimistic interpretation of every utterance I come across. Anyone who does is not actually interested in being a productive part of a conversation.

    11 votes
  11. Comment on Replacing (potentially) insensitive terminology in programming in ~comp

    onyxleopard
    Link Parent
    Naming things is hard. More so when you’re a non-native speaker. Still, even if Baudis approves of the change, it’s clear to me that he did not intend any harm in his original naming choice, and...

    Naming things is hard. More so when you’re a non-native speaker. Still, even if Baudis approves of the change, it’s clear to me that he did not intend any harm in his original naming choice, and since there is no concept of 'slave' branches in git from the beginning, I still contend this is language policing over-reach. If I were Microsoft, I’d spend my resources making more substantive changes within my power to support and amplify the voices of the current protest movement.

    2 votes
  12. Comment on Replacing (potentially) insensitive terminology in programming in ~comp

    onyxleopard
    (edited )
    Link Parent
    Natural language changes via natural processes (like sound change, semantic drift, or epenthesis). The use of the term master in GitHub is up to Microsoft. It can self-censor if it so decides. But...

    Natural language changes via natural processes (like sound change, semantic drift, or epenthesis). The use of the term master in GitHub is up to Microsoft. It can self-censor if it so decides. But segments of a language community do not get to decide how the rest of that language community get to speak by reducing every ambiguity to its most problematic interpretation. The same argument here applied broadly would suggest that the n-word should be taboo, regardless of who says it and how they use it in context. Are you really of the opinion that natural language requires such policing? As a linguist, I’m certain that this kind of prescriptive ideology is misguided.

    Who is this hypothetical change hurting?

    I can’t think of all the possibilities, but it at least hurts anyone who tries to used multiple different git repos and encounters different conventions for the master branch name. This is likely most harmful to beginners who are not initiated with either convention. (git is hard enough to grok on its own without multiple sets of terms being strewn around.)

    And, besides taking actions that have far-reaching effects, whether they harm anyone should not be the sole criterion. What about being principled?

    15 votes
  13. Comment on Replacing (potentially) insensitive terminology in programming in ~comp

    onyxleopard
    Link Parent
    As I’ve already explained in this thread (if you’d bother to read it rather than jumping in without context), you’d understand that the sense of “master” in the context of git is distinct from the...

    As I’ve already explained in this thread (if you’d bother to read it rather than jumping in without context), you’d understand that the sense of “master” in the context of git is distinct from the “master/slave” sense.

    3 votes
  14. Comment on Replacing (potentially) insensitive terminology in programming in ~comp

    onyxleopard
    Link Parent
    The logical sequence is quite easy to follow. If one decides to appease every person who willfully misinterprets your words, you’ll spend your whole life mincing your words rather than saying what...

    The logical sequence is quite easy to follow. If one decides to appease every person who willfully misinterprets your words, you’ll spend your whole life mincing your words rather than saying what you mean.

    7 votes
  15. Comment on Replacing (potentially) insensitive terminology in programming in ~comp

    onyxleopard
    Link Parent
    If I use a certain sense of an ambiguous word, it is true that I am responsible for making my meaning clear. The way to do this is context. In the context of git and GitHub, the sense of master is...

    In this example, if one party understands "master" to be offensive and the other party doesn't, that doesn't make one of them wrong.

    If I use a certain sense of an ambiguous word, it is true that I am responsible for making my meaning clear. The way to do this is context. In the context of git and GitHub, the sense of master is abundantly clear from context. The proposed replacement term main is equally or potentially more ambiguous than master, it just happens to not have a potentially racist connotation for its many possible interpretations.

    Words will have different meanings to them.

    Precisely! And if I use one sense of a word, and you interpret a different sense, that can lead to miscommunication. That is why context is important, along with pragmatics. If you are going to ignore the context and insist that some words are problematic regardless of context you are not being a cooperative communicator.

    It doesn't create any new problems, and appears to solve some existing ones.

    It raises the question of all other identifiers in git and GitHub (or in any other software you care to inspect). Should git change the term cherry-pick because of other senses of 'cherry'?

    [in singular] informal one's virginity: only 3 percent of the students lost their cherry at college.

    I can also imagine this sense is offensive to the sensibilities of some population, so by the precedent set here, mustn’t we change this, too?

    If you don’t think the case of cherry-pick is the same as the case of master, can you articulate why? Because, I see no principled way to say that master was worth changing if cherry-pick isn’t.

    7 votes
  16. Comment on Replacing (potentially) insensitive terminology in programming in ~comp

    onyxleopard
    Link Parent
    Because words can have multiple meanings! Polysemy is inherent to natural language! Are we really going to go through every word in every natural language and reduce each one to the most racially...

    Why not? Why should it stay that way?

    Because words can have multiple meanings! Polysemy is inherent to natural language! Are we really going to go through every word in every natural language and reduce each one to the most racially insensitive sense we can think of, ignoring all the equally valid senses? What about domains other than racism? Are we going to say that languages that have grammatical gender are problematic now, too, because someone might ignorantly perceive that as bigoted?

    13 votes
  17. Comment on Replacing (potentially) insensitive terminology in programming in ~comp

    onyxleopard
    Link Parent
    If the original meaning was in the sense of the master/slave dichotomy, that dichotomy no longer exists in git. The semantics have changed. To insist that terms are problematic per se, rather than...

    That seems less likely than git borrowing from other version control systems, which were based on master/slave.

    If the original meaning was in the sense of the master/slave dichotomy, that dichotomy no longer exists in git. The semantics have changed. To insist that terms are problematic per se, rather than based on their intended meaning and use in context is incredibly frustrating to me, as a linguist. Suggesting that one should self-censor in order to preempt potential ignorant misinterpretations is equally as problematic to me as suggesting that one should adhere to prescriptive grammar to appease grammar nazis.

    13 votes
  18. Comment on Replacing (potentially) insensitive terminology in programming in ~comp

    onyxleopard
    Link Parent
    I totally recognize that this terminology is racist. However! The master/slave dichotomy is not what is being invoked by the term master as used in git. The master branch, as used in the context...

    Yes, master/slave is not a big deal, except for a minority of people.

    I totally recognize that this terminology is racist.

    However! The master/slave dichotomy is not what is being invoked by the term master as used in git. The master branch, as used in the context of version control in git is more like this definition from the New Oxford American Dictionary:

    an original movie, recording, or document from which copies can be made: [as modifier] : the master tape.

    A "master branch" is exactly this sense of 'master'. It is the original branch which is used as a basis for new work to be branched from. There is no concept of slave in git (even if there was in version control systems that predate it).

    So, in this case, if someone perceives the term master as problematic, they are perceiving a different meaning of the term than is intended. We can argue about whether intentionality is important when it comes to issues of systemic racism, but I think this policing of language is over-reaching. Natural languages are inherently ambiguous. Those who use natural language shouldn’t have to reduce their vocabularies to exclude ambiguous terms just because someone might interpret an unintended meaning. Should the music and movie industries stop using the term 'master' as well?

    21 votes
  19. Comment on British farmers need all the help science can offer. Time to allow gene editing in ~science

    onyxleopard
    Link Parent
    And what if our world is changing more rapidly than on the order of millennia due to human activity? We may not have a thousand much less a hundred years of buffer to react to a food supply...

    Most of our crops became that way from hundreds or thousands of years cultivating and selecting for certain properties.

    And what if our world is changing more rapidly than on the order of millennia due to human activity? We may not have a thousand much less a hundred years of buffer to react to a food supply crisis. Are we really going to sit on our hands until we are in a position where we have to beg corporations to try to save crops? Isn’t that setting up the corporations to have way too much influence on the food supply?

    1 vote
  20. Comment on British farmers need all the help science can offer. Time to allow gene editing in ~science

    onyxleopard
    (edited )
    Link Parent
    In the US, the notion of patents was originally supposed to be a limited (in terms of time), government granted monopoly to incentivize innovation. If you limit patents to the genetic modification...

    In the US, the notion of patents was originally supposed to be a limited (in terms of time), government granted monopoly to incentivize innovation.

    If you limit patents to the genetic modification techniques only—that is, only the GM sans the O, and you can make copies of the O, either by cloning or other forms of reproduction—the question becomes: How do you preserve the incentives for genetic engineering research in the first place? Who will pay for the development of genetic engineering science and technology? If the answer is that R&D is restricted to large corporations like Bayers, DuPonts, BASFs, Roches, Shires, Amgens, etc. with well-funded R&D departments, I think this is still fundamentally problematic.

    If the answer is, there is enough real-world economic incentive to do the research via public funding or research universities anyway, then maybe it’s OK to throw out the current system of incentives (i.e., the US patent system). But, if it’s too expensive to invest in genetic engineering because there is no guarantee of such effort being rewarded, if you throw away the current system, you risk chilling the entire field, or limiting it to large corporations.

    There are other perverse incentives of note here (which afflict software, too). Basically, if there is a biotechnology patent of interest, corporations will just wait for the patent to expire, and then begin production when the technology is not encumbered by legal risk.

    I know a molecular biologist who held a patent on a technique for genetically engineered yeast and it covered a GMO yeast strain. He was able to grant research licenses to a few universities, and he did have a small number of commercial licenses over the life of the patent (remember, patents have a limited duration). But overall, as an individual, he probably spent more resources on enforcing his rights as a patent holder in terms of his own time and patent firms’ time, than he did reaping meager financial rewards from licensing the technology. Since the patent and an extension on the patent expired, he knows of at least two startups that are using the technology, now that there is no monopoly on it. He also was talked down by patent lawyers from bringing suit against a corporation that he knew was violating his patent, but he would be unlikely to win in court. He says if he knew what he knows now, he thinks the R&D effort spent back in the 80s was wasted. According to him, the promise of being rewarded a government sanctioned monopoly was insufficient. And, as an individual, his efforts put into developing the technology and the effort to license and enforce his legal rights as a patent holder resulted in a small profit if he’s being charitable, and likely a deficit if his total time spent were tracked accurately (which it wasn’t, because this was basically his side-job for twenty years). If he were to do it again, he says the only way it would have been worth it would be if he could have managed to get the USPTO to have granted him a patent covering all yeast strains with his technique, and the USPTO doesn’t grant such broad patents.

    So, what would an incentive system look like that would actually encourage innovation and reward innovators? How would such a system overcome perverse incentives that benefit legal entities such as corporations? How would such a system ensure that the fundamental technologies necessary to ensure the global food supply is not in danger of diseases or other risk factors causing major disasters? How would such a system take into account the medical domain where the markets for drugs and gene therapies may not present clear incentives to develop solutions?

    I don’t have good answers to these questions, but I think we need scientists involved in government policy decisions to ensure that we don’t end up in a truly unlivable situation.

    6 votes