I think in this context, user-focus/orientation is being confused for developer-focus/orientation. I think Apple, Microsoft, and Google probably have a much better understanding of the users of...
I think in this context, user-focus/orientation is being confused for developer-focus/orientation. I think Apple, Microsoft, and Google probably have a much better understanding of the users of their platforms than the software developers targeting those platforms. Now, I can agree that things like Windows displaying ads that are baked into the OS is one thing. But something like Apple warning users when they install software from developers who don’t sign their apps is user-focused. It’s just not developer-focused if you are a developer who believes they are entitled to distribute their software at the expense of users’ security/privacy/system stability etc.
Basically, I think macOS is far more user-focused than it is developer-focused, but among the platforms called out, it’s actually still very developer-focused, comparatively. Platforms have to balance users’ safety with developers’ convenience, and it’s pretty clear how that’s trending. I have no major dispute there. I just don’t think this trend is entirely explainable as the platforms being self-serving. They want to make sure the user-experience hits a minimum level of quality, and if that means sacrificing quality of life for developers in certain respects where developers have historically been given pretty much free reign, that’s no longer the case. It’s no longer the Wild West for developers. For users who want that freedom, platforms like macOS still offer a great amount of such freedom at the expense of disabling a bunch of the assurances Apple provides by default, like System Integrity Protection etc. But, if Apple were really solely focused on being self-serving, they wouldn’t allow users to disable SIP or install unsigned apps at all—much less allow things like Bootcamp or virtualizing other OSes on the Mac.
The Apple section is kind of lolworthy: So we're going to be mad at Apple over Metal but not say anything about DirectX, which is still the predominant graphics API for most of the situations...
The Apple section is kind of lolworthy:
Apple is more subtle from the end-user’s perspective. They eschew standards to build walled gardens, opting for Metal rather than Vulkan, for example. They use cryptographic signatures to enforce a racket against developers who just want to ship their programs. They bully vendors in the app store into adding things like microtransactions to increase their revenue. They’ve also long been making similar moves in their hardware design, adding anti-features which are explicitly designed to increase their profit — adding false costs which are ultimately passed onto the consumer.
So we're going to be mad at Apple over Metal but not say anything about DirectX, which is still the predominant graphics API for most of the situations people who care about these things complain about it? OK.
And the rest of it is all vendor/developer complaints rather than end-user complaints, and even in that light they don't make sense. Yeah a developer might "just" want to ship their programs, but a user wants to be able to install programs without worrying if this weird thing off Sourceforge might be malware. How is it not user-focused to introduce a modicum of security there? And in MacOS you can still install whatever you want, you just need to know your way around the OS well enough to open the gate.
I don't even know how they decide Apple "bullys vendors to adding microtransactions." If anything, their revenue sharing policies are hard disincentives for microtransactions.
While Apple did deprecate OpenGL, it's now in the same category as on Windows, I believe. You need a third-party-supplied framework to use it, such as MoltenGL. This is the same as how you can use...
While Apple did deprecate OpenGL, it's now in the same category as on Windows, I believe. You need a third-party-supplied framework to use it, such as MoltenGL. This is the same as how you can use Vulkan on Apple platforms using MoltenVK. It's there if you want it, but Apple isn't going to maintain it.
OpenGL has always been a clusterfuck on MacOS because of Nvidia's crappy driver support. It's not unilaterally an Apple thing. Things would be technically compatible but riddled with bugs and issues.
Apple didn't just ignore Vulkan, they also deprecated OpenGL.
OpenGL has always been a clusterfuck on MacOS because of Nvidia's crappy driver support. It's not unilaterally an Apple thing. Things would be technically compatible but riddled with bugs and issues.
Apple has historically, but they stopped sourcing GPUs from Nvidia and it seems that past history has soured relations between Apple and Nvidia. Here’s an article that gives some color on the...
Apple has historically, but they stopped sourcing GPUs from Nvidia and it seems that past history has soured relations between Apple and Nvidia. Here’s an article that gives some color on the history.
Edit: FWIW, I recall needing to get replacements for 2 consecutive generations of MacBook Pros with discrete Nvidia mobile GPUs due to GPU hardware failures circa 2005-2008, so there were clearly some issues with quality from Nvidia’s side—Apple had to eat the cost of replacing both my laptops for me due to Nvidia’s poor quality control.
The thing is, developers are not "the users." Most of Apple's restrictions are designed to protect their users from the bad actions of developers. Those actions could be malicious, but more often...
they consistently choose their own interests over those of their users
The thing is, developers are not "the users." Most of Apple's restrictions are designed to protect their users from the bad actions of developers. Those actions could be malicious, but more often they're well-intentioned but overzealous and/or ill considered. A big part of the Mac user experience, even historically, was Apple's taking an active role in exercising control or providing guidance to developers about how to accomplish certain tasks. This was a big reason why the Mac traditionally had such a cohesive culture in its development community and why "Mac apps" all had such a trademark look and feel, even among third parties.
That's a completely solvable problem, but it's in Apple's interest to help keep proprietary apps more convenient and more widely used, because they get money from developers of proprietary apps.
No this is in my interests to make it hard for you to send me push notifications. I don't need you spamming me with low-value notifications. I definitely don't need 30 developers who all have no respect for my time or attention each deciding to spam me with more push notifications. How does anyone not see that?
This is true iff you think that low-user-count software, or software created by independent developers, is inherently less important than high-user-count software, or software created by companies with enough resources to work with Apple the way they want to be worked with.
It really isn't about low-user-count or high-user-count. It's about developers who are properly acculturated into good development hygiene and practices and those who aren't. A lot of people seem to not think they're obligated to be respectful of how they fit into the broader ecosystem that the end user has to deal with.
If I'm not mistaken, isn't the issue that Apple doesn't allow apps to run whatever they want in the background because it affects battery life pretty negatively? If so, then it helps the user by...
I think you may be misunderstanding - if I install an XMPP app, that app which I installed is not allowed to send me, the user who installed the app, notifications on my own device. How does that help the user?
If I'm not mistaken, isn't the issue that Apple doesn't allow apps to run whatever they want in the background because it affects battery life pretty negatively? If so, then it helps the user by not allowing any app to just run down your battery mining bitcoins or whatever under the guise of sending you messages.
That’s a thing in iOS. The only difference is that you need to pipe all notifications through Apple’s servers, so devices don’t need to keep connections open to dozens of servers for push...
That’s a thing in iOS. The only difference is that you need to pipe all notifications through Apple’s servers, so devices don’t need to keep connections open to dozens of servers for push notifications. If apps could specify notification URLs, your phone would need to either poll these endpoints (using up bandwidth) or keep the connections alive (which would also use bandwidth.)
I don’t think any app code runs when push notifications are received, but I might be wrong as I haven’t ever implemented them.
It feels like you are arguing against the points you wish that @tindall was making rather than what he actually wrote. Those messenger applications sending notifications for new messages is a...
It feels like you are arguing against the points you wish that @tindall was making rather than what he actually wrote. Those messenger applications sending notifications for new messages is a perfect example. Those are absolutely not spam - they are a vital function of the app!
From a wider point of view, the fact that the App Store is a walled garden where you have to play by Apple's rules is not a huge issue by itself. The problem is that there are effectively no alternatives to it. And from an even broader perspective the fact that Apple regularly removes the element of choice throughout various aspects of their hardware and software is in itself anti-consumer. There are many people who actively avoid Apple products simply because they lack the choices they would like to make.
I think Linux distributions such as Qubes, Tails and Kali are pretty specialised. Who uses Kali as their default OS? Probably no-one because it is designed for pen testing. Great article though.
I think Linux distributions such as Qubes, Tails and Kali are pretty specialised. Who uses Kali as their default OS? Probably no-one because it is designed for pen testing. Great article though.
Unfortunately, like so many things with Windows 10, Microsoft is not consistent which makes it incredibly hard to get good information. Idk if it's A/B testing or what it is, but. Few months back...
Unfortunately, like so many things with Windows 10, Microsoft is not consistent which makes it incredibly hard to get good information. Idk if it's A/B testing or what it is, but.
Few months back when all the college classes went online, I had to install Windows and there was no such button. I did still manage to make a local account, but I had to enter an invalid phone number when it asked for one (a trick I had heard about from someone else a few months earlier when it first started appearing), only then did the option appear to allow a local-only account.
And when you have to jump to so many hoops and fool the installer to be able to do this, it's not exactly allowing you to make a local account. I would consider it's allowing me by giving me the...
And when you have to jump to so many hoops and fool the installer to be able to do this, it's not exactly allowing you to make a local account.
I would consider it's allowing me by giving me the choice. it needs to be clearly on the screen: make a Microsoft account or local account.
I think in this context, user-focus/orientation is being confused for developer-focus/orientation. I think Apple, Microsoft, and Google probably have a much better understanding of the users of their platforms than the software developers targeting those platforms. Now, I can agree that things like Windows displaying ads that are baked into the OS is one thing. But something like Apple warning users when they install software from developers who don’t sign their apps is user-focused. It’s just not developer-focused if you are a developer who believes they are entitled to distribute their software at the expense of users’ security/privacy/system stability etc.
Basically, I think macOS is far more user-focused than it is developer-focused, but among the platforms called out, it’s actually still very developer-focused, comparatively. Platforms have to balance users’ safety with developers’ convenience, and it’s pretty clear how that’s trending. I have no major dispute there. I just don’t think this trend is entirely explainable as the platforms being self-serving. They want to make sure the user-experience hits a minimum level of quality, and if that means sacrificing quality of life for developers in certain respects where developers have historically been given pretty much free reign, that’s no longer the case. It’s no longer the Wild West for developers. For users who want that freedom, platforms like macOS still offer a great amount of such freedom at the expense of disabling a bunch of the assurances Apple provides by default, like System Integrity Protection etc. But, if Apple were really solely focused on being self-serving, they wouldn’t allow users to disable SIP or install unsigned apps at all—much less allow things like Bootcamp or virtualizing other OSes on the Mac.
The Apple section is kind of lolworthy:
So we're going to be mad at Apple over Metal but not say anything about DirectX, which is still the predominant graphics API for most of the situations people who care about these things complain about it? OK.
And the rest of it is all vendor/developer complaints rather than end-user complaints, and even in that light they don't make sense. Yeah a developer might "just" want to ship their programs, but a user wants to be able to install programs without worrying if this weird thing off Sourceforge might be malware. How is it not user-focused to introduce a modicum of security there? And in MacOS you can still install whatever you want, you just need to know your way around the OS well enough to open the gate.
I don't even know how they decide Apple "bullys vendors to adding microtransactions." If anything, their revenue sharing policies are hard disincentives for microtransactions.
While Apple did deprecate OpenGL, it's now in the same category as on Windows, I believe. You need a third-party-supplied framework to use it, such as MoltenGL. This is the same as how you can use Vulkan on Apple platforms using MoltenVK. It's there if you want it, but Apple isn't going to maintain it.
OpenGL has always been a clusterfuck on MacOS because of Nvidia's crappy driver support. It's not unilaterally an Apple thing. Things would be technically compatible but riddled with bugs and issues.
Do any Apple computers ship with Nvidia GPUs?
Apple has historically, but they stopped sourcing GPUs from Nvidia and it seems that past history has soured relations between Apple and Nvidia. Here’s an article that gives some color on the history.
Edit: FWIW, I recall needing to get replacements for 2 consecutive generations of MacBook Pros with discrete Nvidia mobile GPUs due to GPU hardware failures circa 2005-2008, so there were clearly some issues with quality from Nvidia’s side—Apple had to eat the cost of replacing both my laptops for me due to Nvidia’s poor quality control.
The thing is, developers are not "the users." Most of Apple's restrictions are designed to protect their users from the bad actions of developers. Those actions could be malicious, but more often they're well-intentioned but overzealous and/or ill considered. A big part of the Mac user experience, even historically, was Apple's taking an active role in exercising control or providing guidance to developers about how to accomplish certain tasks. This was a big reason why the Mac traditionally had such a cohesive culture in its development community and why "Mac apps" all had such a trademark look and feel, even among third parties.
No this is in my interests to make it hard for you to send me push notifications. I don't need you spamming me with low-value notifications. I definitely don't need 30 developers who all have no respect for my time or attention each deciding to spam me with more push notifications. How does anyone not see that?
It really isn't about low-user-count or high-user-count. It's about developers who are properly acculturated into good development hygiene and practices and those who aren't. A lot of people seem to not think they're obligated to be respectful of how they fit into the broader ecosystem that the end user has to deal with.
If I'm not mistaken, isn't the issue that Apple doesn't allow apps to run whatever they want in the background because it affects battery life pretty negatively? If so, then it helps the user by not allowing any app to just run down your battery mining bitcoins or whatever under the guise of sending you messages.
That’s a thing in iOS. The only difference is that you need to pipe all notifications through Apple’s servers, so devices don’t need to keep connections open to dozens of servers for push notifications. If apps could specify notification URLs, your phone would need to either poll these endpoints (using up bandwidth) or keep the connections alive (which would also use bandwidth.)
I don’t think any app code runs when push notifications are received, but I might be wrong as I haven’t ever implemented them.
It feels like you are arguing against the points you wish that @tindall was making rather than what he actually wrote. Those messenger applications sending notifications for new messages is a perfect example. Those are absolutely not spam - they are a vital function of the app!
From a wider point of view, the fact that the App Store is a walled garden where you have to play by Apple's rules is not a huge issue by itself. The problem is that there are effectively no alternatives to it. And from an even broader perspective the fact that Apple regularly removes the element of choice throughout various aspects of their hardware and software is in itself anti-consumer. There are many people who actively avoid Apple products simply because they lack the choices they would like to make.
Sorry about that. I try to stay gender neutral when possible but sometimes it just slips out.
I think Linux distributions such as Qubes, Tails and Kali are pretty specialised. Who uses Kali as their default OS? Probably no-one because it is designed for pen testing. Great article though.
That is not true, you can make a local login account.
Even if windows frames local account as Limited Experience"
Unfortunately, like so many things with Windows 10, Microsoft is not consistent which makes it incredibly hard to get good information. Idk if it's A/B testing or what it is, but.
Few months back when all the college classes went online, I had to install Windows and there was no such button. I did still manage to make a local account, but I had to enter an invalid phone number when it asked for one (a trick I had heard about from someone else a few months earlier when it first started appearing), only then did the option appear to allow a local-only account.
The trick I used was simply not connecting to the internet which meant unplugging the internet cable in the computer class. It was annoying.
And when you have to jump to so many hoops and fool the installer to be able to do this, it's not exactly allowing you to make a local account.
I would consider it's allowing me by giving me the choice. it needs to be clearly on the screen: make a Microsoft account or local account.