I agree that there is a big difference between "I made this to do some FOO thing, and I am the only real intended user. Its not really meant or intended to a community of users to do meaningful, repeated work with it, its meant to do a one-off task" and "Here's my newest offering, please pay me 60 US Dollars. (But I totally did not give any real effort toward polish or performance consideration, I prioritized my own convenience instead.)"
I am pretty sure that the arguments being presented are about the latter.
My raised point was that OS vendors these days dont WANT you hooking the actually performant underlying OS calls directly, because they want to change them like a teenage girl changes clothes. They want you to hook a middle abstraction layer, which the OS vendor then shims however it wants, as it changes the underlying API like a teenage girl changes clothes. (The problem here, is that the OS maker starts getting lax itself, and starts chaining its own middle layers on top of each other, in ever increasing layers of shims for shims for shims, or not completely replacing one technology with another-- See also, the complete shitshow that is "Settings app" and "Control panel" on modern windows, in which SOME functions are now done one way, while others are not, etc.)
As an application developer, being stuck holding the bag because the OS vendor is a fickle and imperious tyrant about that kind of thing, means having to live with that inefficiency, because the alternative is continually broken software, and the implication that you dont know what you are doing/dont care at all about things.
In other cases, said application developer has no choice but to target an abstracted set of interfaces to accomplish what they need done, because that thing involves interacting with a hardware device, that the OEMs out there fight for dominance over. (Like video cards, and 3D acceleration features) Dealing with these bespoke things is exactly what these abstraction libraries are all about, and instead of trying to write code for the 30+ offerings from each OEM out there, you target one set of APIs with a inimum feature level, and do the best you possibly can with that, instead.
In these cases, there isnt much the application programmer can do. Just write the most efficient program logic in their own program they can, and optimize how they do these calls.. Thats about it. They arent responsible for the shitshow the OS is doing, and realistically cant do much about it.
The sins start to crop up, when you, as an application developer, start taking extra helpings of libraries, often for things that you really dont need a library for, other than your own convenience. (eg, not actually essential, like the cases prior-- Say, a library to easily talk to a serial port, or a library to do some fancy string manipulations with easier to use primitives, The potential here is basically limitless.)
Once you start doing that, to save time for yourself, you are wasting the end user's system resources to load libraries that 98% go unused for anything in your application, but the memory is consumed to load the whole library anyway, because thats how this works-- and all just because you didnt want to actually write a string manipulation function yourself.
When enough software does that, you end up with the kinds of bloat our old bastards here (myself included) find disfavorable.
If you are making a program for just yourself, for a single one-off thing, then sure-- you are perfectly legit in not giving two shits, as long as it works. You are the only one who has to use that software anyway. Perfectly OK.
Its when you try to sell it, or your software is *THE* solution for a problem out there, that things get bad. If you sell your software, you really REALLY should be more considerate of the end user's system, and its resources. You should make every effort to make your software into a good guest, and not a karen demanding to see the manager, with elevated privileges.
That's my view on the matter anyway.