I wanted to post this as some context around operating system development, partly as a complement to this response and partly just because it's something that's been discussed here a lot. This is...
I wanted to post this as some context around operating system development, partly as a complement to this response and partly just because it's something that's been discussed here a lot.
This is one of my favorite pieces of tech journalism and speculation, at least in hindsight; it asks a fateful question:
will Windows Vista will be the last of its kind?
Of course, it wasn't; that was Windows 7. But, nonetheless, their predictions are mostly spot on:
a focus on "better support for the emerging multi-core technology"
a concerted response to the threat of "free or ad-supported software delivered over the Internet"
that "nobody will be willing to spend much more than token amount, or [will] expect free usage"
The critical takeaway here, for me, is that:
[The] estimated payroll costs alone for Windows Vista hover around $10 billion.
That's a staggering amount of money; about $14 billion in today's money. I can't get a straight number on how much Windows 10 cost to develop, but estimates on various sites range from $15 billion to $18 billion.
Either way, it's expensive as fuck, and with Windows 11 surely costing not much less, I suspect we can see more revenue-generating features coming to consumer Windows soon.
It's also worth comparing this with the amount of money spent on the free desktop space; maybe a few million per year at most.
I don't know if Vista's development is a good metric for what it costs MS to develop a modern Windows version. XP to Vista was such a massive leap in features, UI, and security compared to...
I don't know if Vista's development is a good metric for what it costs MS to develop a modern Windows version. XP to Vista was such a massive leap in features, UI, and security compared to everything since. Maybe I'm way off the mark here since I'm speaking as a consumer and not a dev, but 7 was just a polished up Vista. 8 was a big jump in terms of UI, but then 10 was just a polished up 8. And 11, while I haven't used it yet, doesn't seem like a massive change either. Certainly not on the scale of XP to Vista.
Yeah, I think you're missing some pretty important behind-the-scenes changes. For one thing, the jump from Vista to 7 involved rewriting huge parts of the kernel, which helped address much of the...
Yeah, I think you're missing some pretty important behind-the-scenes changes. For one thing, the jump from Vista to 7 involved rewriting huge parts of the kernel, which helped address much of the performance penalty of running Vista, and the jump from 7 to 8, and in some ways from 8 to 10, involved a lot of work in the extremely developer-hostile environment of UEFI firmware to implement their custom fastboot and "how dare you boot anything other than Windows" code.
I'd be willing to bet the cost of the 7 to 8 transition, the 7 to 10 transition, and now the 10 to 11 transition are no less than a 10th of the cost, especially with the inflation of developer salaries over time.
I've never worked on an OS or desktop applications, but as a professional software developer who's worked on multiple products I'm pretty sure the users thought were terrible (I'm almost sure this...
Edit I'm really curious to know where the money goes when making a product which I, a non developer, consider to be a terrible product
I've never worked on an OS or desktop applications, but as a professional software developer who's worked on multiple products I'm pretty sure the users thought were terrible (I'm almost sure this describes every developer with more than a few years experience), here's my take: it's almost always due to management floundering, usually from unclear or no coherent direction from the company's executives. In the absence of clear overarching goals, priorities and requirements are constantly in flux, and things get changed or rewritten repeatedly, resulting in a spectacular waste of resources and an unfocused, incoherent final product.
Particularly incompetent or malicious individuals at the middle manager level can cause this sort of thing within their sphere of influence if there's insufficient executive oversight, but in my experience this is rarer. Individual developers can't really cause disasters on the scale of Vista (unless an entire development team is incompetent, which is really on the managers who hired them).
Before anyone asks, I actually don't think software is unusually prone to this particular failure mode (though the extremely short/cheap release process probably leads to more issue resolution in the public eye). There are regular product development disasters in pretty much every field I can think of. If you were going to ask that, or the question interests you, I highly recommend this series of articles comparing software to "traditional" engineering.
Yeah, I would also love to see that. I've spent a lot of time looking at the MS shareholder reports, but they're not very informative; my guess, if anything, is that a lot of time is spent on QA...
Yeah, I would also love to see that. I've spent a lot of time looking at the MS shareholder reports, but they're not very informative; my guess, if anything, is that a lot of time is spent on QA and ensuring compatibility with older versions. Especially on Vista, which replaced the still-used XP, that was critical for getting any significant adoption in business. FOSS has a significant advantage there, because compatibility in things like kernel interfaces isn't as critical; whereas on Windows, all you have to go on are those interfaces, in a FOSS environment your custom kernel module can take full advantage of what the kernel offers, and you can see exactly what has changed between versions. If something does break, it's far, far easier for manufacturers and business customers to fix it quickly.
I'm not going to consider the amount given to be even remotely accurate. The Seattle Times link (archived here), says that BusinessWeek estimated it to be 10,000 employees over 5 years after Steve...
I was expecting a more detailed cost breakdown, a simple product of salaries by time is a bit underwhelming, although illustrative.
I'm not going to consider the amount given to be even remotely accurate. The Seattle Times link (archived here), says that BusinessWeek estimated it to be 10,000 employees over 5 years after Steve Ballamer said "it cost a lot".
I wanted to post this as some context around operating system development, partly as a complement to this response and partly just because it's something that's been discussed here a lot.
This is one of my favorite pieces of tech journalism and speculation, at least in hindsight; it asks a fateful question:
Of course, it wasn't; that was Windows 7. But, nonetheless, their predictions are mostly spot on:
The critical takeaway here, for me, is that:
That's a staggering amount of money; about $14 billion in today's money. I can't get a straight number on how much Windows 10 cost to develop, but estimates on various sites range from $15 billion to $18 billion.
Either way, it's expensive as fuck, and with Windows 11 surely costing not much less, I suspect we can see more revenue-generating features coming to consumer Windows soon.
It's also worth comparing this with the amount of money spent on the free desktop space; maybe a few million per year at most.
I don't know if Vista's development is a good metric for what it costs MS to develop a modern Windows version. XP to Vista was such a massive leap in features, UI, and security compared to everything since. Maybe I'm way off the mark here since I'm speaking as a consumer and not a dev, but 7 was just a polished up Vista. 8 was a big jump in terms of UI, but then 10 was just a polished up 8. And 11, while I haven't used it yet, doesn't seem like a massive change either. Certainly not on the scale of XP to Vista.
Yeah, I think you're missing some pretty important behind-the-scenes changes. For one thing, the jump from Vista to 7 involved rewriting huge parts of the kernel, which helped address much of the performance penalty of running Vista, and the jump from 7 to 8, and in some ways from 8 to 10, involved a lot of work in the extremely developer-hostile environment of UEFI firmware to implement their custom fastboot and "how dare you boot anything other than Windows" code.
I'd be willing to bet the cost of the 7 to 8 transition, the 7 to 10 transition, and now the 10 to 11 transition are no less than a 10th of the cost, especially with the inflation of developer salaries over time.
I've never worked on an OS or desktop applications, but as a professional software developer who's worked on multiple products I'm pretty sure the users thought were terrible (I'm almost sure this describes every developer with more than a few years experience), here's my take: it's almost always due to management floundering, usually from unclear or no coherent direction from the company's executives. In the absence of clear overarching goals, priorities and requirements are constantly in flux, and things get changed or rewritten repeatedly, resulting in a spectacular waste of resources and an unfocused, incoherent final product.
Particularly incompetent or malicious individuals at the middle manager level can cause this sort of thing within their sphere of influence if there's insufficient executive oversight, but in my experience this is rarer. Individual developers can't really cause disasters on the scale of Vista (unless an entire development team is incompetent, which is really on the managers who hired them).
Before anyone asks, I actually don't think software is unusually prone to this particular failure mode (though the extremely short/cheap release process probably leads to more issue resolution in the public eye). There are regular product development disasters in pretty much every field I can think of. If you were going to ask that, or the question interests you, I highly recommend this series of articles comparing software to "traditional" engineering.
Yeah, I would also love to see that. I've spent a lot of time looking at the MS shareholder reports, but they're not very informative; my guess, if anything, is that a lot of time is spent on QA and ensuring compatibility with older versions. Especially on Vista, which replaced the still-used XP, that was critical for getting any significant adoption in business. FOSS has a significant advantage there, because compatibility in things like kernel interfaces isn't as critical; whereas on Windows, all you have to go on are those interfaces, in a FOSS environment your custom kernel module can take full advantage of what the kernel offers, and you can see exactly what has changed between versions. If something does break, it's far, far easier for manufacturers and business customers to fix it quickly.
I'm not going to consider the amount given to be even remotely accurate. The Seattle Times link (archived here), says that BusinessWeek estimated it to be 10,000 employees over 5 years after Steve Ballamer said "it cost a lot".