Open Source Foundational Fatigue: Is It Growing?
There was an article a few weeks ago regarding “foundation fatigue” with regard to the Linux Foundation. This article touched on a few key points, but among them was the “fatigue” felt by the participants in the Linux Foundation projects when it comes to the cost of playing, which can approach $500,000 for some companies. The question of “pay to play” is touched on, but not really discussed a whole lot in the article.
The angle I’d like to talk about on this topic is more along the lines of: are the legacy/big vendors dictating the direction of these projects a little too much, are there too many projects out there, and what is the impact of these things on the spirit of what “open source” should really mean.
First off, I have some concerns that the amount of influence exerted by some of the larger vendors is less about pushing the technology forward and more about pushing the agenda of those vendors. In my experience vendor agendas have much more to do with protecting their base and MUCH less to do with advancing the technology. This issue bothers me less when the vendor involved is a legacy networking vendor that is trying to become a player in the emerging set of tools around SDN and NFV, with no legacy OSS products. When there is no legacy product in place to drive trying to squeeze a few extra dollars out of their intellectual property my concern lessens.
If you have read very many of my blogs you will be very familiar with my concern with the capabilities of legacy OSS systems to handle where we are headed as an industry, and I really don’t want to see some of those vendors try to hinder the innovation that community based, open source development is supposed to bring to bear on a project.
This leads me to my next concern, and something we at Itential have had many conversations about. Just how many open source projects do we need out there? It seems like every time you turn around there is another project popping up that sounds a whole lot the project that popped up 2 weeks ago…or 2 months ago. The number of “me too” projects is a little concerning, but you have to think that eventually natural selection is going to weed out those that aren’t strong enough to survive.
To give some idea on the issue here I will refer to the two major organizations that we deal with projects from, the Linux Foundation and the OpenStack Foundation.
In the OpenStack Foundation there are 6 projects identified as the “Core” projects. These are the ones that every person that has ever even read an article about OpenStack probably has heard of. There are an additional 13 projects listed as “Optional” components. So, that is 19 projects…not so bad, right?
Those are the projects referenced on the main OpenStack pages, which is filtered to those things ready for use now, even if they may be a little immature still. If you get into the OpenStack wiki and see how many projects are really out there it is north of 50. Some of these are active and on the verge of being added to the “official” list on the main page. Some are one off projects that were active and died on the vine due to lack of interest, but still exist out in the ecosystem. Some are very niche focused, such as Tacker for NFV Orchestration, but are still very active.
In the Linux Foundation there are 51 projects listed on their website with more being added every other week, it seems. There are varying degrees of maturity, just as you see with OpenStack, and many projects that are across many industries. This makes it a little less concerning, but there are also seemingly duplicate projects…at least REALLY similar projects…such as ONOS/OpenDaylight and Open-O/OPNFV.
This leads to some confusion as to why multiple projects with the same goals should exist within the same organizational structure. Wouldn’t it make more sense to leverage all the resources from both for a common community? Seems like it to me.
Lastly, I have concerns about all of the above leading to a hindrance (at worst) or a watering down (at best) of the innovation possible if open source continued to mean free (as in beer), if communities were integrated and working on a common project (versus distributed across multiple projects), and a level of focus was maintained. Each of the organizations mentioned above is probably nearing a tipping point where some hard choices have to be made on direction. We just can’t let these issues stop the momentum that has built up around open source in the last 5-10 years in the areas of virtualization, NFV, and SDN. Open source is the lifeblood of these technologies and the success possible there.
Agree? Disagree? Feel free to drop me a line to discuss…