The software development ecosystem: the trends for 2017

During my third participation in Stratégies PME, I had the opportunity to give a presentation on the trends for 2017. So, I’ve looked into my crystal ball in order to share my vision of what 2017 has in store for us regarding software development. Here is a summary of what I see!

Agility and continuous delivery

At Pyxis, for over 15 years now, we’ve been applying Agile principles to carry out our software development projects. Since then, Agility is being applied to other fields. For instance, more and more marketing teams are using Agile approaches. It is also the case for Pyxis—here is an article on the subject. Even if for many people this way of doing is now mainstream, it remains in 2017 advocated for development teams.

Continuous delivery allows to deliver more frequently as well as to test and get feedback from clients on a regular basis. One of the great benefits of continuous delivery is the reaction to changes—whether they are technological changes or changes in the users’ needs.

On the other hand, it’s a myth to believe that Agility and continuous delivery will reduce development time. Agile methods allow us to achieve smaller targets more quickly and to ensure that we’re still aligned on the actual needs; though, the cycle time remains unchanged.


DevOps is a term we hear more and more often. This new approach brings together the development and operations teams. In fact, it facilitates communications and automates the entire process from development to operations. To learn more about DevOps, you can read my colleague’s post.

What I include in the DevOps world is the quality control—thus, the relation with the QC teams. It is important that the quality control coordinators take part in the complete cycle, from the development process to operations. The result: a software application with a truly controlled quality.

Here are the projections for 2017 regarding the adoption of DevOps. This approach should maintain its rise in popularity and be more present in organizations in 2017.



Source: RightScale 2016 State of the Cloud Report


One of the roles of DevOps to facilitate the continuous application delivery is to deliver the application in an infrastructure where it can be executed. In the case of a website, it’s the server. In the case of applications for clients, we’re talking about the installation process…

One of the things DevOps take into account is virtualization. Nowadays, it is wrong to think that the entire infrastructure of our system will run on physical machines—it would be way too expensive.



Traditionally, we installed our application on a server. The biggest problem with doing so is that if we wanted the application to run on both Linux and Windows, two machines were then required, and they were too often underused. Generally speaking, if you check the memory usage on your workstation, you’ll notice that 99% of the time the machine is waiting.

Virtualization is partly resolving this problem by installing a virtualization (or deployment) layer on a server. It is then possible to deploy many operating systems and applications; thus, giving a better return on investment and increasing performance.

Virtualization is increasingly democratized. Besides, the cloud is based on this ease of virtualization. For example, regardless of the web hosting service provider, if your website is being hosted, the provider uses virtualization techniques in order for the environment to be shared between different clients to maximize the use of resources and reduce operating costs. Therefore, virtual environments are increasingly small, flexible, and adapted to the client’s needs.

Initially, virtualization was mainly used in test and quality control environments. Nowadays, the virtual production environments are so reliable that they are increasingly common.

Virtualization has its share of challenges. The problem is that there are a lot of small systems to be deployed on larger systems. One of the trends is to script and automate deployments, which is called infrastructure as code.


Infrastructure as code

Infrastructure as code allows to automate deployments in different environments (development, test, staging, production). The infrastructure is among the deliverables of a project just like the source code.

Since the infrastructure is part of the development process, developers have a greater awareness of the production environment’s structure; thus preventing errors during releases.


Types of clouds

Trend shows that cloud usage is no longer private or public, but rather private and public. More and more organizations are using private and public cloud services for one application. One of the reasons is to better manage data confidentiality. In my opinion, the security of public cloud services is often better than the security private organizations are able to manage. Security is a constant race against hackers. Therefore, cloud services are often more up to date than average.




Internet of things

IoT is exploding. Since 2008, the number of devices connected to the Internet has exceeded the world population. These new devices generate a lot of data and they need to communicate with each other, which requires more and more IP addresses. Soon the Internet will no longer provide addresses based on IPv4. Luckily for me who loves gadgets, IPv6 will meet the request of new addresses. In fact, there will be 340,000,000,000,000,000,000,000,000,000,000,000,000 possible IP addresses.



What about you? What do you see when looking into your crystal ball?

Leave a Reply

Your email address will not be published. Required fields are marked *