Python and C++ (not pictured here) are so far the most chosen languages when it comes to ML and AI.

In 10 years, most software jobs won’t involve programming.

Kamila Hankiewicz
3 min readOct 1, 2019

--

It’s been long discussed that with the tech advancement we’ll soon be able to take the programming out of programming. Machine Learning is seen as such aid in software development. In a recent article in TopBots, Google research engineer Peter Warden said,

“In 10 years, I predict most software jobs won’t involve programming.”

Before the boom on Agile practices, companies were heavily relying on Waterfall software development (also called the Systems Development Life Cycle). This model entailed a programmer using a language such as Python or C++ to write code to deliver on the application requirements. These came from a requirements definition stage, often involving a software business analyst sitting with a business executive. Application design and development then took place, followed by QA testing. Once satisfied, the team leader deployed the application into production, then came the maintenance. Many risk-adverse entities such as banks and other financial institutions still rely on such practices.

In the meantime, developers have modified this cycle many times, such as through Agile software development “sprints” to try to speed things up and lower the up-front resource investment.

Training programs to program themselves

The new approach to software development projects that engineers focus on preparing domain-specific data for feeding it into AI machine learning algorithms. The resulting models are trained and continuously improved. In such way, Machine Learning model can determine the features and patterns that are important to users, without a human, such as a business analyst explicitly encoding the knowledge. Often, ML models can spot important details the humans have not yet thought of.

For some tasks, the humans are no match for the AI due to our focus limitations. Finding whether a piece of code is a duplicate from an open source project, for example, might require analysis of a million lines of code. An effective algorithm can beat the human at that in seconds.

The programmers of tomorrow will “collect, clean, manipulate, label, analyze and visualize data that feeds neural networks. Neural networks are not just another classifier, they represent the beginning of a fundamental shift in how we write software.”

said Andrej Karpathy, formerly or OpenAI and now Director of AI at Tesla.

The SDLC and Agile approach will not disappear quickly, since components surround the AI, for user interface, data management and security for example, will still be written in the more current way.

Data as Code

Looking at existing code as data containing insights, is a viewpoint consistent with an AI infusion into software development. Code repositories, for example, are a version record of all that went on to the application until it reached its current state. Metrics can be computed from program parsing, language classification and other analysis, according to a report in CIO.

There are many startups that see an opportunity to help organisations bridge from Waterfall software development to AI software development. Startup Semmle created a security tech by discovering new types of source code vulnerabilities. Source{d} has developed a platform to automate code review for developers, while helping executives with data to keep the IT strategy and adoption of AI in sync.

Machine learning models can be built to do the coding work. Startup Diffblue uses machine learning to write unit tests for code. Code reviews can be done by programmed and trained bots before the code is submitted for QA testing. The bot could check for common mistakes, make sure the style is consistent with the existing code base, and work 24/7.

For developers entering or working in the field right now, it’s important to understand the architecture behind the tools and frameworks you are using to prepare themselves for the future. Over the past 30 years the languages changed (who now is using Pascal?), the frameworks changed, the vendors changed, but these implementation patterns tend to repeat themselves in each era. If we understand that, we can begin to see the differences as new technologies come out and apply what we already know in these new contexts.

— -

Want to learn how you can apply ML and AI Untrite’s technology to grow you business faster, drop me a message. Happy to help :)

--

--

Kamila Hankiewicz
Kamila Hankiewicz

Written by Kamila Hankiewicz

I'm all about tech, business and everything in between | @untrite.com @oishya.com @hankka.com | @untrite.com @oishya.com, @hankka.com, ex-MD Girls In Tech

No responses yet