Software 2.0: What is it and why is it important?

Software 2.0: What is it and why is it important?

Credit: Artibeus

You’ve seen classic Software 1.0 coding structures: meet Software 2.0 in which code can be written by an optimisation in the form of neural network training.

Software didn’t just revolutionise computing, it changed human culture forever.

In a little under a century, we went from being constructors, animals that manually built our tools, to the only species on Earth to program tasks; fighting wars with bayonets to presidents being granted the codes for nuclear missiles. We moved from generally settling in small communities and keeping ourselves to ourselves, to branching out a little more than before, moving to corners of the globe and keeping in touch via digital means. We transitioned from books and written knowledge to effectively democratising learning, enabling autodidacts to teach themselves about the world via Wikipedia and YouTube.

The words that you’re reading are possible because of software. The device you’re reading them on, too. So what if we could upgrade software? What if a shockwave rips through the very fabric of the digital world? What could the next age humanity look like? Enter Software 2.0, the idea that software could write itself.

This is a familiar concept to those interested in machine learning. This new brand of software is able to be written in more complex languages to Software 1.0; there’s no need for a human to understand exactly what’s being processed. Software 2.0 is written by a neural net: instead of programming into the language everything that you want to be produced as a final result, Software 2.0 can be seen more like a shopping list of things you want the computer to do itself.

Languages such as Tensorflow and Keras program in a similar way to this. By statistically defining the architecture of the neural net, you can then train your data to produce a final result.

How do neural networks work?

Neural networks are sets of algorithms based on biological brains. They interpret data by recognising, labelling and grouping and aim to replicate how we think in order to get a more intuitive, realistic result than something that needs to be plugged in with code

Take a spam filter for example. If you use a collection of emails as data, then an algorithm could spot spam from genuine emails, perhaps from the addresses. As the algorithm analyses more emails, it may discover that the language used in spam varies from non-spam. A neural network will begin to learn the subtler aspects of spam and filter them out.

This is radically different from how coding works: coding does not learn. You can code into an email program to place suspicious-looking email addresses into a folder but it won’t pick up on why.

Software 2.0: How neural networks work

Rather than the output being a sum of the input data, hidden layers are introduced between the ins and outs of the formula. With several layers between the input and output data, all computing, there needs to be what is called “forward propagation” in order to receive the output and compare it against the data.

Neural networks might sound complex but in reality, they could end up simplifying Software 2.0. The instruction set of a neural network is small and it is easy to place a neural network onto a low-cost chip. Plus, if different Software 2.0 modules could interact, it could be possible for a web browser, for example, to automatically translate different systems for better efficiency.

Will Software 2.0 catch on?

Human beings are complex and tend to be stuck in two opposing directions. We’re simultaneously stuck in our ways, almost fearful of change, yet excited by innovation and possibility of the world around us developing. Are we too set in our existing ways for Software 2.0 or is this an upgrade we can’t ignore?

Software 2.0 offers a number of benefits that can’t be ignored by developers. From its portability to its potential to work on low-power consumption, this new way of working could be much more efficient than Software 1.0 in a number of respects. On the other hand, we still don’t know 100% of what there is to know about neural networks – the brain is famously complicated, after all – and so when mistakes and failures naturally crop up, it can be difficult find them and eradicate them.

Perhaps the biggest benefit, however, is simply that the 2.0 version of software would be a lot easier day-to-day, assuming of course that it everything goes to plan. If an algorithm is difficult to create, it makes sense to create it in Software 2.0: it’s a more intuitive way to code.

It’s unclear just yet as to whether Software 2.0 will change the planet or just adds another dimension to developing. What’s certain though is that this new way of writing code will question existing structures.

Luke Conrad

Technology & Marketing Enthusiast

Why DEIB is Imperative to Tech’s Future

Hadas Almog from AppsFlyer • 17th March 2025

We’ve been seeing Diversity, Equity, Inclusion, and Belonging (DEIB) initiatives being cut time and time again throughout the tech industry. DEIB dedicated roles have been eliminated, employee resource groups have lost funding, and initiatives once considered crucial have been deprioritised in favour of “more immediate business needs.” The justification for these cuts is often the...

The need to eradicate platform dependence

Sue Azari • 10th March 2025

The advertising industry is undergoing a seismic shift. Connected TV (CTV), Retail Media Networks (RMNs), and omnichannel strategies are rapidly redefining how brands engage with consumers. As digital privacy regulations evolve and platform dynamics shift, advertisers must recognise a fundamental truth. You cannot build a sustainable business on borrowed ground. The recent uncertainty surrounding TikTok...

The need to clean data for effective insight

David Sheldrake • 05th March 2025

There is more data today than ever before. In fact, the total amount of data created, captured, copied, and consumed globally has now reached an incredible 149 zettabytes. The growth of the big mountain is not expected to slow down, either, with it expected to reach almost 400 zettabytes within the next three years. Whilst...

What can be done to democratize VDI?

Dennis Damen • 05th March 2025

Virtual Desktop Infrastructure (VDI) offers businesses enhanced security, scalability, and compliance, yet it remains a niche technology. One of the biggest barriers to widespread adoption is a severe talent gap. Many IT professionals lack hands-on VDI experience, as their careers begin with physical machines and increasingly shift toward cloud-based services. This shortage has created a...

Tech and Business Outlook: US Confident, European Sentiment Mixed

Viva Technology • 11th February 2025

The VivaTech Confidence Barometer, now in its second edition, reveals strong confidence among tech executives regarding the impact of emerging technologies on business competitiveness, particularly AI, which is expected to have the most significant impact in the near future. Surveying tech leaders from Europe and North America, 81% recognize their companies as competitive internationally, with...

How smart labels are transforming supply chains

Sharath Muddaiah • 27th January 2025

As e-commerce continues to rise globally, the impact of just-in-time manufacturing and rising consumer expectations mean the need for real-time visibility has never been greater. Smart labels directly address this demand, offering solutions to long-standing challenges like shipment delays, theft, and the lack of traceability. With the smart label market projected to grow from $14.1...

The rise of loyalty apps

Sue Azari • 17th January 2025

Increased choice and a consumer more price sensitive than ever before, has made customers far more likely to shop around for the best deals. Price is now the number one factor in brand consideration. In an effort to bag a bargain, loyalty programs have become increasingly popular with consumers, with nine out of ten in...