If computer technology is the foundation of our modern working world, then coding is how we shape and expand it. The possibilities of coding have meant that our opportunities increase rapidly, allowing us to create software that solves problems, sweats the small stuff for us and improves our daily lives.
Coding grants us the flexibility to live and work at our most efficient.
Per the US Bureau of Labor Statistics, programming will become one of the fastest growing occupations over the next 10 years. There will be 1.4 million coding jobs to fill in the US alone, with 67% of those jobs outside the technology industry. Clearly, the coding community is growing. But what about the history of coding? How did programming evolve, to become what it is today?
The history of Coding
Historians recognise Ada Lovelace as the first computer programmer. Ada, born in 1815, studied mathematics, which at the time was highly unusual for a woman. After meeting Charles Babbage, the inventor of the Analytical Engine, Ada was entranced. In 1843, she published an article about the machine, translated from French, adding in her own extensive notes, which came to be recognised as the first algorithm intended to be processed by a machine.
Lovelace’s contribution to coding was just the beginning. In 1923, German military combined coding and electricity, communicating via secret coded messages on the Enigma machine. Alan Turing famously cracked the code, reportedly helping end the war two years earlier than predicted. Turing later took the concept and evolved it, creating a more flexible machine in 1936. Unlike its predecessors, Turing’s computer could complete more than one task, due to its ability to read multiple instructions in binary code.
The drawback to Turing’s design was the fact that it read instructions via long tapes of paper, which proved impractical. In 1948, researchers at Manchester University came up with a solution – the ability to store memory via electricity. Their computer was programmed via binary code, held 128 bytes of memory, and was large enough to fill an entire room.
The late 1950s saw the invention of coding languages that are still in use today – specifically, FORTRAN, LISP and COBOL. Technology swiftly unfolded throughout the 60s, with the notion of computer gaming being born, the mouse being invented, and ARPANET, the predecessor to the internet, being created. The 1970s brought further development: the high-level programming language PASCAL was forged; interestingly, it is still used by Skype today.
Coding in the Workplace
Many consider the 1980s to be the golden age of technological developments, and from the perspective of coding history, it probably was – 1983 saw the conception of C++, a language in use consistently today (think Adobe, Google Chrome and Microsoft Internet Explorer), while 1987 was the year PERL debuted, a language currently in use by IMDB, Amazon and Ticketmaster, amongst others.
By 1989, Tim Berners-Lee had invented the internet, which has arguably had the biggest impact upon our modern working lives. As part of that invention, Berners-Lee devised HTML, URL and HTTP – familiar to anyone who’s ever used the internet (that would be 4.2 billion of us).
The 1990s welcomed some of today’s biggest, and most recognisable names in coding – think Python, JAVA, JavaScript, and PHP. Without these, we wouldn’t have social media, Android/OS apps, or streaming services.
Our home lives have been revolutionised by the technology created with these languages, but what about our working lives?
Microsoft’s .NET, a series of languages within one framework, used for their Office and business-related software, has also had a huge impact upon the modern workplace. The vast majority of office workers are au fait with Office 365, with the .NET framework enabling those familiar applications to move beyond the desktop PC, into the cloud. This obviously has many positive outcomes, including facilitating the rise of remote and flexible working, and the use of laptops, smartphones and tablets in place of more traditional tower computers.
Coding skills are becoming progressively commonplace across industries outside of computing and software: you no longer need to possess the skills of a developer to get to grips with basic code, and many of us are more versed in it than we realise – from formulas in Excel, to creating your own website or app, most of us have dabbled in code at some point, perhaps without realising.
The future of Coding
Something we should be keeping in mind is the future of coding, particularly in the workplace. While plenty of us have a functional knowledge of programming languages, it’s expected we may have a dearth of experts in the future. According to some claims, there will only be around 400,000 computer science graduates, leaving a whopping 1 million jobs available.
This is exactly why educational programmes the world over are including coding in their syllabus, and who knows just how those future programmers will help change our workplace technology?
One thing is for sure: while ever-evolving technology is inevitable, coding will remain at the centre of every development.