History hardware software
Microcomputers came into existence in the s, and this ushered in the age of the personal computer, which brought computers into homes. These clones typically came with a modular design that enabled consumers to add and change components within the system. This upgradeable trait enabled IBM-compatible PCs to rapidly achieve prominence in the home and business computer market.
At the same time, other desktop systems and workstations arose, such as computers made by DEC, Sun, and SGI, mainly for use by businesses. Mobile computing took off in the s with the Osbourne 1 portable computer, the predecessor to the modern laptop, followed by the IBM Thinkpad tablet in Personal data assistants PDAs and smartphones flourished in the early s, making it possible for consumers to take computers wherever they went.
All computer software depends on and is created with some form of programming language. Early programming languages were machine-dependent. These machine languages took significant levels of expertise and training and made software development highly expensive. These languages required a compiler to translate them into machine-readable code and were designed to develop software for the military, scientific community, and businesses.
When personal computers entered the market, simpler, interpreted languages such as BASIC made software coding a task that even home computer users could master. Interpreted languages did not need a compiler to run and were very easy to debug and quick to run.
Another important language was LOGO, developed in to help children get involved in programming. Early forms of software were bundled with the computers that they were written for, meaning that in order to get the software the customer wanted, they also had to buy the hardware with it. The Supreme Court decision in the Digidyne v. Data General case in effectively put an end to this, while the free software movement arose in the same decade, led by pioneers such as Richard Stallman, founder of the GNU Project.
But the transition to using a time-sharing model instead of batch processing for running programs was perhaps most significant of all because it led to a rapid growth in computing applications. Unfortunately, projects consistently failed to deliver reliably, on time and on budget. Practitioners were forced to admit that they lacked the proper best practices to implement and produce software at scale commercially.
They called it the "Software Crisis". It was clear that designing complex software systems would require better tools and approaches than were available at the time so a conference was convened in to find a solution.
This is really where the term "Software Engineering" found its roots. The conference sought to apply the best practices of project management and production -- already used in traditional engineering disciplines -- to software. As a result, they produced a report which defined the foundations of software engineering. Over the following decades, the discipline of programming saw a familiar tension between the scientific thinking of academia, which tended to seek idealized solutions to engineering challenges, and the practical needs of an industry faced with real-life time and cost pressures and bloated code bases.
The early 70's saw the emergence of key ideas in systems thinking which allowed engineers to break these giant projects into modular and much more manageable pieces that communicated via interfaces. Another tectonic shift occurred in the early s with the move away from thinking of data as just a continuously changing stream and towards the idea of persisting discrete "objects" which could interact and hold independent state.
More concretely, that allowed developers to create and interact with the almost-physical objects of the graphical user interface GUI like menus and icons and windows. You're probably familiar with the derived term "object-oriented programming". The decades leading up to the present day were marked by astounding increases in computing power following Moore's Law :.
This new computing power wasn't entirely beneficial to the state of the industry. Where before engineers needed to be very careful to design efficient programs that could run with the limited memory and processing power of the day, reliance on raw power led to some backwards steps in the quality of code written.
This has led some to decry the rise of "wasteful" software. Source: World Science Festival. This was the first time commercial software was available to the average customer, and the ability to add different types of programs to any computer quickly became popular.
Software has become more and more complex over the years. In the early days, commands were typed in, and early software only accepted keyboard input. Because floppy disks could only hold a very small amount of data and most personal computers had no actual hard drive, software had to be very simple. That changed as computer hardware evolved. When hard drives became standard in personal computers, software could be installed on the computer before it left the distributor.
It also made it possible to load larger pieces of software on computers without sending the customer a stack of disks. Now users could switch between a number of different pieces of software without changing disks, something that made computer work much more effective. When CD-ROMs became standard, larger pieces of software could be distributed quickly, easily, and fairly cheaply. CDs could hold much, much more information than floppy disks, and programs that were once spread across a dozen floppies fit on one CD.
They quickly became the standard in software distribution, and by the mid s, floppy disk drives were no longer a standard feature on a computer. The creation of DVDs, which hold even more than CDs, has made it possible to put bundles of programs such as the Microsoft Office Suite all on one disc. However, thanks to the internet, now even DVDs are becoming obsolete.
Many people purchase and directly download their software without needing any kind of physical medium to contain it. This helps reduce the cost even more because nothing needs to be manufactured or shipped to the consumer. Advanced programs may be able to develop code for new programs based on what the user enters or needs. They may even be able to create newer, improved versions of themselves or design entirely new operating systems.
BSC Designer Team.
0コメント