A computer system comprises hardware and software components, aiming to offer a powerful computational tool. These systems play a crucial role across diverse domains, aiding us in numerous tasks. The prevalence of the internet has significantly bolstered the utilization of computers for information sharing and communication. Computer systems empower us to store, process, display, and transmit information. Even in a basic modern computer system, multiple programs are typically required to carry out various functions effectively.

Showing posts with label history. Show all posts
Showing posts with label history. Show all posts

Wednesday, September 3, 2025

Foundations of Computing in 19th-Century England

The history of computers in England during the early 19th century reflects a remarkable blend of visionary thinking and mechanical ingenuity. At the center of this story was Charles Babbage, an English mathematician, engineer, and inventor who is widely hailed as the “father of the computer.” Babbage was deeply concerned with the inaccuracy of mathematical tables, which were manually computed and often riddled with errors. To solve this problem, in 1821 he proposed the Difference Engine, a steam-powered mechanical calculator designed to automatically produce accurate mathematical tables. The following year, on 14 June 1822, he formally presented his idea to the Royal Astronomical Society in a paper titled Note on the application of machinery to the computation of astronomical and mathematical tables. His vision was revolutionary because it aimed to replace human error with mechanical precision.

Although the British government initially supported the project with funding, the Difference Engine was never fully completed due to the complexity of its engineering, high costs, and political disputes. Nevertheless, the attempt marked a turning point, showing that machines could be designed to carry out sophisticated calculations.

By 1837, Babbage had moved beyond his earlier design and imagined the Analytical Engine, an even more ambitious invention that resembled a modern computer in both concept and architecture. Unlike the Difference Engine, which was limited to specific calculations, the Analytical Engine was designed as a general-purpose computing machine. It included components comparable to those of today’s computers: the “mill” (a precursor to the CPU), the “store” (for memory), and input/output devices known as the “reader” and “printer.” It also introduced groundbreaking ideas such as conditional branching, loops, and the ability to store instructions, making it the first design we would now call Turing-complete. The machine would have been programmable using punched cards, a concept borrowed from the Jacquard loom used in textile manufacturing.

A key figure in this story was Ada Lovelace, an English mathematician and close collaborator of Babbage. In 1843, she translated and expanded upon a paper about the Analytical Engine, adding her own extensive notes. These notes included the first algorithm ever intended for machine execution, effectively making her the world’s first computer programmer. She also foresaw the broader potential of computing, suggesting that such machines might one day manipulate not only numbers but also symbols, music, and language.

Although neither the Difference Engine nor the Analytical Engine was built during their lifetimes, the pioneering work of Babbage and Lovelace laid the intellectual foundation for modern computing. Their vision, conceived in 19th-century England, continues to influence technology and inspire innovation well into the digital age.
Foundations of Computing in 19th-Century England

Charles Babbage

Friday, January 7, 2022

History of Quattro Pro

Quattro Pro was a spreadsheet utility, originally developed by Borland Software. It was renamed to Corel Quattro Pro when Corel purchased it. The features that lacked in the previous Quattro Pro have been included in its successor, the WordPerfect Office Suite.

Quattro Pro was designed as a contender for the Microsoft Office Excel and for the Lotus 1-2-3 alternatives.

Borland started in spreadsheets in 1987 with a product called Quattro. Word has it that the internal code name was Buddha because the program was intended to “assume the Lotus position” in the market. Borland changed the name to Quattro Pro for its 1990 release.

Similar to Lotus 1-2-3, Quattro Pro was first shipped and released as a DOS program, and was later rewritten to be adapted to the Windows platform.

In the fall of 1989, Borland began shipping Quattro Pro, which was a more powerful product that built upon the original Quattro and trumped 1-2-3 in just about every area.

In 1991 Borland ranked best Quattro Pro beats Lotus 1-2-3. Two industry studies objectively confirm the facts: Customers rank Borland best among software companies, and Quattro® Pro outperforms all Lotus® spreadsheets.

Like Lotus, Borland was slow to jump on the Windows bandwagon. When Quattro Pro for Windows finally shipped in the fall of 1992, however, it provided some tough competition for the other two Windows spreadsheets, Excel 4.0 and 1-2-3 Release 1.1 for Windows. Since about 1996 Quattro pro has run a distant second to Excel’s market domination.

In 1994, Novell purchased WordPerfect International and Borland’s entire spreadsheet business. In 1996, WordPerfect and Quattro Pro were both purchased by Corel Corporation.

Corel Quattro Pro is a spreadsheet application used to process business and financial transactions of various kinds. To carry its various tasks, Corel Quattro Pro organizes each one of its notebooks into worksheets.
History of Quattro Pro

Friday, February 10, 2017

Search engine technology in history

In the past many search engines have been created to help users find desired information on the web.

Search engines existed even before the invention of the World Wide Web. The roots of web search engine technology are in information retrieval (IR) systems, which can be traced back to the work of Luhn at IBM during the late 1950s.

Search tools such as Archie and Veronica searched using FTP and gopher protocols long before HTTP came into play.

Search technology was very primitive. With Archie one could search for file names, while with Veronica, one could search for text files and file names.

The 1990s was the decade of the World Wide Web, built over the physical infrastructure of the internet, radically changing the availability of information and making possible the rapid dissemination of digital information across the globe.

By the mid-1990s, many thousands of pages were being added to the World Wide Web each day.

The availability of graphical browsing programs such as Mosaic, Netscape, and Microsoft Internet Explorer made it easy for ordinary PC users to view web pages and to navigate from one page to another.
Search engine technology in history

Thursday, September 22, 2016

Google Books Project

Google Books is a giant catalog millions of books on almost any topic. Google intends to scan every books ever published, and to make full texts searchable, in the same way that web sites can be searched on the company’s search engine at google.com.

Google provides full access to public domain books (or those for which permission has been obtained from the publisher).

For copyrighted books there is limited ability to search by keyword and view a limited number of pages. The Google Books Project does not put entire books online for free. If a book still protected by copyright, the user’s search results will show only the brief section of the book that includes the world or phrase searched and the page number it appears on, along with details about the book.

The project started around 2003 when Google approached the Library of Congress with proposal to digitize all the books in the library. The Library of Congress offered a counterproposal that would only include public domain books, Google did not follow up.

In the December 2004 Google announcing that it partnerships with libraries at the University of Michigan, Harvard and Stanford, the Bodleian Library at Oxford and the New York Public Library to digitalize all or large portions of their print collections, which would cover more than 15 million volumes.

The number of partners continues to grow. The University of California, the University of Wisconsin, the University of Virginia and the Universidad Complutense de Madrid joined in 2006.

In collaboration with the participating libraries Google scans books in the public domain along with copyrighted books. The company also intends to scan out-of print books, which benefits individuals searching for books that might otherwise be difficult or impossible to find.
Google Books Project

Sunday, March 27, 2016

Cross-site scripting

The Cross Site scripting is one of the problems that have plagued a lot of websites. Cross site scripting or XSS, is a term for a category of security issues in which an attacker injects HTML tags or scripts into a target website.

Cross-site scripting vulnerabilities date back to 1996 during the early days of the World Wide Web. A time when e-commerce began to take off, the bubble days of Netscape, Yahoo and the obnoxious blink tag.

In December 1999, David Ross was working in security response for Internet Explorer at Microsoft. He was inspired by the work of Georgi Guninsky who was at the time finding flaws in Internet Explorers security model.

David demonstrated that web content could expose ‘Script Injection’ effectively by -passing the same security guarantees by passed by Georgi’s Internet Explorer code flaws, but where the fault seemed to exist on the server side instead of the client side Internet Explorer code.

Cross site scripting can be performed by passing scripts in form of:
*TextBox
*Cookies
*Query Strings
*Web application variable
*Session variables

A web page is vulnerable to cross site scripting if it dynamically generates document content and bases that content on user submitted date without first ‘sanitizing’ that data by removing any embedded HTML tags from it.
Cross-site scripting

Tuesday, February 23, 2016

Video conferencing system in history

Videoconferencing is a telecommunications medium that allows individuals or groups at different locations to transfer video and audio in real-time, face-to-face settings.

Simple analog videoconferences could be established as early as the invention of the television. In 1927, when Herbert Hoover was US Secretary of Commerce, he used a videoconference system in Washington to communicate with AT&T President Walter Gifford in New York.

This was a prototype system, but AT&T continued pursuing the idea developed the Picturephone in the 1950s and launched the commercial Picturephone product in 1970. It was a commercial failure, mostly due to poor picture quality and the lack of efficient video compression techniques.

In 1964 Bell Labs launched their first prototype of videoconferencing system. However, limited by the networks and other technology, videoconferencing system was brought into market only in 1980s with the development of codec protocol.

During the 1970s, some private videoconferencing links were set up by some very large organizations using analog technology.

Video conferencing saw advancement and development in the 1990s  due to many factors including technical advances in Internet Protocol (IP) and also more efficient video compression technologies that were developed, permitting desktop or PC-based video conferencing.

In 1991, IBM introduced the first PC—based video conferencing system, named PicTel. Although it was a black and white system, it was very inexpensive to use. It was the first dedicated systems, started to appear in the market as ISDN networks were expanding throughout the world.
Video conferencing system in history 

Monday, August 18, 2014

Brief history of computer software

Hardware is physical and may be seen and touched whereas software is intangible and is an intellectual undertaking by a team of programmers. The initial computers did not have a stored program, or what became known later as software.

The earliest high level programming language was Plankalkul developed by Konrad Zuse in 1946.

John von Neumann described the concept of storing a program and data in 1945 during the construction of ENIAC. However, the first computer to function with a stored program was EDSAC at Cambridge University, England in 1949.

The ENIAC was at first controlled by wiring as if it were gigantic plugboard, but later Nick Metropolis and Dick Clippenger converted it to a machine that was programmed from ballistic tables, which were huge racks of dials which decimals digit of the program could be set via the knobs of the decimals switches.

Grace M. was an early pioneer in the development of initial programming languages. Her initial work started on the Mark 1 at Harvard University in 1944.

Software technology during this period was very primitive. The first programs were written out in machine code, i.e. programmers directly wrote on the numbers that correspond to the instructions they wanted to store in memory.

In the mid 1960s, lines of code or LOC was one of the first known measures of software seize, which referred to the number of computer instructions or source statements comprising a computer program and is usually expressed as thousand of lines of codes.

Things began to change in the 1970s, as lower cost personal computers were introduced and software began to be sold as a product.

In 1998, a group of are software users decided to create a new term to describe free software. They felt a new term was needed so users wouldn’t confuse ‘freedom’ with ‘no cost’. They coined the term ‘open source’ and started the Open Source Initiative to promote its use.
Brief history of computer software

Wednesday, July 10, 2013

The first microprocessor

Microprocessor is a multipurpose, programmable, clock-driven, register based electronic device that reads binary instructions from a storage device celled memory, accepts binary data as input and processes data according to those instruction, and provides as output. It is essentially the central processing unit of computer fabricated on a single silicon chip.

In the mid 1940s, John Von Neumann, a brilliant mathematician at Princeton University, conceived a theoretical machine in which binary logic and arithmetic could work together in storing detailed programs and performing complex calculations.

The first commercially available microprocessor was the Intel 4004. Produced in 1971, it contained 2300 PMOS transistor. It was a programmable controller on a chip, which merges by today’s standards.

The 4004 was a 4 bit device intend to be used with some other devices in making a calculator. It was designed by Ted Hoff. A 4 bit microprocessor receives 4 bit of information from outside the microprocessor, performs the necessary processing on it, and then sends out of the microprocessor a 4 bit result.

Realizing that the microprocessor was a commercially viable product, Intel Corporation released the 8008 an extended 8 bit version of the 4004 microprocessor.

In 1974 Intel announced the 8080, which had a much larger instruction set than the 8008 and required only two additional devices to form functional CPU. Intel 8080 was the first commercially popular 8 bit microprocessor.

About six months after Intel released the 8080 microprocessor, Motorola Corporation introduced its MC6800 microprocessor.

Around 1978, Intel released 8086, the first 16-bit microprocessor. With 16 bit word size, it was possible to represent signed numbers in the range of -32,768 to +32,767, which is quite a decent range for performing arithmetic calculations.
The first microprocessor

Sunday, January 13, 2013

Father of modern digital computers

The earliest device that qualifies as a digital computer is the “abacus” also known as “soroban”. Abacus is the simplest form of a digital computer.

The device permits the users to represent numbers by the position of beads on a rack. The first mechanical adding machine was invented by French Mathematician Blaise Pascal in 1642. The machine became very popular and was produced on mass scale.

Charles Babbage, nineteenth century Professor at Cambridge University, is considered to be the father of modern digital computers. He was born on 26 December 1791 in his father’s house in Walworth, Surrey. 

During his period, mathematical and statistical tables were prepared by a group of clerks. Even the utmost care and precaution could not eliminate human errors.

Babbage had to spend several hours checking these tables. Soon he became dissatisfied and exasperated with this type of monotonous job. The result was that he started thinking to build a machine which could compute tables guaranteed to be effort-free.

In this process, Babbage designed a “Difference Engine” in the year 1822 which could produce reliable tables.

In 1842, Babbage came out with his new idea of Analytical Engine that was intended to be completely automatic.

It is for his effort that he is today know as the ‘Father of Modern Digital Computer’. It was to be capable of performing the basic arithmetic functions for any mathematical problem and it was to do so at an average speed of 60 additions per minute. His Engine could evaluate algebraic expression correctly and was also able to produce mathematical and statistical tables correct up to 20 digits.

The Engine had five components:
*A storage unit that held the numbers
*An arithmetic unit called Mill, to perform the arithmetic calculations
*A control unit that controlled the activities of the computer
*An input device that gave the numbers and instructions to the computer
*An output device that displayed the result

Unfortunately, he was unable to produce a working model of this machine mainly because the precision engineering required to manufacturer the machine was not available during that period.

However, his effort established a number of principles which have been shown to be fundamental to the design of any computer.

Dr. Howard Aiken of Harvard University in association with IBM developed a large scale electro-mechanical computer in 1944. The computer nicknamed ‘Mark I’ was based on the concept of Charles Babbage’s Analytical Engine.
Father of modern digital computers

The Most Popular Posts

  • The evolution of business intelligence (BI) tools reflects the broader progress of computing technology and data management. In the 1970s and 1980s, early ...
  • Watermelons are quintessential summer crops, prized for their refreshing, red, juicy, and crisp flesh. Producing high-quality melons requires careful handl...
  • Selenium, an essential trace element, plays a crucial role in various bodily functions, including antioxidant defense and thyroid hormone metabolism. Plant...