Definition of Computing in English

The term computation has its origin in the Latin word computatio. This word allows us to approach the notion of computation as an account or calculation, but it is generally used as a synonym for computing (from the French informatique). In this way, it can be said that computation brings together scientific knowledge and methods.
These automated information systems are achieved through certain tools that have been created for this purpose, computers or computers.

According to DigoPaul, the origin of computing, experts say, dates back to more than three hundred years ago, when machinery focused on various calculation tasks began to be developed. In 1623, Wilhelm Schickard invented the first mechanical calculator.

However, computers capable of performing multiple processes (that is, they were not limited to mathematical calculations) began to emerge in the 1940’s. The massive and domestic use of these machines would only arrive in the ’80’s, with the production of personal computers or PCs. The end of the 20th century, with the rise of the Internet, represented a new impetus for everything related to computer science.

As for the theory of computation, it must be said that it is considered a science focused on the study and formal definition of computations. This discipline defines computation as the product of a solution or a result, especially in the mathematical / arithmetic sense of the concept, using a process or algorithm.

In other words, computing is the science that studies and systematizes the orders and activities dictated in a machine, analyzing the factors that participate in this process, among which are programming languages, which allow generating a list of data neat and understandable to the machine.
In the process, two types of analysis are carried out, one organic (translation of the instructions into a language understandable by the computer) and one functional (collecting the information available in the automation process).

To talk about computing it is necessary to previously define the concept of algorithm. An algorithm is a set of specific steps that are structured in time that respond to a list of clear rules and that aim to solve a particular problem. They must meet certain conditions: be defined (clear, detail each of the necessary steps to be carried out without ambiguity), finite (the actions that comprise it must be concluded logically and clearly), have zero or more inputs and one or more more exits, and be effective (Use exactly what is required to solve the problem, spending the minimum of resources and being able to be executed efficiently).

Other areas that computing also covers are artificial intelligence related to computers, graphic and network computing, database systems, computer mathematics and different types of engineering related to this machine.

Currently, the development of computers and related technologies has allowed the production of various types of documents, the sending and receiving of e-mail, the creation of digital drawings, audio editing and the printing of books, among many other processes.

It should be noted that the technology used in computing is microelectronics with physical components (processor, memory, etc.) and logical components (operating system and programs).