The digital environment (mathematical logic, the concept of programming, systems and digital devices) was born through the work of Alan Turing :

1936 Paper on computable numbers, with an Application to the Entscheidungs ​​problem (or decision problem) is the founding text.

1940 He decrypts the Enigma machine used by the Nazis at the time (Bletchley Park)

1948 He participated in the creation of the first computer, the Mark 1, at the University of Manchester.

1950 His article Can a machine can think ? launches the field of artificial intelligence (see the test of Turing).

1952 His work Morphogenesis biomathematics developed a model later confirmed by research in chemistry.

Like other great thinkers such as Copernicus, Newton and Einstein, Alan Turing is one of the pioneers who have revolutionized our world. Here is the concept of the Turing machine :

[expand title=”Landmarks”]

1957: the transistor
Robert Noyce of Fairchild company. This circuit was created thanks to the meeting of seven young expatriate engineers on the West Coast (the future Silicon Valley), who took advantage of the replacement of germanium by silicon and the demands of the defense industry (mainly NASA).

1961: the semiconductor
Robert Noyce of Fairchild. Digital is then mainly used by the US military-industrial consortium in the context of the Cold War.

1971: the microprocessor
By MM. Moore Noyce, Intel-Groove. Digital now pervades most major industries and finance.

1976: the motherboard of the computer
By Steve Wozniak of Apple. From now on Digital is widely used at home and at school.

2001: iPod circuit
By Steve Jobs of Apple. Digital is becoming mainstream and mobile.

[/expand]

One of the first information processors was the Planar (1961) :

Twenty years later appeared the 68000 – 32 bits of Motorola : (1979) :

The digital revolution is recent : Turing (1936), Wiener (1942), von Neumann (1944) and ENIAC (1946). There have been several major advances in the mechanization of information processing :

Calculation (1940-1960)
The first calculating machines served during WW II (1939-1945). This was the era of military and mathematicians and their gigantic machines used to carry out ballistic calculations, the fission of the atom (the Manhattan Project) and cryptology.

Memory-supported media (1960-1980)
During the industrial era, digital capabilities converted text and analog sounds and images into data. The first frame hand-automated the traditional activities of production, which helped develop the first network and instantiate a productive start. We discover time-sharing, networking, workflow, etc. Its pioneers were the military and academics, creators of the first networks.

Participation (1980-2010)
With the beginning of the era of micro-computers, digital becomes much more interactive. It helps to publicize the content and dissemination of new networks in many more than before. Millions of users will learn to mediate their personal content (see Douglas Rushkoff in « The Rise of the Amateur »). This is the stage of programmers and especially evangelists from Apple.

Digital capability is a toolbox !

A second shift occurs with the arrival of Web around 1995, and then another with that of phones and tablets without keyboards (2005) ; we witness the explosion of more than one million applications. This is the stage of hackers, online communities and social networks first generation.

Thus we move from giga to tera then peta, etc. :

After 2020 (?)

Later, digital information will be combined to generate knowledge. This will be in the hands of young people who have never known a world without the Web. This will also be the stage of social networks II (collective), new types of computers, several Webs and the Internet of Things (chapter 3, no 14).

A new algorithm-driven culture is emerging :

The greatest crisis facing our society will be how to transform structured knowledge information.
Carlos Fuentes.

Below here is the classic pattern of the evolution of digital devices. The curve is exponential (see below), parallel to that of the growth in population (chapter 0, no 2).

This growth curve heralds a new generation of computers that arises from the meeting between the rules of quantum physics and those of the increased power of digital. This passage from bits to qubits will generate new ways of thinking :

(Time, February 17, 2014)

Private computers and free software

The proprietary software developed by private companies that analyze human activities via algorithms is kept secret. This technology is interesting because it is potentially exponential : in fact, it is multiplied by the time that the user uses (the contribution of economy).

However, it has a dark side : it may lead the users to social inertia :

Free software operates through joint management of individuals who manage data as a resource : its management is supported by a community of users. The unifying principle is that of the commons, and solidarity. These software packages offer four freedoms : freedom to use, study, improvement and distribution.

The digital environment (mathematical logic, the concept of programming, systems and digital devices) was born through the work of Alan Turing :

1936 Paper on computable numbers, with an Application to the Entscheidungs ​​problem (or decision problem) is the founding text.

1940 He decrypts the Enigma machine used by the Nazis at the time (Bletchley Park)

1948 He participated in the creation of the first computer, the Mark 1, at the University of Manchester.

1950 His article Can a machine can think ? launches the field of artificial intelligence (see the test of Turing).

1952 His work Morphogenesis biomathematics developed a model later confirmed by research in chemistry.

Like other great thinkers such as Copernicus, Newton and Einstein, Alan Turing is one of the pioneers who have revolutionized our world. Here is the concept of the Turing machine :

[expand title=”Landmarks”]

1957: the transistor
Robert Noyce of Fairchild company. This circuit was created thanks to the meeting of seven young expatriate engineers on the West Coast (the future Silicon Valley), who took advantage of the replacement of germanium by silicon and the demands of the defense industry (mainly NASA).

1961: the semiconductor
Robert Noyce of Fairchild. Digital is then mainly used by the US military-industrial consortium in the context of the Cold War.

1971: the microprocessor
By MM. Moore Noyce, Intel-Groove. Digital now pervades most major industries and finance.

1976: the motherboard of the computer
By Steve Wozniak of Apple. From now on Digital is widely used at home and at school.

2001: iPod circuit
By Steve Jobs of Apple. Digital is becoming mainstream and mobile.

[/expand]

One of the first information processors was the Planar (1961) :

Twenty years later appeared the 68000 – 32 bits of Motorola : (1979) :

The digital revolution is recent : Turing (1936), Wiener (1942), von Neumann (1944) and ENIAC (1946). There have been several major advances in the mechanization of information processing :

Calculation (1940-1960)
The first calculating machines served during WW II (1939-1945). This was the era of military and mathematicians and their gigantic machines used to carry out ballistic calculations, the fission of the atom (the Manhattan Project) and cryptology.

Memory-supported media (1960-1980)
During the industrial era, digital capabilities converted text and analog sounds and images into data. The first frame hand-automated the traditional activities of production, which helped develop the first network and instantiate a productive start. We discover time-sharing, networking, workflow, etc. Its pioneers were the military and academics, creators of the first networks.

Participation (1980-2010)
With the beginning of the era of micro-computers, digital becomes much more interactive. It helps to publicize the content and dissemination of new networks in many more than before. Millions of users will learn to mediate their personal content (see Douglas Rushkoff in « The Rise of the Amateur »). This is the stage of programmers and especially evangelists from Apple.

Digital capability is a toolbox !

A second shift occurs with the arrival of Web around 1995, and then another with that of phones and tablets without keyboards (2005) ; we witness the explosion of more than one million applications. This is the stage of hackers, online communities and social networks first generation.

Thus we move from giga to tera then peta, etc. :

After 2020 (?)

Later, digital information will be combined to generate knowledge. This will be in the hands of young people who have never known a world without the Web. This will also be the stage of social networks II (collective), new types of computers, several Webs and the Internet of Things (chapter 3, no 14).

A new algorithm-driven culture is emerging :

The greatest crisis facing our society will be how to transform structured knowledge information.
Carlos Fuentes.

Below here is the classic pattern of the evolution of digital devices. The curve is exponential (see below), parallel to that of the growth in population (chapter 0, no 2).

This growth curve heralds a new generation of computers that arises from the meeting between the rules of quantum physics and those of the increased power of digital. This passage from bits to qubits will generate new ways of thinking :

(Time, February 17, 2014)

Private computers and free software

The proprietary software developed by private companies that analyze human activities via algorithms is kept secret. This technology is interesting because it is potentially exponential : in fact, it is multiplied by the time that the user uses (the contribution of economy).

However, it has a dark side : it may lead the users to social inertia :

Free software operates through joint management of individuals who manage data as a resource : its management is supported by a community of users. The unifying principle is that of the commons, and solidarity. These software packages offer four freedoms : freedom to use, study, improvement and distribution.