Abstract
Programming is a relevant semiotic activity, resulting in millions of lines of written code: the whole digital revolution is still rooted in writing as a semiotic activity. In relation to this, AI applications based on deep learning do not present particular features. They are standard computer programs relying on the von Neumann/Turing architecture. Yet there is an interesting epistemological difference. A distinction can be made between classical programming and machine learning. As the task for programming is always problem solving, in classical programming, the programmer has to input rules and data in order to gather answers in output. A machine learning approach requires a different epistemological wiring: the programmer inputs data and the required answers, while the software learns or discovers the rules. These two approaches to programming can be characterized from a semiotic perspective by referring to the couple “grammar” versus “text” and “allography” versus “autography.” A grammar defines a set of rules to be applied so that an output is generated that is formally consistent with the prescribed rules. Rather, a text acts as an example from which to infer regularities in order to generate a new text. This epistemological shift on the computation side is coupled with an analogous one on the user side. As data, that is – semiotically – texts, are the driving force, users have to focus on sets of examples in order to cope with the algorithms. The contribution discusses this shift by taking into account the relative changes in agency.