Žurnalistikos tyrimai ISSN 2029-1132 eISSN 2424-6042
2022, 16, pp. 14–38 DOI: https://doi.org/10.15388/ZT/JR.2022.1

Shifting from Individualism to Genericism: Personalization as a Conspiracy Theory

Ebrahim Mohseni Ahooei
Communications researcher at the University of Vienna, and a member of the Executive Committee of the UNESCO Chair in Cyberspace and Culture
Email:
emohseni@ut.ac.ir

Abstract. With severe mistrust around classical approaches to consciousness, this paper claims that arguments around the notion of “personalization” of media or messages are grounded on a misinterpretation. Based on the two presuppositions of respective differentiation of human beings and the power to make choices based on reasoning, these approaches have been the reference for many well-known scientific studies, mainly in the fields of media studies, economics, political sciences, and psychology. Despite refuting their results via meta-analyses, such theories have so far sought to maintain their position by resorting to conspiracy theories, the promotion of which, ironically, leads to the syndrome of skepticism, which supports its origins in a vicious circle. While these approaches have been ubiquitous in so-called cognitive priming, projection of mass movements and political abuses of the concepts such as misinformation or disinformation, the mainstream workouts in the fields including but not limited to Perception Management, Artificial Intelligence, and Machine Learning have significantly relied on both de-individualistic and irrational processes. This article aims to prove that the ontological claims about the centrality of individualism in the latest fields of all media and communication technological procedures are grounded in a conspiracy theory. Relying on the method of epistemological reasoning, this article attempts to prove that individualism and personalization in the field of the media industry are the principal tools of social control through the spread of skepticism, which takes advantage of the fictitious nature of the new media sphere for commercial and political purposes.

Keywords: New media, personalization, individualism, genericism, social construction

Received: 2022/12/15. Accepted: 2023/01/30
Copyright © 2022 Ebrahim Mohseni Ahooei. Published by Vilnius University Press. This is an Open Access article distributed under the terms of the Creative Commons Attribution Licence (CC BY), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Introduction

The academic field of communication and media studies is no longer simply about training or acquiring media knowledge and skills to uncover truths. Whether one works as a journalist or a researcher, a lecturer or a student, it is no longer possible to understand what both the media and the truth are without taking a critical approach to the social construction of the media (SCM) and exploring the power and politics of the forces involved in that construction. Such an approach to communications and media is applicable on global and local scales.

Understanding the social construct of media is crucial, especially as a means of decolonizing it across the realm of corporate media dominance. Herman and Chomsky (1988; 2010) illustrate how corporate media globally forms Doxa as a general form of media articulation through the production of “consent.” This is accomplished by reducing social space into mere information and, as a result, makes social action manageable through information control. Corporate media offers the most legitimate and dominant articulation of events. This is why corporate media has become the only source of the truth in the post-truth era. More importantly, it suppresses other subjectivities forms by producing and legitimizing one precise sort of subject.

The importance of paying attention to the SCM at the local level goes back to the dominance of the Development Media Theory (DMT) over the communication sphere of developing countries during the twentieth century (Sonaike, 1988). The DMT theory states that media is nothing more than a foundation for development. Then, neither truth nor freedom of media, have no intrinsic value. The lone credible goal of its mission is to serve the development process. With such an approach, manipulation, deception, and rumor spreading are legitimate if they persuade in line with the development goals. But manipulation is only possible when the audience is unable to take a critical stance toward the media. Otherwise, the propaganda effect of the media would be severely depauperated persuasion techniques and tactics become overt. Therefore, the DMT depends on a somewhat passive, vulnerable, and impressionable audience. For such an approach, the definition of media literacy is reduced to the ability to use media tools such as reading a newspaper or operating a communication device.

From a global perspective, it is why understanding the complex and covert means of propaganda of the corporate media has become an obligation of journalism, communications, and media studies. At the local level, paying attention to the social construct of media can be effective in overcoming inefficient traditional approaches. The legitimacy of state control and surveillance of media and seeing it morally acceptable to manipulate the media is not the only crisis of the DMT. More importantly, the abovementioned legitimacy was particularly relevant to the era of the classical media, including the press, radio, cinema, and television. The intellectual remnants of the DMT here and now are a misunderstanding. These form one of the obstacles for some countries in playing a commensurate role in the global communication sphere.

Given the above, I am focusing here on one of the socially constructed concepts, namely individualism – and its media corresponding, personalization - which is assumed by mistake as a natural concept. My claim is that over the past decade, new media suffered a shift in paradigm; from subject-based personalization to object-based genericism. I argue that the concept of personalization has become a myth. This approach is sustained through artificial ventilation for reasons that I will elaborate on throughout the text.

The question is that, while the procedures for the development of communication and media technologies are based on the repeatedly refuted idea of depersonalization, why are politicians’ statements and media industries’ procedures based on individualism rhetoric?

Method

The reasoning of this article is based on the epistemological paradigm (Guba & Lincoln, 1994, p. 108; Hejase & Hejase, 2013, pp. 82-83) and seeks the background conditions and reasons for a variation of ontologies that have been constructed through the modern history of communication sciences as theoretical attempts to understand the nature of media. The epistemological method used in this article capitalizes on an innovative articulation consisting of three distinct ontologies: Instrumental Approach to Media (IAM), Media Ecology Approach (MEA), and Social Construction of Media (SCM). Each theory of communication and media is coupled to one of these three ontologies, which upon describing their stands on media and communication, their explanatory power can be justified. Such reasoning is particularly focused on turning points where the ontology does not withstand reality.

All these are grounds for the epistemological analysis of a false ontological claim regarding the individualistic nature of all recent technological procedures related to media and communication, including Perception Management, Artificial Intelligence, and Machine Learning.

Adopting epistemological reasoning as a method is not only capable of showing the false nature of individualism in the mainstream procedures of developing communication technologies but also explaining the reasons for these procedures’ insistence on pretending to be individualism and resorting to conspiracy theories to falsify reality.

Milestones in the approaches to the media

To most people, media is just a tool or channel of communication. Such an instrumental approach to media (IAM) is the continuation of three decades of competition for the computational communication models by scientists such as Claude Shannon (1948), Wilber Schramm (1954), and David Berlo (1977) in attempts to provide the mathematical models that dominated the intellectual sphere of the United States from the 1920s to 1950s. This conceptualization reduces media to a neutral channel of content or information, and it limits the scope of media studies to media content. The pernicious deficiency of this notion is that it cannot account for the social effects of media.

To compensate for this shortcoming, communication scientists developed the Media Ecology approach (MEA). The metaphor of ecology here focuses on how the media influences social situations and interactions and seeks to understand how different media facilitate different social interactions and structures. From the MEA perspective, each medium has its own unique sociological and psychological characteristics. The concept of the MEA, introduced mainly by Harold Innis, Marshall McLuhan, and Neil Postman (Patterson, 1990), conceives of media more broadly than as a means of communication. Instead, they see it as an entity that encompasses the entire living environment. That is why the media technology of every era, i.e., how people communicate, significantly depicts its culture, ways of thinking, values, social relations, and power.

Information and communication technology, under this approach, has blurred traditional social and cultural boundaries and thus changed the classical structure of the national state and the concept of national information “border.” According to Lash (2002), in such circumstances, the world is divided into two parts: Reactive winners are those whose reaction to the situation is in line with the new structure, and reactive losers who try to maintain the lost patterns in the old construct by resisting the new one (Lash, 2002, pp. 137-39).

More importantly, in line with the MEA approach, the social situations of individuals are determined by communication constructs rather than productive structures. As citizens organize and express themselves through algorithm-based services, algorithms and their shared interests become a part of citizens’ identities. When digital services determine the content of media through algorithms, digital media become the technological human subconscious that influences the symbols through which we think, make decisions, and react.

However, criticism against the MEA is widespread, whereby its technological determinism is the most common. Critics claim that media ecologists draw a too simple picture of the social change driven by technology. Technological determinism considers technology as an independent force that forms society without considering sociocultural factors related to power. As a well-known illustration, Harway and Williams (1995) point out that McLuhan’s technological determinism of “the medium is the message” (McLuhan, 1964, p. 23) underplays the effect of other factors, including economic, cultural, and political, on the technologies that do not exist independently (Harvey and Williams, 1995). Instead, to influence society, technology must necessarily be socially recreated through human interests, wills, and agency.

While deterministic ecologists had derived their approach from their lived experiences and the direct influence of mass media on the human psyche and mentality, criticism of their magnification of the effect of the media led the next generation of media ecologists to a softer version that saw media technology as facilitating or modifying change rather than determining its course. New media ecologists, including Joshua Meyrowitz (1999) and James Carey (2008), believe in social constructivism. Meyrowitz (2001), for example, demonstrated the unscientific McLuhan’s approach’s nature to the possibility of altering the audience’s nervous balance through media.

According to Meyrowitz (1986), electronic media promote a selected sort of social change by connecting previously separate social spaces and domains. Before electronic media, social spaces were tied to physical spaces. In the past, physical barriers like walls, doors, and gates controlled the flow of information and effectively kept social spaces apart. As electronic media reduced the necessity for face-to-face communication to access information, the dependence of informational spaces on a specific physical space weakened. As a result, social spaces and spheres began to merge. In a society where social spaces can’t be clearly distinguished from one another, the actors will each be a part of a connective tissue or network of communication. The information flow blurs the boundaries between the private and public spheres. At the same time, the evocative nature of the communication network makes it possible to act as a part of the human thought process. As a result, the boundaries are lost between individuals and their networks. This consequence violates the concept of separate and independent individuals, which is the dominant idea within the epoch.

The social construction of media (SCM)

The media is a social institution. According to Giddens (1984), institutions are both objective structures in the sense that they set the rules for social action, and they are subjective in the sense that they can only exist in the minds of citizens and be accomplished through their actions. Institutions change when enough citizens start behaving differently. The idea means that a social institution cannot emerge or survive without the mechanisms of creating and developing society, including the public mindset.

The media is grounded in a specific social, economic, political, cultural, historical, and technological context, and continues to exist in continuous interaction with this context. Therefore, it is necessary to understand media from a social constructivist perspective. At the same time, one must bear in mind that each medium has its technical characteristics that partly determine how it can be used. That is, a medium does not determine the social ecology, but it is determined socially. Assuming media is a social construct explains the power relations that determine media technologies. The entire technological structure of the Internet, for example, is determined and seemingly will be determined by power relations. Earlier, Neil Postman (1984) stated that media technologies are a set of ideas or ideologies. Similarly, Fred Turner (2021) argued that the counterculture of the 1960s was a basic factor in the creation of the Internet in the 1970s. According to Turner (2021), the early inventors of the Internet and personal computers were motivated by the idea of communication technology that could not be controlled by any center, and which would create unpredictable communication freedom for individuals. In the 1990s, this idea turned increasingly to the service of neoliberal policies, and as a result, the Internet became a determinative factor in the globalized economy.

In a vast social ecology, not only economic, all actors begin to cooperate and participate in a very open way. It is not a result of new media, but it reflects individual, anti-authoritarian, and people-centered values of late modern culture. In an information-based society, pervasive networks connect different ideas, cultures, institutions, organizations, and individuals. Everything is part of a whole, and the boundaries between, for example, work and leisure, private and public, and national and international are increasingly blurred. The development of a social ecology does not occur without conflict but creates new problems.

A society organized at the grassroots level is highly individualistic and thus deepens social inequalities. The social ecology is communication-based, meaning that cultural capital and the interactive skills of individuals are emphasized. Cultural and social capital puts citizens in an unequal position, and the hierarchies take on a new form.

In such a situation, where life management is closely associated with symbolic management, citizens become more obsessed with media to comprehend the importance and meaning of their life. The more social capital an individual has, the higher his or her chances of success in matching the meaning of life with the symbolic meaning. Similarly, those with the most effective resources for communicating and producing media content will have the foremost power in defining a shared reality. It is why the foremost significant driver of social development is to possess more interactions and communication.

The inefficiency of outdated ontologies

Regardless of what is happening in the academic realm, the media policymaking sphere, as well as the media’s everyday applications in areas including but not limited to advertising, public relations, and propaganda, is taking a very different path. In the world outside the academy, two IAM and MEA are popular, but the conception of media as part of a wider ecosystem or social construct has widely been neglected. To be more precise, the ontology of the media in the operational space and finding a more applicable answer to the question “what is the media?” has not been developed in parallel with the practical and theoretical developments of the media itself. This ontological discontinuity has several reasons, the investigation of which requires paleontology within the scope of the experience of modernity and its relationship with the concept of media, but what is central to the present research is the destructive effects of falling into the IAM and MEA.

The popularity of the first two approaches is not only a merely theoretical issue but also a fundamental approach in all media management, research, and policy. Even most of the routine analyses and policies of new media, including social networks and computer games, are based on the IAM and MEA procedures. In sociological research, cultural studies, psychological studies, and educational sciences, most of the topics, such as the effect of digital media, social networks, or computer games on identity, ethnicity, teenagers, etc., are continuously problematizing based on these two traditional approaches. It is why scientific answers often do not lead to effective social solutions. In terms of policy and legislation, all the usual conservative efforts such as closing, blocking, and filtering globalized communication are rooted in the lack of an updated approach to media ontology.

Statesmen, politicians, and media policymakers in closed societies are trapped in such an idea that by “closing the borders” of information flow by crystalizing it in a national intranet network, they can restrain the information flow and maintain the traditional patterns of power. This complication is the consequence of imposing a mythical ontology on the media. If such authorities had the chance to get new knowledge about media ontology, they would find that their efforts are fighting for a fictional sentiment that has been passed for decades. The media no longer has a demarcated nature that can be closed or limited by national or local boundaries. It is now an interwoven, integrated, and interactive entity consisting of people, meanings, processes, and technologies. In no way is each of these four-element network-structure dominant over another. These four elements interact, and any intervention in one means unpredictable effects on all the others.

As noted in the analysis of media as a social construct, the IAM and MEA to media are ineffective for three main reasons. The first is that new media have a completely different nature as compared to traditional mass media. This heterogeneity is related to the differentiation of construction context and spheres like processes and goals of creating and expanding. A movie screen in a cinema theatre or a TV screen in a family’s living room in 1980 were constructed based on shared consumption. Even the traditional press, whose individual subscription was the basis of its distribution, was able to keep the cost of its subscription low because its content and printing technologies were designed for the masses. The press was the reproduction of a coherent package of information for an unidentified mass.

Constructed differently, new media work with the claim that its technologies are the basis of personalization, and that their evolution is towards the deepening of individualism through these technologies. It is thought that any new media technology that has more options for hard and soft personalization is more acceptable to users. All the traditional media were based on one-way communication, and a mass passive audience was only perceived as a receiver. New media, however, is fundamentally based on mutual interaction and collaboration. Finally, in terms of the process, while the success of the mass media was in its depersonalizing the audience and suppressing them into a passive mass, new media rhetorically is based on personalization and individualism. The networks and social media, for example, do not have any of the features, capabilities, or process capacities of mass media. Their role, including creating and directing the flow of social movement, is not of the authoritarian type that was common in traditional mass media. When the subject of the investigation has completely changed, the old approaches to it lose their effectiveness.

The second reason for the inefficiency of the IAM and MEA is that all traditional mass media have changed their status after the emergence of new media. Traditional mass media, including the press, radio, cinema, and television, are using the features and capabilities of new media, and this effort has completely changed their being. Today’s television, with its extensive interactive facilities based on both web and wave, is not similar to television in the 1990s. Those who believe that the traditional media, including television, are still the most significant in conducting the public, are oblivious to the fact that the traditional mass media itself achieves its existence through its interaction with the new social ecosystem. Talking about the new media is not specific to certain forms of media. All media, including traditional ones such as the press or television, have internalized the social construction of new media.

And finally, the third reason is related to the content of the media. The SCM is not only about media technology; it also encompasses content. The process of creating meaning in the message transmitted through the media is dynamic and social. Society is continuously negotiating the meanings of real happenings. Therefore, the media cannot produce any meaning outside the scope of what is negotiated in society. This requires simultaneous attention to the audiences and how they interpret media content. The social process of creating meaning is not specific to new media. The traditional mass media’s content was decoded in the same way for the masses. However, because in the period of dominance of mass media, the production of content was exclusively under the control of mass media, those media have had more chances to create the desired meaning in the minds of the masses. At the same time, during that period, complementary techniques such as causing fear, replicating, or harmonizing were used to quickly consolidate the intended meaning. That situation has now completely changed. Social groups can exchange messages and participate in the process of producing meaning free from surveillance; therefore, it is not as easy and possible to control the message’s meaning as it was in the past. It is another reason that proves that even traditional mass media cannot control the meaning production process through society in the period of new media.

Insisting on understanding the media as an instrument or ecosystem is the source of inefficiencies, failures, and continuous wastage of resources. With a closer look, we will find that such a situation was not usual during the period of dominance of traditional mass media. In previous decades, television, for instance, could achieve the desired level of persuasion of public opinion in the best way. The reason for that capability was in the adaptation of social conditions to the answer given by the media trustees to the question of media ontology. The administrators of mass media considered media as a means of persuasion and promotion of the convergence and social cohesion of the passive masses, and practically media was still such a thing. This adaptation of reality and knowledge was the origin of the legitimacy of the mass media’s sovereignty, surveillance, and exclusivity because it was well-seen how mass media could influence public opinion. In other words, it seemed obvious that such a terrible “instrument” should be in the hands of a monopoly because, without this monopoly, it would not be possible to rule over the masses.

The ideological construction of individualism

Adopting an approach to the SCM is most encouraging to the idea that the subject of media studies is no longer the method of identifying ideological dualities, including information/misinformation, real news/fake news, and fact/conspiracy theory. Such a diagnosis is not the duty of scientific efforts but the self-imposed responsibility of propaganda campaigns. The right and serious questions are concerned with how these dualities are constructed and what are the benefits of the forces that intervene in such constructions. Answering these questions is beyond the comprehension of the technical characteristics of the media.

Like all constructed dualities of neoliberalism, personalization and individualism versus de-individualization and genericism is an ideological construct. Using the concept of Ideological State Apparatuses (ISAs) in Althusser (1970; 2010), Garite (2003) states:

“Within ideology, it appears ‘obvious’ that people are unique, distinguishable, irreplaceable identities—and that, as autonomous individuals, they possess a certain kind of subjectivity or consciousness which is the ultimate source of their beliefs and actions, independent of the world around them” (Garite, 2003, p. 5).

So far, many scholars have revealed the ideological or political construction of individualism and shown its hidden nature through concepts such as hailing or interpellation (Althusser, 1972), control (Baudrillard, 1983), or willing adoption (Belsey, 2003). According to Althusser (1972, p. 175), “the existence of ideology and the hailing or interpellation of individuals as subjects are the same thing.” Related to this concept, Gauntlett (2002) remarks: “interpellation occurs when a person connects with a media text” (Gauntlett, 2002, p. 27). Even in the 1980s, philosophers like Baudrillard rightly realized that “the role of the message is no longer information, but testing and polling, and finally control […]” (Baudrillard, 1983, pp. 119-20). Or as Belsey (2003) puts it, these kinds of actions do not have a compelling quality, but

people ‘recognize’ (misrecognize) themselves in the ways in which ideology ... calls them by their names and in turn ‘recognizes’ their autonomy. As a result, they ‘work by themselves’, they ‘willingly’ adopt the subject-positions necessary to their participation in the social formation” (Belsey, 2003, p. 61).

All the above bolster one idea: That the general notion of individualism as a Doxa has been a manipulated one. We must, therefore, refute the notion that individualism is a natural state of affairs for humankind. It is the first point of departure to critique the systems that see their advantage in personalization and adaptation to individualism as a natural feature. It is important to focus on the jargon’s functions or the discourse of individualism because these functions will reveal why this approach pretends to be empowered, despite its incapability.

Specifically, the two main functions of the ideological construction of individualism are to the disclaimer of political systems, on the one hand, and to cultivate the dream of human selectivity, on the other hand. Through these two functions, personalization and individualism have far-reaching economic and political implications for both the political system and the market (Fuchs, 2003).

In contrast, it is claimed that the de-individualization and collectivism of the masses are the mechanisms of totalitarian and fascist regimes. In this way, the individualistic “We” and the mass-oriented “Others” become, in a Kantian way, the universal rule of ethics and aesthetic judgment. It is a social construction of good and evil, which claims that individualism is full of freedom, self-confidence, and self-expression, while collectivism is the product of the suppression of individual freedom in opposition to human nature.

Relying on such a deceptive notion of individualism, new media manifests its advantage by claiming to personalize messages, platforms, and implementations. In breathtaking competition, new media finds its advantages in the so-called respect for individuality, the power to choose, the right to express a personal narrative, and the ability to provide a unique version of media-per-user.

Refutation of the personalization approach

Despite widespread critical explanations of the ideological nature of individualism, doubts about the practical functions of the concept started appearing only in the early 1970s. Scholars’ acknowledgment of the ideological nature of the social construction of individualism in the modern era has inadvertently implied a belief in its persuasive effectiveness. But this conception began to end in the 1970s.

In return to the early 1970s, when the advent of relatively high-speed processors prompted scientists to discover the universal pattern of everything, including the general pattern of human behavior. However, the results of the first attempts were not very promising, and their product was the idea of “randomly transitional phenomena” (Sprott, 2003, p. 89) as a logical explanation for the Chaos Theory (CT). Although the CT implies the impossibility of designing universal patterns, the theory is the product of such a dream by itself. Interdisciplinary studies within the scope of the CT have attempted to arrive at such a pattern, but the matter was reversed. Thus, it was theorized that even though there is a model for explaining human behavior, indeed the number of variables and their interactions is too great to be considered.

Later, during the 1970s and as a reaction to chaos theory, the Computational Complexity Theory (CCT) (Karp, 1972) dominated. This theory proposed entrusting the discovery of a general pattern between information units to the computer as a practical alternative to chaos theory. While it is practically impossible to determine the algorithm of relationships between “information units” in a universal pattern, this should be left to the processing systems to discover an iterative pattern between the information units and finally complete the puzzle. It was soon clear that the CCT was facing two serious obstacles. Firstly, we cannot define a specific “unit” for information. Any breakdown of an information package to its components means the loss of the overall spirit of that package. Secondly, information has something inside that the computer cannot understand: Semanticity. Thus, the theory of complexity failed as the first practical step in machine learning with barriers to the unification and semantics of information.

Although efforts to break down complex semantic structures into smaller parts through projects such as Operad theory continue, these projects are still unable to systematically break down information without human intervention. For example, Operads depend on basic structures called “arguments,” which must be previously defined by humans as “inputs” of the system. So, though “interfaces define which designs are syntactically feasible, key semantic information must be expressed to evaluate candidate designs” (Foley et al., 2021, p. 2).

These controversies continued until 2007 when the Quark Theory (QT) opened a new door into computing science. According to the QT, which the physics community accepted in 1975 (Griffiths, 1987, p. 42), every entity consists of a set of microcomponents called quarks. A quark is the smallest unit of a phenomenon and cannot be partitioned into smaller particles. This subatomic particle applies to any entity whether dead or alive and more importantly, it is not arbitrary but a general rule that is repeated on a larger scale. In the field of information technology, the QT led to a major revolution: shifting from the Internet of information to the Internet of data. In the history of its invention and development, the Internet has never experienced a more fundamental turning point than this.

Putting data instead of information solved two trials of the CT: data do not contain semantic mode and can be unified. This revolution took place around 2007; and rapidly transformed all Internet processes, technologies, and platforms. The inventor of the web, Tim Berners-Lee, expressed in a presentation at Ted in 2009

I said, could you put your documents on this web thing? And you did. Thanks. It’s been a blast, hasn’t it? … Now, I want you to put your data on the web. Turns out that there is still huge, unlocked potential. There is still a huge frustration that people have because we haven’t got data on the web as data” (Berners-Lee, 2009).

The natural thing that Berners-Lee and his other W3 partners are trying to portray as the duty of individuals to the public good is nothing more than to get people to consent to the transfer of their private data and to the accumulation of public data on the servers of giant digital companies like Google or Facebook to achieve generic patterns to control human behavior.

The next defining event was in 2012, when Daniel Kahneman, winner of the 2002 Nobel Prize in Economics and opponent of rational behaviorism, wrote “train wreck looming” in an open letter to the American Psychological Association published by the ‘Nature’ website to apply the inefficiency of the Priming Theory (PT). The PT, a theory in psychology, claims that the behavioral outputs desired can be obtained by intentionally projecting specific information into each individual’s mind in a personalized way. This theory’s findings, which have been the basis of all controversy and so-called ‘conspiracy theories’ based on data manipulation in the world so far, proved ineffective in Kahneman’s re-experimentations. Kahneman’s letter also contains “exposure of fraudulent social psychologists such as Diederik Stapel, Dirk Smeesters and Lawrence Sanna, who used priming techniques in their work” (Yong, 2012).

Thus, at least as far as scientific findings are concerned, the approach to personalization is a myth. The myth-making of this approach has not only been blind to all of the competing scientific studies, but it also continues to insist on its effectiveness even after disclosing its inefficiencies and scientific manipulation of related research processes.

Resistance against the scandal

The abolition of the PM practically meant the end of the legitimacy of the personalization approach. But it still refuses to accept failure. This resistance has unscientific reasons and, therefore, the answer must be sought in the pseudo-scientific mechanisms to conceal its anti-human procedures. Three main reasons explain why politicians and the market continue to support the illegitimacy of the personalization approach.

The first and most important case concerns the function of conspiracy theory in social control. The most important application of the personalization approach to the media is its ability to promote skepticism through the exposure of various conspiracy theories in public opinion. This feature is especially welcomed when the communities are in shock after an event in which there is no obvious possibility to analyze the reasons that led to an unexpected result. When people witness an unexpected event, conspiracy theories are used to spread suspicion. The public can be controlled in this way. Ideas like Russia’s manipulation of American and British voters by priming operations through online social media to vote for Donald Trump or Brexit, Israeli control of the Arab Spring through social media, or Russian influence over the European Union’s (EU) users through the spreading of misinformation are entirely based on conspiracy theories. More surprisingly, these sorts of theories are being voiced, not by ordinary people, but by credible scientists, think tanks, and international institutions.

Even if, for example, Russia has been able to send personalized messages to American or British users through social networks and platforms, this does not mean that such an action has had a definite effect like leading to the mental manipulation of users or forcing them into the desired behavior. Despite extensive efforts to gather massive data on the reality of such an action by Russia, there is not a single article proving the effectiveness of such actions. As if taking an action equals the definite effect of that action. Instead of addressing the real roots of shocking events, in such a sphere that conspiracy theories are used to keep their producers safe from any doubt but to have this suspicion flow across society and among individuals.

The second reason for keeping the personalization approach alive is related to its commercial and political applications. Collusion between bankers, investors, data analysts, and politicians has kept the feasibility and acceptance of risk analysis based on personalized data safe from criticism. The reason is that all parties involved in such a claim benefit from a common myth. However, the main cause of the 2008 economic crisis is the reliance on this inefficient approach (Senior Supervisors Group, 2009). Another example is the Cambridge Analytica scandal, which claims that manipulating the minds of the voters in the US presidential election in 2016 was nothing more than a propaganda effort to legitimize such institutions and maintain a mighty turnover among them.

The third reason goes back to the imaginative existence of the media world. New media create a fictitious world, and users consent by imagining the controllability of that fiction. While new media users are not active subjects in the real world, such an impression gives them a sense of selectivity, control, and centrality. Users react within the realm of the imaginary and fantasy-mediated world. Replacing the imagination of social action with the impossibility of action in the real world leads to consent. It is why and how people consider acting in it as a social norm or even common morality by immersing themselves in an online social network.

The idealistic manifestation of digital companies is the availability of information to create personal narratives by individuals, regardless of the dominance of other narratives. Metaverse, for example, is based on such an illusion. It is the idealistic face of the personalized world through the media. Space is neither a new technology nor a turning point in the history of the Internet or new media. Metaverse is merely an enterprise strategy claiming to personalize the imaginary world. That is, it pursues its interests where freedom is as ideal as possible.

Analysis

The reloaded revolution of 2007 that led to the rise of the Internet of data revealed the illusory nature of the individualism promised by neoliberalism. Internet development processes prove the formalistic manifestation of neoliberalism and the hypocrisy of individualism within it. However, the notion of individualism as the epistemology of Kantian judgment remains a central element of the ideological jargon of neoliberalism. This concept can integrate macro-narratives within the system in a non-problematic articulation.

According to the SCM, media is an ideological construction influenced by contextual conditions. While individualism is the central element of the ideology of neoliberalism, new media emerging from that origin also carry a similar ideology. Accordingly, the hypocrisy of neoliberalism in its emphasis on individualism is traceable in new media. On the one hand, it is claimed that the central value in all new media is the personalization of media technology, processes, and content according to the unique characteristics and needs of the user. On the other hand, the procedures of new media development, especially algorithms, artificial intelligence, and machine learning, are based on two fundamental features de-individualization and de-rationalization.

There is a theoretical contradiction here: With the help of new media and digital technologies, users feel more individualistic selectivity than before, and at the same time, the reality of the development trends of these technologies proves that users are constantly, and more than ever before, stripped from their individuality, and they have lost the rational basis in their decisions. The concept of reverse democracy provides a reliable explanation for the resolution of the above contradiction. What new media users refer to as selectivity is merely an ideological construct in the form of predetermined and planned ideological interactive paths. It works based on Skinner’s model (Shrestha, 2017) of rewards and punishments, where operant conditioning takes the place of rational critical analysis. Therefore, what users think of as the right to choose is the fulfillment of a predetermined task and obedience without reflection or resistance in the implementation of the commands of an inclusive system whose universal patterns have made it impossible to understand its imposed features.

Contrary to common sense, the fears caused by the superiority of artificial intelligence over human cognition, emotion, and motivation are more related to the mechanisms of suppressing the triple Kantian capabilities of humans and turning them into operant conditioning objects rather than the realized and expected advances in artificial intelligence itself. Artificial intelligence and machine learning are nothing but “commands” designed for the machine through algorithms. Similarly, the ideology of interaction is nothing more than “commands” designed for humans through predetermined paths. It is claimed that the increasing complexity of algorithms has made machines human-like, but this is not a comparison between a machine and a very liberal human, but a comparison between a cybernetic machine and an operant conditioning human reduced to the position of an ideological object.

Like the field of artificial intelligence, the individualism of this new level of human is not self-reliant, but an individualism determined within universal generic patterns. According to one of these universal patterns, for example, a human can simply have one of these five primary personality traits: extroversion, agreeableness, openness, conscientiousness, and neuroticism. This generic articulation of humans ignores contextual characteristics such as culture, gender, or age, and it suppresses any inconsistency within the imposed classification.

However, the illusion of individualism continues to be sanctified, and the moral considerations that make “Us” defeatable to “Others” are constantly invoked. It is the starting point for all conspiracy theories which are the underpinning of all dualistic constructs, including information versus disinformation. The basic problem with these types of compositions is their irresolvable contradiction. The assumption of the possibility of influencing individualistic users through individualized advertising or propaganda requires the belief in the lack of ability of rational and critical reasoning on the part of users because, according to conspiracy theories, individualized messages can conduct the behavior of the mass including the US presidential election or the UK Brexit votes. Such a belief requires two basic simultaneous presuppositions. First, it presupposes that some mechanisms can force users to perform planned behavior regardless of their characteristics. As behavior change is possible regardless of individuality, this first assumption rejects the basic claim of individualistic patterns. The second basic premise of conspiracy theories is that it is possible to force users to change their behavior through a series of messages. Acceptance of this claim requires belief in irrational action and objective quality of users, that is, something consistent or at least similar to Skinner’s operant conditioning mechanism (Shrestha, 2017) or algorithmic controllability. Therefore, conspiracy theories, in themselves, negate the two principles of individualism and rational liberalism.

Conclusion

The analyses of new media and all its dual structures, including information/misinformation or fact/conspiracy theory, require a critical approach to the SCM and the intervention of power and politics in such constructed dualities. This approach has two beneficial consequences. First, it enables journalists and the academic community to decolonize the media, thus restoring power to the marginalized forces in a more balanced way. The second important consideration is to move beyond traditional approaches, such as the DMT, MIA, or MEA that have lost their effectiveness in the post-truth era. The SCA is an effective, informative, and efficient alternative to understand, not in a way the corporate media has narrated to us, but in a more realistic and just one.

As the claim of the possibility of personal narratives of self and life, the rhetoric of individualism has two main functions: First, the exoneration of political systems and the consequent voluntary renunciation of human beings from pursuing the demands and rights entrusted to administrations, and second, the cultivation of the dream of human freedom through the creation of the illusion of selectivity and surrender to predetermined paths of the consumer market as the only possibility to express the idea of unique social existence. Both functions are the implementations of very social control.

New media operates not based on the idea of individualistic personalization but on the generic construction of the human cognitive system. Spreading suspicion by shaping conspiracy theories, commercial and political interests, and imaginative new media construction are the three main factors in the survival of media personalization assertions. The main customers of media personalization are politicians and the market. Fragmenting society, spreading skepticism, and expanding anxiety resulting from conspiracy theories or the immersion of users in an imaginary world have commercial and political benefits for those who are consumers of the consent users.

Paying attention to the power and politics of the constructed reality in the post-truth era is a requirement for all researchers in the field of new media. Dearticulating and rearticulating reality in ways that reduce the dominance of power and politics in favor of higher explanatory power and analyzing the effect of power and politics in socially constructed articulations are some suggestions that can be based on the argumentations made here.

Acknowledgments

The author thanks Hatem El Zein for helpful discussions and comments that significantly improved the presentation.

References

Althusser, L. (1970) Ideology and ideological state apparatuses (notes towards an investigation). Available at: https://www.marxists.org/reference/archive/althusser/1970/ideology.htm (Accessed: 29 January 2023)

Althusser, L. (1972) ‘Ideology and Ideological State Apparatuses: Notes towards an Investigation: Lenin and Philosophy and Other Essays’, in A. Scott and K. Nash (Eds.). New Critical Writings in Political Sociology. New York: Monthly Review, pp. 85-126.

Althusser, L. (2010) ‘Ideology and ideological state apparatuses’, in I. Szeman and T. Kaposy (eds.) Cultural theory: an Anthology, Wiley-Blackwell, 204-22.

Baudrillard, J. (1983) Simulations. New York: Semiotext (e).

Belsey, C. (2003) Critical practice. Routledge.

Berlo, D.K. (1977) ‘Communication as process: Review and commentary’, Annals of the International Communication Association, 1(1), pp. 11-27.

Berners-Lee, T. (2009) ‘The next web’, TED Talks.com. [Online]. Available at: https://www.ted.com/talks/tim_berners_lee_the_next_web (Accessed: 10 June 2022).

Carey, J. W. (2008) Communication as culture, revised edition: Essays on media and society. Routledge.

Foley, J. D., Breiner, S., Subrahmanian, E. and Dusel, J. M. (2021) ‘Operads for complex system design specification, analysis, and synthesis’, Proceedings of the Royal Society A, 477(2250), 20210099.

Fuchs, C. (2003) ‘The Role of the Individual in the Social Information Process’, Entropy, 5(1), pp. 34-60. https://doi.org/10.3390/e5010034

Garite, M. (2003, November) ‘The Ideology of Interactivity (or Video Games and Taylorization of Leisure)’, In DiGRA Conference.

Gauntlett, D. (2002) Media, Gender and Identity. London: Routledge.

Giddens, A. (1984) The Constitution of Society: Outline of the Theory of Structuration. Cambridge: Polity Press.

Griffiths, D. J. (1987) Introduction to Elementary Particles. John Wiley & Sons. ISBN: 978-0-471-60386-3.

Guba, E.G. and Lincoln, Y.S. (1994) ‘Competing paradigms in qualitative research’, in N.K. Denzin and Y.S. Lincoln (eds.) Handbook of Qualitative Research, Thousand Oaks: Sage Publications, Inc., 105-117.

Harvey, D. and Williams, R. (1995) ‘Militant particularism and global ambition: The conceptual politics of place, space, and environment in the work of Raymond Williams’, Social Text, (42), pp. 69-98.

Hejase, A.J. and Hejase, H.J. (2013) Research Methods: A Practical Approach for Business Students (second edition). Philadelphia, PA, USA: Masadir Inc., 82-83.

Herman, E.S. and Chomsky, N. (1988) The political economy of the mass media. Pantheon, New York.

Herman, E. S. and Chomsky, N. (2010). Manufacturing consent: The political economy of the mass media. Random House.

Karp, R. M. (1972) ‘Reducibility Among Combinatorial Problems’, in R. E. Miller; J. W. Thatcher (eds.) Complexity of Computer Computations, New York: Plenum, 85–103.

Lash, S. (2002) Critique of information. Sage.

Meyrowitz, J. (1986) No sense of place: The impact of electronic media on social behavior. Oxford University Press.

Meyrowitz, J. (1999) Understandings of media. ETC: a review of general semantics, 56(1), pp. 44-52.

Meyrowitz, J. (2001) ‘Morphing McLuhan: Medium theory for a new millennium’, in Proceedings of the Media Ecology Association, 2, pp. 8-22.

McLuhan, M. (1964) Understanding media: The extensions of man. MIT Press.

Patterson, G.H. (1990) History and communications: Harold Innis, Marshall McLuhan, the interpretation of history (p. 1663). Toronto: The University of Toronto Press.

Postman, N. (1984) Amusing Ourselves to Death: Public Discourse in the Age of Show Business. New York: Viking Penguin Inc.

Schramm, W. L. (1954) The process and effects of mass communication. Illinois: The University of Illinois Press.

Senior Supervisors Group (2009) ‘Risk Management Lessons from the Global Banking Crisis of 2008’, Senior Supervisors Group. [Online]. Available at: https://www.sec.gov/news/press/2009/report102109.pdf (Accessed: 10 June 2022)

Shannon, C. (1948) ‘A Mathematical Theory of Communication’, The Bell System Technical Journal, 27, pp. 379–423, 623–656.

Shrestha, P. (2017, November 17) Skinner’s theory on Operant Conditioning. Psychestudy. Available at: https://www.psychestudy.com/behavioral/learning-memory/operant-conditioning/skinner (Accessed: 14 December 2022)

Sprott, J. C. (2003) Chaos and Time-Series Analysis. Oxford University Press.

Sonaike, S.A. (1988) ‘Communication and Third World development: a dead end?’, Gazette, 41(2) pp. 85-108. doi: 10.1177/001654928804100202.

Turner, F. (2021) From counterculture to Cyberculture. Chicago, IL: University of Chicago Press.

Yong, E. (2012) ‘Nobel laureate challenges psychologists to clean up their act’, Nature (2012). https://doi.org/10.1038/nature.2012.11535