While there are many implications, I wish to focus on just four: (1) use of the information received, (2) privacy of the information we receive or share about ourselves, (3) blurring of lines between reality and fantasy, and (4) the fundamental nature of consciousness. I will use term “technological propinquity” as short hand for the slightly longer phrase “technologies embedded in humans.”
Information Overload and Decision-Making
The first and most obvious questions is: what do we do with all of the information we are receiving about ourselves and our world. We already know more about our body and mind than ever before. We will also soon learn more about our buying habits and other propensities in our daily journey through life. Our colleague, John Krubski, is starting an institute about “Applied Decisional Sciences” – clearly the challenge of propinquity is great with regard to what we do with the information we receive and the decisions we have to make based on this information.
This new world of propinquity requires a tolerance of ambiguity along with comfort with the new technologies (most of us can no longer live comfortably as techno-peasants). There was a major field of research that flourished in psychology over several decades known as “human factors.” It concerned (among other things) the way in which people (such as pilots) gained access to and made effective use of complex information (such as altitude, speed and pitch). This human factor field is even more important today as we address the challenge of making decisions based on even more complex information as we navigate our own “airborne” journey. We are entering the world of “advanced human factors.”
Privacy and Our Exposure as Public Selves
Closely related to the issue of making use of the information we receive about our self is the accessibility of this information to other people. In many ways, we are more “naked” today then we have ever been—since Adam and Eve wandered through their garden. We now enjoy the services of “Alexa” who waits for requests from us to provide information, but we also know that Alexa might be providing our requests and related information to other people and institutions. We own computer-aided television sets that provide us with easy access to many channels of information, but also know (or at least suspect) that this television set is monitoring our own actions at home. And, of course, we are aware of the extensive information being collected by outside agencies from our hand-held devices and computers.
What do we do about this matter – a trade-off between access to information from outside ourself and other people’s access to information from inside our self. This challenge is not just about law and ethics, it is also about what we want to disclose and what we want to keep private. It is about the multiplicity of selves we project onto and into the world (what Kenneth Gergen called our “saturated self’). Who are we and what does this mean for other people in the world with whom we wish to (or must) associate? Do we just create an avatar of ourselves for other people to see in a digital world? Is there such a thing any more as a true and authentic self who is know intimately by a few people (often set as a limit of about 150 people)? There are many psychological challenges associated with this management of the private and public self.
The Merger of Reality and Fantasy
There is the related matter of somehow discerning between reality and image-production (“virtual reality”) and how we integrate the two when we are wandering through the world receiving both kinds of information at the same time.
Young people around the world are already finding it much more interesting to date an avatar (a person who is able to digitally transform themselves). They never have to actually meet their “date”. And what do we do about the building of relationships with a “machine” that knows more about what we want than anyone with whom we are affiliating. As in the movie (“Her”) there might be more to gain from an intimate relationship with a machine than from an intimate relationship with a person. There is a perspective in psychodynamic psychology known as “object relations theory.” This theory might be taking on new relevance as related to the formation of relationships with technological “objects” (rather than real or fantasized people).
One of the dimensions of psychology that relates closely to this blurring of lines concerns the locus of control. We hope to control at least certain aspects of our life (a predisposition toward an internal locus of control) but know that much of the world around us remains out of our control (a recognition of external locus of control). We no longer live in a small village where we know everyone and have some say about what is happening in this village. We might not have had much control over the weather (hence must play nice with the gods), but we could at least influence our neighbors. Now, with little control over many matters in our lives, do we find ourselves pulled toward a world of fantasy that we can control? Are the digital games we (or at least our children) play becoming more relevant than the real world in which we live? Do we build communities in a fantastical world because we can’t build communities in the world we actually inhabit?
What do we do about this pull toward fantasy and about intimate relationships with machines rather than people? How do we deal with a real world that seems to be beyond our control – or even our influence?
The New Consciousness
This fourth challenge is a real dilly! Our fundamental assumptions about not just reality but also human consciousness might be on the chopping block. Most of Western (and Eastern) philosophy has always assumed that there is some way in which we can reflect on our own thoughts and experiences. Human consciousness was assumed to be a unique (or at least highly developed) feature of human capacity – and it was a process that resided within each of us (rather than being shared by the entire community). While there are “intersubjectivity” perspectives in psychology and philosophy that suggest consciousness exists in the space (relationship) between two or more people, it was still a matter of human consciousness – not the consciousness of some machine. This might be changing: we might now be reasoning, deciding and reflecting with the aid of very high-powered machines. We are about to not only leave the driving to the machine (self-driving cars), but also some (or much) of our thinking and reasoning.
As our colleague, John Krubski, has noted, the human brain is much more complex and refined than any computer that now exists (or probably will exist in the near future). However, there is still some domains where we would like some assistance from our technologies. And this assistance could end up capturing some of the work for which we might not want assistance—such as our sense of self and our capacity to reason, reflect and make decisions. If some technology is telling us what ingredients appeal to our taste buds and what food contains these ingredients, then we don’t really have to become discerning in our purchases at the supermarket (or in ordering food on-line). One little bit of consciousness might be lost when our choice of food is mediated by a machine.
This notion about lost consciousness is closely related to the other three challenges mentioned herein. If we are overwhelmed with information, if our boundaries between private and public are invaded and if we are having a hard time discriminating between reality and fantasy (often preferring the latter), then we might be losing our sense of self and abandoning the hard work of making choices and reflecting on our own actions. We might be losing our unique consciousness (individual or collective). What are the psychological implications of this loss?
Technological Propinquity as a Dimension of Psychology
These multiple challenges are all interesting and perhaps something to write about. We can serve as Paul Reveres racing through the streets declaring that “technological propinquity is coming.” We are likely to be ignored or folks will be curious. They will ask: what in the world are we talking about? Most importantly, there might not be any work open to these Paul Reveres—especially if technological propinquity, human-embedded technology, advanced human factors and saturated selves are not in the human vocabulary and if there is no domain of professional psychology devoted to these matters.
I would suggest that there is work available for these revolutionary riders. With knowledge about the kind of challenges addressed in this preliminary proposal, one might, as a psychologist, work with high tech firms: how do we help prepare people to deal with the new propinquity and where do we want to set the boundaries with regard to the human/machine interface? We might also find employment (or at least a consulting contract) in working with health-based institutions regarding how they help their patients and clients handle the information they have received. This is where a cutting-edge alliance between psychology and behavioral economics will yield inter-disciplinary expertise regarding the cognitive and emotional implications of human-embedded technologies.
There are other areas where knowledge (and wisdom) with regard to propinquity might be valued. Physical and mental health professionals will certainly be addressing issues of stress and overwhelm associated with this propinquity, as will professionals in the fields of executive coaching, leadership development—and advanced human factors analysis. Of greatest importance is a more fundamental observation: we don’t yet know how this knowledge will be of greatest value—the human-embedded technologies are changing too fast to make any kind of confident predictions about the psychological impacts and remedies. What we do know with confidence is that human-embedded technologies are here to stay and are growing in number and variety. Propinquity might be an obscure word, but it represents a fundamental revolution in the intimate relationship between humans and their technologies.