The Liquid Self Longs for Solidity

Even in consumption, there seems to be a yearning for tangibility and permanence

Humans have always had a reciprocal relationship with technology. It comes as no surprise that the digital revolution has brought about the “liquid self” and that social media has spawned all sorts of expressivist individuals. 

From a purely economic perspective, these developments may be seen as an unalloyed good, judging from the astronomical market value of high tech firms, the global Covid pandemic notwithstanding. Benefits trickle down even to ordinary consumers who, through digitization and electronic platforms, find a near infinite variety of products within reach. The trick consists in transforming goods and services into digital information or “data”. Material objects then become experiences; the tangible, intangible; the permanent, ephemeral; and ownership reduced to access. Reading a book, listening to music, or watching a film has become incredibly easy, cheap, and quick, thanks to Amazon, Spotify, and Netflix, respectively.

Indeed, there seems to be no limit to how much of our lives can be cut up into byte-sized chunks. We find a place to stay at Zillow, order meals through GrubHub, move around town on an Uber, hold office at WeWork, meetings in Zoom, and keep files in DropBox. Women can even choose their outfits from Rent the Runway (apparently, no such service exists for men). Our family is on WhatsApp, friends on Facebook, colleagues on LinkedIn, dates on Zoosk, idols on Instagram, and enemies, probably on Twitter. And for whatever else we’re looking for, there will always be Google. That most of these services are offered for free distracts us from the fact that in exchange we are constantly being tracked and have given up our privacy, perhaps even things as intimate as our genetic data. 

Yet despite the array of choices and convenience, for the majority life satisfaction remains elusive. Even in consumption, there seems to be a yearning for tangibility and permanence, so much unlike what one would expect from the liquid, expressivist self. Why is this?

Marketing professor Carey Morewedge and colleagues refer to a loss of “psychological ownership” through the different modes of digital consumption in the so-called “sharing economy”. Basically it is the sensation that nothing is no longer “mine”. Instead, everything is subject to lending, reselling, renting, or streaming. The loss of attachment makes us sink into some sort of anomie. The lack of control thwarts our agency and ability to find meaning in events that are fleeting and which occur inexorably. Without an identifiable history or goal, we are rendered incapable of developing a sense of self. 

The fatal flaw lies in believing we are no more than a flow of mental states or sensations and forgetting about our natural embodiment. Having bodies means being moored, subject to time and place, regardless of Internet synchronicity and globalization. We require intimacy and security to grow and develop. We crave for things to call our own, if only to protect our vulnerability. Possessions and property guarantee our freedom. We are, of course, relational and social, but never fully transparent individuals, even to ourselves. We want to be able to choose with whom we share our lives. Flourishing cannot be achieved with attention dissipated and existence fragmented to this degree. 

Even the liquid, expressivist self longs for solidity and intimacy.

On Carebots: All I want to say is that they don’t really care about us

Barcelona has just rolled out a pilot program where carebots will be deployed to assist in the homes of the elderly living alone. The seniors will be closely monitored in their overall wellbeing before, during, and after the carebots’ stay, and findings will be used to fine-tune the carebots’ programming and use.

Around 90,000 elderly people live by themselves in Barcelona. Yet there is room for only 13,000 in publicly-owned or managed care facilities, with a waiting list of more than 6,000. Users of remote-assistance services have crossed the 100,000 threshold and about 40,000 homes require regular visits from welfare workers. Although the majority of senior citizens prefer to live in their own homes, city services are already overwhelmed.

Carebots, which are a combination of a variety of state of the art technologies, are a godsend in several ways. They are excellent communication centers through which the elderly can keep in touch with friends, family, and medical or emergency assistance personnel through video calls. They could stream the elderly person’s favorite movies and shows; run their music playlists; read them the papers, or even guide them through their regular fitness routines. They’re great for early detection and management of a wide range of conditions when equipped with medical diagnostic sensors and health apps. Fitted with mechanical arms or an exoskeleton, they could be a trusty mobility aid, especially for the unstable or disoriented. They can even engage in what passes for an intelligent conversation, often speaking with a pleasant and empathic female voice. Just observing them could be very entertaining. They are often cute, even cuddly. The perfect toy for grown-ups!

Yet there are serious potential downsides as well when we allow carebots to substitute or even take over human care-giving. First and foremost is the risk of unhealthy emotional attachments when we depend on carebots to keep the elderly company. After all, carebots are always there for the seniors 24-7, obeying whatever they’re told, and never complaining. This is quite similar to the use of smartphones, tablets, and other electronic gadgets as child-minders. They may give parents and adults a breather; but then the kids get addicted and spend too much time on these devices, failing to socialize or to get some exercise, becoming prone to depression. Although meeting up with friends or going to the gym isn’t exactly top priority for the elderly, especially in these times of Covid, something’s both wrong and false about relying on a machine to satisfy their social, emotional, and higher-order needs. It’s dehumanizing, if not outright inhuman (so it’s not just a technical problem of how “smart” and functional carebots can become). 

As they grow up, we teach kids the difference between things, living things, and fellow humans, so they learn to treat individuals belonging to each of these groups appropriately. We may at first laugh or find it funny when small children start talking to Alexa or Siri; but we’d have reason to worry if such behavior were to persist, because they’re supposed to get over it sooner than later. 

Why, then, should we be complacent upon hearing Grandpa engaging in a spirited conversation with the carebot, just because he’s developing Alzheimer’s? Has it become ok to infantilize him? Does he now deserve anything less than human?  

It would be a mistake to think carebots are a remedy to the loneliness and isolation many of our seniors suffer. Let’s not fall into this collective delusion. There’s no substitute for the warmth only other human beings can offer for these ills. Pets such as dogs can help as they are alive and share some of our emotions as higher-order mammals. But neither can they take our place.

As for carebots, despite the name, they don’t really care about us. It’s not because they don’t want to; they just can’t. To care is to experience certain feelings and emotions towards others; to want and desire what’s good for them. Generally we know what that is because we want and desire the same thing for ourselves. But to be capable of this, one needs to be alive and embodied, something which the carebot is not. No programming skill or robotic technology can solve that. 

The carebot is not a tin man and there is no wizard of Oz. 

Barcelona city officials should take note.

Master and Commander. The Dialectic of Human-AI Engagement

5050122033_602a17a16e_b.jpg (1024×732)

The “Master-Slave Dialectic” in Hegel’s Phenomenology of Spirit is perhaps among the most commented passages in his work. It has been used to explain a wide range of processes from how the human species evolved from lower life-forms (“hominization”) to the psychological development of children, from the transformation of societies through industrialization to the history of nations as they progress into sovereign states. It could also serve as a lens through which to examine the trajectories of human-AI engagement.

The “Master-Slave Dialectic” is a conceptual construct, an idealized story of how two unequal individuals meet, experiencing a deep conflict or even life-threatening struggle in  their joint quest for higher level self-consciousness. As they go through different stages in their relationship they come to realize how they inescapably depend on each other. Superior self-awareness could only come through recognition of and from the other; self-reflection could only be achieved through the mediation of the other as mirror. Although to affirm the self, one needs to deny the other, at the same time the other turns out to be necessary even just to forge the notion of self. Despite the inequality, there is mutual, reciprocal need between them.

AI represents the latest episode in the grand scheme of technological advancements. Like all tools, whether as computer algorithms alone or as algorithms embedded in robots, AI was invented by humans to make work easier, allowing us to save time, and make life more pleasant by freeing it from drudgery. AI accomplishes this not only by assisting in routinary tasks, but also in augmenting and enhancing human agency through precise medical imaging diagnostics, for instance. So in that respect, we humans are masters, and AI is our close-to-ideal slave. (Indeed, “robot” comes from the Czech word for “serf”, a tenant who pays rent on farmland through servitude or labor.) Yet there is a constant danger of dependence, exacerbated by the fact that AI can now do tasks such as driving, for example, previously imagined to be exclusive to humans. And here is where the dialectic comes in. By learning to perform activities in ways even better than humans can, might there be a chance that AI one day lord it over us? Could AI in the future be the master and we the slave?

I believe not. Strong AI, artificial general intelligence, not to mention superintelligence, for reasons I won’t be able to explain in this essay, will always be consigned to the realm of science fiction. It will never constitute a real threat.

There are limits to Hegel’s master-slave allegory applied to human-AI relationships. First, despite the original inequality, the master did not create the slave, rather, they just found each other almost by chance, as previously existing individuals. Humans, on the other hand, create AI from scratch, although they are unable to endow it with life. AI fully depends on humans for existence with its reason for being in its instrumental value for the goals set and determined by humans. Between humans and AI there is no intersubjectivity. No matter how expert and efficient AI can become, surpassing humans in particular tasks, it can never establish its own purpose; it has no preferences or desires, it experiences no satisfaction or fulfillment.  The ends of AI will always be extrinsic; it is a slave by “nature”.

In the Hegelian version, inexorably, the master becomes dependent on the slave, even as the slave, in turn, becomes dependent, not only on the master, but also on nature as the source and store of raw materials for its work. By carrying out productive activities for the master, however, the slave develops intelligence, skills and creativity, grounds for dignity, recognition, and heightened self-knowledge. Meanwhile, the master, by contrast, regresses to a life dedicated to consumption and enjoyment, becoming no different from individuals belonging to irrational life-forms. The lack of work has pushed the master unwittingly to a slavish existence.

Troubling as this twist of fate may seem between master and slave, it is far from inevitable. Not being alive, AI will never develop intelligent consciousness, no matter how hard or how well it worked. Moreover, the actual danger lies not with AI, but with humans themselves. There are instances in which dependence on AI is not a bad thing at all: think of robot bomb-defusers. Yet admittedly, there are occasions in which over-dependence on AI causes distinctive human powers to atrophy. Why bother to remember telephone numbers, birthdays, addresses, and the like when there are voice-controlled PDAs like Siri, Alexa, or Cortana? What’s the point in learning maths, memorizing poetry, or even learning a language with Google’s ever-expanding array of apps? 

Basically, it boils down to a problem of moderation or temperance. Humans ought to make every effort to retain, not relinquish dominion. They have to be on guard against allowing AI to do whatever they don’t want to, be it homework or driving, without considering why. Lacking nothing but a soul, AI will never refuse to do their bidding. Humans may then be spared immediate pain and suffering, but at the steep price of losing higher-order agency and fulfillment. 

AI will only be as good as its human master and commander.   

Virtues Beyond Recognition?

Although most ethical discussions revolve around virtues such as fairness, honesty, and kindness, for instance, nevertheless, they almost always conclude with the mere formulation of general rules and principles, the New Zealand scholar Liezl Van Zyl observed. Trying to explain this disconnect motivated “Virtue Ethics. A Contemporary Introduction”, a slim volume where she suggests, among others, that virtue ethics’ central claims are at once frequently misunderstood, rejected, and accepted all for the wrong reasons.

I share Van Zyl’s puzzlement. As one educated in the Aristotelian-Thomistic (and Catholic) tradition of virtue ethics represented in recent times by Anscombe and MacIntyre, I repeatedly encounter propositions which seem not only foreign, but even contrary to what I have been taught. Let me name a few: the development of a non-normative virtue theory (as opposed to virtue ethics) focusing on the nature and possibility of virtue; the existence of a non-eudaimonistic virtue ethics (that is, not premised upon human flourishing as telos or final end); virtue ethics without rules (Taylor); virtue ethics without practical wisdom; sentimentalist (Slote) and pluralist (Swanton) virtue ethics; and consequentialist (Driver) and deontological (Johnson) virtue ethics. If the allegations on which these new varieties are based are true, then the rejection of the very existence of the virtues as character traits by social psychologists (Harman, Doris, Merrit) should surprise no one. Instead of hollowed-out virtues, we should really just be studying external contextual features and non-conscious cognitive processes that influence human conduct (“situationism”). Factors such as the number of people in a room, wearing white coats or lab gowns, the smell of freshly baked cookies, and finding loose change in phone booths, for example, can have much greater effects in shaping our behaviors.  

But is this, in fact, the case? Has virtue ethics evolved beyond what Aristotelian-Thomists would recognize? Does Aristotelian-Thomistic virtue ethics have the resources to clarify the points of confusion and respond to novel objections? And more importantly, would the effort matter for the actual practice of virtue ethics, specifically in the domain of business and management?

Below is a brief attempt to make sense of these quandaries. 

Virtue ethics, like all ethics, ought to be normative, providing guidance on what to do or to avoid. Virtues are not abstract ideas or even values simply describing mental qualities or states of affairs. As dispositions to action, they show human beings how to behave, changing the world (and themselves) to be more in accordance with their desires, for the better. Indeed, it would be difficult to justify inquiries into the possibility conditions and nature of the virtues without linking them to how best to live. 

And why do we pursue the virtues? We seek them primarily because we believe they lead to flourishing (eudaimonia), of which they are necessary, partially constitutive elements. Not that virtues are sufficient for flourishing (as Stoïcs defend), because this requires possessing adequate material resources besides, which virtues are unable to guarantee. As distinctly human excellences, virtues may be desirable in themselves, although not as the telos or final end for the sake of which we carry out all our activities. Flourishing alone can give that rational, overarching explanation which the chain of human activities warrants to make sense. In this regard, Aquinas’ main challenge in adopting Aristotle’s teachings was how to establish the connection between flourishing and the Christian God.

Virtues as excellent character traits do not exclude rule-following, particularly in the case of exceptionless prohibitions (“moral absolutes”). Certain actions (think torture) simply cannot be performed in a morally excellent way, such that they contribute to flourishing. Rules are an instructive first step in acquiring the virtues. Being virtuous implies at least having embodied the rules which define the scope of choiceworthy actions. Thus, virtues are not only compatible, but also complementary to observing the Decalogue or divine-positive law, natural law, and the eternal law. Virtues perfect adherence to the law.

Practicing the virtues would not be possible without practical wisdom (phronesis), the disposition to do the right thing, the right way, for the right reason, and in the right circumstances. We are all too familiar with counterexamples of right things done wrongly, with crooked intentions, in the improper time or place, and so forth. Think, for example, of a student correcting a teacher in class with ridicule, to show how smart they are and put the teacher to shame. Clearly, in this case, cleverness is not a virtue, as it is used not to help others, but to embarrass them. The student displays lack of practical wisdom. That is because virtues are exercised in particular situations, not in general, and therefore require circumspection. Moreover, as distinctively human excellences, the different virtues (justice, courage, temperateness, and so forth) require each other to some degree, and practical wisdom ensures that unity and harmony. If the student above were truly virtuous, she would show courage in pointing out the teacher’s errors, justice, because the truth is owed to all, and temperateness, by choosing to do so politely. How exactly to perform this action cannot be rule-driven, but only determined by practical wisdom.

Virtues are multi-track, covering a whole range of dispositions, beginning with inclinations or tendencies, through actions, habits, and characters to lifestyles, all of which are connected to each other by means of some sort of feed-back mechanism. The repetition of actions not only reinforces the tendencies from which they spring, but also serves to develop specific habits. Different habits, in turn, configure characters, orienting these to concrete lifestyle choices. Virtues represent excellence in each of these operational levels. Hence, it would be a mistake to call “virtuous” those who experience appropriate or benevolent emotions alone, as sentimentalist virtue ethics is wont (Slote), without evaluating the actions carried out, the habits developed, or the kind of character forged, and regardless of whether such emotions are in conformity with reason (feeling less sympathy for a fellow human in need than for a cat, for instance). Being motivated by sentiments of goodwill by itself is insufficient for virtue. Similarly, virtues cannot be reduced to “opportune responses” to the world’s demands in terms of values, status, relationships or bonds, and flourishing with nothing common between them, as Swanton’s pluralist virtue ethics contests. Virtues are not specific skill sets for reacting to peculiar exigencies of values, status, relationships, and flourishing, but the integral perfection of human beings as such.

Virtues focus primarily on agents, those who ultimately hope to reach moral perfection and achieve flourishing, not on actions. That’s why virtue ethics is different from consequentialism, which judges actions based on cost-benefit analyses, and from deontology, which evaluates actions depending on whether they are in accordance with laws and performed for love of them, out of a sense of duty. Certainly virtue ethics takes into account obedience to laws, as seen above, but always against a background of justice. Likewise, virtue ethics is sensitive to harms and benefits resulting from actions, but without ignoring what is morally right. 

From the aforementioned we can see how virtues allow sufficient control over our actions, and through time, even over our lives as a whole. We are not as helpless or manipulable as situationists assert, despite being subject to contextual or environmental features and non-cognitive biases in our actions. We still retain our freedom to choose our actions in accordance with what is rationally defensible, although both freedom and reason are affected by the level of virtue we have already achieved. The virtuous are not only able to think more clearly, but to act more freely as well, notwithstanding inescapable limitations.

Such an understanding of virtue ethics is consequential for its practice in business. First, because it furnishes a conceptual architecture that explains the role of business and its contribution to the common good of flourishing within communities. Wealth-creation ought never to be an end in itself. Secondly, it elucidates the protagonism of personal moral agents in the quest for integral moral perfection. Moral excellence is not simply a matter of acting in accordance with and for love of the law, but doing things well, and allowing for the transformation of desires and emotions (not suppressing them) until they are directed toward the good. And lastly, virtue ethics shows how personal moral excellence can be combined with professional entrepreneurial and managerial excellence. Business activities can be reimagined as MacIntyrean practices with their own internal goods, these goods can then be embedded in individual biographies amidst challenges of different role-conflicts, and such nested goods can finally be employed to further advance traditions for the shared benefit of individuals and communities. This triple scaffolding affords virtues substance and sociological relevance across a variety of domains. It also reveals the falsehood behind the dilemma between moral excellence, on the one hand, and business competitiveness, on the other.

“Who’s Minding the Kids?” Empathy for Parents Working from Home

Last Sunday, I checked in by phone on my friend and fellow faculty member whom I hadn’t seen during the whole of the previous week. As I feared, the whole family was on self-isolation. An aide at the nursery where they brought their kids had tested positive for Covid-19, and as a precautionary measure they all remained home while awaiting the results of their PCR tests.

At my university we took the most difficult option of simultaneously teaching in person and remote (because of which we got enrollments in record numbers despite a tuition fee hike). If necessary, faculty could always teach from their own homes or offices. So even if my colleague stayed home, there is no reason for anyone to complain. Or is there?

In tech companies such as Facebook, Twitter and Salesforce, workers with kids who stay home during this pandemic have been getting the heat from their co-workers without parenting duties. The latter complain they have to pick up the slack at work for the same pay. Accommodations for parents working from home eat up company resources with no added benefit. Some admit they simply feel lonely at their jobs and miss the company, taking a dip in their mental health. More than just friendly colleagues, they seem to consider coworkers as surrogate family almost. What’s telling is that these grievances come from employees who feel they have sacrificed their youth and fertility to the firm, renouncing to have families of their own. Now they feel shortchanged with the company’s attention and care being lavished on others. It’s understandable they feel jealous, but are they right? Is the arrangement unfair?

It would be wrong, perhaps even illegal, to discriminate against workers for becoming parents, forcing them to choose between their jobs or their children. If at all, working from home means working more, not less, as they also have to mind the kids and supervise their remote schooling. A common observation is that working from home often translates into extended hours as boundaries are blurred and job and family tasks bleed into each other. Then there’s the tendency to overcompensate at work, precisely because from home there’s no face-time with the boss. (The fear of being the last to get promoted or the first to get laid off helps.) Working moms with small children are especially vulnerable and run the risk of dropping out of employment altogether. Yet a study on Chinese call center workers reveals greater productivity (processing more calls) at home, not only due to longer hours, but also because it was more quiet; besides, people had fewer sick days.

There are several ways in which parents working from home become even less of a burden on company resources. They pay their own utilities such as electricity, water, and internet connection. They free up space for social distancing requirements and help shorten elevator queues at office towers. More importantly, it would be easier for them to remain Covid-free, and even if they fell sick, they wouldn’t contribute to spreading the virus.

Most of today’s workplaces, sadly, were designed for a different age, when professional, productive activity was centered on physical paper and measured by bundy clocks. Not that we’ve gotten rid of “paper-pushing”, but with the current IT and telecom infrastructure, most of it is done metaphorically now, through attachments. And communication, at least the transmission of data and information, of voice and images, has never been easier, both synchronously and asynchronously. So there’s hardly any need for people to be present in the same physical space for work to proceed. That’s the case even in education, it is claimed, although the jury is still out as to how effective the experiment will be. 

Then it’s not unfair to give working parents some margin to mind their children and work from home, especially during the pandemic. The same would apply to taking care of sick relatives or the elderly. Nothing due is taken away from the single, unattached employees or those without children by allowing for this set-up. If one day they were to find themselves in the same situation, they could also take advantage and benefit from the same largesse. Maybe a little more empathy among tech workers is in order.

As for my friend and his family, I’ve just learned they’re all free of Covid, thankfully.

Forgiveness and Flourishing

Flourishing is the basic premise of most ethical thought. Ethical commands and prohibitions, warnings and recommendations only make sense to the extent we desire to achieve flourishing, individually and as members of communities. A few years back, I wrote a book that tried to incorporate social science inputs from economics and experimental psychology with fundamental philosophical insights regarding virtue on how best to reach flourishing. I explored the effects of income (both absolute and relative), pleasures and satisfactions, work and leisure, and the quality of institutions (families, religious, and political), among others, on our shared quest for the good life. However, I failed to sufficiently take into account the imperfect, fragile, and oftentimes broken condition of human nature as found and manifested in each of us. I forgot to make room for forgiveness and I now attempt to remedy this major oversight.

Forgiveness differs from merely condoning an offense. To condone is to excuse by bringing oneself to ignore an offense; it’s to look the other way, even to the point of deluding oneself that perhaps the offense never happened. To forgive, by contrast, requires looking at the offense and the offender straight in the eye, face to face, and acknowledging all the evil and hurt they have wrought. To forgive is painful.

Forgiveness also differs from reconciliation. Forgiveness is an internal process by which people come to terms with their suffering and bitterness, and as a result are able to see their offenders in a more positive light. No longer seeking vengeance, they are now capable of moving forward with their own lives. Reconciliation demands a lot more. It entails the possibility of once more imagining a shared future with the offender and reestablishing trust. Thus, it’s perfectly understandable that even after forgiving, some people are unwilling to go the full journey toward reconciliation. It’s possible to forgive without reconciling. Yet although forgiveness and reconciliation may just be different stages in the same trajectory, there’s reason to believe that reconciliation is forgiveness’ most coveted fruit, its highest aspiration or perfection. So let’s not renounce reconciliation too easily and deprive ourselves of its benefits.

Most people turn to forgiveness seeking healing and interior growth initially. Psychotherapists describe a two-step process toward this goal. First is the need of the aggrieved to share their story in a safe, listening environment without being judged. The objective is not so much to receive sympathy or advice as the chance to get an oppressive load of their chest, literally to vent. By sharing the aggrieved are somehow able to put their pain into words and distance themselves from it. Next is the effort to consider the offender’s standpoint, expanding one’s perspective to at least catch a glimpse of the moral complexity behind hurtful decisions and actions. The aim is to allow some space for empathy, without discharging the offender from responsibility or shifting blame onto the victim.

What tends to be overlooked, however, is the mutual dependence for their success between the aggrieved person’s forgiveness and the aggressor’s “self-forgiveness.” Albeit through different paths, both have to be made whole again. To reach “self-forgiveness,” psychotherapists describe traversing different phases: responsibility, remorse, restoration, and renewal. Responsibility means personally owning up to the fault; remorse, embracing guilt; restoration, rebuilding the damage; and renewal, starting life afresh. Indeed, each phase has its own set of challenges, but I will only focus on the last two.

Restoration begins by humbly asking for pardon. Sometimes, full restoration, strict retribution is not possible; the dead cannot be brought back to life nor certain losses adequately compensated, for example. Both parties have to be realistic enough to acknowledge that. Otherwise, the only option is to descend into a bottomless spiral of revenge. That’s why victims and aggressors both have to pin their hopes on the possibility of renewal, when aggressors, from now on, solemnly commit themselves to working toward the victims’ wellbeing, over and above their own, in reparation. Renewal enables the offended to leave vengeful desires for thoughts of compassion, while permitting the offenders to abandon self-condemnation and recover self-respect.

In its forward-looking aspect, renewal is hardly distinguishable from reconciliation. As Arendt observed, drawing from Jesus’ teachings, vengeance is re-active, absolutely determined by past evils, while forgiveness (without ignoring or erasing the past) allows victims and aggressors to act anew, giving them freedom to shape the future afresh. Reconciliation is nothing else but building that future together, transforming former foe into a faithful and dear friend. 

Perhaps there’s nothing of which we stand in greater need in our families, organizations, and communities than forgiveness, weak and broken people as we are, despite our best intentions. Instead of giving in to outrage at offenses, canceling and calling each other out, let us take our chances at reconciliation. Not only is that a way of “drowning evil in an abundance of good,” but it also happens to be a sine qua non of ethical life and flourishing in a world less than ideal.

Rest

Rest is not only what’s left once you’ve done everything; it’s also a right. But to what?

Man was made to work, even more so, to rest. Rest is not only what’s left, after one has done their job and fulfilled their various duties. It’s not just to stop working, taking a momentary pause, as it were, to refresh and recharge, preparing for future endeavors. Neither is it simply to spend time, smugly enjoying the hard-earned fruits of one’s labor. 

Rest is also a right. But to what? 

Firstly, to leisure, what the ancient Greeks called “schole”, from which the modern English word “school” comes. So one went to “school” not for credentialing or job-training, but to learn how to properly engage in leisure, which was the mark of a cultivated man. Obviously, this opportunity wasn’t open to everyone, only to a privileged few (citizens or land-owning males) who had free time because they didn’t have to carry out servile tasks in farming, artisanry, and commerce. They could count on other people to do this for them. Such “division of labor” provided this class of school-going elites the chance to take part in apparently endless discussions about their natural surroundings and, above all, the city, giving rise to philosophy and politics. Note that it wasn’t at all necessary, to rest or to take part in leisure, to travel very far to some exotic place. Their idea of a holiday or vacation was very much a “staycation”; in any case, always within the confines of the city.  

A unique achievement of the Judeo-Christian tradition is to have extended the right to rest to all human beings, not only to the elites, as had occurred in perhaps all ancient civilizations. This was done in obedience to a divine precept admonishing everyone, represented by the chosen people, to imitate God who rested on the seventh day, after finishing the work of creation. So rest, now, was much more than just engaging in philosophical discussions, as the ancient Greeks did. It means to do like God does and to contemplate, leaving maybe pressing, but in the end ephemeral concerns aside. To rest is to try to live, in the measure possible, in God’s eternal present. 

How are we to do that? 

We rest by contemplating nature, which delights the senses and inspires the mind: “God looked upon all that He had made, and indeed, it was very good” (Gen 1:31). We also rest by entering in friendly dialogue with fellow human beings, starting with members of our own families, discovering with wonder similarities and differences among us of amazing richness. Yet more than anything else, we rest by contemplating God’s mystery, at once transcendent and intimate, in prayer. None more beautiful, none more powerful, none greater; the truest of friends, whose words, in loving conversation, at the same time satisfies and makes one hunger for more.

 

College, Post-Covid 19

I hope the global pandemic ends soon, because we’ve found a cure or discovered a vaccine. But until then, we’ll just have to learn to make do, despite life not being anywhere near normal. (Even those who claim otherwise know it.) What began as a health crisis has evolved into a full-blown systemic threat, affecting politics and international relations, the economy, and education, among others. I’ll be centering on the impact of the Covid 19 pandemic on college education and what we can reasonably expect from the short to the middle term. Most references will be to universities in English-speaking countries, although they would be relevant as well for institutions in Europe and in other industrialized regions. In any case, seats of higher learning elsewhere could benefit by keeping an eye on these developments. 

It’s common knowledge that the Chinese word for crisis, Wéijī, is actually composed of two characters, one meaning “danger”, and the other, “opportunity”. We also know the English “crisis” derives from the Greek verb “krínein”, meaning “to decide”, thereby indicating an occasion when the need for crucial choice or thoughtful judgement comes to fore. Let’s apply these cues to our analysis of college pre- and post-Covid.

Even before Covid-19 set in, college education both in North America and Europe was already in crisis, mainly for demographic reasons. In five years, the pool of applicants would have shrunken by a fourth, and by some estimates about 20% of US colleges, specially the smaller ones with less than a thousand students, may be forced to shut down. Part of the solution is to try to fill the halls with foreign students from developing countries, and in particular, China, who besides can be charged full tuition. My own institution, a medium-sized private university in northern Spain, has about a quarter of its students coming from abroad, mostly from Latin America.

A second factor refers to costs and financing, although perhaps this is more acute in the US. In the past forty years, tuition fees have risen by 260%, double the inflation rate, such that a four-year degree could easily cost between $200,000 from a private college, and $100,000 from a public one. University education, worth $5.8 billion in 2018, is Australia’s fourth largest export, after commodities such as coal, iron, and natural gas, and caters mostly to Asians. In Europe, the majority of universities are publicly funded, with none or very low fees, that can be paid off with cheap loans. But the problem then becomes finding a job. 

This was precisely the situation MOOCs (“Massive Online Open Courses”) sought to address in the early 2000s. Through the use of digital technologies, marginal costs for every nth user would practically disappear as college-level instruction was broadcast to millions. Such initiatives were not free from difficulties however, beginning with student motivation, retention and degree completion, as well as economic sustainability, all of which significantly improved once MOOCs started collecting fees, however minimal. 

Then came the Wuhan virus.

Covid-19 certainly did not cause all the troubles afflicting college education, but it served to exacerbate them. First by preventing classroom gatherings where most traditional instruction took place. The loss of personal contact was worsened by lockdowns or grave restrictions in freedom of movement amongst people scattered in different time zones across the globe. Many national borders are still closed and some warn they will remain that way at least until Christmas. For sure, not all international students will be able to return to school in September. Second is the economic fallout with all non-essential business put on hold. Not only government revenues, but private incomes as well have taken a big hit, such that students and their families begin to question the value of a college education. We know the price, but is it worth it? No one is having to grapple with this existential question as much as the Class of 2020, as they look for a job under the worst labor market conditions since the Great Depression.   

So how will college be transformed in the wake of Covid-19?

Pundits speak of at least three different models.

First is the “Cyborg University”, which is like MOOCs on steroids, offering everything online. The only difference now is the buy-in from BigTech, poised to partner with the best brands in education to cash in on the tremendous growth opportunities. Previous, not-for-profit joint ventures such as Harvard/MIT-EdX and Stanford-Udacity/Coursera could now morph into Udacity/Google-Amazon and Coursera/IBM. They’d pay star-professors handsomely for broadcast lectures while an army of TA-equivalents would be given a pittance for the nitty-gritty of student engagement. Once more this illustrates the “Matthew effect”: to those who have, more shall be given, while to those who have little, even that will be taken away. Presumably there’d be limited subject offerings, most of which will be skills-based and immediately job-friendly. 

Second is the “Parallel University” model with a premium offline and a standard online option. The University of Michigan and Georgia Institute of Technology, for instance, have gone down this route with some full degree programs. This formula introduces some sort of caste system in studies even in the same institution. 

Third is the temporary “Hybrid model” between online and offline teaching, without renouncing the residential college experience to the extent health conditions permit. On the one hand, international students may be stranded in their home countries, unable to travel, and on the other, locals may be caught in a lockdown or forced to self-isolate because they’re sick or have been in close quarters with someone who is. In any case, college facilities cannot simply expand to accommodate everyone while observing mandatory social distancing measures. The stop-gap alternative to classroom teaching then becomes virtual, online instruction, both in synchronous and asynchronous modes. But who would pay tens of thousands of dollars for what amounts, essentially, to a series of Zoom sessions? The hefty price tag would be extremely difficult to justify. So every effort must be taken to try to make up for the loss of personal engagement through staggered attendance, modified calendars, campus testing and tracing, social bubbles, and technology.

The dangers and opportunities among post-Covid college formulas are clear. Now how do we choose?

To decide which among the three models fits best, individuals should consider what they really pursue with a college education and why. For some, it might be mere credentialing, having a certificate they’re legally up to the job or function they wish to perform. For others, it might gaining some instruction, perhaps not much different from the information available from Wikipedia or the practical knowledge imparted from YouTube. But there will be some more who truly seek the full college experience, a period of intense learning and socialization with professors and classmates at a special developmental stage, not only to form a dense web of contacts to move forward professionally, but more importantly, to become the best version of themselves, intellectually and morally, and serve society.