In February, I was walking past the TV as my husband was watching Episode #649 of Real Time with Bill Maher when Maher exclaimed, “New Rule. Enough with the foot washing! From the Bible to the Superbowl it is the 2,000-year-old kink that makes Christians go Amen but the rest of us go yuck.” I turned and pointing emphatically at the TV blurted out, “Sir, I disagree! What we need is more foot washing!” Maher’s remark caught me because almost two months earlier I had submitted my first essay for the Virginia Tech Leadership in Technology fellowship, in which I included musings on a scene from Tracy Kidder’s 2023 book Rough Sleepers that involved—you guessed it—foot washing. There’s a resonance to this symbolism that I don’t want to lose in Maher’s weary hypocrisy, a resonance that can be applied to how we in tech can rethink what it means to lead—if we can get past whatever baggage foot washing imagery holds for us.

As one of the eleven inaugural Humanities in Technology fellows at the Virginia Tech Institute for Leadership in Technology, I hold the motto Ut Prosim—that I may serve—at the forefront of my mind as a guide, like Florence Nightingale’s lamp in the darkness. What does service look like from the lens of being a leader, a steward of technology? This fellowship has offered us a chance to read widely and deeply works ranging from one of my favorites, Wang Chen’s Art of War, to (for me) maddeningly complex pieces like Ibin Rushd’s Long Commentary on the De Anima, to the soulful conversations between Jiddu Krishnamurti and David Bohm. When discussing how to solve human problems, Krishnamurthi said, “Seeing the necessity that we should all come together, people still cannot give up their opinions, their ideas, their own experiences and conclusions.

People fail to come together to solve problems because they cannot give up their opinions. Think about some of our tech leaders today, do we count them among the open-minded, ready to question their own experiences and conclusions, putting aside their ideas and opinions to consider the views of the other? Marc Andreessen published “The Techno-Optimist Manifesto” last October during our fellowship. Through our new lens of the humanities I did not come away from that piece thinking, “Here is a humble leader with a nuanced view of the power and responsibility of technology.” Nor did the authors of the Tech Crunch piece, “When was the last time Marc Andreessen talked to a poor person?” The authors write, “The missing link here is how we can use tech to actually take care of people; how to feed them, clothe them... What is missing here is that San Francisco is already the tech hub of the world and is one of the most unequal places in the universe, both socially and economically.” One could say the same of Seattle, headquarters to many tech companies living side by side with an increasing number of men and women experiencing homelessness—up 24% since 2022. This take picks at a worry I have about technology, that for all the glittering promise of a world made better by it, the reality is that we’re more in love with the tech itself than we are with using it to improve the lives of our fellow humans.

What brave new future are we waiting for, why can’t we do that right now?

Andreessen, who suffers from a techno-hubris, wrote, “We believe that there is no material problem – whether created by nature or by technology – that cannot be solved with more technology.” In a similar vein, Ray Kurzweil—we read excerpts from his book The Singularity is Near in November—writes that “emerging technologies will provide the means of providing. . .the knowledge and wealth to overcome hunger and poverty.” Pardon my skepticism, gentlemen, but these breathy pronouncements seem to fly in the face of the facts. Hunger and poverty have yet to be overcome despite decades of unprecedented growth in wealth and technological advancement. They are problems of human values and priorities. The wealth and technology needed to address these issues, globally, exists today. However, we lack the political will and resolve to direct sufficient capital to address these issues, which have plagued humankind for generations. Artificial Intelligence (AI) advocates have made similar statements, asserting that emerging AI technology will usher in an era when we can expect to solve some of society’s most pressing problems. To this, I and surely others ask, exasperated, what brave new future are we waiting for, why can’t we do that right now?

A call to ordinary flawed people

We don’t want saints and zealots. We want ordinary flawed people who are going to do the job.

--Barbara McInnis, in Rough Sleepers by Tracy Kidder

Let’s go back to the reason I was yelling at Bill Maher on TV: the foot washing scene in Kidder’s book, which follows the career of Dr. Jim O’Connell. After completing his medical residency at Mass General in 1985, O’Connell is offered his first assignment—at a homeless shelter. His other option is a prestigious fellowship in oncology at Sloan Kettering. Curiously, he agrees to become the first doctor at the Pine Street Inn homeless shelter in Boston, but his humanistic gesture is not received with open arms by the nurses at the clinic.

O’Connell was told by head nurse Barbara McInnis that he’d been “trained all wrong.” She added, “You have to let us retrain you. If you come in with your doctor questions, you won’t learn anything. You have to learn to listen to these patients.” His first assignment was to wash the feet of the homeless patients—which he did, and little else, for nearly a month. During this time, he recognized a number of the homeless men he’d treated at Mass General—often without success. Having failed to get medicine into these men at Mass General, he resigned himself to patiently washing their feet at the nurses’ clinic. That is, until one day a former patient of his, a disheveled man with a 25-year history of medical non-compliance whose feet O’Connell had been washing for several weeks, began a conversation with the doctor that led to the man finally accepting medical treatment.

This story is an example of narrative medicine, a form of writing taught by one of the professors in our fellowship program, Dr. Timothy Kirshner. Narrative medicine aims to humanize the practice of medicine and offer personhood to patients—an instructive medium as we seek to humanize the practice of technology. What struck me here is the patience and humility displayed by a recent graduate of the storied Mass General medical residency program. The nurses at the Pine Street clinic set O’Connell on a path to putting aside his ego and just being with the people he was there to serve, to truly hear and understand what they needed. McInnis’ quote that we want ordinary flawed people who are going to do the job is a call to all of us—we do not need to wait for saints or zealots, we don’t need to be perfect ourselves or at the top of the industry to make a difference. If the measure of a society is what it does for its most vulnerable citizens, is the measure of a tech leader what we do to improve the lives of those most at risk in our society? And if so, how do we—ordinary flawed people—apply this lesson as humanistic leaders in tech today?

We listen. The explosion of AI offerings on the marketplace today is a fitting place to apply this thinking and get busy washing some feet. What does this mean? It means taking seriously the concerns of those upon whose backs the data powering these models were built. When we hear objections from artists and creators, or posters on Reddit, angry that their output has been used to power now for-profit models, we sit with this discomfort, we take on their burden as if it were our own. We seek solutions from their perspective, not from ours. When concerned citizens ask about the environmental footprint of data centers, AI, and cryptocurrency as they continue to proliferate, we need to ask ourselves about the price of progress. If, according to the IEA, data center energy usage in 2022 was 460 terawatt hours (TWh) and could increase to 1,000 TWh by 2026—equivalent to the energy use of Japan—we have to join these concerned citizens in asking where this energy is coming from and whether or not the trade-offs are worth it. And what about data colonialism, where companies are paying low wages to workers in developing countries to label and tag data needed to train large language models? Are we guilty of treating workers like disposable widgets in a tech transaction, or are we considering how to lift these human beings out of poverty with the power of this emerging multi-billion-dollar market?

In fact, are the enormous resources both human and technical being poured into AI worth the opportunity cost? Are we any closer to solving foundational challenges like homelessness or poverty with these vast investments—or are we yet again distracted from their urgency by “innovation”? These are not easy questions to answer, but humanistic leaders take the time to candidly and openly consider the impact on the people in developed and emerging nations alike, on people of color, on people of varying means and health. When allegations of bias in search algorithms or AI models are raised, humanistic tech leaders don’t dismiss these concerns by claiming that the algorithms and models merely represent biases and prejudices slurped up from the internet. Instead, we listen to these concerns, engage in debate, ask how to do better. And we may, like Google with Gemini’s bizarre GenAI images of Nazis and Founding Fathers of color, stumble in the path to addressing bias, but we will keep having these conversations, we will keep listening, we will keep iterating. We definitely should not pull a Sergey Brin circa 2013 and shrug off concerns by claiming “people always have a natural aversion to innovation.

Have we asked why people are resistant to innovation, or is our techno-hubris merely dismissing them as unenlightened luddites? Innovation to one person feels like intrusion to another. In one of our last readings, Shoshana Zuboff argues in her book The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power that people have become so habituated to tech’s prying eyes that they are resigned to let companies have their personal data. But make no mistake, resignation is not the same as consent. Is the woman who gives up fighting the persistent man consenting? No. She feels helpless. This pattern that tech uses to relentlessly push for “innovation”—photographing every corner of the earth, extensive data gathering for self-driving cars, mining and mapping our personal search and social data—Zuboff calls the dispossession cycle. The cycle includes incursion (unilateral access to our private spaces), habituation (the pervasiveness of this incursion becomes an accepted norm), adaptation (shapeshifting of tech companies to appear to comply with emerging regulations), and redirection (shifting the narrative to continue to justify the original intent). It feels a bit like: we are going to do what we want anyway, so you may as well accept it. Or as our weary woman is told, “I’m doing this for you, baby—sit back and relax, it is for your own good.” I intend that metaphor to be uncomfortable, and invite you to stay with it for a little while.

Now what?

It has taken me a long time to write about what I’ve seen in the tech landscape, mostly because I feel like one of those ordinary flawed people who has no business speaking up. If you are questioning why I should have a voice, I assure you I ask the same question. I haven’t launched a hot cryptocurrency startup and made (then lost) billions. I don’t have a seat at any venture capital table where I can hand out millions in funding to shape the startup landscape of tomorrow. And I am certainly not leading a Fortune 50 company (few women are), charting a brave new course in the business landscape while lining the pockets of institutional investors. But I live in this world too. I raise children in this world. I lead a team of emerging tech leaders in this world. I see inequality all around me and feel helpless in considering its vast intractability. If anyone should have a seat at the table, should it not be those of us whose lives are impacted by the decisions made by the unelected leaders in technology? Shouldn’t it be those of us who don’t have the answers, but know that we must come together and find those answers outside of our own opinions and experiences? Those of us willing to wash some feet and sit with the uncomfortable realities in our society?

We need more programs like the one here at Virginia Tech, where we lift up emerging leaders in technology and push them via the humanities to think differently about their work and their place in the world. To absorb centuries of human knowledge and thinking, and consider how it applies to the thorny problems before us today. If you are reading this and thinking, ok that sounds great, but I don’t have any time—I am worried about my job in this economy—what if I use my voice and my employer does not like it—who is going to listen to me? I will tell you that I understand. My peers in the fellowship program understand. We have asked ourselves the same questions, but we have stepped forward anyway. I ask you to join us in casting a critical eye to where tech is taking us, who is taking us there, and question our own individual roles in this journey. Join us in Ut Prosim—that I may serve—that we ordinary flawed people may serve. Let’s come together to drive a bold humanistic vision for leadership in technology.

Disclaimer: The views and opinions expressed in this blog post are my own and do not reflect the policy or position of my employer. All content is written in my personal capacity and is not endorsed by or affiliated with my employer.