Compassion & Tech
This article was originally shared on Substack.
Happy New Year? Well, Happy Lunar New Year now. Regardless, may we live in less interesting times. I can’t really complain in the grand scheme of things, but things haven’t been optimal. Not in a terrible way, just regular third year of pandemic things.
Which is an excellent segue to sharing that I set up a merch store for STEAM Powered, and aside from the usual branded stuff (which is great, check it out), one of the designs is a very simple and to the point “This is not optimal”, which is an expression used by my circle of friends from university, and one that I feel is very relevant at this time.
What else isn’t optimal is the range of product images that adequately display the full line.
It’s available on mugs, hoodies, t-shirts, canvases, and journals so that you can express your underwhelmed displeasure any way you want.
Shameless spruik aside, I’d like to talk a little about compassion and technology.
Compassion and Technology
There are many thoughts about how technological advancement has led to a lack of compassion, and also empathy. Not only have numerous papers been published on the subject, but it is even incorporated into various science-fiction tropes such as the absence or development of humanity in AI, loss of empathy as a result of cybernetic enhancement, and the devolution of social mores as a result of increased dependence on technology.
But the idea of how we, as a society, have lost our sense of compassion manifests itself in our day to day as well.
An obvious example could be found in internet comment threads and trolling where the virtualised environment facilitates the dehumanisation of those with whom we interact and enables toxic engagements. But on a grander scale, people are being made to feel like subjects who are micro-targeted and analysed, often in ways that lack sensitivity or context. We are the data for algorithms to determine which products are more marketable, gauge political climate, and even guide public sentiment on sensitive topics.
It’s not all grim, though.
Compassion and its relationship with technology ties into so many other fascinating topics relating to humanity and spirituality1 that I won’t be able to do it justice today, but it’s what motivated me to look into how we currently incorporate compassion with technology.
Compassionate Technology
What prompted this area of exploration was a friend of mine sharing this video about Dr Morgan Barense (@morganbarense) and her app Hippocamera, designed to help people to practice remembering. It was developed for those with conditions like Alzheimer’s, traumatic brain injury, or other forms of cognitive decline.
Memory loss can be damaging in so many ways, not just to those suffering from it, but for those around them as well, and I was struck by how what is actually a rather simple idea, could have such a great impact. It was this app that prompted my wanderings in compassionate technology and allowed me to learn more about the brilliant things happening in this and related spaces.
Australia’s CSIRO developed an app called CALD Assist which supports healthcare clinicians by facilitating communication with patients from non-English speaking backgrounds without an interpreter. Clinicians can ask yes/no questions through the app and the patient will be presented the question in written or spoken form in their chosen language to help with treatment and care. The questions are designed to cover most standard treatment workflows, and it’s an elegant solution that can make healthcare experiences less intimidating all around when there isn’t a common language.
No Isolation, based in Norway, develops warm technology to facilitate human contact. They have two products: a one-button computer called Komp designed to help people who struggle with technology stay connected with family and friends; and a telepresence avatar called AV1 that allows youths who cannot attend school in person due to long-term illness, or other factors outside of their control, to be remotely present in class and remain engaged academically and socially.
All of these are such wonderful applications of compassionate technology, but you’ll notice that they tend to be in the health or medical space solving problems in areas around human interaction and communication when illness or isolation makes us vulnerable.
There definitely were examples in other areas such as an online service for meal-trains to help out friends or family after surgery, childbirth, or bereavement. But as one person pointed out, it’s considered less socially acceptable to seek that kind of assistance for reasons relating to mental health. It’s a good point, and there are a lot of challenges around supporting areas such as mental health, abuse, and domestic violence in this way, but it also means that there are opportunities for innovative solutions here as well that should be explored.
Compassion in Technology
It’s one thing to have compassionate tech, but we can’t build compassionate products without compassionate people as well. And to be fair, our fields don’t always support us in nurturing these qualities, especially when being objective, and in some cases, clinical is highly valued. But we need things like compassion and soft skills and the context that comes with it or we’ll become so focussed on what we’re doing that we lose sight of what we’re trying to achieve.
Dr Audrey Lobo-Pulo spoke of how we place judgements on these ‘soft’ aspects of the hard sciences, but one is not better than the other. The hard and the soft work in concert.
So how do we become more compassionate in technology? It’s about switching your mindset and affecting cultural change. It’s creating an environment where compassion is a value that is embodied and encouraged from the top down, and as part of that, shifting from elitism and seeking only “rockstars” who live and breathe tech to cultivating supportive, diverse, and collaborative spaces.
Putting these ideas into practice, April Wensel (@aprilwensel) founded Compassionate Coding to train technologists in emotional intelligence through compassion, mindfulness, and ethics. LinkedIn’s Executive Chairman and past-CEO Jeff Weiner also spoke of compassion as a priority in his commencement address at Wharton in 2018.
I’ve spoken with several guests on STEAM Powered about why we do need to teach ethics and the need for compassion in STEMM, and it’s in no small part due to other conversations I’ve had in the past about the ethics of what we do, or of SaaS ideas that tread the line of the morally grey2. While Jeff speaks of compassion in the context of business leadership, these principles that April highlights are important at all levels from leadership to implementation because we’re all responsible for our own impact.
We need to consider the work we do more holistically and with compassion because as Jeff said, it’s not just about the bottom line, or our end-users, but also our effect on the wider community.
STEAM Powered
There have been two episodes since my last musing, and while I firmly believe all of my guest’s topics are fascinating, I feel these two might hit a little closer to home for most people.
I spoke with Dr Anika Molesworth, farmer, scientist, and storyteller. In our conversation, we talk about Anika's journey and passion for communicating agroecology and climate change awareness.
Everyone talks about climate change, but if you need an angle that has a more personal impact, Anika and I speak about the topic in the context of food security, and the farmers, who produce the food that ends up on our table, who are seeing the effects of climate change first-hand.
I also spoke with Dr Audrey Lobo-Pulo, liminal technologist. In our conversation, we talk about context and bringing the human element to the forefront of what we do with data and technology rather than as an afterthought.
Audrey introduced me to the concept of warm data, where cold data is the quantitative data we usually work with, and warm data is the qualitative data that provides the context. Context that allows us to derive another dimension of understanding so that we can get a more holistic view of the system we’re trying to analyse. Warm data applies to all sorts of analysis, but when you’re looking at humans, we’re multidimensional, so it makes sense that our analysis of humans should be as well.
Quite Interesting
The Dalai Lama also attended a SingularityU event in Amsterdam (YouTube) and on the panel about “Robotics, Telepresence and Artificial Intelligence” shared his thoughts on the development of technology in this space, and what more technology companies could be doing to share the good.
Charlene deGuzman (@charstarlene) and Miles Crawford created a short film called I Forgot My iPhone (2013) as a excerpt on the impact of media and technology on society. It was also compared to Ray Bradbury’s Fahrenheit 451 in the paper Corpses, Fetuses And Zombies: The Dehumanization of Media Users in Science Fiction and Mainstream Media by Jill Walker Rettberg.
ProxyAddress in the UK was set up to allow those with unstable living situations to have a stable address that gives them access to essential services. Many systems require a fixed address, and without one, people can’t do basic things like getting a bank account, apply for work, and receive mail. The systems we depend on aren’t perfect, and there are calls for change, but in the meantime, at least there’s a way to support those who may fall through the gaps.
There is a Fellowship in Compassion and Artificial Intelligence offered by AMS Healthcare in Canada to support research into the development and impact of digital technology and artificial intelligence in compassionate care. I love this and think we need to see more programs like this in other STEMM fields.
One last bit of existential goodness following on from my footnote about Down and Out in the Magic Kingdom. I mentioned a book called The Ego Trick: What does it mean to be you? by Julian Baggini (BookDepository affiliate) in my conversation with Audrey. To give you an idea of what to expect, the blurb begins “Are you still the person who lived fifteen, ten or five years ago? Fifteen, ten or five minutes ago?”. It’s a fascinating exploration of what makes us who we are.
Thanks for reading, and see you next time!
Stay curious,
Michele
Footnotes
-
If you want to get all existential about this, Down and Out in the Magic Kingdom (free e-book) (BookDepository affiliate) by Cory Doctorow first got me thinking about how much of who you are is the accumulation of your memories and experiences (logs and data) that can be backed up and restored at will, and whether your spirit or essence can be distilled into 0s and 1s or if it’s something more intangible3. ↩
-
I have had conversations where, upon questioning an idea or a course of action, the response was “Well, it’s not illegal”. Folks, spoiler alert. If someone answers you like that, it’s a red flag. ↩
-
Which led me to the hill I developed after I was introduced to Star Trek: TNG, and later watching Discovery, about Star Trek’s transporter and replicator technologies and how they neglected to mention that they cured death in a way that did not require synthetic vessels. Perhaps I will dive into this in a later substack, or I suppose I can get it out of my system through an AU fanfic. ↩