1 00:00:00,120 --> 00:00:04,220 Dear Florian, my apologies for not joining today. 2 00:00:04,250 --> 00:00:09,220 While you're listening to this, I am doing another presentation across town. 3 00:00:09,250 --> 00:00:15,380 No doubt in the not so diststant future, I'll be able to be in two places at once. 4 00:00:15,410 --> 00:00:17,620 What was sneered at as deep fake 5 00:00:17,650 --> 00:00:21,940 with the evolution of AI will simply be replicated iterations of the self 6 00:00:21,970 --> 00:00:25,540 that keep us performing everything, everywhere, all at once. 7 00:00:25,570 --> 00:00:28,260 It's a neoliberal, if not colonial, 8 00:00:28,290 --> 00:00:33,420 dream come true of 24/7 extraction across all longitudes and latitude. 9 00:00:33,450 --> 00:00:35,100 The sun will never set on this 10 00:00:35,130 --> 00:00:40,220 computational empire, where data is mined and CPUs endlessly render. 11 00:00:40,250 --> 00:00:41,340 FOMO. 12 00:00:41,370 --> 00:00:45,900 That abbreviation for the fear of missing out will be neatly filed away in what 13 00:00:45,930 --> 00:00:51,260 the science fiction writer Bruce Sterling named a dead media project 14 00:00:51,290 --> 00:00:54,620 with the ubiquitous presence of our proliferated selves. 15 00:00:54,650 --> 00:00:57,300 FOMO, along with dialup modems. 16 00:00:57,330 --> 00:00:58,980 Welcome to my home page. 17 00:00:59,010 --> 00:01:01,860 Floppy disks and MySpace will be nothing 18 00:01:01,890 --> 00:01:06,620 but nostalgia for technologists and fodder for media archaeologists. 19 00:01:06,650 --> 00:01:10,700 But for the moment, let's remain in this present while knowing 20 00:01:10,730 --> 00:01:14,260 we're always in the archive with history nipping at our heels. 21 00:01:14,290 --> 00:01:16,100 You asked whether I could give a few 22 00:01:16,130 --> 00:01:20,660 reflections on AI, and to do so I begin with a disclaimer. 23 00:01:20,690 --> 00:01:22,780 I am not an expert in the field. 24 00:01:22,810 --> 00:01:28,340 My perspective is that of a mother, teacher, learner, artist, gardener, cook, 25 00:01:28,370 --> 00:01:31,780 lover, Netflix, watcher, and Internet user. 26 00:01:31,810 --> 00:01:36,040 A computer engineer or economist would undoubtedly lend insights regarding 27 00:01:36,070 --> 00:01:41,380 the infrastructural, programmatic and financial underpinnings integral to AI. 28 00:01:41,410 --> 00:01:45,100 As we both know, matter matters form informs, 29 00:01:45,130 --> 00:01:49,210 and you've got to follow the money to understand where power resides. 30 00:01:49,240 --> 00:01:53,380 Nonetheless, AI is embedded in my daily routine. 31 00:01:53,410 --> 00:01:55,820 It permeates my Google searches, 32 00:01:55,850 --> 00:02:01,020 suggests autogenerated responses in Outlook, gives Amazon recommendations, 33 00:02:01,050 --> 00:02:05,100 provides grammatical corrections in grammarly, and tells me I might like 34 00:02:05,130 --> 00:02:10,220 season five of The Handmaid's Tale, even though I've never seen season one. 35 00:02:10,250 --> 00:02:11,940 Stalking my activities, 36 00:02:11,970 --> 00:02:17,140 the machines munch and crunch, ingesting and combining my data with others. 37 00:02:17,170 --> 00:02:23,300 It is a mirror, a projection, a mirage, a prompt, and a directive. 38 00:02:23,330 --> 00:02:25,780 A profile can be viewed as a composite. 39 00:02:25,810 --> 00:02:28,700 Multitudes, glued together like a collage. 40 00:02:28,730 --> 00:02:33,980 But rather than a composite, what if these processes were more akin to composting? 41 00:02:34,010 --> 00:02:39,170 As I write this message, I'm sitting in my garden, looking at my vegetable beds. 42 00:02:39,200 --> 00:02:40,700 Once my courgettes, 43 00:02:40,730 --> 00:02:45,540 pumpkins and beans have gone through their cycle, the leaves and stems are put 44 00:02:45,570 --> 00:02:50,380 into the compost and eventually provide nutrients for the following year. 45 00:02:50,410 --> 00:02:54,420 Every gardener knows the significance of soil and compost. 46 00:02:54,450 --> 00:02:58,980 The golden rule is it's only as good as what goes into it, 47 00:02:59,010 --> 00:03:02,580 and as the pile is sifted and turned from one box to another. 48 00:03:02,610 --> 00:03:08,260 A gardener also knows compost is not just fertilizer, but the future in the making. 49 00:03:08,290 --> 00:03:12,220 Like those who plant trees, they understand themselves within a longer 50 00:03:12,250 --> 00:03:16,060 timeline, a continuum that is greater than their own. 51 00:03:16,090 --> 00:03:18,640 Sadly, I suspect datasets and their 52 00:03:18,670 --> 00:03:22,060 algorithmic perimeters don't get the same attention. 53 00:03:22,090 --> 00:03:26,260 Instead, data is just another resource to be exploited. 54 00:03:26,290 --> 00:03:29,980 Like fossil fuel it is something to be mined. 55 00:03:30,010 --> 00:03:31,880 Unlike the permaculture paradigm 56 00:03:31,910 --> 00:03:36,980 of renewal and regeneration, this worldview is one of total consumption. 57 00:03:37,010 --> 00:03:42,180 In this model, the motto is, It's only as good as what we can get out of it. 58 00:03:42,210 --> 00:03:44,840 And as the Indian writer Amitav Ghosh 59 00:03:44,870 --> 00:03:49,780 warns in his book "The Nut makes curse, parables for a planet in crisis", 60 00:03:49,810 --> 00:03:54,980 the view of the world as resource has deep and destructive colonial roots. 61 00:03:55,010 --> 00:03:59,380 Recounting the history of the VOCs monopolization of the Nutmeg trade 62 00:03:59,410 --> 00:04:04,260 and the subsequent slaughter and eradication of the Bandanese in 1621, 63 00:04:04,290 --> 00:04:09,580 he exposes how these legacies are directly linked to our current climate crisis. 64 00:04:09,610 --> 00:04:14,940 Bersch contrasts this colonial domination and catastrophic violence with indigenous 65 00:04:14,970 --> 00:04:20,420 knowledge, which values the human and more than human, the animate and socalled 66 00:04:20,450 --> 00:04:25,660 inanimate as equally vibrant and, most importantly, interdependent. 67 00:04:25,690 --> 00:04:27,860 Lately, I've been wondering if maybe 68 00:04:27,890 --> 00:04:32,220 the word artificial blinds us to the reality that AI is of this Earth. 69 00:04:32,250 --> 00:04:37,940 Despite the deceptive moniker, cloud computing is far from ethereal. 70 00:04:37,970 --> 00:04:43,540 It is matiere, that is, it is made of material substance. 71 00:04:43,570 --> 00:04:45,560 System boards are forged from metals 72 00:04:45,590 --> 00:04:49,020 and minerals, processors need enormous amounts of energy, 73 00:04:49,040 --> 00:04:53,540 and ever expansive server farms require continuous water cooling. 74 00:04:53,570 --> 00:04:55,380 To sustain these systems, 75 00:04:55,410 --> 00:04:59,860 extraction happens on multiple and interconnected levels. 76 00:04:59,890 --> 00:05:00,980 Some time ago, 77 00:05:01,010 --> 00:05:06,780 I read that Google considers its water usage a proprietary trade secret and bars 78 00:05:06,800 --> 00:05:10,700 even public officials from disclosing the company's consumption. 79 00:05:10,720 --> 00:05:16,700 But from legal cases in 2019 alone, it was surmised that over 2.3 billion 80 00:05:16,720 --> 00:05:20,140 gallons of water were being used across three states. 81 00:05:20,160 --> 00:05:22,420 One of those states, Texas, 82 00:05:22,450 --> 00:05:27,100 where I'm from, has suffered unprecedented drought in the past years. 83 00:05:27,130 --> 00:05:30,860 I can't help but ask myself, how can something essential to our 84 00:05:30,890 --> 00:05:34,660 existence beat Google's proprietary trade secret? 85 00:05:34,690 --> 00:05:38,900 Clearly, this munching and crunching has an environmental impact. 86 00:05:38,920 --> 00:05:41,620 At this point, you might think I'm going 87 00:05:41,650 --> 00:05:46,180 to make a luddite argument for turning back the clock, closing Pandora's box, 88 00:05:46,210 --> 00:05:49,100 or putting a spanner in the wheels of acceleration. 89 00:05:49,130 --> 00:05:50,540 But I'm not. 90 00:05:50,570 --> 00:05:53,420 I'm suspicious of such positions. 91 00:05:53,450 --> 00:05:56,020 Technology never uninvents itself. 92 00:05:56,040 --> 00:05:58,420 It only becomes obsolete. 93 00:05:58,450 --> 00:06:02,460 With this in mind, I return to my humble compost pile. 94 00:06:02,480 --> 00:06:07,580 It's lessons and the golden rule of "it's only as good as what goes into it", 95 00:06:07,600 --> 00:06:11,580 as opposed to "It's only as good as what we can get out of it". 96 00:06:11,600 --> 00:06:17,020 Or, as Donna Haraway writes, I compost my soul in this hot pile. 97 00:06:17,040 --> 00:06:18,940 The worms are not human, 98 00:06:18,970 --> 00:06:24,620 their undulating bodies ingest and reach, and their feces fertilize worlds. 99 00:06:24,650 --> 00:06:27,700 AI is a part of the pile too. 100 00:06:27,730 --> 00:06:31,900 Many have said this before, and much better than I can. 101 00:06:31,920 --> 00:06:36,220 We need to think outside the longstanding beliefs of our own exceptionalism, 102 00:06:36,250 --> 00:06:41,060 acknowledge complexity and interdependency, and consider the long 103 00:06:41,090 --> 00:06:46,140 view this planet is a wondrous and precious place to be, 104 00:06:46,170 --> 00:06:51,100 and in our compost, we must think about what futures we bring to fruition. 105 00:06:51,130 --> 00:06:53,020 In her book Dear Science, 106 00:06:53,040 --> 00:06:57,860 catherine Mcitrich writes about predictive algorithms, saying they are anticipatory 107 00:06:57,890 --> 00:07:00,380 computations that tell us what we already know. 108 00:07:00,410 --> 00:07:01,940 But in the future. 109 00:07:01,970 --> 00:07:04,580 If we want different or better or more 110 00:07:04,600 --> 00:07:08,220 just futures and worlds, it is important to notice what kind 111 00:07:08,250 --> 00:07:11,740 of knowledge networks are already predicting our futures. 112 00:07:11,770 --> 00:07:13,900 McKitrick is right. 113 00:07:13,920 --> 00:07:15,620 As I mentioned earlier, 114 00:07:15,650 --> 00:07:19,940 we're always in the archive with history nipping at our heels, 115 00:07:19,970 --> 00:07:23,420 immersed and implicated without any authoritative view. 116 00:07:23,450 --> 00:07:25,740 The question is how to proceed consciously 117 00:07:25,770 --> 00:07:29,460 and critically with care, with a sense of tending to and with an eye 118 00:07:29,480 --> 00:07:33,300 on what might be at stake rather than training AI. 119 00:07:33,330 --> 00:07:35,860 What if education was emphasized? 120 00:07:35,890 --> 00:07:40,580 It might sound odd in this context, but if we don't, I believe we'll never 121 00:07:40,600 --> 00:07:45,020 witness any intelligence, and only the artificial will remain. 122 00:07:45,040 --> 00:07:47,380 Please stop. 123 00:07:47,410 --> 00:07:51,620 Whatever you do, don't put that garbage in the compost. 124 00:07:51,650 --> 00:07:55,220 If we think about AI within the framework of education, 125 00:07:55,250 --> 00:08:00,020 critical questions can be raised about its curriculum, pedagogical approaches 126 00:08:00,040 --> 00:08:02,980 that might be required, and the disciplinary frameworks that are 127 00:08:03,010 --> 00:08:06,820 needed to enrich knowledge beyond what is solely instrumental. 128 00:08:06,850 --> 00:08:09,020 We can start to question who its teachers 129 00:08:09,040 --> 00:08:13,700 are, what biases are being concretized, what the assessment criteria of its 130 00:08:13,730 --> 00:08:17,260 successes and failures are, and according to whom. 131 00:08:17,290 --> 00:08:22,180 Playing around for this presentation, I spent several hours testing Dall-E 2 132 00:08:22,210 --> 00:08:26,320 which, according to their marketing description, is a new AI system that can 133 00:08:26,350 --> 00:08:31,220 create realistic images and art from a description in natural language. 134 00:08:31,250 --> 00:08:33,880 I entered the simple sentence "she sits 135 00:08:33,910 --> 00:08:37,820 in her garden writing a letter while wearing a straw hat". 136 00:08:37,850 --> 00:08:40,420 All the results, and there were over 100 137 00:08:40,440 --> 00:08:44,660 of them, were images of white women sitting in gardens. 138 00:08:44,690 --> 00:08:46,820 I suppose they've never read the antigen 139 00:08:46,850 --> 00:08:50,860 american writer and gardener Jamaica Kinkade. 140 00:08:50,890 --> 00:08:52,380 I thought to myself, 141 00:08:52,410 --> 00:08:56,500 it's the manifestation of such an impoverished imagination. 142 00:08:56,530 --> 00:09:00,660 But is this the fault of the machines, datasets and algorithms? 143 00:09:00,690 --> 00:09:02,820 But of those who educate them? 144 00:09:02,850 --> 00:09:07,500 I often think about the fact that Alan Turing's 1950 seminal essay, 145 00:09:07,530 --> 00:09:12,820 computing Machinery and Intelligence was published in "Mind a quarterly review 146 00:09:12,850 --> 00:09:16,420 of Philosophy", and not a computer science journal. 147 00:09:16,440 --> 00:09:18,500 And as we abandon AI to corporate 148 00:09:18,530 --> 00:09:22,180 interests, I think there is a profound message in his choice, 149 00:09:22,200 --> 00:09:26,060 which is well worth mulling over as we churn together in the compost of our 150 00:09:26,080 --> 00:09:30,340 present and create toxins or nutrients for our future. 151 00:09:30,370 --> 00:09:33,100 There are choices to be made of desolation 152 00:09:33,130 --> 00:09:37,220 and neglect, or of love, stewardship and care. 153 00:09:37,250 --> 00:09:38,840 Much affection, Renee.