NFL Podcast Episode 084: Thrilling Wonder StoriesNFL Podcast Episode 084: Thrilling Wonder Stories
NFL Podcast Episode 084: Thrilling Wonder Stories

NFL Podcast Episode 084 - 'Thrilling Wonder Stories' - a tape recording found from the year 2011, when the Internet was weird, friendly, and a wandering/wondering place.

Contributed By: Julian Bleecker

Published On: Friday, February 16, 2024 at 22:28:15 PST

***

Ladies and gentlemen, boys and girls, gather ‘round for an electrifying journey through the cerebral and speculative realm of Thrilling Wonder Stories. Listen as Matt Jones, Liam Young, Bruce Sterling, Kevin Slavin and Julian Bleecker regale you with a tale of imagination, innovation, and intellectual wonder. Our narrative unfolds through a dazzling array of discussions, exploring the intersection of technology, science, and the arts. From the organic to the artificial, from the accidental to the meticulously planned, our speakers weave a tapestry of ideas that challenge the boundaries of reality and fiction. Venture with us into discussions that span the history of consciousness, the marvels of special effects in film, the complexities of artificial intelligence, and the speculative frontiers of science fiction. This transcript is not just a record; it’s a gateway to exploring the future’s potential, the mysteries of the universe, and the uncharted territories of human creativity! Dive into a world where the wonders of tomorrow are discussed today, where imagination is the currency, and the possibilities are as boundless as the stars. Welcome to Thrilling Wonder Stories where every word is a step into the future!

Listen or Watch.

[00:00:00] Bruce Sterling: Would you

[00:00:05] Kevin Slavin: venture?

[00:00:18] Can Geoff Manaugh hear me right now? Hey, can Geoff Manaugh hear me right now? I don’t know. I, because I, that’s, that’s my dream because, because like Can Jeff Manna hear you? Yeah, no, I dunno. Like whatever it is I’m doing, I think he might be able to hear me. Can Jeff see it? Can Jeff hear it? He sees all I know, but I want to feel it.

[00:00:36] You want to feel his aura? I have a sense that he’s like a benign force that

[00:00:45] Matt Jones: oversees He

[00:00:47] Kevin Slavin: does have maximum possible interface. That’s true. Alright. Okay, we’re going to

[00:01:04] Bruce Sterling: reconvene

[00:01:11] Julian Bleecker: from the

[00:01:11] Matt Jones: organic and the

[00:01:17] imperfect. And the articulate, and the, uh, kind of accidental to pure platonic thought. Expressed like pink laser beams from the low orbit of the earth. Including, on my left, It sort of looks like it’s going to be like a chess battle or something. Or

[00:01:44] Julian Bleecker: battleships or something. I

[00:01:46] Matt Jones: quite like it. Yeah, yeah. So, um, we’re in the final stretch of our kind of strange wedding slash zoo radio format interrogation of dead animals, robots, time zones, film directors.

[00:02:03] Uh, Captain America and Bruce Sterling, who may be the same person. I missed Captain America. You missed Captain America. And, um, somebody who’s certainly not Captain America. More like, more like the amazing Spider Man. Um, first of all, we’re going to have, uh, Julian Bleeker. Take it away,

[00:02:22] Julian Bleecker: Julian. Thank you.

[00:02:22] Thanks, Matt. Uh, so I’m just going to do a little presentation that I think maybe synthesizes some of the discussion that’s been going on. I’m here and in and about and my sort of, um, insight and sort of perspective on, on the special effect. And one thing that occurred to me in the last couple of presentations that was quite interesting was that there was this little bit of, um, uh, uh, sort of pause or disclaimer to say that some, some of the stuff I’m showing you isn’t real, uh, this is just a video, it’s a fake video, or this isn’t ready for market yet.

[00:02:53] And I, and I think that it’s an interesting observation to think that, just to say that, When, when those, when those sort of moments happen, uh, we’re sort of really giving into the effects of, of the market, of capitalism, to say that something isn’t real unless it’s able to be marketed and sold, or if, uh, someone hasn’t tooled up a factory in China to make the thing that it’s not quite real, and of course, um, if, if we expand beyond that, that, that, uh, that sense of what can be real and what can happen, uh, we enter into this other realm that I’m referring to as the special effect, and it’s the thing that makes us think that something is possible, that it could happen.

[00:03:28] And I think in those videos, uh, that you see, it’s like just a demonstration, or it’s a corporate vision of the future, but it’s not quite here, but it will be, we just want to let you know so that our share price goes up, because, uh, you know, share price is based on the ability to generate profits in the future, that, uh, we, we need to expand beyond that.

[00:03:45] And then we should, we shouldn’t have to pause to say, this isn’t real, uh, because in very many ways, of course, those things are, because they’re activating our imaginations, they’re making us think about what could be. And to say that the thing that is real is only the thing that can be actually marketed and sold and productized and made on a mass scale is complete bullshit.

[00:04:06] And I know that because I got a fucking degree.

[00:04:16] This degree was conferred upon me by the University of California. Uh, it’s a, it’s a Doctor of Philosophy. Uh, in the History of Consciousness, and it was signed by Arnold Schwarzenegger. It was not only a governor, but he was the Terminator. And the Terminator is important for my work because I was writing about special effects in film.

[00:04:39] And, uh, the joke in the, in the, uh, amongst the Bleecker family is that I, I had to wait. It took me ten years to get it, and I tell them, like, Well, I was waiting for Arnold to actually win the governorship so he could sign it. This is what I wrote about. I wrote about this, the reality effect of technoscience, and this was an effect that I sort of, uh, observed and wanted to consider to think about what is the, what are the ways in which technology and science make us believe in what could be?

[00:05:04] What are the ways in which this reality effect, uh, trans translates an idea or a speculation into something that is, is, is as real as this table. What is the material effects that these, uh, essentially speculations have on making things possible? And there was a lot of interesting, uh, ways to look, uh, to look at this.

[00:05:23] One of the things I looked at was, uh, visualization technology. So the telescope was very interesting, uh, to me, cause it’s this way of sort of seeing out there, seeing things that were sort of beyond the horizon in a way. Uh, I was also very interested in the ways in which video games translate our sort of imagination into things that could be real.

[00:05:41] Uh, one thing that, uh, for, for the purposes here that was, uh, that I looked at that’s relevant is the special effect. So how is it that In film, we’re able to sort of suspend our disbelief long enough to believe what we see on the screen. So we believe that there’s this thing called the Terminator. And we do that because we want to enjoy the film.

[00:05:58] Because we paid, I don’t know, now it’s like 14 in the U. S. So I, I better get my money’s worth, so I’m just gonna allow myself to kind of immerse into the, into the narrative and, uh, and, and accept what’s going on on the screen. And in that way, our sort of imagination is activated. We see the content of this story and we sort of allow ourselves to open up.

[00:06:17] and believe what’s there, and I think that’s quite an interesting thing from a lot of different perspectives, but in particular from the perspective of just, uh, expanding, uh, the realm of possibility and allowing ourselves not to be as skeptical as to what could happen, because there are a lot of reasons why we want to not be so skeptical.

[00:06:34] Uh, there was, uh, so just to get into this, a little bit of the work and sort of the, the prehistory of the special effect, I think appears in this book called The Biothon and the Air Pump, uh, written by this amazing, uh, historian of science. Uh, Steven Chapin and his col, uh, colleague, Simon Shara. And they were looking at, uh, the ways in which, in this, in this particular, uh, uh, particular point in the history of science, um, Thomas, uh, Hobbs and Boyle, looking at the possibility for the existence of a vacuum, which at the time was, it was a, was a, was a very dangerous topic.

[00:07:07] ‘cause the idea that there could be a space where nothing. Uh, existed, went up against all kinds of doctrine, not the least of which was the church, which didn’t want there to be a place where there was nothing because it meant that God wasn’t there and that’s heresy and they hang you up by your Buster Browns and do bad things to you.

[00:07:28] Uh, Robert Boyle’s Air Pump was sort of at the, at the center of this and uh, just uh, maybe half a kilometer from here, this painting actually hangs, it was called, um, uh, Experiment with a Bird in the Air Pump by Joseph Wright at Darby’s, so anyone from Darby’s. You’ve got a place in the history of science, uh, in this painting, which is absolutely beautiful.

[00:07:46] It’s worth just sort of sitting and gazing at, uh, at the National Galleries, uh, a sort of roving, uh, bringer of great ideas from the capitals of the world is usually hired by a family with some money to come and sort of basically educate the family and the children in particular about the wonders of the world.

[00:08:04] So these guys, so it’s the character who’s looking like, uh, he’s looking, uh, I don’t know, like, uh, Townsend, Agent Pete Townsend kind of guy, just marveling at the science that’s going on. There’s this wonderful kind of tableau here, where you have on the left, you have the young lovers, who are sort of not so much interested in what’s going on with the experiment, more interested in each other.

[00:08:25] You have the young boy, just to their right, just a little bit more to the left, who’s gazing at this bird who’s in this glass vessel, and so the air is being evacuated from the vessel, and so the bird is kind of like not really flying around much. Uh, it’s probably flopping, and the, the story goes that, you know, just as the bird’s ready to expire, of course, they let the air back in, so the bird can live.

[00:08:46] You have to imagine that there were some mis, mishaps along with this. And you have the, the younger fellow who’s maybe thinking about going into the, into the realm of science or medicine, is just sort of taking in what’s going on. You have the, the older gentleman, maybe the patriarch of the family, who is just pondering how the world has changed so much since he was a little boy, and all these great wonders that are going on.

[00:09:07] Of course, you have the father sort of consoling the daughter who is, who is, uh, more concerned about the fate of the bird. And sort of what’s going on here, uh, from, from, as, as a tableau in the, in the history of science and the, the sort of the, um, exhibition of new ideas is this, uh, this, this sort of very mechanical act.

[00:09:25] And that mechanical act is, of course, the air pump, which as much as that glass vessel there was big science. So we would take for granted now an air pump because we all have bicycles and we take our car to, you know, we pump up the air and this kind of thing. It was exceptionally difficult to create a device that could evacuate air from a vessel.

[00:09:43] It was the kind of thing where it was the X Prize of its day. And the, uh, the same thing with the manufacturer of that glass vessel. To get a glass vessel that could retain its integrity even under negative pressure and that could be carried around and this kind of thing. This was a Uh, you know, this guy’s equivalent of, uh, you know, the most precious apparatus that, that he, that he could have to help demonstrate and, you know, make his way and have his living.

[00:10:09] There’s another illustration in the book that I found particularly fascinating. So this was the experimental theater. And it was quite important that it was seen as a theater, because the only way in which these ideas were, uh, were shared and sort of trans, uh, transmitted to other people was that you had to do these demonstrations.

[00:10:22] So this was effectively the kind of publishing technology. of the day where you had to create these, uh, these, these apparatuses in which people could come, other people could come and see the science and thereby sort of spread the knowledge and sort of wonder about it and speculate about it and of course engage in conversation about it.

[00:10:39] And there weren’t very many of these things. And one of the reasons why there weren’t very many of these things is every little bit of that apparatus is handmade. It’s all made by craftsmen. So you had to find the best craftsmen. to make the glass vessel, and I don’t know, maybe they’re coming from, uh, from Venice.

[00:10:53] You have to have the people, all the, the, uh, the, the leather smiths who could make all the fittings and couplings so that they’d be airtight. Uh, this stuff was very important, and in this, in this theater, what people would see is basically this stage here, and, uh, you know, Boyle would sort of come out, and it was very, it was a very sort of theatrical event by all records.

[00:11:13] What you didn’t see were these special effects artisans, as I refer to them, underneath. Who are actually doing the very laborious work of making fact. They were doing this physical activity that materialized an idea that was exhibited above the stage. And these were he men. So these weren’t just like any old stage Joe.

[00:11:32] They had to be found people who could, you know, once, of course, you start evacuating a chamber, initially it’s quite easy to move the air out, but very quickly it becomes almost impossible. So these big brawlers would have to be brought in to essentially do the pump work. And there was something in that activity, the fact that it was such a, such a, such a physical act, and it wasn’t just ideas being sort of shared and distributed and discussed, that there was this other thing going on, that there was this material activity that was happening that I think is quite important to keep in mind and something that In my mind, uh, translates very quickly to the work that happens by, uh, by special effects artisans today.

[00:12:08] So, anyone doing special effects know it’s not just click a button and the thing happens. Especially in a world where you sort of, um, I imagine competing for the next greatest effect. Uh, you might have to write special software that doesn’t exist. It’s not just Adobe After Effects. Uh, you might have to come up with new techniques and those techniques are usually kept quite secret until after the, after the film is produced when there’s a lot of behind the scenes.

[00:12:31] How is it made? And precisely this kind of activity, uh, is sort of revealed. And so that relationship between the creation of something that looks real and the material activity that’s happening underneath the stage and the creation of something that looks real in the theater of film and all the work that went, that goes on underneath it, is the parallel that I think makes it impossible to distinguish between something that is Uh, where we, where we feel forced to apologize that it’s not real, because it is real, because it starts conversations, because it gets people thinking about what could be, despite the fact that you can’t just go to the corner store and buy it.

[00:13:09] So what I want to share with you in the, in the last, um, uh, five minutes or so are three examples that use a kind of special effect that, uh, reveal ways in which the thing where, you know, you might want to apologize for, but you shouldn’t. is happening. So these are examples that, where I think there is design work going on, there is proper engineering work going on.

[00:13:28] It’s not just, it’s not just, uh, adornment to something. There are ways in which, in these films, people are speculating about and provoking conversations about some aspect of the future that is not something that we should just shrug at and walk away from and say, like, well, it was good entertainment. It is actually changing the way we think about the world.

[00:13:47] I’m sorry, there’ll be two examples of film and one from a, uh, from a book. So the first one is, is one of my favorites, and it’s my favorite in years. It’s, uh, 2001, A Space Odyssey, which was made in 1968. Of course, collaboration between Stanley Kubrick, Arthur C. Clarke, who, uh, who wrote the, wrote the novel that, uh, this was based on, and then what I, most importantly, I think, various rocket scientists who were involved in the creation of the film.

[00:14:10] And I picked this little loop in particular because of the special effect that was, I, correct me if I’m wrong, I hope I’m not, I’m pretty sure it was Douglas Trumbull, young Douglas Trumbull, who’s working in special effects in the film. Uh, was, was, I guess, you know, charged with creating these sort of ancillary graphics, which now, you know, we see all the time in a, in any, you know, even in a Jason Bourne thriller.

[00:14:30] Like, there need to be screens popping and bleeping and showing things happening. What I find most fascinating here is that Douglas Trumbull, in collaboration with, uh, with, you know, various scientists, were trying to figure out, were basically showing what screens would look like, uh, in, in, uh, deep space travel, before deep space travel ever happened.

[00:14:48] And before there was any such thing as a graphical user interface. And I just find it absolutely mind blowing that, uh, these people could come together and say, like, well, this is probably what it should look like. And it holds up, I think, even today. That, uh, you know, before the Macintosh, before Adobe Flash, before, um, Squiggle art graphics and those kinds of things, these guys were actually doing it.

[00:15:10] And they were doing it with quite manual technology. So there were no computers to show the computer. Uh, they were just doing it all with hand animation and rear projection. It’s very exciting to me. And the fact that it was, you know, it was 1968 for crying out loud. We hadn’t even gone to the moon yet.

[00:15:28] And then even things like this, like speculating, imagining, showing us what FaceTime would be, uh, many decades before FaceTime even existed. And there’s always a sort of favorite example, the zero gravity toilet. So, if you look, if you, there’s a lot of, uh, very interesting sort of ancillary, The making of, and speculating about, and so there’s that, that text is actual text.

[00:15:50] It’s not Greek text. So they actually went through, you know, Stanley Kubrick being Stanley Kubrick, and not wanting to like, leave anything out. Uh, said like, okay, we’ll sort it out. How would you deal with a zero gravity toilet? And we’ll put that in as the graphic, which you can’t read there, but if you get the, uh, the making of, uh, documentation, they actually have it written.

[00:16:08] And then, of course, the more recent example, um, from, this is a movie, one of the, one of the movie posters. Uh, Samsung lawyers with straight face used this as a defense against Apple to say like, look, there’s prior art for the iPad.

[00:16:23] And I just find it wonderful that they, that, you know, so we apologize it’s not real, but that these lawyers who I, I gotta imagine are making a lot of money went in straight face and said, it was real. And, uh, so we didn’t rip off Apple, um, Apple ripped off Stanley Kubrick. Sorry to say.

[00:16:42] Uh, so another example, I just, I love these, kind of these Haynes manuals. So anyone who has had a car, uh, you know, when they were, when they were younger than 22 would probably have one of these to help repair it. And they’ve just done this wonderful thing where they shift that idiom of the, of the service manual and make it just plain as day.

[00:17:00] You have an enterprise, uh, and that warp core has just gone a little bit, a little bit fuzzy. Here’s the manual to help repair it. And it’s just a wonderful point of entry that. Engages in, you know, so it’s fan fiction effectively, but it allows people to engage in, uh, the Star Trek universe. Um, by, by putting this sort of facade on it, it almost makes it more interesting in a way.

[00:17:21] It’s like, oh yeah, I got one of these, and this just sits on the shelf next to the one for the old VW van and all that kind of stuff. Of course, they had to make one for Apollo 11. Same idea. Um, now the third example, I feel a little bit funny about, about sharing this because it’s sort of pre release stuff.

[00:17:38] Um, it’s fine to show. Uh, this is, um, it’ll be the first, first production from the Near Future Laboratories Moving Pictures Division. It’s a film called How. This is just a production clip from it. Uh, in this film, what we’re doing is we’re exploring, uh, the ways in which artificial intelligences are becoming more kind of human.

[00:17:58] Uh, in this, in this film what we’re doing is we’re having, there’s a character Hal, he happens to be the other member of the crew that they refer to quite often. Uh, and I thought that it would be a wonderful way to sort of, um, probe our relationship to machines now. Especially with things like series, that we have these conversations with them.

[00:18:16] And we sort of want to wonder and speculate about how they might relate to us. And it’s a little bit troubling that we’re not, that we’re, that we’re sort of taking it for granted. And so part of this is sort of saying like these are machines, maybe they should be, maybe they shouldn’t. Have the same sort of conversational tone.

[00:18:30] What would they do if they start overhearing us say things like, Well, we’re gonna, we’re gonna return the iPhone because it’s not working properly. Like, how, what, what are the ways in which they might respond or might, uh, Might, might turn back on us. Or might fight against us. Or might, uh, lock the bathroom door on us and won’t let us out because they’re not able to do that.

[00:18:51] So, this will be coming out in 2012. You heard it here first.

[00:19:03] Thank you, Julian.

[00:19:13] Matt Jones: And now, um, Kevin Slavin, man of many parts. Um, some of which work better than others. They all work. Uh, game designer, writer, teacher, um, prognosticator, uh, man of the world, etc. Um, special effect artisan, perhaps. Um, and, um, Not terribly good at his own AV requirements.

[00:19:42] Kevin Slavin: Uh,

[00:19:47] this is my favorite part of every presentation.

[00:19:53] You would have loved it earlier today then,

[00:19:54] Matt Jones: man. 10

[00:19:57] Kevin Slavin: please. Oh, that’s the least of it.

[00:20:04] Right, isn’t this, isn’t this your favorite part? Okay, now I have control over it. Uh, I could tell a joke. There’s actually a magic button in Keynote to do that. Is there really? It’s going to be flipped, right? Yep, it’s

[00:20:20] Julian Bleecker: totally flipped. There’s magic buttons there. Really? Yeah,

[00:20:23] Kevin Slavin: press play. No, no, that’s not the problem.

[00:20:26] Kevin’s an expert

[00:20:27] Matt Jones: on the way that computers are changing our relationship to physical space. And each other. Um, and this is a performance piece.

[00:20:37] Kevin Slavin: That, uh, brings that out. You guys want to see it again? Yeah. Yeah. I think we’re going to. You are. Alright, let’s see. Yeah. Hey! I’ll take it. Alright. Okay. Um, okay, so this is, uh, Core War.

[00:20:54] Can you guys hear

[00:20:54] Matt Jones: me okay? You have to get quite close to the

[00:20:56] Kevin Slavin: mic. Check. Okay, yeah. So, uh, this is, um, this is Core War. This is a video game from, uh, from 1984. Did anybody ever play this? Has anyone ever played this? It continues to be available commercially. Um, it’s, um, uh, oh wait, it’s, uh, how do I get it to play, I wonder?

[00:21:18] Wow, this is really Not my, there we go. Okay. So CoreWare, um, uh, it is a game by programmers, uh, for programmers. And to play it, you, uh, you write an algorithm. It’s a two player game. And then you unleash it, uh, uh, against the other player’s algorithm. And, uh, uh, in the end only one of them wins. This is Dwarf Scout versus Little Factory.

[00:21:38] I don’t know if you saw what Little Factory did there, but it was an awesome move and it won. Um, this is not, uh, it’s not a popular game. Uh. There’s, there’s really nothing to do in it except, uh, write, write an algorithm and watch it win or lose, and that’s totally, uh, abstract, right? So, you know, let’s see.

[00:22:02] Um, this is, uh, this is a photograph by, uh, by an artist named Michael Najjar. Uh, and it’s, uh, it’s real, not in the Julian sense, in Julian’s sense of real, but I mean like real, like, I mean like it’s real. Like, this is, like he actually. Um, and then he went there to Patagonia to take this photograph, um, but then he also worked on it extensively, uh, in Photoshop, presumably, uh, to map the, the contours of the mountains there to the vicissitudes of the Dow Jones Index.

[00:22:34] Uh, so what you see there, uh, that, the high precipice and then that, that big valley there, that’s the 2008 financial crisis. Uh, so this photograph was made, uh, right when, right around 2009 when we were kind of deep in the valley. And I’m not sure exactly where it is that we are now. Um, he did that with, uh, the Dow Jones.

[00:22:55] This is, uh, this is the RTS. That’s Russia. Here’s, uh, the Hang Seng Index. Uh, that’s Hong Kong. Uh, similar topography, mysteriously. Um, and this is art, right? I mean, this is, uh, it’s metaphor. Um, but it’s a metaphor with teeth. Um, and it’s with those Uh, that I want to sort of talk about this idea of, um, contemporary math and like sort of specifically algorithmic operations, um, uh, as sort of something that are no longer derived from the world around us, but rather kind of imposed on it.

[00:23:36] Um, that, uh, that rather than sort of being, um, pulled out of the world, that they’re somehow shaping it. Um, the world’s. You know, around us and inside us and that over time they take like the, the texture of truth that they kind of ossify and calcify with repetition, um, like the ground beneath our feet. Um, this is, uh, if you go actually further down, like under the earth, um, this is, uh, this is in Texas, uh, Waxahachie, which is, I don’t, I just never believe that that’s a real name for a real place, but I think it is.

[00:24:14] Uh, uh, and if you go deep underground, uh, near Waxahachie in Texas, you’ll find traced to the rock, although kind of abandoned, uh, uh, it’s not a mine and it’s not political territory. That’s the, the desertron. Um, and I learned about this of all places from Robert Harris, um, which is a long story. Um, but if the desertron had been completed, this was America’s answer to the Large Hadron Collider, which was a question that hadn’t been fully asked yet, which was abandoned.

[00:24:44] the entire project was abandoned. This is what it looked like on the ground. Uh, it’s real, it’s abandoned. And this, you know, the, the Desertron super collider was sort of where America was working on making math real. And this is where they made it real sort of at the same time that it disappeared. It was abandoned in 2003 after going like 8 billion into a projected overrun.

[00:25:09] And this is where, uh, the American physicists were in their own kind of race to zero, um, trying to beat the speed of light. And you’ll note, uh, as architects, um, uh, not a lot of, uh, human affordances, uh, in the architecture here. Not a lot of house paint or, uh, window dressing, um, uh, because all the action is really, uh, underground.

[00:25:35] Um, or it would have been underground if they’d finished it. Um, so this is a 54 mile ring, uh, a tunnel actually blasting through, uh, the United States all to sort of study, uh, you know, the, the matter underneath everything, you know, subatomic particles, nature of the universe, stuff that I know nothing about really.

[00:25:55] Uh, and Clinton was worried when it was voted to be canceled, uh, the president was worried about the economic effects, uh, on Texas of canceling the Desert Charm Project. And. The thing that he couldn’t have predicted, uh, or he didn’t predict, or nobody really predicted, was what the larger economic effects of cancelling the Desertron project were.

[00:26:14] Because what it really did was, uh, what it, it took the densest concentration at that moment of American physicists and turned it into a kind of diaspora, uh, a bunch of physicists looking for something to do, uh, some difficult problem to solve. And so they found it, uh, they found it much closer to home.

[00:26:34] Much closer to my home, uh, the financial district, uh, here in New York City, uh, which used to be built on information and the people who, uh, uh, who moved that information around used to be based on kind of like flesh and blood, uh, which is why you had to build these buildings there to do that. Um, and, uh, really there is this real correlation, if not causality, between that moment where it That huge tunnel under the Earth gets abandoned and when something else starts to hollow out over on Manhattan Island and when I go to him, like when I look at the occupy Wall Street guys who were, who were, who are actually about three blocks from from right here.

[00:27:20] I think of like I think of like hermit crabs, you know, of like, you know, taking the shell from something that actually died quite a while ago. And, uh, you know, using that for a home. Um, and it’s, uh, it actually, the occupation has more to do with the way hermit crabs occupy than actual displacement. Um, because there’s really not that many people left down there.

[00:27:45] Um, and that wasn’t sudden. Uh, it wasn’t, uh, uh, immediate. It was sort of slow and subtle. And I only learned about it, uh, years later at, uh, cruising altitude. Um, I was on a Transatlantic flight, uh, next to a Hungarian physicist, um, who is about my age. And, uh, I was talking to him, and I was, what I was curious about was, what, uh, what a Hungarian physicist did during the Cold War.

[00:28:14] Uh, and sort of, you know, in talking about it, he said, uh, he said, Well, you know, I worked on, um, I worked on stealth. And I thought that was kind of interesting because it’s not what, Uh, they’re famous for and, uh, and I said, so you were making stealth and he said, no, we were breaking stealth. Uh, and he said, you’d, you’d have to know a lot about how stealth works to understand that.

[00:28:37] And I, and I said, yeah, well, I, I kind of do, uh, uh, I know more than some cause I used to work on the F 22 fighter bomber account when I worked in advertising, speaking of ethics. Um, so, so this is what, this is what I know and I’m going to explain this super quickly, right? Which is that. Of course, you know, you can’t just pass a radar signal through 52 tons of metal.

[00:29:00] You can’t just make the thing disappear. So, if you want to, if you want to take this huge thing and you want to make it seem to disappear, what you can do is you, if you can figure out a way, a strategy, to take the big thing and sort of make it look like a million little things, like, you know, a flock of birds.

[00:29:22] You’re only going to see this video like five more times tonight, by the way. Um, uh, if you can, if you can make the big thing look like a million little things, like something that has like the resolution of birds or maybe pollen or something, well then you’d have to calibrate radar to be able to detect every bird in the sky, every piece of pollen.

[00:29:40] And if you’re a radar, that’s a really shitty job, right? That’s a really difficult thing to do, and you don’t want to do that. Uh, so this is sort of, this is one of the fundamental. Uh, sort of premises of stealth. And, uh, and I said, so that’s as I understood it. Uh, we’re having this conversation on a, you know, 747.

[00:30:00] Um, and, uh, and he said, yeah, but that’s only if you’re a radar. Uh, he said, that’s only if you’re looking, uh, for a radar signal. He said, we look for electrical signals. We look for electrical signals that were just moving through the sky. And he said, and, uh, in Uh, Hungary, uh, if we saw, uh, electrical signals moving through the sky with no radar signature, we thought that probably has something to do with the Americans, uh, and, uh, and it usually did.

[00:30:26] Uh, and he said, so, so we made a, uh, we made a black box to sort of look for those electrical signals as they move through the sky. And uh, that’s how we sort of broke stealth, uh, long before that was, uh, long before the, the, uh, F 117 was shot down over Serbia. by chance. Um, and I said, I said, so that’s, so that’s like an interesting career, right?

[00:30:50] Like, because you’ve basically negated 60 years of aeronautic research. Uh, and you know, where do you, where do you go from there? Like, what’s your, what’s your act to like, what are you, you know, what’s, you know, where do you, what’s. What do you, what, what, what’s the next phase in your career? And he said, uh, he said, well, you know, um, financial services.

[00:31:08] And I was like, well, that’s, that’s interesting, because those are in the news really frequently lately. Um, and, uh, and I said, so what are physicists doing down in financial services? What is the, what’s the black box, uh, of the financial services industries? And he said, actually, it’s funny that you ask that, because it’s actually called black box trading, uh, what he’s working on.

[00:31:29] Uh, just a coincidence. Uh, uh, it’s called black box trading. It’s another name for something that’s called, uh, more commonly, uh, algo trading, uh, algorithmic trading. Uh, and that comes from institutional traders, you know, uh, mutual funds, uh, uh, big banks having some of the very same problems as the United States Air Force.

[00:31:53] So, uh, which. You can make some joke about crashing, but I’ll, I’ll let it go. . Uh, but, um, but you, uh, if you are a large institutional trader, right, let’s say you’re gonna move like a million shares of Microsoft or whatever, right? You’re gonna move a million shares through the market, right? And this has to do with the size of the positions that they’re taking.

[00:32:12] It says, okay, so if you’re gonna move a million shares, the, you can’t just enter into the market and say, I’m gonna buy, or I’m gonna sell a million shares. ‘cause it’s a little bit like, you know, imagine. Playing poker and, you know, nobody’s looked at their cards yet. Nobody knows what they have and you just, you just stand up and go like, I’m all in.

[00:32:28] And it’s like, okay, well, how do we, nobody even knows how to play, right? It actually, it disrupts the very circumstances that you’re sort of, uh, set and seeking to take advantage of. So, they can’t just put a million shares to buy or sell out in the market. Um, so they engage in what, uh, Jeff, uh, Minnaar was, uh, uh, called, uh, uh, financial camouflage.

[00:32:48] Uh, and they basically use algorithms to break up. Those million shares into seemingly random chunks, uh, of buys and sells that are seemingly randomly distributed, uh, through the market. And that’s all well and good, uh, and it actually worked for a while, uh, except that of course the very same math that you use to hide something mathematically in the market is the very same math that anybody else can use to find it.

[00:33:13] So that’s really what’s going on in the market, right? If you need to, if you need to picture, you know, what is happening. Uh, you know, with 70 percent of the trading, uh, that’s happening in the operating system, formerly known as your pension, uh, or your mortgage, uh, what you can picture is, uh, uh, a series of algorithms that are, uh, in competition, some of which are trying to hide from everything and some of which are trying to find those algorithms that are hiding.

[00:33:45] And, you know, what could go wrong with such a scheme, really? Uh, uh, the, uh, flash crash of 245, uh, which is my favorite name for the event, uh, that happened about a year ago, in which, uh, the entire American stock market lost 9 percent of its value in about five minutes. Uh, so 9 percent stock market just disappears, uh, 70 percent of that, uh, being just bought, sold, moved around, uh, by a bunch of algorithms that no one in particular has written.

[00:34:17] Uh, uh, that nobody in particular has any oversight of, nobody in particular has any idea what happened. Uh, and in fact, collectively, nobody can even agree on what happened, uh, in the flash crash. Uh, because there, there’s sort of, there’s no human agency left anymore, uh, to interrogate. Uh, all that a trader has at this point is just a big monitor.

[00:34:45] with a number moving up and down and a red button that says stop. And what we’re doing sort of collectively and collaboratively is sort of writing something that we can’t read. Um, we’re sort of writing something illegible. Uh, we humans have written something that we can’t really read anymore at all, almost.

[00:35:07] Um, because there are still a few superheroes. In the world, some of them statistically have to be in Boston, uh, so, uh, these guys are called NANEX, uh, and they are using, I don’t know what, math, I suppose, uh, and they actually go, they interrogate the market data and they can find some of the algorithms, like they’re basically kind of reverse engineering the algorithms out of the market, the same way all the financial guys are.

[00:35:38] And when they find one, they pull it out and they pin it to the wall, like a butterfly, or like a constellation, right? Like the way that we used to tell ourselves stories about the stars, or I guess we still do, uh, and they’re fiction, but they did let us navigate home pretty effectively for a pretty long time.

[00:35:57] Um, they were stories that we told ourselves to kind of make sense out of a world that didn’t make any sense. And that’s what NANEX is doing. Uh, uh, with the market, um, they’re making the thing that humans use, which are stories. Um, so they, they reach in, uh, and this is one that they found, uh, a while ago.

[00:36:15] This one is called, uh, The Knife. Carnival. That’s the Boston Shuffler. The Castle Wall. Twilight.

[00:36:33] And then, you know, you look back at Core War, at Dwarf Scout, and Little Factory. And you realize that like, you know, if chess was an allegory for war and monopoly is some weird twisted allegory for capitalism, you know, then core war from 1984, it’s kind of a masterpiece, right? Because there’s nothing abstract about it, right?

[00:36:56] It’s not an allegory at all. It’s a, it’s an audition, right? It’s an audition for 25 years later when the war, the Dwarf Scout and little factory becomes real. With a couple billion dollars on the line. And if you, you know, picture the New York Stock Exchange, you know, picture Dwarf Scout and Little Factory.

[00:37:14] And picture the Carnival, and the Knife. And the gag is, like, the thing that I came to understand as I came to understand all of this, is that of course, you know, the Knife and the Boston Shuffler, they’re not just working their way through financial markets, and that this isn’t really a financial services story.

[00:37:30] That it’s actually much bigger than money. Um, So, uh, to sort of like, you know, when you start to understand this as kind of the physics underneath this, you can find it almost anywhere you look. This is, uh, this is a book about flies, uh, secondhand, this is about the genetics of flies. Um, so this is, uh, for sale on Amazon about, uh, six months ago, the asking price was 1.

[00:37:58] 7 million, um, which seems high, uh, uh, but. Uh, would have been a bargain had you acted on that because it was four hours later, it was 23. 6 million. Um, and what you see here, uh, is two book traders, small time book traders using, uh, effectively, you know, Dwarf Scout, Little Factory, The Knife, etc. Uh, using fundamental, uh, uh, algorithms to try to set the price of their books in the Amazon book market.

[00:38:28] And what you see here are two algorithms that have caught themselves in a loop. Uh, two algorithms that each one of them, themselves is engaging in an extremely rational behavior but tied together, uh, are asking 23. 6 million dollars, uh, for, uh, for a book about flies. And, um, and, you know, the only difference between what you see here and what you see in the stock market is that on Amazon it still has a human to say, like, yeah, alright, I’ll take it.

[00:38:57] Right? That’s the only difference. You know? Right? And, you know, as with Amazon, so with Netflix, uh, and, you know, so this is a, this is an algorithm called pragmatic chaos. Uh, uh, so I don’t know what the numbers are here in the United States. It’s 20 million, uh, uh, people are using Netflix to watch their movies.

[00:39:16] Um, 60 percent of what they’re watching, uh, is determined by the recommendations of a, of a single algorithm called pragmatic chaos, uh, that is. Fundamentally, in its own way, determining the taste of 20 million of the 300 millions in America. And the question is, you know, if that’s, you know, if you can build algorithms to sort of recommend the movies, could you also build algorithms to make them?

[00:39:44] And the answer is, of course, yes. Of course you can do that. Uh, this is a bunch of, uh, UK Financial Services guys who took The wisdom that they had gathered from working in financial services for many years and decided to apply it to Hollywood. Uh, they have, uh, narrative algos that, uh, they can use to process your script and tell you whether that is a 30 million dollar, 50 million dollar, or 250 million dollar movie, uh, before it’s made.

[00:40:09] And the question is, you know, if these algorithms crashed, how would you know? Um, how would we, what would be the evidence of that except more shitty movies? And, you know, the question is, is that if you have algorithms that are producing culture and if you have algorithms to regulate and recommend their consumption, who is the user up in here?

[00:40:36] Uh, and, uh, you know, I’ll skip over a bunch of things about grapes and wine and Robert Parker. Uh, and, uh, you know, we got to get to robots. So, um, because if you. If you, if you really want to look for all this, if it all seems abstract, um, you can find them in at least a lot of American households, uh, you’ll find them in your house.

[00:40:59] This is, uh, these are the time lapse photographs that were made of two competing, uh, uh, uh, cleaning robots, uh, that are, that sort of, you can see, have very different ideas about what clean is, uh, and that there’s sort of these weird sort of secret architects with different ideas about your bedroom. You know, and the question is, how could robots have such different ideas and what informed those ideas?

[00:41:27] And the answer is kind of like a shrug, you know, like, I don’t really know. Um, because A lot of the algorithms that are being used in all of this, uh, are genetic algorithms. And, you know, we could talk for a long time and probably you guys know much more about that actually than I do. This is the, uh, this is an excellent visualization of how they work, right?

[00:41:47] This is a genetic algorithm that is just sort of given a few parameters. You know, just get, use a couple wheels and some weight and try to get from A to B. And you watch this for a while. And, uh, I’ve watched this for hours on end, uh, because I think it’s just some of the most amazing entertainment produced by computers.

[00:42:03] Uh, and, you know, because you feel, it’s like watching a puppy in the snow, you know, right? It’s like, you’re like, just put the, just put the wheels on the bottom. Uh, you know, and it seems pretty straightforward, and you can watch this for a while, uh, and, but you, you let this go, and it learns, actually, you know, of course, a lot quicker than a baby learns, and it just keeps mutating until it finds it, you know, and then it sort of reaches, uh, you know, a, a relatively optimal state very quickly, that’s probably about an hour in this particular case, um, and it’s sort of delicious, and it’s sort of scary.

[00:42:38] Um, and not because it’s smart, but because it seems smart without any underlying sensibility that we would recognize as human. And you know, I’ve been thinking a lot about this particular point lately, and it’s helped me to think about sort of what’s useful about stories and storytelling. Um, and I’ve been reading this book by James Glick, uh, called The Information, and sort of thinking about stories as information with a lossy.

[00:43:06] Form of compression to help information travel and the lossiness, you know, sort of explains the futility of demanding that stories be true. And then you look at something like this and you realize that what’s unsettling about watching an algorithm arrive at something like this is, is that. You miss that sense that there is an author somewhere, right?

[00:43:28] You know, that, that what a story is, is basically just information with an author. And what we’re looking at here, you know, and what we’re looking at in Cleaning Robots and the entire stock market, is the sense of living in a world without one, without an author, uh, without an authorship, you know. And, you know, I’m not, I don’t want to be alarmist, but that’s who’s in your house, right?

[00:43:51] Um, and, you know, it’s, it manifests itself in a lot of different ways. This is, uh, Destination Control Elevators. I was just on one in Salford, uh, a couple hours ago. Uh, these are, you guys must know all about this stuff, right? It’s, you know, Destination Control Elevators. Instead of this, you know, mess of just, like, having everybody decide what car to get into and decide what floor they want to go to, you input that on the outside and then it’s sort of, it’s a bin packing algorithm, right?

[00:44:15] You just sort of sort humans into different cans and the, and, So there’s two problems. One, which I just saw in Salford, is that then you need elevator customer support, right? You have, so now you have a person who stands there, uh, who explains to everybody how to use this thing that they’ve been using for about 50 years, uh, uh, without, uh, a huge amount of effort.

[00:44:34] But the more important problem, and the reason that they, that these aren’t really being used, uh, very much anymore, uh, is because they produce panic. Uh, when people get in them, uh, because people get in them and you can see the problem right away that there’s no sort of human affordances left over there in the left.

[00:44:50] There’s nothing like a button, you know, there’s nothing like anything to suggest that there’s any sense of control at all. There’s just a, there’s just a window with a number that moves up and down and a red button that says stop, right? And we’re optimized. Right? You know, we’re optimizing all of this to a kind of machine dialect that barely speaks human at all.

[00:45:11] And it’s the scale of the thing that makes it crazy, right? And you really, you can go back to Wall Street to really find the madness of the scale because this is where it gets, uh, the teeth all the way in. Um, because if you are an algo trader, right? If you are the Hungarian physicist, right? If you are running algorithms like Cherokee Nation, Bluegrass, Marco Polo, et cetera, down there where Occupy Wall Street is.

[00:45:35] Uh, uh, you, um, uh, you need to be faster than you can possibly imagine. You need to be faster than any human scale, right? It takes half a million microseconds to click a mouse, right? But if you’re a Wall Street algorithm, like Frog Pond, if you’re five microseconds behind, you’re a loser. So if you were an algorithm, you would hire an architect, like a friend of mine, uh, uh, who was hollowing out a skyscraper in Frankfurt, right?

[00:46:03] Who was removing All the furniture, all the everything, and just laying down steel to hold up server racks. Because the building was never meant to support the weight of the machines that were displacing the people who’d been in there. And all of that for a specific reason, which was so that one algorithm, or maybe a few algorithms, could get closer to the internet.

[00:46:21] Because we think of the internet as a distributed system, and of course it is a distributed system, but it’s distributed from places. In Manhattan, it’s distributed from 60 Broad Street there. Um, and the type of building is called a Carrier Hotel. And this is really where the pipes come right up. Uh, they come out of, um, uh, Staunton Sands, uh, here in the UK.

[00:46:41] Um, so that’s kind of the mother load, uh, for the internet itself. And the problem being that if your algorithm, if FrogPond is, whatever that is, about 20 blocks away, you know, that’s something like, you know, 4 milliseconds for that information to travel. through the earth up to the carrier hotel. And if you are four milliseconds behind, you might as well not be trading at all.

[00:47:06] So you need to be right on top of it, right? So, uh, uh, so the real, like the very clever algorithms, Twilight, the Carnival, Boston Shuffler, et cetera, uh, are actually taking over the buildings, uh, that are closest to the carrier hotels. Uh, and when you start to really understand that this is how urban planning is ad hoc being pursued, um, you realize that network topology has become the new urban topography.

[00:47:34] You know, when real estate valuations are being determined, not by the proximity to a school or a subway or like a good bar scene, but by the race to zero, right? By proximity to the pipe, uh, and that algorithms have their own neighborhoods at this point built on this sort of new. Natural resource, because, you know, pound for pound and inch for inch and dollar for dollar, there’s nobody in this room who could squeeze revenue out of a building like the Boston Shuffler can.

[00:48:03] So let’s, um, if you zoom out a little bit further, we’re almost all the way out. If you zoom out just a little bit further, you can see that the scale is even bigger than that. What we’re looking at here is an 825 mile trench in a straight line between New York City and Chicago. This is actually finished construction now.

[00:48:21] This is built by a company called Spread Networks, Jim Barksdale, first he did Netscape, this is what he does now, thanks Jim. This is a pipe laid down through the United States, a tunnel built. through rock and mountain, uh, all so that, uh, the algorithms moving between New York and Chicago can trade 37 times faster than a mouse click, uh, all just to run the carnival and the knife.

[00:48:46] And you realize, you know, that the physicists got their tunnel after all, racing the speed of light. It just happens to be in the Midwest instead of Texas. And it happens that the data that they’re crunching is from a parallel universe and not our own. You know, and so that 825 mile trench is done, but that’s just Chicago and New York.

[00:49:06] And this is a story, really, like all stories, about Manifest Destiny, about looking for another frontier. And there are always more frontiers, and you need dynamite and rock saws to get there, you know, if an algorithm is going to close the deal three milliseconds faster. But there’s plenty of space left to go, right?

[00:49:25] So this is, uh, this is theoretical. Um, but, but important, uh, this is two MIT, uh, uh, mathematicians, uh, who have calculated planetary scale computing architectures for electronic trading, uh, this is a whole bunch of really, really fancy stuff about like light cones and quantum entanglement and things that I don’t get at all, um, but I understand this part.

[00:49:50] Uh, which is easy to understand, which is the red parts there are where the people are, where the market’s got built in the first place, and that if you really want to run algorithms effectively, optimally, in the market, you’re going to have to locate your servers where those blue dots are in order to minimize the, uh, the, the, the, the time that it takes for the information to travel.

[00:50:12] And the thing that you’ll notice looking at this just casually is, is that an awful lot of those blue dots are in the middle of the ocean. And I don’t see that as a problem necessarily, because I think that we can solve that like we’ve solved everything else. We’ll build bubbles or platforms, and we’ll figure out how to sort of part the water for that.

[00:50:32] That doesn’t seem that ambitious. Um, because it’s a super bright future, if you’re an algorithm. But it’s not the, the money part that’s amazing about all this, right? It’s what the money is motivating. The idea that we are terraforming, we’re actually terraforming the earth with this kind of algorithmic efficiency.

[00:50:55] And in this light, if you return back to Michael Najjar’s photographs of the mountains shaped by the Dow Jones, you realize that it’s not a metaphor at all, that it’s prophecy, right? That core war was no allegory, right? That Michael Najjar is not metaphor, that all of this, they’re just playful fucking rehearsals for everything that we are already doing.

[00:51:18] Right? For all of the tectonic effects of the math that we’ve made, right? And this is the bottom line, right? Is that the landscape has always been shaped by nature and by humans, and now it’s sort of shaped by this third co evolutionary force, which is the algorithm. And we will understand those as nature, because in a way, they are.

[00:51:40] Thanks. Applause

[00:51:58] So

[00:52:02] Matt Jones: Stories without authors. We can just go home now, we don’t need to listen to Bruce. Um, I’m going to introduce Bruce, um, with just a little, um, something, something that I’ve just dialed up on my, my pocket computing device.

[00:52:20] There’s lots of these things. I figure hundreds, maybe thousands, all different kinds. And every one as stupid as dirt. Or else we’d be dead and disassembled a hundred times already. Pete stared at the dissected robots. A cooling mass of nerve netting, batteries, veiny armor plates and gelatin. Why do they look so

[00:52:39] Julian Bleecker: crazy?

[00:52:41] Because they grew up all by themselves.

[00:52:44] Matt Jones: No one ever designed them. Ladies and gentlemen, that is a short story called Tackler McCann by Bruce Sterling. This is Bruce Sterling.

[00:52:54] Julian Bleecker: Right, the author without a story. So, yeah, I’m the obligatory

[00:52:59] Bruce Sterling: literary guy at the gig, so I’ve got like no slides. I’m just going to like tell a thrilling wonder personal anecdote that happened to me actually when I was thinking about showing up here, you know, because I’ve like seen the other two thrilling wonder venues and I wasn’t there, but I was like keenly aware that they’re like full of FX guys and landscape

[00:53:19] Julian Bleecker: futurists and I’m quite the

[00:53:21] Bruce Sterling: fan, and you know, Cory Doctorow says hi, and William Gibson’s got the Berg lab cootin So, you know, we, we were, I was like keen to think, you know, what kind of thrilling wonder am I going to like bring to the table here?

[00:53:34] So, you know, I was looking over at my email and trying to figure out if I could like manage to travel in from Turin, which is where I happen to be staying right now, and I was making some scrambled eggs and bangers for breakfast. Sausage, you know, went out and bought some sausage, so I’m like slicing up my sausage.

[00:53:50] I’m cooking it and like thinking about the wonderment. And then I realized there was like an odd smell in the room. It’s like, there’s like a horse in here. And I was like, this smell of a wet horse. And then I realized it’s horse sausage. Because, you know, I, I went to the slow food grocery in Turin, and, you know, I’m not picky.

[00:54:09] I just tend to like pick them out by the shape. And I like bought this particularly elaborate Piedmontese sausage, and it was, it was horse. It was a, it was an equine sausage. So, you know, I was like startled by the smell of this sausage. And I thought, you know, this is like the most quotidian experience you’re supposed to be able to have.

[00:54:30] I mean, you’re like cooking sausage for breakfast. I mean, what could be like less, gosh, wow. And then it occurred to me, okay, well, you know, what’s the most thrilling and wonderful possible

[00:54:41] Julian Bleecker: sausage? I

[00:54:44] Bruce Sterling: mean, how could you like really jazz it up, you know, like get some kind of really high thrilling wonderment out of a thing this every day.

[00:54:54] And you know, if you’re in the science fiction business as I am, there’s like certain things you can deploy. Okay. It’s an artificially intelligent sausage. It’s a nano carbon sausage, the sausage time travels, the sausage went to Mars, you know, you know. And there are certain things that I think really do have a thrilling wonder.

[00:55:13] You know, because I’m not a cynic about the subject, I mean, I think it’s, it’s, it’s a real part of experience. There are things like the size of the cosmos and the age of the earth. And these are things that I think would like cause wonderment in basically any cognitive organism. You know, a hyalozoic amoeba would be kind of amazed by that, uh, you know, an intelligent algorithm that had girdled the planet for its own.

[00:55:40] For some very amusement driving purposes, if it knew how old the universe was, it would feel a sense of some kind of disturbance, or, you know, or else it would just be kind of a windup tinker toy. And all the rest of it, all the rest of the thrilling wonder is basically socially constructed. Or it’s got something to do with our bodies, you know.

[00:55:58] Like, thrills, like explosions, fires, fights, high speed chases, your basic FX. These are thrilling, but they don’t really provoke any wonder of the kind I’m talking about. They’re just metabolic, and they work in any genre. They don’t have anything really to do with the spearhead of cognition, like my sausage, okay?

[00:56:16] I can have an exploding sausage on fire with two guys fighting over it, and that’s thrilling, right? I mean, the sausage is on fire in those flames, and the guys are just punching, okay? That’s not really going where I want it to. So, you know, what is the most thrilling and wondrous

[00:56:35] Julian Bleecker: horse

[00:56:36] Bruce Sterling: sausage here in our actual disappointing reality?

[00:56:43] And being a novelist, I’m going to draw a verbal portrait of this thing. Okay, you know what a Perzhevalsky’s horse is? Perzhevalsky’s horse, very rare form of horse. They’re not even related to actual horses. You know, they’re more like onagers. Or mules. They don’t have the same chromosome count as other horses.

[00:57:02] But they’re, they’re, they’re just sort of existent on the earth. I mean, they were discovered by a Polish explorer named Przewalski in Central Asia. And there were just a few of them left. I mean, they’re, they’re the relic horse, you know, kind of, kind of a neo hippist. And they were almost all dead, but Przewalski managed to like, grab a few, and there’ve been a few kept around.

[00:57:20] But no wild herds, because they just, you know, they’re vulnerable. They’re small, kind of dwarfy. You can’t ride them. Um, They’re not real grown up horses. I mean, they’re, they’re called a horse. They’re, they’re another kind of animal. I mean, they look like horses. And the biggest herd of Przewalski’s horses is in Chernobyl.

[00:57:40] Because, you know, there’s like this huge involuntary park in Chernobyl. It’s as big as, you know, some small European countries. And people realized that, okay, like, this thing blew up, and it’s like saturated everything around it, and for generations nobody’s gonna be able to dwell here, at least not legally.

[00:57:55] So what are we going to do with this thing? Brilliant notion, you know, we’re going to like round up some Perzevalski’s horses and just let them run around on the grassy plains of Chernobyl, you know, trampling through the buildings. Whatever it is they want to do. You know, a fantastic notion. You know, and the horses took to it.

[00:58:13] They really, you know, they dug it. I mean, there’s like, they’re wildlife and there’s like no people around and there’s like no other horses and they’ve got their niche there. So they were multiplying until recently. And then the fans of the Przewalski’s horse herd, who periodically go out to keep track of them, notice that a few are missing, and they’re like, okay, what is it?

[00:58:32] Is it equine distemper, you know, something must have happened to them? Poachers. Poachers are preying on the Przewalski horses in Chernobyl. This vast, involuntary park, this damned and gloomy, gothic, high tech, radioactive netherworld. Somebody’s eating the horses. Not private taxidermists. You know, nobody’s like making horse hide couches out of them.

[00:59:01] You know, and they’re not thrilling wonder super villain bad guys. I can pretty well tell you who they are. I’m almost certain they’re bored Ukrainian teenagers. Male teenagers. And they’re like sitting out at the outside of the exclusion zone and it’s like, so, you know, have you seen a job lately, Uri? You know?

[00:59:22] No, Leonid, I really haven’t, you know? I mean, we don’t have advanced degrees. We’ve already had a Twitter revolution here in the Ukraine. We had the Orange Revolution. We had like, you know, hip dudes with the sandhills. We went out, we stood in the public squares, froze, shouted the government down, brought in a new government.

[00:59:41] They just did the same thing the other guys did. What are we going to do with ourselves today? Well, you know, let’s get in granddad’s pickup and take some rifles out into the zone again. You know, it’s not a good thing to do. We know it’s kind of bad. The cops don’t approve, but you know, at least we can feed the family.

[01:00:02] So they go out there and they’re looking around for pretty much anything they can bag. You know, and along comes one of these little horses. It’s like, oh yeah, it’s like one of them, man. Pick it off, pick it off, you know, hit it in the head. Pow! You know, it goes down, so they like, rumble over quickly, you know, are there choppers?

[01:00:18] No, it’s like, grab him, you know, the two of them are there. Maybe they’ve got some levers or whatever. They’re sort of muscling the thing up into the back of the pickup. And then they drive off with it. Okay, so they’ve killed the Kurzywalski’s horse. What are they going to do with it? Well, you know, they’re going to eat it.

[01:00:33] But, they’re going to sell it because they need money more than they need food. And what can you do with a dead horse? Okay, a small, dwarf y one that’s kind of weird, you can’t really sell like big pieces of it, like the leg or the chops. You really need to render it, you know? You need to make it sort of unrecognizable, and in fact, you don’t even want people to know it’s horse.

[01:00:54] And you certainly don’t want them to know it’s from the Chernobyl Exclusion Zone because basically the sucker glows in the dark. I mean, these animals are accumulating radioactive material in their flesh as they’re grazing on this, you know, radioactive soaked stuff. So, you know, there they are. It’s like, okay, we got to render it now.

[01:01:10] It’s like, oh, well, that’s not you and me, but it’s this other guy. So they pass it around to this other guy and he’s like running the offshore kind of, you know, handheld, you know. Well, I didn’t used to be a butcher, but you know, I just like tried for a while and after a while I got pretty good at it, you know.

[01:01:24] Chewing the thing up, okay, then it goes into the food distribution chain. Where is that sausage going to go, plausibly speaking? I mean, maybe Belarus, because Belarus is like the last Stalinist outlaw state, which sort of means that everybody breaks the law, and there’s just selective repression, so you know, and it’s close.

[01:01:44] But economically speaking, the answer is almost certainly Moscow, because it’s the center of the world. distribution, especially for any black market activity and any Ukrainian black market activity. You’re gonna want to offshore it. If it goes offshore, it’s gonna go to Moscow. So who’s eating the sausage?

[01:02:01] Well, there’s a guy sitting in Moscow, probably looks pretty much like the guy in Tarkovsky’s film, Stalker, you know, rugged older guy, you know, got his collar up. He’s in an unlit cold water apartment, you know, and in comes the sausage. And he’s like slicing it up, you know, and it smells a little odd, you know, smells a little odd like horse sausage, but you know, he’s not picky cause he just, you know, he needed this sausage and he’s eating it.

[01:02:29] He’s munching that horse sausage that glows in the dark. The weirdest sausage on earth. The weirdest sausage on earth, if he knew how to look, which he doesn’t, he doesn’t have means, motive, opportunity. Or the desire to figure out where his sausage came from. He’s just plain eating it. Now, that’s a very thrilling wonder story.

[01:02:55] I mean, it’s very building blog. Right?

[01:03:02] It’s very boing boing. I read them. You know, it’s got the classic gothic high tech sensibility that our time really likes. It’s just kind of a cool, you know, forwardable kind of internet. Story, you know, it’s not going to go anywhere. It would take some celebrity thing to push it. Like if you learned that Steve Jobs had once visited the Ukraine and he tired of his vegetarian diet and he’d eaten a bunch of sausage, you know, not, not wanting anybody to know, you know, I have clarion.

[01:03:31] Are you doing the apples? You know, I’m a vegetarian, but wait here in the Ukraine, they’ll like feed me all this stuff. And now he’s dying of pancreatic cancer horribly later because he did die for some reason, you know? Quite likely something he ate. Some kind of, some kind of quack hippie. You know. Well, you know, you look at what happened to the guy.

[01:03:52] Okay, now that would make that story explode. Like Steve Jobs killed by radioactive Chernobyl. Horse sausage. You know, if that were the case, it would be a titanic story. It would be like a dominant cultural narrative. You know, it would be like. The last stories out of the Titanic, okay, that’s not, that’s not actually going to happen.

[01:04:16] Did I eat one myself? You know, was that horse sausage actually a Chernobyl sausage that somehow got passed in the slow food chain? You know, maybe it’s not really probable. Maybe it didn’t. So, you know, what is this doing to us? Well, you know, this is a dark It’s a cyberpunk parable of a world with a very conflicted attitude toward thrilling wonder.

[01:04:39] Because we really like it, and the people in this room are super good at it. You can match anybody on the planet. I mean, you’ve got seers of thrilling wonder in this room, visionaries, but that’s not the dominant narrative of our time. I mean, our dominant narratives for the past, wow, 12 years, First Terror War.

[01:05:01] And then Money Panic, and now Giant Ripoff. Those are our kind of narratives, you know, and that’s what you see every day in the newspaper, and that’s kind of what we’re fed, and we’re like way into that. What is this about? I mean, here we are, gathered under the, like, the kindly shelter of the faltering architecture biz.

[01:05:23] I love them. You know, and we’re like a little speculative culture, you know, it’s amazing. You know, how we can follow this exceedingly variegated series of things from all different kinds of disciplines and nobody’s missing a beat. I mean, it’s like supposedly amazing, but nobody’s really all that amazed.

[01:05:41] They’re just kind of, you know, writing stuff down. They’re like following it. Like I get the pitch. Algorithms are taking over the earth. You know, I’m like, I’m fine with that. I’m doing great. Bring on the next one.

[01:05:51] Julian Bleecker: But

[01:05:55] Bruce Sterling: the world outside these walls. is like a giant guy eating the radioactive horse sausage without knowing it, you know, because that’s the condition in which the world is reduced by these events.

[01:06:14] And is that thrilling or wondrous? You know, is that a thrilling and wondrous situation that, you know, real life outside the FX biz? Is that weird? You know, is it, is it? Is it wondrous or not wondrous, or is it somehow both at once, like some kind of figure ground illusion? It’s like a point of, you know, a socially constructed point of view thing where you can just like whip them from one, one to the other.

[01:06:39] Like William Blake, you know, who famously said, there’s infinity in a grain of sand. Okay, why did he say that? You know, because he was a crazy visionary poet. Yeah, but also because it’s the truth. I mean, there really is infinity in a grain of sand. You know, I don’t think anybody in this room would have any trouble demonstrating it.

[01:06:56] You can come up with like a PowerPoint pitch. This is my grain of sand. It’s like, look, we can like drill down to it. Okay. Any one of us could know that happens. If somebody gives you a bucket of sand, nobody ever says thank you for a billion infinities.

[01:07:14] You know, if we really knew what we were doing, we’d be doing something about that. Right? I mean, not just the amazement, the thrill, the wonder, the user base, you know. We would be able to do it in a way that, like, burned with a hard gem like flame.

[01:07:36] Whatever next. Thank you.

[01:08:05] So, uh, Kevin, I’m keen to do the Waxahachie thing. Because I’ve been to Waxahachie. So that’s a real place. Yeah, I’m from Texas, so, you know, I went there. And in fact, I went to the public hearings when they were shutting the thing down. And it really is a place to conjure with, you know. I was sent around the world by WIRED once for this, uh, story called Spirit of Mega, which was all about the biggest, most thrilling, and most wondrous architectural projects in the world.

[01:08:31] And I insisted that one of them should be the, the collider, the superconducting supercollider, because, you know, with the figure ground thing, you, you had to have one that was a spectacular and utter failure, you know, a kind of anti, anti Eiffel Tower. And, um, Now, it’s a super interesting area. They never did build the entire loop, by the way.

[01:08:53] They built 16 miles of it and then stopped drilling.

[01:08:56] Kevin Slavin: What do you think is in there now? They

[01:08:58] Bruce Sterling: actually covered, very carefully shot through all the plugs with concrete so that nobody could get into them. So there’s a 16 mile arc of tunnel, which is still open, and you know, about as big as this room. Uh, and it’s just like one solid piece and, you know, it’ll be discovered someday.

[01:09:15] I mean, some guy with an algorithm is going to be like, what’s this

[01:09:18] Kevin Slavin: echo in the ground? They’ll be like, it’s probably a super collider.

[01:09:21] Bruce Sterling: No, they’re going to think it’s some kind of pharaoh’s tomb from the long lost industrial age. And they’re going to, you know, bend every effort to go in there and dig it up.

[01:09:29] And they’re going to find somebody abandoned some newspapers and floppy disks in there the last minute. It really is, you know, it’s a colossal thing. I mean, it’s a super, it’s an element to conjure with that thing. And it’s because it was buried by physicists and their supporters. And it’s also because of the Weir Guild involved.

[01:09:50] I mean, the reason they got that money is not because of

[01:09:59] It’s, it’s a payoff to not blow up

[01:10:01] Kevin Slavin: the planet. Right, right. Yeah, I mean, it’s, yeah, no, it is, it is, it is a funny, it’s, it is a funny thing. I mean, it does, it does conjure up the image of, of, of physicists as sort of like dangerous elements on their own, that these things get built to sort of cluster them around.

[01:10:16] Bruce Sterling: Well, you know, uh, Trinity is an ultimate thrilling wonder moment. Like, it goes off at 5 a. m. and nothing like this has ever happened before or since. You know, and like you can see popular culture like pivot on a dime and just, you know, go with this narrative. I mean, that was like a super, super sci fi gesture to make and nobody who was there really got over it on some basic Oppenheimer level, you know, they were all marked by that.

[01:10:45] Just don’t do it that way anymore, except maybe in the finance biz. Right. Bruce Banner goes to Wall Street. Um, we are now going to interlocute to you. You’re already starting to do it. Um, so that you each swallow each other’s sausages.

[01:11:05] Kevin Slavin: I’d like to have that struck from the record,

[01:11:07] Bruce Sterling: please.

[01:11:08] Liam: Um, Julian,

[01:11:09] Bruce Sterling: I, when you were showing the picture of, um, of, of the, the famous Pan Am flight to, to, um, the Lunar Station, and I was struck by the fact that it wasn’t an infographic.

[01:11:23] It was a list of words. And there, and there are some things which, there are some sort of cultural forms which just weren’t seen from those points of view. And kind of, and, and I, I, and, and that sort of, today we’ve been looking at sort of the way that cultural forms kind of spring from technology, and technology springs from cultural forms, and, and kind of, even the most far sighted

[01:11:46] Julian Bleecker: among, among us, kind of, um, I’m gonna go

[01:11:50] Bruce Sterling: ahead and

[01:11:52] Julian Bleecker: start with the question.

[01:11:55] What’s the dark matter in,

[01:11:57] Bruce Sterling: in kind of the stuff that you’re butting up against? I’ll sort of open that to all of you. What are the things that you think you can’t ever see from where you are?

[01:12:09] Julian Bleecker: The unknown

[01:12:10] Bruce Sterling: unknowns.

[01:12:11] Julian Bleecker: Yeah. Um, it’s scary to even think about. I guess maybe the same way that dark matter is scary to think about.

[01:12:17] This idea that there’s, you know, we, we, we, in a way, maybe almost like algorithmically we want to track to the next thing, or we want to figure out what, what are the other opportunities for the equivalent of only seeing text in, in a, in an information visualization and not realizing that there could be something else.

[01:12:35] Just not knowing enough to know that, and I think it’s finding those things that’s most exciting for me. Those are the, you know, in very modest ways, thriller, uh, uh, thriller, thriller, thrilling wonder story is, uh, wheels on luggage. The fact that there was this time before people put wheels on luggage and we schlepped around 50 pound Samsonites is absolutely mind blowing to me.

[01:12:56] Like that, just understanding that transformation. And even just the, it’s just those small things, just the little accents. So, the super colliding, super, the super conducting super collider is that, you know, it’s the big epic in Trinity, is the big epic story that things pivot on. But even those small little accents to the world, I think, are as relevant.

[01:13:14] And trying to find those, just those unexpected. Little. I’m just going to change that a little bit to the left and all of a sudden everything changes. And we can’t imagine how the world existed without it.

[01:13:26] Bruce Sterling: What I really see in those unknown unknowns is the kind of reason to get out of bed in the morning.

[01:13:31] I mean, to me, they reckon that they represent a kind of Vosloff, Havel style hope, you know, of salvation, like a reason to get out when things seem darkest. It’s like things can’t seem that dark because you don’t know all the determining factors. You can’t really allow yourself to become that, uh, that, uh, depressed because it’s an active intellectual, intellectual arrogance.

[01:13:53] You know, there’s just stuff going on that’s beyond our, our, our kin. And, um, that’s very vivifying to me, you know, I, I must say, and looking back over my own life, the things in life that gave me the most joy and pleasure, the things that made me really happiest to be alive or things I could never have predicted for myself ever.

[01:14:12] Yeah. Um, and that’s what really brought me a sense of contentment, you know, not things like, Oh, I’ve got this aced. I’ve like cynically got this figured out. That’s sure to work out that way. Okay. When that happens, it’s banal. You know, it’s just, okay, fine. You know, I’m not like, Oh wow. You know, give me my Mr.

[01:14:30] Wizard hat. I mean, the, the thrill’s gone for that. You know, the thrill is really, uh, is, is elsewhere and it’s in things I don’t know. Okay.

[01:14:43] Kevin Slavin: Well, the thing. Okay. Bye. Bye. Um, you know, I, um, there’s this phrase that every American’s using right now, which is, uh, I have no dog in that hunt. Um, uh, uh, and the, the, the hunt, uh, in question here for me is, um, is all this research that I’ve been doing into, into financial products and financial services, which, which I don’t really have any investment in, in, in, in the conventional sense of investment.

[01:15:15] Um, Um, but I’m super fascinated by, because it is its own, you know, they’re abstractions of the world, and the thing that’s, the thing that’s interesting to me is, is there, is the vector towards absolute abstraction. Um, uh, and, uh, what, what I, what I give as a design assignment now, uh, to students at Cooper Union, what they did very well at, um, uh, was, So, after, after mortgages, right, like the, there was the idea that like, it’s not the home, right, it’s the, it’s the system of payments around the home.

[01:15:54] After you, after you package those up and tranche those up into these, you know, CDOs, collectivized debt obligations, whatever, and you, you pack those up so that it’s impossible to actually even find the outline of a home. In there, right? Like the, you know, the idea that there’s actually somebody living in it, cooking sausage or whatever, has been like completely removed out of it, and that they managed to build these weird ideas of ideas out of that, and then, and then took us all into a story that we didn’t even know that we were part of, and when that failed, because it always fails, because it always reaches the, the, the, the conclusion, there, there is in the end something concrete.

[01:16:38] And Um, where do you go from there? And, um, and I know where they, where they’re going from there. And it’s so amazing and I, I give it, so I, I do this thing with students where I just, I outline exactly how the mortgage crisis was, was produced. And then say, so what would you, if you were gonna, if you were gonna carve it up, if you were gonna carve something up now, what would you carve up?

[01:17:04] And, um, uh, I did this very recently and one of them got it, uh, which is life insurance. Uh, which is what’s being carved up right now, uh, in the United States, which is that everyone’s life insurance is being bought and sold and repackaged into these fucked up collateralized debt obligations, exactly like the mortgages were.

[01:17:21] And the difference is, is that it takes longer for humans to fail than a mortgage. Um, so we won’t see that in the five year cycle that we saw with mortgages. We’ll see that in like what might be a 20 or 30 year cycle. What I’m interested in is then, what is after that? Um, what’s after that? What is even further, what takes even longer to fail?

[01:17:44] Like, what is the abstraction of the abstraction of the abstraction? Um, and because there are all of these designers and physicists, et cetera, who are working on this problem in very real ways. Like, who are deeply, deeply incentivized to work on this problem. And the problem is one of abstraction. It’s not actually production.

[01:18:03] It’s abstraction. Um, and I’m just fascinated by their methodology. I mean, it’s just because, because in that, uh, uh, there are, you know, there are all the stories that are sort of revealed underneath that. Um, but so that to me is the, is the greatest unknown unknown, uh, is what happens after, uh, uh, uh, this next one.

[01:18:30] Bruce Sterling: Um,

[01:18:30] Julian Bleecker: so can we take some

[01:18:31] Bruce Sterling: questions for these guys?

[01:18:35] Yes, we can.

[01:18:38] Julian Bleecker: Excellent. So,

[01:18:41] Bruce Sterling: so does knowledge make us cynical? I mean, you know, understanding that there’s a little man behind your scientific experiment making the vacuum happen. Understanding the origins of your sausage. Understanding that actually behind the kind of the magic of the financial world is some selfish little algorithm.

[01:18:58] I mean, you know, are we just making ourselves unhappy by knowing things? Getting rid of the wonder and the thrill and the magic.

[01:19:07] Julian Bleecker: Well, I think, um, from my perspective, uh, it’s, it’s, revealing that is, is exceptional because it allows us to understand that we actually have control over, um, the, you know, the, the path that these things take.

[01:19:19] So knowing that there’s actually someone was typing on a keyboard in a very kind of plain way to construct the algorithm is incredibly liberating because at some point you can track back and know that it didn’t just come from a space spore. That we actually are in, in, in charge of that bit. And so I think the, the point there is to, is to, is to reveal as much of that as possible.

[01:19:39] Not so that there’s some kind of, there’ll be some kind of fantastic Marxist upturning of the world because people all of a sudden realize the means of production, those kinds of things. But knowing that it is our hand, it is by our hand that these things,

[01:19:50] Bruce Sterling: uh, occur. Well, you know, I’m very attracted by Eastern European cultures and spent a lot of time in that society.

[01:19:57] And the first time I went to Russia, I was like interviewing a lot of Russians and I’m like, So, you know, what seems to be the problem here? And, you know, and I was, you know, just a little, you know, because they like to complain and you, you, you realize they’re gonna, they’re gonna, you know, they’re a culture of lamentation and they’re, You know, actually in, in Serbia where I spend a lot of the time, a very common popular greeting is, what ails you?

[01:20:25] It’s true, I swear to God. But, uh, you know, the thing that struck me was that all my correspondents, these people I was interviewing were exceedingly knowledgeable. I mean, they really had elaborate, almost theological ideas of what it was, where and what, at what time their society. It had gone off the rails, but no two of them agreed on anything.

[01:20:48] So they were super knowledgeable and, you know, very well educated and very hip and, you know, and sort of masters of dark esoteric knowledge. But their cynicism and made them incompetent so that you couldn’t like put seven of them into two cabs. You know, they just, they would have like, fantastic, explosive arguments about who’s supposed to go when and what’s under, but you did this back in 1975, you know, uh, long grievance narratives and, uh, so, you know, I, I learned to shy away from that kind of all knowing cynicism because it’s like a short pathway to just foolishness, you know, and, and it can’t even be called knowledge, you know, what, what it is is a kind of intellectual arrogance.

[01:21:34] And, um, you know, the upshot is that you drink.

[01:21:43] Liam: Ah, one

[01:21:44] Bruce Sterling: at the back. They’re always at the back.

[01:21:47] Julian Bleecker: Right.

[01:21:54] Bruce Sterling: This is slightly ill formed, but it sort of ties into that as well. It strikes me that that sort of cynicism or that sort of processing is a very adult reaction. Um, out of the three examples Kevin chose, you know, Chess and Monopoly have expert level play, but I can teach them to an eight year old. I can’t teach an eight year old core war.

[01:22:14] I can’t, I can’t say to them, This is the way you understand the world. And interestingly, the Haynes manual as a reaction is such an adult reaction to the enterprise. I want to know how it works. How does, how do I find that out? I go to the store and I buy a Haynes manual. And I was wondering, with a slightly precocious hat on, How do we explain this to kids?

[01:22:34] How, what’s the actual simple distillation of this? Because right now, I’m seeing a really complex, adult, cynical, deep view of the future. But I don’t know how to make the toy that will teach you about this. I don’t know how to embed this in education. I’m curious as to the way you teach this modern world to children.

[01:22:54] Kevin Slavin: And, through education. There’s this thing called video games that are pretty good at that. I mean, like, I mean, that’s what, I mean, that’s what video games do, right? I mean, video games are, they’re, they’re, they’re living systems that many children, uh, engage with, uh, every day and are, uh, working on systems, working with systems of far greater complexity than I think kids have ever done and, and arguably take all of that for granted in a positive way.

[01:23:26] Um, uh, you know, it’s like, you know, the, the greatest procedural complexity in, you know, that I had as an eight year old was two six sided dice, or I guess 52 cards, right? And the idea of being an eight year old with like, you know, you know, the, the, the tools uh, in front of us, uh, is incredible. And I don’t, I don’t actually agree that, that it, that any of this is necessarily cynical at all.

[01:23:52] I don’t, I don’t, I may sound cynical. But I, I, Um, I’m not. I mean, I think, I think that the, um, the, the, the alternative to the alternative to that sort of adult reaction of like actually wanting to grasp and understand the system that is underneath all of it. Um, the alternative to that hasn’t produced enormous happiness either.

[01:24:19] Um, uh, and I, you know, I see it, you know, I’ve been, I’ve been spending time with Uh, just talking to some of the, the, the, the people who are at Occupy Wall Street and it’s like, they don’t even know what, what do you, what do you put on your sign? Like what do you protest? How do you, you know, what, like how do you, how do you, how do you express it?

[01:24:39] Uh, and they, they literally don’t have the vocabulary, uh, many of them. Uh, they literally don’t have vocabulary and I don’t think that having the vocabulary is what produces the cynicism. I think it’s, I think it’s just the consequences of the things that we’re talking about. Uh, uh. Yeah. But. But being able to approach them and understand them and find the, the magic, uh, that is actually sort of embedded inside them, the magic is still amazing to me, and I don’t, I don’t, I don’t mean to take anything away from that.

[01:25:10] Well, I,

[01:25:10] Bruce Sterling: I hate seeing kids picked on because of adult troubles. And I, I think it’s very typical of a multidisciplinary meeting. They usually resolve on, let’s teach the kids everything we ourselves don’t understand. Uh, but you know, my feeling is like the best favor you can do to a, to a, to a child is instead of coming up with some kind of perfect curriculum, you need to set them an example as an engaged adult, you know, like, okay, this thing is my fault.

[01:25:36] I’m accepting responsibility for this. Oh, even though this bad thing happened, you know, I’m going to like do this other thing. You know, these things happen in life, you know, you teach them how to live and they, uh, they understand that better than they do some. the imaginary notion that you’re going to somehow tell them what their own world is like when you’re going to be, when they’re going to be your age.

[01:26:02] Because even with the best intentions, you can’t really prepare them for that. Even assuming they have jobs, they’re um, they’re not going to have the same jobs you had, or the same skill set you had. So you know, why burden them with that? It’s better to just let them be. Live, you know, and, and live in daylight.

[01:26:23] And take them, take them seriously as moral observers. There will be wheels on luggage, . You can’t tell ‘em that.

[01:26:35] Julian Bleecker: Questions on this side.

[01:26:37] Liam: Ah, it’s always I got it, mate. Okay, good.

[01:26:42] Bruce Sterling: Um, it’s a question about the algorithms again, um, together. Um. So what seems to be very scary about the algorithms is not the algorithm itself, but the fact that, uh, how it interacts with the humans. So one could say, um, if the algorithm would interact with the robots or machines, and the machines then would blow up the mountains and make the way and so on, I wouldn’t be very scared of the algorithm then, because clearly no one wants these kind of machines and they would be stopped.

[01:27:13] But it’s the very fact that humans interact, and it’s the humans that blow up the mountains. That makes the algorithm even

[01:27:18] Kevin Slavin: more effective.

[01:27:22] That’s interesting. I mean,

[01:27:27] For me the thing that’s, For me there’s two things that make them scary, and to your point, Yeah, it’s not the algorithm itself, it’s like, You know, it’s math, right? Whatever. But, One thing is, um, The idea that there are algorithms that are in, not in analytic roles, but in executive roles. Um, so, The algorithms that are pricing books on Amazon are arguably analytic, right?

[01:27:57] Like, they’re looking for a pattern in book prices and they’re trying to set a price based on it. But in the end, it’s actually, there’s a human who executes. You know, who says, yes, that book is worth it to me or not. But when you start to see things like the flash crash, where the same stock is being priced at one penny.

[01:28:16] You know, this many ticks past the second and 99, 999. 99, the next tick. Right? And actually trading, right? Like actually trading, you know, at enormous volume. It’s the idea that there’s no, the idea that there is no human oversight, right? The idea that there is no babysitter, right? There’s nobody who says, like, yeah, okay, that’s a good idea.

[01:28:40] Right? In fact, the only button is stop. Right? Like there is, there are no other buttons. Right? That’s. That’s my problem with it, right? Like, my problem with it is, is that, is that it, you know, there’s this whole idea that, um, that they’re getting really, really smart, and we’re ignoring that. And it’s like, the problem is that they’re dumb, and that we’re ignoring that.

[01:29:01] The problem is, is that they will never have a rational approach, in the sense that we can reason. They’ll never have a reasoned approach, in the sense that humans reason, and we’re treating them like they do. Right. We’re, I mean, that we’re basically, we we’re, we’re treating them as if they reason the way we do, and of course they don’t because it’s, because 99% of the time it appears the same.

[01:29:25] 99% of the time Watson will beat all the other players on jeopardy. Right. But it’s, it’s that we don’t have the, the we because it can do that. It feels intelligent to us and it’s not at all. And I think that’s the, that’s the danger. I

[01:29:45] Liam: mean, I’m intrigued by, when you finished the, you, you said we need to understand the algorithm as nature because the algorithm already is, in a way. It made me think of, we, we, last year we spent time in Australia and this is a, this is a geology, a landscape. The very formation of which is the subject of story and myth.

[01:30:03] You know, the Dreamtime stories of the Aboriginal Australians are actually embedded in the formation of the ground. But, but cut out of this landscape. is this extraordinary mining terrain. A terrain that, that, that, that’s, uh, kind of an inevitable consequence of our iPhone. You know, the, the gold on the circuit board that conducts better than any other mineral.

[01:30:24] Um, what’s amazing in, in terms of your conversation about algorithms is that things that, that give those holes in the ground shape, unlike the kind of the, the myth of the rainbow serpent that slid through making valleys and rivers, the things that give these mines shape is the fluctuations of the stock market.

[01:30:40] You know, uh, the, the gold price going up and down means that, uh, the way that they refine the gold, it becomes more effective to dig a bigger hole and mine lower concentrations if the gold price is high. If the gold price is low, the hole is smaller. So you have real time mining trucks and diggers linked to the stock market that changes in real time the shape of this hole in the ground.

[01:31:06] So you have this condition where You know, the, the, the culture that you’re talking about, the culture of algorithms, is actually shaping, uh, a geology, cut out of which you have another, you have this other kind of cultural landscape, you know, um, running through it. So, I mean, if we know what the world looks like where algorithms shape it, you know, it’s the world around us, now.

[01:31:28] What, what does this world look like when we do start to understand the algorithm as nature? You know, when we have a new cultural relationship to these holes in the ground that aren’t formed by a rainbow serpent, that we don’t understand in myth in the way that we typically relate to nature historically, but we understand in a new way.

[01:31:45] I mean, what, what does that world start to look like? Is that world still fearful for you? Or is it about the, the cognitive kind of imagining of the algorithm and, you know, us kind of coming to terms with that, that, that makes us feel somehow, makes you feel content, I suppose, with your

[01:32:02] Kevin Slavin: discovery.

[01:32:07] I’m not sure that it, that it necessarily looks that different. I mean, it’s, it’s case by case. If you go to Mahwah, New Jersey, which is basically, it’s basically just, It would take too long to explain. There, there are, there are points on the earth that have been transformed. Um, where they were, they were substantively different, uh, ten years ago, physically.

[01:32:36] Uh, uh, even tectonic, just like, like actual, like, like there’s just, there’s stuff there that wasn’t around, uh, uh, before this. But I think that mostly It’s just sort of, it’s all around us the way all these other systems are all around us that we’re also completely unaware of. You know, that it’s like, it’s like we’re no more aware of the ways that, I talk to the people who work in the building at 60 Hudson Street, they have no idea what’s in that building, right?

[01:33:05] And, but that, that will be the case for all buildings anywhere on the earth. Uh, no matter, no matter what they’re doing. And I think that the idea, it’s like, you know, the fact that everything happened on Wall Street, it’s because that’s where the ships came in. Uh, you know, bringing goods across the Atlantic, you know, the fact that that was also opaque was fine.

[01:33:24] I mean, I, like, I don’t, I don’t know that it’s, it’s, you know, this is what basically all we’re doing here is sort of mining the stories of why things are the way they are, but that doesn’t necessarily, it doesn’t necessarily render them, uh, uh, with a specific value relative to the people who are experiencing that.

[01:33:43] I don’t know. It’s a, not a very good answer to your question.

[01:33:46] Bruce Sterling: Well, I can’t answer that question any better, but I think there’s a human impulse to try and grab as much guilt as possible out of a situation like this. So you know, I would pose you like a thought experiment. Let’s just say that there’s an algorithm that actually looks like a moral actor.

[01:34:03] Let’s say we like crossbreed Google with Siri. So now Google can talk. So, you know, you get online, and so, so Google, what’s happening? I am Google. Okay. You know, at that point, you’re gonna see a whole lot of sudden, you know, a shifting of blame, you know, because the thing has got some kind of human face and can be interrogated and can be, you know, brought into a world of moral activity.

[01:34:29] But, it’s, algorithmically, it wouldn’t necessarily be doing anything more than it was, and, you know, it might not even pass a Turing test. I don’t think Siri passes a touring test. I mean, Siri just responds to, you know, responds to queries by performing a stochastic algorithm, sifting of all the texts that’s already there.

[01:34:48] Right. But, you know, we’re just, we, I mean, we don’t want to either sort of blame the algorithms or we basically, we wanna grab things away from them and somehow make ourselves important. And if we were in a world where we were actually minor players, where, you know, there was a third nature and we were something like.

[01:35:06] I don’t know, fungi. And you know, in the scheme of things, we’d still go on. You know, I mean, we’d still have like thrilling wonder meetings and you know, there’d be divorces and gunshots and you know, whatever it was. I mean, we’d behave like other species did, you know. And I would point out that if nature is somehow an algorithm, I mean, that works, that works backwards.

[01:35:31] You know, algorithms could be natural and nature could be Could be algorithmic and, you know, everybody has, you know, a kind of Mother Nature Edenic myth that nature never screws up, but there have been five major extinctions, you know, that we know of on, on the planet, you know, or just something went bust, you know, and that wasn’t us or our algorithms, you know, and, and they didn’t just kind of go bust, they went bust for like dozens of millions of years at a time.

[01:36:01] I mean, if that, you know, if that’s what nature is and that’s what an algorithm is, then we ought to be figuring that out and our attitudes toward it. It’s not like it’s perfect and we somehow messed up. It’s not perfect. I mean, there was the flash crash of, you know, 245 million BC, you know. Right, but they’re working on that.

[01:36:25] Kind of, kind of, over.

[01:36:30] Liam: Do we have any other questions on the floor that Matt can

[01:36:34] Bruce Sterling: grab?

[01:36:38] No? Everybody’s, everybody’s worn out by the thrill and the wonder. Philip is shaking his head. Do you have a question,

[01:36:47] Liam: Philip? No, I don’t have a question. Just, just a quick

[01:36:51] Julian Bleecker: comment, um, to offer back to this, this, this image of the bleak

[01:36:57] Bruce Sterling: curmudgeons eating horse sausage and, and, and trading, trading cynicisms.

[01:37:03] Uh, you, you gave us a cup, a, A couple of different pictures of

[01:37:07] Julian Bleecker: William Blake and, and perhaps it’s an offering back to think

[01:37:10] Bruce Sterling: of

[01:37:11] Julian Bleecker: on one hand him being an absolute bastard, a curmudgeon,

[01:37:15] Bruce Sterling: I mean a really horrifically nasty, impatient person, while oscillating to his vision of angels inhabiting every single leaf of

[01:37:25] Julian Bleecker: every tree in his neighborhood.

[01:37:30] Bruce Sterling: You know, he’s a poet. Laughter They’re like that. You know, it’s like the quarrels at the, uh, you know, academic parking lot when there’s less and less, uh, I mean, I’m the literary guy, you know. I’m a literary guy. Once in a while, I hang out with poets. I’m always amazed at how much bitterness they have for one another.

[01:37:50] You would think that they would be full of solidarity. I was like, well, gee, a couple of centuries ago, we could, like, write national anthems and the earth would quake, you know. Everybody would bow the knee when a poet comes in. But they’re just like little backstabbing dudes, you know, it’s like, it’s alarming how It’s alarming how little fellow feeling poets have for one another Rockstars are icons of solidarity compared to poets.

[01:38:15] I’ll never understand it. And I swear it’s true. It’s a thrilling wonder how much How little they can manage. I

[01:38:27] Liam: think that’s, I think we’re all worn out, man. But I just wanted to say, uh, yeah, let’s wrap it up there. And, um, I guess all that’s left to be said is really an amazing thank you to the, to what is just a, still a jaw dropping collection of people.

[01:38:43] I’m still amazed that we’re able to get, um, everyone in the room together. So that’s really, really fantastic. So thanks all for giving up your time and coming in. Um, it really was amazing. Thanks so much.

[01:38:53] Julian Bleecker: Applause

[01:39:07] And, uh,

[01:39:08] Liam: and thanks to, uh, before you all went off, and thanks to, um, thanks to Matt Jones, who, uh, who really is a, is a trooper. Every time I seem to get him into this building, I force him to stand in one room for nine hours, not feed him, feed him anything or give him anything to drink whatsoever. So One more.

[01:39:24] Oh yeah, more, more sausage. Um, for the, for the masochists amongst you, um, New York is, is online at the moment, so you can log on and, uh, if you just need more thrilling wonders, um, I think they’re in typical building block bowerbird fashion. He’s mining the depths of astrobiology right now. Um, so please check in with that.

[01:39:45] Um, one more thing. The live feed, like all the things you’ve seen today, with the exception of Andy Lockley, that’s probably being chased down the street by an angry Christopher Nolan, um, will be, will be online, um, and archived on, uh, 20 Wonder stories.co uk. Um, in, uh, well, you know, in a, in a couple of weeks.

[01:40:03] But five is, will we probably, when we get around to doing it. Um, but you, we should be able to get on there and you actually already now you can get on, um, and check out. Uh, it’s really one to say two and one is already up there. Um, but thanks again to to, to all the speakers and to Matt and uh, and thanks to you guys as well for, uh, for coming in, sticking it out.

[01:40:22] I’m well done

[01:40:22] Julian Bleecker: Liam.

[01:40:34] Liam: Thanks a much guys. See you next year. Yay.

[01:40:48] Julian Bleecker: I’m gonna,

[01:40:57] yeah.

[01:41:10] I’m going to come finish for you. I have to, I have to take off soon. Uh, that’s why I have to disappear in a second.

[01:41:17] Bruce Sterling: I’m running late and this

[01:41:19] Kevin Slavin: has been noticed.

[01:41:20] Julian Bleecker: So, uh, I hear, uh,

[01:41:24] Liam: An app is trying to sell me a coffee as well. Oh

[01:41:27] Julian Bleecker: yeah? I had to hit, uh, Christmas. You’re really good at building it, aren’t you?

[01:41:32] Exactly, that’s, that’s the goal. Hey, how are you? Oh, she’s great. She’s doing

[01:41:40] Liam: really well. Yeah, she’s fine. She likes broccoli. So, uh, this is

[01:41:44] Julian Bleecker: very good. So let me say hello. And, uh, yes, this is Tim. Oh, that’s me. Yeah, I’m not the bugger. I bet she’s not the bugger. The area is like this. Glen. Hey, Glen. Hello. How are you?

[01:41:58] I’m fine. Yeah, I mean, thank you. I adapted it. I’ve probably got the cars and stuff in my inbox. Yeah, I’ve got it all for you. Yeah I put it into my inbox. She’s been asking me. She wants to speak to me. And it’s simples. It’s great, it’s good that it’s brought you out. Yeah. It’s great

[01:42:17] Bruce Sterling: something to explore with someone’s yeah

[01:42:19] Julian Bleecker: in those streets actually.

[01:42:20] You may go on uh, to somebody like I have. I’m going to go and uh, wall up with some of my camera gear. Yeah. No. Yeah. Uh, No. Yeah. No.

[01:42:44] Bruce Sterling: No.

[01:42:47] Julian Bleecker: No.

[01:43:01] No.

[01:43:07] Which was, which was exciting. Uh, I don’t know, that’s it. Are you, uh, are you going to come to dinner? Do you know, uh, do you know what this is? Heather, I think you will. Nice to meet you, it’s a pleasure. Heather designs urban, uh, images, and, uh, did a little bit of work in, uh, illustrating textures, and it was very, very interesting.

[01:43:33] I

[01:43:33] Kevin Slavin: don’t know what we’ll do.

[01:43:35] Julian Bleecker: There’s a lot of people. Reduce your temporary. Well, two months ago, I

[01:43:55] Bruce Sterling: just

[01:43:56] Julian Bleecker: want to say, listen. Stasis. Yeah. Yeah. Yeah.

[01:53:09] ​