Help, I think I’m melting:
36 degrees isn’t all that much in the grand scheme of things, but trying to work from home without air conditioning is more trying.
I am now on holiday for a week, so expect temperatures to drop, then rise again when I have to work once more.
I did write a lot of post-its, though. Not sure whether that was usefully productive. Time may tell.
Darkfield Radio is a thing. Double, their first show, is interesting and thoughtful and creeped me the hell out.
Have finally ordered more bookcases. Way back in week eight, I predicted that would give me lots to do in terms of moving books around. Time may tell.
Yes, they’re from Ikea.
I was thinking at lunchtime about how you’d bootstrap something like the Artificial Intelligence ecosystem of The Culture. If you know that’s your end state, the ethics of starting off with systems that favour humanity are complex. But if you don’t bake in some kind of limits early on then when you hit AGI, or at least when you hit it and it gains sufficient real-world agency, humanity is no longer in control.
Most people wouldn’t be comfortable with that, I’d guess mostly because they’re worried about the risk of being optimised out of existence. (Which frankly is more likely due to poor use of pre-AGI automation.)
AGI without agency feels ethically dubious, although I suspect most people wouldn’t have a problem with it, at least at first. Deliberately creating something that will be superordinate to humanity has interesting ethical questions attached to it as well, and I certainly think most people would be against that. So assuming we can create AGI, we probably shouldn’t.
That won’t stop people trying, so unless it’s impossible or we wipe ourselves out before then we should assume we’ll eventually succeed.
I’d expect first AGI with constrained agency, but that we’d later depend on AGI to create better AGI, and so those constraints would likely disappear over the generations. So the constraints probably only matter in the short-term, which reduces the ethical problems in getting started.
Leaving us in the long-term, though, with what? A non-biological species with sufficient affinity to and affection for humanity that it will co-exist somewhat harmoniously with us? A species without such affinity that just buggers off into the universe? Or one that for some reason decides its creators are so awful we really need to be tidied away before we cause any more damage. (And which might subsequently tidy itself away, even.) Time may tell.
But not for a long, long time. We’ll probably all be dead.
Stay safe. Wear your mask. Don’t accept a peerage, the Minds won’t like it.