Or more accurately: complexity never decreases, both in the strict sense that if something’s O(n)
, it will always be O(n)
, and if you find a faster way of doing it, the complexity hasn’t changed, you’ve just been really smart (or you were particularly dumb to start off with).
But I also mean this in another way, with a more sloppy use of the word ‘complexity’: that just because you’ve done something before doesn’t necessarily mean it will be easier this time round. The complexity of the problem is the same, after all; the only real advantage you’ve got from having done it before is an existence proof. I think this is an important point that a lot of people miss. Then they act all surprised when building a new widget takes longer than they expected.
Complexity never decreases. You can represent this mathematically as:
which is what I’ve got on my t-shirt today at the second half of the O’Reilly Velocity Conference, which is all about performance and operations for internet-scale websites. And there’s some good stuff here - besides interesting conversations with vendors, and a couple of product launches, the very fact that this is happening, rather than being confined to what I’m sure a lot of my fellow delegates consider more “academic” arenas such as Usenix feels like a positive step forward.
Of course, some of this isn’t new. One of the conference chairs is Steve Souders, who has been banging the drum about front-end web performance techniques for a while, so it’s not surprising that we’re seeing a lot of that kind of approach being talked about. Since I follow this stuff pretty closely anyway, even over the last six months when I’ve been only flakily connected to the industry, I know much of this already; however it doesn’t mean it’s a bad thing: there will be people here who haven’t had it explained sufficiently for them yet, so they’ll go away with important new tools for improving the sites they work on.
Some of the things people are saying are older yet; and some are a bad sign. At least four speakers yesterday laid into the advertising technology industry, including Souders. However I note that they don’t appear to have actually sat down with the tech vendors, and they haven’t got any representatives speaking here. No matter what people think, performance problems related to putting ads on your pages aren’t always the fault of the tech vendors, and even when they are they’re open and often eager to talk to people about improving things. There’s a panel this afternoon on how to avoid performance problems with adverts, which I’m sure will have some interesting and useful techniques, but I’m equally sure some of them will date very rapidly, and very few if any will have been submitted to the tech vendors to make sure that they don’t have unintended side-effects. People are thinking about this, which is good; but they also have to talk about it, and not just at smallish conferences in San Francisco, but at things like Ad:Tech in New York. Hopefully this is the start of the right approach, though: advertising isn’t going away.
I’ll probably have more thoughts by the end of the day, but for now sessions are starting up again, so I’m going to go learn.