In AI, context drift is what happens when the data a model was trained on stops matching the world it's operating in. Everything still looks right, but the ground moved. I think that's happening to people too, not just models. How we make decisions, how we work, how we adopt new tools, it's all shifting faster than most of us are naming it.

I've spent 20 years in growth and product watching how people actually change their behavior. Not how we say we do. How we actually do. Context Drift is where I write about that.

What it's not: a hype newsletter, a roundup of every model release, or a list of ten prompts that will change your life. Other people do those well. I don't want to add to the firehose.

One small ask before you go. If somebody specific came to mind while you were reading this, someone who'd actually get it, forward this to them. That's the whole distribution plan for a while.

Talk soon.
Adam

Keep Reading