Humanist IT: the importance of choice

The short version: IT is broken. It’s not working, to the point where people are ready to give it up, just like they would a mangled…

Humanist IT: the importance of choice
Diagram of a person in a fuzzy box, crowned by a subjective process and surrounded in a see of additional nodes. Image by the author, 2025.

Underlying the baseline function of a governed data set is a thread of controlling behavior. The same playbooks are used in egregious monopolistic behavior, 1:1 abusers, cults, and authoritarian regimes. It’s primary tool is simplifying information down to a false choice. When that doesn’t work, start lying. When lying doesn’t work, start hitting, or killing off the competition, or policing the streets over-armed. In other words, forcefully curtail choice. Moat the information. Lock them in. Scare them.

To get an effective list of the behaviors involved in controlling others is as simple as studying dark triad. Those behaviors — like siloing, gaslighting, love bombing — are all behaviorally informed levers to manipulate information into a false choice. Not everyone using these behaviors are dark triad; but they are probably within a few degrees of one.

If all the choices boil down to one option, like “use genAI or get left behind”; it’s not a choice. That phrase, in context of our economic model, is a threat. “Getting left behind” is just an exchange for “or else”. In this case, it means no longer making money. It’s really just one option: use genAI. This should be making people incredible suspect as a starting point.

When we oversimplify a user path, we’re removing information, which also changes the flavor of the choice/non-choice being made. When we hide the ability to cancel, we’re gaslighting. All the patterns of deceptive design are rooted in manipulating information to manipulate people into a false choice.

Curtailed choices are everywhere in our IT. They have become the norm. In the push to simplify — to make a flow so easy people could do it while barely paying attention — we often scoped too finitely. We called all the things we didn’t really want people to do or be ‘edge cases’, vowed to circle back someday, and then started a new MVP.

If those edge cases were 60% of our users, we didn’t know because we were focused on the positive-form metric of what we were tracking. We didn’t track what we didn’t want to see. We tracked what made our company feel good, what gave us a sense of job security, or stroked the ego of someone with too much power.

As more companies did this, it sent out everyday waves of frustration. Little curtailments, scattered everywhere. It is the status quo, so it’s hard to see. But little by little, the frustration grew. Little by little, stuckness coalesced.

People get stuck because information gets stuck.

Information, usually, that has been simplified without allowing for easily found, increased density and/or alternative, functioning pathways. I am positive that some of this is because our industry was following best practices; following where others had “proven” the least pain-inducing (but still painful for some, even sometimes most because of how best practice is calculated) experience.

Some of it is because information technology is constructed and documented information. When you build information to spec, manipulating it at the source is incredibly tempting for some of our individuals. They developed metrics of success that often implicated the deceptive behavior they built in the information, and were copied.

We got stuck.