Now You're Thinking With Systems

2012-08-03

One of the best Computing Science courses I took in university was actually more of a writing course. It focused on some of the well-known disastrous case studies of our field. It warned us against both technological utopianism and the idea that technology is somehow neutral in our society. It served us well by reminding us who our products work for: the users. People. Society. Humanity.

I was indirectly reminded of this years later while working for a company in the hotel industry. Our dev team was being led and assisted by a few outside consultants. Along with architecture, processes, and best practices, part of their job was to change the dev team's culture. One teaching that has stuck with me is to actively fight the urge to develop an Us vs. Them mentality across teams. Dev vs. QA. Dev vs. Sales. Dev vs. Marketing. Dev vs. Management. The fact is, it was stressed, developers are service-providers. We don't get paid to code just for the sake of coding. We write code for the business. Without the business, we don't get to code. I can't tell you how much time I've saved and goodwill I've earned just by, as was emphasized, getting up from my desk and talking to the business. If a bug comes in that seems wrong, incomplete, or yes, stupid, then the correct response is to go speak with another human being rather than writing something snarky or dismissive. It takes two to communicate, and somewhere, whether it's actually a bug or just an expectations issue or whatever, a miscommunication has occurred. Fix that, first.

It is critical that programmers and technology professionals remember 1) that they serve, 2) who they serve. When a video game controller, peripheral, or console is made, that's not the end of it. "Okay, we made one! Nailed it! glasses clink" No. Those artifacts each represent a particular build at a particular moment in time, but it is not some untouchable piece of work, immune to both criticism and improvements. When you create technology for other people to use, you would do well to remember that is exactly what you're doing. A touchscreen is nothing without touch. Sweet spatial hacks notwithstanding, a Kinect is nothing without a flailing skeleton. A video game controller is nothing without something to press the buttons. If a headset is never plugged in, its entire reason for being dissolves. These things do not exist for themselves. They were not created out of thin air, nor were they designed and built by other pieces of unassailable technology. They are in no way sacred. Their value is determined entirely by their relationship to those who use them. Their merit is defined by use.

This is something that, I think, we as technology professionals sometimes forget. The customer may not always be right, but we, as service-providers, should assume by default that the user is. For a moment, let's operate under the wild assumption that a specific piece of technology was built for a specific user to complete a specific task: If this system is inefficient, unclear, domain-contradictory, or just plain fights this user in any way, then our first thought should be that there is something wrong with the system. Sure, you can "fix" a problem with training. The user can be adjusted to fit to the system for ease and time, and that's perfectly rational in the real world. But we should never lose sight of the fact that technology is built for people, not the other way around. Letting yourself fall into the mindset of "heh, stupid lusers" is dangerous.

Lately, it's a user mindset that's been most fascinating to me -- or more precisely, a very particular moment when a mindset about technology is irreparably broken. I'll start with a simple example. In one of his TED talks (which are kind of "eh" otherwise) at around 7:18, Sir Ken Robinson makes a point about generational differences based on who does or doesn't wear wristwatches. He recalls a conversation with his non-wristwatch-wearing daughter who says, "It's a single function device." At that moment, the audience laughs.

Bingo.

That's the sound of a room full of people realigning their relationship to a piece of technology. With that statement, Robinson's daughter has made it obvious that the relationship is somewhat dysfunctional. The laughter is a demolishing of a thought-barrier -- it's a breakthrough. I can't say whether some people may stop wearing wristwatches after experiencing that during the talk, but it wouldn't surprise me.

Since the Extra Credits episode on Harassment went online, I've been lucky enough to speak to several journalists about harassment in gaming and I always mention EC. Most of the time, I'm talking to people over the phone or face-to-face. The easiest way to explain my current focus of effort (dev-provided community tools) is to describe the concept of auto-muting. I describe it concisely and carefully, and I give them a second to ask a question in case it wasn't clear. But quite reliably, the same thing keeps happening.

They freeze. Their eyes unfocus, suddenly staring at something both in the distance and inside their head. When they come back to this world, their voice is softer, with a hint of awe -- and every time, they remark how great of an idea it is. Again: bingo.

I believe this moment, captured in both laughter and reflection, represents the same thing in a person. It is a physical, visceral reaction generated by re-evaluating our relationship with technology and it is so important. It is in this breakthrough that the user comes to the same conclusion we should demand of our programmers: technology exists to serve. It doesn't matter how great the technology currently is if it's not working for its users. There's a thought-barrier generated by the existence of a piece of technology -- that this is what you must use and put up with, no matter its quirks. The breakthrough happens when you get a glimpse of a different world where the technology serves you better. It's the moment when you learn about something that now seems so obvious and easy that you can't believe it doesn't already exist. It enables you to think about it differently and ask for something greater.

The Extra Credits episode created this breakthrough moment in many people and I'm so proud to have been part of it. Dev-created community policing tools as presented in the episode are not cut-and-dry, of course. These ideas are focused on analyzing and acting upon instances of behaviour as humans interact with other humans, not the technology itself. In the scenarios presented, the technology is a distinct outside entity to human interaction. But the same system that provides a service for humans to interact with each other does not need to be completely passive, nor merely reactive. I can't wait to hear more about what companies like Microsoft can and are doing to chip away at this problem.

But at least as great as getting companies to work on the problem is that this may be the start of a mental breakthrough in the gamer population. So often I hear the argument that harassment will never go away, therefore we shouldn't even try to do anything about it. Well, this is why: because we have the technology to at least try to reduce it. Let me pull some arbitrary numbers out of the air: When I think about, say, one hundred different players all muting the same player for abusive language, I see data. I see automation. I see one hundred different button clicks. I see redundancy. I see inefficiency. I see a player who has hurled abuse at one hundred people. This is someone who is not stopped or discouraged by any particular individual muting them -- they just move on to the next person. It's so important to recognize that simply ignoring or "don't feed the trolls" doesn't work here. Maybe auto-muting can.

This is an extremely simplified and contrived example that doesn't, for example, take the reporting feature into account, discuss the timeline of behaviour, count the total number of players rather than just the muters, nor consider the potentials for system abuse itself. But that's okay, because the point is that now, right now, that's what you're thinking about. The EC episode got a lot of people to start thinking not about how intractable the entire problem of horrible human behaviour is -- they just started to offer their own solutions for a subset of it. They discussed and debated improvements and enhancements to the simplistic versions that EC presented. They broke through the thought process of, "It's too big and people always suck, so why even try," and moved on to breaking it up into distinct, manageable pieces. Instead of thinking about how we'll never control people into being good to each other all the time forever and ever, they started thinking about the ways a system could, to quote EC, "radically alters the dynamic of online gaming."

This is systematic thinking, and it's what we need. Recognizing that we can write systems around these interactions, even though they involve unpredictable social creatures, is a huge and important mental step. But we're getting there, and as we learn more about the actual useful interaction-based data that can be gathered, I expect more breakthroughs.

blog comments powered by Disqus