October 2023

Design is faith first, then fact

I’ve had 20 years practice at Design and almost every time I embark on a new project there’s a little voice inside my head that says, “What if we come out of this $10k research project with nothing? What if my toolkit doesn’t work this time?”

That voice is still there despite 20 years of hard evidence that, every time, that investment in research is money well-spent. In fact, every time I finish a project I walk away thinking, “Gee, if I was that client, I’d pay double for that level of insight, de-risk, innovation, etc” – all the things that practitioners know design research is good for.

And so, if even I still have that little voice in my head, with 20 years of direct experience of Design’s value, how can I expect anything but scepticism from a client or executive who’s never seen what design methods can do.

Design as religion

It strikes me that Design is very much like religion. Design, like religion, requires faith – a belief that, if we follow this set of rituals and practices, we will achieve insight salvation; a miraculous understanding of our user/s that will unlock more social and/or financial benefit for the organisation and those for whom it serves.

And, like religion, there are sceptics – design atheists and non-believers. Sometimes, they believe in Engineering. Sometimes, they believe in Product Management. Sometimes, they believe in six sigma management or ‘servant leadership’ which comes in many flavours. They say things like, “I don’t need the fancy colours and clothing of your faith, functional is fine. We’ll just ship faster and learn in real life.” Or, they say, “I already read the sacred texts of your Design religion and we don’t need to hire one of the clergy to perform the sacrament for us. I got this.”

And you know what I’ve learned? Sometimes, they don’t need Design. Sometimes, they have a market or a business that has no competition, or who’s existing service landscape are akin to the holy fires of hell that doing anything slightly better than it was before is good enough for now. You don’t always need the pope to exorcise a devil. Sometimes a bit of garlic around your neck will do just fine.

For years, the devout Christian members of my family have told me that I don’t know what I’m missing by not cultivating a personal relationship with God. And for years I reply, “I’m doing just fine, thanks.” And, I truly believe I am.

Just like we do in Design, religious institutions try to use testimony to show us what life could be like on the other side – if only we were more faithful. So and so was cured of cancer. God gave them a new baby when they asked for one. God protected them on their overseas adventure. Their prayers were answered. Loaves and fishes. Water and wine. The list goes on.

In Design, we write case studies to show the unconverted what they’re missing out on; “Company X unlocked 10x revenue with this one simple design change.” or “Investing in user research gave Company Y a whole new product category.” or “User efficiency went up 10m% because we understood their needs.”

Loaves and Fishes. Water and Wine. Designers perform miracles. Jesus does too.

Design as fact

Those of us in the clergy of this Design religion (lets call ourselves “Professional Designers”) believe we’re offering something special; a version of enlightenment. Those outside the congregation question what the clergy actually does with the pennies they collect from the donation box. The clergy struggle to grow the church and we spend most of our time online talking about how the outside world could be saved, if only they came to church once in a while, or read this sacred text, or repented their sins.

The thing is, no matter what we think of the non-believers, there are some true believers who aren’t one of the clergy – they are non-designers who believe in Design. They are few in number but they are there. I know some! The question is, how did they become that way? What’s their conversion story? Might that be helpful to the Design clergy in evangelising the benefits of Design?

Can I get a witness?

I’ve worked with a lot of non-believers over my time. People who, for one reason or another, thought Design had nothing to offer. But, they read some HBR article at some point and thought, “maybe I’ll give it a try to see what all the fuss is about.” Or, they were working in a highly regulated space and needed the Design process – speaking with users and a report – for some version of due diligence and risk mitigation. There have been, on very few occasions, the ones who need ‘saving’, too; those who fell so far from the grace of God by shipping waste-of-money software and, through the Design process, sought repentance (mostly from their boss/manager).

And you know what? As I’m one of the clergy standing out on the street handing out fliers, I’ll take any non-believer I can get.

What I’ve learned over all these years is that someone needs to ‘experience’ the miracle of Design before they become a devoted follower. You can tell people as many testimonies as you like, but until you feel it – i.e. you go from thinking one thing to thinking another about your audience, or your entire body relaxes as you realise that you’ve de-risked whatever decision you were going to make anyway, or you unlock a market bigger than you ever thought possible – you can’t believe in it. You don’t want to go to Church again if you don’t feel different from it each time.

Give me a shepherd I want to follow

When I was a kid, I was dragged to church weekly by my Mum. It was a really boring Catholic church that simply ‘ticked the boxes’. Read the book. Ding the bell. Recite the prayer. Put coins in the donation. Eat the bread. Ding the bell again. Shake the priests hand on your way out.

I’ve been in design organisations that take a similar approach to their Design work – tick off the steps of the double-diamond (research, report, invent, test, refine etc) and arrive at the end. There is no internal transformation that occurs in both of these scenarios. As a non-believer you’re left thinking, “That was a waste of time. I could’ve done something else with my Sunday morning.”

Why are we surprised if people don’t engage with Design if the clergy makes them yawn or isn’t invested in giving them the transformative experience we know is possible?

One time, I went to my uncle’s pentecostal service. In contrast to the catholic service, this was outrageously lively. There were people speaking in tongues, dancing in the aisles, falling over after being ‘touched by the lord’ and re-born into a new life. Unlike the catholic gospel reading, in the pentecostal equivalent, the pastor was critically analysing the bible, putting it into historical context, drawing lessons from the text, and applying it to the lives we live today. As a 9-year old, I was blown away. I remember thinking, “Woah, if this is Church, maybe I could be into it.”

There are some design teams that do this, too. Teams where the design process is participatory at all levels. The clergy draw on a vast toolkit of design methods and apply them to the problem they’re trying to solve in very targeted ways. They don’t ‘always do focus groups’ or ‘speak to 8+ people’. They don’t always ‘do research’, or write reports, or do ‘divergent thinking followed by convergent thinking’. They are, quite simply, goal-oriented and focussed on giving their client the miracle of Design in whatever format suits the problem they’re trying to solve. It’s those clients that go away changed.

Building a Design Congregration

Building the Design Congregation, as it turns out, works much like evangelical religion. People need people-like-them to have experienced the miracle of Design and ‘spread the word’ that Design Is Good. That it does not judge. That is accepts all who are willing to listen, engage, and approach it with a curious and open mind. That Design is here to save them, and humanity, from the evils of the world like waste. Planting an equivalent of the Gideon’s bible in every corporate office (Design Thinking by those IDEO guys?) ain’t gonna do it. It comes back to designing the the design experience for non-believers.

Those who have experienced the power Design through an internal transformation of their own worldview know that it’s not just faith forevermore, but fact. It’s not mystic creative genius by the anointed few, it’s logic and deduction. Where Design is *unlike* religion is that Design is empirical, evidence-based, collaborative, and iterative. The Designer is not The Pope, but a shepherd. There may need to be some faith to walk into the church for the first time, but once you walk out, you walk out changed. And then you tell others that, maybe next time, they could go – just once – to see what all the fuss is about.

Once I was blind, now I see, or something like that.

March 2023

When did software lose its softness?

Of all the things software UI could’ve been good for, it was personalisation. With no ‘hardware’ interface, the ability to customise and adapt any user interface to anyone’s needs is software UI’s competitive advantage. But that doesn’t seem to be how it’s working out.

Almost 20 years ago, CSS Zen Garden sprung up as a way to show the power of CSS to those new to the web. You could take exactly the same content, and, completely separately, make as many ‘front-ends’ as you wanted. I don’t mean tweaking a button colour here and there like we tend to get caught up on these days. No, this was absolute and complete wholesale UI change.

A modernist image of a HTML page from CSS Zen GardenA modernist image of a HTML page from CSS Zen Garden
Two websites using exactly the same HTML can look completely different via CSS Zen Garden

Back then, the internet was relatively new, albeit gaining fast momentum, and so us designers were more fixated on making the stuff ‘look acceptable’ – a focus mostly on aesthetics as we battled with early-days HTML table layouts, 1x1px gifs in a world before rounded corners, caveats around which size of monitor and which browser this website would be best viewed at, and ‘sliced’ images if we wanted things to be a little more ‘responsive’. It really was a moment where graphic designers adapted offline content like brochures to the online alternative – brochure websites.

Along came some ‘fancy’ technologies like Flash and, for a brief moment in time, there was a sense of playfulness to people and businesses who had presences online. Things like ‘splash pages’ were the designer’s equivalent to opening movie titles – How might one set the mood of this website with a focus on form before getting to the heart of the content.

As the usefulness of the internet evolved, so too did our focus on and emergence of concepts like ‘usability’ and ‘interaction design’. We did this the best way we knew how – skeuomorphic design – or, taking real-world affordances for things like buttons and tabs, and mimicking them on websites and other digital services so that people understood what to do when they saw a particular interface element. Meanwhile, consultancies like Norman Nielsen were devising principles and practices for ‘best practice UI design’ which, whether you like it or not, still hold true today.

Software and the user interfaces through which we interact are our raw materials and by its very nature, is “soft” – abstract, malleable, fluid, adaptable.

Back then, when the world was a little less connected, most businesses expected people like them to visit. I designed websites for ice cream shops, car dealerships, and other small to medium-sized businesses that were mostly interested in giving people in their local area a way to contact them, and then, a bit later, make a purchase online.

But like a mycelium network, the internet exploded, amplified by the advances in mobile computing. Relatively quickly, humans could see, hear, and feel other humans – everywhere.

One thing I never really see the internet get credit for, at least at a meta-level, is how it accelerated humans’ understanding of one another. Some may call it ‘woke-ism’ but, in truth, the internet exposed, and continues to expose, the diversity of experience that exists across the human race – race, gender, disability, neurodivergence – you name it. I feel like we’ve just scratched the surface, just as the spectrum of light refracted into its various component parts. There are differences in the human race we haven’t even noticed yet.

And, if we’re honest with ourselves, it’s fair to say we’re struggling with that. We’re struggling to ‘catch up’ to this explosion of awareness by way of categorising an ever-nuanced set of human traits in order for us to have a common language for which to discuss our individuality and design a world that’s fair and just for all. Because of the inherent complexity in the infinite diversity of human experience, our brains seem to get lazy so, as a way out, it becomes easier to stick with broad generalisations and proxies for certain values – us vs them, woke vs not woke, informed vs uninformed, left v right. It’s not our fault, it’s just a reaction to the explosive power of the internet and the tools we’ve built to connect ourselves to one another. We are but animals in a rapidly changing habitat.

What’s this got to do with UI?

Well, there’s an opportunity – not necessarily a business one, although some folks will argue it is – for accelerating equality across all humans, and it lives in software.

Let me put it this way. The world is reliant on digital tools and services more than ever, and unless there’s an Independence Day (the movie) level societal collapse, that reliance is not getting any weaker. At the same time, with every day that passes, we are increasing our knowledge and understanding of the ever-growing diversity of the human experience.

See the connection?

There has never been a greater opportunity in human history to create tools and services for anyone. Meanwhile, software (and the user interfaces through which we interact) is our raw material and by it’s very nature, is soft – abstract, malleable, fluid, adaptable.

A modernist image of a HTML page from CSS Zen GardenA modernist image of a HTML page from CSS Zen Garden
Two more websites using exactly the same HTML via CSS Zen Garden

Why then, if we had the technology back in the late 90s to create completely different interfaces with exactly the same content, do we currently inhabit a digital world where software UI has become as stiff as hardware UI once was?

Sure, OK, I’m not an idiot, I know it’s cost. It costs businesses to build UI and the last thing any business wants is to spend money making ‘multiple versions’ of the same UI – designing, deploying, managing, supporting – especially if ‘target audiences’ are small because the ROI on that investment is likely to be small. For what it’s worth, this argument is the common one used for addressing ‘inclusion’ (or accessibility) if you prefer that term – but my answer is the same – fine, so who’s fixing it and what happens if we don’t?

The digital world talks about personalisation in the context of selling more stuff. It talks about AI in the context of ‘accelerating the commodification of everything’. It talks about ‘inclusion’ and ‘diversity’ in executive round tables where things get so complicated and nuanced that the easiest thing to do is bury heads in the sand or make ‘decisions’ which are, often, not always, empty promises for reform and change. But, as almost anyone in digital (and ecology) also knows – personalisation and diversity are accelerators of all sorts of success. It seems, we’ve decided that it’s just not needed in our interfaces?

Look, I used to code. I used to be able to write HTML, CSS, Javascript, C++, SQL blah blah. And sure, over time, our systems have become more complicated and things have progressed such that the focus has been on scale – the most for the many. React, NextJS, etc and so on promises speed, security, and scale but the problem of personalisation doesn’t fall on the engineer, designer, and product manager working to ship features because ‘that’s a business model problem’. I guess I’m here to ask the obvious question – what if it wasn’t?

What if, instead of prioritising speed and scale in our engineering frameworks, we built humanity in – a way for anyone to customise the way they interact with the services and tools we make?

The overwhelming joy and possibility of my first experience with CSS Zen Garden unleashed in me my love of the digital medium. For the first time, a designer could divorce form from content. We were no longer trapped in A4 document boundaries or DL brochures. We were no longer bound by the 3m x 2m shopfront window – that sign, or brochure, or document could be for anyone. Back then, I’m ashamed to say that I was also completely unaware of the diversity in the audience – their preferences and abilities – but now that I am, I can’t help but wonder about the power that lies within a modern-day CSS Zen Garden approach to building front-ends; not just for the ‘functional’ stuff like ‘accessibility’ and ‘inclusion’ but for the emotional stuff, too.

Maybe I’d buy more if my Etsy experience could look the way I wanted it to look, rather than what’s easiest for the team to maintain internally. Maybe I’d prefer doing my taxes online if I could organise the interface the way I wanted to, rather than what an accountant thought I should do. Maybe I’d have more fun on Instagram (which would lead to higher engagement) if I could change the colours, fonts, and customise the ways I ‘scrolled’ through content because it was just the way I preferred it? What if I could choose and own the interfaces between me and said company/product or service?

What are we doing – the shapers of the building blocks of our digital experiences – to create a world that marries the best of what “Soft UI” can bring (adaptability, changeability, customisability) with the ever-increasing diversity of human experience that we know exists? And, should we, could we, be doing more?

November 2022

Somewhere between saviour and selfish

For the last decade, I’ve been changing. I grew up on a diet rich in sport, beer, fitness and, well ‘masculinity’. I’ve spent more hours than I’m proud of in a gym attempting to change my tall and skinny physique into something that resembled the he-man action figures I was surrounded with as a child. I never bothered with picking up music, dance, poetry or art because, well, that wasn’t what men in my life did. Apart from when I got kicked in the shins once, or clobbered around the ear by my parents for some deliquent disobedience, I never properly cried until about 24 years old – and that was only at my grandmother’s funeral. I never admitted it at the time, but it felt good.

My parents loved me, fiercely, and so they brought up their boys in the way they were taught boys should be brought up – mostly the way my Dad was brought up. A focus on physical development instead of a focus on emotional development. That’s not to say we didn’t develop emotionally; it just happened as a secondary thing, and because of that, not particularly well. We learned how to ‘be men’, that definition becoming ever more fluid even as I write these words.

Masculinity was sold to me (not just by my parents, but by society) as empowering – feats of strength, control, command, & conquer. Because of this, I’m sure I made some decisions in my early teens and adulthood that ‘helped me get ahead’ in a career or life. After all, as a man, it was up to me to carve my own path, compete against the other, and most importantly: win. In many ways, I won. But in other ways, I lost, and I’m only realising that now.

What I’ve learned about inhabiting the ideas of masculinity as defined by the generation of my parents and earlier is that it’s limiting. Just as ideas of feminity are, at the same time, limiting women. Masculinity constrains options. Forget about art and music; focus on competition and physical strength. Constraining options in a world where we’re overwhelmed with them feels like a good thing. Focus. And, as we’re taught, the focus is good, right?

Well, now, I’m not so sure. I can’t help but feel that, as a man, those limits are, well, rather problematic. Not just at an individual level but at a societal one. As I’ve come to learn about the role that gender has played in the way the world is run, what it boils down to becomes rather stark and worrying – it’s preventing all of us from living a complete human experience.

Mood Meters can help a bloke make more nuanced emotional decisions
The Mood Meter shows the various levels of complex emotions that men can attune to, if we become more aware of ourselves and how we express these feelings.

If, as a man, I’m not supposed to cry or be scared, what options in the spectrum of complex human emotions can I use to deal with feeling sad or worried? I have seen paths that men take which mostly go to anger and violence (or depression and anxiety) all via frustration. The floodgates to sadness and fear are closed so the water gets channelled elsewhere until the dam inevitably overflows. Because we are unable to feel comfortable expressing the full range of human emotion for fear of ridicule from other men or partners who also have fixed ideas of masculinity, we need to suppress, shortcut, and replace with others. That’s not strength, it’s stupidity. It’s also really sad.

To let down our guard against ‘femininity’ lets in the opportunity to live a more complete human experience.

I suspect that gender has never really been a useful way to categorise humans to anyone except the men who continue to hold and perpetuate their (our) own power. And, yes, it is useful for that. Just take a look at the COP27 delegates.

A photo of world leaders
Leaders pose for a group photo. Photograph: Mohammed Salem/Reuters. Original source: The Guardian

But, by continuing to perpetuate this system, we are in fact doing ourselves a much greater harm, and hurting others in the process. We, as men, are living lives that may feel powerful, but are, indeed powerless.

We do not ‘have the balls’ to watch a ballet and tell our male friends that we may actually have enjoyed it and might go next time. We do not ‘have the balls’ to admit vulnerability, fear, or sadness to our male friends when we experience a miscarriage in our family, or financial hardship, or we’re worried about climate change, or our kids’ first day at school. As a ‘bloke’, I’ve feared being the one who doesn’t want to drink alcohol this afternoon, or watch the footy, because I’d rather go to the symphony. Instead, whether we like to believe it or not, we seek ways to relieve this built up emotional labour. We externalise it upon the world through agression, drug and alcohol abuse, domestic violence. Internally, we end up diagnosed or undiagnosed with depression, anxiety, loneliness… the list goes on. I’ve been extremeley lucky I’ve had male role models who erred toward the less violent ways. My experience is not everyone else’s.

The question I have, especially for other men, is… is it worth it? The media covers a lot about gender equality and why it’s important. Most of what I read is framed from the women’s perspective – why gender equality will help women. And yes, it will. But, that same ‘pro-women’ coverage provokes some (most?) men to take the immediate and reflexively defensive position: to give women more means to ‘takeaway’ from men. The worst examples of coverage try the opposite – to position men as possible ‘saviours’ of women but that’s not the right way to be thinking about it because it still makes us the heroes; the strong and invulnerable ones.

The other way to look at gender equality is through the selfish lens – what’s in it for me, or the children I may be bringing up? I’ve never met a father who has said, “I want to limit the options for my child.” Gender equality, and normalising ‘taboo’ emotions like grief, worry, and sadness in the workplace, at home, or at the pub, will make it better for men, too. It gives us the opportunity to live fuller and richer human lives, to explore places within ourselves and each other that, just a generation ago, were locked doors. By working on ourselves we also help women and non-binary humans, too. It’s a win, win, win situation.

Trying to undo almost 40 years of social conditioning and identity-building has not been easy, but it’s been one of the most fulfilling projects of my life. It requires a lot of self-reflection and understanding about what I truly enjoy, feel, and think. It also required interacting with other men and women who could show me a different path from the one I grew up on. As they say, you cannot be what you cannot see. I still have problems with depending on others but I’m working on that.

But, the further I go down the path of redefining what it means to be a man, the more I’m realising that gender limits us all. I’d love my nephews, nieces, and anyone that comes behind me in the human project to benefit from thinking human first; not in categories of ‘men’ and ‘women’. So, here I am, finding that I’m writing about it in the hope that this message in the bottle lands with just one other bloke who might be courageous enough to take his first step in the direction of living a more human experience. If we don’t do it for others, maybe doing it for ourselves is what’s needed to start the change?

October 2022

Hybrid working isn’t a middle-ground

If we’ve learned anything over the past 24 months is that if everyone is dialled into a call individually, calls work better. With good facilitation, things are more inclusive, equal, and fair. Great meetings with loads of vision and lateral thinking can happen over a video-conference. Mics don’t always need to be off, or on. Neither does video. There’s a time and a place for all of those remote meeting settings so prescribing ‘a company rule for everyone’ doesn’t work.

You know what else doesn’t work? Two or more people dialling in from a shared webcam while the rest of the meeting participants dial in individually. No amount of training or self-control has been able to discipline the two (or more) co-located people away from engaging in a more fluid, richer conversation together at the exclusion of those who have dialled in. Body language is rich, turn-taking is slicker, the centre of gravity of an in-person conversation is so strong that it simply makes it much more difficult for someone else to participate when they aren’t in the same room.

So, where does that leave us? Well, if just one employee has to dial in, it leaves us having to support distributed working. There’s no middle ground. People need good AV equipment, good remote facilitation skills, an understanding of how turn-taking works in online video calls – they (and especially business leaders) need to know how to work in a distributed way. If anything, the idea of saying, ‘we’re hybrid’ sets up an office environment for failure not success. Things will only get harder until we reckon with the underlying question – what’s the office for, now?

What’s the office for?

The office used to be a place where managers would sit, attached to a factory, and make sure workers performed their jobs. But with knowledge work – work that simply requires a laptop and phone – work doesn’t happen in a factory anymore. So, what’s the office for now? Why do we think that ‘returning to work’ is synonymous with being at a particular place for a particular time.

Maybe the office becomes a meeting place? Maybe it’s a place for people who don’t have great working-from-home setups to get some distance from their home so they can work in an environment that’s more ergonomic and conducive to better focus for them.

Maybe it’s a place for people to have focussed collaboration space and work through gnarly problems together – problems that are novel, highly collaborative or where the multi-sensory component of the get-together is important to the outcome (like training). Or, maybe you think it’s still for leaders to ‘watch over’ their employees to make sure they’re still doing their job. But, if that’s the reason, then hybrid won’t work for you either – 100% in the office is probably more your jam because what that says to me is that you don’t yet trust people.

Some examples for re-thinking ‘the office’

Right now, what seems obvious is that to support distributed working well but also leverage the benefits of a place that many of us can decide to use at the same time, the office could be setup differently; to enable individuals to sit next to one another, without risk of background noise or mic crossover. This is easily achieved with some noise-cancelling software and a decent mic (call centres have been doing this for ages, by the way).

This idea provides a way for everyone to dial into remote meetings individually, regardless of location. It makes it inclusive for those who can’t make it in that day. Then, when everyone leaves the meeting, those who chose to work from the same place, say, ‘the office’, can still go to lunch together and enjoy the benefits of in-person time.

Perhaps pairing this idea with optimising the design of the space for larger collaborative group exercises as things head back towards something that resembles normal – work that enables experiential, novel, or highly-collaborative – gives ‘the office’ a different but more useful purpose than trying to cram everyone back into individual desks, only to have them all wear headphones anyway because open plan offices are terrible for concentration.

For some businesses, a communal space for employees still feels important – there are huge benefits to this, but it’s not an ‘office’ anymore. Words like ‘collaboration hub’, ‘meeting place’, ‘homebase’ feel a little more descriptive and true of how those ‘office spaces’ could be used now. No matter what anyone calls it, what it truly means is that there’s no such thing has hybrid because as long as we choose to support one person dialling in, we all need to have the distributed working skills to make it work inclusively and fairly for that one person who couldn’t be in that day.

Sure, there will be times that full teams can work together, at the same place and at the same time. Supporting distributed working also doesn’t mean giving that up. But, if teams also value inclusivity, even though they may have a space to share that’s sort of near where their employees live, it doesn’t mean they don’t need to invest in good tools, practices and processes to support everyone and not just the few who live within a commutable distance to a common space we used to call the ‘office’.

‘Hybrid’ is a false hope

The examples I give aren’t exhaustive, but it worries me to see that the leaders seem to be thinking that the decision to adopt a ‘hybrid model’ seems to imply some middle-ground. A little bit of a relinquishment of the absolute power an employer used to have over their employees. But, as businesses try to grow out of a pandemic, it’s the employees who have the power now, and it’s up to businesses to adapt.

A hybrid model doesn’t mean less work, it means more. Even partially distributed teams means you need to understand and nail how distributed teams work together, properly.

Hybrid work as a middle ground implies that the two ends of the spectrum (all remote, or all in office) are somehow more difficult now. But, to be in the middle means you need an even more nuanced understanding of how work works, what offices are for, how people behave in environments you can’t control, a recognition of the blurry lines between work and life that have always been there but are now more apparent than ever, and that even more elusive value for companies – trust in your employees.

What’s emerging is that for knowledge businesses, there’s far less physical time and effort required in leaning into distributed working as the way forward, regardless of whether 40 people happen to want to work from your collaboration hub for a day or two a week.

Using design to adapt to a post-pandemic workplace

I’ve spent quite a lot of time over the pandemic years helping organisations adapt remarkably well to a distributed ways of working and who are working harder than ever to get better at it. They are experimenting daily, working across time and the country to figure out what good looks like for the people they have employed – with all their neurodiversity and specific environmental needs. The results are happier employees & better quality work for the business. All that’s preventing every knowledge business from doing the same is fear. Most of the time it’s fear of ‘losing control.’

So, if you’re a leader whose curious about how your organisation could better leverage the benefits of distributed teams and the benefits of having ‘the office’, I’m happy to spend an hour or so listening and sharing what I know.

September 2022

The problem with being problem solvers

Design has, for a very long time, been in an identity crisis. The proliferation of job titles, its mixed history with art and artists, and the mystery that surrounds the non-linear, difficult-to-codify nature of the process means that we’ve all struggled to explain what we do to others; not just to someone from outside the industry, like my mum and dad, but to those within it.

Because of the difficulty associated with capturing what Design is, it feels safer to further abstract our explanations of our job until we’re left with phrases like, “Problem Solver”. Generic and understandable.

But, the problem with the label, problem solver, is that it makes obvious a bias that we’ve all been guilty of – a designer sees the world as a bunch of problems needing a solution instead of a complex world that’s simply difficult to understand and predict.

If there’s a problem, I’ll solve it

Stuff annoys me all the time. I hate the way I need to consult the user manual of my air-conditioner unit every time I need to re-program it because it makes no intuitive sense to me at all. I hate the stack of dishes in the company kitchen that sit right in front of the sign that says “Please put your dishes away”. I hate Instagram and Twitter for holding my attention against my will. I hate that the world is broken – climate change, war, genocide etc – the list is endless.

And so the optimistic capitalist within me says, “Great, so many problems, let’s turn them into opportunities!” And so we do, we whack a “How might we” in front each problem statement:

  • How might we allow people to program the air conditioner easily?
  • How might we ensure people to keep the company kitchen tidy?
  • How might we get our attention back from Twitter and Instagram?

We follow this prescribed pattern and ‘ideate’ until we’ve reached the highest order of problems (the most complex ones): How might we fix climate change, stop war, prevent genocide?

Wait. Really?

Sure, the world is imperfect and, as we bumble our way through evolution, some problems will go away and others will take their place. There will, without a doubt, always be problems. Some will be simple ones and others will be more complex. Thank goodness we’ve got designers, nay, wait, Problem Solvers, to help us squash them as they emerge. Right?

The intent to solve vs the intent to intervene

If designers continue to inhabit the title of Problem Solver, what we end up with is creating an identity and culture with a default intent to solve – to identify the problem, hone it, invent solutions to it, and take action. And sure, most of the time, the problem goes away. But, inevitably, another (or more often than not, other(s)) comes along and replaces it. So, which problems do we choose to solve? Which ones can be solved?

This action-oriented mindset – taking action and changing something in our environment – in combination with biasing towards ‘simple’ problems gives us feelings of progress & achievement. It feels really good to change something. We’ve proactively applied our intellect, which manifests in, hang on a minute… candy crush? The air-fryer? This cup printer?

Because simple problems are easier to ‘solve’, we seem to be focussing more and more on the inane optimisations of already wealthy, comfortable lives instead of using our incredible deductive and inventive capacities for something more important. Or, worse, we try to apply formulaic processes and methods that are successful in solving simple problems to complex ones and that’s where we run into trouble. But, hold that thought for a moment, let’s discuss medicine.

How the health sector ‘solves’ problems

In health, we’ve already recognised and use a different method of ‘problem-solving’. In health, there are no solutions, only interventions. There is a culture of understanding that drugs and therapies for humans aren’t ‘solutions’. There is an understanding that what may fix one thing for someone might do more harm to the individual, someone else, or a whole community. Because of this, we’ve developed the clinical trials system – a rigorous (not perfect) method for understanding how a health ‘solution’ may impact one or more human lives.

Clinical trials have various stages, from non-human to human. From small scale to large scale. It tries its best to do things like double-blind testing to remove bias from the process so that the understanding of the intervention is as ‘true’ as it could be for any given time. Again, this isn’t a perfect system, some interventions cause the need for other interventions and so on, but it’s the ‘safest’ one we’ve got right now. It’s an acknowledgement of the perpetual tweaking and change that is baked into the culture of improving healthcare. It’s the sort of process that’s robust enough to deliver the world a vaccine in a pandemic and save millions of lives.

The language we use shapes the culture we create.

This sort of process or mindset doesn’t exist in software culture. But what if it did? What if software culture had the process of a clinical trial – one that measured the holistic impact it had on humans and non-humans, at different scales over time before it was released en masse. What if it wasn’t just focussed on user acquisition and company growth? What if the way we thought about problems wasn’t scoped by what the shareholders are looking for next quarter? And why don’t we see protesting in the streets when a software platform like Tik Tok goes viral – scaling to billions of users in just a few weeks, but we seem to have a problem with a vaccine? What if software culture started to think of things not as ‘solutions’ to problems, but interventions to them?

The intent to intervene, not solve

Providing a solution implies an end to something – once a solution exists, the problem doesn’t. Often, the list of problems we started with is so long that after one is solved we just pick a problem off the old list and start solving that next.

But, if we start to think of ourselves not as problem solvers, but ‘interveners’, a few things happen (well, they happen in me, anyway):

  1. I start to sound a bit annoying and arrogant, and less like a ‘hero’. What gives me, a ‘professional’ designer, the right to intervene in anyone’s life in the first place? Who asked for my crappy opinion or ‘hunch’ on something? How do I know how to intervene with tools or services in the lives of people I don’t understand? Changing one word helps me see more clearly and returns me to the human-to-human relationship that exists between designer and user; to intervene in anyone’s life, we must understand them and their community, deeply, and also receive their permission to mess about with that, don’t we?
  2. It provides a level of responsibility for the unintended consequences of our interventions. When we’re building and releasing tools and services, we are the ones who decided how and why to intervene in someone (or a population’s) life. In any system where humans are involved, we are working with complex adaptive problems, no matter how small the change. Health already understands this. Tweak a human’s environment in one way and, sure enough, humans will use that tool or service in ways no one could imagine. By acknowledging to ourselves that what we’re doing is ‘intervening’, not solving, we’re admitting to ourselves that there will always be an effect (positive and negative) caused by our intervention. Because of this, I’m likely to be a little more careful in how I propose that we intervene and to what extent. Perhaps we’d start to change the scale of how we release our interventions, catching any harm earlier rather than after we’ve changed the brains and neurology of millions of people?
  3. There is no “done”. While it might feel good to address a problem by intervening in a person’s life in some way, the very nature of intervention is that nothing is done – in fact, by intervening to solve one problem, history shows we’re just creating more and different ones. In some sick way, the ‘solution’ mindset is keeping our industry alive and growing. We’re creating problems, not reducing them, so we need more ‘problem solvers’ to help solve them, right? Another unintended consequence I suppose? Who benefits from this mindset, then?

Quite simply, by using intervention terminology over solution terminology, it keeps the mind open and the ears and eyes more aware of our actions; both short and long term. By recognising that what we’re doing is intervening, not solving, perhaps we’re more likely to adopt a listening-first culture, one that moves slower and fixes things; just like it is in the Health sector.

If we’re more aware of the systems and lives in which we’re trying to intervene (because let’s be clear, most of the time, no one asked us to do that except someone who sees a way to make a profit from a community), we may approach problems with more empathy, understanding, time and consideration for the lives which we impact.

If this all sounds a bit dramatic, let me illustrate the importance with a case study.

Case study: Mosquito nets in Africa to ‘solve’ malaria

A smart cookie wants to prevent death from mosquito-borne malaria in Africa. There’s a very cost-effective and easy solution to this – mosquito nets. And so, they are deployed to those who need it most. A truly simple life-saving device. Problem gone. Or is it? Because now people are using those nets for fishing. They are infused with insecticide and the holes are much smaller than regular fishing nets. This means not only are they poisoning their food supply, but they’re also destroying the ecosystem on which they depend by pulling in fish and biodiversity that ordinarily wouldn’t be caught by ‘regular’ fishing nets. They’re also being used for chicken coops, football goals, and wedding veils. The problem now is how to stop the problems we’ve created by setting out to solve a different one.

Could calling these nets an intervention changed these outcomes? Might we have thought a little more broadly about the consequences of giving such a versatile tool to a human? Might we have rolled it out differently – first at a smaller scale to learn, more quickly, about the positive and inevitable unintended consequences?

So, what’s Design for?

My instinct, when I hear the unintended consequences of the mosquito story, is to solve it. What if they did X instead? What if the process was more like Y? That could have easily been avoided if… and so on. It takes considerable effort for me to stop. And think. Yes, this is an obvious problem to solve, but I need to take more time to understand it. What’s the cause and effect here, in this community, for these people? How do systems of food security, biodiversity, health, and education intersect or overlap – that’s a lifetime’s work and the system will keep changing as soon as we begin to understand or interact with it.

The truth is, I, like many other designers I know, are chomping at the bit to change stuff. I also don’t know a single designer whose actively set out to do damage to anyone. All we’re trying to do is help, we say. But, in the big scheme of things, our lives are short. And because our lives are short, we haven’t got a lot of time to make the fulfilling impact we’d like to make. Because of this, we bias toward action over consideration, of solving rather than understanding. People pay us to solve. They don’t pay us to tell everyone to stop, re-consider, take a little longer, try something small and see. At the end of the day, we gotta eat just like everyone else and if company X won’t ‘solve this problem’, company Y probably will. Won’t they?

I often wonder if designers could work together toward something bigger; something more… intergenerational. What if I spent my time understanding a system, and shared that understanding with another? Set someone else up for success? What if we watched and documented, together, across generations, over a much longer time horizon? What if designers helped to create a human organisational memory – a way to visualise the world and its complexity – the interconnectedness of all things? How might I intervene in my own life and community so that we can nudge us, the problem solvers, in a different direction?

What’s next in Design?

Should we have a clinical trials-like process for software products and services? Software products are fast becoming the primary tools and utilities of our time. Safety features are required in cars and other physical tools and services that intervene in our lives at scale every day; maybe it should be the same with software?

The downside of course is that regulation and systemic change of any kind takes a really long time. And, it’s often opposed vehemently until there are enough deaths or enough destruction for governments or other regulatory bodies to take action. We don’t have that sort of time. We’re also not measuring the non-death impacts of thoughtless or unconsidered software (i.e. think mental health at a micro level, democracy at a macro one).

Perhaps the simpler thing to do is to change our culture, one small behaviour change at a time. If language does truly shape culture, then the “How might we fix, solve, remove, address…” style of question – that ‘absolute’ and ‘finite’ terminology we’ve become accustomed to using as provided by ‘thought leaders’ like Google Sprints and Lean Startup books all over – might be better phrased as, “How might we intervene in …”.

The curious thing about “How might we intervene?” is that it provokes a simpler and more important question for any problem we’re staring down – “Should we intervene here at all?” We may just find that doing nothing, in many cases, is the best intervention, and this might free up some space in our brains to find what is likely – that there are more important fish to air-fry after all?

March 2022

Let’s innovate!

I‘ve been in three different innovation teams created by large organisations, in very different sectors, and they’ve all started and ended the same way.

They start with a dream – we want to use our vast capital and resources to ‘start a start-up’ – to break free of the governance structures that slow down big organisations’ decision-making capability to move quickly to improve profit, people, and planet. And, they’ve all ended the same way – the market isn’t ready or not big enough for our innovative thinking and so the commercials don’t justify continued investment.

It’s a curious thing to witness. Three different teams. Three different organisations. Three very different types of tech. They only share one common characteristic – they all began with the technology and not the problem.

The problem with starting without a problem

Look, I get it, tech is exciting. Especially new tech. VR, AI, Web3 and blockchain – it’s all cutting edge stuff and it’s stuff that companies should have their eye on if they want to take advantage of it or defend themselves against possible disruption. I am all-in for exploring the possibilities of new and emerging technology to see when, how, or if it could be used to benefit the strategic goals of the businesses. It’s just – in my experience – it doesn’t seem to happen that way.

In each of the three innovation teams I’ve worked within it’s been the same story (simplified to make a point):

  1. Identify new or emerging tech
  2. Deploy engineers and a ‘head of innovation’ to explore it
  3. As they explore, they imagine ways the business could benefit
  4. Keep exploring further
  5. Repeat steps 3 and 4 until the money runs out

In this model, there is always a feeling of progress because what this model does is accelerate learning. And, when we’re learning, we feel we’re making progress. We feel as though we’re moving closer to the big imaginary lightbulb above someone’s head. We’re not sure towards what exactly we’ll arrive, but we feel arrival is imminent, so we keep going because we’ll know when we get there and ‘that’s what innovation is about’ (real quote, btw).

And, in many ways, I agree – these are all good things. I’m a big advocate of play for play’s sake. Of exploring without a purpose for a while to learn things that structured learning may not teach because of the limits imposed on it by boundaries that are inherent in structured learning. But the problem occurs when we’re constantly engaging in divergent thinking – wider and wider – without any sense of synthesis and reflection.

Alternative approaches to tech-led innovation

So, what to do? We want to enable play and exploration, but we also want it to unlock something for the organisation or the company, at some point or at various stages along the way. This is where innovation labs could benefit from either/or:

  1. Hypothesis-led testing and validation (aka. The Hare – Move fast and break things)
  2. Problem first, then solution second. (aka. The Tortoise – Move slow and fix things)

Hypothesis-led innovation

The methods that describe scientific exploration can be easily adapted to corporate innovation. The idea that one can postulate an outcome before beginning to explore it gives some really wide boundaries for innovation teams to play within. In other words, set a goal post in the far distance, then play and explore until we reach it. Then reflect. It’s not complicated. It’s not rocket science, it’s just science.

This isn’t reinventing the wheel, it’s just using the one that already exists, for many years, in academia and scientific research. It’s the structure that enables play, rather than restricts it. It’s also straight-forward:

  1. We believe that…
  2. To verify this, we will…
  3. And measure…
  4. We are right if…

So why is doing this well so difficult? What I’ve seen are four reasons:

  1. Not everyone is a scientist. Theories of change (of which hypothesis-led testing is one), fall over quickly because of people’s under-developed logic skills. Things like circular logic, and a magical belief in actions/tactics because of bias and assumptions.
  2. The change or ‘vision’ is multi-variant. It’s easy to compare change when you tweak 1 thing against another, but tweaking 2 or 3 things at the same time muddies the experiment and opens the door to all sorts of fallacial reasoning.
  3. Ego gets in the way. Things like confirmation bias and ‘avoiding failure’ prevent humans from seeing things objectively. And, often, failure means missing KPIs or OKRs that lead to promotions and a higher sense of self-worth.
  4. There’s no peer review in corporate innovation. Peer review (i.e. inviting colleagues to critique, objectively, the methods and theories of the working group) isn’t how corporate innovation teams are set up. Crossing ‘silos’ to engage people with no context is difficult and, even if it’s possible, we end up with ‘inventor bias’ where the working team asks peers leading questions like, “You would like it if this was invented, wouldn’t you?”

Hypothesis-led innovation is a robust process – well, it’s the best we’ve got. The foundations of science are based upon it. It’s just that a commercially-oriented culture where teams are measuring profit as an outcome, as opposed to an academic one which is, “Where do I get my next grant from.” makes it much more difficult, but not impossible, to do well.

Research-led innovation

While hypothesis-led testing is an appropriate, robust, and ‘active’ way to make progress in innovation, there is an alternative and that’s being ‘research-led’. The difference is subtle, but absolute.

In hypothesis-led innovation, teams rally around a theory and get to work quickly, often playing with tech and outputs to move towards validation or invalidation of their hypothesis. In research-led innovation the ‘act of working’ starts from observation, not action.

Research-led innovation begins with generative research that is grounded in behaviour, not attitudes. It requires an agreement between the working team that observing is valuable and a collective trust that it will lead to an positive outcome. It requires a deep sense of curiosity and open-mindedness about those outcomes because it may be that the team doesn’t learn what they expect to learn but, quite critically, they always learn something. That something, even if it describes a path that the team can’t follow, is valuable.

Research-led innovation requires a few things to do well:

  1. A definition of a space: this might be a type of customer or non-customer. It may be a particular activity, or a particular environment or domain. Putting a boundary around this is important for constraining the scope of observation – quite simply, it’s impossible to observe infinity.
  2. Excellent and diverse research skills: Experienced researchers who have years of practice recording unbiased actions, conversations, and other human behavioural factors is critical. This sort of contextual inquiry goes beyond non-generative research, like usability testing, and lives in the realm of ‘recording the unconscious.’ There are very few people who do this well.
  3. An opportunity mindset. A thing that most money-spending organisations have trouble with because the ROI isn’t clear, and often isn’t for sometime. Again, it requires an understanding that generative research *always* produces results, whether the funding team likes those results or not.

I’ve spent time sitting and watching people shop in supermarkets, drive trucks, operate in call centres, all without doing anything but watching and asking a few open-ended questions like, “I saw you just took that bread off the shelf, why that bread?”

The giant leap we all want to make comes from listening first, not acting.

Noticing the every day is an art and a skill. It requires patience, curiosity and faith – all traits that you won’t see on a job ad for innovation teams that are looking for ‘fast-paced, exciting, entrepreneurial qualities in their people. And therein lies the crux of the problem – the perceived value of observation, just as our First Nations People have done for 60,000 years is under-valued by the western and start-up mindset of acting before observing. Hence, we return to the comfort zone of hypothesis-led innovation that’s driven western science for hundreds of years.

The giant leap requires patience, not ‘action’

Innovation has a ‘brand’ – new, exciting, untetherable and exploratory. Steve Jobs in all black, secret missions to unlock world-altering technology and systems on the world. But its brand is, in itself, its own problem. Because what innovation is really about is change. What innovation goes looking for is ‘the giant leap’ – something that’s truly game-changing for the industry or problem space within which we’re operating. It’s got confirmation bias baked in – if the leap isn’t giant, then it’s not innovation.

But, to make a giant leap, don’t we need to exert lots of energy?

And that’s where we’re going wrong because the way to get giant leaps is, in fact, counter-intuitive. What the giant leap needs is patience. To stop, observe, listen, understand, first. It’s not sexy, energetic, or exciting. But, it’s only after this perceived ‘passive’ activity that we can act. Precisely and swiftly. To go from A, directly to G.

Until we begin to take a listening-first approach, those giant leaps aren’t likely to come. Instead, we’ll either end up with small, incremental change (which isn’t a bad thing but often not the goals of innovation labs), or, the one I’ve experienced in three different teams: no change at all and a growing distrust of the value of ‘innovation labs’ at all.

January 2022

Software wants to disrupt everything but itself

Software has some powerful attributes, it’s part of the reason I love working in the industry – just one person and a computer can build and share a tool with billions of people in a matter of moments. The optimistic view of this is that its enabling – software can have profound positive effects on the world, almost instantly. It can empower the disempowered, improve the quality of life for people who would otherwise not have that opportunity, it can unlock access, equality, and justice.

Of course, it could do the opposite, too.

And so we find ourselves stuck in this game. Software companies, driven by their commitment to shareholders to generate profit, release the tools at scale that help them deliver on their shareholder commitments. They ‘push us forward’. They ‘help us progress’ as culture. They ‘disrupt the status quo’ because the old way isn’t necessarily the good way, in fact, the assumption is that it most likely isn’t the good way.

So the good software spreads. It creates opportunity as promised. It helps more people participate in the economy as promised. It drives up profits as promised. It unlocks value that, until now, was impossible to activate. It changes the culture through saturation and suddenly those affected can’t imagine a world without it.

As the culture changes, software can respond quickly. In fact, that’s the mantra – move fast and break things. Ship it. Measure the effect. Sense what’s needed next, then make that. The build – measure – learn loop that’s so prolific in software development means that, in a matter of months, several iterations of the tool can be refined, honed and released to meet the needs and demands of the consumers who are using the tool. Often the first thing that was shipped morphs, through feedback, to solve a vastly different problem for a vastly different audience, but who’s keeping track of that when all that matters is the promise made to the shareholders – people will pay for this problem to go away, so now we’re doing that.

Have you ever seen a toddler learn to walk? That’s software. One step after another, learning as they go, gathering balance and momentum quickly. The destination is uncertain, in fact, in most cases, completely unknown, but the mantra is forward without falling. Just keep going. That is, until the parent pulls on the harness and re-directs that energy somewhere else – enter regulation.

Iterative regulation isn’t possible

Everyone knows that regulation, compared to software, is slow. But that’s because laws can’t work like software. In fact, it’s culturally agreed that they shouldn’t. Laws are deliberate, long-lasting, and need clear definitions and boundaries. It would be completely impractical to change the definition of Murder or Grand Theft Auto every few months. Good law-making requires deliberation, long-term analysis, science, political engagement. Issues of equity & justice are central to law-making which makes that deep consideration necessary – once a law is passed, people’s lives and choices change. There is no room for misinterpretation and so the language used in law does its best to articulate its meaning as clearly as possible.

But, like with most things, it’s never perfect. Ask two lawyers to interpret the same paragraph of law and both with come away with opposing viewpoints – the defendant and the prosecutor. The laws we write cannot be divorced from the cultural and political moments in which they’re written. And, similarly, their interpretation, many many years later are also influenced by the moment in which they’re interpreted.

Technological Social Responsibility over Corporate Governance

If it’s impossible for regulation to be iterative, and software, by its nature, is, then who is better placed to ensure that some of the role of regulation be woven into the process of designing, building and distributing software at scale? Maybe software companies could play a role in ensuring that issues of equity and justice, those things normally considered central to law-making, are considered alongside profits and innovation. Maybe the additional constraint will unlock even further innovation (as constraint often does) – not a world we know is possible, but one we can’t even imagine until we do it.

It’s the funny thing about software people – so quick to criticise the government, public policy, ‘slow-moving’ institutions. The culture is one of disruption; cause a ruckus then deal with the consequences a few years down the track when regulation inevitably (and often poorly) catches up. Uber, AirBnB, and now Buy Now Pay Later software are just a few examples of software’s inherent nature of disruption – the hare running ahead of the tortoise.

If I looked at regulation like a software person I’d see an opportunity. Not an opportunity to ‘get around regulation’ as is the default thinking I’ve experienced in the industry, but as a possibility of creating a fairer, more ethical, more just world more quickly. And, to be clear, I’m not advocating for corporate governance, but rather Technological Social Responsibility (TSR) – a consideration that those who can provide tools to billions of people overnight not only should consider their longer-term socio-political implications, but must.

What needs to change in software?

Like addressing any deep cultural assumption – e.g. women shouldn’t vote, gay people can’t get married, segregation – the idea within software cultures that software companies could do a better job at uncovering and planning for longer-term detrimental effects of their software is, at the moment, a radical idea.

A drawing of The Overton Window
The Overton Window applies in software cultures, too.

Whenever I float the idea of TSR to software companies it’s met with immediate, knee-jerk responses like, “That’s the role of regulation!” Even though we all know, implicitly, that what that response really means is, “We want unfettered access to scale and impact because we see a business opportunity. And, if our hunch is right, we’ll be well and truly scaled and profitable before regulation has time to catch up. By that time, we’ll have a significant amount of power and money that will help us shape or defend ourselves against it when the time comes.” It seems that software companies want to disrupt everything but themselves.

But, this is the thing I don’t understand about software people: we are so good at thinking about complex domains and systems. Engineers are literally trained to think through risky scenarios and draw out all the thing that could go wrong in their code. We think through edge cases and what-ifs all the time. The only difference is scope –  it’s just that we’re not applying this incredible analytical and logical skill to the broader cultural implications of giving tools to millions of people in the space of days, instead we’re checking it against the effect on revenue, monthly active users, retention, acquisition, and engagement.

It’s not as if tools to help us think through these complex implications don’t exist. Regulation has a version of them already and whilst they aren’t perfect, and they probably take longer than we would like, they’re better than nothing. They also present an opportunity for incredible software thinkers to ‘disrupt’ that and design or uncover ways to make that process better.

And, as the world begins to realise the ecological impact of software at scale, new and interesting tools are emerging all the time to help us think more long-term, beyond tomorrow, at an intergenerational and global ecological level.

So, if it’s not for lack of access or availability to these tools, it’s something else. But, like with most things, the barriers to change are based on a deeply held, almost invisible assumption – that we can’t have social responsibility and profit, scale, & power. We also used to think that staying at strangers’ houses, getting a ride from someone you’ve never met, or transferring electronic money from person to person were things we couldn’t have – but here we are.

Reframing the idea of legacy software

Unlike landfill, the legacy that software leaves behind doesn’t visibly pile up (although there is the ecological cost of the raw materials that require us to live in this digital age). Software changes the very thing we cannot always see immediately and struggle to anticipate unless we put a concerted effort towards it – the relationships between things: humans and things, humans and humans, humans and animals, humans and our planet. When those relationships shift, it’s far easier to label it an abstract phrase like ‘the evolving cultural landscape’ and do what software and toolmakers have always done – create something new to deal with the problem that the first thing created. The cycle only speeds up – the more tools, the more unanticipated problems, the more tools we need to handle them.

Software leaves behind the things we struggle the most to measure – the relationships between us

Design, as a practice, seems uniquely placed to help engineers, product managers, and businesses visualise and understand the relationships between things. It’s something we take for granted – a skill that seems so obvious that we struggle to understand why others aren’t doing it (just like ‘listening to the customer’ was about 10 years ago). As Paul Rand so famously said – Design, in its truest form, is nothing but understanding relationships.

And so what if legacy software was thought about like a city or a public garden? Infrastructure with a 20-30 year impact – or even longer? What if it was normal for the planners & builders of software tools and systems to know that they’re working on something that they may not live to see others enjoy but do it anyway? Stuff that doesn’t just improve their lives or the lives of those with access to the internet, but for generations that don’t yet exist?

Sure, the technology that we build these systems on will change, evolve, and unlock new opportunities but perhaps the relationships that software creates between humans and the planet could persist or evolve with more intention?

Yes, there are positive and negatives to any tool that’s provided to the masses, any technology that’s created to solve one problem often creates another. But, perhaps if we were more concerned with what us software makers leave behind in the long term rather than the short-term thinking that’s so pervasive in software-building cultures, we’d start to shift the needle for how software thinkers begin to plan the way they disrupt or change the culture. To borrow a little thinking from Regulators, those slow-moving curmudgeonly lawmakers, we may find ourselves iterating our way to a fairer, more equitable world and leaving less people behind as we go, even as the culture evolves.

November 2021

Faith in tech-knowledgey

I was raised a good, catholic boy. I went to Catholic schools (both primary and secondary), I was an altar boy for a while there in my early teens. I attended church with my mum on most Sundays. Sung the hymns. Said the prayers. Then, one day, thought to myself – hey, I’m not sure I agree with these ideas.

Soon, I stopped going to church. I started to argue with my mum (a catholic), and my grandmother and uncle (both protestants) about all the ways in which their blind belief was simply illogical and irrational. I branded myself an atheist but, later on, decided agnosticism was the thing for me – I would prescribe to no belief.

But of course, there was science – good ol’ rational, logical, deductive science. We were unequivocally proving things with science. We’ve proved that we evolved from apes and not a guy’s ribs. That death meant there was no afterlife. That ‘sin’ was something created to help control an unruly population in the Middle Ages. Yeah! Science! This was it. We could do and know anything with science, eventually.

But here’s where even science goes wrong: it turns out that we can and will never know everything. And, because this is true, we need faith. There’s no such thing as atheism or agnosticism in its truest sense. We need to believe in something. Faith is inescapable.

In rejecting Catholicism I simply transferred my belief to another thing, science. And, the more time you spend with science (as I did with Christianity) the more you realise that it can’t do everything, either. It’s only really good for a certain type of knowledge – the stuff that’s observable and measurable by a third party. That’s not a bad thing, it’s just that it still has limits.

Some of us have faith in the free market. Others have faith in dictatorial leaders (on Earth or in Heaven). Some of us have faith in processes (like design) or systems (like democracy), and some of us put our faith in the people closest to us – family and friends. Some of us are beginning to idolise the billionaires because, as it turns out, some of us believe that technology will save us.

Is technology our saviour?

As I write this, the world is beginning to agree that we’re in trouble – if we don’t change the way we live our lives, our species (and a few others) simply won’t survive. And, as I write this, our Prime Minister is seeking to invest 500M dollars in his faith. Yes, he’s an unashamed Pentecostal Christian, but he doesn’t intend to give that money to the church. No. He wants to give it to Technology – technology not taxes.

But what does it mean to have faith in technology?

A few years ago I read something somewhere that stated that we only refer to things as ‘technology’ if they were invented in our lifetime. Anything, before we were born, isn’t technology, it just is. It begs the question, what is the ‘technology’ that our Prime Minister believes in so strongly?

I’ve been working in software for almost 20 years, building tools that help people solve problems of various sizes and scale. I’m pretty proud of the work I’ve done but I’ve also had first-hand experience in the unintended consequences that giving tools to humans always has – first we shape the tools then the tools shape us.

Historically, give or take a few disasters, one could argue that technology has had a net positive effect on the short-term quality of life of the human race. The wheel. The lightbulb. The bicycle. The aeroplane. The invention of democracy, capitalism, industrialism. These tools and systems have indeed raised the tide and lifted the boats with it. We’re living in less poverty, globally, than ever before.

But, how’s it really going? Where are we now? With the growing global agreement that things aren’t looking great, maybe those technologies and systems haven’t been thinking long-term enough. Could there have been another way? Might we still have time to course-correct if things haven’t gone quite to plan?

Maybe to have faith in ‘technology’ really means to have faith in human ingenuity and creativity? That, somehow, we’ll work it out. Our life depends on it. We’ll use science and collaboration. We’ll use the very systems that got us into this predicament to get us out of it. Capitalism and Industrialism, pointed at the exit sign, away from this impending catastrophe, might just be the thing.

But. Even as I write those sentences, something seems off-kilter.

If we keep digging the same hole, with the same shovel, we just dig a deeper hole.

If, for the course of human history we’ve had faith in technology, and we’ve arrived here, at the brink of our demise, maybe it’s not the answer. Maybe, we need to do something different. If we change nothing, nothing changes. Using the same shovel to dig a deeper hole will only make the hole ever deeper. It’s the same hole. It seems that what we need now is some sort of airlift, not a bigger shovel.

As a technologist – someone who, for 20 or so years, has witnessed both the profound positive and negative impacts of ‘technology’ (new tools and processes introduced to people’s lives), I have very little faith that doing the same thing – relying on technology – is the way out. Like with almost every problem I’ve built a tool to overcome – the problem is almost never a tooling problem, it’s a people one. And, most of the time, people problems have been solved with people solutions – empathy, listening, fairness, trust, understanding, care, and reciprocity. I don’t think the plan is to spend 500M bucks on those things.

We can’t have a conversation about Enough, either

In a world where many have a lot and many more have a little, it’s difficult to have a conversation about ‘what is enough’, either; the idea of mutuality over hierarchy. Scarcity (both real and manufactured) underpins our largest and most influential system – capitalism, and so to think about a world without scarcity, a world where things like universal basic income exists, feels near impossible until we’re forced to.

We cannot separate humanity from technology, but technology is just half of the equation and I don’t see anyone else thinking about the humanity bit

Many of us, especially those without enough, want more. And who can blame them? It’s a basic, human need for survival. Squirrel away the spoils so we’ll be safe if the disaster happens. But, to get more – to accumulate, save, and store – we need to borrow from the now. It seems that we haven’t historically been very good at borrowing from the now, but we’ve mastered taking from the now. Borrowing requires a paying back. But we’re all told that debt is fine and we should only pay when the debt collector starts knocking. That’s how it’s done.

But we are eating our own gingerbread house. And what we don’t eat gets destroyed by the acid rain that’s beginning to fall. It’s scrapping for our very survival that will and seems to be leading to our downfall – like a spider trying to climb the side of the toilet bowl only to slip and fall back in again. The climbing is getting more difficult the more tired we become. Meanwhile, the tide is rising. Literally.

So what to believe in, now?

I don’t have solutions but a recognition and understanding of our place in things – the work of scientists like Carl Sagan, poets like Mary Oliver, religious teachings like Buddhism – seem to be part of an answer. Our indigenous Australians have been and continue to ask for a voice to power. But who’s listening? The religious despise the scientists and vice versa. The economists despise the poets and vice versa. The politicians represent their individual preferences, not their people. Everyone is right, and no one is.

I used to have faith that a common enemy would bind us. The scenario of the interplanetary adversary would mean that humans would finally see ourselves as a unit; a common whole. But now, through the course of one of those moments – a global pandemic where each and every human became the vulnerable one – I’ve seen how deeply our beliefs are held and amplified against one another. It seems that no matter the challenge, we will always find a way for it to be us versus us. And, whilst we’re busy arguing with each other, the world around us will disintegrate. Not in a cataclysmic, overnight sense. It’s far more likely to be slow, banal, and boring.

And yet, despite all of the logic and rationality that I can use to analyse the system we’ve built for ourselves that the science tells us will surely spell our demise – I find myself, still, with hope. The science will probably tell me that my brain is wired that way – it’s some sort of biological survival mechanism because the opposite of hope is simply too detrimental to my health. Religion would tell me that all things come to an end or that there’s an afterlife anyway, so there’s nothing to worry about. Capitalism would tell me that the market will work it out for us.

I don’t know what to believe in anymore, but I’m pretty sure technology isn’t it.

October 2021

The deepening of intergenerational digital and social exclusion

Almost every 30-something person I know has a similar story – a moment where their parents or grandparents have tried to achieve something ‘simple’ online (renew a license, download a government app, order a taxi) only to have failed miserably leaving everyone, especially that parent or grandparent incredibly frustrated. From this point, it’s not a difficult path to statements like, “I’m a tech luddite” or “I’m terrible at technology”. From there, the easiest path is the one of least resistance – to opt-out of technology.

It’s not technology’s fault, it’s ours

I don’t know a single software designer, who, at some point in their career, hasn’t done the following:

A new feature is being designed. It’s, in the scheme of the project, a relatively minor one for an application that already exists. The risk of getting it wrong *feels* low, and there’s a huge backlog of work coming up on the horizon that the team is panicking about. It would take 5 days of work to design, test, re-design this particular feature with the users of the application to make sure we get it right. It would take 2 hours to do a quick scan of the world’s biggest software companies (Google, Facebook, Instagram, Netflix) to see how they solve the problem and replicate that in as much as it makes sense to do so. Sure enough, we pick the second option – just this once.

The decision to trade-off time-to-release with usability feels, in the moment, like a pretty low impact one. We use everything in our designer-y brain to justify that decision:

  • The team is under a lot of time pressure
  • It’s just a small feature
  • Chances are, our users are people who also use Google, Facebook, Netflix etc, so we’ll leverage that familiarity to de-risk it
  • Our small team doesn’t have the budget to test everything but we know Google, Facebook, Netflix etc do heaps of testing, so we can trust that by proxy

And sure, on paper, this seems pretty reasonable. Maybe even off paper, when the feature is shipped, the customer service team isn’t inundated with support requests and so, maybe it worked? Maybe if it worked once, it can work again? The next time those factors of time, size of feature, and risk of getting something wrong is true? What if we did it again, and again, and again? Each designer, in each different team. What happens in a year, or two years, or three years down the track?

The slow proliferation unusability (and language)

A knowledge of how anything works is cumulative. If the first strawberry we ever taste is sweet, we’ll assume all strawberries are sweet until you taste one that isn’t. After that, we realise that some strawberries are sweet, and some are not. Once we get on a train of knowledge, we build it slowly over time. The same works with understanding how to interact with digital interfaces.

Example: Buttons

Buttons, in the real world, look something like this:

A close up of vintage car radio buttons
A close up of vintage car radio buttons
Top: The original radio buttons via UX Planet | Bottom: A more subtle and ‘modern’ take on buttons via 7428

The physical characteristics of a push button can be described like this:

  1. They protrude from the surface.
  2. Some are concave or convex to communicate, all be it subtly, that your finger belongs there – that you are required to push.

Buttons were inherently physical objects. So, when digital interfaces came along, designers used the familiarity with real-world objects to teach people the function of these graphical elements. They started off looking a bit like this:

Digital buttons that have convex and concave shapes so they look like buttons that can be pressed even though they're digital
Skeuomorphic buttons via Jon Kantner

Then, over time, the “Medium” of digital evolved. Partly through fashion and a need to differentiate in the market, partly through a requirement to ship more quickly, skeuomorphism started to seem ‘dated’ and big companies like Apple and Google (the ones we rely on as proxy for ‘good, tested design’) decided that we would enter an era of an increasingly minimal aesthetic which ended up in flat design.

Soon enough, lead by the large companies, ‘buttons’ started to look like this:

An example of two 'buttons' that are actually flat rounded shapes
The evolution of buttons took on a non-skeumorphic look which no longer gives clear affordance via Dribbble

And of course, our language didn’t change – we kept calling them ‘buttons’, but their physical characteristics – their affordances – slowly evolved. And, unless you evolved with them, these squircles and circles above don’t look anything like a ‘button’ anymore.

The problem is, we the designers are evolving with and making up our own affordances and forgetting that ‘regular’ people are not.

When I tell my dad to press ‘the upload button’, he doesn’t see it as a button. It’s just a blue shape with rounded edges or a blue circle with an up arrow in it. I can’t use the word ‘button’ when I’m coaching him over the phone through an interface he doesn’t understand because, well, these objects look nothing like actual buttons that still exist in the real world. And I haven’t even described the issue we’ve got with concepts like “Upload” and “The Cloud” here.

Our visual language has evolved, our language hasn’t

The evolution of the visual signals we’re using to denote functionality in the digital world isn’t just constrained to buttons. It’s happening everywhere.

Take ‘tabs’ for example – the ones we use in internet browsers. The word ‘tabs’ comes from the appropriation of the physical tabs we use to separate sections of a document, like this:

A photo of dividing tabs in a physical folder
The ‘original’ tabs haven’t changed their function over many generations. Image via Printwise

So, in the early days of UI design, designers quite rightly took that metaphor, guided by the affordances it gave us via their shape and made the early internet browser:

A photo of dividing tabs in a physical folder
The same notched shape of ‘tabs’ ported over to the digital world with ease, and persists today. Image via Printwise

This metaphor has persisted surprisingly skeuomorphically over the years. In most desktop internet browsers, the use of tabs is still the ‘norm’, and we still say things like, “I have too many tabs open.”

What’s interesting though, is what we see on mobile. This screenshot is of the latest Chrome browser (yes, another Google product, the ones we rely on for direction).

I can see the call-to-action (which also looks nothing like button), “New Tab”, in the top-left corner of the screen. But, I don’t see anything here that looks anything like a tab. Maybe they’re windows? Squares? Perhaps they’re buttons, now?

This would be fine if you’ve used a desktop internet browser before, but what if you haven’t? What if you’re in the increasing number of people on the planet who have only ever used a mobile device? What does “tab” mean to you now?

Once you start looking for it, it’s everywhere. And, you might think, “what’s the big deal? People will learn it once they use it and then they can evolve with it, just like the rest of us?” Well, to them I say this:

Imagine if you felt like a banana and you asked the grocer to give you one. Instead, they gave you an orange. And then you said, “that’s not a banana, I want a banana”. And then they gave you an apple, instead. In today’s world, we’d give the grocer a 1-star review and move on. Eventually, they’d go out of business and we’d say, “Well, that’s good, they didn’t know what they were talking about, anyway.”

This isn’t about anti-evolution, it’s about exclusion.

I’m not saying our craft shouldn’t evolve; that we should continue to replicate the physical world or that we should be limited by what physical manufacturing is capable of in any particular decade. But, what we’ve done, slowly and steadily, drip-by-drip, is made it very difficult for anyone who isn’t us to use and access services that are vital for their wellbeing and existence in society.

UI isn’t the only problem with software – it’s also how we make it and model it.

Here are just a few datapoints from Digital Inclusion Index Australia:

  • 1.25 million Australian households without internet access at home in 2016-17 (14%)
  • 79.1% of people educated to year 12 or below use the internet, compared with 96.7% of people with a tertiary qualification
  • More than two thirds of people who are homeless had difficulty paying their mobile phone bill in the last 12 months

This means it’s about how perfomant we can make our services so that those with limited bandwidth have access. It goes beyond ‘AA accessibility’ tickboxing because not recognising a button as a button, or a tab as a tab isn’t in the WCAG guidelines – it’s about human-ness.

More than 1.3 million Australians with disability did not access the internet in the last 3 months. One quarter of these people report lack of confidence and knowledge as a reason for not accessing the internet.

It won’t happen to me

It’s easy to think that because we work in digital and we’re evolving with the patterns, we’ll be OK. That once we’re on the train, we’ll stay there. It’s our job, after all, to understand what’s emerging in the evolution of our visual language online – how we implicitly communicate form and function to our users as well as how things actually work. I’ve often remarked to people that working in software is my insurance policy against being excluded as I age. But, now at 37, I can already feel it happening.

It happens slowly, and that’s the problem – before you know it, you’re alone.

I have no idea what tools ‘the kids’ are using these days (Gen Z and younger). I mean, I know TikTok exists (our next big ‘global app that everyone uses’), but I don’t use it. But, because ‘everyone’ does, TikTok’s designers have incredible influence and power in establishing interaction design patterns that users will learn and future products will leverage as they evolve their own language. If I miss the TikTok train, how much further do I slip behind? As the saying goes, first we make the tools, then the tools make us.

It turns out that Gen Y and X are one of the first two generations in the history of humanity who are using decidedly different tools to that of the generations before. It used to be that our parents and grandparents taught us how to do important things in life, now it’s the reverse – we spend hours on the phone helping them link their COVID-19 Vaccination Certificate to their ServicesNSW App so they can access important services that are vital to their ongoing wellbeing and inclusion in the community.

We have to fix this before we get there

It’s easy to criticise, but maybe that’s what’s needed here. As a set of professionals who have immense power to shape the way humans and computers (and humans and humans) interact with one another, by all accounts, we’re doing a pretty shit job. Those tiny decisions we make every day have cumulative effects that are creating a more divided and unequal society by the day.

We could keep going this way. We could slowly but surely be the ones who, towards the end of their lives, can no longer access important services or interact with businesses like cafes, cinemas, and restaurants because we can’t figure out how to order food or movies anymore. Maybe we’re OK with that? But also, maybe we’re not.

What needs to change? It’s easy to be overwhelmed by everything. To think that one person can’t make the difference. But, if nothing different happens, then nothing will change so here’s a few simple suggestions that I’ve applied to my own life. I offer them to others:

Prioritise research and testing in our day-to-day work.

Maybe it’s not OK anymore to ship that small innocuous feature without doing a little bit of testing on it first. Maybe it’s no longer OK to rely on what Google, Netflix, Facebook, or TikTok are doing to achieve a similar function to the one you’re trying to design. It’s easier than ever to get access to millions of people, around the world, to confirm or confront your internal biases in the design solutions you put forward within your team.

Recognise that there is no ‘everyone’.

“Everyone” does not use Google. “Everyone” does not use TikTok. “Everyone” does not use Netflix. We can’t assume that just because Google design something a particular way we can lift it and the users we design for will automatically ‘get it’. I’m not saying that we can’t use familiarity at all, we should just make sure that it works for who we’re designing for in our day-to-day as well. Maybe your users don’t use Google, TikTok, or Netflix.

Callout who we’re excluding.

Because every business is different, every ‘market’ is different. It’s easy to say, “Our target market is 35-44-year-old women with children.” That gives focus to the team, sure. But also calling out who gets left out in this scenario at least surfaces who you’re leaving behind. Are we talking inner-city women with children who have access to good bandwidth and widespread mobile coverage? Yes? No? Being clear on the boundaries of who you’re designing for means you’ll have clear boundaries of who you’re excluding – then we can ask the next most important question, Is that OK? Who might we harm?

Don’t blame parents (grandparents or carers), blame designers.

Our parents and grandparents taught us how to interact with the world so, maybe now, it’s time for us to help them do the same until Design is fixed. Recognising that it’s not the older generations lack of ability or fault that ‘they don’t get technology’ is the first step. It’s not that at all, it’s just that for 2+ generations, we’ve left them out of the conversation about how they understand the world and how they want us to shape it for them. Maybe, it’s time we start listening. For their sake, and for ours because it won’t be long until we’re them and we’re calling our loved ones to ask how on Earth we can watch a movie we want to watch these days.

It’s not too late

I have faith that we can turn this around. The designers I know don’t intentionally set out to the do the wrong thing, but as E.F Schumacher states so profoundly in his 1973 book, “Small is beautiful: Economics as If People Mattered”:

Unintentional Neocolonialism results from the mere drift of things, supported by the best intentions.

And so, we have to stop this drift. This isn’t just about computers, the internet or even technology. It is about using technology as a channel to improve skills, to enhance quality of life, to drive education and to promote economic wellbeing across all elements of society. To be digitally inclusive is really about social inclusion. The opposite is true, too.

September 2021

The misleading connection between art and design

A variation of this article was first posted on cogent.co on May 1, 2019.

There is very little connection between art and human-centred design. I’ve spent my life trying to prove to people that having good visual communication skills and deep critical thinking aren’t mutually exclusive (in fact, if anything they reinforce one another). And, even though I’ve been trying to convince people of this for almost 20 years, I feel like I haven’t made much progress. On the internet, I still have to separate my career as a picture book author/illustrator from my ‘professional design career’. If I don’t, I’m pigeon-holed as a ‘visual’ person or, even worse, ‘a creative’ – that word that people associate with ‘magic.’

Designers don’t make magic

I’ll be honest, being ‘a creative’ sounds pretty cool. I used to think I was one too. It’s sort of arty. It implies a resistance against the status quo. It’s an easy way to stand out from the crowd and make one’s self feel pretty special. Like you’re one of the chosen ones with this natural talent at making things look great and breaking conventions while doing it. More importantly, calling one’s self ‘a creative’ bestows an enigmatic quality. It’s really hard for people who aren’t ‘creatives’ to understand what we do or how we do it. This “Wizard of Oz” effect makes us feel great.

But, the reality is that designers aren’t special. We’re not enigmas. Our beards, or flannel shirts, or dark-rimmed glasses and monochrome wardrobe might convince people who don’t know otherwise but largely, that’s been unhelpful. Designers, well, good ones anyway, use logic, deduction, collaboration, and scientific methods to produce great design work. That’s it. It may feel like magic when we get it right, but that’s only because critical thinking is a designer’s superpower and not everyone has that.

How did design and art get so confused?

To understand this, we need to have a quick look at history and how design, as a practice, evolved. Before computers there were these people called ‘commercial artists’. They typically had fine arts educations and so you (and anyone else) could call them artists, but even that’s a stretch. They used art materials like pastels, watercolour, charcoal, oil and acrylic paint, and they went to art school but even then, they didn’t create “art”. But, for argument’s sake, let’s consider them the closest thing to artists that designers have ever come. Anyway, companies would pay them to create images (not art) to advertise their businesses.

Now, we fast forward through time. Technology, as it always does and always will, changed the way commercial artists did their work. It made them go a couple of ways. Some picked up a camera and became photographers as cameras became a cheap and easy way to make images. They walked around and shot images that were used in company ads. Technically, it was still image-making so it made sense. They’d send their photos off to the ‘art department’ where they would be included with a bunch of other things to make the ‘final artwork’. The other half said, “Hey, the photography thing is pretty boring, realism is overrated, I actually like composing the thing that goes to print. I like things like fonts, and colour, and layout and shape and yeah, sure, I’ll include a photo or two as well when it gets supplied”. Sound familiar? Well, that thing became Graphic Design.

And so this discipline of Graphic Design continued to evolve and no one bothered to change the “Art Department” to anything else even though, by now, everyone is definitely not creating art. It started off using manual techniques like cutting paper into various shapes and positioning things on a page to compose the final piece (see Paul Rand). And then computers came along. It wasn’t long before a graphic designer didn’t have to draw all those letters anymore, they had things like digital typefaces (thanks Emigre) to play with and sometimes destroy (like David Carson did). And then, a bit after that, design software evolved such that they no longer had to cut pieces of paper and arrange them manually on a sheet of paper at all, the whole thing was produced in a computer. Still, no one bothered to change what they called the final product. It was easy to keep calling it “Artwork”, “Final Art” etc. As far as designers were concerned, they were still ‘making art’.

And then, the internet came.

Remember when the internet arrived? Those glorious days when web “pages” started to be a thing? Well, it was really new and no one knew what the hell they were doing but graphic designers were probably best-positioned to start trying to ‘help make the internet a more attractive place’. And largely, they did.

Early internet years meant the skills that graphic designers acquired in their careers were largely transferable to this 2D digital environment. Sure, there were button clicks and ‘interactivity’ to deal with but the general theory of visual perception still held true. Visual hierarchy, contrast and the general principles of Gestalt theory of perception served us well.

But, to make software easy for graphic designers to understand, companies like Adobe still used “art” metaphors like slicing and cropping and so on. The output of the process was still called “Artwork”. That’s right, artwork, 60 years on. Graphic designers spent years making ‘online brochures’ and slight variations thereof and here we were calling it “Art”.

The limits of my language are the limits of my world – Wittgenstein

Now, most recently, the internet has shifted from being a bunch of brochures to become pretty damn useful. We’re using it for way more than ‘brochure’ sites. We’re building tools and services. People need these services to function in society now. Banking is just one example. But, here’s where the real problem lies. Because of the events I’ve just described, graphic designers got their hands on the internet first. Because they were there first, they evolved “with” the internet. They learned, through trial and error, what it meant to specify clear interaction paradigms that they thought people knew how to use. I was one of those people. And you know what? We’re still here.

But, if we were to be honest with ourselves for a minute, most of us know we’re not artists now; at least we’ve come that far. We call ourselves “Interaction Designers” or “UXers” or whatever term we keep making up to tell ourselves we’re more interesting, serious, and you know, people should pay us more.

But a job title isn’t enough. No matter what designers end up calling themselves, there is still an intrinsic link in the non-designers mind between art and design. Labels are great, but they aren’t nuanced enough. By helping non-designers understand what we actually do, day-to-day, I hope that I can continue my somewhat hopeless cause – to separate art and design.

What does a designer actually do?

The problem with being part of a group with good visual-communication skills is that, often, we can struggle with words. We have the rich visual vocabulary and, sometimes, don’t spend enough time writing down and explaining all that we do – the stuff that’s intuitive to us is often the most difficult to explain. So, here, I often a model for how think about Design.

Research

Research is an important part of the design process and underpins the quality of decision-making throughout the life of the business and product. The activities that a designer might engage in during this phase are:

  • Identifying questions that the team needs to answer
  • Categorising the demographics of users who will likely use your product
  • Constructing and planning activities like interviews, focus groups & product testing to answer key questions and solve key business problems
  • Analysing and synthesising the outcomes of user and business research
  • Presenting findings back to the team and various business stakeholders clearly and concisely
  • Providing recommendations and guidance on what, when and how to proceed with the business and/or product.

To do effective research, a designer needs to pick the right activities to elicit answers in the deepest and most truthful way. They need to make sure that those activities are completed without subconsciously influencing how a user responds. It’s a very specialised skill — so much so that some designers have been trained in scientific disciplines like Behavioural Psychology and Clinical Research Methods to do this step effectively.

Interaction Design

Interaction Design (IxD) is the process of inventing and describing how users will interact with your product. It’s sometimes called “Human Factors”. The decisions and recommendations made in this phase come from combining two things:

  1. A deep understanding of your users’ needs, business goals and technical environment. This includes:
    • Demographics of users. E.g. Age, gender, social status
    • Environmental factors: Are they on the train, in the car at their desk? Do they have good internet connectivity?
    • Technology: Are they using a mobile phone? Tablet? Desktop? Both? Do they use Android, Mac, Windows, iOS?
    • Situational context: What time of day are they using your product? What did they do just before or after using your it?
  2. A broad knowledge of all the ways in which current technology can be used to help your users and business achieve their goals. This includes knowledge of different platforms (iOS/Android, Windows/Mac), different interface elements (buttons, forms, fields, wizards, pages etc.), and different technologies as well the appropriate time and place where these things can be used to achieve a desired outcome.

Designers will typically communicate their thoughts on how they plan for users to interact with your product using a mixture of online and offline tools. Sketching screens with pen and paper can be used to communicate and iterate ideas quickly. Testing ideas with users can be done with sketches, or to simulate a more ‘finished’ product, a designer might create an online interactive prototype of your product before it gets built.

Visual Design

Visual Design (or User Interface (UI) design) is when a designer works with the elements of Graphic Design to produce a representation of the final thing. It’s about what the user sees.

It draws on traditional graphic design principles like colour, shape, typography and layout that work together to produce a screen or series of screens that help the user achieve their goals.

Visual design can help or hinder how easy your product is to use. Things that you want a user to tap or touch, click or type into, need to look a certain way so that they understand how to use it intuitively.

The visual design also plays a strong role in creating emotion in the user. How a user feels can be critical to how they interact and react to your product and the impression that is left on them by your business. For example, if something is hard to use, they might perceive your business is difficult to deal with too.

What else does a designer do?

All three specialties that currently comprise the design process are equally important to the success of a product and ultimately, the success of the business. You could do wonderful research, and create slick interactions, but without a well-considered visual component, the user might be left feeling uninspired or confused. Conversely, a gorgeous interface full of beautiful images and a well-executed logo won’t necessarily help a user achieve their goals. Often without research, the interface may have the wrong content or navigation.

Every project is different and so the depth and attention that a product needs to achieve across all three areas will vary. However, the most important thing is, at minimum, to make sure all three are considered in a lightweight way rather than drop one or two of them completely.

Important ‘soft skills’ of designers

Also, there’s a bunch of other things, they’re called ‘soft skills’ that make a designer a great one. Here are those:

Empathy, or the ability to understand and share the feelings of others is critical. Without this, we’re unable to understand how painful or joyful something is for someone else. Empathy allows us to design the most positive interaction with a product or a business.

Communication is a no-brainer and whilst not specific to a designer, it’s what a designer does every single day. They need to communicate with users while doing research, with the team in building software or anyone who has an interest in the product and who need ideas conveyed clearly and concisely.

Active listening is part and parcel of being a good communicator. Asking the right questions at the right time can only come from truly concentrating, understanding and responding to others. It’s much harder to do well than you might think.

Self-awareness. A designer needs to know their own strengths and weaknesses, biases and preferences. Only by knowing these well can they perform effective and truthful research and devise solutions that solve problems in the way users need them to be solved. Crucially, this is often different to the way the designer or others in the team would personally like them to be solved.

Problem-solving is an obvious skill for a designer to have but nonetheless, can be difficult to hone. Yes, there are tools and techniques to learn how to problem-solve more effectively and efficiently but the motivation to solve it *well* is something a little harder to find. On top of this, designers are pragmatic and they use exceptional critical thinking. Nothing is perfect, but it doesn’t mean we can’t aim to be.

Imagination is the engine we use for coming up with new and innovative solutions to problems. The ability to create something from scratch that never existed before is unique and, we’ll be honest, a bit magical. Our designers are innately curious folk. They’re always reading, learning, watching and asking why. It’s this natural inquisitiveness that we reckon gives our designers their great imaginations.

Lateral thinking is tightly coupled with imagination. The ability to view a problem from multiple angles, sometimes unusual ones, is what we think lays the foundation for a great creative thinker. Often, it’s the ability to borrow from different contexts and one’s own life experiences that strengthen this in a person. Whether you have experience or not, involving other humans will always produce more ‘lateral’ results.

Story-telling is innately human. It goes to the core of what we are as a species — but to tell a good one requires practice. Designers can spin a good yarn and it’s important. Not everyone in the team will get the chance to talk to users and so it’s up to designers to convey what they hear and learn from users in a way that’s compelling. Designers need to help the entire team build the same level of empathy for their product’s users so that everyone knows the problems they’re trying to solve, and why it’s important to solve them.

Humility. Let’s face it, no one knows everything. Designers are intimately familiar with the design process and the methods and tools they use to do great work, but at the end of the day, they’re human too. They make mistakes, get tired, under sleep and over-eat too. They might misread a user’s expression, or over-emphasise things occasionally. But, they’re also lifelong learners. They use the power of the team to reduce risk of getting things wrong. After all, great products aren’t built by just one person and a designer is always part of a team.

Where to from here?

Maybe some people will use this as a hiring guide. Maybe designers will use it to improve the way they talk about the value they add to a team. Maybe, and, most likely, I’ll continue to separate the way I talk about my skills online for fear of being under-valued. Or maybe now, I won’t.

September 2021

Creating a culture of inclusivity in a society of pervasive masculinity

This piece was written for and first published on cogent.co in Oct 17, 2018

Maintaining a culture of inclusivity in the workplace can’t be a top-down approach. It starts with how we talk to each other, every. single. day.

Not long ago, our Principal Product Manager became our newest GM. It was an incredibly well-deserved appointment. And, like everyone at Cogent, they are smart, creative, goal-driven and, above all, empathetic. These are ALL the qualities one wants in a General Manager and I feel quite lucky to work alongside them.

Like most people do when they assume a new role in an organisation, they update their LinkedIn profile. And, as anyone familiar with using LinkedIn will know, LinkedIn likes to make a big song and dance about people moving jobs. It notifies everyone in that person’s network and it prompts those people to adorn them with congratulations.

And no, this isn’t a story about how LinkedIn’s weird automatic functions went a little haywire and did something stupid like celebrate a demotion. No, it was what appeared in human responses to her announcement that I found curious.

“You smashed it!”
“You’re killing it!”
“Kicking goals!”

It was, on the large part, quite violent.

This got me thinking. Where did this language come from? And is it even appropriate?

If you knew our GM, you wouldn’t associate any of those words with their character. They don’t smash anything (except maybe avocado before it goes on toast). I’m not sure they’ve ever killed anything either, if they have, they haven’t been caught. And kicked goals? Sport? Ha.

Maybe it was LinkedIn somehow promoting this strong language. But, I checked it out, and it seems unlikely. All LinkedIn does is prompt us to “Say Congrats”. Pretty benign and uninventive, yes, but not violent in any way.

There didn’t seem to be any correlation between men or women in their sentiment either. Women were congratulating them on smashing stuff as much as men.

Then I thought, Maybe smashing things could be a reference to the ye olde version of ‘smashing’. As in, “That’s a smashing idea old chap”. But I doubt it. I know artisanal hipster things are back, but we haven’t gone THAT far yet.

So if it’s not the tech, and it’s not a reflection of the person’s character, and it’s not gender, and it’s not a yearning to return to a time where the Great Gatsby was first published, then what?

I can’t help but think this is one of those examples of something so deep in our culture, something so historically ingrained that it can’t be anything but the pervasiveness of masculinity.

A reminder of the pervasiveness of masculinity

Smashing things, killing things, and sports-related metaphors like ‘kicking goals’ just ‘sounds’ very male to me. It reminds me of the bro-cultures I’ve worked in as part of other organisations. It makes me reminisce about all of those terrible job ads of the early 2000s (and even some today who haven’t evolved) when people were looking for ninjas and heroes to do their software design work instead of thoughtful, creative people.

It’s been written about time and time again. We know we live in a patriarchal society where ‘male-ness’ has been the dominant metaphor for so long. Where extrovert-ism, over-confidence and swagger all reap rewards of corporate ladder climbing, financial and social success. Where introverts, deep-thinkers, and active-listeners are not rewarded for what their unique perspectives bring to the table.

Every company shouts from their rooftops about diversity and inclusion. “The glass ceiling doesn’t exist here! We’re a meritocracy, ” sounds utterly repetitive now. And no culture is more obviously doing this than that of our various levels of government right now. But, here we find ourselves, on LinkedIn, providing positive reinforcement to someone through words like smashing, killing and kicking.

Ouch.

Change the language, change the culture

It’s all well and good for companies to update their values from their boardroom and say that they’re ‘officially’ diverse and inclusive. But, when every human in that organisation has learned that a masculine language – of violence, dominance and sports – is what makes you succeed in life, a new diversity policy won’t make a dent.

At Cogent, language is really important. We recognise that it’s a day-to-day manifestation of the values we hold. And, to be honest, I’m one of those people inside Cogent who call people out for use of language that doesn’t reflect those values. We say we’re a values-driven organisation, and so our language has to be too. Otherwise, well, to be frank, it all goes to shit. Here’s an example.

In every place I’ve worked before Cogent, we’ve referred to people as “Resources”. Or, in some worse places, “Human Capital”. Wow.

No, seriously, WOW. Just take 20 seconds. Think about that. How would you feel if you were “Human Capital”, an “Operational Expenditure.”

If there’s ever a way to dehumanise an employee, it’s to think about them as Human Capital.

Now, when you work in a consultancy full of humans that you call Resources and you need to plan where those ‘resources’ will be ‘deployed’ (yes, a term from defence forces), you end up with “Resource Planning.”

You can see where I’m going with this so I probably don’t need to go on. The masculine metaphors are rife. But, what I do need to show is how Cogent is different. Be prepared, this will sound very very stupidly simple.

What if we just called people, people?

At Cogent, we employ people. My colleagues are people. My manager is a person. Our clients are people. When we’re discussing what sort of work those people do, we call that “People Planning.” We have a board in our office called “The People Planner”.

The obviousness and the simplicity of this astounds me still.

I know this sounds like a no-brainer, but we’re all from companies who have instilled in us a culture of ‘resources’. Just like trying to change one’s habits to using the ‘they/them’ pronoun for our people who don’t identify as male or female, It’s hard and it takes concerted time and effort.

In conversations, I notice that we’ll still slip the word ‘resources’ in when what we really mean is people. In fact, the planning software we use, 10,000ft, use “Resource” as their label in place of “People”. I’ve asked them to change it but the response from Georgina, their support person, was this:

“We changed the language to “resource” to incorporate a new placeholder feature we released towards the end of last year. We had previously used the terminology of “user” to refer to people but decided to replace this so that a person/resource was more distinguishable from a user. ”

And, as a software designer, I would say the same thing. I mean, I understand that reasoning. You have a large user base and you need to reflect the language they use, otherwise the software doesn’t make sense to them.

But, funnily enough, I’ve noticed that the primary users of that software, our GMs, are the ones who slip up the most about referring to people as ‘resources’ at Cogent. And it’s not their fault. I know the values they hold are consistent with the business and with mine. They’re are genuinely empathetic, understanding and downright lovely people. This shows me that it’s clear that the software has an effect on the way we label. It’s that old saying, “First we shape the tools, then the tools shape us.”

What does this have to do with anything? My point is it’s because of this sort of thing that we need reminding that our ‘resources’ are people. It’s not a failure of the culture if we slip up, it’s just the context in which we’re working. I’m one of those annoying people who catch people out when they use that term. And, I know I’m really annoying for it. But, to maintain our values-driven culture, we need to train ourselves to use the language that is consistent with our values, not the language that we’re bombarded with in every waking moment of our lives; language that is consistent with the values of neo-liberal mass industrialisation that was invented so that bosses could feel better about treating their people like units.

But what about that smashing it thing?

In the end, shifting language takes a lot of effort. Trying to keep our language values-aligned in the face of larger cultural trends and the habits we’ve adopted from our lives in different, less progressive workplaces is hard. I can’t imagine the effort required to shift an entire population. It’s one of those things that can’t really be driven by a top-down approach alone. It needs to be one of those collective effort things. Problem is, people don’t often agree on much.

As Cogent grows, our culture is going to evolve, which means our language will too. I can’t be a lone member of the word police, and no one expects me too. But once many individuals try to instil a habit of using values-aligned language, that becomes the norm. Those norms form new habits. And those new habits form culture. Then you can truly have a values-driven business.

So, next time you want to congratulate someone, here’s a few suggestions in case you can’t think of anything except for violence. Copy and paste them if you like.

“Congratulations. You’ve worked very hard for this!” or

“Well done on your new promotion. Your intelligence, creativity and empathy make you perfect for the job”.

It only takes one of us to talk like this in a comment stream to demonstrate to others that it’s possible to provide positive reinforcement to someone without adopting the male-centric language that we’ve come to fallback on from habit. A side benefit is that you also sound more articulate.

Soon enough, we’ll all forget about that time we smashed it, killed it and kicked all those goals. Well, at least when we’re talking about people’s career progression. Sport or Gladiatorial fights to death might be a different story.

June 2017

What am I supposed to do with privilege?

This article was first posted on Medium on Jun 28, 2017

So, it turns out I’m about as privileged as you can get. I didn’t know this a few years ago. Like most privilege, it was hidden to me because of the friendship groups and workplaces in which I’d grown up and spent all my time. But over the last couple of years I’d like to think I’ve become more self-aware than ever. Part of that is down to my incredibly progressive workplace who have brought privilege and diversity in to focus for me. I’ve also put a lot of personal time and attention toward identifying and coming to terms with this massive blindspot of mine. I’ve gone through guilt, anger, frustration, anxiety and fear. At first, I resented that all my hard work in school, university and the workplace was now being undermined. Surely, historically-discriminated groups made ‘privilege’ up to make those who have worked their arses off feel bad. If you want a better life, spend less time complaining and more time doing something about it, right? Just like I did! But, like most knee-jerk reactions to learning something new, there’s a lot more jerk in the response than you’re aware of in the moment. And, since then, I’ve immersed in reading, some reflective writing, and, well, truly listening to other humans’ experiences (historically-discriminated against and otherwise). In that way that humans evolve and change, so have I.

Just how privileged am I?

The short answer is ‘about as privileged as a human being can get’. Have you heard of a ‘triple-threat’ in show business? They’re a performer who can sing, act and dance. Well, as far as privilege goes, I’m at least a quintuple threat:

1. Heterosexual. Check.
2. Tall. Check.
3. White. Check.
4. Male. Check.
5. Anglo-Celtic name. Check.

I thought about linking to all the evidence for each of these from here but I fear the list will distract from the point of this article. Let me assure you though, there’s study after study to prove it. And my personal experience shows that these without doubt exist. Whilst being a quintuple threat comes with plenty of advantages, it also comes with some pretty massive questions to deal with. The main one being, “How much of my current state of being is down to the actions I’ve taken versus the fact that I happened to be born in a society that values every single one of those traits that I can’t change about myself?” I don’t have the answer and I’ve had to accept that I likely never will. For someone who spends their career having answers to questions, that’s hard.

I’m not here to feel sorry for myself though. As you can see, I’ve got far less to complain about than, I don’t know, about 98% of people on the planet. But, for me, just knowing that this privilege exists is not good enough. I need to do something by deciding how to use it to help those who aren’t as privileged as me.

I hear time and time again that people like me are the solution. Us tall, white males are the ones people listen to with the least amount of prejudice/bias. Our opinions matter most and if we suggest something, or take some action, then it’ll spread far and wide and over time, society will change.

Well, now it’s time for action.

I’m a full-time designer and, more recently, a part-time picture book maker. All I’ve ever wanted to do is make a positive impact in the world. As a person of immense privilege, I couldn’t be in a better position to make this happen. So, for the last few months I’ve been reflecting deeply on this very thing. How on earth can I take advantage of my privileged position to help other humans? Well, it turns out, what it boils down to is that there’s just a couple of ways. I can donate time or I can donate money.

Time or money – Who has the time?

Another short answer: not me. One of the perks of being a software designer is that I get to help people. I get to listen to people’s problems, then make those problems go away through the decisions I make along with the team working on a project. I’ll be honest, it’s a freaking great job. It keeps my brain busy and I see real positive change in the world. I try to make most of my design work about health, education and energy — things I care deeply about. I don’t want this part of my life to change right now because it’s ticking all the boxes. It’s good for me and for those who use the things I make.

Outside of my day job, I spend nights and weekends dedicating my time to picture book writing and illustrating. It’s an accidental career that I stumbled upon after uploading a few doodles to Instagram and being ‘discovered’ serendipitously by a publisher. You could spend a lifetime trying to unpack how my privilege has contributed to making that possible, but that’s a whole other post. My work in picture bookmaking is driven by trying to improve childhood literacy and numeracy. Picture books play an incredibly important role in a child’s visual and aural development. Two more things I’ve completely taken for granted. And, for whatever reason, I’ve been given this gift (or, if you’re not religious, my genetic code has been randomly arranged in a way that’s made me more likely) to be able to construct stories and draw things that people find empathy toward. So I feel I have no choice but to continue this work, too.

As you can imagine, between these two vocations, there’s not a lot of time left. In my ‘downtime’, I eat, sleep and spend time with people who are important to me. Beyond that, there’s not much else. So, I can either give up those two vocations (which I know are doing positive things), to create more time to use my privilege in other ways or, I can use a by-product of them, money.

It’s about the money, honey!

Or should I say, It’s about the money, gender-neutral human!

Obviously, the advantage of working two jobs and not doing much else means I’m financially remunerated for that time. It also means that I don’t have time to spend training myself in how to do all the things I could do to make a small but heroic impact in the world. You know, those careers where there’s a coal-face element to helping people – social work, early childhood education, nursing, teaching and so on. Low paying, largely thankless jobs that we all agree are an incredibly important necessity to save those in a failing social system. Trust me, I’ve thought about taking this path.

What if I quit my current jobs and went to volunteer in remote Australian Outback schools where I could help close the literacy gap for indigenous children. I could use my design skills, my story-telling skills, and my illustration skills to bring about localised change to a small number of lives. The question still remains, “If I change just one life for the better, would it all be worth it?”

The question still remains, “If I change just one life for the better, would it all be worth it?”

If designing software has taught me one thing, it’s creative problem-solving. Why create something new if you can stitch together pre-existing things to achieve your goal? Well, folks, it turns out money is that glue and that earning to give, at least for those of privilege, is a real choice that’s available.

Earning to give

The gist is that I believe I can have a bigger positive impact on the world by taking advantage of my own privilege. I have two careers, and my privilege is at least partly responsible for enabling that. So, what if I use the financial return from those to fund pre-existing organisations who are better equipped at delivering life changing services than I will ever be. It’s not a new idea, but right now, I can’t think of a better way.

Take, for example, improving childhood literacy for indigenous Australians. I could quit my two jobs right now. I could spend a period of time up-skilling in the credentials needed to work with children. Then I could go and live in a small, remote Australian community and address 12 kids in a class for year to help them. If I do that for the rest of my life, I’ll probably impact about 700 or so kids.

The alternative is that I can donate to The Australian Literacy and Numeracy Foundation where it’s conservatively estimated that for every dollar invested in early childhood education, there’s a cumulative saving of $7 that would otherwise be spent on interventions in the short term (e.g. special education support), mid-term (e.g. support for early school leavers), and the long term (e.g. additional job readiness training resulting in lost productivity). A less conservative estimate from Nobel Prize-winning Professor James Heckman suggests that for every $1 spent in the early years, a return on investment of up to $300 can be expected. (Greley, B. (2014)).

I don’t know about you but the answer is pretty obvious to me. Money does in fact talk. Actually, it shouts from privileged rooftops!

Choosing what to support

There’s still a question about to who, when and how much money I should give. And to be honest, I don’t have the answer to that. All I can do is look inward at my own personal set of values and make decisions that are important to me. These decisions are based on the empathy and compassion I have for humans and the planet that we happen to inhabit. I think this will be different for everyone and I believe it’s so important to avoid putting a moral judgement on which cause is ‘better’ than another because I also think there’s no objective answer. For me, right now, it’s about refugees, indigenous Australians and the natural environment. As my life continues on its way, these priorities might change, and they might not. Things may become more or less important and I’m OK with that. What’s important for me is that the organisations whom I choose to assist are providing effective and evidence-based support to the people and environments they aim to help.

This isn’t a call to arms, or some sort of high-horse gallop through the internet to yell about my privilege and how I’m choosing to spend my money. In fact, I questioned whether I should put this very personal journey on to the internet at all. But, if the internet is anything, it’s a place where the privileged, like me, spend a large chunk of their day. It’s a place where those who are privileged enough to have access because they themselves are privileged with literacy can engage and share in the multitudinous viewpoints of an entire planet of humans in order to make the world a more tolerant, equal and better place. So this is why it’s here. Thank you again for your time and attention.

May 2016

I don’t hate work like I should

This was first published on cogent.co

Sometimes I feel bad about the fact that I like my job. I hear it all the time from my friends and from my family.“Work sucks! I hate my job! Urgh, I have to go back to work on Monday.” It’s pretty depressing to know that the people I care about spend 8 hours a day (and most often more than that) doing things they don’t enjoy. What’s worse is that I’m one of those people who don’t do that. I love what I do and I love the place I work, and, (yes — cue up the world’s smallest violin) it makes me feel guilty.

Work was something to be complained about

My parents are both blue-collar workers and always have been. They raised 3 kids with little to no money. They scrimped and saved every cent to put us through religious education which isn’t cheap. They picked up odd jobs and worked weird hours to make ends meet. Jobs like cleaning toilets at a casino, sweeping streets at a local plaza, stacking shelves at a supermarket and so on. These aren’t glamorous, life-fulfilling vocations. It was their solution to the financial pressure they felt in keeping their family’s heads above water. And, they complained about it a lot but who could blame them. Work was something they just had to do to pay the bills. They didn’t have time for hobbies and we hardly had a family holiday beyond a caravan that a next-door neighbour owned. This isn’t a sob story about my upbringing though, it’s here to explain that my very early view of what work should be was shaped really strongly by my parents’ experience and I suspect I’m not alone. Work was something that was to be hated, complained about, unfulfilling but had to be done.

And now I write this post from a cosy cafe in Melbourne as I sip a green tea and order breakfast that’s far more expensive than if I made it myself at home, and well, it’s hard to admit to myself but I’m about to start working. I spend my days listening to the problems that people have in their lives and then I design easy-to-use digital products that help those problems go away for those people. It’s incredibly rewarding work. Sure, there are days that are more difficult than others and sometimes you end up emotionally deflated or frustrated that due to circumstances out of your control, you can’t solve that person’s problem. But on the whole, it’s averaging out pretty well.

It’s so rare for companies to walk the talk when it comes to living their values that it’s easy to be and remain sceptical about it when entering a new work culture

I work for a company that values work-life balance and, let me be clear, that’s different from saying that they value it. Cogent has a very strong focus on personal well-being and life outside of the office and because of this, I’m able to run a second career as a children’s picture-book illustrator. So, I spend my days improving people’s lives through software, and then get to spend my nights and weekends bringing parents and children together through the wonderful world of picture books. In between those things, I try to manage an auto-immune condition which is really time-consuming. I can’t say that work sucks like my parents did. I actually really enjoy it and well, this comes with its own set of problems.

When I go back home and sit around having a beer with my Dad and my brother (as the males in our family have traditionally done) we talk about work. “How’s work?” My dad asks us both.

My brother complains that being a plumber is hard work. He was digging a trench in the freezing cold the other day and now has blisters all over his hands that won’t heal until the weekend. My Dad contributes in between wheezes and puffs of his asthma medication. He was loading hundreds of 30kg bags onto aeroplanes at the airport as a baggage handler and his back is starting to seize up. He needs to go to the physio now to get it sorted before he can go back. Which, to him, isn’t a bad result. He doesn’t need to go back to work for a while so “he’s got a few days off”. But then, it’s my turn. What am I supposed to say? That work is great? I’m really enjoying it? Should I go into details about my latest round of user testing and how people are thrilled with their improvements to software?

How much am I allowed to enjoy working?

On one hand, I think they’d like to hear this sort of story. In some ways, it’s validation to my Dad that his years of toiling have paid off. He’d be happy to know that I’m happy. But on the other hand, it doesn’t show a great deal of empathy for their back-breaking labour if I talk about a nasty paper cut I got the other day. And, more importantly, how can they understand that work doesn’t suck because, well, it’s supposed to! Yes, I’m using extremes here to demonstrate a point but in my job here at Cogent, we’re swimming against the current in some ways. I work with great, smart people every day. We make things together that are affecting our world and changing people’s lives. We’re not working on a big shiny scale but we’re making a difference to people.

I remember when I applied for a job here at Cogent a couple of years ago, it sounded too good to be true. Most companies will say they value work-life balance, personal wellbeing, professional development, transparency, creativity and so on. In my experience, this has always been lipstick on a pig because most organisations talk the talk but don’t walk the walk. So I approached this role with a certain level of scepticism. But 2 years is plenty of time to find the cracks and to be honest, there haven’t been many. Sure, every workplace will have its ups and downs and we certainly don’t dance around the office with lollipops and rainbows every day but we’re doing great work and, on the most part, I’m really enjoying it.

So, what do I tell my family when we’re talking about work? They don’t understand the details of what I do day in and day out so I tell them that I’m lucky I’ve found a really unique company that supports me in doing what I do, inside and outside of work. I’m lucky to work with the group of people I work with and because of this, it makes work much easier than my family would be used to. I don’t do back-breaking labour but I’m not bored. I’m solving interesting problems and so I go home tired, brain-tired, but satisfied that tomorrow I’ll get to work with the same group of amazing people and solve even hairier problems with them, all in time to get home for tea.

March 2014

The grass is greener on the front lawn

If Vienetta ice-cream on the table means your parents are having guests over for dinner, or a rainy Saturday means a trip to the local Westfield shopping centre, then chances are you were a child in the suburbs of Australia in the early 90s. Having lived in “The ‘burbs” all my life, I took all of what I did as a child for granted, as we often do when we’re kids. But as I now become the same age as my parents were when they had me, something has become clear – I’ve taken what makes the burbs the burbs for granted, the lawn.

If I look at the typical suburban lawn in Australia through the lens of a designer I see some pretty big functional flaws, the main one being location. Australia is a dry, drought-affected environment. Our native grasses grow in clumps, like a Troll doll buried to its forehead in the earth. Lush green rolling paddocks that we associate with a UK or European landscape simply do not exist here. This indicates to me that the lawn that I grew up with in the suburbs of Sydney – the soft, expansive, evenly-covered bright green play surface – took a lot of work to keep. My Dad must have religiously mowed, fed & weeded our lawn, all outside of my field of vision as an 8-year old. I was more interested in running through sprinklers and riding the neighbour’s dog up and down the street. The question I ask now, as a designer, is what drove my Dad to inadvertently make the lawn the centrepiece of my childhood memories.

My Dad has always had an interest in plants. He studied horticulture at night for a number of years outside of his job on the local council. Perhaps he maintained our lawn because he loved to do it, he enjoyed it? Or, perhaps it was his overwhelming need to feel as though his kids were looked after. The lawn was a way that he could provide a place for us to play in the safety of our local neighbourhood. But, I remember looking up the length of the nature strip of our street to see that all lawns were green and manicured, not just the lawns where had kids my age. So, if it wasn’t just for us, and it wasn’t just for him, the source of the reasons have to come from somewhere else. It comes down to something bigger.

Garden design in England and France (probably Australia’s two biggest influences in the early years of settlement) could be responsible for the lawn I grew up with. Throughout history, Western society has been pretty good at expressing control. Those who had control had the power, and those who had the power had the money. Beautifully manicured, symmetrical lawns with trimmed hedges, roses in rows and geometrically precise garden paths characterise English and French gardens, even today. The Palace of Versailles or the other palaces in France or any of the many stately homes across the UK landscape will demonstrate this common characteristic. The maintenance required to keep this type of garden looking ‘nice’ implies wealth; a wealth of money, time and resources. Owners pay staff to attend to a garden’s every need if they don’t have the time to do it themselves; watering, planting and putting each and every pebble or leaf back in its place. Without this constant attention these gardens fail to impress. The beauty of this type of design is found in the order, in the precision, it projects a statement that says “I am able to control my environment.”

Another possible influence for my childhood lawn could be from another continent. 1950s in the USA, a period of history where the stereotype of green-grassed suburban homes is a stereotype for a reason. In 1938, the Americans were endowed with a 40-hour working which meant they now didn’t have to work on Saturdays at all (as opposed to working just a half day on Saturday prior to 1938). The wealth of time had returned to them. Then, WWII came and went. After a period of uncertainty and a lack of resources, abundance returned and it was time to celebrate. What better way to celebrate and regain a sense of control in the world (and communicate it to the world) than to manicure a lawn? War veterans returned home, family-life was re-established, housing and the industrial economy boomed, the world regained a sense of order, predictability was available again. It didn’t matter anymore whether you were aristocracy or not, even Joe the plumber could mow and water his lawn every Saturday.

What affect did WWII and America’s reaction have in Australia? Australia’s response of course was to do what Australia does best, mimic the things that seem to lead to positive outcomes in other parts of the world. Having been settled by the English (who were influenced by the French), the seeds of controlled and geometric garden design were sewn long ago (pun definitely intended). Our first settlers cleared land and tried to plant the classic English plants around buildings that mimicked European architecture. After all, this was the model of success in their home country, but, these have proven to be unsustainable decisions in the long run. It’s no surprise to find Australians looking to America, with its “success” in industry and economics 100s of years later, to try and find a way to emulate that success here.

In a similar way but on a much smaller scale, by the time my parents had me they had both successfully built lives after emigrating from Europe. They settled in a new and strange country, they had jobs, they bought a house, and after about 15 years of uncertainty they finally had some control over their lives; what better way to feel aristocratic and express that control to the world and to themselves than keeping that lawn absolutely perfect. The pattern seems to repeat itself; when I am stable and in control, I shall have a healthy green lawn.

Recently, I’ve gone through similar motions to my parents when they were my age; moving interstate, finding a job and vocation I love, getting married and buying a house. I now find myself looking to the presentation of my home as a way of expressing my sense of control and ‘achievement’ because I’ve got more time to do it. I now understand the chain of events that lead me to feel guilty when I opt for a lawn-free landscape. With the lifestyle I lead I don’t have the ‘wealth’ of time or money to hire a staff to maintain my lawn (and yes neither did my parents) but more importantly, I’m one generation removed from the 50s-80s of which my parents were a part. I’ve seen different world events. The focus of the way we live our lives has been less about mass-consumerism and success through over-production and more about leaving a small footprint behind, not to command and conquer our environment but to live with it. Success isn’t measured by how much stuff you own and how nice it looks as it might have been in my parent’s time, it’s measured by ‘sustainability’; respect for the people and place in which you live. As our Australian aboriginals say – it’s about living with the land.

My inclination for my own home is to plant native Australian flora, a sprawling, undefined ‘bushland’ that thrives where it is because that’s where it should have always been.

It’s ironic really. It’s taken the Western view of society (in Australia at least) a couple of hundred years to realise that we’ve gone about this a little backwards. As I gaze out of my window and stare at a park that looks as though it’s made of hay because the grass is so dry, I can’t help but laugh. We’ve had this wrong for a long time. English and French aristocratic values expressed in garden design are quite literally the incorrect application of a design solution for a context that’s not the right fit. We use resources like water in a place where water is becoming increasingly precious in order to maintain an ‘ideal’ that we’ve been trained to believe is the ‘right’ way.

In my work, I spend day after day trying to positively influence behaviour but, I think a lack of control scares the western mind. Folklore or classic Australian writing paints a picture of a sunburnt country where serial killers roam free. The disorder of the natural Australian  landscape is used by writers and artist throughout Australian history to represent a disorderly mind. This leads us to believe that without explicit control over our landscape, we don’t have sufficient control over our emotions, our mind or our actions. If we cannot command and conquer this rugged outback, do we fail to progress the dominance of our species over everything else in the short time we have on Earth? This sort of question excites me.

I’m often astonished at the design sensibility of Eastern culture; it’s so far removed from my own perceptions of the way the world should be. One example is the concept of ‘wabi-sabi’ which comes from the Japanese. To be able to accept that change is inevitable, to find beauty in the constant evolution of the built and natural environment is a mindset that seems to have so many potential benefits, both in the way we use resources and also for our mental health. Another Japanese tradition, keeping flowers that have not yet blossomed by the front door to welcome guests seems strange to the Western mind at first. However, the view from the eastern eye is that they value the presentation of the potential for beauty more than presenting a short-term beauty where the only evolution of that beauty is its decline. This view is so juxtaposed to the ‘immediacy’ and ‘urgency’ of the western mindset; I find it equally fascinating and hopeful.

To find beauty in a sprawling, untamed landscape for someone who grew up on green grasses and western values is really difficult. I’m finding it much harder than I thought. I’m actively trying to reverse 30 years of social pre-conceptions about what beauty is, what is meant by order and control. For me, it’s not about control anymore and when you let go of this, an unexpected beauty emerges to present itself. Since I’ve planted native flora in and around my home, birds I’ve never seen before are regular visitors. Flowers bloom in Spring with very little attention from me but they seem happy. Sure, there’s still a symbiosis there, they still require water to become established and food on occasion, but the level of time, money and resources that are required are far more sustainable. It’s no longer a fight for survival for either of us, it’s an acceptance of living in an environment where we both belong. Less water, less cost, less time are required from both parties yet we both benefit. It’s probably my most successful design solution ever, one that I’m most proud of. I’m aware that it will never win a design award nor be internationally recognised but I’m OK with that. It’s created a story and a sense of comfort – just like the grass lawn of my childhood did for my Dad and I.

February 2013

We’re all writing our own story

Good stories are hard to come by, or at least this is what we’re brought up to believe. There’s only one Tolstoy, one Dickens, one Bronte right? If you can’t tell a classic that’ll be remembered for 1000 years than what’s the point of telling any story at all? We imagine these big-name authors of the past sitting in their quiet loft or ‘studio’ by candlelight, carefully penning each and every word, finding the right place at the right time for that comma, that full-stop or that exclamation point. Whilst I cannot speak for Tolstoy, Dickens or Bronte it’s clear to me that writing is about more than just the language or verse, it’s about more than hard work and endless hours of writing and re-writing – it’s about recognising that our own stories are important and interesting enough to tell.

I remember when I first started at my current workplace, I was tasked with telling those who were already working there about me and my life; it was called, “doing a ‘this is your life’.” It’s a great way to give some context to your colleagues about the skills you have and how you’ve become the person you’ve become. I remember struggling with the content of this talk I was supposed to give. I remember sitting back and saying, “this is hard because my life isn’t interesting.” I also remember telling them my story; essentially a chronological list of events that happened to me since I was born and I remember the feeling of surprise I got when I listened to their dialogue about those key events that told me my life was, in fact, interesting to other people.

Fast forward 18 months and I attended the recent “Postcard from Melbourne: Jens Lekman” event at the Wheeler Centre in Melbourne on Tuesday. I’ve been a big fan of Jens’ narrative-style song writing for years so it was great to get the opportunity to listen to him speak about his own inspiration and process for writing his songs. Amongst many of the insightful and interesting discussion points, one struck home more strongly than the others – he said, ‘we don’t realise why our own stories are so important.’ This little sentence has stuck with me and I couldn’t help but ruminate on this point all week. He’s right! Our own lives are interesting to everyone but ourselves.

Why is this? Why are we fascinated by the events that occur to someone else but not ourselves? Do they feel more serendipitous when something happens to someone who isn’t us? Is it because we don’t have the contextual view of every connection and circumstance in that person’s life to lead up to an interesting event? If every experience we have helps to shape who we are and how we think then our own stories should be the most important to us, not the least important. In fact, the only thing more important than having our own stories is being able to see the value in sharing them with other people.

Inspiration for creative writing (or any writing for that matter) is not some stroke of genius or a whimsical moment where a lightning bolt descends from the heavens and strikes the plot, characters and entire collection of words (in order) for War & Peace in to our heads. This inspiration comes from living life. Where you’re born, how you grew up, who your friends were, how early you lost your virginity, what you ate for breakfast that morning before you fell sick in the biggest exam of your life up to that point – there’s not a single person on earth who occupies the same space at the same time as anyone else which means that each and every individual *is* a unique story waiting to be told with a unique way of interpreting the world. That, for me, is really exciting!

My recent pursuits in search of creative writing inspiration has seen me ask myself the same questions over and over – “What am I going to write about? What are the elements of a great character? Is this one good enough? When will that magic moment hit where the words will just fall out of my fingers and on to the page?” I’ve read endless blog posts from inspiring authors, thought about joining a writer’s society and watched video after video of the theory and technical structure of great creative storytelling. Of course, this loop of unanswered questions leads to one inevitable conclusion – staring at a blank page pondering the potential of a great story, not writing one. I recognise in myself a seed of an idea buried in my brain somewhere because I consider myself a ‘creative’ person but alas, nothing comes. We fail to realise that each and every person we’ve ever met in the way that we’ve met them is inspiration available to us that’s unique only to ourselves and not to anyone else in the world. Family members, friends, pets, that annoying guy on the same train at the same time everyday who chews his chewing gum too loudly – they’re all potential characters and they’re all potential plots – we just need to put pen to paper. Easier said than done of course.

Many published authors have quoted that annoying phrase to anyone who is not a published author, “You should write what you know,” to which my usual response is, “Yes, all well and good, but I don’t know anything.” They don’t say “Write your own stories, write your life down on a piece of paper and show it to others. They’ll tell you what’s interesting, they’ll laugh at the bits you didn’t think were funny. They’ll even share their own similar (or completely unrelated) stories with you.” Writing what you know implies some sort of excellent knowledge in a subject – maths, history, language, science. But, it’s not about writing what you know; it’s about writing what has happened to you! Who have you met? What did they say? How did they say it? It’s about writing down that time you went backpacking in Eastern Europe and stumbled across a gentleman called Enzo who promised to take you to his villa in the country if you gave him your patent leather shoes you bought when you were in Milan … not completely true, but I make my point. Every story will be interesting to someone.

It’s common knowledge that Charles Dickens used key events in his own life to form the basis for a number of his stories. Tolstoy writes about his home in Russia, while Bronte writes about the tragedy in love. These authors of classics use the same inspiration as our modern day television writers. A recent Seinfeld documentary showed that almost every plot on all 9 seasons of shows came from a writer’s own experiences – adapting people, places and times and exaggerating circumstances is simply part of great storytelling. Yes, there really was a soup nazi, Kramer is a real person, the character of George Costanza is based on the executive producer and co-writer of the show, Larry David. What did they do when the writer’s on Seinfeld started to dry up on ideas? They simply got additional people in – people with their own stories! Jens could not have put it better, we just don’t realise why our own stories are so important because we often don’t take the time to tell them to anyone and get that second, third or fourth perspective.

In a time where the barriers to a large audience are inversely proportional to the ease of being able to share, we’re living in a time that Dickens, Bronte & Tolstoy would no doubt envy. No longer do we require ‘approval’ from the readership of a local paper (like Dickens) before we’re recognised as a ‘writer’.  We don’t have to sit in a local market square doing public readings in the hope that a passer-by may take a liking to our words or our voice. The only barrier that comes between a story and its audience in today’s internet-enabled society is the author’s ability to recognise the value in their own stories and to tell them. Your own stories may not seem interesting to you but let others be the judge of that.

May 2012

Pleasure in waiting: A lesson from the seasons

Three seemingly random events have happened to me in the past couple of months:

  1. I had to write a letter to my 88 year old uncle who doesn’t use computers.
  2. I’ve started doing watercolour again for the first time since high-school.
  3. My garden cried out to be freed from weeds before the winter set in.

While these 3 events seem completely unrelated, they’ve brought to my attention how impatient I’ve become and I’m forced to reflect, as any designer would, on why this is happening – is it just me or is it a more common issue permeating our culture and values?

I see it on the commute to work every day; trains run 2-3 minutes apart during peak hour and yet there’s always at least 2 people who make a dive for the closing carriage doors in the hope that they don’t have to wait another 120 seconds for the next one. Those that are lucky enough to make it aboard are huffing and puffing; one lady has broken her high heel, another needs to take a shot of her asthma medication and a sip of water. A gentleman has saturated his business attire with sweat. He needs to pull a handkerchief from his pocket to try and mop up the saturation so that his colleagues don’t talk about him behind his back at work; the fear of being known as “the sweaty guy” instead of the “the punctual one” when he gets to work at 8:59am instead of 9:02am is just too much to bear.

Out of context this sort of behaviour seems ridiculous but I’m sure that most of us have done it. If not the dash for the train, then the exasperated sigh when an email takes more than 10 seconds to send, or the zig and zag down a footpath to try to get passed the old lady whose trudging her way forward, step by excruciatingly slow step behind her own walking frame. It leads me to think – were we always like this or is our society the first to think that waiting 30 seconds for anything was this inconvenient?

It’s generally agreed that technology can’t keep up with our expectations. Our expectations of when things should happen are… well, right now. We often use phrases like, “We can put man on the moon but we can’t get WiFi to work 1-hour away from the CBD,” and whilst technology isn’t meeting our expectations, it’s doing a pretty good job of setting them. I often receive calls from clients who, 2 minutes after pressing the ‘send’ button on their emails, decide to call me on the phone to make sure I got their email and to tell me how beneficial it would be if I could respond to it ASAP. We seem to be suffering from a kind of anxiety associated with silence – if there’s no immediate feedback from our actions then something must clearly be wrong.

It’s easy to talk about impatience in the context of work because, like it or not, time is money. What’s ironic is that time must’ve been far less valuable just 20 years ago when email, nay fax, was not yet being used widely and regular post was the best we could manage. What on earth did people do while they were ‘waiting for confirmation’ of a design concept or better yet, waiting for a designer to put together a mock-up that probably took about 4 weeks. Over recent months my 3 separate experiences that are not related to work have taught me some very valuable lessons in what I now like to call it – “Veruca Salt syndrome”.

For members of my generation who grew up with Roal Dahl’s wonderful story, “Willy Wonka and the Chocolate Factory”, they’ll know the young Veruca Salt character well. Veruca was a spoiled child whose father was a factory owner and gave her everything she desired for fear of a tantrum. Today, her catch phrase, “But I want it nooow!” rings in my ears more often than it ever has as digital technology continues to set our expectations of how quickly things should happen and how completely inconvenient it is when we have to wait for anything. The three recent examples of my own impatience have, without question, emphasised this phenomenon to me more dramatically that I could ever have imagined.

My wife and I recently booked a holiday to Scotland. My grandma often told me of her many years she spent in Orkney, off the north coast of the Scottish mainland, and how magical the time was there for her. She spoke of the war, Scapa Flow and the Pentland Firth and how she was almost shot by passing planes whilst carrying milk home from the village centre. My Grandma passed away a few years ago and so in order to find out more about her family’s life there I decided to contact my oldest direct relative, my 88-year old ‘grand-uncle,’ her brother.

The most obvious place to start was to ask my parents how I could contact Uncle Terry. They laughed and responded, “Email? You’ll have to write to him, we only have his home address but he loves receiving letters.”

A letter? I don’t think I had ever written a personal letter to a family member before. A postcard, yes, that was easy, “Having a great time… wish you were here… we chose this card because it has a photo of a walrus that we saw yesterday on it. See ya soon. Hugs and Kisses”. But a letter? What do I say? I hardly know him. How long should it be? Is it rude to type it? I don’t want to start writing it and then make a mistake and then have to cross whole sentences and paragraphs out. Even if I did write it how long would it take for him to reply? Maybe I won’t even hear back from him at all? How will I know they got it?  The anxiety loop I found myself in was enough for me to put it off for 6 months.

When I finally got around to writing it, I asked my mum for help on where to start. The guiding word from her was, “Just tell them how the family is.” My immediate reaction was, “Isn’t that boring to most people?” Apparently not. I wrote a 3-page letter to my Uncle telling him about my wedding, my parents, my brother, my sister and what they’ve all been doing. I went through the arduous process of selecting photos to send to him. I went to the photo store to have them printed. I wrote a little description of each one and listed the names of the people who were photographed on the back of them so they knew who was who – the whole process took about a week. What’s this got to do with impatience? Well, despite all of the running around I had to do, when I popped it in the post the sense of anticipation was astounding! What will happen? Will he get it? Will he write back? Will I get photos I’ve never seen before of my family in Orkney? Will I be introduced to some old friends of theirs? It feels a bit like Christmas – and it’s a feeling I no longer get with email or digital communication… ever. If it were email, my feelings would be of growing frustration for every day I didn’t receive a reply. I have no way of controlling or knowing what the outcome will be and quite frankly, it’s overwhelmingly exciting.

My second experience was my recent reintroduction in to watercolour – the first time I’ve picked up a brush since high school. The medium is such a joy to work with. The physical relationship between brush, pigment, water and paper and the varying degrees of wet and dry, light and dark, activates a part of my brain that designing for digital interfaces doesn’t come close to. As beautiful as my first night with brush-in-hand was, I didn’t realise it until a few days afterwards. It took  3 seconds (about 4 quick strokes) to lay down my first wash since 2001, a strong mix of French ultramarine on the saturated 320gsm paper. As the physical nature of watercolour dictates I could no longer touch it until it was completely dry and this takes anywhere between 20-60 minutes. What was I supposed to do now? I had all my art supplies out, I had set up the work area, pigment and water were all over the place and it was 10pm – I had to work tomorrow. I paced back and forth anxiously. I considered turning on the house heating to speed up the process. I found it very difficult to concentrate on any other activity like cooking or ironing, I just wanted the wash to dry so I could see how it turned out and so I could continue painting.

About 5 minutes later I found myself with a brush in one hand and a hair dryer in the other using the gift of electricity to speed up the process. I also found a moment of clarity where I recognised in myself a complete absence of patience. What was the rush? Could I not just rinse the brush, have a good night’s sleep and continue with a dry canvas in the morning or after work the next day? Did I really need to see it now? With watercolour, the feedback you get from it is in the drying process. Some artists say, “watercolour will always a better painter than you are” because the subtleties of the brush strokes only emerge as the pigment dries and fastens to the paper. While this may be true, my problem was that I’m used to having Photoshop “dry” instantly. I can work in layers and save, re-save, undo mistakes and continue. Watercolour is a much harsher critic and has obviously uncovered a personality trai in me (or is it a habit of expectation?) that I wasn’t aware existed.

Lastly, like watercolour, gardening also provides feedback over (what is now considered to be) a long period of time. My dad has a green thumb. He grows his own vegetables and has raised his own chickens. Every season his garden looks picture-perfect and so it’s no surprise that I enjoy the process in my own home too. Well, at least I do now. The first few years after I moved out felt like a constant battle; if in 2 months the basil plant wasn’t double the size then I’d just rip it out and try something else because it obviously wasn’t going to grow. The lesson I learned from my Dad is that with the garden, you need to think in seasons. He plants a plant not for this winter and not the next winter but for a winter 3 years in to the future. Once planted, the process becomes one of nurturing through water and fertiliser and sun – there is no ‘rapid’ iteration. If there is too much sun (or vice versa) a plant tells you in 3-4 days and you need to correct it. It is the same with water, mulch and soil-health. The time-scale of the natural world is what it is. To be happy with it, you have to walk at the pace it chooses rather than pulling it kicking and screaming at a pace that it’s not comfortable with.

Letter-writing, water colour and gardening all have one thing in common – to have the appropriate feedback, to gain a sense of achievement, to reach the goal, it takes much longer than digital technology has trained us to wait. The physicality of communicating by printed word or of using the process of drying water to create art is such that we have to let it lead for once. If you want a beautifully crafted painting, you must wait for the water colour to dry; only then can you make an informed decision about your next choice of colour, level of saturation and brush type. If you want a blossoming garden next spring, the reality is you can’t have it… maybe you can try for the following spring instead. If you want to communicate with a relative overseas without the use of digital technology, well, your words need to travel 1200kms by boat or by air and back again. All this juxtaposes our day-to-day interactions with each other through digital technology. I can transfer bank funds from one account to another in the 5 minutes before I leave the house and have those funds available at the swipe of a card at almost any point of sale – except, of course, for those really inconvenient and despicable shopfronts who don’t yet have EFTPOS facilities, right?

What I’ve been finding as I continue to observe our behaviour in public space is that we’re simply becoming far less patient and, as a result, more irate when the world doesn’t behave according to our expectations. We’re becoming a nation of angry people who can’t get ‘there’ (wherever ‘there’ is) fast enough. The question I’d love to have the answer for is when will it be fast enough? Technology is 100 times faster than it was 10 years ago and it’s still not fast enough. In 10 years’ time, it’ll probably be 1000 times faster than it was today but our expectation is, no doubt, that it should be 1500 times faster. It’s inevitable that it will never match our expectations and as such, leads me to ask – what can we do about it? The most obvious answer, as I’ve realised in these past months, is to find joy in the waiting, to give yourself permission to be patient.

It sounds like a novel concept – joy in waiting. When I said this to my wife and a couple of colleagues it was scoffed at like I was joking or I was reading too many Jane Austen novels and my perception of modern-day reality was severely skewed. But, as is often the case with solving a difficult problem, the solution sometimes lies in the opposite, the unexpected. Either find joy in waiting or continue chasing a goal-post that’s always going to be just out of reach. Before writing that letter to my uncle my barriers to writing had a strong negative-bias. However, after putting the letter in the post those exact same barriers turned to hopeful expectation and childish giddiness; from thinking about the possibility of not receiving a response to an expectation of receiving a surprising or delightful one. It’s the same with watercolour. Watching the subtle variations of line and colour form in front of my eyes was (and is) a joyous experience – even if it turns out to be completely different to what I set out to achieve. I get a joy from waiting for the garden to respond to the new layer of fertiliser or organic matter I’ve laid down for it – sure the results don’t manifest themselves immediately but when I notice that first bud or see that first sprout of a leaf it’s a feeling that I just don’t get from any other experience – especially the digital world.

Waiting promotes the feeling of anticipation like nothing else. It’s a feeling that is so strongly linked with moments in my childhood also. My mum used lay-by religiously for our Christmas presents and we’d know that just over the other side of the lay-by counter lay a wealth of goodies that were put away in July. Come Christmas time, we’ll be the happiest kids on earth. When Christmas finally did roll around, the joy of unwrapping and playing with those presents after waiting so long was second-to-none. As adults we seem to lose that feeling of pleasure. Waiting becomes inconvenient, inconvenience leads to frustration, frustration to anger and all of a sudden the blood pressure is through the roof and we’re taking multiple pills a day to try and curb it. No one says, “Have you tried finding joy in waiting” or “I give you permission to be patient.”

I’ll be the first to admit that these thoughts sound somewhat idealistic. That joy in waiting doesn’t always apply in the context of work when there are deadlines and nagging clients. There’s no time to be patient when you need to be in the office at 6am for a conference call with a colleague in England and then you want to rush home for dinner to be with your 3-year-old child for 10 minutes before he’s put to bed by the nanny at 7:30pm on the dot. But, while it may seem impractical in some instances, what I’ve found is that there are moments in the day where you’re actually allowed to wait and I’ve begun to relish the opportunity – not to mention, have started to find more of them. The morning turnstile crush at South Yarra train station is a perfect example.

I used to jam my way in to the hordes of people because I was trying to get to work like everyone else – just like most people do at 8:45am on a weekday inner-city train station. The reality is that we will all get through those turnstiles eventually, and often within 30 seconds. I’ve had to actively train myself to switch to a mode of patience and ‘acceptance’ and it wasn’t easy – it too takes time. Standing patiently in the throng has given me the opportunity to notice things like the young school boy who helped the old lady through the crowd, or the pregnant woman frantically trying to access her train ticket from a backpack pocket that she couldn’t quite reach only to be helped by a young guy in a suit as he swiped her through the barriers using his own ticket.

When you accept waiting, you notice things that make life that little bit more special, interesting and inspirational. The more often you do it, you don’t just notice those things, you actively look for them. It has the potential to heighten the feeling of anticipation you feel and breed an expectation of positivity. It melts away the cynicism that we get bombarded with through mainstream news and media channels. We live in a fascinating time with never-before-seen speed. It’s easy to get caught up with the goal of ‘getting somewhere’ but the reality is we all end up in the same place eventually so why not take a step back and enjoy the wait.

March 2012

Cue, Saturation, Blindess: How we consume the new

I’ve been reading a lot about anthropology lately, trying to get a better understanding of the ‘why’ and the ‘how’ of the way we live our lives. In my attempt to try to understand what makes us tick as a society I’ve recognised a powerful three-stage process that seems to repeat itself on different levels throughout our history and it’s constantly shaping who we are, like a potter moulding clay.

The Cue is the first moment that an idea is planted within the cultural landscape – like a spark or an epiphany. People responsible for the cue are often heralded as creative or innovative by their peers and colleagues. They are perceived as people with such amazing vision that no one can believe that what they’ve come up with hasn’t been thought of before. The Cue often seems so obvious, but only after it’s presented for the first time. It paves the way for a new wave of creativity as others take the Cue and use it in their own way to create something new.

The idea of the Cue can come in many forms, and some have a greater impact on society than others. In the world of scientific progress (a facet of our culture which we put a lot of importance on as a society), Einstein’s theory of relativity or Isaac Newtown’s laws of motion could be considered “Cues”. In the land of branding or marketing, a facet of our culture which we often view with less importance than scientific discovery, Coca-Cola’s use of Santa Clause in their 1930s advertising campaigns, dressed in Coke’s brand colours of red and white or the introduction of the ‘need’ to use deodorant in the late 19th century is also a Cue.

Naturally, Einstein and Newton’s scientific discoveries seem much more important to humanity than the clothes of a fictitious Christmas character or under-arm sweat regulation products but the scale of this importance may not be quite so different – they simply affect a different part of our lives. Western culture values scientific progress and education more than our visual culture or societal customs. For some reason, the apparent objectivity in science is far easier to comprehend to us – we’re able to identify right from wrong, we’re able to teach these mathematical formulae in a formal classroom environment. Trying to quantify the impact that Santa’s clothes or anti-perspirants have had on our society and environment is a much harder task but it demonstrates a very valuable point. While these 2 examples of scientific discovery and 2 examples of cultural persuasion seem to be at opposite ends of the spectrum, it’s clear to see that any Cue can change the way we perceive our world and it can be introduced to us by just a single person with its affect on our society being permanent.

Saturation is the process we seem to adopt of ‘mainstreaming’ or ‘normalising’ a Cue that we feel has potential to impact our culture in a positive way. Many of the things I take for granted today started sometime before I was born – like the colour of Santa’s clothes. In fact, it’s the same for any generation who takes an idea or habit for granted and it’s exactly this idea that makes saturation so difficult to recognise in everyday life.

Last week, I found myself trying to select my next toothbrush from the plethora of choice that the supermarket shelf was offering. I had finally narrowed my choice down to two or three brushes and I caught myself using the ‘tongue-cleaner’ feature of a brush as a deciding factor in trying to narrow my choice down further. At that moment, like a slap in the face, I remembered seeing my first ever advertisement for a toothbrush with a tongue cleaner (about 2 years ago) whilst I was sitting on the couch. I laughed so hard that the milk I was drinking came out of my nose. “Oh. My. Goodness!” I said to myself, “It looks like the ad agencies have finally exhausted every iteration possible for bristle-angles and now it’s time to move on to another part of the mouth. I can’t wait to see the 3-in-1 brush, tongue and cheek masseur in 2015.” It’s ironic that here I was, literally 2-3 years later having an internal dialogue with myself about whether a tongue-cleaner, at no extra cost over the other brushes, would be a good idea. The cue had been planted 2 years ago, and with other toothbrush companies bringing out their own version of it over that time, I had sub-consciously seen enough of it to come to accept it as a normal part of my life. The saturation had progressed to blindness.

The third stage of consumption is the concept of Blindness which could equally be called “acceptance” or “adoption”. It’s the point where our society has become so saturated with the original cue for long enough that it has become the norm. We become used to its presence in our lives, we stop questioning it because it’s so ubiquitous and its immediate impact has not been dramatically negative. If humanity were a river, blindness would be the current or flow – the cue becomes part of our identity and frees up our attention to focus on the next cue.

What the experience of the tongue-cleaner made very clear to me is that no matter how unusual, outrageous or immediately fickle an idea might seem, if it’s repeated for long enough, if we see it often enough and it does not seem to have any short-term negative impacts on health or wellbeing, we’re willing to accept it. We’re willing to give it a go and consider it a ‘progression’ or ‘advancement’ in the way we live our lives. My parents’ generation never had tongue-cleaners. My dad is 58 and only had his first filling last year – no other dental work required. Are we just refining the way we live from generation to generation, the more we discover about our tongue the more we need to do to manage it’s bacteria? Or is it simply just some marketing executive or ad-agency creating the problem and also providing the solution so that an extension can be added to the pool-house before the family comes over for summer?

All jokes aside, I don’t mean any of this to be cynical. I simply find it fascinating. What’s most fascinating about Cue, Saturation and Blindness (CSB) is that we know about it and accept it in certain areas of society. For some reason, when it comes to clothing we seem to accept that fashion will come and go, that style has a short lifespan. Whether it’s jeggings, denim shorts, skinny black jeans, in fact anything from the 80s really – we look back and laugh at ourselves living through the style. We accept that it has changed and we know that it will continue to change. Yet, for some reason we’re sure that we’re not making the same fashion mistakes twice when we make our next purchase of this season’s colours and textures.

What surprises me the most is that in other facets of our culture, we seem to respond differently, particularly in technology and hygiene or medicine. Faster car? Why not. Another exotic fragrance or make-up with oxygen-activated micro minerals? Yes please. New drug to cure a disease? Why do you even ask?! More features in your smart phone? Of course! In these other areas of our society we see each iteration, each cue, as building blocks to a more ‘successful’ one, one that’s moving forward, progressing to a future goal, one that’s becoming quicker, smarter, and healthier; a society that’s improving. We’ve even started using these words to describe today’s products – we have a ‘smart phone’, ‘intelligent lighting’, ‘responsive design’. We do this knowing full well that tomorrow’s smart-phone will be smarter, that intelligent lighting in 10 years’ time will be more intelligent than today and that design can only become more responsive. So what does that really make of today’s devices and technology?

Since I’ve put this idea to paper, I see it everywhere. My local supermarket has upwards of 34 different shower gel products. Let me repeat that, 34! As each brand tries to carve out a place in the market for itself we come to accept that fragrances (and I use that term loosely) like “Refreshing Ocean Fresh”, “Foamburst Moisture Delight” and “Coconut and Tiare flower” are valid, attractive and necessary (yes, they are real fragrances). What’s even more interesting to me is that if we were to take away the “Foamburst Moisture Delight” one there is no doubt, for at least 15 minutes of someone’s life, there will be confusion and disappointment that they are forced to change their habit and buy an unfamiliar alternative. Is it just me or does no one else notice this ridiculous and unnecessary microcosm of soap-free wash products that we’ve convinced ourselves that we need?

Before you go ahead and judge me, I am aware that it’s easy to err on the side of negative criticism when it comes to capitalism and consumption. I use the example above because I think it’s relevant for pointing out how the laws of CSB can be applied to something as trivial as body wash. It happens to us gradually and providing it doesn’t have a negative impact on health, we’re happy to go with it. The health argument is of course the number one influence on whether a cue eventually becomes a part of our everyday lives. The irony is, our impatience as a society means that sometimes we make the mistake of saturating and becoming blind to a cue before the health implications manifest themselves.

Digital technology is the prime example for demonstrating how big an effect CSB can have on the way we live. The proliferation of mobile phones, just one segment of digital technology, has literally transformed the way we think, the way we share and the way we communicate with one another. The impact it has had upon us when compared to shower gel is much more substantial and most agree that the positives outweigh the negatives… so far. Is this just because the potential health risks simply haven’t been identified and/or confirmed yet? Is the speed of communication and sharing a valid trade-off for the potential harms that long-term mobile phone use could have? How do we decide as a culture where that line is? If it were proved that phones had a detrimental effect to health and wellbeing would society backflip and find an alternative? How long would it take for us to shift the momentum? I wish I had the answers and equally, I hope I never have to.

Inevitably I’ve tried to ask myself – Is CSB a good or a bad thing? Can it even be one or the other? It’s unlikely we’ll ever have the capacity to know. On one side of the argument, the hypothesis for why the human species has been so successful in reproducing and thriving within our environment is its willingness to adapt and the speed at which it can do so in order to survive. It’s also the basis of evolution theory and natural selection. Does this mean that other cultures that are more resistant to adaptation will be lost? For some time it’s been common knowledge in Australia that our indigenous societies have been dying a slow death, its stories and values are dying as generation do the same. Is this cultural evolution theory at work?

It begs the question, is the assumption we currently have of using a biological metric, “greater population = greater success” to judge the ‘progress’ of humanity as relevant to cultural evolution as it is to natural selection?  As the global population nears 7 billion it seems that we’re starting to question it ourselves on a much larger-scale. How can we sustain this growth as a species? Is it time to keep tabs on how much we’re consuming? We’re being urged to think more holistically about our place in the earth’s ecosystem; learning to live with it rather than live on it. Indigenous people the world over share a commonality of respect for their environment and land. There is not a sense of domination and rule in these cultural systems, but a sense of harmony and shared responsibility. The CSB model of growth that underpins our western cultural values seems to emphasise the values of capitalism and ownership rather than the values that have sustained our indigenous people over centuries. For years these indigenous people have lived without such a dramatic impact on the natural world compared to modern western society, even without the tools and techniques we live with today.

Could it be that the rolling snowball of western culture is reaching a critical mass? At what point does it stop being a snowball of progress and become an avalanche? Will western culture simply become too obese to continue moving forward? Will it die early of a heart-attack from an over-indulgence of what was thought to be good for you but turned out to be bad? Like lead in make-up, mercury in top-hats or large quantities of offal and claret before bed.

Being aware of western society’s natural tendancy towards a CSB consumption model can be helpful in providing a frame of reference that will allow us to think creatively like we never have before. I’m certainly not saying we shouldn’t innovate and that we shouldn’t create new cues but the lesson to learn from our own short history is that if we invent something, and give it broad exposure which leads to widespread adoption it is inevitable that it will become part of who we are, it will define our species and the way we live our lives. Perhaps we need to think deeper and harder before we release a cue and saturate ourselves with it so that the base of our ‘progress’ is built from stone and not from sand. We should be asking ourselves, “What will the impact of a new novel idea be if it is a globally accepted norm?” before it becomes one. Being aware of this means we can then ask short-term questions like “Do we really another shower gel?” and answer them with a more holistic approach to addressing the environmental impact of 7 billion people washing litres of shower gel down a drain and in to the ocean per day.

Obviously I’m using extreme examples here and I don’t mean to be cynical nor the preacher of doom and gloom. In fact, like most designers, I’m finding inspiration and motivation from the problem. The values of capitalism are, in some sense, fundamentals of the western culture. These have evolved over countless generations and as such, they prevent a sharp shift in the way our world works; at least within my lifetime. What CSB shows me is that designers are at the coal-face of cues and saturation and we’re working with an audience who, throughout history, has been responsive to change. Along with perhaps science practitioners, we have the potential to change the way we live and work more so than any other profession humans have invented so far. Even with baby steps, we have the ability and the means to help define ourselves as a respectable and sustainable species and we have the added advantage of knowing that it can happen more quickly in western society than any other culture in the world.

 

 

January 2012

A rose by any other name

The idea of the global market place is not new. Where once geography was a contributing factor to isolating markets and giving competing businesses a smaller playing field from which to make their millions, digital technology now blurs the lines where businesses large and small are competing in a digital Gondwanaland.

For years, Australian retail has capitalized on the geographic isolation of its country. Those who could afford to would import goods and sell them on to the Australian people at a mark-up that was normal for the time but is being tolerated less and less by today’s standards. The fact that these large, well-funded businesses were the only ones who had access to the products from overseas was their major competitive advantage. With broader access to digital technology than ever before, this advantage is being swallowed up and the Australian people can now purchase from a one-man-band in the United States and have the item delivered to their door at half the cost of purchasing it from an Australian retailer or middle man. In the cut-throat world of business I find it no surprise that business-savvy designers have capitalised on the fear that businesses have in losing their once dominant place in the market and have packaged up ‘design’ to sell to those who don’t quite understand it.

If you’re a designer (or even a business-person) and haven’t been living under a rock in the last few years you’ve no doubt heard of phrases like “Lovemarks”, “Design Thinking” and “Collaborative Consumption”. It’s my belief that the invention of these phrases in our new digitally-aware world is no coincidence. It’s not the business moguls who are coming up with these terms, it’s the designers.

As the traditional lines of separation between business’ unique selling points blur because of digital technology, it seems that many are scrambling to find or create a new point of difference from their competitors. Designers have recognised an opportunity and taken it; design is the obvious choice. It’s a young, immature, mostly misunderstood industry, not by designers but by those in business environments. For years designers have been perceived by business as “the people who make it look good” but ask a designer what they do and they answer “we’re problem solvers” or “we’re creative and strategic thinkers”. Sure enough, those who have been designing have seized the opportunity that digital technology has handed them by branding their ideas and selling them to the highest bidder.

First, let’s start with one of the more common terms in business over the last few years, “Design Thinking.” In 2009, UK born culture-critic Rick Poynor gave a talk in Australia that I was lucky enough to attend. He continually drove home, with the utmost urgency, the need for designers to be aware of the traction that “design thinking” was getting within business environments and its potentially negative impact on our visual culture. Coined by Tim Brown, CEO of design firm “IDEO”, the term has been invented for what seems like an attempt to rationalise and explain the process of design to business-minded people.

As designers, we take our problem solving abilities for granted. If a client says, “make something feel lustful,” we know which typeface to choose, colours spring to mind immediately, and we know the right layout and photography that will communicate the feeling. We’re able to devise multiple, equally-valid solutions to the problem intuitively and present them with pride to the stakeholder who says “My wife didn’t like the red colour you used.” We sigh, refine and present something of a compromise. IDEO and Tim Brown have devised a catch-phrase and a series of speaking presentations and self-help books simply explaining this process that designers go through on a day-to-day basis and how it’s supposed to be able to revolutionize business. There is no refinement, there is no “feeling”, no emotion, just words and diagrams. Funnily enough, the business world can’t get enough of it.

I should pause for a moment here and explain that I have nothing against IDEO or Tim Brown and their success with “Design Thinking” and I don’t believe for a second that ‘design’ can’t revolutionize business. As designers often do, Tim Brown saw a problem and delivered a solution.  No doubt it’s made him (and IDEO) a lot of money in book sales and keynote presentations over the last 2 or 3 years. They preach that it is broadening the awareness and legitimacy of designers in business, elevating them beyond simple pixel-pushers to strategic business partners whose creativity and innovative thinking can help transform a business. I see it more as a way of making money by using design as it wasn’t intended and patronising the design community by trying to tell us how we’ll all benefit from it too. Yes, design has the potential to transform business, “Design Thinking” in the hands of people who can’t “feel” design does not.

Tim Brown and IDEO aren’t the only ones to have capitalized handsomely on the juice that businesses are trying to squeeze from the design fruit. Take for example “Lovemarks” – a term coined by none other than leading advertising agency “Saatchi & Saatchi”. The Lovemarks website explains it ever so eloquently:

“Brands have run out of juice. More and more people in the world have grown to expect great performance from products, services and experiences. And most often, we get it. Cars start first time, fries are always crisp, dishes shine.

Saatchi & Saatchi looked closely at the question: What makes some brands inspirational, while others struggle?

Have Saatchi & Saatchi successfully branded successful-branding? It sounds like a tongue-twister but what Lovemarks creates, like all advertising, is a problem. What is a Lovemark? Is your brand a Lovemark? Do you have brands that you love? Can I be a Lovemark? Not only that but they’ve created the solution. For all the answers to these important questions and more, Saatchi & Saatchi say “buy our book.” According to Saatchi & Saatchi, creating an ethical, responsive and attractive brand isn’t enough anymore; if you’re not a Lovemark than you’re just an ineffective brand.

What’s in it for Saatchi & Saatchi? Profit of course. Book sales, speaking engagements and higher community awareness of the Saatchi & Saatchi name amongst businesses that are looking for that differentiating factor from their competitors. It feels good to be a Lovemark but if you’re not one right now then you’ve got work to do. Who better to help you achieve this almost unattainable accolade than the ones who know most about it – the ones who literally wrote the book on it?

While I’m on this subject, it would be remiss of me not to mention my favourite phrase of this branding movement we’re experiencing, “Collaborative Consumption.” It rolls off the tongue almost poetically. It’s alliterative, catchy and sounds strong and powerful, a real ‘game-changer’ if I may use a business buzz-word to describe a design one. It should come as no surprise that it was named one of Time Magazine’s 10 ideas that will change the world and businesses are falling head over heels to be a part of it.

My first exposure to Collaborative Consumption was on ABCs “Big Ideas” program where I caught one of its two evangelists, the personable Rachel Botsman, explaining her “Big Idea” at a TEDx Sydney conference. I sat through her 10-minute spiel about what it is, why it’s great and of course… yep, you guessed it, why you should read their book. Her enthusiasm for it made a very convincing and entertaining affair. In essence, collaborative consumption is a remarkably simple idea, why it makes it in to this post on branding then is the fact that collaborative consumption is a well-constructed buzz word (or is it buzz-phrase or buzz-brand) for that little thing that our mums taught us as kids; sharing.

“Sharing” doesn’t sound innovative. It’s not alliterative, certainly not powerful and doesn’t come across as ‘game-changing’. In fact, for me it evokes memories of all the times I had to give over half of my ice-cream to my brother when we were growing up because he had dropped his on the floor.  Sharing doesn’t sell but call it “Collaborative Consumption” and by Joe, Rachel Botsman, Roo Rogers & Co have a book-deal, a speaking deal, plumes of media coverage and thus a brand for which to sell an idea that’s existed since the first caveman cooked for two.

Australia’s prime example of a collaborative consumption business model is “GoGet CarShare” – a car share scheme for those who live within a certain range of the Melbourne CBD. No registration costs, little petrol consumption and no maintenance cost. On the surface it sounds great; Collaborative too. Yes, but only for those who can afford to live within the restricted CBD radius and funnily enough, those also with the most convenient access to public transport.

All three of these examples share one thing in common – branding. For many years, companies have used traditional branding philosophy, mostly established in the 1940s and 1950s, to embody and attempt to communicate their values, their processes and ideas to a consumer base that didn’t have an avenue to talk back. These of course still work – we feel a sense of quality for couture fashion labels over those whose logos we don’t recognise. The irony is that a breed of hybrid designers/business owners have turned the tables on the business world and instead of focusing on their own logo, on differentiating their own design businesses through visual aids and deep and meaningful metaphors they’re simply branding their ideas and telling people about them. It seems that this notion of “ideas as products” is what the business world is willing to pay for before they pay for a designer.

The ideas recognised in each of the examples above are not innovative as many of their evangelists say they are. Creative thinking, consistent and valued customer service and the idea of sharing are not new, they’ve simply been re-packaged by designers who have seen a need in the market; a business opportunity. Is that the role of a designer or a business person? Has the idea of design-thinking, Lovemarks and collaborative consumption alone improved the quality of life for all people or just improved the weekly income of Rachel and Roo, Tim Brown and IDEO or the team at Saatchi & Saatchi.

Businesses that are looking for the edge on their competitors during a time where traditional notions of “points of differentiation” crumble around them are having the wool pulled over their eyes. The rapid growth and strengthening of digital technology as a core business asset is scary and those who are not agile enough to think creatively about how to use it will inevitably fall – see exhibit A Kodak and exhibit B Borders for the proof. These are businesses scared to share and engage with their consumers in a 2-way conversation for the fear that they’ll be loved a little less.

Any designer who understands the way brand perception works would give any business, large or small, the same advice on how to use digital technology to enhance their own brand without the additional cost of book publishing rights or a monumental speaking event:

  1. Be creative and innovative – (Design Thinking)
  2. Engage in a conversation with your customers. Use the social nature of people to share your business. (Collaborative Consumption)

By achieving these two simple goals, brand loyalty and customers will inevitably follow, see exhibit C: Apple, Google and Virgin Australia. There’s no need to write a book or tour the world as a speaker. The only question remaining then of course is when will one achieve Lovemark status? There’s only one business with the expertise to decide that – touché Saatchi & Saatchi. Perhaps we should at least buy their book after all.

January 2012

Well-read is well-fed

The Christmas and New Year break is often my most anticipated holiday of the year. Perhaps I haven’t grown up yet but for me, there’s still something magical about early December. The city gets dressed up to look its best in big bows and tinsel. Tiny over-excited children begin to fill Bourke St Mall, dancing around like popcorn in a pan about to burst with joy because Santa Clause is comin’ to town. Richie Benaud’s dole set tones leak out of the television on warm summer days and tumble in to nearby rooms in the house. Christmas brings this wonderful air of anticipation and it affords the opportunity to reflect on the year passed – the achievements and the dissapointments. In my own reflection, one point has consistently niggled at my conscience and that is recently, I haven’t been able to write because I haven’t been able to read.

No, I haven’t contracted an illness that has affected my literacy levels, it’s not an onset of dyslexia or attention deficit disorder. Time is also not an excuse and I haven’t fallen out of love with design, far from it. After taking some time out and looking inwards at my habits in general and barriers to sitting down with a book over the last 5 months, two reasons have distinguished themselves as the root cause of my lack of reading. Firstly, I never developed the habit of reading as a child and secondly, books often fail to hold my interest long enough for me to get to the end of one.

As children, my younger brother and I never read books; not books that were outside of the school study curriculum anyway. It wasn’t that we couldn’t read or didn’t like reading, we were both very proficient readers but we were simply so involved in weekend sports that there wasn’t enough time in the day to sit down with a book. Not having developed this habit of reading in my early years means that today, I still see reading as something you do when you can’t go outside and play. Reading for me is a chore; the last alternative. I feel restless when I’m sitting idle and if I’m not engaged in the content of a book from page one, I find that I force myself to turn pages for ten minutes only to have to constantly turn back and re-read parts because the words that entered through my eyes never made it to my brain. When this happens regularly, it’s hard to see what value, if any, continuing to read will have.

My second barrier to reading is getting to the end of a book. If I don’t read a book from beginning to end I feel incredibly guilty. It sounds ridiculous as a read that statement back to myself but I find that if don’t get to the last word on the last page of a book I feel as though I have a weak attention span, like I’ve failed or that digital technology has altered my brain in such a way that I’ve lost the ability to concentrate on any written piece for more than 300 words at a time. I very seldom blame the poor writing of an author or the fact that I actually have no interest in the subject matter or genre of book I happen to read – both of which are very valid arguments to put down a book and move on to the next one.

Having said all of this, when I do read, I know how I read. I often read 2 or 3 chapters in one sitting and with two or 3 more chapters in the next sitting I forget what I had read in my first one. This only reinforces to me that reading is less productive then physical exercise. It has never crossed my mind to try to consider reading as exercise for the brain and that I should force myself to continue despite not being able to reap the rewards of viewing the Adonis staring back at me in the mirror after a boxing or weights session.

Throughout September I tried hard to write. In fact the harder I tried to write about all the sprouting ideas in my head the more I failed. It felt like trying to eat the fruit from a tree whose fruit wasn’t quite ripe yet. The nectarines were crunchy and the oranges were sour. I thought I would just give it some time, incubate a little and wait for the writing to ripen however, the longer I waited the less important each idea seemed to be. Before I knew it I had waited 5 months and all I had to show for it was a failed, tasteless crop of half-formed ideas and half-finished sentences.

In trying to avoid slipping in to a turkey-induced coma on Christmas afternoon, I aimlessly flicked through my trusty moleskine notebook. It’s where I jot down odd thoughts, observations and ideas while I’m out and about, I never leave home without it. While flicking the pages, refreshing my memory on things I’ve seen and read, I stumbled across a quote from Paul Rand from a book of his a couple of years ago

“If you don’t read, you just don’t know. [Reading] is nourishment when you run – you do not eat bread while you are running.”

With a sporting background like mine, the running analogy that Paul Rand used rang familiar sounding bells. It was after reading this quote that I realized how my lack of reading in the last part of 2011 had actually affected the way my brain was operating. It wasn’t receiving its nourishment; it wasn’t being fuelled for the marathon of being a designer everyday. It was in a stupor of responding to circumstantial stimulus rather than being trained and honed to find a creative solution to any problem from even the most unlikely of situations. It affected my ability to articulate what it was I was trying to communicate.

I went on to review my “About Me” page from my blog shortly after reading Paul Rand’s quote and re-discovered the reason why I had started the blog in the first place – I had formed a habit of reading. Reading had helped me overflow with thoughts about design and encouraged me to reflect, critically analyse and continually question my own creative process. It helped me open my ‘observer-eye’ and re-think everyday occurrences in a new light. I started to use the blog as a way of organizing these thoughts and responding to experiences. Improving my writing was only ever meant to be a by-product and yet here I was, in September 2011 concerned about having ‘writer’s block’.

As my wife (an avid reader) often does, she noticed my frustration before I did. It’s no surprise that my Christmas presents from her were in fact books (accompanied by this wonderful lego project). She knows I’m not a ‘natural’ reader (especially with the sunny Australian summer weather that drags one outdoors) but none the less she understood what my brain had been missing these passed months. Perhaps it was also her cunning plan, or maybe a stroke of serendipity, that we managed to squeeze in a trip to Gould’s Books in Newtown while on holidays.

To call Gould’s books enormous would be an understatement. It’s where books go to die – literally, book heaven. I browsed the aisles of the bookshop for at least 2 hours, an easy task with almost 1 million to choose from. I dived in to the design books, meandered amongst the biographies and perused the popular fiction only to end up in health sciences (the least likely place I thought I’d find myself). I selected a book from the shelf called Mind and Memory Training by Ernest E. Wood purely because of the cover design; it had beautiful vintage colours and beautiful vintage type which made it jump at me from the shelf. I almost bought it immediately for the aesthetic alone. I flicked the pages lazily while waiting for my wife to finish her browsing and I landed on a section at the back of the book called “The Mind at Work – Reading and Study”. I began to read,

“Read for correction, not for information. Think first then read. When you pick up a book to read, let us say, a chapter on the habits of elephants, you will not immediately open the book and plunge in to the subject. You will first sit with the book unopened on your knee or on the table and say to yourself, “Now, just what do I know about the habits of elephants?” It may be much or it may be little or next to nothing that you know, but whatever it is, you must make yourself review your own knowledge first before you start to add to it. If you have twenty minutes to read, think for 5 and read for 15.”

The concept that the author introduced to me here is what I now refer to as “Active Reading.” Reading non-fiction should first be an activity of affirmation or investigation in to thoughts and ideas that I’ve already got and only after this affirmation should I try to add to it. The author goes on to talk about how preparing for reading is like getting your house in order before trying to fit in the grand piano or another bookshelf. First, find everything you already own that may match this new piece of furniture. Then, arrange it all in a logical order, put each piece in their proper place and it becomes much easier to see where you can make room for your new furniture. The author suggests that doing this will not only allow you to keep your attention on the subject matter for longer but will increase your brain’s ability  to hold the information once you begin to read.

I find that this concept can also be applied to choosing what to read. It would feel odd and unsettling to put a never-before-seen modern chrome and black leather chair in a country-style, wooden and plaid kitchen if you’ve lived in the country-style house for your whole life. Of course, that’s not to say that you should never purchase anything that doesn’t match your existing decor; the styles that you’re comfortable and familiar with. To learn anything about a brand new domain, to introduce any new piece of furniture, one has to start somewhere. Should you wish to add that chrome and leather chair to an old familiar house, you might build a new, empty room first. Take the first moments of reading and relax with the knowledge that you know nothing of what you’re about to read. Perhaps start with painting the walls a sterile white then place the chair inside and spend some time with it – study it from every angle, sleep in your new room, review the chair to become super familiar with it until it feels as comfortable as the country kitchen once did. Perhaps then it might be time to bring home a matching modern lamp and begin to fill out the room with these related items, to build knowledge and familiarity of this new, once alien domain.

Although the concept of active reading sparked some interest in this book, it didn’t address my second barrier to reading; getting to the end of a book. Back in the bookshop I flipped to another page and read a paragraph at random;

“It is always best to have a good book on hand, on philosophy, on history, on travel or science or any other subject to which one can turn to several times a week for mental recreation. There should be no thought of reaching the end of the book, it is to be lived with.”

The sentence “It is to be lived with” brings to mind a long-lasting and comforting relationship – one you seldom get with digital technology but one that books so obviously lend themselves to.  In fact, with digital technology, the relationship we have is quite the opposite. If my bookshop experience was a scene from a movie, the angel’s trumpets would have sounded, the choir would have sung in full voice, a shining light would have beamed down on me from above and bestowed upon me a glowing halo of peace, calm and tranquility. A weight had been lifted and it was finally OK for me to stop reading a book before I got to the end.

Unlike the way we consume digital technology, books seem to imply a linearity – a beginning, a middle and an end; a left to right continuum. When was the last time anyone picked up a book and started reading from the middle? It seems counter-intuitive to do so because of the medium in which the content is being delivered. It will no doubt take practice for me to adopt some of the advice in Mind and Memory Training. It sounds easy in practice but I know long-standing childhood habits are hard to change. Perhaps my continuing journey as a reader will lead to an intersection where the habits of digital content consumption that I’ve naturally been developing over the past 10 years can aid my ability and confidence to consume the printed word in a more efficient, useful and memorable way. I’m interested to see where this leads.

Just as Christmas is a good time for reflection, it’s also a good time to think about the future and set goals and plans. My plan? To read. Not necessarily read more, but read better. To think and then read. To read and then write. It probably goes without saying but I slipped the book under my arm and headed to the front counter to pay for it. I spent New Year’s Eve with friends and family and headed home to Melbourne full of confidence that in 2012, I’ll be well-read and well-fed. Come December next year, I hope to have a bumper crop so that I might have the pleasure to indulge and feed a few hungry minds. If not, I’ll simply enjoy the process of writing and exploring creativity knowing that each word I type, each thought I explore is all part of shaping me in to a better thinker and a better designer.

May 2011

The role of weight in luxury

As a child, I loved space. I still do. The thought of zero-gravity, of feeling weightlessness is a feeling I’ve always wanted to experience. It’s ironic that I’m now musing about the “comfort” I’m finding in weight.  I’ve been designing interactive digital experiences now for 7 years. I’ve always had an interest in computers and art so it seemed a natural progression for me to pursue a role in interaction design. It’s really satisfying work; designing, architecting and developing a successful online user experience but this year something is different. Something is missing in my day to day. I find myself pining for something tactile. Something to feel, to hold, something with texture, something that isn’t backlit, something with… well… weight.

I was sitting on the train this morning with my 1200 page copy of Ayn Rand’s “Atlas Shrugged”. My wife was sitting next to me with her e-reader, clicking her way through Moby Dick. A stranger on the other side of me had his kindle out, making digital notes against a paragraph he obviously thought was important and the lady across from me was swiping away at her iPhone – I felt like the lucky one. I had with me an object of what some would consider excessive, unnecessary weight these days. This one book is about 2kg in paperback, and yes, it’s just one story. It doesn’t have a library of thousands of books bundled with it nor can I flip a page with a one-handed click, but it feels comfortable.

Sure, it may be considered inconvenient or impractical but reading a story isn’t just about information absorption for me, the words conjure images. The feeling of the uncoated, off-white pages, the weight of the pages in my hands, the musty, second-hand bookstore smell that wafts across you as you progress through the book all work together to make it a strong, multi-sensory experience. I found myself taking immense comfort in weight.

Many of my colleagues and friends say “digital is the way of the future, you work with the internet and you don’t even have a facebook account”- and maybe they’re right. Maybe I should. Maybe I should just de-clutter and live in the cloud. However it’s unlikely that as technology improves, the devices we use will gain weight. It got me thinking, what will this mean? The race is on for hardware producers to provide devices to consumers that are faster, thinner and lighter than their ancestors. No matter how light something becomes, until it’s implanted within us, somehow woven in to our biological being, it will always be considered too heavy. Does this mean that the path for technological advance is already laid out before us as the inevitable? Are we heading towards a future where interfaces are controlled by brainwaves, we think and therefore it is? Where does that leave our sense of touch?

Historically, weight has always been associated with comfort and quality. The term “comfort food” is the label given to the meals we prepare that are high in fat, that make us feel good, that also make us gain weight if we overindulge. The paintings of the Botticelli era, having seen them in person in Italy last year, show women with curves and bellies, this “weight” was and still is considered beautiful to look at. When shopping for furniture, you can often distinguish between Ikea chipboard and solid wood by how heavy it is. The heavier the item, often the higher the price; and I’m not talking about shipping costs.

It’s the same with design and architecture books; hard cover versions cost and weigh substantially more than their cheaper, paperback counterparts. Today’s musicians now release vinyl versions of their albums as well as CD and MP3. Your buying more than the experience of sound with a vinyl – the music feels more substantial as you place it on your record player and pick-up the 20g needle to place it on your disc to make your speakers crackle and sing. You’re forced to sit within earshot and listen because in 20 minutes, you’ll have to be there to turn it over if you want to listen to side B. Even money has more value when we can hold it and feel it. Spending $250 on a single electronic transaction makes it so much easier to part with money than feeling the weight transfer of 5 crisp $50 notes from your back pocket to the shop’s cash register. It becomes more than hypothetical visual information, it becomes a physical representation of the work you’ve put in to have the weight in your wallet in the first place. There are many more examples I could share but I digress.

My fear, as the world becomes more digitally engaged, is that humans will become less touch-sensitive, less haptic, less able to feel. Weight is and will be seen as an inconvenience rather than a tool that designers can use to create emotions and reactions in people that our other senses are simply not capable of communicating to our brains. My classic training as a designer taught me the devices I had at my disposal to create effective graphic design; colour, line, type, etc, no one ever mentioned weight beyond light, medium and heavy typefaces. It’s something we feel so naturally that it’s almost too obvious to teach.

Print a business card for a high-priced lawyer or boutique, international design studio on 120gsm paper and it simply feels wrong. The weight of the card adds credibility to the content. But, add 100 grams to the next generation of the iPad and the community will be in an uproar about portability. This begs the question – by what measure do we judge the quality of content we consume in the digital world? It’s interesting that the rules of weight in other forms of consumption; furniture, painting and the printed word seem to invoke an opposite reaction for interactive digital devices. The flimsier the device, the more we see it as an advancement, as ‘better’. Would an uncoated version of a touch screen feel more luxurious or would it feel cheap and unfinished? Imagine if every printed document that was ever produced had a shiny, gloss finish akin to the LCD screens of today’s smart phones.

Are we sacrificing our sense of ‘touch’ and ‘feel’ for convenience and portability or will touch just take on a different meaning now? Are we about to witness a revolution in making our multi-touch devices more tactile? The quilted back of the Kobo, the plethora of leather, wood and felt cases for all our current mobile devices, make it obvious to me that I’m not alone in wanting a more comforting digital experience. It’s clear that humans find comfort in tactile experiences, what interests me going forward in my professional life is how we’re going to address this need if our content is trapped within the confines of the 2D, gloss-coated LCD.

September 2010

Journalism’s role in shaping the way that we think

Is our generation the worst one yet? Or is the next going to get more violent and thus make it even more dangerous for our children? There’s been a spate of recent violence on the public transport system in Melbourne, particularly on the notorious “Frankston” line and the media has not been shy in reporting it. Sensational headlines, sporadic comments by ‘witnesses’ and the same story written in different ways for 3 or 4 days in a row.

Just 20 years ago the internet was in its infancy, an embryo compared to its evolved state today. Static text pages, the god-forbidden splash page and gecko browsers were it. You needed to be a trained expert in writing HTML and what little existed of CSS at the time to call yourself a web designer. Aesthetic sense or graphic design training was at the bottom of the list of required skills and your knowledge of spacer gifs (or shims as they were called) and your ability to create infinitely-nested tables were a sign of a good web designer. Clients came to you with a brief that essentially said “my business needs to be on the internet.” A web designer was programmer.

Over the years the role of the internet in our daily lives as well as the role of a web designer has undoubtedly changed. We see the traditional ‘web designer’ being split in to niche jobs – visual designer, front-end programmer, user experience specialist, back-end programmer, producer etc. The increasing importance of the internet and the rise of user-generated content has changed the playing field not only for businesses but individuals as well. The speed of worldwide communication and the reporting of news in particular is phenomenal, even by yesterday’s standards.

The ease of web-publishing these days seems to be fostering a growing trend suggesting that we, as a society, are getting bored with old content. If something hasn’t changed on a web page in a couple of days we question whether someone is even updating the site anymore. A twitter status of “about  3 days ago” probably means that person or business isn’t a serious tweeter – you question whether you should ‘follow’ or return to the site you just visited if these people are only updating once a week. You know you can get more up to date information somewhere else. The expectation about content delivery and our ability as a society to consume this information is, I believe, on an unsustainable curve. An individual simply can’t write quality content everyday. When I say quality content I mean the stuff that’s worth reading, that makes you think, that adds value to someone’s daily life.

Humans love to be informed, we thrive on the knowing and sharing of knowledge. “Did you hear about…” is often the opening line around a water cooler whether it’s about politics, sport or art. We feel pride if we’re the first one to know about a particular event and conversely feel ‘out of the loop’ if we haven’t heard something and someone (or several people) have. Media giants know this and so they feel, as journalists, they have a responsibility to keep us informed. With the overwhelming pressure on companies like Fairfax media to keep the stats for page hits, facebook ‘friends’and user readership up it seems they’ve adopted  the ‘more is more’ approach – the problem I see here is repetition. Reporting on the same stories in different ways is skewing our perspective on the relevance and importance of events in our daily lives.

Take train violence; where once a journalist had 24 hours to turn around an article, they can now have it in the public domain in minutes, even seconds if they’re in the right place at the right time. What does this mean for us as media consumers? We get to read the ‘in progress’ edit… for 3 days. It seems that there are 3 distinct approaches to publishing news articles on today’s commercial news websites.

Breaking news is important so the first article that seems to get published is the ‘We found out this just happened, we don’t know any details but we published it first’. This is the article with the sensational headline “Commuters left shaken by gang attack!” But when you click to read the whole article you get a couple of paragraphs (if you’re lucky) simply describing as many of the W’s (Who, What, Where, Why, How) – that the reporter has deemed valid information at that particular moment. Often there’s no witness statement, no police statement, no statement from Metro; for want of better words it’s simply a ‘tweet’. But, in the grand scheme of things, we associate this news publisher as one with it’s finger on the pulse so we’re likely to come back.

Once the website has established itself as bringing you ‘cutting edge’ news it can now relax for the afternoon and do some real journalism. Talk to witnesses and police, get statements and details about what *actually* happened – the stuff that used to make news News. So, 24 hours after the event there’s often a more traditional, in-depth news story published on the website about the events that actually unfolded, the headline is often downgraded to something a little less sensationalist “Commuters feared over safety” – because as it turns out there was no gang, not many commuters needed psychological help and the ‘attack’ involved someone throwing a bottle at a closed train carriage.

Once the real report is presented a follow-up is sometimes written which is based on comments that weren’t gathered as part of the original report due to poor reporting or perhaps it’s yet another chance to have a shot at the train operator “Metro has no regrets over train violence” or something to that effect tends to be the headline. It’s aim? To keep the story alive and to connect with readers on the common ground of negative social sentiment towards ‘Metro’.

What we end up with is, over the course of 4 days, the original, sensational and fear-invoking headline is constantly brought to our attention. It feels like 4 days of violence, not one isolated and rare act. If you were to find out that the actual statistics on violent attacks on the Melbourne public transport network were 33 in one million (as reported by The Age)  you would think that it would be of little concern to the tens of thousands that catch the train or bus or tram each day to and from work. But, research shows that violence on the train is of big concern to commuters. What I personally find funny is that the article that quotes these figures (written by the same company that wrote the 3 articles telling us how dangerous our trains are)  is written with a feeling of bafflement, like their readers are all a bit paranoid and that we should relax a bit more on our train ride home.

I don’t expect Fairfax to blame itself for the fear it instills in its readership, if it were my business, my bread on the table, I know I wouldn’t. But responsibility does need to lie somewhere and with the ability now for talented reporters and writers to publish and re-publish their ideas and research for information hungry consumers to devour in every minute of everyday I believe there needs to be some consideration taken by writers to assure the rest of society that violence probably hasn’t increased as much as we think in this new generation. The usual scapegoats of the entertainment industry – video games and violent films have always been around. In the 50’s rock’n’roll was considered to be propogated by Lucifer himself! The internet is of course the next thing to be scared of because of child pornography websites and complete lack of understanding about privacy and sharing personal information to strangers worldwide but perhaps the wolf in sheep’s clothing are our journalists and media companies – whether they know it yet or not.