â
Over the past week, there has been a surge of angst in my Twitter feed. Specifically, design angst. The final article in a five-part series by Eli Schiff, titled The Fall of the Designer, had just been published and was being mentioned frequently by many design professionals I know and follow. Most of these mentions, I should point out, came from people who readily admitted that they had not read much or any of the series, but were clearly upset by the idea the title alone suggests â that the designerâs days are numbered. And why wouldnât they be? Every day sees some idea threatening the livelihood â if not the very purpose â of someone making its way, like a virus, over the internet, fueled more by the success of its provocation than any actual affirmation it may find in reality. That participating in the marketplace of ideas online has conditioned us to extreme sensitivity, insecurity, and a tendency toward rash reactivity is an understatement. We should know better, at this point, than to be baited by titles as hyperbolic as this one, whether to hand over anything as valuable as our attention, at least, or at most, our credence. And yet, I imagine this series has received a lot of attention. It received mine. And I, too, reacted abruptly, firing off a series of messages on Twitter scoffing at the extremity of it, the so-called âfall of the designer.â
âThere is no fall of the designer,â I announced. âThere is only permanent transition, which is part of what design is in its very nature. All is well.â
Though I, too, hadnât yet read Schiffâs series, I was confident it couldnât be true. Unless, of course, Iâve missed something important that he has not. That question was enough reason for me to give Schiffâs series more of my attention than a few 140-character bursts had cost. So I read the thing. All 16,000+ words of it. And Iâm relieved to say that I donât need to retract any tweets.
But let me get real for a moment. The first two segments â with my initial first impression coming from the title, of course â had me in a perpetual state of OH HELLLL NO. And by the looks of things on Twitter, I was not alone. I was stopping every two sentences to jot down my rebuttal. But by the third section I realized something: This guy doesnât know. He doesnât know! This isnât a serial essay, thought through from start to finish, deliberately structured to slowly and persuasively guide me from point to point, leading me to a grand conclusion, that the end is, in fact, near for the designer. It may sound that way, especially at first. But donât be fooled. This is the outward processing of an anxious mind. And thatâs great news. That is something to which I can absolutely relate. Suddenly the chaos of it all felt familiar. The moments of cogency and thoughtfulness alternating with strange, illogical, red-herring conclusions. Who doesnât think like that? Especially when afraid? So, Eli, if youâre out there, I get it. Things are scary right now. But you know what? I think itâs going to be OK. And even when itâs not, life is better when you choose to believe that itâs going to be OK. Either way, the future is unknown. So why let the unknown ruin your now by choosing to believe itâs not going to be ok?
So Iâm not going to dismantle Eliâs series, point by point. I thought I was going to do that, but to do so wouldnât be fair to what Iâve come to believe this series really is â an interior monologue, flailing for meaning and stability amidst the shifting sand of our industry. We can choose to believe in the fall of the designer, or we can choose to believe that the designer can remain calm and stand firm. Which I do. And with that said, there are a few points that Eli raises that I want to address. From now on, Iâm going to refer to Eli as âthe author,â because Iâm probably going to get harsh from time to time out of necessity, and it will be easier for me to do that if I put my buddy Eli out of my mind. Iâm a lover not a fighter.
Â
Flat Design
The villain, as far as the author is concerned â the trap door through which the designer falls â is flat design. Imagine that. Flat design. Simple, harmless, Eddie Haskell flat design. Gee. Could flat design really be a wolf in sheepâs clothing? Letâs give the author the benefit of the doubt and say maybe.
The author begins by honing in on the crowned jewel of interaction design in the early aughts: icon design. He says that the integrity of design, as illustrated by the golden age of icon design â all five years of it â has been trampled upon by flat design. That flat design is ideologically opposed to the creative expression that used to make icons a joy to make, but has reduced them to an exercise in corporate obedience. He shows example after example of icons which used to look one way â different, unique â and now look another: the same. Or, perhaps more clearly put, at home in the context in which they are used. To which I wonder, is that such a problem?
A good icon isnât defined by its originality. Or, really â in the context of todayâs mobile app grid convention â its ability to stand out visually among many. An icon is good if it clearly communicates what will happen when it is clicked or tapped. Thatâs really it.
The author shows two variants of the UMass BusTrack appâs icons he designed as representative of what icons could be â ostensibly, that they can be more intricate and/or more unique. Ok, but letâs call a spade a spade. These particular icons â unique as they may be â are not really not that good. They fail the one job the icon is intended to do. Why? Because theyâre too intricate. Their busyness makes it difficult to quickly decipher what they represent. It took me too long to figure out that Iâm looking at something as theoretically recognizable as a bus. There are so many lines and edges to the image the author created that it looks far more like a labyrinth than a bus. But icons arenât something weâre supposed to get lost in, are they? Frankly, the flat design version would be better.
But the author doesnât agree. He says that âthe standards for what constitutes âgood-lookingâ have plummeted dramatically.â Setting aside the subjectivity of what is or is not âgood looking,â and assuming for just a moment that such a thing is objective, it is worth considering that perhaps the standards for what is âgood lookingâ have plummeted because the expectation for how long any visual convention will last has also dropped. With every new operating systemâs release, we expect something fresh. A big change. Some new pretty shinies to play with. But if weâre prioritizing novelty, why wouldnât we expect good design to suffer?
I do find it interesting that flat design is now being blamed for eroding the value of design with a capital D. Especially when most designers were begging for broader visual standards just a decade ago and would have squealed with delight if anything in the Microsoft, Apple, or Google universes looked like it had taken even the slightest direction from anything designed by Saul Bass. But now that this has happened, itâs the opposite. Itâs that design is too homogenous. That thereâs no room for creative expression. Perhaps this is true, but I think the designers complaining about this are misplacing the blame.
âFlatâ is a design trend that most benefits the main computing ecosystems in which we all work. Itâs a means to a corporate homogeneity and ubiquity, which, in the course of its work, has become representative of what looks professional and competent in general. So while the corporate computing ecosystems continue to expand for their reasons, we assimilate for our own. Whether Microsoft, Google, or Apple care that the applications we design look more or less like their environments is debatable (material design documentation notwithstanding), but itâs certainly convenient for them that we care about the graceful integration â down to even the color palette â of our stuff. But ultimately, that we, however subconsciously, submit to that design authority â intentional on their part or not, and I doubt much that it is â is a testament to the consolidating forces of capitalism. Weâve accepted a lack of competition on the ecosystem level. The fact that I cannot even name a Martin OâMalley of computing â that three companies have pretty much locked down the worldâs computing, both in terms of software and hardware â is pretty good evidence of that. Itâs not a design ideology at work here, itâs economics. A design language has emerged, used both by outside designers and developers as well as those within the corporation, not because Apple or Google have an interest in controlling all application development everywhere, but because they have a very clear interest in developers putting their work up for sale in their marketplaces. And developers and designers participating in that marketplace â specifically, designing things that look more at home there â is not some sort of capitulation to the âprevailing ideology.â No, theyâre just trying to get people to use their app. If anything itâs a capitulation to the market.
In order to make the ideology charge stick, you need to both prove that such a prevailing ideology exists to the degree that alternatives to it are being excluded by the powers that be in a sort of design price-fixing scheme, not by the aggregate choices of people like you and me, and that the alternative â all the ugly, clumsy and just all around less-good designs that the author laments have been replaced by soulless flat design â is actually better. I havenât seen that yet. Not that I canât imagine what that would be like, though.
But then the question remains, of whether flat design, regardless of where it comes from, denudes an essential creative expression within the act of design. It seems to make that case would require parsing out the balance that exists within design of creative expression, utility, and service in a way which covers all acts of design. A tall order. Is it even worth doing so? To decry a current trend in design because it doesnât allow for enough creative expression seems completely off-base to me, so long as that conversation continues to be held in the context in which it is: the computing monopolies that I just mentioned. So what that some designers have chosen to iterate toward visual anonymity, slowly but surely turning something that looked unique into something that doesnât? Nobody forced them to do that, and nobody is forcing us to. But creative freedom, of which we must honestly stop for a moment and acknowledge we have in historically unprecedented abundance, has to be balanced with utility and fashion. If it exceeds somethingâs usefulness, well, then itâs not going to be used. And if people find something else nicer to look at, then itâs not going to get used. Homer Simpson designed a pretty darn original car which embodies this point rather completely.
Fashion, though, is a metaphor the author himself uses. He suggests that user interface design isnât enough like fashion, the implication being that designing an icon, for example, should come with the same generous latitude for expression that designing a coat does. But I wonder, how much latitude is there, really, in designing a coat? It must fit the human body; wrapping around a vertically-oriented torso, with extensions for two arms, and a way to fasten it shut to keep cold and moisture away. From a structural standpoint, that doesnât allow for much âcreativity,â does it? Other than its size, length, type of fastener, and whether or not it has a hood, most coats are structurally and functionally the same. Now, they be made of different materials, in different colors, with different patterns, but little of that has major functional implications beyond the subtle (camouflage, sporting materials, and the like). So thereâs huge aesthetic range in the act of designing a coat. And thereâs a bit less material options. But little to no room to completely reinvent the coat. Nobodyâs going to add or remove a sleeve. So how does this apply to an icon? Well, itâs emphasis on utility is certainly the same. Youâve got all the aesthetic freedom in the world, but if you canât read an icon â if you canât quickly understand what it represents and what will happen when you click it â then it doesnât work. It doesnât matter how many hours you spent making an original and detailed drawing of a bus to go on it.
To further ground this ideology charge, the author quotes Frank Chimero, who writes:
âThe web is forcing our hands. And this is fine! Many sites will share design solutions, because weâre using the same materials. The consistencies establish best practices; they are proof of design patterns that play off of the needs of a common medium, and not evidence of a visual monoculture.â
The author is right in not just accepting what he calls a âvisual monocultureâ as simply the manifestation of what Frank calls âbest practices.â Flat design is no more a best practice than it is for a man today to grow a beard, wear dark-rimmed glasses, a plaid button-up shirt, and tight, tapered raw denim pants. Itâs fashion. Thereâs that word again. Fashion is great, but fashion comes and goes. But Frank is also on to something with the idea that the web has âforced our hands.â I think thereâs some truth to that, because in light of the ever more complex job that designing for the web has become â because the web is no longer something we experience in the same rectangular box on a desk â finding opportunities for reducing complexity and labor makes sense.
All kinds of things can be done today to push far past the so-called minimalism of flat design. Whether they should be is a matter of context. Whether they will be is a matter of choice. And that choice is always going to be made in light of constraints. Time. Money. So when someone like Frank says that the web has forced us into homogeneity, well, Iâm not sure Iâd strongly disagree. Because given the choice between being unique and being accessible, Iâm going to choose accessibility. And given the choice between shipping on time and on budget and creating a million bespoke versions of my app, well, I think the choice is obvious. To call flat design âillogicalâ (the authorâs words) because it doesnât take advantage of what can be done now thanks to the many affordances of better CSS, better image types, and better screen resolution is actually kind of illogical, in the same way it would be illogical to say that Warren Buffet is illogical for choosing to live in a modest Colonial style home in the midwest rather than build himself a Taj Mahal. Dudeâs got bank, but who are we to say what he should spend it on. Or what he should like. The point is, you canât really blame people for making a decision that prioritizes getting things done over making a statement, or doing something new, or preserving some high-design integrity. Sometimes there just isnât time for that. Itâs disappointing, and we should talk about what can be done about this, but thereâs no point in trying to blame a design trend or any individual(s) for what is essentially the human predicament.
Â
Code
Letâs change the subject and talk about code.
The author cites an industry preference for code as yet another way design is being devalued and threatened. He struggles through what the priority between design and code should be â whether it should be expressed as Design -> Code, or Code -> Design, as if sequence determines value. And while I agree that this struggle is playing out in design and production team workflows everywhere â everyone earnestly trying to get good things made quicker â I think itâs, again, the result of economic forces, not some vast, right-wing conspiracy against design or designers. Like flat design, this phenomenon has more to do, I think, with entrenchment in particular computing ecosystems, and the resulting commoditization of design work through things like frameworks and icon sets. These things exist to ensure stability and increase efficiency, but they rest upon a commitment to other things â a commitment that feels more and more like bedrock (and almost disappears) as more and more people make it. So, at the surface, youâve got something like Ruby, which makes development lightning fast, but sits on a commitment to MySQL, which is committed to Apache, which is committed to Linux, which is committed to C. And mostly, nobody needs to care about that. But, commitment hierarchies like that arenât really set in stone. As the usefulness of the outer layers â the frameworks that abstract design and development in exchange for speed and ease â ebbs and flows, so do the fortunes of companies who make software and hardware with their own commitment hierarchies. So long as the next piece of popular hardware is theirs, a company can maintain commitments to computing ecosystems, which of course makes it impossible for designers and developers to do anything other than make the same commitments. If designers really want to take some power back, theyâd worry less about the mounting pressure of programming and think more about how the next big piece of hardware might come from someone other than the big four, and how any code-first dogma would collapse as entirely new commitment hierarchies are built.
Of course, I wonât hold my breath waiting for that to happen. And in the meantime, weâre going to have to deal with the infighting between designers and developers, and all the strange dogmas that emerge from it. Like the âNo PSDâ movement. I wonât defend that in the slightest. Itâs rash and silly. But I will say that there are threads within it that are absolutely right on. For example, of course the static nature of Photoshop is a problem. But thatâs not because âplanning out a design visually is a bastardization of the purity of the codeâ (the authorâs words summing up the supposed perspective of NO PSD adherents). No, this is a straw man. For years, we designed static layouts in Photoshop only to find them sorely incomplete when it came time to interpret them with code. But thatâs not because code is a better tool than Photoshop. They do completely different things; comparing them makes very little sense.
Ideas go in to Photoshop and stay there until they are interpreted and re-articulated with code. No problem there. The âpurityâ â if it exists at all â is in the idea. But if the idea isnât taking into account all the in-betweenness of use, then its purity is beside the point. Itâs glib. Photoshop alone is an inadequate tool for interaction design because it is a tool that excels at visualizing things in a static environment. Snapshots of possibilities. Itâs not good at documenting interactions. It isnât even good at faking them (animations can be played forward and reverse, but theyâre little more than an idealization of what could happen). But neither, really, is code. In between what Photoshop is good at (image creation and ideation) and what code is good at (translating structure, logic, and even aesthetic properties into machine language) is this vast, messy world of people and psychology and actions. And that is where Design is, at its heart. Unfortunately, that is also where design is hardest. Which means if youâre going to write a few unnecessarily shrill blog posts about the state of design, you have to reduce design down to something much smaller than it is.
Thereâs still a place for Photoshop and even mockups, so on that, Iâm in agreement with the author. But they can no longer stand alone. They have to work together more closely with code in order to deal well with the increasing complexity of computing and interactions. And that balance and collaboration is an area where, I think, the author has a balanced view:
âIt should go without saying that everything designed on computers relies on a foundation of code, and a responsible designer should be aware of the performance limitations of the code they are working with or depending on. Yet, in order for a designer to be wary of the current limitations of code, they need not be an expert programmer, as many in industry would demand. The solution is for greater communication between the two complementary practices.â
Exactly! But not everyone thinks that is enough.
The author goes on mention Bret Victorâs conclusion that designers unable to âshipâ anything âas isâ are âhelplessâ and âtimidâ as a result of not being âself-reliant.â Well, that is just bogus. No one is self-reliant. And to try to be so in the complex environments in which most design problems exist today would be naive and irresponsible. Why anyone with experience would imply that a good designer is âself-sufficientâ is beyond me. Iâm with the author in repudiating that. But I also doubt that this opinion is held by enough people to be as much of a threat as the author thinks it is.
In fact, the roles one could file under the category of Design are more diverse and complex than ever. Described without using the titles we love to bicker about, Design today involves maintaining a point of view and making plans for the future based upon it, having an aesthetic sensibility and making choices in how it is expressed, understanding functional options and making decisions about which are used, organizing and arranging information, observing use, and gathering feedback and synthesizing conclusions. Iâm sure there are others Iâve missed. The point is that none of these roles requires programming skills. However, many are enhanced by an understanding of the programmerâs world. A designer working today in the digital space who doesnât pursue a deeper knowledge of the technology under the skin is just as unprepared to do her job as an industrial designer who doesnât understand how aluminum works. End of story. As for tools and titles, good lord, who cares?
Â
Automation
As distinct from code, automation is the threat that comes from the existence of software anyone can use to design something themselves.
Perhaps this is where the illusion of self-sufficiency comes from. Within most design teams and organizations that Iâve been able to observe, I have seen no examples of self-sufficiency. But Iâve certainly heard designers boast of having made a website or something âall by themselves.â But then Iâm like, yeah, but you used Squarespace. Thatâs cool and all, but thatâs not self-sufficiency. Thatâs the illusion of it because Squarespace has abstracted virtually all the technical expertise you donât have and made the only requirement of you clicking a button. So great job.
But is something like the Squarespace a threat? The author worries that it is. In particular, their logo creator tool (which is funny because this is one thing Squarespace has done that Iâve paid absolutely no attention to whatsoever). But it does exist and yes, you can make a logo with it. Is that a threat to designers? Sure. In the same way that a McDonaldâs hamburger is a threat to a chef who makes a much better burger with locally sourced ingredients and the meat of a nicely treated cow that costs ten times as much. Is one clearly less than the other â in quality, integrity, value? In my opinion, yes. The McDonaldâs burger is about as devolved from what it could be as something like that can get and still be edible. But, thereâs always going to be a place for that in the market. Both out of necessity â that some people simply donât have the means to think of food consumption any other way than lowest possible cost per calorie and it verges on elitist for me to maintain the opinion I do about McDonalds in light of that â and out of nature. On that last point, it is in our nature to choose things, the reasons for which cannot possibly be understood and respected by all. So McDonaldâs counts among their customers those who simply cannot afford to eat elsewhere â perhaps even their own kitchens â which is a blight on capitalism and our culture (again, not a design problem), and those whose taste prefers it. Means doesnât beget taste, nor does lack of means preclude it. Taste is subjective, mysterious, and a major market factor. So, like McDonalds, Squarespace is, perhaps, a threat to a certain strata of opportunity in the marketplace. Those who cannot afford the cost of hiring a competent designer and paying for the time it requires (of the designer and client) to produce a well-designed identity system may, though they recognize the value of that approach, choose to go the five-dollar logo route instead. And someone who has money to burn may do the same thing. This is not a problem that good design, nor one that any amount of designer lobbying (or complaining), can solve. Itâs the other edge of the sword of choice.
But the author worries that the influence of automation isnât just limited to the choice a client has between hiring a designer or doing it âthemselves.â Iâm sure that automation has influenced the perception of value in the client/designer relationship, but Iâm not sure to the degree he thinks it does. He writes:
âBack then [2001], it must have been inconceivable that in a decadeâs time clients would be so forcefully urging icon designers to slash prices.â
Again with the icon designers. Ok, well, huh. I actually had a job designing icons in 2001 and was grossly overpaid. So much so that I recall feeling obliged to take a long time to validate what the client was willing to pay â I was just a kid, what did I know? â which of course resulted in overwrought versions of icons I could have just lifted from the existing Windows ecosystem. Of course I never would have done that because that would have been theft. But I wonder. If my save-to-folder icon basically Taco-Belled the one in Windows98 â most system icons back then were just remixes of the same basic elements of folders and arrows â did I really avoid the ethical quandary I thought I had? I mean, I got paid big bucks to spit out unnecessarily unique versions of things that already existed. Nevertheless, just a couple of years later, I realized that would be the last time I felt overpaid for design work. I routinely encountered clients that wanted a logo and wanted the experience of getting it to be like receiving a Krispie Kreme donut off the conveyor belt. Same goes for the price. Whatâs inconceivable to me is the notion that a lack of understanding of the design process and the resulting price pressure is at all a recent phenomenon or causally related to the 21st century design bogeymen â flat design, code supremacy, or automation. Please. Thatâs like blaming the internet for the exploitation that makes a UNIQLO t-shirt cost less than $10. Except that The Gap has been doing it the same way by way of Old Navy since before people bought clothes online. The digital part just makes it more convenient. The bargains come off the backs of people living pretty miserable lives in far off places. These disparities have to do with the choices we all face living in a world that is increasingly more affordable for those who can afford more, and increasingly arduous for everyone else. How many designers who justifiably protest the $5 logo keep their UNIQLO t-shirts safely in their blind spot? Right.
To that, the author says, itâs not just the economy, itâs the people.
âIt should be unsurprising that clients both large and small react with hostility towards visual designers after the emergence of flat design. Anyone with their eyes open will recognize that material costs for producing design continue to shrink towards almost zero once a designer has a computer, a subscription to a visual design program and a few drawing implements.â
Clients both large and small react this way? Perhaps. But clients large and small always have. It really has nothing to do with material costs. The material costs of designing a logo really havenât changed much. Even with automation. In fact, one could make a case that theyâve grown. Sure, we use digital tools now, but all told, our digital tools today can be pretty expensive. Laptops and software, if weâre talking about just the things you need to make a logo, arenât nearly as cheap as pencils and paper. Whatâs changed are the production costs for identity application. Printing. Signage. Keeping it all under one roof. Markup of all of that â the full-service thing that defines the last centuryâs notion of what an agency is â was always the agency business model. Price pressure hasnât come from technology going after the magic of design. The magic has always been hard to price. Price pressures have come from the fact that technology has gone after all the stuff that used to be easily and profitably marked up. What this most immediately threatens is the possibility of making a logo design job for a client that doesnât need much application a profitable gig. If all itâs ever going to be is a JPEG in the header of a website, then that significantly reduces the scope of the problem. And yes, the cost, which, if you donât count the not insignificant investment in a laptop and Adobe Creative Suite or whatever, could be thought of as approaching zero.
Still, someone who values design might be willing to pay a value-based price for something truly unique and intelligently conceived. But keeping a business afloat with those beautiful unicorns as the core clientele is going to be quite challenging, because theyâre rare, especially in light of the many purveyors of the $5 logo. On the other hand, going for larger clients has its own set of challenges, not to mention those of actually doing the work they need â designing identity systems that span a wide and diverse set of surfaces and contexts. That work cannot approach zero, and that has nothing to do with the computer and software a designer uses. It has do do with the massive increase in complexity, time, and material costs a project like that requires. That hasnât changed much and isnât likely to regardless of how much longer flat design is en vogue.
What I find most strange in the authorâs assessment of these issues is that he goes on to compare designers who believe they are âimmune to the impact of flat designâ with designers who thought they were immune to the impact of digital publishing. Perhaps designer obstinance today and then is comparable. Obstinance is obstinance. But is the contemporary trend of flat design truly comparable to the paradigm shift from print to digital publishing? Hardly. Again, at the root of all of this is economics. Not just that digital tools are cheaper than analog ones. Nor that digital processes are more efficient â and thus cheaper â than analog ones. In some cases, these things are obviously true. But for them to be the sole culprit would mean that all previous demand for design services have come to find todayâs fast and cheap supply sufficient. That is not the case. Today there is new demand for design services. There are more buyers than in decades past. Which opens up lots of opportunity for supply â mostly at the low end, but also at the high end. After all, I partly reacted with surprise to this idea that todayâs digital designers who feel confident in the shadow of the supposed flat design juggernaut are just as glib as yesterdayâs print Pollyannas because there are plenty of designers doing quite well today who clearly donât get digital at all! Some of todayâs most celebrated designers operating out of our most iconic and revered studios continue to be handed the keys to big, equally iconic brands, yet deliver work that clearly falls apart â both practically and conceptually â once it leaves the safety of the printed page. You know who Iâm talking about, so Iâll leave it at that. But hey, plenty of âbigâ clients are still paying BIG money for this incompetent, anachronistic work. The internet hasnât changed that; I doubt flat design will, either.
Ah, the uncertainty of it all.
Â
Where Do We Go From Here?
The author eventually levels-off with a discussion of how technological and economic forces have, all told, removed certainty and stability from the careers of most designers. And to be fair, he also (finally!) acknowledges the tenuous connection between flat design and much bigger things like technological change and market forces:
âIt is true that the two are not themselves related, but the quality of design they inspire is quite similar. Moreover, the expectations they set for what qualifies as good design in the minds of clients are similar. Clients cannot be blamed for drawing improper conclusions about the value of design when it is commoditized.â
Itâs almost as if, over the course of these four parts, he has worked through this issue in part, but not in full. After all, he began with clinging to the artisanal craft of late-90s icon design, almost defining design as a creative prerogative, then transitioned to lamenting the apparently hopeless plight of design in light of the operating system and responsive design, and finally emerged calling for something I can absolutely advocate: a bigger and longer view of the role of design in the world, and more proactivity on the part of working designers. Iâm glad that happened, but perhaps a less definitive title for this series would have been in order. Like, I dunno, âI Donât Know Whatâs Happening But Am Going to Work Through It Until I Figure Something Out.â That certainly would have offended fewer designers.
The issue of infighting among designers as a whole â over what should be done, who should do it, and what we should call one another â is frustrating. It only increases entrenchment and gridlock. But itâs the result of things getting ever more complicated and the responsibilities of the âDesignerâ ever more overwhelming. That specialization would be required of a designer in the midst of all of this should be no surprise. But itâs not going to be easy, as specialization requires that we let go of some things we would like to continue doing (and mostly, controlling, because this is often more about love of control than love of the work) and perhaps take on things that weâd rather not do. And even more challenging to our egos, cooperate with those who are new to the field thanks to the opportunity all this complication has created.
Too many critiques of what design is or should be, or how it functions in the marketplace or should, speaks more to the org chart with which the critiquing designer is most accustomed than any meaningful truth about design at large. Or, in other words, most design dogma speaks more to the fears of the preaching designer than any meaningful truth about design at large. Our isolated, subjective views of the world distort the greater truth, and the only fix for that is to get together and figure this stuff out in community. Weâre on our own, together.
Web design hasnât sat still long enough to give us a Saul Bass or a Paul Rand or a Milton Glazer or a Charles and Ray Eames. We lack digital design heroes to show us the way. We have Mooreâs Law to blame for that. Every tech shift has demanded a reaction from graphic design. But our 20th century print design heroes had centuries of stability before them which supported observation and mastery. We havenât had that, and weâre certainly not doing a great job having the patience to get it. So letâs give ourselves and each other a break.
Between the author and all the people he quotes, a wonderful and deep conversation about the nature of design in this world could happen. But instead, because of some need to plant a flag of opinion with a capital âO,â we get a tangled mess of words ripped out of context and pitted against one another in order to support what is ultimately a cry of fear. Of course things are scary. Thatâs not new! Design is scary. To design is to form with intent. Or, in other words, to bring order to disorder. And so we can choose to fall in to disorder, or to remain calm and stand firm. Whoâs with me?
âŹ
Heavy Rotation: Iâve really been enjoying BBC Radio4âs Coast and Country podcast. If you like field recordings of walks through quiet places narrated by soft, English voices, then this is for you.
â
Recent Tabs: This first one came by way of my stepdad (who reads this newsletter sometimes; hi!): Compulsive behavior sells, which makes me sad, even though I acknowledge it to be true. I even left the article partway through to look at beach houses on Expedia.com, which the article goes to great lengths to mention is really good at sucking in and addicting pathetic web browsing sapiens just like me. The Internet Archive got a redesign. âAs Nepal collapses and Baltimore burns, we are still able to do little more than document it.â Peter Stoykoâs latest is good. How many decisions does a person make in a day? Beyond interaction. The robot alternate universe where everyone watches Friends alone.