The term “e-commerce” isn’t used in 2015 because online retail is a ubiquitous notion to the market. That wasn’t the case in 1998.
Six long years after Book Stacks Unlimited went live and three years into the early days of both Amazon & eBay, a large segment of the population was still wary of making purchases online with their credit cards.
This was the climate when Pierre Fay walked into the Organic Online office in New York with a big idea.
Pierre’s goal was to change the way people discovered fashion & prescription eyewear—moving from traditional brick and mortar browsing to a complete online experience. Just months prior to signing with Organic, Pierre struck a deal with Lenscrafters to host an Eyeweb co-branded kiosk, which would allow potential customers to precisely measure their facial/pupil structure and then capture an image that would be uploaded to the cloud (yes, we had “clouds” in ’98).
Upon returning home, the user would log-in to Eyeweb.com, browse an inventory of thousands of frames, virtually try them on against their freshly captured head shot, and ultimately, purchase the product from the comfort of their home.
I was assigned the project under the mentorship of Organic’s Chief Information Architect, Robert Fabricant, as this was my first foray into IA after shifting over from art direction at another interactive firm. The work was intense in terms of getting up to speed in how to design a search and discovery mechanism around SKU-based inventory—rich with descriptive attributes that could drive our frame advisor engine. Between Robert’s direction, and the amazing collaboration between the Organic team, I settled in for an interesting ride.
We started on the ground floor, as the branding was developed from scratch via a thorough brand exercise with Pierre. As Fanny Krivoy and the visual team led that process, Robert presented our organizational concepts to Pierre through conceptual diagrams, and as we garnered consensus, I defined the organizational principles, recall methods, hierarchy in the interface, navigational affordance, widget schematics, etc.
With a boutique focus on one retail object, Eyeweb allowed us to draw focus on the homepage to specific frame-centric features, and then divide the global navigation between those features and secondary support locations. Settling on the right nomenclature was important, as we were transitioning people from their mental model of high-end, real-world shopping to a very early online experience. Areas such as the Personal Collection needed to impart the same white glove feel they were used to experiencing in person.
Web design challenges in 1998 were plenty, but the most ubiquitous—regardless of project type—were limited resolution display sizes (more than 70% of the market had 800×600 px wide displays or less). As such, we had strict constraints when presenting information within the limited real estate available. This not only impacted the structure of the active elements in the interface, but it influenced how much white space we used to compensate for crowded aspects of the interface.
As the user made their way down to the frame detail, we featured the product image prominently, building the actionable affordances of the interface around the frame itself. Before time-on-page was an understood analytics term, we shot for stickiness, which doesn’t seem pertinent in the retail world, as the ultimate goal is to convert purchases. But stickiness also applies to users easily moving from one interesting product to the next in their browse > purchase lifecycle, which increases the potential for conversion.
While we didn’t have the capability to present other frames dynamically via collaborative filtering—or even a simple (in 2015 terms) in-line attribute co-occurance driven display—we knew that prominently exposing frame attributes was important for sparking secondary discovery.
When we began working on the most compelling form of discovery on the site, the Frame Advisor, Pierre expressed that he wanted something remarkably different, so different is what we targeted.
My sketches began by crafting an explicit linear narrative, using multiple choice questions centered around the notion of style and functional preferences, leveraging the same set of frame attributes exposed on the detail interface and personal collection—to return a set of frames.
Once we left Planning and landed in Design, we took that architectural foundation and built upon it with our visual designer evolving the explicit nature of the questions & answers to be represented as an esoteric, implicit input process, using the grid pattern that we established across the site, but in a way that completely relied on visual clues that mapped explicitly to SKU attributes, moving the user through the recommendation engine.
The result was compelling and different. If this was designed for 2015, I’m positive that the experience would’ve held up to the times.
With the co-branded kiosk, we were working with a much more task-based interface with interesting environmental considerations, as the user was operating in public and needed to precisely map specific areas of their face for the system to prep their image for the try-on interface. We honed in on an overly simplistic interaction model and interface, with zero extraneous copy or features so any shopper could successfully use it. The last thing we wanted was visible frustration from the least tech savvy users, surrounded by other potential customers.
We carried through the design patterns from the site for both brand and navigation consistency, and designed the java interface to present a simplified mechanism for dragging points into position. The step-by-step process was clearly labeled and overly simplistic, providing easy to find navigation both forward and backward in the process. Yes, we wanted conversion, but not at the expense of usability.
17 years ago we created a true platform—one that had interfaces used in the public space, and online as a true shop, browse, and manage service while leveraging the then magic of cloud immediacy to provide a compelling experience. There aren’t many platform experiences that survive over a period of 17 years, and while our Eyeweb experience isn’t one of them, some variation of the business is still operational, so we must’ve done something right.
Silicon Alley circa 1998; the halcyon days of agency product design and development.
That’s a 30,000 foot view of Damien Newman’s process of design. It may be crude, but it’s absolutely clear: the unknown heading into Research leads to the sparks of Ideation and culminates with the clarity and details of Design.
While being a universal axiom of both art and design—we all start in a fog as we attempt to narrow down to any clear form of visual or behavioral communication—what it doesn’t begin to touch upon are the moving parts found within the various methods that we must employ in order to participate within & contribute to a product development team.
I’ve been thinking about the evolution of my own creative method for a while now, as I’ve moved from personal expression to participating within and creating Design methodologies for both start-up & large organizations over a 20+ year-long career. So if you’re inclined to join me on this journey, be warned up-front: I’m going to start at the beginning and that goes way back.
Being an Artist
Before, during and after art school, I was a “drawer” on most days, and an illustrator when things got fancy. As a drawer, I’d go directly to pad with either pencil or rapidograph, often to remove cognitive processing altogether with blind contour drawings—my portfolio for art school was filled with them. The subject matter was mine to define, and I finished when I felt it was complete.
As I became more serious, I’d spend time understanding the subject matter, find inspiration, and prep in a number of ways—taking photographs, sketching, cutting images out of magazines, you name it. If the illustration was for a class or a client, I’d spend time up front to understand the concept I was attempting to illustrate. My creative process evolved to include other people’s input for the first time.
Undergrad Advertising Design
My illustration focus at Crouse College quickly took a turn to advertising design, and with it, a whole new set of creative challenges and processes were introduced.
We discussed campaign objectives and parameters in studio, sequestered ourselves to research, sketch and write copy, and then returned to present three concepts and be critiqued by our peers. Repeat, narrow down, complete. It may not sound like much at first, but being able to clearly pitch an idea, and then take criticism as constructive rather than negative, is almost as important as the ideation process itself.
Towards the end of my undergrad career, we began to collaborate with Newhouse copy writers, prepping us for the machinations of a real world creative team (I never pursued a traditional print & tv agency career). With no developers in the mix, this was a commercial design process, but one with greater degrees of complexities than illustration when it came to client input and iterations towards sign-off.
Directly after undergraduate, I spent my days designing CD-rom games after landing an internship at a design and production studio. The biggest contrast between then and now is that the notion of a downloadable patch or a version update simply didn’t exist. We were pre-web, so production houses didn’t staff and operate with the concern that developers constantly needed to be busy shipping product.
We began each project with a singular clear goal of burning a gold disk to take to a reproduction house on a specific date far off in the future. There was no iterative approach, no outcome to meet other than great reviews. We were 100% big bang.
Our creative process looked something like the following:
the writer develops a linear script
our team translated it into a non-linear, node-based script
we storyboarded key frames and the interaction model
environment, sprites, and animations were identified
we then focused on final animations and artwork
As we made progress, the development team would ramp up their technical requirements and begin to shape their architecture; new details from the design team would impact their approach and they’d iterate accordingly. We would intensely collaborate, and the engineers prototyped both good and throwaway code. The process felt much more like collaboration found in film production than a web-based product.
Our reputations and compensation were dependent on releasing a finished product that found a market. Our distribution partner placed the product into the limited sales channels that existed prior to the robust long-tail of online retail, so we only had one chance to make a impression with both reviewers and customers.
So if you ever see me twitch when an MVP is mentioned, that’s simply muscle memory from a product age long ago. It subsides with a good night’s sleep.
As east coast game production dried up, I transitioned to the web and cut my teeth within a couple of agencies; first as an interactive art director, and then as an information architect.
The creative process as an IAD was somewhat similar to undergrad—though pitches and critiques absolutely stepped up in intensity—and my experience in the gaming world prepped me for collaborating with a development team. Similar to gaming, online campaigns, websites, screensavers, etc. were a first impression pursuit— the notion of iterative updates was nascent in adoption.
As an IA, the work was much more focused on defining and then translating requirements into sketches, schemas, taxonomies, etc., but the ideation, presentation and executing processes were highly similar. The “new” element was the explicit notion of documenting user requirements as an explicit point in the process. Functional requirements (user + business + technical) and strategy docs were the foundation for how we ideated solutions
= = = = = = = = = =
In the late nineties, when agencies were in the luxurious position to offer soup to nuts services, development teams didn’t have the same agency to drive innovation as they enjoy today; they were mostly beholden to the client initiative, which didn’t often align with experiments as Creative drove the process. Aside from feasibility reviews of deliverables, and early exploration to settle on technical requirements, innovation was boxed within the core vision of the client, the strategy team’s position paper, and the design process.
Only the best of the best shops could honestly tout a record of pushing the boundaries with their work, and those shops were doing it with amazing collaboration and client partnerships across the board. My time spent time at Organic exemplified such a business environment. These days, firms such as IDEO and frog have diversified with such quality and talent that they can demand the proper price to bring innovation to the table.
Smaller studios and boutiques have to be creative in how they sell their creativity, otherwise they fall into a game of pitching beyond their capabilities. Clients may invest upfront to bring in a shop to provide specific thinking or output—forever the value of an external team—but businesses are leaning more and more on newly formed in-house product design teams to take on the majority of design thinking and execution capabilities.
While traditional full lifecycle innovation is difficult in the agency space, outside consulting can foster innovation within this climate through ideation around notions such as brand experience, behavioral and industrial design, and technical approaches.
The challenge ahead for studios will be to move to a business model and underlying IP creation method that can adapt to the shifting world around them to redefine what innovation looks like and how an external team can provide it with clear and understandable ROI.
= = = = = = = = = =
Engineering Leading; Engineering Billing
After stopping by for the tail-end of a startup experience (Tripod)—where method wasn’t much of a focus—I experienced working within a development-centric organization for the first time. As the Chief Information Architect of a consultancy spun off from a global IT shop, my charge was rather specific—to develop an information architecture practice within a creative team methodology that was flexible and scalable, and could roll out across the agency offices across the U.S.
Aside from forming strong allies with creative directors in each office—which became a difficult line to toe as they (understandably) felt comfortable running their own teams as they saw fit—the largest problem I faced was that ancillary teams (business analysts and developers, in particular) operated in an object-oriented, UML methodology. Engagements began with JAD meetings that lasted days, sometimes weeks, and stacks of business and development documentation were produced for the project’s build.
I’m sure good thinking came out of those sessions, but it looked like ceremony for ceremony’s sake, especially as I came to understand that the majority of the work turned out to be websites with less than innovative aspirations, degrees of difficulty, or complex user needs. The scale and complexity of the challenge ahead of me was clear.
My primary goal out of the gate was to figure out how to get tech management on-board with user-centered design in the midst of client engagements constantly kicking-off across the country.
What I immediately discovered is that IA had to own as much of the actor-driven UML documentation as possible or we had no leg to stand on. While we were successful in taking ownership of certain aspects of the process—from key insertion points to deliverables—we just couldn’t nudge the overall methodology, which was a nightmare.
As our team attempted deeper degrees of change, our efforts quickly became a game of tug of war with management. This was the time we lived in (2001)—many of my design peers were experiencing similar struggles with development-centric organizations across the industry. The attempt to convince an entrenched executive technology staff that Design needed to have a seat upstream to inform development of what users needed, even with wins in tow, quickly devolved into a street fight.
Technical documentation = billable hours.
Looking back, the daily struggle for optimizing how Design fit into a potential innovative space created a blindspot for me to the possibilities of more collaborative opportunities. Maybe we could’ve figured out a pre-Lean UX / Agile approach, or complimented JAD with design activities. At the end of the day, I believe we were too early to a problem that not enough people had yet encountered.
Over the next five years, I unceremoniously took on the conscious role of being a change agent for Design, which unbeknownst to me at the time seemed as important to me as producing innovative work.
Product Design Where No One Considered Product As A Thing
As with my previous stop, it took results to seal the deal of institutional change.
Over the three years that I was in the employ, of Datek / Ameritrade, Design moved from a position of downstream application to an upstream presence that impacted every aspect of the brand made available to the public. But to describe our methodology in a simple sentence just isn’t possible.
Both firms knew they were providing a service to their clients, but at the highest positions internally the authenticated space simply wasn’t perceived as what we commonly understand product to be today. It was treated as the “flip side” of the marketing layer; the logged-in space that pushed traffic to pages or applications that triggered trading charges.
When Ameritrade bought us, the Business provided Development with direction that prioritized desired projects, and Marketing was accustomed to providing all visual assets when applicable. After successfully navigating the domain for a few years, I positioned myself to staff a UX Design team to design a new trading platform after a deep, collaborative analysis of user needs by me and former Datek leadership.
The Active trader platform, Apex, was born and we had real, innovative work to do.
While our process was more waterfall than not, the front-end engineers were in our group (UX) and were constantly prototyping our designs for testing and skeleton development. Consider us an early EPD team, with the evolutionary notion of product management still stuck between the business analyst and project management roles.
Without this organizational approach, waterfall would’ve stopped development in their tracks, but our team collaboration was top-notch. At any given time we had 10-15 active projects prior to the era of collaborative platforms (e.g. Basecamp). Our productivity would’ve grounded to a halt if it weren’t for how well our internal processes adapted to produce both on-time and above expectations.
We pushed forward within a Goal-Directed Design framework to innovate an experience designed specifically to support scenarios particular to our active trading clients. The opportunity to apply human-centered design thinking in a broader manner was limited by our place at the bottom of the engineering reporting structure.
For the past ten years, I operated a design shop that would flex in size depending on the opportunities in front of me. Similarly, the particular engagements I landed (and the domains behind them) greatly influenced the process and methods available for me to employ.
For almost two years, I engaged with a huge player in the media industry, building out my team to six to completely re-imagine how they published to their online brands which received millions of views per day.
Our design process with the internal project management and development teams was steeped in an early-stage (2007) Agile method with a blend of GDD., as we needed to understand and meet the needs of a cross section of internal users—editors, designers, media buyers, product management. It also had the challenging objective to be generic enough to be used by teams across multiple brands publish to different the same, yet differing front-end templates.
After beginning the gig with a round of research to understand the role-based needs of the product, we kicked off the work by identifying key scenarios to pursue, beginning with the “publish and edit article” scenario.
My team, located across the US and the UK, would hold our own daily scrum each morning prior to joining in for the client scrum in the afternoon. We (myself, the PM and dev lead) discussed where we were in designing for the scenario in play, and what we’d complete over the following three week period. We would sketch, review, and specify, and then work with dev.
As this was an internal application for a high-profile brand, a certain degree of politics limited our iterative learning capabilities. If I were in-house, I probably would’ve dove directly onto that sword; as it were, I did the best I could delivering what we had promised to deliver.
The lesson from that experience is that methodology decisions matter, as method—collaboration, research, iteration, agility, etc.—directly impacts the creation of a useful experience for real people.
Design Thinking With A Side Of Innovation?
We began with a process of design, which seems ridiculously simple looking back across this essay, and here I am, fully taking in how much I’ve had to adapt to my environment over the years to thrive, even survive at times.
While I’ve explicitly operated in a manner to nudge others to shift their perspective on how to work with Design over the past 20 years, the evolution of my understanding of Design, and how it fits across industries, form factors, and customer types has immensely changed as well.
Just as business leaders shift their operational tactics as the world evolves around them, we designers must also evolve in order to produce quality work in an ever-shifting technological environment with varying expectations from clients and business partners alike.
My d/Design interests have never led me to submitting entries in an awards chase. I don’t care about accolades; I care about results. My interests lay squarely in the realm of leading a team of practitioners, contributing to an excellent product vision, understanding human needs, innovating solutions, and executing designs to the point of wild success.
What method gets me there is still yet to be defined.
In 2005 I collaborated with Khoi Vihn (a founding partner and Design Principal at Behavior Design) as a contract information architect to redesign the Media Matters for America website.
The organization had launched only a year prior in 2004, but with the explosion of political blogs beginning to make a dent in online conversations (pre-social media), their platform couldn’t handle or take advantage of the overflow of new traffic. If you’re not familiar with the media tracking platform, here’s how they describe themselves:
Media Matters for America is a Web-based, not-for-profit, 501(c)(3) progressive research and information center dedicated to comprehensively monitoring, analyzing, and correcting conservative misinformation in the U.S. media.
The team we worked with was led by a handful of senior research fellows who were mostly in charge of the position pieces on the site. While the incoming goal of the engagement was centered primarily around re-architecting the site to make specific content more findable and visible to users, the team knew that they were posting blindly to the web with only the hope of satisfying audience needs.
Who Actually Needs Monitored Media
In leu of a large research budget, we decided the next best thing was to run a session with our staff of in-house political consultants and subject matter experts. Armed with sticky pads and markers, we set up shop in their war room in downtown Washington, DC and started in at the beginning:
What do we know about the people who come to the site today?
What do they visit often? What do they do there?
As bloggers, how can we better serve their needs?
Who isn’t currently aware of MMFA that should be aware?
Is there a chance to develop programs or community around misinformation?
When designing an information heavy, public website—as opposed to a task-oriented, internal application—it can be a slippery task to craft design personas that will present even pseudo-exacting context scenarios.
In this case, we were able to start the conversation with a solid understanding of a user base that had particular needs (political bloggers), and then extrapolate outwards to see if similar users might exist with even more discrete goals. Our generic “embed and share” user began to show enough tendencies for us to divide the profile into three primary design personas:
Jonathan Kenney represented the archetype of a journalist that MMFA knew was using the site for research purposes. Upon fleshing out his experiential and career goals, we recognized the opportunity to design a program that fed content into his topic-specific research needs
Jackson Martin represented the scores of political bloggers that were on the rise at the time. This was pre-social media, but the scenarios were similar: Make content findable, digestible and shareable.
Efrat Zori represented the mass of anti-war and left leaning activists who simply didn’t appreciate being lied to. MMFA can become a community for her to spend time both commenting and sharing media tips.
Somewhat outside our purview, and deeper into the objectives of MMFA itself, lay a fourth potential persona. Dharia Hsin represented a political staffer who was dealing with the media, putting out fires and suggesting policy strategy her congressman. We believed that she might also be pegged strategically by MMFA as a conduit to share misinformation research and video with her boss to assist movement on the hill.
Moving From Strategy To The Interface
Once we had our personas in play, we crafted a number of scenarios that positioned our users within their daily context and imagined, in the most optimal manner, how they might interact with MMFA. This narrative exercise allowed us to design solutions without the constraints of the interface. We then walked the client through each scenario, and made changes where necessary. Once we were in agreement on the strategic approach concerning user needs, I began to document lower-level scenarios—following key paths throughout this yet-to-be-sketched interface.
The key paths began to resemble functional requirements, but not as an out-of-context collection of direction; rather, much more of an an organic semblance of explicit interface needs steeped in user goals—from particular widgets to search results hero campaigns to column requirements. We used this process to iterate internally and then with the client, eventually leading to design sketches… but with a twist.
As I began low-fidelity template sketches that were heavily laden with modular features, I visually assigned our personas to each content area. From the home page to content pages to search results, it helped to communicate the bridge between the strategy work and the interface to the team.
Form there, as my sketches gained in fidelity and Khoi—probably the most renowned grid designer on the web in this era—worked with the team to establish interface details and a more sophisticated palette, we iterated on module placement, widget behavior and a metadata scheme. Previously, MMFA wasn’t tapping into tagging as an approach to make their content more available, either internally or externally. We made it a priority that all content published—research, video, article, etc.—would have an explicit approach to tagging, which helped the team align more closely with the citizen media launch spots (think: Technorati) of the time.
In the end, the MMFA team loved the work. Our strategy improved their community building efforts and laid the groundwork for MMFA to become the go to place for both bloggers and journalists looking for misinformation in the media.
Ten years after launch, not only can you experience the vast majority of the architectural foundation we put in play, but the domain has skyrocketed in popularity. The advent of social media most definitely fueled such an explosion, but our thinking at the time didn’t limit the domain when a new vehicle for discovery appeared on the horizon.
There aren’t many examples of domain designs that have stood the test of time for more than a decade. That’s very gratifying.
It won’t come as much of a surprise to those close to me that I’m planning on retiring dotmatrix studios as a business this winter. Housekeeping will keep me billing clients as such until the end of this calendar year, but any aspiration of evolving the business passed a few years back. It’s time to move forward, or as my favorite philosopher once said:
The only way to make sense out of change is to plunge into it, move with it, and join the dance.
– Alan Watts
I’ll continue to consult independently while I shift my focus to pursuing the right fit of an in-house design management position.
Quite honestly, it’s a few years past time for me to return to the challenge of building, collaborating and developing professionally with an internal team. And regardless of the results of my full-time pursuits, I’m going to pursue work that is more strategic in nature, which is not to say I won’t dive deep on projects, but at this point in my career I’m exponentially more valuable upstream consulting strategy, defining direction, mentoring designers and working cross-functionally.
Before my life becomes too hectic in this next phase of my career, I figured this might be a good time to document the history behind my studio (RIP), hash through its evolution and take into account the positives that I took away from these past 10 years. By no means do I consider this to be a case study of any particular importance, but it could provide interesting fodder for those looking to strike out on their own.
86 Bedford Street
When my stint with Ameritrade ended a decade ago and I moved from Jersey City to Greensboro, my professional goals were simple enough: to open my own shop. After three years of negotiating the waters of a development-centric organization sans executive Design support, I had become disillusioned with the politics of leading an in-house product design team, on deadline, while fighting turf wars to meet the needs of our clients.
In comparison, the notion of running my own shop, making decisions that fit my approach to design, business and the ripening opportunities of the web was more than appealing.
After a few months of consulting I quickly came to realize that I wasn’t interested in constructing a traditional agency; I had zero desire to get caught in the loop of chasing down work in order to maintain a bench of designers. That was when I took inventory and came to understand that my interests revolved around three distinct axes:
to collaborate with as many of the best and brightest I could find
to work on projects that I found to have value beyond a paycheck
to immerse myself in community-building efforts
This “mission” is what drove the choice of the dotmatrix name and mark. Riffing off an analog printer’s output—imagining individual members of a team or community coming together to create something larger than themselves—within the spoken equivalent to a top-level domain, the branding was ironic, cryptic, geeky, aloof, and executed with sophistication (thanks, Tina)
After putting 10+ years in the industry, I was still an artist at heart who trained as a designer and learned the art of business out of necessity. If I was going to run my own shop, it would be by my own rules, so I decided up front that I wasn’t going to chase RFPs and local clients—let alone propose panels and angle for speaking engagements—but rather invest my time and efforts in openly posting my thoughts and ideas about our quickly evolving 2.0 world and attempt to meet potential collaborators.
From ’05 to ’07 I attended a good number of conferences—from SXSW to Emerging Technologies at MIT—and met some brilliant folk along the way (such as Tara, Doc, Nate, and David). The more I posted, the more I became a part of the 2.0 conversation and the more project feelers I’d get in return. Over the last ten years, every project I’ve worked on was the result of either a past colleague reaching out, a referral or someone pinging me based on my writing.
dotmatrix studios never marketed itself with a website or a twitter account; we were the speakeasy that you came to know by word of mouth.
One of the advantages of setting up shop in a city like Greensboro is the cost of living, which can give you the opportunity, if you’re so inlined, to balance your time between paid gigs and creative pursuits.
After contracting with Behavior Design to rethink and design the Media Matters for America platform, I had the opportunity to assemble a full team to design the CMS for Scripps Networks, which was to be used to publish, manage and monetize content for multiple high-visibility online properties (i.e. The Food Network, HGTV, DIY, etc.). I’d co-lead scrum calls on Skype and interview users on location every few weeks. Working out of offices Knoxville and NYC with a team based in both, including London and Minnesota, the project was a bonafide feat of collaboration.
In my (few) off-hours, I ended up becoming waist deep in the local music scene, and after watching my musician friends struggling landing crowds at shows and getting their music to the masses, I came up with the idea for the dotmatrix project (DMP). Encompassing all three of dotmatrix studios’ mission elements (collaboration, value project, build community), DMP was an experience design challenge both online and off.
Offline, we crafted the shows to be experienced by the audience as a mixture of a studio recording, a video shoot and a live show, which got the locals talking and showing up in numbers; online we posted high quality documentation of the shows and promoted both the documentarians and musicians as if they were all rock stars.
The goals of the project were large (win grants to pay documentarians and open a venue were just two), and the commitment exhausting (I carried equipment with the engineers, contributed to designing & hanging show posters, produced the shows, managed all creative collaboration, etc.) and in the end it was just too much to juggle with full-time paid design work. We produced our last show in May of 2010 and imbibed a few tall ones to celebrate a good run of close to three years worth of monthly shows—all through the efforts of a volunteer community of more than 200 local, creative souls.
For the last five years, dotmatrix studios began the slow ramp down from taking on team-oriented projects to individual consulting projects, with me eventually teaming up with a few different shops: Bluespark Labs (where I took a UX Director role for a minute until I felt the itch to try one last time to make dotmatrix work) and my current partner, Analogous. The work has been challenging and interesting—from the startup Knewton.com to FXCM.com to Indiana University Libraries to the Inter-American Development Bank—but priorities change.
It’s A Wrap
Sometimes I look back on where I was in 2005 and wonder if I made the right decision to move from NYC to the south; to leave Silicon Valley offers on the table in order to startup my own business. I have zero regrets. While ten years removed from being employed by a large company might make it difficult to transition back into a full-time product team, too many amazing things have happened to me over the past ten years to think twice about my decision.
If you ever have the chance to do your own thing, regardless of the obstacles that lay in your path, be sure to go for it. Take that risk, as life is too short, or as the kids say nowadays, “YOLO.”
In 2005 we were two years deep into the Iraqi war, a year removed from Boston’s first World Series championship in 80+ years, and smack dab in the middle of the search engine gold rush.
Google had made it clear that data mining was not only a lucrative venture, but safe as well (no canaries have died to this day during database dredging operations). The rush was on to either catch up with Google’s progress (see Microsoft’s Bing) or to get out of the innovation game altogether (see Yahoo!).
Found somewhere between the turf battle of the open web search giants were hundreds of thousands of domains of wide-ranging sizes and missions that had either developed their own search engines or licensed the best available in the market. From editorial to retail sites, search was quickly becoming the focus of product teams, as the browse culture had evolved into a find culture.
Data vs Proximity
That summer, after leaving Ameritrade and still indecisive about my next move, I had flown out to A9—Amazon’s search R&D team—to present ideas about “block view” proximity in the A9 Yellow Pages interface. The context of the presentation was within a ux design interview with Udi Manber and my old Billsville pal, DeWitt Clinton, but the more I prepared back in Jersey City for a proximity UI presentation, the more my understanding of the needs of users within the context of search drew my focus elsewhere.
By the time I had left for Palo Alto, I knew I wouldn’t be a serious candidate for the job; I had fashioned a presentation around the value of structured data for retailers in a Yellow Pages experience, not proximity. This was pre-smartphones, and the ask was a desktop interface.
I ended up moving away from proximity as I skimmed through a physical Yellow Page, taking notes on my own scanning needs and findability patterns. It became clear to me that while proximity of the store from my location within the context of its own block was useful, it fell squarely in a secondary position when compared to the task of actually pinpointing a store that could serve me my specific needs.
If I’m going to get in my car and drive to a brick & mortar store, it had better have what I need to purchase.
That notion became even more self-evident when taking a spin with the existing A9 Yellow Pages interface, as inventory and/or service details weren’t exposed at the return level. Conversely, proximity was found everywhere, and while proximity might help someone find the store, and such relevance is an inherent partner in presenting search returns, it’s customer query-matching data from individual businesses that should lead this tango.
For example: the modern day Google search experience expresses this axiom in a number of ways, including:
a query for – [city] movies – returns an elegant display of available movies and showtimes in particular city, presenting multiple sources of aggregated reviews, a trailer, synopsis, cast list and online availability
a query for – [ethnic type] restaurant – returns a display featuring local restaurants that meet the type criteria, presented within the context of proximity from my current location
a query for – [ethnic] food menu [city] – returns a display featuring the most popular local establishment’s menu as structured data (if managed within the Google My Business environment)
Google understood as early as 2006 that supporting the needs of businesses and their customers by exposing relevant data was a complex task, one that centered around data collection. So as they iterated their algorithmic approach to presenting more accurate business/customer results in their popular, revenue-driving primary search product, they began to focus on launching more structured “human-generated” data products related to business needs.
In 2006, Apps for Your Domain became the first attempt to harness the multitude of online needs for businesses.
Then came the merger of their Local Business Center—which had allowed for the collection of business-related data—with Places in 2010, creating an even more fluid experience of claiming and enriching representations of a business experience online.
Pages launched in 2011 (recently becoming My Business), which allowed businesses to expose their structured data, making it even more findable by potential customers, clients, partner businesses, etc. via a search query or when represented in the G+ environment.
In 2015, with the knowledge graph, graph panels and entity search in play, Google can now present a range of instantaneous, smart, semantic results to meet user queries, including those of customers looking for specifics from businesses, but they still need well-structured content to parse when an algorithmically informed guess can’t do the job.
So without relying on the market to do it on their own, their business products keep pushing the ball forward, including Google Reviews and the notion of us all being Local Guides.
Smartphones have changed search behavior over the years, and as such, proximity has claimed its own place in the search game. That said, just as I imagined when prepping for that presentation of an explicit Yellow Pages product more than a decade ago, finding the data match to a customer’s need still comes first.
Back To The Future
A decade ago the potential for interfaces to support multivariate needs in a humane fashion were just becoming understandable, let alone practical.
The reality of shipping a UI that explicitly reflected the underlying technology or database rather than the mental model of the user was beginning to wear thin. User experience began to steadily climb towards an understandable notion, on levels far beyond usability into usefulness and how it actually affects brand and the bottom line.
My presentation (download PDF) framed a number of explicit design principles that would optimize for usefulness in the interface. Regardless of the results of the data/proximity discussion I planned on undertaking, it was important to establish that the experience would need to be humane:
After presenting such UX axioms, I walked through the primary scenarios of the existing product. The most frustrating aspect of the customer query -> connect with business task centered around the most basic, yet complex element of search recall:
“coffee” in “palo alto” – returned 77 results in an interface steeped in proximity elements, such as distance from current location and a hyper-local map display
“coffee” and “donut” in “palo alto” – returned only 2 results, so either 75 of the previous coffee shops don’t have donuts, or the data isn’t being made available at time of query
“coffee” and “doughnut” in “palo alto” – returned only 1 result, and it wasn’t one of the previous two listings
There’s no kind way to tell a product team—one that has flown you clear across the country to present a vision for their interface—that no matter what we do in the interface, that the current product focus is not on point.
The toughest challenge for the Yellow Pages product wasn’t proximity, it was figuring out a way to garner a rich collections of meta-data representative of inventory for their search to parse and return. My perspective was to empower business owners to tag their services, leverage existing user reviews as metadata, and open up APIs to provide structured data for search to leverage:
If the business object model is a restaurant, provide an simple menu input interface on the front-end, or offer to pull in data from a popular 3rd party service
If the business object model is a hardware store, provide an API to connect with the domain’s inventory database
Do everything you can to make it impossible for me—the customer who is searching for a specific type of wrench—to miss out on spending money at the Ma & Pa hardware store 15 minutes away that I’d rather support than a warehouse store just down the street, simply because the name of the store doesn’t have “wrench” in it.
A year after our meeting, block view and Yellow Pages were shut down by Amazon— which seemed to indicate that a focus on Amazon-available products became the priority—while Google began on a path towards creating a decentralized, modern day “Yellow Pages” product centered around business data and customer needs.
Amazon now dominates product search through it’s own interface, leaving proximity of brick & mortar to Google. Their shift into creating physical products such as Kindle & Echo while launching their own original entertainment studio for delivery on Prime, marks a wicked understanding of market opportunities and innovation.
The recent announcement of Facebook “working on” a Dislike feature could read like an overstated talking point for a seemingly simple update. I mean, design an icon, add it to all posts and register the count, right? What could be so involved to make this such a “working” situation?
Brand integrity and the bottom-line would be the short answer.
When Like Is >> Love
This “minuscule” feature update has probably been tossed between Facebook upper management for a while now, but not due to a “slow to respond to feedback” perception as many users might believe.
The Like behavior has become ubiquitous with the Facebook brand; it’s one of the core experiences we have when when using the product. I’d imagine management realizes that Like has transcended the brand itself, becoming a pseudo-proprietary eponym for all likable (favorite, best, etc.) interactions online. Even offline the term “like,” used in practically any context, has become the a collective linguistic signifier for the upturned thumb, which equates with the Facebook brand.
That’s a valuable interactive component for an online product; as valuable as asking for a Kleenex instead of a tissue. As my former colleague Dan Saffer describes it:
[…] Microinteractions, when done well, can become Signature Moments that are associated with the brand. Facebook’s “Like” is one great example. […]
Facebook isn’t going to disrupt such brand DNA haphazardly. While Dislike may make perfect sense to the user within the context of registering a quick opinion on a post, it introduces tangible degrees of dissonance to the purity of the brand as a diametrically opposed interaction. Disrupting such a key aspect of the brand—with the potential of turning the simplicity of a Like experience into something heading in the direction of the faceted rankings of posts on Buzzfeed—takes a measured approach.
So yes, it’s a simple execution, but not from a brand strategy perspective.. and even more so when considering the bottom-line.
Adding Up The Ads
From individuals (artists, musicians, politicians, etc.) to businesses (phone companies, restaurants, airlines, etc.), people have recognized the value of a Facebook presence, understanding that a high numbers of likes on a Page can provide:
a perception of credibility in the field, such as a musician with a large following looking to book gigs with established venues
an avenue for traditionally non-sexy, non-communicative brands, such as an AT&T, to get placed in the news feed of users
This understandable path to credibility and gained attention has become a strong revenue model for Facebook, as in-product advertising—presented to highly targeted users—drives conversion to Liked Pages.
So where would Dislike begin and end in this business model?
I would imagine that if Dislike is rolled out to all posts and comments, users would expect the ability to downgrade everything else as well. AT&T’s 5M plus Likes looks “impressive” when positioned alone; add in the context of, say, 35M dislikes and… “Houston, we have an issue here.”
This is the fine line that Zuckerberg tows.
One persona—the daily user of Facebook that posts to generate conversation with friends and/or the general public—would thoroughly enjoy the ability to provide “truth” to everything; from politicians to corporations to local restaurants to individuals. But revenue providing personas? They’re not going to enjoy catching one-to-one cancellations of hard-earned/paid for Likes.
It’ll be interesting to see how Facebook handles the different interests in play, though I have a pretty strong feeling that they’re going to lean in the direction of catering to the user base who parts with cash. Call me crazy.
A Wrench In The Algorithm
Lastly, and potentially most interesting to me (aside from people de-friending people left and right for dropping explicit, negative, system feedback such as DISLIKE!), is how the behavior of Disliking would affect the News Feed. Here’s how Facebook positions its purpose:
The goal of News Feed is to deliver the right content to the right people at the right time so they don’t miss the stories that are important to them.
Facebook designed the approach for how to filter content based on user behavior, and then codified these specific choices it into an algorithm. Of course, no one will be 100% happy with the results, but that’s to be expected and quite honestly, we’re not dealing with brain surgery here. As long as the context of inclusion feels like it’s in the ballpark, user won’t complain… too much. That said, the behavioral decisions are somewhat obtuse at times. Here’s the approach (from 2013) with my comments in bold:
How often you interact with the friend, Page, or public figure (like an actor or journalist) who posted – Do I really want to see more content from someone I negatively interact with? Isn’t there a good chance that content hidden might be more appropriate?
The number of likes, shares and comments a post receives from the world at large and from your friends in particular – Who’s to say I consider my friends as equitable partners in my world view? If one of my friends posts surfaces more often, and I explicitly interact with them, does that actually mean I want to do the same down the road?
How much you have interacted with this type of post in the past – Can the system recognize “type” beyond content types of video and music, and explicitly understand the difference between politics and sports? Or more specifically, left vs. right or Jets vs. Patriots?
Whether or not you and other people across Facebook are hiding or reporting a given post – If I hide or report a post, the post is gone. Does the algorithm look at the “type” I reported and draw conclusions about future “similar” content?
Now toss the Dislike feature into the mix. Am I disliking my friend’s contextualization of a post, more so than the post itself? If I’m passionate about the subject matter, I probably wouldn’t want to see less of it based on a dislike; I may want to dislike more of the same to remain vigilant with my position and potentially impact other people’s perspectives. It’s a tricky balance.
While I’m constantly berating myself for the amount of time I spend on Facebook—tending to draw some pretty ill conclusions about what drives our society’s collective addiction to connecting, pontificating and arguing—I have to admit that this evolution intrigues me. Don’t agree? Like this post and I’ll argue my position on your wall.
MapAmericas was used as a secondary mechanism for IDB employees in the field to input project data, with the additional benefit of geo-targeting projects and their corresponding results. The public-facing map featured every project across Latin America, introducing users to contextual information within a click. The concept was useful, but the interface segmented projects by country, had limited filtered views and the iconography overlapped and confused navigation. With a low adoption rate by all users, we had our work cut out for us.
As I wrote up a heuristic evaluation of the user experience, the design team researched mapping solutions out in the wild to review the variety of visual and experiential options in play. After a review of all our findings, we settled in on an overarching approach:
The project map couldn’t exist as a destination point in the global navigation. While it had the potential of being be a smart, useful visualization of regional projects, it just didn’t fit the context of how users browse or might use it. We decided early on that the map needed to be accessed contextually from project, sector and country pages while providing avenues to return to the same templates when appropriate
There’s a large amount of structured data found on project pages, but only structured in terms of the information design of the page, which allowed for great readability. We needed to free up data as explicit attributes for filtering purposes to navigate within the map interface.
The overlap issue needed to be address, so we decided upon a clustering approach for projects living in nearby proximity at high-level views, and situations when exact geo-coordinates overlapped. We also created two tiers to the map—a pre-project view and a project/results view—which allowed for different behaviors specific to the context of the current task.
The resulting experience now allows users to view projects across all of Latin America, but subsets particular to persona interests. So an Education Minister in Bolivia can view all education projects in her country, and then broaden the results to show education project across the entire continent, allowing for better context. Similarly, data points such as total cost and phase were made available as filters, so that same user can now view projects in the approval phase that cost $50M-$100M. These were simple, yet powerful changes to the browse paradigm.
Once the user drills down to a particular project view, results are now explicitly tied to the project, whereas previously, results lived within their own view with no discernible relationship to a project and/or a mechanism to get more project information. The cognitive result is an immediate processing of scale—how many outcomes were produced by a single initiative.
A decent amount of code tweaking remains in order to get the experience exactly right, but that should occur over the next few months. Once we have usage numbers, I’ll update this post.