FACTORYTALK® ANALYTICS FOR DEVICES
AKA, “PROJECT SHELBY” | A DEEP CASE STUDY | ROCKWELL AUTOMATION | PRODUCT DEVELOPMENT METHODOLOGY: AGILE

SUMMARY
ROLE.
UX design lead
Visual design lead
User researcher
Copywriter
OPPORTUNITY.
Customers have copious data, yet no convenient way to mold it into actionable insights. Most analytics solutions focus on the Cloud and top-down approaches that require data egress, a nonnegotiable security risk for most manufacturers.
HYPOTHESIS.
Customers will purchase a solution (hardware with Web-based software) that allows them to get insights at “the edge,” a bottom-up approach that doesn’t require data egress to the Cloud.
USER RESEARCH
Methods
User interviews, usability studies, persona creation, and contextual inquiry.
Challenges
Gaining access to users in security-intensive industrial environments, and identifying appropriate persona and target user mindsets.
Hypothesis
Our product offering can satisfy the needs of a wide range of users, including maintenance engineers, system designers, and even operators.
Outcome
With the help of stakeholders and product owner, we gained access to customers in a variety of industries; including System Integrators (general manufacturing), Food & Beverage and Automotive—all of which fueled persona creation.

It all started with talking to customers about our hypothesis. The documents shown above were driven out of a series of interviews with a system integrator in Canada who was already a Rockwell Automation customer. At left are my raw notes from one of the discussions. At center, a snippet of the beginnings of persona generation, and at right, the document that would ultimately inform another one of our personas: the maintenance engineer.

The above are the persona documents I authored and designed, which came after the multiple customer interviews. We concluded that there would be three primary users of our solution: a systems engineer, an operator, and a maintenance engineer. The operator would benefit from the situational awareness the dashboards would help provide in a plant floor setting. The maintenance engineer would benefit from the real-time diagnostics that we would show (and push to them) so that they could trouble-shoot issues. The systems engineer would also benefit from the same diagnostics as they would help with system fit, installation, and performance for other end users. These artifacts helped build consensus and user empathy, and they were posted in close proximity to our kanban board so that our small agile team never lost sight of them.

Later, when the product had been released, we conducted multiple usability studies on new-feature hypotheses I’d designed. Above, are a couple snippets of a user research plan I created for a study that was conducted at Automation Fair 2018. As per our Build/Measure/Learn approach, we pivoted frequently depending on early adopter feedback.
USER RESEARCH
Deliverables
User flows, sketches, and wireframes.
Requirements
The product must be fast and ready-to-use (out-of-box experience should be 5 steps or less for users to complete). The product must do more than regurgitate data; must provide actionable insights in human readable language. The product must work offline and receive updates offline. The product should learn from its users; get better over time.

The UX design process started with sketching…which was of course messy. The product owner and I had to tackle the dicey problem of how a user would view a list of their devices before we could even consider what a device analytics dashboard would look like. Should it be a flat list? Should it be nested? Should it be from a controller/IO context? Perhaps a network topology context? Something else?

After several rounds of iteration, we created a prototype using Balsamiq Mockups wireframes, linked together, to spoof three different types of navigational paradigms: tree, drill-in, and a hybrid of the two. We conducted a study at Rockwell Software TechEd in 2016 with 7 participants, and we received mixed results. In the end, we decided to keep things simple, using a flat list, since our solution could only reliably return results from about 100 devices before performance degradation became an issue.

I’ve given away what our dashboards began to look like via the prototype above. But it was a journey to get there as well, and you’ll see the final output in the visual design mockups below. Before we settled on anything, the product owner, lead engineer, and I conducted several white-boarding sessions and interviewed internal SMEs along the way about information architecture, hierarchy, and what information would be most salient. We imposed a restriction of 5-10 KPI widgets/sections of content per device dashboard, since we wanted the information to be a quick read, which would foster situational awareness and load fast for users.

We went through the same iterative approach to determine how a user would onboard with the product, which was both, a hardware and a software solution, using Web technology—which was a newer paradigm in our industry at the time. We challenged ourselves to give users an OOBE (Out of Box Experience) that would require no more than 5 steps to complete and were successful in that endeavor.

I’d mentioned earlier that our expected users, particularly maintenance engineers, expected and would benefit from content being pushed to them. After much discussion, research, and exploration, I came up with the concept of “Action Cards,” which would conceptually model a work order or ticket a worker would receive for a task. This concept also pushed us in the direction of ensuring action cards could be viewed on a mobile device—with the help of Rockwell Automation’s FactoryTalk TeamOne app—a mobile platform that aggregated content and capabilities from other Web-based applications.
Eventually, we gave action cards their own home on a page called the “Action Deck.” Users could access this view from the global navigation bar to view an aggregated list of action cards. While the cards would be available on their mobile device, a large form factor could allow users the added benefit of seeing them all in one place. This concept resonated with usability study participants, and through our Build/Measure/Learn approach, we eventually learned that users preferred using the Action Deck as a large-display dashboard on the plant floor. In the visual design examples, you’ll see some additional functionality and embellishments we made to action cards.


As an additional way of meeting users where they’re at, and putting less onus on their having to navigate a user interface, we leveraged new AI chat bot technology—all so users could type a quick question (from anywhere in the application) and get just as quick of an answer. We nicknamed our chat bot, “Shelby,” a name that had been initially tied to the program since the Shelby muscle car took the best parts from other cars to create something new and special—and our application was made up of other pieces and parts of technologies other teams had used. Over time, Shelby became personified as a German Shepherd, as the bot would fetch answers much in the same way the overall application sniffed out devices on a network and retrieved them. Furthermore, parts of our user interface were “trainable,” and adapted to user preferences over time. In the wireframe above, you’ll see we’d created an addition feature which would allow users to click an accelerator button to invoke the chat bot to answer a common question. This was an onboarding technique to teach users the value of the bot.
VISUAL DESIGN
UI mockups, styles and icons, and copy.
The system Feed, a view that draws corollary conclusions and inferences based upon data communicating between connected devices; the device dashboards with diagnostic assessments; the “Action Deck”—a view with industry-first Action Cards that can be voted up and down so that system may learn user preference and adapt; and industry-first chatbot, “Shelby,” which can respond to hundreds of queries, providing users information about their system.

A look at how the OOBE workflow looks in production. We leveraged an existing style guide, which was tied to our Mobile Foundation Toolkit, and served as the basis for the FactoryTalk TeamOne mobile application. Other than TeamOne, FactoryTalk Analytics for Devices was one of the first Web applications to repurpose the style guide, and I and the team contributed multiple components and patterns back into the system.

A screenshot of how the System Feed initially looked in production. Over time, we updated the default “stories” to become more actionable regarding device relationships, and less predicated on onboarding users.

A screenshot of a PowerFlex drive dashboard in production. The top-level assessment (gray strip with the warning symbol) was intended to provide users with human-readable insights so they wouldn’t have to interpret and/or cobble together a narrative based upon the gauges, trend, and status indicator. While those KPIs are useful, we wanted to surface what mattered most.

Speaking of what matters most, the Action Deck displayed all action cards for a given day, which were associated with the devices “adopted” by a FactoryTalk Analytics for Devices appliance. Over time, we implemented additional intelligence into the cards, essentially allowing users to vote up or vote down a device and its associated diagnostic condition. For example, an issue might have occurred with a device within a machine on Line 2 on a plant floor. However, the maintenance engineer or technician might only care about Line 1, which is their responsibility. Therefore, they could vote down Line 2 devices and their conditions and vote up anything associated with Line 1. The Action Deck is smart enough to understand what’s been voted down, and persists those votes so that users get more meaningful insights on subsequent uses of the application.

A glimpse at how we showcased the Action Deck in the iOS App Store, as part of the TeamOne platform. As mentioned, the unique selling proposition of the Action Deck and action cards was that they provided “snackable” portions of content, perfect for a small form factor or for a user on the go.

The final “Shelby” logo, which I designed myself!


