FACTORYTALK® ANALYTICS LOGIXAI®

DEEP CASE STUDY | ROCKWELL AUTOMATION | PRODUCT DEVELOPMENT METHODOLOGY: AGILE

SUMMARY

ROLE.

UX design lead

Visual design lead

User researcher

Copywriter

OPPORTUNITY.

Customers must spend hundreds of thousands of dollars for data scientists to derive predictive analytics through machine learning. There was no product offering in the industrial domain that allowed a process or system engineer to simply perform similar tasks. A product from Rockwell Automation already existed and persona were established, but the product was difficult and time intensive to use. Moreover, it was expensive.

HYPOTHESIS.

Customers will purchase a new, modern product that sits atop existing technology. It will be a module (with onboard software) that snaps into a controller backplane on their plant floor, capable of collecting controller tags and allowing users to bind them to desired inputs and outputs to build prediction models. The product’s proximity to the controller favors speed, and the built-in software’s physics-based algorithm will usher in a new paradigm for data-cleansing and machine learning—that anyone can use.

UX DESIGN

Deliverables

User flows, sketches, and wireframes.

Requirements

Product must be fast and ready-to-use (out-of-box experience should be 5 steps or less for users to complete). It must onboard users’ knowledge as they use it (no heavy tutorial or non-contextual Help experience). It must provide users with common process templates to facilitate a smooth experience that doesn’t require deep predictive analytics knowledge. It should use modern interaction paradigms wherever possible to leverage new Web tech and a generational shift in automation engineers.

FactoryTalk Analytics LogixAI closely followed on the heels of FactoryTalk Analytics for Devices (FT4D) and FactoryTalk TeamOne—a modern mobile application framework which also provided device-level data. The first thing the product manager (who also managed FT4D) and I did, was brainstorm where this new product—code-named “Project Sherlock”—should reside in a user’s overall workflow. That led to the question: will they also use FT4D? TeamOne? Do the personas overlap between someone wanting controller-based predictions and basic device diagnostics? How would those workflows go? As the above whiteboard images show, we spent quite a bit of time trying to understand any overlap and potential downstream/upstream hand-off between the products so that we made Sherlock different enough. But we needed to understand the target persona better…

We brought in a data scientist who was involved with the existing technology, which used an application called Pavilion. While Pavilion was expensive and required a lengthy time to stand up, the types of customers (and specific users) were still in the sweet spot of who we wanted to target. Through discussions with customers in the oil and gas and heavy industries verticals, we were able to determine that a controls engineer would be the ideal persona to start with—which was very similar to “Aaron,” a Systems Engineer persona I’d established through research on the FT4D project. Aaron’s goals of modeling his system, fostering repeatability, and keeping costs down were well aligned with the type of persona we’d concluded would be ideal for Sherlock. We knew it couldn’t be a data scientist, as that would run counter to the primary mission of the product and its capabilities—which was to extend these capabilities to others.

Eventually, the product manager, other business leaders and I began to view Sherlock as more of a stand-alone offering. We’d been trying to force a narrative of it residing adjacent to FT4D, and other Rockwell applications, but that narrative was becoming complicated and unnecessary. So, we set to trying to understand how to reduce the barriers to entry for every-day “Aaron” users in our customers’ operations using a stand-alone software application (with hardware) that used modern Web technology. The above sketches and workflows show some early explorations on workflow and user interface.

This led to one alignment opportunity with FT4D though: a quick, easy-to-understand Out of Box Experience (OOBE). Fortunately, the OOBE workflow I’d designed for FT4D had received positive feedback and favorable usability study results and we were able to repurpose much of it for Sherlock’s similar experience, albeit with some tweaks. In the above wireframes, you’ll see that I’d even explored trying to make the exact same flow extend into Aaron’s task of creating a prediction model—which we’d concluded could entail defining assets and then assigning controller tags to variables that would allow Sherlock’s algorithm to “train” on what a normal day looks like so that it could eventually predict a bad day.

after more exploration, it became clear that Aaron’s primary task of creating a prediction model was far more complicated than what could (or should) be possible in an initial OOBE experience. There were simply too many decisions to make and we didn’t want to delay his ability to access the software interface by having so much bloat. This eventually led to more opportunity: why not provide an overview screen (upon first accessing the interface) that would give Aaron an opportunity to not only create a new prediction, but view existing predictions with their various statuses and any actions he could take on them? Additionally, he could even watch a video to get started (eventually, we moved to in-line coach marks to keep things contextual and low-friction).

Here’s where we get back to the initial desire for a wizard-like flow, akin to OOBE. To keep things simple and bite-sized, I felt that Aaron would benefit from a step-through experience upon creating a new prediction. In discussions with the product manager and our data science team, we knew we’d need to understand the following off the bat: which controller Aaron is targeting (there could be multiple in a backplane); whether he is creating a new prediction or adding to an existing one; what type of asset or process is he trying to learn more about (e.g., a pump, boiler, generator); what he’d like to have Sherlock predict (e.g., blockage or cavitation for a pump); in addition to the basics such as a prediction name and description for later recall and retrieval. These decisions were table stakes for anything Sherlock could do going forward.

With the table stakes out of the way, Sherlock would then look across the backplane at the controller Aaron identified and begin to load in all the controller tags. Also, his choices of asset, process, and what to focus on were not in vain—we would stage the main area of the interface with common inputs and outputs associated with those choices. This was intended to be an accelerator, as Sherlock’s primary purpose was to make data science and machine learning more accessible to all. From this view, a user could then drag and drop (or browse for) the controller tags they wanted Sherlock to train against into the appropriate variable “bucket.” Aaron’s intervention here was necessary, as tags may not have logical naming conventions, and vary greatly by customer—a fact that became abundantly clear during a later usability study in which a customer from Disney became overly distracted by the names of the tags we’d loaded into a prototype for him to test. Once a user assigned the tags to the variables and established limits (another intervention we’d determined was needed), they were then prepared to move to the next step.

Later, we determined that a drag-and-drop paradigm with well afforded “drop zones” would be the best approach to keep the interface clean and focused and eventually added a faceted filter control to the tags list, while also pairing the limits fields with each tag/variable’s drop zone. Regarding drag and drop: it is always my goal to make the interfaces I design as accessible as possible. However, we did not have keyboard equivalents for the drag and drop interactions for R1—and this continues to be a key learning I’ve taken away from this project. Once certain oversights or misses worm their way into an interface, it is very difficult to get a “go fast” team (or any other for that matter) to go back and fix them. However, this improvement will be prioritized for the next major release.

Aaron’s final step is to review his handy work and make sure it’s in order. Here’s where things get a bit complex though: while Sherlock can package up the prediction model based upon Aaron’s preferences, it cannot begin to train against the model until Aaron “activates” it an application specifically for managing and designing a controls system. That product is Logix Designer, one of Rockwell’s flagship products, which is Windows-based. Sherlock’s payload is downloadable as a specific file type needed for Logix Designer. With that file/payload in hand, Aaron can then open Logix Designer and upload the special prediction tags that Sherlock created on his behalf. Once loaded into the section of Logix Designer where Aaron typically manages tags, he can then turn them “on.” To make this seamless, we eventually designed an onboarding companion Webpage (not a video) that Aaron would keep open as he went through the steps to load the prediction into Logix Designer and then know what to do next.

Once he’s completed his work with the wizard (which we later called, “Prediction Builder”) and understands what to do next in Logix Designer, he’s selects Finish and finds himself back again at the overview screen. Here’s where he can download the payload when it’s ready, and see the statuses of other predictions associated with that same controller and possibly others. While an individual prediction could be expressed as being “on,” within this view, we later suppressed the toggle until users like Aaron completed the Logix Designer steps, which were prerequisites for completing the work.

VISUAL DESIGN

This was the look and feel we’d started with for LogixAI, and created a very similar OOBE experience as we did for FT4D—both functionally, and thematically. However, there was change afoot…

Around the time that I was designing a high-fidelity vision for Sherlock, the User Experience Design Team’s (UxDT) UI design leads were going through an adjacent activity of creating a more robust, modern design library, built upon Google Material. The above screenshot only partially shows all the various components, styles, and themes we were beginning to align visually, and from an interaction design perspective. So, it was time to pivot…

Out with the old, and in with the new. Based on the wireframes I’d designed, you can see here (on the Step 1 view) that I gave the interface the high-fidelity treatment as per our new system, including starting to use the linear form inputs and styles derived from Google Material. In later usability studies, I found that users struggled to acquire those linear-input targets and made the decision to use the enclosed versions instead—and this was supported by the rest of the UxDT. They simply didn’t convey “input” and failed to capitalize on users’ existing knowledge, particularly those like Aaron, who were used to Window-based apps.

Here are some examples of the more detailed high-fidelity design specifications I designed for developer hand-off. I should point out that “Sherlock” was evolving then to become “FactoryTalk Analytics for Applications.” Eventually, the marketing team swooped in and gave it the name, LogixAI—which made sense, given the necessary integration with the Logix Designer application and our focus on artificial intelligence.

A few more examples for you.

USABILITY STUDIES

Overview

Conducted at the Rockwell Automation TechEd event in San Diego, CA in 2018 and Automation Fair in Philadelphia, PA in 2018. There were 15 participants from wide-ranging industry backgrounds and roles.

Methods

Think-aloud qualitative usability studies: Participants used a functioning prototype to test core workflows.

Outcomes

All recommendation that were deemed high or medium priority were implemented into the released product.

These findings were derived from a usability study conducted in San Diego, California at Rockwell Automation TechEd, in 2018—once we had a running prototype. We had 9 participants ranging from companies like 3M, Disney, Alcoa, and also had a participant from Georgia Tech’s data science laboratory. The results reinforced some limitations on Google Material’s input affordances, but also illuminated issues with our naming and labeling conventions. All of the recommendations listed were eventually added to our agile kanban board and fixed in subsequent releases.

This usability study was conducted in Philadelphia, Pennsylvania at Automation Fair, in 2018. We had 6 participants, who were mostly from mining and heavy industries. This study had a different learning objective, and was more focused on the hand-off between LogixAI and Logix Designer, and the types of files needed, and later reinforced our need for a “Helper Companion” Webpage, which a user like Aaron would keep open while he took any additional steps in Logix Designer. The study itself was more complex as we needed Logix Designer installed, as well as LogixAI running with fake data. Additionally, the time needed to reset each subsequent experience was longer, given the amount of overhead and setup involved with the workflow. However, we were able to use some spare minutes from our perpetually late attendees (having to traverse a large venue to find the study location) to make it all work.

JOIN MAILING LIST

Sign up for Jon’s newsletter, THE TRAILHEAD, through which he’ll share the latest updates, musings, and bonus content.

Your privacy is respected. I promise, no spamming!

Copyright © jondwalter

Scroll to Top