Synaptics acquired Validity Sensors in 2013 adding fingerprint sensors to our portfolio of human interface technologies. Validity had no user experience team, so our team was tasked with researching and designing interfaces for fingerprint enrollment.
Fingerprint enrollment is the process of initially creating a template of the fingerprint to authenticate users as needed. The goal of enrollment is to try and maximize the capture of fingerprint area to account for variation in usage environments and finger positions. Enrollment consists of user interface components and matching algorithms that work together to create the best possible enrollment template.
The goals of the project were to:
- Design user interfaces that produce the highest quality fingerprints for authentication.
- Create scalable interfaces for various usage environments and different sensor types.
- Research the best way to keep users engaged during the enrollment process.
- Interface with the Synaptics SDK to develop our UI using the same tools as our customers.
Fingerprint sensors were common on enterprise laptops, but having a fingerprint sensor on a smartphone required us to rethink how to design the user interface within a mobile context. Our team started by looking at the user experience of enrollment on swipe sensors, but quickly moved to area sensors once the technology became available.
We started with designs for three different types of enrollment experiences: fully guided, partially guided, and unguided. After completing some initial brainstorming related to the different enrollment experiences, we settled on doing a deep dive related to a fully guided enrollment. We started by iterating on a design geared around a small, rectangular sensor on the front of a mobile device. The shape and location were the same constraints of our next-generation fingerprint sensor that had been selected by Samsung for the Galaxy S6.
Our initial design included a significant amount of instruction as we expected most Android users would be completing a fingerprint enrollment for the first time. These instructions were placed on an introductory screen and also provided dynamically during the enrollment. Feedback was an important component during the enrollment process as well. We were comfortable with the visual feedback we had designed (a fingerprint changing in color intensity), but tried different combinations of haptic and audio feedback throughout the design process.
The first step was to build a prototype framework that allowed us to iterate on designs quickly and have multiple enrollment user interfaces within one application. We knew that there were going to be other enrollment designs we would have to implement for different sensor shapes and locations. I initially developed low-fidelity prototypes on Android using a PandaBoard and our external development sensors before moving to a custom AOSP build on a Nexus device.
A detailed Configuration screen allowed us to build tailored enrollments by choosing individual sensors and locations, toggling interface elements, and selecting different feedback types. Since each project required a custom SDK, I created an emulator that allowed us to design, prototype, and test interfaces when the software wasn't available. Once the SDKs were released, it was a very quick process to validate the designs.
We completed five usability studies, each one with between eight and ten participants that we recruited internally, providing us with both quantitative and qualitative feedback. We provided fingerprint images that were captured by the sensor to the team that was working on the matching algorithms. We continued to move to higher-fidelity prototypes and used all of the feedback to improve the user interfaces before making a final recommendation prior to the Samsung Galaxy S6 launch.
In general, we decreased the amount of instructions provided in the application due to feedback from users. We also scaled back the text feedback that was provided to the user and moved those feedback elements into our visual design. We have since tested interfaces that ranged between realistic and abstract with varying amounts of animations. Our design explorations resulted in users stating that our UI was easy to follow, helpful, and visually appealing.
The application was very successful internally and has since been used by our marketing teams for product demos. Originally the source code was only available to internal development and marketing teams, but is now provided to customers for reference and to utilize in their enrollment interfaces. Our team continues to use the application to prototype new enrollment user interfaces that relate to our next-generation fingerprint sensors. As the application is now provided to customers, I have implemented many of the features that were originally in our engineering applications. These include features such as fingerprint management, displaying fingerprint bitmaps, and viewing debug information from the sensor.
Interaction Design, Prototyping, Android Development, and User Testing.
I wanted to create a testing tool in order to measure the performance of our touch technology on mobile devices against our competitors. Our team had testing tools to measure our touch performance on Windows, but those tools were not compatible with Android. Our team thought it would be useful to conduct research related to the real-world usability of our devices and get results that reflected user performance across all of the applications on a mobile device.
Touchscreen performance on mobile devices is especially important due to issues like non-linearity across the surface, poor edge performance, industrial design limitations, varied usage environments, and a wide range of form factors. Having an extendable platform to develop different types of tests would allow us to fulfill our research needs and also help our marketing teams in showcasing the performance of our touchscreens.
The goals of the project were to:
- Design a usability platform for tests evaluating touchscreen performance on mobile devices.
- Implement features allowing for unmoderated usability studies to be conducted.
- Add networking features to help expedite and simplify the data collection process.
- Provide data to internal marketing teams about our touchscreen sensor performance.
One of the important tenants of the application was to simplify the process of conducting user studies. I believed that this was a situation where I could design an application that would allow for mobile user studies to be run with little oversight directly from a test moderator. This kind of design would allow our team to complete these competitive studies in the background and let us focus on more in-depth studies that required a moderator to be present.
After brainstorming different platforms to develop the application on, UX Suite manifested itself as a web application and Android application that worked together. The Android application would house all of the usability tests and collect the research data. The web application would allow for remote test configuration and management of results. The application would sync results with the web server and the web server would send updated test configuration values to the application.
I started with the design of a test related to the most common task on a smartphone, which was selection. The selection test was based on one of the most widely accepted predictive models for testing the accuracy of target selection (Fitts’ law). The web control panel was designed to allow for easy access to all of the options included in the test with the same user interface as what was seen on the Android device. I included the ability to manage results, assign default test values, link custom surveys, and edit the text for instructions from the web interface. Lastly, device-specific settings could be sent down to the application including the test background color, default audio level, default haptic strength, and whether results were shown to the participant at the end of the test.
The Android application targeted devices running an OS version of 4.0 (Ice Cream Sandwich) or later, which accounted for 84% of active devices at the time of development. Detailed results were saved as CSV files with headers dedicated to session identifiers, device information, test configuration, and the test results. The CSV files allowed for easy importing and analysis in most statistical applications. The application also hooked into Google Drive to handle post-test surveys in order to ease survey creation and analysis. The results were saved both to the internal storage of the device and uploaded to the server for easy management of results.
Two internal user studies were conducted on the application to assess its usability and the self-moderated study process. For both of the studies, participants completed the selection test within UX Suite. Subjective questionnaires were administered after the test through an embedded Google Drive survey within the application. The first study had participants complete the remote usability study using a device that was dropped off on their desk. The second study had participants install the application on their personal devices.
The results of the research showed that participants completed the usability studies much faster when utilizing their own device. More than half of the participants in the second study completed the test on the same day the instructions were sent out via email. The results of that study showed that 83% of participants believed that the instructions provided inside the application were "very clear." Additionally, 89% of participants stated that completing the study without a moderator present was "very easy." Every participant in the research study said they would complete more self-moderated studies in the future. The results of both user studies were very encouraging and justified the continued refinement of the application and the self-moderated usability research process.
The application has continued to be used internally and has also been provided to several of our hardware partners, such as HP and Lenovo. The results of the user studies were published at MobileHCI 2014 and the application was released to the public on the Google Play Store. My colleagues also utilized the application for their research on proximity-based touchscreen interactions that was published at CHI 2015.
Interaction/Visual Design, Prototyping, Android/Web Development, User Testing, and Statistics.
Image Suite is an acquisition modality that is targeted at clinics worldwide that do not need the powerful DirectView system from Carestream. Due to poor feedback from users in the field, our team was tasked with investigating the reported issues and to recommend solutions. Some areas of concern were the inefficient workflow, the amount of pop-up windows, and outdated design patterns.
Our team spent more than six months working on an overhaul that completely reimagined the user interface of the product. This was an iterative process that included designing and developing interactive wireframes, performing usability tests, conducting management reviews, and having discussions with an offshore development team.
The goals of the project were to:
- Redesign the acquisition workflow to make technicians more efficient.
- Add the most requested features, such as switching between patients and search.
- Bring the product visuals and interactions more in line with the DirectView product.
One of the most important aspects of the redesign was to make sure that all of the current features would make it into the new revision of the software. This required us to make sure that all of the features were still discoverable and usable when we reduced the complexity of the interface. The focus of the redesign for our team was reducing the amount of steps required by the technician. We wanted to increase their efficiency in many of their common tasks, such as a placing image markers on the exposure. We designed an interface that allowed them to place image markers immediately after scanning in an image versus having to visit a post-processing screen.
The redesign also allowed us to add shortcuts in the workflow, such as starting the exam immediately after creating the order for an X-ray. Lastly, we removed all unnecessary dialog boxes and pop-up windows that even our expert users found confusing. When the redesign was completed, the user interface was reduced to only three screens (Worklist, Order Entry, and Acquisition).
An offshore development team was responsible for the final implementation of our redesign. However, our team completed the redesign process utilizing Axure RP. It allowed us to develop an interactive prototype that we could ship to the team that was developing the software revision. We were also able to export a design specification and helpful documentation directly from the prototype. Axure simplified the workflow of our team and we continued to utilize it for other projects due to the positive feedback.
The major obstacle for our team during the redesign process was how to evaluate our different ideas and prototypes. We did not have consistent access to users of the Image Suite product for frequent usability tests. As a result, we were only able to complete a few usability studies that gave us positive feedback towards the end of the design process. The qualitative and quantitative feedback we received from the usability studies reinforced the improvements to the product.
When the redesign shipped in 2013, the marketing materials focused on the more efficient workflow and the new touch-friendly interface.
Interaction Design, Prototyping, User Testing, and Statistics.
The System Configuration menu on our DirectView product was the number one source of frustration from our customers and field engineers. Our users had concerns about both the overall navigation method and the confusing content of the individual configuration pages. Technicians stated that the menu was intimidating overall and that menu item locations were unnecessarily confusing and cumbersome.
When our team reviewed the menu, we decided that the biggest issue with the design was the large list of buttons that needed to be scrolled through. There were multiple levels of button-only menus that would eventually lead to the configuration page that a user was looking for. However, we decided that the menu needed a complete redesign as redoing the menu hierarchy would only solve some of the issues.
The goals of the project were to:
- Redesign the navigation of the menu to solve the issues related to poor navigation.
- Add new features that reduce the complexity of the configuration screen.
- Rework the individual configuration pages with modern design patterns and widgets.
- Reduce the time needed to configure the product for first-time use.
The first task I completed related to the redesign was to conduct a card sorting exercise with our internal subject-matter experts. The redesign utilized the results of that exercise and reduced the menu hierarchy to only two levels. After the new configuration categories were created, I worked on combining similar configuration pages within the interface. For example, the original menu had four separate pages related to configuring bar codes on patient wristbands that I combined into one page.
I decided on the accordion design pattern for the new navigation menu to reduce the amount of items on the screen at once. One of the other improvements was to simply alphabetize the pages under each category. Previously, each new configuration page was just added on to the end of the configuration menu. Other improvements I added to the design included the ability to search, browser-like navigation buttons, thumbnails, and user favorites.
After the navigation menu was completed, I went into each configuration page to reduce the complexity. I also updated the contents of each page with modern design patterns and widgets. The last task related to the design of the menu was to complete a design guide for developers. The goal of the design guide was to make sure that developers would be consistent with the redesigned pages when adding new pages to the menu.
During the redesign, I met frequently with our application engineers. Our engineers often traveled to the headquarters for training on new features and upcoming products. I made it a priority to meet with new groups that were arriving to get consistent feedback throughout the process of the redesign.
During the project, I met with three groups of about fifteen users to show the new user interface and to solicit feedback from them. The technicians were excited by the redesigned navigation and appreciated that our team was working to make their work on-site at our customers easier. Feedback from users during the process was unanimously positive. Due to the time constraints of the visiting application engineers, I was never able to put the redesign through any formal usability testing with them.
After handing off the high-fidelity prototype to the DirectView product team, I left Carestream Health for a new career opportunity.
Interaction/Visual Design, Prototyping, and Web Development.