Do we know our users?
THE PROBLEM
We were losing customers in our small business segment. They were not realizing the value proposition of our platform. We needed to better understand our small business customers, their needs, and differences among customer types. Research was asked to gather our data on customers and their personas, then build user journeys. With more complete user journeys, we could understand how to customize services to particular customer groups. Did we really know our users?
THE HYPOTHESIS
,
Many of our teams did not have a full true grasp of the users we were partnering with and designing solutions around. Because of this, we often did not right-size solutions for our customers but rather offered a "one size fits all" approach. With more complete user journeys and personas, we could understand how to customize services to particular customer groups and stem the tide of lower retention in particular business segments.
THE PROCESS
First, we needed to understand what specific questions we wanted to know about our user population. I collaborated with the Product Manager, a Senior Researcher, and a UX Designer to narrow and group our questions by themes: demogrpahics, motivations, and technology usage. Once we had defined and refined our questions, we determined an internal Literature Review made sense. Could we answer any of our questions based on previously conducted research?
THE METHODOLOGY
Literature Review. I set about to connect with other departments to see if they had any published research that might be useful in my efforts. There had been several previously completed studies that worked to identify usage patterns and preferences of our user population. I created a requirements document by which to judge the internal resources including age of publication, and specific focus on our learners versus our administrators. If the previously completed research met the inclusion criteria, it could be used as reference material for the Literature Review. Once I had an active list of what resources were included and excluded, I then reviewed our questions against the resources.
THE FINAL ARTIFACT
While the resource materials themselves cannot be shared as they are proprietary, my findings included a written report as the final artifact.
Abstract. Relias strives to remain competitive by prioritizing user-centered design in ongoing UX projects. To better understand our customers' needs and create products they enjoy, we conducted a literature review prior to initiating a new study. This review allowed us to streamline open-ended questions and identify gaps in learner profiles for further inquiry.
1 Introduction
Understanding our learner population is essential to designing solutions that meet the needs of our users. In order to ensure we can design across teams and verticals; Relias must fully comprehend the well-rounded personas of the learners. Relias must understand their demographics, motivations, and technology device usage patterns to devise solutions that best meet our their needs. Previous Market Intelligence Team (MIT) research provided some details regarding our user populations. Additional literature is to be reviewed in Power Bi dashboards and Google Analytics. The findings from those sources will provide the content for this literature review.
Demographics
This study aims to understand more about who the Relias learners are and just knowing that our customer base consists primarily of administrators and learners is insufficient. We need to understand the following about our learners as well:
- General age range
- The average number of years of healthcare experience
- Certification types, if any
- Learning styles of users, and
- Assistive Technology usage
Motivations
For our purposes, motivations include why users access the Relias platform or other learning platforms. We need to understand better the intent of learners who access our forum. We need to know how frequently they use the platform for mandatory learning versus non mandatory personal learning. What kinds of individual knowledge do they seek, prefer, and complete?
Technology Usage
To develop solutions that are easy to use based on our audience, we need to understand technology preferences, accessibility, and limitations for our learner user base. Fundamentally grasping the preferred and most frequently used devices, hindrances, and bonuses of the various device types and accessing the platform will help drive short- and long-term roadmap decisions across our platform.
2 Research methodology
The methodology to uncover any previously elicited responses to our outstanding questions included a literature review of previously conducted quantitative and qualitative studies. This review process includes planning the assessment, conducting the review, and reporting on the review findings.
2.1 Planning
Planning the review further evolves into several steps. Identifying the outstanding questions to be answered about our learners started this process. Questions were submitted by Relias Product Managers, UX Design team members, and the UX Research team in the three categories of demographics, motivation, and technology usage.
2.2 Gathering resources
Gathering resources is the next phase of the Planning Stage. The MIT (Market Intelligence Team) provided some of the resources to be reviewed for this inquiry. A review of the Condens repository will also be necessary. Finally, the Product Operations teams were contacted to determine if they, too, have any additional resources that can be included in this review. Product Operations provided resources for review via Google Analytics and Power BI. The resources gathered from all teams included:
- Mobile Product Report PowerPoint
- 2023 State of Healthcare Training Survey Report PowerPoint
- Power BI Dashboards
- Google Analytics Dashboards (Aug 2022 – Aug 2023)
2.3 Resource review
A resource review starts with establishing protocols to determine if the resources meet the necessary criteria for inclusivity. The requirements will ensure that the resources are:
- Learner-centered studies
- Provides data on Demographics, Motivations, or Technology usage
- Timely (cannot be more than two years old)
Resources that are excluded will include:
- Admin centered studies
- Not related to Demographics, Motivations, or Technology usage
- Outdated (more than two years old)
- Non-Relias centered study results
This criterion will include or exclude previous research findings from the various Relias Teams. The above resources met the inclusive criteria and will be referenced throughout this review.
2.4 Research questions
These are the questions we aim to answer with our literature review.
Demographic, Motivation, and Technology Questions
Who are our learners?
- What is the general age range of our learners?
- What are their roles?
- What professional credentials do they have?
- On average, how long have they worked in healthcare?
- What are the most common learning styles for our users?
- Do our learners rely on any assistive technology?
What motivates our learners to participate in non-mandated courses?
- Do they take non-mandated courses?
- If so, why and for what purpose?
- Have they ever utilized it?
- Who provided that course?
- Why did they take it?
- Did they gain any value from it?
- Did they take it during work hours or outside of work? [can be related to work or not]
What types of technology, if any, do they interact with at work? At home?
- How comfortable are they with interacting with technology? (scale of 1 – 7, 1: least comfortable; 7: most comfortable) (computers, smartphones, tablets)
- Do users have a preferred device they use for professional training or learning? (Ex: mobile vs. desktop) (ex: Compliance, CEUs, etc.)
- What, if any, other types of devices might users use beyond their preferred device?
- What platforms do users use for work-related email or calendars? SMS? Personal email, calendars, SMS
3 Results
Some results were revealed in the review of the literature and documentation provided by the MIT and Product Operations Teams. In two categories, Demographics, and Technology Usage, the following results presented themselves to the outstanding questions.
Demographics
What is the general age range of our users?
The resources reviewed needed to be more consistent in this response. According to the Mobile Product Report PowerPoint, the respondent’s age range for Learners is 51-60 years of age, with 31%; the highest response rate. According to this report, the second category was 41-50, with 23% of. The median age was 40, according to the State of Healthcare Training Survey Report amongst individual contributors. Finally, Google Analytics also captured demographics for age. The results showed that the average age range was 25 – 34. While this range differs from the first two resources reviewed, it should be noted that Google Analytics does not split the data into learners versus people managers. The discrepancy between the first resources (51-60 and the median age of 40) needs to be investigated further to truly understand the age range of our learners.
Average Years in Healthcare
Three of the four resources reviewed did not capture the average years worked in healthcare. The resource that did identify average years in healthcare was the State of Healthcare Training Report. According to this study, the range with the highest selection was 0 to 5 years.
What certification types, if any, do learners hold?
To understand certification types, the Power BI Dashboard was reviewed. This data suggests that most learners add their certifications to their learner profiles. Nursing Professionals had the highest selection with 40%. Non-Licensed direct care professionals were second, with nearly 13%. However, the data also reflects that, on average, learners add their certification types 69 days after creating their profiles rather than when initially creating profiles. Additionally, there was no data that indicates if learners updated their certification types once initially added.
The 2023 State of Healthcare Training Survey Report also referenced job roles. According to this study, the roles were reversed among individual participants, with non-medical care professionals ranking first and accounting for 34%. Nurse professionals came in second with 26%. While both studies indicated nurse and non-nurse professionals as the top two categories, additional research is necessary to truly understand the correct percentages for these two groups.
None of the resources reviewed answered the following demographic questions:
- What are the learning styles of users?
- What percentage of our learner population uses assistive technology?
These questions remain open and should be included in a future study.
Motivation
Unfortunately, none of the studies posed the questions we sought to be answered in the motivation category. All questions below remain outstanding and must be posed in additional research.
- When did users last take a course or program that their employer mandated?
- Which courses have users taken that their employer did not mandate?
- Have users utilized the learning received from courses not mandated by their employer?
- Who provided the course?
- In what format was that course taken?
- Why were they taken?
- Did users find the courses valuable?
- Did users take the training during or outside of work hours?
Technology
There were also some responses to the pending questions in the Technology category. Those responses answer the following questions:
What types of technology do users interact with outside of work?
The only study that asked participants about their technology preferences outside of work was the Mobile Product Report. This study captured preferences for technology used for social media, with 32%; the highest percentage on mobile devices. That said, social media is not the only time learners may interact with technology outside of work and the survey did not pose additional questions about usage other than social media. The results may represent our population, but the scope of the question was also limiting.
Do Learners have a preferred device for use for professional training or learning?
According to the Mobile Product Report, 41% of learners preferred to learn via their desktop when learning content was 30 minutes long. For watching short how-to videos, learners used their desktops 40%. The response was that 51 % prefer to use desktops for taking assessments or exams. This is supported by the 2023 State of Healthcare Training Survey Report, with an overwhelming % of individual contributors selecting desktops/laptops 88% as the device on which they are most likely to complete online training. A review of the Mobile Overview in Google Analytics can also support this selection of desktops. Over the past year, individuals have used the desktop 78% of the time to access the RLMS (Relias Learning Management Platform). While Analytics does not differentiate between learners and administrators, these higher percentages are consistent with previous studies findings about our learners.
What, if any, other types of devices might learners use beyond their preferred devices?
Three different resources, the Mobile Product Report, the 2023 State of Healthcare Training Survey, and Google Analytics, all reflected the same response to this question, with mobile devices coming in second to desktops across all sources. Mobile Product report indicated mobile usage in second place with 34%. The 2023 State of Healthcare Training Survey Report by individual users rated mobile devices used for training by 6%. Then Google Analytics showed mobile device logins at 18%. All three resources showed much less mobile device usage than desktops and tablets rated last according to all three sources.
What platform do our learners use for work-related emails or personal emails and calendars?
The Mobile Product Report gathered feedback on how users prefer to tackle governess, risk, compliance, and sending email and other electronic communication. While the response was overwhelmingly desktops by 54%, this may have been attributed to admin users versus learners. Learners may have yet to make a distinction between work and personal email communications based on the wording of the questions posed in the survey. No other resources posted this question.
One to two outstanding questions under the Technology category remain. They include:
What type of technology do users interact with at work?
(While we may be able to make some assumptions given the overwhelming percentages cited for desktops across studies, this question was not directly asked by any of the previous studies.)
How comfortable are users interacting with technology?
These questions must be directly posed to our user base in an additional study.
4 Conclusion
It can be difficult for any members or groups of the Relias team to grasp our learners' personas fully. While studies have provided some of the data for the various Relias teams, there was not one single source of truth. Gathering the data required understanding where the data lives and finding the right people resources to call out the resources. In some instances, extra time and effort were needed to access the resources before review. The impact on our various Relias teams is that they probably would not know that (a) these resources exist, (b) how to gain access to these resources, and most importantly, (c) what these resources do or do not reveal about our learners. Because many Relians may have yet to be exposed to this kind of data about our learners, they may not truly understand our learner population. Essentially, they may be planning, designing, implementing solutions for, or supporting our learners in a vacuum. Our teams need one source that allows them to get to know our learners in more detail.
Next, there were some discrepancies within the data and there are still some unanswered questions posed by the literature review. An additional study is needed to fill the knowledge gaps. Additionally, while two resources gave specific breakouts of individual learners versus administration or people managers, and two did not have a learner specific focus. The results with mixed audiences need to be confirmed in a learner specific study. Results from Google Analytics and PowerBI Dashboards should be vetted based on this study. As a follow-up, we can explore if the two resources can be set up to give specific breakouts for learners versus administrators. If it is possible to separate these two populations, they may provide more support for the assertions within this review.
Finally, we need to learn more about the motivations of our learner populations. We may have gotten closer to answering some demographic information but other questions remain or need clarity. We know some of our learner’s technology preferences for work-related tasks and social-related tasks. However, we need answers to questions such as which courses learners have taken that an employer did not mandate. We need to find out what courses they take apart from the required procedures and in what formats they have taken them. We need to have a well-rounded understanding of our learners before we can build true personas and design around their preferences. A complete understanding of our learners will allow us to devise better solutions that impact their work and personal lives, allowing us to better serve them as our customers.
THE OUTCOME
The literature review was a good first step and did answer a few of our outstanding questions. Additional research was necessary to answer the remaining questions. The remaining questions were being included in a larger more comprehensive study to be completed in the first quarter of 2024. Do we know our users -- not completely, but work was underway to solve for this concern.
THE HINDSIGHT
This was work I completed near the end of my tenure with this LMS provider. In hindsight, I wish that this was more of a starting point when I joined the organization so all of our teams could really be familar with our user population, their user journeys, and workflows before we attempted to problem solve for them. It may have alleviated designing solutions that did not fit the personas.