Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 19 Current »

Goal

Observing whether the users comprehend the Govstack methodology and BBs while engaging in the USCT simulation.

Area Of Focus

Primary Focus: USCT Simulation (Onboarding and Contextual Panel+USCT Wireframes)

  • Do participants understand the conceptual BB interaction while experiencing the simulation?

  • Understanding the different roles/perspective changes

  • Clarity of the structure and flow

  • Do participants understand the complexity and the usage of BB throughout the whole process?

Secondary Focus

USCT Wireframes

  • Do the wireframes contains the expected information for users

  • If the actions are understandable by the user

Contextual Panel

  • (Will be filled after design)

Onboarding page

  • Clarity of the structure of the information

  • if there is any additional information needed on the onboarding page

(Suggestion) Two test runs:
First Testrun: starting with the onboarding page to see how long the participant will stay on the page itself to grasp the information needed to start the simulation.

Second Testrun: starting point will be the simulation itself, to see if the contextual panel is enough and we could minimise the information needed on the onboarding page.

Participants

Identifying and Recruiting the participants

Participants should be decided based on the functional scope “Policy Decision Maker” persona.

Due to their industry knowledge and network, the Governance Committee and team members such as Meelis and Nico should be in charge to identify fitting participants.

Screener

Include/exclude

Factor

Value

Comment

include

Age

30-65

Most decision makers have reached certain degree of professional maturity, higher age will raise potential fit

include

Gender

-

 

include

Language

  • English

 

include

Job

  • public sector / government servants with leading responsibilities

  • at a public sector vendor

  • politicians

or

  • people with job with daily exposure to IT sector and cloud, i.e. bank manager, no technical experts (i.e. developers)

 

exclude

Knowledge

  • knowledge or exposure to Govstack as a project

(Recruiting process should be decided)

Running the Tests

https://sensible.com/download-files/

Type of test

Remote Moderated Usability Testing

  • Usability testing through prototype

  • Qualitative/Contextual interview

Number of tests:

Number of tests: 5 tests with 1 or 2 internal warm-up tests beforehand (See the suggestion before regarding Onboarding page)

Tools/Set-up/Roles

Tools:

  • Figma (Wireframes)

  • Microsoft Teams (Communication)

  • Confluence (Notetaking)

Time frame: 45 min

  • Possible agenda for interview:

    • 5 min introduction

    • 5-10 min onboarding page

    • 15-20 min for USCT Simulation

    • 10-20 min for follow-up questions → Qualitative/Contextual Interview

Roles:

Will be decided later, can be change through the different interviews.

Requiting Participants

First Email:

Dear [Participant],

we are thrilled to invite you to participate in our upcoming user test!
(Name) forwarded us your contact information. Your feedback is valuable to us and can help us improve our product. The test will take about 45 minutes and you can choose a time slot on the dates:

13.04. and 14.04.2023

https://rallly.co/p/wNfkcDqXiyZX

The test will be conducted online via Microsoft Teams, the invitation will be sent to you after you
chose a time slot.

With your consent, this session will be recorded. We want to assure you that your privacy is important to us, so we will only use the recording internally and store it for a short amount of time. Your personal information will not be shared with anyone outside the meeting.

If you have any questions, please don’t hesitate and ask us anything.

Thank you for considering this invitation, and we look forward to hearing from you soon!

Best regards,

Gofore UX-Team

Follow Up Email:

Dear [Participant],

Thank you for choosing [Timeslot]. Here is the link to our session.

[Link]

To participate in the test, you will need a laptop with a webcam, a calm environment and stable internet connection. You will be testing a clickable prototype of a product.

We appreciate your time and effort.

If you have any questions, please don’t hesitate and ask us anything.

Best regards,

Gofore UX-Team

Location/Time

13.04. - 14.04. from 9AM to 12AM

→ 2 Test Users from Gofore on Thursday, 06.04. and Tuesday, 12.04.


Agenda

  1. Define what you will test

  2. Set up Test Session

  3. Prep the user and let them consent

  4. Testing the Onboarding site, tutorial and Simulation

  5. Write up notes

  6. Action list


Script

Test Plan

  • User Testing Intro

  • Consent

  • Warm-up Questions

  • Prototype

    • Onboarding Page

      • Experiencing onboarding page

      • Questions

      • Task

    • Tutorial

      • Experiencing Tutorial Page

        • Depending on the user’s actions questions will be asked (During the tutorial and Simulation)

    • Simulation (Revisited)

      • Experiencing first page

      • Questions

      • Task

  • Interview regarding the whole experience (Including onboarding, simulation and tutorial)

It is important to know that the script is flexible. The script will be used as a guideline.

1. User Testing Intro

To get the test started, we should thank them for taking the time and helping us with our research.

After that, we’re going to introduce ourselves with our name, company and job title. (2-3 Minutes)

Hi, _____________. My name is ___________, and I’m going to be walking you through this session today.
Before we begin, I have some information for you, and I’m going to read it to make sure that I cover everything.
You probably already have a good idea of why we asked you here, but let me go over it again briefly. We’re asking people to try using a Website that we’re working on so we can see whether it works as intended. The session should take about 45 minutes.
The first thing I want to make clear is that we’re testing the site, not you. You can’t do anything wrong. We’d like you to explore the product freely and share your thoughts with us, so please speak out loud during the session.

Legal Considerations and Consent Form

Asking the legal questions and let the note taker start the recording session after the tester has given consent. (3 Minutes)

Question

Background

Notes

1- Do you have any questions about the recording process or what will happen to the recording afterwards?

→ If not, I would start recording the session.

Starting question before the recording starts. We should ask for consent on camera.

Start the recording session, if the answer is YES.

2- Do you have any objections about us recording this session, including your voice and face or my colleague taking notes on the side?

If the answer is No, continue with asking questions about the personal data.

Warm-up Questions

Question

Background

Notes

Please tell us a few details about you.

Warm-up questions:

What is your name and age?

Could you please tell us about the tasks of your job and your job title?

Civil Servant tester question:

Can you briefly describe your role and responsibilities within the public sector?

Gather personal information and check if the tester fits the screener details

continue with a short onboarding text and post the link in the Teams chat.

Explain what will happen next on the session.

Prototype

Questions we want to ask during the session

User clicks on link and we ask him to share the screen with us. Explain the user should take the time to read through the onboarding page - this should take about 5-10 minutes .

If the user asks you how to do a task, please do not tell him/her but rather ask “How do you think the task should be done?” or “What would you expect it to do?”

“Thank you! Let’s get started with the session, please share your screen with us, so we can observe your interactions with the website/product. (Wait for them to open the page)

(First Request) Please keep in mind that this is a clickable prototype. But currently please do not click any button. You can scroll if you want to. Please think out loud while doing that and explain what you are seeing what is your first impression of the page.

(Wait a bit for them to explain) (Ask questions about onboarding page depending on what they explained)

  • Can you describe the goal of the website?

  • In your own words, please describe what the “Building Blocks” are

  • Do you have an understanding of what the “Simulation” is?

  • (Improvise)

(End of onboarding)

Please take your time and scan the website first.”

---

Question and requests

Task

Background

Notes

Onboarding page:

  1. Exposure test: What was your first impression of the page?

  2. Exposure test: Can you describe the goal of the website?

  3. In your own words, please describe what the “Building Blocks” are

  4. Did you get an idea of what the “Simulation” is?

  5. Could you please start the simulation?

Explain the simulation, building blocks and use cases in your own words and start the it.

Does the onboarding page explain the simulation accordingly?

What kind of information would we need to add to explain the simulation better?

Is the page self explanatory?

Wait a few minutes/moments before asking questions

After the tutorial is completed:

  1. How did you feel about the pacing and content/text of the tutorial?

  2. Was there anything in the tutorial that surprised you or that you didn't expect?

  3. How confident do you feel in your ability to use the simulation after completing the tutorial?

Explore the tutorial.

We want to check if the user knows what to do on their own, otherwise the tutorial is not a good onboarding option.

Tutorial starts - Wait until the user clicks through the whole tutorial. Just intervene if asked.

Simulation Task:

→ After you’ve experienced the simulation, please explain the building blocks in general in your own words

→ please explain one “specific build” block

  1. How do you feel about the user interface of the simulation?

    1. What does the simulation do?

  2. Can you describe the roles and tasks in the simulation?

  3. → Ask this questions after the first few interactions: What tasks have you completed so far?

  4. How do you feel about the hints?

  5. How do you feel about the DIAL menu? Do you find this helpful? If yes/no, please explain why.

  6. How do you feel about the contextual change? Do you understand which perspective you are currently viewing?

  7. How do you feel about the progress bar?

  8. How would you describe your understanding of how the building blocks are used?

Test main functionalities:

TBD by Artun Gürkan and Martha Vasquez

Screen background:

GovStack approach and Building Blocks.

Ask this question in between:

  • Can you walk me through your thought process as you completed the tasks?

How “good” do the UI Elements feel?

Did the information get transported as we planned?

Does the perspective/context change work?

Does the progress bar work - if not, maybe it can be removed from the layout.

Ask these questions after the user used the simulation for a few actions.

How useful are the overlays?

This part should take about 10-15 Minutes

End Questions:

  1. What are the most memorable tasks and features of this simulation for you?

  2. What would make your experience on the simulation better?

  3. Follow-up Tutorial question: After experiencing the simulation - did you feel well introduced to it by the website and tutorial?

  4. Would this simulation support you make a decision regarding GovStack?

    1. If not, what feature/information was missing?

Ask the user to close the prototype and stop the screen sharing, so we can ask them a couple of more questions.

After the test - stop the recording and say

“Thank you very much for your participation, your feedback really helps us improve our product.”

After(To-do-List)

  • create a task list and go through them 1 by 1

    • if there are a lot of highlights on that list, we know which feature to redesign

  • evaluate your own performance

  • Create a to-do list with action items

    • improvements and why → refer to the task and question of the session

    • explain the problem in a “How might we” - question

    • add possible suggestions

    • should/could be improved section

  • create follow up Jira Tasks

  • No labels