
User-Centred Design of a Pervasive Interface
Motivation
The motivation behind this study comes from a request made by the organisation FDM to create a visualisation tool for a set of anonymised data. This work is focussed towards the design and creation of an intuitive data visualisation tool, which encompasses all features requested by FDM. Currently, there is no easy way for the employees of FDM to gain useful insight from the database, leading to the data being ignored. The organisation requested that the solution created provides an easy way to generate an outcome that provides them with the information they need to make more informed business decisions. Therefore, the visualisation tool will be developed with the user centred design process in mind. The tool created at the end of the project will use the knowledge gained from the user studies, to ensure that it is easy for users to navigate, and usable for all employees.
Methodology
As defined by the ISO (the International Organization for Standardisation), the human-centred approach to development aims to make any system created useful by focussing on the users’ needs and requirements, by utilising usability techniques (ISO 2010). By incorporating users throughout the development process, the complete system created should be moulded to fit the users’ needs and abilities. Without any user involvement whatsoever, the complete product may stray from the original focus of ease of use for the employees. The main feature focussed on in the research is whether or not the user is able to generate a visualisation, with any data or chart. Although there exists alternative pathways to creating the final graph, I will be looking at whether the user uses the most efficient one.
In order to ensure that the user is considered throughout the development process, observational user testing is to be carried out at the low fidelity and mid fidelity prototype stages. The reasoning for selecting an observational user testing approach was that in comparison to usability testing alone, additional information can be gathered on top of the measures recorded. Through observation, the user’s feelings towards the system can be learned through their commentary, which can be helpful in detecting problems. For example, when a user makes a verbal expression of confusion, we can assume that this section of the system is a problem area. This is determined, even if the user does not make a direct error. The prototype at the next iteration can be altered to ensure that no other users are unable to navigate at that point again. Without observing the users as they test the prototype, there is a chance that the user’s true feelings could be missed. By using a video recording as well, a transcript of the tests can be made for future analysis and reference.
For this study, quantitative and qualitative measures will be recorded. The quantitative measure to record is when the user makes an error or takes the wrong pathway. The optimal pathway is defined as the user generating the visualisation without having to use the help page, logging out, or asking for assistance. Consulting help documentation does not reflect poorly on the users’ abilities. However, as one of the aims for the system is that the user will be able to generate visualisations easily , the design should be intuitive enough for even a naïve user to use it straight away. Users will be asked on their feelings towards the layout and overall satisfaction with the prototype once they have completed the test, which will help to give insight into their thoughts that may not have been expressed during the test. The quantitative measures recorded will be any expression of confusion by the user (facial or vocal), expression of wanting to give up, whether the user pauses or seems stuck on the task at any point, and asking for advice on how to do the task. The higher the number of the measures recorded, the less effective the design, leading to more alterations being made to the next iteration. Using a combination of qualitative and quantitative measures ensures that some of the limitations of each method are redeemed by the contrasting one. For example, from a user that was easily able to generate the visualisation, but verbally express their discontent, we will have recorded extra information not found with a singular measure. The number of users chosen was 5; this allowed me to conduct a more indepth study given the limited time frame.
A cognitive walkthrough, shown below, was used before conducting the user tests, to discover where users may encounter problems. From the walkthrough we can estimate that the user will most likely be able to navigate the website fine, but may have trouble creating the graph itself. As this is not a process that users will be familiar with doing, like logging into a website, they will have to learn how to do it. Users will have the option to go to the help page for more information, however this takes them away from the most efficient path.

Research
Paper Prototype
A pen and paper approach was used for the initial low fidelity prototype. A limitation of a paper prototype is the difficulty depicting all of the features the complete system will contain. Therefore, a simple design that imitates the desired functionality was used. A benefit of using the paper prototype is that it gives flexibility to change and redesign prototypes quickly, giving the opportunity to look at alternative prototype designs without spending too long designing them. The prototype works by the experimenter manually changing the pages after the user ‘clicks’.
Users were shown a user instruction document that outlined what the task was, and how to use the prototype, prior to beginning the study. They were also asked to sign a consent form, stating that they were happy for their details to be used within the research. The test was recorded by video or photographs were taken. The user was asked to fill out a form once they completed the task asking what their feelings towards the layout of the prototype was, and their overall satisfaction with the prototype was. Responses were taken from a 1-5 rating system; 1 being completely negative and 5 being very positive. Users were also given the option to provide any additional feedback about the prototype.
User Evaluation
All users were able to create a visualisation from the paper prototype; however, some users did encounter bouts of confusion during their test. Two users expressed confusion during their tests when they had reached the visualisation page of the prototype. unsure of how to select the data they wanted, both opting to tap randomly on one of the text boxes rather than selecting it with the tick box. Additionally, on three counts, users asked for advice on how to select the data for the visualisation; one user questioning, “I would tick this, yes?”. From all of the user tests combined, there were 8 counts of users pausing and not doing anything with the prototype. For all users, this occurred when they were on the visualisation page. For some of the duration of the users pausing, they were reading the instructions for what to do. However, after this, users were showing signs of hesitation and confusion; one user hesitantly asked whether they should be selecting an attribute or not, and another user expressed that “that is confusing” when the page was shown to them.



Requirements
Overall the design used for this prototype was somewhat successful, as all users could create a visualisation. However, 4 of the users only managed to create the visualisation after learning that they could not continue without selecting all of the necessary data. Even though through this method the users were able to create visualisations, we do not know whether they would have been able to without prompts. The assumption could be made that if a user was navigating a system with the same design as the prototype, they would look to the help page, once they realised they reached a point where they could not continue. This demonstrates that the prototype design is not intuitive; the FDM employees should be able to create a visualisation quickly and efficiently, without having to look to for assistance. For the next prototype, the design will be altered to ensure that a novice user will be able to use the system straight away.
The most consistently highlighted issues with the paper prototype was the data selection process on the visualisation page. All but one user had trouble with this portion of the prototype. Only two users realised that the square boxes next to the data objects were tick boxes; in order to select the object, the user had to tick to box. This feedback suggests that the data selection form is not intuitive for the user, as it caused such confusion for more than one user.
For the next iteration of the prototype, the tick boxes will be removed, and the data selection will be done in a series of dropdown boxes with clearer instruction labels, so that the user will be instantly prompted on how to select the data they want. Additionally, as the limitations of a static paper prototype do not exist for a digital prototype, the methods of data selection can become more interactive, and the users will be able to select ‘actual’ data that will have an effect on the following selections. As none of the users used the help page, no feedback was given for it; this page will not be modified greatly for the next iteration; by improving the visualisation page, for the digital prototype, there will expectantly be a decrease in the likelihood of the user needed to look at the help page.
Digital Prototype
After evaluating the feedback form the paper prototype, the next iteration mid fidelity prototype was created. The prototype was created using a web wire framing tool ‘UXPin’. The aim was to try and mimic the look of an actual website, so that the experience for the user was more in line with how actually using the complete system would feel. When designing the next prototype, Shneidermans ‘Eight Golden Rules of Interface Design’ were taken into consideration (Shneiderman 1998), so that the system could be as aesthetically pleasing for the user as possible. The first rule states that the system should be consistent, which includes; vocabulary, colours, layout, fonts, etc. For each page of the prototype, the same colours, fonts, and layout were kept, in order to keep consistency and to give the idea that they were pages on a website. The colours that were selected were derived from the house style of the organisation, and were kept the same throughout each page.
From the findings of the previous iteration, the tick boxes were removed, and drop down boxes were added in place. Additionally, the nature of the data selection boxes were altered; the option to narrow down the data used in the visualisation was added, with the user being able to select a condition and operator. This was added to further increase the likability to the desired functionality of the final system. The other pages were kept mostly the same, except for the help page, where clearer instructions with visual aids were added.
The prototype demonstrates the basic functionality of the complete system, much like the low fidelity prototype, however, it bares more similarity to an actual website. By using the wire-framing tool selected, the prototype provides simulated functionality and navigation of an actual website, which expectantly will increase user engagement as they will feel like they are interacting with an actual website. The prototype can be found at the following link.
3.5 User Evaluation
The feedback from the digital prototype was much improved from its predecessor. There were significantly less counts of users pausing and appearing stuck on the task; . The user who found themselves stuck on a portion of the prototype explained in their feedback form that the ‘select operator option was confusing’, and that ‘it didn’t make much sense when [they] first clicked on it’. They gave the suggestion of reversing the order of the operator and condition options. There was also a decrease in the instances where users had to ask for help on how to complete the task. One of the limitations with using the digital prototype software was that the user was able to create a chart without selecting all of the necessary data. One user created their chart without selecting either an operator or condition.



The results from the digital prototype demonstrate that the design implemented was more successful in allowing users to efficiently create a visualisation. There were less errors made, and users gave more positive feedback; one user said that they liked the organisation of the site and that the prototype was ‘easy to follow’. However, one limitation with using the UXPin tool was that there was no way to ensure that the user filled out all of the data selection form before a chart was created. With the final system for FDM, the data the user selects would be pulled from a database and be displayed; if the user does not fill out the entire form, the chart will not be generated, as no data would be pulled to display. In a future high fidelity prototype, a validation feature would be added that would mimic the function of the actual system. Also, additional instruction on the visualisation page would be added so that the user would know that all data fields need to be completed before the visualisation can be generated.