Hi-Audio — User Evaluation Results

Preliminary user evaluation of the Hi-Audio online platform for collaborative audio recording. Results from 22 participants including task-based performance, SUS, and NASA-TLX metrics.

View the Project on GitHub idsinge/hiaudio_user_evaluation_results

Supplementary Materials


Overview

This document provides participant-level supplementary data for the Hi-Audio user evaluation study. Participants are identified by an anonymous label (P01–P22) that corresponds to their entry in the anonymized per-participant document docs/Hi-Audio_Participant_Comments_Anonymized.md. All personally identifiable information has been replaced with JotForm Submission IDs.

Scoring procedures:


Participant Mapping

Participant Submission ID Hi-Audio UUID Recordings
P01 6416169389419749315 LhX24erm7R7i8rxSRWah76 link
P02 6409432612171376683 FSrnB4DPjFP5EwgHjekyMM link
P03 6416147786326656385 eMWdcY56whcMayiF5oZJHG link
P04 6420720104814184671 Bcg7LsTy8uDRaDTtTkJEWw link
P05 6409505034176280614 3D76kkBoqxofHfut3dQHTj link
P06 6408690289642566299 ZcnHQypCweDenC3XuWj5yS link
P07 6413564565214864018 3nLNBNXbCVtEdosGYKuHeE link
P08 6417239988428702414 eUtQHBH5jUDj3eCsvLmQkn link
P09 6417715954425401224 7G9NTxoHauPEvkdkrjuQLv link
P10 6412896872714817218 mqiFEK4x9WrB27Rewij6oT link
P11 6413633065228695943 3ythJ29PSrH4PAkv7mP36E link
P12 6421178822523703074 FMuW7UiGACpde4dqykt3tJ link
P13 6419627128718388274 YDAp2nHnLnyLTPB49frora link
P14 6421404144652090461 BVTm5sb6AsiaKnmwX7L5F8 link
P15 6421684511428984688 BSXwAVCNL8t5ECYbEgwdRd link
P16 6420455651028514320 ZKskFBE7GN84uR4DvRi4C7 link
P17 6419757055218539765 PNX5yjFpNGd34BUi9Tn2qs link
P18 6420494692118337517 L6F282RuqPipYu7AjQQR3c link
P19 6420480828325347605 n7rpkV4acHLCMER6khzCgh link
P20 6421275260197236995 UhkW6SWdK55esHa5UBF3A7 link
P21 6420550821525052209 nSiXgfRrWFFHW4beKWpYQ3 link
P22 6420655997663445267 CbGMhcPyQQaonPsBoRWAjx link

Per-Participant Task Performance

Y = completed successfully, N = not completed.

Participant T1 T2 T3 T4 T5 T6 T7 T8 T9 T10 Total
P01 N N N N Y Y Y Y Y N 5/10
P02 N N N Y Y Y Y Y Y Y 7/10
P03 Y N Y Y Y Y Y Y Y N 8/10
P04 N N Y Y Y Y Y Y Y Y 8/10
P05 Y N N N Y N N N Y N 3/10
P06 Y N Y Y Y Y Y Y Y N 8/10
P07 N N N N Y Y N Y Y Y 5/10
P08 Y Y N Y Y Y Y Y Y Y 9/10
P09 Y N Y Y Y Y Y Y Y Y 9/10
P10 Y N N Y Y Y N N Y N 5/10
P11 Y Y N Y Y Y Y Y Y N 8/10
P12 Y N Y Y Y Y Y N Y N 7/10
P13 Y Y Y Y Y Y Y Y Y Y 10/10
P14 Y N Y Y N Y N Y Y Y 7/10
P15 Y Y Y Y Y Y Y Y Y Y 10/10
P16 N N Y Y Y Y Y Y Y N 7/10
P17 N Y Y N Y Y Y Y Y Y 8/10
P18 Y Y Y Y Y Y Y Y Y N 9/10
P19 Y Y Y N Y Y N N Y N 6/10
P20 Y N Y Y Y Y N N Y N 6/10
P21 Y N N Y Y N Y Y Y Y 7/10
P22 N N N Y Y Y Y Y Y N 6/10

Individual SUS Scores

Raw item responses (1–5 scale) and computed SUS score (0–100). Odd-numbered items are positively worded; even-numbered items are negatively worded (reversed in scoring).

Participant Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 SUS Score
P01 3 3 2 3 3 3 2 3 2 3 42.5
P02 3 2 2 3 4 2 2 2 2 3 52.5
P03 1 4 1 1 3 4 1 5 1 3 25.0
P04 3 4 2 3 2 4 1 5 2 3 27.5
P05 3 2 5 3 3 3 4 1 4 1 72.5
P06 2 1 3 1 4 2 4 3 5 2 72.5
P07 2 2 4 3 3 2 4 2 3 1 65.0
P08 1 1 4 1 4 2 5 1 5 1 82.5
P09 3 4 1 2 3 5 1 5 3 2 32.5
P10 1 3 1 4 3 2 1 4 3 1 37.5
P11 3 1 5 1 5 1 5 5 4 1 82.5
P12 3 2 3 2 3 2 1 3 2 1 55.0
P13 3 3 2 1 2 4 5 3 2 1 55.0
P14 3 2 4 1 4 2 4 2 4 1 77.5
P15 2 1 4 1 2 1 4 3 3 2 67.5
P16 3 1 3 2 4 2 5 3 4 3 70.0
P17 3 4 3 2 2 2 1 2 3 4 45.0
P18 4 1 3 1 5 1 4 3 2 1 77.5
P19 1 2 3 2 4 2 1 1 4 4 55.0
P20 1 5 2 5 1 3 4 5 1 5 15.0
P21 2 2 4 4 2 2 4 2 1 4 47.5
P22 3 2 3 2 3 1 2 1 4 1 70.0

Individual NASA-TLX Scores

Raw dimension scores (0–100 scale, as answered by participants) and computed raw NASA-TLX score. Own Performance is shown here as the raw survey value (0 = low, 100 = high perceived performance). It is inverted (100 − value) before computing the Raw Score, so that higher values indicate greater workload across all dimensions.

Participant Mental Physical Temporal Own Perf. Effort Frustration Raw Score
P01 52 33 30 54 28 21 35.0
P02 50 5 30 40 75 60 46.7
P03 30 12 25 79 57 87 38.7
P04 59 0 50 91 66 92 46.0
P05 19 13 43 50 64 27 36.0
P06 80 15 31 89 69 16 37.0
P07 65 21 31 41 50 71 49.5
P08 0 0 10 100 0 0 1.7
P09 72 29 27 23 10 77 48.7
P10 75 0 50 100 0 25 25.0
P11 50 53 0 82 21 0 23.7
P12 90 1 37 75 72 59 47.3
P13 65 10 20 65 70 85 47.5
P14 17 4 21 82 18 15 15.5
P15 30 20 20 80 30 30 25.0
P16 37 7 60 11 19 46 43.0
P17 77 11 25 74 29 77 40.8
P18 65 0 79 100 21 55 36.7
P19 81 38 16 16 55 45 53.2
P20 73 45 11 11 12 9 39.8
P21 83 0 28 0 75 80 61.0
P22 26 11 16 33 14 2 22.7

Latency Measurements

Self-reported round-trip latency from the platform’s built-in estimation tool (Task T5). Values include the signal-to-noise ratio (dB) where available. Notes reflect conditions observed in recordings or participant comments.

Participant Measurement
P01 481.48 ms, ratio 22.57 dB (probably Bluetooth earphones)
P02 132.69 ms, ratio 32.80 dB
P03 166.27 ms, ratio 29.23 dB
P04 313.13 ms, ratio 26.65 dB
P05 218.65 ms, ratio 38.52 dB (tracks not well synced)
P06 186.71 ms, ratio 35.07 dB
P07 50.50 ms, ratio 31.54 dB
P08 217.92 ms
P09 ERROR (test run but failed due to wrong procedure)
P10 137.05 ms, ratio 29.48 dB
P11 47.94 ms
P12 597.75 ms, ratio 19.33 dB (no headphones used)
P13 206.92 ms, ratio 19.83 dB
P14 Not provided (no earphones used, feedback/echo in recording)
P15 176.71 ms
P16 119.82 ms, ratio 28.63 dB
P17 304.77 ms, ratio 28.94 dB (Android phone, probably Bluetooth headset)
P18 211.32 ms, ratio 23.91 dB
P19 469.33 ms, ratio 27.06 dB (Bluetooth buds)
P20 272.10 ms
P21 92.63 ms, ratio 33.20 dB
P22 144.52 ms, ratio 23.04 dB

Qualitative Feedback

Anonymized comments from the post-task survey ([Survey]) and follow-up email exchanges ([Email]). Evaluator notes (observations from recording review) are shown in italics.

P01

[Survey]

P02

[Email]

[Survey]

P03

[Survey]

P04

[Survey]

P05

Evaluator notes:

(Tracks not well sync, latency might be wrong estimated or simply bad performed)

[Email]

[Survey]

P06

Evaluator notes:

Conclusion: he confirms he added the email to invite “hiaudioparis@gmail.com” but eventually no collaborator is present which means the process of invitation was not clear enough and he did not confirm in the pop-up dialog,

[Survey]

P07

[Survey]

P08

[Survey]

P09

[Survey]

P10

[Survey]

P11

[Survey]

P12

Evaluator notes:

(No headphones he says)

[Survey]

P13

[Survey]

P14

Evaluator notes:

No latency value provided, from the recordings I can tell there’s delay and no earphones were used so there’s feedback (echo) in the recording

[Survey]

P15

[Survey]

P16

Evaluator notes:

He managed to change the title in collaboration but there’s no audio recorded, maybe the track was deleted before being sync with the server

[Survey]

P17

Evaluator notes:

(On Android phone Pixel, probably bluetooth headset: Marshall Major IV) He did record Au clair de la Lune but he created a new collection and composition himself for it

[Survey]

P18

[Survey]

P19

Evaluator notes:

He used “Bluetooth buds” He recorded Au clair de la lune outside the main collection

[Email]

[Survey]

P20

Evaluator notes:

She recorded Au clair de la lune outside the main collection

[Survey]

P21

Evaluator notes:

Nice estimation of the latency (metronome recorded) The person cloned both templates for Happy Birthday (language and instruments but did not record anything inside any of them)

[Survey]

P22

[Survey]