STUFF

Body Analysis

Company: Advanced Health Intelligence (AHI)

Complexity: 9.7222/10

Fun factor: 8.334/10

Project details

  • Platform: iOS, Android
  • Design system management: Supernova.io
  • Design system: Material 3, UCDL
  • Product type: B2B SaaS / SDK
  • Design tool: Figma
  • Roles: Chief Design Officer, UX Researcher, Product Manager, Product Lead, Customer Support, Video [Director, Producer, Editor].

What is it, and what did you do?

  • Accurate and repeatable body composition and circumference using a smartphone.
  • Multiple responsibilities: UX & market research, user testing, developer handoff, prototypes, app design, scan results, developer docs, product docs, marketing assets (print, digital), video guides/promos, promotional decks, multi-day film shoot, training.

Project context

  • Developed a mobile-based body scanning feature leveraging computer vision to assess user posture and body composition in real-time.
  • Aimed to bridge the gap between clinical physical assessments and accessible at-home monitoring for the health and wellness vertical.

Biometric outputs

  • Chest, hips, waist, and thigh, body fat percentage, waist-hip ratio, waist-height ratio, obesity risk, central obesity risk.

Body Analysis: Involvement and updates

I researched and designed the technology through all major milestones, including:

  1. Two person experience → single person experience.

    • New guide, Phone Alignment (interactive UI interface), Staged countdowns (UI), error states, failure states, cloud segmentation.
  2. Static capture and outline → Dynamic capture and scaled outlines

    • On-device pose checking, phone height detection, real-time messaging (and errors), failure states, on-device segmentation.

This meant being heavily involved with engineering, ensuring the UX was being improved whilst accuracy and repeatability was maintained. Whilst the front-end undertook a significant change, the AI models also transformed from cloud to on-device, so that no images leave the device.

Version 1x
MVP (v2)
Dev-v2

Use cases & user archetypes

  • The biometric outputs allow you to predict Obesity, placing it directly in digital health, telehealth and insurance pipelines.
  • A less effective use falls into apparel and fitness.
  • It is also combined with other scans (like the BHA) and patient data to contribute to further predictive health markers.

Major challenges & constraints

  • Environmental variance: Mitigating computer vision failures caused by poor domestic lighting and low-contrast clothing against complex backgrounds.
  • Privacy & trust: Overcoming user hesitation regarding capturing and processing semi-nude or form-fitting imagery on a cloud-based architecture.
  • Instructional clarity: designing an intuitive guidance system (visual and haptic) to ensure users stand at the correct distance and angle without frustration.
Body Analysis App - sample screenshot

UX design & research frameworks

  • Technology Acceptance Model (TAM): Utilised to analyse and optimise perceived usefulness and ease of use, directly influencing the onboarding flow design.
  • Nielsen’s 10 Usability Heuristics: specifically ‘Match between system and the real world’ to align scanning instructions with natural human mirroring behaviours.
  • Double Diamond Process: strictly followed the Discover/Define phases to narrow down the MVP scope from ‘full medical diagnosis’ to ‘wellness indicators’.
  • System Usability Scale (SUS): Conducted post-testing analysis yielding a score of 82, validating the iterative improvements made to the scanning reticle UI.

Outcomes

  • Achieved a 40% reduction in scan failure rates through the implementation of real-time AR guidance.
  • Validated the ‘privacy-first’ local processing model, which tested significantly higher for user trust during qualitative interviews.
  • New opportunities: “Privacy mode