App screenshots or hardware prototype photo
photo / screenshot coming soon
Why I built this
I wear a bunch of trackers and I'm still confused about my health data. There's a gap between 'your resting HR was 62 bpm' and 'here's what that means for you today.' I wanted to close it.
Overview
Beet is a real-time health data aggregator that pulls from multiple wearable sensors and surfaces meaningful, actionable insights — not just raw numbers. It processes signals at the edge for low-latency feedback and works offline-first.
The Problem
What's broken
Wearable health apps drown users in metrics without context. Data is siloed across devices, and the insights layer is usually shallow or nonexistent. Most apps just repackage manufacturer data.
The Solution
What I built
A unified aggregation layer that reads from multiple sources (Apple Health, BLE devices, custom sensors), processes locally for speed and privacy, and surfaces insights through a minimal, calm iOS interface.
Technical Approach
The core is a Swift/SwiftUI iOS app that uses HealthKit and CoreBluetooth for data ingestion. A lightweight on-device inference model interprets patterns in real time — no cloud round-trips for basic insights. BLE peripherals communicate via custom GATT profiles I defined for a prototype sensor node.
Stack
Process
Problem Mapping
Identified the most meaningful signals to track and correlate
iOS Foundation
Built HealthKit + CoreBluetooth data pipeline
Custom Sensor Node
Prototyped a BLE peripheral for additional biometric sensing
Insights Layer
Training lightweight on-device model for pattern interpretation
Results & Impact
- Working iOS prototype with multi-source data aggregation
- Sub-100ms latency for real-time sensor feedback
- Custom BLE sensor node functional on breadboard
- Ongoing: refining the insight generation model