SamiBo Platform

SamiBo is a multimodal diagnostic platform that unifies biosensors (blood & urine), wearables (heart rate, HRV, steps, sleep), and imaging (X‑ray/CT/MRI/Skin) into one explainable assessment. AI models estimate internal biomarkers from non‑invasive signals, auto‑select the best source via BLE routing, and generate a multi‑page clinical‑style report. A hybrid calibration layer lets users (or clinics) periodically anchor AI estimates to verified lab results, improving accuracy over time. The GUI highlights behavior‑linked feedback — including projected Life Gained and Money Saved — to encourage sustained healthy habits.

Multimodal AI: Biosensors • Wearables • Imaging

SamiBo fuses heterogeneous inputs using AI models designed for healthcare: sensor correction, source quality scoring, and explainable overlays (e.g., Grad‑CAM for imaging). Wearable‑only users can still obtain non‑invasive biomarker estimates; clinics can add blood/urine sensors for higher precision.

Calibration is privacy‑preserving and optional: submit lab test results when available to realign predictions to your personal baseline.

MODULES

SamiBo is a multimodal diagnostic platform that unifies biosensors (blood & urine), wearables (heart rate, HRV, steps, sleep), and imaging (X‑ray/CT/MRI/Skin) into one explainable assessment. AI models estimate internal biomarkers from non‑invasive signals, auto‑select the best source via BLE routing, and generate a multi‑page clinical‑style report. A hybrid calibration layer lets users (or clinics) periodically anchor AI estimates to verified lab results, improving accuracy over time.

BioSense Terminal

Touchscreen diagnostic unit; runs biosensor & imaging AI; offline-capable.

Wearable Bridge

BLE device discovery, mapping, and calibration; streams HR/HRV/SpO₂/steps.

Imaging AI

Explainable models for X‑ray/CT/MRI/Skin with Grad‑CAM overlays.

Clinical Report Generator

Explainable models for X‑ray/CT/MRI/Skin with Grad‑CAM overlays.

HOW IT WORKS

SamiBo is a multimodal diagnostic platform that unifies biosensors (blood & urine), wearables (heart rate, HRV, steps, sleep), and imaging (X‑ray/CT/MRI/Skin) into one explainable assessment. AI models estimate internal biomarkers from non‑invasive signals, auto‑select the best source via BLE routing, and generate a multi‑page clinical‑style report. A hybrid calibration layer lets users (or clinics) periodically anchor AI estimates to verified lab results, improving accuracy over time.



Connect Devices

Detect BLE biosensors & wearables; map devices to biomarkers in one place.

Acquire & Verify

Stream vitals and biomarkers; apply signal checks, unit normalization, and sanity filters.

AI Fusion

Fuse biosensors + wearables + imaging; estimate biomarkers; pick best source via runtime quality scoring.

Report

Generate a multi‑page clinical‑style PDF with explainability and color‑coded ranges.

Calibration (Optional)

Enter trusted blood test results to personalize and improve accuracy.

Source Routing

For overlapping markers (e.g., SpO₂), the system auto‑selects wearable vs biosensor based on signal quality.

Explainability

Every prediction carries confidence and, for imaging, Grad‑CAM overlays for transparent review.

Feedback

Life Gained and Money Saved indicators reinforce healthy behavior.

OUR TEAM

We are a product-driven AI health team combining applied-ML researchers, embedded/edge engineers, and clinical-workflow specialists. We build explainable, multimodal diagnostics—integrating biosensor data, wearable signals, and medical images—backed by quality control, calibration to lab references, and transparent model reporting. Our engineers deliver device-agnostic BLE and sensor integrations, robust data/ML pipelines, and production-ready Python on touchscreen hardware. We run structured pilots, document model cards, and uphold privacy/security and regulatory-aware practices. Hardware and UX capabilities span custom enclosures, touchscreen UI, and ODM collaboration to translate prototypes into field-ready, scalable systems.



Sam Peng

Phil Fan

Jonathan Ziegler

Amy Wang

Alan Xu
Meriem Benmmlouk

PARTNERS & PILOT SITES

We collaborate with ecosystem partners and pilot sites to validate workflow, usability, and calibration in real‑world settings.