A deep, honest walkthrough of every capability — from raw extraction to FHIR output to on-device face scan. Jump to what you need; each section takes under a minute to read.
Device displays, clinical documents, and the human face — all collapse into the same canonical JSON response.
Per-device vision pipelines that understand what a reading means, not just what pixels look like. Purpose-trained for blood-pressure monitors, glucometers, pulse oximeters, thermometers, scales, peak-flow meters, ECG monitors and more.
{
"success": true,
"deviceType": "blood-pressure",
"data": {
"systolic": { "value": 128, "unit": "mmHg", "confidence": 0.996 },
"diastolic": { "value": 82, "unit": "mmHg", "confidence": 0.993 },
"pulse": { "value": 74, "unit": "bpm", "confidence": 0.988 }
},
"processingTime": 487 // ms
}Prescriptions, discharge summaries, lab reports, clinical and progress notes, imaging reports, and more. Not raw OCR — the output lands in proper clinical schemas (drug names, dosages, frequencies; diagnoses; impressions; measurements).
Our remote photoplethysmography pipeline reads subtle colour changes in your skin from a front-camera video and estimates heart rate, HRV, respiratory rate, SpO₂, stress, wellness-band BP, heart age, and more — without a cuff, strap, or wearable.
Every modality — device, document, face scan — lands in the same canonical response shape. Your UI parses it once and handles all 16+ modalities behind the same code.
Stable shape: success, deviceType, data.{field}.{value, unit, confidence}, timestamp, processingTime. Never changes.
Set includeFhir: true and receive a Bundle of Observation resources with LOINC codes, ready to POST straight into Epic, Cerner, Athena, or any R4-compatible EHR.
Fire-and-forget integrations. Exponential-backoff retries (up to 24h), HMAC-SHA256 signing with customer-managed keys, replay protection, dead-letter queue.
Live · Pro+Pull all extractions for a project as JSONL, CSV, or FHIR NDJSON over a signed S3 URL. Great for analytics warehouses.
LiveBring your own Epic / Cerner / HL7 v2 endpoint; we'll deliver Observations there instead of a webhook.
Beta · EnterpriseNo-code integrations so ops teams can route extractions into CRMs, spreadsheets, and Slack without a developer.
Coming Q3Every value comes with a calibrated confidence score, so your code can decide what to auto-accept and what to route to human review.
Each extracted value gets its own confidence score from 0–1. Set thresholds per field, per device, or globally.
Convert mg/dL ↔ mmol/L, °C ↔ °F, kg ↔ lb automatically — or keep the original. Your call.
Pass deviceType: "auto-detect" and we identify the device. Explicit hints give slightly lower latency.
Out-of-range flags (e.g., HR of 300 bpm) are surfaced with a warnings array so you don't silently ingest garbage.
Submit known-correct readings back to us to help improve the model. Private to your project.
Published accuracy benchmarks updated monthly. Filter by device manufacturer, display type, and image quality.
p50 ~280 ms, p95 < 500 ms, p99 < 900 ms on typical mobile networks. Uptime 99.9% (99.99% on Enterprise).
Every extraction has a stable extractionId. Pass it back to pin results to a canonical record in your DB.
The REST API covers everything; the SDKs make it effortless. Face-scan SDKs run on-device so raw video never leaves the user.
Every capability is available over HTTPS. No SDK required — just curl, fetch, requests, or your HTTP client of choice.
TypeScript module. Handles camera permissions, quality detection, countdown UI, and on-device rPPG inference. Ships with React, Vue, and vanilla-JS adapters.
LiveSwift package, iOS 15+. On-device face-scan. Built-in SwiftUI components for quick integration or headless mode for custom UI.
BetaKotlin library, minSdk 24. Compose-friendly. On-device face-scan with same capability surface as iOS.
BetaWraps the native iOS + Android SDKs. Single import, same hook API on both platforms.
Coming Q3Dart package. Works on iOS and Android. Same adapter contract as every other face-scan SDK so you can swap vendors without rewriting UI.
Coming Q3The things auditors ask about, already done. Enterprise plans add a signed BAA, zero-retention mode, customer-managed keys, and an SLA you can forward to procurement.
Infrastructure designed for PHI from day one. BAA available on Enterprise. PHI isolation per project; no cross-tenant leakage.
Type I audit scheduled for Q2. Controls already mapped and monitored; auditor's interim report available under NDA.
Process in US, EU, or India. Required for EU data-residency and India's DPDPA. Custom regions for Enterprise.
Data Processing Agreement on request. Subject-access and deletion workflows built into the dashboard.
Encryption in transit and at rest. Keys rotated automatically. Customer-managed keys (KMS BYOK) on Enterprise.
Raw images deleted 60 s after processing by default. Enterprise supports immediate deletion and "process-and-forget" mode.
Immutable, exportable log of every API call, admin action, and key rotation. Stream to S3, Splunk, or Datadog.
Deploy SmartScanPro inside your AWS / Azure / GCP account. Data never leaves your perimeter. Helm chart + Terraform module available.
No friction between signing up and making your first authenticated call. Full OpenAPI spec, an interactive playground, and sensible defaults all the way down.
Register → create project → get API key → ship a real request. No credit card. No sales call. No "approvals" queue.
Upload an image in the dashboard and see the full JSON response, FHIR Bundle, and confidence heatmap — before you write any code.
Machine-readable openapi.yaml — generate clients in any language, auto-wire Postman, or plug into AI coding assistants.
Pass Idempotency-Key to make retries safe. Repeat requests within 24 h return the cached response.
Every response includes X-RateLimit-Remaining, X-RateLimit-Reset, and Retry-After. Your backoff writes itself.
Every error returns a stable code, human message, and hints for resolution. No string-matching required.
Dashboard charts by project, device type, endpoint, and time. Export raw usage logs as CSV or JSONL.
Attach tags to any call (e.g., customer:42, flow:onboarding) to filter dashboards and audit logs.
Pro+ plans get a shared Slack channel with our engineering team. Typical response in < 2 hours, EU & PT hours covered.
Four real workflows built on SmartScanPro. Each one took customers under a week to ship the first version.
Patient joins a video call. Doctor asks them to measure their BP. Patient snaps the device screen in their phone app. Vitals land in the EHR before they hang up.
Pharmacy uploads prescription photos to their portal; adjudicator system auto-accepts high-confidence claims, routes edge cases to a human reviewer.
Trial participants bring their own devices (BYOD). A single app onboards any device the participant happens to own — no device kit to ship.
Employees do a 30-second face scan each week. Coaches see structured trends; HR sees population-level insights. Zero hardware, zero wearables.
The next three quarters. Dates are guidance, not promises — we ship when quality is right.
Current beta graduates to GA with Swift/Kotlin support for the full face-scan capability set. White-label branding on Enterprise.
Cross-platform adapters on top of the native SDKs. Same hook API on both platforms; no more "which stack should we pick?"
Ops-friendly triggers and actions. Route extractions into CRMs, spreadsheets, Slack, Notion without a developer.
Audit in progress. Report will be available under NDA on Enterprise plans.
Replacing the licensed component for wellness-band BP with our own model, trained on a proprietary multi-region dataset.
For legacy hospital systems that prefer ORU/MDM messages over FHIR. Emitted in parallel to the JSON envelope.
Free for the first 100 calls. No credit card. Upgrade the moment you ship. Every capability above is available on the free tier — you never discover something is paywalled at the last minute.