The Challenge

Leverage computer vision to interpret human palm lines and features in real time. The goal was to classify unique palm characteristics into meaningful categories, while delivering an engaging, low-latency experience inside a mobile app.

Overview

Industry
Lifestyle / AI
Duration
2.5 months
Total project hours
620
Technology
Python, React Native, TensorFlow
Integrations
Camera API

The Solution

We trained a custom convolutional neural network (CNN) on a curated dataset of palm images, annotated with interpretive classifications derived from palmistry logic. The model was optimized for edge performance and integrated seamlessly into a React Native mobile app.

The app captures hand images through the device camera, preprocesses them locally, and performs inference on-device to ensure real-time responsiveness and data privacy. A user-friendly interface guides the scanning process, delivers visual feedback, and presents interpretations in an intuitive, narrative format.