Founder & CEO of solCme

Shahar Maoz

Musician. Developer. Founder.

I'm building solCme a digital rehabilitation platform that uses computer vision to turn body movement into sound.
The platform captures movement through a standard webcam, translates it into musical control, and generates clinical rehabilitation data in real time.

Shahar Maoz
Background
Jerusalem Academy of Music and Dance, Classical Guitar Open University, Psychology John Bryce, Full Stack Development 15+ Years in Music Education & Therapy

The Problem We're Solving

Rehabilitation exercises are repetitive and hard to stick with. Therapists lack objective data between sessions.
solCme solves both problems at once.

Movement as Input

Our instruments adapt to the player, not the other way around. A head tilt, a finger gesture, a facial expression. Any controlled movement can produce sound.

Clinical-Grade Data

Each session automatically captures range of motion, stability, fatigue, and consistency. Therapists get structured reports without extra documentation work.

Patient Motivation

Music gives patients a reason to repeat their exercises. Relevant for stroke recovery, pediatric autism therapy, and cognitive decline.


solCme

solCme is a digital rehabilitation platform that translates body movement into musical expression using computer vision. No wearables, no special hardware. Just a webcam.
The system creates a feedback loop: movement produces sound in real time, while the platform logs clinical metrics for the therapist.

HandSynth: hand gesture recognition to sound control
HeadSynth: facial movement and expression to sound for facial rehabilitation
Personal calibration layer that adapts to tremor, range, and intent per user
Rehabilitation dashboard for range of motion, stability, fatigue & consistency tracking
Visit solCme
Our Instruments

HandSynth

Translates hand gestures and finger tracking into musical control. Designed for upper-limb motor rehabilitation

HeadSynth

Turns facial expressions and head movement into sound. Built for facial nerve rehabilitation and synkinesis detection

Data Dashboard

Real-time and session-over-session progress tracking for therapists, patients, and clinical teams

Background

I'm a full-stack developer and classical guitarist from Israel. For over 15 years I performed, taught, and worked with music in therapeutic settings, including autistic children, motor rehabilitation patients, and elderly populations.

I taught myself Python during a long hospitalization, then completed full-stack training at John Bryce. That period gave me both technical skills and a personal understanding of motor rehabilitation from the patient side.

Now I build systems that connect computer vision, sound synthesis, and clinical measurement. That combination is what solCme is built on.

EARLIER
Musician & Educator
Performed and taught at leading institutions.
Used music therapeutically with rehabilitation populations.
TRANSITION
Developer
Self-taught Python during a medical recovery period. Completed full-stack development training at John Bryce.
NOW
Founder & CEO of solCme
Building a digital rehabilitation platform powered by computer vision and sound synthesis.

Let's Talk

Open to clinical partnerships, pilot programs, and conversations about accessible music technology.


Press & Media

Media coverage and press materials.

מעריב (Maariv)

זירת היזמות החדשה של ישראל: שילוב של מוזיקה, AI וטכנולוגיה. כתבה על המיזם הישראלי שמחבר בין סאונד, AI ואימפקט חברתי.

Maariv Article Read Article

Press Materials

Download our official press kit and materials.

Press Kit (English) Press Kit (Hebrew)

About

solCme is a digital rehabilitation platform that translates body movement into musical expression using computer vision.

For press inquiries, contact shahar@solcme.com