Haptics on iOS, Android, and web do not behave the same way. This article documents how one design system team, working on a large server-driven UI project, resolved that inconsistency by translating Apple's three semantic haptic types, Impact, Selection, and Notification, into three numeric parameters that map identically across all three platforms and sync directly with Figma components via JSON.
The core architecture decision is worth understanding before you skip to the code. The team rejected Android's flat list of hardware constants like KEYBOARD_TAP and CLOCK_TICK because there is no semantic model behind them. Instead they anchored the entire system to Apple's UIFeedbackGenerator taxonomy, then built the Android and web layers to conform to it. The result is a single set of named presets, such as Impact Light, Impact Rigid, and Notification Error, that designers assign in Figma and developers receive as structured JSON with no translation step in between.
The article earns a full read because the cross-platform mapping is shown in detail, not summarized. The author uses Mobbin references to group eight real interaction categories, tap, swipe, drag and drop, error states, success confirmation, scroll boundaries, long press, and gamification, to each preset, giving you a practical decision tree rather than a philosophy lecture. If you are building or auditing a design system that touches mobile, this is the implementation record you would otherwise have to write yourself.
[READ ORIGINAL →]