System Overview
Last updated
Last updated
Looking at the Haptic system in a simplified way, we can outline the process as follows:
LLM providers connect their models to the Haptic front-end by giving access to their API endpoints.
Users can reach these LLMs and their response generation services via Haptic. Where required, models that charge for their services can apply additional limits (e.g., a minimum token balance in the wallet, setting up micro-payments).
Haptic encourages gathering comparative feedback on the same category of questions from various users and creates a scoring mechanism used by the preference model. In the upcoming phase, we'll also collect comparative scoring from the same user across different LLM models.
Once sufficient data and feedback responses are gathered from users, Haptic will assist in completing one iteration of LLM parameter retraining.
LLM models will compensate Haptic based on a composite value derived from the number of high feedback queries and iterations of retraining their parameter set.
Haptic will create significant earning opportunities for human feedback providers through the HAI staking module.
Stakers will receive their token rewards from LLM products directly via a claim method, but ETH/stablecoin revenue will depend on the timelock-based xHAI held by users.