FLOPS
FLOPS, or Floating Point Operations, are the basic unit used by Tracebloc to measure and allocate resources for training machine learning models or for collaborations. In this context, FLOPS refers to the total number of floating-point operations rather than the rate of operations per second (as it's sometimes used in other contexts).
To quantify different amounts of FLOPS, the following prefixes are used:
- KF (Kilo Flops) = 10^3 FLOPS
- MF (Mega Flops) = 10^6 FLOPS
- GF (Giga Flops) = 10^9 FLOPS
- TF (Tera Flops) = 10^12 FLOPS
- PF (Peta Flops) = 10^15 FLOPS
- EF (Exa Flops) = 10^18 FLOPS
These units help determine the computational cost of training models or participating in collaborations on the platform.