(Information Science Expert) Lecture on IoT and Edge Computing
-
Allowing for a certain degree of computational error (aiming for approximately correct calculations)
- In exchange, aiming for improved circuit processing speed and energy efficiency
-
Can be used for tasks like “RMS (Recognition, Mining, Synthesis)”
- Tasks with ambiguity, such as machine learning and image generation
- Characteristics:
- Input data itself contains a lot of noise
- No golden result (no unique correct answer)
- Depends on human perception (ambiguity) when generating images or natural language
- Connected to the concept of “ambiguity” in Natural Language Processing
- In that sense, Scrapbox can also be considered ambiguous (the “connections” depend on human perception)
- Statistical/probabilistic
- Self-recovery mechanism
- For example, in machine learning, calculations continue until convergence, so it’s okay to make mistakes a few times
-
How to “allow for error”?
- Representation of decimal points
- For example, 0.09
- Using 32 bits, it can be represented up to 0.089999996
- However, this increases computational complexity and circuit area
- Therefore, sacrificing precision and quantizing values to be represented in fewer bits
- Truncating calculations (imagine perforations)
- Data reuse
- Approximating the computational unit itself
- Imagine doing the right half and left half separately when doing long division
- It becomes impossible to carry over at a certain digit, but it’s acceptable
- Representation of decimal points
Note: The content in double square brackets (WikiLinks) and other links remains unchanged.