The leaders in computing are now profiting from their investments in new number systems initiated half a decade ago. NVIDIA transformed the AI ecosystem with their 16-bit, half-precision float providing a boost over Intel enabling them to gain market share in the valuable data center market. Google designed the Tensor Processing Unit to accelerate AI cloud services; the TPU uses an 8-bit integer format to create a 100x benefit over its competitors. Microsoft is using an 8-bit floating point with 2-bit exponents for its Project Brainwave. And China’s Huawei is using a fixed-point format for its 4G/5G base stations to gain performance per Watt benefit over its US competitors who still use IEEE floating point.
All these companies realized that Moore’s Law and Denning’s scaling having reached a plateau, and the efficiency of computation is now a direct limiter on performance, scaling, and power. For Deep Learning specifically, and High-Performance Computing in general, IEEE floating point has shown its deficiencies in silicon efficiency, information density, and even mathematical accuracy.
The posit number system is positioned as a replacement of IEEE floating point, and offer significant improvements including performance per Watt, information density, and reproducibility. The posit number system is a tapered floating point with very efficient encoding of real numbers. It has only two exceptional values; zero and NaR (not-a-real). The posit encoding improves precision compared to floats of the same bit-width, which leads to higher performance and lower cost for big-data applications. Furthermore, the posit standard defines rules for reproducibility in concurrent environments enabling high-productivity and lower-cost for software application development for multi-core and many-core deployments.
In the following blog posts, we will introduce the posit number system format, and report on experiments that compare IEEE floats to posits in real applications. Here is a reference to a full software environment for you tinkerers: http://stillwater-sc.com/assets/content/stillwater-universal-sw.html
No comments:
Post a Comment