Syed Asad Alam, Andrew Anderson, Barbara Basabasz and David Gregg, "Winograd Convolution for Deep Neural Networks: Efficient Point Selection", ACM Transactions on Embedded Computing Systems, December 2022

Winograd's fast convolution algorithm can greatly reduce the high computational cost of deep neural networks. However, there is a fundamental trade-off between the computational cost of the convolution and the floating point accuracy of the result. This study proposes a novel method to improve the numeric accuracy of Winograd convolution by selecting parameters that simplify intermediate values through cancellation. These methods allow a reduction in the error of up to 35% with no additional computational cost.

https://dl.acm.org/doi/full/10.1145/3524069