Pay attention to what you read: Non-recurrent handwritten text-Line recognition
作者:
Highlights:
• Novel adaptation of transformers for handwriting recognition tasks, bypassing recurrent neural nets.
• Competitive results achieved in low resource scenario with synthetically pretrained model.
• Extensive ablation and comparative studies conducted to understand and modify transformer properly for HTR.
• Implicit language modelling ability proved.
• The state-of-the-art performance achieved on public IAM dataset.
摘要
•Novel adaptation of transformers for handwriting recognition tasks, bypassing recurrent neural nets.•Competitive results achieved in low resource scenario with synthetically pretrained model.•Extensive ablation and comparative studies conducted to understand and modify transformer properly for HTR.•Implicit language modelling ability proved.•The state-of-the-art performance achieved on public IAM dataset.
论文关键词:Handwriting text recognition,Transformers,Self-Attention,Implicit language model
论文评审过程:Received 31 July 2020, Revised 25 April 2022, Accepted 29 April 2022, Available online 4 May 2022, Version of Record 10 May 2022.
论文官网地址:https://doi.org/10.1016/j.patcog.2022.108766