Bi-directional block self-attention for fast and memory-efficient sequence modeling

Publication Type:
Conference Proceeding
Citation:
ICLR 2018, 2018
Issue Date:
2018
Full metadata record
Please use this identifier to cite or link to this item: