Computational Linguistics
About

Jakob Uszkoreit

Jakob Uszkoreit is a German computer scientist who co-authored the Transformer paper at Google Brain and later applied Transformer-based approaches to biological sequence modelling, co-founding Inceptive to develop RNA therapeutics using deep learning.

PE(pos,2i) = sin(pos/10000^(2i/d_model))

Jakob Uszkoreit is a computer scientist who was one of the eight co-authors of "Attention Is All You Need." His work at Google Brain contributed to the design and validation of the Transformer architecture. He later leveraged the insights from Transformer development to pioneer the application of attention-based models to biological sequences, founding a company at the intersection of deep learning and RNA therapeutics.

Early Life and Education

Born in Germany, Uszkoreit comes from a family with deep roots in computational linguistics — his father, Hans Uszkoreit, is a prominent computational linguist at the German Research Center for Artificial Intelligence (DFKI) and Saarland University. Jakob studied computer science and joined Google, where he worked on natural language understanding and neural architecture design.

2010s

Worked at Google on NLP and machine learning

2017

Co-authored "Attention Is All You Need"

2021

Co-founded Inceptive, applying Transformer models to RNA design

Key Contributions

Uszkoreit contributed to the Transformer's design, including the positional encoding scheme that uses sinusoidal functions to inject sequence order information: PE(pos, 2i) = sin(pos / 10000^(2i/d_model)) and PE(pos, 2i+1) = cos(pos / 10000^(2i/d_model)). This elegant solution allows the model to generalise to sequence lengths not seen during training and provides a smooth, continuous representation of position.

His subsequent work demonstrated that the self-attention mechanisms developed for natural language could be applied to biological sequences — proteins and RNA — where the same fundamental challenge of modelling long-range dependencies in sequences arises. This cross-pollination between NLP and biology exemplifies how architectural innovations in computational linguistics can have far-reaching impact.

"The principles of attention and representation learning that work for language turn out to be remarkably effective for biological sequences as well." — Jakob Uszkoreit, on applying Transformers to biology

Legacy

Uszkoreit's contributions to the Transformer paper helped enable the revolution in language modelling. His subsequent career demonstrates the broad applicability of NLP architectures beyond language, particularly in computational biology. His family connection to computational linguistics — bridging symbolic and neural approaches across generations — is a unique thread in the field's history.

Interactive Calculator

Enter a CSV of publications: year,title,citations_count. The calculator computes total citations, h-index, peak year, and a per-decade breakdown of scholarly output.

Click Calculate to see results, or Animate to watch the statistics update one record at a time.

Related Topics

References

  1. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., & Polosukhin, I. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 30, 5998–6008.
  2. Uszkoreit, H. (2000). What is computational linguistics? DFKI Technical Report.
  3. Rives, A., et al. (2021). Biological structure and function emerge from scaling unsupervised learning to 250 million protein sequences. Proceedings of the National Academy of Sciences, 118(15), e2016239118.
  4. Lin, T., Wang, Y., Liu, X., & Qiu, X. (2022). A survey of transformers. AI Open, 3, 111–132.

External Links