OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Biological structure and function emerge from scaling unsupervised learning to 250 million protein sequences
Alexander Rives, Joshua Meier, Tom Sercu, et al.
Proceedings of the National Academy of Sciences (2021) Vol. 118, Iss. 15
Open Access | Times Cited: 1827

Showing 1-25 of 1827 citing articles:

Highly accurate protein structure prediction with AlphaFold
John Jumper, Richard Evans, Alexander Pritzel, et al.
Nature (2021) Vol. 596, Iss. 7873, pp. 583-589
Open Access | Times Cited: 28635

Evolutionary-scale prediction of atomic-level protein structure with a language model
Zeming Lin, Halil Akin, Roshan Rao, et al.
Science (2023) Vol. 379, Iss. 6637, pp. 1123-1130
Open Access | Times Cited: 1818

SignalP 6.0 predicts all five types of signal peptides using protein language models
Felix Teufel, José Juan Almagro Armenteros, Alexander Rosenberg Johansen, et al.
Nature Biotechnology (2022) Vol. 40, Iss. 7, pp. 1023-1025
Open Access | Times Cited: 1546

Evaluating Large Language Models Trained on Code
Mark Chen, Jerry Tworek, Heewoo Jun, et al.
arXiv (Cornell University) (2021)
Open Access | Times Cited: 1104

DeepTMHMM predicts alpha and beta transmembrane proteins using deep neural networks
Jeppe Hallgren, Konstantinos D. Tsirigos, Mads Damgaard Pedersen, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2022)
Open Access | Times Cited: 813

Accurate proteome-wide missense variant effect prediction with AlphaMissense
Jun Cheng, Guido Novati, Joshua Pan, et al.
Science (2023) Vol. 381, Iss. 6664
Open Access | Times Cited: 672

Generative Pretraining From Pixels
Mark Chen, Alec Radford, Rewon Child, et al.
International Conference on Machine Learning (2020) Vol. 1, pp. 1691-1703
Closed Access | Times Cited: 610

A survey of transformers
Tianyang Lin, Yuxin Wang, Xiangyang Liu, et al.
AI Open (2022) Vol. 3, pp. 111-132
Open Access | Times Cited: 596

Scientific discovery in the age of artificial intelligence
Hanchen Wang, Tianfan Fu, Yuanqi Du, et al.
Nature (2023) Vol. 620, Iss. 7972, pp. 47-60
Closed Access | Times Cited: 582

Large language models generate functional protein sequences across diverse families
Ali Madani, Ben Krause, Eric R. Greene, et al.
Nature Biotechnology (2023) Vol. 41, Iss. 8, pp. 1099-1106
Open Access | Times Cited: 514

Modeling aspects of the language of life through transfer-learning protein sequences
Michael Heinzinger, Ahmed Elnaggar, Yu Wang, et al.
BMC Bioinformatics (2019) Vol. 20, Iss. 1
Open Access | Times Cited: 481

ProteinBERT: a universal deep-learning model of protein sequence and function
Nadav Brandes, Dan Ofer, Yam Peleg, et al.
Bioinformatics (2022) Vol. 38, Iss. 8, pp. 2102-2110
Open Access | Times Cited: 445

DeepLoc 2.0: multi-label subcellular localization prediction using protein language models
Vineet Thumuluri, José Juan Almagro Armenteros, Alexander Rosenberg Johansen, et al.
Nucleic Acids Research (2022) Vol. 50, Iss. W1, pp. W228-W234
Open Access | Times Cited: 370

ProtGPT2 is a deep unsupervised language model for protein design
Noelia Ferruz, Steffen Schmidt, Birte Höcker
Nature Communications (2022) Vol. 13, Iss. 1
Open Access | Times Cited: 369

DeepGOPlus: improved protein function prediction from sequence
Maxat Kulmanov, Robert Hoehndorf
Bioinformatics (2019) Vol. 36, Iss. 2, pp. 422-429
Open Access | Times Cited: 342

Learning the protein language: Evolution, structure, and function
Tristan Bepler, Bonnie Berger
Cell Systems (2021) Vol. 12, Iss. 6, pp. 654-669.e3
Open Access | Times Cited: 330

Evaluating Protein Transfer Learning with TAPE
Roshan Rao, Nicholas Bhattacharya, Neil Thomas, et al.
arXiv (Cornell University) (2019)
Open Access | Times Cited: 309

Single-sequence protein structure prediction using a language model and deep learning
Ratul Chowdhury, Nazim Bouatta, Surojit Biswas, et al.
Nature Biotechnology (2022) Vol. 40, Iss. 11, pp. 1617-1623
Open Access | Times Cited: 285

Protein design and variant prediction using autoregressive generative models
Jung-Eun Shin, Adam J. Riesselman, Aaron W. Kollasch, et al.
Nature Communications (2021) Vol. 12, Iss. 1
Open Access | Times Cited: 269

The language of proteins: NLP, machine learning & protein sequences
Dan Ofer, Nadav Brandes, Michal Linial
Computational and Structural Biotechnology Journal (2021) Vol. 19, pp. 1750-1758
Open Access | Times Cited: 262

High-resolutionde novostructure prediction from primary sequence
Ruidong Wu, Fan Ding, Rui Wang, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2022)
Open Access | Times Cited: 258

Evolutionary-scale prediction of atomic level protein structure with a language model
Zeming Lin, Halil Akin, Roshan Rao, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2022)
Open Access | Times Cited: 251

Biological structure and function emerge from scaling unsupervised learning to 250 million protein sequences
Alexander Rives, Joshua Meier, Tom Sercu, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2019)
Open Access | Times Cited: 233

Learning inverse folding from millions of predicted structures
Chloe Hsu, Robert Verkuil, Jason Liu, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2022)
Open Access | Times Cited: 221

Page 1 - Next Page

Scroll to top