Title:

The General Language Understanding Evaluation (GLUE) For Evaluating Foundation Models

Original Title:

SuperGLUE Benchmark

Data Science

Data Science - General, Advanced Contents

The content is about:

The General Language Understanding Evaluation (GLUE) benchmark is a collection of resources for training, evaluating, and analyzing natural language understanding systems.

Other Features:

Paper, Code, Tasks, Leaderboard

Name of Reference:

Glue Benchmark

Link of Reference:

Other independent links, if related to important  downloads or contents, will be listed here separately.

Downloads or other useful links

GLUE Benchmark (Previous version)

Downloads or other useful links

Paper

Downloads or other useful links

Leaderboard