
How can I contribute?
Test your model against our benchmark!
This is the easiest way to contribute to the txt2sql community! Test your model performance against other models on classic txt2sql questions as well as generative responses.
Evaluate soft-eval model responses
Every single binary evaluation is accounted for in our soft-eval ranking. Help us evaluate the best models by evaluating model-to-model responses on the Evaluate page.
Contribute to our open-source PyPI package
Our PyPI package is live and active! You can contribute by making a PR, joining our public discussions, or giving us feedback.
Send in a database schema
Do you have a database you want txt2sql-hacked? We can help you! We can help incorporate a version of your database schema to our next release iteration to reach a community of researchers and developers.
Contribute natural language queries
Have interesting business questions that challenge txt2sql models? Share your natural language queries and help expand our benchmark with diverse, real-world scenarios that push the boundaries of AI.
Database Schema Performance
Your database could be the next challenge that pushes txt2sql research forward!
See how contributed database schemas perform across different models.
Database Name | Number of Tables | Best Performing Score | Average Score |
---|---|---|---|
TBD | X | XX.X% | XX.X% |
Database performance measured across evaluated models in the benchmark