| --- |
| language: de |
| license: mit |
| datasets: |
| - wikipedia |
| - OPUS |
| - OpenLegalData |
| --- |
| |
| # German ELECTRA base |
|
|
| Released, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our [paper](https://arxiv.org/pdf/2010.10906.pdf), we outline the steps taken to train our model. Our evaluation suggests that this model is somewhat undertrained. For best performance from a base sized model, we recommend deepset/gbert-base |
|
|
| ## Overview |
| **Paper:** [here](https://arxiv.org/pdf/2010.10906.pdf) |
| **Architecture:** ELECTRA base (discriminator) |
| **Language:** German |
|
|
| ## Performance |
| ``` |
| GermEval18 Coarse: 76.02 |
| GermEval18 Fine: 42.22 |
| GermEval14: 86.02 |
| ``` |
|
|
| See also: |
| deepset/gbert-base |
| deepset/gbert-large |
| deepset/gelectra-base |
| deepset/gelectra-large |
| deepset/gelectra-base-generator |
| deepset/gelectra-large-generator |
|
|
| ## Authors |
| Branden Chan: `branden.chan [at] deepset.ai` |
| Stefan Schweter: `stefan [at] schweter.eu` |
| Timo Möller: `timo.moeller [at] deepset.ai` |
|
|
| ## About us |
|
|
| <div class="grid lg:grid-cols-2 gap-x-4 gap-y-3"> |
| <div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center"> |
| <img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/> |
| </div> |
| <div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center"> |
| <img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/> |
| </div> |
| </div> |
| |
| [deepset](http://deepset.ai/) is the company behind the production-ready open-source AI framework [Haystack](https://haystack.deepset.ai/). |
|
|
| Some of our other work: |
| - [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")](https://huggingface.co/deepset/tinyroberta-squad2) |
| - [German BERT](https://deepset.ai/german-bert), [GermanQuAD and GermanDPR](https://deepset.ai/germanquad), [German embedding model](https://huggingface.co/mixedbread-ai/deepset-mxbai-embed-de-large-v1) |
| - [deepset Cloud](https://www.deepset.ai/deepset-cloud-product), [deepset Studio](https://www.deepset.ai/deepset-studio) |
|
|
| ## Get in touch and join the Haystack community |
|
|
| <p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>. |
|
|
| We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community">Discord community open to everyone!</a></strong></p> |
|
|
| [Twitter](https://twitter.com/Haystack_AI) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://haystack.deepset.ai/) | [YouTube](https://www.youtube.com/@deepset_ai) |
|
|
| By the way: [we're hiring!](http://www.deepset.ai/jobs) |