Impact of pre-training on background knowledge and societal bias
<p>With appropriate pre-training on unstructured text, larger and more accurate neural network models can be trained. Unfortunately, unstructured pre-training data may contain undesired societal biases, which a model may mimic and amplify. This thesis focuses on both improving unsuperv...
Main Author: | Kocijan, V |
---|---|
Other Authors: | Lukasiewicz, T |
Format: | Thesis |
Language: | English |
Published: |
2021
|
Subjects: |
Similar Items
-
Capsule neural tensor networks with multi-aspect information for Few-shot Knowledge Graph Completion
by: Li, Qianyu, et al.
Published: (2023) -
Fusing topology contexts and logical rules in language models for knowledge graph completion
by: Lin, Qika, et al.
Published: (2023) -
A Knowledge Based Approach to Facilitate Enginering Design
by: Seshasai, Satwik, et al.
Published: (2002) -
Toward a Theory of Representation Design
by: Baalen, Jeffrey Van
Published: (2004) -
The use of microorganisms to remove drilling fluid filter cake /
by: Amy Shareena Abd. Mubin, 1986-, et al.
Published: (2009)