Skip to content

Commit b84a913

Browse files
author
baaraban
committed
Initial commit
0 parents  commit b84a913

19 files changed

+340514
-0
lines changed

.gitignore

+6
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
**/__pycache__/
2+
**/.ipynb_checkpoints/
3+
**/.idea/
4+
**/embeddings/
5+
6+
**/tryings.ipynb

DL-NLP-homework.pdf

77.1 KB
Binary file not shown.

README.md

+18
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
## Overview
2+
The main file for grading is solution.ipynb file in the root folder. <br>
3+
The html version of this file can be found in the "html" folder<br>
4+
All of the supporting code is extracted to logically separated .py files inside "scripts" folder<br>
5+
6+
## Task in details
7+
1. <b>Implement functionality to read and process NER 2003 English Shared Task data in CoNNL file format, data will be provided (10% of score).</b><br>
8+
Needed functionality can be found in scripts/util.py file
9+
2. <b>Implement 3 strategies for loading the embeddings</b><br>
10+
Needed functionality is located in scripts/embedding_fabric.py
11+
3. <b>Implement training on batches</b><br>
12+
The function for batching is in scripts/utils.py file. The logic for training in batches is implemented in scripts/training_model.py
13+
4. <b>Implement the calculation of token-level Precision / Recall / F1 / F0.5 scores for all classes in average.</b><br>
14+
Implementation is in scripts/metrics.py
15+
5. <b>Provide the report the performances (F1 and F0.5 scores) on the dev / test subsets w.r.t epoch number during the training for the first 5 epochs for each strategy of loading the embeddings</b><br>
16+
The expirement execution and results can be recreated by running solution.ipynb files.
17+
I have not followed the instructions strictly: for each model and epoch I validated the results on dev set, but the performance for test subset is done only after the training<br>
18+
Sorry about that, I noticed this line too late.

0 commit comments

Comments
 (0)