rbelanec commited on
Commit
b361080
·
verified ·
1 Parent(s): 9bb1a73

Model save

Browse files
Files changed (2) hide show
  1. README.md +262 -0
  2. adapter_model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,262 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: peft
3
+ license: gemma
4
+ base_model: google/gemma-3-1b-it
5
+ tags:
6
+ - llama-factory
7
+ - generated_from_trainer
8
+ model-index:
9
+ - name: train_multirc_1745950262
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # train_multirc_1745950262
17
+
18
+ This model is a fine-tuned version of [google/gemma-3-1b-it](https://huggingface.co/google/gemma-3-1b-it) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 3.3402
21
+ - Num Input Tokens Seen: 76963024
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 5e-05
41
+ - train_batch_size: 2
42
+ - eval_batch_size: 2
43
+ - seed: 123
44
+ - gradient_accumulation_steps: 2
45
+ - total_train_batch_size: 4
46
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
47
+ - lr_scheduler_type: cosine
48
+ - training_steps: 40000
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
53
+ |:-------------:|:------:|:-----:|:---------------:|:-----------------:|
54
+ | 2.857 | 0.0326 | 200 | 3.5027 | 385088 |
55
+ | 3.2623 | 0.0653 | 400 | 3.4216 | 770352 |
56
+ | 2.9273 | 0.0979 | 600 | 3.3862 | 1160480 |
57
+ | 2.3565 | 0.1305 | 800 | 3.3748 | 1543296 |
58
+ | 2.9929 | 0.1631 | 1000 | 3.3255 | 1931808 |
59
+ | 3.3083 | 0.1958 | 1200 | 3.3574 | 2315744 |
60
+ | 2.5698 | 0.2284 | 1400 | 3.3604 | 2710208 |
61
+ | 3.5809 | 0.2610 | 1600 | 3.3383 | 3095216 |
62
+ | 3.2428 | 0.2937 | 1800 | 3.3427 | 3483504 |
63
+ | 3.0597 | 0.3263 | 2000 | 3.3370 | 3872976 |
64
+ | 3.813 | 0.3589 | 2200 | 3.3336 | 4254272 |
65
+ | 3.866 | 0.3915 | 2400 | 3.3476 | 4637376 |
66
+ | 3.2625 | 0.4242 | 2600 | 3.3691 | 5019664 |
67
+ | 4.0913 | 0.4568 | 2800 | 3.3289 | 5406912 |
68
+ | 3.2895 | 0.4894 | 3000 | 3.3188 | 5786080 |
69
+ | 3.27 | 0.5221 | 3200 | 3.3495 | 6167600 |
70
+ | 3.4807 | 0.5547 | 3400 | 3.3346 | 6553904 |
71
+ | 3.9413 | 0.5873 | 3600 | 3.3282 | 6936656 |
72
+ | 3.0852 | 0.6200 | 3800 | 3.3407 | 7321136 |
73
+ | 3.3249 | 0.6526 | 4000 | 3.3292 | 7709856 |
74
+ | 2.8785 | 0.6852 | 4200 | 3.3577 | 8100560 |
75
+ | 3.0068 | 0.7178 | 4400 | 3.3204 | 8482208 |
76
+ | 3.9827 | 0.7505 | 4600 | 3.3427 | 8868016 |
77
+ | 3.1844 | 0.7831 | 4800 | 3.3166 | 9254560 |
78
+ | 2.9282 | 0.8157 | 5000 | 3.3262 | 9634544 |
79
+ | 3.329 | 0.8484 | 5200 | 3.3274 | 10013984 |
80
+ | 3.8631 | 0.8810 | 5400 | 3.3282 | 10397792 |
81
+ | 3.3116 | 0.9136 | 5600 | 3.3371 | 10784512 |
82
+ | 2.5499 | 0.9462 | 5800 | 3.3406 | 11165168 |
83
+ | 2.2533 | 0.9789 | 6000 | 3.3248 | 11553056 |
84
+ | 2.9946 | 1.0114 | 6200 | 3.3201 | 11940352 |
85
+ | 2.331 | 1.0440 | 6400 | 3.3198 | 12331920 |
86
+ | 2.64 | 1.0767 | 6600 | 3.3347 | 12726352 |
87
+ | 3.6418 | 1.1093 | 6800 | 3.3414 | 13105200 |
88
+ | 3.6926 | 1.1419 | 7000 | 3.3382 | 13483648 |
89
+ | 2.6131 | 1.1746 | 7200 | 3.3405 | 13862816 |
90
+ | 3.8571 | 1.2072 | 7400 | 3.3480 | 14252288 |
91
+ | 3.1471 | 1.2398 | 7600 | 3.3307 | 14638816 |
92
+ | 3.6328 | 1.2725 | 7800 | 3.3687 | 15024560 |
93
+ | 2.5802 | 1.3051 | 8000 | 3.3462 | 15412000 |
94
+ | 3.9663 | 1.3377 | 8200 | 3.3536 | 15789456 |
95
+ | 3.242 | 1.3703 | 8400 | 3.3473 | 16173616 |
96
+ | 2.9885 | 1.4030 | 8600 | 3.3319 | 16558464 |
97
+ | 3.2899 | 1.4356 | 8800 | 3.3373 | 16945488 |
98
+ | 3.8357 | 1.4682 | 9000 | 3.3396 | 17338800 |
99
+ | 2.511 | 1.5009 | 9200 | 3.3474 | 17729104 |
100
+ | 3.2625 | 1.5335 | 9400 | 3.3420 | 18107328 |
101
+ | 3.4682 | 1.5661 | 9600 | 3.3343 | 18497776 |
102
+ | 3.3546 | 1.5987 | 9800 | 3.3381 | 18881008 |
103
+ | 3.7871 | 1.6314 | 10000 | 3.3457 | 19266960 |
104
+ | 3.1788 | 1.6640 | 10200 | 3.3292 | 19650480 |
105
+ | 3.2014 | 1.6966 | 10400 | 3.3497 | 20041120 |
106
+ | 3.1338 | 1.7293 | 10600 | 3.3413 | 20421120 |
107
+ | 4.135 | 1.7619 | 10800 | 3.3325 | 20808496 |
108
+ | 3.0333 | 1.7945 | 11000 | 3.3395 | 21195024 |
109
+ | 2.8654 | 1.8271 | 11200 | 3.3553 | 21570368 |
110
+ | 2.9058 | 1.8598 | 11400 | 3.3378 | 21950896 |
111
+ | 3.2689 | 1.8924 | 11600 | 3.3470 | 22333376 |
112
+ | 2.5088 | 1.9250 | 11800 | 3.3491 | 22714512 |
113
+ | 3.3349 | 1.9577 | 12000 | 3.3541 | 23099888 |
114
+ | 2.5775 | 1.9903 | 12200 | 3.3717 | 23482400 |
115
+ | 3.0117 | 2.0228 | 12400 | 3.3611 | 23860160 |
116
+ | 3.3181 | 2.0555 | 12600 | 3.3665 | 24249008 |
117
+ | 3.4065 | 2.0881 | 12800 | 3.3510 | 24639552 |
118
+ | 3.3034 | 2.1207 | 13000 | 3.3573 | 25026880 |
119
+ | 2.4809 | 2.1534 | 13200 | 3.3575 | 25410448 |
120
+ | 2.7989 | 2.1860 | 13400 | 3.3628 | 25785744 |
121
+ | 3.3774 | 2.2186 | 13600 | 3.3717 | 26163104 |
122
+ | 3.7661 | 2.2512 | 13800 | 3.3681 | 26546240 |
123
+ | 3.3281 | 2.2839 | 14000 | 3.3671 | 26923408 |
124
+ | 3.3407 | 2.3165 | 14200 | 3.3710 | 27309344 |
125
+ | 3.6008 | 2.3491 | 14400 | 3.3624 | 27698752 |
126
+ | 3.2149 | 2.3818 | 14600 | 3.3701 | 28082208 |
127
+ | 3.1563 | 2.4144 | 14800 | 3.3692 | 28468576 |
128
+ | 3.712 | 2.4470 | 15000 | 3.3657 | 28856272 |
129
+ | 3.1448 | 2.4796 | 15200 | 3.3596 | 29234704 |
130
+ | 2.6872 | 2.5123 | 15400 | 3.3429 | 29617728 |
131
+ | 4.1079 | 2.5449 | 15600 | 3.3574 | 30004032 |
132
+ | 3.3531 | 2.5775 | 15800 | 3.3597 | 30386752 |
133
+ | 3.3259 | 2.6102 | 16000 | 3.3508 | 30774224 |
134
+ | 3.0081 | 2.6428 | 16200 | 3.3413 | 31164304 |
135
+ | 3.8045 | 2.6754 | 16400 | 3.3482 | 31548832 |
136
+ | 2.5507 | 2.7081 | 16600 | 3.3499 | 31943568 |
137
+ | 3.342 | 2.7407 | 16800 | 3.3370 | 32327088 |
138
+ | 3.1806 | 2.7733 | 17000 | 3.3335 | 32713728 |
139
+ | 2.9748 | 2.8059 | 17200 | 3.3408 | 33093744 |
140
+ | 3.2799 | 2.8386 | 17400 | 3.3411 | 33484336 |
141
+ | 2.7973 | 2.8712 | 17600 | 3.3399 | 33875072 |
142
+ | 3.497 | 2.9038 | 17800 | 3.3334 | 34264832 |
143
+ | 3.1674 | 2.9365 | 18000 | 3.3369 | 34652800 |
144
+ | 3.3584 | 2.9691 | 18200 | 3.3395 | 35036144 |
145
+ | 3.1948 | 3.0016 | 18400 | 3.3313 | 35410304 |
146
+ | 3.6125 | 3.0343 | 18600 | 3.3329 | 35808688 |
147
+ | 2.9242 | 3.0669 | 18800 | 3.3340 | 36200720 |
148
+ | 3.0657 | 3.0995 | 19000 | 3.3365 | 36580112 |
149
+ | 3.4129 | 3.1321 | 19200 | 3.3343 | 36961872 |
150
+ | 2.4262 | 3.1648 | 19400 | 3.3348 | 37345136 |
151
+ | 3.3369 | 3.1974 | 19600 | 3.3297 | 37732992 |
152
+ | 3.0873 | 3.2300 | 19800 | 3.3333 | 38118784 |
153
+ | 2.6554 | 3.2627 | 20000 | 3.3385 | 38503392 |
154
+ | 3.5597 | 3.2953 | 20200 | 3.3338 | 38885696 |
155
+ | 3.5319 | 3.3279 | 20400 | 3.3332 | 39270320 |
156
+ | 2.9296 | 3.3606 | 20600 | 3.3318 | 39665472 |
157
+ | 3.0795 | 3.3932 | 20800 | 3.3292 | 40049680 |
158
+ | 3.4726 | 3.4258 | 21000 | 3.3370 | 40436560 |
159
+ | 3.3311 | 3.4584 | 21200 | 3.3357 | 40820704 |
160
+ | 3.94 | 3.4911 | 21400 | 3.3415 | 41202080 |
161
+ | 2.414 | 3.5237 | 21600 | 3.3442 | 41588560 |
162
+ | 3.3365 | 3.5563 | 21800 | 3.3424 | 41977888 |
163
+ | 3.1493 | 3.5890 | 22000 | 3.3374 | 42361392 |
164
+ | 3.7413 | 3.6216 | 22200 | 3.3442 | 42746416 |
165
+ | 2.7835 | 3.6542 | 22400 | 3.3415 | 43126400 |
166
+ | 3.1416 | 3.6868 | 22600 | 3.3417 | 43513248 |
167
+ | 4.1077 | 3.7195 | 22800 | 3.3385 | 43896720 |
168
+ | 3.1847 | 3.7521 | 23000 | 3.3420 | 44278640 |
169
+ | 3.1111 | 3.7847 | 23200 | 3.3425 | 44666464 |
170
+ | 2.8387 | 3.8174 | 23400 | 3.3423 | 45047360 |
171
+ | 3.5614 | 3.8500 | 23600 | 3.3412 | 45426496 |
172
+ | 2.8613 | 3.8826 | 23800 | 3.3445 | 45813536 |
173
+ | 3.2588 | 3.9152 | 24000 | 3.3387 | 46192656 |
174
+ | 2.5633 | 3.9479 | 24200 | 3.3436 | 46576928 |
175
+ | 4.2529 | 3.9805 | 24400 | 3.3428 | 46965120 |
176
+ | 3.0276 | 4.0131 | 24600 | 3.3475 | 47347920 |
177
+ | 3.1426 | 4.0457 | 24800 | 3.3347 | 47741360 |
178
+ | 2.9586 | 4.0783 | 25000 | 3.3380 | 48131120 |
179
+ | 3.51 | 4.1109 | 25200 | 3.3403 | 48513200 |
180
+ | 3.5387 | 4.1436 | 25400 | 3.3383 | 48894496 |
181
+ | 2.7833 | 4.1762 | 25600 | 3.3424 | 49280736 |
182
+ | 3.2627 | 4.2088 | 25800 | 3.3448 | 49662304 |
183
+ | 3.2861 | 4.2415 | 26000 | 3.3386 | 50049312 |
184
+ | 2.8297 | 4.2741 | 26200 | 3.3398 | 50433008 |
185
+ | 3.7263 | 4.3067 | 26400 | 3.3382 | 50815824 |
186
+ | 3.2323 | 4.3393 | 26600 | 3.3389 | 51200224 |
187
+ | 3.079 | 4.3720 | 26800 | 3.3459 | 51585680 |
188
+ | 3.7935 | 4.4046 | 27000 | 3.3449 | 51969184 |
189
+ | 2.6226 | 4.4372 | 27200 | 3.3459 | 52363216 |
190
+ | 3.8058 | 4.4699 | 27400 | 3.3450 | 52737552 |
191
+ | 3.5748 | 4.5025 | 27600 | 3.3485 | 53112128 |
192
+ | 3.168 | 4.5351 | 27800 | 3.3442 | 53489200 |
193
+ | 3.2045 | 4.5677 | 28000 | 3.3454 | 53870832 |
194
+ | 3.3387 | 4.6004 | 28200 | 3.3447 | 54260848 |
195
+ | 3.2715 | 4.6330 | 28400 | 3.3464 | 54647840 |
196
+ | 2.7614 | 4.6656 | 28600 | 3.3428 | 55035376 |
197
+ | 2.8811 | 4.6983 | 28800 | 3.3451 | 55421296 |
198
+ | 3.2825 | 4.7309 | 29000 | 3.3448 | 55807776 |
199
+ | 3.315 | 4.7635 | 29200 | 3.3454 | 56188960 |
200
+ | 3.6957 | 4.7961 | 29400 | 3.3451 | 56576864 |
201
+ | 3.9208 | 4.8288 | 29600 | 3.3394 | 56959888 |
202
+ | 3.2552 | 4.8614 | 29800 | 3.3442 | 57347776 |
203
+ | 3.0983 | 4.8940 | 30000 | 3.3445 | 57727072 |
204
+ | 3.3484 | 4.9267 | 30200 | 3.3448 | 58119904 |
205
+ | 3.6294 | 4.9593 | 30400 | 3.3468 | 58503776 |
206
+ | 3.1149 | 4.9919 | 30600 | 3.3455 | 58892528 |
207
+ | 2.9508 | 5.0245 | 30800 | 3.3448 | 59278112 |
208
+ | 3.7158 | 5.0571 | 31000 | 3.3395 | 59663264 |
209
+ | 2.9366 | 5.0897 | 31200 | 3.3395 | 60047056 |
210
+ | 2.7281 | 5.1224 | 31400 | 3.3434 | 60433680 |
211
+ | 3.9076 | 5.1550 | 31600 | 3.3407 | 60809376 |
212
+ | 3.2993 | 5.1876 | 31800 | 3.3402 | 61186608 |
213
+ | 2.8036 | 5.2202 | 32000 | 3.3395 | 61567504 |
214
+ | 3.2689 | 5.2529 | 32200 | 3.3391 | 61958976 |
215
+ | 2.7509 | 5.2855 | 32400 | 3.3434 | 62346176 |
216
+ | 4.7217 | 5.3181 | 32600 | 3.3395 | 62734064 |
217
+ | 2.6531 | 5.3508 | 32800 | 3.3395 | 63124752 |
218
+ | 2.744 | 5.3834 | 33000 | 3.3407 | 63517792 |
219
+ | 2.8734 | 5.4160 | 33200 | 3.3407 | 63894896 |
220
+ | 3.7619 | 5.4486 | 33400 | 3.3407 | 64277584 |
221
+ | 3.6114 | 5.4813 | 33600 | 3.3402 | 64661856 |
222
+ | 3.2589 | 5.5139 | 33800 | 3.3402 | 65043136 |
223
+ | 3.401 | 5.5465 | 34000 | 3.3402 | 65439360 |
224
+ | 3.4793 | 5.5792 | 34200 | 3.3402 | 65819600 |
225
+ | 2.8978 | 5.6118 | 34400 | 3.3402 | 66199376 |
226
+ | 2.7495 | 5.6444 | 34600 | 3.3402 | 66583936 |
227
+ | 3.9339 | 5.6771 | 34800 | 3.3402 | 66968960 |
228
+ | 2.2472 | 5.7097 | 35000 | 3.3402 | 67361344 |
229
+ | 3.9815 | 5.7423 | 35200 | 3.3402 | 67746288 |
230
+ | 4.0091 | 5.7749 | 35400 | 3.3402 | 68131952 |
231
+ | 3.0225 | 5.8076 | 35600 | 3.3402 | 68514656 |
232
+ | 3.0446 | 5.8402 | 35800 | 3.3402 | 68904544 |
233
+ | 3.671 | 5.8728 | 36000 | 3.3402 | 69286320 |
234
+ | 3.2561 | 5.9055 | 36200 | 3.3402 | 69676640 |
235
+ | 3.7261 | 5.9381 | 36400 | 3.3402 | 70057024 |
236
+ | 3.8545 | 5.9707 | 36600 | 3.3402 | 70432848 |
237
+ | 2.8251 | 6.0033 | 36800 | 3.3402 | 70819440 |
238
+ | 3.6945 | 6.0359 | 37000 | 3.3402 | 71203008 |
239
+ | 3.9083 | 6.0685 | 37200 | 3.3402 | 71588672 |
240
+ | 3.5624 | 6.1012 | 37400 | 3.3402 | 71972608 |
241
+ | 3.4654 | 6.1338 | 37600 | 3.3402 | 72358032 |
242
+ | 3.7411 | 6.1664 | 37800 | 3.3402 | 72749840 |
243
+ | 3.2175 | 6.1990 | 38000 | 3.3402 | 73128448 |
244
+ | 3.3323 | 6.2317 | 38200 | 3.3402 | 73518048 |
245
+ | 3.9256 | 6.2643 | 38400 | 3.3402 | 73911328 |
246
+ | 3.2926 | 6.2969 | 38600 | 3.3402 | 74293168 |
247
+ | 3.0041 | 6.3296 | 38800 | 3.3402 | 74668864 |
248
+ | 3.8996 | 6.3622 | 39000 | 3.3402 | 75058640 |
249
+ | 3.4163 | 6.3948 | 39200 | 3.3402 | 75440784 |
250
+ | 3.9653 | 6.4274 | 39400 | 3.3402 | 75822528 |
251
+ | 3.666 | 6.4601 | 39600 | 3.3402 | 76198368 |
252
+ | 2.5767 | 6.4927 | 39800 | 3.3402 | 76581104 |
253
+ | 3.2328 | 6.5253 | 40000 | 3.3402 | 76963024 |
254
+
255
+
256
+ ### Framework versions
257
+
258
+ - PEFT 0.15.2.dev0
259
+ - Transformers 4.51.3
260
+ - Pytorch 2.6.0+cu124
261
+ - Datasets 3.5.0
262
+ - Tokenizers 0.21.1
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9a8c1ff5f25287d38aa94c2219ac1ddfde3caaa792ce61a80cdce437acd76988
3
  size 257072
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:19e412e7a39082ad5455ce3f94c5a119aa9eede660b17b58ea2514675da94fa6
3
  size 257072