Safetensors
Frywind commited on
Commit
c273ab8
·
verified ·
1 Parent(s): 5b8680f

Upload folder using huggingface_hub

Browse files
Files changed (34) hide show
  1. .gitattributes +3 -0
  2. qwen3_4b_beauty/added_tokens.json +1308 -0
  3. qwen3_4b_beauty/config.json +30 -0
  4. qwen3_4b_beauty/generation_config.json +13 -0
  5. qwen3_4b_beauty/merges.txt +0 -0
  6. qwen3_4b_beauty/model-00001-of-00002.safetensors +3 -0
  7. qwen3_4b_beauty/model-00002-of-00002.safetensors +3 -0
  8. qwen3_4b_beauty/model.safetensors.index.json +406 -0
  9. qwen3_4b_beauty/special_tokens_map.json +1298 -0
  10. qwen3_4b_beauty/tokenizer.json +3 -0
  11. qwen3_4b_beauty/tokenizer_config.json +0 -0
  12. qwen3_4b_beauty/vocab.json +0 -0
  13. qwen3_4b_instruments/added_tokens.json +1308 -0
  14. qwen3_4b_instruments/config.json +30 -0
  15. qwen3_4b_instruments/generation_config.json +13 -0
  16. qwen3_4b_instruments/merges.txt +0 -0
  17. qwen3_4b_instruments/model-00001-of-00002.safetensors +3 -0
  18. qwen3_4b_instruments/model-00002-of-00002.safetensors +3 -0
  19. qwen3_4b_instruments/model.safetensors.index.json +406 -0
  20. qwen3_4b_instruments/special_tokens_map.json +1298 -0
  21. qwen3_4b_instruments/tokenizer.json +3 -0
  22. qwen3_4b_instruments/tokenizer_config.json +0 -0
  23. qwen3_4b_instruments/vocab.json +0 -0
  24. qwen3_4b_sports/added_tokens.json +1308 -0
  25. qwen3_4b_sports/config.json +30 -0
  26. qwen3_4b_sports/generation_config.json +13 -0
  27. qwen3_4b_sports/merges.txt +0 -0
  28. qwen3_4b_sports/model-00001-of-00002.safetensors +3 -0
  29. qwen3_4b_sports/model-00002-of-00002.safetensors +3 -0
  30. qwen3_4b_sports/model.safetensors.index.json +406 -0
  31. qwen3_4b_sports/special_tokens_map.json +1298 -0
  32. qwen3_4b_sports/tokenizer.json +3 -0
  33. qwen3_4b_sports/tokenizer_config.json +0 -0
  34. qwen3_4b_sports/vocab.json +0 -0
.gitattributes CHANGED
@@ -33,3 +33,6 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ qwen3_4b_beauty/tokenizer.json filter=lfs diff=lfs merge=lfs -text
37
+ qwen3_4b_instruments/tokenizer.json filter=lfs diff=lfs merge=lfs -text
38
+ qwen3_4b_sports/tokenizer.json filter=lfs diff=lfs merge=lfs -text
qwen3_4b_beauty/added_tokens.json ADDED
@@ -0,0 +1,1308 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "</think>": 151668,
3
+ "</tool_call>": 151658,
4
+ "</tool_response>": 151666,
5
+ "<think>": 151667,
6
+ "<tool_call>": 151657,
7
+ "<tool_response>": 151665,
8
+ "<|a_100|>": 151768,
9
+ "<|a_101|>": 151769,
10
+ "<|a_102|>": 151770,
11
+ "<|a_103|>": 151771,
12
+ "<|a_104|>": 151772,
13
+ "<|a_105|>": 151773,
14
+ "<|a_106|>": 151774,
15
+ "<|a_107|>": 151775,
16
+ "<|a_108|>": 151776,
17
+ "<|a_109|>": 151777,
18
+ "<|a_10|>": 151678,
19
+ "<|a_110|>": 151778,
20
+ "<|a_111|>": 151779,
21
+ "<|a_112|>": 151780,
22
+ "<|a_113|>": 151781,
23
+ "<|a_114|>": 151782,
24
+ "<|a_115|>": 151783,
25
+ "<|a_116|>": 151784,
26
+ "<|a_117|>": 151785,
27
+ "<|a_118|>": 151786,
28
+ "<|a_119|>": 151787,
29
+ "<|a_11|>": 151679,
30
+ "<|a_120|>": 151788,
31
+ "<|a_121|>": 151789,
32
+ "<|a_122|>": 151790,
33
+ "<|a_123|>": 151791,
34
+ "<|a_124|>": 151792,
35
+ "<|a_125|>": 151793,
36
+ "<|a_126|>": 151794,
37
+ "<|a_127|>": 151795,
38
+ "<|a_128|>": 151796,
39
+ "<|a_129|>": 151797,
40
+ "<|a_12|>": 151680,
41
+ "<|a_130|>": 151798,
42
+ "<|a_131|>": 151799,
43
+ "<|a_132|>": 151800,
44
+ "<|a_133|>": 151801,
45
+ "<|a_134|>": 151802,
46
+ "<|a_135|>": 151803,
47
+ "<|a_136|>": 151804,
48
+ "<|a_137|>": 151805,
49
+ "<|a_138|>": 151806,
50
+ "<|a_139|>": 151807,
51
+ "<|a_13|>": 151681,
52
+ "<|a_140|>": 151808,
53
+ "<|a_141|>": 151809,
54
+ "<|a_142|>": 151810,
55
+ "<|a_143|>": 151811,
56
+ "<|a_144|>": 151812,
57
+ "<|a_145|>": 151813,
58
+ "<|a_146|>": 151814,
59
+ "<|a_147|>": 151815,
60
+ "<|a_148|>": 151816,
61
+ "<|a_149|>": 151817,
62
+ "<|a_14|>": 151682,
63
+ "<|a_150|>": 151818,
64
+ "<|a_151|>": 151819,
65
+ "<|a_152|>": 151820,
66
+ "<|a_153|>": 151821,
67
+ "<|a_154|>": 151822,
68
+ "<|a_155|>": 151823,
69
+ "<|a_156|>": 151824,
70
+ "<|a_157|>": 151825,
71
+ "<|a_158|>": 151826,
72
+ "<|a_159|>": 151827,
73
+ "<|a_15|>": 151683,
74
+ "<|a_160|>": 151828,
75
+ "<|a_161|>": 151829,
76
+ "<|a_162|>": 151830,
77
+ "<|a_163|>": 151831,
78
+ "<|a_164|>": 151832,
79
+ "<|a_165|>": 151833,
80
+ "<|a_166|>": 151834,
81
+ "<|a_167|>": 151835,
82
+ "<|a_168|>": 151836,
83
+ "<|a_169|>": 151837,
84
+ "<|a_16|>": 151684,
85
+ "<|a_170|>": 151838,
86
+ "<|a_171|>": 151839,
87
+ "<|a_172|>": 151840,
88
+ "<|a_173|>": 151841,
89
+ "<|a_174|>": 151842,
90
+ "<|a_175|>": 151843,
91
+ "<|a_176|>": 151844,
92
+ "<|a_177|>": 151845,
93
+ "<|a_178|>": 151846,
94
+ "<|a_179|>": 151847,
95
+ "<|a_17|>": 151685,
96
+ "<|a_180|>": 151848,
97
+ "<|a_181|>": 151849,
98
+ "<|a_182|>": 151850,
99
+ "<|a_183|>": 151851,
100
+ "<|a_184|>": 151852,
101
+ "<|a_185|>": 151853,
102
+ "<|a_186|>": 151854,
103
+ "<|a_187|>": 151855,
104
+ "<|a_188|>": 151856,
105
+ "<|a_189|>": 151857,
106
+ "<|a_18|>": 151686,
107
+ "<|a_190|>": 151858,
108
+ "<|a_191|>": 151859,
109
+ "<|a_192|>": 151860,
110
+ "<|a_193|>": 151861,
111
+ "<|a_194|>": 151862,
112
+ "<|a_195|>": 151863,
113
+ "<|a_196|>": 151864,
114
+ "<|a_197|>": 151865,
115
+ "<|a_198|>": 151866,
116
+ "<|a_199|>": 151867,
117
+ "<|a_19|>": 151687,
118
+ "<|a_1|>": 151669,
119
+ "<|a_200|>": 151868,
120
+ "<|a_201|>": 151869,
121
+ "<|a_202|>": 151870,
122
+ "<|a_203|>": 151871,
123
+ "<|a_204|>": 151872,
124
+ "<|a_205|>": 151873,
125
+ "<|a_206|>": 151874,
126
+ "<|a_207|>": 151875,
127
+ "<|a_208|>": 151876,
128
+ "<|a_209|>": 151877,
129
+ "<|a_20|>": 151688,
130
+ "<|a_210|>": 151878,
131
+ "<|a_211|>": 151879,
132
+ "<|a_212|>": 151880,
133
+ "<|a_213|>": 151881,
134
+ "<|a_214|>": 151882,
135
+ "<|a_215|>": 151883,
136
+ "<|a_216|>": 151884,
137
+ "<|a_217|>": 151885,
138
+ "<|a_218|>": 151886,
139
+ "<|a_219|>": 151887,
140
+ "<|a_21|>": 151689,
141
+ "<|a_220|>": 151888,
142
+ "<|a_221|>": 151889,
143
+ "<|a_222|>": 151890,
144
+ "<|a_223|>": 151891,
145
+ "<|a_224|>": 151892,
146
+ "<|a_225|>": 151893,
147
+ "<|a_226|>": 151894,
148
+ "<|a_227|>": 151895,
149
+ "<|a_228|>": 151896,
150
+ "<|a_229|>": 151897,
151
+ "<|a_22|>": 151690,
152
+ "<|a_230|>": 151898,
153
+ "<|a_231|>": 151899,
154
+ "<|a_232|>": 151900,
155
+ "<|a_233|>": 151901,
156
+ "<|a_234|>": 151902,
157
+ "<|a_235|>": 151903,
158
+ "<|a_236|>": 151904,
159
+ "<|a_237|>": 151905,
160
+ "<|a_238|>": 151906,
161
+ "<|a_239|>": 151907,
162
+ "<|a_23|>": 151691,
163
+ "<|a_240|>": 151908,
164
+ "<|a_241|>": 151909,
165
+ "<|a_242|>": 151910,
166
+ "<|a_243|>": 151911,
167
+ "<|a_244|>": 151912,
168
+ "<|a_245|>": 151913,
169
+ "<|a_246|>": 151914,
170
+ "<|a_247|>": 151915,
171
+ "<|a_248|>": 151916,
172
+ "<|a_249|>": 151917,
173
+ "<|a_24|>": 151692,
174
+ "<|a_250|>": 151918,
175
+ "<|a_251|>": 151919,
176
+ "<|a_252|>": 151920,
177
+ "<|a_253|>": 151921,
178
+ "<|a_254|>": 151922,
179
+ "<|a_255|>": 151923,
180
+ "<|a_256|>": 151924,
181
+ "<|a_25|>": 151693,
182
+ "<|a_26|>": 151694,
183
+ "<|a_27|>": 151695,
184
+ "<|a_28|>": 151696,
185
+ "<|a_29|>": 151697,
186
+ "<|a_2|>": 151670,
187
+ "<|a_30|>": 151698,
188
+ "<|a_31|>": 151699,
189
+ "<|a_32|>": 151700,
190
+ "<|a_33|>": 151701,
191
+ "<|a_34|>": 151702,
192
+ "<|a_35|>": 151703,
193
+ "<|a_36|>": 151704,
194
+ "<|a_37|>": 151705,
195
+ "<|a_38|>": 151706,
196
+ "<|a_39|>": 151707,
197
+ "<|a_3|>": 151671,
198
+ "<|a_40|>": 151708,
199
+ "<|a_41|>": 151709,
200
+ "<|a_42|>": 151710,
201
+ "<|a_43|>": 151711,
202
+ "<|a_44|>": 151712,
203
+ "<|a_45|>": 151713,
204
+ "<|a_46|>": 151714,
205
+ "<|a_47|>": 151715,
206
+ "<|a_48|>": 151716,
207
+ "<|a_49|>": 151717,
208
+ "<|a_4|>": 151672,
209
+ "<|a_50|>": 151718,
210
+ "<|a_51|>": 151719,
211
+ "<|a_52|>": 151720,
212
+ "<|a_53|>": 151721,
213
+ "<|a_54|>": 151722,
214
+ "<|a_55|>": 151723,
215
+ "<|a_56|>": 151724,
216
+ "<|a_57|>": 151725,
217
+ "<|a_58|>": 151726,
218
+ "<|a_59|>": 151727,
219
+ "<|a_5|>": 151673,
220
+ "<|a_60|>": 151728,
221
+ "<|a_61|>": 151729,
222
+ "<|a_62|>": 151730,
223
+ "<|a_63|>": 151731,
224
+ "<|a_64|>": 151732,
225
+ "<|a_65|>": 151733,
226
+ "<|a_66|>": 151734,
227
+ "<|a_67|>": 151735,
228
+ "<|a_68|>": 151736,
229
+ "<|a_69|>": 151737,
230
+ "<|a_6|>": 151674,
231
+ "<|a_70|>": 151738,
232
+ "<|a_71|>": 151739,
233
+ "<|a_72|>": 151740,
234
+ "<|a_73|>": 151741,
235
+ "<|a_74|>": 151742,
236
+ "<|a_75|>": 151743,
237
+ "<|a_76|>": 151744,
238
+ "<|a_77|>": 151745,
239
+ "<|a_78|>": 151746,
240
+ "<|a_79|>": 151747,
241
+ "<|a_7|>": 151675,
242
+ "<|a_80|>": 151748,
243
+ "<|a_81|>": 151749,
244
+ "<|a_82|>": 151750,
245
+ "<|a_83|>": 151751,
246
+ "<|a_84|>": 151752,
247
+ "<|a_85|>": 151753,
248
+ "<|a_86|>": 151754,
249
+ "<|a_87|>": 151755,
250
+ "<|a_88|>": 151756,
251
+ "<|a_89|>": 151757,
252
+ "<|a_8|>": 151676,
253
+ "<|a_90|>": 151758,
254
+ "<|a_91|>": 151759,
255
+ "<|a_92|>": 151760,
256
+ "<|a_93|>": 151761,
257
+ "<|a_94|>": 151762,
258
+ "<|a_95|>": 151763,
259
+ "<|a_96|>": 151764,
260
+ "<|a_97|>": 151765,
261
+ "<|a_98|>": 151766,
262
+ "<|a_99|>": 151767,
263
+ "<|a_9|>": 151677,
264
+ "<|b_100|>": 152024,
265
+ "<|b_101|>": 152025,
266
+ "<|b_102|>": 152026,
267
+ "<|b_103|>": 152027,
268
+ "<|b_104|>": 152028,
269
+ "<|b_105|>": 152029,
270
+ "<|b_106|>": 152030,
271
+ "<|b_107|>": 152031,
272
+ "<|b_108|>": 152032,
273
+ "<|b_109|>": 152033,
274
+ "<|b_10|>": 151934,
275
+ "<|b_110|>": 152034,
276
+ "<|b_111|>": 152035,
277
+ "<|b_112|>": 152036,
278
+ "<|b_113|>": 152037,
279
+ "<|b_114|>": 152038,
280
+ "<|b_115|>": 152039,
281
+ "<|b_116|>": 152040,
282
+ "<|b_117|>": 152041,
283
+ "<|b_118|>": 152042,
284
+ "<|b_119|>": 152043,
285
+ "<|b_11|>": 151935,
286
+ "<|b_120|>": 152044,
287
+ "<|b_121|>": 152045,
288
+ "<|b_122|>": 152046,
289
+ "<|b_123|>": 152047,
290
+ "<|b_124|>": 152048,
291
+ "<|b_125|>": 152049,
292
+ "<|b_126|>": 152050,
293
+ "<|b_127|>": 152051,
294
+ "<|b_128|>": 152052,
295
+ "<|b_129|>": 152053,
296
+ "<|b_12|>": 151936,
297
+ "<|b_130|>": 152054,
298
+ "<|b_131|>": 152055,
299
+ "<|b_132|>": 152056,
300
+ "<|b_133|>": 152057,
301
+ "<|b_134|>": 152058,
302
+ "<|b_135|>": 152059,
303
+ "<|b_136|>": 152060,
304
+ "<|b_137|>": 152061,
305
+ "<|b_138|>": 152062,
306
+ "<|b_139|>": 152063,
307
+ "<|b_13|>": 151937,
308
+ "<|b_140|>": 152064,
309
+ "<|b_141|>": 152065,
310
+ "<|b_142|>": 152066,
311
+ "<|b_143|>": 152067,
312
+ "<|b_144|>": 152068,
313
+ "<|b_145|>": 152069,
314
+ "<|b_146|>": 152070,
315
+ "<|b_147|>": 152071,
316
+ "<|b_148|>": 152072,
317
+ "<|b_149|>": 152073,
318
+ "<|b_14|>": 151938,
319
+ "<|b_150|>": 152074,
320
+ "<|b_151|>": 152075,
321
+ "<|b_152|>": 152076,
322
+ "<|b_153|>": 152077,
323
+ "<|b_154|>": 152078,
324
+ "<|b_155|>": 152079,
325
+ "<|b_156|>": 152080,
326
+ "<|b_157|>": 152081,
327
+ "<|b_158|>": 152082,
328
+ "<|b_159|>": 152083,
329
+ "<|b_15|>": 151939,
330
+ "<|b_160|>": 152084,
331
+ "<|b_161|>": 152085,
332
+ "<|b_162|>": 152086,
333
+ "<|b_163|>": 152087,
334
+ "<|b_164|>": 152088,
335
+ "<|b_165|>": 152089,
336
+ "<|b_166|>": 152090,
337
+ "<|b_167|>": 152091,
338
+ "<|b_168|>": 152092,
339
+ "<|b_169|>": 152093,
340
+ "<|b_16|>": 151940,
341
+ "<|b_170|>": 152094,
342
+ "<|b_171|>": 152095,
343
+ "<|b_172|>": 152096,
344
+ "<|b_173|>": 152097,
345
+ "<|b_174|>": 152098,
346
+ "<|b_175|>": 152099,
347
+ "<|b_176|>": 152100,
348
+ "<|b_177|>": 152101,
349
+ "<|b_178|>": 152102,
350
+ "<|b_179|>": 152103,
351
+ "<|b_17|>": 151941,
352
+ "<|b_180|>": 152104,
353
+ "<|b_181|>": 152105,
354
+ "<|b_182|>": 152106,
355
+ "<|b_183|>": 152107,
356
+ "<|b_184|>": 152108,
357
+ "<|b_185|>": 152109,
358
+ "<|b_186|>": 152110,
359
+ "<|b_187|>": 152111,
360
+ "<|b_188|>": 152112,
361
+ "<|b_189|>": 152113,
362
+ "<|b_18|>": 151942,
363
+ "<|b_190|>": 152114,
364
+ "<|b_191|>": 152115,
365
+ "<|b_192|>": 152116,
366
+ "<|b_193|>": 152117,
367
+ "<|b_194|>": 152118,
368
+ "<|b_195|>": 152119,
369
+ "<|b_196|>": 152120,
370
+ "<|b_197|>": 152121,
371
+ "<|b_198|>": 152122,
372
+ "<|b_199|>": 152123,
373
+ "<|b_19|>": 151943,
374
+ "<|b_1|>": 151925,
375
+ "<|b_200|>": 152124,
376
+ "<|b_201|>": 152125,
377
+ "<|b_202|>": 152126,
378
+ "<|b_203|>": 152127,
379
+ "<|b_204|>": 152128,
380
+ "<|b_205|>": 152129,
381
+ "<|b_206|>": 152130,
382
+ "<|b_207|>": 152131,
383
+ "<|b_208|>": 152132,
384
+ "<|b_209|>": 152133,
385
+ "<|b_20|>": 151944,
386
+ "<|b_210|>": 152134,
387
+ "<|b_211|>": 152135,
388
+ "<|b_212|>": 152136,
389
+ "<|b_213|>": 152137,
390
+ "<|b_214|>": 152138,
391
+ "<|b_215|>": 152139,
392
+ "<|b_216|>": 152140,
393
+ "<|b_217|>": 152141,
394
+ "<|b_218|>": 152142,
395
+ "<|b_219|>": 152143,
396
+ "<|b_21|>": 151945,
397
+ "<|b_220|>": 152144,
398
+ "<|b_221|>": 152145,
399
+ "<|b_222|>": 152146,
400
+ "<|b_223|>": 152147,
401
+ "<|b_224|>": 152148,
402
+ "<|b_225|>": 152149,
403
+ "<|b_226|>": 152150,
404
+ "<|b_227|>": 152151,
405
+ "<|b_228|>": 152152,
406
+ "<|b_229|>": 152153,
407
+ "<|b_22|>": 151946,
408
+ "<|b_230|>": 152154,
409
+ "<|b_231|>": 152155,
410
+ "<|b_232|>": 152156,
411
+ "<|b_233|>": 152157,
412
+ "<|b_234|>": 152158,
413
+ "<|b_235|>": 152159,
414
+ "<|b_236|>": 152160,
415
+ "<|b_237|>": 152161,
416
+ "<|b_238|>": 152162,
417
+ "<|b_239|>": 152163,
418
+ "<|b_23|>": 151947,
419
+ "<|b_240|>": 152164,
420
+ "<|b_241|>": 152165,
421
+ "<|b_242|>": 152166,
422
+ "<|b_243|>": 152167,
423
+ "<|b_244|>": 152168,
424
+ "<|b_245|>": 152169,
425
+ "<|b_246|>": 152170,
426
+ "<|b_247|>": 152171,
427
+ "<|b_248|>": 152172,
428
+ "<|b_249|>": 152173,
429
+ "<|b_24|>": 151948,
430
+ "<|b_250|>": 152174,
431
+ "<|b_251|>": 152175,
432
+ "<|b_252|>": 152176,
433
+ "<|b_253|>": 152177,
434
+ "<|b_254|>": 152178,
435
+ "<|b_255|>": 152179,
436
+ "<|b_256|>": 152180,
437
+ "<|b_25|>": 151949,
438
+ "<|b_26|>": 151950,
439
+ "<|b_27|>": 151951,
440
+ "<|b_28|>": 151952,
441
+ "<|b_29|>": 151953,
442
+ "<|b_2|>": 151926,
443
+ "<|b_30|>": 151954,
444
+ "<|b_31|>": 151955,
445
+ "<|b_32|>": 151956,
446
+ "<|b_33|>": 151957,
447
+ "<|b_34|>": 151958,
448
+ "<|b_35|>": 151959,
449
+ "<|b_36|>": 151960,
450
+ "<|b_37|>": 151961,
451
+ "<|b_38|>": 151962,
452
+ "<|b_39|>": 151963,
453
+ "<|b_3|>": 151927,
454
+ "<|b_40|>": 151964,
455
+ "<|b_41|>": 151965,
456
+ "<|b_42|>": 151966,
457
+ "<|b_43|>": 151967,
458
+ "<|b_44|>": 151968,
459
+ "<|b_45|>": 151969,
460
+ "<|b_46|>": 151970,
461
+ "<|b_47|>": 151971,
462
+ "<|b_48|>": 151972,
463
+ "<|b_49|>": 151973,
464
+ "<|b_4|>": 151928,
465
+ "<|b_50|>": 151974,
466
+ "<|b_51|>": 151975,
467
+ "<|b_52|>": 151976,
468
+ "<|b_53|>": 151977,
469
+ "<|b_54|>": 151978,
470
+ "<|b_55|>": 151979,
471
+ "<|b_56|>": 151980,
472
+ "<|b_57|>": 151981,
473
+ "<|b_58|>": 151982,
474
+ "<|b_59|>": 151983,
475
+ "<|b_5|>": 151929,
476
+ "<|b_60|>": 151984,
477
+ "<|b_61|>": 151985,
478
+ "<|b_62|>": 151986,
479
+ "<|b_63|>": 151987,
480
+ "<|b_64|>": 151988,
481
+ "<|b_65|>": 151989,
482
+ "<|b_66|>": 151990,
483
+ "<|b_67|>": 151991,
484
+ "<|b_68|>": 151992,
485
+ "<|b_69|>": 151993,
486
+ "<|b_6|>": 151930,
487
+ "<|b_70|>": 151994,
488
+ "<|b_71|>": 151995,
489
+ "<|b_72|>": 151996,
490
+ "<|b_73|>": 151997,
491
+ "<|b_74|>": 151998,
492
+ "<|b_75|>": 151999,
493
+ "<|b_76|>": 152000,
494
+ "<|b_77|>": 152001,
495
+ "<|b_78|>": 152002,
496
+ "<|b_79|>": 152003,
497
+ "<|b_7|>": 151931,
498
+ "<|b_80|>": 152004,
499
+ "<|b_81|>": 152005,
500
+ "<|b_82|>": 152006,
501
+ "<|b_83|>": 152007,
502
+ "<|b_84|>": 152008,
503
+ "<|b_85|>": 152009,
504
+ "<|b_86|>": 152010,
505
+ "<|b_87|>": 152011,
506
+ "<|b_88|>": 152012,
507
+ "<|b_89|>": 152013,
508
+ "<|b_8|>": 151932,
509
+ "<|b_90|>": 152014,
510
+ "<|b_91|>": 152015,
511
+ "<|b_92|>": 152016,
512
+ "<|b_93|>": 152017,
513
+ "<|b_94|>": 152018,
514
+ "<|b_95|>": 152019,
515
+ "<|b_96|>": 152020,
516
+ "<|b_97|>": 152021,
517
+ "<|b_98|>": 152022,
518
+ "<|b_99|>": 152023,
519
+ "<|b_9|>": 151933,
520
+ "<|box_end|>": 151649,
521
+ "<|box_start|>": 151648,
522
+ "<|c_100|>": 152280,
523
+ "<|c_101|>": 152281,
524
+ "<|c_102|>": 152282,
525
+ "<|c_103|>": 152283,
526
+ "<|c_104|>": 152284,
527
+ "<|c_105|>": 152285,
528
+ "<|c_106|>": 152286,
529
+ "<|c_107|>": 152287,
530
+ "<|c_108|>": 152288,
531
+ "<|c_109|>": 152289,
532
+ "<|c_10|>": 152190,
533
+ "<|c_110|>": 152290,
534
+ "<|c_111|>": 152291,
535
+ "<|c_112|>": 152292,
536
+ "<|c_113|>": 152293,
537
+ "<|c_114|>": 152294,
538
+ "<|c_115|>": 152295,
539
+ "<|c_116|>": 152296,
540
+ "<|c_117|>": 152297,
541
+ "<|c_118|>": 152298,
542
+ "<|c_119|>": 152299,
543
+ "<|c_11|>": 152191,
544
+ "<|c_120|>": 152300,
545
+ "<|c_121|>": 152301,
546
+ "<|c_122|>": 152302,
547
+ "<|c_123|>": 152303,
548
+ "<|c_124|>": 152304,
549
+ "<|c_125|>": 152305,
550
+ "<|c_126|>": 152306,
551
+ "<|c_127|>": 152307,
552
+ "<|c_128|>": 152308,
553
+ "<|c_129|>": 152309,
554
+ "<|c_12|>": 152192,
555
+ "<|c_130|>": 152310,
556
+ "<|c_131|>": 152311,
557
+ "<|c_132|>": 152312,
558
+ "<|c_133|>": 152313,
559
+ "<|c_134|>": 152314,
560
+ "<|c_135|>": 152315,
561
+ "<|c_136|>": 152316,
562
+ "<|c_137|>": 152317,
563
+ "<|c_138|>": 152318,
564
+ "<|c_139|>": 152319,
565
+ "<|c_13|>": 152193,
566
+ "<|c_140|>": 152320,
567
+ "<|c_141|>": 152321,
568
+ "<|c_142|>": 152322,
569
+ "<|c_143|>": 152323,
570
+ "<|c_144|>": 152324,
571
+ "<|c_145|>": 152325,
572
+ "<|c_146|>": 152326,
573
+ "<|c_147|>": 152327,
574
+ "<|c_148|>": 152328,
575
+ "<|c_149|>": 152329,
576
+ "<|c_14|>": 152194,
577
+ "<|c_150|>": 152330,
578
+ "<|c_151|>": 152331,
579
+ "<|c_152|>": 152332,
580
+ "<|c_153|>": 152333,
581
+ "<|c_154|>": 152334,
582
+ "<|c_155|>": 152335,
583
+ "<|c_156|>": 152336,
584
+ "<|c_157|>": 152337,
585
+ "<|c_158|>": 152338,
586
+ "<|c_159|>": 152339,
587
+ "<|c_15|>": 152195,
588
+ "<|c_160|>": 152340,
589
+ "<|c_161|>": 152341,
590
+ "<|c_162|>": 152342,
591
+ "<|c_163|>": 152343,
592
+ "<|c_164|>": 152344,
593
+ "<|c_165|>": 152345,
594
+ "<|c_166|>": 152346,
595
+ "<|c_167|>": 152347,
596
+ "<|c_168|>": 152348,
597
+ "<|c_169|>": 152349,
598
+ "<|c_16|>": 152196,
599
+ "<|c_170|>": 152350,
600
+ "<|c_171|>": 152351,
601
+ "<|c_172|>": 152352,
602
+ "<|c_173|>": 152353,
603
+ "<|c_174|>": 152354,
604
+ "<|c_175|>": 152355,
605
+ "<|c_176|>": 152356,
606
+ "<|c_177|>": 152357,
607
+ "<|c_178|>": 152358,
608
+ "<|c_179|>": 152359,
609
+ "<|c_17|>": 152197,
610
+ "<|c_180|>": 152360,
611
+ "<|c_181|>": 152361,
612
+ "<|c_182|>": 152362,
613
+ "<|c_183|>": 152363,
614
+ "<|c_184|>": 152364,
615
+ "<|c_185|>": 152365,
616
+ "<|c_186|>": 152366,
617
+ "<|c_187|>": 152367,
618
+ "<|c_188|>": 152368,
619
+ "<|c_189|>": 152369,
620
+ "<|c_18|>": 152198,
621
+ "<|c_190|>": 152370,
622
+ "<|c_191|>": 152371,
623
+ "<|c_192|>": 152372,
624
+ "<|c_193|>": 152373,
625
+ "<|c_194|>": 152374,
626
+ "<|c_195|>": 152375,
627
+ "<|c_196|>": 152376,
628
+ "<|c_197|>": 152377,
629
+ "<|c_198|>": 152378,
630
+ "<|c_199|>": 152379,
631
+ "<|c_19|>": 152199,
632
+ "<|c_1|>": 152181,
633
+ "<|c_200|>": 152380,
634
+ "<|c_201|>": 152381,
635
+ "<|c_202|>": 152382,
636
+ "<|c_203|>": 152383,
637
+ "<|c_204|>": 152384,
638
+ "<|c_205|>": 152385,
639
+ "<|c_206|>": 152386,
640
+ "<|c_207|>": 152387,
641
+ "<|c_208|>": 152388,
642
+ "<|c_209|>": 152389,
643
+ "<|c_20|>": 152200,
644
+ "<|c_210|>": 152390,
645
+ "<|c_211|>": 152391,
646
+ "<|c_212|>": 152392,
647
+ "<|c_213|>": 152393,
648
+ "<|c_214|>": 152394,
649
+ "<|c_215|>": 152395,
650
+ "<|c_216|>": 152396,
651
+ "<|c_217|>": 152397,
652
+ "<|c_218|>": 152398,
653
+ "<|c_219|>": 152399,
654
+ "<|c_21|>": 152201,
655
+ "<|c_220|>": 152400,
656
+ "<|c_221|>": 152401,
657
+ "<|c_222|>": 152402,
658
+ "<|c_223|>": 152403,
659
+ "<|c_224|>": 152404,
660
+ "<|c_225|>": 152405,
661
+ "<|c_226|>": 152406,
662
+ "<|c_227|>": 152407,
663
+ "<|c_228|>": 152408,
664
+ "<|c_229|>": 152409,
665
+ "<|c_22|>": 152202,
666
+ "<|c_230|>": 152410,
667
+ "<|c_231|>": 152411,
668
+ "<|c_232|>": 152412,
669
+ "<|c_233|>": 152413,
670
+ "<|c_234|>": 152414,
671
+ "<|c_235|>": 152415,
672
+ "<|c_236|>": 152416,
673
+ "<|c_237|>": 152417,
674
+ "<|c_238|>": 152418,
675
+ "<|c_239|>": 152419,
676
+ "<|c_23|>": 152203,
677
+ "<|c_240|>": 152420,
678
+ "<|c_241|>": 152421,
679
+ "<|c_242|>": 152422,
680
+ "<|c_243|>": 152423,
681
+ "<|c_244|>": 152424,
682
+ "<|c_245|>": 152425,
683
+ "<|c_246|>": 152426,
684
+ "<|c_247|>": 152427,
685
+ "<|c_248|>": 152428,
686
+ "<|c_249|>": 152429,
687
+ "<|c_24|>": 152204,
688
+ "<|c_250|>": 152430,
689
+ "<|c_251|>": 152431,
690
+ "<|c_252|>": 152432,
691
+ "<|c_253|>": 152433,
692
+ "<|c_254|>": 152434,
693
+ "<|c_255|>": 152435,
694
+ "<|c_256|>": 152436,
695
+ "<|c_25|>": 152205,
696
+ "<|c_26|>": 152206,
697
+ "<|c_27|>": 152207,
698
+ "<|c_28|>": 152208,
699
+ "<|c_29|>": 152209,
700
+ "<|c_2|>": 152182,
701
+ "<|c_30|>": 152210,
702
+ "<|c_31|>": 152211,
703
+ "<|c_32|>": 152212,
704
+ "<|c_33|>": 152213,
705
+ "<|c_34|>": 152214,
706
+ "<|c_35|>": 152215,
707
+ "<|c_36|>": 152216,
708
+ "<|c_37|>": 152217,
709
+ "<|c_38|>": 152218,
710
+ "<|c_39|>": 152219,
711
+ "<|c_3|>": 152183,
712
+ "<|c_40|>": 152220,
713
+ "<|c_41|>": 152221,
714
+ "<|c_42|>": 152222,
715
+ "<|c_43|>": 152223,
716
+ "<|c_44|>": 152224,
717
+ "<|c_45|>": 152225,
718
+ "<|c_46|>": 152226,
719
+ "<|c_47|>": 152227,
720
+ "<|c_48|>": 152228,
721
+ "<|c_49|>": 152229,
722
+ "<|c_4|>": 152184,
723
+ "<|c_50|>": 152230,
724
+ "<|c_51|>": 152231,
725
+ "<|c_52|>": 152232,
726
+ "<|c_53|>": 152233,
727
+ "<|c_54|>": 152234,
728
+ "<|c_55|>": 152235,
729
+ "<|c_56|>": 152236,
730
+ "<|c_57|>": 152237,
731
+ "<|c_58|>": 152238,
732
+ "<|c_59|>": 152239,
733
+ "<|c_5|>": 152185,
734
+ "<|c_60|>": 152240,
735
+ "<|c_61|>": 152241,
736
+ "<|c_62|>": 152242,
737
+ "<|c_63|>": 152243,
738
+ "<|c_64|>": 152244,
739
+ "<|c_65|>": 152245,
740
+ "<|c_66|>": 152246,
741
+ "<|c_67|>": 152247,
742
+ "<|c_68|>": 152248,
743
+ "<|c_69|>": 152249,
744
+ "<|c_6|>": 152186,
745
+ "<|c_70|>": 152250,
746
+ "<|c_71|>": 152251,
747
+ "<|c_72|>": 152252,
748
+ "<|c_73|>": 152253,
749
+ "<|c_74|>": 152254,
750
+ "<|c_75|>": 152255,
751
+ "<|c_76|>": 152256,
752
+ "<|c_77|>": 152257,
753
+ "<|c_78|>": 152258,
754
+ "<|c_79|>": 152259,
755
+ "<|c_7|>": 152187,
756
+ "<|c_80|>": 152260,
757
+ "<|c_81|>": 152261,
758
+ "<|c_82|>": 152262,
759
+ "<|c_83|>": 152263,
760
+ "<|c_84|>": 152264,
761
+ "<|c_85|>": 152265,
762
+ "<|c_86|>": 152266,
763
+ "<|c_87|>": 152267,
764
+ "<|c_88|>": 152268,
765
+ "<|c_89|>": 152269,
766
+ "<|c_8|>": 152188,
767
+ "<|c_90|>": 152270,
768
+ "<|c_91|>": 152271,
769
+ "<|c_92|>": 152272,
770
+ "<|c_93|>": 152273,
771
+ "<|c_94|>": 152274,
772
+ "<|c_95|>": 152275,
773
+ "<|c_96|>": 152276,
774
+ "<|c_97|>": 152277,
775
+ "<|c_98|>": 152278,
776
+ "<|c_99|>": 152279,
777
+ "<|c_9|>": 152189,
778
+ "<|d_100|>": 152536,
779
+ "<|d_101|>": 152537,
780
+ "<|d_102|>": 152538,
781
+ "<|d_103|>": 152539,
782
+ "<|d_104|>": 152540,
783
+ "<|d_105|>": 152541,
784
+ "<|d_106|>": 152542,
785
+ "<|d_107|>": 152543,
786
+ "<|d_108|>": 152544,
787
+ "<|d_109|>": 152545,
788
+ "<|d_10|>": 152446,
789
+ "<|d_110|>": 152546,
790
+ "<|d_111|>": 152547,
791
+ "<|d_112|>": 152548,
792
+ "<|d_113|>": 152549,
793
+ "<|d_114|>": 152550,
794
+ "<|d_115|>": 152551,
795
+ "<|d_116|>": 152552,
796
+ "<|d_117|>": 152553,
797
+ "<|d_118|>": 152554,
798
+ "<|d_119|>": 152555,
799
+ "<|d_11|>": 152447,
800
+ "<|d_120|>": 152556,
801
+ "<|d_121|>": 152557,
802
+ "<|d_122|>": 152558,
803
+ "<|d_123|>": 152559,
804
+ "<|d_124|>": 152560,
805
+ "<|d_125|>": 152561,
806
+ "<|d_126|>": 152562,
807
+ "<|d_127|>": 152563,
808
+ "<|d_128|>": 152564,
809
+ "<|d_129|>": 152565,
810
+ "<|d_12|>": 152448,
811
+ "<|d_130|>": 152566,
812
+ "<|d_131|>": 152567,
813
+ "<|d_132|>": 152568,
814
+ "<|d_133|>": 152569,
815
+ "<|d_134|>": 152570,
816
+ "<|d_135|>": 152571,
817
+ "<|d_136|>": 152572,
818
+ "<|d_137|>": 152573,
819
+ "<|d_138|>": 152574,
820
+ "<|d_139|>": 152575,
821
+ "<|d_13|>": 152449,
822
+ "<|d_140|>": 152576,
823
+ "<|d_141|>": 152577,
824
+ "<|d_142|>": 152578,
825
+ "<|d_143|>": 152579,
826
+ "<|d_144|>": 152580,
827
+ "<|d_145|>": 152581,
828
+ "<|d_146|>": 152582,
829
+ "<|d_147|>": 152583,
830
+ "<|d_148|>": 152584,
831
+ "<|d_149|>": 152585,
832
+ "<|d_14|>": 152450,
833
+ "<|d_150|>": 152586,
834
+ "<|d_151|>": 152587,
835
+ "<|d_152|>": 152588,
836
+ "<|d_153|>": 152589,
837
+ "<|d_154|>": 152590,
838
+ "<|d_155|>": 152591,
839
+ "<|d_156|>": 152592,
840
+ "<|d_157|>": 152593,
841
+ "<|d_158|>": 152594,
842
+ "<|d_159|>": 152595,
843
+ "<|d_15|>": 152451,
844
+ "<|d_160|>": 152596,
845
+ "<|d_161|>": 152597,
846
+ "<|d_162|>": 152598,
847
+ "<|d_163|>": 152599,
848
+ "<|d_164|>": 152600,
849
+ "<|d_165|>": 152601,
850
+ "<|d_166|>": 152602,
851
+ "<|d_167|>": 152603,
852
+ "<|d_168|>": 152604,
853
+ "<|d_169|>": 152605,
854
+ "<|d_16|>": 152452,
855
+ "<|d_170|>": 152606,
856
+ "<|d_171|>": 152607,
857
+ "<|d_172|>": 152608,
858
+ "<|d_173|>": 152609,
859
+ "<|d_174|>": 152610,
860
+ "<|d_175|>": 152611,
861
+ "<|d_176|>": 152612,
862
+ "<|d_177|>": 152613,
863
+ "<|d_178|>": 152614,
864
+ "<|d_179|>": 152615,
865
+ "<|d_17|>": 152453,
866
+ "<|d_180|>": 152616,
867
+ "<|d_181|>": 152617,
868
+ "<|d_182|>": 152618,
869
+ "<|d_183|>": 152619,
870
+ "<|d_184|>": 152620,
871
+ "<|d_185|>": 152621,
872
+ "<|d_186|>": 152622,
873
+ "<|d_187|>": 152623,
874
+ "<|d_188|>": 152624,
875
+ "<|d_189|>": 152625,
876
+ "<|d_18|>": 152454,
877
+ "<|d_190|>": 152626,
878
+ "<|d_191|>": 152627,
879
+ "<|d_192|>": 152628,
880
+ "<|d_193|>": 152629,
881
+ "<|d_194|>": 152630,
882
+ "<|d_195|>": 152631,
883
+ "<|d_196|>": 152632,
884
+ "<|d_197|>": 152633,
885
+ "<|d_198|>": 152634,
886
+ "<|d_199|>": 152635,
887
+ "<|d_19|>": 152455,
888
+ "<|d_1|>": 152437,
889
+ "<|d_200|>": 152636,
890
+ "<|d_201|>": 152637,
891
+ "<|d_202|>": 152638,
892
+ "<|d_203|>": 152639,
893
+ "<|d_204|>": 152640,
894
+ "<|d_205|>": 152641,
895
+ "<|d_206|>": 152642,
896
+ "<|d_207|>": 152643,
897
+ "<|d_208|>": 152644,
898
+ "<|d_209|>": 152645,
899
+ "<|d_20|>": 152456,
900
+ "<|d_210|>": 152646,
901
+ "<|d_211|>": 152647,
902
+ "<|d_212|>": 152648,
903
+ "<|d_213|>": 152649,
904
+ "<|d_214|>": 152650,
905
+ "<|d_215|>": 152651,
906
+ "<|d_216|>": 152652,
907
+ "<|d_217|>": 152653,
908
+ "<|d_218|>": 152654,
909
+ "<|d_219|>": 152655,
910
+ "<|d_21|>": 152457,
911
+ "<|d_220|>": 152656,
912
+ "<|d_221|>": 152657,
913
+ "<|d_222|>": 152658,
914
+ "<|d_223|>": 152659,
915
+ "<|d_224|>": 152660,
916
+ "<|d_225|>": 152661,
917
+ "<|d_226|>": 152662,
918
+ "<|d_227|>": 152663,
919
+ "<|d_228|>": 152664,
920
+ "<|d_229|>": 152665,
921
+ "<|d_22|>": 152458,
922
+ "<|d_230|>": 152666,
923
+ "<|d_231|>": 152667,
924
+ "<|d_232|>": 152668,
925
+ "<|d_233|>": 152669,
926
+ "<|d_234|>": 152670,
927
+ "<|d_235|>": 152671,
928
+ "<|d_236|>": 152672,
929
+ "<|d_237|>": 152673,
930
+ "<|d_238|>": 152674,
931
+ "<|d_239|>": 152675,
932
+ "<|d_23|>": 152459,
933
+ "<|d_240|>": 152676,
934
+ "<|d_241|>": 152677,
935
+ "<|d_242|>": 152678,
936
+ "<|d_243|>": 152679,
937
+ "<|d_244|>": 152680,
938
+ "<|d_245|>": 152681,
939
+ "<|d_246|>": 152682,
940
+ "<|d_247|>": 152683,
941
+ "<|d_248|>": 152684,
942
+ "<|d_249|>": 152685,
943
+ "<|d_24|>": 152460,
944
+ "<|d_250|>": 152686,
945
+ "<|d_251|>": 152687,
946
+ "<|d_252|>": 152688,
947
+ "<|d_253|>": 152689,
948
+ "<|d_254|>": 152690,
949
+ "<|d_255|>": 152691,
950
+ "<|d_256|>": 152692,
951
+ "<|d_25|>": 152461,
952
+ "<|d_26|>": 152462,
953
+ "<|d_27|>": 152463,
954
+ "<|d_28|>": 152464,
955
+ "<|d_29|>": 152465,
956
+ "<|d_2|>": 152438,
957
+ "<|d_30|>": 152466,
958
+ "<|d_31|>": 152467,
959
+ "<|d_32|>": 152468,
960
+ "<|d_33|>": 152469,
961
+ "<|d_34|>": 152470,
962
+ "<|d_35|>": 152471,
963
+ "<|d_36|>": 152472,
964
+ "<|d_37|>": 152473,
965
+ "<|d_38|>": 152474,
966
+ "<|d_39|>": 152475,
967
+ "<|d_3|>": 152439,
968
+ "<|d_40|>": 152476,
969
+ "<|d_41|>": 152477,
970
+ "<|d_42|>": 152478,
971
+ "<|d_43|>": 152479,
972
+ "<|d_44|>": 152480,
973
+ "<|d_45|>": 152481,
974
+ "<|d_46|>": 152482,
975
+ "<|d_47|>": 152483,
976
+ "<|d_48|>": 152484,
977
+ "<|d_49|>": 152485,
978
+ "<|d_4|>": 152440,
979
+ "<|d_50|>": 152486,
980
+ "<|d_51|>": 152487,
981
+ "<|d_52|>": 152488,
982
+ "<|d_53|>": 152489,
983
+ "<|d_54|>": 152490,
984
+ "<|d_55|>": 152491,
985
+ "<|d_56|>": 152492,
986
+ "<|d_57|>": 152493,
987
+ "<|d_58|>": 152494,
988
+ "<|d_59|>": 152495,
989
+ "<|d_5|>": 152441,
990
+ "<|d_60|>": 152496,
991
+ "<|d_61|>": 152497,
992
+ "<|d_62|>": 152498,
993
+ "<|d_63|>": 152499,
994
+ "<|d_64|>": 152500,
995
+ "<|d_65|>": 152501,
996
+ "<|d_66|>": 152502,
997
+ "<|d_67|>": 152503,
998
+ "<|d_68|>": 152504,
999
+ "<|d_69|>": 152505,
1000
+ "<|d_6|>": 152442,
1001
+ "<|d_70|>": 152506,
1002
+ "<|d_71|>": 152507,
1003
+ "<|d_72|>": 152508,
1004
+ "<|d_73|>": 152509,
1005
+ "<|d_74|>": 152510,
1006
+ "<|d_75|>": 152511,
1007
+ "<|d_76|>": 152512,
1008
+ "<|d_77|>": 152513,
1009
+ "<|d_78|>": 152514,
1010
+ "<|d_79|>": 152515,
1011
+ "<|d_7|>": 152443,
1012
+ "<|d_80|>": 152516,
1013
+ "<|d_81|>": 152517,
1014
+ "<|d_82|>": 152518,
1015
+ "<|d_83|>": 152519,
1016
+ "<|d_84|>": 152520,
1017
+ "<|d_85|>": 152521,
1018
+ "<|d_86|>": 152522,
1019
+ "<|d_87|>": 152523,
1020
+ "<|d_88|>": 152524,
1021
+ "<|d_89|>": 152525,
1022
+ "<|d_8|>": 152444,
1023
+ "<|d_90|>": 152526,
1024
+ "<|d_91|>": 152527,
1025
+ "<|d_92|>": 152528,
1026
+ "<|d_93|>": 152529,
1027
+ "<|d_94|>": 152530,
1028
+ "<|d_95|>": 152531,
1029
+ "<|d_96|>": 152532,
1030
+ "<|d_97|>": 152533,
1031
+ "<|d_98|>": 152534,
1032
+ "<|d_99|>": 152535,
1033
+ "<|d_9|>": 152445,
1034
+ "<|e_100|>": 152792,
1035
+ "<|e_101|>": 152793,
1036
+ "<|e_102|>": 152794,
1037
+ "<|e_103|>": 152795,
1038
+ "<|e_104|>": 152796,
1039
+ "<|e_105|>": 152797,
1040
+ "<|e_106|>": 152798,
1041
+ "<|e_107|>": 152799,
1042
+ "<|e_108|>": 152800,
1043
+ "<|e_109|>": 152801,
1044
+ "<|e_10|>": 152702,
1045
+ "<|e_110|>": 152802,
1046
+ "<|e_111|>": 152803,
1047
+ "<|e_112|>": 152804,
1048
+ "<|e_113|>": 152805,
1049
+ "<|e_114|>": 152806,
1050
+ "<|e_115|>": 152807,
1051
+ "<|e_116|>": 152808,
1052
+ "<|e_117|>": 152809,
1053
+ "<|e_118|>": 152810,
1054
+ "<|e_119|>": 152811,
1055
+ "<|e_11|>": 152703,
1056
+ "<|e_120|>": 152812,
1057
+ "<|e_121|>": 152813,
1058
+ "<|e_122|>": 152814,
1059
+ "<|e_123|>": 152815,
1060
+ "<|e_124|>": 152816,
1061
+ "<|e_125|>": 152817,
1062
+ "<|e_126|>": 152818,
1063
+ "<|e_127|>": 152819,
1064
+ "<|e_128|>": 152820,
1065
+ "<|e_129|>": 152821,
1066
+ "<|e_12|>": 152704,
1067
+ "<|e_130|>": 152822,
1068
+ "<|e_131|>": 152823,
1069
+ "<|e_132|>": 152824,
1070
+ "<|e_133|>": 152825,
1071
+ "<|e_134|>": 152826,
1072
+ "<|e_135|>": 152827,
1073
+ "<|e_136|>": 152828,
1074
+ "<|e_137|>": 152829,
1075
+ "<|e_138|>": 152830,
1076
+ "<|e_139|>": 152831,
1077
+ "<|e_13|>": 152705,
1078
+ "<|e_140|>": 152832,
1079
+ "<|e_141|>": 152833,
1080
+ "<|e_142|>": 152834,
1081
+ "<|e_143|>": 152835,
1082
+ "<|e_144|>": 152836,
1083
+ "<|e_145|>": 152837,
1084
+ "<|e_146|>": 152838,
1085
+ "<|e_147|>": 152839,
1086
+ "<|e_148|>": 152840,
1087
+ "<|e_149|>": 152841,
1088
+ "<|e_14|>": 152706,
1089
+ "<|e_150|>": 152842,
1090
+ "<|e_151|>": 152843,
1091
+ "<|e_152|>": 152844,
1092
+ "<|e_153|>": 152845,
1093
+ "<|e_154|>": 152846,
1094
+ "<|e_155|>": 152847,
1095
+ "<|e_156|>": 152848,
1096
+ "<|e_157|>": 152849,
1097
+ "<|e_158|>": 152850,
1098
+ "<|e_159|>": 152851,
1099
+ "<|e_15|>": 152707,
1100
+ "<|e_160|>": 152852,
1101
+ "<|e_161|>": 152853,
1102
+ "<|e_162|>": 152854,
1103
+ "<|e_163|>": 152855,
1104
+ "<|e_164|>": 152856,
1105
+ "<|e_165|>": 152857,
1106
+ "<|e_166|>": 152858,
1107
+ "<|e_167|>": 152859,
1108
+ "<|e_168|>": 152860,
1109
+ "<|e_169|>": 152861,
1110
+ "<|e_16|>": 152708,
1111
+ "<|e_170|>": 152862,
1112
+ "<|e_171|>": 152863,
1113
+ "<|e_172|>": 152864,
1114
+ "<|e_173|>": 152865,
1115
+ "<|e_174|>": 152866,
1116
+ "<|e_175|>": 152867,
1117
+ "<|e_176|>": 152868,
1118
+ "<|e_177|>": 152869,
1119
+ "<|e_178|>": 152870,
1120
+ "<|e_179|>": 152871,
1121
+ "<|e_17|>": 152709,
1122
+ "<|e_180|>": 152872,
1123
+ "<|e_181|>": 152873,
1124
+ "<|e_182|>": 152874,
1125
+ "<|e_183|>": 152875,
1126
+ "<|e_184|>": 152876,
1127
+ "<|e_185|>": 152877,
1128
+ "<|e_186|>": 152878,
1129
+ "<|e_187|>": 152879,
1130
+ "<|e_188|>": 152880,
1131
+ "<|e_189|>": 152881,
1132
+ "<|e_18|>": 152710,
1133
+ "<|e_190|>": 152882,
1134
+ "<|e_191|>": 152883,
1135
+ "<|e_192|>": 152884,
1136
+ "<|e_193|>": 152885,
1137
+ "<|e_194|>": 152886,
1138
+ "<|e_195|>": 152887,
1139
+ "<|e_196|>": 152888,
1140
+ "<|e_197|>": 152889,
1141
+ "<|e_198|>": 152890,
1142
+ "<|e_199|>": 152891,
1143
+ "<|e_19|>": 152711,
1144
+ "<|e_1|>": 152693,
1145
+ "<|e_200|>": 152892,
1146
+ "<|e_201|>": 152893,
1147
+ "<|e_202|>": 152894,
1148
+ "<|e_203|>": 152895,
1149
+ "<|e_204|>": 152896,
1150
+ "<|e_205|>": 152897,
1151
+ "<|e_206|>": 152898,
1152
+ "<|e_207|>": 152899,
1153
+ "<|e_208|>": 152900,
1154
+ "<|e_209|>": 152901,
1155
+ "<|e_20|>": 152712,
1156
+ "<|e_210|>": 152902,
1157
+ "<|e_211|>": 152903,
1158
+ "<|e_212|>": 152904,
1159
+ "<|e_213|>": 152905,
1160
+ "<|e_214|>": 152906,
1161
+ "<|e_215|>": 152907,
1162
+ "<|e_216|>": 152908,
1163
+ "<|e_217|>": 152909,
1164
+ "<|e_218|>": 152910,
1165
+ "<|e_219|>": 152911,
1166
+ "<|e_21|>": 152713,
1167
+ "<|e_220|>": 152912,
1168
+ "<|e_221|>": 152913,
1169
+ "<|e_222|>": 152914,
1170
+ "<|e_223|>": 152915,
1171
+ "<|e_224|>": 152916,
1172
+ "<|e_225|>": 152917,
1173
+ "<|e_226|>": 152918,
1174
+ "<|e_227|>": 152919,
1175
+ "<|e_228|>": 152920,
1176
+ "<|e_229|>": 152921,
1177
+ "<|e_22|>": 152714,
1178
+ "<|e_230|>": 152922,
1179
+ "<|e_231|>": 152923,
1180
+ "<|e_232|>": 152924,
1181
+ "<|e_233|>": 152925,
1182
+ "<|e_234|>": 152926,
1183
+ "<|e_235|>": 152927,
1184
+ "<|e_236|>": 152928,
1185
+ "<|e_237|>": 152929,
1186
+ "<|e_238|>": 152930,
1187
+ "<|e_239|>": 152931,
1188
+ "<|e_23|>": 152715,
1189
+ "<|e_240|>": 152932,
1190
+ "<|e_241|>": 152933,
1191
+ "<|e_242|>": 152934,
1192
+ "<|e_243|>": 152935,
1193
+ "<|e_244|>": 152936,
1194
+ "<|e_245|>": 152937,
1195
+ "<|e_246|>": 152938,
1196
+ "<|e_247|>": 152939,
1197
+ "<|e_248|>": 152940,
1198
+ "<|e_249|>": 152941,
1199
+ "<|e_24|>": 152716,
1200
+ "<|e_250|>": 152942,
1201
+ "<|e_251|>": 152943,
1202
+ "<|e_252|>": 152944,
1203
+ "<|e_253|>": 152945,
1204
+ "<|e_254|>": 152946,
1205
+ "<|e_255|>": 152947,
1206
+ "<|e_256|>": 152948,
1207
+ "<|e_25|>": 152717,
1208
+ "<|e_26|>": 152718,
1209
+ "<|e_27|>": 152719,
1210
+ "<|e_28|>": 152720,
1211
+ "<|e_29|>": 152721,
1212
+ "<|e_2|>": 152694,
1213
+ "<|e_30|>": 152722,
1214
+ "<|e_31|>": 152723,
1215
+ "<|e_32|>": 152724,
1216
+ "<|e_33|>": 152725,
1217
+ "<|e_34|>": 152726,
1218
+ "<|e_35|>": 152727,
1219
+ "<|e_36|>": 152728,
1220
+ "<|e_37|>": 152729,
1221
+ "<|e_38|>": 152730,
1222
+ "<|e_39|>": 152731,
1223
+ "<|e_3|>": 152695,
1224
+ "<|e_40|>": 152732,
1225
+ "<|e_41|>": 152733,
1226
+ "<|e_42|>": 152734,
1227
+ "<|e_43|>": 152735,
1228
+ "<|e_44|>": 152736,
1229
+ "<|e_45|>": 152737,
1230
+ "<|e_46|>": 152738,
1231
+ "<|e_47|>": 152739,
1232
+ "<|e_48|>": 152740,
1233
+ "<|e_49|>": 152741,
1234
+ "<|e_4|>": 152696,
1235
+ "<|e_50|>": 152742,
1236
+ "<|e_51|>": 152743,
1237
+ "<|e_52|>": 152744,
1238
+ "<|e_53|>": 152745,
1239
+ "<|e_54|>": 152746,
1240
+ "<|e_55|>": 152747,
1241
+ "<|e_56|>": 152748,
1242
+ "<|e_57|>": 152749,
1243
+ "<|e_58|>": 152750,
1244
+ "<|e_59|>": 152751,
1245
+ "<|e_5|>": 152697,
1246
+ "<|e_60|>": 152752,
1247
+ "<|e_61|>": 152753,
1248
+ "<|e_62|>": 152754,
1249
+ "<|e_63|>": 152755,
1250
+ "<|e_64|>": 152756,
1251
+ "<|e_65|>": 152757,
1252
+ "<|e_66|>": 152758,
1253
+ "<|e_67|>": 152759,
1254
+ "<|e_68|>": 152760,
1255
+ "<|e_69|>": 152761,
1256
+ "<|e_6|>": 152698,
1257
+ "<|e_70|>": 152762,
1258
+ "<|e_71|>": 152763,
1259
+ "<|e_72|>": 152764,
1260
+ "<|e_73|>": 152765,
1261
+ "<|e_74|>": 152766,
1262
+ "<|e_75|>": 152767,
1263
+ "<|e_76|>": 152768,
1264
+ "<|e_77|>": 152769,
1265
+ "<|e_78|>": 152770,
1266
+ "<|e_79|>": 152771,
1267
+ "<|e_7|>": 152699,
1268
+ "<|e_80|>": 152772,
1269
+ "<|e_81|>": 152773,
1270
+ "<|e_82|>": 152774,
1271
+ "<|e_83|>": 152775,
1272
+ "<|e_84|>": 152776,
1273
+ "<|e_85|>": 152777,
1274
+ "<|e_86|>": 152778,
1275
+ "<|e_87|>": 152779,
1276
+ "<|e_88|>": 152780,
1277
+ "<|e_89|>": 152781,
1278
+ "<|e_8|>": 152700,
1279
+ "<|e_90|>": 152782,
1280
+ "<|e_91|>": 152783,
1281
+ "<|e_92|>": 152784,
1282
+ "<|e_93|>": 152785,
1283
+ "<|e_94|>": 152786,
1284
+ "<|e_95|>": 152787,
1285
+ "<|e_96|>": 152788,
1286
+ "<|e_97|>": 152789,
1287
+ "<|e_98|>": 152790,
1288
+ "<|e_99|>": 152791,
1289
+ "<|e_9|>": 152701,
1290
+ "<|endoftext|>": 151643,
1291
+ "<|file_sep|>": 151664,
1292
+ "<|fim_middle|>": 151660,
1293
+ "<|fim_pad|>": 151662,
1294
+ "<|fim_prefix|>": 151659,
1295
+ "<|fim_suffix|>": 151661,
1296
+ "<|im_end|>": 151645,
1297
+ "<|im_start|>": 151644,
1298
+ "<|image_pad|>": 151655,
1299
+ "<|object_ref_end|>": 151647,
1300
+ "<|object_ref_start|>": 151646,
1301
+ "<|quad_end|>": 151651,
1302
+ "<|quad_start|>": 151650,
1303
+ "<|repo_name|>": 151663,
1304
+ "<|video_pad|>": 151656,
1305
+ "<|vision_end|>": 151653,
1306
+ "<|vision_pad|>": 151654,
1307
+ "<|vision_start|>": 151652
1308
+ }
qwen3_4b_beauty/config.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Qwen3ForCausalLM"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "eos_token_id": 151645,
8
+ "head_dim": 128,
9
+ "hidden_act": "silu",
10
+ "hidden_size": 2560,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 9728,
13
+ "max_position_embeddings": 262144,
14
+ "max_window_layers": 36,
15
+ "model_type": "qwen3",
16
+ "num_attention_heads": 32,
17
+ "num_hidden_layers": 36,
18
+ "num_key_value_heads": 8,
19
+ "pad_token_id": 151643,
20
+ "rms_norm_eps": 1e-06,
21
+ "rope_scaling": null,
22
+ "rope_theta": 5000000,
23
+ "sliding_window": null,
24
+ "tie_word_embeddings": true,
25
+ "torch_dtype": "bfloat16",
26
+ "transformers_version": "4.51.1",
27
+ "use_cache": false,
28
+ "use_sliding_window": false,
29
+ "vocab_size": 152949
30
+ }
qwen3_4b_beauty/generation_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 151643,
3
+ "do_sample": true,
4
+ "eos_token_id": [
5
+ 151645,
6
+ 151643
7
+ ],
8
+ "pad_token_id": 151643,
9
+ "temperature": 0.7,
10
+ "top_k": 20,
11
+ "top_p": 0.8,
12
+ "transformers_version": "4.51.1"
13
+ }
qwen3_4b_beauty/merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
qwen3_4b_beauty/model-00001-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2c824e297e8dea2ff4198759a6129fed11ffda953a2171140a7a9721bb767c5e
3
+ size 4990760184
qwen3_4b_beauty/model-00002-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eb1ee7637f924a6431968f3b701350b2bf9052c315a968d96aae112d9862e891
3
+ size 3842507440
qwen3_4b_beauty/model.safetensors.index.json ADDED
@@ -0,0 +1,406 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 8833221632
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00001-of-00002.safetensors",
7
+ "model.embed_tokens.weight": "model-00002-of-00002.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00002.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
13
+ "model.layers.0.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
14
+ "model.layers.0.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
15
+ "model.layers.0.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
16
+ "model.layers.0.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
17
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
18
+ "model.layers.0.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
19
+ "model.layers.1.input_layernorm.weight": "model-00002-of-00002.safetensors",
20
+ "model.layers.1.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
21
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
22
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
23
+ "model.layers.1.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
24
+ "model.layers.1.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
25
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
26
+ "model.layers.1.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
27
+ "model.layers.1.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
28
+ "model.layers.1.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
29
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
30
+ "model.layers.10.input_layernorm.weight": "model-00001-of-00002.safetensors",
31
+ "model.layers.10.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
32
+ "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
33
+ "model.layers.10.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
34
+ "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
35
+ "model.layers.10.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
36
+ "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
37
+ "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
38
+ "model.layers.10.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
39
+ "model.layers.10.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
40
+ "model.layers.10.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
41
+ "model.layers.11.input_layernorm.weight": "model-00001-of-00002.safetensors",
42
+ "model.layers.11.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
43
+ "model.layers.11.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
44
+ "model.layers.11.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
45
+ "model.layers.11.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
46
+ "model.layers.11.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
47
+ "model.layers.11.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
48
+ "model.layers.11.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
49
+ "model.layers.11.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
50
+ "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
51
+ "model.layers.11.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
52
+ "model.layers.12.input_layernorm.weight": "model-00001-of-00002.safetensors",
53
+ "model.layers.12.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
54
+ "model.layers.12.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
55
+ "model.layers.12.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
56
+ "model.layers.12.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
57
+ "model.layers.12.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
58
+ "model.layers.12.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
59
+ "model.layers.12.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
60
+ "model.layers.12.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
61
+ "model.layers.12.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
62
+ "model.layers.12.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
63
+ "model.layers.13.input_layernorm.weight": "model-00002-of-00002.safetensors",
64
+ "model.layers.13.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
65
+ "model.layers.13.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
66
+ "model.layers.13.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
67
+ "model.layers.13.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
68
+ "model.layers.13.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
69
+ "model.layers.13.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
70
+ "model.layers.13.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
71
+ "model.layers.13.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
72
+ "model.layers.13.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
73
+ "model.layers.13.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
74
+ "model.layers.14.input_layernorm.weight": "model-00001-of-00002.safetensors",
75
+ "model.layers.14.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
76
+ "model.layers.14.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
77
+ "model.layers.14.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
78
+ "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
79
+ "model.layers.14.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
80
+ "model.layers.14.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
81
+ "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
82
+ "model.layers.14.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
83
+ "model.layers.14.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
84
+ "model.layers.14.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
85
+ "model.layers.15.input_layernorm.weight": "model-00001-of-00002.safetensors",
86
+ "model.layers.15.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
87
+ "model.layers.15.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
88
+ "model.layers.15.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
89
+ "model.layers.15.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
90
+ "model.layers.15.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
91
+ "model.layers.15.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
92
+ "model.layers.15.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
93
+ "model.layers.15.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
94
+ "model.layers.15.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
95
+ "model.layers.15.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
96
+ "model.layers.16.input_layernorm.weight": "model-00001-of-00002.safetensors",
97
+ "model.layers.16.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
98
+ "model.layers.16.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
99
+ "model.layers.16.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
100
+ "model.layers.16.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
101
+ "model.layers.16.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
102
+ "model.layers.16.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
103
+ "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
104
+ "model.layers.16.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
105
+ "model.layers.16.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
106
+ "model.layers.16.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
107
+ "model.layers.17.input_layernorm.weight": "model-00002-of-00002.safetensors",
108
+ "model.layers.17.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
109
+ "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
110
+ "model.layers.17.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
111
+ "model.layers.17.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
112
+ "model.layers.17.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
113
+ "model.layers.17.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
114
+ "model.layers.17.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
115
+ "model.layers.17.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
116
+ "model.layers.17.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
117
+ "model.layers.17.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
118
+ "model.layers.18.input_layernorm.weight": "model-00001-of-00002.safetensors",
119
+ "model.layers.18.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
120
+ "model.layers.18.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
121
+ "model.layers.18.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
122
+ "model.layers.18.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
123
+ "model.layers.18.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
124
+ "model.layers.18.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
125
+ "model.layers.18.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
126
+ "model.layers.18.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
127
+ "model.layers.18.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
128
+ "model.layers.18.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
129
+ "model.layers.19.input_layernorm.weight": "model-00001-of-00002.safetensors",
130
+ "model.layers.19.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
131
+ "model.layers.19.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
132
+ "model.layers.19.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
133
+ "model.layers.19.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
134
+ "model.layers.19.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
135
+ "model.layers.19.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
136
+ "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
137
+ "model.layers.19.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
138
+ "model.layers.19.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
139
+ "model.layers.19.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
140
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00002.safetensors",
141
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
142
+ "model.layers.2.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
143
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
144
+ "model.layers.2.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
145
+ "model.layers.2.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
146
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
147
+ "model.layers.2.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
148
+ "model.layers.2.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
149
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
150
+ "model.layers.2.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
151
+ "model.layers.20.input_layernorm.weight": "model-00001-of-00002.safetensors",
152
+ "model.layers.20.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
153
+ "model.layers.20.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
154
+ "model.layers.20.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
155
+ "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
156
+ "model.layers.20.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
157
+ "model.layers.20.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
158
+ "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
159
+ "model.layers.20.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
160
+ "model.layers.20.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
161
+ "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
162
+ "model.layers.21.input_layernorm.weight": "model-00001-of-00002.safetensors",
163
+ "model.layers.21.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
164
+ "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
165
+ "model.layers.21.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
166
+ "model.layers.21.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
167
+ "model.layers.21.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
168
+ "model.layers.21.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
169
+ "model.layers.21.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
170
+ "model.layers.21.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
171
+ "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
172
+ "model.layers.21.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
173
+ "model.layers.22.input_layernorm.weight": "model-00002-of-00002.safetensors",
174
+ "model.layers.22.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
175
+ "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
176
+ "model.layers.22.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
177
+ "model.layers.22.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
178
+ "model.layers.22.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
179
+ "model.layers.22.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
180
+ "model.layers.22.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
181
+ "model.layers.22.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
182
+ "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
183
+ "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
184
+ "model.layers.23.input_layernorm.weight": "model-00001-of-00002.safetensors",
185
+ "model.layers.23.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
186
+ "model.layers.23.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
187
+ "model.layers.23.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
188
+ "model.layers.23.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
189
+ "model.layers.23.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
190
+ "model.layers.23.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
191
+ "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
192
+ "model.layers.23.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
193
+ "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
194
+ "model.layers.23.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
195
+ "model.layers.24.input_layernorm.weight": "model-00001-of-00002.safetensors",
196
+ "model.layers.24.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
197
+ "model.layers.24.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
198
+ "model.layers.24.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
199
+ "model.layers.24.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
200
+ "model.layers.24.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
201
+ "model.layers.24.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
202
+ "model.layers.24.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
203
+ "model.layers.24.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
204
+ "model.layers.24.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
205
+ "model.layers.24.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
206
+ "model.layers.25.input_layernorm.weight": "model-00001-of-00002.safetensors",
207
+ "model.layers.25.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
208
+ "model.layers.25.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
209
+ "model.layers.25.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
210
+ "model.layers.25.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
211
+ "model.layers.25.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
212
+ "model.layers.25.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
213
+ "model.layers.25.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
214
+ "model.layers.25.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
215
+ "model.layers.25.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
216
+ "model.layers.25.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
217
+ "model.layers.26.input_layernorm.weight": "model-00001-of-00002.safetensors",
218
+ "model.layers.26.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
219
+ "model.layers.26.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
220
+ "model.layers.26.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
221
+ "model.layers.26.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
222
+ "model.layers.26.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
223
+ "model.layers.26.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
224
+ "model.layers.26.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
225
+ "model.layers.26.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
226
+ "model.layers.26.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
227
+ "model.layers.26.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
228
+ "model.layers.27.input_layernorm.weight": "model-00001-of-00002.safetensors",
229
+ "model.layers.27.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
230
+ "model.layers.27.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
231
+ "model.layers.27.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
232
+ "model.layers.27.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
233
+ "model.layers.27.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
234
+ "model.layers.27.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
235
+ "model.layers.27.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
236
+ "model.layers.27.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
237
+ "model.layers.27.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
238
+ "model.layers.27.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
239
+ "model.layers.28.input_layernorm.weight": "model-00001-of-00002.safetensors",
240
+ "model.layers.28.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
241
+ "model.layers.28.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
242
+ "model.layers.28.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
243
+ "model.layers.28.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
244
+ "model.layers.28.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
245
+ "model.layers.28.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
246
+ "model.layers.28.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
247
+ "model.layers.28.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
248
+ "model.layers.28.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
249
+ "model.layers.28.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
250
+ "model.layers.29.input_layernorm.weight": "model-00002-of-00002.safetensors",
251
+ "model.layers.29.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
252
+ "model.layers.29.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
253
+ "model.layers.29.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
254
+ "model.layers.29.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
255
+ "model.layers.29.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
256
+ "model.layers.29.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
257
+ "model.layers.29.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
258
+ "model.layers.29.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
259
+ "model.layers.29.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
260
+ "model.layers.29.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
261
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00002.safetensors",
262
+ "model.layers.3.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
263
+ "model.layers.3.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
264
+ "model.layers.3.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
265
+ "model.layers.3.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
266
+ "model.layers.3.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
267
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
268
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
269
+ "model.layers.3.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
270
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
271
+ "model.layers.3.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
272
+ "model.layers.30.input_layernorm.weight": "model-00002-of-00002.safetensors",
273
+ "model.layers.30.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
274
+ "model.layers.30.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
275
+ "model.layers.30.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
276
+ "model.layers.30.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
277
+ "model.layers.30.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
278
+ "model.layers.30.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
279
+ "model.layers.30.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
280
+ "model.layers.30.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
281
+ "model.layers.30.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
282
+ "model.layers.30.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
283
+ "model.layers.31.input_layernorm.weight": "model-00001-of-00002.safetensors",
284
+ "model.layers.31.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
285
+ "model.layers.31.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
286
+ "model.layers.31.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
287
+ "model.layers.31.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
288
+ "model.layers.31.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
289
+ "model.layers.31.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
290
+ "model.layers.31.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
291
+ "model.layers.31.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
292
+ "model.layers.31.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
293
+ "model.layers.31.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
294
+ "model.layers.32.input_layernorm.weight": "model-00001-of-00002.safetensors",
295
+ "model.layers.32.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
296
+ "model.layers.32.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
297
+ "model.layers.32.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
298
+ "model.layers.32.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
299
+ "model.layers.32.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
300
+ "model.layers.32.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
301
+ "model.layers.32.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
302
+ "model.layers.32.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
303
+ "model.layers.32.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
304
+ "model.layers.32.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
305
+ "model.layers.33.input_layernorm.weight": "model-00002-of-00002.safetensors",
306
+ "model.layers.33.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
307
+ "model.layers.33.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
308
+ "model.layers.33.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
309
+ "model.layers.33.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
310
+ "model.layers.33.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
311
+ "model.layers.33.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
312
+ "model.layers.33.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
313
+ "model.layers.33.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
314
+ "model.layers.33.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
315
+ "model.layers.33.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
316
+ "model.layers.34.input_layernorm.weight": "model-00002-of-00002.safetensors",
317
+ "model.layers.34.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
318
+ "model.layers.34.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
319
+ "model.layers.34.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
320
+ "model.layers.34.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
321
+ "model.layers.34.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
322
+ "model.layers.34.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
323
+ "model.layers.34.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
324
+ "model.layers.34.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
325
+ "model.layers.34.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
326
+ "model.layers.34.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
327
+ "model.layers.35.input_layernorm.weight": "model-00002-of-00002.safetensors",
328
+ "model.layers.35.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
329
+ "model.layers.35.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
330
+ "model.layers.35.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
331
+ "model.layers.35.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
332
+ "model.layers.35.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
333
+ "model.layers.35.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
334
+ "model.layers.35.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
335
+ "model.layers.35.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
336
+ "model.layers.35.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
337
+ "model.layers.35.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
338
+ "model.layers.4.input_layernorm.weight": "model-00002-of-00002.safetensors",
339
+ "model.layers.4.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
340
+ "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
341
+ "model.layers.4.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
342
+ "model.layers.4.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
343
+ "model.layers.4.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
344
+ "model.layers.4.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
345
+ "model.layers.4.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
346
+ "model.layers.4.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
347
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
348
+ "model.layers.4.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
349
+ "model.layers.5.input_layernorm.weight": "model-00002-of-00002.safetensors",
350
+ "model.layers.5.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
351
+ "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
352
+ "model.layers.5.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
353
+ "model.layers.5.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
354
+ "model.layers.5.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
355
+ "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
356
+ "model.layers.5.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
357
+ "model.layers.5.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
358
+ "model.layers.5.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
359
+ "model.layers.5.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
360
+ "model.layers.6.input_layernorm.weight": "model-00001-of-00002.safetensors",
361
+ "model.layers.6.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
362
+ "model.layers.6.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
363
+ "model.layers.6.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
364
+ "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
365
+ "model.layers.6.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
366
+ "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
367
+ "model.layers.6.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
368
+ "model.layers.6.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
369
+ "model.layers.6.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
370
+ "model.layers.6.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
371
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00002.safetensors",
372
+ "model.layers.7.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
373
+ "model.layers.7.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
374
+ "model.layers.7.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
375
+ "model.layers.7.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
376
+ "model.layers.7.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
377
+ "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
378
+ "model.layers.7.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
379
+ "model.layers.7.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
380
+ "model.layers.7.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
381
+ "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
382
+ "model.layers.8.input_layernorm.weight": "model-00002-of-00002.safetensors",
383
+ "model.layers.8.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
384
+ "model.layers.8.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
385
+ "model.layers.8.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
386
+ "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
387
+ "model.layers.8.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
388
+ "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
389
+ "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
390
+ "model.layers.8.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
391
+ "model.layers.8.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
392
+ "model.layers.8.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
393
+ "model.layers.9.input_layernorm.weight": "model-00001-of-00002.safetensors",
394
+ "model.layers.9.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
395
+ "model.layers.9.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
396
+ "model.layers.9.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
397
+ "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
398
+ "model.layers.9.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
399
+ "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
400
+ "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
401
+ "model.layers.9.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
402
+ "model.layers.9.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
403
+ "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
404
+ "model.norm.weight": "model-00002-of-00002.safetensors"
405
+ }
406
+ }
qwen3_4b_beauty/special_tokens_map.json ADDED
@@ -0,0 +1,1298 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|a_1|>",
4
+ "<|a_2|>",
5
+ "<|a_3|>",
6
+ "<|a_4|>",
7
+ "<|a_5|>",
8
+ "<|a_6|>",
9
+ "<|a_7|>",
10
+ "<|a_8|>",
11
+ "<|a_9|>",
12
+ "<|a_10|>",
13
+ "<|a_11|>",
14
+ "<|a_12|>",
15
+ "<|a_13|>",
16
+ "<|a_14|>",
17
+ "<|a_15|>",
18
+ "<|a_16|>",
19
+ "<|a_17|>",
20
+ "<|a_18|>",
21
+ "<|a_19|>",
22
+ "<|a_20|>",
23
+ "<|a_21|>",
24
+ "<|a_22|>",
25
+ "<|a_23|>",
26
+ "<|a_24|>",
27
+ "<|a_25|>",
28
+ "<|a_26|>",
29
+ "<|a_27|>",
30
+ "<|a_28|>",
31
+ "<|a_29|>",
32
+ "<|a_30|>",
33
+ "<|a_31|>",
34
+ "<|a_32|>",
35
+ "<|a_33|>",
36
+ "<|a_34|>",
37
+ "<|a_35|>",
38
+ "<|a_36|>",
39
+ "<|a_37|>",
40
+ "<|a_38|>",
41
+ "<|a_39|>",
42
+ "<|a_40|>",
43
+ "<|a_41|>",
44
+ "<|a_42|>",
45
+ "<|a_43|>",
46
+ "<|a_44|>",
47
+ "<|a_45|>",
48
+ "<|a_46|>",
49
+ "<|a_47|>",
50
+ "<|a_48|>",
51
+ "<|a_49|>",
52
+ "<|a_50|>",
53
+ "<|a_51|>",
54
+ "<|a_52|>",
55
+ "<|a_53|>",
56
+ "<|a_54|>",
57
+ "<|a_55|>",
58
+ "<|a_56|>",
59
+ "<|a_57|>",
60
+ "<|a_58|>",
61
+ "<|a_59|>",
62
+ "<|a_60|>",
63
+ "<|a_61|>",
64
+ "<|a_62|>",
65
+ "<|a_63|>",
66
+ "<|a_64|>",
67
+ "<|a_65|>",
68
+ "<|a_66|>",
69
+ "<|a_67|>",
70
+ "<|a_68|>",
71
+ "<|a_69|>",
72
+ "<|a_70|>",
73
+ "<|a_71|>",
74
+ "<|a_72|>",
75
+ "<|a_73|>",
76
+ "<|a_74|>",
77
+ "<|a_75|>",
78
+ "<|a_76|>",
79
+ "<|a_77|>",
80
+ "<|a_78|>",
81
+ "<|a_79|>",
82
+ "<|a_80|>",
83
+ "<|a_81|>",
84
+ "<|a_82|>",
85
+ "<|a_83|>",
86
+ "<|a_84|>",
87
+ "<|a_85|>",
88
+ "<|a_86|>",
89
+ "<|a_87|>",
90
+ "<|a_88|>",
91
+ "<|a_89|>",
92
+ "<|a_90|>",
93
+ "<|a_91|>",
94
+ "<|a_92|>",
95
+ "<|a_93|>",
96
+ "<|a_94|>",
97
+ "<|a_95|>",
98
+ "<|a_96|>",
99
+ "<|a_97|>",
100
+ "<|a_98|>",
101
+ "<|a_99|>",
102
+ "<|a_100|>",
103
+ "<|a_101|>",
104
+ "<|a_102|>",
105
+ "<|a_103|>",
106
+ "<|a_104|>",
107
+ "<|a_105|>",
108
+ "<|a_106|>",
109
+ "<|a_107|>",
110
+ "<|a_108|>",
111
+ "<|a_109|>",
112
+ "<|a_110|>",
113
+ "<|a_111|>",
114
+ "<|a_112|>",
115
+ "<|a_113|>",
116
+ "<|a_114|>",
117
+ "<|a_115|>",
118
+ "<|a_116|>",
119
+ "<|a_117|>",
120
+ "<|a_118|>",
121
+ "<|a_119|>",
122
+ "<|a_120|>",
123
+ "<|a_121|>",
124
+ "<|a_122|>",
125
+ "<|a_123|>",
126
+ "<|a_124|>",
127
+ "<|a_125|>",
128
+ "<|a_126|>",
129
+ "<|a_127|>",
130
+ "<|a_128|>",
131
+ "<|a_129|>",
132
+ "<|a_130|>",
133
+ "<|a_131|>",
134
+ "<|a_132|>",
135
+ "<|a_133|>",
136
+ "<|a_134|>",
137
+ "<|a_135|>",
138
+ "<|a_136|>",
139
+ "<|a_137|>",
140
+ "<|a_138|>",
141
+ "<|a_139|>",
142
+ "<|a_140|>",
143
+ "<|a_141|>",
144
+ "<|a_142|>",
145
+ "<|a_143|>",
146
+ "<|a_144|>",
147
+ "<|a_145|>",
148
+ "<|a_146|>",
149
+ "<|a_147|>",
150
+ "<|a_148|>",
151
+ "<|a_149|>",
152
+ "<|a_150|>",
153
+ "<|a_151|>",
154
+ "<|a_152|>",
155
+ "<|a_153|>",
156
+ "<|a_154|>",
157
+ "<|a_155|>",
158
+ "<|a_156|>",
159
+ "<|a_157|>",
160
+ "<|a_158|>",
161
+ "<|a_159|>",
162
+ "<|a_160|>",
163
+ "<|a_161|>",
164
+ "<|a_162|>",
165
+ "<|a_163|>",
166
+ "<|a_164|>",
167
+ "<|a_165|>",
168
+ "<|a_166|>",
169
+ "<|a_167|>",
170
+ "<|a_168|>",
171
+ "<|a_169|>",
172
+ "<|a_170|>",
173
+ "<|a_171|>",
174
+ "<|a_172|>",
175
+ "<|a_173|>",
176
+ "<|a_174|>",
177
+ "<|a_175|>",
178
+ "<|a_176|>",
179
+ "<|a_177|>",
180
+ "<|a_178|>",
181
+ "<|a_179|>",
182
+ "<|a_180|>",
183
+ "<|a_181|>",
184
+ "<|a_182|>",
185
+ "<|a_183|>",
186
+ "<|a_184|>",
187
+ "<|a_185|>",
188
+ "<|a_186|>",
189
+ "<|a_187|>",
190
+ "<|a_188|>",
191
+ "<|a_189|>",
192
+ "<|a_190|>",
193
+ "<|a_191|>",
194
+ "<|a_192|>",
195
+ "<|a_193|>",
196
+ "<|a_194|>",
197
+ "<|a_195|>",
198
+ "<|a_196|>",
199
+ "<|a_197|>",
200
+ "<|a_198|>",
201
+ "<|a_199|>",
202
+ "<|a_200|>",
203
+ "<|a_201|>",
204
+ "<|a_202|>",
205
+ "<|a_203|>",
206
+ "<|a_204|>",
207
+ "<|a_205|>",
208
+ "<|a_206|>",
209
+ "<|a_207|>",
210
+ "<|a_208|>",
211
+ "<|a_209|>",
212
+ "<|a_210|>",
213
+ "<|a_211|>",
214
+ "<|a_212|>",
215
+ "<|a_213|>",
216
+ "<|a_214|>",
217
+ "<|a_215|>",
218
+ "<|a_216|>",
219
+ "<|a_217|>",
220
+ "<|a_218|>",
221
+ "<|a_219|>",
222
+ "<|a_220|>",
223
+ "<|a_221|>",
224
+ "<|a_222|>",
225
+ "<|a_223|>",
226
+ "<|a_224|>",
227
+ "<|a_225|>",
228
+ "<|a_226|>",
229
+ "<|a_227|>",
230
+ "<|a_228|>",
231
+ "<|a_229|>",
232
+ "<|a_230|>",
233
+ "<|a_231|>",
234
+ "<|a_232|>",
235
+ "<|a_233|>",
236
+ "<|a_234|>",
237
+ "<|a_235|>",
238
+ "<|a_236|>",
239
+ "<|a_237|>",
240
+ "<|a_238|>",
241
+ "<|a_239|>",
242
+ "<|a_240|>",
243
+ "<|a_241|>",
244
+ "<|a_242|>",
245
+ "<|a_243|>",
246
+ "<|a_244|>",
247
+ "<|a_245|>",
248
+ "<|a_246|>",
249
+ "<|a_247|>",
250
+ "<|a_248|>",
251
+ "<|a_249|>",
252
+ "<|a_250|>",
253
+ "<|a_251|>",
254
+ "<|a_252|>",
255
+ "<|a_253|>",
256
+ "<|a_254|>",
257
+ "<|a_255|>",
258
+ "<|a_256|>",
259
+ "<|b_1|>",
260
+ "<|b_2|>",
261
+ "<|b_3|>",
262
+ "<|b_4|>",
263
+ "<|b_5|>",
264
+ "<|b_6|>",
265
+ "<|b_7|>",
266
+ "<|b_8|>",
267
+ "<|b_9|>",
268
+ "<|b_10|>",
269
+ "<|b_11|>",
270
+ "<|b_12|>",
271
+ "<|b_13|>",
272
+ "<|b_14|>",
273
+ "<|b_15|>",
274
+ "<|b_16|>",
275
+ "<|b_17|>",
276
+ "<|b_18|>",
277
+ "<|b_19|>",
278
+ "<|b_20|>",
279
+ "<|b_21|>",
280
+ "<|b_22|>",
281
+ "<|b_23|>",
282
+ "<|b_24|>",
283
+ "<|b_25|>",
284
+ "<|b_26|>",
285
+ "<|b_27|>",
286
+ "<|b_28|>",
287
+ "<|b_29|>",
288
+ "<|b_30|>",
289
+ "<|b_31|>",
290
+ "<|b_32|>",
291
+ "<|b_33|>",
292
+ "<|b_34|>",
293
+ "<|b_35|>",
294
+ "<|b_36|>",
295
+ "<|b_37|>",
296
+ "<|b_38|>",
297
+ "<|b_39|>",
298
+ "<|b_40|>",
299
+ "<|b_41|>",
300
+ "<|b_42|>",
301
+ "<|b_43|>",
302
+ "<|b_44|>",
303
+ "<|b_45|>",
304
+ "<|b_46|>",
305
+ "<|b_47|>",
306
+ "<|b_48|>",
307
+ "<|b_49|>",
308
+ "<|b_50|>",
309
+ "<|b_51|>",
310
+ "<|b_52|>",
311
+ "<|b_53|>",
312
+ "<|b_54|>",
313
+ "<|b_55|>",
314
+ "<|b_56|>",
315
+ "<|b_57|>",
316
+ "<|b_58|>",
317
+ "<|b_59|>",
318
+ "<|b_60|>",
319
+ "<|b_61|>",
320
+ "<|b_62|>",
321
+ "<|b_63|>",
322
+ "<|b_64|>",
323
+ "<|b_65|>",
324
+ "<|b_66|>",
325
+ "<|b_67|>",
326
+ "<|b_68|>",
327
+ "<|b_69|>",
328
+ "<|b_70|>",
329
+ "<|b_71|>",
330
+ "<|b_72|>",
331
+ "<|b_73|>",
332
+ "<|b_74|>",
333
+ "<|b_75|>",
334
+ "<|b_76|>",
335
+ "<|b_77|>",
336
+ "<|b_78|>",
337
+ "<|b_79|>",
338
+ "<|b_80|>",
339
+ "<|b_81|>",
340
+ "<|b_82|>",
341
+ "<|b_83|>",
342
+ "<|b_84|>",
343
+ "<|b_85|>",
344
+ "<|b_86|>",
345
+ "<|b_87|>",
346
+ "<|b_88|>",
347
+ "<|b_89|>",
348
+ "<|b_90|>",
349
+ "<|b_91|>",
350
+ "<|b_92|>",
351
+ "<|b_93|>",
352
+ "<|b_94|>",
353
+ "<|b_95|>",
354
+ "<|b_96|>",
355
+ "<|b_97|>",
356
+ "<|b_98|>",
357
+ "<|b_99|>",
358
+ "<|b_100|>",
359
+ "<|b_101|>",
360
+ "<|b_102|>",
361
+ "<|b_103|>",
362
+ "<|b_104|>",
363
+ "<|b_105|>",
364
+ "<|b_106|>",
365
+ "<|b_107|>",
366
+ "<|b_108|>",
367
+ "<|b_109|>",
368
+ "<|b_110|>",
369
+ "<|b_111|>",
370
+ "<|b_112|>",
371
+ "<|b_113|>",
372
+ "<|b_114|>",
373
+ "<|b_115|>",
374
+ "<|b_116|>",
375
+ "<|b_117|>",
376
+ "<|b_118|>",
377
+ "<|b_119|>",
378
+ "<|b_120|>",
379
+ "<|b_121|>",
380
+ "<|b_122|>",
381
+ "<|b_123|>",
382
+ "<|b_124|>",
383
+ "<|b_125|>",
384
+ "<|b_126|>",
385
+ "<|b_127|>",
386
+ "<|b_128|>",
387
+ "<|b_129|>",
388
+ "<|b_130|>",
389
+ "<|b_131|>",
390
+ "<|b_132|>",
391
+ "<|b_133|>",
392
+ "<|b_134|>",
393
+ "<|b_135|>",
394
+ "<|b_136|>",
395
+ "<|b_137|>",
396
+ "<|b_138|>",
397
+ "<|b_139|>",
398
+ "<|b_140|>",
399
+ "<|b_141|>",
400
+ "<|b_142|>",
401
+ "<|b_143|>",
402
+ "<|b_144|>",
403
+ "<|b_145|>",
404
+ "<|b_146|>",
405
+ "<|b_147|>",
406
+ "<|b_148|>",
407
+ "<|b_149|>",
408
+ "<|b_150|>",
409
+ "<|b_151|>",
410
+ "<|b_152|>",
411
+ "<|b_153|>",
412
+ "<|b_154|>",
413
+ "<|b_155|>",
414
+ "<|b_156|>",
415
+ "<|b_157|>",
416
+ "<|b_158|>",
417
+ "<|b_159|>",
418
+ "<|b_160|>",
419
+ "<|b_161|>",
420
+ "<|b_162|>",
421
+ "<|b_163|>",
422
+ "<|b_164|>",
423
+ "<|b_165|>",
424
+ "<|b_166|>",
425
+ "<|b_167|>",
426
+ "<|b_168|>",
427
+ "<|b_169|>",
428
+ "<|b_170|>",
429
+ "<|b_171|>",
430
+ "<|b_172|>",
431
+ "<|b_173|>",
432
+ "<|b_174|>",
433
+ "<|b_175|>",
434
+ "<|b_176|>",
435
+ "<|b_177|>",
436
+ "<|b_178|>",
437
+ "<|b_179|>",
438
+ "<|b_180|>",
439
+ "<|b_181|>",
440
+ "<|b_182|>",
441
+ "<|b_183|>",
442
+ "<|b_184|>",
443
+ "<|b_185|>",
444
+ "<|b_186|>",
445
+ "<|b_187|>",
446
+ "<|b_188|>",
447
+ "<|b_189|>",
448
+ "<|b_190|>",
449
+ "<|b_191|>",
450
+ "<|b_192|>",
451
+ "<|b_193|>",
452
+ "<|b_194|>",
453
+ "<|b_195|>",
454
+ "<|b_196|>",
455
+ "<|b_197|>",
456
+ "<|b_198|>",
457
+ "<|b_199|>",
458
+ "<|b_200|>",
459
+ "<|b_201|>",
460
+ "<|b_202|>",
461
+ "<|b_203|>",
462
+ "<|b_204|>",
463
+ "<|b_205|>",
464
+ "<|b_206|>",
465
+ "<|b_207|>",
466
+ "<|b_208|>",
467
+ "<|b_209|>",
468
+ "<|b_210|>",
469
+ "<|b_211|>",
470
+ "<|b_212|>",
471
+ "<|b_213|>",
472
+ "<|b_214|>",
473
+ "<|b_215|>",
474
+ "<|b_216|>",
475
+ "<|b_217|>",
476
+ "<|b_218|>",
477
+ "<|b_219|>",
478
+ "<|b_220|>",
479
+ "<|b_221|>",
480
+ "<|b_222|>",
481
+ "<|b_223|>",
482
+ "<|b_224|>",
483
+ "<|b_225|>",
484
+ "<|b_226|>",
485
+ "<|b_227|>",
486
+ "<|b_228|>",
487
+ "<|b_229|>",
488
+ "<|b_230|>",
489
+ "<|b_231|>",
490
+ "<|b_232|>",
491
+ "<|b_233|>",
492
+ "<|b_234|>",
493
+ "<|b_235|>",
494
+ "<|b_236|>",
495
+ "<|b_237|>",
496
+ "<|b_238|>",
497
+ "<|b_239|>",
498
+ "<|b_240|>",
499
+ "<|b_241|>",
500
+ "<|b_242|>",
501
+ "<|b_243|>",
502
+ "<|b_244|>",
503
+ "<|b_245|>",
504
+ "<|b_246|>",
505
+ "<|b_247|>",
506
+ "<|b_248|>",
507
+ "<|b_249|>",
508
+ "<|b_250|>",
509
+ "<|b_251|>",
510
+ "<|b_252|>",
511
+ "<|b_253|>",
512
+ "<|b_254|>",
513
+ "<|b_255|>",
514
+ "<|b_256|>",
515
+ "<|c_1|>",
516
+ "<|c_2|>",
517
+ "<|c_3|>",
518
+ "<|c_4|>",
519
+ "<|c_5|>",
520
+ "<|c_6|>",
521
+ "<|c_7|>",
522
+ "<|c_8|>",
523
+ "<|c_9|>",
524
+ "<|c_10|>",
525
+ "<|c_11|>",
526
+ "<|c_12|>",
527
+ "<|c_13|>",
528
+ "<|c_14|>",
529
+ "<|c_15|>",
530
+ "<|c_16|>",
531
+ "<|c_17|>",
532
+ "<|c_18|>",
533
+ "<|c_19|>",
534
+ "<|c_20|>",
535
+ "<|c_21|>",
536
+ "<|c_22|>",
537
+ "<|c_23|>",
538
+ "<|c_24|>",
539
+ "<|c_25|>",
540
+ "<|c_26|>",
541
+ "<|c_27|>",
542
+ "<|c_28|>",
543
+ "<|c_29|>",
544
+ "<|c_30|>",
545
+ "<|c_31|>",
546
+ "<|c_32|>",
547
+ "<|c_33|>",
548
+ "<|c_34|>",
549
+ "<|c_35|>",
550
+ "<|c_36|>",
551
+ "<|c_37|>",
552
+ "<|c_38|>",
553
+ "<|c_39|>",
554
+ "<|c_40|>",
555
+ "<|c_41|>",
556
+ "<|c_42|>",
557
+ "<|c_43|>",
558
+ "<|c_44|>",
559
+ "<|c_45|>",
560
+ "<|c_46|>",
561
+ "<|c_47|>",
562
+ "<|c_48|>",
563
+ "<|c_49|>",
564
+ "<|c_50|>",
565
+ "<|c_51|>",
566
+ "<|c_52|>",
567
+ "<|c_53|>",
568
+ "<|c_54|>",
569
+ "<|c_55|>",
570
+ "<|c_56|>",
571
+ "<|c_57|>",
572
+ "<|c_58|>",
573
+ "<|c_59|>",
574
+ "<|c_60|>",
575
+ "<|c_61|>",
576
+ "<|c_62|>",
577
+ "<|c_63|>",
578
+ "<|c_64|>",
579
+ "<|c_65|>",
580
+ "<|c_66|>",
581
+ "<|c_67|>",
582
+ "<|c_68|>",
583
+ "<|c_69|>",
584
+ "<|c_70|>",
585
+ "<|c_71|>",
586
+ "<|c_72|>",
587
+ "<|c_73|>",
588
+ "<|c_74|>",
589
+ "<|c_75|>",
590
+ "<|c_76|>",
591
+ "<|c_77|>",
592
+ "<|c_78|>",
593
+ "<|c_79|>",
594
+ "<|c_80|>",
595
+ "<|c_81|>",
596
+ "<|c_82|>",
597
+ "<|c_83|>",
598
+ "<|c_84|>",
599
+ "<|c_85|>",
600
+ "<|c_86|>",
601
+ "<|c_87|>",
602
+ "<|c_88|>",
603
+ "<|c_89|>",
604
+ "<|c_90|>",
605
+ "<|c_91|>",
606
+ "<|c_92|>",
607
+ "<|c_93|>",
608
+ "<|c_94|>",
609
+ "<|c_95|>",
610
+ "<|c_96|>",
611
+ "<|c_97|>",
612
+ "<|c_98|>",
613
+ "<|c_99|>",
614
+ "<|c_100|>",
615
+ "<|c_101|>",
616
+ "<|c_102|>",
617
+ "<|c_103|>",
618
+ "<|c_104|>",
619
+ "<|c_105|>",
620
+ "<|c_106|>",
621
+ "<|c_107|>",
622
+ "<|c_108|>",
623
+ "<|c_109|>",
624
+ "<|c_110|>",
625
+ "<|c_111|>",
626
+ "<|c_112|>",
627
+ "<|c_113|>",
628
+ "<|c_114|>",
629
+ "<|c_115|>",
630
+ "<|c_116|>",
631
+ "<|c_117|>",
632
+ "<|c_118|>",
633
+ "<|c_119|>",
634
+ "<|c_120|>",
635
+ "<|c_121|>",
636
+ "<|c_122|>",
637
+ "<|c_123|>",
638
+ "<|c_124|>",
639
+ "<|c_125|>",
640
+ "<|c_126|>",
641
+ "<|c_127|>",
642
+ "<|c_128|>",
643
+ "<|c_129|>",
644
+ "<|c_130|>",
645
+ "<|c_131|>",
646
+ "<|c_132|>",
647
+ "<|c_133|>",
648
+ "<|c_134|>",
649
+ "<|c_135|>",
650
+ "<|c_136|>",
651
+ "<|c_137|>",
652
+ "<|c_138|>",
653
+ "<|c_139|>",
654
+ "<|c_140|>",
655
+ "<|c_141|>",
656
+ "<|c_142|>",
657
+ "<|c_143|>",
658
+ "<|c_144|>",
659
+ "<|c_145|>",
660
+ "<|c_146|>",
661
+ "<|c_147|>",
662
+ "<|c_148|>",
663
+ "<|c_149|>",
664
+ "<|c_150|>",
665
+ "<|c_151|>",
666
+ "<|c_152|>",
667
+ "<|c_153|>",
668
+ "<|c_154|>",
669
+ "<|c_155|>",
670
+ "<|c_156|>",
671
+ "<|c_157|>",
672
+ "<|c_158|>",
673
+ "<|c_159|>",
674
+ "<|c_160|>",
675
+ "<|c_161|>",
676
+ "<|c_162|>",
677
+ "<|c_163|>",
678
+ "<|c_164|>",
679
+ "<|c_165|>",
680
+ "<|c_166|>",
681
+ "<|c_167|>",
682
+ "<|c_168|>",
683
+ "<|c_169|>",
684
+ "<|c_170|>",
685
+ "<|c_171|>",
686
+ "<|c_172|>",
687
+ "<|c_173|>",
688
+ "<|c_174|>",
689
+ "<|c_175|>",
690
+ "<|c_176|>",
691
+ "<|c_177|>",
692
+ "<|c_178|>",
693
+ "<|c_179|>",
694
+ "<|c_180|>",
695
+ "<|c_181|>",
696
+ "<|c_182|>",
697
+ "<|c_183|>",
698
+ "<|c_184|>",
699
+ "<|c_185|>",
700
+ "<|c_186|>",
701
+ "<|c_187|>",
702
+ "<|c_188|>",
703
+ "<|c_189|>",
704
+ "<|c_190|>",
705
+ "<|c_191|>",
706
+ "<|c_192|>",
707
+ "<|c_193|>",
708
+ "<|c_194|>",
709
+ "<|c_195|>",
710
+ "<|c_196|>",
711
+ "<|c_197|>",
712
+ "<|c_198|>",
713
+ "<|c_199|>",
714
+ "<|c_200|>",
715
+ "<|c_201|>",
716
+ "<|c_202|>",
717
+ "<|c_203|>",
718
+ "<|c_204|>",
719
+ "<|c_205|>",
720
+ "<|c_206|>",
721
+ "<|c_207|>",
722
+ "<|c_208|>",
723
+ "<|c_209|>",
724
+ "<|c_210|>",
725
+ "<|c_211|>",
726
+ "<|c_212|>",
727
+ "<|c_213|>",
728
+ "<|c_214|>",
729
+ "<|c_215|>",
730
+ "<|c_216|>",
731
+ "<|c_217|>",
732
+ "<|c_218|>",
733
+ "<|c_219|>",
734
+ "<|c_220|>",
735
+ "<|c_221|>",
736
+ "<|c_222|>",
737
+ "<|c_223|>",
738
+ "<|c_224|>",
739
+ "<|c_225|>",
740
+ "<|c_226|>",
741
+ "<|c_227|>",
742
+ "<|c_228|>",
743
+ "<|c_229|>",
744
+ "<|c_230|>",
745
+ "<|c_231|>",
746
+ "<|c_232|>",
747
+ "<|c_233|>",
748
+ "<|c_234|>",
749
+ "<|c_235|>",
750
+ "<|c_236|>",
751
+ "<|c_237|>",
752
+ "<|c_238|>",
753
+ "<|c_239|>",
754
+ "<|c_240|>",
755
+ "<|c_241|>",
756
+ "<|c_242|>",
757
+ "<|c_243|>",
758
+ "<|c_244|>",
759
+ "<|c_245|>",
760
+ "<|c_246|>",
761
+ "<|c_247|>",
762
+ "<|c_248|>",
763
+ "<|c_249|>",
764
+ "<|c_250|>",
765
+ "<|c_251|>",
766
+ "<|c_252|>",
767
+ "<|c_253|>",
768
+ "<|c_254|>",
769
+ "<|c_255|>",
770
+ "<|c_256|>",
771
+ "<|d_1|>",
772
+ "<|d_2|>",
773
+ "<|d_3|>",
774
+ "<|d_4|>",
775
+ "<|d_5|>",
776
+ "<|d_6|>",
777
+ "<|d_7|>",
778
+ "<|d_8|>",
779
+ "<|d_9|>",
780
+ "<|d_10|>",
781
+ "<|d_11|>",
782
+ "<|d_12|>",
783
+ "<|d_13|>",
784
+ "<|d_14|>",
785
+ "<|d_15|>",
786
+ "<|d_16|>",
787
+ "<|d_17|>",
788
+ "<|d_18|>",
789
+ "<|d_19|>",
790
+ "<|d_20|>",
791
+ "<|d_21|>",
792
+ "<|d_22|>",
793
+ "<|d_23|>",
794
+ "<|d_24|>",
795
+ "<|d_25|>",
796
+ "<|d_26|>",
797
+ "<|d_27|>",
798
+ "<|d_28|>",
799
+ "<|d_29|>",
800
+ "<|d_30|>",
801
+ "<|d_31|>",
802
+ "<|d_32|>",
803
+ "<|d_33|>",
804
+ "<|d_34|>",
805
+ "<|d_35|>",
806
+ "<|d_36|>",
807
+ "<|d_37|>",
808
+ "<|d_38|>",
809
+ "<|d_39|>",
810
+ "<|d_40|>",
811
+ "<|d_41|>",
812
+ "<|d_42|>",
813
+ "<|d_43|>",
814
+ "<|d_44|>",
815
+ "<|d_45|>",
816
+ "<|d_46|>",
817
+ "<|d_47|>",
818
+ "<|d_48|>",
819
+ "<|d_49|>",
820
+ "<|d_50|>",
821
+ "<|d_51|>",
822
+ "<|d_52|>",
823
+ "<|d_53|>",
824
+ "<|d_54|>",
825
+ "<|d_55|>",
826
+ "<|d_56|>",
827
+ "<|d_57|>",
828
+ "<|d_58|>",
829
+ "<|d_59|>",
830
+ "<|d_60|>",
831
+ "<|d_61|>",
832
+ "<|d_62|>",
833
+ "<|d_63|>",
834
+ "<|d_64|>",
835
+ "<|d_65|>",
836
+ "<|d_66|>",
837
+ "<|d_67|>",
838
+ "<|d_68|>",
839
+ "<|d_69|>",
840
+ "<|d_70|>",
841
+ "<|d_71|>",
842
+ "<|d_72|>",
843
+ "<|d_73|>",
844
+ "<|d_74|>",
845
+ "<|d_75|>",
846
+ "<|d_76|>",
847
+ "<|d_77|>",
848
+ "<|d_78|>",
849
+ "<|d_79|>",
850
+ "<|d_80|>",
851
+ "<|d_81|>",
852
+ "<|d_82|>",
853
+ "<|d_83|>",
854
+ "<|d_84|>",
855
+ "<|d_85|>",
856
+ "<|d_86|>",
857
+ "<|d_87|>",
858
+ "<|d_88|>",
859
+ "<|d_89|>",
860
+ "<|d_90|>",
861
+ "<|d_91|>",
862
+ "<|d_92|>",
863
+ "<|d_93|>",
864
+ "<|d_94|>",
865
+ "<|d_95|>",
866
+ "<|d_96|>",
867
+ "<|d_97|>",
868
+ "<|d_98|>",
869
+ "<|d_99|>",
870
+ "<|d_100|>",
871
+ "<|d_101|>",
872
+ "<|d_102|>",
873
+ "<|d_103|>",
874
+ "<|d_104|>",
875
+ "<|d_105|>",
876
+ "<|d_106|>",
877
+ "<|d_107|>",
878
+ "<|d_108|>",
879
+ "<|d_109|>",
880
+ "<|d_110|>",
881
+ "<|d_111|>",
882
+ "<|d_112|>",
883
+ "<|d_113|>",
884
+ "<|d_114|>",
885
+ "<|d_115|>",
886
+ "<|d_116|>",
887
+ "<|d_117|>",
888
+ "<|d_118|>",
889
+ "<|d_119|>",
890
+ "<|d_120|>",
891
+ "<|d_121|>",
892
+ "<|d_122|>",
893
+ "<|d_123|>",
894
+ "<|d_124|>",
895
+ "<|d_125|>",
896
+ "<|d_126|>",
897
+ "<|d_127|>",
898
+ "<|d_128|>",
899
+ "<|d_129|>",
900
+ "<|d_130|>",
901
+ "<|d_131|>",
902
+ "<|d_132|>",
903
+ "<|d_133|>",
904
+ "<|d_134|>",
905
+ "<|d_135|>",
906
+ "<|d_136|>",
907
+ "<|d_137|>",
908
+ "<|d_138|>",
909
+ "<|d_139|>",
910
+ "<|d_140|>",
911
+ "<|d_141|>",
912
+ "<|d_142|>",
913
+ "<|d_143|>",
914
+ "<|d_144|>",
915
+ "<|d_145|>",
916
+ "<|d_146|>",
917
+ "<|d_147|>",
918
+ "<|d_148|>",
919
+ "<|d_149|>",
920
+ "<|d_150|>",
921
+ "<|d_151|>",
922
+ "<|d_152|>",
923
+ "<|d_153|>",
924
+ "<|d_154|>",
925
+ "<|d_155|>",
926
+ "<|d_156|>",
927
+ "<|d_157|>",
928
+ "<|d_158|>",
929
+ "<|d_159|>",
930
+ "<|d_160|>",
931
+ "<|d_161|>",
932
+ "<|d_162|>",
933
+ "<|d_163|>",
934
+ "<|d_164|>",
935
+ "<|d_165|>",
936
+ "<|d_166|>",
937
+ "<|d_167|>",
938
+ "<|d_168|>",
939
+ "<|d_169|>",
940
+ "<|d_170|>",
941
+ "<|d_171|>",
942
+ "<|d_172|>",
943
+ "<|d_173|>",
944
+ "<|d_174|>",
945
+ "<|d_175|>",
946
+ "<|d_176|>",
947
+ "<|d_177|>",
948
+ "<|d_178|>",
949
+ "<|d_179|>",
950
+ "<|d_180|>",
951
+ "<|d_181|>",
952
+ "<|d_182|>",
953
+ "<|d_183|>",
954
+ "<|d_184|>",
955
+ "<|d_185|>",
956
+ "<|d_186|>",
957
+ "<|d_187|>",
958
+ "<|d_188|>",
959
+ "<|d_189|>",
960
+ "<|d_190|>",
961
+ "<|d_191|>",
962
+ "<|d_192|>",
963
+ "<|d_193|>",
964
+ "<|d_194|>",
965
+ "<|d_195|>",
966
+ "<|d_196|>",
967
+ "<|d_197|>",
968
+ "<|d_198|>",
969
+ "<|d_199|>",
970
+ "<|d_200|>",
971
+ "<|d_201|>",
972
+ "<|d_202|>",
973
+ "<|d_203|>",
974
+ "<|d_204|>",
975
+ "<|d_205|>",
976
+ "<|d_206|>",
977
+ "<|d_207|>",
978
+ "<|d_208|>",
979
+ "<|d_209|>",
980
+ "<|d_210|>",
981
+ "<|d_211|>",
982
+ "<|d_212|>",
983
+ "<|d_213|>",
984
+ "<|d_214|>",
985
+ "<|d_215|>",
986
+ "<|d_216|>",
987
+ "<|d_217|>",
988
+ "<|d_218|>",
989
+ "<|d_219|>",
990
+ "<|d_220|>",
991
+ "<|d_221|>",
992
+ "<|d_222|>",
993
+ "<|d_223|>",
994
+ "<|d_224|>",
995
+ "<|d_225|>",
996
+ "<|d_226|>",
997
+ "<|d_227|>",
998
+ "<|d_228|>",
999
+ "<|d_229|>",
1000
+ "<|d_230|>",
1001
+ "<|d_231|>",
1002
+ "<|d_232|>",
1003
+ "<|d_233|>",
1004
+ "<|d_234|>",
1005
+ "<|d_235|>",
1006
+ "<|d_236|>",
1007
+ "<|d_237|>",
1008
+ "<|d_238|>",
1009
+ "<|d_239|>",
1010
+ "<|d_240|>",
1011
+ "<|d_241|>",
1012
+ "<|d_242|>",
1013
+ "<|d_243|>",
1014
+ "<|d_244|>",
1015
+ "<|d_245|>",
1016
+ "<|d_246|>",
1017
+ "<|d_247|>",
1018
+ "<|d_248|>",
1019
+ "<|d_249|>",
1020
+ "<|d_250|>",
1021
+ "<|d_251|>",
1022
+ "<|d_252|>",
1023
+ "<|d_253|>",
1024
+ "<|d_254|>",
1025
+ "<|d_255|>",
1026
+ "<|d_256|>",
1027
+ "<|e_1|>",
1028
+ "<|e_2|>",
1029
+ "<|e_3|>",
1030
+ "<|e_4|>",
1031
+ "<|e_5|>",
1032
+ "<|e_6|>",
1033
+ "<|e_7|>",
1034
+ "<|e_8|>",
1035
+ "<|e_9|>",
1036
+ "<|e_10|>",
1037
+ "<|e_11|>",
1038
+ "<|e_12|>",
1039
+ "<|e_13|>",
1040
+ "<|e_14|>",
1041
+ "<|e_15|>",
1042
+ "<|e_16|>",
1043
+ "<|e_17|>",
1044
+ "<|e_18|>",
1045
+ "<|e_19|>",
1046
+ "<|e_20|>",
1047
+ "<|e_21|>",
1048
+ "<|e_22|>",
1049
+ "<|e_23|>",
1050
+ "<|e_24|>",
1051
+ "<|e_25|>",
1052
+ "<|e_26|>",
1053
+ "<|e_27|>",
1054
+ "<|e_28|>",
1055
+ "<|e_29|>",
1056
+ "<|e_30|>",
1057
+ "<|e_31|>",
1058
+ "<|e_32|>",
1059
+ "<|e_33|>",
1060
+ "<|e_34|>",
1061
+ "<|e_35|>",
1062
+ "<|e_36|>",
1063
+ "<|e_37|>",
1064
+ "<|e_38|>",
1065
+ "<|e_39|>",
1066
+ "<|e_40|>",
1067
+ "<|e_41|>",
1068
+ "<|e_42|>",
1069
+ "<|e_43|>",
1070
+ "<|e_44|>",
1071
+ "<|e_45|>",
1072
+ "<|e_46|>",
1073
+ "<|e_47|>",
1074
+ "<|e_48|>",
1075
+ "<|e_49|>",
1076
+ "<|e_50|>",
1077
+ "<|e_51|>",
1078
+ "<|e_52|>",
1079
+ "<|e_53|>",
1080
+ "<|e_54|>",
1081
+ "<|e_55|>",
1082
+ "<|e_56|>",
1083
+ "<|e_57|>",
1084
+ "<|e_58|>",
1085
+ "<|e_59|>",
1086
+ "<|e_60|>",
1087
+ "<|e_61|>",
1088
+ "<|e_62|>",
1089
+ "<|e_63|>",
1090
+ "<|e_64|>",
1091
+ "<|e_65|>",
1092
+ "<|e_66|>",
1093
+ "<|e_67|>",
1094
+ "<|e_68|>",
1095
+ "<|e_69|>",
1096
+ "<|e_70|>",
1097
+ "<|e_71|>",
1098
+ "<|e_72|>",
1099
+ "<|e_73|>",
1100
+ "<|e_74|>",
1101
+ "<|e_75|>",
1102
+ "<|e_76|>",
1103
+ "<|e_77|>",
1104
+ "<|e_78|>",
1105
+ "<|e_79|>",
1106
+ "<|e_80|>",
1107
+ "<|e_81|>",
1108
+ "<|e_82|>",
1109
+ "<|e_83|>",
1110
+ "<|e_84|>",
1111
+ "<|e_85|>",
1112
+ "<|e_86|>",
1113
+ "<|e_87|>",
1114
+ "<|e_88|>",
1115
+ "<|e_89|>",
1116
+ "<|e_90|>",
1117
+ "<|e_91|>",
1118
+ "<|e_92|>",
1119
+ "<|e_93|>",
1120
+ "<|e_94|>",
1121
+ "<|e_95|>",
1122
+ "<|e_96|>",
1123
+ "<|e_97|>",
1124
+ "<|e_98|>",
1125
+ "<|e_99|>",
1126
+ "<|e_100|>",
1127
+ "<|e_101|>",
1128
+ "<|e_102|>",
1129
+ "<|e_103|>",
1130
+ "<|e_104|>",
1131
+ "<|e_105|>",
1132
+ "<|e_106|>",
1133
+ "<|e_107|>",
1134
+ "<|e_108|>",
1135
+ "<|e_109|>",
1136
+ "<|e_110|>",
1137
+ "<|e_111|>",
1138
+ "<|e_112|>",
1139
+ "<|e_113|>",
1140
+ "<|e_114|>",
1141
+ "<|e_115|>",
1142
+ "<|e_116|>",
1143
+ "<|e_117|>",
1144
+ "<|e_118|>",
1145
+ "<|e_119|>",
1146
+ "<|e_120|>",
1147
+ "<|e_121|>",
1148
+ "<|e_122|>",
1149
+ "<|e_123|>",
1150
+ "<|e_124|>",
1151
+ "<|e_125|>",
1152
+ "<|e_126|>",
1153
+ "<|e_127|>",
1154
+ "<|e_128|>",
1155
+ "<|e_129|>",
1156
+ "<|e_130|>",
1157
+ "<|e_131|>",
1158
+ "<|e_132|>",
1159
+ "<|e_133|>",
1160
+ "<|e_134|>",
1161
+ "<|e_135|>",
1162
+ "<|e_136|>",
1163
+ "<|e_137|>",
1164
+ "<|e_138|>",
1165
+ "<|e_139|>",
1166
+ "<|e_140|>",
1167
+ "<|e_141|>",
1168
+ "<|e_142|>",
1169
+ "<|e_143|>",
1170
+ "<|e_144|>",
1171
+ "<|e_145|>",
1172
+ "<|e_146|>",
1173
+ "<|e_147|>",
1174
+ "<|e_148|>",
1175
+ "<|e_149|>",
1176
+ "<|e_150|>",
1177
+ "<|e_151|>",
1178
+ "<|e_152|>",
1179
+ "<|e_153|>",
1180
+ "<|e_154|>",
1181
+ "<|e_155|>",
1182
+ "<|e_156|>",
1183
+ "<|e_157|>",
1184
+ "<|e_158|>",
1185
+ "<|e_159|>",
1186
+ "<|e_160|>",
1187
+ "<|e_161|>",
1188
+ "<|e_162|>",
1189
+ "<|e_163|>",
1190
+ "<|e_164|>",
1191
+ "<|e_165|>",
1192
+ "<|e_166|>",
1193
+ "<|e_167|>",
1194
+ "<|e_168|>",
1195
+ "<|e_169|>",
1196
+ "<|e_170|>",
1197
+ "<|e_171|>",
1198
+ "<|e_172|>",
1199
+ "<|e_173|>",
1200
+ "<|e_174|>",
1201
+ "<|e_175|>",
1202
+ "<|e_176|>",
1203
+ "<|e_177|>",
1204
+ "<|e_178|>",
1205
+ "<|e_179|>",
1206
+ "<|e_180|>",
1207
+ "<|e_181|>",
1208
+ "<|e_182|>",
1209
+ "<|e_183|>",
1210
+ "<|e_184|>",
1211
+ "<|e_185|>",
1212
+ "<|e_186|>",
1213
+ "<|e_187|>",
1214
+ "<|e_188|>",
1215
+ "<|e_189|>",
1216
+ "<|e_190|>",
1217
+ "<|e_191|>",
1218
+ "<|e_192|>",
1219
+ "<|e_193|>",
1220
+ "<|e_194|>",
1221
+ "<|e_195|>",
1222
+ "<|e_196|>",
1223
+ "<|e_197|>",
1224
+ "<|e_198|>",
1225
+ "<|e_199|>",
1226
+ "<|e_200|>",
1227
+ "<|e_201|>",
1228
+ "<|e_202|>",
1229
+ "<|e_203|>",
1230
+ "<|e_204|>",
1231
+ "<|e_205|>",
1232
+ "<|e_206|>",
1233
+ "<|e_207|>",
1234
+ "<|e_208|>",
1235
+ "<|e_209|>",
1236
+ "<|e_210|>",
1237
+ "<|e_211|>",
1238
+ "<|e_212|>",
1239
+ "<|e_213|>",
1240
+ "<|e_214|>",
1241
+ "<|e_215|>",
1242
+ "<|e_216|>",
1243
+ "<|e_217|>",
1244
+ "<|e_218|>",
1245
+ "<|e_219|>",
1246
+ "<|e_220|>",
1247
+ "<|e_221|>",
1248
+ "<|e_222|>",
1249
+ "<|e_223|>",
1250
+ "<|e_224|>",
1251
+ "<|e_225|>",
1252
+ "<|e_226|>",
1253
+ "<|e_227|>",
1254
+ "<|e_228|>",
1255
+ "<|e_229|>",
1256
+ "<|e_230|>",
1257
+ "<|e_231|>",
1258
+ "<|e_232|>",
1259
+ "<|e_233|>",
1260
+ "<|e_234|>",
1261
+ "<|e_235|>",
1262
+ "<|e_236|>",
1263
+ "<|e_237|>",
1264
+ "<|e_238|>",
1265
+ "<|e_239|>",
1266
+ "<|e_240|>",
1267
+ "<|e_241|>",
1268
+ "<|e_242|>",
1269
+ "<|e_243|>",
1270
+ "<|e_244|>",
1271
+ "<|e_245|>",
1272
+ "<|e_246|>",
1273
+ "<|e_247|>",
1274
+ "<|e_248|>",
1275
+ "<|e_249|>",
1276
+ "<|e_250|>",
1277
+ "<|e_251|>",
1278
+ "<|e_252|>",
1279
+ "<|e_253|>",
1280
+ "<|e_254|>",
1281
+ "<|e_255|>",
1282
+ "<|e_256|>"
1283
+ ],
1284
+ "eos_token": {
1285
+ "content": "<|im_end|>",
1286
+ "lstrip": false,
1287
+ "normalized": false,
1288
+ "rstrip": false,
1289
+ "single_word": false
1290
+ },
1291
+ "pad_token": {
1292
+ "content": "<|endoftext|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false
1297
+ }
1298
+ }
qwen3_4b_beauty/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3bcd0ab896130d768fa805cf9a6ffeacac3cbb78921e1b68a664da7735314452
3
+ size 11660194
qwen3_4b_beauty/tokenizer_config.json ADDED
The diff for this file is too large to render. See raw diff
 
qwen3_4b_beauty/vocab.json ADDED
The diff for this file is too large to render. See raw diff
 
qwen3_4b_instruments/added_tokens.json ADDED
@@ -0,0 +1,1308 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "</think>": 151668,
3
+ "</tool_call>": 151658,
4
+ "</tool_response>": 151666,
5
+ "<think>": 151667,
6
+ "<tool_call>": 151657,
7
+ "<tool_response>": 151665,
8
+ "<|a_100|>": 151768,
9
+ "<|a_101|>": 151769,
10
+ "<|a_102|>": 151770,
11
+ "<|a_103|>": 151771,
12
+ "<|a_104|>": 151772,
13
+ "<|a_105|>": 151773,
14
+ "<|a_106|>": 151774,
15
+ "<|a_107|>": 151775,
16
+ "<|a_108|>": 151776,
17
+ "<|a_109|>": 151777,
18
+ "<|a_10|>": 151678,
19
+ "<|a_110|>": 151778,
20
+ "<|a_111|>": 151779,
21
+ "<|a_112|>": 151780,
22
+ "<|a_113|>": 151781,
23
+ "<|a_114|>": 151782,
24
+ "<|a_115|>": 151783,
25
+ "<|a_116|>": 151784,
26
+ "<|a_117|>": 151785,
27
+ "<|a_118|>": 151786,
28
+ "<|a_119|>": 151787,
29
+ "<|a_11|>": 151679,
30
+ "<|a_120|>": 151788,
31
+ "<|a_121|>": 151789,
32
+ "<|a_122|>": 151790,
33
+ "<|a_123|>": 151791,
34
+ "<|a_124|>": 151792,
35
+ "<|a_125|>": 151793,
36
+ "<|a_126|>": 151794,
37
+ "<|a_127|>": 151795,
38
+ "<|a_128|>": 151796,
39
+ "<|a_129|>": 151797,
40
+ "<|a_12|>": 151680,
41
+ "<|a_130|>": 151798,
42
+ "<|a_131|>": 151799,
43
+ "<|a_132|>": 151800,
44
+ "<|a_133|>": 151801,
45
+ "<|a_134|>": 151802,
46
+ "<|a_135|>": 151803,
47
+ "<|a_136|>": 151804,
48
+ "<|a_137|>": 151805,
49
+ "<|a_138|>": 151806,
50
+ "<|a_139|>": 151807,
51
+ "<|a_13|>": 151681,
52
+ "<|a_140|>": 151808,
53
+ "<|a_141|>": 151809,
54
+ "<|a_142|>": 151810,
55
+ "<|a_143|>": 151811,
56
+ "<|a_144|>": 151812,
57
+ "<|a_145|>": 151813,
58
+ "<|a_146|>": 151814,
59
+ "<|a_147|>": 151815,
60
+ "<|a_148|>": 151816,
61
+ "<|a_149|>": 151817,
62
+ "<|a_14|>": 151682,
63
+ "<|a_150|>": 151818,
64
+ "<|a_151|>": 151819,
65
+ "<|a_152|>": 151820,
66
+ "<|a_153|>": 151821,
67
+ "<|a_154|>": 151822,
68
+ "<|a_155|>": 151823,
69
+ "<|a_156|>": 151824,
70
+ "<|a_157|>": 151825,
71
+ "<|a_158|>": 151826,
72
+ "<|a_159|>": 151827,
73
+ "<|a_15|>": 151683,
74
+ "<|a_160|>": 151828,
75
+ "<|a_161|>": 151829,
76
+ "<|a_162|>": 151830,
77
+ "<|a_163|>": 151831,
78
+ "<|a_164|>": 151832,
79
+ "<|a_165|>": 151833,
80
+ "<|a_166|>": 151834,
81
+ "<|a_167|>": 151835,
82
+ "<|a_168|>": 151836,
83
+ "<|a_169|>": 151837,
84
+ "<|a_16|>": 151684,
85
+ "<|a_170|>": 151838,
86
+ "<|a_171|>": 151839,
87
+ "<|a_172|>": 151840,
88
+ "<|a_173|>": 151841,
89
+ "<|a_174|>": 151842,
90
+ "<|a_175|>": 151843,
91
+ "<|a_176|>": 151844,
92
+ "<|a_177|>": 151845,
93
+ "<|a_178|>": 151846,
94
+ "<|a_179|>": 151847,
95
+ "<|a_17|>": 151685,
96
+ "<|a_180|>": 151848,
97
+ "<|a_181|>": 151849,
98
+ "<|a_182|>": 151850,
99
+ "<|a_183|>": 151851,
100
+ "<|a_184|>": 151852,
101
+ "<|a_185|>": 151853,
102
+ "<|a_186|>": 151854,
103
+ "<|a_187|>": 151855,
104
+ "<|a_188|>": 151856,
105
+ "<|a_189|>": 151857,
106
+ "<|a_18|>": 151686,
107
+ "<|a_190|>": 151858,
108
+ "<|a_191|>": 151859,
109
+ "<|a_192|>": 151860,
110
+ "<|a_193|>": 151861,
111
+ "<|a_194|>": 151862,
112
+ "<|a_195|>": 151863,
113
+ "<|a_196|>": 151864,
114
+ "<|a_197|>": 151865,
115
+ "<|a_198|>": 151866,
116
+ "<|a_199|>": 151867,
117
+ "<|a_19|>": 151687,
118
+ "<|a_1|>": 151669,
119
+ "<|a_200|>": 151868,
120
+ "<|a_201|>": 151869,
121
+ "<|a_202|>": 151870,
122
+ "<|a_203|>": 151871,
123
+ "<|a_204|>": 151872,
124
+ "<|a_205|>": 151873,
125
+ "<|a_206|>": 151874,
126
+ "<|a_207|>": 151875,
127
+ "<|a_208|>": 151876,
128
+ "<|a_209|>": 151877,
129
+ "<|a_20|>": 151688,
130
+ "<|a_210|>": 151878,
131
+ "<|a_211|>": 151879,
132
+ "<|a_212|>": 151880,
133
+ "<|a_213|>": 151881,
134
+ "<|a_214|>": 151882,
135
+ "<|a_215|>": 151883,
136
+ "<|a_216|>": 151884,
137
+ "<|a_217|>": 151885,
138
+ "<|a_218|>": 151886,
139
+ "<|a_219|>": 151887,
140
+ "<|a_21|>": 151689,
141
+ "<|a_220|>": 151888,
142
+ "<|a_221|>": 151889,
143
+ "<|a_222|>": 151890,
144
+ "<|a_223|>": 151891,
145
+ "<|a_224|>": 151892,
146
+ "<|a_225|>": 151893,
147
+ "<|a_226|>": 151894,
148
+ "<|a_227|>": 151895,
149
+ "<|a_228|>": 151896,
150
+ "<|a_229|>": 151897,
151
+ "<|a_22|>": 151690,
152
+ "<|a_230|>": 151898,
153
+ "<|a_231|>": 151899,
154
+ "<|a_232|>": 151900,
155
+ "<|a_233|>": 151901,
156
+ "<|a_234|>": 151902,
157
+ "<|a_235|>": 151903,
158
+ "<|a_236|>": 151904,
159
+ "<|a_237|>": 151905,
160
+ "<|a_238|>": 151906,
161
+ "<|a_239|>": 151907,
162
+ "<|a_23|>": 151691,
163
+ "<|a_240|>": 151908,
164
+ "<|a_241|>": 151909,
165
+ "<|a_242|>": 151910,
166
+ "<|a_243|>": 151911,
167
+ "<|a_244|>": 151912,
168
+ "<|a_245|>": 151913,
169
+ "<|a_246|>": 151914,
170
+ "<|a_247|>": 151915,
171
+ "<|a_248|>": 151916,
172
+ "<|a_249|>": 151917,
173
+ "<|a_24|>": 151692,
174
+ "<|a_250|>": 151918,
175
+ "<|a_251|>": 151919,
176
+ "<|a_252|>": 151920,
177
+ "<|a_253|>": 151921,
178
+ "<|a_254|>": 151922,
179
+ "<|a_255|>": 151923,
180
+ "<|a_256|>": 151924,
181
+ "<|a_25|>": 151693,
182
+ "<|a_26|>": 151694,
183
+ "<|a_27|>": 151695,
184
+ "<|a_28|>": 151696,
185
+ "<|a_29|>": 151697,
186
+ "<|a_2|>": 151670,
187
+ "<|a_30|>": 151698,
188
+ "<|a_31|>": 151699,
189
+ "<|a_32|>": 151700,
190
+ "<|a_33|>": 151701,
191
+ "<|a_34|>": 151702,
192
+ "<|a_35|>": 151703,
193
+ "<|a_36|>": 151704,
194
+ "<|a_37|>": 151705,
195
+ "<|a_38|>": 151706,
196
+ "<|a_39|>": 151707,
197
+ "<|a_3|>": 151671,
198
+ "<|a_40|>": 151708,
199
+ "<|a_41|>": 151709,
200
+ "<|a_42|>": 151710,
201
+ "<|a_43|>": 151711,
202
+ "<|a_44|>": 151712,
203
+ "<|a_45|>": 151713,
204
+ "<|a_46|>": 151714,
205
+ "<|a_47|>": 151715,
206
+ "<|a_48|>": 151716,
207
+ "<|a_49|>": 151717,
208
+ "<|a_4|>": 151672,
209
+ "<|a_50|>": 151718,
210
+ "<|a_51|>": 151719,
211
+ "<|a_52|>": 151720,
212
+ "<|a_53|>": 151721,
213
+ "<|a_54|>": 151722,
214
+ "<|a_55|>": 151723,
215
+ "<|a_56|>": 151724,
216
+ "<|a_57|>": 151725,
217
+ "<|a_58|>": 151726,
218
+ "<|a_59|>": 151727,
219
+ "<|a_5|>": 151673,
220
+ "<|a_60|>": 151728,
221
+ "<|a_61|>": 151729,
222
+ "<|a_62|>": 151730,
223
+ "<|a_63|>": 151731,
224
+ "<|a_64|>": 151732,
225
+ "<|a_65|>": 151733,
226
+ "<|a_66|>": 151734,
227
+ "<|a_67|>": 151735,
228
+ "<|a_68|>": 151736,
229
+ "<|a_69|>": 151737,
230
+ "<|a_6|>": 151674,
231
+ "<|a_70|>": 151738,
232
+ "<|a_71|>": 151739,
233
+ "<|a_72|>": 151740,
234
+ "<|a_73|>": 151741,
235
+ "<|a_74|>": 151742,
236
+ "<|a_75|>": 151743,
237
+ "<|a_76|>": 151744,
238
+ "<|a_77|>": 151745,
239
+ "<|a_78|>": 151746,
240
+ "<|a_79|>": 151747,
241
+ "<|a_7|>": 151675,
242
+ "<|a_80|>": 151748,
243
+ "<|a_81|>": 151749,
244
+ "<|a_82|>": 151750,
245
+ "<|a_83|>": 151751,
246
+ "<|a_84|>": 151752,
247
+ "<|a_85|>": 151753,
248
+ "<|a_86|>": 151754,
249
+ "<|a_87|>": 151755,
250
+ "<|a_88|>": 151756,
251
+ "<|a_89|>": 151757,
252
+ "<|a_8|>": 151676,
253
+ "<|a_90|>": 151758,
254
+ "<|a_91|>": 151759,
255
+ "<|a_92|>": 151760,
256
+ "<|a_93|>": 151761,
257
+ "<|a_94|>": 151762,
258
+ "<|a_95|>": 151763,
259
+ "<|a_96|>": 151764,
260
+ "<|a_97|>": 151765,
261
+ "<|a_98|>": 151766,
262
+ "<|a_99|>": 151767,
263
+ "<|a_9|>": 151677,
264
+ "<|b_100|>": 152024,
265
+ "<|b_101|>": 152025,
266
+ "<|b_102|>": 152026,
267
+ "<|b_103|>": 152027,
268
+ "<|b_104|>": 152028,
269
+ "<|b_105|>": 152029,
270
+ "<|b_106|>": 152030,
271
+ "<|b_107|>": 152031,
272
+ "<|b_108|>": 152032,
273
+ "<|b_109|>": 152033,
274
+ "<|b_10|>": 151934,
275
+ "<|b_110|>": 152034,
276
+ "<|b_111|>": 152035,
277
+ "<|b_112|>": 152036,
278
+ "<|b_113|>": 152037,
279
+ "<|b_114|>": 152038,
280
+ "<|b_115|>": 152039,
281
+ "<|b_116|>": 152040,
282
+ "<|b_117|>": 152041,
283
+ "<|b_118|>": 152042,
284
+ "<|b_119|>": 152043,
285
+ "<|b_11|>": 151935,
286
+ "<|b_120|>": 152044,
287
+ "<|b_121|>": 152045,
288
+ "<|b_122|>": 152046,
289
+ "<|b_123|>": 152047,
290
+ "<|b_124|>": 152048,
291
+ "<|b_125|>": 152049,
292
+ "<|b_126|>": 152050,
293
+ "<|b_127|>": 152051,
294
+ "<|b_128|>": 152052,
295
+ "<|b_129|>": 152053,
296
+ "<|b_12|>": 151936,
297
+ "<|b_130|>": 152054,
298
+ "<|b_131|>": 152055,
299
+ "<|b_132|>": 152056,
300
+ "<|b_133|>": 152057,
301
+ "<|b_134|>": 152058,
302
+ "<|b_135|>": 152059,
303
+ "<|b_136|>": 152060,
304
+ "<|b_137|>": 152061,
305
+ "<|b_138|>": 152062,
306
+ "<|b_139|>": 152063,
307
+ "<|b_13|>": 151937,
308
+ "<|b_140|>": 152064,
309
+ "<|b_141|>": 152065,
310
+ "<|b_142|>": 152066,
311
+ "<|b_143|>": 152067,
312
+ "<|b_144|>": 152068,
313
+ "<|b_145|>": 152069,
314
+ "<|b_146|>": 152070,
315
+ "<|b_147|>": 152071,
316
+ "<|b_148|>": 152072,
317
+ "<|b_149|>": 152073,
318
+ "<|b_14|>": 151938,
319
+ "<|b_150|>": 152074,
320
+ "<|b_151|>": 152075,
321
+ "<|b_152|>": 152076,
322
+ "<|b_153|>": 152077,
323
+ "<|b_154|>": 152078,
324
+ "<|b_155|>": 152079,
325
+ "<|b_156|>": 152080,
326
+ "<|b_157|>": 152081,
327
+ "<|b_158|>": 152082,
328
+ "<|b_159|>": 152083,
329
+ "<|b_15|>": 151939,
330
+ "<|b_160|>": 152084,
331
+ "<|b_161|>": 152085,
332
+ "<|b_162|>": 152086,
333
+ "<|b_163|>": 152087,
334
+ "<|b_164|>": 152088,
335
+ "<|b_165|>": 152089,
336
+ "<|b_166|>": 152090,
337
+ "<|b_167|>": 152091,
338
+ "<|b_168|>": 152092,
339
+ "<|b_169|>": 152093,
340
+ "<|b_16|>": 151940,
341
+ "<|b_170|>": 152094,
342
+ "<|b_171|>": 152095,
343
+ "<|b_172|>": 152096,
344
+ "<|b_173|>": 152097,
345
+ "<|b_174|>": 152098,
346
+ "<|b_175|>": 152099,
347
+ "<|b_176|>": 152100,
348
+ "<|b_177|>": 152101,
349
+ "<|b_178|>": 152102,
350
+ "<|b_179|>": 152103,
351
+ "<|b_17|>": 151941,
352
+ "<|b_180|>": 152104,
353
+ "<|b_181|>": 152105,
354
+ "<|b_182|>": 152106,
355
+ "<|b_183|>": 152107,
356
+ "<|b_184|>": 152108,
357
+ "<|b_185|>": 152109,
358
+ "<|b_186|>": 152110,
359
+ "<|b_187|>": 152111,
360
+ "<|b_188|>": 152112,
361
+ "<|b_189|>": 152113,
362
+ "<|b_18|>": 151942,
363
+ "<|b_190|>": 152114,
364
+ "<|b_191|>": 152115,
365
+ "<|b_192|>": 152116,
366
+ "<|b_193|>": 152117,
367
+ "<|b_194|>": 152118,
368
+ "<|b_195|>": 152119,
369
+ "<|b_196|>": 152120,
370
+ "<|b_197|>": 152121,
371
+ "<|b_198|>": 152122,
372
+ "<|b_199|>": 152123,
373
+ "<|b_19|>": 151943,
374
+ "<|b_1|>": 151925,
375
+ "<|b_200|>": 152124,
376
+ "<|b_201|>": 152125,
377
+ "<|b_202|>": 152126,
378
+ "<|b_203|>": 152127,
379
+ "<|b_204|>": 152128,
380
+ "<|b_205|>": 152129,
381
+ "<|b_206|>": 152130,
382
+ "<|b_207|>": 152131,
383
+ "<|b_208|>": 152132,
384
+ "<|b_209|>": 152133,
385
+ "<|b_20|>": 151944,
386
+ "<|b_210|>": 152134,
387
+ "<|b_211|>": 152135,
388
+ "<|b_212|>": 152136,
389
+ "<|b_213|>": 152137,
390
+ "<|b_214|>": 152138,
391
+ "<|b_215|>": 152139,
392
+ "<|b_216|>": 152140,
393
+ "<|b_217|>": 152141,
394
+ "<|b_218|>": 152142,
395
+ "<|b_219|>": 152143,
396
+ "<|b_21|>": 151945,
397
+ "<|b_220|>": 152144,
398
+ "<|b_221|>": 152145,
399
+ "<|b_222|>": 152146,
400
+ "<|b_223|>": 152147,
401
+ "<|b_224|>": 152148,
402
+ "<|b_225|>": 152149,
403
+ "<|b_226|>": 152150,
404
+ "<|b_227|>": 152151,
405
+ "<|b_228|>": 152152,
406
+ "<|b_229|>": 152153,
407
+ "<|b_22|>": 151946,
408
+ "<|b_230|>": 152154,
409
+ "<|b_231|>": 152155,
410
+ "<|b_232|>": 152156,
411
+ "<|b_233|>": 152157,
412
+ "<|b_234|>": 152158,
413
+ "<|b_235|>": 152159,
414
+ "<|b_236|>": 152160,
415
+ "<|b_237|>": 152161,
416
+ "<|b_238|>": 152162,
417
+ "<|b_239|>": 152163,
418
+ "<|b_23|>": 151947,
419
+ "<|b_240|>": 152164,
420
+ "<|b_241|>": 152165,
421
+ "<|b_242|>": 152166,
422
+ "<|b_243|>": 152167,
423
+ "<|b_244|>": 152168,
424
+ "<|b_245|>": 152169,
425
+ "<|b_246|>": 152170,
426
+ "<|b_247|>": 152171,
427
+ "<|b_248|>": 152172,
428
+ "<|b_249|>": 152173,
429
+ "<|b_24|>": 151948,
430
+ "<|b_250|>": 152174,
431
+ "<|b_251|>": 152175,
432
+ "<|b_252|>": 152176,
433
+ "<|b_253|>": 152177,
434
+ "<|b_254|>": 152178,
435
+ "<|b_255|>": 152179,
436
+ "<|b_256|>": 152180,
437
+ "<|b_25|>": 151949,
438
+ "<|b_26|>": 151950,
439
+ "<|b_27|>": 151951,
440
+ "<|b_28|>": 151952,
441
+ "<|b_29|>": 151953,
442
+ "<|b_2|>": 151926,
443
+ "<|b_30|>": 151954,
444
+ "<|b_31|>": 151955,
445
+ "<|b_32|>": 151956,
446
+ "<|b_33|>": 151957,
447
+ "<|b_34|>": 151958,
448
+ "<|b_35|>": 151959,
449
+ "<|b_36|>": 151960,
450
+ "<|b_37|>": 151961,
451
+ "<|b_38|>": 151962,
452
+ "<|b_39|>": 151963,
453
+ "<|b_3|>": 151927,
454
+ "<|b_40|>": 151964,
455
+ "<|b_41|>": 151965,
456
+ "<|b_42|>": 151966,
457
+ "<|b_43|>": 151967,
458
+ "<|b_44|>": 151968,
459
+ "<|b_45|>": 151969,
460
+ "<|b_46|>": 151970,
461
+ "<|b_47|>": 151971,
462
+ "<|b_48|>": 151972,
463
+ "<|b_49|>": 151973,
464
+ "<|b_4|>": 151928,
465
+ "<|b_50|>": 151974,
466
+ "<|b_51|>": 151975,
467
+ "<|b_52|>": 151976,
468
+ "<|b_53|>": 151977,
469
+ "<|b_54|>": 151978,
470
+ "<|b_55|>": 151979,
471
+ "<|b_56|>": 151980,
472
+ "<|b_57|>": 151981,
473
+ "<|b_58|>": 151982,
474
+ "<|b_59|>": 151983,
475
+ "<|b_5|>": 151929,
476
+ "<|b_60|>": 151984,
477
+ "<|b_61|>": 151985,
478
+ "<|b_62|>": 151986,
479
+ "<|b_63|>": 151987,
480
+ "<|b_64|>": 151988,
481
+ "<|b_65|>": 151989,
482
+ "<|b_66|>": 151990,
483
+ "<|b_67|>": 151991,
484
+ "<|b_68|>": 151992,
485
+ "<|b_69|>": 151993,
486
+ "<|b_6|>": 151930,
487
+ "<|b_70|>": 151994,
488
+ "<|b_71|>": 151995,
489
+ "<|b_72|>": 151996,
490
+ "<|b_73|>": 151997,
491
+ "<|b_74|>": 151998,
492
+ "<|b_75|>": 151999,
493
+ "<|b_76|>": 152000,
494
+ "<|b_77|>": 152001,
495
+ "<|b_78|>": 152002,
496
+ "<|b_79|>": 152003,
497
+ "<|b_7|>": 151931,
498
+ "<|b_80|>": 152004,
499
+ "<|b_81|>": 152005,
500
+ "<|b_82|>": 152006,
501
+ "<|b_83|>": 152007,
502
+ "<|b_84|>": 152008,
503
+ "<|b_85|>": 152009,
504
+ "<|b_86|>": 152010,
505
+ "<|b_87|>": 152011,
506
+ "<|b_88|>": 152012,
507
+ "<|b_89|>": 152013,
508
+ "<|b_8|>": 151932,
509
+ "<|b_90|>": 152014,
510
+ "<|b_91|>": 152015,
511
+ "<|b_92|>": 152016,
512
+ "<|b_93|>": 152017,
513
+ "<|b_94|>": 152018,
514
+ "<|b_95|>": 152019,
515
+ "<|b_96|>": 152020,
516
+ "<|b_97|>": 152021,
517
+ "<|b_98|>": 152022,
518
+ "<|b_99|>": 152023,
519
+ "<|b_9|>": 151933,
520
+ "<|box_end|>": 151649,
521
+ "<|box_start|>": 151648,
522
+ "<|c_100|>": 152280,
523
+ "<|c_101|>": 152281,
524
+ "<|c_102|>": 152282,
525
+ "<|c_103|>": 152283,
526
+ "<|c_104|>": 152284,
527
+ "<|c_105|>": 152285,
528
+ "<|c_106|>": 152286,
529
+ "<|c_107|>": 152287,
530
+ "<|c_108|>": 152288,
531
+ "<|c_109|>": 152289,
532
+ "<|c_10|>": 152190,
533
+ "<|c_110|>": 152290,
534
+ "<|c_111|>": 152291,
535
+ "<|c_112|>": 152292,
536
+ "<|c_113|>": 152293,
537
+ "<|c_114|>": 152294,
538
+ "<|c_115|>": 152295,
539
+ "<|c_116|>": 152296,
540
+ "<|c_117|>": 152297,
541
+ "<|c_118|>": 152298,
542
+ "<|c_119|>": 152299,
543
+ "<|c_11|>": 152191,
544
+ "<|c_120|>": 152300,
545
+ "<|c_121|>": 152301,
546
+ "<|c_122|>": 152302,
547
+ "<|c_123|>": 152303,
548
+ "<|c_124|>": 152304,
549
+ "<|c_125|>": 152305,
550
+ "<|c_126|>": 152306,
551
+ "<|c_127|>": 152307,
552
+ "<|c_128|>": 152308,
553
+ "<|c_129|>": 152309,
554
+ "<|c_12|>": 152192,
555
+ "<|c_130|>": 152310,
556
+ "<|c_131|>": 152311,
557
+ "<|c_132|>": 152312,
558
+ "<|c_133|>": 152313,
559
+ "<|c_134|>": 152314,
560
+ "<|c_135|>": 152315,
561
+ "<|c_136|>": 152316,
562
+ "<|c_137|>": 152317,
563
+ "<|c_138|>": 152318,
564
+ "<|c_139|>": 152319,
565
+ "<|c_13|>": 152193,
566
+ "<|c_140|>": 152320,
567
+ "<|c_141|>": 152321,
568
+ "<|c_142|>": 152322,
569
+ "<|c_143|>": 152323,
570
+ "<|c_144|>": 152324,
571
+ "<|c_145|>": 152325,
572
+ "<|c_146|>": 152326,
573
+ "<|c_147|>": 152327,
574
+ "<|c_148|>": 152328,
575
+ "<|c_149|>": 152329,
576
+ "<|c_14|>": 152194,
577
+ "<|c_150|>": 152330,
578
+ "<|c_151|>": 152331,
579
+ "<|c_152|>": 152332,
580
+ "<|c_153|>": 152333,
581
+ "<|c_154|>": 152334,
582
+ "<|c_155|>": 152335,
583
+ "<|c_156|>": 152336,
584
+ "<|c_157|>": 152337,
585
+ "<|c_158|>": 152338,
586
+ "<|c_159|>": 152339,
587
+ "<|c_15|>": 152195,
588
+ "<|c_160|>": 152340,
589
+ "<|c_161|>": 152341,
590
+ "<|c_162|>": 152342,
591
+ "<|c_163|>": 152343,
592
+ "<|c_164|>": 152344,
593
+ "<|c_165|>": 152345,
594
+ "<|c_166|>": 152346,
595
+ "<|c_167|>": 152347,
596
+ "<|c_168|>": 152348,
597
+ "<|c_169|>": 152349,
598
+ "<|c_16|>": 152196,
599
+ "<|c_170|>": 152350,
600
+ "<|c_171|>": 152351,
601
+ "<|c_172|>": 152352,
602
+ "<|c_173|>": 152353,
603
+ "<|c_174|>": 152354,
604
+ "<|c_175|>": 152355,
605
+ "<|c_176|>": 152356,
606
+ "<|c_177|>": 152357,
607
+ "<|c_178|>": 152358,
608
+ "<|c_179|>": 152359,
609
+ "<|c_17|>": 152197,
610
+ "<|c_180|>": 152360,
611
+ "<|c_181|>": 152361,
612
+ "<|c_182|>": 152362,
613
+ "<|c_183|>": 152363,
614
+ "<|c_184|>": 152364,
615
+ "<|c_185|>": 152365,
616
+ "<|c_186|>": 152366,
617
+ "<|c_187|>": 152367,
618
+ "<|c_188|>": 152368,
619
+ "<|c_189|>": 152369,
620
+ "<|c_18|>": 152198,
621
+ "<|c_190|>": 152370,
622
+ "<|c_191|>": 152371,
623
+ "<|c_192|>": 152372,
624
+ "<|c_193|>": 152373,
625
+ "<|c_194|>": 152374,
626
+ "<|c_195|>": 152375,
627
+ "<|c_196|>": 152376,
628
+ "<|c_197|>": 152377,
629
+ "<|c_198|>": 152378,
630
+ "<|c_199|>": 152379,
631
+ "<|c_19|>": 152199,
632
+ "<|c_1|>": 152181,
633
+ "<|c_200|>": 152380,
634
+ "<|c_201|>": 152381,
635
+ "<|c_202|>": 152382,
636
+ "<|c_203|>": 152383,
637
+ "<|c_204|>": 152384,
638
+ "<|c_205|>": 152385,
639
+ "<|c_206|>": 152386,
640
+ "<|c_207|>": 152387,
641
+ "<|c_208|>": 152388,
642
+ "<|c_209|>": 152389,
643
+ "<|c_20|>": 152200,
644
+ "<|c_210|>": 152390,
645
+ "<|c_211|>": 152391,
646
+ "<|c_212|>": 152392,
647
+ "<|c_213|>": 152393,
648
+ "<|c_214|>": 152394,
649
+ "<|c_215|>": 152395,
650
+ "<|c_216|>": 152396,
651
+ "<|c_217|>": 152397,
652
+ "<|c_218|>": 152398,
653
+ "<|c_219|>": 152399,
654
+ "<|c_21|>": 152201,
655
+ "<|c_220|>": 152400,
656
+ "<|c_221|>": 152401,
657
+ "<|c_222|>": 152402,
658
+ "<|c_223|>": 152403,
659
+ "<|c_224|>": 152404,
660
+ "<|c_225|>": 152405,
661
+ "<|c_226|>": 152406,
662
+ "<|c_227|>": 152407,
663
+ "<|c_228|>": 152408,
664
+ "<|c_229|>": 152409,
665
+ "<|c_22|>": 152202,
666
+ "<|c_230|>": 152410,
667
+ "<|c_231|>": 152411,
668
+ "<|c_232|>": 152412,
669
+ "<|c_233|>": 152413,
670
+ "<|c_234|>": 152414,
671
+ "<|c_235|>": 152415,
672
+ "<|c_236|>": 152416,
673
+ "<|c_237|>": 152417,
674
+ "<|c_238|>": 152418,
675
+ "<|c_239|>": 152419,
676
+ "<|c_23|>": 152203,
677
+ "<|c_240|>": 152420,
678
+ "<|c_241|>": 152421,
679
+ "<|c_242|>": 152422,
680
+ "<|c_243|>": 152423,
681
+ "<|c_244|>": 152424,
682
+ "<|c_245|>": 152425,
683
+ "<|c_246|>": 152426,
684
+ "<|c_247|>": 152427,
685
+ "<|c_248|>": 152428,
686
+ "<|c_249|>": 152429,
687
+ "<|c_24|>": 152204,
688
+ "<|c_250|>": 152430,
689
+ "<|c_251|>": 152431,
690
+ "<|c_252|>": 152432,
691
+ "<|c_253|>": 152433,
692
+ "<|c_254|>": 152434,
693
+ "<|c_255|>": 152435,
694
+ "<|c_256|>": 152436,
695
+ "<|c_25|>": 152205,
696
+ "<|c_26|>": 152206,
697
+ "<|c_27|>": 152207,
698
+ "<|c_28|>": 152208,
699
+ "<|c_29|>": 152209,
700
+ "<|c_2|>": 152182,
701
+ "<|c_30|>": 152210,
702
+ "<|c_31|>": 152211,
703
+ "<|c_32|>": 152212,
704
+ "<|c_33|>": 152213,
705
+ "<|c_34|>": 152214,
706
+ "<|c_35|>": 152215,
707
+ "<|c_36|>": 152216,
708
+ "<|c_37|>": 152217,
709
+ "<|c_38|>": 152218,
710
+ "<|c_39|>": 152219,
711
+ "<|c_3|>": 152183,
712
+ "<|c_40|>": 152220,
713
+ "<|c_41|>": 152221,
714
+ "<|c_42|>": 152222,
715
+ "<|c_43|>": 152223,
716
+ "<|c_44|>": 152224,
717
+ "<|c_45|>": 152225,
718
+ "<|c_46|>": 152226,
719
+ "<|c_47|>": 152227,
720
+ "<|c_48|>": 152228,
721
+ "<|c_49|>": 152229,
722
+ "<|c_4|>": 152184,
723
+ "<|c_50|>": 152230,
724
+ "<|c_51|>": 152231,
725
+ "<|c_52|>": 152232,
726
+ "<|c_53|>": 152233,
727
+ "<|c_54|>": 152234,
728
+ "<|c_55|>": 152235,
729
+ "<|c_56|>": 152236,
730
+ "<|c_57|>": 152237,
731
+ "<|c_58|>": 152238,
732
+ "<|c_59|>": 152239,
733
+ "<|c_5|>": 152185,
734
+ "<|c_60|>": 152240,
735
+ "<|c_61|>": 152241,
736
+ "<|c_62|>": 152242,
737
+ "<|c_63|>": 152243,
738
+ "<|c_64|>": 152244,
739
+ "<|c_65|>": 152245,
740
+ "<|c_66|>": 152246,
741
+ "<|c_67|>": 152247,
742
+ "<|c_68|>": 152248,
743
+ "<|c_69|>": 152249,
744
+ "<|c_6|>": 152186,
745
+ "<|c_70|>": 152250,
746
+ "<|c_71|>": 152251,
747
+ "<|c_72|>": 152252,
748
+ "<|c_73|>": 152253,
749
+ "<|c_74|>": 152254,
750
+ "<|c_75|>": 152255,
751
+ "<|c_76|>": 152256,
752
+ "<|c_77|>": 152257,
753
+ "<|c_78|>": 152258,
754
+ "<|c_79|>": 152259,
755
+ "<|c_7|>": 152187,
756
+ "<|c_80|>": 152260,
757
+ "<|c_81|>": 152261,
758
+ "<|c_82|>": 152262,
759
+ "<|c_83|>": 152263,
760
+ "<|c_84|>": 152264,
761
+ "<|c_85|>": 152265,
762
+ "<|c_86|>": 152266,
763
+ "<|c_87|>": 152267,
764
+ "<|c_88|>": 152268,
765
+ "<|c_89|>": 152269,
766
+ "<|c_8|>": 152188,
767
+ "<|c_90|>": 152270,
768
+ "<|c_91|>": 152271,
769
+ "<|c_92|>": 152272,
770
+ "<|c_93|>": 152273,
771
+ "<|c_94|>": 152274,
772
+ "<|c_95|>": 152275,
773
+ "<|c_96|>": 152276,
774
+ "<|c_97|>": 152277,
775
+ "<|c_98|>": 152278,
776
+ "<|c_99|>": 152279,
777
+ "<|c_9|>": 152189,
778
+ "<|d_100|>": 152536,
779
+ "<|d_101|>": 152537,
780
+ "<|d_102|>": 152538,
781
+ "<|d_103|>": 152539,
782
+ "<|d_104|>": 152540,
783
+ "<|d_105|>": 152541,
784
+ "<|d_106|>": 152542,
785
+ "<|d_107|>": 152543,
786
+ "<|d_108|>": 152544,
787
+ "<|d_109|>": 152545,
788
+ "<|d_10|>": 152446,
789
+ "<|d_110|>": 152546,
790
+ "<|d_111|>": 152547,
791
+ "<|d_112|>": 152548,
792
+ "<|d_113|>": 152549,
793
+ "<|d_114|>": 152550,
794
+ "<|d_115|>": 152551,
795
+ "<|d_116|>": 152552,
796
+ "<|d_117|>": 152553,
797
+ "<|d_118|>": 152554,
798
+ "<|d_119|>": 152555,
799
+ "<|d_11|>": 152447,
800
+ "<|d_120|>": 152556,
801
+ "<|d_121|>": 152557,
802
+ "<|d_122|>": 152558,
803
+ "<|d_123|>": 152559,
804
+ "<|d_124|>": 152560,
805
+ "<|d_125|>": 152561,
806
+ "<|d_126|>": 152562,
807
+ "<|d_127|>": 152563,
808
+ "<|d_128|>": 152564,
809
+ "<|d_129|>": 152565,
810
+ "<|d_12|>": 152448,
811
+ "<|d_130|>": 152566,
812
+ "<|d_131|>": 152567,
813
+ "<|d_132|>": 152568,
814
+ "<|d_133|>": 152569,
815
+ "<|d_134|>": 152570,
816
+ "<|d_135|>": 152571,
817
+ "<|d_136|>": 152572,
818
+ "<|d_137|>": 152573,
819
+ "<|d_138|>": 152574,
820
+ "<|d_139|>": 152575,
821
+ "<|d_13|>": 152449,
822
+ "<|d_140|>": 152576,
823
+ "<|d_141|>": 152577,
824
+ "<|d_142|>": 152578,
825
+ "<|d_143|>": 152579,
826
+ "<|d_144|>": 152580,
827
+ "<|d_145|>": 152581,
828
+ "<|d_146|>": 152582,
829
+ "<|d_147|>": 152583,
830
+ "<|d_148|>": 152584,
831
+ "<|d_149|>": 152585,
832
+ "<|d_14|>": 152450,
833
+ "<|d_150|>": 152586,
834
+ "<|d_151|>": 152587,
835
+ "<|d_152|>": 152588,
836
+ "<|d_153|>": 152589,
837
+ "<|d_154|>": 152590,
838
+ "<|d_155|>": 152591,
839
+ "<|d_156|>": 152592,
840
+ "<|d_157|>": 152593,
841
+ "<|d_158|>": 152594,
842
+ "<|d_159|>": 152595,
843
+ "<|d_15|>": 152451,
844
+ "<|d_160|>": 152596,
845
+ "<|d_161|>": 152597,
846
+ "<|d_162|>": 152598,
847
+ "<|d_163|>": 152599,
848
+ "<|d_164|>": 152600,
849
+ "<|d_165|>": 152601,
850
+ "<|d_166|>": 152602,
851
+ "<|d_167|>": 152603,
852
+ "<|d_168|>": 152604,
853
+ "<|d_169|>": 152605,
854
+ "<|d_16|>": 152452,
855
+ "<|d_170|>": 152606,
856
+ "<|d_171|>": 152607,
857
+ "<|d_172|>": 152608,
858
+ "<|d_173|>": 152609,
859
+ "<|d_174|>": 152610,
860
+ "<|d_175|>": 152611,
861
+ "<|d_176|>": 152612,
862
+ "<|d_177|>": 152613,
863
+ "<|d_178|>": 152614,
864
+ "<|d_179|>": 152615,
865
+ "<|d_17|>": 152453,
866
+ "<|d_180|>": 152616,
867
+ "<|d_181|>": 152617,
868
+ "<|d_182|>": 152618,
869
+ "<|d_183|>": 152619,
870
+ "<|d_184|>": 152620,
871
+ "<|d_185|>": 152621,
872
+ "<|d_186|>": 152622,
873
+ "<|d_187|>": 152623,
874
+ "<|d_188|>": 152624,
875
+ "<|d_189|>": 152625,
876
+ "<|d_18|>": 152454,
877
+ "<|d_190|>": 152626,
878
+ "<|d_191|>": 152627,
879
+ "<|d_192|>": 152628,
880
+ "<|d_193|>": 152629,
881
+ "<|d_194|>": 152630,
882
+ "<|d_195|>": 152631,
883
+ "<|d_196|>": 152632,
884
+ "<|d_197|>": 152633,
885
+ "<|d_198|>": 152634,
886
+ "<|d_199|>": 152635,
887
+ "<|d_19|>": 152455,
888
+ "<|d_1|>": 152437,
889
+ "<|d_200|>": 152636,
890
+ "<|d_201|>": 152637,
891
+ "<|d_202|>": 152638,
892
+ "<|d_203|>": 152639,
893
+ "<|d_204|>": 152640,
894
+ "<|d_205|>": 152641,
895
+ "<|d_206|>": 152642,
896
+ "<|d_207|>": 152643,
897
+ "<|d_208|>": 152644,
898
+ "<|d_209|>": 152645,
899
+ "<|d_20|>": 152456,
900
+ "<|d_210|>": 152646,
901
+ "<|d_211|>": 152647,
902
+ "<|d_212|>": 152648,
903
+ "<|d_213|>": 152649,
904
+ "<|d_214|>": 152650,
905
+ "<|d_215|>": 152651,
906
+ "<|d_216|>": 152652,
907
+ "<|d_217|>": 152653,
908
+ "<|d_218|>": 152654,
909
+ "<|d_219|>": 152655,
910
+ "<|d_21|>": 152457,
911
+ "<|d_220|>": 152656,
912
+ "<|d_221|>": 152657,
913
+ "<|d_222|>": 152658,
914
+ "<|d_223|>": 152659,
915
+ "<|d_224|>": 152660,
916
+ "<|d_225|>": 152661,
917
+ "<|d_226|>": 152662,
918
+ "<|d_227|>": 152663,
919
+ "<|d_228|>": 152664,
920
+ "<|d_229|>": 152665,
921
+ "<|d_22|>": 152458,
922
+ "<|d_230|>": 152666,
923
+ "<|d_231|>": 152667,
924
+ "<|d_232|>": 152668,
925
+ "<|d_233|>": 152669,
926
+ "<|d_234|>": 152670,
927
+ "<|d_235|>": 152671,
928
+ "<|d_236|>": 152672,
929
+ "<|d_237|>": 152673,
930
+ "<|d_238|>": 152674,
931
+ "<|d_239|>": 152675,
932
+ "<|d_23|>": 152459,
933
+ "<|d_240|>": 152676,
934
+ "<|d_241|>": 152677,
935
+ "<|d_242|>": 152678,
936
+ "<|d_243|>": 152679,
937
+ "<|d_244|>": 152680,
938
+ "<|d_245|>": 152681,
939
+ "<|d_246|>": 152682,
940
+ "<|d_247|>": 152683,
941
+ "<|d_248|>": 152684,
942
+ "<|d_249|>": 152685,
943
+ "<|d_24|>": 152460,
944
+ "<|d_250|>": 152686,
945
+ "<|d_251|>": 152687,
946
+ "<|d_252|>": 152688,
947
+ "<|d_253|>": 152689,
948
+ "<|d_254|>": 152690,
949
+ "<|d_255|>": 152691,
950
+ "<|d_256|>": 152692,
951
+ "<|d_25|>": 152461,
952
+ "<|d_26|>": 152462,
953
+ "<|d_27|>": 152463,
954
+ "<|d_28|>": 152464,
955
+ "<|d_29|>": 152465,
956
+ "<|d_2|>": 152438,
957
+ "<|d_30|>": 152466,
958
+ "<|d_31|>": 152467,
959
+ "<|d_32|>": 152468,
960
+ "<|d_33|>": 152469,
961
+ "<|d_34|>": 152470,
962
+ "<|d_35|>": 152471,
963
+ "<|d_36|>": 152472,
964
+ "<|d_37|>": 152473,
965
+ "<|d_38|>": 152474,
966
+ "<|d_39|>": 152475,
967
+ "<|d_3|>": 152439,
968
+ "<|d_40|>": 152476,
969
+ "<|d_41|>": 152477,
970
+ "<|d_42|>": 152478,
971
+ "<|d_43|>": 152479,
972
+ "<|d_44|>": 152480,
973
+ "<|d_45|>": 152481,
974
+ "<|d_46|>": 152482,
975
+ "<|d_47|>": 152483,
976
+ "<|d_48|>": 152484,
977
+ "<|d_49|>": 152485,
978
+ "<|d_4|>": 152440,
979
+ "<|d_50|>": 152486,
980
+ "<|d_51|>": 152487,
981
+ "<|d_52|>": 152488,
982
+ "<|d_53|>": 152489,
983
+ "<|d_54|>": 152490,
984
+ "<|d_55|>": 152491,
985
+ "<|d_56|>": 152492,
986
+ "<|d_57|>": 152493,
987
+ "<|d_58|>": 152494,
988
+ "<|d_59|>": 152495,
989
+ "<|d_5|>": 152441,
990
+ "<|d_60|>": 152496,
991
+ "<|d_61|>": 152497,
992
+ "<|d_62|>": 152498,
993
+ "<|d_63|>": 152499,
994
+ "<|d_64|>": 152500,
995
+ "<|d_65|>": 152501,
996
+ "<|d_66|>": 152502,
997
+ "<|d_67|>": 152503,
998
+ "<|d_68|>": 152504,
999
+ "<|d_69|>": 152505,
1000
+ "<|d_6|>": 152442,
1001
+ "<|d_70|>": 152506,
1002
+ "<|d_71|>": 152507,
1003
+ "<|d_72|>": 152508,
1004
+ "<|d_73|>": 152509,
1005
+ "<|d_74|>": 152510,
1006
+ "<|d_75|>": 152511,
1007
+ "<|d_76|>": 152512,
1008
+ "<|d_77|>": 152513,
1009
+ "<|d_78|>": 152514,
1010
+ "<|d_79|>": 152515,
1011
+ "<|d_7|>": 152443,
1012
+ "<|d_80|>": 152516,
1013
+ "<|d_81|>": 152517,
1014
+ "<|d_82|>": 152518,
1015
+ "<|d_83|>": 152519,
1016
+ "<|d_84|>": 152520,
1017
+ "<|d_85|>": 152521,
1018
+ "<|d_86|>": 152522,
1019
+ "<|d_87|>": 152523,
1020
+ "<|d_88|>": 152524,
1021
+ "<|d_89|>": 152525,
1022
+ "<|d_8|>": 152444,
1023
+ "<|d_90|>": 152526,
1024
+ "<|d_91|>": 152527,
1025
+ "<|d_92|>": 152528,
1026
+ "<|d_93|>": 152529,
1027
+ "<|d_94|>": 152530,
1028
+ "<|d_95|>": 152531,
1029
+ "<|d_96|>": 152532,
1030
+ "<|d_97|>": 152533,
1031
+ "<|d_98|>": 152534,
1032
+ "<|d_99|>": 152535,
1033
+ "<|d_9|>": 152445,
1034
+ "<|e_100|>": 152792,
1035
+ "<|e_101|>": 152793,
1036
+ "<|e_102|>": 152794,
1037
+ "<|e_103|>": 152795,
1038
+ "<|e_104|>": 152796,
1039
+ "<|e_105|>": 152797,
1040
+ "<|e_106|>": 152798,
1041
+ "<|e_107|>": 152799,
1042
+ "<|e_108|>": 152800,
1043
+ "<|e_109|>": 152801,
1044
+ "<|e_10|>": 152702,
1045
+ "<|e_110|>": 152802,
1046
+ "<|e_111|>": 152803,
1047
+ "<|e_112|>": 152804,
1048
+ "<|e_113|>": 152805,
1049
+ "<|e_114|>": 152806,
1050
+ "<|e_115|>": 152807,
1051
+ "<|e_116|>": 152808,
1052
+ "<|e_117|>": 152809,
1053
+ "<|e_118|>": 152810,
1054
+ "<|e_119|>": 152811,
1055
+ "<|e_11|>": 152703,
1056
+ "<|e_120|>": 152812,
1057
+ "<|e_121|>": 152813,
1058
+ "<|e_122|>": 152814,
1059
+ "<|e_123|>": 152815,
1060
+ "<|e_124|>": 152816,
1061
+ "<|e_125|>": 152817,
1062
+ "<|e_126|>": 152818,
1063
+ "<|e_127|>": 152819,
1064
+ "<|e_128|>": 152820,
1065
+ "<|e_129|>": 152821,
1066
+ "<|e_12|>": 152704,
1067
+ "<|e_130|>": 152822,
1068
+ "<|e_131|>": 152823,
1069
+ "<|e_132|>": 152824,
1070
+ "<|e_133|>": 152825,
1071
+ "<|e_134|>": 152826,
1072
+ "<|e_135|>": 152827,
1073
+ "<|e_136|>": 152828,
1074
+ "<|e_137|>": 152829,
1075
+ "<|e_138|>": 152830,
1076
+ "<|e_139|>": 152831,
1077
+ "<|e_13|>": 152705,
1078
+ "<|e_140|>": 152832,
1079
+ "<|e_141|>": 152833,
1080
+ "<|e_142|>": 152834,
1081
+ "<|e_143|>": 152835,
1082
+ "<|e_144|>": 152836,
1083
+ "<|e_145|>": 152837,
1084
+ "<|e_146|>": 152838,
1085
+ "<|e_147|>": 152839,
1086
+ "<|e_148|>": 152840,
1087
+ "<|e_149|>": 152841,
1088
+ "<|e_14|>": 152706,
1089
+ "<|e_150|>": 152842,
1090
+ "<|e_151|>": 152843,
1091
+ "<|e_152|>": 152844,
1092
+ "<|e_153|>": 152845,
1093
+ "<|e_154|>": 152846,
1094
+ "<|e_155|>": 152847,
1095
+ "<|e_156|>": 152848,
1096
+ "<|e_157|>": 152849,
1097
+ "<|e_158|>": 152850,
1098
+ "<|e_159|>": 152851,
1099
+ "<|e_15|>": 152707,
1100
+ "<|e_160|>": 152852,
1101
+ "<|e_161|>": 152853,
1102
+ "<|e_162|>": 152854,
1103
+ "<|e_163|>": 152855,
1104
+ "<|e_164|>": 152856,
1105
+ "<|e_165|>": 152857,
1106
+ "<|e_166|>": 152858,
1107
+ "<|e_167|>": 152859,
1108
+ "<|e_168|>": 152860,
1109
+ "<|e_169|>": 152861,
1110
+ "<|e_16|>": 152708,
1111
+ "<|e_170|>": 152862,
1112
+ "<|e_171|>": 152863,
1113
+ "<|e_172|>": 152864,
1114
+ "<|e_173|>": 152865,
1115
+ "<|e_174|>": 152866,
1116
+ "<|e_175|>": 152867,
1117
+ "<|e_176|>": 152868,
1118
+ "<|e_177|>": 152869,
1119
+ "<|e_178|>": 152870,
1120
+ "<|e_179|>": 152871,
1121
+ "<|e_17|>": 152709,
1122
+ "<|e_180|>": 152872,
1123
+ "<|e_181|>": 152873,
1124
+ "<|e_182|>": 152874,
1125
+ "<|e_183|>": 152875,
1126
+ "<|e_184|>": 152876,
1127
+ "<|e_185|>": 152877,
1128
+ "<|e_186|>": 152878,
1129
+ "<|e_187|>": 152879,
1130
+ "<|e_188|>": 152880,
1131
+ "<|e_189|>": 152881,
1132
+ "<|e_18|>": 152710,
1133
+ "<|e_190|>": 152882,
1134
+ "<|e_191|>": 152883,
1135
+ "<|e_192|>": 152884,
1136
+ "<|e_193|>": 152885,
1137
+ "<|e_194|>": 152886,
1138
+ "<|e_195|>": 152887,
1139
+ "<|e_196|>": 152888,
1140
+ "<|e_197|>": 152889,
1141
+ "<|e_198|>": 152890,
1142
+ "<|e_199|>": 152891,
1143
+ "<|e_19|>": 152711,
1144
+ "<|e_1|>": 152693,
1145
+ "<|e_200|>": 152892,
1146
+ "<|e_201|>": 152893,
1147
+ "<|e_202|>": 152894,
1148
+ "<|e_203|>": 152895,
1149
+ "<|e_204|>": 152896,
1150
+ "<|e_205|>": 152897,
1151
+ "<|e_206|>": 152898,
1152
+ "<|e_207|>": 152899,
1153
+ "<|e_208|>": 152900,
1154
+ "<|e_209|>": 152901,
1155
+ "<|e_20|>": 152712,
1156
+ "<|e_210|>": 152902,
1157
+ "<|e_211|>": 152903,
1158
+ "<|e_212|>": 152904,
1159
+ "<|e_213|>": 152905,
1160
+ "<|e_214|>": 152906,
1161
+ "<|e_215|>": 152907,
1162
+ "<|e_216|>": 152908,
1163
+ "<|e_217|>": 152909,
1164
+ "<|e_218|>": 152910,
1165
+ "<|e_219|>": 152911,
1166
+ "<|e_21|>": 152713,
1167
+ "<|e_220|>": 152912,
1168
+ "<|e_221|>": 152913,
1169
+ "<|e_222|>": 152914,
1170
+ "<|e_223|>": 152915,
1171
+ "<|e_224|>": 152916,
1172
+ "<|e_225|>": 152917,
1173
+ "<|e_226|>": 152918,
1174
+ "<|e_227|>": 152919,
1175
+ "<|e_228|>": 152920,
1176
+ "<|e_229|>": 152921,
1177
+ "<|e_22|>": 152714,
1178
+ "<|e_230|>": 152922,
1179
+ "<|e_231|>": 152923,
1180
+ "<|e_232|>": 152924,
1181
+ "<|e_233|>": 152925,
1182
+ "<|e_234|>": 152926,
1183
+ "<|e_235|>": 152927,
1184
+ "<|e_236|>": 152928,
1185
+ "<|e_237|>": 152929,
1186
+ "<|e_238|>": 152930,
1187
+ "<|e_239|>": 152931,
1188
+ "<|e_23|>": 152715,
1189
+ "<|e_240|>": 152932,
1190
+ "<|e_241|>": 152933,
1191
+ "<|e_242|>": 152934,
1192
+ "<|e_243|>": 152935,
1193
+ "<|e_244|>": 152936,
1194
+ "<|e_245|>": 152937,
1195
+ "<|e_246|>": 152938,
1196
+ "<|e_247|>": 152939,
1197
+ "<|e_248|>": 152940,
1198
+ "<|e_249|>": 152941,
1199
+ "<|e_24|>": 152716,
1200
+ "<|e_250|>": 152942,
1201
+ "<|e_251|>": 152943,
1202
+ "<|e_252|>": 152944,
1203
+ "<|e_253|>": 152945,
1204
+ "<|e_254|>": 152946,
1205
+ "<|e_255|>": 152947,
1206
+ "<|e_256|>": 152948,
1207
+ "<|e_25|>": 152717,
1208
+ "<|e_26|>": 152718,
1209
+ "<|e_27|>": 152719,
1210
+ "<|e_28|>": 152720,
1211
+ "<|e_29|>": 152721,
1212
+ "<|e_2|>": 152694,
1213
+ "<|e_30|>": 152722,
1214
+ "<|e_31|>": 152723,
1215
+ "<|e_32|>": 152724,
1216
+ "<|e_33|>": 152725,
1217
+ "<|e_34|>": 152726,
1218
+ "<|e_35|>": 152727,
1219
+ "<|e_36|>": 152728,
1220
+ "<|e_37|>": 152729,
1221
+ "<|e_38|>": 152730,
1222
+ "<|e_39|>": 152731,
1223
+ "<|e_3|>": 152695,
1224
+ "<|e_40|>": 152732,
1225
+ "<|e_41|>": 152733,
1226
+ "<|e_42|>": 152734,
1227
+ "<|e_43|>": 152735,
1228
+ "<|e_44|>": 152736,
1229
+ "<|e_45|>": 152737,
1230
+ "<|e_46|>": 152738,
1231
+ "<|e_47|>": 152739,
1232
+ "<|e_48|>": 152740,
1233
+ "<|e_49|>": 152741,
1234
+ "<|e_4|>": 152696,
1235
+ "<|e_50|>": 152742,
1236
+ "<|e_51|>": 152743,
1237
+ "<|e_52|>": 152744,
1238
+ "<|e_53|>": 152745,
1239
+ "<|e_54|>": 152746,
1240
+ "<|e_55|>": 152747,
1241
+ "<|e_56|>": 152748,
1242
+ "<|e_57|>": 152749,
1243
+ "<|e_58|>": 152750,
1244
+ "<|e_59|>": 152751,
1245
+ "<|e_5|>": 152697,
1246
+ "<|e_60|>": 152752,
1247
+ "<|e_61|>": 152753,
1248
+ "<|e_62|>": 152754,
1249
+ "<|e_63|>": 152755,
1250
+ "<|e_64|>": 152756,
1251
+ "<|e_65|>": 152757,
1252
+ "<|e_66|>": 152758,
1253
+ "<|e_67|>": 152759,
1254
+ "<|e_68|>": 152760,
1255
+ "<|e_69|>": 152761,
1256
+ "<|e_6|>": 152698,
1257
+ "<|e_70|>": 152762,
1258
+ "<|e_71|>": 152763,
1259
+ "<|e_72|>": 152764,
1260
+ "<|e_73|>": 152765,
1261
+ "<|e_74|>": 152766,
1262
+ "<|e_75|>": 152767,
1263
+ "<|e_76|>": 152768,
1264
+ "<|e_77|>": 152769,
1265
+ "<|e_78|>": 152770,
1266
+ "<|e_79|>": 152771,
1267
+ "<|e_7|>": 152699,
1268
+ "<|e_80|>": 152772,
1269
+ "<|e_81|>": 152773,
1270
+ "<|e_82|>": 152774,
1271
+ "<|e_83|>": 152775,
1272
+ "<|e_84|>": 152776,
1273
+ "<|e_85|>": 152777,
1274
+ "<|e_86|>": 152778,
1275
+ "<|e_87|>": 152779,
1276
+ "<|e_88|>": 152780,
1277
+ "<|e_89|>": 152781,
1278
+ "<|e_8|>": 152700,
1279
+ "<|e_90|>": 152782,
1280
+ "<|e_91|>": 152783,
1281
+ "<|e_92|>": 152784,
1282
+ "<|e_93|>": 152785,
1283
+ "<|e_94|>": 152786,
1284
+ "<|e_95|>": 152787,
1285
+ "<|e_96|>": 152788,
1286
+ "<|e_97|>": 152789,
1287
+ "<|e_98|>": 152790,
1288
+ "<|e_99|>": 152791,
1289
+ "<|e_9|>": 152701,
1290
+ "<|endoftext|>": 151643,
1291
+ "<|file_sep|>": 151664,
1292
+ "<|fim_middle|>": 151660,
1293
+ "<|fim_pad|>": 151662,
1294
+ "<|fim_prefix|>": 151659,
1295
+ "<|fim_suffix|>": 151661,
1296
+ "<|im_end|>": 151645,
1297
+ "<|im_start|>": 151644,
1298
+ "<|image_pad|>": 151655,
1299
+ "<|object_ref_end|>": 151647,
1300
+ "<|object_ref_start|>": 151646,
1301
+ "<|quad_end|>": 151651,
1302
+ "<|quad_start|>": 151650,
1303
+ "<|repo_name|>": 151663,
1304
+ "<|video_pad|>": 151656,
1305
+ "<|vision_end|>": 151653,
1306
+ "<|vision_pad|>": 151654,
1307
+ "<|vision_start|>": 151652
1308
+ }
qwen3_4b_instruments/config.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Qwen3ForCausalLM"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "eos_token_id": 151645,
8
+ "head_dim": 128,
9
+ "hidden_act": "silu",
10
+ "hidden_size": 2560,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 9728,
13
+ "max_position_embeddings": 262144,
14
+ "max_window_layers": 36,
15
+ "model_type": "qwen3",
16
+ "num_attention_heads": 32,
17
+ "num_hidden_layers": 36,
18
+ "num_key_value_heads": 8,
19
+ "pad_token_id": 151643,
20
+ "rms_norm_eps": 1e-06,
21
+ "rope_scaling": null,
22
+ "rope_theta": 5000000,
23
+ "sliding_window": null,
24
+ "tie_word_embeddings": true,
25
+ "torch_dtype": "bfloat16",
26
+ "transformers_version": "4.51.1",
27
+ "use_cache": false,
28
+ "use_sliding_window": false,
29
+ "vocab_size": 152949
30
+ }
qwen3_4b_instruments/generation_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 151643,
3
+ "do_sample": true,
4
+ "eos_token_id": [
5
+ 151645,
6
+ 151643
7
+ ],
8
+ "pad_token_id": 151643,
9
+ "temperature": 0.7,
10
+ "top_k": 20,
11
+ "top_p": 0.8,
12
+ "transformers_version": "4.51.1"
13
+ }
qwen3_4b_instruments/merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
qwen3_4b_instruments/model-00001-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:04ddd9747b8d616144b781801f249db40e8613d67950d5bd0589d38a6d812418
3
+ size 4974276200
qwen3_4b_instruments/model-00002-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:20983d2f41777d388edf64f0a40a6ac8a8e02eb7dfef429d0909f952eb994424
3
+ size 3858991352
qwen3_4b_instruments/model.safetensors.index.json ADDED
@@ -0,0 +1,406 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 8833221632
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00001-of-00002.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00002.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00002-of-00002.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
13
+ "model.layers.0.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
14
+ "model.layers.0.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
15
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
16
+ "model.layers.0.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
17
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
18
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
19
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00002.safetensors",
20
+ "model.layers.1.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
21
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
22
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
23
+ "model.layers.1.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
24
+ "model.layers.1.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
25
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
26
+ "model.layers.1.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
27
+ "model.layers.1.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
28
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
29
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
30
+ "model.layers.10.input_layernorm.weight": "model-00001-of-00002.safetensors",
31
+ "model.layers.10.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
32
+ "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
33
+ "model.layers.10.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
34
+ "model.layers.10.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
35
+ "model.layers.10.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
36
+ "model.layers.10.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
37
+ "model.layers.10.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
38
+ "model.layers.10.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
39
+ "model.layers.10.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
40
+ "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
41
+ "model.layers.11.input_layernorm.weight": "model-00002-of-00002.safetensors",
42
+ "model.layers.11.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
43
+ "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
44
+ "model.layers.11.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
45
+ "model.layers.11.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
46
+ "model.layers.11.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
47
+ "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
48
+ "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
49
+ "model.layers.11.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
50
+ "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
51
+ "model.layers.11.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
52
+ "model.layers.12.input_layernorm.weight": "model-00001-of-00002.safetensors",
53
+ "model.layers.12.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
54
+ "model.layers.12.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
55
+ "model.layers.12.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
56
+ "model.layers.12.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
57
+ "model.layers.12.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
58
+ "model.layers.12.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
59
+ "model.layers.12.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
60
+ "model.layers.12.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
61
+ "model.layers.12.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
62
+ "model.layers.12.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
63
+ "model.layers.13.input_layernorm.weight": "model-00002-of-00002.safetensors",
64
+ "model.layers.13.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
65
+ "model.layers.13.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
66
+ "model.layers.13.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
67
+ "model.layers.13.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
68
+ "model.layers.13.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
69
+ "model.layers.13.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
70
+ "model.layers.13.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
71
+ "model.layers.13.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
72
+ "model.layers.13.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
73
+ "model.layers.13.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
74
+ "model.layers.14.input_layernorm.weight": "model-00001-of-00002.safetensors",
75
+ "model.layers.14.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
76
+ "model.layers.14.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
77
+ "model.layers.14.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
78
+ "model.layers.14.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
79
+ "model.layers.14.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
80
+ "model.layers.14.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
81
+ "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
82
+ "model.layers.14.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
83
+ "model.layers.14.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
84
+ "model.layers.14.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
85
+ "model.layers.15.input_layernorm.weight": "model-00002-of-00002.safetensors",
86
+ "model.layers.15.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
87
+ "model.layers.15.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
88
+ "model.layers.15.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
89
+ "model.layers.15.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
90
+ "model.layers.15.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
91
+ "model.layers.15.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
92
+ "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
93
+ "model.layers.15.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
94
+ "model.layers.15.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
95
+ "model.layers.15.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
96
+ "model.layers.16.input_layernorm.weight": "model-00002-of-00002.safetensors",
97
+ "model.layers.16.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
98
+ "model.layers.16.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
99
+ "model.layers.16.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
100
+ "model.layers.16.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
101
+ "model.layers.16.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
102
+ "model.layers.16.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
103
+ "model.layers.16.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
104
+ "model.layers.16.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
105
+ "model.layers.16.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
106
+ "model.layers.16.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
107
+ "model.layers.17.input_layernorm.weight": "model-00002-of-00002.safetensors",
108
+ "model.layers.17.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
109
+ "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
110
+ "model.layers.17.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
111
+ "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
112
+ "model.layers.17.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
113
+ "model.layers.17.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
114
+ "model.layers.17.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
115
+ "model.layers.17.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
116
+ "model.layers.17.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
117
+ "model.layers.17.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
118
+ "model.layers.18.input_layernorm.weight": "model-00001-of-00002.safetensors",
119
+ "model.layers.18.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
120
+ "model.layers.18.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
121
+ "model.layers.18.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
122
+ "model.layers.18.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
123
+ "model.layers.18.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
124
+ "model.layers.18.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
125
+ "model.layers.18.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
126
+ "model.layers.18.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
127
+ "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
128
+ "model.layers.18.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
129
+ "model.layers.19.input_layernorm.weight": "model-00002-of-00002.safetensors",
130
+ "model.layers.19.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
131
+ "model.layers.19.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
132
+ "model.layers.19.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
133
+ "model.layers.19.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
134
+ "model.layers.19.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
135
+ "model.layers.19.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
136
+ "model.layers.19.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
137
+ "model.layers.19.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
138
+ "model.layers.19.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
139
+ "model.layers.19.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
140
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00002.safetensors",
141
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
142
+ "model.layers.2.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
143
+ "model.layers.2.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
144
+ "model.layers.2.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
145
+ "model.layers.2.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
146
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
147
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
148
+ "model.layers.2.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
149
+ "model.layers.2.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
150
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
151
+ "model.layers.20.input_layernorm.weight": "model-00001-of-00002.safetensors",
152
+ "model.layers.20.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
153
+ "model.layers.20.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
154
+ "model.layers.20.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
155
+ "model.layers.20.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
156
+ "model.layers.20.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
157
+ "model.layers.20.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
158
+ "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
159
+ "model.layers.20.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
160
+ "model.layers.20.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
161
+ "model.layers.20.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
162
+ "model.layers.21.input_layernorm.weight": "model-00002-of-00002.safetensors",
163
+ "model.layers.21.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
164
+ "model.layers.21.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
165
+ "model.layers.21.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
166
+ "model.layers.21.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
167
+ "model.layers.21.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
168
+ "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
169
+ "model.layers.21.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
170
+ "model.layers.21.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
171
+ "model.layers.21.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
172
+ "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
173
+ "model.layers.22.input_layernorm.weight": "model-00001-of-00002.safetensors",
174
+ "model.layers.22.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
175
+ "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
176
+ "model.layers.22.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
177
+ "model.layers.22.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
178
+ "model.layers.22.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
179
+ "model.layers.22.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
180
+ "model.layers.22.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
181
+ "model.layers.22.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
182
+ "model.layers.22.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
183
+ "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
184
+ "model.layers.23.input_layernorm.weight": "model-00001-of-00002.safetensors",
185
+ "model.layers.23.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
186
+ "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
187
+ "model.layers.23.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
188
+ "model.layers.23.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
189
+ "model.layers.23.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
190
+ "model.layers.23.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
191
+ "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
192
+ "model.layers.23.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
193
+ "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
194
+ "model.layers.23.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
195
+ "model.layers.24.input_layernorm.weight": "model-00002-of-00002.safetensors",
196
+ "model.layers.24.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
197
+ "model.layers.24.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
198
+ "model.layers.24.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
199
+ "model.layers.24.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
200
+ "model.layers.24.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
201
+ "model.layers.24.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
202
+ "model.layers.24.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
203
+ "model.layers.24.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
204
+ "model.layers.24.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
205
+ "model.layers.24.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
206
+ "model.layers.25.input_layernorm.weight": "model-00002-of-00002.safetensors",
207
+ "model.layers.25.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
208
+ "model.layers.25.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
209
+ "model.layers.25.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
210
+ "model.layers.25.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
211
+ "model.layers.25.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
212
+ "model.layers.25.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
213
+ "model.layers.25.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
214
+ "model.layers.25.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
215
+ "model.layers.25.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
216
+ "model.layers.25.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
217
+ "model.layers.26.input_layernorm.weight": "model-00001-of-00002.safetensors",
218
+ "model.layers.26.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
219
+ "model.layers.26.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
220
+ "model.layers.26.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
221
+ "model.layers.26.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
222
+ "model.layers.26.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
223
+ "model.layers.26.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
224
+ "model.layers.26.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
225
+ "model.layers.26.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
226
+ "model.layers.26.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
227
+ "model.layers.26.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
228
+ "model.layers.27.input_layernorm.weight": "model-00001-of-00002.safetensors",
229
+ "model.layers.27.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
230
+ "model.layers.27.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
231
+ "model.layers.27.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
232
+ "model.layers.27.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
233
+ "model.layers.27.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
234
+ "model.layers.27.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
235
+ "model.layers.27.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
236
+ "model.layers.27.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
237
+ "model.layers.27.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
238
+ "model.layers.27.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
239
+ "model.layers.28.input_layernorm.weight": "model-00002-of-00002.safetensors",
240
+ "model.layers.28.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
241
+ "model.layers.28.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
242
+ "model.layers.28.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
243
+ "model.layers.28.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
244
+ "model.layers.28.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
245
+ "model.layers.28.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
246
+ "model.layers.28.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
247
+ "model.layers.28.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
248
+ "model.layers.28.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
249
+ "model.layers.28.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
250
+ "model.layers.29.input_layernorm.weight": "model-00002-of-00002.safetensors",
251
+ "model.layers.29.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
252
+ "model.layers.29.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
253
+ "model.layers.29.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
254
+ "model.layers.29.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
255
+ "model.layers.29.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
256
+ "model.layers.29.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
257
+ "model.layers.29.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
258
+ "model.layers.29.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
259
+ "model.layers.29.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
260
+ "model.layers.29.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
261
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00002.safetensors",
262
+ "model.layers.3.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
263
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
264
+ "model.layers.3.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
265
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
266
+ "model.layers.3.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
267
+ "model.layers.3.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
268
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
269
+ "model.layers.3.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
270
+ "model.layers.3.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
271
+ "model.layers.3.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
272
+ "model.layers.30.input_layernorm.weight": "model-00002-of-00002.safetensors",
273
+ "model.layers.30.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
274
+ "model.layers.30.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
275
+ "model.layers.30.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
276
+ "model.layers.30.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
277
+ "model.layers.30.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
278
+ "model.layers.30.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
279
+ "model.layers.30.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
280
+ "model.layers.30.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
281
+ "model.layers.30.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
282
+ "model.layers.30.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
283
+ "model.layers.31.input_layernorm.weight": "model-00002-of-00002.safetensors",
284
+ "model.layers.31.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
285
+ "model.layers.31.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
286
+ "model.layers.31.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
287
+ "model.layers.31.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
288
+ "model.layers.31.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
289
+ "model.layers.31.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
290
+ "model.layers.31.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
291
+ "model.layers.31.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
292
+ "model.layers.31.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
293
+ "model.layers.31.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
294
+ "model.layers.32.input_layernorm.weight": "model-00001-of-00002.safetensors",
295
+ "model.layers.32.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
296
+ "model.layers.32.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
297
+ "model.layers.32.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
298
+ "model.layers.32.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
299
+ "model.layers.32.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
300
+ "model.layers.32.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
301
+ "model.layers.32.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
302
+ "model.layers.32.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
303
+ "model.layers.32.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
304
+ "model.layers.32.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
305
+ "model.layers.33.input_layernorm.weight": "model-00002-of-00002.safetensors",
306
+ "model.layers.33.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
307
+ "model.layers.33.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
308
+ "model.layers.33.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
309
+ "model.layers.33.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
310
+ "model.layers.33.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
311
+ "model.layers.33.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
312
+ "model.layers.33.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
313
+ "model.layers.33.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
314
+ "model.layers.33.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
315
+ "model.layers.33.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
316
+ "model.layers.34.input_layernorm.weight": "model-00001-of-00002.safetensors",
317
+ "model.layers.34.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
318
+ "model.layers.34.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
319
+ "model.layers.34.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
320
+ "model.layers.34.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
321
+ "model.layers.34.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
322
+ "model.layers.34.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
323
+ "model.layers.34.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
324
+ "model.layers.34.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
325
+ "model.layers.34.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
326
+ "model.layers.34.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
327
+ "model.layers.35.input_layernorm.weight": "model-00002-of-00002.safetensors",
328
+ "model.layers.35.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
329
+ "model.layers.35.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
330
+ "model.layers.35.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
331
+ "model.layers.35.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
332
+ "model.layers.35.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
333
+ "model.layers.35.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
334
+ "model.layers.35.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
335
+ "model.layers.35.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
336
+ "model.layers.35.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
337
+ "model.layers.35.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
338
+ "model.layers.4.input_layernorm.weight": "model-00002-of-00002.safetensors",
339
+ "model.layers.4.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
340
+ "model.layers.4.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
341
+ "model.layers.4.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
342
+ "model.layers.4.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
343
+ "model.layers.4.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
344
+ "model.layers.4.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
345
+ "model.layers.4.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
346
+ "model.layers.4.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
347
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
348
+ "model.layers.4.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
349
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00002.safetensors",
350
+ "model.layers.5.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
351
+ "model.layers.5.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
352
+ "model.layers.5.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
353
+ "model.layers.5.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
354
+ "model.layers.5.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
355
+ "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
356
+ "model.layers.5.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
357
+ "model.layers.5.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
358
+ "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
359
+ "model.layers.5.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
360
+ "model.layers.6.input_layernorm.weight": "model-00002-of-00002.safetensors",
361
+ "model.layers.6.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
362
+ "model.layers.6.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
363
+ "model.layers.6.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
364
+ "model.layers.6.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
365
+ "model.layers.6.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
366
+ "model.layers.6.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
367
+ "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
368
+ "model.layers.6.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
369
+ "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
370
+ "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
371
+ "model.layers.7.input_layernorm.weight": "model-00002-of-00002.safetensors",
372
+ "model.layers.7.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
373
+ "model.layers.7.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
374
+ "model.layers.7.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
375
+ "model.layers.7.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
376
+ "model.layers.7.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
377
+ "model.layers.7.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
378
+ "model.layers.7.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
379
+ "model.layers.7.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
380
+ "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
381
+ "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
382
+ "model.layers.8.input_layernorm.weight": "model-00002-of-00002.safetensors",
383
+ "model.layers.8.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
384
+ "model.layers.8.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
385
+ "model.layers.8.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
386
+ "model.layers.8.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
387
+ "model.layers.8.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
388
+ "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
389
+ "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
390
+ "model.layers.8.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
391
+ "model.layers.8.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
392
+ "model.layers.8.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
393
+ "model.layers.9.input_layernorm.weight": "model-00002-of-00002.safetensors",
394
+ "model.layers.9.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
395
+ "model.layers.9.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
396
+ "model.layers.9.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
397
+ "model.layers.9.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
398
+ "model.layers.9.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
399
+ "model.layers.9.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
400
+ "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
401
+ "model.layers.9.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
402
+ "model.layers.9.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
403
+ "model.layers.9.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
404
+ "model.norm.weight": "model-00002-of-00002.safetensors"
405
+ }
406
+ }
qwen3_4b_instruments/special_tokens_map.json ADDED
@@ -0,0 +1,1298 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|a_1|>",
4
+ "<|a_2|>",
5
+ "<|a_3|>",
6
+ "<|a_4|>",
7
+ "<|a_5|>",
8
+ "<|a_6|>",
9
+ "<|a_7|>",
10
+ "<|a_8|>",
11
+ "<|a_9|>",
12
+ "<|a_10|>",
13
+ "<|a_11|>",
14
+ "<|a_12|>",
15
+ "<|a_13|>",
16
+ "<|a_14|>",
17
+ "<|a_15|>",
18
+ "<|a_16|>",
19
+ "<|a_17|>",
20
+ "<|a_18|>",
21
+ "<|a_19|>",
22
+ "<|a_20|>",
23
+ "<|a_21|>",
24
+ "<|a_22|>",
25
+ "<|a_23|>",
26
+ "<|a_24|>",
27
+ "<|a_25|>",
28
+ "<|a_26|>",
29
+ "<|a_27|>",
30
+ "<|a_28|>",
31
+ "<|a_29|>",
32
+ "<|a_30|>",
33
+ "<|a_31|>",
34
+ "<|a_32|>",
35
+ "<|a_33|>",
36
+ "<|a_34|>",
37
+ "<|a_35|>",
38
+ "<|a_36|>",
39
+ "<|a_37|>",
40
+ "<|a_38|>",
41
+ "<|a_39|>",
42
+ "<|a_40|>",
43
+ "<|a_41|>",
44
+ "<|a_42|>",
45
+ "<|a_43|>",
46
+ "<|a_44|>",
47
+ "<|a_45|>",
48
+ "<|a_46|>",
49
+ "<|a_47|>",
50
+ "<|a_48|>",
51
+ "<|a_49|>",
52
+ "<|a_50|>",
53
+ "<|a_51|>",
54
+ "<|a_52|>",
55
+ "<|a_53|>",
56
+ "<|a_54|>",
57
+ "<|a_55|>",
58
+ "<|a_56|>",
59
+ "<|a_57|>",
60
+ "<|a_58|>",
61
+ "<|a_59|>",
62
+ "<|a_60|>",
63
+ "<|a_61|>",
64
+ "<|a_62|>",
65
+ "<|a_63|>",
66
+ "<|a_64|>",
67
+ "<|a_65|>",
68
+ "<|a_66|>",
69
+ "<|a_67|>",
70
+ "<|a_68|>",
71
+ "<|a_69|>",
72
+ "<|a_70|>",
73
+ "<|a_71|>",
74
+ "<|a_72|>",
75
+ "<|a_73|>",
76
+ "<|a_74|>",
77
+ "<|a_75|>",
78
+ "<|a_76|>",
79
+ "<|a_77|>",
80
+ "<|a_78|>",
81
+ "<|a_79|>",
82
+ "<|a_80|>",
83
+ "<|a_81|>",
84
+ "<|a_82|>",
85
+ "<|a_83|>",
86
+ "<|a_84|>",
87
+ "<|a_85|>",
88
+ "<|a_86|>",
89
+ "<|a_87|>",
90
+ "<|a_88|>",
91
+ "<|a_89|>",
92
+ "<|a_90|>",
93
+ "<|a_91|>",
94
+ "<|a_92|>",
95
+ "<|a_93|>",
96
+ "<|a_94|>",
97
+ "<|a_95|>",
98
+ "<|a_96|>",
99
+ "<|a_97|>",
100
+ "<|a_98|>",
101
+ "<|a_99|>",
102
+ "<|a_100|>",
103
+ "<|a_101|>",
104
+ "<|a_102|>",
105
+ "<|a_103|>",
106
+ "<|a_104|>",
107
+ "<|a_105|>",
108
+ "<|a_106|>",
109
+ "<|a_107|>",
110
+ "<|a_108|>",
111
+ "<|a_109|>",
112
+ "<|a_110|>",
113
+ "<|a_111|>",
114
+ "<|a_112|>",
115
+ "<|a_113|>",
116
+ "<|a_114|>",
117
+ "<|a_115|>",
118
+ "<|a_116|>",
119
+ "<|a_117|>",
120
+ "<|a_118|>",
121
+ "<|a_119|>",
122
+ "<|a_120|>",
123
+ "<|a_121|>",
124
+ "<|a_122|>",
125
+ "<|a_123|>",
126
+ "<|a_124|>",
127
+ "<|a_125|>",
128
+ "<|a_126|>",
129
+ "<|a_127|>",
130
+ "<|a_128|>",
131
+ "<|a_129|>",
132
+ "<|a_130|>",
133
+ "<|a_131|>",
134
+ "<|a_132|>",
135
+ "<|a_133|>",
136
+ "<|a_134|>",
137
+ "<|a_135|>",
138
+ "<|a_136|>",
139
+ "<|a_137|>",
140
+ "<|a_138|>",
141
+ "<|a_139|>",
142
+ "<|a_140|>",
143
+ "<|a_141|>",
144
+ "<|a_142|>",
145
+ "<|a_143|>",
146
+ "<|a_144|>",
147
+ "<|a_145|>",
148
+ "<|a_146|>",
149
+ "<|a_147|>",
150
+ "<|a_148|>",
151
+ "<|a_149|>",
152
+ "<|a_150|>",
153
+ "<|a_151|>",
154
+ "<|a_152|>",
155
+ "<|a_153|>",
156
+ "<|a_154|>",
157
+ "<|a_155|>",
158
+ "<|a_156|>",
159
+ "<|a_157|>",
160
+ "<|a_158|>",
161
+ "<|a_159|>",
162
+ "<|a_160|>",
163
+ "<|a_161|>",
164
+ "<|a_162|>",
165
+ "<|a_163|>",
166
+ "<|a_164|>",
167
+ "<|a_165|>",
168
+ "<|a_166|>",
169
+ "<|a_167|>",
170
+ "<|a_168|>",
171
+ "<|a_169|>",
172
+ "<|a_170|>",
173
+ "<|a_171|>",
174
+ "<|a_172|>",
175
+ "<|a_173|>",
176
+ "<|a_174|>",
177
+ "<|a_175|>",
178
+ "<|a_176|>",
179
+ "<|a_177|>",
180
+ "<|a_178|>",
181
+ "<|a_179|>",
182
+ "<|a_180|>",
183
+ "<|a_181|>",
184
+ "<|a_182|>",
185
+ "<|a_183|>",
186
+ "<|a_184|>",
187
+ "<|a_185|>",
188
+ "<|a_186|>",
189
+ "<|a_187|>",
190
+ "<|a_188|>",
191
+ "<|a_189|>",
192
+ "<|a_190|>",
193
+ "<|a_191|>",
194
+ "<|a_192|>",
195
+ "<|a_193|>",
196
+ "<|a_194|>",
197
+ "<|a_195|>",
198
+ "<|a_196|>",
199
+ "<|a_197|>",
200
+ "<|a_198|>",
201
+ "<|a_199|>",
202
+ "<|a_200|>",
203
+ "<|a_201|>",
204
+ "<|a_202|>",
205
+ "<|a_203|>",
206
+ "<|a_204|>",
207
+ "<|a_205|>",
208
+ "<|a_206|>",
209
+ "<|a_207|>",
210
+ "<|a_208|>",
211
+ "<|a_209|>",
212
+ "<|a_210|>",
213
+ "<|a_211|>",
214
+ "<|a_212|>",
215
+ "<|a_213|>",
216
+ "<|a_214|>",
217
+ "<|a_215|>",
218
+ "<|a_216|>",
219
+ "<|a_217|>",
220
+ "<|a_218|>",
221
+ "<|a_219|>",
222
+ "<|a_220|>",
223
+ "<|a_221|>",
224
+ "<|a_222|>",
225
+ "<|a_223|>",
226
+ "<|a_224|>",
227
+ "<|a_225|>",
228
+ "<|a_226|>",
229
+ "<|a_227|>",
230
+ "<|a_228|>",
231
+ "<|a_229|>",
232
+ "<|a_230|>",
233
+ "<|a_231|>",
234
+ "<|a_232|>",
235
+ "<|a_233|>",
236
+ "<|a_234|>",
237
+ "<|a_235|>",
238
+ "<|a_236|>",
239
+ "<|a_237|>",
240
+ "<|a_238|>",
241
+ "<|a_239|>",
242
+ "<|a_240|>",
243
+ "<|a_241|>",
244
+ "<|a_242|>",
245
+ "<|a_243|>",
246
+ "<|a_244|>",
247
+ "<|a_245|>",
248
+ "<|a_246|>",
249
+ "<|a_247|>",
250
+ "<|a_248|>",
251
+ "<|a_249|>",
252
+ "<|a_250|>",
253
+ "<|a_251|>",
254
+ "<|a_252|>",
255
+ "<|a_253|>",
256
+ "<|a_254|>",
257
+ "<|a_255|>",
258
+ "<|a_256|>",
259
+ "<|b_1|>",
260
+ "<|b_2|>",
261
+ "<|b_3|>",
262
+ "<|b_4|>",
263
+ "<|b_5|>",
264
+ "<|b_6|>",
265
+ "<|b_7|>",
266
+ "<|b_8|>",
267
+ "<|b_9|>",
268
+ "<|b_10|>",
269
+ "<|b_11|>",
270
+ "<|b_12|>",
271
+ "<|b_13|>",
272
+ "<|b_14|>",
273
+ "<|b_15|>",
274
+ "<|b_16|>",
275
+ "<|b_17|>",
276
+ "<|b_18|>",
277
+ "<|b_19|>",
278
+ "<|b_20|>",
279
+ "<|b_21|>",
280
+ "<|b_22|>",
281
+ "<|b_23|>",
282
+ "<|b_24|>",
283
+ "<|b_25|>",
284
+ "<|b_26|>",
285
+ "<|b_27|>",
286
+ "<|b_28|>",
287
+ "<|b_29|>",
288
+ "<|b_30|>",
289
+ "<|b_31|>",
290
+ "<|b_32|>",
291
+ "<|b_33|>",
292
+ "<|b_34|>",
293
+ "<|b_35|>",
294
+ "<|b_36|>",
295
+ "<|b_37|>",
296
+ "<|b_38|>",
297
+ "<|b_39|>",
298
+ "<|b_40|>",
299
+ "<|b_41|>",
300
+ "<|b_42|>",
301
+ "<|b_43|>",
302
+ "<|b_44|>",
303
+ "<|b_45|>",
304
+ "<|b_46|>",
305
+ "<|b_47|>",
306
+ "<|b_48|>",
307
+ "<|b_49|>",
308
+ "<|b_50|>",
309
+ "<|b_51|>",
310
+ "<|b_52|>",
311
+ "<|b_53|>",
312
+ "<|b_54|>",
313
+ "<|b_55|>",
314
+ "<|b_56|>",
315
+ "<|b_57|>",
316
+ "<|b_58|>",
317
+ "<|b_59|>",
318
+ "<|b_60|>",
319
+ "<|b_61|>",
320
+ "<|b_62|>",
321
+ "<|b_63|>",
322
+ "<|b_64|>",
323
+ "<|b_65|>",
324
+ "<|b_66|>",
325
+ "<|b_67|>",
326
+ "<|b_68|>",
327
+ "<|b_69|>",
328
+ "<|b_70|>",
329
+ "<|b_71|>",
330
+ "<|b_72|>",
331
+ "<|b_73|>",
332
+ "<|b_74|>",
333
+ "<|b_75|>",
334
+ "<|b_76|>",
335
+ "<|b_77|>",
336
+ "<|b_78|>",
337
+ "<|b_79|>",
338
+ "<|b_80|>",
339
+ "<|b_81|>",
340
+ "<|b_82|>",
341
+ "<|b_83|>",
342
+ "<|b_84|>",
343
+ "<|b_85|>",
344
+ "<|b_86|>",
345
+ "<|b_87|>",
346
+ "<|b_88|>",
347
+ "<|b_89|>",
348
+ "<|b_90|>",
349
+ "<|b_91|>",
350
+ "<|b_92|>",
351
+ "<|b_93|>",
352
+ "<|b_94|>",
353
+ "<|b_95|>",
354
+ "<|b_96|>",
355
+ "<|b_97|>",
356
+ "<|b_98|>",
357
+ "<|b_99|>",
358
+ "<|b_100|>",
359
+ "<|b_101|>",
360
+ "<|b_102|>",
361
+ "<|b_103|>",
362
+ "<|b_104|>",
363
+ "<|b_105|>",
364
+ "<|b_106|>",
365
+ "<|b_107|>",
366
+ "<|b_108|>",
367
+ "<|b_109|>",
368
+ "<|b_110|>",
369
+ "<|b_111|>",
370
+ "<|b_112|>",
371
+ "<|b_113|>",
372
+ "<|b_114|>",
373
+ "<|b_115|>",
374
+ "<|b_116|>",
375
+ "<|b_117|>",
376
+ "<|b_118|>",
377
+ "<|b_119|>",
378
+ "<|b_120|>",
379
+ "<|b_121|>",
380
+ "<|b_122|>",
381
+ "<|b_123|>",
382
+ "<|b_124|>",
383
+ "<|b_125|>",
384
+ "<|b_126|>",
385
+ "<|b_127|>",
386
+ "<|b_128|>",
387
+ "<|b_129|>",
388
+ "<|b_130|>",
389
+ "<|b_131|>",
390
+ "<|b_132|>",
391
+ "<|b_133|>",
392
+ "<|b_134|>",
393
+ "<|b_135|>",
394
+ "<|b_136|>",
395
+ "<|b_137|>",
396
+ "<|b_138|>",
397
+ "<|b_139|>",
398
+ "<|b_140|>",
399
+ "<|b_141|>",
400
+ "<|b_142|>",
401
+ "<|b_143|>",
402
+ "<|b_144|>",
403
+ "<|b_145|>",
404
+ "<|b_146|>",
405
+ "<|b_147|>",
406
+ "<|b_148|>",
407
+ "<|b_149|>",
408
+ "<|b_150|>",
409
+ "<|b_151|>",
410
+ "<|b_152|>",
411
+ "<|b_153|>",
412
+ "<|b_154|>",
413
+ "<|b_155|>",
414
+ "<|b_156|>",
415
+ "<|b_157|>",
416
+ "<|b_158|>",
417
+ "<|b_159|>",
418
+ "<|b_160|>",
419
+ "<|b_161|>",
420
+ "<|b_162|>",
421
+ "<|b_163|>",
422
+ "<|b_164|>",
423
+ "<|b_165|>",
424
+ "<|b_166|>",
425
+ "<|b_167|>",
426
+ "<|b_168|>",
427
+ "<|b_169|>",
428
+ "<|b_170|>",
429
+ "<|b_171|>",
430
+ "<|b_172|>",
431
+ "<|b_173|>",
432
+ "<|b_174|>",
433
+ "<|b_175|>",
434
+ "<|b_176|>",
435
+ "<|b_177|>",
436
+ "<|b_178|>",
437
+ "<|b_179|>",
438
+ "<|b_180|>",
439
+ "<|b_181|>",
440
+ "<|b_182|>",
441
+ "<|b_183|>",
442
+ "<|b_184|>",
443
+ "<|b_185|>",
444
+ "<|b_186|>",
445
+ "<|b_187|>",
446
+ "<|b_188|>",
447
+ "<|b_189|>",
448
+ "<|b_190|>",
449
+ "<|b_191|>",
450
+ "<|b_192|>",
451
+ "<|b_193|>",
452
+ "<|b_194|>",
453
+ "<|b_195|>",
454
+ "<|b_196|>",
455
+ "<|b_197|>",
456
+ "<|b_198|>",
457
+ "<|b_199|>",
458
+ "<|b_200|>",
459
+ "<|b_201|>",
460
+ "<|b_202|>",
461
+ "<|b_203|>",
462
+ "<|b_204|>",
463
+ "<|b_205|>",
464
+ "<|b_206|>",
465
+ "<|b_207|>",
466
+ "<|b_208|>",
467
+ "<|b_209|>",
468
+ "<|b_210|>",
469
+ "<|b_211|>",
470
+ "<|b_212|>",
471
+ "<|b_213|>",
472
+ "<|b_214|>",
473
+ "<|b_215|>",
474
+ "<|b_216|>",
475
+ "<|b_217|>",
476
+ "<|b_218|>",
477
+ "<|b_219|>",
478
+ "<|b_220|>",
479
+ "<|b_221|>",
480
+ "<|b_222|>",
481
+ "<|b_223|>",
482
+ "<|b_224|>",
483
+ "<|b_225|>",
484
+ "<|b_226|>",
485
+ "<|b_227|>",
486
+ "<|b_228|>",
487
+ "<|b_229|>",
488
+ "<|b_230|>",
489
+ "<|b_231|>",
490
+ "<|b_232|>",
491
+ "<|b_233|>",
492
+ "<|b_234|>",
493
+ "<|b_235|>",
494
+ "<|b_236|>",
495
+ "<|b_237|>",
496
+ "<|b_238|>",
497
+ "<|b_239|>",
498
+ "<|b_240|>",
499
+ "<|b_241|>",
500
+ "<|b_242|>",
501
+ "<|b_243|>",
502
+ "<|b_244|>",
503
+ "<|b_245|>",
504
+ "<|b_246|>",
505
+ "<|b_247|>",
506
+ "<|b_248|>",
507
+ "<|b_249|>",
508
+ "<|b_250|>",
509
+ "<|b_251|>",
510
+ "<|b_252|>",
511
+ "<|b_253|>",
512
+ "<|b_254|>",
513
+ "<|b_255|>",
514
+ "<|b_256|>",
515
+ "<|c_1|>",
516
+ "<|c_2|>",
517
+ "<|c_3|>",
518
+ "<|c_4|>",
519
+ "<|c_5|>",
520
+ "<|c_6|>",
521
+ "<|c_7|>",
522
+ "<|c_8|>",
523
+ "<|c_9|>",
524
+ "<|c_10|>",
525
+ "<|c_11|>",
526
+ "<|c_12|>",
527
+ "<|c_13|>",
528
+ "<|c_14|>",
529
+ "<|c_15|>",
530
+ "<|c_16|>",
531
+ "<|c_17|>",
532
+ "<|c_18|>",
533
+ "<|c_19|>",
534
+ "<|c_20|>",
535
+ "<|c_21|>",
536
+ "<|c_22|>",
537
+ "<|c_23|>",
538
+ "<|c_24|>",
539
+ "<|c_25|>",
540
+ "<|c_26|>",
541
+ "<|c_27|>",
542
+ "<|c_28|>",
543
+ "<|c_29|>",
544
+ "<|c_30|>",
545
+ "<|c_31|>",
546
+ "<|c_32|>",
547
+ "<|c_33|>",
548
+ "<|c_34|>",
549
+ "<|c_35|>",
550
+ "<|c_36|>",
551
+ "<|c_37|>",
552
+ "<|c_38|>",
553
+ "<|c_39|>",
554
+ "<|c_40|>",
555
+ "<|c_41|>",
556
+ "<|c_42|>",
557
+ "<|c_43|>",
558
+ "<|c_44|>",
559
+ "<|c_45|>",
560
+ "<|c_46|>",
561
+ "<|c_47|>",
562
+ "<|c_48|>",
563
+ "<|c_49|>",
564
+ "<|c_50|>",
565
+ "<|c_51|>",
566
+ "<|c_52|>",
567
+ "<|c_53|>",
568
+ "<|c_54|>",
569
+ "<|c_55|>",
570
+ "<|c_56|>",
571
+ "<|c_57|>",
572
+ "<|c_58|>",
573
+ "<|c_59|>",
574
+ "<|c_60|>",
575
+ "<|c_61|>",
576
+ "<|c_62|>",
577
+ "<|c_63|>",
578
+ "<|c_64|>",
579
+ "<|c_65|>",
580
+ "<|c_66|>",
581
+ "<|c_67|>",
582
+ "<|c_68|>",
583
+ "<|c_69|>",
584
+ "<|c_70|>",
585
+ "<|c_71|>",
586
+ "<|c_72|>",
587
+ "<|c_73|>",
588
+ "<|c_74|>",
589
+ "<|c_75|>",
590
+ "<|c_76|>",
591
+ "<|c_77|>",
592
+ "<|c_78|>",
593
+ "<|c_79|>",
594
+ "<|c_80|>",
595
+ "<|c_81|>",
596
+ "<|c_82|>",
597
+ "<|c_83|>",
598
+ "<|c_84|>",
599
+ "<|c_85|>",
600
+ "<|c_86|>",
601
+ "<|c_87|>",
602
+ "<|c_88|>",
603
+ "<|c_89|>",
604
+ "<|c_90|>",
605
+ "<|c_91|>",
606
+ "<|c_92|>",
607
+ "<|c_93|>",
608
+ "<|c_94|>",
609
+ "<|c_95|>",
610
+ "<|c_96|>",
611
+ "<|c_97|>",
612
+ "<|c_98|>",
613
+ "<|c_99|>",
614
+ "<|c_100|>",
615
+ "<|c_101|>",
616
+ "<|c_102|>",
617
+ "<|c_103|>",
618
+ "<|c_104|>",
619
+ "<|c_105|>",
620
+ "<|c_106|>",
621
+ "<|c_107|>",
622
+ "<|c_108|>",
623
+ "<|c_109|>",
624
+ "<|c_110|>",
625
+ "<|c_111|>",
626
+ "<|c_112|>",
627
+ "<|c_113|>",
628
+ "<|c_114|>",
629
+ "<|c_115|>",
630
+ "<|c_116|>",
631
+ "<|c_117|>",
632
+ "<|c_118|>",
633
+ "<|c_119|>",
634
+ "<|c_120|>",
635
+ "<|c_121|>",
636
+ "<|c_122|>",
637
+ "<|c_123|>",
638
+ "<|c_124|>",
639
+ "<|c_125|>",
640
+ "<|c_126|>",
641
+ "<|c_127|>",
642
+ "<|c_128|>",
643
+ "<|c_129|>",
644
+ "<|c_130|>",
645
+ "<|c_131|>",
646
+ "<|c_132|>",
647
+ "<|c_133|>",
648
+ "<|c_134|>",
649
+ "<|c_135|>",
650
+ "<|c_136|>",
651
+ "<|c_137|>",
652
+ "<|c_138|>",
653
+ "<|c_139|>",
654
+ "<|c_140|>",
655
+ "<|c_141|>",
656
+ "<|c_142|>",
657
+ "<|c_143|>",
658
+ "<|c_144|>",
659
+ "<|c_145|>",
660
+ "<|c_146|>",
661
+ "<|c_147|>",
662
+ "<|c_148|>",
663
+ "<|c_149|>",
664
+ "<|c_150|>",
665
+ "<|c_151|>",
666
+ "<|c_152|>",
667
+ "<|c_153|>",
668
+ "<|c_154|>",
669
+ "<|c_155|>",
670
+ "<|c_156|>",
671
+ "<|c_157|>",
672
+ "<|c_158|>",
673
+ "<|c_159|>",
674
+ "<|c_160|>",
675
+ "<|c_161|>",
676
+ "<|c_162|>",
677
+ "<|c_163|>",
678
+ "<|c_164|>",
679
+ "<|c_165|>",
680
+ "<|c_166|>",
681
+ "<|c_167|>",
682
+ "<|c_168|>",
683
+ "<|c_169|>",
684
+ "<|c_170|>",
685
+ "<|c_171|>",
686
+ "<|c_172|>",
687
+ "<|c_173|>",
688
+ "<|c_174|>",
689
+ "<|c_175|>",
690
+ "<|c_176|>",
691
+ "<|c_177|>",
692
+ "<|c_178|>",
693
+ "<|c_179|>",
694
+ "<|c_180|>",
695
+ "<|c_181|>",
696
+ "<|c_182|>",
697
+ "<|c_183|>",
698
+ "<|c_184|>",
699
+ "<|c_185|>",
700
+ "<|c_186|>",
701
+ "<|c_187|>",
702
+ "<|c_188|>",
703
+ "<|c_189|>",
704
+ "<|c_190|>",
705
+ "<|c_191|>",
706
+ "<|c_192|>",
707
+ "<|c_193|>",
708
+ "<|c_194|>",
709
+ "<|c_195|>",
710
+ "<|c_196|>",
711
+ "<|c_197|>",
712
+ "<|c_198|>",
713
+ "<|c_199|>",
714
+ "<|c_200|>",
715
+ "<|c_201|>",
716
+ "<|c_202|>",
717
+ "<|c_203|>",
718
+ "<|c_204|>",
719
+ "<|c_205|>",
720
+ "<|c_206|>",
721
+ "<|c_207|>",
722
+ "<|c_208|>",
723
+ "<|c_209|>",
724
+ "<|c_210|>",
725
+ "<|c_211|>",
726
+ "<|c_212|>",
727
+ "<|c_213|>",
728
+ "<|c_214|>",
729
+ "<|c_215|>",
730
+ "<|c_216|>",
731
+ "<|c_217|>",
732
+ "<|c_218|>",
733
+ "<|c_219|>",
734
+ "<|c_220|>",
735
+ "<|c_221|>",
736
+ "<|c_222|>",
737
+ "<|c_223|>",
738
+ "<|c_224|>",
739
+ "<|c_225|>",
740
+ "<|c_226|>",
741
+ "<|c_227|>",
742
+ "<|c_228|>",
743
+ "<|c_229|>",
744
+ "<|c_230|>",
745
+ "<|c_231|>",
746
+ "<|c_232|>",
747
+ "<|c_233|>",
748
+ "<|c_234|>",
749
+ "<|c_235|>",
750
+ "<|c_236|>",
751
+ "<|c_237|>",
752
+ "<|c_238|>",
753
+ "<|c_239|>",
754
+ "<|c_240|>",
755
+ "<|c_241|>",
756
+ "<|c_242|>",
757
+ "<|c_243|>",
758
+ "<|c_244|>",
759
+ "<|c_245|>",
760
+ "<|c_246|>",
761
+ "<|c_247|>",
762
+ "<|c_248|>",
763
+ "<|c_249|>",
764
+ "<|c_250|>",
765
+ "<|c_251|>",
766
+ "<|c_252|>",
767
+ "<|c_253|>",
768
+ "<|c_254|>",
769
+ "<|c_255|>",
770
+ "<|c_256|>",
771
+ "<|d_1|>",
772
+ "<|d_2|>",
773
+ "<|d_3|>",
774
+ "<|d_4|>",
775
+ "<|d_5|>",
776
+ "<|d_6|>",
777
+ "<|d_7|>",
778
+ "<|d_8|>",
779
+ "<|d_9|>",
780
+ "<|d_10|>",
781
+ "<|d_11|>",
782
+ "<|d_12|>",
783
+ "<|d_13|>",
784
+ "<|d_14|>",
785
+ "<|d_15|>",
786
+ "<|d_16|>",
787
+ "<|d_17|>",
788
+ "<|d_18|>",
789
+ "<|d_19|>",
790
+ "<|d_20|>",
791
+ "<|d_21|>",
792
+ "<|d_22|>",
793
+ "<|d_23|>",
794
+ "<|d_24|>",
795
+ "<|d_25|>",
796
+ "<|d_26|>",
797
+ "<|d_27|>",
798
+ "<|d_28|>",
799
+ "<|d_29|>",
800
+ "<|d_30|>",
801
+ "<|d_31|>",
802
+ "<|d_32|>",
803
+ "<|d_33|>",
804
+ "<|d_34|>",
805
+ "<|d_35|>",
806
+ "<|d_36|>",
807
+ "<|d_37|>",
808
+ "<|d_38|>",
809
+ "<|d_39|>",
810
+ "<|d_40|>",
811
+ "<|d_41|>",
812
+ "<|d_42|>",
813
+ "<|d_43|>",
814
+ "<|d_44|>",
815
+ "<|d_45|>",
816
+ "<|d_46|>",
817
+ "<|d_47|>",
818
+ "<|d_48|>",
819
+ "<|d_49|>",
820
+ "<|d_50|>",
821
+ "<|d_51|>",
822
+ "<|d_52|>",
823
+ "<|d_53|>",
824
+ "<|d_54|>",
825
+ "<|d_55|>",
826
+ "<|d_56|>",
827
+ "<|d_57|>",
828
+ "<|d_58|>",
829
+ "<|d_59|>",
830
+ "<|d_60|>",
831
+ "<|d_61|>",
832
+ "<|d_62|>",
833
+ "<|d_63|>",
834
+ "<|d_64|>",
835
+ "<|d_65|>",
836
+ "<|d_66|>",
837
+ "<|d_67|>",
838
+ "<|d_68|>",
839
+ "<|d_69|>",
840
+ "<|d_70|>",
841
+ "<|d_71|>",
842
+ "<|d_72|>",
843
+ "<|d_73|>",
844
+ "<|d_74|>",
845
+ "<|d_75|>",
846
+ "<|d_76|>",
847
+ "<|d_77|>",
848
+ "<|d_78|>",
849
+ "<|d_79|>",
850
+ "<|d_80|>",
851
+ "<|d_81|>",
852
+ "<|d_82|>",
853
+ "<|d_83|>",
854
+ "<|d_84|>",
855
+ "<|d_85|>",
856
+ "<|d_86|>",
857
+ "<|d_87|>",
858
+ "<|d_88|>",
859
+ "<|d_89|>",
860
+ "<|d_90|>",
861
+ "<|d_91|>",
862
+ "<|d_92|>",
863
+ "<|d_93|>",
864
+ "<|d_94|>",
865
+ "<|d_95|>",
866
+ "<|d_96|>",
867
+ "<|d_97|>",
868
+ "<|d_98|>",
869
+ "<|d_99|>",
870
+ "<|d_100|>",
871
+ "<|d_101|>",
872
+ "<|d_102|>",
873
+ "<|d_103|>",
874
+ "<|d_104|>",
875
+ "<|d_105|>",
876
+ "<|d_106|>",
877
+ "<|d_107|>",
878
+ "<|d_108|>",
879
+ "<|d_109|>",
880
+ "<|d_110|>",
881
+ "<|d_111|>",
882
+ "<|d_112|>",
883
+ "<|d_113|>",
884
+ "<|d_114|>",
885
+ "<|d_115|>",
886
+ "<|d_116|>",
887
+ "<|d_117|>",
888
+ "<|d_118|>",
889
+ "<|d_119|>",
890
+ "<|d_120|>",
891
+ "<|d_121|>",
892
+ "<|d_122|>",
893
+ "<|d_123|>",
894
+ "<|d_124|>",
895
+ "<|d_125|>",
896
+ "<|d_126|>",
897
+ "<|d_127|>",
898
+ "<|d_128|>",
899
+ "<|d_129|>",
900
+ "<|d_130|>",
901
+ "<|d_131|>",
902
+ "<|d_132|>",
903
+ "<|d_133|>",
904
+ "<|d_134|>",
905
+ "<|d_135|>",
906
+ "<|d_136|>",
907
+ "<|d_137|>",
908
+ "<|d_138|>",
909
+ "<|d_139|>",
910
+ "<|d_140|>",
911
+ "<|d_141|>",
912
+ "<|d_142|>",
913
+ "<|d_143|>",
914
+ "<|d_144|>",
915
+ "<|d_145|>",
916
+ "<|d_146|>",
917
+ "<|d_147|>",
918
+ "<|d_148|>",
919
+ "<|d_149|>",
920
+ "<|d_150|>",
921
+ "<|d_151|>",
922
+ "<|d_152|>",
923
+ "<|d_153|>",
924
+ "<|d_154|>",
925
+ "<|d_155|>",
926
+ "<|d_156|>",
927
+ "<|d_157|>",
928
+ "<|d_158|>",
929
+ "<|d_159|>",
930
+ "<|d_160|>",
931
+ "<|d_161|>",
932
+ "<|d_162|>",
933
+ "<|d_163|>",
934
+ "<|d_164|>",
935
+ "<|d_165|>",
936
+ "<|d_166|>",
937
+ "<|d_167|>",
938
+ "<|d_168|>",
939
+ "<|d_169|>",
940
+ "<|d_170|>",
941
+ "<|d_171|>",
942
+ "<|d_172|>",
943
+ "<|d_173|>",
944
+ "<|d_174|>",
945
+ "<|d_175|>",
946
+ "<|d_176|>",
947
+ "<|d_177|>",
948
+ "<|d_178|>",
949
+ "<|d_179|>",
950
+ "<|d_180|>",
951
+ "<|d_181|>",
952
+ "<|d_182|>",
953
+ "<|d_183|>",
954
+ "<|d_184|>",
955
+ "<|d_185|>",
956
+ "<|d_186|>",
957
+ "<|d_187|>",
958
+ "<|d_188|>",
959
+ "<|d_189|>",
960
+ "<|d_190|>",
961
+ "<|d_191|>",
962
+ "<|d_192|>",
963
+ "<|d_193|>",
964
+ "<|d_194|>",
965
+ "<|d_195|>",
966
+ "<|d_196|>",
967
+ "<|d_197|>",
968
+ "<|d_198|>",
969
+ "<|d_199|>",
970
+ "<|d_200|>",
971
+ "<|d_201|>",
972
+ "<|d_202|>",
973
+ "<|d_203|>",
974
+ "<|d_204|>",
975
+ "<|d_205|>",
976
+ "<|d_206|>",
977
+ "<|d_207|>",
978
+ "<|d_208|>",
979
+ "<|d_209|>",
980
+ "<|d_210|>",
981
+ "<|d_211|>",
982
+ "<|d_212|>",
983
+ "<|d_213|>",
984
+ "<|d_214|>",
985
+ "<|d_215|>",
986
+ "<|d_216|>",
987
+ "<|d_217|>",
988
+ "<|d_218|>",
989
+ "<|d_219|>",
990
+ "<|d_220|>",
991
+ "<|d_221|>",
992
+ "<|d_222|>",
993
+ "<|d_223|>",
994
+ "<|d_224|>",
995
+ "<|d_225|>",
996
+ "<|d_226|>",
997
+ "<|d_227|>",
998
+ "<|d_228|>",
999
+ "<|d_229|>",
1000
+ "<|d_230|>",
1001
+ "<|d_231|>",
1002
+ "<|d_232|>",
1003
+ "<|d_233|>",
1004
+ "<|d_234|>",
1005
+ "<|d_235|>",
1006
+ "<|d_236|>",
1007
+ "<|d_237|>",
1008
+ "<|d_238|>",
1009
+ "<|d_239|>",
1010
+ "<|d_240|>",
1011
+ "<|d_241|>",
1012
+ "<|d_242|>",
1013
+ "<|d_243|>",
1014
+ "<|d_244|>",
1015
+ "<|d_245|>",
1016
+ "<|d_246|>",
1017
+ "<|d_247|>",
1018
+ "<|d_248|>",
1019
+ "<|d_249|>",
1020
+ "<|d_250|>",
1021
+ "<|d_251|>",
1022
+ "<|d_252|>",
1023
+ "<|d_253|>",
1024
+ "<|d_254|>",
1025
+ "<|d_255|>",
1026
+ "<|d_256|>",
1027
+ "<|e_1|>",
1028
+ "<|e_2|>",
1029
+ "<|e_3|>",
1030
+ "<|e_4|>",
1031
+ "<|e_5|>",
1032
+ "<|e_6|>",
1033
+ "<|e_7|>",
1034
+ "<|e_8|>",
1035
+ "<|e_9|>",
1036
+ "<|e_10|>",
1037
+ "<|e_11|>",
1038
+ "<|e_12|>",
1039
+ "<|e_13|>",
1040
+ "<|e_14|>",
1041
+ "<|e_15|>",
1042
+ "<|e_16|>",
1043
+ "<|e_17|>",
1044
+ "<|e_18|>",
1045
+ "<|e_19|>",
1046
+ "<|e_20|>",
1047
+ "<|e_21|>",
1048
+ "<|e_22|>",
1049
+ "<|e_23|>",
1050
+ "<|e_24|>",
1051
+ "<|e_25|>",
1052
+ "<|e_26|>",
1053
+ "<|e_27|>",
1054
+ "<|e_28|>",
1055
+ "<|e_29|>",
1056
+ "<|e_30|>",
1057
+ "<|e_31|>",
1058
+ "<|e_32|>",
1059
+ "<|e_33|>",
1060
+ "<|e_34|>",
1061
+ "<|e_35|>",
1062
+ "<|e_36|>",
1063
+ "<|e_37|>",
1064
+ "<|e_38|>",
1065
+ "<|e_39|>",
1066
+ "<|e_40|>",
1067
+ "<|e_41|>",
1068
+ "<|e_42|>",
1069
+ "<|e_43|>",
1070
+ "<|e_44|>",
1071
+ "<|e_45|>",
1072
+ "<|e_46|>",
1073
+ "<|e_47|>",
1074
+ "<|e_48|>",
1075
+ "<|e_49|>",
1076
+ "<|e_50|>",
1077
+ "<|e_51|>",
1078
+ "<|e_52|>",
1079
+ "<|e_53|>",
1080
+ "<|e_54|>",
1081
+ "<|e_55|>",
1082
+ "<|e_56|>",
1083
+ "<|e_57|>",
1084
+ "<|e_58|>",
1085
+ "<|e_59|>",
1086
+ "<|e_60|>",
1087
+ "<|e_61|>",
1088
+ "<|e_62|>",
1089
+ "<|e_63|>",
1090
+ "<|e_64|>",
1091
+ "<|e_65|>",
1092
+ "<|e_66|>",
1093
+ "<|e_67|>",
1094
+ "<|e_68|>",
1095
+ "<|e_69|>",
1096
+ "<|e_70|>",
1097
+ "<|e_71|>",
1098
+ "<|e_72|>",
1099
+ "<|e_73|>",
1100
+ "<|e_74|>",
1101
+ "<|e_75|>",
1102
+ "<|e_76|>",
1103
+ "<|e_77|>",
1104
+ "<|e_78|>",
1105
+ "<|e_79|>",
1106
+ "<|e_80|>",
1107
+ "<|e_81|>",
1108
+ "<|e_82|>",
1109
+ "<|e_83|>",
1110
+ "<|e_84|>",
1111
+ "<|e_85|>",
1112
+ "<|e_86|>",
1113
+ "<|e_87|>",
1114
+ "<|e_88|>",
1115
+ "<|e_89|>",
1116
+ "<|e_90|>",
1117
+ "<|e_91|>",
1118
+ "<|e_92|>",
1119
+ "<|e_93|>",
1120
+ "<|e_94|>",
1121
+ "<|e_95|>",
1122
+ "<|e_96|>",
1123
+ "<|e_97|>",
1124
+ "<|e_98|>",
1125
+ "<|e_99|>",
1126
+ "<|e_100|>",
1127
+ "<|e_101|>",
1128
+ "<|e_102|>",
1129
+ "<|e_103|>",
1130
+ "<|e_104|>",
1131
+ "<|e_105|>",
1132
+ "<|e_106|>",
1133
+ "<|e_107|>",
1134
+ "<|e_108|>",
1135
+ "<|e_109|>",
1136
+ "<|e_110|>",
1137
+ "<|e_111|>",
1138
+ "<|e_112|>",
1139
+ "<|e_113|>",
1140
+ "<|e_114|>",
1141
+ "<|e_115|>",
1142
+ "<|e_116|>",
1143
+ "<|e_117|>",
1144
+ "<|e_118|>",
1145
+ "<|e_119|>",
1146
+ "<|e_120|>",
1147
+ "<|e_121|>",
1148
+ "<|e_122|>",
1149
+ "<|e_123|>",
1150
+ "<|e_124|>",
1151
+ "<|e_125|>",
1152
+ "<|e_126|>",
1153
+ "<|e_127|>",
1154
+ "<|e_128|>",
1155
+ "<|e_129|>",
1156
+ "<|e_130|>",
1157
+ "<|e_131|>",
1158
+ "<|e_132|>",
1159
+ "<|e_133|>",
1160
+ "<|e_134|>",
1161
+ "<|e_135|>",
1162
+ "<|e_136|>",
1163
+ "<|e_137|>",
1164
+ "<|e_138|>",
1165
+ "<|e_139|>",
1166
+ "<|e_140|>",
1167
+ "<|e_141|>",
1168
+ "<|e_142|>",
1169
+ "<|e_143|>",
1170
+ "<|e_144|>",
1171
+ "<|e_145|>",
1172
+ "<|e_146|>",
1173
+ "<|e_147|>",
1174
+ "<|e_148|>",
1175
+ "<|e_149|>",
1176
+ "<|e_150|>",
1177
+ "<|e_151|>",
1178
+ "<|e_152|>",
1179
+ "<|e_153|>",
1180
+ "<|e_154|>",
1181
+ "<|e_155|>",
1182
+ "<|e_156|>",
1183
+ "<|e_157|>",
1184
+ "<|e_158|>",
1185
+ "<|e_159|>",
1186
+ "<|e_160|>",
1187
+ "<|e_161|>",
1188
+ "<|e_162|>",
1189
+ "<|e_163|>",
1190
+ "<|e_164|>",
1191
+ "<|e_165|>",
1192
+ "<|e_166|>",
1193
+ "<|e_167|>",
1194
+ "<|e_168|>",
1195
+ "<|e_169|>",
1196
+ "<|e_170|>",
1197
+ "<|e_171|>",
1198
+ "<|e_172|>",
1199
+ "<|e_173|>",
1200
+ "<|e_174|>",
1201
+ "<|e_175|>",
1202
+ "<|e_176|>",
1203
+ "<|e_177|>",
1204
+ "<|e_178|>",
1205
+ "<|e_179|>",
1206
+ "<|e_180|>",
1207
+ "<|e_181|>",
1208
+ "<|e_182|>",
1209
+ "<|e_183|>",
1210
+ "<|e_184|>",
1211
+ "<|e_185|>",
1212
+ "<|e_186|>",
1213
+ "<|e_187|>",
1214
+ "<|e_188|>",
1215
+ "<|e_189|>",
1216
+ "<|e_190|>",
1217
+ "<|e_191|>",
1218
+ "<|e_192|>",
1219
+ "<|e_193|>",
1220
+ "<|e_194|>",
1221
+ "<|e_195|>",
1222
+ "<|e_196|>",
1223
+ "<|e_197|>",
1224
+ "<|e_198|>",
1225
+ "<|e_199|>",
1226
+ "<|e_200|>",
1227
+ "<|e_201|>",
1228
+ "<|e_202|>",
1229
+ "<|e_203|>",
1230
+ "<|e_204|>",
1231
+ "<|e_205|>",
1232
+ "<|e_206|>",
1233
+ "<|e_207|>",
1234
+ "<|e_208|>",
1235
+ "<|e_209|>",
1236
+ "<|e_210|>",
1237
+ "<|e_211|>",
1238
+ "<|e_212|>",
1239
+ "<|e_213|>",
1240
+ "<|e_214|>",
1241
+ "<|e_215|>",
1242
+ "<|e_216|>",
1243
+ "<|e_217|>",
1244
+ "<|e_218|>",
1245
+ "<|e_219|>",
1246
+ "<|e_220|>",
1247
+ "<|e_221|>",
1248
+ "<|e_222|>",
1249
+ "<|e_223|>",
1250
+ "<|e_224|>",
1251
+ "<|e_225|>",
1252
+ "<|e_226|>",
1253
+ "<|e_227|>",
1254
+ "<|e_228|>",
1255
+ "<|e_229|>",
1256
+ "<|e_230|>",
1257
+ "<|e_231|>",
1258
+ "<|e_232|>",
1259
+ "<|e_233|>",
1260
+ "<|e_234|>",
1261
+ "<|e_235|>",
1262
+ "<|e_236|>",
1263
+ "<|e_237|>",
1264
+ "<|e_238|>",
1265
+ "<|e_239|>",
1266
+ "<|e_240|>",
1267
+ "<|e_241|>",
1268
+ "<|e_242|>",
1269
+ "<|e_243|>",
1270
+ "<|e_244|>",
1271
+ "<|e_245|>",
1272
+ "<|e_246|>",
1273
+ "<|e_247|>",
1274
+ "<|e_248|>",
1275
+ "<|e_249|>",
1276
+ "<|e_250|>",
1277
+ "<|e_251|>",
1278
+ "<|e_252|>",
1279
+ "<|e_253|>",
1280
+ "<|e_254|>",
1281
+ "<|e_255|>",
1282
+ "<|e_256|>"
1283
+ ],
1284
+ "eos_token": {
1285
+ "content": "<|im_end|>",
1286
+ "lstrip": false,
1287
+ "normalized": false,
1288
+ "rstrip": false,
1289
+ "single_word": false
1290
+ },
1291
+ "pad_token": {
1292
+ "content": "<|endoftext|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false
1297
+ }
1298
+ }
qwen3_4b_instruments/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3bcd0ab896130d768fa805cf9a6ffeacac3cbb78921e1b68a664da7735314452
3
+ size 11660194
qwen3_4b_instruments/tokenizer_config.json ADDED
The diff for this file is too large to render. See raw diff
 
qwen3_4b_instruments/vocab.json ADDED
The diff for this file is too large to render. See raw diff
 
qwen3_4b_sports/added_tokens.json ADDED
@@ -0,0 +1,1308 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "</think>": 151668,
3
+ "</tool_call>": 151658,
4
+ "</tool_response>": 151666,
5
+ "<think>": 151667,
6
+ "<tool_call>": 151657,
7
+ "<tool_response>": 151665,
8
+ "<|a_100|>": 151768,
9
+ "<|a_101|>": 151769,
10
+ "<|a_102|>": 151770,
11
+ "<|a_103|>": 151771,
12
+ "<|a_104|>": 151772,
13
+ "<|a_105|>": 151773,
14
+ "<|a_106|>": 151774,
15
+ "<|a_107|>": 151775,
16
+ "<|a_108|>": 151776,
17
+ "<|a_109|>": 151777,
18
+ "<|a_10|>": 151678,
19
+ "<|a_110|>": 151778,
20
+ "<|a_111|>": 151779,
21
+ "<|a_112|>": 151780,
22
+ "<|a_113|>": 151781,
23
+ "<|a_114|>": 151782,
24
+ "<|a_115|>": 151783,
25
+ "<|a_116|>": 151784,
26
+ "<|a_117|>": 151785,
27
+ "<|a_118|>": 151786,
28
+ "<|a_119|>": 151787,
29
+ "<|a_11|>": 151679,
30
+ "<|a_120|>": 151788,
31
+ "<|a_121|>": 151789,
32
+ "<|a_122|>": 151790,
33
+ "<|a_123|>": 151791,
34
+ "<|a_124|>": 151792,
35
+ "<|a_125|>": 151793,
36
+ "<|a_126|>": 151794,
37
+ "<|a_127|>": 151795,
38
+ "<|a_128|>": 151796,
39
+ "<|a_129|>": 151797,
40
+ "<|a_12|>": 151680,
41
+ "<|a_130|>": 151798,
42
+ "<|a_131|>": 151799,
43
+ "<|a_132|>": 151800,
44
+ "<|a_133|>": 151801,
45
+ "<|a_134|>": 151802,
46
+ "<|a_135|>": 151803,
47
+ "<|a_136|>": 151804,
48
+ "<|a_137|>": 151805,
49
+ "<|a_138|>": 151806,
50
+ "<|a_139|>": 151807,
51
+ "<|a_13|>": 151681,
52
+ "<|a_140|>": 151808,
53
+ "<|a_141|>": 151809,
54
+ "<|a_142|>": 151810,
55
+ "<|a_143|>": 151811,
56
+ "<|a_144|>": 151812,
57
+ "<|a_145|>": 151813,
58
+ "<|a_146|>": 151814,
59
+ "<|a_147|>": 151815,
60
+ "<|a_148|>": 151816,
61
+ "<|a_149|>": 151817,
62
+ "<|a_14|>": 151682,
63
+ "<|a_150|>": 151818,
64
+ "<|a_151|>": 151819,
65
+ "<|a_152|>": 151820,
66
+ "<|a_153|>": 151821,
67
+ "<|a_154|>": 151822,
68
+ "<|a_155|>": 151823,
69
+ "<|a_156|>": 151824,
70
+ "<|a_157|>": 151825,
71
+ "<|a_158|>": 151826,
72
+ "<|a_159|>": 151827,
73
+ "<|a_15|>": 151683,
74
+ "<|a_160|>": 151828,
75
+ "<|a_161|>": 151829,
76
+ "<|a_162|>": 151830,
77
+ "<|a_163|>": 151831,
78
+ "<|a_164|>": 151832,
79
+ "<|a_165|>": 151833,
80
+ "<|a_166|>": 151834,
81
+ "<|a_167|>": 151835,
82
+ "<|a_168|>": 151836,
83
+ "<|a_169|>": 151837,
84
+ "<|a_16|>": 151684,
85
+ "<|a_170|>": 151838,
86
+ "<|a_171|>": 151839,
87
+ "<|a_172|>": 151840,
88
+ "<|a_173|>": 151841,
89
+ "<|a_174|>": 151842,
90
+ "<|a_175|>": 151843,
91
+ "<|a_176|>": 151844,
92
+ "<|a_177|>": 151845,
93
+ "<|a_178|>": 151846,
94
+ "<|a_179|>": 151847,
95
+ "<|a_17|>": 151685,
96
+ "<|a_180|>": 151848,
97
+ "<|a_181|>": 151849,
98
+ "<|a_182|>": 151850,
99
+ "<|a_183|>": 151851,
100
+ "<|a_184|>": 151852,
101
+ "<|a_185|>": 151853,
102
+ "<|a_186|>": 151854,
103
+ "<|a_187|>": 151855,
104
+ "<|a_188|>": 151856,
105
+ "<|a_189|>": 151857,
106
+ "<|a_18|>": 151686,
107
+ "<|a_190|>": 151858,
108
+ "<|a_191|>": 151859,
109
+ "<|a_192|>": 151860,
110
+ "<|a_193|>": 151861,
111
+ "<|a_194|>": 151862,
112
+ "<|a_195|>": 151863,
113
+ "<|a_196|>": 151864,
114
+ "<|a_197|>": 151865,
115
+ "<|a_198|>": 151866,
116
+ "<|a_199|>": 151867,
117
+ "<|a_19|>": 151687,
118
+ "<|a_1|>": 151669,
119
+ "<|a_200|>": 151868,
120
+ "<|a_201|>": 151869,
121
+ "<|a_202|>": 151870,
122
+ "<|a_203|>": 151871,
123
+ "<|a_204|>": 151872,
124
+ "<|a_205|>": 151873,
125
+ "<|a_206|>": 151874,
126
+ "<|a_207|>": 151875,
127
+ "<|a_208|>": 151876,
128
+ "<|a_209|>": 151877,
129
+ "<|a_20|>": 151688,
130
+ "<|a_210|>": 151878,
131
+ "<|a_211|>": 151879,
132
+ "<|a_212|>": 151880,
133
+ "<|a_213|>": 151881,
134
+ "<|a_214|>": 151882,
135
+ "<|a_215|>": 151883,
136
+ "<|a_216|>": 151884,
137
+ "<|a_217|>": 151885,
138
+ "<|a_218|>": 151886,
139
+ "<|a_219|>": 151887,
140
+ "<|a_21|>": 151689,
141
+ "<|a_220|>": 151888,
142
+ "<|a_221|>": 151889,
143
+ "<|a_222|>": 151890,
144
+ "<|a_223|>": 151891,
145
+ "<|a_224|>": 151892,
146
+ "<|a_225|>": 151893,
147
+ "<|a_226|>": 151894,
148
+ "<|a_227|>": 151895,
149
+ "<|a_228|>": 151896,
150
+ "<|a_229|>": 151897,
151
+ "<|a_22|>": 151690,
152
+ "<|a_230|>": 151898,
153
+ "<|a_231|>": 151899,
154
+ "<|a_232|>": 151900,
155
+ "<|a_233|>": 151901,
156
+ "<|a_234|>": 151902,
157
+ "<|a_235|>": 151903,
158
+ "<|a_236|>": 151904,
159
+ "<|a_237|>": 151905,
160
+ "<|a_238|>": 151906,
161
+ "<|a_239|>": 151907,
162
+ "<|a_23|>": 151691,
163
+ "<|a_240|>": 151908,
164
+ "<|a_241|>": 151909,
165
+ "<|a_242|>": 151910,
166
+ "<|a_243|>": 151911,
167
+ "<|a_244|>": 151912,
168
+ "<|a_245|>": 151913,
169
+ "<|a_246|>": 151914,
170
+ "<|a_247|>": 151915,
171
+ "<|a_248|>": 151916,
172
+ "<|a_249|>": 151917,
173
+ "<|a_24|>": 151692,
174
+ "<|a_250|>": 151918,
175
+ "<|a_251|>": 151919,
176
+ "<|a_252|>": 151920,
177
+ "<|a_253|>": 151921,
178
+ "<|a_254|>": 151922,
179
+ "<|a_255|>": 151923,
180
+ "<|a_256|>": 151924,
181
+ "<|a_25|>": 151693,
182
+ "<|a_26|>": 151694,
183
+ "<|a_27|>": 151695,
184
+ "<|a_28|>": 151696,
185
+ "<|a_29|>": 151697,
186
+ "<|a_2|>": 151670,
187
+ "<|a_30|>": 151698,
188
+ "<|a_31|>": 151699,
189
+ "<|a_32|>": 151700,
190
+ "<|a_33|>": 151701,
191
+ "<|a_34|>": 151702,
192
+ "<|a_35|>": 151703,
193
+ "<|a_36|>": 151704,
194
+ "<|a_37|>": 151705,
195
+ "<|a_38|>": 151706,
196
+ "<|a_39|>": 151707,
197
+ "<|a_3|>": 151671,
198
+ "<|a_40|>": 151708,
199
+ "<|a_41|>": 151709,
200
+ "<|a_42|>": 151710,
201
+ "<|a_43|>": 151711,
202
+ "<|a_44|>": 151712,
203
+ "<|a_45|>": 151713,
204
+ "<|a_46|>": 151714,
205
+ "<|a_47|>": 151715,
206
+ "<|a_48|>": 151716,
207
+ "<|a_49|>": 151717,
208
+ "<|a_4|>": 151672,
209
+ "<|a_50|>": 151718,
210
+ "<|a_51|>": 151719,
211
+ "<|a_52|>": 151720,
212
+ "<|a_53|>": 151721,
213
+ "<|a_54|>": 151722,
214
+ "<|a_55|>": 151723,
215
+ "<|a_56|>": 151724,
216
+ "<|a_57|>": 151725,
217
+ "<|a_58|>": 151726,
218
+ "<|a_59|>": 151727,
219
+ "<|a_5|>": 151673,
220
+ "<|a_60|>": 151728,
221
+ "<|a_61|>": 151729,
222
+ "<|a_62|>": 151730,
223
+ "<|a_63|>": 151731,
224
+ "<|a_64|>": 151732,
225
+ "<|a_65|>": 151733,
226
+ "<|a_66|>": 151734,
227
+ "<|a_67|>": 151735,
228
+ "<|a_68|>": 151736,
229
+ "<|a_69|>": 151737,
230
+ "<|a_6|>": 151674,
231
+ "<|a_70|>": 151738,
232
+ "<|a_71|>": 151739,
233
+ "<|a_72|>": 151740,
234
+ "<|a_73|>": 151741,
235
+ "<|a_74|>": 151742,
236
+ "<|a_75|>": 151743,
237
+ "<|a_76|>": 151744,
238
+ "<|a_77|>": 151745,
239
+ "<|a_78|>": 151746,
240
+ "<|a_79|>": 151747,
241
+ "<|a_7|>": 151675,
242
+ "<|a_80|>": 151748,
243
+ "<|a_81|>": 151749,
244
+ "<|a_82|>": 151750,
245
+ "<|a_83|>": 151751,
246
+ "<|a_84|>": 151752,
247
+ "<|a_85|>": 151753,
248
+ "<|a_86|>": 151754,
249
+ "<|a_87|>": 151755,
250
+ "<|a_88|>": 151756,
251
+ "<|a_89|>": 151757,
252
+ "<|a_8|>": 151676,
253
+ "<|a_90|>": 151758,
254
+ "<|a_91|>": 151759,
255
+ "<|a_92|>": 151760,
256
+ "<|a_93|>": 151761,
257
+ "<|a_94|>": 151762,
258
+ "<|a_95|>": 151763,
259
+ "<|a_96|>": 151764,
260
+ "<|a_97|>": 151765,
261
+ "<|a_98|>": 151766,
262
+ "<|a_99|>": 151767,
263
+ "<|a_9|>": 151677,
264
+ "<|b_100|>": 152024,
265
+ "<|b_101|>": 152025,
266
+ "<|b_102|>": 152026,
267
+ "<|b_103|>": 152027,
268
+ "<|b_104|>": 152028,
269
+ "<|b_105|>": 152029,
270
+ "<|b_106|>": 152030,
271
+ "<|b_107|>": 152031,
272
+ "<|b_108|>": 152032,
273
+ "<|b_109|>": 152033,
274
+ "<|b_10|>": 151934,
275
+ "<|b_110|>": 152034,
276
+ "<|b_111|>": 152035,
277
+ "<|b_112|>": 152036,
278
+ "<|b_113|>": 152037,
279
+ "<|b_114|>": 152038,
280
+ "<|b_115|>": 152039,
281
+ "<|b_116|>": 152040,
282
+ "<|b_117|>": 152041,
283
+ "<|b_118|>": 152042,
284
+ "<|b_119|>": 152043,
285
+ "<|b_11|>": 151935,
286
+ "<|b_120|>": 152044,
287
+ "<|b_121|>": 152045,
288
+ "<|b_122|>": 152046,
289
+ "<|b_123|>": 152047,
290
+ "<|b_124|>": 152048,
291
+ "<|b_125|>": 152049,
292
+ "<|b_126|>": 152050,
293
+ "<|b_127|>": 152051,
294
+ "<|b_128|>": 152052,
295
+ "<|b_129|>": 152053,
296
+ "<|b_12|>": 151936,
297
+ "<|b_130|>": 152054,
298
+ "<|b_131|>": 152055,
299
+ "<|b_132|>": 152056,
300
+ "<|b_133|>": 152057,
301
+ "<|b_134|>": 152058,
302
+ "<|b_135|>": 152059,
303
+ "<|b_136|>": 152060,
304
+ "<|b_137|>": 152061,
305
+ "<|b_138|>": 152062,
306
+ "<|b_139|>": 152063,
307
+ "<|b_13|>": 151937,
308
+ "<|b_140|>": 152064,
309
+ "<|b_141|>": 152065,
310
+ "<|b_142|>": 152066,
311
+ "<|b_143|>": 152067,
312
+ "<|b_144|>": 152068,
313
+ "<|b_145|>": 152069,
314
+ "<|b_146|>": 152070,
315
+ "<|b_147|>": 152071,
316
+ "<|b_148|>": 152072,
317
+ "<|b_149|>": 152073,
318
+ "<|b_14|>": 151938,
319
+ "<|b_150|>": 152074,
320
+ "<|b_151|>": 152075,
321
+ "<|b_152|>": 152076,
322
+ "<|b_153|>": 152077,
323
+ "<|b_154|>": 152078,
324
+ "<|b_155|>": 152079,
325
+ "<|b_156|>": 152080,
326
+ "<|b_157|>": 152081,
327
+ "<|b_158|>": 152082,
328
+ "<|b_159|>": 152083,
329
+ "<|b_15|>": 151939,
330
+ "<|b_160|>": 152084,
331
+ "<|b_161|>": 152085,
332
+ "<|b_162|>": 152086,
333
+ "<|b_163|>": 152087,
334
+ "<|b_164|>": 152088,
335
+ "<|b_165|>": 152089,
336
+ "<|b_166|>": 152090,
337
+ "<|b_167|>": 152091,
338
+ "<|b_168|>": 152092,
339
+ "<|b_169|>": 152093,
340
+ "<|b_16|>": 151940,
341
+ "<|b_170|>": 152094,
342
+ "<|b_171|>": 152095,
343
+ "<|b_172|>": 152096,
344
+ "<|b_173|>": 152097,
345
+ "<|b_174|>": 152098,
346
+ "<|b_175|>": 152099,
347
+ "<|b_176|>": 152100,
348
+ "<|b_177|>": 152101,
349
+ "<|b_178|>": 152102,
350
+ "<|b_179|>": 152103,
351
+ "<|b_17|>": 151941,
352
+ "<|b_180|>": 152104,
353
+ "<|b_181|>": 152105,
354
+ "<|b_182|>": 152106,
355
+ "<|b_183|>": 152107,
356
+ "<|b_184|>": 152108,
357
+ "<|b_185|>": 152109,
358
+ "<|b_186|>": 152110,
359
+ "<|b_187|>": 152111,
360
+ "<|b_188|>": 152112,
361
+ "<|b_189|>": 152113,
362
+ "<|b_18|>": 151942,
363
+ "<|b_190|>": 152114,
364
+ "<|b_191|>": 152115,
365
+ "<|b_192|>": 152116,
366
+ "<|b_193|>": 152117,
367
+ "<|b_194|>": 152118,
368
+ "<|b_195|>": 152119,
369
+ "<|b_196|>": 152120,
370
+ "<|b_197|>": 152121,
371
+ "<|b_198|>": 152122,
372
+ "<|b_199|>": 152123,
373
+ "<|b_19|>": 151943,
374
+ "<|b_1|>": 151925,
375
+ "<|b_200|>": 152124,
376
+ "<|b_201|>": 152125,
377
+ "<|b_202|>": 152126,
378
+ "<|b_203|>": 152127,
379
+ "<|b_204|>": 152128,
380
+ "<|b_205|>": 152129,
381
+ "<|b_206|>": 152130,
382
+ "<|b_207|>": 152131,
383
+ "<|b_208|>": 152132,
384
+ "<|b_209|>": 152133,
385
+ "<|b_20|>": 151944,
386
+ "<|b_210|>": 152134,
387
+ "<|b_211|>": 152135,
388
+ "<|b_212|>": 152136,
389
+ "<|b_213|>": 152137,
390
+ "<|b_214|>": 152138,
391
+ "<|b_215|>": 152139,
392
+ "<|b_216|>": 152140,
393
+ "<|b_217|>": 152141,
394
+ "<|b_218|>": 152142,
395
+ "<|b_219|>": 152143,
396
+ "<|b_21|>": 151945,
397
+ "<|b_220|>": 152144,
398
+ "<|b_221|>": 152145,
399
+ "<|b_222|>": 152146,
400
+ "<|b_223|>": 152147,
401
+ "<|b_224|>": 152148,
402
+ "<|b_225|>": 152149,
403
+ "<|b_226|>": 152150,
404
+ "<|b_227|>": 152151,
405
+ "<|b_228|>": 152152,
406
+ "<|b_229|>": 152153,
407
+ "<|b_22|>": 151946,
408
+ "<|b_230|>": 152154,
409
+ "<|b_231|>": 152155,
410
+ "<|b_232|>": 152156,
411
+ "<|b_233|>": 152157,
412
+ "<|b_234|>": 152158,
413
+ "<|b_235|>": 152159,
414
+ "<|b_236|>": 152160,
415
+ "<|b_237|>": 152161,
416
+ "<|b_238|>": 152162,
417
+ "<|b_239|>": 152163,
418
+ "<|b_23|>": 151947,
419
+ "<|b_240|>": 152164,
420
+ "<|b_241|>": 152165,
421
+ "<|b_242|>": 152166,
422
+ "<|b_243|>": 152167,
423
+ "<|b_244|>": 152168,
424
+ "<|b_245|>": 152169,
425
+ "<|b_246|>": 152170,
426
+ "<|b_247|>": 152171,
427
+ "<|b_248|>": 152172,
428
+ "<|b_249|>": 152173,
429
+ "<|b_24|>": 151948,
430
+ "<|b_250|>": 152174,
431
+ "<|b_251|>": 152175,
432
+ "<|b_252|>": 152176,
433
+ "<|b_253|>": 152177,
434
+ "<|b_254|>": 152178,
435
+ "<|b_255|>": 152179,
436
+ "<|b_256|>": 152180,
437
+ "<|b_25|>": 151949,
438
+ "<|b_26|>": 151950,
439
+ "<|b_27|>": 151951,
440
+ "<|b_28|>": 151952,
441
+ "<|b_29|>": 151953,
442
+ "<|b_2|>": 151926,
443
+ "<|b_30|>": 151954,
444
+ "<|b_31|>": 151955,
445
+ "<|b_32|>": 151956,
446
+ "<|b_33|>": 151957,
447
+ "<|b_34|>": 151958,
448
+ "<|b_35|>": 151959,
449
+ "<|b_36|>": 151960,
450
+ "<|b_37|>": 151961,
451
+ "<|b_38|>": 151962,
452
+ "<|b_39|>": 151963,
453
+ "<|b_3|>": 151927,
454
+ "<|b_40|>": 151964,
455
+ "<|b_41|>": 151965,
456
+ "<|b_42|>": 151966,
457
+ "<|b_43|>": 151967,
458
+ "<|b_44|>": 151968,
459
+ "<|b_45|>": 151969,
460
+ "<|b_46|>": 151970,
461
+ "<|b_47|>": 151971,
462
+ "<|b_48|>": 151972,
463
+ "<|b_49|>": 151973,
464
+ "<|b_4|>": 151928,
465
+ "<|b_50|>": 151974,
466
+ "<|b_51|>": 151975,
467
+ "<|b_52|>": 151976,
468
+ "<|b_53|>": 151977,
469
+ "<|b_54|>": 151978,
470
+ "<|b_55|>": 151979,
471
+ "<|b_56|>": 151980,
472
+ "<|b_57|>": 151981,
473
+ "<|b_58|>": 151982,
474
+ "<|b_59|>": 151983,
475
+ "<|b_5|>": 151929,
476
+ "<|b_60|>": 151984,
477
+ "<|b_61|>": 151985,
478
+ "<|b_62|>": 151986,
479
+ "<|b_63|>": 151987,
480
+ "<|b_64|>": 151988,
481
+ "<|b_65|>": 151989,
482
+ "<|b_66|>": 151990,
483
+ "<|b_67|>": 151991,
484
+ "<|b_68|>": 151992,
485
+ "<|b_69|>": 151993,
486
+ "<|b_6|>": 151930,
487
+ "<|b_70|>": 151994,
488
+ "<|b_71|>": 151995,
489
+ "<|b_72|>": 151996,
490
+ "<|b_73|>": 151997,
491
+ "<|b_74|>": 151998,
492
+ "<|b_75|>": 151999,
493
+ "<|b_76|>": 152000,
494
+ "<|b_77|>": 152001,
495
+ "<|b_78|>": 152002,
496
+ "<|b_79|>": 152003,
497
+ "<|b_7|>": 151931,
498
+ "<|b_80|>": 152004,
499
+ "<|b_81|>": 152005,
500
+ "<|b_82|>": 152006,
501
+ "<|b_83|>": 152007,
502
+ "<|b_84|>": 152008,
503
+ "<|b_85|>": 152009,
504
+ "<|b_86|>": 152010,
505
+ "<|b_87|>": 152011,
506
+ "<|b_88|>": 152012,
507
+ "<|b_89|>": 152013,
508
+ "<|b_8|>": 151932,
509
+ "<|b_90|>": 152014,
510
+ "<|b_91|>": 152015,
511
+ "<|b_92|>": 152016,
512
+ "<|b_93|>": 152017,
513
+ "<|b_94|>": 152018,
514
+ "<|b_95|>": 152019,
515
+ "<|b_96|>": 152020,
516
+ "<|b_97|>": 152021,
517
+ "<|b_98|>": 152022,
518
+ "<|b_99|>": 152023,
519
+ "<|b_9|>": 151933,
520
+ "<|box_end|>": 151649,
521
+ "<|box_start|>": 151648,
522
+ "<|c_100|>": 152280,
523
+ "<|c_101|>": 152281,
524
+ "<|c_102|>": 152282,
525
+ "<|c_103|>": 152283,
526
+ "<|c_104|>": 152284,
527
+ "<|c_105|>": 152285,
528
+ "<|c_106|>": 152286,
529
+ "<|c_107|>": 152287,
530
+ "<|c_108|>": 152288,
531
+ "<|c_109|>": 152289,
532
+ "<|c_10|>": 152190,
533
+ "<|c_110|>": 152290,
534
+ "<|c_111|>": 152291,
535
+ "<|c_112|>": 152292,
536
+ "<|c_113|>": 152293,
537
+ "<|c_114|>": 152294,
538
+ "<|c_115|>": 152295,
539
+ "<|c_116|>": 152296,
540
+ "<|c_117|>": 152297,
541
+ "<|c_118|>": 152298,
542
+ "<|c_119|>": 152299,
543
+ "<|c_11|>": 152191,
544
+ "<|c_120|>": 152300,
545
+ "<|c_121|>": 152301,
546
+ "<|c_122|>": 152302,
547
+ "<|c_123|>": 152303,
548
+ "<|c_124|>": 152304,
549
+ "<|c_125|>": 152305,
550
+ "<|c_126|>": 152306,
551
+ "<|c_127|>": 152307,
552
+ "<|c_128|>": 152308,
553
+ "<|c_129|>": 152309,
554
+ "<|c_12|>": 152192,
555
+ "<|c_130|>": 152310,
556
+ "<|c_131|>": 152311,
557
+ "<|c_132|>": 152312,
558
+ "<|c_133|>": 152313,
559
+ "<|c_134|>": 152314,
560
+ "<|c_135|>": 152315,
561
+ "<|c_136|>": 152316,
562
+ "<|c_137|>": 152317,
563
+ "<|c_138|>": 152318,
564
+ "<|c_139|>": 152319,
565
+ "<|c_13|>": 152193,
566
+ "<|c_140|>": 152320,
567
+ "<|c_141|>": 152321,
568
+ "<|c_142|>": 152322,
569
+ "<|c_143|>": 152323,
570
+ "<|c_144|>": 152324,
571
+ "<|c_145|>": 152325,
572
+ "<|c_146|>": 152326,
573
+ "<|c_147|>": 152327,
574
+ "<|c_148|>": 152328,
575
+ "<|c_149|>": 152329,
576
+ "<|c_14|>": 152194,
577
+ "<|c_150|>": 152330,
578
+ "<|c_151|>": 152331,
579
+ "<|c_152|>": 152332,
580
+ "<|c_153|>": 152333,
581
+ "<|c_154|>": 152334,
582
+ "<|c_155|>": 152335,
583
+ "<|c_156|>": 152336,
584
+ "<|c_157|>": 152337,
585
+ "<|c_158|>": 152338,
586
+ "<|c_159|>": 152339,
587
+ "<|c_15|>": 152195,
588
+ "<|c_160|>": 152340,
589
+ "<|c_161|>": 152341,
590
+ "<|c_162|>": 152342,
591
+ "<|c_163|>": 152343,
592
+ "<|c_164|>": 152344,
593
+ "<|c_165|>": 152345,
594
+ "<|c_166|>": 152346,
595
+ "<|c_167|>": 152347,
596
+ "<|c_168|>": 152348,
597
+ "<|c_169|>": 152349,
598
+ "<|c_16|>": 152196,
599
+ "<|c_170|>": 152350,
600
+ "<|c_171|>": 152351,
601
+ "<|c_172|>": 152352,
602
+ "<|c_173|>": 152353,
603
+ "<|c_174|>": 152354,
604
+ "<|c_175|>": 152355,
605
+ "<|c_176|>": 152356,
606
+ "<|c_177|>": 152357,
607
+ "<|c_178|>": 152358,
608
+ "<|c_179|>": 152359,
609
+ "<|c_17|>": 152197,
610
+ "<|c_180|>": 152360,
611
+ "<|c_181|>": 152361,
612
+ "<|c_182|>": 152362,
613
+ "<|c_183|>": 152363,
614
+ "<|c_184|>": 152364,
615
+ "<|c_185|>": 152365,
616
+ "<|c_186|>": 152366,
617
+ "<|c_187|>": 152367,
618
+ "<|c_188|>": 152368,
619
+ "<|c_189|>": 152369,
620
+ "<|c_18|>": 152198,
621
+ "<|c_190|>": 152370,
622
+ "<|c_191|>": 152371,
623
+ "<|c_192|>": 152372,
624
+ "<|c_193|>": 152373,
625
+ "<|c_194|>": 152374,
626
+ "<|c_195|>": 152375,
627
+ "<|c_196|>": 152376,
628
+ "<|c_197|>": 152377,
629
+ "<|c_198|>": 152378,
630
+ "<|c_199|>": 152379,
631
+ "<|c_19|>": 152199,
632
+ "<|c_1|>": 152181,
633
+ "<|c_200|>": 152380,
634
+ "<|c_201|>": 152381,
635
+ "<|c_202|>": 152382,
636
+ "<|c_203|>": 152383,
637
+ "<|c_204|>": 152384,
638
+ "<|c_205|>": 152385,
639
+ "<|c_206|>": 152386,
640
+ "<|c_207|>": 152387,
641
+ "<|c_208|>": 152388,
642
+ "<|c_209|>": 152389,
643
+ "<|c_20|>": 152200,
644
+ "<|c_210|>": 152390,
645
+ "<|c_211|>": 152391,
646
+ "<|c_212|>": 152392,
647
+ "<|c_213|>": 152393,
648
+ "<|c_214|>": 152394,
649
+ "<|c_215|>": 152395,
650
+ "<|c_216|>": 152396,
651
+ "<|c_217|>": 152397,
652
+ "<|c_218|>": 152398,
653
+ "<|c_219|>": 152399,
654
+ "<|c_21|>": 152201,
655
+ "<|c_220|>": 152400,
656
+ "<|c_221|>": 152401,
657
+ "<|c_222|>": 152402,
658
+ "<|c_223|>": 152403,
659
+ "<|c_224|>": 152404,
660
+ "<|c_225|>": 152405,
661
+ "<|c_226|>": 152406,
662
+ "<|c_227|>": 152407,
663
+ "<|c_228|>": 152408,
664
+ "<|c_229|>": 152409,
665
+ "<|c_22|>": 152202,
666
+ "<|c_230|>": 152410,
667
+ "<|c_231|>": 152411,
668
+ "<|c_232|>": 152412,
669
+ "<|c_233|>": 152413,
670
+ "<|c_234|>": 152414,
671
+ "<|c_235|>": 152415,
672
+ "<|c_236|>": 152416,
673
+ "<|c_237|>": 152417,
674
+ "<|c_238|>": 152418,
675
+ "<|c_239|>": 152419,
676
+ "<|c_23|>": 152203,
677
+ "<|c_240|>": 152420,
678
+ "<|c_241|>": 152421,
679
+ "<|c_242|>": 152422,
680
+ "<|c_243|>": 152423,
681
+ "<|c_244|>": 152424,
682
+ "<|c_245|>": 152425,
683
+ "<|c_246|>": 152426,
684
+ "<|c_247|>": 152427,
685
+ "<|c_248|>": 152428,
686
+ "<|c_249|>": 152429,
687
+ "<|c_24|>": 152204,
688
+ "<|c_250|>": 152430,
689
+ "<|c_251|>": 152431,
690
+ "<|c_252|>": 152432,
691
+ "<|c_253|>": 152433,
692
+ "<|c_254|>": 152434,
693
+ "<|c_255|>": 152435,
694
+ "<|c_256|>": 152436,
695
+ "<|c_25|>": 152205,
696
+ "<|c_26|>": 152206,
697
+ "<|c_27|>": 152207,
698
+ "<|c_28|>": 152208,
699
+ "<|c_29|>": 152209,
700
+ "<|c_2|>": 152182,
701
+ "<|c_30|>": 152210,
702
+ "<|c_31|>": 152211,
703
+ "<|c_32|>": 152212,
704
+ "<|c_33|>": 152213,
705
+ "<|c_34|>": 152214,
706
+ "<|c_35|>": 152215,
707
+ "<|c_36|>": 152216,
708
+ "<|c_37|>": 152217,
709
+ "<|c_38|>": 152218,
710
+ "<|c_39|>": 152219,
711
+ "<|c_3|>": 152183,
712
+ "<|c_40|>": 152220,
713
+ "<|c_41|>": 152221,
714
+ "<|c_42|>": 152222,
715
+ "<|c_43|>": 152223,
716
+ "<|c_44|>": 152224,
717
+ "<|c_45|>": 152225,
718
+ "<|c_46|>": 152226,
719
+ "<|c_47|>": 152227,
720
+ "<|c_48|>": 152228,
721
+ "<|c_49|>": 152229,
722
+ "<|c_4|>": 152184,
723
+ "<|c_50|>": 152230,
724
+ "<|c_51|>": 152231,
725
+ "<|c_52|>": 152232,
726
+ "<|c_53|>": 152233,
727
+ "<|c_54|>": 152234,
728
+ "<|c_55|>": 152235,
729
+ "<|c_56|>": 152236,
730
+ "<|c_57|>": 152237,
731
+ "<|c_58|>": 152238,
732
+ "<|c_59|>": 152239,
733
+ "<|c_5|>": 152185,
734
+ "<|c_60|>": 152240,
735
+ "<|c_61|>": 152241,
736
+ "<|c_62|>": 152242,
737
+ "<|c_63|>": 152243,
738
+ "<|c_64|>": 152244,
739
+ "<|c_65|>": 152245,
740
+ "<|c_66|>": 152246,
741
+ "<|c_67|>": 152247,
742
+ "<|c_68|>": 152248,
743
+ "<|c_69|>": 152249,
744
+ "<|c_6|>": 152186,
745
+ "<|c_70|>": 152250,
746
+ "<|c_71|>": 152251,
747
+ "<|c_72|>": 152252,
748
+ "<|c_73|>": 152253,
749
+ "<|c_74|>": 152254,
750
+ "<|c_75|>": 152255,
751
+ "<|c_76|>": 152256,
752
+ "<|c_77|>": 152257,
753
+ "<|c_78|>": 152258,
754
+ "<|c_79|>": 152259,
755
+ "<|c_7|>": 152187,
756
+ "<|c_80|>": 152260,
757
+ "<|c_81|>": 152261,
758
+ "<|c_82|>": 152262,
759
+ "<|c_83|>": 152263,
760
+ "<|c_84|>": 152264,
761
+ "<|c_85|>": 152265,
762
+ "<|c_86|>": 152266,
763
+ "<|c_87|>": 152267,
764
+ "<|c_88|>": 152268,
765
+ "<|c_89|>": 152269,
766
+ "<|c_8|>": 152188,
767
+ "<|c_90|>": 152270,
768
+ "<|c_91|>": 152271,
769
+ "<|c_92|>": 152272,
770
+ "<|c_93|>": 152273,
771
+ "<|c_94|>": 152274,
772
+ "<|c_95|>": 152275,
773
+ "<|c_96|>": 152276,
774
+ "<|c_97|>": 152277,
775
+ "<|c_98|>": 152278,
776
+ "<|c_99|>": 152279,
777
+ "<|c_9|>": 152189,
778
+ "<|d_100|>": 152536,
779
+ "<|d_101|>": 152537,
780
+ "<|d_102|>": 152538,
781
+ "<|d_103|>": 152539,
782
+ "<|d_104|>": 152540,
783
+ "<|d_105|>": 152541,
784
+ "<|d_106|>": 152542,
785
+ "<|d_107|>": 152543,
786
+ "<|d_108|>": 152544,
787
+ "<|d_109|>": 152545,
788
+ "<|d_10|>": 152446,
789
+ "<|d_110|>": 152546,
790
+ "<|d_111|>": 152547,
791
+ "<|d_112|>": 152548,
792
+ "<|d_113|>": 152549,
793
+ "<|d_114|>": 152550,
794
+ "<|d_115|>": 152551,
795
+ "<|d_116|>": 152552,
796
+ "<|d_117|>": 152553,
797
+ "<|d_118|>": 152554,
798
+ "<|d_119|>": 152555,
799
+ "<|d_11|>": 152447,
800
+ "<|d_120|>": 152556,
801
+ "<|d_121|>": 152557,
802
+ "<|d_122|>": 152558,
803
+ "<|d_123|>": 152559,
804
+ "<|d_124|>": 152560,
805
+ "<|d_125|>": 152561,
806
+ "<|d_126|>": 152562,
807
+ "<|d_127|>": 152563,
808
+ "<|d_128|>": 152564,
809
+ "<|d_129|>": 152565,
810
+ "<|d_12|>": 152448,
811
+ "<|d_130|>": 152566,
812
+ "<|d_131|>": 152567,
813
+ "<|d_132|>": 152568,
814
+ "<|d_133|>": 152569,
815
+ "<|d_134|>": 152570,
816
+ "<|d_135|>": 152571,
817
+ "<|d_136|>": 152572,
818
+ "<|d_137|>": 152573,
819
+ "<|d_138|>": 152574,
820
+ "<|d_139|>": 152575,
821
+ "<|d_13|>": 152449,
822
+ "<|d_140|>": 152576,
823
+ "<|d_141|>": 152577,
824
+ "<|d_142|>": 152578,
825
+ "<|d_143|>": 152579,
826
+ "<|d_144|>": 152580,
827
+ "<|d_145|>": 152581,
828
+ "<|d_146|>": 152582,
829
+ "<|d_147|>": 152583,
830
+ "<|d_148|>": 152584,
831
+ "<|d_149|>": 152585,
832
+ "<|d_14|>": 152450,
833
+ "<|d_150|>": 152586,
834
+ "<|d_151|>": 152587,
835
+ "<|d_152|>": 152588,
836
+ "<|d_153|>": 152589,
837
+ "<|d_154|>": 152590,
838
+ "<|d_155|>": 152591,
839
+ "<|d_156|>": 152592,
840
+ "<|d_157|>": 152593,
841
+ "<|d_158|>": 152594,
842
+ "<|d_159|>": 152595,
843
+ "<|d_15|>": 152451,
844
+ "<|d_160|>": 152596,
845
+ "<|d_161|>": 152597,
846
+ "<|d_162|>": 152598,
847
+ "<|d_163|>": 152599,
848
+ "<|d_164|>": 152600,
849
+ "<|d_165|>": 152601,
850
+ "<|d_166|>": 152602,
851
+ "<|d_167|>": 152603,
852
+ "<|d_168|>": 152604,
853
+ "<|d_169|>": 152605,
854
+ "<|d_16|>": 152452,
855
+ "<|d_170|>": 152606,
856
+ "<|d_171|>": 152607,
857
+ "<|d_172|>": 152608,
858
+ "<|d_173|>": 152609,
859
+ "<|d_174|>": 152610,
860
+ "<|d_175|>": 152611,
861
+ "<|d_176|>": 152612,
862
+ "<|d_177|>": 152613,
863
+ "<|d_178|>": 152614,
864
+ "<|d_179|>": 152615,
865
+ "<|d_17|>": 152453,
866
+ "<|d_180|>": 152616,
867
+ "<|d_181|>": 152617,
868
+ "<|d_182|>": 152618,
869
+ "<|d_183|>": 152619,
870
+ "<|d_184|>": 152620,
871
+ "<|d_185|>": 152621,
872
+ "<|d_186|>": 152622,
873
+ "<|d_187|>": 152623,
874
+ "<|d_188|>": 152624,
875
+ "<|d_189|>": 152625,
876
+ "<|d_18|>": 152454,
877
+ "<|d_190|>": 152626,
878
+ "<|d_191|>": 152627,
879
+ "<|d_192|>": 152628,
880
+ "<|d_193|>": 152629,
881
+ "<|d_194|>": 152630,
882
+ "<|d_195|>": 152631,
883
+ "<|d_196|>": 152632,
884
+ "<|d_197|>": 152633,
885
+ "<|d_198|>": 152634,
886
+ "<|d_199|>": 152635,
887
+ "<|d_19|>": 152455,
888
+ "<|d_1|>": 152437,
889
+ "<|d_200|>": 152636,
890
+ "<|d_201|>": 152637,
891
+ "<|d_202|>": 152638,
892
+ "<|d_203|>": 152639,
893
+ "<|d_204|>": 152640,
894
+ "<|d_205|>": 152641,
895
+ "<|d_206|>": 152642,
896
+ "<|d_207|>": 152643,
897
+ "<|d_208|>": 152644,
898
+ "<|d_209|>": 152645,
899
+ "<|d_20|>": 152456,
900
+ "<|d_210|>": 152646,
901
+ "<|d_211|>": 152647,
902
+ "<|d_212|>": 152648,
903
+ "<|d_213|>": 152649,
904
+ "<|d_214|>": 152650,
905
+ "<|d_215|>": 152651,
906
+ "<|d_216|>": 152652,
907
+ "<|d_217|>": 152653,
908
+ "<|d_218|>": 152654,
909
+ "<|d_219|>": 152655,
910
+ "<|d_21|>": 152457,
911
+ "<|d_220|>": 152656,
912
+ "<|d_221|>": 152657,
913
+ "<|d_222|>": 152658,
914
+ "<|d_223|>": 152659,
915
+ "<|d_224|>": 152660,
916
+ "<|d_225|>": 152661,
917
+ "<|d_226|>": 152662,
918
+ "<|d_227|>": 152663,
919
+ "<|d_228|>": 152664,
920
+ "<|d_229|>": 152665,
921
+ "<|d_22|>": 152458,
922
+ "<|d_230|>": 152666,
923
+ "<|d_231|>": 152667,
924
+ "<|d_232|>": 152668,
925
+ "<|d_233|>": 152669,
926
+ "<|d_234|>": 152670,
927
+ "<|d_235|>": 152671,
928
+ "<|d_236|>": 152672,
929
+ "<|d_237|>": 152673,
930
+ "<|d_238|>": 152674,
931
+ "<|d_239|>": 152675,
932
+ "<|d_23|>": 152459,
933
+ "<|d_240|>": 152676,
934
+ "<|d_241|>": 152677,
935
+ "<|d_242|>": 152678,
936
+ "<|d_243|>": 152679,
937
+ "<|d_244|>": 152680,
938
+ "<|d_245|>": 152681,
939
+ "<|d_246|>": 152682,
940
+ "<|d_247|>": 152683,
941
+ "<|d_248|>": 152684,
942
+ "<|d_249|>": 152685,
943
+ "<|d_24|>": 152460,
944
+ "<|d_250|>": 152686,
945
+ "<|d_251|>": 152687,
946
+ "<|d_252|>": 152688,
947
+ "<|d_253|>": 152689,
948
+ "<|d_254|>": 152690,
949
+ "<|d_255|>": 152691,
950
+ "<|d_256|>": 152692,
951
+ "<|d_25|>": 152461,
952
+ "<|d_26|>": 152462,
953
+ "<|d_27|>": 152463,
954
+ "<|d_28|>": 152464,
955
+ "<|d_29|>": 152465,
956
+ "<|d_2|>": 152438,
957
+ "<|d_30|>": 152466,
958
+ "<|d_31|>": 152467,
959
+ "<|d_32|>": 152468,
960
+ "<|d_33|>": 152469,
961
+ "<|d_34|>": 152470,
962
+ "<|d_35|>": 152471,
963
+ "<|d_36|>": 152472,
964
+ "<|d_37|>": 152473,
965
+ "<|d_38|>": 152474,
966
+ "<|d_39|>": 152475,
967
+ "<|d_3|>": 152439,
968
+ "<|d_40|>": 152476,
969
+ "<|d_41|>": 152477,
970
+ "<|d_42|>": 152478,
971
+ "<|d_43|>": 152479,
972
+ "<|d_44|>": 152480,
973
+ "<|d_45|>": 152481,
974
+ "<|d_46|>": 152482,
975
+ "<|d_47|>": 152483,
976
+ "<|d_48|>": 152484,
977
+ "<|d_49|>": 152485,
978
+ "<|d_4|>": 152440,
979
+ "<|d_50|>": 152486,
980
+ "<|d_51|>": 152487,
981
+ "<|d_52|>": 152488,
982
+ "<|d_53|>": 152489,
983
+ "<|d_54|>": 152490,
984
+ "<|d_55|>": 152491,
985
+ "<|d_56|>": 152492,
986
+ "<|d_57|>": 152493,
987
+ "<|d_58|>": 152494,
988
+ "<|d_59|>": 152495,
989
+ "<|d_5|>": 152441,
990
+ "<|d_60|>": 152496,
991
+ "<|d_61|>": 152497,
992
+ "<|d_62|>": 152498,
993
+ "<|d_63|>": 152499,
994
+ "<|d_64|>": 152500,
995
+ "<|d_65|>": 152501,
996
+ "<|d_66|>": 152502,
997
+ "<|d_67|>": 152503,
998
+ "<|d_68|>": 152504,
999
+ "<|d_69|>": 152505,
1000
+ "<|d_6|>": 152442,
1001
+ "<|d_70|>": 152506,
1002
+ "<|d_71|>": 152507,
1003
+ "<|d_72|>": 152508,
1004
+ "<|d_73|>": 152509,
1005
+ "<|d_74|>": 152510,
1006
+ "<|d_75|>": 152511,
1007
+ "<|d_76|>": 152512,
1008
+ "<|d_77|>": 152513,
1009
+ "<|d_78|>": 152514,
1010
+ "<|d_79|>": 152515,
1011
+ "<|d_7|>": 152443,
1012
+ "<|d_80|>": 152516,
1013
+ "<|d_81|>": 152517,
1014
+ "<|d_82|>": 152518,
1015
+ "<|d_83|>": 152519,
1016
+ "<|d_84|>": 152520,
1017
+ "<|d_85|>": 152521,
1018
+ "<|d_86|>": 152522,
1019
+ "<|d_87|>": 152523,
1020
+ "<|d_88|>": 152524,
1021
+ "<|d_89|>": 152525,
1022
+ "<|d_8|>": 152444,
1023
+ "<|d_90|>": 152526,
1024
+ "<|d_91|>": 152527,
1025
+ "<|d_92|>": 152528,
1026
+ "<|d_93|>": 152529,
1027
+ "<|d_94|>": 152530,
1028
+ "<|d_95|>": 152531,
1029
+ "<|d_96|>": 152532,
1030
+ "<|d_97|>": 152533,
1031
+ "<|d_98|>": 152534,
1032
+ "<|d_99|>": 152535,
1033
+ "<|d_9|>": 152445,
1034
+ "<|e_100|>": 152792,
1035
+ "<|e_101|>": 152793,
1036
+ "<|e_102|>": 152794,
1037
+ "<|e_103|>": 152795,
1038
+ "<|e_104|>": 152796,
1039
+ "<|e_105|>": 152797,
1040
+ "<|e_106|>": 152798,
1041
+ "<|e_107|>": 152799,
1042
+ "<|e_108|>": 152800,
1043
+ "<|e_109|>": 152801,
1044
+ "<|e_10|>": 152702,
1045
+ "<|e_110|>": 152802,
1046
+ "<|e_111|>": 152803,
1047
+ "<|e_112|>": 152804,
1048
+ "<|e_113|>": 152805,
1049
+ "<|e_114|>": 152806,
1050
+ "<|e_115|>": 152807,
1051
+ "<|e_116|>": 152808,
1052
+ "<|e_117|>": 152809,
1053
+ "<|e_118|>": 152810,
1054
+ "<|e_119|>": 152811,
1055
+ "<|e_11|>": 152703,
1056
+ "<|e_120|>": 152812,
1057
+ "<|e_121|>": 152813,
1058
+ "<|e_122|>": 152814,
1059
+ "<|e_123|>": 152815,
1060
+ "<|e_124|>": 152816,
1061
+ "<|e_125|>": 152817,
1062
+ "<|e_126|>": 152818,
1063
+ "<|e_127|>": 152819,
1064
+ "<|e_128|>": 152820,
1065
+ "<|e_129|>": 152821,
1066
+ "<|e_12|>": 152704,
1067
+ "<|e_130|>": 152822,
1068
+ "<|e_131|>": 152823,
1069
+ "<|e_132|>": 152824,
1070
+ "<|e_133|>": 152825,
1071
+ "<|e_134|>": 152826,
1072
+ "<|e_135|>": 152827,
1073
+ "<|e_136|>": 152828,
1074
+ "<|e_137|>": 152829,
1075
+ "<|e_138|>": 152830,
1076
+ "<|e_139|>": 152831,
1077
+ "<|e_13|>": 152705,
1078
+ "<|e_140|>": 152832,
1079
+ "<|e_141|>": 152833,
1080
+ "<|e_142|>": 152834,
1081
+ "<|e_143|>": 152835,
1082
+ "<|e_144|>": 152836,
1083
+ "<|e_145|>": 152837,
1084
+ "<|e_146|>": 152838,
1085
+ "<|e_147|>": 152839,
1086
+ "<|e_148|>": 152840,
1087
+ "<|e_149|>": 152841,
1088
+ "<|e_14|>": 152706,
1089
+ "<|e_150|>": 152842,
1090
+ "<|e_151|>": 152843,
1091
+ "<|e_152|>": 152844,
1092
+ "<|e_153|>": 152845,
1093
+ "<|e_154|>": 152846,
1094
+ "<|e_155|>": 152847,
1095
+ "<|e_156|>": 152848,
1096
+ "<|e_157|>": 152849,
1097
+ "<|e_158|>": 152850,
1098
+ "<|e_159|>": 152851,
1099
+ "<|e_15|>": 152707,
1100
+ "<|e_160|>": 152852,
1101
+ "<|e_161|>": 152853,
1102
+ "<|e_162|>": 152854,
1103
+ "<|e_163|>": 152855,
1104
+ "<|e_164|>": 152856,
1105
+ "<|e_165|>": 152857,
1106
+ "<|e_166|>": 152858,
1107
+ "<|e_167|>": 152859,
1108
+ "<|e_168|>": 152860,
1109
+ "<|e_169|>": 152861,
1110
+ "<|e_16|>": 152708,
1111
+ "<|e_170|>": 152862,
1112
+ "<|e_171|>": 152863,
1113
+ "<|e_172|>": 152864,
1114
+ "<|e_173|>": 152865,
1115
+ "<|e_174|>": 152866,
1116
+ "<|e_175|>": 152867,
1117
+ "<|e_176|>": 152868,
1118
+ "<|e_177|>": 152869,
1119
+ "<|e_178|>": 152870,
1120
+ "<|e_179|>": 152871,
1121
+ "<|e_17|>": 152709,
1122
+ "<|e_180|>": 152872,
1123
+ "<|e_181|>": 152873,
1124
+ "<|e_182|>": 152874,
1125
+ "<|e_183|>": 152875,
1126
+ "<|e_184|>": 152876,
1127
+ "<|e_185|>": 152877,
1128
+ "<|e_186|>": 152878,
1129
+ "<|e_187|>": 152879,
1130
+ "<|e_188|>": 152880,
1131
+ "<|e_189|>": 152881,
1132
+ "<|e_18|>": 152710,
1133
+ "<|e_190|>": 152882,
1134
+ "<|e_191|>": 152883,
1135
+ "<|e_192|>": 152884,
1136
+ "<|e_193|>": 152885,
1137
+ "<|e_194|>": 152886,
1138
+ "<|e_195|>": 152887,
1139
+ "<|e_196|>": 152888,
1140
+ "<|e_197|>": 152889,
1141
+ "<|e_198|>": 152890,
1142
+ "<|e_199|>": 152891,
1143
+ "<|e_19|>": 152711,
1144
+ "<|e_1|>": 152693,
1145
+ "<|e_200|>": 152892,
1146
+ "<|e_201|>": 152893,
1147
+ "<|e_202|>": 152894,
1148
+ "<|e_203|>": 152895,
1149
+ "<|e_204|>": 152896,
1150
+ "<|e_205|>": 152897,
1151
+ "<|e_206|>": 152898,
1152
+ "<|e_207|>": 152899,
1153
+ "<|e_208|>": 152900,
1154
+ "<|e_209|>": 152901,
1155
+ "<|e_20|>": 152712,
1156
+ "<|e_210|>": 152902,
1157
+ "<|e_211|>": 152903,
1158
+ "<|e_212|>": 152904,
1159
+ "<|e_213|>": 152905,
1160
+ "<|e_214|>": 152906,
1161
+ "<|e_215|>": 152907,
1162
+ "<|e_216|>": 152908,
1163
+ "<|e_217|>": 152909,
1164
+ "<|e_218|>": 152910,
1165
+ "<|e_219|>": 152911,
1166
+ "<|e_21|>": 152713,
1167
+ "<|e_220|>": 152912,
1168
+ "<|e_221|>": 152913,
1169
+ "<|e_222|>": 152914,
1170
+ "<|e_223|>": 152915,
1171
+ "<|e_224|>": 152916,
1172
+ "<|e_225|>": 152917,
1173
+ "<|e_226|>": 152918,
1174
+ "<|e_227|>": 152919,
1175
+ "<|e_228|>": 152920,
1176
+ "<|e_229|>": 152921,
1177
+ "<|e_22|>": 152714,
1178
+ "<|e_230|>": 152922,
1179
+ "<|e_231|>": 152923,
1180
+ "<|e_232|>": 152924,
1181
+ "<|e_233|>": 152925,
1182
+ "<|e_234|>": 152926,
1183
+ "<|e_235|>": 152927,
1184
+ "<|e_236|>": 152928,
1185
+ "<|e_237|>": 152929,
1186
+ "<|e_238|>": 152930,
1187
+ "<|e_239|>": 152931,
1188
+ "<|e_23|>": 152715,
1189
+ "<|e_240|>": 152932,
1190
+ "<|e_241|>": 152933,
1191
+ "<|e_242|>": 152934,
1192
+ "<|e_243|>": 152935,
1193
+ "<|e_244|>": 152936,
1194
+ "<|e_245|>": 152937,
1195
+ "<|e_246|>": 152938,
1196
+ "<|e_247|>": 152939,
1197
+ "<|e_248|>": 152940,
1198
+ "<|e_249|>": 152941,
1199
+ "<|e_24|>": 152716,
1200
+ "<|e_250|>": 152942,
1201
+ "<|e_251|>": 152943,
1202
+ "<|e_252|>": 152944,
1203
+ "<|e_253|>": 152945,
1204
+ "<|e_254|>": 152946,
1205
+ "<|e_255|>": 152947,
1206
+ "<|e_256|>": 152948,
1207
+ "<|e_25|>": 152717,
1208
+ "<|e_26|>": 152718,
1209
+ "<|e_27|>": 152719,
1210
+ "<|e_28|>": 152720,
1211
+ "<|e_29|>": 152721,
1212
+ "<|e_2|>": 152694,
1213
+ "<|e_30|>": 152722,
1214
+ "<|e_31|>": 152723,
1215
+ "<|e_32|>": 152724,
1216
+ "<|e_33|>": 152725,
1217
+ "<|e_34|>": 152726,
1218
+ "<|e_35|>": 152727,
1219
+ "<|e_36|>": 152728,
1220
+ "<|e_37|>": 152729,
1221
+ "<|e_38|>": 152730,
1222
+ "<|e_39|>": 152731,
1223
+ "<|e_3|>": 152695,
1224
+ "<|e_40|>": 152732,
1225
+ "<|e_41|>": 152733,
1226
+ "<|e_42|>": 152734,
1227
+ "<|e_43|>": 152735,
1228
+ "<|e_44|>": 152736,
1229
+ "<|e_45|>": 152737,
1230
+ "<|e_46|>": 152738,
1231
+ "<|e_47|>": 152739,
1232
+ "<|e_48|>": 152740,
1233
+ "<|e_49|>": 152741,
1234
+ "<|e_4|>": 152696,
1235
+ "<|e_50|>": 152742,
1236
+ "<|e_51|>": 152743,
1237
+ "<|e_52|>": 152744,
1238
+ "<|e_53|>": 152745,
1239
+ "<|e_54|>": 152746,
1240
+ "<|e_55|>": 152747,
1241
+ "<|e_56|>": 152748,
1242
+ "<|e_57|>": 152749,
1243
+ "<|e_58|>": 152750,
1244
+ "<|e_59|>": 152751,
1245
+ "<|e_5|>": 152697,
1246
+ "<|e_60|>": 152752,
1247
+ "<|e_61|>": 152753,
1248
+ "<|e_62|>": 152754,
1249
+ "<|e_63|>": 152755,
1250
+ "<|e_64|>": 152756,
1251
+ "<|e_65|>": 152757,
1252
+ "<|e_66|>": 152758,
1253
+ "<|e_67|>": 152759,
1254
+ "<|e_68|>": 152760,
1255
+ "<|e_69|>": 152761,
1256
+ "<|e_6|>": 152698,
1257
+ "<|e_70|>": 152762,
1258
+ "<|e_71|>": 152763,
1259
+ "<|e_72|>": 152764,
1260
+ "<|e_73|>": 152765,
1261
+ "<|e_74|>": 152766,
1262
+ "<|e_75|>": 152767,
1263
+ "<|e_76|>": 152768,
1264
+ "<|e_77|>": 152769,
1265
+ "<|e_78|>": 152770,
1266
+ "<|e_79|>": 152771,
1267
+ "<|e_7|>": 152699,
1268
+ "<|e_80|>": 152772,
1269
+ "<|e_81|>": 152773,
1270
+ "<|e_82|>": 152774,
1271
+ "<|e_83|>": 152775,
1272
+ "<|e_84|>": 152776,
1273
+ "<|e_85|>": 152777,
1274
+ "<|e_86|>": 152778,
1275
+ "<|e_87|>": 152779,
1276
+ "<|e_88|>": 152780,
1277
+ "<|e_89|>": 152781,
1278
+ "<|e_8|>": 152700,
1279
+ "<|e_90|>": 152782,
1280
+ "<|e_91|>": 152783,
1281
+ "<|e_92|>": 152784,
1282
+ "<|e_93|>": 152785,
1283
+ "<|e_94|>": 152786,
1284
+ "<|e_95|>": 152787,
1285
+ "<|e_96|>": 152788,
1286
+ "<|e_97|>": 152789,
1287
+ "<|e_98|>": 152790,
1288
+ "<|e_99|>": 152791,
1289
+ "<|e_9|>": 152701,
1290
+ "<|endoftext|>": 151643,
1291
+ "<|file_sep|>": 151664,
1292
+ "<|fim_middle|>": 151660,
1293
+ "<|fim_pad|>": 151662,
1294
+ "<|fim_prefix|>": 151659,
1295
+ "<|fim_suffix|>": 151661,
1296
+ "<|im_end|>": 151645,
1297
+ "<|im_start|>": 151644,
1298
+ "<|image_pad|>": 151655,
1299
+ "<|object_ref_end|>": 151647,
1300
+ "<|object_ref_start|>": 151646,
1301
+ "<|quad_end|>": 151651,
1302
+ "<|quad_start|>": 151650,
1303
+ "<|repo_name|>": 151663,
1304
+ "<|video_pad|>": 151656,
1305
+ "<|vision_end|>": 151653,
1306
+ "<|vision_pad|>": 151654,
1307
+ "<|vision_start|>": 151652
1308
+ }
qwen3_4b_sports/config.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Qwen3ForCausalLM"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "eos_token_id": 151645,
8
+ "head_dim": 128,
9
+ "hidden_act": "silu",
10
+ "hidden_size": 2560,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 9728,
13
+ "max_position_embeddings": 262144,
14
+ "max_window_layers": 36,
15
+ "model_type": "qwen3",
16
+ "num_attention_heads": 32,
17
+ "num_hidden_layers": 36,
18
+ "num_key_value_heads": 8,
19
+ "pad_token_id": 151643,
20
+ "rms_norm_eps": 1e-06,
21
+ "rope_scaling": null,
22
+ "rope_theta": 5000000,
23
+ "sliding_window": null,
24
+ "tie_word_embeddings": true,
25
+ "torch_dtype": "bfloat16",
26
+ "transformers_version": "4.51.1",
27
+ "use_cache": false,
28
+ "use_sliding_window": false,
29
+ "vocab_size": 152949
30
+ }
qwen3_4b_sports/generation_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 151643,
3
+ "do_sample": true,
4
+ "eos_token_id": [
5
+ 151645,
6
+ 151643
7
+ ],
8
+ "pad_token_id": 151643,
9
+ "temperature": 0.7,
10
+ "top_k": 20,
11
+ "top_p": 0.8,
12
+ "transformers_version": "4.51.1"
13
+ }
qwen3_4b_sports/merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
qwen3_4b_sports/model-00001-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e39218ddcacaa49a6e8b0a1a7b568bfae81c361c5aad08075554ea708b208744
3
+ size 4476896280
qwen3_4b_sports/model-00002-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c40fc52a8f52cd54f874afe95dde83e70632bc3ba78249e52b6cd0763e3074d1
3
+ size 4356371336
qwen3_4b_sports/model.safetensors.index.json ADDED
@@ -0,0 +1,406 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 8833221632
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00001-of-00002.safetensors",
7
+ "model.embed_tokens.weight": "model-00002-of-00002.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00002.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
13
+ "model.layers.0.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
14
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
15
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
16
+ "model.layers.0.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
17
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
18
+ "model.layers.0.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
19
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00002.safetensors",
20
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
21
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
22
+ "model.layers.1.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
23
+ "model.layers.1.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
24
+ "model.layers.1.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
25
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
26
+ "model.layers.1.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
27
+ "model.layers.1.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
28
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
29
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
30
+ "model.layers.10.input_layernorm.weight": "model-00002-of-00002.safetensors",
31
+ "model.layers.10.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
32
+ "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
33
+ "model.layers.10.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
34
+ "model.layers.10.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
35
+ "model.layers.10.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
36
+ "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
37
+ "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
38
+ "model.layers.10.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
39
+ "model.layers.10.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
40
+ "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
41
+ "model.layers.11.input_layernorm.weight": "model-00002-of-00002.safetensors",
42
+ "model.layers.11.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
43
+ "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
44
+ "model.layers.11.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
45
+ "model.layers.11.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
46
+ "model.layers.11.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
47
+ "model.layers.11.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
48
+ "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
49
+ "model.layers.11.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
50
+ "model.layers.11.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
51
+ "model.layers.11.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
52
+ "model.layers.12.input_layernorm.weight": "model-00001-of-00002.safetensors",
53
+ "model.layers.12.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
54
+ "model.layers.12.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
55
+ "model.layers.12.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
56
+ "model.layers.12.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
57
+ "model.layers.12.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
58
+ "model.layers.12.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
59
+ "model.layers.12.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
60
+ "model.layers.12.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
61
+ "model.layers.12.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
62
+ "model.layers.12.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
63
+ "model.layers.13.input_layernorm.weight": "model-00002-of-00002.safetensors",
64
+ "model.layers.13.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
65
+ "model.layers.13.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
66
+ "model.layers.13.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
67
+ "model.layers.13.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
68
+ "model.layers.13.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
69
+ "model.layers.13.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
70
+ "model.layers.13.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
71
+ "model.layers.13.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
72
+ "model.layers.13.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
73
+ "model.layers.13.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
74
+ "model.layers.14.input_layernorm.weight": "model-00001-of-00002.safetensors",
75
+ "model.layers.14.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
76
+ "model.layers.14.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
77
+ "model.layers.14.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
78
+ "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
79
+ "model.layers.14.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
80
+ "model.layers.14.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
81
+ "model.layers.14.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
82
+ "model.layers.14.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
83
+ "model.layers.14.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
84
+ "model.layers.14.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
85
+ "model.layers.15.input_layernorm.weight": "model-00002-of-00002.safetensors",
86
+ "model.layers.15.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
87
+ "model.layers.15.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
88
+ "model.layers.15.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
89
+ "model.layers.15.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
90
+ "model.layers.15.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
91
+ "model.layers.15.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
92
+ "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
93
+ "model.layers.15.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
94
+ "model.layers.15.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
95
+ "model.layers.15.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
96
+ "model.layers.16.input_layernorm.weight": "model-00002-of-00002.safetensors",
97
+ "model.layers.16.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
98
+ "model.layers.16.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
99
+ "model.layers.16.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
100
+ "model.layers.16.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
101
+ "model.layers.16.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
102
+ "model.layers.16.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
103
+ "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
104
+ "model.layers.16.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
105
+ "model.layers.16.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
106
+ "model.layers.16.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
107
+ "model.layers.17.input_layernorm.weight": "model-00002-of-00002.safetensors",
108
+ "model.layers.17.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
109
+ "model.layers.17.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
110
+ "model.layers.17.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
111
+ "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
112
+ "model.layers.17.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
113
+ "model.layers.17.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
114
+ "model.layers.17.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
115
+ "model.layers.17.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
116
+ "model.layers.17.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
117
+ "model.layers.17.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
118
+ "model.layers.18.input_layernorm.weight": "model-00002-of-00002.safetensors",
119
+ "model.layers.18.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
120
+ "model.layers.18.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
121
+ "model.layers.18.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
122
+ "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
123
+ "model.layers.18.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
124
+ "model.layers.18.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
125
+ "model.layers.18.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
126
+ "model.layers.18.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
127
+ "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
128
+ "model.layers.18.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
129
+ "model.layers.19.input_layernorm.weight": "model-00001-of-00002.safetensors",
130
+ "model.layers.19.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
131
+ "model.layers.19.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
132
+ "model.layers.19.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
133
+ "model.layers.19.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
134
+ "model.layers.19.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
135
+ "model.layers.19.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
136
+ "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
137
+ "model.layers.19.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
138
+ "model.layers.19.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
139
+ "model.layers.19.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
140
+ "model.layers.2.input_layernorm.weight": "model-00002-of-00002.safetensors",
141
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
142
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
143
+ "model.layers.2.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
144
+ "model.layers.2.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
145
+ "model.layers.2.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
146
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
147
+ "model.layers.2.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
148
+ "model.layers.2.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
149
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
150
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
151
+ "model.layers.20.input_layernorm.weight": "model-00002-of-00002.safetensors",
152
+ "model.layers.20.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
153
+ "model.layers.20.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
154
+ "model.layers.20.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
155
+ "model.layers.20.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
156
+ "model.layers.20.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
157
+ "model.layers.20.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
158
+ "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
159
+ "model.layers.20.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
160
+ "model.layers.20.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
161
+ "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
162
+ "model.layers.21.input_layernorm.weight": "model-00002-of-00002.safetensors",
163
+ "model.layers.21.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
164
+ "model.layers.21.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
165
+ "model.layers.21.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
166
+ "model.layers.21.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
167
+ "model.layers.21.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
168
+ "model.layers.21.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
169
+ "model.layers.21.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
170
+ "model.layers.21.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
171
+ "model.layers.21.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
172
+ "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
173
+ "model.layers.22.input_layernorm.weight": "model-00001-of-00002.safetensors",
174
+ "model.layers.22.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
175
+ "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
176
+ "model.layers.22.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
177
+ "model.layers.22.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
178
+ "model.layers.22.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
179
+ "model.layers.22.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
180
+ "model.layers.22.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
181
+ "model.layers.22.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
182
+ "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
183
+ "model.layers.22.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
184
+ "model.layers.23.input_layernorm.weight": "model-00001-of-00002.safetensors",
185
+ "model.layers.23.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
186
+ "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
187
+ "model.layers.23.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
188
+ "model.layers.23.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
189
+ "model.layers.23.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
190
+ "model.layers.23.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
191
+ "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
192
+ "model.layers.23.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
193
+ "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
194
+ "model.layers.23.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
195
+ "model.layers.24.input_layernorm.weight": "model-00002-of-00002.safetensors",
196
+ "model.layers.24.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
197
+ "model.layers.24.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
198
+ "model.layers.24.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
199
+ "model.layers.24.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
200
+ "model.layers.24.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
201
+ "model.layers.24.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
202
+ "model.layers.24.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
203
+ "model.layers.24.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
204
+ "model.layers.24.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
205
+ "model.layers.24.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
206
+ "model.layers.25.input_layernorm.weight": "model-00002-of-00002.safetensors",
207
+ "model.layers.25.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
208
+ "model.layers.25.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
209
+ "model.layers.25.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
210
+ "model.layers.25.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
211
+ "model.layers.25.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
212
+ "model.layers.25.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
213
+ "model.layers.25.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
214
+ "model.layers.25.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
215
+ "model.layers.25.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
216
+ "model.layers.25.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
217
+ "model.layers.26.input_layernorm.weight": "model-00002-of-00002.safetensors",
218
+ "model.layers.26.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
219
+ "model.layers.26.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
220
+ "model.layers.26.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
221
+ "model.layers.26.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
222
+ "model.layers.26.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
223
+ "model.layers.26.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
224
+ "model.layers.26.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
225
+ "model.layers.26.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
226
+ "model.layers.26.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
227
+ "model.layers.26.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
228
+ "model.layers.27.input_layernorm.weight": "model-00001-of-00002.safetensors",
229
+ "model.layers.27.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
230
+ "model.layers.27.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
231
+ "model.layers.27.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
232
+ "model.layers.27.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
233
+ "model.layers.27.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
234
+ "model.layers.27.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
235
+ "model.layers.27.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
236
+ "model.layers.27.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
237
+ "model.layers.27.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
238
+ "model.layers.27.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
239
+ "model.layers.28.input_layernorm.weight": "model-00002-of-00002.safetensors",
240
+ "model.layers.28.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
241
+ "model.layers.28.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
242
+ "model.layers.28.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
243
+ "model.layers.28.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
244
+ "model.layers.28.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
245
+ "model.layers.28.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
246
+ "model.layers.28.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
247
+ "model.layers.28.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
248
+ "model.layers.28.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
249
+ "model.layers.28.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
250
+ "model.layers.29.input_layernorm.weight": "model-00002-of-00002.safetensors",
251
+ "model.layers.29.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
252
+ "model.layers.29.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
253
+ "model.layers.29.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
254
+ "model.layers.29.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
255
+ "model.layers.29.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
256
+ "model.layers.29.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
257
+ "model.layers.29.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
258
+ "model.layers.29.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
259
+ "model.layers.29.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
260
+ "model.layers.29.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
261
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00002.safetensors",
262
+ "model.layers.3.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
263
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
264
+ "model.layers.3.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
265
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
266
+ "model.layers.3.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
267
+ "model.layers.3.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
268
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
269
+ "model.layers.3.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
270
+ "model.layers.3.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
271
+ "model.layers.3.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
272
+ "model.layers.30.input_layernorm.weight": "model-00002-of-00002.safetensors",
273
+ "model.layers.30.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
274
+ "model.layers.30.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
275
+ "model.layers.30.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
276
+ "model.layers.30.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
277
+ "model.layers.30.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
278
+ "model.layers.30.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
279
+ "model.layers.30.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
280
+ "model.layers.30.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
281
+ "model.layers.30.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
282
+ "model.layers.30.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
283
+ "model.layers.31.input_layernorm.weight": "model-00002-of-00002.safetensors",
284
+ "model.layers.31.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
285
+ "model.layers.31.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
286
+ "model.layers.31.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
287
+ "model.layers.31.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
288
+ "model.layers.31.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
289
+ "model.layers.31.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
290
+ "model.layers.31.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
291
+ "model.layers.31.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
292
+ "model.layers.31.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
293
+ "model.layers.31.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
294
+ "model.layers.32.input_layernorm.weight": "model-00001-of-00002.safetensors",
295
+ "model.layers.32.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
296
+ "model.layers.32.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
297
+ "model.layers.32.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
298
+ "model.layers.32.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
299
+ "model.layers.32.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
300
+ "model.layers.32.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
301
+ "model.layers.32.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
302
+ "model.layers.32.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
303
+ "model.layers.32.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
304
+ "model.layers.32.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
305
+ "model.layers.33.input_layernorm.weight": "model-00002-of-00002.safetensors",
306
+ "model.layers.33.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
307
+ "model.layers.33.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
308
+ "model.layers.33.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
309
+ "model.layers.33.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
310
+ "model.layers.33.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
311
+ "model.layers.33.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
312
+ "model.layers.33.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
313
+ "model.layers.33.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
314
+ "model.layers.33.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
315
+ "model.layers.33.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
316
+ "model.layers.34.input_layernorm.weight": "model-00002-of-00002.safetensors",
317
+ "model.layers.34.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
318
+ "model.layers.34.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
319
+ "model.layers.34.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
320
+ "model.layers.34.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
321
+ "model.layers.34.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
322
+ "model.layers.34.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
323
+ "model.layers.34.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
324
+ "model.layers.34.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
325
+ "model.layers.34.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
326
+ "model.layers.34.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
327
+ "model.layers.35.input_layernorm.weight": "model-00002-of-00002.safetensors",
328
+ "model.layers.35.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
329
+ "model.layers.35.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
330
+ "model.layers.35.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
331
+ "model.layers.35.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
332
+ "model.layers.35.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
333
+ "model.layers.35.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
334
+ "model.layers.35.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
335
+ "model.layers.35.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
336
+ "model.layers.35.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
337
+ "model.layers.35.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
338
+ "model.layers.4.input_layernorm.weight": "model-00002-of-00002.safetensors",
339
+ "model.layers.4.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
340
+ "model.layers.4.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
341
+ "model.layers.4.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
342
+ "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
343
+ "model.layers.4.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
344
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
345
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
346
+ "model.layers.4.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
347
+ "model.layers.4.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
348
+ "model.layers.4.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
349
+ "model.layers.5.input_layernorm.weight": "model-00002-of-00002.safetensors",
350
+ "model.layers.5.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
351
+ "model.layers.5.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
352
+ "model.layers.5.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
353
+ "model.layers.5.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
354
+ "model.layers.5.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
355
+ "model.layers.5.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
356
+ "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
357
+ "model.layers.5.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
358
+ "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
359
+ "model.layers.5.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
360
+ "model.layers.6.input_layernorm.weight": "model-00002-of-00002.safetensors",
361
+ "model.layers.6.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
362
+ "model.layers.6.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
363
+ "model.layers.6.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
364
+ "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
365
+ "model.layers.6.self_attn.k_norm.weight": "model-00002-of-00002.safetensors",
366
+ "model.layers.6.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
367
+ "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
368
+ "model.layers.6.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
369
+ "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
370
+ "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
371
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00002.safetensors",
372
+ "model.layers.7.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
373
+ "model.layers.7.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
374
+ "model.layers.7.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
375
+ "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
376
+ "model.layers.7.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
377
+ "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
378
+ "model.layers.7.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
379
+ "model.layers.7.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
380
+ "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
381
+ "model.layers.7.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
382
+ "model.layers.8.input_layernorm.weight": "model-00002-of-00002.safetensors",
383
+ "model.layers.8.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
384
+ "model.layers.8.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
385
+ "model.layers.8.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
386
+ "model.layers.8.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
387
+ "model.layers.8.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
388
+ "model.layers.8.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
389
+ "model.layers.8.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
390
+ "model.layers.8.self_attn.q_norm.weight": "model-00002-of-00002.safetensors",
391
+ "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
392
+ "model.layers.8.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
393
+ "model.layers.9.input_layernorm.weight": "model-00002-of-00002.safetensors",
394
+ "model.layers.9.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
395
+ "model.layers.9.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
396
+ "model.layers.9.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
397
+ "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
398
+ "model.layers.9.self_attn.k_norm.weight": "model-00001-of-00002.safetensors",
399
+ "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
400
+ "model.layers.9.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
401
+ "model.layers.9.self_attn.q_norm.weight": "model-00001-of-00002.safetensors",
402
+ "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
403
+ "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
404
+ "model.norm.weight": "model-00001-of-00002.safetensors"
405
+ }
406
+ }
qwen3_4b_sports/special_tokens_map.json ADDED
@@ -0,0 +1,1298 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|a_1|>",
4
+ "<|a_2|>",
5
+ "<|a_3|>",
6
+ "<|a_4|>",
7
+ "<|a_5|>",
8
+ "<|a_6|>",
9
+ "<|a_7|>",
10
+ "<|a_8|>",
11
+ "<|a_9|>",
12
+ "<|a_10|>",
13
+ "<|a_11|>",
14
+ "<|a_12|>",
15
+ "<|a_13|>",
16
+ "<|a_14|>",
17
+ "<|a_15|>",
18
+ "<|a_16|>",
19
+ "<|a_17|>",
20
+ "<|a_18|>",
21
+ "<|a_19|>",
22
+ "<|a_20|>",
23
+ "<|a_21|>",
24
+ "<|a_22|>",
25
+ "<|a_23|>",
26
+ "<|a_24|>",
27
+ "<|a_25|>",
28
+ "<|a_26|>",
29
+ "<|a_27|>",
30
+ "<|a_28|>",
31
+ "<|a_29|>",
32
+ "<|a_30|>",
33
+ "<|a_31|>",
34
+ "<|a_32|>",
35
+ "<|a_33|>",
36
+ "<|a_34|>",
37
+ "<|a_35|>",
38
+ "<|a_36|>",
39
+ "<|a_37|>",
40
+ "<|a_38|>",
41
+ "<|a_39|>",
42
+ "<|a_40|>",
43
+ "<|a_41|>",
44
+ "<|a_42|>",
45
+ "<|a_43|>",
46
+ "<|a_44|>",
47
+ "<|a_45|>",
48
+ "<|a_46|>",
49
+ "<|a_47|>",
50
+ "<|a_48|>",
51
+ "<|a_49|>",
52
+ "<|a_50|>",
53
+ "<|a_51|>",
54
+ "<|a_52|>",
55
+ "<|a_53|>",
56
+ "<|a_54|>",
57
+ "<|a_55|>",
58
+ "<|a_56|>",
59
+ "<|a_57|>",
60
+ "<|a_58|>",
61
+ "<|a_59|>",
62
+ "<|a_60|>",
63
+ "<|a_61|>",
64
+ "<|a_62|>",
65
+ "<|a_63|>",
66
+ "<|a_64|>",
67
+ "<|a_65|>",
68
+ "<|a_66|>",
69
+ "<|a_67|>",
70
+ "<|a_68|>",
71
+ "<|a_69|>",
72
+ "<|a_70|>",
73
+ "<|a_71|>",
74
+ "<|a_72|>",
75
+ "<|a_73|>",
76
+ "<|a_74|>",
77
+ "<|a_75|>",
78
+ "<|a_76|>",
79
+ "<|a_77|>",
80
+ "<|a_78|>",
81
+ "<|a_79|>",
82
+ "<|a_80|>",
83
+ "<|a_81|>",
84
+ "<|a_82|>",
85
+ "<|a_83|>",
86
+ "<|a_84|>",
87
+ "<|a_85|>",
88
+ "<|a_86|>",
89
+ "<|a_87|>",
90
+ "<|a_88|>",
91
+ "<|a_89|>",
92
+ "<|a_90|>",
93
+ "<|a_91|>",
94
+ "<|a_92|>",
95
+ "<|a_93|>",
96
+ "<|a_94|>",
97
+ "<|a_95|>",
98
+ "<|a_96|>",
99
+ "<|a_97|>",
100
+ "<|a_98|>",
101
+ "<|a_99|>",
102
+ "<|a_100|>",
103
+ "<|a_101|>",
104
+ "<|a_102|>",
105
+ "<|a_103|>",
106
+ "<|a_104|>",
107
+ "<|a_105|>",
108
+ "<|a_106|>",
109
+ "<|a_107|>",
110
+ "<|a_108|>",
111
+ "<|a_109|>",
112
+ "<|a_110|>",
113
+ "<|a_111|>",
114
+ "<|a_112|>",
115
+ "<|a_113|>",
116
+ "<|a_114|>",
117
+ "<|a_115|>",
118
+ "<|a_116|>",
119
+ "<|a_117|>",
120
+ "<|a_118|>",
121
+ "<|a_119|>",
122
+ "<|a_120|>",
123
+ "<|a_121|>",
124
+ "<|a_122|>",
125
+ "<|a_123|>",
126
+ "<|a_124|>",
127
+ "<|a_125|>",
128
+ "<|a_126|>",
129
+ "<|a_127|>",
130
+ "<|a_128|>",
131
+ "<|a_129|>",
132
+ "<|a_130|>",
133
+ "<|a_131|>",
134
+ "<|a_132|>",
135
+ "<|a_133|>",
136
+ "<|a_134|>",
137
+ "<|a_135|>",
138
+ "<|a_136|>",
139
+ "<|a_137|>",
140
+ "<|a_138|>",
141
+ "<|a_139|>",
142
+ "<|a_140|>",
143
+ "<|a_141|>",
144
+ "<|a_142|>",
145
+ "<|a_143|>",
146
+ "<|a_144|>",
147
+ "<|a_145|>",
148
+ "<|a_146|>",
149
+ "<|a_147|>",
150
+ "<|a_148|>",
151
+ "<|a_149|>",
152
+ "<|a_150|>",
153
+ "<|a_151|>",
154
+ "<|a_152|>",
155
+ "<|a_153|>",
156
+ "<|a_154|>",
157
+ "<|a_155|>",
158
+ "<|a_156|>",
159
+ "<|a_157|>",
160
+ "<|a_158|>",
161
+ "<|a_159|>",
162
+ "<|a_160|>",
163
+ "<|a_161|>",
164
+ "<|a_162|>",
165
+ "<|a_163|>",
166
+ "<|a_164|>",
167
+ "<|a_165|>",
168
+ "<|a_166|>",
169
+ "<|a_167|>",
170
+ "<|a_168|>",
171
+ "<|a_169|>",
172
+ "<|a_170|>",
173
+ "<|a_171|>",
174
+ "<|a_172|>",
175
+ "<|a_173|>",
176
+ "<|a_174|>",
177
+ "<|a_175|>",
178
+ "<|a_176|>",
179
+ "<|a_177|>",
180
+ "<|a_178|>",
181
+ "<|a_179|>",
182
+ "<|a_180|>",
183
+ "<|a_181|>",
184
+ "<|a_182|>",
185
+ "<|a_183|>",
186
+ "<|a_184|>",
187
+ "<|a_185|>",
188
+ "<|a_186|>",
189
+ "<|a_187|>",
190
+ "<|a_188|>",
191
+ "<|a_189|>",
192
+ "<|a_190|>",
193
+ "<|a_191|>",
194
+ "<|a_192|>",
195
+ "<|a_193|>",
196
+ "<|a_194|>",
197
+ "<|a_195|>",
198
+ "<|a_196|>",
199
+ "<|a_197|>",
200
+ "<|a_198|>",
201
+ "<|a_199|>",
202
+ "<|a_200|>",
203
+ "<|a_201|>",
204
+ "<|a_202|>",
205
+ "<|a_203|>",
206
+ "<|a_204|>",
207
+ "<|a_205|>",
208
+ "<|a_206|>",
209
+ "<|a_207|>",
210
+ "<|a_208|>",
211
+ "<|a_209|>",
212
+ "<|a_210|>",
213
+ "<|a_211|>",
214
+ "<|a_212|>",
215
+ "<|a_213|>",
216
+ "<|a_214|>",
217
+ "<|a_215|>",
218
+ "<|a_216|>",
219
+ "<|a_217|>",
220
+ "<|a_218|>",
221
+ "<|a_219|>",
222
+ "<|a_220|>",
223
+ "<|a_221|>",
224
+ "<|a_222|>",
225
+ "<|a_223|>",
226
+ "<|a_224|>",
227
+ "<|a_225|>",
228
+ "<|a_226|>",
229
+ "<|a_227|>",
230
+ "<|a_228|>",
231
+ "<|a_229|>",
232
+ "<|a_230|>",
233
+ "<|a_231|>",
234
+ "<|a_232|>",
235
+ "<|a_233|>",
236
+ "<|a_234|>",
237
+ "<|a_235|>",
238
+ "<|a_236|>",
239
+ "<|a_237|>",
240
+ "<|a_238|>",
241
+ "<|a_239|>",
242
+ "<|a_240|>",
243
+ "<|a_241|>",
244
+ "<|a_242|>",
245
+ "<|a_243|>",
246
+ "<|a_244|>",
247
+ "<|a_245|>",
248
+ "<|a_246|>",
249
+ "<|a_247|>",
250
+ "<|a_248|>",
251
+ "<|a_249|>",
252
+ "<|a_250|>",
253
+ "<|a_251|>",
254
+ "<|a_252|>",
255
+ "<|a_253|>",
256
+ "<|a_254|>",
257
+ "<|a_255|>",
258
+ "<|a_256|>",
259
+ "<|b_1|>",
260
+ "<|b_2|>",
261
+ "<|b_3|>",
262
+ "<|b_4|>",
263
+ "<|b_5|>",
264
+ "<|b_6|>",
265
+ "<|b_7|>",
266
+ "<|b_8|>",
267
+ "<|b_9|>",
268
+ "<|b_10|>",
269
+ "<|b_11|>",
270
+ "<|b_12|>",
271
+ "<|b_13|>",
272
+ "<|b_14|>",
273
+ "<|b_15|>",
274
+ "<|b_16|>",
275
+ "<|b_17|>",
276
+ "<|b_18|>",
277
+ "<|b_19|>",
278
+ "<|b_20|>",
279
+ "<|b_21|>",
280
+ "<|b_22|>",
281
+ "<|b_23|>",
282
+ "<|b_24|>",
283
+ "<|b_25|>",
284
+ "<|b_26|>",
285
+ "<|b_27|>",
286
+ "<|b_28|>",
287
+ "<|b_29|>",
288
+ "<|b_30|>",
289
+ "<|b_31|>",
290
+ "<|b_32|>",
291
+ "<|b_33|>",
292
+ "<|b_34|>",
293
+ "<|b_35|>",
294
+ "<|b_36|>",
295
+ "<|b_37|>",
296
+ "<|b_38|>",
297
+ "<|b_39|>",
298
+ "<|b_40|>",
299
+ "<|b_41|>",
300
+ "<|b_42|>",
301
+ "<|b_43|>",
302
+ "<|b_44|>",
303
+ "<|b_45|>",
304
+ "<|b_46|>",
305
+ "<|b_47|>",
306
+ "<|b_48|>",
307
+ "<|b_49|>",
308
+ "<|b_50|>",
309
+ "<|b_51|>",
310
+ "<|b_52|>",
311
+ "<|b_53|>",
312
+ "<|b_54|>",
313
+ "<|b_55|>",
314
+ "<|b_56|>",
315
+ "<|b_57|>",
316
+ "<|b_58|>",
317
+ "<|b_59|>",
318
+ "<|b_60|>",
319
+ "<|b_61|>",
320
+ "<|b_62|>",
321
+ "<|b_63|>",
322
+ "<|b_64|>",
323
+ "<|b_65|>",
324
+ "<|b_66|>",
325
+ "<|b_67|>",
326
+ "<|b_68|>",
327
+ "<|b_69|>",
328
+ "<|b_70|>",
329
+ "<|b_71|>",
330
+ "<|b_72|>",
331
+ "<|b_73|>",
332
+ "<|b_74|>",
333
+ "<|b_75|>",
334
+ "<|b_76|>",
335
+ "<|b_77|>",
336
+ "<|b_78|>",
337
+ "<|b_79|>",
338
+ "<|b_80|>",
339
+ "<|b_81|>",
340
+ "<|b_82|>",
341
+ "<|b_83|>",
342
+ "<|b_84|>",
343
+ "<|b_85|>",
344
+ "<|b_86|>",
345
+ "<|b_87|>",
346
+ "<|b_88|>",
347
+ "<|b_89|>",
348
+ "<|b_90|>",
349
+ "<|b_91|>",
350
+ "<|b_92|>",
351
+ "<|b_93|>",
352
+ "<|b_94|>",
353
+ "<|b_95|>",
354
+ "<|b_96|>",
355
+ "<|b_97|>",
356
+ "<|b_98|>",
357
+ "<|b_99|>",
358
+ "<|b_100|>",
359
+ "<|b_101|>",
360
+ "<|b_102|>",
361
+ "<|b_103|>",
362
+ "<|b_104|>",
363
+ "<|b_105|>",
364
+ "<|b_106|>",
365
+ "<|b_107|>",
366
+ "<|b_108|>",
367
+ "<|b_109|>",
368
+ "<|b_110|>",
369
+ "<|b_111|>",
370
+ "<|b_112|>",
371
+ "<|b_113|>",
372
+ "<|b_114|>",
373
+ "<|b_115|>",
374
+ "<|b_116|>",
375
+ "<|b_117|>",
376
+ "<|b_118|>",
377
+ "<|b_119|>",
378
+ "<|b_120|>",
379
+ "<|b_121|>",
380
+ "<|b_122|>",
381
+ "<|b_123|>",
382
+ "<|b_124|>",
383
+ "<|b_125|>",
384
+ "<|b_126|>",
385
+ "<|b_127|>",
386
+ "<|b_128|>",
387
+ "<|b_129|>",
388
+ "<|b_130|>",
389
+ "<|b_131|>",
390
+ "<|b_132|>",
391
+ "<|b_133|>",
392
+ "<|b_134|>",
393
+ "<|b_135|>",
394
+ "<|b_136|>",
395
+ "<|b_137|>",
396
+ "<|b_138|>",
397
+ "<|b_139|>",
398
+ "<|b_140|>",
399
+ "<|b_141|>",
400
+ "<|b_142|>",
401
+ "<|b_143|>",
402
+ "<|b_144|>",
403
+ "<|b_145|>",
404
+ "<|b_146|>",
405
+ "<|b_147|>",
406
+ "<|b_148|>",
407
+ "<|b_149|>",
408
+ "<|b_150|>",
409
+ "<|b_151|>",
410
+ "<|b_152|>",
411
+ "<|b_153|>",
412
+ "<|b_154|>",
413
+ "<|b_155|>",
414
+ "<|b_156|>",
415
+ "<|b_157|>",
416
+ "<|b_158|>",
417
+ "<|b_159|>",
418
+ "<|b_160|>",
419
+ "<|b_161|>",
420
+ "<|b_162|>",
421
+ "<|b_163|>",
422
+ "<|b_164|>",
423
+ "<|b_165|>",
424
+ "<|b_166|>",
425
+ "<|b_167|>",
426
+ "<|b_168|>",
427
+ "<|b_169|>",
428
+ "<|b_170|>",
429
+ "<|b_171|>",
430
+ "<|b_172|>",
431
+ "<|b_173|>",
432
+ "<|b_174|>",
433
+ "<|b_175|>",
434
+ "<|b_176|>",
435
+ "<|b_177|>",
436
+ "<|b_178|>",
437
+ "<|b_179|>",
438
+ "<|b_180|>",
439
+ "<|b_181|>",
440
+ "<|b_182|>",
441
+ "<|b_183|>",
442
+ "<|b_184|>",
443
+ "<|b_185|>",
444
+ "<|b_186|>",
445
+ "<|b_187|>",
446
+ "<|b_188|>",
447
+ "<|b_189|>",
448
+ "<|b_190|>",
449
+ "<|b_191|>",
450
+ "<|b_192|>",
451
+ "<|b_193|>",
452
+ "<|b_194|>",
453
+ "<|b_195|>",
454
+ "<|b_196|>",
455
+ "<|b_197|>",
456
+ "<|b_198|>",
457
+ "<|b_199|>",
458
+ "<|b_200|>",
459
+ "<|b_201|>",
460
+ "<|b_202|>",
461
+ "<|b_203|>",
462
+ "<|b_204|>",
463
+ "<|b_205|>",
464
+ "<|b_206|>",
465
+ "<|b_207|>",
466
+ "<|b_208|>",
467
+ "<|b_209|>",
468
+ "<|b_210|>",
469
+ "<|b_211|>",
470
+ "<|b_212|>",
471
+ "<|b_213|>",
472
+ "<|b_214|>",
473
+ "<|b_215|>",
474
+ "<|b_216|>",
475
+ "<|b_217|>",
476
+ "<|b_218|>",
477
+ "<|b_219|>",
478
+ "<|b_220|>",
479
+ "<|b_221|>",
480
+ "<|b_222|>",
481
+ "<|b_223|>",
482
+ "<|b_224|>",
483
+ "<|b_225|>",
484
+ "<|b_226|>",
485
+ "<|b_227|>",
486
+ "<|b_228|>",
487
+ "<|b_229|>",
488
+ "<|b_230|>",
489
+ "<|b_231|>",
490
+ "<|b_232|>",
491
+ "<|b_233|>",
492
+ "<|b_234|>",
493
+ "<|b_235|>",
494
+ "<|b_236|>",
495
+ "<|b_237|>",
496
+ "<|b_238|>",
497
+ "<|b_239|>",
498
+ "<|b_240|>",
499
+ "<|b_241|>",
500
+ "<|b_242|>",
501
+ "<|b_243|>",
502
+ "<|b_244|>",
503
+ "<|b_245|>",
504
+ "<|b_246|>",
505
+ "<|b_247|>",
506
+ "<|b_248|>",
507
+ "<|b_249|>",
508
+ "<|b_250|>",
509
+ "<|b_251|>",
510
+ "<|b_252|>",
511
+ "<|b_253|>",
512
+ "<|b_254|>",
513
+ "<|b_255|>",
514
+ "<|b_256|>",
515
+ "<|c_1|>",
516
+ "<|c_2|>",
517
+ "<|c_3|>",
518
+ "<|c_4|>",
519
+ "<|c_5|>",
520
+ "<|c_6|>",
521
+ "<|c_7|>",
522
+ "<|c_8|>",
523
+ "<|c_9|>",
524
+ "<|c_10|>",
525
+ "<|c_11|>",
526
+ "<|c_12|>",
527
+ "<|c_13|>",
528
+ "<|c_14|>",
529
+ "<|c_15|>",
530
+ "<|c_16|>",
531
+ "<|c_17|>",
532
+ "<|c_18|>",
533
+ "<|c_19|>",
534
+ "<|c_20|>",
535
+ "<|c_21|>",
536
+ "<|c_22|>",
537
+ "<|c_23|>",
538
+ "<|c_24|>",
539
+ "<|c_25|>",
540
+ "<|c_26|>",
541
+ "<|c_27|>",
542
+ "<|c_28|>",
543
+ "<|c_29|>",
544
+ "<|c_30|>",
545
+ "<|c_31|>",
546
+ "<|c_32|>",
547
+ "<|c_33|>",
548
+ "<|c_34|>",
549
+ "<|c_35|>",
550
+ "<|c_36|>",
551
+ "<|c_37|>",
552
+ "<|c_38|>",
553
+ "<|c_39|>",
554
+ "<|c_40|>",
555
+ "<|c_41|>",
556
+ "<|c_42|>",
557
+ "<|c_43|>",
558
+ "<|c_44|>",
559
+ "<|c_45|>",
560
+ "<|c_46|>",
561
+ "<|c_47|>",
562
+ "<|c_48|>",
563
+ "<|c_49|>",
564
+ "<|c_50|>",
565
+ "<|c_51|>",
566
+ "<|c_52|>",
567
+ "<|c_53|>",
568
+ "<|c_54|>",
569
+ "<|c_55|>",
570
+ "<|c_56|>",
571
+ "<|c_57|>",
572
+ "<|c_58|>",
573
+ "<|c_59|>",
574
+ "<|c_60|>",
575
+ "<|c_61|>",
576
+ "<|c_62|>",
577
+ "<|c_63|>",
578
+ "<|c_64|>",
579
+ "<|c_65|>",
580
+ "<|c_66|>",
581
+ "<|c_67|>",
582
+ "<|c_68|>",
583
+ "<|c_69|>",
584
+ "<|c_70|>",
585
+ "<|c_71|>",
586
+ "<|c_72|>",
587
+ "<|c_73|>",
588
+ "<|c_74|>",
589
+ "<|c_75|>",
590
+ "<|c_76|>",
591
+ "<|c_77|>",
592
+ "<|c_78|>",
593
+ "<|c_79|>",
594
+ "<|c_80|>",
595
+ "<|c_81|>",
596
+ "<|c_82|>",
597
+ "<|c_83|>",
598
+ "<|c_84|>",
599
+ "<|c_85|>",
600
+ "<|c_86|>",
601
+ "<|c_87|>",
602
+ "<|c_88|>",
603
+ "<|c_89|>",
604
+ "<|c_90|>",
605
+ "<|c_91|>",
606
+ "<|c_92|>",
607
+ "<|c_93|>",
608
+ "<|c_94|>",
609
+ "<|c_95|>",
610
+ "<|c_96|>",
611
+ "<|c_97|>",
612
+ "<|c_98|>",
613
+ "<|c_99|>",
614
+ "<|c_100|>",
615
+ "<|c_101|>",
616
+ "<|c_102|>",
617
+ "<|c_103|>",
618
+ "<|c_104|>",
619
+ "<|c_105|>",
620
+ "<|c_106|>",
621
+ "<|c_107|>",
622
+ "<|c_108|>",
623
+ "<|c_109|>",
624
+ "<|c_110|>",
625
+ "<|c_111|>",
626
+ "<|c_112|>",
627
+ "<|c_113|>",
628
+ "<|c_114|>",
629
+ "<|c_115|>",
630
+ "<|c_116|>",
631
+ "<|c_117|>",
632
+ "<|c_118|>",
633
+ "<|c_119|>",
634
+ "<|c_120|>",
635
+ "<|c_121|>",
636
+ "<|c_122|>",
637
+ "<|c_123|>",
638
+ "<|c_124|>",
639
+ "<|c_125|>",
640
+ "<|c_126|>",
641
+ "<|c_127|>",
642
+ "<|c_128|>",
643
+ "<|c_129|>",
644
+ "<|c_130|>",
645
+ "<|c_131|>",
646
+ "<|c_132|>",
647
+ "<|c_133|>",
648
+ "<|c_134|>",
649
+ "<|c_135|>",
650
+ "<|c_136|>",
651
+ "<|c_137|>",
652
+ "<|c_138|>",
653
+ "<|c_139|>",
654
+ "<|c_140|>",
655
+ "<|c_141|>",
656
+ "<|c_142|>",
657
+ "<|c_143|>",
658
+ "<|c_144|>",
659
+ "<|c_145|>",
660
+ "<|c_146|>",
661
+ "<|c_147|>",
662
+ "<|c_148|>",
663
+ "<|c_149|>",
664
+ "<|c_150|>",
665
+ "<|c_151|>",
666
+ "<|c_152|>",
667
+ "<|c_153|>",
668
+ "<|c_154|>",
669
+ "<|c_155|>",
670
+ "<|c_156|>",
671
+ "<|c_157|>",
672
+ "<|c_158|>",
673
+ "<|c_159|>",
674
+ "<|c_160|>",
675
+ "<|c_161|>",
676
+ "<|c_162|>",
677
+ "<|c_163|>",
678
+ "<|c_164|>",
679
+ "<|c_165|>",
680
+ "<|c_166|>",
681
+ "<|c_167|>",
682
+ "<|c_168|>",
683
+ "<|c_169|>",
684
+ "<|c_170|>",
685
+ "<|c_171|>",
686
+ "<|c_172|>",
687
+ "<|c_173|>",
688
+ "<|c_174|>",
689
+ "<|c_175|>",
690
+ "<|c_176|>",
691
+ "<|c_177|>",
692
+ "<|c_178|>",
693
+ "<|c_179|>",
694
+ "<|c_180|>",
695
+ "<|c_181|>",
696
+ "<|c_182|>",
697
+ "<|c_183|>",
698
+ "<|c_184|>",
699
+ "<|c_185|>",
700
+ "<|c_186|>",
701
+ "<|c_187|>",
702
+ "<|c_188|>",
703
+ "<|c_189|>",
704
+ "<|c_190|>",
705
+ "<|c_191|>",
706
+ "<|c_192|>",
707
+ "<|c_193|>",
708
+ "<|c_194|>",
709
+ "<|c_195|>",
710
+ "<|c_196|>",
711
+ "<|c_197|>",
712
+ "<|c_198|>",
713
+ "<|c_199|>",
714
+ "<|c_200|>",
715
+ "<|c_201|>",
716
+ "<|c_202|>",
717
+ "<|c_203|>",
718
+ "<|c_204|>",
719
+ "<|c_205|>",
720
+ "<|c_206|>",
721
+ "<|c_207|>",
722
+ "<|c_208|>",
723
+ "<|c_209|>",
724
+ "<|c_210|>",
725
+ "<|c_211|>",
726
+ "<|c_212|>",
727
+ "<|c_213|>",
728
+ "<|c_214|>",
729
+ "<|c_215|>",
730
+ "<|c_216|>",
731
+ "<|c_217|>",
732
+ "<|c_218|>",
733
+ "<|c_219|>",
734
+ "<|c_220|>",
735
+ "<|c_221|>",
736
+ "<|c_222|>",
737
+ "<|c_223|>",
738
+ "<|c_224|>",
739
+ "<|c_225|>",
740
+ "<|c_226|>",
741
+ "<|c_227|>",
742
+ "<|c_228|>",
743
+ "<|c_229|>",
744
+ "<|c_230|>",
745
+ "<|c_231|>",
746
+ "<|c_232|>",
747
+ "<|c_233|>",
748
+ "<|c_234|>",
749
+ "<|c_235|>",
750
+ "<|c_236|>",
751
+ "<|c_237|>",
752
+ "<|c_238|>",
753
+ "<|c_239|>",
754
+ "<|c_240|>",
755
+ "<|c_241|>",
756
+ "<|c_242|>",
757
+ "<|c_243|>",
758
+ "<|c_244|>",
759
+ "<|c_245|>",
760
+ "<|c_246|>",
761
+ "<|c_247|>",
762
+ "<|c_248|>",
763
+ "<|c_249|>",
764
+ "<|c_250|>",
765
+ "<|c_251|>",
766
+ "<|c_252|>",
767
+ "<|c_253|>",
768
+ "<|c_254|>",
769
+ "<|c_255|>",
770
+ "<|c_256|>",
771
+ "<|d_1|>",
772
+ "<|d_2|>",
773
+ "<|d_3|>",
774
+ "<|d_4|>",
775
+ "<|d_5|>",
776
+ "<|d_6|>",
777
+ "<|d_7|>",
778
+ "<|d_8|>",
779
+ "<|d_9|>",
780
+ "<|d_10|>",
781
+ "<|d_11|>",
782
+ "<|d_12|>",
783
+ "<|d_13|>",
784
+ "<|d_14|>",
785
+ "<|d_15|>",
786
+ "<|d_16|>",
787
+ "<|d_17|>",
788
+ "<|d_18|>",
789
+ "<|d_19|>",
790
+ "<|d_20|>",
791
+ "<|d_21|>",
792
+ "<|d_22|>",
793
+ "<|d_23|>",
794
+ "<|d_24|>",
795
+ "<|d_25|>",
796
+ "<|d_26|>",
797
+ "<|d_27|>",
798
+ "<|d_28|>",
799
+ "<|d_29|>",
800
+ "<|d_30|>",
801
+ "<|d_31|>",
802
+ "<|d_32|>",
803
+ "<|d_33|>",
804
+ "<|d_34|>",
805
+ "<|d_35|>",
806
+ "<|d_36|>",
807
+ "<|d_37|>",
808
+ "<|d_38|>",
809
+ "<|d_39|>",
810
+ "<|d_40|>",
811
+ "<|d_41|>",
812
+ "<|d_42|>",
813
+ "<|d_43|>",
814
+ "<|d_44|>",
815
+ "<|d_45|>",
816
+ "<|d_46|>",
817
+ "<|d_47|>",
818
+ "<|d_48|>",
819
+ "<|d_49|>",
820
+ "<|d_50|>",
821
+ "<|d_51|>",
822
+ "<|d_52|>",
823
+ "<|d_53|>",
824
+ "<|d_54|>",
825
+ "<|d_55|>",
826
+ "<|d_56|>",
827
+ "<|d_57|>",
828
+ "<|d_58|>",
829
+ "<|d_59|>",
830
+ "<|d_60|>",
831
+ "<|d_61|>",
832
+ "<|d_62|>",
833
+ "<|d_63|>",
834
+ "<|d_64|>",
835
+ "<|d_65|>",
836
+ "<|d_66|>",
837
+ "<|d_67|>",
838
+ "<|d_68|>",
839
+ "<|d_69|>",
840
+ "<|d_70|>",
841
+ "<|d_71|>",
842
+ "<|d_72|>",
843
+ "<|d_73|>",
844
+ "<|d_74|>",
845
+ "<|d_75|>",
846
+ "<|d_76|>",
847
+ "<|d_77|>",
848
+ "<|d_78|>",
849
+ "<|d_79|>",
850
+ "<|d_80|>",
851
+ "<|d_81|>",
852
+ "<|d_82|>",
853
+ "<|d_83|>",
854
+ "<|d_84|>",
855
+ "<|d_85|>",
856
+ "<|d_86|>",
857
+ "<|d_87|>",
858
+ "<|d_88|>",
859
+ "<|d_89|>",
860
+ "<|d_90|>",
861
+ "<|d_91|>",
862
+ "<|d_92|>",
863
+ "<|d_93|>",
864
+ "<|d_94|>",
865
+ "<|d_95|>",
866
+ "<|d_96|>",
867
+ "<|d_97|>",
868
+ "<|d_98|>",
869
+ "<|d_99|>",
870
+ "<|d_100|>",
871
+ "<|d_101|>",
872
+ "<|d_102|>",
873
+ "<|d_103|>",
874
+ "<|d_104|>",
875
+ "<|d_105|>",
876
+ "<|d_106|>",
877
+ "<|d_107|>",
878
+ "<|d_108|>",
879
+ "<|d_109|>",
880
+ "<|d_110|>",
881
+ "<|d_111|>",
882
+ "<|d_112|>",
883
+ "<|d_113|>",
884
+ "<|d_114|>",
885
+ "<|d_115|>",
886
+ "<|d_116|>",
887
+ "<|d_117|>",
888
+ "<|d_118|>",
889
+ "<|d_119|>",
890
+ "<|d_120|>",
891
+ "<|d_121|>",
892
+ "<|d_122|>",
893
+ "<|d_123|>",
894
+ "<|d_124|>",
895
+ "<|d_125|>",
896
+ "<|d_126|>",
897
+ "<|d_127|>",
898
+ "<|d_128|>",
899
+ "<|d_129|>",
900
+ "<|d_130|>",
901
+ "<|d_131|>",
902
+ "<|d_132|>",
903
+ "<|d_133|>",
904
+ "<|d_134|>",
905
+ "<|d_135|>",
906
+ "<|d_136|>",
907
+ "<|d_137|>",
908
+ "<|d_138|>",
909
+ "<|d_139|>",
910
+ "<|d_140|>",
911
+ "<|d_141|>",
912
+ "<|d_142|>",
913
+ "<|d_143|>",
914
+ "<|d_144|>",
915
+ "<|d_145|>",
916
+ "<|d_146|>",
917
+ "<|d_147|>",
918
+ "<|d_148|>",
919
+ "<|d_149|>",
920
+ "<|d_150|>",
921
+ "<|d_151|>",
922
+ "<|d_152|>",
923
+ "<|d_153|>",
924
+ "<|d_154|>",
925
+ "<|d_155|>",
926
+ "<|d_156|>",
927
+ "<|d_157|>",
928
+ "<|d_158|>",
929
+ "<|d_159|>",
930
+ "<|d_160|>",
931
+ "<|d_161|>",
932
+ "<|d_162|>",
933
+ "<|d_163|>",
934
+ "<|d_164|>",
935
+ "<|d_165|>",
936
+ "<|d_166|>",
937
+ "<|d_167|>",
938
+ "<|d_168|>",
939
+ "<|d_169|>",
940
+ "<|d_170|>",
941
+ "<|d_171|>",
942
+ "<|d_172|>",
943
+ "<|d_173|>",
944
+ "<|d_174|>",
945
+ "<|d_175|>",
946
+ "<|d_176|>",
947
+ "<|d_177|>",
948
+ "<|d_178|>",
949
+ "<|d_179|>",
950
+ "<|d_180|>",
951
+ "<|d_181|>",
952
+ "<|d_182|>",
953
+ "<|d_183|>",
954
+ "<|d_184|>",
955
+ "<|d_185|>",
956
+ "<|d_186|>",
957
+ "<|d_187|>",
958
+ "<|d_188|>",
959
+ "<|d_189|>",
960
+ "<|d_190|>",
961
+ "<|d_191|>",
962
+ "<|d_192|>",
963
+ "<|d_193|>",
964
+ "<|d_194|>",
965
+ "<|d_195|>",
966
+ "<|d_196|>",
967
+ "<|d_197|>",
968
+ "<|d_198|>",
969
+ "<|d_199|>",
970
+ "<|d_200|>",
971
+ "<|d_201|>",
972
+ "<|d_202|>",
973
+ "<|d_203|>",
974
+ "<|d_204|>",
975
+ "<|d_205|>",
976
+ "<|d_206|>",
977
+ "<|d_207|>",
978
+ "<|d_208|>",
979
+ "<|d_209|>",
980
+ "<|d_210|>",
981
+ "<|d_211|>",
982
+ "<|d_212|>",
983
+ "<|d_213|>",
984
+ "<|d_214|>",
985
+ "<|d_215|>",
986
+ "<|d_216|>",
987
+ "<|d_217|>",
988
+ "<|d_218|>",
989
+ "<|d_219|>",
990
+ "<|d_220|>",
991
+ "<|d_221|>",
992
+ "<|d_222|>",
993
+ "<|d_223|>",
994
+ "<|d_224|>",
995
+ "<|d_225|>",
996
+ "<|d_226|>",
997
+ "<|d_227|>",
998
+ "<|d_228|>",
999
+ "<|d_229|>",
1000
+ "<|d_230|>",
1001
+ "<|d_231|>",
1002
+ "<|d_232|>",
1003
+ "<|d_233|>",
1004
+ "<|d_234|>",
1005
+ "<|d_235|>",
1006
+ "<|d_236|>",
1007
+ "<|d_237|>",
1008
+ "<|d_238|>",
1009
+ "<|d_239|>",
1010
+ "<|d_240|>",
1011
+ "<|d_241|>",
1012
+ "<|d_242|>",
1013
+ "<|d_243|>",
1014
+ "<|d_244|>",
1015
+ "<|d_245|>",
1016
+ "<|d_246|>",
1017
+ "<|d_247|>",
1018
+ "<|d_248|>",
1019
+ "<|d_249|>",
1020
+ "<|d_250|>",
1021
+ "<|d_251|>",
1022
+ "<|d_252|>",
1023
+ "<|d_253|>",
1024
+ "<|d_254|>",
1025
+ "<|d_255|>",
1026
+ "<|d_256|>",
1027
+ "<|e_1|>",
1028
+ "<|e_2|>",
1029
+ "<|e_3|>",
1030
+ "<|e_4|>",
1031
+ "<|e_5|>",
1032
+ "<|e_6|>",
1033
+ "<|e_7|>",
1034
+ "<|e_8|>",
1035
+ "<|e_9|>",
1036
+ "<|e_10|>",
1037
+ "<|e_11|>",
1038
+ "<|e_12|>",
1039
+ "<|e_13|>",
1040
+ "<|e_14|>",
1041
+ "<|e_15|>",
1042
+ "<|e_16|>",
1043
+ "<|e_17|>",
1044
+ "<|e_18|>",
1045
+ "<|e_19|>",
1046
+ "<|e_20|>",
1047
+ "<|e_21|>",
1048
+ "<|e_22|>",
1049
+ "<|e_23|>",
1050
+ "<|e_24|>",
1051
+ "<|e_25|>",
1052
+ "<|e_26|>",
1053
+ "<|e_27|>",
1054
+ "<|e_28|>",
1055
+ "<|e_29|>",
1056
+ "<|e_30|>",
1057
+ "<|e_31|>",
1058
+ "<|e_32|>",
1059
+ "<|e_33|>",
1060
+ "<|e_34|>",
1061
+ "<|e_35|>",
1062
+ "<|e_36|>",
1063
+ "<|e_37|>",
1064
+ "<|e_38|>",
1065
+ "<|e_39|>",
1066
+ "<|e_40|>",
1067
+ "<|e_41|>",
1068
+ "<|e_42|>",
1069
+ "<|e_43|>",
1070
+ "<|e_44|>",
1071
+ "<|e_45|>",
1072
+ "<|e_46|>",
1073
+ "<|e_47|>",
1074
+ "<|e_48|>",
1075
+ "<|e_49|>",
1076
+ "<|e_50|>",
1077
+ "<|e_51|>",
1078
+ "<|e_52|>",
1079
+ "<|e_53|>",
1080
+ "<|e_54|>",
1081
+ "<|e_55|>",
1082
+ "<|e_56|>",
1083
+ "<|e_57|>",
1084
+ "<|e_58|>",
1085
+ "<|e_59|>",
1086
+ "<|e_60|>",
1087
+ "<|e_61|>",
1088
+ "<|e_62|>",
1089
+ "<|e_63|>",
1090
+ "<|e_64|>",
1091
+ "<|e_65|>",
1092
+ "<|e_66|>",
1093
+ "<|e_67|>",
1094
+ "<|e_68|>",
1095
+ "<|e_69|>",
1096
+ "<|e_70|>",
1097
+ "<|e_71|>",
1098
+ "<|e_72|>",
1099
+ "<|e_73|>",
1100
+ "<|e_74|>",
1101
+ "<|e_75|>",
1102
+ "<|e_76|>",
1103
+ "<|e_77|>",
1104
+ "<|e_78|>",
1105
+ "<|e_79|>",
1106
+ "<|e_80|>",
1107
+ "<|e_81|>",
1108
+ "<|e_82|>",
1109
+ "<|e_83|>",
1110
+ "<|e_84|>",
1111
+ "<|e_85|>",
1112
+ "<|e_86|>",
1113
+ "<|e_87|>",
1114
+ "<|e_88|>",
1115
+ "<|e_89|>",
1116
+ "<|e_90|>",
1117
+ "<|e_91|>",
1118
+ "<|e_92|>",
1119
+ "<|e_93|>",
1120
+ "<|e_94|>",
1121
+ "<|e_95|>",
1122
+ "<|e_96|>",
1123
+ "<|e_97|>",
1124
+ "<|e_98|>",
1125
+ "<|e_99|>",
1126
+ "<|e_100|>",
1127
+ "<|e_101|>",
1128
+ "<|e_102|>",
1129
+ "<|e_103|>",
1130
+ "<|e_104|>",
1131
+ "<|e_105|>",
1132
+ "<|e_106|>",
1133
+ "<|e_107|>",
1134
+ "<|e_108|>",
1135
+ "<|e_109|>",
1136
+ "<|e_110|>",
1137
+ "<|e_111|>",
1138
+ "<|e_112|>",
1139
+ "<|e_113|>",
1140
+ "<|e_114|>",
1141
+ "<|e_115|>",
1142
+ "<|e_116|>",
1143
+ "<|e_117|>",
1144
+ "<|e_118|>",
1145
+ "<|e_119|>",
1146
+ "<|e_120|>",
1147
+ "<|e_121|>",
1148
+ "<|e_122|>",
1149
+ "<|e_123|>",
1150
+ "<|e_124|>",
1151
+ "<|e_125|>",
1152
+ "<|e_126|>",
1153
+ "<|e_127|>",
1154
+ "<|e_128|>",
1155
+ "<|e_129|>",
1156
+ "<|e_130|>",
1157
+ "<|e_131|>",
1158
+ "<|e_132|>",
1159
+ "<|e_133|>",
1160
+ "<|e_134|>",
1161
+ "<|e_135|>",
1162
+ "<|e_136|>",
1163
+ "<|e_137|>",
1164
+ "<|e_138|>",
1165
+ "<|e_139|>",
1166
+ "<|e_140|>",
1167
+ "<|e_141|>",
1168
+ "<|e_142|>",
1169
+ "<|e_143|>",
1170
+ "<|e_144|>",
1171
+ "<|e_145|>",
1172
+ "<|e_146|>",
1173
+ "<|e_147|>",
1174
+ "<|e_148|>",
1175
+ "<|e_149|>",
1176
+ "<|e_150|>",
1177
+ "<|e_151|>",
1178
+ "<|e_152|>",
1179
+ "<|e_153|>",
1180
+ "<|e_154|>",
1181
+ "<|e_155|>",
1182
+ "<|e_156|>",
1183
+ "<|e_157|>",
1184
+ "<|e_158|>",
1185
+ "<|e_159|>",
1186
+ "<|e_160|>",
1187
+ "<|e_161|>",
1188
+ "<|e_162|>",
1189
+ "<|e_163|>",
1190
+ "<|e_164|>",
1191
+ "<|e_165|>",
1192
+ "<|e_166|>",
1193
+ "<|e_167|>",
1194
+ "<|e_168|>",
1195
+ "<|e_169|>",
1196
+ "<|e_170|>",
1197
+ "<|e_171|>",
1198
+ "<|e_172|>",
1199
+ "<|e_173|>",
1200
+ "<|e_174|>",
1201
+ "<|e_175|>",
1202
+ "<|e_176|>",
1203
+ "<|e_177|>",
1204
+ "<|e_178|>",
1205
+ "<|e_179|>",
1206
+ "<|e_180|>",
1207
+ "<|e_181|>",
1208
+ "<|e_182|>",
1209
+ "<|e_183|>",
1210
+ "<|e_184|>",
1211
+ "<|e_185|>",
1212
+ "<|e_186|>",
1213
+ "<|e_187|>",
1214
+ "<|e_188|>",
1215
+ "<|e_189|>",
1216
+ "<|e_190|>",
1217
+ "<|e_191|>",
1218
+ "<|e_192|>",
1219
+ "<|e_193|>",
1220
+ "<|e_194|>",
1221
+ "<|e_195|>",
1222
+ "<|e_196|>",
1223
+ "<|e_197|>",
1224
+ "<|e_198|>",
1225
+ "<|e_199|>",
1226
+ "<|e_200|>",
1227
+ "<|e_201|>",
1228
+ "<|e_202|>",
1229
+ "<|e_203|>",
1230
+ "<|e_204|>",
1231
+ "<|e_205|>",
1232
+ "<|e_206|>",
1233
+ "<|e_207|>",
1234
+ "<|e_208|>",
1235
+ "<|e_209|>",
1236
+ "<|e_210|>",
1237
+ "<|e_211|>",
1238
+ "<|e_212|>",
1239
+ "<|e_213|>",
1240
+ "<|e_214|>",
1241
+ "<|e_215|>",
1242
+ "<|e_216|>",
1243
+ "<|e_217|>",
1244
+ "<|e_218|>",
1245
+ "<|e_219|>",
1246
+ "<|e_220|>",
1247
+ "<|e_221|>",
1248
+ "<|e_222|>",
1249
+ "<|e_223|>",
1250
+ "<|e_224|>",
1251
+ "<|e_225|>",
1252
+ "<|e_226|>",
1253
+ "<|e_227|>",
1254
+ "<|e_228|>",
1255
+ "<|e_229|>",
1256
+ "<|e_230|>",
1257
+ "<|e_231|>",
1258
+ "<|e_232|>",
1259
+ "<|e_233|>",
1260
+ "<|e_234|>",
1261
+ "<|e_235|>",
1262
+ "<|e_236|>",
1263
+ "<|e_237|>",
1264
+ "<|e_238|>",
1265
+ "<|e_239|>",
1266
+ "<|e_240|>",
1267
+ "<|e_241|>",
1268
+ "<|e_242|>",
1269
+ "<|e_243|>",
1270
+ "<|e_244|>",
1271
+ "<|e_245|>",
1272
+ "<|e_246|>",
1273
+ "<|e_247|>",
1274
+ "<|e_248|>",
1275
+ "<|e_249|>",
1276
+ "<|e_250|>",
1277
+ "<|e_251|>",
1278
+ "<|e_252|>",
1279
+ "<|e_253|>",
1280
+ "<|e_254|>",
1281
+ "<|e_255|>",
1282
+ "<|e_256|>"
1283
+ ],
1284
+ "eos_token": {
1285
+ "content": "<|im_end|>",
1286
+ "lstrip": false,
1287
+ "normalized": false,
1288
+ "rstrip": false,
1289
+ "single_word": false
1290
+ },
1291
+ "pad_token": {
1292
+ "content": "<|endoftext|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false
1297
+ }
1298
+ }
qwen3_4b_sports/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3bcd0ab896130d768fa805cf9a6ffeacac3cbb78921e1b68a664da7735314452
3
+ size 11660194
qwen3_4b_sports/tokenizer_config.json ADDED
The diff for this file is too large to render. See raw diff
 
qwen3_4b_sports/vocab.json ADDED
The diff for this file is too large to render. See raw diff