finetune-instance-segmentation-ade20k-mini-mask2former
This model is a fine-tuned version of facebook/mask2former-swin-tiny-coco-instance on the qubvel-hf/ade20k-mini dataset. It achieves the following results on the evaluation set:
- Loss: 28.6625
- Map: 0.2266
- Map 50: 0.4359
- Map 75: 0.2107
- Map Small: 0.1469
- Map Medium: 0.6658
- Map Large: 0.8156
- Mar 1: 0.0959
- Mar 10: 0.2562
- Mar 100: 0.2914
- Mar Small: 0.2182
- Mar Medium: 0.7126
- Mar Large: 0.8476
- Map Person: 0.159
- Mar 100 Person: 0.2146
- Map Car: 0.2941
- Mar 100 Car: 0.3683
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 40.0
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Person | Mar 100 Person | Map Car | Mar 100 Car |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 34.5369 | 1.0 | 100 | 32.5221 | 0.1698 | 0.3433 | 0.1514 | 0.108 | 0.5888 | 0.7927 | 0.0881 | 0.2369 | 0.2807 | 0.2063 | 0.7072 | 0.8625 | 0.1244 | 0.1982 | 0.2151 | 0.3632 |
| 28.4476 | 2.0 | 200 | 30.4610 | 0.1927 | 0.3894 | 0.1741 | 0.1265 | 0.6125 | 0.8154 | 0.0938 | 0.2451 | 0.2845 | 0.2107 | 0.7076 | 0.8594 | 0.134 | 0.2015 | 0.2513 | 0.3675 |
| 26.9079 | 3.0 | 300 | 29.2391 | 0.2024 | 0.4059 | 0.1874 | 0.1358 | 0.6313 | 0.8212 | 0.0942 | 0.2521 | 0.2891 | 0.2155 | 0.712 | 0.8594 | 0.1377 | 0.2055 | 0.2671 | 0.3728 |
| 25.9174 | 4.0 | 400 | 28.7160 | 0.2108 | 0.4157 | 0.1969 | 0.1403 | 0.6379 | 0.8089 | 0.0957 | 0.2524 | 0.2873 | 0.2141 | 0.7104 | 0.8316 | 0.1447 | 0.207 | 0.277 | 0.3677 |
| 25.3421 | 5.0 | 500 | 28.3916 | 0.2147 | 0.4224 | 0.2003 | 0.1421 | 0.6461 | 0.8321 | 0.0961 | 0.2527 | 0.2918 | 0.2185 | 0.7119 | 0.8649 | 0.1483 | 0.2093 | 0.2811 | 0.3743 |
| 24.7176 | 6.0 | 600 | 28.2911 | 0.2185 | 0.4226 | 0.2064 | 0.1431 | 0.6528 | 0.8301 | 0.0971 | 0.2572 | 0.293 | 0.2197 | 0.7138 | 0.8562 | 0.1499 | 0.2118 | 0.2871 | 0.3741 |
| 24.13 | 7.0 | 700 | 27.7788 | 0.22 | 0.4299 | 0.207 | 0.1451 | 0.649 | 0.799 | 0.0972 | 0.2588 | 0.2933 | 0.221 | 0.7107 | 0.8316 | 0.154 | 0.2127 | 0.286 | 0.374 |
| 23.7107 | 8.0 | 800 | 27.7578 | 0.2176 | 0.4247 | 0.2027 | 0.1423 | 0.654 | 0.8346 | 0.0953 | 0.2553 | 0.2935 | 0.2199 | 0.7163 | 0.8649 | 0.1522 | 0.2119 | 0.2831 | 0.3752 |
| 23.4264 | 9.0 | 900 | 27.4461 | 0.2208 | 0.4289 | 0.2114 | 0.143 | 0.6539 | 0.83 | 0.0961 | 0.2545 | 0.2953 | 0.222 | 0.7154 | 0.8649 | 0.153 | 0.2116 | 0.2886 | 0.3789 |
| 23.2262 | 10.0 | 1000 | 27.7515 | 0.2238 | 0.4362 | 0.2112 | 0.1468 | 0.6532 | 0.8306 | 0.0956 | 0.2567 | 0.2944 | 0.2213 | 0.7136 | 0.8618 | 0.1541 | 0.2101 | 0.2936 | 0.3786 |
| 22.8743 | 11.0 | 1100 | 27.9457 | 0.226 | 0.442 | 0.2148 | 0.1495 | 0.6519 | 0.8096 | 0.0967 | 0.2555 | 0.2931 | 0.221 | 0.7086 | 0.8365 | 0.1555 | 0.2122 | 0.2964 | 0.3741 |
| 22.4995 | 12.0 | 1200 | 27.7485 | 0.2261 | 0.4364 | 0.217 | 0.1478 | 0.6582 | 0.8332 | 0.0973 | 0.256 | 0.2959 | 0.2233 | 0.7128 | 0.8594 | 0.1575 | 0.2137 | 0.2947 | 0.378 |
| 22.4303 | 13.0 | 1300 | 27.8137 | 0.2233 | 0.4266 | 0.2162 | 0.1459 | 0.6551 | 0.8331 | 0.0971 | 0.2538 | 0.2903 | 0.2166 | 0.7131 | 0.8587 | 0.1552 | 0.2112 | 0.2914 | 0.3693 |
| 22.1673 | 14.0 | 1400 | 27.4536 | 0.2253 | 0.4356 | 0.2176 | 0.1479 | 0.6554 | 0.8283 | 0.0984 | 0.2547 | 0.2931 | 0.2203 | 0.7116 | 0.8531 | 0.1564 | 0.2148 | 0.2943 | 0.3714 |
| 21.8272 | 15.0 | 1500 | 27.3626 | 0.2254 | 0.4398 | 0.2153 | 0.1475 | 0.6579 | 0.7987 | 0.0977 | 0.2532 | 0.2922 | 0.2201 | 0.7092 | 0.8229 | 0.1572 | 0.2118 | 0.2937 | 0.3726 |
| 21.6443 | 16.0 | 1600 | 27.1325 | 0.2286 | 0.4452 | 0.217 | 0.1502 | 0.6586 | 0.8147 | 0.0988 | 0.2578 | 0.2961 | 0.2235 | 0.714 | 0.8438 | 0.1563 | 0.2146 | 0.301 | 0.3776 |
| 21.5326 | 17.0 | 1700 | 27.5965 | 0.2275 | 0.44 | 0.2192 | 0.1482 | 0.6594 | 0.8221 | 0.0986 | 0.2584 | 0.2954 | 0.2227 | 0.7134 | 0.8531 | 0.1584 | 0.2144 | 0.2966 | 0.3764 |
| 21.472 | 18.0 | 1800 | 27.7345 | 0.2282 | 0.4368 | 0.2146 | 0.15 | 0.6577 | 0.8217 | 0.0983 | 0.2559 | 0.2958 | 0.223 | 0.714 | 0.8531 | 0.1571 | 0.2135 | 0.2992 | 0.378 |
| 21.053 | 19.0 | 1900 | 27.4663 | 0.2266 | 0.4383 | 0.2143 | 0.148 | 0.6569 | 0.8177 | 0.0977 | 0.2548 | 0.2905 | 0.2177 | 0.7097 | 0.8413 | 0.1571 | 0.2095 | 0.296 | 0.3714 |
| 20.9179 | 20.0 | 2000 | 27.5020 | 0.226 | 0.441 | 0.2115 | 0.1479 | 0.6601 | 0.7901 | 0.0976 | 0.2546 | 0.2927 | 0.2203 | 0.7115 | 0.8167 | 0.1564 | 0.2135 | 0.2955 | 0.3719 |
| 20.6969 | 21.0 | 2100 | 27.2502 | 0.2296 | 0.4421 | 0.214 | 0.151 | 0.6619 | 0.8299 | 0.0992 | 0.257 | 0.2938 | 0.221 | 0.712 | 0.8531 | 0.1586 | 0.2139 | 0.3007 | 0.3737 |
| 20.5419 | 22.0 | 2200 | 27.7422 | 0.2307 | 0.4456 | 0.2169 | 0.1533 | 0.6568 | 0.8284 | 0.0995 | 0.2575 | 0.2946 | 0.2221 | 0.7113 | 0.8531 | 0.1581 | 0.2132 | 0.3033 | 0.3761 |
| 20.5334 | 23.0 | 2300 | 27.0527 | 0.2271 | 0.4377 | 0.2171 | 0.1482 | 0.663 | 0.8093 | 0.0966 | 0.2542 | 0.2904 | 0.2172 | 0.7125 | 0.8413 | 0.1559 | 0.2092 | 0.2983 | 0.3717 |
| 20.3248 | 24.0 | 2400 | 27.8191 | 0.2299 | 0.4435 | 0.2155 | 0.1518 | 0.6572 | 0.82 | 0.0991 | 0.2577 | 0.2957 | 0.2237 | 0.7086 | 0.8531 | 0.1571 | 0.2113 | 0.3026 | 0.38 |
| 20.2354 | 25.0 | 2500 | 27.6890 | 0.2283 | 0.4423 | 0.2154 | 0.1506 | 0.6592 | 0.8293 | 0.0984 | 0.2569 | 0.2964 | 0.2238 | 0.7135 | 0.8562 | 0.1565 | 0.2146 | 0.3 | 0.3782 |
| 20.1336 | 26.0 | 2600 | 28.0179 | 0.2282 | 0.4346 | 0.2132 | 0.1499 | 0.6648 | 0.813 | 0.0976 | 0.2553 | 0.2928 | 0.2191 | 0.7175 | 0.8451 | 0.1559 | 0.2139 | 0.3004 | 0.3717 |
| 19.7615 | 27.0 | 2700 | 27.9383 | 0.2287 | 0.4421 | 0.216 | 0.1512 | 0.6594 | 0.8105 | 0.0995 | 0.2572 | 0.293 | 0.22 | 0.7131 | 0.8444 | 0.1582 | 0.2148 | 0.2991 | 0.3711 |
| 19.7833 | 28.0 | 2800 | 27.5669 | 0.2283 | 0.4403 | 0.2146 | 0.1501 | 0.6567 | 0.7939 | 0.0976 | 0.2573 | 0.2922 | 0.2204 | 0.7079 | 0.8167 | 0.1573 | 0.2137 | 0.2993 | 0.3708 |
| 19.6696 | 29.0 | 2900 | 27.6334 | 0.2283 | 0.4357 | 0.2157 | 0.149 | 0.6596 | 0.7861 | 0.0977 | 0.2546 | 0.2895 | 0.2168 | 0.7108 | 0.8142 | 0.1567 | 0.2106 | 0.2998 | 0.3684 |
| 19.5177 | 30.0 | 3000 | 28.1526 | 0.2319 | 0.4429 | 0.2135 | 0.1528 | 0.6626 | 0.8203 | 0.0973 | 0.2577 | 0.2957 | 0.2228 | 0.7146 | 0.8524 | 0.1593 | 0.2135 | 0.3045 | 0.3779 |
| 19.4806 | 31.0 | 3100 | 27.8846 | 0.227 | 0.4398 | 0.2143 | 0.1499 | 0.6564 | 0.8211 | 0.0965 | 0.2564 | 0.2916 | 0.2186 | 0.712 | 0.8438 | 0.1584 | 0.2138 | 0.2957 | 0.3695 |
| 19.194 | 32.0 | 3200 | 28.0200 | 0.2286 | 0.4407 | 0.2161 | 0.1501 | 0.6616 | 0.8088 | 0.0971 | 0.256 | 0.2911 | 0.2179 | 0.7134 | 0.8326 | 0.1603 | 0.2147 | 0.2969 | 0.3675 |
| 19.2278 | 33.0 | 3300 | 27.3223 | 0.2312 | 0.4385 | 0.2168 | 0.1511 | 0.6626 | 0.8097 | 0.0979 | 0.2547 | 0.2916 | 0.2191 | 0.709 | 0.842 | 0.1604 | 0.212 | 0.302 | 0.3711 |
| 19.048 | 34.0 | 3400 | 28.1198 | 0.2284 | 0.4417 | 0.2148 | 0.1507 | 0.6603 | 0.8197 | 0.0981 | 0.2566 | 0.2933 | 0.2206 | 0.7114 | 0.85 | 0.158 | 0.2145 | 0.2988 | 0.3722 |
| 18.9002 | 35.0 | 3500 | 28.7206 | 0.2299 | 0.4379 | 0.2155 | 0.1524 | 0.6601 | 0.8173 | 0.0983 | 0.2597 | 0.294 | 0.2214 | 0.7118 | 0.8476 | 0.1601 | 0.2132 | 0.2996 | 0.3747 |
| 18.9445 | 36.0 | 3600 | 28.2101 | 0.2301 | 0.4442 | 0.2144 | 0.1506 | 0.6613 | 0.8214 | 0.0983 | 0.2577 | 0.2931 | 0.2205 | 0.7102 | 0.8507 | 0.1599 | 0.2142 | 0.3004 | 0.3719 |
| 18.7087 | 37.0 | 3700 | 28.8212 | 0.2299 | 0.4437 | 0.2113 | 0.15 | 0.6698 | 0.7972 | 0.0987 | 0.2574 | 0.2927 | 0.2192 | 0.7188 | 0.8236 | 0.1621 | 0.215 | 0.2978 | 0.3704 |
| 18.7487 | 38.0 | 3800 | 28.7313 | 0.2293 | 0.4405 | 0.2129 | 0.1489 | 0.6649 | 0.7981 | 0.0975 | 0.2576 | 0.2941 | 0.2216 | 0.7134 | 0.8253 | 0.1598 | 0.2147 | 0.2988 | 0.3735 |
| 18.594 | 39.0 | 3900 | 28.3669 | 0.2304 | 0.4431 | 0.2138 | 0.1522 | 0.6619 | 0.8048 | 0.0974 | 0.2592 | 0.2973 | 0.2252 | 0.7117 | 0.85 | 0.1612 | 0.2146 | 0.2995 | 0.38 |
| 18.4462 | 40.0 | 4000 | 28.6625 | 0.2266 | 0.4359 | 0.2107 | 0.1469 | 0.6658 | 0.8156 | 0.0959 | 0.2562 | 0.2914 | 0.2182 | 0.7126 | 0.8476 | 0.159 | 0.2146 | 0.2941 | 0.3683 |
Framework versions
- Transformers 4.57.1
- Pytorch 2.9.0+cu128
- Datasets 4.3.0
- Tokenizers 0.22.1
- Downloads last month
- 8
Model tree for woodBorjo/finetune-instance-segmentation-ade20k-mini-mask2former
Base model
facebook/mask2former-swin-tiny-coco-instance