--- Setting up PumpDataModule ---
✅ DataModule setup complete.
==========================================================================================
Layer (type:depth-idx)                   Output Shape              Param #
==========================================================================================
NeuralNetwork                            [1, 3]                    --
├─ModuleList: 1-1                        --                        --
│    └─BatchNorm2d: 2-1                  [1, 1, 51, 1]             2
│    └─Conv2d: 2-2                       [1, 8, 26, 1]             32
│    └─BatchNorm2d: 2-3                  [1, 8, 26, 1]             16
│    └─ReLU: 2-4                         [1, 8, 26, 1]             --
│    └─Conv2d: 2-5                       [1, 16, 13, 1]            400
│    └─BatchNorm2d: 2-6                  [1, 16, 13, 1]            32
│    └─ReLU: 2-7                         [1, 16, 13, 1]            --
│    └─Conv2d: 2-8                       [1, 32, 7, 1]             1,568
│    └─BatchNorm2d: 2-9                  [1, 32, 7, 1]             64
│    └─ReLU: 2-10                        [1, 32, 7, 1]             --
│    └─AvgPool2d: 2-11                   [1, 32, 4, 1]             --
│    └─Flatten: 2-12                     [1, 128]                  --
│    └─Linear: 2-13                      [1, 64]                   8,256
│    └─ReLU: 2-14                        [1, 64]                   --
│    └─Linear: 2-15                      [1, 3]                    195
==========================================================================================
Total params: 10,565
Trainable params: 10,565
Non-trainable params: 0
Total mult-adds (M): 0.03
==========================================================================================
Input size (MB): 0.00
Forward/backward pass size (MB): 0.01
Params size (MB): 0.04
Estimated Total Size (MB): 0.05
==========================================================================================

--- Starting FP32 Training ---
Epoch: 1         LR: 0.001       Train Loss: 0.79987     Val Loss: 0.63328
Epoch: 2         LR: 0.001       Train Loss: 0.60594     Val Loss: 0.51342
Epoch: 3         LR: 0.00099     Train Loss: 0.51595     Val Loss: 0.45563
Epoch: 4         LR: 0.00099     Train Loss: 0.46011     Val Loss: 0.51723
Epoch: 5         LR: 0.00098     Train Loss: 0.42488     Val Loss: 0.51573
Epoch: 6         LR: 0.00098     Train Loss: 0.40542     Val Loss: 0.57677
Epoch: 7         LR: 0.00097     Train Loss: 0.38991     Val Loss: 0.42355
Epoch: 8         LR: 0.00096     Train Loss: 0.38141     Val Loss: 0.40248
Epoch: 9         LR: 0.00095     Train Loss: 0.37117     Val Loss: 0.32169
Epoch: 10        LR: 0.00093     Train Loss: 0.36391     Val Loss: 0.32408
Epoch: 11        LR: 0.00092     Train Loss: 0.35921     Val Loss: 0.31018
Epoch: 12        LR: 0.0009      Train Loss: 0.34965     Val Loss: 0.29195
Epoch: 13        LR: 0.00089     Train Loss: 0.34727     Val Loss: 0.28629
Epoch: 14        LR: 0.00087     Train Loss: 0.34658     Val Loss: 0.27471
Epoch: 15        LR: 0.00085     Train Loss: 0.33536     Val Loss: 0.27776
Epoch: 16        LR: 0.00083     Train Loss: 0.33213     Val Loss: 0.26858
Epoch: 17        LR: 0.00081     Train Loss: 0.32734     Val Loss: 0.27044
Epoch: 18        LR: 0.00079     Train Loss: 0.32247     Val Loss: 0.2658
Epoch: 19        LR: 0.00077     Train Loss: 0.32387     Val Loss: 0.26363
Epoch: 20        LR: 0.00075     Train Loss: 0.32181     Val Loss: 0.25941
Epoch: 21        LR: 0.00073     Train Loss: 0.31386     Val Loss: 0.25478
Epoch: 22        LR: 0.0007      Train Loss: 0.31154     Val Loss: 0.261
Epoch: 23        LR: 0.00068     Train Loss: 0.31439     Val Loss: 0.25658
Epoch: 24        LR: 0.00065     Train Loss: 0.30751     Val Loss: 0.25245
Epoch: 25        LR: 0.00063     Train Loss: 0.31251     Val Loss: 0.26561
Epoch: 26        LR: 0.0006      Train Loss: 0.30942     Val Loss: 0.24342
Epoch: 27        LR: 0.00058     Train Loss: 0.301       Val Loss: 0.24642
Epoch: 28        LR: 0.00055     Train Loss: 0.30431     Val Loss: 0.249
Epoch: 29        LR: 0.00053     Train Loss: 0.29906     Val Loss: 0.23579
Epoch: 30        LR: 0.0005      Train Loss: 0.29639     Val Loss: 0.23665
Epoch: 31        LR: 0.00047     Train Loss: 0.29427     Val Loss: 0.23671
Epoch: 32        LR: 0.00045     Train Loss: 0.29512     Val Loss: 0.23978
Epoch: 33        LR: 0.00042     Train Loss: 0.29266     Val Loss: 0.24023
Epoch: 34        LR: 0.0004      Train Loss: 0.29295     Val Loss: 0.23724
Epoch: 35        LR: 0.00037     Train Loss: 0.29217     Val Loss: 0.23734
Epoch: 36        LR: 0.00035     Train Loss: 0.28855     Val Loss: 0.23096
Epoch: 37        LR: 0.00032     Train Loss: 0.29108     Val Loss: 0.23042
Epoch: 38        LR: 0.0003      Train Loss: 0.28774     Val Loss: 0.23451
Epoch: 39        LR: 0.00027     Train Loss: 0.28866     Val Loss: 0.23563
Epoch: 40        LR: 0.00025     Train Loss: 0.28885     Val Loss: 0.22655
Epoch: 41        LR: 0.00023     Train Loss: 0.28423     Val Loss: 0.22905
Epoch: 42        LR: 0.00021     Train Loss: 0.28273     Val Loss: 0.22855
Epoch: 43        LR: 0.00019     Train Loss: 0.2833      Val Loss: 0.23527
Epoch: 44        LR: 0.00017     Train Loss: 0.28342     Val Loss: 0.22797
Epoch: 45        LR: 0.00015     Train Loss: 0.28391     Val Loss: 0.2227
Epoch: 46        LR: 0.00013     Train Loss: 0.28227     Val Loss: 0.23132
Epoch: 47        LR: 0.00011     Train Loss: 0.28027     Val Loss: 0.22465
Epoch: 48        LR: 0.0001      Train Loss: 0.28037     Val Loss: 0.23329
Epoch: 49        LR: 8e-05       Train Loss: 0.27888     Val Loss: 0.22557
Epoch: 50        LR: 7e-05       Train Loss: 0.27764     Val Loss: 0.22279
Epoch: 51        LR: 5e-05       Train Loss: 0.27861     Val Loss: 0.22241
Epoch: 52        LR: 4e-05       Train Loss: 0.27925     Val Loss: 0.22759
Epoch: 53        LR: 3e-05       Train Loss: 0.27808     Val Loss: 0.22287
Epoch: 54        LR: 2e-05       Train Loss: 0.27832     Val Loss: 0.22674
Epoch: 55        LR: 2e-05       Train Loss: 0.27812     Val Loss: 0.21964
Epoch: 56        LR: 1e-05       Train Loss: 0.27747     Val Loss: 0.2238
Epoch: 57        LR: 1e-05       Train Loss: 0.27761     Val Loss: 0.22825
Epoch: 58        LR: 0.0         Train Loss: 0.27548     Val Loss: 0.21883
Epoch: 59        LR: 0.0         Train Loss: 0.27618     Val Loss: 0.21848
Epoch: 60        LR: 0.0         Train Loss: 0.27732     Val Loss: 0.22458

Trained FP32 Model R2-Score: 0.77479    SMAPE: 0.676


--- Starting QAT Fine-Tuning ---
Epoch: 1         LR: 0.0001      Train Loss: 0.7146      Val Loss: 0.84786
Epoch: 2         LR: 0.0001      Train Loss: 0.6457      Val Loss: 0.59974
Epoch: 3         LR: 0.0001      Train Loss: 0.60398     Val Loss: 0.66121
Epoch: 4         LR: 0.0001      Train Loss: 0.59204     Val Loss: 0.6158
Epoch: 5         LR: 9e-05       Train Loss: 0.57161     Val Loss: 0.54803
Epoch: 6         LR: 9e-05       Train Loss: 0.56271     Val Loss: 0.50966
Epoch: 7         LR: 9e-05       Train Loss: 0.56116     Val Loss: 0.53495
Epoch: 8         LR: 8e-05       Train Loss: 0.5611      Val Loss: 0.5329
Epoch: 9         LR: 8e-05       Train Loss: 0.55009     Val Loss: 0.51751
Epoch: 10        LR: 8e-05       Train Loss: 0.55351     Val Loss: 0.56883
Epoch: 11        LR: 7e-05       Train Loss: 0.54568     Val Loss: 0.52725
Epoch: 12        LR: 7e-05       Train Loss: 0.54912     Val Loss: 0.50522
Epoch: 13        LR: 6e-05       Train Loss: 0.5448      Val Loss: 0.52417
Epoch: 14        LR: 6e-05       Train Loss: 0.54506     Val Loss: 0.51931
Freezing BN for subsequent epochs
Epoch: 15        LR: 5e-05       Train Loss: 0.60983     Val Loss: 0.50065
Epoch: 16        LR: 4e-05       Train Loss: 0.56399     Val Loss: 0.53103
Epoch: 17        LR: 4e-05       Train Loss: 0.55697     Val Loss: 0.53592
Epoch: 18        LR: 3e-05       Train Loss: 0.5681      Val Loss: 0.50948
Epoch: 19        LR: 3e-05       Train Loss: 0.56953     Val Loss: 0.47122
Epoch: 20        LR: 3e-05       Train Loss: 0.55753     Val Loss: 0.53415
Epoch: 21        LR: 2e-05       Train Loss: 0.56054     Val Loss: 0.48304
Epoch: 22        LR: 2e-05       Train Loss: 0.54917     Val Loss: 0.51425
Epoch: 23        LR: 1e-05       Train Loss: 0.54795     Val Loss: 0.46628
Epoch: 24        LR: 1e-05       Train Loss: 0.54416     Val Loss: 0.53487
Epoch: 25        LR: 1e-05       Train Loss: 0.53473     Val Loss: 0.50085
Epoch: 26        LR: 0.0         Train Loss: 0.5357      Val Loss: 0.47827
Epoch: 27        LR: 0.0         Train Loss: 0.5357      Val Loss: 0.47469
Epoch: 28        LR: 0.0         Train Loss: 0.53861     Val Loss: 0.47364
Epoch: 29        LR: 0.0         Train Loss: 0.52794     Val Loss: 0.48684
Freezing ranges for subsequent epochs
Epoch: 30        LR: 0.0         Train Loss: 0.5327      Val Loss: 0.4694

QAT Model R2-Score: 0.52503     SMAPE: 1.057