| Name | ParityModified-1647044117-29644-1-0_1 |
| Workunit | 10363746 |
| Created | 23 Mar 2022, 1:40:08 UTC |
| Sent | 23 Mar 2022, 3:34:47 UTC |
| Report deadline | 31 Mar 2022, 3:34:47 UTC |
| Received | 13 Apr 2022, 19:07:59 UTC |
| Server state | Over |
| Outcome | Success |
| Client state | Done |
| Exit status | 0 (0x00000000) |
| Computer ID | 21129 |
| Run time | 2 hours 44 min 37 sec |
| CPU time | 2 hours 27 min 33 sec |
| Validate state | Task was reported too late to validate |
| Credit | 0.00 |
| Device peak FLOPS | 4,378.10 GFLOPS |
| Application version | Machine Learning Dataset Generator (GPU) v9.75 (cuda10200) windows_x86_64 |
| Peak working set size | 1.62 GB |
| Peak swap size | 3.61 GB |
| Peak disk usage | 1.54 GB |
<core_client_version>7.16.20</core_client_version> <![CDATA[ <stderr_txt> 47 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-13 08:58:48 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-13 08:58:48 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-13 08:58:48 main:494] : INFO : Creating Model [2022-04-13 08:58:49 main:507] : INFO : Preparing config file [2022-04-13 08:58:49 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-13 08:58:49 main:512] : INFO : Loading config [2022-04-13 08:58:49 main:514] : INFO : Loading state [2022-04-13 08:59:00 main:559] : INFO : Loading DataLoader into Memory [2022-04-13 08:59:00 main:562] : INFO : Starting Training [2022-04-13 08:59:09 main:574] : INFO : Epoch 1777 | loss: 0.0313955 | val_loss: 0.0312801 | Time: 8527.75 ms [2022-04-13 08:59:14 main:574] : INFO : Epoch 1778 | loss: 0.0311886 | val_loss: 0.0311729 | Time: 4638.11 ms [2022-04-13 08:59:18 main:574] : INFO : Epoch 1779 | loss: 0.0311329 | val_loss: 0.0311572 | Time: 4492.13 ms [2022-04-13 08:59:23 main:574] : INFO : Epoch 1780 | loss: 0.0311258 | val_loss: 0.0311521 | Time: 4363.09 ms [2022-04-13 08:59:27 main:574] : INFO : Epoch 1781 | loss: 0.031125 | val_loss: 0.0311544 | Time: 4167.19 ms [2022-04-13 08:59:32 main:574] : INFO : Epoch 1782 | loss: 0.0311236 | val_loss: 0.0311507 | Time: 4372.36 ms [2022-04-13 08:59:37 main:574] : INFO : Epoch 1783 | loss: 0.0311229 | val_loss: 0.0311558 | Time: 4204.37 ms [2022-04-13 08:59:41 main:574] : INFO : Epoch 1784 | loss: 0.0311235 | val_loss: 0.0311509 | Time: 4415.45 ms [2022-04-13 08:59:46 main:574] : INFO : Epoch 1785 | loss: 0.0311221 | val_loss: 0.03115 | Time: 4294.65 ms [2022-04-13 08:59:50 main:574] : INFO : Epoch 1786 | loss: 0.0311216 | val_loss: 0.0311546 | Time: 4384.05 ms [2022-04-13 08:59:55 main:574] : INFO : Epoch 1787 | loss: 0.0311223 | val_loss: 0.0311544 | Time: 4310.71 ms [2022-04-13 08:59:59 main:574] : INFO : Epoch 1788 | loss: 0.0311224 | val_loss: 0.03115 | Time: 4236.83 ms [2022-04-13 09:00:04 main:574] : INFO : Epoch 1789 | loss: 0.0311225 | val_loss: 0.0311529 | Time: 4185.44 ms [2022-04-13 09:00:08 main:574] : INFO : Epoch 1790 | loss: 0.0311248 | val_loss: 0.0311515 | Time: 4191.88 ms [2022-04-13 09:00:12 main:574] : INFO : Epoch 1791 | loss: 0.0311208 | val_loss: 0.0311498 | Time: 4266.13 ms [2022-04-13 09:00:17 main:574] : INFO : Epoch 1792 | loss: 0.031122 | val_loss: 0.0311527 | Time: 4241.82 ms [2022-04-13 09:00:21 main:574] : INFO : Epoch 1793 | loss: 0.0311232 | val_loss: 0.0311494 | Time: 4310.65 ms [2022-04-13 09:00:26 main:574] : INFO : Epoch 1794 | loss: 0.0311207 | val_loss: 0.0311508 | Time: 4243.81 ms [2022-04-13 09:00:30 main:574] : INFO : Epoch 1795 | loss: 0.031119 | val_loss: 0.0311522 | Time: 4166.42 ms [2022-04-13 09:00:35 main:574] : INFO : Epoch 1796 | loss: 0.0311192 | val_loss: 0.0311484 | Time: 4259.97 ms [2022-04-13 09:00:39 main:574] : INFO : Epoch 1797 | loss: 0.0311203 | val_loss: 0.0311543 | Time: 4256.73 ms [2022-04-13 09:00:43 main:574] : INFO : Epoch 1798 | loss: 0.0311219 | val_loss: 0.0311556 | Time: 3991.51 ms [2022-04-13 09:00:48 main:574] : INFO : Epoch 1799 | loss: 0.0311214 | val_loss: 0.0311504 | Time: 4027.78 ms [2022-04-13 09:00:52 main:574] : INFO : Epoch 1800 | loss: 0.0311185 | val_loss: 0.031151 | Time: 3998.16 ms [2022-04-13 09:00:56 main:574] : INFO : Epoch 1801 | loss: 0.0311164 | val_loss: 0.0311526 | Time: 4087.25 ms [2022-04-13 09:01:00 main:574] : INFO : Epoch 1802 | loss: 0.0311195 | val_loss: 0.0311491 | Time: 3999.27 ms [2022-04-13 09:01:05 main:574] : INFO : Epoch 1803 | loss: 0.0311162 | val_loss: 0.0311459 | Time: 4087.69 ms [2022-04-13 09:01:09 main:574] : INFO : Epoch 1804 | loss: 0.0311154 | val_loss: 0.0311513 | Time: 4124.88 ms [2022-04-13 09:01:14 main:574] : INFO : Epoch 1805 | loss: 0.0311139 | val_loss: 0.0311513 | Time: 4404.55 ms [2022-04-13 09:01:18 main:574] : INFO : Epoch 1806 | loss: 0.0311138 | val_loss: 0.0311497 | Time: 4418.34 ms [2022-04-13 09:01:23 main:574] : INFO : Epoch 1807 | loss: 0.0311136 | val_loss: 0.03115 | Time: 4196 ms [2022-04-13 09:01:28 main:574] : INFO : Epoch 1808 | loss: 0.0311145 | val_loss: 0.0311537 | Time: 4755.36 ms [2022-04-13 09:01:32 main:574] : INFO : Epoch 1809 | loss: 0.0311151 | val_loss: 0.0311512 | Time: 4283.22 ms [2022-04-13 09:01:37 main:574] : INFO : Epoch 1810 | loss: 0.0311136 | val_loss: 0.0311494 | Time: 4261.9 ms [2022-04-13 09:01:41 main:574] : INFO : Epoch 1811 | loss: 0.0311137 | val_loss: 0.0311513 | Time: 4182.39 ms [2022-04-13 09:01:45 main:574] : INFO : Epoch 1812 | loss: 0.0311111 | val_loss: 0.0311461 | Time: 3983.1 ms [2022-04-13 09:01:50 main:574] : INFO : Epoch 1813 | loss: 0.0311129 | val_loss: 0.0311454 | Time: 4027.2 ms [2022-04-13 09:01:54 main:574] : INFO : Epoch 1814 | loss: 0.031111 | val_loss: 0.0311454 | Time: 4023.22 ms [2022-04-13 09:01:58 main:574] : INFO : Epoch 1815 | loss: 0.0311102 | val_loss: 0.031145 | Time: 3989.39 ms [2022-04-13 09:02:02 main:574] : INFO : Epoch 1816 | loss: 0.0311118 | val_loss: 0.0311441 | Time: 4036.66 ms [2022-04-13 09:02:06 main:574] : INFO : Epoch 1817 | loss: 0.0311139 | val_loss: 0.0311467 | Time: 3995.42 ms [2022-04-13 09:02:11 main:574] : INFO : Epoch 1818 | loss: 0.0311167 | val_loss: 0.0311471 | Time: 3997.94 ms [2022-04-13 09:02:15 main:574] : INFO : Epoch 1819 | loss: 0.0311146 | val_loss: 0.0311479 | Time: 4039.34 ms [2022-04-13 09:02:19 main:574] : INFO : Epoch 1820 | loss: 0.0311147 | val_loss: 0.0311518 | Time: 4188.64 ms [2022-04-13 09:02:24 main:574] : INFO : Epoch 1821 | loss: 0.0311119 | val_loss: 0.0311471 | Time: 3972.97 ms [2022-04-13 09:02:28 main:574] : INFO : Epoch 1822 | loss: 0.0311104 | val_loss: 0.0311451 | Time: 4035.57 ms [2022-04-13 09:02:32 main:574] : INFO : Epoch 1823 | loss: 0.0311126 | val_loss: 0.0311524 | Time: 4030.69 ms [2022-04-13 09:02:36 main:574] : INFO : Epoch 1824 | loss: 0.0311117 | val_loss: 0.0311522 | Time: 4106.93 ms [2022-04-13 09:02:40 main:574] : INFO : Epoch 1825 | loss: 0.0311144 | val_loss: 0.0311475 | Time: 3993.6 ms [2022-04-13 09:02:45 main:574] : INFO : Epoch 1826 | loss: 0.0311158 | val_loss: 0.031148 | Time: 4075.87 ms [2022-04-13 09:02:49 main:574] : INFO : Epoch 1827 | loss: 0.0311177 | val_loss: 0.0311461 | Time: 4021.84 ms [2022-04-13 09:02:53 main:574] : INFO : Epoch 1828 | loss: 0.0311189 | val_loss: 0.0311477 | Time: 4093.59 ms [2022-04-13 09:02:58 main:574] : INFO : Epoch 1829 | loss: 0.0311164 | val_loss: 0.0311453 | Time: 4025.14 ms [2022-04-13 09:03:02 main:574] : INFO : Epoch 1830 | loss: 0.0311147 | val_loss: 0.0311431 | Time: 4137.1 ms [2022-04-13 09:03:06 main:574] : INFO : Epoch 1831 | loss: 0.0311165 | val_loss: 0.0311577 | Time: 4029.63 ms [2022-04-13 09:03:10 main:574] : INFO : Epoch 1832 | loss: 0.0311165 | val_loss: 0.0311518 | Time: 4191.47 ms [2022-04-13 09:03:15 main:574] : INFO : Epoch 1833 | loss: 0.0311123 | val_loss: 0.0311487 | Time: 3946.06 ms [2022-04-13 09:03:19 main:574] : INFO : Epoch 1834 | loss: 0.0311101 | val_loss: 0.031144 | Time: 4072.59 ms [2022-04-13 09:03:23 main:574] : INFO : Epoch 1835 | loss: 0.0311138 | val_loss: 0.031148 | Time: 3908.3 ms [2022-04-13 09:03:28 main:574] : INFO : Epoch 1836 | loss: 0.0311174 | val_loss: 0.0311515 | Time: 4173.31 ms [2022-04-13 09:03:32 main:574] : INFO : Epoch 1837 | loss: 0.031116 | val_loss: 0.0311451 | Time: 4130.71 ms [2022-04-13 09:03:37 main:574] : INFO : Epoch 1838 | loss: 0.0311127 | val_loss: 0.0311476 | Time: 4209.36 ms [2022-04-13 09:03:41 main:574] : INFO : Epoch 1839 | loss: 0.0311103 | val_loss: 0.031152 | Time: 4049.51 ms [2022-04-13 09:03:45 main:574] : INFO : Epoch 1840 | loss: 0.0311261 | val_loss: 0.0311505 | Time: 4092.23 ms [2022-04-13 09:03:49 main:574] : INFO : Epoch 1841 | loss: 0.0311346 | val_loss: 0.0311514 | Time: 3918.53 ms [2022-04-13 09:03:54 main:574] : INFO : Epoch 1842 | loss: 0.0311327 | val_loss: 0.0311498 | Time: 4251.14 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-04-13 09:58:40 main:435] : INFO : Set logging level to 1 [2022-04-13 09:58:40 main:441] : INFO : Running in BOINC Client mode [2022-04-13 09:58:40 main:444] : INFO : Resolving all filenames [2022-04-13 09:58:40 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-13 09:58:40 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-13 09:58:40 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-13 09:58:40 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-13 09:58:40 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-13 09:58:40 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-13 09:58:40 main:474] : INFO : Configuration: [2022-04-13 09:58:40 main:475] : INFO : Model type: GRU [2022-04-13 09:58:40 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-13 09:58:41 main:477] : INFO : Max Epochs: 2048 [2022-04-13 09:58:41 main:478] : INFO : Batch Size: 128 [2022-04-13 09:58:41 main:479] : INFO : Learning Rate: 0.01 [2022-04-13 09:58:41 main:480] : INFO : Patience: 10 [2022-04-13 09:58:41 main:481] : INFO : Hidden Width: 12 [2022-04-13 09:58:41 main:482] : INFO : # Recurrent Layers: 4 [2022-04-13 09:58:41 main:483] : INFO : # Backend Layers: 4 [2022-04-13 09:58:41 main:484] : INFO : # Threads: 1 [2022-04-13 09:58:41 main:486] : INFO : Preparing Dataset [2022-04-13 09:58:41 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-13 09:58:41 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-13 09:59:02 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-13 09:59:02 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-13 09:59:02 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-13 09:59:02 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-13 09:59:03 main:494] : INFO : Creating Model [2022-04-13 09:59:03 main:507] : INFO : Preparing config file [2022-04-13 09:59:03 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-13 09:59:03 main:512] : INFO : Loading config [2022-04-13 09:59:03 main:514] : INFO : Loading state [2022-04-13 09:59:12 main:559] : INFO : Loading DataLoader into Memory [2022-04-13 09:59:12 main:562] : INFO : Starting Training [2022-04-13 09:59:19 main:574] : INFO : Epoch 1813 | loss: 0.0312625 | val_loss: 0.0311973 | Time: 6736.04 ms [2022-04-13 09:59:24 main:574] : INFO : Epoch 1814 | loss: 0.0311339 | val_loss: 0.0311489 | Time: 4250.29 ms [2022-04-13 09:59:28 main:574] : INFO : Epoch 1815 | loss: 0.0311163 | val_loss: 0.0311486 | Time: 4261.96 ms [2022-04-13 09:59:33 main:574] : INFO : Epoch 1816 | loss: 0.031113 | val_loss: 0.0311506 | Time: 4269.76 ms [2022-04-13 09:59:37 main:574] : INFO : Epoch 1817 | loss: 0.0311135 | val_loss: 0.031151 | Time: 4138.96 ms [2022-04-13 09:59:42 main:574] : INFO : Epoch 1818 | loss: 0.0311141 | val_loss: 0.0311489 | Time: 4438.83 ms [2022-04-13 09:59:46 main:574] : INFO : Epoch 1819 | loss: 0.0311151 | val_loss: 0.0311534 | Time: 4239.56 ms [2022-04-13 09:59:50 main:574] : INFO : Epoch 1820 | loss: 0.0311233 | val_loss: 0.0311471 | Time: 4231.45 ms [2022-04-13 09:59:54 main:574] : INFO : Epoch 1821 | loss: 0.0311219 | val_loss: 0.0311481 | Time: 4115.9 ms [2022-04-13 09:59:59 main:574] : INFO : Epoch 1822 | loss: 0.031118 | val_loss: 0.0311494 | Time: 4362.92 ms [2022-04-13 10:00:03 main:574] : INFO : Epoch 1823 | loss: 0.0311182 | val_loss: 0.0311511 | Time: 4185.31 ms [2022-04-13 10:00:07 main:574] : INFO : Epoch 1824 | loss: 0.031121 | val_loss: 0.0311516 | Time: 4219.1 ms [2022-04-13 10:00:12 main:574] : INFO : Epoch 1825 | loss: 0.0311176 | val_loss: 0.031154 | Time: 4134.09 ms [2022-04-13 10:00:16 main:574] : INFO : Epoch 1826 | loss: 0.0311162 | val_loss: 0.0311521 | Time: 4169.15 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-04-13 10:41:49 main:435] : INFO : Set logging level to 1 [2022-04-13 10:41:49 main:441] : INFO : Running in BOINC Client mode [2022-04-13 10:41:50 main:444] : INFO : Resolving all filenames [2022-04-13 10:41:50 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-13 10:41:50 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-13 10:41:50 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-13 10:41:50 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-13 10:41:50 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-13 10:41:50 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-13 10:41:50 main:474] : INFO : Configuration: [2022-04-13 10:41:50 main:475] : INFO : Model type: GRU [2022-04-13 10:41:50 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-13 10:41:50 main:477] : INFO : Max Epochs: 2048 [2022-04-13 10:41:50 main:478] : INFO : Batch Size: 128 [2022-04-13 10:41:50 main:479] : INFO : Learning Rate: 0.01 [2022-04-13 10:41:50 main:480] : INFO : Patience: 10 [2022-04-13 10:41:50 main:481] : INFO : Hidden Width: 12 [2022-04-13 10:41:50 main:482] : INFO : # Recurrent Layers: 4 [2022-04-13 10:41:50 main:483] : INFO : # Backend Layers: 4 [2022-04-13 10:41:50 main:484] : INFO : # Threads: 1 [2022-04-13 10:41:50 main:486] : INFO : Preparing Dataset [2022-04-13 10:41:50 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-13 10:41:51 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-13 10:41:53 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-13 10:41:53 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-13 10:41:53 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-13 10:41:53 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-13 10:41:53 main:494] : INFO : Creating Model [2022-04-13 10:41:53 main:507] : INFO : Preparing config file [2022-04-13 10:41:53 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-13 10:41:53 main:512] : INFO : Loading config [2022-04-13 10:41:53 main:514] : INFO : Loading state [2022-04-13 10:41:55 main:559] : INFO : Loading DataLoader into Memory [2022-04-13 10:41:55 main:562] : INFO : Starting Training [2022-04-13 10:42:00 main:574] : INFO : Epoch 1813 | loss: 0.0312642 | val_loss: 0.0312005 | Time: 5140.29 ms [2022-04-13 10:42:05 main:574] : INFO : Epoch 1814 | loss: 0.0311394 | val_loss: 0.0311516 | Time: 4654.61 ms [2022-04-13 10:42:10 main:574] : INFO : Epoch 1815 | loss: 0.0311196 | val_loss: 0.0311518 | Time: 4676.18 ms [2022-04-13 10:42:14 main:574] : INFO : Epoch 1816 | loss: 0.0311163 | val_loss: 0.0311507 | Time: 4613.34 ms [2022-04-13 10:42:19 main:574] : INFO : Epoch 1817 | loss: 0.0311145 | val_loss: 0.0311455 | Time: 4657.91 ms [2022-04-13 10:42:24 main:574] : INFO : Epoch 1818 | loss: 0.0311129 | val_loss: 0.0311485 | Time: 4712.97 ms [2022-04-13 10:42:28 main:574] : INFO : Epoch 1819 | loss: 0.0311132 | val_loss: 0.0311554 | Time: 4565.32 ms [2022-04-13 10:42:33 main:574] : INFO : Epoch 1820 | loss: 0.0311173 | val_loss: 0.0311487 | Time: 4582.21 ms [2022-04-13 10:42:38 main:574] : INFO : Epoch 1821 | loss: 0.0311163 | val_loss: 0.0311517 | Time: 4643.32 ms [2022-04-13 10:42:42 main:574] : INFO : Epoch 1822 | loss: 0.0311149 | val_loss: 0.031154 | Time: 4642.88 ms [2022-04-13 10:42:47 main:574] : INFO : Epoch 1823 | loss: 0.031113 | val_loss: 0.0311495 | Time: 4696.88 ms [2022-04-13 10:42:52 main:574] : INFO : Epoch 1824 | loss: 0.0311138 | val_loss: 0.0311524 | Time: 5142.35 ms [2022-04-13 10:42:57 main:574] : INFO : Epoch 1825 | loss: 0.0311163 | val_loss: 0.0311513 | Time: 4613.28 ms [2022-04-13 10:43:02 main:574] : INFO : Epoch 1826 | loss: 0.03112 | val_loss: 0.0311478 | Time: 4743.88 ms [2022-04-13 10:43:06 main:574] : INFO : Epoch 1827 | loss: 0.0311152 | val_loss: 0.0311498 | Time: 4609.05 ms [2022-04-13 10:43:11 main:574] : INFO : Epoch 1828 | loss: 0.031113 | val_loss: 0.0311595 | Time: 4553.06 ms [2022-04-13 10:43:16 main:574] : INFO : Epoch 1829 | loss: 0.0311137 | val_loss: 0.0311519 | Time: 4710.13 ms [2022-04-13 10:43:21 main:574] : INFO : Epoch 1830 | loss: 0.0311122 | val_loss: 0.0311517 | Time: 4614.18 ms [2022-04-13 10:43:25 main:574] : INFO : Epoch 1831 | loss: 0.0311098 | val_loss: 0.0311566 | Time: 4531.7 ms [2022-04-13 10:43:30 main:574] : INFO : Epoch 1832 | loss: 0.0311106 | val_loss: 0.0311515 | Time: 4634.98 ms [2022-04-13 10:43:35 main:574] : INFO : Epoch 1833 | loss: 0.0311122 | val_loss: 0.0311585 | Time: 4637.68 ms [2022-04-13 10:43:39 main:574] : INFO : Epoch 1834 | loss: 0.0311129 | val_loss: 0.0311567 | Time: 4574.97 ms [2022-04-13 10:43:44 main:574] : INFO : Epoch 1835 | loss: 0.0311114 | val_loss: 0.0311512 | Time: 4667.97 ms [2022-04-13 10:43:49 main:574] : INFO : Epoch 1836 | loss: 0.0311116 | val_loss: 0.0311535 | Time: 4622.32 ms [2022-04-13 10:43:53 main:574] : INFO : Epoch 1837 | loss: 0.0311104 | val_loss: 0.031154 | Time: 4766.98 ms [2022-04-13 10:43:58 main:574] : INFO : Epoch 1838 | loss: 0.0311093 | val_loss: 0.0311525 | Time: 4690.35 ms [2022-04-13 10:44:03 main:574] : INFO : Epoch 1839 | loss: 0.0311086 | val_loss: 0.0311549 | Time: 4643.05 ms [2022-04-13 10:44:08 main:574] : INFO : Epoch 1840 | loss: 0.0311081 | val_loss: 0.0311539 | Time: 4844.36 ms [2022-04-13 10:44:13 main:574] : INFO : Epoch 1841 | loss: 0.03111 | val_loss: 0.0311527 | Time: 4709.43 ms [2022-04-13 10:44:17 main:574] : INFO : Epoch 1842 | loss: 0.0311063 | val_loss: 0.0311506 | Time: 4709.5 ms [2022-04-13 10:44:32 main:574] : INFO : Epoch 1843 | loss: 0.0311064 | val_loss: 0.0311494 | Time: 15030.2 ms [2022-04-13 10:44:37 main:574] : INFO : Epoch 1844 | loss: 0.0311071 | val_loss: 0.0311538 | Time: 4721.74 ms [2022-04-13 10:44:42 main:574] : INFO : Epoch 1845 | loss: 0.0311066 | val_loss: 0.0311554 | Time: 4629.54 ms [2022-04-13 10:44:47 main:574] : INFO : Epoch 1846 | loss: 0.0311088 | val_loss: 0.0311529 | Time: 4640.65 ms [2022-04-13 10:44:51 main:574] : INFO : Epoch 1847 | loss: 0.0311092 | val_loss: 0.0311481 | Time: 4784.1 ms [2022-04-13 10:44:56 main:574] : INFO : Epoch 1848 | loss: 0.031111 | val_loss: 0.0311565 | Time: 4620.87 ms [2022-04-13 10:45:01 main:574] : INFO : Epoch 1849 | loss: 0.0311124 | val_loss: 0.0311592 | Time: 4645.96 ms [2022-04-13 10:45:06 main:574] : INFO : Epoch 1850 | loss: 0.0311117 | val_loss: 0.0311501 | Time: 4788.57 ms [2022-04-13 10:45:11 main:574] : INFO : Epoch 1851 | loss: 0.0311139 | val_loss: 0.0311517 | Time: 4672.44 ms [2022-04-13 10:45:15 main:574] : INFO : Epoch 1852 | loss: 0.0311098 | val_loss: 0.0311562 | Time: 4647.38 ms [2022-04-13 10:45:20 main:574] : INFO : Epoch 1853 | loss: 0.0311094 | val_loss: 0.0311559 | Time: 4733.18 ms [2022-04-13 10:45:25 main:574] : INFO : Epoch 1854 | loss: 0.0311112 | val_loss: 0.0311523 | Time: 4587.42 ms [2022-04-13 10:45:30 main:574] : INFO : Epoch 1855 | loss: 0.0311101 | val_loss: 0.0311501 | Time: 4632.57 ms [2022-04-13 10:45:34 main:574] : INFO : Epoch 1856 | loss: 0.0311089 | val_loss: 0.031152 | Time: 4780.1 ms [2022-04-13 10:45:39 main:574] : INFO : Epoch 1857 | loss: 0.0311121 | val_loss: 0.0311557 | Time: 4645.87 ms [2022-04-13 10:45:44 main:574] : INFO : Epoch 1858 | loss: 0.03111 | val_loss: 0.0311544 | Time: 4779.62 ms [2022-04-13 10:45:49 main:574] : INFO : Epoch 1859 | loss: 0.0311081 | val_loss: 0.0311514 | Time: 4577.33 ms [2022-04-13 10:45:53 main:574] : INFO : Epoch 1860 | loss: 0.031107 | val_loss: 0.0311524 | Time: 4557.1 ms [2022-04-13 10:45:58 main:574] : INFO : Epoch 1861 | loss: 0.0311059 | val_loss: 0.0311491 | Time: 4610.74 ms [2022-04-13 10:46:03 main:574] : INFO : Epoch 1862 | loss: 0.0311038 | val_loss: 0.031156 | Time: 4627.6 ms [2022-04-13 10:46:07 main:574] : INFO : Epoch 1863 | loss: 0.0311055 | val_loss: 0.0311566 | Time: 4629.15 ms [2022-04-13 10:46:12 main:574] : INFO : Epoch 1864 | loss: 0.0311083 | val_loss: 0.0311615 | Time: 4626.13 ms [2022-04-13 10:46:17 main:574] : INFO : Epoch 1865 | loss: 0.0311107 | val_loss: 0.0311475 | Time: 4586.35 ms [2022-04-13 10:46:21 main:574] : INFO : Epoch 1866 | loss: 0.0311116 | val_loss: 0.0311496 | Time: 4630.8 ms [2022-04-13 10:46:26 main:574] : INFO : Epoch 1867 | loss: 0.0311093 | val_loss: 0.0311542 | Time: 4581.76 ms [2022-04-13 10:46:31 main:574] : INFO : Epoch 1868 | loss: 0.0311083 | val_loss: 0.0311471 | Time: 4590.1 ms [2022-04-13 10:46:35 main:574] : INFO : Epoch 1869 | loss: 0.031109 | val_loss: 0.0311479 | Time: 4711.5 ms [2022-04-13 10:46:40 main:574] : INFO : Epoch 1870 | loss: 0.0311044 | val_loss: 0.0311499 | Time: 4577.12 ms [2022-04-13 10:46:45 main:574] : INFO : Epoch 1871 | loss: 0.0311058 | val_loss: 0.031154 | Time: 4593.81 ms [2022-04-13 10:46:49 main:574] : INFO : Epoch 1872 | loss: 0.0311047 | val_loss: 0.0311535 | Time: 4668.88 ms [2022-04-13 10:46:54 main:574] : INFO : Epoch 1873 | loss: 0.0311049 | val_loss: 0.0311531 | Time: 4640.57 ms [2022-04-13 10:46:59 main:574] : INFO : Epoch 1874 | loss: 0.031107 | val_loss: 0.0311483 | Time: 4652.65 ms [2022-04-13 10:47:04 main:574] : INFO : Epoch 1875 | loss: 0.0311072 | val_loss: 0.0311526 | Time: 4684.07 ms [2022-04-13 10:47:08 main:574] : INFO : Epoch 1876 | loss: 0.0311053 | val_loss: 0.0311543 | Time: 4546.77 ms [2022-04-13 10:47:13 main:574] : INFO : Epoch 1877 | loss: 0.0311045 | val_loss: 0.0311524 | Time: 4615.13 ms [2022-04-13 10:47:18 main:574] : INFO : Epoch 1878 | loss: 0.0311082 | val_loss: 0.0311617 | Time: 4709.86 ms [2022-04-13 10:47:22 main:574] : INFO : Epoch 1879 | loss: 0.0311096 | val_loss: 0.0311474 | Time: 4549.69 ms [2022-04-13 10:47:27 main:574] : INFO : Epoch 1880 | loss: 0.0311062 | val_loss: 0.0311519 | Time: 4691.5 ms [2022-04-13 10:47:32 main:574] : INFO : Epoch 1881 | loss: 0.0311048 | val_loss: 0.0311464 | Time: 4627.11 ms [2022-04-13 10:47:36 main:574] : INFO : Epoch 1882 | loss: 0.0311115 | val_loss: 0.031156 | Time: 4685.6 ms [2022-04-13 10:47:41 main:574] : INFO : Epoch 1883 | loss: 0.0311096 | val_loss: 0.0311504 | Time: 4639.98 ms [2022-04-13 10:47:46 main:574] : INFO : Epoch 1884 | loss: 0.0311064 | val_loss: 0.0311467 | Time: 4575.33 ms [2022-04-13 10:47:50 main:574] : INFO : Epoch 1885 | loss: 0.0311054 | val_loss: 0.0311503 | Time: 4608.35 ms [2022-04-13 10:47:55 main:574] : INFO : Epoch 1886 | loss: 0.0311061 | val_loss: 0.0311458 | Time: 4615.24 ms [2022-04-13 10:48:00 main:574] : INFO : Epoch 1887 | loss: 0.0311054 | val_loss: 0.0311493 | Time: 4641.43 ms [2022-04-13 10:48:04 main:574] : INFO : Epoch 1888 | loss: 0.0311058 | val_loss: 0.0311542 | Time: 4717.96 ms [2022-04-13 10:48:10 main:574] : INFO : Epoch 1889 | loss: 0.0311041 | val_loss: 0.0311564 | Time: 4930.26 ms [2022-04-13 10:48:14 main:574] : INFO : Epoch 1890 | loss: 0.0311042 | val_loss: 0.0311557 | Time: 4741.94 ms [2022-04-13 10:48:21 main:574] : INFO : Epoch 1891 | loss: 0.0311053 | val_loss: 0.0311528 | Time: 6227.72 ms [2022-04-13 10:48:27 main:574] : INFO : Epoch 1892 | loss: 0.0311068 | val_loss: 0.0311498 | Time: 6170.79 ms [2022-04-13 10:48:33 main:574] : INFO : Epoch 1893 | loss: 0.0311056 | val_loss: 0.0311573 | Time: 6257.17 ms [2022-04-13 10:48:40 main:574] : INFO : Epoch 1894 | loss: 0.0311068 | val_loss: 0.0311574 | Time: 6334.41 ms [2022-04-13 10:48:46 main:574] : INFO : Epoch 1895 | loss: 0.0311085 | val_loss: 0.0311611 | Time: 6208.56 ms [2022-04-13 10:48:52 main:574] : INFO : Epoch 1896 | loss: 0.0311101 | val_loss: 0.0311575 | Time: 6312.9 ms [2022-04-13 10:48:58 main:574] : INFO : Epoch 1897 | loss: 0.0311061 | val_loss: 0.0311566 | Time: 6187.82 ms [2022-04-13 10:49:05 main:574] : INFO : Epoch 1898 | loss: 0.03111 | val_loss: 0.0311542 | Time: 6147.12 ms [2022-04-13 10:49:11 main:574] : INFO : Epoch 1899 | loss: 0.0311078 | val_loss: 0.0311567 | Time: 6208.72 ms [2022-04-13 10:49:17 main:574] : INFO : Epoch 1900 | loss: 0.0311064 | val_loss: 0.0311515 | Time: 6090.99 ms [2022-04-13 10:49:23 main:574] : INFO : Epoch 1901 | loss: 0.0311141 | val_loss: 0.031154 | Time: 6207.19 ms [2022-04-13 10:49:29 main:574] : INFO : Epoch 1902 | loss: 0.0311194 | val_loss: 0.0311541 | Time: 6201.87 ms [2022-04-13 10:49:36 main:574] : INFO : Epoch 1903 | loss: 0.0311147 | val_loss: 0.0311497 | Time: 6214.88 ms [2022-04-13 10:49:42 main:574] : INFO : Epoch 1904 | loss: 0.0311136 | val_loss: 0.0311491 | Time: 6155.11 ms [2022-04-13 10:49:48 main:574] : INFO : Epoch 1905 | loss: 0.0311098 | val_loss: 0.0311504 | Time: 6126.65 ms [2022-04-13 10:49:54 main:574] : INFO : Epoch 1906 | loss: 0.0311103 | val_loss: 0.0311483 | Time: 6145.31 ms [2022-04-13 10:50:01 main:574] : INFO : Epoch 1907 | loss: 0.0311085 | val_loss: 0.031148 | Time: 6174.6 ms [2022-04-13 10:50:07 main:574] : INFO : Epoch 1908 | loss: 0.0311082 | val_loss: 0.0311575 | Time: 6075.12 ms [2022-04-13 10:50:13 main:574] : INFO : Epoch 1909 | loss: 0.0311095 | val_loss: 0.0311573 | Time: 6082.01 ms [2022-04-13 10:50:19 main:574] : INFO : Epoch 1910 | loss: 0.0311072 | val_loss: 0.0311502 | Time: 6144.34 ms [2022-04-13 10:50:25 main:574] : INFO : Epoch 1911 | loss: 0.0311067 | val_loss: 0.0311481 | Time: 6202.26 ms [2022-04-13 10:50:31 main:574] : INFO : Epoch 1912 | loss: 0.0311061 | val_loss: 0.0311557 | Time: 6213.05 ms [2022-04-13 10:50:38 main:574] : INFO : Epoch 1913 | loss: 0.0311035 | val_loss: 0.0311516 | Time: 6255.23 ms [2022-04-13 10:50:44 main:574] : INFO : Epoch 1914 | loss: 0.0311022 | val_loss: 0.0311541 | Time: 6130.44 ms [2022-04-13 10:50:50 main:574] : INFO : Epoch 1915 | loss: 0.0311001 | val_loss: 0.0311544 | Time: 6143.77 ms [2022-04-13 10:50:56 main:574] : INFO : Epoch 1916 | loss: 0.031101 | val_loss: 0.0311518 | Time: 6110.2 ms [2022-04-13 10:51:03 main:574] : INFO : Epoch 1917 | loss: 0.0311014 | val_loss: 0.0311523 | Time: 6234.47 ms [2022-04-13 10:51:09 main:574] : INFO : Epoch 1918 | loss: 0.0311081 | val_loss: 0.0311474 | Time: 6041.58 ms [2022-04-13 10:51:15 main:574] : INFO : Epoch 1919 | loss: 0.0311373 | val_loss: 0.0311512 | Time: 6265.31 ms [2022-04-13 10:51:21 main:574] : INFO : Epoch 1920 | loss: 0.0311381 | val_loss: 0.0311521 | Time: 6176.8 ms [2022-04-13 10:51:27 main:574] : INFO : Epoch 1921 | loss: 0.0311337 | val_loss: 0.0311596 | Time: 6260.08 ms [2022-04-13 10:51:34 main:574] : INFO : Epoch 1922 | loss: 0.0311285 | val_loss: 0.0311516 | Time: 6149.83 ms [2022-04-13 10:51:40 main:574] : INFO : Epoch 1923 | loss: 0.0311265 | val_loss: 0.0311514 | Time: 6155.82 ms [2022-04-13 10:51:46 main:574] : INFO : Epoch 1924 | loss: 0.0311255 | val_loss: 0.0311499 | Time: 6173.64 ms [2022-04-13 10:51:52 main:574] : INFO : Epoch 1925 | loss: 0.0311238 | val_loss: 0.0311557 | Time: 6078.6 ms [2022-04-13 10:51:59 main:574] : INFO : Epoch 1926 | loss: 0.0311224 | val_loss: 0.0311513 | Time: 6196.83 ms [2022-04-13 10:52:05 main:574] : INFO : Epoch 1927 | loss: 0.0311194 | val_loss: 0.0311504 | Time: 6126.02 ms [2022-04-13 10:52:11 main:574] : INFO : Epoch 1928 | loss: 0.0311184 | val_loss: 0.0311505 | Time: 6246.91 ms [2022-04-13 10:52:17 main:574] : INFO : Epoch 1929 | loss: 0.0311197 | val_loss: 0.0311512 | Time: 6077.14 ms [2022-04-13 10:52:23 main:574] : INFO : Epoch 1930 | loss: 0.0311242 | val_loss: 0.0311598 | Time: 6123.94 ms [2022-04-13 10:52:30 main:574] : INFO : Epoch 1931 | loss: 0.0311319 | val_loss: 0.0311559 | Time: 6228.86 ms [2022-04-13 10:52:36 main:574] : INFO : Epoch 1932 | loss: 0.0311275 | val_loss: 0.03115 | Time: 6363.06 ms [2022-04-13 10:52:42 main:574] : INFO : Epoch 1933 | loss: 0.031124 | val_loss: 0.0311501 | Time: 6096.31 ms [2022-04-13 10:52:48 main:574] : INFO : Epoch 1934 | loss: 0.0311223 | val_loss: 0.0311484 | Time: 6166.1 ms [2022-04-13 10:52:55 main:574] : INFO : Epoch 1935 | loss: 0.0311197 | val_loss: 0.0311483 | Time: 6206.18 ms [2022-04-13 10:53:01 main:574] : INFO : Epoch 1936 | loss: 0.0311198 | val_loss: 0.0311481 | Time: 6081.84 ms [2022-04-13 10:53:07 main:574] : INFO : Epoch 1937 | loss: 0.0311184 | val_loss: 0.0311494 | Time: 6497.73 ms [2022-04-13 10:53:13 main:574] : INFO : Epoch 1938 | loss: 0.0311165 | val_loss: 0.0311494 | Time: 6107.09 ms [2022-04-13 10:53:20 main:574] : INFO : Epoch 1939 | loss: 0.0311157 | val_loss: 0.0311503 | Time: 6117.73 ms [2022-04-13 10:53:26 main:574] : INFO : Epoch 1940 | loss: 0.0311159 | val_loss: 0.0311508 | Time: 6278.81 ms [2022-04-13 10:53:32 main:574] : INFO : Epoch 1941 | loss: 0.0311146 | val_loss: 0.0311485 | Time: 6198.97 ms [2022-04-13 10:53:39 main:574] : INFO : Epoch 1942 | loss: 0.0311125 | val_loss: 0.0311476 | Time: 6327.16 ms [2022-04-13 10:53:45 main:574] : INFO : Epoch 1943 | loss: 0.0311172 | val_loss: 0.0311513 | Time: 6237.8 ms [2022-04-13 10:53:51 main:574] : INFO : Epoch 1944 | loss: 0.0311183 | val_loss: 0.031146 | Time: 6260.4 ms [2022-04-13 10:53:57 main:574] : INFO : Epoch 1945 | loss: 0.0311139 | val_loss: 0.0311528 | Time: 6201.08 ms [2022-04-13 10:54:04 main:574] : INFO : Epoch 1946 | loss: 0.0311129 | val_loss: 0.0311492 | Time: 6306.09 ms [2022-04-13 10:54:10 main:574] : INFO : Epoch 1947 | loss: 0.0311139 | val_loss: 0.0311506 | Time: 6163.98 ms [2022-04-13 10:54:16 main:574] : INFO : Epoch 1948 | loss: 0.0311122 | val_loss: 0.0311541 | Time: 6324.49 ms [2022-04-13 10:54:23 main:574] : INFO : Epoch 1949 | loss: 0.0311128 | val_loss: 0.0311489 | Time: 6280.74 ms [2022-04-13 10:54:29 main:574] : INFO : Epoch 1950 | loss: 0.0311119 | val_loss: 0.0311546 | Time: 6345.82 ms [2022-04-13 10:54:35 main:574] : INFO : Epoch 1951 | loss: 0.0311129 | val_loss: 0.0311583 | Time: 6226.18 ms [2022-04-13 10:54:42 main:574] : INFO : Epoch 1952 | loss: 0.031114 | val_loss: 0.031152 | Time: 6393.64 ms [2022-04-13 10:54:48 main:574] : INFO : Epoch 1953 | loss: 0.0311108 | val_loss: 0.0311518 | Time: 6165.18 ms [2022-04-13 10:54:54 main:574] : INFO : Epoch 1954 | loss: 0.0311112 | val_loss: 0.0311495 | Time: 6236.77 ms [2022-04-13 10:55:00 main:574] : INFO : Epoch 1955 | loss: 0.0311109 | val_loss: 0.0311509 | Time: 5516.86 ms [2022-04-13 10:55:04 main:574] : INFO : Epoch 1956 | loss: 0.0311111 | val_loss: 0.0311474 | Time: 4471.85 ms [2022-04-13 10:55:09 main:574] : INFO : Epoch 1957 | loss: 0.0311104 | val_loss: 0.0311545 | Time: 4409.64 ms [2022-04-13 10:55:13 main:574] : INFO : Epoch 1958 | loss: 0.0311114 | val_loss: 0.0311498 | Time: 4514.23 ms [2022-04-13 10:55:18 main:574] : INFO : Epoch 1959 | loss: 0.0311103 | val_loss: 0.0311457 | Time: 4509.96 ms [2022-04-13 10:55:22 main:574] : INFO : Epoch 1960 | loss: 0.031111 | val_loss: 0.0311457 | Time: 4475.56 ms [2022-04-13 10:55:27 main:574] : INFO : Epoch 1961 | loss: 0.0311098 | val_loss: 0.0311553 | Time: 4524.53 ms [2022-04-13 10:55:31 main:574] : INFO : Epoch 1962 | loss: 0.0311076 | val_loss: 0.0311522 | Time: 4409.15 ms [2022-04-13 10:55:36 main:574] : INFO : Epoch 1963 | loss: 0.0311082 | val_loss: 0.0311484 | Time: 4484.04 ms [2022-04-13 10:55:40 main:574] : INFO : Epoch 1964 | loss: 0.031109 | val_loss: 0.0311577 | Time: 4512.63 ms [2022-04-13 10:55:45 main:574] : INFO : Epoch 1965 | loss: 0.0311071 | val_loss: 0.0311516 | Time: 4616.98 ms [2022-04-13 10:55:50 main:574] : INFO : Epoch 1966 | loss: 0.0311078 | val_loss: 0.0311472 | Time: 4466.54 ms [2022-04-13 10:55:54 main:574] : INFO : Epoch 1967 | loss: 0.0311104 | val_loss: 0.0311528 | Time: 4471.03 ms [2022-04-13 10:55:59 main:574] : INFO : Epoch 1968 | loss: 0.0311127 | val_loss: 0.031156 | Time: 4522.13 ms [2022-04-13 10:56:03 main:574] : INFO : Epoch 1969 | loss: 0.0311229 | val_loss: 0.0311594 | Time: 4513.81 ms [2022-04-13 10:56:08 main:574] : INFO : Epoch 1970 | loss: 0.031123 | val_loss: 0.0311469 | Time: 4471.62 ms [2022-04-13 10:56:12 main:574] : INFO : Epoch 1971 | loss: 0.0311177 | val_loss: 0.0311497 | Time: 4544.91 ms [2022-04-13 10:56:17 main:574] : INFO : Epoch 1972 | loss: 0.0311161 | val_loss: 0.0311507 | Time: 4436.98 ms [2022-04-13 10:56:22 main:574] : INFO : Epoch 1973 | loss: 0.0311158 | val_loss: 0.0311498 | Time: 4556.6 ms [2022-04-13 10:56:26 main:574] : INFO : Epoch 1974 | loss: 0.0311198 | val_loss: 0.0311589 | Time: 4393.47 ms [2022-04-13 10:56:31 main:574] : INFO : Epoch 1975 | loss: 0.0311204 | val_loss: 0.0311497 | Time: 4491.52 ms [2022-04-13 10:56:35 main:574] : INFO : Epoch 1976 | loss: 0.0311165 | val_loss: 0.0311565 | Time: 4404.28 ms [2022-04-13 10:56:40 main:574] : INFO : Epoch 1977 | loss: 0.0311119 | val_loss: 0.0311513 | Time: 4671.55 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-04-13 11:20:52 main:435] : INFO : Set logging level to 1 [2022-04-13 11:20:52 main:441] : INFO : Running in BOINC Client mode [2022-04-13 11:20:52 main:444] : INFO : Resolving all filenames [2022-04-13 11:20:52 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-13 11:20:52 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-13 11:20:52 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-13 11:20:52 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-13 11:20:52 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-13 11:20:52 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-13 11:20:52 main:474] : INFO : Configuration: [2022-04-13 11:20:52 main:475] : INFO : Model type: GRU [2022-04-13 11:20:52 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-13 11:20:52 main:477] : INFO : Max Epochs: 2048 [2022-04-13 11:20:52 main:478] : INFO : Batch Size: 128 [2022-04-13 11:20:52 main:479] : INFO : Learning Rate: 0.01 [2022-04-13 11:20:52 main:480] : INFO : Patience: 10 [2022-04-13 11:20:52 main:481] : INFO : Hidden Width: 12 [2022-04-13 11:20:52 main:482] : INFO : # Recurrent Layers: 4 [2022-04-13 11:20:52 main:483] : INFO : # Backend Layers: 4 [2022-04-13 11:20:52 main:484] : INFO : # Threads: 1 [2022-04-13 11:20:52 main:486] : INFO : Preparing Dataset [2022-04-13 11:20:52 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-13 11:20:53 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-13 11:20:54 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-13 11:20:55 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-13 11:20:55 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-13 11:20:55 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-13 11:20:55 main:494] : INFO : Creating Model [2022-04-13 11:20:55 main:507] : INFO : Preparing config file [2022-04-13 11:20:55 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-13 11:20:55 main:512] : INFO : Loading config [2022-04-13 11:20:55 main:514] : INFO : Loading state [2022-04-13 11:20:56 main:559] : INFO : Loading DataLoader into Memory [2022-04-13 11:20:56 main:562] : INFO : Starting Training [2022-04-13 11:21:00 main:574] : INFO : Epoch 1957 | loss: 0.0311595 | val_loss: 0.0311485 | Time: 4485.13 ms [2022-04-13 11:21:05 main:574] : INFO : Epoch 1958 | loss: 0.0311185 | val_loss: 0.0311432 | Time: 4334.15 ms [2022-04-13 11:21:09 main:574] : INFO : Epoch 1959 | loss: 0.0311142 | val_loss: 0.0311439 | Time: 4193.41 ms [2022-04-13 11:21:14 main:574] : INFO : Epoch 1960 | loss: 0.0311113 | val_loss: 0.0311465 | Time: 4274.49 ms [2022-04-13 11:21:18 main:574] : INFO : Epoch 1961 | loss: 0.0311118 | val_loss: 0.0311506 | Time: 4188.61 ms [2022-04-13 11:21:22 main:574] : INFO : Epoch 1962 | loss: 0.0311135 | val_loss: 0.0311584 | Time: 4283.81 ms [2022-04-13 11:21:26 main:574] : INFO : Epoch 1963 | loss: 0.0311109 | val_loss: 0.0311489 | Time: 4192.68 ms [2022-04-13 11:21:31 main:574] : INFO : Epoch 1964 | loss: 0.0311089 | val_loss: 0.0311527 | Time: 4251.45 ms [2022-04-13 11:21:35 main:574] : INFO : Epoch 1965 | loss: 0.0311069 | val_loss: 0.0311473 | Time: 4195.35 ms [2022-04-13 11:21:39 main:574] : INFO : Epoch 1966 | loss: 0.0311109 | val_loss: 0.0311512 | Time: 4319.97 ms [2022-04-13 11:21:44 main:574] : INFO : Epoch 1967 | loss: 0.0311079 | val_loss: 0.0311567 | Time: 4213.99 ms [2022-04-13 11:21:48 main:574] : INFO : Epoch 1968 | loss: 0.0311081 | val_loss: 0.0311518 | Time: 4237.73 ms [2022-04-13 11:21:52 main:574] : INFO : Epoch 1969 | loss: 0.0311153 | val_loss: 0.0311518 | Time: 4295.95 ms [2022-04-13 11:21:57 main:574] : INFO : Epoch 1970 | loss: 0.031112 | val_loss: 0.0311486 | Time: 4312.17 ms [2022-04-13 11:22:01 main:574] : INFO : Epoch 1971 | loss: 0.0311116 | val_loss: 0.0311536 | Time: 4231.32 ms [2022-04-13 11:22:05 main:574] : INFO : Epoch 1972 | loss: 0.0311228 | val_loss: 0.0311612 | Time: 4331.1 ms [2022-04-13 11:22:10 main:574] : INFO : Epoch 1973 | loss: 0.0311249 | val_loss: 0.0311524 | Time: 4155.31 ms [2022-04-13 11:22:14 main:574] : INFO : Epoch 1974 | loss: 0.0311174 | val_loss: 0.0311566 | Time: 4306.22 ms [2022-04-13 11:22:18 main:574] : INFO : Epoch 1975 | loss: 0.031119 | val_loss: 0.0311517 | Time: 4145.43 ms [2022-04-13 11:22:22 main:574] : INFO : Epoch 1976 | loss: 0.0311141 | val_loss: 0.0311441 | Time: 4303.08 ms [2022-04-13 11:22:27 main:574] : INFO : Epoch 1977 | loss: 0.0311116 | val_loss: 0.0311539 | Time: 4119.29 ms [2022-04-13 11:22:31 main:574] : INFO : Epoch 1978 | loss: 0.0311099 | val_loss: 0.0311517 | Time: 4259.8 ms [2022-04-13 11:22:35 main:574] : INFO : Epoch 1979 | loss: 0.0311087 | val_loss: 0.0311483 | Time: 4177.85 ms [2022-04-13 11:22:40 main:574] : INFO : Epoch 1980 | loss: 0.0311099 | val_loss: 0.0311464 | Time: 4297.91 ms [2022-04-13 11:22:44 main:574] : INFO : Epoch 1981 | loss: 0.0311082 | val_loss: 0.031151 | Time: 4150.89 ms [2022-04-13 11:22:48 main:574] : INFO : Epoch 1982 | loss: 0.0311071 | val_loss: 0.031154 | Time: 4252.38 ms [2022-04-13 11:22:52 main:574] : INFO : Epoch 1983 | loss: 0.0311056 | val_loss: 0.0311531 | Time: 4188.93 ms [2022-04-13 11:22:57 main:574] : INFO : Epoch 1984 | loss: 0.0311059 | val_loss: 0.031153 | Time: 4273.9 ms [2022-04-13 11:23:01 main:574] : INFO : Epoch 1985 | loss: 0.0311247 | val_loss: 0.0311495 | Time: 4210.13 ms [2022-04-13 11:23:05 main:574] : INFO : Epoch 1986 | loss: 0.0311275 | val_loss: 0.0311513 | Time: 4262.38 ms [2022-04-13 11:23:09 main:574] : INFO : Epoch 1987 | loss: 0.0311245 | val_loss: 0.0311533 | Time: 4177.34 ms [2022-04-13 11:23:14 main:574] : INFO : Epoch 1988 | loss: 0.0311203 | val_loss: 0.0311484 | Time: 4286.17 ms [2022-04-13 11:23:19 main:574] : INFO : Epoch 1989 | loss: 0.0311181 | val_loss: 0.0311478 | Time: 4209.72 ms [2022-04-13 11:23:23 main:574] : INFO : Epoch 1990 | loss: 0.031118 | val_loss: 0.0311468 | Time: 4184.34 ms [2022-04-13 11:23:28 main:574] : INFO : Epoch 1991 | loss: 0.0311164 | val_loss: 0.0311614 | Time: 4196.7 ms [2022-04-13 11:23:32 main:574] : INFO : Epoch 1992 | loss: 0.0311187 | val_loss: 0.0311514 | Time: 4373.28 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-04-13 11:57:45 main:435] : INFO : Set logging level to 1 [2022-04-13 11:57:45 main:441] : INFO : Running in BOINC Client mode [2022-04-13 11:57:45 main:444] : INFO : Resolving all filenames [2022-04-13 11:57:45 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-13 11:57:45 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-13 11:57:45 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-13 11:57:45 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-13 11:57:45 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-13 11:57:45 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-13 11:57:45 main:474] : INFO : Configuration: [2022-04-13 11:57:45 main:475] : INFO : Model type: GRU [2022-04-13 11:57:46 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-13 11:57:46 main:477] : INFO : Max Epochs: 2048 [2022-04-13 11:57:46 main:478] : INFO : Batch Size: 128 [2022-04-13 11:57:46 main:479] : INFO : Learning Rate: 0.01 [2022-04-13 11:57:46 main:480] : INFO : Patience: 10 [2022-04-13 11:57:46 main:481] : INFO : Hidden Width: 12 [2022-04-13 11:57:46 main:482] : INFO : # Recurrent Layers: 4 [2022-04-13 11:57:46 main:483] : INFO : # Backend Layers: 4 [2022-04-13 11:57:46 main:484] : INFO : # Threads: 1 [2022-04-13 11:57:46 main:486] : INFO : Preparing Dataset [2022-04-13 11:57:46 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-13 11:57:46 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-13 11:57:48 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-13 11:57:48 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-13 11:57:48 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-13 11:57:48 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-13 11:57:48 main:494] : INFO : Creating Model [2022-04-13 11:57:48 main:507] : INFO : Preparing config file [2022-04-13 11:57:48 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-13 11:57:49 main:512] : INFO : Loading config [2022-04-13 11:57:49 main:514] : INFO : Loading state [2022-04-13 11:57:50 main:559] : INFO : Loading DataLoader into Memory [2022-04-13 11:57:50 main:562] : INFO : Starting Training [2022-04-13 11:57:55 main:574] : INFO : Epoch 1957 | loss: 0.0311606 | val_loss: 0.0311527 | Time: 4349.96 ms [2022-04-13 11:57:59 main:574] : INFO : Epoch 1958 | loss: 0.0311185 | val_loss: 0.0311494 | Time: 4117.32 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-04-13 12:01:11 main:435] : INFO : Set logging level to 1 [2022-04-13 12:01:11 main:441] : INFO : Running in BOINC Client mode [2022-04-13 12:01:11 main:444] : INFO : Resolving all filenames [2022-04-13 12:01:11 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-13 12:01:11 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-13 12:01:11 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-13 12:01:11 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-13 12:01:11 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-13 12:01:12 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-13 12:01:12 main:474] : INFO : Configuration: [2022-04-13 12:01:12 main:475] : INFO : Model type: GRU [2022-04-13 12:01:12 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-13 12:01:12 main:477] : INFO : Max Epochs: 2048 [2022-04-13 12:01:12 main:478] : INFO : Batch Size: 128 [2022-04-13 12:01:13 main:479] : INFO : Learning Rate: 0.01 [2022-04-13 12:01:13 main:480] : INFO : Patience: 10 [2022-04-13 12:01:13 main:481] : INFO : Hidden Width: 12 [2022-04-13 12:01:13 main:482] : INFO : # Recurrent Layers: 4 [2022-04-13 12:01:13 main:483] : INFO : # Backend Layers: 4 [2022-04-13 12:01:13 main:484] : INFO : # Threads: 1 [2022-04-13 12:01:13 main:486] : INFO : Preparing Dataset [2022-04-13 12:01:14 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-13 12:01:14 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-13 12:01:15 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-13 12:01:16 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-13 12:01:16 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-13 12:01:16 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-13 12:01:16 main:494] : INFO : Creating Model [2022-04-13 12:01:16 main:507] : INFO : Preparing config file [2022-04-13 12:01:16 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-13 12:01:16 main:512] : INFO : Loading config [2022-04-13 12:01:16 main:514] : INFO : Loading state [2022-04-13 12:01:17 main:559] : INFO : Loading DataLoader into Memory [2022-04-13 12:01:17 main:562] : INFO : Starting Training [2022-04-13 12:01:21 main:574] : INFO : Epoch 1957 | loss: 0.0311601 | val_loss: 0.0311545 | Time: 4245.75 ms [2022-04-13 12:01:25 main:574] : INFO : Epoch 1958 | loss: 0.0311192 | val_loss: 0.0311509 | Time: 4006.61 ms [2022-04-13 12:01:29 main:574] : INFO : Epoch 1959 | loss: 0.0311112 | val_loss: 0.0311465 | Time: 4036.89 ms [2022-04-13 12:01:33 main:574] : INFO : Epoch 1960 | loss: 0.031109 | val_loss: 0.0311478 | Time: 4002.92 ms [2022-04-13 12:01:37 main:574] : INFO : Epoch 1961 | loss: 0.0311091 | val_loss: 0.031155 | Time: 3967.28 ms [2022-04-13 12:01:41 main:574] : INFO : Epoch 1962 | loss: 0.0311127 | val_loss: 0.031144 | Time: 4005.44 ms [2022-04-13 12:01:45 main:574] : INFO : Epoch 1963 | loss: 0.0311133 | val_loss: 0.0311458 | Time: 4037.69 ms [2022-04-13 12:01:49 main:574] : INFO : Epoch 1964 | loss: 0.0311133 | val_loss: 0.0311478 | Time: 4046.1 ms [2022-04-13 12:01:53 main:574] : INFO : Epoch 1965 | loss: 0.0311115 | val_loss: 0.0311475 | Time: 3987.62 ms [2022-04-13 12:01:57 main:574] : INFO : Epoch 1966 | loss: 0.0311091 | val_loss: 0.0311528 | Time: 4005.49 ms [2022-04-13 12:02:02 main:574] : INFO : Epoch 1967 | loss: 0.0311113 | val_loss: 0.0311498 | Time: 4103.91 ms [2022-04-13 12:02:06 main:574] : INFO : Epoch 1968 | loss: 0.0311103 | val_loss: 0.0311522 | Time: 4015.53 ms [2022-04-13 12:02:10 main:574] : INFO : Epoch 1969 | loss: 0.0311104 | val_loss: 0.0311522 | Time: 3936.82 ms [2022-04-13 12:02:14 main:574] : INFO : Epoch 1970 | loss: 0.0311111 | val_loss: 0.0311526 | Time: 4169.41 ms [2022-04-13 12:02:18 main:574] : INFO : Epoch 1971 | loss: 0.0311086 | val_loss: 0.0311517 | Time: 3934.62 ms [2022-04-13 12:02:22 main:574] : INFO : Epoch 1972 | loss: 0.0311084 | val_loss: 0.0311494 | Time: 3967.41 ms [2022-04-13 12:02:26 main:574] : INFO : Epoch 1973 | loss: 0.0311064 | val_loss: 0.0311503 | Time: 3959.87 ms [2022-04-13 12:02:30 main:574] : INFO : Epoch 1974 | loss: 0.0311099 | val_loss: 0.0311535 | Time: 3985.24 ms [2022-04-13 12:02:34 main:574] : INFO : Epoch 1975 | loss: 0.0311074 | val_loss: 0.0311487 | Time: 3975.81 ms [2022-04-13 12:02:38 main:574] : INFO : Epoch 1976 | loss: 0.0311084 | val_loss: 0.0311502 | Time: 3986.18 ms [2022-04-13 12:02:42 main:574] : INFO : Epoch 1977 | loss: 0.0311075 | val_loss: 0.0311539 | Time: 3971.55 ms [2022-04-13 12:02:46 main:574] : INFO : Epoch 1978 | loss: 0.0311076 | val_loss: 0.0311513 | Time: 3928.1 ms [2022-04-13 12:02:50 main:574] : INFO : Epoch 1979 | loss: 0.0311095 | val_loss: 0.0311496 | Time: 3993.87 ms [2022-04-13 12:02:54 main:574] : INFO : Epoch 1980 | loss: 0.0311093 | val_loss: 0.0311448 | Time: 3994.05 ms [2022-04-13 12:02:58 main:574] : INFO : Epoch 1981 | loss: 0.0311121 | val_loss: 0.0311488 | Time: 3969.7 ms [2022-04-13 12:03:02 main:574] : INFO : Epoch 1982 | loss: 0.031114 | val_loss: 0.031147 | Time: 4009.41 ms [2022-04-13 12:03:06 main:574] : INFO : Epoch 1983 | loss: 0.0311101 | val_loss: 0.0311494 | Time: 3990.49 ms [2022-04-13 12:03:10 main:574] : INFO : Epoch 1984 | loss: 0.0311106 | val_loss: 0.0311512 | Time: 3932.11 ms [2022-04-13 12:03:14 main:574] : INFO : Epoch 1985 | loss: 0.0311102 | val_loss: 0.0311546 | Time: 4096.24 ms [2022-04-13 12:03:18 main:574] : INFO : Epoch 1986 | loss: 0.0311117 | val_loss: 0.0311518 | Time: 3950.47 ms [2022-04-13 12:03:22 main:574] : INFO : Epoch 1987 | loss: 0.0311112 | val_loss: 0.0311567 | Time: 3966.29 ms [2022-04-13 12:03:27 main:574] : INFO : Epoch 1988 | loss: 0.0311119 | val_loss: 0.0311597 | Time: 4007.12 ms [2022-04-13 12:03:31 main:574] : INFO : Epoch 1989 | loss: 0.0311257 | val_loss: 0.0311523 | Time: 3993.47 ms [2022-04-13 12:03:35 main:574] : INFO : Epoch 1990 | loss: 0.031124 | val_loss: 0.0311499 | Time: 3963.07 ms [2022-04-13 12:03:39 main:574] : INFO : Epoch 1991 | loss: 0.0311208 | val_loss: 0.0311515 | Time: 4083.83 ms [2022-04-13 12:03:43 main:574] : INFO : Epoch 1992 | loss: 0.0311179 | val_loss: 0.0311512 | Time: 3987.44 ms [2022-04-13 12:03:47 main:574] : INFO : Epoch 1993 | loss: 0.0311138 | val_loss: 0.0311578 | Time: 4009.8 ms [2022-04-13 12:03:51 main:574] : INFO : Epoch 1994 | loss: 0.031115 | val_loss: 0.0311524 | Time: 4261.46 ms [2022-04-13 12:03:55 main:574] : INFO : Epoch 1995 | loss: 0.0311155 | val_loss: 0.0311586 | Time: 4121.57 ms [2022-04-13 12:03:59 main:574] : INFO : Epoch 1996 | loss: 0.0311118 | val_loss: 0.0311503 | Time: 4101.4 ms [2022-04-13 12:04:04 main:574] : INFO : Epoch 1997 | loss: 0.0311107 | val_loss: 0.0311513 | Time: 4145.42 ms [2022-04-13 12:04:08 main:574] : INFO : Epoch 1998 | loss: 0.031111 | val_loss: 0.0311504 | Time: 4092.92 ms [2022-04-13 12:04:12 main:574] : INFO : Epoch 1999 | loss: 0.0311148 | val_loss: 0.0311489 | Time: 4131.42 ms [2022-04-13 12:04:16 main:574] : INFO : Epoch 2000 | loss: 0.0311112 | val_loss: 0.0311476 | Time: 4129.2 ms [2022-04-13 12:04:20 main:574] : INFO : Epoch 2001 | loss: 0.0311106 | val_loss: 0.031154 | Time: 4131.98 ms [2022-04-13 12:04:24 main:574] : INFO : Epoch 2002 | loss: 0.0311097 | val_loss: 0.0311527 | Time: 4150.24 ms [2022-04-13 12:04:29 main:574] : INFO : Epoch 2003 | loss: 0.0311127 | val_loss: 0.0311498 | Time: 4492.2 ms [2022-04-13 12:04:33 main:574] : INFO : Epoch 2004 | loss: 0.0311126 | val_loss: 0.0311521 | Time: 4286.88 ms [2022-04-13 12:04:38 main:574] : INFO : Epoch 2005 | loss: 0.0311109 | val_loss: 0.0311527 | Time: 4295.17 ms [2022-04-13 12:04:42 main:574] : INFO : Epoch 2006 | loss: 0.0311184 | val_loss: 0.0311536 | Time: 4274.01 ms [2022-04-13 12:04:46 main:574] : INFO : Epoch 2007 | loss: 0.0311174 | val_loss: 0.0311498 | Time: 4312.2 ms [2022-04-13 12:04:51 main:574] : INFO : Epoch 2008 | loss: 0.031113 | val_loss: 0.0311551 | Time: 4360.3 ms [2022-04-13 12:04:55 main:574] : INFO : Epoch 2009 | loss: 0.031112 | val_loss: 0.03115 | Time: 4441.65 ms [2022-04-13 12:05:00 main:574] : INFO : Epoch 2010 | loss: 0.0311111 | val_loss: 0.0311519 | Time: 4300.99 ms [2022-04-13 12:05:04 main:574] : INFO : Epoch 2011 | loss: 0.0311214 | val_loss: 0.0311512 | Time: 4284.34 ms [2022-04-13 12:05:08 main:574] : INFO : Epoch 2012 | loss: 0.0311203 | val_loss: 0.0311545 | Time: 4278.69 ms [2022-04-13 12:05:13 main:574] : INFO : Epoch 2013 | loss: 0.0311157 | val_loss: 0.0311554 | Time: 4321.92 ms [2022-04-13 12:05:17 main:574] : INFO : Epoch 2014 | loss: 0.0311138 | val_loss: 0.0311543 | Time: 4324.57 ms [2022-04-13 12:05:21 main:574] : INFO : Epoch 2015 | loss: 0.0311108 | val_loss: 0.0311501 | Time: 4308.82 ms [2022-04-13 12:05:26 main:574] : INFO : Epoch 2016 | loss: 0.0311117 | val_loss: 0.0311537 | Time: 4245.11 ms [2022-04-13 12:05:30 main:574] : INFO : Epoch 2017 | loss: 0.0311093 | val_loss: 0.0311461 | Time: 4285.55 ms [2022-04-13 12:05:34 main:574] : INFO : Epoch 2018 | loss: 0.0311083 | val_loss: 0.0311503 | Time: 4210.75 ms [2022-04-13 12:05:39 main:574] : INFO : Epoch 2019 | loss: 0.0311094 | val_loss: 0.0311534 | Time: 4282.71 ms [2022-04-13 12:05:43 main:574] : INFO : Epoch 2020 | loss: 0.0311233 | val_loss: 0.0311542 | Time: 4254.43 ms [2022-04-13 12:05:47 main:574] : INFO : Epoch 2021 | loss: 0.0311243 | val_loss: 0.0311544 | Time: 4229.65 ms [2022-04-13 12:05:51 main:574] : INFO : Epoch 2022 | loss: 0.0311205 | val_loss: 0.0311495 | Time: 4271.86 ms [2022-04-13 12:05:56 main:574] : INFO : Epoch 2023 | loss: 0.0311158 | val_loss: 0.0311538 | Time: 4293.86 ms [2022-04-13 12:06:00 main:574] : INFO : Epoch 2024 | loss: 0.0311147 | val_loss: 0.0311528 | Time: 4251.42 ms [2022-04-13 12:06:04 main:574] : INFO : Epoch 2025 | loss: 0.0311146 | val_loss: 0.0311539 | Time: 4290.21 ms [2022-04-13 12:06:09 main:574] : INFO : Epoch 2026 | loss: 0.0311135 | val_loss: 0.0311532 | Time: 4349.75 ms [2022-04-13 12:06:13 main:574] : INFO : Epoch 2027 | loss: 0.0311125 | val_loss: 0.0311541 | Time: 4319.39 ms [2022-04-13 12:06:18 main:574] : INFO : Epoch 2028 | loss: 0.0311128 | val_loss: 0.0311573 | Time: 4297.29 ms [2022-04-13 12:06:22 main:574] : INFO : Epoch 2029 | loss: 0.0311101 | val_loss: 0.0311495 | Time: 4258.08 ms [2022-04-13 12:06:26 main:574] : INFO : Epoch 2030 | loss: 0.031109 | val_loss: 0.0311545 | Time: 4252.49 ms [2022-04-13 12:06:30 main:574] : INFO : Epoch 2031 | loss: 0.031108 | val_loss: 0.0311552 | Time: 4269.12 ms [2022-04-13 12:06:35 main:574] : INFO : Epoch 2032 | loss: 0.0311093 | val_loss: 0.0311505 | Time: 4227.26 ms [2022-04-13 12:06:39 main:574] : INFO : Epoch 2033 | loss: 0.0311128 | val_loss: 0.0311568 | Time: 4234.59 ms [2022-04-13 12:06:43 main:574] : INFO : Epoch 2034 | loss: 0.0311072 | val_loss: 0.0311537 | Time: 4271.95 ms [2022-04-13 12:06:48 main:574] : INFO : Epoch 2035 | loss: 0.0311062 | val_loss: 0.0311539 | Time: 4255.78 ms [2022-04-13 12:06:52 main:574] : INFO : Epoch 2036 | loss: 0.031116 | val_loss: 0.0311607 | Time: 4296.11 ms [2022-04-13 12:06:56 main:574] : INFO : Epoch 2037 | loss: 0.0311211 | val_loss: 0.031159 | Time: 4288.65 ms [2022-04-13 12:07:01 main:574] : INFO : Epoch 2038 | loss: 0.0311186 | val_loss: 0.0311508 | Time: 4261.77 ms [2022-04-13 12:07:05 main:574] : INFO : Epoch 2039 | loss: 0.0311169 | val_loss: 0.0311612 | Time: 4235.99 ms [2022-04-13 12:07:09 main:574] : INFO : Epoch 2040 | loss: 0.0311239 | val_loss: 0.0311592 | Time: 4268.84 ms [2022-04-13 12:07:13 main:574] : INFO : Epoch 2041 | loss: 0.0311202 | val_loss: 0.0311578 | Time: 4280.57 ms [2022-04-13 12:07:18 main:574] : INFO : Epoch 2042 | loss: 0.0311177 | val_loss: 0.0311526 | Time: 4271.86 ms [2022-04-13 12:07:22 main:574] : INFO : Epoch 2043 | loss: 0.0311163 | val_loss: 0.0311567 | Time: 4219.78 ms [2022-04-13 12:07:26 main:574] : INFO : Epoch 2044 | loss: 0.031116 | val_loss: 0.0311592 | Time: 4260.52 ms [2022-04-13 12:07:31 main:574] : INFO : Epoch 2045 | loss: 0.0311168 | val_loss: 0.0311541 | Time: 4279.68 ms [2022-04-13 12:07:35 main:574] : INFO : Epoch 2046 | loss: 0.031112 | val_loss: 0.0311534 | Time: 4306.98 ms [2022-04-13 12:07:40 main:574] : INFO : Epoch 2047 | loss: 0.0311115 | val_loss: 0.031151 | Time: 4380.35 ms [2022-04-13 12:07:44 main:574] : INFO : Epoch 2048 | loss: 0.0311124 | val_loss: 0.0311508 | Time: 4344.82 ms [2022-04-13 12:07:44 main:597] : INFO : Saving trained model to model-final.pt, val_loss 0.0311508 [2022-04-13 12:07:44 main:603] : INFO : Saving end state to config to file [2022-04-13 12:07:44 main:608] : INFO : Success, exiting.. 12:07:44 (30292): called boinc_finish(0) </stderr_txt> ]]>
©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)