| Name | ParityModified-1647048670-24514-3-0_0 |
| Workunit | 11617571 |
| Created | 10 Apr 2022, 12:48:34 UTC |
| Sent | 10 Apr 2022, 19:40:02 UTC |
| Report deadline | 18 Apr 2022, 19:40:02 UTC |
| Received | 11 Apr 2022, 11:35:28 UTC |
| Server state | Over |
| Outcome | Success |
| Client state | Done |
| Exit status | 0 (0x00000000) |
| Computer ID | 18276 |
| Run time | 5 hours 41 min 37 sec |
| CPU time | 4 hours 12 min 24 sec |
| Validate state | Valid |
| Credit | 4,160.00 |
| Device peak FLOPS | 1,218.24 GFLOPS |
| Application version | Machine Learning Dataset Generator (GPU) v9.75 (cuda10200) windows_x86_64 |
| Peak working set size | 2.08 GB |
| Peak swap size | 4.31 GB |
| Peak disk usage | 1.54 GB |
<core_client_version>7.16.20</core_client_version> <![CDATA[ <stderr_txt> : Epoch 1733 | loss: 0.0310502 | val_loss: 0.0310951 | Time: 10151.2 ms [2022-04-11 09:35:23 main:574] : INFO : Epoch 1734 | loss: 0.0310467 | val_loss: 0.0311009 | Time: 10001.8 ms [2022-04-11 09:35:33 main:574] : INFO : Epoch 1735 | loss: 0.0310483 | val_loss: 0.0310959 | Time: 10218.2 ms [2022-04-11 09:35:43 main:574] : INFO : Epoch 1736 | loss: 0.0310453 | val_loss: 0.0311016 | Time: 10186.8 ms [2022-04-11 09:35:53 main:574] : INFO : Epoch 1737 | loss: 0.0310497 | val_loss: 0.0311044 | Time: 10154.8 ms [2022-04-11 09:36:04 main:574] : INFO : Epoch 1738 | loss: 0.0310465 | val_loss: 0.0310966 | Time: 10223.6 ms [2022-04-11 09:36:14 main:574] : INFO : Epoch 1739 | loss: 0.0310514 | val_loss: 0.0311052 | Time: 10107.8 ms [2022-04-11 09:36:24 main:574] : INFO : Epoch 1740 | loss: 0.0310526 | val_loss: 0.0311102 | Time: 10416.4 ms [2022-04-11 09:36:34 main:574] : INFO : Epoch 1741 | loss: 0.0310519 | val_loss: 0.0310943 | Time: 10220.1 ms [2022-04-11 09:36:45 main:574] : INFO : Epoch 1742 | loss: 0.0310506 | val_loss: 0.0311032 | Time: 10480.2 ms [2022-04-11 09:36:55 main:574] : INFO : Epoch 1743 | loss: 0.031049 | val_loss: 0.0310998 | Time: 10341.3 ms [2022-04-11 09:37:06 main:574] : INFO : Epoch 1744 | loss: 0.031045 | val_loss: 0.0310958 | Time: 10419.6 ms [2022-04-11 09:37:16 main:574] : INFO : Epoch 1745 | loss: 0.0310458 | val_loss: 0.0311047 | Time: 10006.4 ms [2022-04-11 09:37:26 main:574] : INFO : Epoch 1746 | loss: 0.0310551 | val_loss: 0.0311015 | Time: 10079.3 ms [2022-04-11 09:37:36 main:574] : INFO : Epoch 1747 | loss: 0.0310476 | val_loss: 0.031091 | Time: 10381.5 ms [2022-04-11 09:37:46 main:574] : INFO : Epoch 1748 | loss: 0.0310461 | val_loss: 0.0311004 | Time: 9871.9 ms [2022-04-11 09:37:56 main:574] : INFO : Epoch 1749 | loss: 0.0310442 | val_loss: 0.0311075 | Time: 10191.1 ms [2022-04-11 09:38:07 main:574] : INFO : Epoch 1750 | loss: 0.0310491 | val_loss: 0.0311134 | Time: 10336.7 ms [2022-04-11 09:38:17 main:574] : INFO : Epoch 1751 | loss: 0.0310492 | val_loss: 0.0311 | Time: 10020 ms [2022-04-11 09:38:27 main:574] : INFO : Epoch 1752 | loss: 0.031044 | val_loss: 0.0310927 | Time: 10426.9 ms [2022-04-11 09:38:37 main:574] : INFO : Epoch 1753 | loss: 0.0310405 | val_loss: 0.0310933 | Time: 10301 ms [2022-04-11 09:38:48 main:574] : INFO : Epoch 1754 | loss: 0.0310424 | val_loss: 0.0310969 | Time: 10512.5 ms [2022-04-11 09:38:58 main:574] : INFO : Epoch 1755 | loss: 0.0310533 | val_loss: 0.0311145 | Time: 10230.1 ms [2022-04-11 09:39:08 main:574] : INFO : Epoch 1756 | loss: 0.0310482 | val_loss: 0.0311087 | Time: 10112.5 ms [2022-04-11 09:39:18 main:574] : INFO : Epoch 1757 | loss: 0.0310453 | val_loss: 0.0311125 | Time: 9868.95 ms [2022-04-11 09:39:29 main:574] : INFO : Epoch 1758 | loss: 0.0310435 | val_loss: 0.0311034 | Time: 10324.2 ms [2022-04-11 09:39:39 main:574] : INFO : Epoch 1759 | loss: 0.0310449 | val_loss: 0.0310976 | Time: 10287 ms [2022-04-11 09:39:49 main:574] : INFO : Epoch 1760 | loss: 0.0310431 | val_loss: 0.0311067 | Time: 10261.9 ms [2022-04-11 09:39:59 main:574] : INFO : Epoch 1761 | loss: 0.0310417 | val_loss: 0.031108 | Time: 10181.7 ms [2022-04-11 09:40:10 main:574] : INFO : Epoch 1762 | loss: 0.0310385 | val_loss: 0.0311082 | Time: 10323.4 ms [2022-04-11 09:40:20 main:574] : INFO : Epoch 1763 | loss: 0.0310368 | val_loss: 0.0311062 | Time: 10311.4 ms [2022-04-11 09:40:30 main:574] : INFO : Epoch 1764 | loss: 0.0310339 | val_loss: 0.0310982 | Time: 10438.2 ms [2022-04-11 09:40:41 main:574] : INFO : Epoch 1765 | loss: 0.031034 | val_loss: 0.031104 | Time: 10293.3 ms [2022-04-11 09:40:51 main:574] : INFO : Epoch 1766 | loss: 0.0310386 | val_loss: 0.0311055 | Time: 9984.79 ms [2022-04-11 09:41:01 main:574] : INFO : Epoch 1767 | loss: 0.0310354 | val_loss: 0.0311056 | Time: 9975.07 ms [2022-04-11 09:41:11 main:574] : INFO : Epoch 1768 | loss: 0.0310385 | val_loss: 0.031106 | Time: 10283.1 ms [2022-04-11 09:41:21 main:574] : INFO : Epoch 1769 | loss: 0.0310413 | val_loss: 0.0311071 | Time: 10351.4 ms [2022-04-11 09:41:32 main:574] : INFO : Epoch 1770 | loss: 0.031037 | val_loss: 0.0311068 | Time: 10191.4 ms [2022-04-11 09:41:42 main:574] : INFO : Epoch 1771 | loss: 0.0310336 | val_loss: 0.031097 | Time: 10394.3 ms [2022-04-11 09:41:52 main:574] : INFO : Epoch 1772 | loss: 0.0310386 | val_loss: 0.0311082 | Time: 9763.51 ms [2022-04-11 09:42:02 main:574] : INFO : Epoch 1773 | loss: 0.0310366 | val_loss: 0.0311052 | Time: 9840.8 ms [2022-04-11 09:42:12 main:574] : INFO : Epoch 1774 | loss: 0.0310336 | val_loss: 0.0310958 | Time: 10289.9 ms [2022-04-11 09:42:22 main:574] : INFO : Epoch 1775 | loss: 0.0310375 | val_loss: 0.0311058 | Time: 10298.3 ms [2022-04-11 09:42:33 main:574] : INFO : Epoch 1776 | loss: 0.0310366 | val_loss: 0.0310964 | Time: 10603.6 ms [2022-04-11 09:42:43 main:574] : INFO : Epoch 1777 | loss: 0.03104 | val_loss: 0.0311019 | Time: 10300 ms [2022-04-11 09:42:54 main:574] : INFO : Epoch 1778 | loss: 0.0310375 | val_loss: 0.0311029 | Time: 10391.4 ms [2022-04-11 09:43:04 main:574] : INFO : Epoch 1779 | loss: 0.0310352 | val_loss: 0.0311018 | Time: 10041.9 ms [2022-04-11 09:43:14 main:574] : INFO : Epoch 1780 | loss: 0.0310349 | val_loss: 0.0311238 | Time: 10333.6 ms [2022-04-11 09:43:24 main:574] : INFO : Epoch 1781 | loss: 0.0310497 | val_loss: 0.0310994 | Time: 10167.4 ms [2022-04-11 09:43:35 main:574] : INFO : Epoch 1782 | loss: 0.0310508 | val_loss: 0.0311113 | Time: 10634.1 ms [2022-04-11 09:43:45 main:574] : INFO : Epoch 1783 | loss: 0.0310476 | val_loss: 0.0311108 | Time: 10119.3 ms [2022-04-11 09:43:55 main:574] : INFO : Epoch 1784 | loss: 0.031046 | val_loss: 0.0311011 | Time: 10174 ms [2022-04-11 09:44:06 main:574] : INFO : Epoch 1785 | loss: 0.0310429 | val_loss: 0.0311108 | Time: 10567.2 ms [2022-04-11 09:44:16 main:574] : INFO : Epoch 1786 | loss: 0.0310414 | val_loss: 0.0310959 | Time: 10373.3 ms [2022-04-11 09:44:26 main:574] : INFO : Epoch 1787 | loss: 0.0310402 | val_loss: 0.0311084 | Time: 10087 ms [2022-04-11 09:44:36 main:574] : INFO : Epoch 1788 | loss: 0.0310418 | val_loss: 0.031107 | Time: 10192.1 ms [2022-04-11 09:44:47 main:574] : INFO : Epoch 1789 | loss: 0.0310497 | val_loss: 0.0311041 | Time: 10335.2 ms [2022-04-11 09:44:57 main:574] : INFO : Epoch 1790 | loss: 0.0310492 | val_loss: 0.0310956 | Time: 10572.4 ms [2022-04-11 09:45:07 main:574] : INFO : Epoch 1791 | loss: 0.0310446 | val_loss: 0.0310971 | Time: 10129.4 ms [2022-04-11 09:45:18 main:574] : INFO : Epoch 1792 | loss: 0.0310379 | val_loss: 0.0311005 | Time: 10360.7 ms [2022-04-11 09:45:28 main:574] : INFO : Epoch 1793 | loss: 0.0310385 | val_loss: 0.0311007 | Time: 10099.6 ms [2022-04-11 09:45:39 main:574] : INFO : Epoch 1794 | loss: 0.0310401 | val_loss: 0.0310944 | Time: 10566.3 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce MX250) [2022-04-11 16:24:19 main:435] : INFO : Set logging level to 1 [2022-04-11 16:24:19 main:441] : INFO : Running in BOINC Client mode [2022-04-11 16:24:19 main:444] : INFO : Resolving all filenames [2022-04-11 16:24:19 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-11 16:24:19 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-11 16:24:19 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-11 16:24:19 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-11 16:24:19 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-11 16:24:19 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-11 16:24:19 main:474] : INFO : Configuration: [2022-04-11 16:24:19 main:475] : INFO : Model type: GRU [2022-04-11 16:24:19 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-11 16:24:19 main:477] : INFO : Max Epochs: 2048 [2022-04-11 16:24:19 main:478] : INFO : Batch Size: 128 [2022-04-11 16:24:19 main:479] : INFO : Learning Rate: 0.01 [2022-04-11 16:24:19 main:480] : INFO : Patience: 10 [2022-04-11 16:24:19 main:481] : INFO : Hidden Width: 12 [2022-04-11 16:24:19 main:482] : INFO : # Recurrent Layers: 4 [2022-04-11 16:24:19 main:483] : INFO : # Backend Layers: 4 [2022-04-11 16:24:19 main:484] : INFO : # Threads: 1 [2022-04-11 16:24:19 main:486] : INFO : Preparing Dataset [2022-04-11 16:24:19 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-11 16:24:20 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-11 16:24:22 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-11 16:24:22 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-11 16:24:22 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-11 16:24:22 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-11 16:24:22 main:494] : INFO : Creating Model [2022-04-11 16:24:22 main:507] : INFO : Preparing config file [2022-04-11 16:24:22 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-11 16:24:22 main:512] : INFO : Loading config [2022-04-11 16:24:22 main:514] : INFO : Loading state [2022-04-11 16:24:23 main:559] : INFO : Loading DataLoader into Memory [2022-04-11 16:24:23 main:562] : INFO : Starting Training [2022-04-11 16:24:31 main:574] : INFO : Epoch 1787 | loss: 0.0311264 | val_loss: 0.0311075 | Time: 8177.81 ms [2022-04-11 16:24:39 main:574] : INFO : Epoch 1788 | loss: 0.0310531 | val_loss: 0.0311172 | Time: 8171.77 ms [2022-04-11 16:24:47 main:574] : INFO : Epoch 1789 | loss: 0.0310477 | val_loss: 0.0311067 | Time: 8033.16 ms [2022-04-11 16:24:56 main:574] : INFO : Epoch 1790 | loss: 0.0310427 | val_loss: 0.0311086 | Time: 8156.91 ms [2022-04-11 16:25:03 main:574] : INFO : Epoch 1791 | loss: 0.0310391 | val_loss: 0.0311112 | Time: 7770.34 ms [2022-04-11 16:25:11 main:574] : INFO : Epoch 1792 | loss: 0.0310342 | val_loss: 0.0311128 | Time: 8061.25 ms [2022-04-11 16:25:20 main:574] : INFO : Epoch 1793 | loss: 0.0310348 | val_loss: 0.0311061 | Time: 8089.74 ms [2022-04-11 16:25:42 main:574] : INFO : Epoch 1794 | loss: 0.0310368 | val_loss: 0.0311059 | Time: 21984.8 ms [2022-04-11 16:25:50 main:574] : INFO : Epoch 1795 | loss: 0.0310349 | val_loss: 0.0311074 | Time: 8799.68 ms [2022-04-11 16:25:59 main:574] : INFO : Epoch 1796 | loss: 0.0310387 | val_loss: 0.0310977 | Time: 8860.17 ms [2022-04-11 16:26:08 main:574] : INFO : Epoch 1797 | loss: 0.0310366 | val_loss: 0.0311103 | Time: 9084.4 ms [2022-04-11 16:26:18 main:574] : INFO : Epoch 1798 | loss: 0.0310382 | val_loss: 0.0311106 | Time: 9520.81 ms [2022-04-11 16:26:48 main:574] : INFO : Epoch 1799 | loss: 0.0310393 | val_loss: 0.0311044 | Time: 29750.9 ms [2022-04-11 16:27:06 main:574] : INFO : Epoch 1800 | loss: 0.0310414 | val_loss: 0.0310894 | Time: 18493.7 ms [2022-04-11 16:27:15 main:574] : INFO : Epoch 1801 | loss: 0.0310436 | val_loss: 0.0311016 | Time: 8428.76 ms [2022-04-11 16:27:23 main:574] : INFO : Epoch 1802 | loss: 0.0310445 | val_loss: 0.0310992 | Time: 8411.7 ms [2022-04-11 16:27:32 main:574] : INFO : Epoch 1803 | loss: 0.0310417 | val_loss: 0.0311072 | Time: 8627.23 ms [2022-04-11 16:27:40 main:574] : INFO : Epoch 1804 | loss: 0.0310397 | val_loss: 0.0311057 | Time: 8644.74 ms [2022-04-11 16:27:49 main:574] : INFO : Epoch 1805 | loss: 0.031037 | val_loss: 0.031093 | Time: 8648.78 ms [2022-04-11 16:27:57 main:574] : INFO : Epoch 1806 | loss: 0.0310347 | val_loss: 0.0311034 | Time: 8237.18 ms [2022-04-11 16:28:05 main:574] : INFO : Epoch 1807 | loss: 0.0310353 | val_loss: 0.031094 | Time: 8284.35 ms [2022-04-11 16:28:14 main:574] : INFO : Epoch 1808 | loss: 0.0310333 | val_loss: 0.0311032 | Time: 8315.36 ms [2022-04-11 16:28:22 main:574] : INFO : Epoch 1809 | loss: 0.0310347 | val_loss: 0.0310991 | Time: 8098.75 ms [2022-04-11 16:28:30 main:574] : INFO : Epoch 1810 | loss: 0.0310402 | val_loss: 0.0310948 | Time: 8192.69 ms [2022-04-11 16:28:38 main:574] : INFO : Epoch 1811 | loss: 0.0310346 | val_loss: 0.0311058 | Time: 8279.3 ms [2022-04-11 16:28:47 main:574] : INFO : Epoch 1812 | loss: 0.0310445 | val_loss: 0.0311023 | Time: 8408.43 ms [2022-04-11 16:28:55 main:574] : INFO : Epoch 1813 | loss: 0.0310454 | val_loss: 0.0310963 | Time: 8400.65 ms [2022-04-11 16:29:04 main:574] : INFO : Epoch 1814 | loss: 0.0310419 | val_loss: 0.0311022 | Time: 8453.91 ms [2022-04-11 16:29:12 main:574] : INFO : Epoch 1815 | loss: 0.0310401 | val_loss: 0.0311088 | Time: 8189.54 ms [2022-04-11 16:29:20 main:574] : INFO : Epoch 1816 | loss: 0.0310512 | val_loss: 0.0311048 | Time: 8408.83 ms [2022-04-11 16:29:29 main:574] : INFO : Epoch 1817 | loss: 0.0310409 | val_loss: 0.031106 | Time: 8328.88 ms [2022-04-11 16:29:37 main:574] : INFO : Epoch 1818 | loss: 0.0310422 | val_loss: 0.031109 | Time: 8275.43 ms [2022-04-11 16:29:45 main:574] : INFO : Epoch 1819 | loss: 0.0310394 | val_loss: 0.0310975 | Time: 8314.53 ms [2022-04-11 16:29:54 main:574] : INFO : Epoch 1820 | loss: 0.0310351 | val_loss: 0.0311093 | Time: 8368.42 ms [2022-04-11 16:30:02 main:574] : INFO : Epoch 1821 | loss: 0.0310309 | val_loss: 0.0311063 | Time: 8494.29 ms [2022-04-11 16:30:11 main:574] : INFO : Epoch 1822 | loss: 0.0310297 | val_loss: 0.0311144 | Time: 8452.81 ms [2022-04-11 16:30:19 main:574] : INFO : Epoch 1823 | loss: 0.0310298 | val_loss: 0.0311126 | Time: 8391.2 ms [2022-04-11 16:30:27 main:574] : INFO : Epoch 1824 | loss: 0.0310292 | val_loss: 0.0311003 | Time: 8374.56 ms [2022-04-11 16:30:36 main:574] : INFO : Epoch 1825 | loss: 0.0310308 | val_loss: 0.0311135 | Time: 8306.3 ms [2022-04-11 16:30:44 main:574] : INFO : Epoch 1826 | loss: 0.0310385 | val_loss: 0.0311081 | Time: 8261.99 ms [2022-04-11 16:30:52 main:574] : INFO : Epoch 1827 | loss: 0.0310466 | val_loss: 0.0310955 | Time: 8175.2 ms [2022-04-11 16:31:01 main:574] : INFO : Epoch 1828 | loss: 0.0310441 | val_loss: 0.0311022 | Time: 8509.77 ms [2022-04-11 16:31:09 main:574] : INFO : Epoch 1829 | loss: 0.0310359 | val_loss: 0.0310953 | Time: 8457.35 ms [2022-04-11 16:31:17 main:574] : INFO : Epoch 1830 | loss: 0.0310356 | val_loss: 0.0310933 | Time: 8123.69 ms [2022-04-11 16:31:26 main:574] : INFO : Epoch 1831 | loss: 0.0310321 | val_loss: 0.0311003 | Time: 8874.79 ms [2022-04-11 16:31:35 main:574] : INFO : Epoch 1832 | loss: 0.0310347 | val_loss: 0.0310996 | Time: 8338.09 ms [2022-04-11 16:31:43 main:574] : INFO : Epoch 1833 | loss: 0.0310505 | val_loss: 0.031099 | Time: 8507.75 ms [2022-04-11 16:31:52 main:574] : INFO : Epoch 1834 | loss: 0.0310545 | val_loss: 0.0311061 | Time: 8617.81 ms [2022-04-11 16:32:00 main:574] : INFO : Epoch 1835 | loss: 0.0310516 | val_loss: 0.0311091 | Time: 8435.19 ms [2022-04-11 16:32:09 main:574] : INFO : Epoch 1836 | loss: 0.0310524 | val_loss: 0.031103 | Time: 8453.29 ms [2022-04-11 16:32:17 main:574] : INFO : Epoch 1837 | loss: 0.0310519 | val_loss: 0.0310997 | Time: 8508.75 ms [2022-04-11 16:32:25 main:574] : INFO : Epoch 1838 | loss: 0.0310472 | val_loss: 0.0311152 | Time: 8197.13 ms [2022-04-11 16:32:34 main:574] : INFO : Epoch 1839 | loss: 0.0310426 | val_loss: 0.0311068 | Time: 8347.61 ms [2022-04-11 16:32:42 main:574] : INFO : Epoch 1840 | loss: 0.0310414 | val_loss: 0.0311006 | Time: 8400.14 ms [2022-04-11 16:32:50 main:574] : INFO : Epoch 1841 | loss: 0.0310368 | val_loss: 0.0310909 | Time: 8288.04 ms [2022-04-11 16:32:59 main:574] : INFO : Epoch 1842 | loss: 0.0310331 | val_loss: 0.0311022 | Time: 8364.06 ms [2022-04-11 16:33:07 main:574] : INFO : Epoch 1843 | loss: 0.0310348 | val_loss: 0.0310993 | Time: 8292.15 ms [2022-04-11 16:33:15 main:574] : INFO : Epoch 1844 | loss: 0.0310384 | val_loss: 0.0311074 | Time: 8141.31 ms [2022-04-11 16:33:24 main:574] : INFO : Epoch 1845 | loss: 0.0310413 | val_loss: 0.0311059 | Time: 8307.71 ms [2022-04-11 16:33:32 main:574] : INFO : Epoch 1846 | loss: 0.0310387 | val_loss: 0.0311124 | Time: 8406.6 ms [2022-04-11 16:33:40 main:574] : INFO : Epoch 1847 | loss: 0.0310385 | val_loss: 0.0311023 | Time: 8361.38 ms [2022-04-11 16:33:49 main:574] : INFO : Epoch 1848 | loss: 0.0310372 | val_loss: 0.0311086 | Time: 8532.25 ms [2022-04-11 16:33:58 main:574] : INFO : Epoch 1849 | loss: 0.0310326 | val_loss: 0.0310998 | Time: 8631.03 ms [2022-04-11 16:34:06 main:574] : INFO : Epoch 1850 | loss: 0.0310306 | val_loss: 0.031101 | Time: 8580.94 ms [2022-04-11 16:34:15 main:574] : INFO : Epoch 1851 | loss: 0.0310334 | val_loss: 0.0311053 | Time: 8424.67 ms [2022-04-11 16:34:23 main:574] : INFO : Epoch 1852 | loss: 0.0310348 | val_loss: 0.0311156 | Time: 8525.44 ms [2022-04-11 16:34:32 main:574] : INFO : Epoch 1853 | loss: 0.0310342 | val_loss: 0.0311079 | Time: 8404.23 ms [2022-04-11 16:34:40 main:574] : INFO : Epoch 1854 | loss: 0.0310306 | val_loss: 0.0311043 | Time: 8484.67 ms [2022-04-11 16:34:48 main:574] : INFO : Epoch 1855 | loss: 0.0310358 | val_loss: 0.0311049 | Time: 8340.64 ms [2022-04-11 16:34:57 main:574] : INFO : Epoch 1856 | loss: 0.0310305 | val_loss: 0.031105 | Time: 8630.36 ms [2022-04-11 16:35:05 main:574] : INFO : Epoch 1857 | loss: 0.0310294 | val_loss: 0.0311051 | Time: 8385.47 ms [2022-04-11 16:35:14 main:574] : INFO : Epoch 1858 | loss: 0.0310326 | val_loss: 0.0311138 | Time: 8426.53 ms [2022-04-11 16:35:22 main:574] : INFO : Epoch 1859 | loss: 0.0310352 | val_loss: 0.0311072 | Time: 8408.93 ms [2022-04-11 16:35:31 main:574] : INFO : Epoch 1860 | loss: 0.0310353 | val_loss: 0.0310996 | Time: 8434.22 ms [2022-04-11 16:35:39 main:574] : INFO : Epoch 1861 | loss: 0.0310355 | val_loss: 0.0311091 | Time: 8615.88 ms [2022-04-11 16:35:48 main:574] : INFO : Epoch 1862 | loss: 0.0310348 | val_loss: 0.031097 | Time: 8604.22 ms [2022-04-11 16:35:57 main:574] : INFO : Epoch 1863 | loss: 0.0310352 | val_loss: 0.0311067 | Time: 8589.08 ms [2022-04-11 16:36:05 main:574] : INFO : Epoch 1864 | loss: 0.0310331 | val_loss: 0.0310994 | Time: 8338.16 ms [2022-04-11 16:36:13 main:574] : INFO : Epoch 1865 | loss: 0.0310322 | val_loss: 0.031104 | Time: 8488.75 ms [2022-04-11 16:36:22 main:574] : INFO : Epoch 1866 | loss: 0.0310313 | val_loss: 0.0311161 | Time: 8418.12 ms [2022-04-11 16:36:30 main:574] : INFO : Epoch 1867 | loss: 0.0310367 | val_loss: 0.0310982 | Time: 8384.27 ms [2022-04-11 16:36:38 main:574] : INFO : Epoch 1868 | loss: 0.0310322 | val_loss: 0.0311076 | Time: 8147.01 ms [2022-04-11 16:36:47 main:574] : INFO : Epoch 1869 | loss: 0.0310357 | val_loss: 0.031097 | Time: 8321.32 ms [2022-04-11 16:36:55 main:574] : INFO : Epoch 1870 | loss: 0.0310303 | val_loss: 0.0311162 | Time: 8327.26 ms [2022-04-11 16:37:04 main:574] : INFO : Epoch 1871 | loss: 0.0310286 | val_loss: 0.0311015 | Time: 8470.64 ms [2022-04-11 16:37:12 main:574] : INFO : Epoch 1872 | loss: 0.0310412 | val_loss: 0.0310972 | Time: 8515.36 ms [2022-04-11 16:37:21 main:574] : INFO : Epoch 1873 | loss: 0.0310353 | val_loss: 0.0310916 | Time: 8729.5 ms [2022-04-11 16:37:29 main:574] : INFO : Epoch 1874 | loss: 0.0310474 | val_loss: 0.0311058 | Time: 8614.91 ms [2022-04-11 16:37:38 main:574] : INFO : Epoch 1875 | loss: 0.0310649 | val_loss: 0.0311141 | Time: 8637.52 ms [2022-04-11 16:37:47 main:574] : INFO : Epoch 1876 | loss: 0.0310617 | val_loss: 0.0311245 | Time: 8521.91 ms [2022-04-11 16:37:55 main:574] : INFO : Epoch 1877 | loss: 0.0310612 | val_loss: 0.0311091 | Time: 8507.5 ms [2022-04-11 16:38:04 main:574] : INFO : Epoch 1878 | loss: 0.0310627 | val_loss: 0.0311179 | Time: 8720.04 ms [2022-04-11 16:38:22 main:574] : INFO : Epoch 1879 | loss: 0.0310623 | val_loss: 0.0311056 | Time: 18538.5 ms [2022-04-11 16:38:31 main:574] : INFO : Epoch 1880 | loss: 0.0310604 | val_loss: 0.0311006 | Time: 8332.55 ms [2022-04-11 16:38:39 main:574] : INFO : Epoch 1881 | loss: 0.0310578 | val_loss: 0.0311071 | Time: 8558.8 ms [2022-04-11 16:38:48 main:574] : INFO : Epoch 1882 | loss: 0.0310538 | val_loss: 0.031109 | Time: 8327.96 ms [2022-04-11 16:38:57 main:574] : INFO : Epoch 1883 | loss: 0.0310488 | val_loss: 0.0311075 | Time: 8779.6 ms [2022-04-11 16:39:05 main:574] : INFO : Epoch 1884 | loss: 0.0310459 | val_loss: 0.0311031 | Time: 8385.8 ms [2022-04-11 16:39:13 main:574] : INFO : Epoch 1885 | loss: 0.0310441 | val_loss: 0.0311082 | Time: 8417.96 ms [2022-04-11 16:39:22 main:574] : INFO : Epoch 1886 | loss: 0.0310393 | val_loss: 0.0311099 | Time: 8932.18 ms [2022-04-11 16:39:31 main:574] : INFO : Epoch 1887 | loss: 0.0310405 | val_loss: 0.0311089 | Time: 8637.3 ms [2022-04-11 16:39:39 main:574] : INFO : Epoch 1888 | loss: 0.0310378 | val_loss: 0.0311001 | Time: 8423.79 ms [2022-04-11 16:39:48 main:574] : INFO : Epoch 1889 | loss: 0.0310401 | val_loss: 0.0310997 | Time: 8668.38 ms [2022-04-11 16:39:57 main:574] : INFO : Epoch 1890 | loss: 0.0310397 | val_loss: 0.0310986 | Time: 8500.88 ms [2022-04-11 16:40:05 main:574] : INFO : Epoch 1891 | loss: 0.0310353 | val_loss: 0.0311087 | Time: 8324.52 ms [2022-04-11 16:40:13 main:574] : INFO : Epoch 1892 | loss: 0.0310356 | val_loss: 0.0311022 | Time: 8390.03 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1660) [2022-04-11 17:09:34 main:435] : INFO : Set logging level to 1 [2022-04-11 17:09:34 main:441] : INFO : Running in BOINC Client mode [2022-04-11 17:09:34 main:444] : INFO : Resolving all filenames [2022-04-11 17:09:34 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-11 17:09:34 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-11 17:09:34 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-11 17:09:34 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-11 17:09:34 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-11 17:09:34 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-11 17:09:34 main:474] : INFO : Configuration: [2022-04-11 17:09:34 main:475] : INFO : Model type: GRU [2022-04-11 17:09:34 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-11 17:09:34 main:477] : INFO : Max Epochs: 2048 [2022-04-11 17:09:34 main:478] : INFO : Batch Size: 128 [2022-04-11 17:09:34 main:479] : INFO : Learning Rate: 0.01 [2022-04-11 17:09:34 main:480] : INFO : Patience: 10 [2022-04-11 17:09:34 main:481] : INFO : Hidden Width: 12 [2022-04-11 17:09:34 main:482] : INFO : # Recurrent Layers: 4 [2022-04-11 17:09:34 main:483] : INFO : # Backend Layers: 4 [2022-04-11 17:09:34 main:484] : INFO : # Threads: 1 [2022-04-11 17:09:34 main:486] : INFO : Preparing Dataset [2022-04-11 17:09:34 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-11 17:09:34 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-11 17:09:37 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-11 17:09:37 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-11 17:09:38 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-11 17:09:38 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-11 17:09:38 main:494] : INFO : Creating Model [2022-04-11 17:09:38 main:507] : INFO : Preparing config file [2022-04-11 17:09:38 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-11 17:09:38 main:512] : INFO : Loading config [2022-04-11 17:09:38 main:514] : INFO : Loading state [2022-04-11 17:09:40 main:559] : INFO : Loading DataLoader into Memory [2022-04-11 17:09:40 main:562] : INFO : Starting Training [2022-04-11 17:09:54 main:574] : INFO : Epoch 1880 | loss: 0.0311076 | val_loss: 0.0311179 | Time: 14245.9 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1660) [2022-04-11 17:22:11 main:435] : INFO : Set logging level to 1 [2022-04-11 17:22:11 main:441] : INFO : Running in BOINC Client mode [2022-04-11 17:22:11 main:444] : INFO : Resolving all filenames [2022-04-11 17:22:11 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-11 17:22:11 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-11 17:22:11 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-11 17:22:11 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-11 17:22:11 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-11 17:22:11 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-11 17:22:11 main:474] : INFO : Configuration: [2022-04-11 17:22:11 main:475] : INFO : Model type: GRU [2022-04-11 17:22:11 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-11 17:22:11 main:477] : INFO : Max Epochs: 2048 [2022-04-11 17:22:11 main:478] : INFO : Batch Size: 128 [2022-04-11 17:22:11 main:479] : INFO : Learning Rate: 0.01 [2022-04-11 17:22:11 main:480] : INFO : Patience: 10 [2022-04-11 17:22:11 main:481] : INFO : Hidden Width: 12 [2022-04-11 17:22:11 main:482] : INFO : # Recurrent Layers: 4 [2022-04-11 17:22:11 main:483] : INFO : # Backend Layers: 4 [2022-04-11 17:22:11 main:484] : INFO : # Threads: 1 [2022-04-11 17:22:11 main:486] : INFO : Preparing Dataset [2022-04-11 17:22:11 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-11 17:22:11 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-11 17:22:14 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-11 17:22:14 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-11 17:22:14 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-11 17:22:15 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-11 17:22:15 main:494] : INFO : Creating Model [2022-04-11 17:22:15 main:507] : INFO : Preparing config file [2022-04-11 17:22:15 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-11 17:22:15 main:512] : INFO : Loading config [2022-04-11 17:22:15 main:514] : INFO : Loading state [2022-04-11 17:22:17 main:559] : INFO : Loading DataLoader into Memory [2022-04-11 17:22:17 main:562] : INFO : Starting Training [2022-04-11 17:22:33 main:574] : INFO : Epoch 1880 | loss: 0.0311199 | val_loss: 0.0311169 | Time: 16636 ms [2022-04-11 17:22:49 main:574] : INFO : Epoch 1881 | loss: 0.0310591 | val_loss: 0.0311049 | Time: 15349.9 ms [2022-04-11 17:23:04 main:574] : INFO : Epoch 1882 | loss: 0.0310502 | val_loss: 0.031099 | Time: 15170 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1660) [2022-04-11 17:27:57 main:435] : INFO : Set logging level to 1 [2022-04-11 17:27:57 main:441] : INFO : Running in BOINC Client mode [2022-04-11 17:27:57 main:444] : INFO : Resolving all filenames [2022-04-11 17:27:57 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-11 17:27:57 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-11 17:27:57 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-11 17:27:57 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-11 17:27:57 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-11 17:27:57 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-11 17:27:57 main:474] : INFO : Configuration: [2022-04-11 17:27:57 main:475] : INFO : Model type: GRU [2022-04-11 17:27:57 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-11 17:27:57 main:477] : INFO : Max Epochs: 2048 [2022-04-11 17:27:57 main:478] : INFO : Batch Size: 128 [2022-04-11 17:27:57 main:479] : INFO : Learning Rate: 0.01 [2022-04-11 17:27:57 main:480] : INFO : Patience: 10 [2022-04-11 17:27:57 main:481] : INFO : Hidden Width: 12 [2022-04-11 17:27:57 main:482] : INFO : # Recurrent Layers: 4 [2022-04-11 17:27:57 main:483] : INFO : # Backend Layers: 4 [2022-04-11 17:27:57 main:484] : INFO : # Threads: 1 [2022-04-11 17:27:57 main:486] : INFO : Preparing Dataset [2022-04-11 17:27:57 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-11 17:27:58 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-11 17:28:01 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-11 17:28:01 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-11 17:28:01 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-11 17:28:02 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-11 17:28:02 main:494] : INFO : Creating Model [2022-04-11 17:28:02 main:507] : INFO : Preparing config file [2022-04-11 17:28:02 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-11 17:28:02 main:512] : INFO : Loading config [2022-04-11 17:28:02 main:514] : INFO : Loading state Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1660) [2022-04-11 17:32:42 main:435] : INFO : Set logging level to 1 [2022-04-11 17:32:42 main:441] : INFO : Running in BOINC Client mode [2022-04-11 17:32:42 main:444] : INFO : Resolving all filenames [2022-04-11 17:32:42 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-11 17:32:42 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-11 17:32:42 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-11 17:32:42 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-11 17:32:42 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-11 17:32:42 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-11 17:32:42 main:474] : INFO : Configuration: [2022-04-11 17:32:42 main:475] : INFO : Model type: GRU [2022-04-11 17:32:42 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-11 17:32:42 main:477] : INFO : Max Epochs: 2048 [2022-04-11 17:32:42 main:478] : INFO : Batch Size: 128 [2022-04-11 17:32:42 main:479] : INFO : Learning Rate: 0.01 [2022-04-11 17:32:42 main:480] : INFO : Patience: 10 [2022-04-11 17:32:42 main:481] : INFO : Hidden Width: 12 [2022-04-11 17:32:42 main:482] : INFO : # Recurrent Layers: 4 [2022-04-11 17:32:42 main:483] : INFO : # Backend Layers: 4 [2022-04-11 17:32:42 main:484] : INFO : # Threads: 1 [2022-04-11 17:32:42 main:486] : INFO : Preparing Dataset [2022-04-11 17:32:42 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-11 17:32:42 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1660) [2022-04-11 17:39:23 main:435] : INFO : Set logging level to 1 [2022-04-11 17:39:23 main:441] : INFO : Running in BOINC Client mode [2022-04-11 17:39:23 main:444] : INFO : Resolving all filenames [2022-04-11 17:39:23 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-11 17:39:23 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-11 17:39:23 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-11 17:39:23 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-11 17:39:23 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-11 17:39:23 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-11 17:39:23 main:474] : INFO : Configuration: [2022-04-11 17:39:23 main:475] : INFO : Model type: GRU [2022-04-11 17:39:23 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-11 17:39:23 main:477] : INFO : Max Epochs: 2048 [2022-04-11 17:39:23 main:478] : INFO : Batch Size: 128 [2022-04-11 17:39:23 main:479] : INFO : Learning Rate: 0.01 [2022-04-11 17:39:23 main:480] : INFO : Patience: 10 [2022-04-11 17:39:23 main:481] : INFO : Hidden Width: 12 [2022-04-11 17:39:23 main:482] : INFO : # Recurrent Layers: 4 [2022-04-11 17:39:23 main:483] : INFO : # Backend Layers: 4 [2022-04-11 17:39:23 main:484] : INFO : # Threads: 1 [2022-04-11 17:39:23 main:486] : INFO : Preparing Dataset [2022-04-11 17:39:23 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-11 17:39:24 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-11 17:39:27 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-11 17:39:27 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-11 17:39:27 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-11 17:39:27 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-11 17:39:27 main:494] : INFO : Creating Model [2022-04-11 17:39:27 main:507] : INFO : Preparing config file [2022-04-11 17:39:27 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-11 17:39:27 main:512] : INFO : Loading config [2022-04-11 17:39:27 main:514] : INFO : Loading state [2022-04-11 17:39:29 main:559] : INFO : Loading DataLoader into Memory [2022-04-11 17:39:29 main:562] : INFO : Starting Training [2022-04-11 17:39:45 main:574] : INFO : Epoch 1880 | loss: 0.0311134 | val_loss: 0.0311137 | Time: 16103.3 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1660) [2022-04-11 18:08:50 main:435] : INFO : Set logging level to 1 [2022-04-11 18:08:50 main:441] : INFO : Running in BOINC Client mode [2022-04-11 18:08:50 main:444] : INFO : Resolving all filenames [2022-04-11 18:08:50 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-04-11 18:08:50 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-04-11 18:08:50 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-04-11 18:08:50 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-04-11 18:08:50 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-04-11 18:08:50 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-04-11 18:08:50 main:474] : INFO : Configuration: [2022-04-11 18:08:50 main:475] : INFO : Model type: GRU [2022-04-11 18:08:50 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-04-11 18:08:50 main:477] : INFO : Max Epochs: 2048 [2022-04-11 18:08:50 main:478] : INFO : Batch Size: 128 [2022-04-11 18:08:50 main:479] : INFO : Learning Rate: 0.01 [2022-04-11 18:08:50 main:480] : INFO : Patience: 10 [2022-04-11 18:08:50 main:481] : INFO : Hidden Width: 12 [2022-04-11 18:08:50 main:482] : INFO : # Recurrent Layers: 4 [2022-04-11 18:08:50 main:483] : INFO : # Backend Layers: 4 [2022-04-11 18:08:50 main:484] : INFO : # Threads: 1 [2022-04-11 18:08:50 main:486] : INFO : Preparing Dataset [2022-04-11 18:08:50 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-04-11 18:08:51 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-04-11 18:08:53 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-04-11 18:08:53 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-04-11 18:08:53 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-04-11 18:08:53 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-04-11 18:08:53 main:494] : INFO : Creating Model [2022-04-11 18:08:53 main:507] : INFO : Preparing config file [2022-04-11 18:08:53 main:511] : INFO : Found checkpoint, attempting to load... [2022-04-11 18:08:53 main:512] : INFO : Loading config [2022-04-11 18:08:53 main:514] : INFO : Loading state [2022-04-11 18:08:54 main:559] : INFO : Loading DataLoader into Memory [2022-04-11 18:08:54 main:562] : INFO : Starting Training [2022-04-11 18:09:04 main:574] : INFO : Epoch 1880 | loss: 0.0311177 | val_loss: 0.031114 | Time: 9865.57 ms [2022-04-11 18:09:13 main:574] : INFO : Epoch 1881 | loss: 0.0310543 | val_loss: 0.0311068 | Time: 9175.76 ms [2022-04-11 18:09:22 main:574] : INFO : Epoch 1882 | loss: 0.0310495 | val_loss: 0.0311023 | Time: 9133.06 ms [2022-04-11 18:09:32 main:574] : INFO : Epoch 1883 | loss: 0.0310472 | val_loss: 0.0311081 | Time: 9115.66 ms [2022-04-11 18:09:41 main:574] : INFO : Epoch 1884 | loss: 0.0310445 | val_loss: 0.0310999 | Time: 9255.21 ms [2022-04-11 18:09:50 main:574] : INFO : Epoch 1885 | loss: 0.0310419 | val_loss: 0.0311046 | Time: 9054.11 ms [2022-04-11 18:09:59 main:574] : INFO : Epoch 1886 | loss: 0.0310417 | val_loss: 0.0311008 | Time: 9237.4 ms [2022-04-11 18:10:08 main:574] : INFO : Epoch 1887 | loss: 0.0310373 | val_loss: 0.0311066 | Time: 9214.34 ms [2022-04-11 18:10:18 main:574] : INFO : Epoch 1888 | loss: 0.031042 | val_loss: 0.031113 | Time: 9138.44 ms [2022-04-11 18:10:27 main:574] : INFO : Epoch 1889 | loss: 0.0310408 | val_loss: 0.0311097 | Time: 9380.2 ms [2022-04-11 18:10:36 main:574] : INFO : Epoch 1890 | loss: 0.0310547 | val_loss: 0.0311112 | Time: 9022.41 ms [2022-04-11 18:10:45 main:574] : INFO : Epoch 1891 | loss: 0.0310546 | val_loss: 0.031111 | Time: 9288.23 ms [2022-04-11 18:10:55 main:574] : INFO : Epoch 1892 | loss: 0.0310479 | val_loss: 0.0311167 | Time: 9421.94 ms [2022-04-11 18:11:04 main:574] : INFO : Epoch 1893 | loss: 0.0310451 | val_loss: 0.0311041 | Time: 9176.82 ms [2022-04-11 18:11:13 main:574] : INFO : Epoch 1894 | loss: 0.0310431 | val_loss: 0.0311028 | Time: 9101.69 ms [2022-04-11 18:11:22 main:574] : INFO : Epoch 1895 | loss: 0.0310412 | val_loss: 0.0311096 | Time: 9098.53 ms [2022-04-11 18:11:31 main:574] : INFO : Epoch 1896 | loss: 0.0310402 | val_loss: 0.0311055 | Time: 9296.11 ms [2022-04-11 18:11:40 main:574] : INFO : Epoch 1897 | loss: 0.0310387 | val_loss: 0.0311013 | Time: 9100.94 ms [2022-04-11 18:11:50 main:574] : INFO : Epoch 1898 | loss: 0.0310378 | val_loss: 0.0311077 | Time: 9210.08 ms [2022-04-11 18:12:00 main:574] : INFO : Epoch 1899 | loss: 0.0310354 | val_loss: 0.0311021 | Time: 10217.2 ms [2022-04-11 18:12:10 main:574] : INFO : Epoch 1900 | loss: 0.031036 | val_loss: 0.0311161 | Time: 10111.8 ms [2022-04-11 18:12:19 main:574] : INFO : Epoch 1901 | loss: 0.0310328 | val_loss: 0.0311009 | Time: 9369.65 ms [2022-04-11 18:12:29 main:574] : INFO : Epoch 1902 | loss: 0.0310354 | val_loss: 0.0311063 | Time: 9173.67 ms [2022-04-11 18:12:38 main:574] : INFO : Epoch 1903 | loss: 0.0310369 | val_loss: 0.0311215 | Time: 9299.83 ms [2022-04-11 18:12:47 main:574] : INFO : Epoch 1904 | loss: 0.0310407 | val_loss: 0.0311087 | Time: 9195.32 ms [2022-04-11 18:12:57 main:574] : INFO : Epoch 1905 | loss: 0.0310364 | val_loss: 0.0310999 | Time: 9808.94 ms [2022-04-11 18:13:06 main:574] : INFO : Epoch 1906 | loss: 0.0310409 | val_loss: 0.0310997 | Time: 9105.37 ms [2022-04-11 18:13:15 main:574] : INFO : Epoch 1907 | loss: 0.0310476 | val_loss: 0.0311015 | Time: 9171.46 ms [2022-04-11 18:13:25 main:574] : INFO : Epoch 1908 | loss: 0.0310422 | val_loss: 0.031113 | Time: 9414.41 ms [2022-04-11 18:13:34 main:574] : INFO : Epoch 1909 | loss: 0.0310359 | val_loss: 0.0311069 | Time: 9240.49 ms [2022-04-11 18:13:43 main:574] : INFO : Epoch 1910 | loss: 0.0310355 | val_loss: 0.0311135 | Time: 9410.11 ms [2022-04-11 18:13:53 main:574] : INFO : Epoch 1911 | loss: 0.03104 | val_loss: 0.0311178 | Time: 9257.8 ms [2022-04-11 18:14:02 main:574] : INFO : Epoch 1912 | loss: 0.0310417 | val_loss: 0.0311122 | Time: 9245.46 ms [2022-04-11 18:14:11 main:574] : INFO : Epoch 1913 | loss: 0.0310394 | val_loss: 0.0311086 | Time: 9383.48 ms [2022-04-11 18:14:20 main:574] : INFO : Epoch 1914 | loss: 0.0310421 | val_loss: 0.0311084 | Time: 9182.73 ms [2022-04-11 18:14:30 main:574] : INFO : Epoch 1915 | loss: 0.031052 | val_loss: 0.0311074 | Time: 9225.11 ms [2022-04-11 18:14:39 main:574] : INFO : Epoch 1916 | loss: 0.0310533 | val_loss: 0.0311095 | Time: 9365.53 ms [2022-04-11 18:14:48 main:574] : INFO : Epoch 1917 | loss: 0.0310487 | val_loss: 0.0311069 | Time: 9133.06 ms [2022-04-11 18:14:58 main:574] : INFO : Epoch 1918 | loss: 0.0310427 | val_loss: 0.0310983 | Time: 9410.76 ms [2022-04-11 18:15:07 main:574] : INFO : Epoch 1919 | loss: 0.0310508 | val_loss: 0.0311071 | Time: 9153.75 ms [2022-04-11 18:15:16 main:574] : INFO : Epoch 1920 | loss: 0.0310434 | val_loss: 0.0311131 | Time: 9484.48 ms [2022-04-11 18:15:26 main:574] : INFO : Epoch 1921 | loss: 0.0310479 | val_loss: 0.0311048 | Time: 9199.04 ms [2022-04-11 18:15:35 main:574] : INFO : Epoch 1922 | loss: 0.0310411 | val_loss: 0.031098 | Time: 9274.78 ms [2022-04-11 18:15:44 main:574] : INFO : Epoch 1923 | loss: 0.0310382 | val_loss: 0.031102 | Time: 9451.65 ms [2022-04-11 18:15:54 main:574] : INFO : Epoch 1924 | loss: 0.0310357 | val_loss: 0.031093 | Time: 9499.77 ms [2022-04-11 18:16:03 main:574] : INFO : Epoch 1925 | loss: 0.0310403 | val_loss: 0.0311017 | Time: 9432.72 ms [2022-04-11 18:16:13 main:574] : INFO : Epoch 1926 | loss: 0.031041 | val_loss: 0.0311181 | Time: 9477.85 ms [2022-04-11 18:16:22 main:574] : INFO : Epoch 1927 | loss: 0.0310424 | val_loss: 0.0310996 | Time: 9266.27 ms [2022-04-11 18:16:31 main:574] : INFO : Epoch 1928 | loss: 0.0310411 | val_loss: 0.0311005 | Time: 9413.52 ms [2022-04-11 18:16:41 main:574] : INFO : Epoch 1929 | loss: 0.0310399 | val_loss: 0.0310968 | Time: 9371.27 ms [2022-04-11 18:16:50 main:574] : INFO : Epoch 1930 | loss: 0.0310397 | val_loss: 0.0310963 | Time: 9193.81 ms [2022-04-11 18:16:59 main:574] : INFO : Epoch 1931 | loss: 0.0310314 | val_loss: 0.0311022 | Time: 9254.1 ms [2022-04-11 18:17:09 main:574] : INFO : Epoch 1932 | loss: 0.031029 | val_loss: 0.0310936 | Time: 9276.02 ms [2022-04-11 18:17:18 main:574] : INFO : Epoch 1933 | loss: 0.031032 | val_loss: 0.031102 | Time: 9352.56 ms [2022-04-11 18:17:27 main:574] : INFO : Epoch 1934 | loss: 0.0310358 | val_loss: 0.0311051 | Time: 9321.36 ms [2022-04-11 18:17:37 main:574] : INFO : Epoch 1935 | loss: 0.0310312 | val_loss: 0.0310974 | Time: 9370.86 ms [2022-04-11 18:17:46 main:574] : INFO : Epoch 1936 | loss: 0.0310291 | val_loss: 0.0310952 | Time: 9552.78 ms [2022-04-11 18:17:55 main:574] : INFO : Epoch 1937 | loss: 0.031026 | val_loss: 0.0310966 | Time: 9315.1 ms [2022-04-11 18:18:05 main:574] : INFO : Epoch 1938 | loss: 0.0310306 | val_loss: 0.0311007 | Time: 9212.16 ms [2022-04-11 18:18:14 main:574] : INFO : Epoch 1939 | loss: 0.0310343 | val_loss: 0.0310993 | Time: 9365.86 ms [2022-04-11 18:18:24 main:574] : INFO : Epoch 1940 | loss: 0.0310316 | val_loss: 0.0311028 | Time: 9461.4 ms [2022-04-11 18:18:33 main:574] : INFO : Epoch 1941 | loss: 0.0310307 | val_loss: 0.0311035 | Time: 9275.07 ms [2022-04-11 18:18:42 main:574] : INFO : Epoch 1942 | loss: 0.0310276 | val_loss: 0.0310968 | Time: 9211.38 ms [2022-04-11 18:18:51 main:574] : INFO : Epoch 1943 | loss: 0.0310358 | val_loss: 0.0310968 | Time: 9375.5 ms [2022-04-11 18:19:01 main:574] : INFO : Epoch 1944 | loss: 0.0310341 | val_loss: 0.0310959 | Time: 9223.34 ms [2022-04-11 18:19:10 main:574] : INFO : Epoch 1945 | loss: 0.0310372 | val_loss: 0.0311036 | Time: 9160.86 ms [2022-04-11 18:19:19 main:574] : INFO : Epoch 1946 | loss: 0.0310355 | val_loss: 0.0310927 | Time: 9257.07 ms [2022-04-11 18:19:28 main:574] : INFO : Epoch 1947 | loss: 0.0310349 | val_loss: 0.0310967 | Time: 9079.27 ms [2022-04-11 18:19:38 main:574] : INFO : Epoch 1948 | loss: 0.03104 | val_loss: 0.0310954 | Time: 9350.54 ms [2022-04-11 18:19:47 main:574] : INFO : Epoch 1949 | loss: 0.0310358 | val_loss: 0.031098 | Time: 9336.89 ms [2022-04-11 18:19:56 main:574] : INFO : Epoch 1950 | loss: 0.0310333 | val_loss: 0.0311004 | Time: 9457.32 ms [2022-04-11 18:20:06 main:574] : INFO : Epoch 1951 | loss: 0.0310329 | val_loss: 0.0311078 | Time: 9243.77 ms [2022-04-11 18:20:15 main:574] : INFO : Epoch 1952 | loss: 0.0310351 | val_loss: 0.0311007 | Time: 9507.58 ms [2022-04-11 18:20:25 main:574] : INFO : Epoch 1953 | loss: 0.0310568 | val_loss: 0.0311072 | Time: 9360.92 ms [2022-04-11 18:20:34 main:574] : INFO : Epoch 1954 | loss: 0.0310627 | val_loss: 0.0310948 | Time: 9153.86 ms [2022-04-11 18:20:43 main:574] : INFO : Epoch 1955 | loss: 0.031052 | val_loss: 0.0311013 | Time: 9494.94 ms [2022-04-11 18:20:52 main:574] : INFO : Epoch 1956 | loss: 0.0310491 | val_loss: 0.0311085 | Time: 9215.14 ms [2022-04-11 18:21:02 main:574] : INFO : Epoch 1957 | loss: 0.0310501 | val_loss: 0.0311117 | Time: 9354.03 ms [2022-04-11 18:21:11 main:574] : INFO : Epoch 1958 | loss: 0.0310588 | val_loss: 0.0311012 | Time: 9369.64 ms [2022-04-11 18:21:20 main:574] : INFO : Epoch 1959 | loss: 0.0310682 | val_loss: 0.031102 | Time: 9151.83 ms [2022-04-11 18:21:30 main:574] : INFO : Epoch 1960 | loss: 0.0310622 | val_loss: 0.0311043 | Time: 9623.36 ms [2022-04-11 18:21:39 main:574] : INFO : Epoch 1961 | loss: 0.0310582 | val_loss: 0.031104 | Time: 9317.64 ms [2022-04-11 18:21:48 main:574] : INFO : Epoch 1962 | loss: 0.0310543 | val_loss: 0.0311 | Time: 9112.22 ms [2022-04-11 18:21:58 main:574] : INFO : Epoch 1963 | loss: 0.0310562 | val_loss: 0.0311047 | Time: 9260.22 ms [2022-04-11 18:22:07 main:574] : INFO : Epoch 1964 | loss: 0.0310568 | val_loss: 0.0311062 | Time: 9346.31 ms [2022-04-11 18:22:16 main:574] : INFO : Epoch 1965 | loss: 0.031059 | val_loss: 0.0311105 | Time: 9269.66 ms [2022-04-11 18:22:25 main:574] : INFO : Epoch 1966 | loss: 0.0310655 | val_loss: 0.0311058 | Time: 9119.39 ms [2022-04-11 18:22:35 main:574] : INFO : Epoch 1967 | loss: 0.0310554 | val_loss: 0.0310952 | Time: 9295.18 ms [2022-04-11 18:22:44 main:574] : INFO : Epoch 1968 | loss: 0.0310512 | val_loss: 0.0311094 | Time: 9253.57 ms [2022-04-11 18:22:53 main:574] : INFO : Epoch 1969 | loss: 0.031048 | val_loss: 0.031106 | Time: 9342.06 ms [2022-04-11 18:23:03 main:574] : INFO : Epoch 1970 | loss: 0.0310437 | val_loss: 0.031103 | Time: 9321.64 ms [2022-04-11 18:23:12 main:574] : INFO : Epoch 1971 | loss: 0.0310414 | val_loss: 0.0311052 | Time: 9571.75 ms [2022-04-11 18:23:22 main:574] : INFO : Epoch 1972 | loss: 0.031041 | val_loss: 0.0311021 | Time: 9441.05 ms [2022-04-11 18:23:31 main:574] : INFO : Epoch 1973 | loss: 0.0310396 | val_loss: 0.0311059 | Time: 9163.19 ms [2022-04-11 18:23:40 main:574] : INFO : Epoch 1974 | loss: 0.0310407 | val_loss: 0.0310987 | Time: 9366.22 ms [2022-04-11 18:23:50 main:574] : INFO : Epoch 1975 | loss: 0.0310415 | val_loss: 0.031098 | Time: 9396.19 ms [2022-04-11 18:23:59 main:574] : INFO : Epoch 1976 | loss: 0.0310371 | val_loss: 0.0310943 | Time: 9584.66 ms [2022-04-11 18:24:09 main:574] : INFO : Epoch 1977 | loss: 0.0310357 | val_loss: 0.0311053 | Time: 9494.77 ms [2022-04-11 18:24:18 main:574] : INFO : Epoch 1978 | loss: 0.0310395 | val_loss: 0.0311038 | Time: 9443.52 ms [2022-04-11 18:24:27 main:574] : INFO : Epoch 1979 | loss: 0.0310394 | val_loss: 0.0310985 | Time: 9198.59 ms [2022-04-11 18:24:37 main:574] : INFO : Epoch 1980 | loss: 0.0310402 | val_loss: 0.0311053 | Time: 9212.92 ms [2022-04-11 18:24:46 main:574] : INFO : Epoch 1981 | loss: 0.0310401 | val_loss: 0.0311115 | Time: 9308.06 ms [2022-04-11 18:24:55 main:574] : INFO : Epoch 1982 | loss: 0.0310388 | val_loss: 0.031104 | Time: 9401.85 ms [2022-04-11 18:25:05 main:574] : INFO : Epoch 1983 | loss: 0.0310346 | val_loss: 0.0310991 | Time: 9201.72 ms [2022-04-11 18:25:14 main:574] : INFO : Epoch 1984 | loss: 0.0310328 | val_loss: 0.0311061 | Time: 9188.33 ms [2022-04-11 18:25:23 main:574] : INFO : Epoch 1985 | loss: 0.031033 | val_loss: 0.0310988 | Time: 9371.29 ms [2022-04-11 18:25:32 main:574] : INFO : Epoch 1986 | loss: 0.0310367 | val_loss: 0.0311091 | Time: 9142 ms [2022-04-11 18:25:42 main:574] : INFO : Epoch 1987 | loss: 0.031037 | val_loss: 0.0310998 | Time: 9401.19 ms [2022-04-11 18:25:51 main:574] : INFO : Epoch 1988 | loss: 0.0310326 | val_loss: 0.0311037 | Time: 9285.34 ms [2022-04-11 18:26:00 main:574] : INFO : Epoch 1989 | loss: 0.0310315 | val_loss: 0.0310993 | Time: 9305.1 ms [2022-04-11 18:26:10 main:574] : INFO : Epoch 1990 | loss: 0.0310314 | val_loss: 0.0311054 | Time: 10060.5 ms [2022-04-11 18:26:20 main:574] : INFO : Epoch 1991 | loss: 0.0310339 | val_loss: 0.0311147 | Time: 9599.27 ms [2022-04-11 18:26:29 main:574] : INFO : Epoch 1992 | loss: 0.0310304 | val_loss: 0.0311158 | Time: 9381.47 ms [2022-04-11 18:26:39 main:574] : INFO : Epoch 1993 | loss: 0.0310298 | val_loss: 0.0311047 | Time: 9334.98 ms [2022-04-11 18:26:48 main:574] : INFO : Epoch 1994 | loss: 0.0310262 | val_loss: 0.0311114 | Time: 9405.14 ms [2022-04-11 18:26:58 main:574] : INFO : Epoch 1995 | loss: 0.0310306 | val_loss: 0.0311142 | Time: 9288.51 ms [2022-04-11 18:27:07 main:574] : INFO : Epoch 1996 | loss: 0.0310302 | val_loss: 0.0311068 | Time: 9518.63 ms [2022-04-11 18:27:16 main:574] : INFO : Epoch 1997 | loss: 0.0310301 | val_loss: 0.0311074 | Time: 9326.41 ms [2022-04-11 18:27:26 main:574] : INFO : Epoch 1998 | loss: 0.0310274 | val_loss: 0.0310964 | Time: 9193.95 ms [2022-04-11 18:27:35 main:574] : INFO : Epoch 1999 | loss: 0.0310282 | val_loss: 0.0311066 | Time: 9269.43 ms [2022-04-11 18:27:44 main:574] : INFO : Epoch 2000 | loss: 0.031035 | val_loss: 0.031104 | Time: 9466.83 ms [2022-04-11 18:27:53 main:574] : INFO : Epoch 2001 | loss: 0.0310317 | val_loss: 0.0311095 | Time: 9007 ms [2022-04-11 18:28:03 main:574] : INFO : Epoch 2002 | loss: 0.0310329 | val_loss: 0.0311052 | Time: 9283.28 ms [2022-04-11 18:28:12 main:574] : INFO : Epoch 2003 | loss: 0.0310389 | val_loss: 0.0311134 | Time: 9362.96 ms [2022-04-11 18:28:21 main:574] : INFO : Epoch 2004 | loss: 0.03104 | val_loss: 0.0311044 | Time: 9427.49 ms [2022-04-11 18:28:31 main:574] : INFO : Epoch 2005 | loss: 0.031043 | val_loss: 0.0311083 | Time: 9308.06 ms [2022-04-11 18:28:40 main:574] : INFO : Epoch 2006 | loss: 0.0310358 | val_loss: 0.0311011 | Time: 9257.99 ms [2022-04-11 18:28:49 main:574] : INFO : Epoch 2007 | loss: 0.0310391 | val_loss: 0.0311063 | Time: 9177.19 ms [2022-04-11 18:28:58 main:574] : INFO : Epoch 2008 | loss: 0.031039 | val_loss: 0.0311016 | Time: 9086.99 ms [2022-04-11 18:29:08 main:574] : INFO : Epoch 2009 | loss: 0.0310354 | val_loss: 0.0311071 | Time: 9265.42 ms [2022-04-11 18:29:17 main:574] : INFO : Epoch 2010 | loss: 0.0310346 | val_loss: 0.0311062 | Time: 9549.37 ms [2022-04-11 18:29:27 main:574] : INFO : Epoch 2011 | loss: 0.0310422 | val_loss: 0.0311057 | Time: 9418.04 ms [2022-04-11 18:29:36 main:574] : INFO : Epoch 2012 | loss: 0.0310404 | val_loss: 0.0311057 | Time: 9335.76 ms [2022-04-11 18:29:45 main:574] : INFO : Epoch 2013 | loss: 0.0310368 | val_loss: 0.0311078 | Time: 9435.46 ms [2022-04-11 18:29:54 main:574] : INFO : Epoch 2014 | loss: 0.0310351 | val_loss: 0.0310915 | Time: 9017.97 ms [2022-04-11 18:30:03 main:574] : INFO : Epoch 2015 | loss: 0.0310418 | val_loss: 0.031109 | Time: 9069.56 ms [2022-04-11 18:30:13 main:574] : INFO : Epoch 2016 | loss: 0.0310374 | val_loss: 0.0310939 | Time: 9551.11 ms [2022-04-11 18:30:22 main:574] : INFO : Epoch 2017 | loss: 0.0310698 | val_loss: 0.0311018 | Time: 9273.22 ms [2022-04-11 18:30:32 main:574] : INFO : Epoch 2018 | loss: 0.0310747 | val_loss: 0.0311087 | Time: 9235.12 ms [2022-04-11 18:30:41 main:574] : INFO : Epoch 2019 | loss: 0.0310644 | val_loss: 0.0311025 | Time: 9181.43 ms [2022-04-11 18:30:50 main:574] : INFO : Epoch 2020 | loss: 0.0310592 | val_loss: 0.0311063 | Time: 9377.15 ms [2022-04-11 18:30:59 main:574] : INFO : Epoch 2021 | loss: 0.0310572 | val_loss: 0.0311093 | Time: 9174.93 ms [2022-04-11 18:31:09 main:574] : INFO : Epoch 2022 | loss: 0.031051 | val_loss: 0.0311109 | Time: 9446.8 ms [2022-04-11 18:31:18 main:574] : INFO : Epoch 2023 | loss: 0.0310512 | val_loss: 0.0311061 | Time: 9293.17 ms [2022-04-11 18:31:27 main:574] : INFO : Epoch 2024 | loss: 0.0310448 | val_loss: 0.0311153 | Time: 9137.37 ms [2022-04-11 18:31:36 main:574] : INFO : Epoch 2025 | loss: 0.031041 | val_loss: 0.0311063 | Time: 9208 ms [2022-04-11 18:31:46 main:574] : INFO : Epoch 2026 | loss: 0.0310462 | val_loss: 0.0311112 | Time: 9436 ms [2022-04-11 18:31:56 main:574] : INFO : Epoch 2027 | loss: 0.0310411 | val_loss: 0.0311047 | Time: 9666.44 ms [2022-04-11 18:32:05 main:574] : INFO : Epoch 2028 | loss: 0.0310393 | val_loss: 0.0311001 | Time: 9681.27 ms [2022-04-11 18:32:15 main:574] : INFO : Epoch 2029 | loss: 0.0310393 | val_loss: 0.031116 | Time: 9630.54 ms [2022-04-11 18:32:24 main:574] : INFO : Epoch 2030 | loss: 0.0310364 | val_loss: 0.0311109 | Time: 9325.01 ms [2022-04-11 18:32:34 main:574] : INFO : Epoch 2031 | loss: 0.0310337 | val_loss: 0.031113 | Time: 9554.83 ms [2022-04-11 18:32:44 main:574] : INFO : Epoch 2032 | loss: 0.0310325 | val_loss: 0.031115 | Time: 10126.9 ms [2022-04-11 18:32:53 main:574] : INFO : Epoch 2033 | loss: 0.0310319 | val_loss: 0.0311115 | Time: 9461.4 ms [2022-04-11 18:33:03 main:574] : INFO : Epoch 2034 | loss: 0.0310341 | val_loss: 0.0311046 | Time: 9295.3 ms [2022-04-11 18:33:12 main:574] : INFO : Epoch 2035 | loss: 0.031038 | val_loss: 0.0311085 | Time: 9302.52 ms [2022-04-11 18:33:21 main:574] : INFO : Epoch 2036 | loss: 0.031038 | val_loss: 0.0311097 | Time: 9195.46 ms [2022-04-11 18:33:31 main:574] : INFO : Epoch 2037 | loss: 0.0310382 | val_loss: 0.0311161 | Time: 9300.72 ms [2022-04-11 18:33:40 main:574] : INFO : Epoch 2038 | loss: 0.0310375 | val_loss: 0.0311002 | Time: 9278.09 ms [2022-04-11 18:33:49 main:574] : INFO : Epoch 2039 | loss: 0.0310388 | val_loss: 0.0310975 | Time: 9355.38 ms [2022-04-11 18:33:58 main:574] : INFO : Epoch 2040 | loss: 0.0310404 | val_loss: 0.0310979 | Time: 9185.14 ms [2022-04-11 18:34:08 main:574] : INFO : Epoch 2041 | loss: 0.0310384 | val_loss: 0.031097 | Time: 9402.86 ms [2022-04-11 18:34:17 main:574] : INFO : Epoch 2042 | loss: 0.0310369 | val_loss: 0.0310982 | Time: 9592.73 ms [2022-04-11 18:34:27 main:574] : INFO : Epoch 2043 | loss: 0.0310353 | val_loss: 0.0310982 | Time: 9258.38 ms [2022-04-11 18:34:36 main:574] : INFO : Epoch 2044 | loss: 0.0310307 | val_loss: 0.0310996 | Time: 9407.51 ms [2022-04-11 18:34:46 main:574] : INFO : Epoch 2045 | loss: 0.031032 | val_loss: 0.0311071 | Time: 9541.11 ms [2022-04-11 18:34:55 main:574] : INFO : Epoch 2046 | loss: 0.0310313 | val_loss: 0.031101 | Time: 9247.82 ms [2022-04-11 18:35:04 main:574] : INFO : Epoch 2047 | loss: 0.0310289 | val_loss: 0.0311183 | Time: 9325.83 ms [2022-04-11 18:35:14 main:574] : INFO : Epoch 2048 | loss: 0.0310363 | val_loss: 0.0310927 | Time: 9475.38 ms [2022-04-11 18:35:14 main:597] : INFO : Saving trained model to model-final.pt, val_loss 0.0310927 [2022-04-11 18:35:14 main:603] : INFO : Saving end state to config to file [2022-04-11 18:35:14 main:608] : INFO : Success, exiting.. 18:35:14 (22568): called boinc_finish(0) </stderr_txt> ]]>
©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)