| Name | ParityModified-1639961302-21958-0_0 |
| Workunit | 7403265 |
| Created | 20 Dec 2021, 0:48:26 UTC |
| Sent | 4 Jan 2022, 1:23:44 UTC |
| Report deadline | 12 Jan 2022, 1:23:44 UTC |
| Received | 9 Jan 2022, 2:03:28 UTC |
| Server state | Over |
| Outcome | Success |
| Client state | Done |
| Exit status | 0 (0x00000000) |
| Computer ID | 6020 |
| Run time | 7 hours 17 min 10 sec |
| CPU time | 6 hours 54 min 18 sec |
| Validate state | Valid |
| Credit | 4,160.00 |
| Device peak FLOPS | 1,880.96 GFLOPS |
| Application version | Machine Learning Dataset Generator (GPU) v9.75 (cuda10200) windows_x86_64 |
| Peak working set size | 1.64 GB |
| Peak swap size | 3.60 GB |
| Peak disk usage | 1.54 GB |
<core_client_version>7.16.20</core_client_version> <![CDATA[ <stderr_txt> loss: 0.0311185 | val_loss: 0.0312019 | Time: 8029.66 ms [2022-01-06 13:49:27 main:574] : INFO : Epoch 1686 | loss: 0.0311299 | val_loss: 0.0311814 | Time: 8073.48 ms [2022-01-06 13:49:35 main:574] : INFO : Epoch 1687 | loss: 0.0311359 | val_loss: 0.0312033 | Time: 8068.28 ms [2022-01-06 13:49:43 main:574] : INFO : Epoch 1688 | loss: 0.0311295 | val_loss: 0.0312034 | Time: 8099.86 ms [2022-01-06 13:49:51 main:574] : INFO : Epoch 1689 | loss: 0.0311216 | val_loss: 0.0311991 | Time: 8276.45 ms [2022-01-06 13:49:59 main:574] : INFO : Epoch 1690 | loss: 0.0311171 | val_loss: 0.031196 | Time: 8025.78 ms [2022-01-06 13:50:07 main:574] : INFO : Epoch 1691 | loss: 0.0311213 | val_loss: 0.0311982 | Time: 7348.74 ms [2022-01-06 13:50:15 main:574] : INFO : Epoch 1692 | loss: 0.0311307 | val_loss: 0.031194 | Time: 8097.3 ms [2022-01-06 13:50:23 main:574] : INFO : Epoch 1693 | loss: 0.0311259 | val_loss: 0.0312029 | Time: 7965.03 ms [2022-01-06 13:50:31 main:574] : INFO : Epoch 1694 | loss: 0.0311202 | val_loss: 0.0312028 | Time: 7787.43 ms [2022-01-06 13:50:38 main:574] : INFO : Epoch 1695 | loss: 0.0311181 | val_loss: 0.0311977 | Time: 7919.48 ms [2022-01-06 13:50:46 main:574] : INFO : Epoch 1696 | loss: 0.0311165 | val_loss: 0.0312025 | Time: 7620.51 ms [2022-01-06 13:50:54 main:574] : INFO : Epoch 1697 | loss: 0.0311161 | val_loss: 0.0312001 | Time: 7509.58 ms [2022-01-06 13:51:02 main:574] : INFO : Epoch 1698 | loss: 0.031116 | val_loss: 0.0312144 | Time: 8048.89 ms [2022-01-06 13:51:09 main:574] : INFO : Epoch 1699 | loss: 0.0311169 | val_loss: 0.0312028 | Time: 7459.21 ms [2022-01-06 13:51:16 main:574] : INFO : Epoch 1700 | loss: 0.0311203 | val_loss: 0.0311992 | Time: 7054.98 ms [2022-01-06 13:51:25 main:574] : INFO : Epoch 1701 | loss: 0.0311194 | val_loss: 0.0311969 | Time: 8419.46 ms [2022-01-06 13:51:33 main:574] : INFO : Epoch 1702 | loss: 0.0311165 | val_loss: 0.0312064 | Time: 8700.54 ms [2022-01-06 13:51:41 main:574] : INFO : Epoch 1703 | loss: 0.0311166 | val_loss: 0.0311946 | Time: 8077.36 ms [2022-01-06 13:51:49 main:574] : INFO : Epoch 1704 | loss: 0.0311237 | val_loss: 0.0311983 | Time: 7532.55 ms [2022-01-06 13:51:57 main:574] : INFO : Epoch 1705 | loss: 0.0311139 | val_loss: 0.0312068 | Time: 7974.89 ms [2022-01-06 13:52:05 main:574] : INFO : Epoch 1706 | loss: 0.0311153 | val_loss: 0.0311966 | Time: 7673.98 ms [2022-01-06 13:52:13 main:574] : INFO : Epoch 1707 | loss: 0.0311135 | val_loss: 0.0312021 | Time: 7819.92 ms [2022-01-06 13:52:20 main:574] : INFO : Epoch 1708 | loss: 0.0311132 | val_loss: 0.0311972 | Time: 7277.36 ms [2022-01-06 13:52:27 main:574] : INFO : Epoch 1709 | loss: 0.0311195 | val_loss: 0.0311933 | Time: 7102.44 ms [2022-01-06 13:52:35 main:574] : INFO : Epoch 1710 | loss: 0.0311132 | val_loss: 0.0312075 | Time: 7959.18 ms [2022-01-06 13:52:43 main:574] : INFO : Epoch 1711 | loss: 0.0311118 | val_loss: 0.0312075 | Time: 7971.18 ms [2022-01-06 13:52:51 main:574] : INFO : Epoch 1712 | loss: 0.0311123 | val_loss: 0.0312126 | Time: 7807.02 ms [2022-01-06 13:52:59 main:574] : INFO : Epoch 1713 | loss: 0.0311225 | val_loss: 0.0311989 | Time: 8131.89 ms [2022-01-06 13:53:07 main:574] : INFO : Epoch 1714 | loss: 0.031122 | val_loss: 0.0311995 | Time: 7894.19 ms [2022-01-06 13:53:14 main:574] : INFO : Epoch 1715 | loss: 0.0311213 | val_loss: 0.0312103 | Time: 7389.2 ms [2022-01-06 13:53:22 main:574] : INFO : Epoch 1716 | loss: 0.0311174 | val_loss: 0.0311971 | Time: 8001.87 ms [2022-01-06 13:53:30 main:574] : INFO : Epoch 1717 | loss: 0.0311166 | val_loss: 0.0311984 | Time: 7601.81 ms [2022-01-06 13:53:38 main:574] : INFO : Epoch 1718 | loss: 0.0311137 | val_loss: 0.0312029 | Time: 7894.27 ms [2022-01-06 13:53:45 main:574] : INFO : Epoch 1719 | loss: 0.0311123 | val_loss: 0.0312012 | Time: 7789.15 ms [2022-01-06 13:53:53 main:574] : INFO : Epoch 1720 | loss: 0.0311098 | val_loss: 0.0312092 | Time: 7682.35 ms [2022-01-06 13:54:01 main:574] : INFO : Epoch 1721 | loss: 0.0311149 | val_loss: 0.0311988 | Time: 7923.11 ms [2022-01-06 13:54:09 main:574] : INFO : Epoch 1722 | loss: 0.031114 | val_loss: 0.0312006 | Time: 7884.97 ms [2022-01-06 13:54:17 main:574] : INFO : Epoch 1723 | loss: 0.0311239 | val_loss: 0.0312035 | Time: 7802.85 ms [2022-01-06 13:54:25 main:574] : INFO : Epoch 1724 | loss: 0.0311192 | val_loss: 0.0311982 | Time: 8101.65 ms [2022-01-06 13:54:33 main:574] : INFO : Epoch 1725 | loss: 0.031117 | val_loss: 0.0311976 | Time: 7742.39 ms [2022-01-06 13:54:41 main:574] : INFO : Epoch 1726 | loss: 0.031112 | val_loss: 0.0312069 | Time: 8202.31 ms [2022-01-06 13:54:49 main:574] : INFO : Epoch 1727 | loss: 0.0311158 | val_loss: 0.0311996 | Time: 7827.45 ms [2022-01-06 13:54:56 main:574] : INFO : Epoch 1728 | loss: 0.0311166 | val_loss: 0.0311987 | Time: 7733.62 ms [2022-01-06 13:55:04 main:574] : INFO : Epoch 1729 | loss: 0.031108 | val_loss: 0.0312163 | Time: 7648.76 ms [2022-01-06 13:55:12 main:574] : INFO : Epoch 1730 | loss: 0.0311079 | val_loss: 0.0312052 | Time: 7679.45 ms [2022-01-06 13:55:20 main:574] : INFO : Epoch 1731 | loss: 0.0311101 | val_loss: 0.0312086 | Time: 8082.37 ms [2022-01-06 13:55:28 main:574] : INFO : Epoch 1732 | loss: 0.0311114 | val_loss: 0.031213 | Time: 7920.86 ms [2022-01-06 13:55:36 main:574] : INFO : Epoch 1733 | loss: 0.0311098 | val_loss: 0.0312058 | Time: 8395.22 ms [2022-01-06 13:55:44 main:574] : INFO : Epoch 1734 | loss: 0.0311082 | val_loss: 0.0312159 | Time: 7581.4 ms [2022-01-06 13:55:51 main:574] : INFO : Epoch 1735 | loss: 0.0311127 | val_loss: 0.031206 | Time: 7103.32 ms [2022-01-06 13:55:59 main:574] : INFO : Epoch 1736 | loss: 0.0311146 | val_loss: 0.0311999 | Time: 8129.88 ms [2022-01-06 13:56:07 main:574] : INFO : Epoch 1737 | loss: 0.031116 | val_loss: 0.0312029 | Time: 7677.49 ms [2022-01-06 13:56:15 main:574] : INFO : Epoch 1738 | loss: 0.0311151 | val_loss: 0.0311957 | Time: 7935.18 ms [2022-01-06 13:56:23 main:574] : INFO : Epoch 1739 | loss: 0.0311189 | val_loss: 0.0312049 | Time: 7777.56 ms [2022-01-06 13:56:31 main:574] : INFO : Epoch 1740 | loss: 0.0311248 | val_loss: 0.0311942 | Time: 8087.57 ms [2022-01-06 13:56:39 main:574] : INFO : Epoch 1741 | loss: 0.0311263 | val_loss: 0.0311968 | Time: 8044.07 ms [2022-01-06 13:56:47 main:574] : INFO : Epoch 1742 | loss: 0.0311206 | val_loss: 0.0312045 | Time: 7819.43 ms [2022-01-06 13:56:55 main:574] : INFO : Epoch 1743 | loss: 0.0311189 | val_loss: 0.0311988 | Time: 8024.79 ms [2022-01-06 13:57:02 main:574] : INFO : Epoch 1744 | loss: 0.0311158 | val_loss: 0.0312 | Time: 7712.93 ms [2022-01-06 13:57:10 main:574] : INFO : Epoch 1745 | loss: 0.031112 | val_loss: 0.0312042 | Time: 7506.66 ms [2022-01-06 13:57:18 main:574] : INFO : Epoch 1746 | loss: 0.0311181 | val_loss: 0.0311888 | Time: 7608.95 ms [2022-01-06 13:57:25 main:574] : INFO : Epoch 1747 | loss: 0.0311182 | val_loss: 0.0312019 | Time: 7767.25 ms [2022-01-06 13:57:33 main:574] : INFO : Epoch 1748 | loss: 0.0311138 | val_loss: 0.0311954 | Time: 8046.49 ms [2022-01-06 13:57:41 main:574] : INFO : Epoch 1749 | loss: 0.0311145 | val_loss: 0.0312036 | Time: 8048.78 ms [2022-01-06 13:57:49 main:574] : INFO : Epoch 1750 | loss: 0.031112 | val_loss: 0.0312048 | Time: 7571.45 ms [2022-01-06 13:57:57 main:574] : INFO : Epoch 1751 | loss: 0.0311084 | val_loss: 0.0312048 | Time: 8126.99 ms [2022-01-06 13:58:05 main:574] : INFO : Epoch 1752 | loss: 0.0311195 | val_loss: 0.0311998 | Time: 8078.58 ms [2022-01-06 13:58:13 main:574] : INFO : Epoch 1753 | loss: 0.031116 | val_loss: 0.0312088 | Time: 7767.04 ms [2022-01-06 13:58:21 main:574] : INFO : Epoch 1754 | loss: 0.031111 | val_loss: 0.0312105 | Time: 8230.58 ms [2022-01-06 13:58:29 main:574] : INFO : Epoch 1755 | loss: 0.0311088 | val_loss: 0.0311996 | Time: 8063.38 ms [2022-01-06 13:58:38 main:574] : INFO : Epoch 1756 | loss: 0.0311052 | val_loss: 0.0312151 | Time: 8194.55 ms [2022-01-06 13:58:46 main:574] : INFO : Epoch 1757 | loss: 0.0311053 | val_loss: 0.0312219 | Time: 8160.56 ms [2022-01-06 13:58:54 main:574] : INFO : Epoch 1758 | loss: 0.0311016 | val_loss: 0.0311997 | Time: 8166.73 ms [2022-01-06 13:59:02 main:574] : INFO : Epoch 1759 | loss: 0.0311042 | val_loss: 0.0312045 | Time: 7785.54 ms [2022-01-06 13:59:10 main:574] : INFO : Epoch 1760 | loss: 0.0311087 | val_loss: 0.0312091 | Time: 7842.01 ms [2022-01-06 13:59:18 main:574] : INFO : Epoch 1761 | loss: 0.0311077 | val_loss: 0.0312147 | Time: 8090.14 ms [2022-01-06 13:59:25 main:574] : INFO : Epoch 1762 | loss: 0.0311055 | val_loss: 0.0312103 | Time: 7492.87 ms [2022-01-06 13:59:33 main:574] : INFO : Epoch 1763 | loss: 0.0311061 | val_loss: 0.0312027 | Time: 7954.51 ms [2022-01-06 13:59:41 main:574] : INFO : Epoch 1764 | loss: 0.0311028 | val_loss: 0.0312007 | Time: 7891.44 ms [2022-01-06 13:59:49 main:574] : INFO : Epoch 1765 | loss: 0.0311138 | val_loss: 0.0312058 | Time: 7690.55 ms [2022-01-06 13:59:56 main:574] : INFO : Epoch 1766 | loss: 0.0311543 | val_loss: 0.0311871 | Time: 7645.73 ms [2022-01-06 14:00:04 main:574] : INFO : Epoch 1767 | loss: 0.0311612 | val_loss: 0.0311859 | Time: 7743.12 ms [2022-01-06 14:00:12 main:574] : INFO : Epoch 1768 | loss: 0.0311496 | val_loss: 0.0311845 | Time: 8185.2 ms [2022-01-06 14:00:20 main:574] : INFO : Epoch 1769 | loss: 0.0311444 | val_loss: 0.0311873 | Time: 7817.91 ms [2022-01-06 14:00:28 main:574] : INFO : Epoch 1770 | loss: 0.0311408 | val_loss: 0.0311857 | Time: 7839.11 ms [2022-01-06 14:00:36 main:574] : INFO : Epoch 1771 | loss: 0.03114 | val_loss: 0.0311872 | Time: 7600.52 ms [2022-01-06 14:00:44 main:574] : INFO : Epoch 1772 | loss: 0.0311413 | val_loss: 0.0311858 | Time: 7868.78 ms [2022-01-06 14:00:51 main:574] : INFO : Epoch 1773 | loss: 0.0311427 | val_loss: 0.0311835 | Time: 7495.08 ms [2022-01-06 14:00:59 main:574] : INFO : Epoch 1774 | loss: 0.0311515 | val_loss: 0.0311802 | Time: 7816.86 ms [2022-01-06 14:01:07 main:574] : INFO : Epoch 1775 | loss: 0.0311441 | val_loss: 0.0311838 | Time: 7850.24 ms [2022-01-06 14:01:15 main:574] : INFO : Epoch 1776 | loss: 0.031139 | val_loss: 0.0311782 | Time: 8004.98 ms [2022-01-06 14:01:23 main:574] : INFO : Epoch 1777 | loss: 0.0311332 | val_loss: 0.0311801 | Time: 7802.41 ms [2022-01-06 14:01:31 main:574] : INFO : Epoch 1778 | loss: 0.0311297 | val_loss: 0.0311841 | Time: 7954.32 ms [2022-01-06 14:01:39 main:574] : INFO : Epoch 1779 | loss: 0.0311279 | val_loss: 0.0311833 | Time: 8055.13 ms [2022-01-06 14:01:47 main:574] : INFO : Epoch 1780 | loss: 0.0311245 | val_loss: 0.0311906 | Time: 8066.13 ms [2022-01-06 14:01:55 main:574] : INFO : Epoch 1781 | loss: 0.0311226 | val_loss: 0.0311986 | Time: 8067.98 ms [2022-01-06 14:02:03 main:574] : INFO : Epoch 1782 | loss: 0.0311228 | val_loss: 0.031196 | Time: 8062.66 ms [2022-01-06 14:02:11 main:574] : INFO : Epoch 1783 | loss: 0.0311222 | val_loss: 0.0311992 | Time: 8089.83 ms [2022-01-06 14:02:19 main:574] : INFO : Epoch 1784 | loss: 0.0311174 | val_loss: 0.0311915 | Time: 7826.24 ms [2022-01-06 14:02:27 main:574] : INFO : Epoch 1785 | loss: 0.0311174 | val_loss: 0.0311886 | Time: 8052.17 ms [2022-01-06 14:02:34 main:574] : INFO : Epoch 1786 | loss: 0.0311186 | val_loss: 0.0312005 | Time: 7246.54 ms [2022-01-06 14:02:42 main:574] : INFO : Epoch 1787 | loss: 0.0311174 | val_loss: 0.0312059 | Time: 7840 ms [2022-01-06 14:02:50 main:574] : INFO : Epoch 1788 | loss: 0.031113 | val_loss: 0.0311978 | Time: 7783.45 ms [2022-01-06 14:02:58 main:574] : INFO : Epoch 1789 | loss: 0.0311113 | val_loss: 0.0311994 | Time: 8220.21 ms [2022-01-06 14:03:06 main:574] : INFO : Epoch 1790 | loss: 0.0311159 | val_loss: 0.0311989 | Time: 7994.23 ms [2022-01-06 14:03:14 main:574] : INFO : Epoch 1791 | loss: 0.0311217 | val_loss: 0.0311922 | Time: 7862.69 ms [2022-01-06 14:03:22 main:574] : INFO : Epoch 1792 | loss: 0.0311231 | val_loss: 0.0312 | Time: 7809.76 ms [2022-01-06 14:03:30 main:574] : INFO : Epoch 1793 | loss: 0.0311193 | val_loss: 0.0312054 | Time: 7859.78 ms [2022-01-06 14:03:38 main:574] : INFO : Epoch 1794 | loss: 0.0311244 | val_loss: 0.0311908 | Time: 8088.16 ms [2022-01-06 14:03:46 main:574] : INFO : Epoch 1795 | loss: 0.0311235 | val_loss: 0.0312073 | Time: 7925.27 ms [2022-01-06 14:03:54 main:574] : INFO : Epoch 1796 | loss: 0.031133 | val_loss: 0.0311842 | Time: 8132.03 ms [2022-01-06 14:04:02 main:574] : INFO : Epoch 1797 | loss: 0.0311282 | val_loss: 0.0311881 | Time: 7934.51 ms [2022-01-06 14:04:10 main:574] : INFO : Epoch 1798 | loss: 0.0311214 | val_loss: 0.0311853 | Time: 8151.78 ms [2022-01-06 14:04:18 main:574] : INFO : Epoch 1799 | loss: 0.0311176 | val_loss: 0.0311842 | Time: 7662.03 ms [2022-01-06 14:04:26 main:574] : INFO : Epoch 1800 | loss: 0.0311125 | val_loss: 0.031207 | Time: 7890.99 ms [2022-01-06 14:04:33 main:574] : INFO : Epoch 1801 | loss: 0.0311158 | val_loss: 0.031194 | Time: 7838.58 ms [2022-01-06 14:04:41 main:574] : INFO : Epoch 1802 | loss: 0.0311154 | val_loss: 0.0311952 | Time: 7915.69 ms [2022-01-06 14:04:50 main:574] : INFO : Epoch 1803 | loss: 0.031116 | val_loss: 0.031202 | Time: 8410.77 ms [2022-01-06 14:04:58 main:574] : INFO : Epoch 1804 | loss: 0.0311115 | val_loss: 0.0312063 | Time: 8103.84 ms [2022-01-06 14:05:06 main:574] : INFO : Epoch 1805 | loss: 0.0311152 | val_loss: 0.0312019 | Time: 8055.57 ms [2022-01-06 14:05:14 main:574] : INFO : Epoch 1806 | loss: 0.0311158 | val_loss: 0.0311988 | Time: 7657.35 ms [2022-01-06 14:05:21 main:574] : INFO : Epoch 1807 | loss: 0.0311128 | val_loss: 0.0312023 | Time: 7565.28 ms [2022-01-06 14:05:29 main:574] : INFO : Epoch 1808 | loss: 0.0311132 | val_loss: 0.0311995 | Time: 7749.66 ms [2022-01-06 14:05:37 main:574] : INFO : Epoch 1809 | loss: 0.031109 | val_loss: 0.031204 | Time: 7873.66 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: GeForce MX350) [2022-01-07 10:28:45 main:435] : INFO : Set logging level to 1 [2022-01-07 10:28:45 main:441] : INFO : Running in BOINC Client mode [2022-01-07 10:28:45 main:444] : INFO : Resolving all filenames [2022-01-07 10:28:45 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-07 10:28:45 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-07 10:28:45 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-07 10:28:45 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-01-07 10:28:45 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-07 10:28:45 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-07 10:28:45 main:474] : INFO : Configuration: [2022-01-07 10:28:45 main:475] : INFO : Model type: GRU [2022-01-07 10:28:45 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-07 10:28:45 main:477] : INFO : Max Epochs: 2048 [2022-01-07 10:28:45 main:478] : INFO : Batch Size: 128 [2022-01-07 10:28:45 main:479] : INFO : Learning Rate: 0.01 [2022-01-07 10:28:45 main:480] : INFO : Patience: 10 [2022-01-07 10:28:45 main:481] : INFO : Hidden Width: 12 [2022-01-07 10:28:45 main:482] : INFO : # Recurrent Layers: 4 [2022-01-07 10:28:45 main:483] : INFO : # Backend Layers: 4 [2022-01-07 10:28:45 main:484] : INFO : # Threads: 1 [2022-01-07 10:28:45 main:486] : INFO : Preparing Dataset [2022-01-07 10:28:45 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-07 10:28:45 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-07 10:28:48 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-07 10:28:48 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-07 10:28:48 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-07 10:28:49 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-07 10:28:49 main:494] : INFO : Creating Model [2022-01-07 10:28:49 main:507] : INFO : Preparing config file [2022-01-07 10:28:49 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-07 10:28:49 main:512] : INFO : Loading config [2022-01-07 10:28:49 main:514] : INFO : Loading state [2022-01-07 10:28:50 main:559] : INFO : Loading DataLoader into Memory [2022-01-07 10:28:50 main:562] : INFO : Starting Training [2022-01-07 10:28:56 main:574] : INFO : Epoch 1805 | loss: 0.0314679 | val_loss: 0.0312069 | Time: 6045.11 ms [2022-01-07 10:29:01 main:574] : INFO : Epoch 1806 | loss: 0.0311991 | val_loss: 0.0311866 | Time: 4972.97 ms [2022-01-07 10:29:07 main:574] : INFO : Epoch 1807 | loss: 0.0311483 | val_loss: 0.0311916 | Time: 5762.75 ms [2022-01-07 10:29:13 main:574] : INFO : Epoch 1808 | loss: 0.031134 | val_loss: 0.031191 | Time: 6113.09 ms [2022-01-07 10:29:20 main:574] : INFO : Epoch 1809 | loss: 0.0311221 | val_loss: 0.0311901 | Time: 7069.3 ms [2022-01-07 10:29:27 main:574] : INFO : Epoch 1810 | loss: 0.0311197 | val_loss: 0.0311966 | Time: 7000.41 ms [2022-01-07 10:29:35 main:574] : INFO : Epoch 1811 | loss: 0.0311199 | val_loss: 0.0311931 | Time: 7336.29 ms [2022-01-07 10:29:42 main:574] : INFO : Epoch 1812 | loss: 0.031118 | val_loss: 0.0311936 | Time: 6799.02 ms [2022-01-07 10:29:52 main:574] : INFO : Epoch 1813 | loss: 0.0311127 | val_loss: 0.0311976 | Time: 10732.8 ms [2022-01-07 10:30:10 main:574] : INFO : Epoch 1814 | loss: 0.0311134 | val_loss: 0.0311924 | Time: 18063 ms [2022-01-07 10:30:16 main:574] : INFO : Epoch 1815 | loss: 0.0311134 | val_loss: 0.0312005 | Time: 5993.72 ms [2022-01-07 10:30:23 main:574] : INFO : Epoch 1816 | loss: 0.0311086 | val_loss: 0.0312069 | Time: 6707.29 ms [2022-01-07 10:30:30 main:574] : INFO : Epoch 1817 | loss: 0.0311135 | val_loss: 0.031204 | Time: 7239.18 ms [2022-01-07 10:30:38 main:574] : INFO : Epoch 1818 | loss: 0.0311156 | val_loss: 0.0312034 | Time: 7220.72 ms [2022-01-07 10:30:45 main:574] : INFO : Epoch 1819 | loss: 0.0311116 | val_loss: 0.0312069 | Time: 7387.06 ms [2022-01-07 10:30:52 main:574] : INFO : Epoch 1820 | loss: 0.0311113 | val_loss: 0.0312063 | Time: 6843.28 ms [2022-01-07 10:30:59 main:574] : INFO : Epoch 1821 | loss: 0.0311113 | val_loss: 0.031192 | Time: 7018.78 ms [2022-01-07 10:31:06 main:574] : INFO : Epoch 1822 | loss: 0.031115 | val_loss: 0.0312135 | Time: 6920.74 ms [2022-01-07 10:31:14 main:574] : INFO : Epoch 1823 | loss: 0.0311146 | val_loss: 0.0312025 | Time: 7931.31 ms [2022-01-07 10:31:21 main:574] : INFO : Epoch 1824 | loss: 0.0311296 | val_loss: 0.0311975 | Time: 7307.16 ms [2022-01-07 10:31:28 main:574] : INFO : Epoch 1825 | loss: 0.031128 | val_loss: 0.0311856 | Time: 7068.85 ms [2022-01-07 10:31:35 main:574] : INFO : Epoch 1826 | loss: 0.0311226 | val_loss: 0.0312076 | Time: 6944.44 ms [2022-01-07 10:31:42 main:574] : INFO : Epoch 1827 | loss: 0.0311197 | val_loss: 0.0311991 | Time: 6821.04 ms [2022-01-07 10:31:49 main:574] : INFO : Epoch 1828 | loss: 0.0311132 | val_loss: 0.0312091 | Time: 7226.87 ms [2022-01-07 10:31:56 main:574] : INFO : Epoch 1829 | loss: 0.0311141 | val_loss: 0.0311975 | Time: 6753.52 ms [2022-01-07 10:32:03 main:574] : INFO : Epoch 1830 | loss: 0.0311122 | val_loss: 0.0311999 | Time: 6989.74 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: GeForce MX350) [2022-01-08 00:27:26 main:435] : INFO : Set logging level to 1 [2022-01-08 00:27:26 main:441] : INFO : Running in BOINC Client mode [2022-01-08 00:27:26 main:444] : INFO : Resolving all filenames [2022-01-08 00:27:26 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-08 00:27:26 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-08 00:27:26 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-08 00:27:26 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-01-08 00:27:26 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-08 00:27:26 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-08 00:27:26 main:474] : INFO : Configuration: [2022-01-08 00:27:26 main:475] : INFO : Model type: GRU [2022-01-08 00:27:26 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-08 00:27:26 main:477] : INFO : Max Epochs: 2048 [2022-01-08 00:27:26 main:478] : INFO : Batch Size: 128 [2022-01-08 00:27:26 main:479] : INFO : Learning Rate: 0.01 [2022-01-08 00:27:26 main:480] : INFO : Patience: 10 [2022-01-08 00:27:26 main:481] : INFO : Hidden Width: 12 [2022-01-08 00:27:26 main:482] : INFO : # Recurrent Layers: 4 [2022-01-08 00:27:26 main:483] : INFO : # Backend Layers: 4 [2022-01-08 00:27:26 main:484] : INFO : # Threads: 1 [2022-01-08 00:27:26 main:486] : INFO : Preparing Dataset [2022-01-08 00:27:26 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-08 00:27:28 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-08 00:27:36 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-08 00:27:36 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-08 00:27:37 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-08 00:27:37 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-08 00:27:37 main:494] : INFO : Creating Model [2022-01-08 00:27:37 main:507] : INFO : Preparing config file [2022-01-08 00:27:37 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-08 00:27:37 main:512] : INFO : Loading config [2022-01-08 00:27:37 main:514] : INFO : Loading state [2022-01-08 00:27:48 main:559] : INFO : Loading DataLoader into Memory [2022-01-08 00:27:48 main:562] : INFO : Starting Training [2022-01-08 00:28:52 main:574] : INFO : Epoch 1831 | loss: 0.031517 | val_loss: 0.0312982 | Time: 64375.7 ms [2022-01-08 00:29:13 main:574] : INFO : Epoch 1832 | loss: 0.0311723 | val_loss: 0.0312182 | Time: 20504.5 ms [2022-01-08 00:29:38 main:574] : INFO : Epoch 1833 | loss: 0.0311284 | val_loss: 0.0311912 | Time: 24881.2 ms [2022-01-08 00:29:59 main:574] : INFO : Epoch 1834 | loss: 0.031116 | val_loss: 0.0311944 | Time: 20739.7 ms [2022-01-08 00:30:18 main:574] : INFO : Epoch 1835 | loss: 0.0311151 | val_loss: 0.031193 | Time: 19599.3 ms [2022-01-08 00:30:38 main:574] : INFO : Epoch 1836 | loss: 0.0311155 | val_loss: 0.0311996 | Time: 19952.8 ms [2022-01-08 00:30:59 main:574] : INFO : Epoch 1837 | loss: 0.0311086 | val_loss: 0.0312058 | Time: 20380.8 ms [2022-01-08 00:31:20 main:574] : INFO : Epoch 1838 | loss: 0.0311062 | val_loss: 0.0312085 | Time: 21314.5 ms [2022-01-08 00:31:40 main:574] : INFO : Epoch 1839 | loss: 0.0311023 | val_loss: 0.0312008 | Time: 20509.8 ms [2022-01-08 00:32:01 main:574] : INFO : Epoch 1840 | loss: 0.0311034 | val_loss: 0.0312081 | Time: 20213.9 ms [2022-01-08 00:32:26 main:574] : INFO : Epoch 1841 | loss: 0.0311045 | val_loss: 0.0312111 | Time: 25687.2 ms [2022-01-08 00:32:46 main:574] : INFO : Epoch 1842 | loss: 0.0311022 | val_loss: 0.0311943 | Time: 19134.4 ms [2022-01-08 00:33:05 main:574] : INFO : Epoch 1843 | loss: 0.0311031 | val_loss: 0.0312124 | Time: 19641.9 ms [2022-01-08 00:33:24 main:574] : INFO : Epoch 1844 | loss: 0.0310976 | val_loss: 0.0312014 | Time: 19125.2 ms [2022-01-08 00:33:44 main:574] : INFO : Epoch 1845 | loss: 0.0310983 | val_loss: 0.0312119 | Time: 19437.2 ms [2022-01-08 00:34:03 main:574] : INFO : Epoch 1846 | loss: 0.031101 | val_loss: 0.031214 | Time: 18737.4 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: GeForce MX350) [2022-01-08 10:37:38 main:435] : INFO : Set logging level to 1 [2022-01-08 10:37:38 main:441] : INFO : Running in BOINC Client mode [2022-01-08 10:37:38 main:444] : INFO : Resolving all filenames [2022-01-08 10:37:38 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-08 10:37:38 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-08 10:37:38 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-08 10:37:38 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-01-08 10:37:38 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-08 10:37:38 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-08 10:37:38 main:474] : INFO : Configuration: [2022-01-08 10:37:38 main:475] : INFO : Model type: GRU [2022-01-08 10:37:38 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-08 10:37:38 main:477] : INFO : Max Epochs: 2048 [2022-01-08 10:37:38 main:478] : INFO : Batch Size: 128 [2022-01-08 10:37:38 main:479] : INFO : Learning Rate: 0.01 [2022-01-08 10:37:38 main:480] : INFO : Patience: 10 [2022-01-08 10:37:38 main:481] : INFO : Hidden Width: 12 [2022-01-08 10:37:38 main:482] : INFO : # Recurrent Layers: 4 [2022-01-08 10:37:38 main:483] : INFO : # Backend Layers: 4 [2022-01-08 10:37:38 main:484] : INFO : # Threads: 1 [2022-01-08 10:37:38 main:486] : INFO : Preparing Dataset [2022-01-08 10:37:38 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-08 10:37:38 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-08 10:37:41 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-08 10:37:41 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-08 10:37:41 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-08 10:37:41 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-08 10:37:41 main:494] : INFO : Creating Model [2022-01-08 10:37:41 main:507] : INFO : Preparing config file [2022-01-08 10:37:41 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-08 10:37:41 main:512] : INFO : Loading config [2022-01-08 10:37:41 main:514] : INFO : Loading state [2022-01-08 10:37:42 main:559] : INFO : Loading DataLoader into Memory [2022-01-08 10:37:42 main:562] : INFO : Starting Training [2022-01-08 10:37:48 main:574] : INFO : Epoch 1838 | loss: 0.03156 | val_loss: 0.0312907 | Time: 5953.33 ms [2022-01-08 10:37:59 main:574] : INFO : Epoch 1839 | loss: 0.0312332 | val_loss: 0.0311889 | Time: 11679.6 ms [2022-01-08 10:38:25 main:574] : INFO : Epoch 1840 | loss: 0.0311733 | val_loss: 0.0311786 | Time: 25727.9 ms [2022-01-08 10:38:54 main:574] : INFO : Epoch 1841 | loss: 0.0311555 | val_loss: 0.0311773 | Time: 28349.2 ms [2022-01-08 10:39:21 main:574] : INFO : Epoch 1842 | loss: 0.0311442 | val_loss: 0.0311829 | Time: 27521.9 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: GeForce MX350) [2022-01-08 10:50:48 main:435] : INFO : Set logging level to 1 [2022-01-08 10:50:48 main:441] : INFO : Running in BOINC Client mode [2022-01-08 10:50:48 main:444] : INFO : Resolving all filenames [2022-01-08 10:50:48 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-08 10:50:48 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-08 10:50:48 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-08 10:50:48 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-01-08 10:50:48 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-08 10:50:48 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-08 10:50:48 main:474] : INFO : Configuration: [2022-01-08 10:50:48 main:475] : INFO : Model type: GRU [2022-01-08 10:50:48 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-08 10:50:48 main:477] : INFO : Max Epochs: 2048 [2022-01-08 10:50:48 main:478] : INFO : Batch Size: 128 [2022-01-08 10:50:48 main:479] : INFO : Learning Rate: 0.01 [2022-01-08 10:50:48 main:480] : INFO : Patience: 10 [2022-01-08 10:50:48 main:481] : INFO : Hidden Width: 12 [2022-01-08 10:50:48 main:482] : INFO : # Recurrent Layers: 4 [2022-01-08 10:50:48 main:483] : INFO : # Backend Layers: 4 [2022-01-08 10:50:48 main:484] : INFO : # Threads: 1 [2022-01-08 10:50:48 main:486] : INFO : Preparing Dataset [2022-01-08 10:50:48 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-08 10:50:49 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-08 10:50:51 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-08 10:50:51 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-08 10:50:51 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-08 10:50:51 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-08 10:50:51 main:494] : INFO : Creating Model [2022-01-08 10:50:51 main:507] : INFO : Preparing config file [2022-01-08 10:50:51 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-08 10:50:51 main:512] : INFO : Loading config [2022-01-08 10:50:51 main:514] : INFO : Loading state [2022-01-08 10:50:52 main:559] : INFO : Loading DataLoader into Memory [2022-01-08 10:50:53 main:562] : INFO : Starting Training [2022-01-08 10:50:59 main:574] : INFO : Epoch 1838 | loss: 0.0317283 | val_loss: 0.0313178 | Time: 6121.38 ms [2022-01-08 10:51:12 main:574] : INFO : Epoch 1839 | loss: 0.0312366 | val_loss: 0.0311862 | Time: 12998.1 ms [2022-01-08 10:51:41 main:574] : INFO : Epoch 1840 | loss: 0.0311637 | val_loss: 0.0311808 | Time: 28676.3 ms [2022-01-08 10:52:09 main:574] : INFO : Epoch 1841 | loss: 0.0311353 | val_loss: 0.031193 | Time: 28335.2 ms [2022-01-08 10:52:38 main:574] : INFO : Epoch 1842 | loss: 0.0311174 | val_loss: 0.0312029 | Time: 29322.5 ms [2022-01-08 10:53:06 main:574] : INFO : Epoch 1843 | loss: 0.0311122 | val_loss: 0.0311941 | Time: 27447.4 ms [2022-01-08 10:53:34 main:574] : INFO : Epoch 1844 | loss: 0.0311075 | val_loss: 0.0311963 | Time: 27929.5 ms [2022-01-08 10:54:01 main:574] : INFO : Epoch 1845 | loss: 0.0311105 | val_loss: 0.0312009 | Time: 27238.6 ms [2022-01-08 10:54:27 main:574] : INFO : Epoch 1846 | loss: 0.0311109 | val_loss: 0.0312135 | Time: 26431.3 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: GeForce MX350) [2022-01-08 20:06:25 main:435] : INFO : Set logging level to 1 [2022-01-08 20:06:25 main:441] : INFO : Running in BOINC Client mode [2022-01-08 20:06:25 main:444] : INFO : Resolving all filenames [2022-01-08 20:06:25 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-08 20:06:25 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-08 20:06:25 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-08 20:06:25 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-01-08 20:06:25 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-08 20:06:25 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-08 20:06:25 main:474] : INFO : Configuration: [2022-01-08 20:06:25 main:475] : INFO : Model type: GRU [2022-01-08 20:06:25 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-08 20:06:25 main:477] : INFO : Max Epochs: 2048 [2022-01-08 20:06:25 main:478] : INFO : Batch Size: 128 [2022-01-08 20:06:25 main:479] : INFO : Learning Rate: 0.01 [2022-01-08 20:06:25 main:480] : INFO : Patience: 10 [2022-01-08 20:06:25 main:481] : INFO : Hidden Width: 12 [2022-01-08 20:06:25 main:482] : INFO : # Recurrent Layers: 4 [2022-01-08 20:06:25 main:483] : INFO : # Backend Layers: 4 [2022-01-08 20:06:25 main:484] : INFO : # Threads: 1 [2022-01-08 20:06:25 main:486] : INFO : Preparing Dataset [2022-01-08 20:06:25 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-08 20:06:26 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-08 20:06:35 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-08 20:06:35 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-08 20:06:35 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-08 20:06:36 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-08 20:06:36 main:494] : INFO : Creating Model [2022-01-08 20:06:36 main:507] : INFO : Preparing config file [2022-01-08 20:06:36 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-08 20:06:36 main:512] : INFO : Loading config [2022-01-08 20:06:36 main:514] : INFO : Loading state [2022-01-08 20:06:42 main:559] : INFO : Loading DataLoader into Memory [2022-01-08 20:06:43 main:562] : INFO : Starting Training [2022-01-08 20:07:22 main:574] : INFO : Epoch 1847 | loss: 0.0317232 | val_loss: 0.0312694 | Time: 38960.4 ms [2022-01-08 20:07:47 main:574] : INFO : Epoch 1848 | loss: 0.0312811 | val_loss: 0.031262 | Time: 24977 ms [2022-01-08 20:08:11 main:574] : INFO : Epoch 1849 | loss: 0.0312361 | val_loss: 0.0312275 | Time: 24861.8 ms [2022-01-08 20:08:36 main:574] : INFO : Epoch 1850 | loss: 0.0312122 | val_loss: 0.0312037 | Time: 24970.3 ms [2022-01-08 20:09:02 main:574] : INFO : Epoch 1851 | loss: 0.031174 | val_loss: 0.0311883 | Time: 25708.7 ms [2022-01-08 20:09:28 main:574] : INFO : Epoch 1852 | loss: 0.0311377 | val_loss: 0.0311855 | Time: 25772 ms [2022-01-08 20:09:53 main:574] : INFO : Epoch 1853 | loss: 0.0311266 | val_loss: 0.0311928 | Time: 25109.8 ms [2022-01-08 20:10:19 main:574] : INFO : Epoch 1854 | loss: 0.0311245 | val_loss: 0.0311934 | Time: 25550.9 ms [2022-01-08 20:10:45 main:574] : INFO : Epoch 1855 | loss: 0.031122 | val_loss: 0.0311929 | Time: 25860.1 ms [2022-01-08 20:11:09 main:574] : INFO : Epoch 1856 | loss: 0.0311146 | val_loss: 0.0312026 | Time: 24734.8 ms [2022-01-08 20:11:35 main:574] : INFO : Epoch 1857 | loss: 0.0311151 | val_loss: 0.0311995 | Time: 26043.2 ms [2022-01-08 20:12:01 main:574] : INFO : Epoch 1858 | loss: 0.0311115 | val_loss: 0.0312118 | Time: 25366.6 ms [2022-01-08 20:12:26 main:574] : INFO : Epoch 1859 | loss: 0.031111 | val_loss: 0.0312159 | Time: 24897.3 ms [2022-01-08 20:12:51 main:574] : INFO : Epoch 1860 | loss: 0.0311114 | val_loss: 0.0311957 | Time: 25579.6 ms [2022-01-08 20:13:16 main:574] : INFO : Epoch 1861 | loss: 0.0311103 | val_loss: 0.0312037 | Time: 24808.6 ms [2022-01-08 20:13:42 main:574] : INFO : Epoch 1862 | loss: 0.031111 | val_loss: 0.0312013 | Time: 25337.2 ms [2022-01-08 20:14:07 main:574] : INFO : Epoch 1863 | loss: 0.0311102 | val_loss: 0.031199 | Time: 25966.3 ms [2022-01-08 20:14:33 main:574] : INFO : Epoch 1864 | loss: 0.0311077 | val_loss: 0.0312055 | Time: 25291.7 ms [2022-01-08 20:14:58 main:574] : INFO : Epoch 1865 | loss: 0.0311064 | val_loss: 0.031201 | Time: 25488.9 ms [2022-01-08 20:15:23 main:574] : INFO : Epoch 1866 | loss: 0.0311039 | val_loss: 0.0311935 | Time: 25157.6 ms [2022-01-08 20:15:49 main:574] : INFO : Epoch 1867 | loss: 0.0311086 | val_loss: 0.0311992 | Time: 25692.1 ms [2022-01-08 20:16:14 main:574] : INFO : Epoch 1868 | loss: 0.0311176 | val_loss: 0.0311971 | Time: 25056.7 ms [2022-01-08 20:16:39 main:574] : INFO : Epoch 1869 | loss: 0.0311308 | val_loss: 0.031184 | Time: 25130.4 ms [2022-01-08 20:17:05 main:574] : INFO : Epoch 1870 | loss: 0.0311205 | val_loss: 0.0312019 | Time: 25567.1 ms [2022-01-08 20:17:31 main:574] : INFO : Epoch 1871 | loss: 0.031115 | val_loss: 0.031195 | Time: 25530.7 ms [2022-01-08 20:17:56 main:574] : INFO : Epoch 1872 | loss: 0.0311073 | val_loss: 0.0312021 | Time: 25241.3 ms [2022-01-08 20:18:21 main:574] : INFO : Epoch 1873 | loss: 0.0311063 | val_loss: 0.0312029 | Time: 25202.1 ms [2022-01-08 20:18:47 main:574] : INFO : Epoch 1874 | loss: 0.031105 | val_loss: 0.031209 | Time: 25657.7 ms [2022-01-08 20:19:12 main:574] : INFO : Epoch 1875 | loss: 0.0311031 | val_loss: 0.0312145 | Time: 25559.7 ms [2022-01-08 20:19:48 main:574] : INFO : Epoch 1876 | loss: 0.0311014 | val_loss: 0.0312098 | Time: 35861.3 ms [2022-01-08 20:20:14 main:574] : INFO : Epoch 1877 | loss: 0.0311042 | val_loss: 0.0312066 | Time: 25508.4 ms [2022-01-08 20:20:39 main:574] : INFO : Epoch 1878 | loss: 0.0311011 | val_loss: 0.031205 | Time: 25496.9 ms [2022-01-08 20:21:05 main:574] : INFO : Epoch 1879 | loss: 0.0310987 | val_loss: 0.0312234 | Time: 25754.4 ms [2022-01-08 20:21:31 main:574] : INFO : Epoch 1880 | loss: 0.0311027 | val_loss: 0.0312078 | Time: 26116.8 ms [2022-01-08 20:21:58 main:574] : INFO : Epoch 1881 | loss: 0.0311064 | val_loss: 0.0311993 | Time: 26389.8 ms [2022-01-08 20:22:23 main:574] : INFO : Epoch 1882 | loss: 0.0311081 | val_loss: 0.0311955 | Time: 25037.9 ms [2022-01-08 20:22:48 main:574] : INFO : Epoch 1883 | loss: 0.0311079 | val_loss: 0.0312181 | Time: 25307.6 ms [2022-01-08 20:23:14 main:574] : INFO : Epoch 1884 | loss: 0.0311097 | val_loss: 0.0312069 | Time: 25687.3 ms [2022-01-08 20:23:39 main:574] : INFO : Epoch 1885 | loss: 0.0310991 | val_loss: 0.0312187 | Time: 25525.8 ms [2022-01-08 20:24:05 main:574] : INFO : Epoch 1886 | loss: 0.0310964 | val_loss: 0.0312118 | Time: 25272.2 ms [2022-01-08 20:24:29 main:574] : INFO : Epoch 1887 | loss: 0.0311024 | val_loss: 0.0312178 | Time: 24944.2 ms [2022-01-08 20:24:55 main:574] : INFO : Epoch 1888 | loss: 0.0310975 | val_loss: 0.0312059 | Time: 25632.2 ms [2022-01-08 20:25:20 main:574] : INFO : Epoch 1889 | loss: 0.0310981 | val_loss: 0.0312082 | Time: 25317.8 ms [2022-01-08 20:25:46 main:574] : INFO : Epoch 1890 | loss: 0.0310978 | val_loss: 0.0312201 | Time: 25027.7 ms [2022-01-08 20:26:11 main:574] : INFO : Epoch 1891 | loss: 0.0311021 | val_loss: 0.03121 | Time: 25036.8 ms [2022-01-08 20:26:36 main:574] : INFO : Epoch 1892 | loss: 0.0311027 | val_loss: 0.0312129 | Time: 25390.9 ms [2022-01-08 20:27:02 main:574] : INFO : Epoch 1893 | loss: 0.0311172 | val_loss: 0.0311957 | Time: 25746.5 ms [2022-01-08 20:27:26 main:574] : INFO : Epoch 1894 | loss: 0.03111 | val_loss: 0.0312068 | Time: 24241.3 ms [2022-01-08 20:27:51 main:574] : INFO : Epoch 1895 | loss: 0.0311045 | val_loss: 0.0312093 | Time: 25331.8 ms [2022-01-08 20:28:17 main:574] : INFO : Epoch 1896 | loss: 0.0310978 | val_loss: 0.0312023 | Time: 25512 ms [2022-01-08 20:28:43 main:574] : INFO : Epoch 1897 | loss: 0.0311015 | val_loss: 0.0312127 | Time: 26113.8 ms [2022-01-08 20:29:09 main:574] : INFO : Epoch 1898 | loss: 0.0310985 | val_loss: 0.0312099 | Time: 26094.9 ms [2022-01-08 20:29:34 main:574] : INFO : Epoch 1899 | loss: 0.0310988 | val_loss: 0.0312204 | Time: 25180 ms [2022-01-08 20:30:00 main:574] : INFO : Epoch 1900 | loss: 0.0310994 | val_loss: 0.0312101 | Time: 25562.1 ms [2022-01-08 20:30:25 main:574] : INFO : Epoch 1901 | loss: 0.0310994 | val_loss: 0.0312106 | Time: 25425.6 ms [2022-01-08 20:30:50 main:574] : INFO : Epoch 1902 | loss: 0.0311016 | val_loss: 0.0312051 | Time: 24855.9 ms [2022-01-08 20:31:15 main:574] : INFO : Epoch 1903 | loss: 0.031099 | val_loss: 0.0312146 | Time: 24712.9 ms [2022-01-08 20:31:40 main:574] : INFO : Epoch 1904 | loss: 0.0311003 | val_loss: 0.0312204 | Time: 25032.6 ms [2022-01-08 20:32:05 main:574] : INFO : Epoch 1905 | loss: 0.0311002 | val_loss: 0.0312208 | Time: 24710.7 ms [2022-01-08 20:32:30 main:574] : INFO : Epoch 1906 | loss: 0.0310975 | val_loss: 0.0312123 | Time: 25593.9 ms [2022-01-08 20:32:55 main:574] : INFO : Epoch 1907 | loss: 0.0310963 | val_loss: 0.0312009 | Time: 24735.9 ms [2022-01-08 20:33:21 main:574] : INFO : Epoch 1908 | loss: 0.0310957 | val_loss: 0.0312207 | Time: 25385.2 ms [2022-01-08 20:33:47 main:574] : INFO : Epoch 1909 | loss: 0.0310917 | val_loss: 0.0312043 | Time: 26206.8 ms [2022-01-08 20:34:12 main:574] : INFO : Epoch 1910 | loss: 0.0310956 | val_loss: 0.0312189 | Time: 25454.5 ms [2022-01-08 20:34:38 main:574] : INFO : Epoch 1911 | loss: 0.0310938 | val_loss: 0.0312076 | Time: 25339.3 ms [2022-01-08 20:35:03 main:574] : INFO : Epoch 1912 | loss: 0.031093 | val_loss: 0.031226 | Time: 25680.8 ms [2022-01-08 20:35:29 main:574] : INFO : Epoch 1913 | loss: 0.0310907 | val_loss: 0.0312187 | Time: 25249.7 ms [2022-01-08 20:35:54 main:574] : INFO : Epoch 1914 | loss: 0.0310884 | val_loss: 0.0312132 | Time: 25626.6 ms [2022-01-08 20:36:20 main:574] : INFO : Epoch 1915 | loss: 0.0310877 | val_loss: 0.0312182 | Time: 25484.3 ms [2022-01-08 20:36:45 main:574] : INFO : Epoch 1916 | loss: 0.0310877 | val_loss: 0.0312036 | Time: 24975.9 ms [2022-01-08 20:37:10 main:574] : INFO : Epoch 1917 | loss: 0.0310901 | val_loss: 0.03123 | Time: 24999.9 ms [2022-01-08 20:37:35 main:574] : INFO : Epoch 1918 | loss: 0.0310888 | val_loss: 0.0312343 | Time: 25550.1 ms [2022-01-08 20:38:01 main:574] : INFO : Epoch 1919 | loss: 0.0310939 | val_loss: 0.03121 | Time: 25328.6 ms [2022-01-08 20:38:26 main:574] : INFO : Epoch 1920 | loss: 0.0310924 | val_loss: 0.0312298 | Time: 24976.3 ms [2022-01-08 20:38:51 main:574] : INFO : Epoch 1921 | loss: 0.0310906 | val_loss: 0.0312322 | Time: 25469.2 ms [2022-01-08 20:39:17 main:574] : INFO : Epoch 1922 | loss: 0.0310909 | val_loss: 0.0312351 | Time: 25439.4 ms [2022-01-08 20:39:42 main:574] : INFO : Epoch 1923 | loss: 0.0310863 | val_loss: 0.0312197 | Time: 24999.5 ms [2022-01-08 20:40:07 main:574] : INFO : Epoch 1924 | loss: 0.0310859 | val_loss: 0.0312318 | Time: 25516.7 ms [2022-01-08 20:40:33 main:574] : INFO : Epoch 1925 | loss: 0.0310846 | val_loss: 0.0312146 | Time: 25405.8 ms [2022-01-08 20:40:57 main:574] : INFO : Epoch 1926 | loss: 0.0310921 | val_loss: 0.0312197 | Time: 24650.8 ms [2022-01-08 20:41:23 main:574] : INFO : Epoch 1927 | loss: 0.0310915 | val_loss: 0.031231 | Time: 25763.1 ms [2022-01-08 20:41:49 main:574] : INFO : Epoch 1928 | loss: 0.0311 | val_loss: 0.0312124 | Time: 26072.2 ms [2022-01-08 20:42:15 main:574] : INFO : Epoch 1929 | loss: 0.0310968 | val_loss: 0.031223 | Time: 25508.4 ms [2022-01-08 20:42:40 main:574] : INFO : Epoch 1930 | loss: 0.0310928 | val_loss: 0.0312165 | Time: 25090.4 ms [2022-01-08 20:43:05 main:574] : INFO : Epoch 1931 | loss: 0.0310895 | val_loss: 0.0312255 | Time: 25331.5 ms [2022-01-08 20:43:30 main:574] : INFO : Epoch 1932 | loss: 0.0311185 | val_loss: 0.031204 | Time: 24795.4 ms [2022-01-08 20:44:07 main:574] : INFO : Epoch 1933 | loss: 0.0311165 | val_loss: 0.0312036 | Time: 37141.6 ms [2022-01-08 20:44:33 main:574] : INFO : Epoch 1934 | loss: 0.0311064 | val_loss: 0.0312008 | Time: 25697.6 ms [2022-01-08 20:44:58 main:574] : INFO : Epoch 1935 | loss: 0.0311022 | val_loss: 0.0312047 | Time: 25269.7 ms [2022-01-08 20:45:24 main:574] : INFO : Epoch 1936 | loss: 0.0310964 | val_loss: 0.0312177 | Time: 25362.9 ms [2022-01-08 20:45:49 main:574] : INFO : Epoch 1937 | loss: 0.0310996 | val_loss: 0.0312297 | Time: 25278 ms [2022-01-08 20:46:14 main:574] : INFO : Epoch 1938 | loss: 0.0310905 | val_loss: 0.0312296 | Time: 24602.3 ms [2022-01-08 20:46:39 main:574] : INFO : Epoch 1939 | loss: 0.0310923 | val_loss: 0.031218 | Time: 25192.3 ms [2022-01-08 20:47:04 main:574] : INFO : Epoch 1940 | loss: 0.0310935 | val_loss: 0.0312229 | Time: 24982.9 ms [2022-01-08 20:47:28 main:574] : INFO : Epoch 1941 | loss: 0.0310942 | val_loss: 0.0312089 | Time: 24318.4 ms [2022-01-08 20:47:53 main:574] : INFO : Epoch 1942 | loss: 0.0310945 | val_loss: 0.0312251 | Time: 24931.5 ms [2022-01-08 20:48:18 main:574] : INFO : Epoch 1943 | loss: 0.0310899 | val_loss: 0.0312135 | Time: 25078.5 ms [2022-01-08 20:48:44 main:574] : INFO : Epoch 1944 | loss: 0.0310893 | val_loss: 0.0312228 | Time: 25458.3 ms [2022-01-08 20:49:09 main:574] : INFO : Epoch 1945 | loss: 0.0310839 | val_loss: 0.0312169 | Time: 25347 ms [2022-01-08 20:49:35 main:574] : INFO : Epoch 1946 | loss: 0.0310901 | val_loss: 0.0312299 | Time: 25791.6 ms [2022-01-08 20:50:15 main:574] : INFO : Epoch 1947 | loss: 0.0310882 | val_loss: 0.0312428 | Time: 40356.8 ms [2022-01-08 20:50:41 main:574] : INFO : Epoch 1948 | loss: 0.0310836 | val_loss: 0.0312203 | Time: 25620.1 ms [2022-01-08 20:51:07 main:574] : INFO : Epoch 1949 | loss: 0.0310885 | val_loss: 0.0312203 | Time: 26321 ms [2022-01-08 20:51:32 main:574] : INFO : Epoch 1950 | loss: 0.0310949 | val_loss: 0.0312138 | Time: 24971.2 ms [2022-01-08 20:51:53 main:574] : INFO : Epoch 1951 | loss: 0.0310951 | val_loss: 0.0312101 | Time: 20496.4 ms [2022-01-08 20:51:59 main:574] : INFO : Epoch 1952 | loss: 0.0310999 | val_loss: 0.0312372 | Time: 6627.01 ms [2022-01-08 20:52:06 main:574] : INFO : Epoch 1953 | loss: 0.0311319 | val_loss: 0.0311893 | Time: 6868.99 ms [2022-01-08 20:52:13 main:574] : INFO : Epoch 1954 | loss: 0.0311366 | val_loss: 0.0311942 | Time: 6854.92 ms [2022-01-08 20:52:20 main:574] : INFO : Epoch 1955 | loss: 0.0311274 | val_loss: 0.0312029 | Time: 7052.23 ms [2022-01-08 20:52:27 main:574] : INFO : Epoch 1956 | loss: 0.0311185 | val_loss: 0.0312058 | Time: 6604.22 ms [2022-01-08 20:52:34 main:574] : INFO : Epoch 1957 | loss: 0.0311195 | val_loss: 0.0312006 | Time: 7115.79 ms [2022-01-08 20:52:41 main:574] : INFO : Epoch 1958 | loss: 0.0311316 | val_loss: 0.0311972 | Time: 7241.15 ms [2022-01-08 20:52:49 main:574] : INFO : Epoch 1959 | loss: 0.0311236 | val_loss: 0.0312019 | Time: 7235 ms [2022-01-08 20:52:56 main:574] : INFO : Epoch 1960 | loss: 0.0311154 | val_loss: 0.0312164 | Time: 7015.5 ms [2022-01-08 20:53:03 main:574] : INFO : Epoch 1961 | loss: 0.0311114 | val_loss: 0.0312165 | Time: 7332.24 ms [2022-01-08 20:53:10 main:574] : INFO : Epoch 1962 | loss: 0.0311097 | val_loss: 0.0312079 | Time: 6840.72 ms [2022-01-08 20:53:17 main:574] : INFO : Epoch 1963 | loss: 0.0311159 | val_loss: 0.0312069 | Time: 7261 ms [2022-01-08 20:53:24 main:574] : INFO : Epoch 1964 | loss: 0.0311081 | val_loss: 0.0312066 | Time: 7185.05 ms [2022-01-08 20:53:31 main:574] : INFO : Epoch 1965 | loss: 0.0311024 | val_loss: 0.0312091 | Time: 7095.83 ms [2022-01-08 20:53:38 main:574] : INFO : Epoch 1966 | loss: 0.0311025 | val_loss: 0.0312262 | Time: 6969.02 ms [2022-01-08 20:53:46 main:574] : INFO : Epoch 1967 | loss: 0.031104 | val_loss: 0.0312179 | Time: 7275.66 ms [2022-01-08 20:53:52 main:574] : INFO : Epoch 1968 | loss: 0.0311039 | val_loss: 0.0312192 | Time: 6840.97 ms [2022-01-08 20:54:00 main:574] : INFO : Epoch 1969 | loss: 0.031097 | val_loss: 0.0312182 | Time: 7514.47 ms [2022-01-08 20:54:07 main:574] : INFO : Epoch 1970 | loss: 0.0310983 | val_loss: 0.0312193 | Time: 6928.21 ms [2022-01-08 20:54:14 main:574] : INFO : Epoch 1971 | loss: 0.0310988 | val_loss: 0.0312154 | Time: 6945.92 ms [2022-01-08 20:54:21 main:574] : INFO : Epoch 1972 | loss: 0.0310911 | val_loss: 0.0312166 | Time: 7388.01 ms [2022-01-08 20:54:28 main:574] : INFO : Epoch 1973 | loss: 0.0310909 | val_loss: 0.0312223 | Time: 6994.54 ms [2022-01-08 20:54:35 main:574] : INFO : Epoch 1974 | loss: 0.0310919 | val_loss: 0.0312144 | Time: 7043.56 ms [2022-01-08 20:54:42 main:574] : INFO : Epoch 1975 | loss: 0.0310989 | val_loss: 0.031224 | Time: 6878.58 ms [2022-01-08 20:54:50 main:574] : INFO : Epoch 1976 | loss: 0.0310969 | val_loss: 0.0312237 | Time: 7305.59 ms [2022-01-08 20:54:58 main:574] : INFO : Epoch 1977 | loss: 0.0310906 | val_loss: 0.0312107 | Time: 7940.34 ms [2022-01-08 20:55:05 main:574] : INFO : Epoch 1978 | loss: 0.0310944 | val_loss: 0.031215 | Time: 7339.79 ms [2022-01-08 20:55:12 main:574] : INFO : Epoch 1979 | loss: 0.0310868 | val_loss: 0.031227 | Time: 6711.24 ms [2022-01-08 20:55:19 main:574] : INFO : Epoch 1980 | loss: 0.0310841 | val_loss: 0.0312233 | Time: 7195.26 ms [2022-01-08 20:55:26 main:574] : INFO : Epoch 1981 | loss: 0.0310796 | val_loss: 0.0312326 | Time: 7034.87 ms [2022-01-08 20:55:33 main:574] : INFO : Epoch 1982 | loss: 0.0310885 | val_loss: 0.0312142 | Time: 6864.58 ms [2022-01-08 20:55:40 main:574] : INFO : Epoch 1983 | loss: 0.0310888 | val_loss: 0.0312362 | Time: 7314.47 ms [2022-01-08 20:55:47 main:574] : INFO : Epoch 1984 | loss: 0.0310838 | val_loss: 0.0312258 | Time: 7150.59 ms [2022-01-08 20:55:54 main:574] : INFO : Epoch 1985 | loss: 0.0310793 | val_loss: 0.0312309 | Time: 6989.04 ms [2022-01-08 20:56:01 main:574] : INFO : Epoch 1986 | loss: 0.0310858 | val_loss: 0.0312287 | Time: 7013.53 ms [2022-01-08 20:56:08 main:574] : INFO : Epoch 1987 | loss: 0.0310868 | val_loss: 0.0312234 | Time: 6731.61 ms [2022-01-08 20:56:15 main:574] : INFO : Epoch 1988 | loss: 0.0310868 | val_loss: 0.0312205 | Time: 6805.71 ms [2022-01-08 20:56:22 main:574] : INFO : Epoch 1989 | loss: 0.0310813 | val_loss: 0.0312329 | Time: 7229.3 ms [2022-01-08 20:56:29 main:574] : INFO : Epoch 1990 | loss: 0.0310811 | val_loss: 0.0312241 | Time: 7361.21 ms [2022-01-08 20:56:37 main:574] : INFO : Epoch 1991 | loss: 0.0310811 | val_loss: 0.0312297 | Time: 7164.55 ms [2022-01-08 20:56:43 main:574] : INFO : Epoch 1992 | loss: 0.0310824 | val_loss: 0.031235 | Time: 6821.17 ms [2022-01-08 20:56:50 main:574] : INFO : Epoch 1993 | loss: 0.0310846 | val_loss: 0.0312272 | Time: 6721.23 ms [2022-01-08 20:56:57 main:574] : INFO : Epoch 1994 | loss: 0.0311002 | val_loss: 0.0312177 | Time: 7055.87 ms [2022-01-08 20:57:04 main:574] : INFO : Epoch 1995 | loss: 0.0311029 | val_loss: 0.0312364 | Time: 7051.04 ms [2022-01-08 20:57:11 main:574] : INFO : Epoch 1996 | loss: 0.0310943 | val_loss: 0.0312312 | Time: 6804.12 ms [2022-01-08 20:57:18 main:574] : INFO : Epoch 1997 | loss: 0.0311047 | val_loss: 0.0312143 | Time: 6970.48 ms [2022-01-08 20:57:25 main:574] : INFO : Epoch 1998 | loss: 0.0310975 | val_loss: 0.031215 | Time: 7276.86 ms [2022-01-08 20:57:32 main:574] : INFO : Epoch 1999 | loss: 0.0310899 | val_loss: 0.0312217 | Time: 6736.04 ms [2022-01-08 20:57:39 main:574] : INFO : Epoch 2000 | loss: 0.0310836 | val_loss: 0.0312151 | Time: 7024.91 ms [2022-01-08 20:57:46 main:574] : INFO : Epoch 2001 | loss: 0.0310834 | val_loss: 0.0312119 | Time: 7112.02 ms [2022-01-08 20:57:53 main:574] : INFO : Epoch 2002 | loss: 0.0310811 | val_loss: 0.0312335 | Time: 7049.18 ms [2022-01-08 20:58:01 main:574] : INFO : Epoch 2003 | loss: 0.0310823 | val_loss: 0.0312213 | Time: 7181.35 ms [2022-01-08 20:58:08 main:574] : INFO : Epoch 2004 | loss: 0.0310877 | val_loss: 0.0312322 | Time: 7083.5 ms [2022-01-08 20:58:15 main:574] : INFO : Epoch 2005 | loss: 0.031083 | val_loss: 0.0312202 | Time: 6961.37 ms [2022-01-08 20:58:22 main:574] : INFO : Epoch 2006 | loss: 0.0310888 | val_loss: 0.0312152 | Time: 6993.57 ms [2022-01-08 20:58:29 main:574] : INFO : Epoch 2007 | loss: 0.0310833 | val_loss: 0.0312375 | Time: 7097.47 ms [2022-01-08 20:58:36 main:574] : INFO : Epoch 2008 | loss: 0.0310771 | val_loss: 0.0312173 | Time: 7196.28 ms [2022-01-08 20:58:43 main:574] : INFO : Epoch 2009 | loss: 0.0310804 | val_loss: 0.0312312 | Time: 7099.96 ms [2022-01-08 20:58:51 main:574] : INFO : Epoch 2010 | loss: 0.0310884 | val_loss: 0.0312546 | Time: 7360.41 ms [2022-01-08 20:58:58 main:574] : INFO : Epoch 2011 | loss: 0.0310856 | val_loss: 0.0312377 | Time: 7107.6 ms [2022-01-08 20:59:05 main:574] : INFO : Epoch 2012 | loss: 0.0310846 | val_loss: 0.0312262 | Time: 7213.88 ms [2022-01-08 20:59:12 main:574] : INFO : Epoch 2013 | loss: 0.0310849 | val_loss: 0.0312282 | Time: 7119.5 ms [2022-01-08 20:59:19 main:574] : INFO : Epoch 2014 | loss: 0.031084 | val_loss: 0.0312267 | Time: 7263.05 ms [2022-01-08 20:59:26 main:574] : INFO : Epoch 2015 | loss: 0.0310888 | val_loss: 0.0312253 | Time: 7152.53 ms [2022-01-08 20:59:33 main:574] : INFO : Epoch 2016 | loss: 0.0310839 | val_loss: 0.0312345 | Time: 6985.46 ms [2022-01-08 20:59:40 main:574] : INFO : Epoch 2017 | loss: 0.0310791 | val_loss: 0.0312234 | Time: 6789.62 ms [2022-01-08 20:59:47 main:574] : INFO : Epoch 2018 | loss: 0.0310912 | val_loss: 0.0312097 | Time: 7073.19 ms [2022-01-08 20:59:54 main:574] : INFO : Epoch 2019 | loss: 0.0311282 | val_loss: 0.0311941 | Time: 6903.42 ms [2022-01-08 21:00:01 main:574] : INFO : Epoch 2020 | loss: 0.0311249 | val_loss: 0.0311968 | Time: 6900.23 ms [2022-01-08 21:00:08 main:574] : INFO : Epoch 2021 | loss: 0.0311136 | val_loss: 0.0312086 | Time: 7130.78 ms [2022-01-08 21:00:15 main:574] : INFO : Epoch 2022 | loss: 0.0311015 | val_loss: 0.0312019 | Time: 6983.1 ms [2022-01-08 21:00:22 main:574] : INFO : Epoch 2023 | loss: 0.0310973 | val_loss: 0.0312022 | Time: 6741.95 ms [2022-01-08 21:00:30 main:574] : INFO : Epoch 2024 | loss: 0.0310931 | val_loss: 0.0312141 | Time: 7443.11 ms [2022-01-08 21:00:37 main:574] : INFO : Epoch 2025 | loss: 0.0310985 | val_loss: 0.0311969 | Time: 6999.2 ms [2022-01-08 21:00:43 main:574] : INFO : Epoch 2026 | loss: 0.0311131 | val_loss: 0.0312049 | Time: 6867.28 ms [2022-01-08 21:00:50 main:574] : INFO : Epoch 2027 | loss: 0.0311041 | val_loss: 0.0312136 | Time: 6908.93 ms [2022-01-08 21:00:57 main:574] : INFO : Epoch 2028 | loss: 0.0310973 | val_loss: 0.0312046 | Time: 6931.26 ms [2022-01-08 21:01:04 main:574] : INFO : Epoch 2029 | loss: 0.0310928 | val_loss: 0.0312274 | Time: 6996.67 ms [2022-01-08 21:01:11 main:574] : INFO : Epoch 2030 | loss: 0.0310961 | val_loss: 0.0312165 | Time: 6842.67 ms [2022-01-08 21:01:18 main:574] : INFO : Epoch 2031 | loss: 0.0310931 | val_loss: 0.0312232 | Time: 6792.01 ms [2022-01-08 21:01:25 main:574] : INFO : Epoch 2032 | loss: 0.0310865 | val_loss: 0.0312244 | Time: 6997 ms [2022-01-08 21:01:32 main:574] : INFO : Epoch 2033 | loss: 0.0310907 | val_loss: 0.0312172 | Time: 7048.54 ms [2022-01-08 21:01:39 main:574] : INFO : Epoch 2034 | loss: 0.0310902 | val_loss: 0.0312216 | Time: 7198.09 ms [2022-01-08 21:01:46 main:574] : INFO : Epoch 2035 | loss: 0.031087 | val_loss: 0.0312307 | Time: 6911.17 ms [2022-01-08 21:01:53 main:574] : INFO : Epoch 2036 | loss: 0.0311052 | val_loss: 0.0312057 | Time: 7279.51 ms [2022-01-08 21:02:01 main:574] : INFO : Epoch 2037 | loss: 0.0310994 | val_loss: 0.0312182 | Time: 7175.78 ms [2022-01-08 21:02:08 main:574] : INFO : Epoch 2038 | loss: 0.0310907 | val_loss: 0.0312399 | Time: 6894.66 ms [2022-01-08 21:02:15 main:574] : INFO : Epoch 2039 | loss: 0.0310867 | val_loss: 0.031228 | Time: 7119.4 ms [2022-01-08 21:02:22 main:574] : INFO : Epoch 2040 | loss: 0.0310783 | val_loss: 0.0312408 | Time: 6968.09 ms [2022-01-08 21:02:29 main:574] : INFO : Epoch 2041 | loss: 0.0310819 | val_loss: 0.0312432 | Time: 6915.75 ms [2022-01-08 21:02:36 main:574] : INFO : Epoch 2042 | loss: 0.0310795 | val_loss: 0.0312206 | Time: 7006.71 ms [2022-01-08 21:02:42 main:574] : INFO : Epoch 2043 | loss: 0.0310823 | val_loss: 0.0312281 | Time: 6812.93 ms [2022-01-08 21:02:49 main:574] : INFO : Epoch 2044 | loss: 0.0310844 | val_loss: 0.0312056 | Time: 6784.88 ms [2022-01-08 21:02:56 main:574] : INFO : Epoch 2045 | loss: 0.0310968 | val_loss: 0.0312204 | Time: 6675.95 ms [2022-01-08 21:03:03 main:574] : INFO : Epoch 2046 | loss: 0.0310928 | val_loss: 0.0312202 | Time: 6918.7 ms [2022-01-08 21:03:10 main:574] : INFO : Epoch 2047 | loss: 0.0310832 | val_loss: 0.0312244 | Time: 6745.88 ms [2022-01-08 21:03:17 main:574] : INFO : Epoch 2048 | loss: 0.0310788 | val_loss: 0.0312269 | Time: 7252.64 ms [2022-01-08 21:03:17 main:597] : INFO : Saving trained model to model-final.pt, val_loss 0.0312269 [2022-01-08 21:03:17 main:603] : INFO : Saving end state to config to file [2022-01-08 21:03:17 main:608] : INFO : Success, exiting.. 21:03:17 (12320): called boinc_finish(0) </stderr_txt> ]]>
©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)