| Name | ParityModified-1639956799-25721-1-0_0 |
| Workunit | 7672361 |
| Created | 2 Jan 2022, 16:25:32 UTC |
| Sent | 16 Jan 2022, 14:12:49 UTC |
| Report deadline | 24 Jan 2022, 14:12:49 UTC |
| Received | 9 Feb 2022, 4:18:49 UTC |
| Server state | Over |
| Outcome | Success |
| Client state | Done |
| Exit status | 0 (0x00000000) |
| Computer ID | 11391 |
| Run time | 3 days 7 hours 7 min 51 sec |
| CPU time | 12 hours 23 min 19 sec |
| Validate state | Task was reported too late to validate |
| Credit | 0.00 |
| Device peak FLOPS | 1,441.83 GFLOPS |
| Application version | Machine Learning Dataset Generator (GPU) v9.75 (cuda10200) windows_x86_64 |
| Peak working set size | 1.54 GB |
| Peak swap size | 3.44 GB |
| Peak disk usage | 1.54 GB |
<core_client_version>7.16.20</core_client_version> <![CDATA[ <stderr_txt> : 4 [2022-01-28 01:46:25 main:484] : INFO : # Threads: 1 [2022-01-28 01:46:25 main:486] : INFO : Preparing Dataset [2022-01-28 01:46:27 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-28 01:46:55 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-28 01:50:32 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-28 01:50:32 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-28 01:50:37 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-28 01:50:45 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-28 01:50:46 main:494] : INFO : Creating Model [2022-01-28 01:50:47 main:507] : INFO : Preparing config file [2022-01-28 01:50:47 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-28 01:50:47 main:512] : INFO : Loading config [2022-01-28 01:50:48 main:514] : INFO : Loading state [2022-01-28 01:54:37 main:559] : INFO : Loading DataLoader into Memory [2022-01-28 01:54:48 main:562] : INFO : Starting Training [2022-01-28 02:22:50 main:574] : INFO : Epoch 1712 | loss: 0.0311436 | val_loss: 0.0311777 | Time: 1.67584e+06 ms [2022-01-28 02:35:41 main:574] : INFO : Epoch 1713 | loss: 0.0311258 | val_loss: 0.031169 | Time: 718989 ms [2022-01-28 02:43:20 main:574] : INFO : Epoch 1714 | loss: 0.031123 | val_loss: 0.0311696 | Time: 453192 ms [2022-01-28 02:47:29 main:574] : INFO : Epoch 1715 | loss: 0.0311177 | val_loss: 0.031168 | Time: 233248 ms [2022-01-28 02:50:34 main:574] : INFO : Epoch 1716 | loss: 0.0311153 | val_loss: 0.0311757 | Time: 179562 ms [2022-01-28 02:53:18 main:574] : INFO : Epoch 1717 | loss: 0.0311157 | val_loss: 0.0311715 | Time: 162611 ms [2022-01-28 02:56:41 main:574] : INFO : Epoch 1718 | loss: 0.0311156 | val_loss: 0.0311772 | Time: 195000 ms [2022-01-28 02:59:57 main:574] : INFO : Epoch 1719 | loss: 0.0311131 | val_loss: 0.0311722 | Time: 196191 ms [2022-01-28 03:02:38 main:574] : INFO : Epoch 1720 | loss: 0.0311134 | val_loss: 0.0311765 | Time: 159574 ms [2022-01-28 03:05:29 main:574] : INFO : Epoch 1721 | loss: 0.0311129 | val_loss: 0.0311735 | Time: 170896 ms [2022-01-28 03:08:30 main:574] : INFO : Epoch 1722 | loss: 0.0311183 | val_loss: 0.0311755 | Time: 178253 ms [2022-01-28 03:11:06 main:574] : INFO : Epoch 1723 | loss: 0.0311167 | val_loss: 0.0311704 | Time: 155901 ms [2022-01-28 03:14:15 main:574] : INFO : Epoch 1724 | loss: 0.0311133 | val_loss: 0.031172 | Time: 185679 ms [2022-01-28 03:17:24 main:574] : INFO : Epoch 1725 | loss: 0.0311218 | val_loss: 0.0311696 | Time: 188228 ms [2022-01-28 03:20:20 main:574] : INFO : Epoch 1726 | loss: 0.0311195 | val_loss: 0.0311694 | Time: 170015 ms [2022-01-28 03:22:32 main:574] : INFO : Epoch 1727 | loss: 0.0311188 | val_loss: 0.0311719 | Time: 131612 ms [2022-01-28 03:33:14 main:574] : INFO : Epoch 1728 | loss: 0.0311165 | val_loss: 0.0311744 | Time: 636780 ms [2022-01-28 03:50:45 main:574] : INFO : Epoch 1729 | loss: 0.0311185 | val_loss: 0.03117 | Time: 954817 ms [2022-01-28 04:05:52 main:574] : INFO : Epoch 1730 | loss: 0.0311137 | val_loss: 0.0311791 | Time: 904740 ms [2022-01-28 04:23:36 main:574] : INFO : Epoch 1731 | loss: 0.0311123 | val_loss: 0.0311712 | Time: 1.05964e+06 ms [2022-01-28 04:28:25 main:574] : INFO : Epoch 1732 | loss: 0.0311158 | val_loss: 0.0311749 | Time: 266504 ms [2022-01-28 04:30:28 main:574] : INFO : Epoch 1733 | loss: 0.031114 | val_loss: 0.0311742 | Time: 121335 ms [2022-01-28 04:34:04 main:574] : INFO : Epoch 1734 | loss: 0.0311135 | val_loss: 0.0311809 | Time: 214636 ms [2022-01-28 04:36:07 main:574] : INFO : Epoch 1735 | loss: 0.0311096 | val_loss: 0.0311766 | Time: 115457 ms [2022-01-28 04:38:11 main:574] : INFO : Epoch 1736 | loss: 0.0311084 | val_loss: 0.0311781 | Time: 124060 ms [2022-01-28 04:40:12 main:574] : INFO : Epoch 1737 | loss: 0.0311081 | val_loss: 0.0311765 | Time: 118557 ms [2022-01-28 04:42:22 main:574] : INFO : Epoch 1738 | loss: 0.0311104 | val_loss: 0.031177 | Time: 129367 ms [2022-01-28 04:45:00 main:574] : INFO : Epoch 1739 | loss: 0.0311112 | val_loss: 0.0311739 | Time: 154851 ms [2022-01-28 04:47:18 main:574] : INFO : Epoch 1740 | loss: 0.03111 | val_loss: 0.0311814 | Time: 138048 ms [2022-01-28 04:49:22 main:574] : INFO : Epoch 1741 | loss: 0.0311081 | val_loss: 0.0311776 | Time: 123278 ms [2022-01-28 04:51:10 main:574] : INFO : Epoch 1742 | loss: 0.0311097 | val_loss: 0.031174 | Time: 108194 ms [2022-01-28 04:54:24 main:574] : INFO : Epoch 1743 | loss: 0.0311117 | val_loss: 0.0311667 | Time: 191348 ms [2022-01-28 04:56:10 main:574] : INFO : Epoch 1744 | loss: 0.0311112 | val_loss: 0.0311724 | Time: 105885 ms [2022-01-28 04:58:19 main:574] : INFO : Epoch 1745 | loss: 0.031121 | val_loss: 0.0311728 | Time: 122019 ms [2022-01-28 05:00:25 main:574] : INFO : Epoch 1746 | loss: 0.0311215 | val_loss: 0.0311716 | Time: 126029 ms [2022-01-28 05:02:02 main:574] : INFO : Epoch 1747 | loss: 0.031118 | val_loss: 0.0311751 | Time: 92118 ms [2022-01-28 05:04:28 main:574] : INFO : Epoch 1748 | loss: 0.0311186 | val_loss: 0.0311692 | Time: 145985 ms [2022-01-28 05:06:17 main:574] : INFO : Epoch 1749 | loss: 0.0311148 | val_loss: 0.0311727 | Time: 103478 ms [2022-01-28 05:08:07 main:574] : INFO : Epoch 1750 | loss: 0.0311151 | val_loss: 0.0311695 | Time: 108093 ms [2022-01-28 05:10:31 main:574] : INFO : Epoch 1751 | loss: 0.0311167 | val_loss: 0.0311712 | Time: 134200 ms [2022-01-28 05:13:46 main:574] : INFO : Epoch 1752 | loss: 0.0311165 | val_loss: 0.0311697 | Time: 193480 ms [2022-01-28 05:17:05 main:574] : INFO : Epoch 1753 | loss: 0.0311162 | val_loss: 0.0311744 | Time: 189724 ms [2022-01-28 05:19:21 main:574] : INFO : Epoch 1754 | loss: 0.031115 | val_loss: 0.0311705 | Time: 135959 ms [2022-01-28 05:21:12 main:574] : INFO : Epoch 1755 | loss: 0.0311134 | val_loss: 0.0311703 | Time: 100514 ms [2022-01-28 05:23:25 main:574] : INFO : Epoch 1756 | loss: 0.0311136 | val_loss: 0.031173 | Time: 132683 ms [2022-01-28 05:42:29 main:574] : INFO : Epoch 1757 | loss: 0.0311109 | val_loss: 0.0311752 | Time: 1.1198e+06 ms [2022-01-28 06:02:37 main:574] : INFO : Epoch 1758 | loss: 0.031111 | val_loss: 0.0311716 | Time: 1.16272e+06 ms [2022-01-28 06:06:20 main:574] : INFO : Epoch 1759 | loss: 0.0311106 | val_loss: 0.0311763 | Time: 163726 ms [2022-01-28 06:13:55 main:574] : INFO : Epoch 1760 | loss: 0.0311136 | val_loss: 0.0311763 | Time: 454881 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 950M) [2022-01-29 01:47:46 main:435] : INFO : Set logging level to 1 [2022-01-29 01:47:46 main:441] : INFO : Running in BOINC Client mode [2022-01-29 01:47:46 main:444] : INFO : Resolving all filenames [2022-01-29 01:47:46 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-29 01:47:46 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-29 01:47:46 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-29 01:47:47 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-29 01:47:47 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-29 01:47:47 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-29 01:47:47 main:474] : INFO : Configuration: [2022-01-29 01:47:47 main:475] : INFO : Model type: GRU [2022-01-29 01:47:47 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-29 01:47:47 main:477] : INFO : Max Epochs: 2048 [2022-01-29 01:47:47 main:478] : INFO : Batch Size: 128 [2022-01-29 01:47:47 main:479] : INFO : Learning Rate: 0.01 [2022-01-29 01:47:47 main:480] : INFO : Patience: 10 [2022-01-29 01:47:47 main:481] : INFO : Hidden Width: 12 [2022-01-29 01:47:47 main:482] : INFO : # Recurrent Layers: 4 [2022-01-29 01:47:47 main:483] : INFO : # Backend Layers: 4 [2022-01-29 01:47:47 main:484] : INFO : # Threads: 1 [2022-01-29 01:47:47 main:486] : INFO : Preparing Dataset [2022-01-29 01:47:48 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-29 01:48:09 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-29 01:56:30 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-29 01:56:30 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-29 01:56:52 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-29 01:57:07 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-29 01:57:07 main:494] : INFO : Creating Model [2022-01-29 01:57:07 main:507] : INFO : Preparing config file [2022-01-29 01:57:08 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-29 01:57:08 main:512] : INFO : Loading config [2022-01-29 01:57:08 main:514] : INFO : Loading state [2022-01-29 02:00:47 main:559] : INFO : Loading DataLoader into Memory [2022-01-29 02:00:48 main:562] : INFO : Starting Training [2022-01-29 02:07:03 main:574] : INFO : Epoch 1761 | loss: 0.0311454 | val_loss: 0.0311775 | Time: 374904 ms [2022-01-29 02:10:08 main:574] : INFO : Epoch 1762 | loss: 0.0311188 | val_loss: 0.0311799 | Time: 171873 ms [2022-01-29 02:13:22 main:574] : INFO : Epoch 1763 | loss: 0.0311124 | val_loss: 0.0311715 | Time: 193894 ms [2022-01-29 02:16:29 main:574] : INFO : Epoch 1764 | loss: 0.0311106 | val_loss: 0.0311701 | Time: 171366 ms [2022-01-29 02:19:21 main:574] : INFO : Epoch 1765 | loss: 0.0311101 | val_loss: 0.031177 | Time: 171349 ms [2022-01-29 02:22:40 main:574] : INFO : Epoch 1766 | loss: 0.0311153 | val_loss: 0.0311707 | Time: 197557 ms [2022-01-29 02:25:51 main:574] : INFO : Epoch 1767 | loss: 0.0311133 | val_loss: 0.0311771 | Time: 178345 ms [2022-01-29 02:28:45 main:574] : INFO : Epoch 1768 | loss: 0.0311131 | val_loss: 0.0311715 | Time: 173954 ms [2022-01-29 02:31:26 main:574] : INFO : Epoch 1769 | loss: 0.0311112 | val_loss: 0.0311767 | Time: 151549 ms [2022-01-29 02:34:13 main:574] : INFO : Epoch 1770 | loss: 0.0311118 | val_loss: 0.0311754 | Time: 166354 ms [2022-01-29 02:37:37 main:574] : INFO : Epoch 1771 | loss: 0.03111 | val_loss: 0.0311718 | Time: 191788 ms [2022-01-29 02:39:59 main:574] : INFO : Epoch 1772 | loss: 0.0311088 | val_loss: 0.0311709 | Time: 140340 ms [2022-01-29 02:43:17 main:574] : INFO : Epoch 1773 | loss: 0.0311059 | val_loss: 0.0311752 | Time: 180144 ms [2022-01-29 02:45:31 main:574] : INFO : Epoch 1774 | loss: 0.0311071 | val_loss: 0.0311759 | Time: 133588 ms [2022-01-29 02:47:25 main:574] : INFO : Epoch 1775 | loss: 0.0311112 | val_loss: 0.031171 | Time: 104443 ms [2022-01-29 02:49:44 main:574] : INFO : Epoch 1776 | loss: 0.0311103 | val_loss: 0.03118 | Time: 138716 ms [2022-01-29 02:52:13 main:574] : INFO : Epoch 1777 | loss: 0.0311066 | val_loss: 0.0311714 | Time: 137799 ms [2022-01-29 02:54:37 main:574] : INFO : Epoch 1778 | loss: 0.0311096 | val_loss: 0.0311774 | Time: 142907 ms [2022-01-29 02:56:56 main:574] : INFO : Epoch 1779 | loss: 0.031113 | val_loss: 0.031172 | Time: 130745 ms [2022-01-29 02:59:52 main:574] : INFO : Epoch 1780 | loss: 0.0311189 | val_loss: 0.0311737 | Time: 175544 ms [2022-01-29 03:02:14 main:574] : INFO : Epoch 1781 | loss: 0.0311182 | val_loss: 0.0311723 | Time: 132974 ms [2022-01-29 03:04:42 main:574] : INFO : Epoch 1782 | loss: 0.031118 | val_loss: 0.0311678 | Time: 148172 ms [2022-01-29 03:07:38 main:574] : INFO : Epoch 1783 | loss: 0.031117 | val_loss: 0.0311695 | Time: 164520 ms [2022-01-29 03:10:03 main:574] : INFO : Epoch 1784 | loss: 0.0311158 | val_loss: 0.0311693 | Time: 143939 ms [2022-01-29 03:13:50 main:574] : INFO : Epoch 1785 | loss: 0.0311138 | val_loss: 0.0311703 | Time: 224582 ms [2022-01-29 03:17:42 main:574] : INFO : Epoch 1786 | loss: 0.0311103 | val_loss: 0.0311759 | Time: 205269 ms [2022-01-29 03:20:38 main:574] : INFO : Epoch 1787 | loss: 0.0311132 | val_loss: 0.0311682 | Time: 171208 ms [2022-01-29 03:23:09 main:574] : INFO : Epoch 1788 | loss: 0.0311149 | val_loss: 0.0311708 | Time: 150348 ms [2022-01-29 03:26:08 main:574] : INFO : Epoch 1789 | loss: 0.0311145 | val_loss: 0.0311734 | Time: 166974 ms [2022-01-29 03:29:24 main:574] : INFO : Epoch 1790 | loss: 0.0311154 | val_loss: 0.0311643 | Time: 195875 ms [2022-01-29 03:33:30 main:574] : INFO : Epoch 1791 | loss: 0.0311162 | val_loss: 0.03117 | Time: 218071 ms [2022-01-29 03:36:16 main:574] : INFO : Epoch 1792 | loss: 0.0311157 | val_loss: 0.0311742 | Time: 162043 ms [2022-01-29 03:39:23 main:574] : INFO : Epoch 1793 | loss: 0.0311129 | val_loss: 0.0311708 | Time: 186552 ms [2022-01-29 03:41:52 main:574] : INFO : Epoch 1794 | loss: 0.0311127 | val_loss: 0.031167 | Time: 140165 ms [2022-01-29 03:44:38 main:574] : INFO : Epoch 1795 | loss: 0.0311138 | val_loss: 0.0311786 | Time: 163723 ms [2022-01-29 03:48:43 main:574] : INFO : Epoch 1796 | loss: 0.0311115 | val_loss: 0.031177 | Time: 237536 ms [2022-01-29 03:52:04 main:574] : INFO : Epoch 1797 | loss: 0.0311141 | val_loss: 0.0311713 | Time: 194620 ms [2022-01-29 03:54:54 main:574] : INFO : Epoch 1798 | loss: 0.0311128 | val_loss: 0.0311715 | Time: 168796 ms [2022-01-29 03:57:42 main:574] : INFO : Epoch 1799 | loss: 0.0311129 | val_loss: 0.0311733 | Time: 156524 ms [2022-01-29 04:00:36 main:574] : INFO : Epoch 1800 | loss: 0.0311143 | val_loss: 0.0311663 | Time: 173926 ms [2022-01-29 04:02:52 main:574] : INFO : Epoch 1801 | loss: 0.031118 | val_loss: 0.0311756 | Time: 127693 ms [2022-01-29 04:05:38 main:574] : INFO : Epoch 1802 | loss: 0.0311288 | val_loss: 0.031166 | Time: 166231 ms [2022-01-29 04:07:48 main:574] : INFO : Epoch 1803 | loss: 0.0311233 | val_loss: 0.0311729 | Time: 122550 ms [2022-01-29 04:09:24 main:574] : INFO : Epoch 1804 | loss: 0.0311189 | val_loss: 0.0311777 | Time: 95750.4 ms [2022-01-29 04:12:26 main:574] : INFO : Epoch 1805 | loss: 0.0311339 | val_loss: 0.0311721 | Time: 165389 ms [2022-01-29 04:14:52 main:574] : INFO : Epoch 1806 | loss: 0.0311514 | val_loss: 0.0311628 | Time: 145912 ms [2022-01-29 04:18:04 main:574] : INFO : Epoch 1807 | loss: 0.031147 | val_loss: 0.031166 | Time: 169660 ms [2022-01-29 04:20:24 main:574] : INFO : Epoch 1808 | loss: 0.0311389 | val_loss: 0.0311674 | Time: 139590 ms [2022-01-29 04:22:53 main:574] : INFO : Epoch 1809 | loss: 0.0311339 | val_loss: 0.0311608 | Time: 138488 ms [2022-01-29 04:29:03 main:574] : INFO : Epoch 1810 | loss: 0.0311304 | val_loss: 0.0311639 | Time: 368807 ms [2022-01-29 04:31:37 main:574] : INFO : Epoch 1811 | loss: 0.0311281 | val_loss: 0.0311646 | Time: 143228 ms [2022-01-29 04:33:47 main:574] : INFO : Epoch 1812 | loss: 0.0311289 | val_loss: 0.031166 | Time: 129234 ms [2022-01-29 04:40:28 main:574] : INFO : Epoch 1813 | loss: 0.0311253 | val_loss: 0.0311679 | Time: 388937 ms [2022-01-29 04:47:05 main:574] : INFO : Epoch 1814 | loss: 0.0311235 | val_loss: 0.0311687 | Time: 388190 ms [2022-01-29 04:58:11 main:574] : INFO : Epoch 1815 | loss: 0.0311219 | val_loss: 0.0311744 | Time: 664577 ms [2022-01-29 05:05:50 main:574] : INFO : Epoch 1816 | loss: 0.0311227 | val_loss: 0.0311721 | Time: 430560 ms [2022-01-29 05:17:06 main:574] : INFO : Epoch 1817 | loss: 0.0311259 | val_loss: 0.0311659 | Time: 670667 ms [2022-01-29 05:25:34 main:574] : INFO : Epoch 1818 | loss: 0.031124 | val_loss: 0.03117 | Time: 468270 ms [2022-01-29 05:31:04 main:574] : INFO : Epoch 1819 | loss: 0.031121 | val_loss: 0.0311736 | Time: 297971 ms [2022-01-29 05:35:33 main:574] : INFO : Epoch 1820 | loss: 0.031121 | val_loss: 0.0311686 | Time: 255879 ms [2022-01-29 05:39:01 main:574] : INFO : Epoch 1821 | loss: 0.0311202 | val_loss: 0.0311718 | Time: 196546 ms [2022-01-29 05:41:41 main:574] : INFO : Epoch 1822 | loss: 0.0311182 | val_loss: 0.0311716 | Time: 159415 ms [2022-01-29 05:44:19 main:574] : INFO : Epoch 1823 | loss: 0.0311165 | val_loss: 0.0311696 | Time: 147563 ms [2022-01-29 05:47:47 main:574] : INFO : Epoch 1824 | loss: 0.031123 | val_loss: 0.0311711 | Time: 208560 ms [2022-01-29 05:55:23 main:574] : INFO : Epoch 1825 | loss: 0.0311174 | val_loss: 0.0311737 | Time: 444359 ms [2022-01-29 06:02:52 main:574] : INFO : Epoch 1826 | loss: 0.0311168 | val_loss: 0.0311774 | Time: 420159 ms [2022-01-29 06:08:51 main:574] : INFO : Epoch 1827 | loss: 0.0311161 | val_loss: 0.0311715 | Time: 350618 ms [2022-01-29 06:13:34 main:574] : INFO : Epoch 1828 | loss: 0.0311154 | val_loss: 0.0311729 | Time: 275684 ms [2022-01-29 06:17:57 main:574] : INFO : Epoch 1829 | loss: 0.0311132 | val_loss: 0.031174 | Time: 239242 ms [2022-01-29 06:21:35 main:574] : INFO : Epoch 1830 | loss: 0.0311135 | val_loss: 0.0311692 | Time: 214211 ms [2022-01-29 06:25:33 main:574] : INFO : Epoch 1831 | loss: 0.0311126 | val_loss: 0.031176 | Time: 224216 ms [2022-01-29 06:31:12 main:574] : INFO : Epoch 1832 | loss: 0.0311101 | val_loss: 0.0311783 | Time: 328053 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 950M) [2022-02-08 01:16:46 main:435] : INFO : Set logging level to 1 [2022-02-08 01:16:50 main:441] : INFO : Running in BOINC Client mode [2022-02-08 01:16:51 main:444] : INFO : Resolving all filenames [2022-02-08 01:16:52 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-02-08 01:16:53 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-02-08 01:16:54 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-02-08 01:16:55 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-02-08 01:16:59 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-02-08 01:16:59 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-02-08 01:17:00 main:474] : INFO : Configuration: [2022-02-08 01:17:01 main:475] : INFO : Model type: GRU [2022-02-08 01:17:02 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-02-08 01:17:02 main:477] : INFO : Max Epochs: 2048 [2022-02-08 01:17:03 main:478] : INFO : Batch Size: 128 [2022-02-08 01:17:03 main:479] : INFO : Learning Rate: 0.01 [2022-02-08 01:17:03 main:480] : INFO : Patience: 10 [2022-02-08 01:17:03 main:481] : INFO : Hidden Width: 12 [2022-02-08 01:17:04 main:482] : INFO : # Recurrent Layers: 4 [2022-02-08 01:17:04 main:483] : INFO : # Backend Layers: 4 [2022-02-08 01:17:04 main:484] : INFO : # Threads: 1 [2022-02-08 01:17:04 main:486] : INFO : Preparing Dataset [2022-02-08 01:17:04 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-02-08 01:17:22 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-02-08 01:18:34 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-02-08 01:18:35 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-02-08 01:18:37 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-02-08 01:18:40 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-02-08 01:18:41 main:494] : INFO : Creating Model [2022-02-08 01:18:41 main:507] : INFO : Preparing config file [2022-02-08 01:18:42 main:511] : INFO : Found checkpoint, attempting to load... [2022-02-08 01:18:42 main:512] : INFO : Loading config [2022-02-08 01:18:42 main:514] : INFO : Loading state [2022-02-08 01:19:48 main:559] : INFO : Loading DataLoader into Memory [2022-02-08 01:19:49 main:562] : INFO : Starting Training [2022-02-08 01:20:41 main:574] : INFO : Epoch 1833 | loss: 0.0311462 | val_loss: 0.0311794 | Time: 52258.8 ms [2022-02-08 01:21:49 main:574] : INFO : Epoch 1834 | loss: 0.0311184 | val_loss: 0.0311806 | Time: 66449.9 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 950M) [2022-02-08 01:28:41 main:435] : INFO : Set logging level to 1 [2022-02-08 01:28:41 main:441] : INFO : Running in BOINC Client mode [2022-02-08 01:28:41 main:444] : INFO : Resolving all filenames [2022-02-08 01:28:41 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-02-08 01:28:41 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-02-08 01:28:41 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-02-08 01:28:41 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-02-08 01:28:41 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-02-08 01:28:42 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-02-08 01:28:43 main:474] : INFO : Configuration: [2022-02-08 01:28:44 main:475] : INFO : Model type: GRU [2022-02-08 01:28:44 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-02-08 01:28:44 main:477] : INFO : Max Epochs: 2048 [2022-02-08 01:28:47 main:478] : INFO : Batch Size: 128 [2022-02-08 01:28:58 main:479] : INFO : Learning Rate: 0.01 [2022-02-08 01:28:58 main:480] : INFO : Patience: 10 [2022-02-08 01:28:58 main:481] : INFO : Hidden Width: 12 [2022-02-08 01:28:58 main:482] : INFO : # Recurrent Layers: 4 [2022-02-08 01:28:58 main:483] : INFO : # Backend Layers: 4 [2022-02-08 01:28:58 main:484] : INFO : # Threads: 1 [2022-02-08 01:28:59 main:486] : INFO : Preparing Dataset [2022-02-08 01:28:59 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-02-08 01:29:54 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 950M) [2022-02-08 01:51:10 main:435] : INFO : Set logging level to 1 [2022-02-08 01:51:15 main:441] : INFO : Running in BOINC Client mode [2022-02-08 01:51:15 main:444] : INFO : Resolving all filenames [2022-02-08 01:51:15 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-02-08 01:51:15 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-02-08 01:51:19 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-02-08 01:51:23 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-02-08 01:51:28 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-02-08 01:51:32 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-02-08 01:51:39 main:474] : INFO : Configuration: [2022-02-08 01:51:39 main:475] : INFO : Model type: GRU [2022-02-08 01:51:39 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-02-08 01:51:40 main:477] : INFO : Max Epochs: 2048 [2022-02-08 01:51:45 main:478] : INFO : Batch Size: 128 [2022-02-08 01:51:55 main:479] : INFO : Learning Rate: 0.01 [2022-02-08 01:51:55 main:480] : INFO : Patience: 10 [2022-02-08 01:51:56 main:481] : INFO : Hidden Width: 12 [2022-02-08 01:51:56 main:482] : INFO : # Recurrent Layers: 4 [2022-02-08 01:51:56 main:483] : INFO : # Backend Layers: 4 [2022-02-08 01:51:57 main:484] : INFO : # Threads: 1 [2022-02-08 01:51:57 main:486] : INFO : Preparing Dataset [2022-02-08 01:51:57 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-02-08 01:52:31 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-02-08 02:00:11 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-02-08 02:00:11 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-02-08 02:00:21 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-02-08 02:00:53 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-02-08 02:00:53 main:494] : INFO : Creating Model [2022-02-08 02:00:53 main:507] : INFO : Preparing config file [2022-02-08 02:00:54 main:511] : INFO : Found checkpoint, attempting to load... [2022-02-08 02:00:54 main:512] : INFO : Loading config [2022-02-08 02:00:54 main:514] : INFO : Loading state [2022-02-08 02:04:47 main:559] : INFO : Loading DataLoader into Memory [2022-02-08 02:04:59 main:562] : INFO : Starting Training [2022-02-08 02:06:53 main:574] : INFO : Epoch 1834 | loss: 0.0311562 | val_loss: 0.0311838 | Time: 108530 ms [2022-02-08 02:08:51 main:574] : INFO : Epoch 1835 | loss: 0.0311251 | val_loss: 0.0311796 | Time: 114281 ms [2022-02-08 02:10:14 main:574] : INFO : Epoch 1836 | loss: 0.0311181 | val_loss: 0.0311765 | Time: 83264.4 ms [2022-02-08 02:11:45 main:574] : INFO : Epoch 1837 | loss: 0.0311148 | val_loss: 0.0311735 | Time: 89701 ms [2022-02-08 02:13:10 main:574] : INFO : Epoch 1838 | loss: 0.0311152 | val_loss: 0.0311737 | Time: 84830.7 ms [2022-02-08 02:14:44 main:574] : INFO : Epoch 1839 | loss: 0.0311144 | val_loss: 0.0311766 | Time: 93967.2 ms [2022-02-08 02:16:30 main:574] : INFO : Epoch 1840 | loss: 0.0311094 | val_loss: 0.0311723 | Time: 105599 ms [2022-02-08 02:18:05 main:574] : INFO : Epoch 1841 | loss: 0.0311096 | val_loss: 0.0311769 | Time: 93280.7 ms [2022-02-08 02:19:41 main:574] : INFO : Epoch 1842 | loss: 0.0311105 | val_loss: 0.0311798 | Time: 96113.3 ms [2022-02-08 02:21:10 main:574] : INFO : Epoch 1843 | loss: 0.0311108 | val_loss: 0.0311768 | Time: 88768.6 ms [2022-02-08 02:23:21 main:574] : INFO : Epoch 1844 | loss: 0.0311114 | val_loss: 0.0311761 | Time: 115226 ms [2022-02-08 02:25:28 main:574] : INFO : Epoch 1845 | loss: 0.0311139 | val_loss: 0.0311791 | Time: 127227 ms [2022-02-08 02:27:27 main:574] : INFO : Epoch 1846 | loss: 0.0311189 | val_loss: 0.031174 | Time: 118513 ms [2022-02-08 02:29:02 main:574] : INFO : Epoch 1847 | loss: 0.031116 | val_loss: 0.0311769 | Time: 95625.9 ms [2022-02-08 02:30:39 main:574] : INFO : Epoch 1848 | loss: 0.0311146 | val_loss: 0.0311727 | Time: 93734.8 ms [2022-02-08 02:32:38 main:574] : INFO : Epoch 1849 | loss: 0.0311118 | val_loss: 0.0311772 | Time: 118653 ms [2022-02-08 02:34:22 main:574] : INFO : Epoch 1850 | loss: 0.0311108 | val_loss: 0.0311744 | Time: 103330 ms [2022-02-08 02:36:13 main:574] : INFO : Epoch 1851 | loss: 0.0311106 | val_loss: 0.0311767 | Time: 110408 ms [2022-02-08 02:38:15 main:574] : INFO : Epoch 1852 | loss: 0.031109 | val_loss: 0.0311757 | Time: 117288 ms [2022-02-08 02:39:52 main:574] : INFO : Epoch 1853 | loss: 0.0311085 | val_loss: 0.031176 | Time: 97166.5 ms [2022-02-08 02:41:50 main:574] : INFO : Epoch 1854 | loss: 0.0311091 | val_loss: 0.0311929 | Time: 109253 ms [2022-02-08 02:43:12 main:574] : INFO : Epoch 1855 | loss: 0.031119 | val_loss: 0.0311747 | Time: 81586.8 ms [2022-02-08 02:44:58 main:574] : INFO : Epoch 1856 | loss: 0.0311218 | val_loss: 0.0311753 | Time: 105357 ms [2022-02-08 02:47:03 main:574] : INFO : Epoch 1857 | loss: 0.031117 | val_loss: 0.0311769 | Time: 124194 ms [2022-02-08 02:49:05 main:574] : INFO : Epoch 1858 | loss: 0.0311129 | val_loss: 0.0311819 | Time: 122147 ms [2022-02-08 02:50:49 main:574] : INFO : Epoch 1859 | loss: 0.031109 | val_loss: 0.0311736 | Time: 103825 ms [2022-02-08 02:52:35 main:574] : INFO : Epoch 1860 | loss: 0.0311083 | val_loss: 0.0311814 | Time: 105443 ms [2022-02-08 02:54:27 main:574] : INFO : Epoch 1861 | loss: 0.0311122 | val_loss: 0.0311772 | Time: 106696 ms [2022-02-08 02:56:07 main:574] : INFO : Epoch 1862 | loss: 0.0311096 | val_loss: 0.0311772 | Time: 98264.5 ms [2022-02-08 02:56:48 main:574] : INFO : Epoch 1863 | loss: 0.0311107 | val_loss: 0.031173 | Time: 41605.7 ms [2022-02-08 02:58:32 main:574] : INFO : Epoch 1864 | loss: 0.0311112 | val_loss: 0.0311866 | Time: 103734 ms [2022-02-08 03:00:06 main:574] : INFO : Epoch 1865 | loss: 0.0311113 | val_loss: 0.031172 | Time: 93164.2 ms [2022-02-08 03:01:28 main:574] : INFO : Epoch 1866 | loss: 0.031109 | val_loss: 0.0311802 | Time: 79357.9 ms [2022-02-08 03:02:56 main:574] : INFO : Epoch 1867 | loss: 0.0311106 | val_loss: 0.0311764 | Time: 87107.4 ms [2022-02-08 03:04:31 main:574] : INFO : Epoch 1868 | loss: 0.0311091 | val_loss: 0.0311858 | Time: 94814.2 ms [2022-02-08 03:05:59 main:574] : INFO : Epoch 1869 | loss: 0.0311088 | val_loss: 0.0311835 | Time: 87701.6 ms [2022-02-08 03:09:35 main:574] : INFO : Epoch 1870 | loss: 0.0311085 | val_loss: 0.0311774 | Time: 215386 ms [2022-02-08 03:13:34 main:574] : INFO : Epoch 1871 | loss: 0.0311091 | val_loss: 0.0311793 | Time: 225870 ms [2022-02-08 03:16:56 main:574] : INFO : Epoch 1872 | loss: 0.0311109 | val_loss: 0.0311739 | Time: 195226 ms [2022-02-08 03:20:53 main:574] : INFO : Epoch 1873 | loss: 0.0311077 | val_loss: 0.031174 | Time: 236450 ms [2022-02-08 03:23:48 main:574] : INFO : Epoch 1874 | loss: 0.0311087 | val_loss: 0.0311819 | Time: 175254 ms [2022-02-08 03:26:42 main:574] : INFO : Epoch 1875 | loss: 0.0311063 | val_loss: 0.0311746 | Time: 171716 ms [2022-02-08 03:28:56 main:574] : INFO : Epoch 1876 | loss: 0.0311069 | val_loss: 0.0311836 | Time: 123698 ms [2022-02-08 03:31:13 main:574] : INFO : Epoch 1877 | loss: 0.0311066 | val_loss: 0.0311812 | Time: 137274 ms [2022-02-08 03:34:01 main:574] : INFO : Epoch 1878 | loss: 0.0311065 | val_loss: 0.0311892 | Time: 148479 ms [2022-02-08 03:38:23 main:574] : INFO : Epoch 1879 | loss: 0.0311069 | val_loss: 0.0311781 | Time: 260462 ms [2022-02-08 03:41:01 main:574] : INFO : Epoch 1880 | loss: 0.0311045 | val_loss: 0.0311806 | Time: 151333 ms [2022-02-08 03:43:26 main:574] : INFO : Epoch 1881 | loss: 0.0311039 | val_loss: 0.0311787 | Time: 144426 ms [2022-02-08 03:46:04 main:574] : INFO : Epoch 1882 | loss: 0.0311047 | val_loss: 0.0311817 | Time: 149939 ms [2022-02-08 03:49:25 main:574] : INFO : Epoch 1883 | loss: 0.0311073 | val_loss: 0.0311761 | Time: 201490 ms [2022-02-08 03:53:09 main:574] : INFO : Epoch 1884 | loss: 0.0311236 | val_loss: 0.0311644 | Time: 209703 ms [2022-02-08 03:56:47 main:574] : INFO : Epoch 1885 | loss: 0.0311208 | val_loss: 0.0311668 | Time: 189709 ms [2022-02-08 04:03:15 main:574] : INFO : Epoch 1886 | loss: 0.0311162 | val_loss: 0.0311771 | Time: 382039 ms [2022-02-08 04:06:41 main:574] : INFO : Epoch 1887 | loss: 0.0311138 | val_loss: 0.0311714 | Time: 188093 ms [2022-02-08 04:08:16 main:574] : INFO : Epoch 1888 | loss: 0.0311132 | val_loss: 0.0311688 | Time: 94432 ms [2022-02-08 04:10:03 main:574] : INFO : Epoch 1889 | loss: 0.0311146 | val_loss: 0.0311748 | Time: 99210.6 ms [2022-02-08 04:11:29 main:574] : INFO : Epoch 1890 | loss: 0.0311131 | val_loss: 0.0311767 | Time: 85638.3 ms [2022-02-08 04:12:49 main:574] : INFO : Epoch 1891 | loss: 0.0311112 | val_loss: 0.0311724 | Time: 79058.6 ms [2022-02-08 04:14:31 main:574] : INFO : Epoch 1892 | loss: 0.0311112 | val_loss: 0.0311741 | Time: 96903.4 ms [2022-02-08 04:15:59 main:574] : INFO : Epoch 1893 | loss: 0.0311107 | val_loss: 0.0311756 | Time: 88311.2 ms [2022-02-08 04:17:30 main:574] : INFO : Epoch 1894 | loss: 0.0311111 | val_loss: 0.0311733 | Time: 90975.3 ms [2022-02-08 04:18:56 main:574] : INFO : Epoch 1895 | loss: 0.031114 | val_loss: 0.0311765 | Time: 82083.1 ms [2022-02-08 04:20:19 main:574] : INFO : Epoch 1896 | loss: 0.0311154 | val_loss: 0.0311764 | Time: 82682.9 ms [2022-02-08 04:21:47 main:574] : INFO : Epoch 1897 | loss: 0.0311152 | val_loss: 0.0311741 | Time: 87426.5 ms [2022-02-08 04:23:06 main:574] : INFO : Epoch 1898 | loss: 0.0311162 | val_loss: 0.0311644 | Time: 73233.1 ms [2022-02-08 04:25:18 main:574] : INFO : Epoch 1899 | loss: 0.0311145 | val_loss: 0.0311748 | Time: 131097 ms [2022-02-08 04:27:04 main:574] : INFO : Epoch 1900 | loss: 0.0311119 | val_loss: 0.0311802 | Time: 99231.7 ms [2022-02-08 04:28:37 main:574] : INFO : Epoch 1901 | loss: 0.031115 | val_loss: 0.031176 | Time: 91809.4 ms [2022-02-08 04:30:09 main:574] : INFO : Epoch 1902 | loss: 0.0311283 | val_loss: 0.0311674 | Time: 90424.5 ms [2022-02-08 04:31:23 main:574] : INFO : Epoch 1903 | loss: 0.0311246 | val_loss: 0.0311749 | Time: 70920.3 ms [2022-02-08 04:32:35 main:574] : INFO : Epoch 1904 | loss: 0.0311335 | val_loss: 0.0311609 | Time: 72127.8 ms [2022-02-08 04:33:57 main:574] : INFO : Epoch 1905 | loss: 0.0311398 | val_loss: 0.0311629 | Time: 81278.7 ms [2022-02-08 04:35:37 main:574] : INFO : Epoch 1906 | loss: 0.0311373 | val_loss: 0.0311653 | Time: 99896.3 ms [2022-02-08 04:36:40 main:574] : INFO : Epoch 1907 | loss: 0.0311379 | val_loss: 0.0311637 | Time: 62487.6 ms [2022-02-08 04:38:04 main:574] : INFO : Epoch 1908 | loss: 0.031136 | val_loss: 0.0311622 | Time: 84522.2 ms [2022-02-08 04:39:40 main:574] : INFO : Epoch 1909 | loss: 0.0311311 | val_loss: 0.031165 | Time: 91579.8 ms [2022-02-08 04:41:08 main:574] : INFO : Epoch 1910 | loss: 0.0311302 | val_loss: 0.0311684 | Time: 88059.3 ms [2022-02-08 04:42:30 main:574] : INFO : Epoch 1911 | loss: 0.0311288 | val_loss: 0.0311775 | Time: 81736.3 ms [2022-02-08 04:43:55 main:574] : INFO : Epoch 1912 | loss: 0.0311282 | val_loss: 0.0311689 | Time: 81759.8 ms [2022-02-08 04:45:07 main:574] : INFO : Epoch 1913 | loss: 0.0311266 | val_loss: 0.0311692 | Time: 71894.5 ms [2022-02-08 04:46:48 main:574] : INFO : Epoch 1914 | loss: 0.0311277 | val_loss: 0.0311647 | Time: 100763 ms [2022-02-08 04:48:22 main:574] : INFO : Epoch 1915 | loss: 0.0311251 | val_loss: 0.0311631 | Time: 89848.4 ms [2022-02-08 04:49:43 main:574] : INFO : Epoch 1916 | loss: 0.0311234 | val_loss: 0.0311641 | Time: 80364.6 ms [2022-02-08 04:51:21 main:574] : INFO : Epoch 1917 | loss: 0.031122 | val_loss: 0.0311678 | Time: 97949.5 ms [2022-02-08 04:52:51 main:574] : INFO : Epoch 1918 | loss: 0.031121 | val_loss: 0.0311654 | Time: 86819 ms [2022-02-08 04:54:30 main:574] : INFO : Epoch 1919 | loss: 0.0311201 | val_loss: 0.0311608 | Time: 97814.8 ms [2022-02-08 04:55:50 main:574] : INFO : Epoch 1920 | loss: 0.0311191 | val_loss: 0.0311693 | Time: 79696 ms [2022-02-08 04:57:28 main:574] : INFO : Epoch 1921 | loss: 0.0311227 | val_loss: 0.03117 | Time: 93633.9 ms [2022-02-08 04:58:45 main:574] : INFO : Epoch 1922 | loss: 0.0311501 | val_loss: 0.0311701 | Time: 77109.1 ms [2022-02-08 05:00:11 main:574] : INFO : Epoch 1923 | loss: 0.0311601 | val_loss: 0.0311646 | Time: 86129.7 ms [2022-02-08 05:01:30 main:574] : INFO : Epoch 1924 | loss: 0.0311528 | val_loss: 0.0311624 | Time: 78013.9 ms [2022-02-08 05:02:47 main:574] : INFO : Epoch 1925 | loss: 0.0311484 | val_loss: 0.031159 | Time: 77281.1 ms [2022-02-08 05:04:15 main:574] : INFO : Epoch 1926 | loss: 0.0311478 | val_loss: 0.03116 | Time: 87417.3 ms [2022-02-08 05:06:03 main:574] : INFO : Epoch 1927 | loss: 0.0311448 | val_loss: 0.0311591 | Time: 99785.1 ms [2022-02-08 05:09:10 main:574] : INFO : Epoch 1928 | loss: 0.0311442 | val_loss: 0.0311578 | Time: 186354 ms [2022-02-08 05:13:50 main:574] : INFO : Epoch 1929 | loss: 0.0311422 | val_loss: 0.0311624 | Time: 272488 ms [2022-02-08 05:17:59 main:574] : INFO : Epoch 1930 | loss: 0.0311405 | val_loss: 0.0311633 | Time: 231148 ms [2022-02-08 05:21:48 main:574] : INFO : Epoch 1931 | loss: 0.0311383 | val_loss: 0.0311606 | Time: 224811 ms [2022-02-08 05:25:26 main:574] : INFO : Epoch 1932 | loss: 0.0311373 | val_loss: 0.0311605 | Time: 210653 ms [2022-02-08 05:28:16 main:574] : INFO : Epoch 1933 | loss: 0.0311374 | val_loss: 0.0311653 | Time: 162457 ms [2022-02-08 05:32:07 main:574] : INFO : Epoch 1934 | loss: 0.0311363 | val_loss: 0.0311605 | Time: 230170 ms [2022-02-08 05:37:12 main:574] : INFO : Epoch 1935 | loss: 0.0311358 | val_loss: 0.0311625 | Time: 283679 ms [2022-02-08 05:41:39 main:574] : INFO : Epoch 1936 | loss: 0.0311368 | val_loss: 0.0311651 | Time: 254347 ms [2022-02-08 05:45:10 main:574] : INFO : Epoch 1937 | loss: 0.0311336 | val_loss: 0.0311633 | Time: 199955 ms [2022-02-08 05:49:12 main:574] : INFO : Epoch 1938 | loss: 0.0311339 | val_loss: 0.0311626 | Time: 241839 ms [2022-02-08 05:54:28 main:574] : INFO : Epoch 1939 | loss: 0.0311319 | val_loss: 0.0311637 | Time: 292515 ms [2022-02-08 05:59:07 main:574] : INFO : Epoch 1940 | loss: 0.0311339 | val_loss: 0.0311664 | Time: 237133 ms [2022-02-08 06:02:55 main:574] : INFO : Epoch 1941 | loss: 0.0311318 | val_loss: 0.0311648 | Time: 208557 ms [2022-02-08 06:06:38 main:574] : INFO : Epoch 1942 | loss: 0.03113 | val_loss: 0.0311662 | Time: 217295 ms [2022-02-08 06:08:17 main:574] : INFO : Epoch 1943 | loss: 0.0311276 | val_loss: 0.031169 | Time: 98533.1 ms [2022-02-08 06:10:11 main:574] : INFO : Epoch 1944 | loss: 0.0311284 | val_loss: 0.0311707 | Time: 114201 ms [2022-02-08 06:12:02 main:574] : INFO : Epoch 1945 | loss: 0.0311283 | val_loss: 0.0311669 | Time: 100116 ms [2022-02-08 06:13:26 main:574] : INFO : Epoch 1946 | loss: 0.0311266 | val_loss: 0.0311714 | Time: 83723.2 ms [2022-02-08 06:15:03 main:574] : INFO : Epoch 1947 | loss: 0.031125 | val_loss: 0.0311722 | Time: 96136.4 ms [2022-02-08 06:16:12 main:574] : INFO : Epoch 1948 | loss: 0.0311264 | val_loss: 0.0311676 | Time: 64775.6 ms [2022-02-08 06:17:38 main:574] : INFO : Epoch 1949 | loss: 0.0311295 | val_loss: 0.0311701 | Time: 84968.9 ms [2022-02-08 06:19:10 main:574] : INFO : Epoch 1950 | loss: 0.0311287 | val_loss: 0.0311675 | Time: 92197.7 ms [2022-02-08 06:20:31 main:574] : INFO : Epoch 1951 | loss: 0.0311261 | val_loss: 0.0311677 | Time: 79199.1 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 950M) [2022-02-09 00:03:38 main:435] : INFO : Set logging level to 1 [2022-02-09 00:03:39 main:441] : INFO : Running in BOINC Client mode [2022-02-09 00:03:39 main:444] : INFO : Resolving all filenames [2022-02-09 00:03:39 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-02-09 00:03:39 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-02-09 00:03:39 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-02-09 00:03:39 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-02-09 00:03:39 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-02-09 00:03:39 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-02-09 00:03:39 main:474] : INFO : Configuration: [2022-02-09 00:03:39 main:475] : INFO : Model type: GRU [2022-02-09 00:03:40 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-02-09 00:03:40 main:477] : INFO : Max Epochs: 2048 [2022-02-09 00:03:40 main:478] : INFO : Batch Size: 128 [2022-02-09 00:03:40 main:479] : INFO : Learning Rate: 0.01 [2022-02-09 00:03:40 main:480] : INFO : Patience: 10 [2022-02-09 00:03:40 main:481] : INFO : Hidden Width: 12 [2022-02-09 00:03:40 main:482] : INFO : # Recurrent Layers: 4 [2022-02-09 00:03:40 main:483] : INFO : # Backend Layers: 4 [2022-02-09 00:03:40 main:484] : INFO : # Threads: 1 [2022-02-09 00:03:41 main:486] : INFO : Preparing Dataset [2022-02-09 00:03:41 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-02-09 00:03:46 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-02-09 00:08:02 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-02-09 00:08:02 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-02-09 00:08:05 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-02-09 00:08:14 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-02-09 00:08:15 main:494] : INFO : Creating Model [2022-02-09 00:08:15 main:507] : INFO : Preparing config file [2022-02-09 00:08:15 main:511] : INFO : Found checkpoint, attempting to load... [2022-02-09 00:08:15 main:512] : INFO : Loading config [2022-02-09 00:08:15 main:514] : INFO : Loading state [2022-02-09 00:10:58 main:559] : INFO : Loading DataLoader into Memory [2022-02-09 00:11:00 main:562] : INFO : Starting Training Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 950M) [2022-02-09 00:21:47 main:435] : INFO : Set logging level to 1 [2022-02-09 00:21:47 main:441] : INFO : Running in BOINC Client mode [2022-02-09 00:21:47 main:444] : INFO : Resolving all filenames [2022-02-09 00:21:48 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-02-09 00:21:49 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-02-09 00:21:49 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-02-09 00:21:49 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-02-09 00:21:49 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-02-09 00:21:49 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-02-09 00:21:50 main:474] : INFO : Configuration: [2022-02-09 00:21:50 main:475] : INFO : Model type: GRU [2022-02-09 00:21:50 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-02-09 00:21:50 main:477] : INFO : Max Epochs: 2048 [2022-02-09 00:21:50 main:478] : INFO : Batch Size: 128 [2022-02-09 00:21:50 main:479] : INFO : Learning Rate: 0.01 [2022-02-09 00:21:51 main:480] : INFO : Patience: 10 [2022-02-09 00:21:51 main:481] : INFO : Hidden Width: 12 [2022-02-09 00:21:51 main:482] : INFO : # Recurrent Layers: 4 [2022-02-09 00:21:51 main:483] : INFO : # Backend Layers: 4 [2022-02-09 00:21:51 main:484] : INFO : # Threads: 1 [2022-02-09 00:21:51 main:486] : INFO : Preparing Dataset [2022-02-09 00:21:51 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-02-09 00:21:58 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 950M) [2022-02-09 01:09:54 main:435] : INFO : Set logging level to 1 [2022-02-09 01:09:54 main:441] : INFO : Running in BOINC Client mode [2022-02-09 01:09:54 main:444] : INFO : Resolving all filenames [2022-02-09 01:09:54 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-02-09 01:09:54 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-02-09 01:09:54 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-02-09 01:09:55 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-02-09 01:09:55 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-02-09 01:09:55 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-02-09 01:09:56 main:474] : INFO : Configuration: [2022-02-09 01:09:56 main:475] : INFO : Model type: GRU [2022-02-09 01:09:56 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-02-09 01:09:56 main:477] : INFO : Max Epochs: 2048 [2022-02-09 01:09:58 main:478] : INFO : Batch Size: 128 [2022-02-09 01:09:59 main:479] : INFO : Learning Rate: 0.01 [2022-02-09 01:09:59 main:480] : INFO : Patience: 10 [2022-02-09 01:10:00 main:481] : INFO : Hidden Width: 12 [2022-02-09 01:10:00 main:482] : INFO : # Recurrent Layers: 4 [2022-02-09 01:10:00 main:483] : INFO : # Backend Layers: 4 [2022-02-09 01:10:00 main:484] : INFO : # Threads: 1 [2022-02-09 01:10:00 main:486] : INFO : Preparing Dataset [2022-02-09 01:10:00 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-02-09 01:10:08 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-02-09 01:17:57 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-02-09 01:17:57 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-02-09 01:18:00 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-02-09 01:18:16 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-02-09 01:18:17 main:494] : INFO : Creating Model [2022-02-09 01:18:20 main:507] : INFO : Preparing config file [2022-02-09 01:18:20 main:511] : INFO : Found checkpoint, attempting to load... [2022-02-09 01:18:20 main:512] : INFO : Loading config [2022-02-09 01:18:20 main:514] : INFO : Loading state [2022-02-09 01:21:20 main:559] : INFO : Loading DataLoader into Memory [2022-02-09 01:21:25 main:562] : INFO : Starting Training [2022-02-09 01:24:27 main:574] : INFO : Epoch 1951 | loss: 0.0311715 | val_loss: 0.0311657 | Time: 181795 ms [2022-02-09 01:27:06 main:574] : INFO : Epoch 1952 | loss: 0.0311376 | val_loss: 0.0311628 | Time: 151818 ms [2022-02-09 01:29:40 main:574] : INFO : Epoch 1953 | loss: 0.031132 | val_loss: 0.0311682 | Time: 153519 ms [2022-02-09 01:31:10 main:574] : INFO : Epoch 1954 | loss: 0.0311277 | val_loss: 0.0311647 | Time: 77120.6 ms [2022-02-09 01:33:44 main:574] : INFO : Epoch 1955 | loss: 0.031125 | val_loss: 0.031167 | Time: 153185 ms [2022-02-09 01:35:33 main:574] : INFO : Epoch 1956 | loss: 0.0311245 | val_loss: 0.0311652 | Time: 108783 ms [2022-02-09 01:37:58 main:574] : INFO : Epoch 1957 | loss: 0.0311258 | val_loss: 0.0311642 | Time: 145519 ms [2022-02-09 01:40:35 main:574] : INFO : Epoch 1958 | loss: 0.0311284 | val_loss: 0.0311679 | Time: 152170 ms [2022-02-09 01:42:26 main:574] : INFO : Epoch 1959 | loss: 0.0311277 | val_loss: 0.0311639 | Time: 110766 ms [2022-02-09 01:44:38 main:574] : INFO : Epoch 1960 | loss: 0.0311258 | val_loss: 0.0311664 | Time: 129196 ms [2022-02-09 01:47:07 main:574] : INFO : Epoch 1961 | loss: 0.0311246 | val_loss: 0.0311685 | Time: 146271 ms [2022-02-09 01:49:14 main:574] : INFO : Epoch 1962 | loss: 0.0311231 | val_loss: 0.031169 | Time: 122607 ms [2022-02-09 01:50:56 main:574] : INFO : Epoch 1963 | loss: 0.0311264 | val_loss: 0.0311722 | Time: 100784 ms [2022-02-09 01:53:14 main:574] : INFO : Epoch 1964 | loss: 0.0311248 | val_loss: 0.0311645 | Time: 131289 ms [2022-02-09 01:54:59 main:574] : INFO : Epoch 1965 | loss: 0.0311214 | val_loss: 0.0311656 | Time: 104812 ms [2022-02-09 01:57:13 main:574] : INFO : Epoch 1966 | loss: 0.0311229 | val_loss: 0.0311647 | Time: 125742 ms [2022-02-09 01:58:51 main:574] : INFO : Epoch 1967 | loss: 0.0311238 | val_loss: 0.031163 | Time: 98238.1 ms [2022-02-09 02:01:13 main:574] : INFO : Epoch 1968 | loss: 0.0311208 | val_loss: 0.0311682 | Time: 135760 ms [2022-02-09 02:03:35 main:574] : INFO : Epoch 1969 | loss: 0.0311234 | val_loss: 0.0311681 | Time: 141992 ms [2022-02-09 02:05:46 main:574] : INFO : Epoch 1970 | loss: 0.0311213 | val_loss: 0.0311639 | Time: 118606 ms [2022-02-09 02:07:58 main:574] : INFO : Epoch 1971 | loss: 0.03112 | val_loss: 0.0311719 | Time: 129925 ms [2022-02-09 02:10:40 main:574] : INFO : Epoch 1972 | loss: 0.0311191 | val_loss: 0.0311664 | Time: 153673 ms [2022-02-09 02:12:55 main:574] : INFO : Epoch 1973 | loss: 0.0311196 | val_loss: 0.0311681 | Time: 133886 ms [2022-02-09 02:15:15 main:574] : INFO : Epoch 1974 | loss: 0.0311244 | val_loss: 0.031166 | Time: 126256 ms [2022-02-09 02:17:07 main:574] : INFO : Epoch 1975 | loss: 0.0311214 | val_loss: 0.0311635 | Time: 112107 ms [2022-02-09 02:19:12 main:574] : INFO : Epoch 1976 | loss: 0.0311187 | val_loss: 0.031168 | Time: 124914 ms [2022-02-09 02:21:15 main:574] : INFO : Epoch 1977 | loss: 0.0311185 | val_loss: 0.0311631 | Time: 122042 ms [2022-02-09 02:23:25 main:574] : INFO : Epoch 1978 | loss: 0.0311253 | val_loss: 0.031163 | Time: 117406 ms [2022-02-09 02:25:24 main:574] : INFO : Epoch 1979 | loss: 0.0311249 | val_loss: 0.0311587 | Time: 118156 ms [2022-02-09 02:27:31 main:574] : INFO : Epoch 1980 | loss: 0.0311253 | val_loss: 0.0311637 | Time: 119487 ms [2022-02-09 02:29:47 main:574] : INFO : Epoch 1981 | loss: 0.0311228 | val_loss: 0.0311655 | Time: 134135 ms [2022-02-09 02:32:23 main:574] : INFO : Epoch 1982 | loss: 0.0311226 | val_loss: 0.0311622 | Time: 140351 ms [2022-02-09 02:34:46 main:574] : INFO : Epoch 1983 | loss: 0.0311243 | val_loss: 0.0311704 | Time: 127968 ms [2022-02-09 02:38:00 main:574] : INFO : Epoch 1984 | loss: 0.0311218 | val_loss: 0.0311598 | Time: 156402 ms [2022-02-09 02:40:16 main:574] : INFO : Epoch 1985 | loss: 0.0311226 | val_loss: 0.0311682 | Time: 135247 ms [2022-02-09 02:42:44 main:574] : INFO : Epoch 1986 | loss: 0.031121 | val_loss: 0.0311663 | Time: 125819 ms [2022-02-09 02:45:21 main:574] : INFO : Epoch 1987 | loss: 0.0311208 | val_loss: 0.0311629 | Time: 156021 ms [2022-02-09 02:47:56 main:574] : INFO : Epoch 1988 | loss: 0.0311198 | val_loss: 0.0311676 | Time: 144162 ms [2022-02-09 02:49:39 main:574] : INFO : Epoch 1989 | loss: 0.0311165 | val_loss: 0.0311674 | Time: 101589 ms [2022-02-09 02:51:48 main:574] : INFO : Epoch 1990 | loss: 0.0311162 | val_loss: 0.0311703 | Time: 126041 ms [2022-02-09 02:53:50 main:574] : INFO : Epoch 1991 | loss: 0.0311146 | val_loss: 0.0311724 | Time: 120145 ms [2022-02-09 02:55:20 main:574] : INFO : Epoch 1992 | loss: 0.0311203 | val_loss: 0.0311719 | Time: 89516.3 ms [2022-02-09 02:56:47 main:574] : INFO : Epoch 1993 | loss: 0.0311233 | val_loss: 0.0311655 | Time: 86201.5 ms [2022-02-09 02:57:05 main:574] : INFO : Epoch 1994 | loss: 0.0311307 | val_loss: 0.0311764 | Time: 17723.9 ms [2022-02-09 02:59:03 main:574] : INFO : Epoch 1995 | loss: 0.0311485 | val_loss: 0.0311679 | Time: 115994 ms [2022-02-09 03:01:30 main:574] : INFO : Epoch 1996 | loss: 0.0311365 | val_loss: 0.0311663 | Time: 141923 ms [2022-02-09 03:03:24 main:574] : INFO : Epoch 1997 | loss: 0.0311232 | val_loss: 0.0311666 | Time: 113089 ms [2022-02-09 03:05:05 main:574] : INFO : Epoch 1998 | loss: 0.0311228 | val_loss: 0.0311686 | Time: 95116.9 ms [2022-02-09 03:06:33 main:574] : INFO : Epoch 1999 | loss: 0.0311247 | val_loss: 0.0311679 | Time: 88056.2 ms [2022-02-09 03:08:25 main:574] : INFO : Epoch 2000 | loss: 0.0311228 | val_loss: 0.0311696 | Time: 108968 ms [2022-02-09 03:10:07 main:574] : INFO : Epoch 2001 | loss: 0.0311201 | val_loss: 0.0311706 | Time: 88229.5 ms [2022-02-09 03:12:09 main:574] : INFO : Epoch 2002 | loss: 0.0311176 | val_loss: 0.0311696 | Time: 121132 ms [2022-02-09 03:14:02 main:574] : INFO : Epoch 2003 | loss: 0.0311206 | val_loss: 0.03117 | Time: 108825 ms [2022-02-09 03:16:20 main:574] : INFO : Epoch 2004 | loss: 0.0311185 | val_loss: 0.031163 | Time: 137047 ms [2022-02-09 03:18:11 main:574] : INFO : Epoch 2005 | loss: 0.0311208 | val_loss: 0.0311708 | Time: 108765 ms [2022-02-09 03:19:25 main:574] : INFO : Epoch 2006 | loss: 0.0311197 | val_loss: 0.0311716 | Time: 73676.5 ms [2022-02-09 03:21:09 main:574] : INFO : Epoch 2007 | loss: 0.031118 | val_loss: 0.031169 | Time: 102075 ms [2022-02-09 03:23:27 main:574] : INFO : Epoch 2008 | loss: 0.0311165 | val_loss: 0.0311733 | Time: 133070 ms [2022-02-09 03:25:28 main:574] : INFO : Epoch 2009 | loss: 0.0311156 | val_loss: 0.0311669 | Time: 119748 ms [2022-02-09 03:27:27 main:574] : INFO : Epoch 2010 | loss: 0.0311167 | val_loss: 0.0311659 | Time: 114721 ms [2022-02-09 03:29:19 main:574] : INFO : Epoch 2011 | loss: 0.0311141 | val_loss: 0.0311704 | Time: 108488 ms [2022-02-09 03:32:01 main:574] : INFO : Epoch 2012 | loss: 0.0311118 | val_loss: 0.0311697 | Time: 155999 ms [2022-02-09 03:34:16 main:574] : INFO : Epoch 2013 | loss: 0.0311138 | val_loss: 0.0311713 | Time: 134036 ms [2022-02-09 03:36:38 main:574] : INFO : Epoch 2014 | loss: 0.0311166 | val_loss: 0.0311852 | Time: 136750 ms [2022-02-09 03:38:56 main:574] : INFO : Epoch 2015 | loss: 0.0311167 | val_loss: 0.0311688 | Time: 136664 ms [2022-02-09 03:41:12 main:574] : INFO : Epoch 2016 | loss: 0.0311164 | val_loss: 0.031168 | Time: 127531 ms [2022-02-09 03:43:43 main:574] : INFO : Epoch 2017 | loss: 0.0311152 | val_loss: 0.0311679 | Time: 150074 ms [2022-02-09 03:46:15 main:574] : INFO : Epoch 2018 | loss: 0.0311129 | val_loss: 0.031171 | Time: 143434 ms [2022-02-09 03:48:57 main:574] : INFO : Epoch 2019 | loss: 0.0311136 | val_loss: 0.03117 | Time: 160130 ms [2022-02-09 03:52:21 main:574] : INFO : Epoch 2020 | loss: 0.0311146 | val_loss: 0.0311701 | Time: 177936 ms [2022-02-09 03:55:08 main:574] : INFO : Epoch 2021 | loss: 0.0311149 | val_loss: 0.0311742 | Time: 165860 ms [2022-02-09 03:57:08 main:574] : INFO : Epoch 2022 | loss: 0.0311144 | val_loss: 0.0311655 | Time: 108266 ms [2022-02-09 03:59:01 main:574] : INFO : Epoch 2023 | loss: 0.0311171 | val_loss: 0.0311702 | Time: 112719 ms [2022-02-09 04:01:16 main:574] : INFO : Epoch 2024 | loss: 0.0311157 | val_loss: 0.0311668 | Time: 124792 ms [2022-02-09 04:03:49 main:574] : INFO : Epoch 2025 | loss: 0.0311167 | val_loss: 0.0311688 | Time: 152213 ms [2022-02-09 04:06:25 main:574] : INFO : Epoch 2026 | loss: 0.0311234 | val_loss: 0.0311645 | Time: 146914 ms [2022-02-09 04:08:43 main:574] : INFO : Epoch 2027 | loss: 0.0311224 | val_loss: 0.0311643 | Time: 133402 ms [2022-02-09 04:11:04 main:574] : INFO : Epoch 2028 | loss: 0.0311213 | val_loss: 0.031168 | Time: 129519 ms [2022-02-09 04:14:45 main:574] : INFO : Epoch 2029 | loss: 0.0311171 | val_loss: 0.0311619 | Time: 219003 ms [2022-02-09 04:17:49 main:574] : INFO : Epoch 2030 | loss: 0.0311151 | val_loss: 0.0311673 | Time: 176759 ms [2022-02-09 04:20:18 main:574] : INFO : Epoch 2031 | loss: 0.031118 | val_loss: 0.0311658 | Time: 144402 ms [2022-02-09 04:23:02 main:574] : INFO : Epoch 2032 | loss: 0.0311234 | val_loss: 0.0311739 | Time: 155171 ms [2022-02-09 04:24:17 main:574] : INFO : Epoch 2033 | loss: 0.0311227 | val_loss: 0.0311661 | Time: 74260.3 ms [2022-02-09 04:28:42 main:574] : INFO : Epoch 2034 | loss: 0.0311195 | val_loss: 0.031162 | Time: 226845 ms [2022-02-09 04:31:06 main:574] : INFO : Epoch 2035 | loss: 0.0311216 | val_loss: 0.0311617 | Time: 133948 ms [2022-02-09 04:33:19 main:574] : INFO : Epoch 2036 | loss: 0.0311224 | val_loss: 0.0311696 | Time: 131898 ms [2022-02-09 04:36:20 main:574] : INFO : Epoch 2037 | loss: 0.0311185 | val_loss: 0.0311671 | Time: 156698 ms [2022-02-09 04:42:44 main:574] : INFO : Epoch 2038 | loss: 0.031128 | val_loss: 0.0311589 | Time: 383154 ms [2022-02-09 04:51:31 main:574] : INFO : Epoch 2039 | loss: 0.031129 | val_loss: 0.0311617 | Time: 471648 ms [2022-02-09 04:55:36 main:574] : INFO : Epoch 2040 | loss: 0.0311279 | val_loss: 0.0311678 | Time: 200897 ms [2022-02-09 04:58:01 main:574] : INFO : Epoch 2041 | loss: 0.0311232 | val_loss: 0.0311665 | Time: 133162 ms [2022-02-09 05:00:23 main:574] : INFO : Epoch 2042 | loss: 0.031119 | val_loss: 0.0311617 | Time: 141653 ms [2022-02-09 05:03:34 main:574] : INFO : Epoch 2043 | loss: 0.0311166 | val_loss: 0.0311698 | Time: 186127 ms [2022-02-09 05:06:07 main:574] : INFO : Epoch 2044 | loss: 0.0311388 | val_loss: 0.0311683 | Time: 152241 ms [2022-02-09 05:08:52 main:574] : INFO : Epoch 2045 | loss: 0.0311388 | val_loss: 0.0311717 | Time: 159360 ms [2022-02-09 05:11:24 main:574] : INFO : Epoch 2046 | loss: 0.031138 | val_loss: 0.0311721 | Time: 151041 ms [2022-02-09 05:14:31 main:574] : INFO : Epoch 2047 | loss: 0.0311335 | val_loss: 0.0311637 | Time: 163653 ms [2022-02-09 05:18:27 main:574] : INFO : Epoch 2048 | loss: 0.0311294 | val_loss: 0.0311689 | Time: 232197 ms [2022-02-09 05:18:35 main:597] : INFO : Saving trained model to model-final.pt, val_loss 0.0311689 [2022-02-09 05:18:35 main:603] : INFO : Saving end state to config to file [2022-02-09 05:18:36 main:608] : INFO : Success, exiting.. 05:18:36 (11200): called boinc_finish(0) </stderr_txt> ]]>
©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)