| Name | ParityModified-1639958790-1804-1-0_0 |
| Workunit | 7765380 |
| Created | 4 Jan 2022, 1:30:19 UTC |
| Sent | 17 Jan 2022, 19:05:06 UTC |
| Report deadline | 25 Jan 2022, 19:05:06 UTC |
| Received | 18 Jan 2022, 19:31:47 UTC |
| Server state | Over |
| Outcome | Success |
| Client state | Done |
| Exit status | 0 (0x00000000) |
| Computer ID | 157 |
| Run time | 12 hours 17 min 34 sec |
| CPU time | 6 hours 52 min 43 sec |
| Validate state | Valid |
| Credit | 4,160.00 |
| Device peak FLOPS | 4,474.06 GFLOPS |
| Application version | Machine Learning Dataset Generator (GPU) v9.75 (cuda10200) windows_x86_64 |
| Peak working set size | 1.62 GB |
| Peak swap size | 3.61 GB |
| Peak disk usage | 1.54 GB |
<core_client_version>7.16.20</core_client_version> <![CDATA[ <stderr_txt> ms [2022-01-18 04:22:11 main:574] : INFO : Epoch 1780 | loss: 0.0311464 | val_loss: 0.0311621 | Time: 18016.2 ms [2022-01-18 04:22:29 main:574] : INFO : Epoch 1781 | loss: 0.0311453 | val_loss: 0.0311667 | Time: 17606 ms [2022-01-18 04:22:46 main:574] : INFO : Epoch 1782 | loss: 0.0311454 | val_loss: 0.0311639 | Time: 17530.5 ms [2022-01-18 04:23:04 main:574] : INFO : Epoch 1783 | loss: 0.0311441 | val_loss: 0.0311647 | Time: 18022.6 ms [2022-01-18 04:23:22 main:574] : INFO : Epoch 1784 | loss: 0.0311452 | val_loss: 0.0311663 | Time: 17579.7 ms [2022-01-18 04:23:40 main:574] : INFO : Epoch 1785 | loss: 0.0311444 | val_loss: 0.0311645 | Time: 18125.5 ms [2022-01-18 04:23:59 main:574] : INFO : Epoch 1786 | loss: 0.0311449 | val_loss: 0.0311658 | Time: 18123 ms [2022-01-18 04:24:17 main:574] : INFO : Epoch 1787 | loss: 0.0311446 | val_loss: 0.031169 | Time: 18028.9 ms [2022-01-18 04:24:35 main:574] : INFO : Epoch 1788 | loss: 0.0311437 | val_loss: 0.0311675 | Time: 18517.5 ms [2022-01-18 04:24:54 main:574] : INFO : Epoch 1789 | loss: 0.0311433 | val_loss: 0.0311615 | Time: 18052 ms [2022-01-18 04:25:12 main:574] : INFO : Epoch 1790 | loss: 0.0311428 | val_loss: 0.0311645 | Time: 18214.7 ms [2022-01-18 04:25:29 main:574] : INFO : Epoch 1791 | loss: 0.0311409 | val_loss: 0.0311655 | Time: 17448.5 ms [2022-01-18 04:25:48 main:574] : INFO : Epoch 1792 | loss: 0.0311416 | val_loss: 0.0311645 | Time: 17934.6 ms [2022-01-18 04:26:06 main:574] : INFO : Epoch 1793 | loss: 0.0311455 | val_loss: 0.0311686 | Time: 18461.5 ms [2022-01-18 04:26:24 main:574] : INFO : Epoch 1794 | loss: 0.0311615 | val_loss: 0.0311811 | Time: 17447.9 ms [2022-01-18 04:26:43 main:574] : INFO : Epoch 1795 | loss: 0.0326339 | val_loss: 0.0326441 | Time: 18880.5 ms [2022-01-18 04:27:00 main:574] : INFO : Epoch 1796 | loss: 0.0326987 | val_loss: 0.0316089 | Time: 17413 ms [2022-01-18 04:27:18 main:574] : INFO : Epoch 1797 | loss: 0.0324217 | val_loss: 0.0315558 | Time: 17221.3 ms [2022-01-18 04:27:36 main:574] : INFO : Epoch 1798 | loss: 0.0313962 | val_loss: 0.031295 | Time: 18414 ms [2022-01-18 04:27:54 main:574] : INFO : Epoch 1799 | loss: 0.0312501 | val_loss: 0.0312147 | Time: 17674.9 ms [2022-01-18 04:28:12 main:574] : INFO : Epoch 1800 | loss: 0.031207 | val_loss: 0.0311897 | Time: 17926.2 ms [2022-01-18 04:28:30 main:574] : INFO : Epoch 1801 | loss: 0.0311849 | val_loss: 0.0311766 | Time: 18471.6 ms [2022-01-18 04:28:48 main:574] : INFO : Epoch 1802 | loss: 0.0311768 | val_loss: 0.031174 | Time: 17144.4 ms [2022-01-18 04:29:07 main:574] : INFO : Epoch 1803 | loss: 0.0311738 | val_loss: 0.0311755 | Time: 18749.1 ms [2022-01-18 04:29:24 main:574] : INFO : Epoch 1804 | loss: 0.0311723 | val_loss: 0.0311735 | Time: 17643.7 ms [2022-01-18 04:29:43 main:574] : INFO : Epoch 1805 | loss: 0.0311688 | val_loss: 0.0311723 | Time: 18105.3 ms [2022-01-18 04:30:01 main:574] : INFO : Epoch 1806 | loss: 0.0311663 | val_loss: 0.0311702 | Time: 18477.1 ms [2022-01-18 04:30:20 main:574] : INFO : Epoch 1807 | loss: 0.0311654 | val_loss: 0.0311682 | Time: 18473 ms [2022-01-18 04:30:38 main:574] : INFO : Epoch 1808 | loss: 0.0311647 | val_loss: 0.0311701 | Time: 17733.8 ms [2022-01-18 04:30:56 main:574] : INFO : Epoch 1809 | loss: 0.0311639 | val_loss: 0.0311666 | Time: 18259.1 ms [2022-01-18 04:31:14 main:574] : INFO : Epoch 1810 | loss: 0.0311622 | val_loss: 0.0311674 | Time: 17548.9 ms [2022-01-18 04:31:32 main:574] : INFO : Epoch 1811 | loss: 0.0311628 | val_loss: 0.0311654 | Time: 17942.2 ms [2022-01-18 04:31:49 main:574] : INFO : Epoch 1812 | loss: 0.0311632 | val_loss: 0.0311646 | Time: 17188.2 ms [2022-01-18 04:32:08 main:574] : INFO : Epoch 1813 | loss: 0.0311623 | val_loss: 0.0311642 | Time: 18444 ms [2022-01-18 04:32:25 main:574] : INFO : Epoch 1814 | loss: 0.0311614 | val_loss: 0.0311643 | Time: 17848.5 ms [2022-01-18 04:32:44 main:574] : INFO : Epoch 1815 | loss: 0.0311611 | val_loss: 0.0311726 | Time: 18250.1 ms [2022-01-18 04:33:02 main:574] : INFO : Epoch 1816 | loss: 0.0311604 | val_loss: 0.031164 | Time: 18421 ms [2022-01-18 04:33:19 main:574] : INFO : Epoch 1817 | loss: 0.0311606 | val_loss: 0.0311636 | Time: 16547.5 ms [2022-01-18 04:33:37 main:574] : INFO : Epoch 1818 | loss: 0.0311588 | val_loss: 0.0311634 | Time: 18149.2 ms [2022-01-18 04:33:55 main:574] : INFO : Epoch 1819 | loss: 0.0311587 | val_loss: 0.0311658 | Time: 17375.8 ms [2022-01-18 04:34:13 main:574] : INFO : Epoch 1820 | loss: 0.0311576 | val_loss: 0.031168 | Time: 18205.5 ms [2022-01-18 04:34:31 main:574] : INFO : Epoch 1821 | loss: 0.0311575 | val_loss: 0.0311664 | Time: 18145 ms [2022-01-18 04:34:49 main:574] : INFO : Epoch 1822 | loss: 0.0311584 | val_loss: 0.0311638 | Time: 17361.8 ms [2022-01-18 04:35:08 main:574] : INFO : Epoch 1823 | loss: 0.0311582 | val_loss: 0.0311654 | Time: 18657.5 ms [2022-01-18 04:35:25 main:574] : INFO : Epoch 1824 | loss: 0.0311574 | val_loss: 0.0311632 | Time: 17859.3 ms [2022-01-18 04:35:43 main:574] : INFO : Epoch 1825 | loss: 0.0311567 | val_loss: 0.031163 | Time: 17776.9 ms [2022-01-18 04:36:01 main:574] : INFO : Epoch 1826 | loss: 0.0311581 | val_loss: 0.0311691 | Time: 17658.6 ms [2022-01-18 04:36:18 main:574] : INFO : Epoch 1827 | loss: 0.0311561 | val_loss: 0.031164 | Time: 17138.4 ms [2022-01-18 04:36:37 main:574] : INFO : Epoch 1828 | loss: 0.0311542 | val_loss: 0.0311619 | Time: 18229 ms [2022-01-18 04:36:55 main:574] : INFO : Epoch 1829 | loss: 0.0311546 | val_loss: 0.0311636 | Time: 17874.4 ms [2022-01-18 04:37:13 main:574] : INFO : Epoch 1830 | loss: 0.0311549 | val_loss: 0.0311653 | Time: 17821 ms [2022-01-18 04:37:30 main:574] : INFO : Epoch 1831 | loss: 0.0311551 | val_loss: 0.0311639 | Time: 17224.9 ms [2022-01-18 04:37:48 main:574] : INFO : Epoch 1832 | loss: 0.031153 | val_loss: 0.0311635 | Time: 17911.4 ms [2022-01-18 04:38:07 main:574] : INFO : Epoch 1833 | loss: 0.0311522 | val_loss: 0.0311644 | Time: 18608 ms [2022-01-18 04:38:25 main:574] : INFO : Epoch 1834 | loss: 0.0311545 | val_loss: 0.0311644 | Time: 18055.7 ms [2022-01-18 04:38:43 main:574] : INFO : Epoch 1835 | loss: 0.0311533 | val_loss: 0.0311639 | Time: 18105.3 ms [2022-01-18 04:39:01 main:574] : INFO : Epoch 1836 | loss: 0.0311528 | val_loss: 0.0311647 | Time: 17688 ms [2022-01-18 04:39:18 main:574] : INFO : Epoch 1837 | loss: 0.0311522 | val_loss: 0.031164 | Time: 17522 ms [2022-01-18 04:39:37 main:574] : INFO : Epoch 1838 | loss: 0.0311532 | val_loss: 0.0311638 | Time: 18555.4 ms [2022-01-18 04:39:55 main:574] : INFO : Epoch 1839 | loss: 0.0311518 | val_loss: 0.0311654 | Time: 17607.7 ms [2022-01-18 04:40:13 main:574] : INFO : Epoch 1840 | loss: 0.0311564 | val_loss: 0.0311669 | Time: 18370.3 ms [2022-01-18 04:40:31 main:574] : INFO : Epoch 1841 | loss: 0.0311552 | val_loss: 0.0311675 | Time: 17422.4 ms [2022-01-18 04:40:49 main:574] : INFO : Epoch 1842 | loss: 0.0311551 | val_loss: 0.0311635 | Time: 17632.1 ms [2022-01-18 04:41:07 main:574] : INFO : Epoch 1843 | loss: 0.0311521 | val_loss: 0.0311665 | Time: 18754.6 ms [2022-01-18 04:41:25 main:574] : INFO : Epoch 1844 | loss: 0.0311513 | val_loss: 0.031164 | Time: 17670.2 ms [2022-01-18 04:41:44 main:574] : INFO : Epoch 1845 | loss: 0.0311499 | val_loss: 0.0311637 | Time: 18626.4 ms [2022-01-18 04:42:02 main:574] : INFO : Epoch 1846 | loss: 0.0311494 | val_loss: 0.0311643 | Time: 18315.3 ms [2022-01-18 04:42:19 main:574] : INFO : Epoch 1847 | loss: 0.0311486 | val_loss: 0.031164 | Time: 17052.5 ms [2022-01-18 04:42:38 main:574] : INFO : Epoch 1848 | loss: 0.0311504 | val_loss: 0.031164 | Time: 18640.1 ms [2022-01-18 04:42:58 main:574] : INFO : Epoch 1849 | loss: 0.0311494 | val_loss: 0.0311684 | Time: 19647.1 ms [2022-01-18 04:43:26 main:574] : INFO : Epoch 1850 | loss: 0.0311513 | val_loss: 0.0311656 | Time: 27248.4 ms [2022-01-18 04:43:55 main:574] : INFO : Epoch 1851 | loss: 0.0311477 | val_loss: 0.0311655 | Time: 28811.4 ms [2022-01-18 04:44:14 main:574] : INFO : Epoch 1852 | loss: 0.0311471 | val_loss: 0.0311657 | Time: 19418.3 ms [2022-01-18 04:44:32 main:574] : INFO : Epoch 1853 | loss: 0.0311509 | val_loss: 0.0311611 | Time: 17744.7 ms [2022-01-18 04:44:50 main:574] : INFO : Epoch 1854 | loss: 0.0311508 | val_loss: 0.0311603 | Time: 17664.4 ms [2022-01-18 04:45:08 main:574] : INFO : Epoch 1855 | loss: 0.0311544 | val_loss: 0.0311643 | Time: 18543.1 ms [2022-01-18 04:45:27 main:574] : INFO : Epoch 1856 | loss: 0.0311536 | val_loss: 0.0311646 | Time: 18217.7 ms [2022-01-18 04:45:44 main:574] : INFO : Epoch 1857 | loss: 0.0311519 | val_loss: 0.031165 | Time: 17589.9 ms [2022-01-18 04:46:02 main:574] : INFO : Epoch 1858 | loss: 0.031151 | val_loss: 0.0311645 | Time: 17276.4 ms [2022-01-18 04:46:20 main:574] : INFO : Epoch 1859 | loss: 0.0311507 | val_loss: 0.0311647 | Time: 18067 ms [2022-01-18 04:46:38 main:574] : INFO : Epoch 1860 | loss: 0.0311499 | val_loss: 0.0311638 | Time: 18251.7 ms [2022-01-18 04:46:57 main:574] : INFO : Epoch 1861 | loss: 0.0311482 | val_loss: 0.0311613 | Time: 18741.9 ms [2022-01-18 04:47:20 main:574] : INFO : Epoch 1862 | loss: 0.0311473 | val_loss: 0.031165 | Time: 22149.6 ms [2022-01-18 04:47:38 main:574] : INFO : Epoch 1863 | loss: 0.0311497 | val_loss: 0.0311618 | Time: 17987.9 ms [2022-01-18 04:47:56 main:574] : INFO : Epoch 1864 | loss: 0.0311469 | val_loss: 0.0311626 | Time: 17847.9 ms [2022-01-18 04:48:13 main:574] : INFO : Epoch 1865 | loss: 0.0311463 | val_loss: 0.0311646 | Time: 17817.4 ms [2022-01-18 04:48:31 main:574] : INFO : Epoch 1866 | loss: 0.0311468 | val_loss: 0.031166 | Time: 17005.4 ms [2022-01-18 04:48:49 main:574] : INFO : Epoch 1867 | loss: 0.0311509 | val_loss: 0.0311655 | Time: 18330.8 ms [2022-01-18 04:49:07 main:574] : INFO : Epoch 1868 | loss: 0.0311474 | val_loss: 0.0311682 | Time: 17683.7 ms [2022-01-18 04:49:24 main:574] : INFO : Epoch 1869 | loss: 0.0311482 | val_loss: 0.0311642 | Time: 17668.7 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-01-18 07:48:49 main:435] : INFO : Set logging level to 1 [2022-01-18 07:48:49 main:441] : INFO : Running in BOINC Client mode [2022-01-18 07:48:49 main:444] : INFO : Resolving all filenames [2022-01-18 07:48:49 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-18 07:48:50 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-18 07:48:50 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-18 07:48:50 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-18 07:48:50 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-18 07:48:51 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-18 07:48:51 main:474] : INFO : Configuration: [2022-01-18 07:48:51 main:475] : INFO : Model type: GRU [2022-01-18 07:48:52 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-18 07:48:52 main:477] : INFO : Max Epochs: 2048 [2022-01-18 07:48:52 main:478] : INFO : Batch Size: 128 [2022-01-18 07:48:52 main:479] : INFO : Learning Rate: 0.01 [2022-01-18 07:48:52 main:480] : INFO : Patience: 10 [2022-01-18 07:48:53 main:481] : INFO : Hidden Width: 12 [2022-01-18 07:48:53 main:482] : INFO : # Recurrent Layers: 4 [2022-01-18 07:48:53 main:483] : INFO : # Backend Layers: 4 [2022-01-18 07:48:53 main:484] : INFO : # Threads: 1 [2022-01-18 07:48:53 main:486] : INFO : Preparing Dataset [2022-01-18 07:48:53 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-18 07:48:55 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-18 07:49:13 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-18 07:49:13 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-18 07:49:14 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-18 07:49:14 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-18 07:49:14 main:494] : INFO : Creating Model [2022-01-18 07:49:14 main:507] : INFO : Preparing config file [2022-01-18 07:49:15 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-18 07:49:15 main:512] : INFO : Loading config [2022-01-18 07:49:15 main:514] : INFO : Loading state [2022-01-18 07:49:23 main:559] : INFO : Loading DataLoader into Memory [2022-01-18 07:49:24 main:562] : INFO : Starting Training [2022-01-18 07:50:02 main:574] : INFO : Epoch 1870 | loss: 0.0314805 | val_loss: 0.0312583 | Time: 38626.8 ms [2022-01-18 07:50:34 main:574] : INFO : Epoch 1871 | loss: 0.0312496 | val_loss: 0.03122 | Time: 31285.2 ms [2022-01-18 07:51:11 main:574] : INFO : Epoch 1872 | loss: 0.031194 | val_loss: 0.0311731 | Time: 37282 ms [2022-01-18 07:51:43 main:574] : INFO : Epoch 1873 | loss: 0.0311615 | val_loss: 0.0311757 | Time: 31618.4 ms [2022-01-18 07:52:14 main:574] : INFO : Epoch 1874 | loss: 0.0311578 | val_loss: 0.0311709 | Time: 31008.8 ms [2022-01-18 07:52:44 main:574] : INFO : Epoch 1875 | loss: 0.0311577 | val_loss: 0.0311653 | Time: 29487.3 ms [2022-01-18 07:53:16 main:574] : INFO : Epoch 1876 | loss: 0.031153 | val_loss: 0.0311683 | Time: 31784.4 ms [2022-01-18 07:53:47 main:574] : INFO : Epoch 1877 | loss: 0.0311516 | val_loss: 0.0311704 | Time: 31373.4 ms [2022-01-18 07:54:20 main:574] : INFO : Epoch 1878 | loss: 0.0311507 | val_loss: 0.0311646 | Time: 32596.8 ms [2022-01-18 07:54:52 main:574] : INFO : Epoch 1879 | loss: 0.0311531 | val_loss: 0.0311668 | Time: 31852.6 ms [2022-01-18 07:55:24 main:574] : INFO : Epoch 1880 | loss: 0.031153 | val_loss: 0.0311673 | Time: 31253.8 ms [2022-01-18 07:55:55 main:574] : INFO : Epoch 1881 | loss: 0.0311524 | val_loss: 0.0311654 | Time: 30916.5 ms [2022-01-18 07:56:25 main:574] : INFO : Epoch 1882 | loss: 0.0311503 | val_loss: 0.0311667 | Time: 29984.9 ms [2022-01-18 07:56:57 main:574] : INFO : Epoch 1883 | loss: 0.0311514 | val_loss: 0.03117 | Time: 32211.8 ms [2022-01-18 07:57:27 main:574] : INFO : Epoch 1884 | loss: 0.031151 | val_loss: 0.0311659 | Time: 29720.9 ms [2022-01-18 07:57:57 main:574] : INFO : Epoch 1885 | loss: 0.0311497 | val_loss: 0.0311657 | Time: 30257.1 ms [2022-01-18 07:58:27 main:574] : INFO : Epoch 1886 | loss: 0.0311506 | val_loss: 0.0311634 | Time: 29391.3 ms [2022-01-18 07:58:57 main:574] : INFO : Epoch 1887 | loss: 0.0311525 | val_loss: 0.0311697 | Time: 30379.5 ms [2022-01-18 07:59:29 main:574] : INFO : Epoch 1888 | loss: 0.0311488 | val_loss: 0.0311639 | Time: 31547.4 ms [2022-01-18 07:59:59 main:574] : INFO : Epoch 1889 | loss: 0.031148 | val_loss: 0.0311631 | Time: 29940.4 ms [2022-01-18 08:00:29 main:574] : INFO : Epoch 1890 | loss: 0.0311474 | val_loss: 0.0311655 | Time: 29602.9 ms [2022-01-18 08:00:57 main:574] : INFO : Epoch 1891 | loss: 0.0311496 | val_loss: 0.0311656 | Time: 27387.9 ms [2022-01-18 08:01:21 main:574] : INFO : Epoch 1892 | loss: 0.031152 | val_loss: 0.0311685 | Time: 23980.8 ms [2022-01-18 08:01:42 main:574] : INFO : Epoch 1893 | loss: 0.0311487 | val_loss: 0.0311654 | Time: 20471.2 ms [2022-01-18 08:02:08 main:574] : INFO : Epoch 1894 | loss: 0.0311458 | val_loss: 0.0311652 | Time: 26061.7 ms [2022-01-18 08:02:26 main:574] : INFO : Epoch 1895 | loss: 0.0311476 | val_loss: 0.0311617 | Time: 17884.5 ms [2022-01-18 08:02:45 main:574] : INFO : Epoch 1896 | loss: 0.031151 | val_loss: 0.0311642 | Time: 18724.1 ms [2022-01-18 08:03:03 main:574] : INFO : Epoch 1897 | loss: 0.0311488 | val_loss: 0.031166 | Time: 18178.1 ms [2022-01-18 08:03:21 main:574] : INFO : Epoch 1898 | loss: 0.0311496 | val_loss: 0.0311678 | Time: 17912.7 ms [2022-01-18 08:03:39 main:574] : INFO : Epoch 1899 | loss: 0.0311493 | val_loss: 0.0311654 | Time: 18088.3 ms [2022-01-18 08:03:57 main:574] : INFO : Epoch 1900 | loss: 0.0311511 | val_loss: 0.0311641 | Time: 17975.1 ms [2022-01-18 08:04:15 main:574] : INFO : Epoch 1901 | loss: 0.0311517 | val_loss: 0.0311628 | Time: 18078.3 ms [2022-01-18 08:04:34 main:574] : INFO : Epoch 1902 | loss: 0.0311492 | val_loss: 0.0311651 | Time: 18068.3 ms [2022-01-18 08:04:51 main:574] : INFO : Epoch 1903 | loss: 0.0311498 | val_loss: 0.0311616 | Time: 17484.5 ms [2022-01-18 08:05:09 main:574] : INFO : Epoch 1904 | loss: 0.0311473 | val_loss: 0.0311617 | Time: 17399.2 ms [2022-01-18 08:05:26 main:574] : INFO : Epoch 1905 | loss: 0.0311469 | val_loss: 0.0311605 | Time: 17189.9 ms [2022-01-18 08:05:44 main:574] : INFO : Epoch 1906 | loss: 0.0311449 | val_loss: 0.0311631 | Time: 18018.1 ms [2022-01-18 08:06:02 main:574] : INFO : Epoch 1907 | loss: 0.0311437 | val_loss: 0.0311611 | Time: 17478.3 ms [2022-01-18 08:06:19 main:574] : INFO : Epoch 1908 | loss: 0.031144 | val_loss: 0.0311607 | Time: 17755.6 ms [2022-01-18 08:06:37 main:574] : INFO : Epoch 1909 | loss: 0.0311466 | val_loss: 0.0311633 | Time: 17259.6 ms [2022-01-18 08:06:55 main:574] : INFO : Epoch 1910 | loss: 0.031146 | val_loss: 0.0311621 | Time: 18157.5 ms [2022-01-18 08:07:13 main:574] : INFO : Epoch 1911 | loss: 0.0311459 | val_loss: 0.0311613 | Time: 17729 ms [2022-01-18 08:07:30 main:574] : INFO : Epoch 1912 | loss: 0.0311465 | val_loss: 0.0311662 | Time: 17301.9 ms [2022-01-18 08:07:48 main:574] : INFO : Epoch 1913 | loss: 0.0311469 | val_loss: 0.0311619 | Time: 17812.9 ms [2022-01-18 08:08:06 main:574] : INFO : Epoch 1914 | loss: 0.0311478 | val_loss: 0.0311614 | Time: 17941.6 ms [2022-01-18 08:08:25 main:574] : INFO : Epoch 1915 | loss: 0.0311454 | val_loss: 0.0311615 | Time: 18906.4 ms [2022-01-18 08:08:44 main:574] : INFO : Epoch 1916 | loss: 0.0311438 | val_loss: 0.0311629 | Time: 18418.4 ms [2022-01-18 08:09:02 main:574] : INFO : Epoch 1917 | loss: 0.0311431 | val_loss: 0.0311645 | Time: 18096.5 ms [2022-01-18 08:09:20 main:574] : INFO : Epoch 1918 | loss: 0.0311439 | val_loss: 0.0311656 | Time: 17954.9 ms [2022-01-18 08:09:38 main:574] : INFO : Epoch 1919 | loss: 0.0311451 | val_loss: 0.0311634 | Time: 17427.5 ms [2022-01-18 08:09:56 main:574] : INFO : Epoch 1920 | loss: 0.0311409 | val_loss: 0.0311605 | Time: 17903.4 ms [2022-01-18 08:10:14 main:574] : INFO : Epoch 1921 | loss: 0.0311402 | val_loss: 0.0311665 | Time: 17964.4 ms [2022-01-18 08:10:31 main:574] : INFO : Epoch 1922 | loss: 0.0311399 | val_loss: 0.0311627 | Time: 17347.8 ms [2022-01-18 08:10:48 main:574] : INFO : Epoch 1923 | loss: 0.0311404 | val_loss: 0.0311635 | Time: 17200.2 ms [2022-01-18 08:11:06 main:574] : INFO : Epoch 1924 | loss: 0.031141 | val_loss: 0.0311641 | Time: 17491.6 ms [2022-01-18 08:11:24 main:574] : INFO : Epoch 1925 | loss: 0.0311415 | val_loss: 0.0311651 | Time: 17489.9 ms [2022-01-18 08:11:42 main:574] : INFO : Epoch 1926 | loss: 0.031144 | val_loss: 0.0311692 | Time: 18774.2 ms [2022-01-18 08:12:00 main:574] : INFO : Epoch 1927 | loss: 0.0311411 | val_loss: 0.031164 | Time: 17710.7 ms [2022-01-18 08:12:18 main:574] : INFO : Epoch 1928 | loss: 0.0311403 | val_loss: 0.0311638 | Time: 17323.9 ms [2022-01-18 08:12:36 main:574] : INFO : Epoch 1929 | loss: 0.0311433 | val_loss: 0.0311673 | Time: 18521.2 ms [2022-01-18 08:12:55 main:574] : INFO : Epoch 1930 | loss: 0.0311443 | val_loss: 0.0311655 | Time: 18164.3 ms [2022-01-18 08:13:13 main:574] : INFO : Epoch 1931 | loss: 0.0311501 | val_loss: 0.0311637 | Time: 18301.9 ms [2022-01-18 08:13:31 main:574] : INFO : Epoch 1932 | loss: 0.0311455 | val_loss: 0.0311652 | Time: 17557 ms [2022-01-18 08:13:48 main:574] : INFO : Epoch 1933 | loss: 0.0311458 | val_loss: 0.0311667 | Time: 16980.7 ms [2022-01-18 08:14:07 main:574] : INFO : Epoch 1934 | loss: 0.0311428 | val_loss: 0.031165 | Time: 18781 ms [2022-01-18 08:14:25 main:574] : INFO : Epoch 1935 | loss: 0.0311417 | val_loss: 0.0311653 | Time: 18137.9 ms [2022-01-18 08:14:43 main:574] : INFO : Epoch 1936 | loss: 0.0311425 | val_loss: 0.0311607 | Time: 17794.6 ms [2022-01-18 08:15:01 main:574] : INFO : Epoch 1937 | loss: 0.0311426 | val_loss: 0.0311608 | Time: 17687.3 ms [2022-01-18 08:15:18 main:574] : INFO : Epoch 1938 | loss: 0.031143 | val_loss: 0.0311602 | Time: 17269.8 ms [2022-01-18 08:15:36 main:574] : INFO : Epoch 1939 | loss: 0.0311405 | val_loss: 0.0311629 | Time: 17760.5 ms [2022-01-18 08:15:54 main:574] : INFO : Epoch 1940 | loss: 0.0311399 | val_loss: 0.0311636 | Time: 17670 ms [2022-01-18 08:16:11 main:574] : INFO : Epoch 1941 | loss: 0.0311412 | val_loss: 0.0311644 | Time: 17375.8 ms [2022-01-18 08:16:29 main:574] : INFO : Epoch 1942 | loss: 0.0311419 | val_loss: 0.0311658 | Time: 17609.5 ms [2022-01-18 08:16:46 main:574] : INFO : Epoch 1943 | loss: 0.0311441 | val_loss: 0.0311618 | Time: 17547.3 ms [2022-01-18 08:17:05 main:574] : INFO : Epoch 1944 | loss: 0.0311415 | val_loss: 0.0311623 | Time: 18261.3 ms [2022-01-18 08:17:23 main:574] : INFO : Epoch 1945 | loss: 0.0311417 | val_loss: 0.0311643 | Time: 17867.6 ms [2022-01-18 08:17:41 main:574] : INFO : Epoch 1946 | loss: 0.0311475 | val_loss: 0.0311695 | Time: 17816.1 ms [2022-01-18 08:17:59 main:574] : INFO : Epoch 1947 | loss: 0.0311471 | val_loss: 0.0311644 | Time: 17698.9 ms [2022-01-18 08:18:17 main:574] : INFO : Epoch 1948 | loss: 0.0311412 | val_loss: 0.0311679 | Time: 17991.1 ms [2022-01-18 08:18:34 main:574] : INFO : Epoch 1949 | loss: 0.0311412 | val_loss: 0.031167 | Time: 17684.4 ms [2022-01-18 08:18:52 main:574] : INFO : Epoch 1950 | loss: 0.0311518 | val_loss: 0.0311775 | Time: 17847.4 ms [2022-01-18 08:19:10 main:574] : INFO : Epoch 1951 | loss: 0.0311552 | val_loss: 0.0311739 | Time: 18034.5 ms [2022-01-18 08:19:28 main:574] : INFO : Epoch 1952 | loss: 0.031149 | val_loss: 0.0311651 | Time: 17165.9 ms [2022-01-18 08:19:46 main:574] : INFO : Epoch 1953 | loss: 0.0311508 | val_loss: 0.0311725 | Time: 17728.6 ms [2022-01-18 08:20:03 main:574] : INFO : Epoch 1954 | loss: 0.0311548 | val_loss: 0.0311676 | Time: 17543.4 ms [2022-01-18 08:20:21 main:574] : INFO : Epoch 1955 | loss: 0.0311533 | val_loss: 0.0311649 | Time: 17979.6 ms [2022-01-18 08:20:39 main:574] : INFO : Epoch 1956 | loss: 0.0311503 | val_loss: 0.0311709 | Time: 17699 ms [2022-01-18 08:20:57 main:574] : INFO : Epoch 1957 | loss: 0.031152 | val_loss: 0.0311643 | Time: 17562.9 ms [2022-01-18 08:21:15 main:574] : INFO : Epoch 1958 | loss: 0.0311535 | val_loss: 0.0311644 | Time: 17805.2 ms [2022-01-18 08:21:33 main:574] : INFO : Epoch 1959 | loss: 0.0311536 | val_loss: 0.0311617 | Time: 18374.8 ms [2022-01-18 08:21:51 main:574] : INFO : Epoch 1960 | loss: 0.0311496 | val_loss: 0.0311622 | Time: 17613.1 ms [2022-01-18 08:22:09 main:574] : INFO : Epoch 1961 | loss: 0.0311467 | val_loss: 0.0311611 | Time: 17565.2 ms [2022-01-18 08:22:26 main:574] : INFO : Epoch 1962 | loss: 0.0311442 | val_loss: 0.0311617 | Time: 17668.8 ms [2022-01-18 08:22:45 main:574] : INFO : Epoch 1963 | loss: 0.0311459 | val_loss: 0.0311641 | Time: 18557.3 ms [2022-01-18 08:23:03 main:574] : INFO : Epoch 1964 | loss: 0.0311491 | val_loss: 0.0311678 | Time: 18245.5 ms [2022-01-18 08:23:21 main:574] : INFO : Epoch 1965 | loss: 0.0311496 | val_loss: 0.0311607 | Time: 17388.3 ms [2022-01-18 08:23:39 main:574] : INFO : Epoch 1966 | loss: 0.0311535 | val_loss: 0.0311606 | Time: 18106.7 ms [2022-01-18 08:23:57 main:574] : INFO : Epoch 1967 | loss: 0.0311543 | val_loss: 0.0311603 | Time: 18136.6 ms [2022-01-18 08:24:16 main:574] : INFO : Epoch 1968 | loss: 0.0311536 | val_loss: 0.0311622 | Time: 19002.3 ms [2022-01-18 08:24:34 main:574] : INFO : Epoch 1969 | loss: 0.0311522 | val_loss: 0.031162 | Time: 17165.3 ms [2022-01-18 08:24:53 main:574] : INFO : Epoch 1970 | loss: 0.0311518 | val_loss: 0.0311619 | Time: 18904.3 ms [2022-01-18 08:25:10 main:574] : INFO : Epoch 1971 | loss: 0.0311479 | val_loss: 0.0311616 | Time: 17274.4 ms [2022-01-18 08:25:28 main:574] : INFO : Epoch 1972 | loss: 0.0311472 | val_loss: 0.0311648 | Time: 17650.6 ms [2022-01-18 08:25:47 main:574] : INFO : Epoch 1973 | loss: 0.0311467 | val_loss: 0.0311614 | Time: 18665.1 ms [2022-01-18 08:26:05 main:574] : INFO : Epoch 1974 | loss: 0.0311472 | val_loss: 0.0311649 | Time: 17846.6 ms [2022-01-18 08:26:22 main:574] : INFO : Epoch 1975 | loss: 0.031149 | val_loss: 0.0311607 | Time: 17141.4 ms [2022-01-18 08:26:39 main:574] : INFO : Epoch 1976 | loss: 0.0311478 | val_loss: 0.0311654 | Time: 17200.7 ms [2022-01-18 08:26:58 main:574] : INFO : Epoch 1977 | loss: 0.0311484 | val_loss: 0.0311701 | Time: 18767.1 ms [2022-01-18 08:27:16 main:574] : INFO : Epoch 1978 | loss: 0.0311543 | val_loss: 0.0311673 | Time: 17949.8 ms [2022-01-18 08:27:34 main:574] : INFO : Epoch 1979 | loss: 0.0311499 | val_loss: 0.031165 | Time: 17906.9 ms [2022-01-18 08:27:51 main:574] : INFO : Epoch 1980 | loss: 0.0311467 | val_loss: 0.0311682 | Time: 17286.8 ms [2022-01-18 08:28:10 main:574] : INFO : Epoch 1981 | loss: 0.0311466 | val_loss: 0.0311659 | Time: 18144.3 ms [2022-01-18 08:28:28 main:574] : INFO : Epoch 1982 | loss: 0.0311464 | val_loss: 0.0311651 | Time: 17773.8 ms [2022-01-18 08:28:46 main:574] : INFO : Epoch 1983 | loss: 0.0311471 | val_loss: 0.031164 | Time: 18115.4 ms [2022-01-18 08:29:03 main:574] : INFO : Epoch 1984 | loss: 0.031147 | val_loss: 0.0311674 | Time: 17619.3 ms [2022-01-18 08:29:21 main:574] : INFO : Epoch 1985 | loss: 0.0311437 | val_loss: 0.0311679 | Time: 17932.9 ms [2022-01-18 08:29:39 main:574] : INFO : Epoch 1986 | loss: 0.0311439 | val_loss: 0.0311613 | Time: 17270.1 ms [2022-01-18 08:29:58 main:574] : INFO : Epoch 1987 | loss: 0.0311463 | val_loss: 0.0311624 | Time: 18556.4 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-01-18 08:45:39 main:435] : INFO : Set logging level to 1 [2022-01-18 08:45:39 main:441] : INFO : Running in BOINC Client mode [2022-01-18 08:45:39 main:444] : INFO : Resolving all filenames [2022-01-18 08:45:39 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-18 08:45:39 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-18 08:45:39 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-18 08:45:39 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-18 08:45:39 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-18 08:45:39 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-18 08:45:39 main:474] : INFO : Configuration: [2022-01-18 08:45:39 main:475] : INFO : Model type: GRU [2022-01-18 08:45:40 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-18 08:45:40 main:477] : INFO : Max Epochs: 2048 [2022-01-18 08:45:40 main:478] : INFO : Batch Size: 128 [2022-01-18 08:45:40 main:479] : INFO : Learning Rate: 0.01 [2022-01-18 08:45:40 main:480] : INFO : Patience: 10 [2022-01-18 08:45:40 main:481] : INFO : Hidden Width: 12 [2022-01-18 08:45:40 main:482] : INFO : # Recurrent Layers: 4 [2022-01-18 08:45:41 main:483] : INFO : # Backend Layers: 4 [2022-01-18 08:45:41 main:484] : INFO : # Threads: 1 [2022-01-18 08:45:41 main:486] : INFO : Preparing Dataset [2022-01-18 08:45:41 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-18 08:45:42 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-18 08:45:45 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-18 08:45:46 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-18 08:45:46 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-18 08:45:46 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-18 08:45:46 main:494] : INFO : Creating Model [2022-01-18 08:45:47 main:507] : INFO : Preparing config file [2022-01-18 08:45:47 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-18 08:45:47 main:512] : INFO : Loading config [2022-01-18 08:45:47 main:514] : INFO : Loading state [2022-01-18 08:45:49 main:559] : INFO : Loading DataLoader into Memory [2022-01-18 08:45:49 main:562] : INFO : Starting Training [2022-01-18 08:46:08 main:574] : INFO : Epoch 1962 | loss: 0.0313036 | val_loss: 0.0312259 | Time: 18755 ms [2022-01-18 08:46:25 main:574] : INFO : Epoch 1963 | loss: 0.0311802 | val_loss: 0.0311861 | Time: 16960.4 ms [2022-01-18 08:46:42 main:574] : INFO : Epoch 1964 | loss: 0.0311525 | val_loss: 0.0311646 | Time: 17015.1 ms [2022-01-18 08:47:00 main:574] : INFO : Epoch 1965 | loss: 0.0311467 | val_loss: 0.0311657 | Time: 17522.9 ms [2022-01-18 08:47:30 main:574] : INFO : Epoch 1966 | loss: 0.0311498 | val_loss: 0.0311688 | Time: 30343.8 ms [2022-01-18 08:47:49 main:574] : INFO : Epoch 1967 | loss: 0.0311483 | val_loss: 0.0311676 | Time: 18810.7 ms [2022-01-18 08:48:07 main:574] : INFO : Epoch 1968 | loss: 0.0311451 | val_loss: 0.0311654 | Time: 17532.4 ms [2022-01-18 08:48:24 main:574] : INFO : Epoch 1969 | loss: 0.0311425 | val_loss: 0.0311644 | Time: 16980.9 ms [2022-01-18 08:48:43 main:574] : INFO : Epoch 1970 | loss: 0.0311406 | val_loss: 0.0311654 | Time: 19130.1 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-01-18 08:50:45 main:435] : INFO : Set logging level to 1 [2022-01-18 08:50:45 main:441] : INFO : Running in BOINC Client mode [2022-01-18 08:50:46 main:444] : INFO : Resolving all filenames [2022-01-18 08:50:46 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-18 08:50:46 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-18 08:50:46 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-18 08:50:46 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-18 08:50:46 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-18 08:50:46 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-18 08:50:46 main:474] : INFO : Configuration: [2022-01-18 08:50:46 main:475] : INFO : Model type: GRU [2022-01-18 08:50:46 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-18 08:50:47 main:477] : INFO : Max Epochs: 2048 [2022-01-18 08:50:47 main:478] : INFO : Batch Size: 128 [2022-01-18 08:50:47 main:479] : INFO : Learning Rate: 0.01 [2022-01-18 08:50:48 main:480] : INFO : Patience: 10 [2022-01-18 08:50:48 main:481] : INFO : Hidden Width: 12 [2022-01-18 08:50:48 main:482] : INFO : # Recurrent Layers: 4 [2022-01-18 08:50:48 main:483] : INFO : # Backend Layers: 4 [2022-01-18 08:50:49 main:484] : INFO : # Threads: 1 [2022-01-18 08:50:49 main:486] : INFO : Preparing Dataset [2022-01-18 08:50:49 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-18 08:50:50 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-18 08:50:54 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-18 08:50:54 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-18 08:50:55 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-18 08:50:55 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-18 08:50:55 main:494] : INFO : Creating Model [2022-01-18 08:50:55 main:507] : INFO : Preparing config file [2022-01-18 08:50:55 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-18 08:50:55 main:512] : INFO : Loading config [2022-01-18 08:50:56 main:514] : INFO : Loading state [2022-01-18 08:50:59 main:559] : INFO : Loading DataLoader into Memory [2022-01-18 08:50:59 main:562] : INFO : Starting Training [2022-01-18 08:51:17 main:574] : INFO : Epoch 1962 | loss: 0.0312945 | val_loss: 0.0312346 | Time: 18116.6 ms [2022-01-18 08:51:34 main:574] : INFO : Epoch 1963 | loss: 0.0311897 | val_loss: 0.031181 | Time: 16760.2 ms [2022-01-18 08:51:52 main:574] : INFO : Epoch 1964 | loss: 0.0311539 | val_loss: 0.0311654 | Time: 17784.2 ms [2022-01-18 08:52:10 main:574] : INFO : Epoch 1965 | loss: 0.031147 | val_loss: 0.0311709 | Time: 17891.4 ms [2022-01-18 08:52:29 main:574] : INFO : Epoch 1966 | loss: 0.0311516 | val_loss: 0.0311646 | Time: 18472.7 ms [2022-01-18 08:52:47 main:574] : INFO : Epoch 1967 | loss: 0.0311461 | val_loss: 0.0311665 | Time: 18046 ms [2022-01-18 08:53:04 main:574] : INFO : Epoch 1968 | loss: 0.0311457 | val_loss: 0.0311691 | Time: 17417.7 ms [2022-01-18 08:53:22 main:574] : INFO : Epoch 1969 | loss: 0.0311438 | val_loss: 0.0311636 | Time: 17798 ms [2022-01-18 08:53:40 main:574] : INFO : Epoch 1970 | loss: 0.0311459 | val_loss: 0.0311693 | Time: 17257.5 ms [2022-01-18 08:53:59 main:574] : INFO : Epoch 1971 | loss: 0.0311418 | val_loss: 0.0311659 | Time: 19119.6 ms [2022-01-18 08:54:17 main:574] : INFO : Epoch 1972 | loss: 0.0311403 | val_loss: 0.031171 | Time: 17626.8 ms [2022-01-18 08:54:34 main:574] : INFO : Epoch 1973 | loss: 0.0311446 | val_loss: 0.0311697 | Time: 17503.5 ms [2022-01-18 08:54:52 main:574] : INFO : Epoch 1974 | loss: 0.0311431 | val_loss: 0.0311691 | Time: 17633.9 ms [2022-01-18 08:55:17 main:574] : INFO : Epoch 1975 | loss: 0.0311425 | val_loss: 0.0311675 | Time: 25184.4 ms [2022-01-18 08:55:47 main:574] : INFO : Epoch 1976 | loss: 0.0311416 | val_loss: 0.0311679 | Time: 29479.5 ms [2022-01-18 08:56:09 main:574] : INFO : Epoch 1977 | loss: 0.0311423 | val_loss: 0.0311645 | Time: 21705.7 ms [2022-01-18 08:56:27 main:574] : INFO : Epoch 1978 | loss: 0.0311407 | val_loss: 0.0311633 | Time: 17527.6 ms [2022-01-18 08:56:45 main:574] : INFO : Epoch 1979 | loss: 0.0311433 | val_loss: 0.0311623 | Time: 18113.7 ms [2022-01-18 08:57:03 main:574] : INFO : Epoch 1980 | loss: 0.031142 | val_loss: 0.0311707 | Time: 17634.1 ms [2022-01-18 08:57:21 main:574] : INFO : Epoch 1981 | loss: 0.0311451 | val_loss: 0.0311646 | Time: 17758.6 ms [2022-01-18 08:57:40 main:574] : INFO : Epoch 1982 | loss: 0.0311436 | val_loss: 0.0311624 | Time: 19381.8 ms [2022-01-18 08:57:59 main:574] : INFO : Epoch 1983 | loss: 0.0311444 | val_loss: 0.0311646 | Time: 18463.5 ms [2022-01-18 08:58:17 main:574] : INFO : Epoch 1984 | loss: 0.0311443 | val_loss: 0.0311643 | Time: 18329.8 ms [2022-01-18 08:58:35 main:574] : INFO : Epoch 1985 | loss: 0.0311435 | val_loss: 0.0311644 | Time: 17491.3 ms [2022-01-18 08:58:53 main:574] : INFO : Epoch 1986 | loss: 0.0311431 | val_loss: 0.0311682 | Time: 17910 ms [2022-01-18 08:59:10 main:574] : INFO : Epoch 1987 | loss: 0.0311474 | val_loss: 0.0311656 | Time: 17612.3 ms [2022-01-18 08:59:28 main:574] : INFO : Epoch 1988 | loss: 0.0311445 | val_loss: 0.0311677 | Time: 17739.6 ms [2022-01-18 08:59:48 main:574] : INFO : Epoch 1989 | loss: 0.0311462 | val_loss: 0.0311606 | Time: 19347.7 ms [2022-01-18 09:00:05 main:574] : INFO : Epoch 1990 | loss: 0.0311431 | val_loss: 0.0311689 | Time: 17788.6 ms [2022-01-18 09:00:24 main:574] : INFO : Epoch 1991 | loss: 0.0311423 | val_loss: 0.0311619 | Time: 18252.8 ms [2022-01-18 09:00:41 main:574] : INFO : Epoch 1992 | loss: 0.0311421 | val_loss: 0.0311657 | Time: 17500.6 ms [2022-01-18 09:01:12 main:574] : INFO : Epoch 1993 | loss: 0.0311407 | val_loss: 0.031158 | Time: 30197 ms [2022-01-18 09:01:41 main:574] : INFO : Epoch 1994 | loss: 0.0311389 | val_loss: 0.031162 | Time: 29117.9 ms [2022-01-18 09:02:11 main:574] : INFO : Epoch 1995 | loss: 0.0311391 | val_loss: 0.0311646 | Time: 29720.1 ms [2022-01-18 09:02:37 main:574] : INFO : Epoch 1996 | loss: 0.0311391 | val_loss: 0.0311624 | Time: 26285.6 ms [2022-01-18 09:03:02 main:574] : INFO : Epoch 1997 | loss: 0.0311381 | val_loss: 0.0311666 | Time: 24804.6 ms [2022-01-18 09:03:31 main:574] : INFO : Epoch 1998 | loss: 0.0311364 | val_loss: 0.0311661 | Time: 28766.9 ms [2022-01-18 09:04:00 main:574] : INFO : Epoch 1999 | loss: 0.0311394 | val_loss: 0.0311629 | Time: 29588.5 ms [2022-01-18 09:04:25 main:574] : INFO : Epoch 2000 | loss: 0.0311419 | val_loss: 0.0311627 | Time: 24927.4 ms [2022-01-18 09:04:51 main:574] : INFO : Epoch 2001 | loss: 0.0311394 | val_loss: 0.0311653 | Time: 25545.1 ms [2022-01-18 09:05:17 main:574] : INFO : Epoch 2002 | loss: 0.0311422 | val_loss: 0.0311731 | Time: 25325.1 ms [2022-01-18 09:05:42 main:574] : INFO : Epoch 2003 | loss: 0.0311447 | val_loss: 0.0311666 | Time: 25114 ms [2022-01-18 09:06:11 main:574] : INFO : Epoch 2004 | loss: 0.0311468 | val_loss: 0.0311652 | Time: 28613.5 ms [2022-01-18 09:06:37 main:574] : INFO : Epoch 2005 | loss: 0.0311472 | val_loss: 0.0311644 | Time: 26664 ms [2022-01-18 09:07:07 main:574] : INFO : Epoch 2006 | loss: 0.0311447 | val_loss: 0.0311634 | Time: 29081.2 ms [2022-01-18 09:07:37 main:574] : INFO : Epoch 2007 | loss: 0.0311449 | val_loss: 0.0311666 | Time: 29506.4 ms [2022-01-18 09:08:02 main:574] : INFO : Epoch 2008 | loss: 0.0311444 | val_loss: 0.0311644 | Time: 25137.9 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-01-18 09:21:34 main:435] : INFO : Set logging level to 1 [2022-01-18 09:21:34 main:441] : INFO : Running in BOINC Client mode [2022-01-18 09:21:34 main:444] : INFO : Resolving all filenames [2022-01-18 09:21:34 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-18 09:21:34 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-18 09:21:35 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-18 09:21:35 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-18 09:21:35 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-18 09:21:35 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-18 09:21:35 main:474] : INFO : Configuration: [2022-01-18 09:21:35 main:475] : INFO : Model type: GRU [2022-01-18 09:21:36 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-18 09:21:36 main:477] : INFO : Max Epochs: 2048 [2022-01-18 09:21:36 main:478] : INFO : Batch Size: 128 [2022-01-18 09:21:36 main:479] : INFO : Learning Rate: 0.01 [2022-01-18 09:21:36 main:480] : INFO : Patience: 10 [2022-01-18 09:21:37 main:481] : INFO : Hidden Width: 12 [2022-01-18 09:21:37 main:482] : INFO : # Recurrent Layers: 4 [2022-01-18 09:21:37 main:483] : INFO : # Backend Layers: 4 [2022-01-18 09:21:37 main:484] : INFO : # Threads: 1 [2022-01-18 09:21:37 main:486] : INFO : Preparing Dataset [2022-01-18 09:21:38 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-18 09:21:39 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-18 09:21:47 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-18 09:21:48 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-18 09:21:48 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-18 09:21:48 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-18 09:21:49 main:494] : INFO : Creating Model [2022-01-18 09:21:49 main:507] : INFO : Preparing config file [2022-01-18 09:21:49 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-18 09:21:49 main:512] : INFO : Loading config [2022-01-18 09:21:50 main:514] : INFO : Loading state [2022-01-18 09:21:52 main:559] : INFO : Loading DataLoader into Memory [2022-01-18 09:21:52 main:562] : INFO : Starting Training [2022-01-18 09:22:22 main:574] : INFO : Epoch 1962 | loss: 0.0312888 | val_loss: 0.0312148 | Time: 29820.5 ms [2022-01-18 09:22:50 main:574] : INFO : Epoch 1963 | loss: 0.0311835 | val_loss: 0.0311681 | Time: 27481.6 ms [2022-01-18 09:23:14 main:574] : INFO : Epoch 1964 | loss: 0.0311497 | val_loss: 0.031166 | Time: 24439.5 ms [2022-01-18 09:23:41 main:574] : INFO : Epoch 1965 | loss: 0.0311492 | val_loss: 0.0311674 | Time: 26824.3 ms [2022-01-18 09:24:12 main:574] : INFO : Epoch 1966 | loss: 0.0311473 | val_loss: 0.0311636 | Time: 30652 ms [2022-01-18 09:24:37 main:574] : INFO : Epoch 1967 | loss: 0.0311445 | val_loss: 0.0311626 | Time: 25028.5 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-01-18 09:26:42 main:435] : INFO : Set logging level to 1 [2022-01-18 09:26:42 main:441] : INFO : Running in BOINC Client mode [2022-01-18 09:26:43 main:444] : INFO : Resolving all filenames [2022-01-18 09:26:43 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-18 09:26:43 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-18 09:26:43 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-18 09:26:43 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-18 09:26:43 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-18 09:26:43 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-18 09:26:43 main:474] : INFO : Configuration: [2022-01-18 09:26:43 main:475] : INFO : Model type: GRU [2022-01-18 09:26:43 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-18 09:26:43 main:477] : INFO : Max Epochs: 2048 [2022-01-18 09:26:43 main:478] : INFO : Batch Size: 128 [2022-01-18 09:26:44 main:479] : INFO : Learning Rate: 0.01 [2022-01-18 09:26:44 main:480] : INFO : Patience: 10 [2022-01-18 09:26:45 main:481] : INFO : Hidden Width: 12 [2022-01-18 09:26:45 main:482] : INFO : # Recurrent Layers: 4 [2022-01-18 09:26:45 main:483] : INFO : # Backend Layers: 4 [2022-01-18 09:26:45 main:484] : INFO : # Threads: 1 [2022-01-18 09:26:45 main:486] : INFO : Preparing Dataset [2022-01-18 09:26:46 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-18 09:26:47 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-18 09:26:54 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-18 09:26:54 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-18 09:26:55 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-18 09:26:55 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-18 09:26:55 main:494] : INFO : Creating Model [2022-01-18 09:26:55 main:507] : INFO : Preparing config file [2022-01-18 09:26:56 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-18 09:26:56 main:512] : INFO : Loading config [2022-01-18 09:26:56 main:514] : INFO : Loading state [2022-01-18 09:27:02 main:559] : INFO : Loading DataLoader into Memory [2022-01-18 09:27:03 main:562] : INFO : Starting Training [2022-01-18 09:27:29 main:574] : INFO : Epoch 1962 | loss: 0.0312954 | val_loss: 0.0312245 | Time: 26134.3 ms [2022-01-18 09:27:53 main:574] : INFO : Epoch 1963 | loss: 0.0311837 | val_loss: 0.0311825 | Time: 23854.4 ms [2022-01-18 09:28:20 main:574] : INFO : Epoch 1964 | loss: 0.0311556 | val_loss: 0.0311664 | Time: 27287.8 ms [2022-01-18 09:28:44 main:574] : INFO : Epoch 1965 | loss: 0.0311497 | val_loss: 0.031164 | Time: 23482.1 ms [2022-01-18 09:29:10 main:574] : INFO : Epoch 1966 | loss: 0.0311466 | val_loss: 0.0311631 | Time: 26088.3 ms [2022-01-18 09:29:34 main:574] : INFO : Epoch 1967 | loss: 0.0311447 | val_loss: 0.031163 | Time: 23471.4 ms [2022-01-18 09:30:03 main:574] : INFO : Epoch 1968 | loss: 0.0311427 | val_loss: 0.0311619 | Time: 29103.2 ms [2022-01-18 09:30:26 main:574] : INFO : Epoch 1969 | loss: 0.0311429 | val_loss: 0.0311644 | Time: 23405.5 ms [2022-01-18 09:30:55 main:574] : INFO : Epoch 1970 | loss: 0.0311454 | val_loss: 0.0311614 | Time: 28281.2 ms [2022-01-18 09:31:19 main:574] : INFO : Epoch 1971 | loss: 0.031145 | val_loss: 0.0311672 | Time: 24505.8 ms [2022-01-18 09:31:44 main:574] : INFO : Epoch 1972 | loss: 0.0311505 | val_loss: 0.0311624 | Time: 25168.6 ms [2022-01-18 09:32:08 main:574] : INFO : Epoch 1973 | loss: 0.0311484 | val_loss: 0.0311765 | Time: 23630.4 ms [2022-01-18 09:32:33 main:574] : INFO : Epoch 1974 | loss: 0.031153 | val_loss: 0.0311695 | Time: 25020.4 ms [2022-01-18 09:32:59 main:574] : INFO : Epoch 1975 | loss: 0.0311465 | val_loss: 0.0311684 | Time: 26090.8 ms [2022-01-18 09:33:24 main:574] : INFO : Epoch 1976 | loss: 0.0311439 | val_loss: 0.0311631 | Time: 23903.7 ms [2022-01-18 09:33:50 main:574] : INFO : Epoch 1977 | loss: 0.0311454 | val_loss: 0.0311646 | Time: 26331.5 ms [2022-01-18 09:34:15 main:574] : INFO : Epoch 1978 | loss: 0.031145 | val_loss: 0.0311675 | Time: 24358.4 ms [2022-01-18 09:34:39 main:574] : INFO : Epoch 1979 | loss: 0.0311466 | val_loss: 0.0311648 | Time: 24050.6 ms [2022-01-18 09:35:06 main:574] : INFO : Epoch 1980 | loss: 0.0311467 | val_loss: 0.0311643 | Time: 27581.9 ms [2022-01-18 09:35:32 main:574] : INFO : Epoch 1981 | loss: 0.0311456 | val_loss: 0.0311682 | Time: 25495.3 ms [2022-01-18 09:35:56 main:574] : INFO : Epoch 1982 | loss: 0.0311455 | val_loss: 0.0311636 | Time: 24018.6 ms [2022-01-18 09:36:24 main:574] : INFO : Epoch 1983 | loss: 0.0311436 | val_loss: 0.0311625 | Time: 27211.4 ms [2022-01-18 09:36:49 main:574] : INFO : Epoch 1984 | loss: 0.0311449 | val_loss: 0.0311616 | Time: 25398.8 ms [2022-01-18 09:37:15 main:574] : INFO : Epoch 1985 | loss: 0.0311466 | val_loss: 0.0311678 | Time: 25440.8 ms [2022-01-18 09:37:35 main:574] : INFO : Epoch 1986 | loss: 0.0311475 | val_loss: 0.0311647 | Time: 20027.8 ms [2022-01-18 09:38:03 main:574] : INFO : Epoch 1987 | loss: 0.0311472 | val_loss: 0.0311656 | Time: 27957.6 ms [2022-01-18 09:38:31 main:574] : INFO : Epoch 1988 | loss: 0.0311471 | val_loss: 0.0311626 | Time: 27611.1 ms [2022-01-18 09:39:14 main:574] : INFO : Epoch 1989 | loss: 0.0311436 | val_loss: 0.0311673 | Time: 43040.4 ms [2022-01-18 09:39:46 main:574] : INFO : Epoch 1990 | loss: 0.0311418 | val_loss: 0.0311634 | Time: 31521 ms [2022-01-18 09:40:15 main:574] : INFO : Epoch 1991 | loss: 0.0311421 | val_loss: 0.0311647 | Time: 29443.4 ms [2022-01-18 09:40:45 main:574] : INFO : Epoch 1992 | loss: 0.0311411 | val_loss: 0.0311596 | Time: 29906.7 ms [2022-01-18 09:41:15 main:574] : INFO : Epoch 1993 | loss: 0.0311421 | val_loss: 0.0311576 | Time: 29470.1 ms [2022-01-18 09:41:46 main:574] : INFO : Epoch 1994 | loss: 0.0311402 | val_loss: 0.0311629 | Time: 30435.7 ms [2022-01-18 09:42:16 main:574] : INFO : Epoch 1995 | loss: 0.0311385 | val_loss: 0.0311621 | Time: 29663.1 ms [2022-01-18 09:42:46 main:574] : INFO : Epoch 1996 | loss: 0.031138 | val_loss: 0.0311599 | Time: 29983.4 ms [2022-01-18 09:43:18 main:574] : INFO : Epoch 1997 | loss: 0.0311389 | val_loss: 0.0311594 | Time: 32503.7 ms [2022-01-18 09:43:50 main:574] : INFO : Epoch 1998 | loss: 0.0311388 | val_loss: 0.0311619 | Time: 31898.4 ms [2022-01-18 09:44:22 main:574] : INFO : Epoch 1999 | loss: 0.0311383 | val_loss: 0.0311629 | Time: 31645.9 ms [2022-01-18 09:44:55 main:574] : INFO : Epoch 2000 | loss: 0.03114 | val_loss: 0.0311594 | Time: 32275.4 ms [2022-01-18 09:45:25 main:574] : INFO : Epoch 2001 | loss: 0.0311394 | val_loss: 0.0311651 | Time: 30325.3 ms [2022-01-18 09:45:55 main:574] : INFO : Epoch 2002 | loss: 0.0311373 | val_loss: 0.0311615 | Time: 30026.7 ms [2022-01-18 09:46:25 main:574] : INFO : Epoch 2003 | loss: 0.0311377 | val_loss: 0.0311611 | Time: 30122.7 ms [2022-01-18 09:46:56 main:574] : INFO : Epoch 2004 | loss: 0.0311387 | val_loss: 0.0311627 | Time: 30136.1 ms [2022-01-18 09:47:25 main:574] : INFO : Epoch 2005 | loss: 0.0311381 | val_loss: 0.0311633 | Time: 29787.9 ms [2022-01-18 09:47:55 main:574] : INFO : Epoch 2006 | loss: 0.0311373 | val_loss: 0.0311626 | Time: 29836.7 ms [2022-01-18 09:48:24 main:574] : INFO : Epoch 2007 | loss: 0.0311406 | val_loss: 0.0311611 | Time: 28628.4 ms [2022-01-18 09:48:56 main:574] : INFO : Epoch 2008 | loss: 0.0311395 | val_loss: 0.0311681 | Time: 30941.2 ms [2022-01-18 09:49:26 main:574] : INFO : Epoch 2009 | loss: 0.0311382 | val_loss: 0.0311663 | Time: 29883.2 ms [2022-01-18 09:49:56 main:574] : INFO : Epoch 2010 | loss: 0.0311401 | val_loss: 0.0311679 | Time: 29926.3 ms [2022-01-18 09:50:25 main:574] : INFO : Epoch 2011 | loss: 0.031141 | val_loss: 0.0311684 | Time: 29497.3 ms [2022-01-18 09:51:03 main:574] : INFO : Epoch 2012 | loss: 0.0311511 | val_loss: 0.0311699 | Time: 37865 ms [2022-01-18 09:51:34 main:574] : INFO : Epoch 2013 | loss: 0.0311739 | val_loss: 0.0311762 | Time: 30571.1 ms [2022-01-18 09:52:06 main:574] : INFO : Epoch 2014 | loss: 0.031181 | val_loss: 0.0311757 | Time: 31704.7 ms [2022-01-18 09:52:36 main:574] : INFO : Epoch 2015 | loss: 0.0311794 | val_loss: 0.0311739 | Time: 30214.7 ms [2022-01-18 09:53:07 main:574] : INFO : Epoch 2016 | loss: 0.0311758 | val_loss: 0.0311681 | Time: 30108.5 ms [2022-01-18 09:53:38 main:574] : INFO : Epoch 2017 | loss: 0.0311756 | val_loss: 0.0311763 | Time: 31147.1 ms [2022-01-18 09:53:57 main:574] : INFO : Epoch 2018 | loss: 0.0311744 | val_loss: 0.0311662 | Time: 19020.9 ms [2022-01-18 09:54:25 main:574] : INFO : Epoch 2019 | loss: 0.0311736 | val_loss: 0.0311691 | Time: 28046.8 ms [2022-01-18 09:54:43 main:574] : INFO : Epoch 2020 | loss: 0.0311708 | val_loss: 0.0311703 | Time: 18376.2 ms [2022-01-18 09:55:02 main:574] : INFO : Epoch 2021 | loss: 0.0311714 | val_loss: 0.0311697 | Time: 17921 ms [2022-01-18 09:55:19 main:574] : INFO : Epoch 2022 | loss: 0.0311701 | val_loss: 0.0311693 | Time: 17620.9 ms [2022-01-18 09:55:37 main:574] : INFO : Epoch 2023 | loss: 0.0311692 | val_loss: 0.0311685 | Time: 17719.7 ms [2022-01-18 09:55:55 main:574] : INFO : Epoch 2024 | loss: 0.0311675 | val_loss: 0.0311684 | Time: 18138.1 ms [2022-01-18 09:56:14 main:574] : INFO : Epoch 2025 | loss: 0.0311685 | val_loss: 0.0311684 | Time: 18043.7 ms [2022-01-18 09:56:32 main:574] : INFO : Epoch 2026 | loss: 0.0311686 | val_loss: 0.0311691 | Time: 18396.9 ms [2022-01-18 09:56:50 main:574] : INFO : Epoch 2027 | loss: 0.0311706 | val_loss: 0.0311667 | Time: 18282.1 ms [2022-01-18 09:57:08 main:574] : INFO : Epoch 2028 | loss: 0.0311687 | val_loss: 0.0311689 | Time: 17321.3 ms [2022-01-18 09:57:26 main:574] : INFO : Epoch 2029 | loss: 0.031168 | val_loss: 0.0311667 | Time: 18553.4 ms [2022-01-18 09:57:45 main:574] : INFO : Epoch 2030 | loss: 0.0311665 | val_loss: 0.0311652 | Time: 18613.6 ms [2022-01-18 09:58:03 main:574] : INFO : Epoch 2031 | loss: 0.031168 | val_loss: 0.0311668 | Time: 17941.2 ms [2022-01-18 09:58:21 main:574] : INFO : Epoch 2032 | loss: 0.0311679 | val_loss: 0.0311681 | Time: 17528.1 ms [2022-01-18 09:58:38 main:574] : INFO : Epoch 2033 | loss: 0.0311656 | val_loss: 0.0311681 | Time: 17439.3 ms [2022-01-18 09:58:56 main:574] : INFO : Epoch 2034 | loss: 0.0311695 | val_loss: 0.0311663 | Time: 17946.4 ms [2022-01-18 09:59:14 main:574] : INFO : Epoch 2035 | loss: 0.0311673 | val_loss: 0.0311654 | Time: 17586.2 ms [2022-01-18 09:59:32 main:574] : INFO : Epoch 2036 | loss: 0.031165 | val_loss: 0.0311634 | Time: 17602.5 ms [2022-01-18 09:59:49 main:574] : INFO : Epoch 2037 | loss: 0.0311616 | val_loss: 0.0311613 | Time: 17492.2 ms [2022-01-18 10:00:08 main:574] : INFO : Epoch 2038 | loss: 0.0311633 | val_loss: 0.0311677 | Time: 17895.1 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB) [2022-01-18 12:25:53 main:435] : INFO : Set logging level to 1 [2022-01-18 12:25:53 main:441] : INFO : Running in BOINC Client mode [2022-01-18 12:25:53 main:444] : INFO : Resolving all filenames [2022-01-18 12:25:53 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-18 12:25:54 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-18 12:25:54 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-18 12:25:54 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-18 12:25:54 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-18 12:25:54 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-18 12:25:54 main:474] : INFO : Configuration: [2022-01-18 12:25:54 main:475] : INFO : Model type: GRU [2022-01-18 12:25:54 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-18 12:25:54 main:477] : INFO : Max Epochs: 2048 [2022-01-18 12:25:54 main:478] : INFO : Batch Size: 128 [2022-01-18 12:25:54 main:479] : INFO : Learning Rate: 0.01 [2022-01-18 12:25:54 main:480] : INFO : Patience: 10 [2022-01-18 12:25:54 main:481] : INFO : Hidden Width: 12 [2022-01-18 12:25:54 main:482] : INFO : # Recurrent Layers: 4 [2022-01-18 12:25:54 main:483] : INFO : # Backend Layers: 4 [2022-01-18 12:25:54 main:484] : INFO : # Threads: 1 [2022-01-18 12:25:55 main:486] : INFO : Preparing Dataset [2022-01-18 12:25:55 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-18 12:25:56 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-18 12:26:05 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-18 12:26:05 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-18 12:26:05 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-18 12:26:06 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-18 12:26:06 main:494] : INFO : Creating Model [2022-01-18 12:26:06 main:507] : INFO : Preparing config file [2022-01-18 12:26:07 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-18 12:26:07 main:512] : INFO : Loading config [2022-01-18 12:26:07 main:514] : INFO : Loading state [2022-01-18 12:26:13 main:559] : INFO : Loading DataLoader into Memory [2022-01-18 12:26:13 main:562] : INFO : Starting Training [2022-01-18 12:26:43 main:574] : INFO : Epoch 2039 | loss: 0.0312625 | val_loss: 0.031183 | Time: 29768.2 ms [2022-01-18 12:27:10 main:574] : INFO : Epoch 2040 | loss: 0.0311776 | val_loss: 0.0311706 | Time: 26637.7 ms [2022-01-18 12:27:35 main:574] : INFO : Epoch 2041 | loss: 0.0311725 | val_loss: 0.0311781 | Time: 24169.1 ms [2022-01-18 12:28:01 main:574] : INFO : Epoch 2042 | loss: 0.0311686 | val_loss: 0.0311692 | Time: 26648.1 ms [2022-01-18 12:28:28 main:574] : INFO : Epoch 2043 | loss: 0.031165 | val_loss: 0.0311643 | Time: 26097.3 ms [2022-01-18 12:28:54 main:574] : INFO : Epoch 2044 | loss: 0.0311649 | val_loss: 0.0311739 | Time: 25996 ms [2022-01-18 12:29:18 main:574] : INFO : Epoch 2045 | loss: 0.0311639 | val_loss: 0.0311656 | Time: 24473.4 ms [2022-01-18 12:29:45 main:574] : INFO : Epoch 2046 | loss: 0.0311637 | val_loss: 0.0311707 | Time: 26282.4 ms [2022-01-18 12:30:14 main:574] : INFO : Epoch 2047 | loss: 0.0311635 | val_loss: 0.0311665 | Time: 29381 ms [2022-01-18 12:30:38 main:574] : INFO : Epoch 2048 | loss: 0.0311631 | val_loss: 0.0311675 | Time: 23972.5 ms [2022-01-18 12:30:38 main:597] : INFO : Saving trained model to model-final.pt, val_loss 0.0311675 [2022-01-18 12:30:39 main:603] : INFO : Saving end state to config to file [2022-01-18 12:30:39 main:608] : INFO : Success, exiting.. 12:30:39 (8932): called boinc_finish(0) </stderr_txt> ]]>
©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)