| Name | ParityModified-1647041472-8266-0_0 |
| Workunit | 10087381 |
| Created | 11 Mar 2022, 23:31:15 UTC |
| Sent | 20 Mar 2022, 15:30:34 UTC |
| Report deadline | 28 Mar 2022, 15:30:34 UTC |
| Received | 30 Mar 2022, 1:38:25 UTC |
| Server state | Over |
| Outcome | Success |
| Client state | Done |
| Exit status | 0 (0x00000000) |
| Computer ID | 14113 |
| Run time | 4 hours 13 min 44 sec |
| CPU time | 4 hours 12 min 5 sec |
| Validate state | Task was reported too late to validate |
| Credit | 0.00 |
| Device peak FLOPS | 8,298.90 GFLOPS |
| Application version | Machine Learning Dataset Generator (GPU) v9.75 (cuda10200) windows_x86_64 |
| Peak working set size | 2.91 GB |
| Peak swap size | 4.68 GB |
| Peak disk usage | 1.54 GB |
<core_client_version>7.16.20</core_client_version> <![CDATA[ <stderr_txt> : INFO : Epoch 1784 | loss: 0.0311184 | val_loss: 0.0311701 | Time: 2434.24 ms [2022-03-26 12:53:52 main:574] : INFO : Epoch 1785 | loss: 0.0311172 | val_loss: 0.0311707 | Time: 2609.22 ms [2022-03-26 12:53:54 main:574] : INFO : Epoch 1786 | loss: 0.0311187 | val_loss: 0.0311736 | Time: 2426.05 ms [2022-03-26 12:53:56 main:574] : INFO : Epoch 1787 | loss: 0.0311192 | val_loss: 0.0311713 | Time: 2448.96 ms [2022-03-26 12:53:59 main:574] : INFO : Epoch 1788 | loss: 0.0311196 | val_loss: 0.0311674 | Time: 2485.95 ms [2022-03-26 12:54:01 main:574] : INFO : Epoch 1789 | loss: 0.0311198 | val_loss: 0.0311723 | Time: 2291.74 ms [2022-03-26 12:54:04 main:574] : INFO : Epoch 1790 | loss: 0.031117 | val_loss: 0.0311729 | Time: 2470.27 ms [2022-03-26 12:54:06 main:574] : INFO : Epoch 1791 | loss: 0.0311182 | val_loss: 0.0311696 | Time: 2519.54 ms [2022-03-26 12:54:09 main:574] : INFO : Epoch 1792 | loss: 0.0311216 | val_loss: 0.0311748 | Time: 2393.98 ms [2022-03-26 12:54:11 main:574] : INFO : Epoch 1793 | loss: 0.0311225 | val_loss: 0.0311707 | Time: 2471.17 ms [2022-03-26 12:54:14 main:574] : INFO : Epoch 1794 | loss: 0.0311276 | val_loss: 0.0311733 | Time: 2486.37 ms [2022-03-26 12:54:16 main:574] : INFO : Epoch 1795 | loss: 0.0311261 | val_loss: 0.0311681 | Time: 2516.63 ms [2022-03-26 12:54:19 main:574] : INFO : Epoch 1796 | loss: 0.0311236 | val_loss: 0.0311712 | Time: 2473.84 ms [2022-03-26 12:54:21 main:574] : INFO : Epoch 1797 | loss: 0.0311246 | val_loss: 0.0311688 | Time: 2448.6 ms [2022-03-26 12:54:24 main:574] : INFO : Epoch 1798 | loss: 0.0311237 | val_loss: 0.0311767 | Time: 2508.08 ms [2022-03-26 12:54:26 main:574] : INFO : Epoch 1799 | loss: 0.0311253 | val_loss: 0.0311723 | Time: 2356.41 ms [2022-03-26 12:54:28 main:574] : INFO : Epoch 1800 | loss: 0.0311219 | val_loss: 0.0311724 | Time: 2548.66 ms [2022-03-26 12:54:31 main:574] : INFO : Epoch 1801 | loss: 0.0311283 | val_loss: 0.0311708 | Time: 2452.23 ms [2022-03-26 12:54:33 main:574] : INFO : Epoch 1802 | loss: 0.0311351 | val_loss: 0.031175 | Time: 2494.72 ms [2022-03-26 12:54:36 main:574] : INFO : Epoch 1803 | loss: 0.0311376 | val_loss: 0.0311694 | Time: 2540.37 ms [2022-03-26 12:54:38 main:574] : INFO : Epoch 1804 | loss: 0.031137 | val_loss: 0.0311729 | Time: 2410.33 ms [2022-03-26 12:54:41 main:574] : INFO : Epoch 1805 | loss: 0.0311283 | val_loss: 0.0311707 | Time: 2501.91 ms [2022-03-26 12:54:43 main:574] : INFO : Epoch 1806 | loss: 0.0311304 | val_loss: 0.0311713 | Time: 2544.8 ms [2022-03-26 12:54:46 main:574] : INFO : Epoch 1807 | loss: 0.0311318 | val_loss: 0.031169 | Time: 2372.59 ms [2022-03-26 12:54:48 main:574] : INFO : Epoch 1808 | loss: 0.0311283 | val_loss: 0.0311678 | Time: 2605.91 ms [2022-03-26 12:54:51 main:574] : INFO : Epoch 1809 | loss: 0.0311257 | val_loss: 0.0311714 | Time: 2502.63 ms [2022-03-26 12:54:53 main:574] : INFO : Epoch 1810 | loss: 0.0311231 | val_loss: 0.0311665 | Time: 2383.95 ms [2022-03-26 12:54:56 main:574] : INFO : Epoch 1811 | loss: 0.0311203 | val_loss: 0.031167 | Time: 2731.05 ms [2022-03-26 12:54:59 main:574] : INFO : Epoch 1812 | loss: 0.0311199 | val_loss: 0.0311674 | Time: 2433.37 ms [2022-03-26 12:55:01 main:574] : INFO : Epoch 1813 | loss: 0.0311181 | val_loss: 0.03117 | Time: 2530.17 ms [2022-03-26 12:55:04 main:574] : INFO : Epoch 1814 | loss: 0.0311161 | val_loss: 0.03117 | Time: 2581.81 ms [2022-03-26 12:55:06 main:574] : INFO : Epoch 1815 | loss: 0.0311182 | val_loss: 0.031167 | Time: 2378.62 ms [2022-03-26 12:55:09 main:574] : INFO : Epoch 1816 | loss: 0.0311176 | val_loss: 0.0311712 | Time: 2558.57 ms [2022-03-26 12:55:11 main:574] : INFO : Epoch 1817 | loss: 0.031116 | val_loss: 0.0311712 | Time: 2492.36 ms [2022-03-26 12:55:13 main:574] : INFO : Epoch 1818 | loss: 0.0311178 | val_loss: 0.0311671 | Time: 2388.04 ms [2022-03-26 12:55:16 main:574] : INFO : Epoch 1819 | loss: 0.0311248 | val_loss: 0.0311638 | Time: 2539.91 ms [2022-03-26 12:55:18 main:574] : INFO : Epoch 1820 | loss: 0.0311306 | val_loss: 0.0311701 | Time: 2449.35 ms [2022-03-26 12:55:21 main:574] : INFO : Epoch 1821 | loss: 0.0311229 | val_loss: 0.0311714 | Time: 2446.96 ms [2022-03-26 12:55:24 main:574] : INFO : Epoch 1822 | loss: 0.031122 | val_loss: 0.0311662 | Time: 2563.65 ms [2022-03-26 12:55:26 main:574] : INFO : Epoch 1823 | loss: 0.0311192 | val_loss: 0.0311725 | Time: 2406.98 ms [2022-03-26 12:55:28 main:574] : INFO : Epoch 1824 | loss: 0.0311163 | val_loss: 0.0311717 | Time: 2496.61 ms [2022-03-26 12:55:31 main:574] : INFO : Epoch 1825 | loss: 0.0311186 | val_loss: 0.0311684 | Time: 2447.3 ms [2022-03-26 12:55:33 main:574] : INFO : Epoch 1826 | loss: 0.0311163 | val_loss: 0.031169 | Time: 2393.64 ms [2022-03-26 12:55:36 main:574] : INFO : Epoch 1827 | loss: 0.0311249 | val_loss: 0.0311676 | Time: 2500.65 ms [2022-03-26 12:55:38 main:574] : INFO : Epoch 1828 | loss: 0.0311248 | val_loss: 0.0311704 | Time: 2537.74 ms [2022-03-26 12:55:41 main:574] : INFO : Epoch 1829 | loss: 0.0311227 | val_loss: 0.0311719 | Time: 2458.49 ms [2022-03-26 12:55:43 main:574] : INFO : Epoch 1830 | loss: 0.0311182 | val_loss: 0.0311688 | Time: 2602.41 ms [2022-03-26 12:55:46 main:574] : INFO : Epoch 1831 | loss: 0.0311208 | val_loss: 0.0311752 | Time: 2429.36 ms [2022-03-26 12:55:48 main:574] : INFO : Epoch 1832 | loss: 0.0311235 | val_loss: 0.0311667 | Time: 2517.42 ms [2022-03-26 12:55:51 main:574] : INFO : Epoch 1833 | loss: 0.0311208 | val_loss: 0.0311685 | Time: 2544.39 ms [2022-03-26 12:55:53 main:574] : INFO : Epoch 1834 | loss: 0.0311205 | val_loss: 0.0311818 | Time: 2447.3 ms [2022-03-26 12:55:56 main:574] : INFO : Epoch 1835 | loss: 0.0311197 | val_loss: 0.0311698 | Time: 2555.58 ms [2022-03-26 12:55:58 main:574] : INFO : Epoch 1836 | loss: 0.0311168 | val_loss: 0.0311692 | Time: 2418.02 ms [2022-03-26 12:56:01 main:574] : INFO : Epoch 1837 | loss: 0.0311162 | val_loss: 0.0311739 | Time: 2580.95 ms [2022-03-26 12:56:03 main:574] : INFO : Epoch 1838 | loss: 0.0311214 | val_loss: 0.0311676 | Time: 2451.38 ms [2022-03-26 12:56:06 main:574] : INFO : Epoch 1839 | loss: 0.0311196 | val_loss: 0.0311768 | Time: 2520.55 ms [2022-03-26 12:56:08 main:574] : INFO : Epoch 1840 | loss: 0.031118 | val_loss: 0.0311737 | Time: 2536.19 ms [2022-03-26 12:56:11 main:574] : INFO : Epoch 1841 | loss: 0.0311173 | val_loss: 0.0311787 | Time: 2379.8 ms [2022-03-26 12:56:13 main:574] : INFO : Epoch 1842 | loss: 0.0311162 | val_loss: 0.0311709 | Time: 2537.56 ms [2022-03-26 12:56:16 main:574] : INFO : Epoch 1843 | loss: 0.0311156 | val_loss: 0.031177 | Time: 2692.57 ms [2022-03-26 12:56:19 main:574] : INFO : Epoch 1844 | loss: 0.0311182 | val_loss: 0.0311749 | Time: 2524.54 ms [2022-03-26 12:56:21 main:574] : INFO : Epoch 1845 | loss: 0.0311164 | val_loss: 0.0311787 | Time: 2417.73 ms [2022-03-26 12:56:23 main:574] : INFO : Epoch 1846 | loss: 0.0311161 | val_loss: 0.0311776 | Time: 2368.86 ms [2022-03-26 12:56:26 main:574] : INFO : Epoch 1847 | loss: 0.0311122 | val_loss: 0.0311721 | Time: 2343.86 ms [2022-03-26 12:56:28 main:574] : INFO : Epoch 1848 | loss: 0.0311104 | val_loss: 0.0311721 | Time: 2364.12 ms [2022-03-26 12:56:30 main:574] : INFO : Epoch 1849 | loss: 0.0311136 | val_loss: 0.0311703 | Time: 2306.06 ms [2022-03-26 12:56:33 main:574] : INFO : Epoch 1850 | loss: 0.0311184 | val_loss: 0.031176 | Time: 2330.33 ms [2022-03-26 12:56:35 main:574] : INFO : Epoch 1851 | loss: 0.031121 | val_loss: 0.0311745 | Time: 2396.81 ms [2022-03-26 12:56:37 main:574] : INFO : Epoch 1852 | loss: 0.0311194 | val_loss: 0.0311731 | Time: 2317.77 ms [2022-03-26 12:56:40 main:574] : INFO : Epoch 1853 | loss: 0.0311148 | val_loss: 0.0311758 | Time: 2424.55 ms [2022-03-26 12:56:42 main:574] : INFO : Epoch 1854 | loss: 0.0311201 | val_loss: 0.0311747 | Time: 2403.3 ms [2022-03-26 12:56:45 main:574] : INFO : Epoch 1855 | loss: 0.0311201 | val_loss: 0.0311689 | Time: 2359.28 ms [2022-03-26 12:56:47 main:574] : INFO : Epoch 1856 | loss: 0.031122 | val_loss: 0.0311682 | Time: 2394.56 ms [2022-03-26 12:56:50 main:574] : INFO : Epoch 1857 | loss: 0.0311185 | val_loss: 0.0311672 | Time: 2425.95 ms [2022-03-26 12:56:52 main:574] : INFO : Epoch 1858 | loss: 0.0311158 | val_loss: 0.0311694 | Time: 2449.39 ms [2022-03-26 12:56:55 main:574] : INFO : Epoch 1859 | loss: 0.0311185 | val_loss: 0.0311693 | Time: 2561.94 ms [2022-03-26 12:56:57 main:574] : INFO : Epoch 1860 | loss: 0.0311138 | val_loss: 0.031172 | Time: 2326.34 ms [2022-03-26 12:56:59 main:574] : INFO : Epoch 1861 | loss: 0.0311202 | val_loss: 0.031176 | Time: 2476.02 ms [2022-03-26 12:57:02 main:574] : INFO : Epoch 1862 | loss: 0.0311235 | val_loss: 0.0311768 | Time: 2579.18 ms [2022-03-26 12:57:04 main:574] : INFO : Epoch 1863 | loss: 0.0311228 | val_loss: 0.0311693 | Time: 2377.16 ms [2022-03-26 12:57:07 main:574] : INFO : Epoch 1864 | loss: 0.0311219 | val_loss: 0.0311769 | Time: 2491.74 ms [2022-03-26 12:57:09 main:574] : INFO : Epoch 1865 | loss: 0.031121 | val_loss: 0.031177 | Time: 2409.65 ms [2022-03-26 12:57:12 main:574] : INFO : Epoch 1866 | loss: 0.0311242 | val_loss: 0.0311701 | Time: 2502.95 ms [2022-03-26 12:57:14 main:574] : INFO : Epoch 1867 | loss: 0.0311262 | val_loss: 0.0311755 | Time: 2611.46 ms [2022-03-26 12:57:17 main:574] : INFO : Epoch 1868 | loss: 0.0311235 | val_loss: 0.0311757 | Time: 2381.76 ms [2022-03-26 12:57:19 main:574] : INFO : Epoch 1869 | loss: 0.0311217 | val_loss: 0.0311767 | Time: 2420.67 ms [2022-03-26 12:57:22 main:574] : INFO : Epoch 1870 | loss: 0.0311236 | val_loss: 0.0311698 | Time: 2552.33 ms [2022-03-26 12:57:24 main:574] : INFO : Epoch 1871 | loss: 0.0311229 | val_loss: 0.0311727 | Time: 2275.36 ms [2022-03-26 12:57:26 main:574] : INFO : Epoch 1872 | loss: 0.0311232 | val_loss: 0.0311683 | Time: 2443.93 ms [2022-03-26 12:57:29 main:574] : INFO : Epoch 1873 | loss: 0.0311188 | val_loss: 0.031175 | Time: 2470.85 ms [2022-03-26 12:57:31 main:574] : INFO : Epoch 1874 | loss: 0.0311204 | val_loss: 0.031179 | Time: 2334.4 ms [2022-03-26 12:57:34 main:574] : INFO : Epoch 1875 | loss: 0.0311179 | val_loss: 0.0311722 | Time: 2501.32 ms [2022-03-26 12:57:36 main:574] : INFO : Epoch 1876 | loss: 0.0311247 | val_loss: 0.0311728 | Time: 2450.53 ms [2022-03-26 12:57:39 main:574] : INFO : Epoch 1877 | loss: 0.0311238 | val_loss: 0.0311677 | Time: 2470.55 ms [2022-03-26 12:57:41 main:574] : INFO : Epoch 1878 | loss: 0.0311206 | val_loss: 0.0311675 | Time: 2478.97 ms [2022-03-26 12:57:44 main:574] : INFO : Epoch 1879 | loss: 0.0311233 | val_loss: 0.0311719 | Time: 2401.99 ms [2022-03-26 12:57:46 main:574] : INFO : Epoch 1880 | loss: 0.0311339 | val_loss: 0.0311681 | Time: 2547.64 ms [2022-03-26 12:57:49 main:574] : INFO : Epoch 1881 | loss: 0.0311402 | val_loss: 0.0311666 | Time: 2478.69 ms [2022-03-26 12:57:51 main:574] : INFO : Epoch 1882 | loss: 0.0311275 | val_loss: 0.0311701 | Time: 2550.93 ms [2022-03-26 12:57:54 main:574] : INFO : Epoch 1883 | loss: 0.0311202 | val_loss: 0.0311684 | Time: 2462.83 ms [2022-03-26 12:57:56 main:574] : INFO : Epoch 1884 | loss: 0.0311174 | val_loss: 0.0311693 | Time: 2327.68 ms [2022-03-26 12:57:58 main:574] : INFO : Epoch 1885 | loss: 0.031116 | val_loss: 0.0311707 | Time: 2477.16 ms [2022-03-26 12:58:01 main:574] : INFO : Epoch 1886 | loss: 0.0311172 | val_loss: 0.0311748 | Time: 2378.85 ms [2022-03-26 12:58:03 main:574] : INFO : Epoch 1887 | loss: 0.0311171 | val_loss: 0.0311669 | Time: 2377.17 ms [2022-03-26 12:58:06 main:574] : INFO : Epoch 1888 | loss: 0.0311212 | val_loss: 0.0311762 | Time: 2582.73 ms [2022-03-26 12:58:08 main:574] : INFO : Epoch 1889 | loss: 0.0311234 | val_loss: 0.0311674 | Time: 2347.6 ms [2022-03-26 12:58:11 main:574] : INFO : Epoch 1890 | loss: 0.0311265 | val_loss: 0.0311673 | Time: 2551.39 ms [2022-03-26 12:58:13 main:574] : INFO : Epoch 1891 | loss: 0.0311216 | val_loss: 0.0311663 | Time: 2423.21 ms [2022-03-26 12:58:16 main:574] : INFO : Epoch 1892 | loss: 0.0311175 | val_loss: 0.0311739 | Time: 2370.86 ms [2022-03-26 12:58:18 main:574] : INFO : Epoch 1893 | loss: 0.0311135 | val_loss: 0.0311739 | Time: 2463.29 ms [2022-03-26 12:58:20 main:574] : INFO : Epoch 1894 | loss: 0.0311136 | val_loss: 0.031182 | Time: 2294.57 ms [2022-03-26 12:58:23 main:574] : INFO : Epoch 1895 | loss: 0.031112 | val_loss: 0.0311787 | Time: 2363.99 ms [2022-03-26 12:58:25 main:574] : INFO : Epoch 1896 | loss: 0.0311208 | val_loss: 0.0311734 | Time: 2583.57 ms [2022-03-26 12:58:28 main:574] : INFO : Epoch 1897 | loss: 0.0311183 | val_loss: 0.031177 | Time: 2429.7 ms [2022-03-26 12:58:30 main:574] : INFO : Epoch 1898 | loss: 0.0311193 | val_loss: 0.0311734 | Time: 2425.77 ms [2022-03-26 12:58:33 main:574] : INFO : Epoch 1899 | loss: 0.0311163 | val_loss: 0.0311716 | Time: 2519.04 ms [2022-03-26 12:58:35 main:574] : INFO : Epoch 1900 | loss: 0.0311122 | val_loss: 0.0311746 | Time: 2393.11 ms [2022-03-26 12:58:38 main:574] : INFO : Epoch 1901 | loss: 0.0311188 | val_loss: 0.031169 | Time: 2534.37 ms [2022-03-26 12:58:40 main:574] : INFO : Epoch 1902 | loss: 0.0311182 | val_loss: 0.03117 | Time: 2494.77 ms [2022-03-26 12:58:42 main:574] : INFO : Epoch 1903 | loss: 0.0311162 | val_loss: 0.0311739 | Time: 2332.72 ms [2022-03-26 12:58:45 main:574] : INFO : Epoch 1904 | loss: 0.0311111 | val_loss: 0.031172 | Time: 2555.2 ms [2022-03-26 12:58:47 main:574] : INFO : Epoch 1905 | loss: 0.0311134 | val_loss: 0.0311696 | Time: 2486.56 ms [2022-03-26 12:58:50 main:574] : INFO : Epoch 1906 | loss: 0.031115 | val_loss: 0.0311737 | Time: 2399.5 ms [2022-03-26 12:58:52 main:574] : INFO : Epoch 1907 | loss: 0.0311149 | val_loss: 0.0311719 | Time: 2380.03 ms [2022-03-26 12:58:55 main:574] : INFO : Epoch 1908 | loss: 0.0311163 | val_loss: 0.0311738 | Time: 2427.27 ms [2022-03-26 12:58:57 main:574] : INFO : Epoch 1909 | loss: 0.0311205 | val_loss: 0.0311772 | Time: 2519.56 ms [2022-03-26 12:59:00 main:574] : INFO : Epoch 1910 | loss: 0.0311189 | val_loss: 0.031171 | Time: 2500.43 ms [2022-03-26 12:59:02 main:574] : INFO : Epoch 1911 | loss: 0.0311186 | val_loss: 0.0311747 | Time: 2357.88 ms [2022-03-26 12:59:05 main:574] : INFO : Epoch 1912 | loss: 0.0311179 | val_loss: 0.0311734 | Time: 2487.53 ms [2022-03-26 12:59:07 main:574] : INFO : Epoch 1913 | loss: 0.0311181 | val_loss: 0.0311761 | Time: 2515.23 ms [2022-03-26 12:59:09 main:574] : INFO : Epoch 1914 | loss: 0.0311147 | val_loss: 0.0311801 | Time: 2387.18 ms [2022-03-26 12:59:12 main:574] : INFO : Epoch 1915 | loss: 0.0311171 | val_loss: 0.0311763 | Time: 2423.03 ms [2022-03-26 12:59:14 main:574] : INFO : Epoch 1916 | loss: 0.031121 | val_loss: 0.0311733 | Time: 2443.01 ms [2022-03-26 12:59:17 main:574] : INFO : Epoch 1917 | loss: 0.0311168 | val_loss: 0.0311766 | Time: 2357.35 ms [2022-03-26 12:59:19 main:574] : INFO : Epoch 1918 | loss: 0.031115 | val_loss: 0.0311756 | Time: 2443.33 ms [2022-03-26 12:59:22 main:574] : INFO : Epoch 1919 | loss: 0.0311104 | val_loss: 0.0311728 | Time: 2390.85 ms [2022-03-26 12:59:24 main:574] : INFO : Epoch 1920 | loss: 0.03111 | val_loss: 0.0311815 | Time: 2535.9 ms [2022-03-26 12:59:27 main:574] : INFO : Epoch 1921 | loss: 0.0311089 | val_loss: 0.0311753 | Time: 2608.15 ms [2022-03-26 12:59:29 main:574] : INFO : Epoch 1922 | loss: 0.0311069 | val_loss: 0.0311773 | Time: 2562.32 ms [2022-03-26 12:59:32 main:574] : INFO : Epoch 1923 | loss: 0.0311067 | val_loss: 0.0311788 | Time: 2474.42 ms [2022-03-26 12:59:34 main:574] : INFO : Epoch 1924 | loss: 0.0311095 | val_loss: 0.0311786 | Time: 2442.85 ms [2022-03-26 12:59:37 main:574] : INFO : Epoch 1925 | loss: 0.0311095 | val_loss: 0.0311881 | Time: 2497.35 ms [2022-03-26 12:59:39 main:574] : INFO : Epoch 1926 | loss: 0.0311072 | val_loss: 0.0311895 | Time: 2436.82 ms [2022-03-26 12:59:41 main:574] : INFO : Epoch 1927 | loss: 0.031119 | val_loss: 0.0311807 | Time: 2337.18 ms [2022-03-26 12:59:44 main:574] : INFO : Epoch 1928 | loss: 0.031125 | val_loss: 0.0311758 | Time: 2477.69 ms [2022-03-26 12:59:46 main:574] : INFO : Epoch 1929 | loss: 0.0311184 | val_loss: 0.0311688 | Time: 2461.34 ms [2022-03-26 12:59:49 main:574] : INFO : Epoch 1930 | loss: 0.0311186 | val_loss: 0.0311735 | Time: 2358.6 ms [2022-03-26 12:59:51 main:574] : INFO : Epoch 1931 | loss: 0.031119 | val_loss: 0.0311707 | Time: 2485.57 ms [2022-03-26 12:59:54 main:574] : INFO : Epoch 1932 | loss: 0.031118 | val_loss: 0.0311753 | Time: 2324.83 ms [2022-03-26 12:59:56 main:574] : INFO : Epoch 1933 | loss: 0.0311147 | val_loss: 0.0311821 | Time: 2671.97 ms [2022-03-26 12:59:59 main:574] : INFO : Epoch 1934 | loss: 0.0311138 | val_loss: 0.0311782 | Time: 2432.61 ms [2022-03-26 13:00:01 main:574] : INFO : Epoch 1935 | loss: 0.0311145 | val_loss: 0.0311749 | Time: 2365.66 ms [2022-03-26 13:00:04 main:574] : INFO : Epoch 1936 | loss: 0.0311135 | val_loss: 0.031173 | Time: 2678.35 ms [2022-03-26 13:00:06 main:574] : INFO : Epoch 1937 | loss: 0.0311152 | val_loss: 0.0311764 | Time: 2384.39 ms [2022-03-26 13:00:09 main:574] : INFO : Epoch 1938 | loss: 0.0311149 | val_loss: 0.0311785 | Time: 2385.63 ms [2022-03-26 13:00:11 main:574] : INFO : Epoch 1939 | loss: 0.0311127 | val_loss: 0.0311726 | Time: 2560.9 ms [2022-03-26 13:00:13 main:574] : INFO : Epoch 1940 | loss: 0.0311213 | val_loss: 0.0311677 | Time: 2319.25 ms [2022-03-26 13:00:16 main:574] : INFO : Epoch 1941 | loss: 0.0311208 | val_loss: 0.0311769 | Time: 2317.6 ms [2022-03-26 13:00:18 main:574] : INFO : Epoch 1942 | loss: 0.031122 | val_loss: 0.0311729 | Time: 2642.37 ms [2022-03-26 13:00:21 main:574] : INFO : Epoch 1943 | loss: 0.0311275 | val_loss: 0.0311754 | Time: 2254.78 ms [2022-03-26 13:00:23 main:574] : INFO : Epoch 1944 | loss: 0.0311276 | val_loss: 0.0311741 | Time: 2492.62 ms [2022-03-26 13:00:26 main:574] : INFO : Epoch 1945 | loss: 0.0311254 | val_loss: 0.0311746 | Time: 2480.75 ms [2022-03-26 13:00:28 main:574] : INFO : Epoch 1946 | loss: 0.0311198 | val_loss: 0.0311764 | Time: 2399.1 ms [2022-03-26 13:00:31 main:574] : INFO : Epoch 1947 | loss: 0.0311159 | val_loss: 0.0311684 | Time: 2447.73 ms [2022-03-26 13:00:33 main:574] : INFO : Epoch 1948 | loss: 0.0311175 | val_loss: 0.0311746 | Time: 2475.17 ms [2022-03-26 13:00:35 main:574] : INFO : Epoch 1949 | loss: 0.0311152 | val_loss: 0.0311827 | Time: 2274.29 ms [2022-03-26 13:00:38 main:574] : INFO : Epoch 1950 | loss: 0.0311181 | val_loss: 0.0311793 | Time: 2711.74 ms [2022-03-26 13:00:40 main:574] : INFO : Epoch 1951 | loss: 0.0311238 | val_loss: 0.0311773 | Time: 2464.05 ms [2022-03-26 13:00:43 main:574] : INFO : Epoch 1952 | loss: 0.0311249 | val_loss: 0.0311869 | Time: 2485.09 ms [2022-03-26 13:00:46 main:574] : INFO : Epoch 1953 | loss: 0.0311179 | val_loss: 0.0311709 | Time: 2550.59 ms [2022-03-26 13:00:48 main:574] : INFO : Epoch 1954 | loss: 0.0311132 | val_loss: 0.0311769 | Time: 2403.79 ms [2022-03-26 13:00:50 main:574] : INFO : Epoch 1955 | loss: 0.0311156 | val_loss: 0.0311748 | Time: 2422.75 ms [2022-03-26 13:00:53 main:574] : INFO : Epoch 1956 | loss: 0.0311227 | val_loss: 0.0311843 | Time: 2447.48 ms [2022-03-26 13:00:55 main:574] : INFO : Epoch 1957 | loss: 0.0311175 | val_loss: 0.0311784 | Time: 2273.26 ms [2022-03-26 13:00:58 main:574] : INFO : Epoch 1958 | loss: 0.0311185 | val_loss: 0.0311805 | Time: 2486.87 ms [2022-03-26 13:01:00 main:574] : INFO : Epoch 1959 | loss: 0.0311159 | val_loss: 0.0311803 | Time: 2552.27 ms [2022-03-26 13:01:03 main:574] : INFO : Epoch 1960 | loss: 0.0311165 | val_loss: 0.0311791 | Time: 2478.22 ms [2022-03-26 13:01:05 main:574] : INFO : Epoch 1961 | loss: 0.0311161 | val_loss: 0.0311743 | Time: 2484.71 ms [2022-03-26 13:01:08 main:574] : INFO : Epoch 1962 | loss: 0.0311192 | val_loss: 0.031175 | Time: 2470.81 ms [2022-03-26 13:01:10 main:574] : INFO : Epoch 1963 | loss: 0.0311155 | val_loss: 0.0311795 | Time: 2423.37 ms [2022-03-26 13:01:12 main:574] : INFO : Epoch 1964 | loss: 0.0311118 | val_loss: 0.0311768 | Time: 2429.12 ms [2022-03-26 13:01:15 main:574] : INFO : Epoch 1965 | loss: 0.0311118 | val_loss: 0.0311774 | Time: 2541.56 ms [2022-03-26 13:01:18 main:574] : INFO : Epoch 1966 | loss: 0.0311086 | val_loss: 0.0311861 | Time: 2580.63 ms [2022-03-26 13:01:20 main:574] : INFO : Epoch 1967 | loss: 0.031112 | val_loss: 0.0311765 | Time: 2467.19 ms [2022-03-26 13:01:23 main:574] : INFO : Epoch 1968 | loss: 0.031112 | val_loss: 0.0311783 | Time: 2472.02 ms [2022-03-26 13:01:25 main:574] : INFO : Epoch 1969 | loss: 0.0311083 | val_loss: 0.0311756 | Time: 2512.95 ms [2022-03-26 13:01:28 main:574] : INFO : Epoch 1970 | loss: 0.0311487 | val_loss: 0.031182 | Time: 2549.65 ms [2022-03-26 13:01:30 main:574] : INFO : Epoch 1971 | loss: 0.0311843 | val_loss: 0.0311825 | Time: 2480.2 ms [2022-03-26 13:01:33 main:574] : INFO : Epoch 1972 | loss: 0.0311822 | val_loss: 0.0311789 | Time: 2419.17 ms [2022-03-26 13:01:35 main:574] : INFO : Epoch 1973 | loss: 0.0311751 | val_loss: 0.0311733 | Time: 2500.6 ms [2022-03-26 13:01:37 main:574] : INFO : Epoch 1974 | loss: 0.0311707 | val_loss: 0.031173 | Time: 2485.3 ms [2022-03-26 13:01:40 main:574] : INFO : Epoch 1975 | loss: 0.0311962 | val_loss: 0.0312392 | Time: 2437.31 ms [2022-03-26 13:01:42 main:574] : INFO : Epoch 1976 | loss: 0.031333 | val_loss: 0.0312749 | Time: 2479.07 ms [2022-03-26 13:01:45 main:574] : INFO : Epoch 1977 | loss: 0.0312546 | val_loss: 0.0312268 | Time: 2452.78 ms [2022-03-26 13:01:47 main:574] : INFO : Epoch 1978 | loss: 0.031217 | val_loss: 0.0312088 | Time: 2377.84 ms [2022-03-26 13:01:50 main:574] : INFO : Epoch 1979 | loss: 0.0312061 | val_loss: 0.0312037 | Time: 2500.39 ms [2022-03-26 13:01:52 main:574] : INFO : Epoch 1980 | loss: 0.0312043 | val_loss: 0.0312007 | Time: 2379.17 ms [2022-03-26 13:01:55 main:574] : INFO : Epoch 1981 | loss: 0.0312003 | val_loss: 0.0311968 | Time: 2431.76 ms [2022-03-26 13:01:57 main:574] : INFO : Epoch 1982 | loss: 0.0311988 | val_loss: 0.0311951 | Time: 2566.88 ms [2022-03-26 13:02:00 main:574] : INFO : Epoch 1983 | loss: 0.0311987 | val_loss: 0.0311949 | Time: 2462.05 ms [2022-03-26 13:02:02 main:574] : INFO : Epoch 1984 | loss: 0.0311962 | val_loss: 0.0311997 | Time: 2610.86 ms [2022-03-26 13:02:05 main:574] : INFO : Epoch 1985 | loss: 0.0311971 | val_loss: 0.0312012 | Time: 2462.49 ms [2022-03-26 13:02:07 main:574] : INFO : Epoch 1986 | loss: 0.0311945 | val_loss: 0.0311956 | Time: 2509.29 ms [2022-03-26 13:02:10 main:574] : INFO : Epoch 1987 | loss: 0.0311928 | val_loss: 0.0311911 | Time: 2620.45 ms [2022-03-26 13:02:12 main:574] : INFO : Epoch 1988 | loss: 0.0311924 | val_loss: 0.0311902 | Time: 2422.57 ms [2022-03-26 13:02:15 main:574] : INFO : Epoch 1989 | loss: 0.03119 | val_loss: 0.0311887 | Time: 2505.5 ms [2022-03-26 13:02:17 main:574] : INFO : Epoch 1990 | loss: 0.0311885 | val_loss: 0.0311882 | Time: 2617.93 ms [2022-03-26 13:02:20 main:574] : INFO : Epoch 1991 | loss: 0.0311874 | val_loss: 0.0311893 | Time: 2550.35 ms [2022-03-26 13:02:22 main:574] : INFO : Epoch 1992 | loss: 0.0311883 | val_loss: 0.0311893 | Time: 2499.95 ms [2022-03-26 13:02:25 main:574] : INFO : Epoch 1993 | loss: 0.0311866 | val_loss: 0.0311893 | Time: 2612.91 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 3070 Laptop GPU) [2022-03-27 10:03:04 main:435] : INFO : Set logging level to 1 [2022-03-27 10:03:04 main:441] : INFO : Running in BOINC Client mode [2022-03-27 10:03:04 main:444] : INFO : Resolving all filenames [2022-03-27 10:03:04 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-03-27 10:03:04 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-03-27 10:03:04 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-03-27 10:03:04 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-03-27 10:03:04 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-03-27 10:03:04 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-03-27 10:03:04 main:474] : INFO : Configuration: [2022-03-27 10:03:04 main:475] : INFO : Model type: GRU [2022-03-27 10:03:04 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-03-27 10:03:04 main:477] : INFO : Max Epochs: 2048 [2022-03-27 10:03:04 main:478] : INFO : Batch Size: 128 [2022-03-27 10:03:04 main:479] : INFO : Learning Rate: 0.01 [2022-03-27 10:03:04 main:480] : INFO : Patience: 10 [2022-03-27 10:03:04 main:481] : INFO : Hidden Width: 12 [2022-03-27 10:03:04 main:482] : INFO : # Recurrent Layers: 4 [2022-03-27 10:03:04 main:483] : INFO : # Backend Layers: 4 [2022-03-27 10:03:04 main:484] : INFO : # Threads: 1 [2022-03-27 10:03:04 main:486] : INFO : Preparing Dataset [2022-03-27 10:03:04 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-03-27 10:03:04 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 3070 Laptop GPU) [2022-03-27 18:26:38 main:435] : INFO : Set logging level to 1 [2022-03-27 18:26:38 main:441] : INFO : Running in BOINC Client mode [2022-03-27 18:26:38 main:444] : INFO : Resolving all filenames [2022-03-27 18:26:38 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-03-27 18:26:38 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-03-27 18:26:38 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-03-27 18:26:38 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-03-27 18:26:38 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-03-27 18:26:38 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-03-27 18:26:38 main:474] : INFO : Configuration: [2022-03-27 18:26:38 main:475] : INFO : Model type: GRU [2022-03-27 18:26:38 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-03-27 18:26:38 main:477] : INFO : Max Epochs: 2048 [2022-03-27 18:26:38 main:478] : INFO : Batch Size: 128 [2022-03-27 18:26:38 main:479] : INFO : Learning Rate: 0.01 [2022-03-27 18:26:38 main:480] : INFO : Patience: 10 [2022-03-27 18:26:38 main:481] : INFO : Hidden Width: 12 [2022-03-27 18:26:38 main:482] : INFO : # Recurrent Layers: 4 [2022-03-27 18:26:38 main:483] : INFO : # Backend Layers: 4 [2022-03-27 18:26:38 main:484] : INFO : # Threads: 1 [2022-03-27 18:26:38 main:486] : INFO : Preparing Dataset [2022-03-27 18:26:38 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-03-27 18:26:39 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 3070 Laptop GPU) [2022-03-27 18:50:07 main:435] : INFO : Set logging level to 1 [2022-03-27 18:50:07 main:441] : INFO : Running in BOINC Client mode [2022-03-27 18:50:07 main:444] : INFO : Resolving all filenames [2022-03-27 18:50:07 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-03-27 18:50:07 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-03-27 18:50:07 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-03-27 18:50:07 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-03-27 18:50:07 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-03-27 18:50:07 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-03-27 18:50:07 main:474] : INFO : Configuration: [2022-03-27 18:50:07 main:475] : INFO : Model type: GRU [2022-03-27 18:50:07 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-03-27 18:50:07 main:477] : INFO : Max Epochs: 2048 [2022-03-27 18:50:07 main:478] : INFO : Batch Size: 128 [2022-03-27 18:50:07 main:479] : INFO : Learning Rate: 0.01 [2022-03-27 18:50:07 main:480] : INFO : Patience: 10 [2022-03-27 18:50:07 main:481] : INFO : Hidden Width: 12 [2022-03-27 18:50:07 main:482] : INFO : # Recurrent Layers: 4 [2022-03-27 18:50:07 main:483] : INFO : # Backend Layers: 4 [2022-03-27 18:50:07 main:484] : INFO : # Threads: 1 [2022-03-27 18:50:07 main:486] : INFO : Preparing Dataset [2022-03-27 18:50:07 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-03-27 18:50:07 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 3070 Laptop GPU) [2022-03-28 06:24:37 main:435] : INFO : Set logging level to 1 [2022-03-28 06:24:37 main:441] : INFO : Running in BOINC Client mode [2022-03-28 06:24:37 main:444] : INFO : Resolving all filenames [2022-03-28 06:24:37 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-03-28 06:24:37 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-03-28 06:24:37 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-03-28 06:24:37 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-03-28 06:24:37 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-03-28 06:24:37 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-03-28 06:24:37 main:474] : INFO : Configuration: [2022-03-28 06:24:37 main:475] : INFO : Model type: GRU [2022-03-28 06:24:37 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-03-28 06:24:37 main:477] : INFO : Max Epochs: 2048 [2022-03-28 06:24:37 main:478] : INFO : Batch Size: 128 [2022-03-28 06:24:37 main:479] : INFO : Learning Rate: 0.01 [2022-03-28 06:24:37 main:480] : INFO : Patience: 10 Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 3070 Laptop GPU) [2022-03-28 19:20:48 main:435] : INFO : Set logging level to 1 [2022-03-28 19:20:48 main:441] : INFO : Running in BOINC Client mode [2022-03-28 19:20:48 main:444] : INFO : Resolving all filenames [2022-03-28 19:20:48 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-03-28 19:20:48 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-03-28 19:20:48 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-03-28 19:20:48 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-03-28 19:20:48 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-03-28 19:20:48 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-03-28 19:20:48 main:474] : INFO : Configuration: [2022-03-28 19:20:48 main:475] : INFO : Model type: GRU [2022-03-28 19:20:48 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-03-28 19:20:48 main:477] : INFO : Max Epochs: 2048 [2022-03-28 19:20:48 main:478] : INFO : Batch Size: 128 [2022-03-28 19:20:48 main:479] : INFO : Learning Rate: 0.01 [2022-03-28 19:20:48 main:480] : INFO : Patience: 10 [2022-03-28 19:20:48 main:481] : INFO : Hidden Width: 12 [2022-03-28 19:20:48 main:482] : INFO : # Recurrent Layers: 4 [2022-03-28 19:20:48 main:483] : INFO : # Backend Layers: 4 [2022-03-28 19:20:48 main:484] : INFO : # Threads: 1 [2022-03-28 19:20:48 main:486] : INFO : Preparing Dataset [2022-03-28 19:20:48 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-03-28 19:20:49 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 3070 Laptop GPU) [2022-03-28 19:48:53 main:435] : INFO : Set logging level to 1 [2022-03-28 19:48:53 main:441] : INFO : Running in BOINC Client mode [2022-03-28 19:48:53 main:444] : INFO : Resolving all filenames [2022-03-28 19:48:53 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-03-28 19:48:53 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-03-28 19:48:53 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-03-28 19:48:53 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-03-28 19:48:53 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-03-28 19:48:53 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-03-28 19:48:53 main:474] : INFO : Configuration: [2022-03-28 19:48:53 main:475] : INFO : Model type: GRU [2022-03-28 19:48:53 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-03-28 19:48:53 main:477] : INFO : Max Epochs: 2048 [2022-03-28 19:48:53 main:478] : INFO : Batch Size: 128 [2022-03-28 19:48:53 main:479] : INFO : Learning Rate: 0.01 [2022-03-28 19:48:53 main:480] : INFO : Patience: 10 [2022-03-28 19:48:53 main:481] : INFO : Hidden Width: 12 [2022-03-28 19:48:53 main:482] : INFO : # Recurrent Layers: 4 [2022-03-28 19:48:53 main:483] : INFO : # Backend Layers: 4 [2022-03-28 19:48:53 main:484] : INFO : # Threads: 1 [2022-03-28 19:48:53 main:486] : INFO : Preparing Dataset [2022-03-28 19:48:53 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-03-28 19:48:53 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 3070 Laptop GPU) [2022-03-28 20:08:28 main:435] : INFO : Set logging level to 1 [2022-03-28 20:08:28 main:441] : INFO : Running in BOINC Client mode [2022-03-28 20:08:28 main:444] : INFO : Resolving all filenames [2022-03-28 20:08:28 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-03-28 20:08:28 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-03-28 20:08:28 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-03-28 20:08:28 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-03-28 20:08:28 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-03-28 20:08:28 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-03-28 20:08:28 main:474] : INFO : Configuration: [2022-03-28 20:08:28 main:475] : INFO : Model type: GRU [2022-03-28 20:08:28 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-03-28 20:08:28 main:477] : INFO : Max Epochs: 2048 [2022-03-28 20:08:28 main:478] : INFO : Batch Size: 128 [2022-03-28 20:08:28 main:479] : INFO : Learning Rate: 0.01 [2022-03-28 20:08:28 main:480] : INFO : Patience: 10 [2022-03-28 20:08:28 main:481] : INFO : Hidden Width: 12 [2022-03-28 20:08:28 main:482] : INFO : # Recurrent Layers: 4 [2022-03-28 20:08:28 main:483] : INFO : # Backend Layers: 4 [2022-03-28 20:08:28 main:484] : INFO : # Threads: 1 [2022-03-28 20:08:28 main:486] : INFO : Preparing Dataset [2022-03-28 20:08:28 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-03-28 20:08:28 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 3070 Laptop GPU) [2022-03-29 06:24:25 main:435] : INFO : Set logging level to 1 [2022-03-29 06:24:25 main:441] : INFO : Running in BOINC Client mode [2022-03-29 06:24:25 main:444] : INFO : Resolving all filenames [2022-03-29 06:24:25 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-03-29 06:24:25 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-03-29 06:24:25 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-03-29 06:24:25 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-03-29 06:24:25 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-03-29 06:24:25 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-03-29 06:24:25 main:474] : INFO : Configuration: [2022-03-29 06:24:25 main:475] : INFO : Model type: GRU [2022-03-29 06:24:25 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-03-29 06:24:25 main:477] : INFO : Max Epochs: 2048 [2022-03-29 06:24:25 main:478] : INFO : Batch Size: 128 [2022-03-29 06:24:25 main:479] : INFO : Learning Rate: 0.01 [2022-03-29 06:24:25 main:480] : INFO : Patience: 10 [2022-03-29 06:24:25 main:481] : INFO : Hidden Width: 12 [2022-03-29 06:24:25 main:482] : INFO : # Recurrent Layers: 4 [2022-03-29 06:24:25 main:483] : INFO : # Backend Layers: 4 [2022-03-29 06:24:25 main:484] : INFO : # Threads: 1 [2022-03-29 06:24:25 main:486] : INFO : Preparing Dataset [2022-03-29 06:24:25 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-03-29 06:24:25 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 3070 Laptop GPU) [2022-03-29 17:06:23 main:435] : INFO : Set logging level to 1 [2022-03-29 17:06:23 main:441] : INFO : Running in BOINC Client mode [2022-03-29 17:06:23 main:444] : INFO : Resolving all filenames [2022-03-29 17:06:23 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-03-29 17:06:23 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-03-29 17:06:23 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-03-29 17:06:23 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-03-29 17:06:23 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-03-29 17:06:23 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-03-29 17:06:23 main:474] : INFO : Configuration: [2022-03-29 17:06:23 main:475] : INFO : Model type: GRU [2022-03-29 17:06:23 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-03-29 17:06:23 main:477] : INFO : Max Epochs: 2048 [2022-03-29 17:06:23 main:478] : INFO : Batch Size: 128 [2022-03-29 17:06:23 main:479] : INFO : Learning Rate: 0.01 [2022-03-29 17:06:23 main:480] : INFO : Patience: 10 [2022-03-29 17:06:23 main:481] : INFO : Hidden Width: 12 [2022-03-29 17:06:23 main:482] : INFO : # Recurrent Layers: 4 [2022-03-29 17:06:23 main:483] : INFO : # Backend Layers: 4 [2022-03-29 17:06:23 main:484] : INFO : # Threads: 1 [2022-03-29 17:06:23 main:486] : INFO : Preparing Dataset [2022-03-29 17:06:23 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-03-29 17:06:23 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 3070 Laptop GPU) [2022-03-29 17:37:16 main:435] : INFO : Set logging level to 1 [2022-03-29 17:37:16 main:441] : INFO : Running in BOINC Client mode [2022-03-29 17:37:16 main:444] : INFO : Resolving all filenames [2022-03-29 17:37:16 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-03-29 17:37:16 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-03-29 17:37:16 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-03-29 17:37:16 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 0) [2022-03-29 17:37:16 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-03-29 17:37:16 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-03-29 17:37:16 main:474] : INFO : Configuration: [2022-03-29 17:37:16 main:475] : INFO : Model type: GRU [2022-03-29 17:37:16 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-03-29 17:37:16 main:477] : INFO : Max Epochs: 2048 [2022-03-29 17:37:16 main:478] : INFO : Batch Size: 128 [2022-03-29 17:37:16 main:479] : INFO : Learning Rate: 0.01 [2022-03-29 17:37:16 main:480] : INFO : Patience: 10 [2022-03-29 17:37:16 main:481] : INFO : Hidden Width: 12 [2022-03-29 17:37:16 main:482] : INFO : # Recurrent Layers: 4 [2022-03-29 17:37:16 main:483] : INFO : # Backend Layers: 4 [2022-03-29 17:37:16 main:484] : INFO : # Threads: 1 [2022-03-29 17:37:16 main:486] : INFO : Preparing Dataset [2022-03-29 17:37:16 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-03-29 17:37:16 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-03-29 18:05:30 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-03-29 18:05:30 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-03-29 18:05:30 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-03-29 18:05:31 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-03-29 18:05:31 main:494] : INFO : Creating Model [2022-03-29 18:05:31 main:507] : INFO : Preparing config file [2022-03-29 18:05:31 main:511] : INFO : Found checkpoint, attempting to load... [2022-03-29 18:05:31 main:512] : INFO : Loading config [2022-03-29 18:05:31 main:514] : INFO : Loading state [2022-03-29 18:31:50 main:559] : INFO : Loading DataLoader into Memory [2022-03-29 18:31:50 main:562] : INFO : Starting Training [2022-03-29 18:34:07 main:574] : INFO : Epoch 1926 | loss: 0.0312459 | val_loss: 0.0312085 | Time: 136457 ms [2022-03-29 18:34:08 main:574] : INFO : Epoch 1927 | loss: 0.0311333 | val_loss: 0.0311808 | Time: 1854.25 ms [2022-03-29 18:34:10 main:574] : INFO : Epoch 1928 | loss: 0.0311158 | val_loss: 0.0311777 | Time: 1918.44 ms [2022-03-29 18:34:12 main:574] : INFO : Epoch 1929 | loss: 0.0311144 | val_loss: 0.0311794 | Time: 1835.62 ms [2022-03-29 18:34:14 main:574] : INFO : Epoch 1930 | loss: 0.0311163 | val_loss: 0.0311803 | Time: 1904.31 ms [2022-03-29 18:34:16 main:574] : INFO : Epoch 1931 | loss: 0.031114 | val_loss: 0.031175 | Time: 1854.26 ms [2022-03-29 18:34:18 main:574] : INFO : Epoch 1932 | loss: 0.0311174 | val_loss: 0.0311771 | Time: 1883.49 ms [2022-03-29 18:34:20 main:574] : INFO : Epoch 1933 | loss: 0.0311143 | val_loss: 0.0311865 | Time: 1843.36 ms [2022-03-29 18:34:22 main:574] : INFO : Epoch 1934 | loss: 0.031111 | val_loss: 0.0311846 | Time: 1870.33 ms [2022-03-29 18:34:24 main:574] : INFO : Epoch 1935 | loss: 0.0311116 | val_loss: 0.0311734 | Time: 1926.83 ms [2022-03-29 18:34:26 main:574] : INFO : Epoch 1936 | loss: 0.0311154 | val_loss: 0.0311735 | Time: 2019.07 ms [2022-03-29 18:34:28 main:574] : INFO : Epoch 1937 | loss: 0.0311194 | val_loss: 0.0311763 | Time: 2022.38 ms [2022-03-29 18:34:30 main:574] : INFO : Epoch 1938 | loss: 0.0311139 | val_loss: 0.0311783 | Time: 2005.24 ms [2022-03-29 18:34:32 main:574] : INFO : Epoch 1939 | loss: 0.0311132 | val_loss: 0.0311818 | Time: 1946.36 ms [2022-03-29 18:34:34 main:574] : INFO : Epoch 1940 | loss: 0.0311102 | val_loss: 0.0311762 | Time: 2149.47 ms [2022-03-29 18:34:36 main:574] : INFO : Epoch 1941 | loss: 0.0311107 | val_loss: 0.0311827 | Time: 2034.09 ms [2022-03-29 18:34:38 main:574] : INFO : Epoch 1942 | loss: 0.0311094 | val_loss: 0.0311829 | Time: 1957.04 ms [2022-03-29 18:34:40 main:574] : INFO : Epoch 1943 | loss: 0.0311063 | val_loss: 0.0311822 | Time: 2004.57 ms [2022-03-29 18:34:42 main:574] : INFO : Epoch 1944 | loss: 0.0311047 | val_loss: 0.0311765 | Time: 2089.68 ms [2022-03-29 18:34:44 main:574] : INFO : Epoch 1945 | loss: 0.031107 | val_loss: 0.0311817 | Time: 1998.74 ms [2022-03-29 18:34:46 main:574] : INFO : Epoch 1946 | loss: 0.0311085 | val_loss: 0.031179 | Time: 2112.43 ms [2022-03-29 18:34:48 main:574] : INFO : Epoch 1947 | loss: 0.0311085 | val_loss: 0.0311776 | Time: 2143.76 ms [2022-03-29 18:34:50 main:574] : INFO : Epoch 1948 | loss: 0.031112 | val_loss: 0.0311809 | Time: 1923.7 ms [2022-03-29 18:34:52 main:574] : INFO : Epoch 1949 | loss: 0.031117 | val_loss: 0.0311794 | Time: 2008.18 ms [2022-03-29 18:34:54 main:574] : INFO : Epoch 1950 | loss: 0.0311151 | val_loss: 0.0311806 | Time: 2020.09 ms [2022-03-29 18:34:56 main:574] : INFO : Epoch 1951 | loss: 0.0311098 | val_loss: 0.031183 | Time: 2133.66 ms [2022-03-29 18:34:58 main:574] : INFO : Epoch 1952 | loss: 0.0311137 | val_loss: 0.031173 | Time: 2108.32 ms [2022-03-29 18:35:01 main:574] : INFO : Epoch 1953 | loss: 0.0311127 | val_loss: 0.0311792 | Time: 2215.52 ms [2022-03-29 18:35:02 main:574] : INFO : Epoch 1954 | loss: 0.0311173 | val_loss: 0.0311847 | Time: 1943.53 ms [2022-03-29 18:35:04 main:574] : INFO : Epoch 1955 | loss: 0.0311141 | val_loss: 0.031169 | Time: 1999.03 ms [2022-03-29 18:35:07 main:574] : INFO : Epoch 1956 | loss: 0.0311119 | val_loss: 0.0311802 | Time: 2033.3 ms [2022-03-29 18:35:09 main:574] : INFO : Epoch 1957 | loss: 0.0311113 | val_loss: 0.0311796 | Time: 2029.65 ms [2022-03-29 18:35:11 main:574] : INFO : Epoch 1958 | loss: 0.0311117 | val_loss: 0.0311816 | Time: 1993.53 ms [2022-03-29 18:35:13 main:574] : INFO : Epoch 1959 | loss: 0.0311082 | val_loss: 0.0311779 | Time: 2074.05 ms [2022-03-29 18:35:15 main:574] : INFO : Epoch 1960 | loss: 0.0311076 | val_loss: 0.0311765 | Time: 2009.19 ms [2022-03-29 18:35:17 main:574] : INFO : Epoch 1961 | loss: 0.0311174 | val_loss: 0.0311781 | Time: 2024.05 ms [2022-03-29 18:35:19 main:574] : INFO : Epoch 1962 | loss: 0.0311156 | val_loss: 0.0311803 | Time: 2070.38 ms [2022-03-29 18:35:21 main:574] : INFO : Epoch 1963 | loss: 0.0311087 | val_loss: 0.0311765 | Time: 2007.71 ms [2022-03-29 18:35:23 main:574] : INFO : Epoch 1964 | loss: 0.0311056 | val_loss: 0.0311765 | Time: 2050.38 ms [2022-03-29 18:35:25 main:574] : INFO : Epoch 1965 | loss: 0.0311085 | val_loss: 0.0311813 | Time: 2009.84 ms [2022-03-29 18:35:27 main:574] : INFO : Epoch 1966 | loss: 0.0311066 | val_loss: 0.0311819 | Time: 2024.8 ms [2022-03-29 18:35:29 main:574] : INFO : Epoch 1967 | loss: 0.031114 | val_loss: 0.0311756 | Time: 2094.65 ms [2022-03-29 18:35:31 main:574] : INFO : Epoch 1968 | loss: 0.0311332 | val_loss: 0.0311703 | Time: 2069.32 ms [2022-03-29 18:35:33 main:574] : INFO : Epoch 1969 | loss: 0.0311293 | val_loss: 0.0311714 | Time: 2087.86 ms [2022-03-29 18:35:35 main:574] : INFO : Epoch 1970 | loss: 0.0311263 | val_loss: 0.0311739 | Time: 2045.92 ms [2022-03-29 18:35:37 main:574] : INFO : Epoch 1971 | loss: 0.0311212 | val_loss: 0.0311795 | Time: 2051.44 ms [2022-03-29 18:35:39 main:574] : INFO : Epoch 1972 | loss: 0.0311186 | val_loss: 0.0311704 | Time: 2047.49 ms [2022-03-29 18:35:41 main:574] : INFO : Epoch 1973 | loss: 0.031119 | val_loss: 0.0311747 | Time: 2098.01 ms [2022-03-29 18:35:43 main:574] : INFO : Epoch 1974 | loss: 0.0311144 | val_loss: 0.0311765 | Time: 2020.96 ms [2022-03-29 18:35:45 main:574] : INFO : Epoch 1975 | loss: 0.0311211 | val_loss: 0.0311786 | Time: 2058.24 ms [2022-03-29 18:35:48 main:574] : INFO : Epoch 1976 | loss: 0.0311251 | val_loss: 0.0311759 | Time: 2054.13 ms [2022-03-29 18:35:50 main:574] : INFO : Epoch 1977 | loss: 0.0311302 | val_loss: 0.0311702 | Time: 2004.12 ms [2022-03-29 18:35:52 main:574] : INFO : Epoch 1978 | loss: 0.0311244 | val_loss: 0.0311725 | Time: 2035.77 ms [2022-03-29 18:35:54 main:574] : INFO : Epoch 1979 | loss: 0.0311257 | val_loss: 0.0311703 | Time: 2025.92 ms [2022-03-29 18:35:56 main:574] : INFO : Epoch 1980 | loss: 0.0311225 | val_loss: 0.0311793 | Time: 2016.94 ms [2022-03-29 18:35:58 main:574] : INFO : Epoch 1981 | loss: 0.0311214 | val_loss: 0.0311742 | Time: 2112.87 ms [2022-03-29 18:36:00 main:574] : INFO : Epoch 1982 | loss: 0.0311167 | val_loss: 0.0311776 | Time: 2051.97 ms [2022-03-29 18:36:02 main:574] : INFO : Epoch 1983 | loss: 0.0311365 | val_loss: 0.0311718 | Time: 2024.64 ms [2022-03-29 18:36:04 main:574] : INFO : Epoch 1984 | loss: 0.0311434 | val_loss: 0.0311706 | Time: 1972.2 ms [2022-03-29 18:36:06 main:574] : INFO : Epoch 1985 | loss: 0.0311331 | val_loss: 0.0311691 | Time: 2017.95 ms [2022-03-29 18:36:08 main:574] : INFO : Epoch 1986 | loss: 0.0311275 | val_loss: 0.0311694 | Time: 2081.33 ms [2022-03-29 18:36:10 main:574] : INFO : Epoch 1987 | loss: 0.0311257 | val_loss: 0.031173 | Time: 2056.88 ms [2022-03-29 18:36:12 main:574] : INFO : Epoch 1988 | loss: 0.0311206 | val_loss: 0.0311719 | Time: 2065.7 ms [2022-03-29 18:36:14 main:574] : INFO : Epoch 1989 | loss: 0.0311173 | val_loss: 0.0311781 | Time: 1976.63 ms [2022-03-29 18:36:16 main:574] : INFO : Epoch 1990 | loss: 0.0311181 | val_loss: 0.031177 | Time: 2030.8 ms [2022-03-29 18:36:18 main:574] : INFO : Epoch 1991 | loss: 0.0311149 | val_loss: 0.0311753 | Time: 2109.97 ms [2022-03-29 18:36:20 main:574] : INFO : Epoch 1992 | loss: 0.0311126 | val_loss: 0.0311844 | Time: 2008.7 ms [2022-03-29 18:36:22 main:574] : INFO : Epoch 1993 | loss: 0.0311136 | val_loss: 0.0311751 | Time: 2009.23 ms [2022-03-29 18:36:24 main:574] : INFO : Epoch 1994 | loss: 0.0311153 | val_loss: 0.0311836 | Time: 2062.75 ms [2022-03-29 18:36:26 main:574] : INFO : Epoch 1995 | loss: 0.0311148 | val_loss: 0.0311758 | Time: 2085.88 ms [2022-03-29 18:36:28 main:574] : INFO : Epoch 1996 | loss: 0.0311128 | val_loss: 0.0311756 | Time: 2052.25 ms [2022-03-29 18:36:30 main:574] : INFO : Epoch 1997 | loss: 0.0311106 | val_loss: 0.0311879 | Time: 1974.78 ms [2022-03-29 18:36:32 main:574] : INFO : Epoch 1998 | loss: 0.0311095 | val_loss: 0.031178 | Time: 2076.37 ms [2022-03-29 18:36:35 main:574] : INFO : Epoch 1999 | loss: 0.0311133 | val_loss: 0.0311767 | Time: 2035.67 ms [2022-03-29 18:36:37 main:574] : INFO : Epoch 2000 | loss: 0.0311084 | val_loss: 0.0311831 | Time: 2054.34 ms [2022-03-29 18:36:39 main:574] : INFO : Epoch 2001 | loss: 0.0311084 | val_loss: 0.0311805 | Time: 2162.48 ms [2022-03-29 18:36:41 main:574] : INFO : Epoch 2002 | loss: 0.031105 | val_loss: 0.0311826 | Time: 2198.55 ms [2022-03-29 18:36:43 main:574] : INFO : Epoch 2003 | loss: 0.0311076 | val_loss: 0.031182 | Time: 2034.21 ms [2022-03-29 18:36:45 main:574] : INFO : Epoch 2004 | loss: 0.0311133 | val_loss: 0.0311816 | Time: 2221.38 ms [2022-03-29 18:36:47 main:574] : INFO : Epoch 2005 | loss: 0.0311112 | val_loss: 0.0311851 | Time: 2057.93 ms [2022-03-29 18:36:49 main:574] : INFO : Epoch 2006 | loss: 0.0311107 | val_loss: 0.0311832 | Time: 2057.51 ms [2022-03-29 18:36:51 main:574] : INFO : Epoch 2007 | loss: 0.0311113 | val_loss: 0.0311785 | Time: 2052.63 ms [2022-03-29 18:36:53 main:574] : INFO : Epoch 2008 | loss: 0.0311072 | val_loss: 0.0311836 | Time: 1977.76 ms [2022-03-29 18:36:55 main:574] : INFO : Epoch 2009 | loss: 0.0311063 | val_loss: 0.0311886 | Time: 2081.51 ms [2022-03-29 18:36:58 main:574] : INFO : Epoch 2010 | loss: 0.0311083 | val_loss: 0.0311881 | Time: 2118.38 ms [2022-03-29 18:37:00 main:574] : INFO : Epoch 2011 | loss: 0.0311057 | val_loss: 0.0311873 | Time: 2097.06 ms [2022-03-29 18:37:02 main:574] : INFO : Epoch 2012 | loss: 0.0311072 | val_loss: 0.0311814 | Time: 2125.37 ms [2022-03-29 18:37:04 main:574] : INFO : Epoch 2013 | loss: 0.0311102 | val_loss: 0.0311761 | Time: 1967.17 ms [2022-03-29 18:37:06 main:574] : INFO : Epoch 2014 | loss: 0.0311067 | val_loss: 0.0311848 | Time: 2049.08 ms [2022-03-29 18:37:08 main:574] : INFO : Epoch 2015 | loss: 0.0311063 | val_loss: 0.0311829 | Time: 2112.84 ms [2022-03-29 18:37:10 main:574] : INFO : Epoch 2016 | loss: 0.031107 | val_loss: 0.0311855 | Time: 2030.78 ms [2022-03-29 18:37:12 main:574] : INFO : Epoch 2017 | loss: 0.0311061 | val_loss: 0.0311886 | Time: 2076.68 ms [2022-03-29 18:37:14 main:574] : INFO : Epoch 2018 | loss: 0.0311079 | val_loss: 0.031183 | Time: 1997.75 ms [2022-03-29 18:37:16 main:574] : INFO : Epoch 2019 | loss: 0.0311105 | val_loss: 0.0311864 | Time: 2076.13 ms [2022-03-29 18:37:18 main:574] : INFO : Epoch 2020 | loss: 0.031135 | val_loss: 0.0311731 | Time: 2028.88 ms [2022-03-29 18:37:20 main:574] : INFO : Epoch 2021 | loss: 0.0311329 | val_loss: 0.0311686 | Time: 1986.4 ms [2022-03-29 18:37:22 main:574] : INFO : Epoch 2022 | loss: 0.0311288 | val_loss: 0.0311748 | Time: 1990.74 ms [2022-03-29 18:37:24 main:574] : INFO : Epoch 2023 | loss: 0.0311284 | val_loss: 0.0311685 | Time: 2169.69 ms [2022-03-29 18:37:27 main:574] : INFO : Epoch 2024 | loss: 0.0311299 | val_loss: 0.0311719 | Time: 2290.91 ms [2022-03-29 18:37:29 main:574] : INFO : Epoch 2025 | loss: 0.0311269 | val_loss: 0.0311735 | Time: 2179.85 ms [2022-03-29 18:37:31 main:574] : INFO : Epoch 2026 | loss: 0.0311301 | val_loss: 0.0311705 | Time: 2069.56 ms [2022-03-29 18:37:33 main:574] : INFO : Epoch 2027 | loss: 0.0311253 | val_loss: 0.0311722 | Time: 2099.06 ms [2022-03-29 18:37:35 main:574] : INFO : Epoch 2028 | loss: 0.0311207 | val_loss: 0.0311786 | Time: 2071.55 ms [2022-03-29 18:37:37 main:574] : INFO : Epoch 2029 | loss: 0.03112 | val_loss: 0.0311763 | Time: 2109.97 ms [2022-03-29 18:37:39 main:574] : INFO : Epoch 2030 | loss: 0.0311159 | val_loss: 0.0311855 | Time: 2011.3 ms [2022-03-29 18:37:41 main:574] : INFO : Epoch 2031 | loss: 0.031114 | val_loss: 0.0311763 | Time: 2051.79 ms [2022-03-29 18:37:43 main:574] : INFO : Epoch 2032 | loss: 0.0311368 | val_loss: 0.0311756 | Time: 2043.06 ms [2022-03-29 18:37:45 main:574] : INFO : Epoch 2033 | loss: 0.031139 | val_loss: 0.031167 | Time: 2078.96 ms [2022-03-29 18:37:47 main:574] : INFO : Epoch 2034 | loss: 0.031128 | val_loss: 0.0311723 | Time: 2053.87 ms [2022-03-29 18:37:50 main:574] : INFO : Epoch 2035 | loss: 0.0311247 | val_loss: 0.0311741 | Time: 2118.66 ms [2022-03-29 18:37:52 main:574] : INFO : Epoch 2036 | loss: 0.0311197 | val_loss: 0.0311719 | Time: 2064.19 ms [2022-03-29 18:37:54 main:574] : INFO : Epoch 2037 | loss: 0.0311147 | val_loss: 0.0311755 | Time: 2128.02 ms [2022-03-29 18:37:56 main:574] : INFO : Epoch 2038 | loss: 0.0311112 | val_loss: 0.0311791 | Time: 1994.31 ms [2022-03-29 18:37:58 main:574] : INFO : Epoch 2039 | loss: 0.0311122 | val_loss: 0.0311703 | Time: 1993.09 ms [2022-03-29 18:38:00 main:574] : INFO : Epoch 2040 | loss: 0.0311194 | val_loss: 0.0311705 | Time: 2082.75 ms [2022-03-29 18:38:02 main:574] : INFO : Epoch 2041 | loss: 0.0311285 | val_loss: 0.0311656 | Time: 2130.66 ms [2022-03-29 18:38:04 main:574] : INFO : Epoch 2042 | loss: 0.0311257 | val_loss: 0.0311743 | Time: 2126.33 ms [2022-03-29 18:38:06 main:574] : INFO : Epoch 2043 | loss: 0.0311183 | val_loss: 0.0311747 | Time: 2102.79 ms [2022-03-29 18:38:08 main:574] : INFO : Epoch 2044 | loss: 0.0311172 | val_loss: 0.0311731 | Time: 1980.9 ms [2022-03-29 18:38:10 main:574] : INFO : Epoch 2045 | loss: 0.0311128 | val_loss: 0.0311797 | Time: 2147.19 ms [2022-03-29 18:38:12 main:574] : INFO : Epoch 2046 | loss: 0.0311151 | val_loss: 0.031173 | Time: 2044.24 ms [2022-03-29 18:38:15 main:574] : INFO : Epoch 2047 | loss: 0.0311153 | val_loss: 0.0311708 | Time: 2126.65 ms [2022-03-29 18:38:17 main:574] : INFO : Epoch 2048 | loss: 0.0311163 | val_loss: 0.0311719 | Time: 2085.16 ms [2022-03-29 18:38:17 main:597] : INFO : Saving trained model to model-final.pt, val_loss 0.0311719 [2022-03-29 18:38:17 main:603] : INFO : Saving end state to config to file [2022-03-29 18:38:17 main:608] : INFO : Success, exiting.. 18:38:17 (15160): called boinc_finish(0) </stderr_txt> ]]>
©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)