| Name | ParityModified-1639955755-22885-1-0_0 |
| Workunit | 7631372 |
| Created | 31 Dec 2021, 21:50:36 UTC |
| Sent | 15 Jan 2022, 11:59:41 UTC |
| Report deadline | 23 Jan 2022, 11:59:41 UTC |
| Received | 17 Jan 2022, 12:52:48 UTC |
| Server state | Over |
| Outcome | Success |
| Client state | Done |
| Exit status | 0 (0x00000000) |
| Computer ID | 6741 |
| Run time | 4 hours 27 min 15 sec |
| CPU time | 3 hours 45 min 56 sec |
| Validate state | Valid |
| Credit | 4,160.00 |
| Device peak FLOPS | 786.79 GFLOPS |
| Application version | Machine Learning Dataset Generator (GPU) v9.75 (cuda10200) windows_x86_64 |
| Peak working set size | 1.54 GB |
| Peak swap size | 3.44 GB |
| Peak disk usage | 1.54 GB |
<core_client_version>7.16.20</core_client_version> <![CDATA[ <stderr_txt> : 0.0311092 | val_loss: 0.0311426 | Time: 33508.1 ms [2022-01-17 19:01:42 main:574] : INFO : Epoch 1748 | loss: 0.0311086 | val_loss: 0.031145 | Time: 20656.7 ms [2022-01-17 19:02:03 main:574] : INFO : Epoch 1749 | loss: 0.0311115 | val_loss: 0.0311455 | Time: 21509.2 ms [2022-01-17 19:02:24 main:574] : INFO : Epoch 1750 | loss: 0.0311062 | val_loss: 0.0311406 | Time: 20310.3 ms [2022-01-17 19:02:59 main:574] : INFO : Epoch 1751 | loss: 0.0311029 | val_loss: 0.0311452 | Time: 35072.3 ms [2022-01-17 19:03:20 main:574] : INFO : Epoch 1752 | loss: 0.0311002 | val_loss: 0.0311404 | Time: 21344.4 ms [2022-01-17 19:03:41 main:574] : INFO : Epoch 1753 | loss: 0.0310971 | val_loss: 0.0311422 | Time: 20587.2 ms [2022-01-17 19:04:13 main:574] : INFO : Epoch 1754 | loss: 0.0310957 | val_loss: 0.0311449 | Time: 32249.1 ms [2022-01-17 19:04:35 main:574] : INFO : Epoch 1755 | loss: 0.0310957 | val_loss: 0.0311466 | Time: 21348.2 ms [2022-01-17 19:04:54 main:574] : INFO : Epoch 1756 | loss: 0.0310992 | val_loss: 0.0311447 | Time: 19717.4 ms [2022-01-17 19:05:17 main:574] : INFO : Epoch 1757 | loss: 0.0310954 | val_loss: 0.0311412 | Time: 22376.5 ms [2022-01-17 19:05:38 main:574] : INFO : Epoch 1758 | loss: 0.0310924 | val_loss: 0.0311454 | Time: 20437.2 ms [2022-01-17 19:05:59 main:574] : INFO : Epoch 1759 | loss: 0.0310928 | val_loss: 0.0311475 | Time: 21521.2 ms [2022-01-17 19:06:20 main:574] : INFO : Epoch 1760 | loss: 0.0310988 | val_loss: 0.0311447 | Time: 20978.5 ms [2022-01-17 19:06:41 main:574] : INFO : Epoch 1761 | loss: 0.0310956 | val_loss: 0.031143 | Time: 21082.1 ms [2022-01-17 19:07:02 main:574] : INFO : Epoch 1762 | loss: 0.0310938 | val_loss: 0.0311454 | Time: 21140.1 ms [2022-01-17 19:07:22 main:574] : INFO : Epoch 1763 | loss: 0.0310973 | val_loss: 0.0311433 | Time: 20019.3 ms [2022-01-17 19:07:57 main:574] : INFO : Epoch 1764 | loss: 0.0310987 | val_loss: 0.0311398 | Time: 34354.3 ms [2022-01-17 19:08:18 main:574] : INFO : Epoch 1765 | loss: 0.0310994 | val_loss: 0.031147 | Time: 21221.4 ms [2022-01-17 19:08:39 main:574] : INFO : Epoch 1766 | loss: 0.0310973 | val_loss: 0.031148 | Time: 20651.9 ms [2022-01-17 19:09:00 main:574] : INFO : Epoch 1767 | loss: 0.0310969 | val_loss: 0.0311455 | Time: 21165.5 ms [2022-01-17 19:09:23 main:574] : INFO : Epoch 1768 | loss: 0.0310951 | val_loss: 0.031149 | Time: 22165 ms [2022-01-17 19:09:44 main:574] : INFO : Epoch 1769 | loss: 0.031093 | val_loss: 0.0311489 | Time: 21103.5 ms [2022-01-17 19:11:00 main:574] : INFO : Epoch 1770 | loss: 0.0310908 | val_loss: 0.0311492 | Time: 76523.4 ms [2022-01-17 19:11:21 main:574] : INFO : Epoch 1771 | loss: 0.0310882 | val_loss: 0.0311463 | Time: 20257 ms [2022-01-17 19:11:42 main:574] : INFO : Epoch 1772 | loss: 0.0310922 | val_loss: 0.0311459 | Time: 21012.1 ms [2022-01-17 19:12:03 main:574] : INFO : Epoch 1773 | loss: 0.0310954 | val_loss: 0.0311499 | Time: 21099.8 ms [2022-01-17 19:12:24 main:574] : INFO : Epoch 1774 | loss: 0.031105 | val_loss: 0.0311508 | Time: 20417.2 ms [2022-01-17 19:12:44 main:574] : INFO : Epoch 1775 | loss: 0.0311045 | val_loss: 0.0311447 | Time: 19922.1 ms [2022-01-17 19:13:16 main:574] : INFO : Epoch 1776 | loss: 0.0310989 | val_loss: 0.0311417 | Time: 32447.5 ms [2022-01-17 19:13:36 main:574] : INFO : Epoch 1777 | loss: 0.0310952 | val_loss: 0.0311421 | Time: 20085.7 ms [2022-01-17 19:13:58 main:574] : INFO : Epoch 1778 | loss: 0.0310923 | val_loss: 0.0311455 | Time: 22042.7 ms [2022-01-17 19:14:19 main:574] : INFO : Epoch 1779 | loss: 0.0310978 | val_loss: 0.0311441 | Time: 21087 ms [2022-01-17 19:14:52 main:574] : INFO : Epoch 1780 | loss: 0.0310947 | val_loss: 0.0311465 | Time: 32629.1 ms [2022-01-17 19:15:12 main:574] : INFO : Epoch 1781 | loss: 0.031091 | val_loss: 0.0311435 | Time: 19976 ms [2022-01-17 19:15:34 main:574] : INFO : Epoch 1782 | loss: 0.0310891 | val_loss: 0.0311489 | Time: 21597.4 ms [2022-01-17 19:15:55 main:574] : INFO : Epoch 1783 | loss: 0.0310904 | val_loss: 0.0311468 | Time: 21321.7 ms [2022-01-17 19:16:17 main:574] : INFO : Epoch 1784 | loss: 0.0310896 | val_loss: 0.0311428 | Time: 21420.4 ms [2022-01-17 19:16:38 main:574] : INFO : Epoch 1785 | loss: 0.0310884 | val_loss: 0.0311484 | Time: 20810 ms [2022-01-17 19:16:59 main:574] : INFO : Epoch 1786 | loss: 0.0310873 | val_loss: 0.0311509 | Time: 20834.1 ms [2022-01-17 19:17:20 main:574] : INFO : Epoch 1787 | loss: 0.031112 | val_loss: 0.0311528 | Time: 21857.9 ms [2022-01-17 19:17:41 main:574] : INFO : Epoch 1788 | loss: 0.0311435 | val_loss: 0.0311512 | Time: 20262 ms [2022-01-17 19:18:25 main:574] : INFO : Epoch 1789 | loss: 0.0311349 | val_loss: 0.0311439 | Time: 44426 ms [2022-01-17 19:18:46 main:574] : INFO : Epoch 1790 | loss: 0.0311268 | val_loss: 0.0311466 | Time: 20498.6 ms [2022-01-17 19:19:06 main:574] : INFO : Epoch 1791 | loss: 0.0311253 | val_loss: 0.0311451 | Time: 20081.2 ms [2022-01-17 19:19:27 main:574] : INFO : Epoch 1792 | loss: 0.0311218 | val_loss: 0.0311429 | Time: 20303.1 ms [2022-01-17 19:19:48 main:574] : INFO : Epoch 1793 | loss: 0.0311192 | val_loss: 0.0311421 | Time: 20989.9 ms [2022-01-17 19:20:08 main:574] : INFO : Epoch 1794 | loss: 0.0311175 | val_loss: 0.0311392 | Time: 20209.1 ms [2022-01-17 19:20:29 main:574] : INFO : Epoch 1795 | loss: 0.0311142 | val_loss: 0.0311423 | Time: 20659.1 ms [2022-01-17 19:20:50 main:574] : INFO : Epoch 1796 | loss: 0.0311129 | val_loss: 0.0311441 | Time: 21744.9 ms [2022-01-17 19:21:11 main:574] : INFO : Epoch 1797 | loss: 0.0311143 | val_loss: 0.0311432 | Time: 20596.6 ms [2022-01-17 19:21:32 main:574] : INFO : Epoch 1798 | loss: 0.0311135 | val_loss: 0.0311508 | Time: 20796.4 ms [2022-01-17 19:21:54 main:574] : INFO : Epoch 1799 | loss: 0.0313274 | val_loss: 0.0325588 | Time: 21801.7 ms [2022-01-17 19:22:26 main:574] : INFO : Epoch 1800 | loss: 0.0502897 | val_loss: 0.0421246 | Time: 32253.3 ms [2022-01-17 19:22:47 main:574] : INFO : Epoch 1801 | loss: 0.0387829 | val_loss: 0.0380351 | Time: 20994.7 ms [2022-01-17 19:23:07 main:574] : INFO : Epoch 1802 | loss: 0.0357945 | val_loss: 0.0314705 | Time: 19238.9 ms [2022-01-17 19:23:32 main:574] : INFO : Epoch 1803 | loss: 0.0314919 | val_loss: 0.0312909 | Time: 25276.4 ms [2022-01-17 19:23:52 main:574] : INFO : Epoch 1804 | loss: 0.0312867 | val_loss: 0.0312576 | Time: 20320.4 ms [2022-01-17 19:24:13 main:574] : INFO : Epoch 1805 | loss: 0.031249 | val_loss: 0.0312418 | Time: 20446.7 ms [2022-01-17 19:24:34 main:574] : INFO : Epoch 1806 | loss: 0.0312395 | val_loss: 0.0312366 | Time: 21077.7 ms [2022-01-17 19:24:56 main:574] : INFO : Epoch 1807 | loss: 0.031235 | val_loss: 0.0312316 | Time: 21450 ms [2022-01-17 19:26:11 main:574] : INFO : Epoch 1808 | loss: 0.0312301 | val_loss: 0.031226 | Time: 75705.7 ms [2022-01-17 19:26:32 main:574] : INFO : Epoch 1809 | loss: 0.0312231 | val_loss: 0.0312178 | Time: 20530.1 ms [2022-01-17 19:26:52 main:574] : INFO : Epoch 1810 | loss: 0.0312153 | val_loss: 0.0312097 | Time: 19830.5 ms [2022-01-17 19:27:13 main:574] : INFO : Epoch 1811 | loss: 0.0312068 | val_loss: 0.0312012 | Time: 21508.7 ms [2022-01-17 19:27:33 main:574] : INFO : Epoch 1812 | loss: 0.0311962 | val_loss: 0.0311873 | Time: 19979.3 ms [2022-01-17 19:27:54 main:574] : INFO : Epoch 1813 | loss: 0.0311831 | val_loss: 0.0311777 | Time: 20891.8 ms [2022-01-17 19:28:15 main:574] : INFO : Epoch 1814 | loss: 0.031173 | val_loss: 0.0311689 | Time: 20813.4 ms [2022-01-17 19:28:47 main:574] : INFO : Epoch 1815 | loss: 0.0311657 | val_loss: 0.0311634 | Time: 31882.7 ms [2022-01-17 19:29:08 main:574] : INFO : Epoch 1816 | loss: 0.0311609 | val_loss: 0.0311598 | Time: 20428.8 ms [2022-01-17 19:29:28 main:574] : INFO : Epoch 1817 | loss: 0.0311575 | val_loss: 0.0311579 | Time: 20483.6 ms [2022-01-17 19:30:01 main:574] : INFO : Epoch 1818 | loss: 0.0311552 | val_loss: 0.0311565 | Time: 32339.1 ms [2022-01-17 19:30:22 main:574] : INFO : Epoch 1819 | loss: 0.0311536 | val_loss: 0.0311552 | Time: 20824.8 ms [2022-01-17 19:30:45 main:574] : INFO : Epoch 1820 | loss: 0.0311522 | val_loss: 0.0311543 | Time: 22844.5 ms [2022-01-17 19:31:06 main:574] : INFO : Epoch 1821 | loss: 0.031151 | val_loss: 0.031154 | Time: 20898.3 ms [2022-01-17 19:31:25 main:574] : INFO : Epoch 1822 | loss: 0.0311503 | val_loss: 0.0311533 | Time: 19857.2 ms [2022-01-17 19:31:45 main:574] : INFO : Epoch 1823 | loss: 0.0311494 | val_loss: 0.0311522 | Time: 19125.5 ms [2022-01-17 19:32:04 main:574] : INFO : Epoch 1824 | loss: 0.0311486 | val_loss: 0.0311527 | Time: 19669.3 ms [2022-01-17 19:32:24 main:574] : INFO : Epoch 1825 | loss: 0.0311482 | val_loss: 0.0311507 | Time: 19925.2 ms [2022-01-17 19:32:45 main:574] : INFO : Epoch 1826 | loss: 0.0311468 | val_loss: 0.0311507 | Time: 20492.2 ms [2022-01-17 19:33:04 main:574] : INFO : Epoch 1827 | loss: 0.0311465 | val_loss: 0.0311509 | Time: 19469.3 ms [2022-01-17 19:33:25 main:574] : INFO : Epoch 1828 | loss: 0.031146 | val_loss: 0.0311502 | Time: 20457.7 ms [2022-01-17 19:33:56 main:574] : INFO : Epoch 1829 | loss: 0.0311453 | val_loss: 0.0311499 | Time: 31172.4 ms [2022-01-17 19:34:16 main:574] : INFO : Epoch 1830 | loss: 0.0311446 | val_loss: 0.0311497 | Time: 19920.9 ms [2022-01-17 19:34:35 main:574] : INFO : Epoch 1831 | loss: 0.031144 | val_loss: 0.031149 | Time: 18850.6 ms [2022-01-17 19:34:56 main:574] : INFO : Epoch 1832 | loss: 0.0311434 | val_loss: 0.0311487 | Time: 20311.5 ms [2022-01-17 19:35:16 main:574] : INFO : Epoch 1833 | loss: 0.0311433 | val_loss: 0.0311489 | Time: 20134.3 ms [2022-01-17 19:35:26 main:574] : INFO : Epoch 1834 | loss: 0.0311428 | val_loss: 0.0311486 | Time: 9885.76 ms [2022-01-17 19:35:47 main:574] : INFO : Epoch 1835 | loss: 0.031142 | val_loss: 0.0311491 | Time: 21524 ms [2022-01-17 19:36:09 main:574] : INFO : Epoch 1836 | loss: 0.0311418 | val_loss: 0.031148 | Time: 21202.7 ms [2022-01-17 19:36:30 main:574] : INFO : Epoch 1837 | loss: 0.0311412 | val_loss: 0.0311479 | Time: 21335.2 ms [2022-01-17 19:36:51 main:574] : INFO : Epoch 1838 | loss: 0.0311411 | val_loss: 0.0311477 | Time: 20804.3 ms [2022-01-17 19:37:12 main:574] : INFO : Epoch 1839 | loss: 0.0311402 | val_loss: 0.0311481 | Time: 20806.7 ms [2022-01-17 19:37:33 main:574] : INFO : Epoch 1840 | loss: 0.0311397 | val_loss: 0.0311473 | Time: 21198.1 ms [2022-01-17 19:38:05 main:574] : INFO : Epoch 1841 | loss: 0.0311391 | val_loss: 0.0311466 | Time: 31654.1 ms [2022-01-17 19:38:27 main:574] : INFO : Epoch 1842 | loss: 0.0311384 | val_loss: 0.0311466 | Time: 21339.7 ms [2022-01-17 19:38:57 main:574] : INFO : Epoch 1843 | loss: 0.0311382 | val_loss: 0.0311463 | Time: 30508.6 ms [2022-01-17 19:39:31 main:574] : INFO : Epoch 1844 | loss: 0.0311382 | val_loss: 0.0311461 | Time: 33644.7 ms [2022-01-17 19:43:09 main:574] : INFO : Epoch 1845 | loss: 0.0311379 | val_loss: 0.0311455 | Time: 217536 ms [2022-01-17 19:43:30 main:574] : INFO : Epoch 1846 | loss: 0.0311375 | val_loss: 0.0311453 | Time: 21382.3 ms [2022-01-17 19:43:52 main:574] : INFO : Epoch 1847 | loss: 0.0311386 | val_loss: 0.0311448 | Time: 22231.1 ms [2022-01-17 19:44:16 main:574] : INFO : Epoch 1848 | loss: 0.0311376 | val_loss: 0.0311437 | Time: 23061.4 ms [2022-01-17 19:44:37 main:574] : INFO : Epoch 1849 | loss: 0.0311363 | val_loss: 0.0311455 | Time: 21133.1 ms [2022-01-17 19:45:09 main:574] : INFO : Epoch 1850 | loss: 0.0311357 | val_loss: 0.0311466 | Time: 32476.8 ms [2022-01-17 19:45:30 main:574] : INFO : Epoch 1851 | loss: 0.0311352 | val_loss: 0.0311461 | Time: 20542.5 ms [2022-01-17 19:45:49 main:574] : INFO : Epoch 1852 | loss: 0.0311356 | val_loss: 0.0311452 | Time: 19547 ms [2022-01-17 19:46:12 main:574] : INFO : Epoch 1853 | loss: 0.0311346 | val_loss: 0.0311446 | Time: 22102 ms [2022-01-17 19:46:32 main:574] : INFO : Epoch 1854 | loss: 0.031134 | val_loss: 0.0311452 | Time: 20295.3 ms [2022-01-17 19:46:54 main:574] : INFO : Epoch 1855 | loss: 0.0311336 | val_loss: 0.0311452 | Time: 21617.7 ms [2022-01-17 19:47:14 main:574] : INFO : Epoch 1856 | loss: 0.0311335 | val_loss: 0.0311447 | Time: 20341.2 ms [2022-01-17 19:47:35 main:574] : INFO : Epoch 1857 | loss: 0.0311318 | val_loss: 0.0311437 | Time: 21291.5 ms [2022-01-17 19:47:55 main:574] : INFO : Epoch 1858 | loss: 0.0311315 | val_loss: 0.0311432 | Time: 19720.5 ms [2022-01-17 19:48:18 main:574] : INFO : Epoch 1859 | loss: 0.0311313 | val_loss: 0.0311429 | Time: 22392.9 ms [2022-01-17 19:48:51 main:574] : INFO : Epoch 1860 | loss: 0.0311313 | val_loss: 0.0311425 | Time: 33148 ms [2022-01-17 19:49:24 main:574] : INFO : Epoch 1861 | loss: 0.03113 | val_loss: 0.0311431 | Time: 32714.6 ms [2022-01-17 19:49:44 main:574] : INFO : Epoch 1862 | loss: 0.0311296 | val_loss: 0.0311429 | Time: 20680 ms [2022-01-17 19:50:07 main:574] : INFO : Epoch 1863 | loss: 0.031129 | val_loss: 0.0311442 | Time: 22786.7 ms [2022-01-17 19:50:30 main:574] : INFO : Epoch 1864 | loss: 0.031129 | val_loss: 0.0311438 | Time: 22070.2 ms [2022-01-17 19:50:51 main:574] : INFO : Epoch 1865 | loss: 0.031128 | val_loss: 0.0311434 | Time: 21307.1 ms [2022-01-17 19:51:23 main:574] : INFO : Epoch 1866 | loss: 0.0311277 | val_loss: 0.0311444 | Time: 32114.5 ms [2022-01-17 19:51:44 main:574] : INFO : Epoch 1867 | loss: 0.0311296 | val_loss: 0.031142 | Time: 20161.6 ms [2022-01-17 19:52:04 main:574] : INFO : Epoch 1868 | loss: 0.0311293 | val_loss: 0.0311433 | Time: 20777.8 ms [2022-01-17 19:52:16 main:574] : INFO : Epoch 1869 | loss: 0.0311293 | val_loss: 0.0311413 | Time: 11662.5 ms [2022-01-17 19:52:37 main:574] : INFO : Epoch 1870 | loss: 0.0311267 | val_loss: 0.0311416 | Time: 21148.2 ms [2022-01-17 19:52:59 main:574] : INFO : Epoch 1871 | loss: 0.0311262 | val_loss: 0.0311413 | Time: 21418 ms [2022-01-17 19:53:21 main:574] : INFO : Epoch 1872 | loss: 0.0311298 | val_loss: 0.0311444 | Time: 21951.2 ms [2022-01-17 19:53:41 main:574] : INFO : Epoch 1873 | loss: 0.0311314 | val_loss: 0.0311428 | Time: 20356.7 ms [2022-01-17 19:54:02 main:574] : INFO : Epoch 1874 | loss: 0.0311298 | val_loss: 0.0311421 | Time: 20720.8 ms [2022-01-17 19:54:43 main:574] : INFO : Epoch 1875 | loss: 0.031127 | val_loss: 0.0311409 | Time: 41106.9 ms [2022-01-17 19:55:56 main:574] : INFO : Epoch 1876 | loss: 0.0311253 | val_loss: 0.0311425 | Time: 72629 ms [2022-01-17 19:56:17 main:574] : INFO : Epoch 1877 | loss: 0.0311245 | val_loss: 0.0311407 | Time: 20953.4 ms [2022-01-17 19:56:37 main:574] : INFO : Epoch 1878 | loss: 0.0311238 | val_loss: 0.0311399 | Time: 20421.6 ms [2022-01-17 19:56:59 main:574] : INFO : Epoch 1879 | loss: 0.0311242 | val_loss: 0.0311388 | Time: 21803.2 ms [2022-01-17 19:57:21 main:574] : INFO : Epoch 1880 | loss: 0.0311231 | val_loss: 0.0311404 | Time: 21286 ms [2022-01-17 19:57:42 main:574] : INFO : Epoch 1881 | loss: 0.0311232 | val_loss: 0.0311392 | Time: 20991.4 ms [2022-01-17 19:58:05 main:574] : INFO : Epoch 1882 | loss: 0.0311222 | val_loss: 0.0311397 | Time: 22341 ms [2022-01-17 19:58:26 main:574] : INFO : Epoch 1883 | loss: 0.0311223 | val_loss: 0.0311384 | Time: 21811.4 ms [2022-01-17 19:58:47 main:574] : INFO : Epoch 1884 | loss: 0.0311222 | val_loss: 0.0311367 | Time: 20181.2 ms [2022-01-17 19:59:21 main:574] : INFO : Epoch 1885 | loss: 0.0311206 | val_loss: 0.0311374 | Time: 34414.2 ms [2022-01-17 19:59:45 main:574] : INFO : Epoch 1886 | loss: 0.0311201 | val_loss: 0.0311394 | Time: 23872.1 ms [2022-01-17 20:00:18 main:574] : INFO : Epoch 1887 | loss: 0.0311198 | val_loss: 0.031138 | Time: 32683.4 ms [2022-01-17 20:00:39 main:574] : INFO : Epoch 1888 | loss: 0.0311181 | val_loss: 0.0311369 | Time: 21414.5 ms [2022-01-17 20:01:00 main:574] : INFO : Epoch 1889 | loss: 0.0311177 | val_loss: 0.0311383 | Time: 20658.2 ms [2022-01-17 20:01:22 main:574] : INFO : Epoch 1890 | loss: 0.0311178 | val_loss: 0.0311362 | Time: 21067.1 ms [2022-01-17 20:01:43 main:574] : INFO : Epoch 1891 | loss: 0.0311177 | val_loss: 0.0311375 | Time: 21414.9 ms [2022-01-17 20:02:05 main:574] : INFO : Epoch 1892 | loss: 0.031121 | val_loss: 0.031137 | Time: 21369.4 ms [2022-01-17 20:02:26 main:574] : INFO : Epoch 1893 | loss: 0.031121 | val_loss: 0.031136 | Time: 21011.1 ms [2022-01-17 20:02:46 main:574] : INFO : Epoch 1894 | loss: 0.0311189 | val_loss: 0.031135 | Time: 20233.4 ms [2022-01-17 20:03:07 main:574] : INFO : Epoch 1895 | loss: 0.0311182 | val_loss: 0.0311356 | Time: 21208.1 ms [2022-01-17 20:03:28 main:574] : INFO : Epoch 1896 | loss: 0.0311187 | val_loss: 0.0311371 | Time: 20841.6 ms [2022-01-17 20:03:49 main:574] : INFO : Epoch 1897 | loss: 0.031119 | val_loss: 0.0311371 | Time: 21059.3 ms [2022-01-17 20:04:20 main:574] : INFO : Epoch 1898 | loss: 0.0311181 | val_loss: 0.0311376 | Time: 30751.7 ms [2022-01-17 20:04:53 main:574] : INFO : Epoch 1899 | loss: 0.0311171 | val_loss: 0.0311384 | Time: 32224.8 ms [2022-01-17 20:05:13 main:574] : INFO : Epoch 1900 | loss: 0.0311195 | val_loss: 0.0311359 | Time: 20726.5 ms [2022-01-17 20:05:34 main:574] : INFO : Epoch 1901 | loss: 0.0311185 | val_loss: 0.0311364 | Time: 20739 ms [2022-01-17 20:05:55 main:574] : INFO : Epoch 1902 | loss: 0.0311178 | val_loss: 0.0311375 | Time: 20616.9 ms [2022-01-17 20:06:16 main:574] : INFO : Epoch 1903 | loss: 0.0311163 | val_loss: 0.0311362 | Time: 21364 ms [2022-01-17 20:06:37 main:574] : INFO : Epoch 1904 | loss: 0.0311168 | val_loss: 0.0311339 | Time: 20838.1 ms [2022-01-17 20:06:58 main:574] : INFO : Epoch 1905 | loss: 0.0311157 | val_loss: 0.0311359 | Time: 20712.6 ms [2022-01-17 20:07:19 main:574] : INFO : Epoch 1906 | loss: 0.0311143 | val_loss: 0.0311345 | Time: 20606.8 ms [2022-01-17 20:07:40 main:574] : INFO : Epoch 1907 | loss: 0.0311161 | val_loss: 0.0311339 | Time: 21727.6 ms [2022-01-17 20:08:14 main:574] : INFO : Epoch 1908 | loss: 0.0311144 | val_loss: 0.0311332 | Time: 33050.7 ms [2022-01-17 20:08:35 main:574] : INFO : Epoch 1909 | loss: 0.0311146 | val_loss: 0.0311308 | Time: 20969.9 ms [2022-01-17 20:08:56 main:574] : INFO : Epoch 1910 | loss: 0.0311127 | val_loss: 0.0311313 | Time: 21079.7 ms [2022-01-17 20:09:17 main:574] : INFO : Epoch 1911 | loss: 0.0311125 | val_loss: 0.0311302 | Time: 20529 ms [2022-01-17 20:09:39 main:574] : INFO : Epoch 1912 | loss: 0.0311113 | val_loss: 0.0311304 | Time: 22150.8 ms [2022-01-17 20:11:05 main:574] : INFO : Epoch 1913 | loss: 0.0311131 | val_loss: 0.0311311 | Time: 86601.9 ms [2022-01-17 20:11:26 main:574] : INFO : Epoch 1914 | loss: 0.0311124 | val_loss: 0.0311309 | Time: 20323.6 ms [2022-01-17 20:11:47 main:574] : INFO : Epoch 1915 | loss: 0.0311127 | val_loss: 0.0311365 | Time: 20489.8 ms [2022-01-17 20:12:08 main:574] : INFO : Epoch 1916 | loss: 0.0311113 | val_loss: 0.0311323 | Time: 20990.2 ms [2022-01-17 20:12:28 main:574] : INFO : Epoch 1917 | loss: 0.0311101 | val_loss: 0.0311325 | Time: 20070 ms [2022-01-17 20:12:49 main:574] : INFO : Epoch 1918 | loss: 0.0311092 | val_loss: 0.0311332 | Time: 21150.5 ms [2022-01-17 20:13:11 main:574] : INFO : Epoch 1919 | loss: 0.0311105 | val_loss: 0.0311314 | Time: 21675.3 ms [2022-01-17 20:13:32 main:574] : INFO : Epoch 1920 | loss: 0.0311096 | val_loss: 0.0311338 | Time: 21587.9 ms [2022-01-17 20:13:51 main:574] : INFO : Epoch 1921 | loss: 0.0311081 | val_loss: 0.0311347 | Time: 18987.4 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 930MX) [2022-01-17 20:43:01 main:435] : INFO : Set logging level to 1 [2022-01-17 20:43:01 main:441] : INFO : Running in BOINC Client mode [2022-01-17 20:43:01 main:444] : INFO : Resolving all filenames [2022-01-17 20:43:01 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-17 20:43:01 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-17 20:43:01 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-17 20:43:01 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-17 20:43:01 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-17 20:43:01 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-17 20:43:01 main:474] : INFO : Configuration: [2022-01-17 20:43:01 main:475] : INFO : Model type: GRU [2022-01-17 20:43:01 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-17 20:43:01 main:477] : INFO : Max Epochs: 2048 [2022-01-17 20:43:01 main:478] : INFO : Batch Size: 128 [2022-01-17 20:43:01 main:479] : INFO : Learning Rate: 0.01 [2022-01-17 20:43:01 main:480] : INFO : Patience: 10 [2022-01-17 20:43:01 main:481] : INFO : Hidden Width: 12 [2022-01-17 20:43:01 main:482] : INFO : # Recurrent Layers: 4 [2022-01-17 20:43:01 main:483] : INFO : # Backend Layers: 4 [2022-01-17 20:43:01 main:484] : INFO : # Threads: 1 [2022-01-17 20:43:01 main:486] : INFO : Preparing Dataset [2022-01-17 20:43:01 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-17 20:43:02 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-17 20:43:07 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-17 20:43:07 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-17 20:43:07 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-17 20:43:07 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-17 20:43:07 main:494] : INFO : Creating Model [2022-01-17 20:43:07 main:507] : INFO : Preparing config file [2022-01-17 20:43:07 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-17 20:43:07 main:512] : INFO : Loading config [2022-01-17 20:43:07 main:514] : INFO : Loading state [2022-01-17 20:43:10 main:559] : INFO : Loading DataLoader into Memory [2022-01-17 20:43:10 main:562] : INFO : Starting Training [2022-01-17 20:43:32 main:574] : INFO : Epoch 1914 | loss: 0.031772 | val_loss: 0.0313316 | Time: 21913.7 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 930MX) [2022-01-17 20:48:18 main:435] : INFO : Set logging level to 1 [2022-01-17 20:48:18 main:441] : INFO : Running in BOINC Client mode [2022-01-17 20:48:18 main:444] : INFO : Resolving all filenames [2022-01-17 20:48:19 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-17 20:48:19 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-17 20:48:19 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-17 20:48:19 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-17 20:48:19 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-17 20:48:19 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-17 20:48:19 main:474] : INFO : Configuration: [2022-01-17 20:48:19 main:475] : INFO : Model type: GRU [2022-01-17 20:48:19 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-17 20:48:19 main:477] : INFO : Max Epochs: 2048 [2022-01-17 20:48:19 main:478] : INFO : Batch Size: 128 [2022-01-17 20:48:19 main:479] : INFO : Learning Rate: 0.01 [2022-01-17 20:48:19 main:480] : INFO : Patience: 10 [2022-01-17 20:48:19 main:481] : INFO : Hidden Width: 12 [2022-01-17 20:48:19 main:482] : INFO : # Recurrent Layers: 4 [2022-01-17 20:48:19 main:483] : INFO : # Backend Layers: 4 [2022-01-17 20:48:19 main:484] : INFO : # Threads: 1 [2022-01-17 20:48:19 main:486] : INFO : Preparing Dataset [2022-01-17 20:48:19 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-17 20:48:19 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-17 20:48:22 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-17 20:48:22 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-17 20:48:23 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-17 20:48:23 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-17 20:48:23 main:494] : INFO : Creating Model [2022-01-17 20:48:23 main:507] : INFO : Preparing config file [2022-01-17 20:48:23 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-17 20:48:23 main:512] : INFO : Loading config [2022-01-17 20:48:23 main:514] : INFO : Loading state [2022-01-17 20:48:24 main:559] : INFO : Loading DataLoader into Memory [2022-01-17 20:48:24 main:562] : INFO : Starting Training [2022-01-17 20:48:31 main:574] : INFO : Epoch 1914 | loss: 0.0319604 | val_loss: 0.0313769 | Time: 6875.83 ms [2022-01-17 20:48:39 main:574] : INFO : Epoch 1915 | loss: 0.0312588 | val_loss: 0.0312066 | Time: 7637.18 ms [2022-01-17 20:49:00 main:574] : INFO : Epoch 1916 | loss: 0.0311769 | val_loss: 0.0311628 | Time: 21302.7 ms [2022-01-17 20:49:22 main:574] : INFO : Epoch 1917 | loss: 0.0311354 | val_loss: 0.0311461 | Time: 21917.8 ms [2022-01-17 20:49:52 main:574] : INFO : Epoch 1918 | loss: 0.0311246 | val_loss: 0.0311385 | Time: 29659.9 ms [2022-01-17 20:50:12 main:574] : INFO : Epoch 1919 | loss: 0.0311216 | val_loss: 0.0311361 | Time: 19819 ms [2022-01-17 20:50:34 main:574] : INFO : Epoch 1920 | loss: 0.0311184 | val_loss: 0.0311308 | Time: 22490.2 ms [2022-01-17 20:50:55 main:574] : INFO : Epoch 1921 | loss: 0.0311156 | val_loss: 0.0311312 | Time: 20629.6 ms [2022-01-17 20:51:18 main:574] : INFO : Epoch 1922 | loss: 0.0311152 | val_loss: 0.0311287 | Time: 22894.9 ms [2022-01-17 20:51:39 main:574] : INFO : Epoch 1923 | loss: 0.0311136 | val_loss: 0.0311285 | Time: 21110 ms [2022-01-17 20:52:46 main:574] : INFO : Epoch 1924 | loss: 0.0311117 | val_loss: 0.0311299 | Time: 66928.8 ms [2022-01-17 20:56:05 main:574] : INFO : Epoch 1925 | loss: 0.0311117 | val_loss: 0.0311295 | Time: 198603 ms [2022-01-17 21:02:51 main:574] : INFO : Epoch 1926 | loss: 0.0311115 | val_loss: 0.0311301 | Time: 405975 ms [2022-01-17 21:03:12 main:574] : INFO : Epoch 1927 | loss: 0.0311107 | val_loss: 0.0311303 | Time: 20898.7 ms [2022-01-17 21:03:33 main:574] : INFO : Epoch 1928 | loss: 0.0311096 | val_loss: 0.0311296 | Time: 20209.8 ms [2022-01-17 21:03:53 main:574] : INFO : Epoch 1929 | loss: 0.0311123 | val_loss: 0.0311346 | Time: 20365.1 ms [2022-01-17 21:04:13 main:574] : INFO : Epoch 1930 | loss: 0.0311156 | val_loss: 0.0311304 | Time: 19539.9 ms [2022-01-17 21:04:33 main:574] : INFO : Epoch 1931 | loss: 0.0311152 | val_loss: 0.031129 | Time: 20753.3 ms [2022-01-17 21:04:54 main:574] : INFO : Epoch 1932 | loss: 0.0311132 | val_loss: 0.0311302 | Time: 20216.5 ms [2022-01-17 21:05:38 main:574] : INFO : Epoch 1933 | loss: 0.031111 | val_loss: 0.031129 | Time: 44011.5 ms [2022-01-17 21:05:58 main:574] : INFO : Epoch 1934 | loss: 0.0311105 | val_loss: 0.0311271 | Time: 20313.3 ms [2022-01-17 21:06:20 main:574] : INFO : Epoch 1935 | loss: 0.0311094 | val_loss: 0.0311293 | Time: 21480.8 ms [2022-01-17 21:06:42 main:574] : INFO : Epoch 1936 | loss: 0.0311097 | val_loss: 0.031129 | Time: 21146.8 ms [2022-01-17 21:07:03 main:574] : INFO : Epoch 1937 | loss: 0.0311066 | val_loss: 0.0311277 | Time: 21316.9 ms [2022-01-17 21:07:24 main:574] : INFO : Epoch 1938 | loss: 0.0311054 | val_loss: 0.0311298 | Time: 20791 ms [2022-01-17 21:07:44 main:574] : INFO : Epoch 1939 | loss: 0.0311064 | val_loss: 0.031131 | Time: 20429.9 ms [2022-01-17 21:08:06 main:574] : INFO : Epoch 1940 | loss: 0.0311088 | val_loss: 0.0311303 | Time: 21676.1 ms [2022-01-17 21:08:27 main:574] : INFO : Epoch 1941 | loss: 0.0311084 | val_loss: 0.031127 | Time: 21037.3 ms [2022-01-17 21:08:49 main:574] : INFO : Epoch 1942 | loss: 0.0311062 | val_loss: 0.0311322 | Time: 21481.8 ms [2022-01-17 21:09:20 main:574] : INFO : Epoch 1943 | loss: 0.0311065 | val_loss: 0.0311303 | Time: 31457.8 ms [2022-01-17 21:09:42 main:574] : INFO : Epoch 1944 | loss: 0.0311042 | val_loss: 0.0311288 | Time: 22060.1 ms [2022-01-17 21:10:04 main:574] : INFO : Epoch 1945 | loss: 0.0311036 | val_loss: 0.0311309 | Time: 21348.5 ms [2022-01-17 21:10:35 main:574] : INFO : Epoch 1946 | loss: 0.0311048 | val_loss: 0.0311302 | Time: 31601.9 ms [2022-01-17 21:10:56 main:574] : INFO : Epoch 1947 | loss: 0.0311075 | val_loss: 0.0311321 | Time: 20425.5 ms [2022-01-17 21:11:17 main:574] : INFO : Epoch 1948 | loss: 0.0311029 | val_loss: 0.0311325 | Time: 21194.6 ms [2022-01-17 21:11:38 main:574] : INFO : Epoch 1949 | loss: 0.0311036 | val_loss: 0.0311319 | Time: 21169.6 ms [2022-01-17 21:12:41 main:574] : INFO : Epoch 1950 | loss: 0.0311021 | val_loss: 0.0311356 | Time: 62134.7 ms [2022-01-17 21:13:02 main:574] : INFO : Epoch 1951 | loss: 0.0311008 | val_loss: 0.0311293 | Time: 20975.2 ms [2022-01-17 21:13:33 main:574] : INFO : Epoch 1952 | loss: 0.0310997 | val_loss: 0.0311303 | Time: 20736.9 ms [2022-01-17 21:13:54 main:574] : INFO : Epoch 1953 | loss: 0.0310993 | val_loss: 0.0311262 | Time: 21459.2 ms [2022-01-17 21:14:17 main:574] : INFO : Epoch 1954 | loss: 0.0311014 | val_loss: 0.0311292 | Time: 22603.7 ms [2022-01-17 21:14:37 main:574] : INFO : Epoch 1955 | loss: 0.0311032 | val_loss: 0.0311304 | Time: 20562.3 ms [2022-01-17 21:15:00 main:574] : INFO : Epoch 1956 | loss: 0.0311015 | val_loss: 0.0311291 | Time: 23009.5 ms [2022-01-17 21:15:45 main:574] : INFO : Epoch 1957 | loss: 0.0311002 | val_loss: 0.0311322 | Time: 44779 ms [2022-01-17 21:16:05 main:574] : INFO : Epoch 1958 | loss: 0.0311033 | val_loss: 0.0311317 | Time: 19363 ms [2022-01-17 21:16:26 main:574] : INFO : Epoch 1959 | loss: 0.031105 | val_loss: 0.0311307 | Time: 21065.7 ms [2022-01-17 21:16:47 main:574] : INFO : Epoch 1960 | loss: 0.0311184 | val_loss: 0.0311383 | Time: 21180.2 ms [2022-01-17 21:17:08 main:574] : INFO : Epoch 1961 | loss: 0.0311178 | val_loss: 0.0311366 | Time: 21167.3 ms [2022-01-17 21:17:30 main:574] : INFO : Epoch 1962 | loss: 0.031113 | val_loss: 0.0311376 | Time: 21629.8 ms [2022-01-17 21:17:51 main:574] : INFO : Epoch 1963 | loss: 0.0311106 | val_loss: 0.0311367 | Time: 21186 ms [2022-01-17 21:18:11 main:574] : INFO : Epoch 1964 | loss: 0.0311096 | val_loss: 0.0311385 | Time: 19945.3 ms [2022-01-17 21:18:32 main:574] : INFO : Epoch 1965 | loss: 0.0311192 | val_loss: 0.0311407 | Time: 20944.5 ms [2022-01-17 21:18:55 main:574] : INFO : Epoch 1966 | loss: 0.0311179 | val_loss: 0.0311372 | Time: 22372.4 ms [2022-01-17 21:19:15 main:574] : INFO : Epoch 1967 | loss: 0.0311158 | val_loss: 0.0311325 | Time: 20310.5 ms [2022-01-17 21:19:34 main:574] : INFO : Epoch 1968 | loss: 0.031111 | val_loss: 0.0311356 | Time: 19133.4 ms [2022-01-17 21:19:52 main:574] : INFO : Epoch 1969 | loss: 0.03111 | val_loss: 0.0311345 | Time: 17490.1 ms [2022-01-17 21:19:58 main:574] : INFO : Epoch 1970 | loss: 0.0311079 | val_loss: 0.0311447 | Time: 6757.86 ms [2022-01-17 21:20:05 main:574] : INFO : Epoch 1971 | loss: 0.0311101 | val_loss: 0.0311322 | Time: 6860 ms [2022-01-17 21:20:12 main:574] : INFO : Epoch 1972 | loss: 0.0311034 | val_loss: 0.0311318 | Time: 6948.2 ms [2022-01-17 21:20:19 main:574] : INFO : Epoch 1973 | loss: 0.0311019 | val_loss: 0.0311359 | Time: 6854.56 ms [2022-01-17 21:20:26 main:574] : INFO : Epoch 1974 | loss: 0.0311054 | val_loss: 0.0311293 | Time: 6911.43 ms [2022-01-17 21:20:36 main:574] : INFO : Epoch 1975 | loss: 0.0311044 | val_loss: 0.0311457 | Time: 9753.37 ms [2022-01-17 21:20:53 main:574] : INFO : Epoch 1976 | loss: 0.0311047 | val_loss: 0.0311416 | Time: 16911.4 ms [2022-01-17 21:21:00 main:574] : INFO : Epoch 1977 | loss: 0.0311037 | val_loss: 0.0311325 | Time: 6865.45 ms [2022-01-17 21:21:07 main:574] : INFO : Epoch 1978 | loss: 0.0310996 | val_loss: 0.0311364 | Time: 6975.74 ms [2022-01-17 21:21:14 main:574] : INFO : Epoch 1979 | loss: 0.0311015 | val_loss: 0.0311365 | Time: 6763.96 ms [2022-01-17 21:21:21 main:574] : INFO : Epoch 1980 | loss: 0.0311024 | val_loss: 0.0311311 | Time: 6910.21 ms [2022-01-17 21:21:28 main:574] : INFO : Epoch 1981 | loss: 0.031104 | val_loss: 0.0311367 | Time: 6890.12 ms [2022-01-17 21:21:34 main:574] : INFO : Epoch 1982 | loss: 0.0311038 | val_loss: 0.0311322 | Time: 6874.56 ms [2022-01-17 21:21:41 main:574] : INFO : Epoch 1983 | loss: 0.0311011 | val_loss: 0.0311358 | Time: 6815.16 ms [2022-01-17 21:21:48 main:574] : INFO : Epoch 1984 | loss: 0.0311013 | val_loss: 0.0311328 | Time: 6917.62 ms [2022-01-17 21:21:55 main:574] : INFO : Epoch 1985 | loss: 0.0310988 | val_loss: 0.0311304 | Time: 6996.46 ms [2022-01-17 21:22:02 main:574] : INFO : Epoch 1986 | loss: 0.0310983 | val_loss: 0.031132 | Time: 6773.9 ms [2022-01-17 21:22:09 main:574] : INFO : Epoch 1987 | loss: 0.0310974 | val_loss: 0.0311306 | Time: 6964.42 ms [2022-01-17 21:22:16 main:574] : INFO : Epoch 1988 | loss: 0.0310966 | val_loss: 0.0311329 | Time: 6754.69 ms [2022-01-17 21:22:23 main:574] : INFO : Epoch 1989 | loss: 0.0310981 | val_loss: 0.0311328 | Time: 7064.59 ms [2022-01-17 21:22:30 main:574] : INFO : Epoch 1990 | loss: 0.0310972 | val_loss: 0.0311301 | Time: 6762.62 ms [2022-01-17 21:22:37 main:574] : INFO : Epoch 1991 | loss: 0.0310993 | val_loss: 0.0311305 | Time: 6911.64 ms [2022-01-17 21:22:43 main:574] : INFO : Epoch 1992 | loss: 0.0311005 | val_loss: 0.0311401 | Time: 6782.61 ms [2022-01-17 21:22:50 main:574] : INFO : Epoch 1993 | loss: 0.0311016 | val_loss: 0.0311296 | Time: 6775.61 ms [2022-01-17 21:22:58 main:574] : INFO : Epoch 1994 | loss: 0.031098 | val_loss: 0.0311309 | Time: 7373.75 ms [2022-01-17 21:23:04 main:574] : INFO : Epoch 1995 | loss: 0.0310993 | val_loss: 0.0311292 | Time: 6756.92 ms [2022-01-17 21:23:13 main:574] : INFO : Epoch 1996 | loss: 0.0310981 | val_loss: 0.0311287 | Time: 8234.12 ms [2022-01-17 21:23:20 main:574] : INFO : Epoch 1997 | loss: 0.0310968 | val_loss: 0.0311308 | Time: 7244.57 ms [2022-01-17 21:23:27 main:574] : INFO : Epoch 1998 | loss: 0.0310973 | val_loss: 0.0311317 | Time: 6829.87 ms [2022-01-17 21:23:34 main:574] : INFO : Epoch 1999 | loss: 0.0310991 | val_loss: 0.0311316 | Time: 6778.99 ms [2022-01-17 21:23:41 main:574] : INFO : Epoch 2000 | loss: 0.0310987 | val_loss: 0.0311301 | Time: 7190.05 ms [2022-01-17 21:23:48 main:574] : INFO : Epoch 2001 | loss: 0.0310977 | val_loss: 0.0311289 | Time: 6797.46 ms [2022-01-17 21:23:55 main:574] : INFO : Epoch 2002 | loss: 0.0310968 | val_loss: 0.0311295 | Time: 6769.66 ms [2022-01-17 21:24:02 main:574] : INFO : Epoch 2003 | loss: 0.031098 | val_loss: 0.03113 | Time: 7302.76 ms [2022-01-17 21:24:09 main:574] : INFO : Epoch 2004 | loss: 0.0310941 | val_loss: 0.0311281 | Time: 6788.11 ms [2022-01-17 21:24:15 main:574] : INFO : Epoch 2005 | loss: 0.0310934 | val_loss: 0.0311362 | Time: 6787.5 ms [2022-01-17 21:24:22 main:574] : INFO : Epoch 2006 | loss: 0.0310924 | val_loss: 0.0311306 | Time: 6895.35 ms [2022-01-17 21:24:29 main:574] : INFO : Epoch 2007 | loss: 0.0310919 | val_loss: 0.0311334 | Time: 6792.74 ms [2022-01-17 21:24:36 main:574] : INFO : Epoch 2008 | loss: 0.0310934 | val_loss: 0.0311338 | Time: 6990.41 ms [2022-01-17 21:24:43 main:574] : INFO : Epoch 2009 | loss: 0.0310929 | val_loss: 0.0311332 | Time: 6900.05 ms [2022-01-17 21:24:50 main:574] : INFO : Epoch 2010 | loss: 0.0310922 | val_loss: 0.0311302 | Time: 6767.48 ms [2022-01-17 21:24:57 main:574] : INFO : Epoch 2011 | loss: 0.0310927 | val_loss: 0.0311383 | Time: 6779.26 ms [2022-01-17 21:25:04 main:574] : INFO : Epoch 2012 | loss: 0.0310933 | val_loss: 0.0311337 | Time: 6788.13 ms [2022-01-17 21:25:10 main:574] : INFO : Epoch 2013 | loss: 0.0310919 | val_loss: 0.0311338 | Time: 6790.56 ms [2022-01-17 21:25:17 main:574] : INFO : Epoch 2014 | loss: 0.0310973 | val_loss: 0.0311309 | Time: 6781.98 ms [2022-01-17 21:25:24 main:574] : INFO : Epoch 2015 | loss: 0.0310979 | val_loss: 0.0311324 | Time: 6770.33 ms [2022-01-17 21:25:31 main:574] : INFO : Epoch 2016 | loss: 0.0310979 | val_loss: 0.0311389 | Time: 6788.59 ms [2022-01-17 21:25:38 main:574] : INFO : Epoch 2017 | loss: 0.0310976 | val_loss: 0.0311301 | Time: 6784.28 ms [2022-01-17 21:25:45 main:574] : INFO : Epoch 2018 | loss: 0.0310949 | val_loss: 0.0311324 | Time: 7236.05 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 930MX) [2022-01-17 21:29:54 main:435] : INFO : Set logging level to 1 [2022-01-17 21:29:54 main:441] : INFO : Running in BOINC Client mode [2022-01-17 21:29:54 main:444] : INFO : Resolving all filenames [2022-01-17 21:29:54 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-17 21:29:54 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-17 21:29:54 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-17 21:29:54 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-17 21:29:54 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-17 21:29:54 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-17 21:29:54 main:474] : INFO : Configuration: [2022-01-17 21:29:54 main:475] : INFO : Model type: GRU [2022-01-17 21:29:54 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-17 21:29:54 main:477] : INFO : Max Epochs: 2048 [2022-01-17 21:29:54 main:478] : INFO : Batch Size: 128 [2022-01-17 21:29:54 main:479] : INFO : Learning Rate: 0.01 [2022-01-17 21:29:54 main:480] : INFO : Patience: 10 [2022-01-17 21:29:54 main:481] : INFO : Hidden Width: 12 [2022-01-17 21:29:54 main:482] : INFO : # Recurrent Layers: 4 [2022-01-17 21:29:54 main:483] : INFO : # Backend Layers: 4 [2022-01-17 21:29:55 main:484] : INFO : # Threads: 1 [2022-01-17 21:29:55 main:486] : INFO : Preparing Dataset [2022-01-17 21:29:55 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-17 21:29:55 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-17 21:29:58 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-17 21:29:58 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-17 21:29:58 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-17 21:29:58 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-17 21:29:58 main:494] : INFO : Creating Model [2022-01-17 21:29:58 main:507] : INFO : Preparing config file [2022-01-17 21:29:58 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-17 21:29:58 main:512] : INFO : Loading config [2022-01-17 21:29:59 main:514] : INFO : Loading state [2022-01-17 21:29:59 main:559] : INFO : Loading DataLoader into Memory [2022-01-17 21:30:00 main:562] : INFO : Starting Training [2022-01-17 21:30:17 main:574] : INFO : Epoch 2000 | loss: 0.0312929 | val_loss: 0.0312151 | Time: 17587.2 ms [2022-01-17 21:30:24 main:574] : INFO : Epoch 2001 | loss: 0.0311437 | val_loss: 0.0311447 | Time: 6722.24 ms [2022-01-17 21:30:31 main:574] : INFO : Epoch 2002 | loss: 0.0311058 | val_loss: 0.0311373 | Time: 6678.63 ms [2022-01-17 21:30:38 main:574] : INFO : Epoch 2003 | loss: 0.0311023 | val_loss: 0.0311287 | Time: 6721.93 ms [2022-01-17 21:30:44 main:574] : INFO : Epoch 2004 | loss: 0.0310988 | val_loss: 0.0311324 | Time: 6845.93 ms [2022-01-17 21:31:43 main:574] : INFO : Epoch 2005 | loss: 0.0310971 | val_loss: 0.0311325 | Time: 58620.2 ms [2022-01-17 21:31:50 main:574] : INFO : Epoch 2006 | loss: 0.0310972 | val_loss: 0.0311365 | Time: 6553.42 ms [2022-01-17 21:31:57 main:574] : INFO : Epoch 2007 | loss: 0.0310991 | val_loss: 0.0311409 | Time: 7010.85 ms [2022-01-17 21:32:03 main:574] : INFO : Epoch 2008 | loss: 0.0310978 | val_loss: 0.0311389 | Time: 6765.09 ms [2022-01-17 21:32:10 main:574] : INFO : Epoch 2009 | loss: 0.0311012 | val_loss: 0.0311316 | Time: 6737.77 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 930MX) [2022-01-17 21:35:12 main:435] : INFO : Set logging level to 1 [2022-01-17 21:35:13 main:441] : INFO : Running in BOINC Client mode [2022-01-17 21:35:13 main:444] : INFO : Resolving all filenames [2022-01-17 21:35:13 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-17 21:35:13 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-17 21:35:13 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-17 21:35:13 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-17 21:35:13 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-17 21:35:13 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-17 21:35:13 main:474] : INFO : Configuration: [2022-01-17 21:35:13 main:475] : INFO : Model type: GRU [2022-01-17 21:35:13 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-17 21:35:13 main:477] : INFO : Max Epochs: 2048 [2022-01-17 21:35:13 main:478] : INFO : Batch Size: 128 [2022-01-17 21:35:13 main:479] : INFO : Learning Rate: 0.01 [2022-01-17 21:35:13 main:480] : INFO : Patience: 10 [2022-01-17 21:35:13 main:481] : INFO : Hidden Width: 12 [2022-01-17 21:35:13 main:482] : INFO : # Recurrent Layers: 4 [2022-01-17 21:35:13 main:483] : INFO : # Backend Layers: 4 [2022-01-17 21:35:13 main:484] : INFO : # Threads: 1 [2022-01-17 21:35:13 main:486] : INFO : Preparing Dataset [2022-01-17 21:35:13 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-17 21:35:13 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-17 21:35:16 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-17 21:35:16 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-17 21:35:16 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-17 21:35:17 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-17 21:35:17 main:494] : INFO : Creating Model [2022-01-17 21:35:17 main:507] : INFO : Preparing config file [2022-01-17 21:35:17 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-17 21:35:17 main:512] : INFO : Loading config [2022-01-17 21:35:17 main:514] : INFO : Loading state [2022-01-17 21:35:18 main:559] : INFO : Loading DataLoader into Memory [2022-01-17 21:35:18 main:562] : INFO : Starting Training [2022-01-17 21:35:25 main:574] : INFO : Epoch 2000 | loss: 0.0313057 | val_loss: 0.031217 | Time: 6997.1 ms [2022-01-17 21:35:31 main:574] : INFO : Epoch 2001 | loss: 0.0311497 | val_loss: 0.0311566 | Time: 6683.59 ms [2022-01-17 21:35:38 main:574] : INFO : Epoch 2002 | loss: 0.0311094 | val_loss: 0.0311379 | Time: 6684.62 ms [2022-01-17 21:35:45 main:574] : INFO : Epoch 2003 | loss: 0.0311021 | val_loss: 0.0311356 | Time: 6730.41 ms [2022-01-17 21:35:52 main:574] : INFO : Epoch 2004 | loss: 0.0310984 | val_loss: 0.0311344 | Time: 6873.48 ms [2022-01-17 21:35:59 main:574] : INFO : Epoch 2005 | loss: 0.0310965 | val_loss: 0.0311289 | Time: 6734.37 ms [2022-01-17 21:36:29 main:574] : INFO : Epoch 2006 | loss: 0.0310993 | val_loss: 0.0311293 | Time: 30056.1 ms [2022-01-17 21:36:35 main:574] : INFO : Epoch 2007 | loss: 0.0310996 | val_loss: 0.0311369 | Time: 6752.64 ms [2022-01-17 21:36:42 main:574] : INFO : Epoch 2008 | loss: 0.0310963 | val_loss: 0.0311352 | Time: 6982.06 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 930MX) [2022-01-17 21:39:44 main:435] : INFO : Set logging level to 1 [2022-01-17 21:39:44 main:441] : INFO : Running in BOINC Client mode [2022-01-17 21:39:44 main:444] : INFO : Resolving all filenames [2022-01-17 21:39:44 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-17 21:39:44 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-17 21:39:44 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-17 21:39:44 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-17 21:39:44 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-17 21:39:44 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-17 21:39:44 main:474] : INFO : Configuration: [2022-01-17 21:39:44 main:475] : INFO : Model type: GRU [2022-01-17 21:39:44 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-17 21:39:44 main:477] : INFO : Max Epochs: 2048 [2022-01-17 21:39:44 main:478] : INFO : Batch Size: 128 [2022-01-17 21:39:44 main:479] : INFO : Learning Rate: 0.01 [2022-01-17 21:39:44 main:480] : INFO : Patience: 10 [2022-01-17 21:39:45 main:481] : INFO : Hidden Width: 12 [2022-01-17 21:39:45 main:482] : INFO : # Recurrent Layers: 4 [2022-01-17 21:39:45 main:483] : INFO : # Backend Layers: 4 [2022-01-17 21:39:45 main:484] : INFO : # Threads: 1 [2022-01-17 21:39:45 main:486] : INFO : Preparing Dataset [2022-01-17 21:39:45 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-17 21:39:45 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-17 21:39:59 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-17 21:39:59 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-17 21:39:59 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-17 21:40:00 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-17 21:40:00 main:494] : INFO : Creating Model [2022-01-17 21:40:00 main:507] : INFO : Preparing config file [2022-01-17 21:40:00 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-17 21:40:00 main:512] : INFO : Loading config [2022-01-17 21:40:00 main:514] : INFO : Loading state [2022-01-17 21:40:02 main:559] : INFO : Loading DataLoader into Memory [2022-01-17 21:40:02 main:562] : INFO : Starting Training [2022-01-17 21:40:09 main:574] : INFO : Epoch 2000 | loss: 0.0312796 | val_loss: 0.0311959 | Time: 7082.83 ms [2022-01-17 21:40:15 main:574] : INFO : Epoch 2001 | loss: 0.03113 | val_loss: 0.0311392 | Time: 6643.32 ms [2022-01-17 21:40:22 main:574] : INFO : Epoch 2002 | loss: 0.0311033 | val_loss: 0.0311356 | Time: 6697.59 ms [2022-01-17 21:40:29 main:574] : INFO : Epoch 2003 | loss: 0.0311043 | val_loss: 0.0311386 | Time: 6737.83 ms [2022-01-17 21:40:36 main:574] : INFO : Epoch 2004 | loss: 0.031103 | val_loss: 0.0311312 | Time: 6920.21 ms [2022-01-17 21:40:43 main:574] : INFO : Epoch 2005 | loss: 0.0310994 | val_loss: 0.0311291 | Time: 6735.28 ms [2022-01-17 21:41:42 main:574] : INFO : Epoch 2006 | loss: 0.0311001 | val_loss: 0.0311335 | Time: 59312.4 ms [2022-01-17 21:41:49 main:574] : INFO : Epoch 2007 | loss: 0.0310994 | val_loss: 0.0311309 | Time: 6814.07 ms [2022-01-17 21:41:55 main:574] : INFO : Epoch 2008 | loss: 0.0310973 | val_loss: 0.0311338 | Time: 6758.89 ms Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 930MX) [2022-01-17 21:45:02 main:435] : INFO : Set logging level to 1 [2022-01-17 21:45:03 main:441] : INFO : Running in BOINC Client mode [2022-01-17 21:45:03 main:444] : INFO : Resolving all filenames [2022-01-17 21:45:03 main:452] : INFO : Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1) [2022-01-17 21:45:03 main:452] : INFO : Resolved: model.cfg => model.cfg (exists = 1) [2022-01-17 21:45:03 main:452] : INFO : Resolved: model-final.pt => model-final.pt (exists = 0) [2022-01-17 21:45:03 main:452] : INFO : Resolved: model-input.pt => model-input.pt (exists = 1) [2022-01-17 21:45:03 main:452] : INFO : Resolved: snapshot.pt => snapshot.pt (exists = 1) [2022-01-17 21:45:03 main:472] : INFO : Dataset filename: dataset.hdf5 [2022-01-17 21:45:03 main:474] : INFO : Configuration: [2022-01-17 21:45:03 main:475] : INFO : Model type: GRU [2022-01-17 21:45:03 main:476] : INFO : Validation Loss Threshold: 0.0001 [2022-01-17 21:45:03 main:477] : INFO : Max Epochs: 2048 [2022-01-17 21:45:03 main:478] : INFO : Batch Size: 128 [2022-01-17 21:45:03 main:479] : INFO : Learning Rate: 0.01 [2022-01-17 21:45:03 main:480] : INFO : Patience: 10 [2022-01-17 21:45:03 main:481] : INFO : Hidden Width: 12 [2022-01-17 21:45:03 main:482] : INFO : # Recurrent Layers: 4 [2022-01-17 21:45:03 main:483] : INFO : # Backend Layers: 4 [2022-01-17 21:45:03 main:484] : INFO : # Threads: 1 [2022-01-17 21:45:03 main:486] : INFO : Preparing Dataset [2022-01-17 21:45:03 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xt from dataset.hdf5 into memory [2022-01-17 21:45:03 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yt from dataset.hdf5 into memory [2022-01-17 21:45:06 load:106] : INFO : Successfully loaded dataset of 2048 examples into memory. [2022-01-17 21:45:06 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Xv from dataset.hdf5 into memory [2022-01-17 21:45:06 load_hdf5_ds_into_tensor:28] : INFO : Loading Dataset /Yv from dataset.hdf5 into memory [2022-01-17 21:45:07 load:106] : INFO : Successfully loaded dataset of 512 examples into memory. [2022-01-17 21:45:07 main:494] : INFO : Creating Model [2022-01-17 21:45:07 main:507] : INFO : Preparing config file [2022-01-17 21:45:07 main:511] : INFO : Found checkpoint, attempting to load... [2022-01-17 21:45:07 main:512] : INFO : Loading config [2022-01-17 21:45:07 main:514] : INFO : Loading state [2022-01-17 21:45:08 main:559] : INFO : Loading DataLoader into Memory [2022-01-17 21:45:08 main:562] : INFO : Starting Training [2022-01-17 21:45:15 main:574] : INFO : Epoch 2000 | loss: 0.0312997 | val_loss: 0.0312116 | Time: 6877.79 ms [2022-01-17 21:45:21 main:574] : INFO : Epoch 2001 | loss: 0.0311431 | val_loss: 0.0311479 | Time: 6608.09 ms [2022-01-17 21:45:28 main:574] : INFO : Epoch 2002 | loss: 0.0311086 | val_loss: 0.0311341 | Time: 6711.08 ms [2022-01-17 21:45:35 main:574] : INFO : Epoch 2003 | loss: 0.0311035 | val_loss: 0.031134 | Time: 6713.35 ms [2022-01-17 21:45:42 main:574] : INFO : Epoch 2004 | loss: 0.0311008 | val_loss: 0.0311331 | Time: 6824.57 ms [2022-01-17 21:45:49 main:574] : INFO : Epoch 2005 | loss: 0.031098 | val_loss: 0.0311345 | Time: 6829.38 ms [2022-01-17 21:45:55 main:574] : INFO : Epoch 2006 | loss: 0.0311004 | val_loss: 0.0311323 | Time: 6950.27 ms [2022-01-17 21:46:02 main:574] : INFO : Epoch 2007 | loss: 0.0310977 | val_loss: 0.031133 | Time: 6804.27 ms [2022-01-17 21:46:10 main:574] : INFO : Epoch 2008 | loss: 0.0311001 | val_loss: 0.0311278 | Time: 7814.36 ms [2022-01-17 21:46:17 main:574] : INFO : Epoch 2009 | loss: 0.031098 | val_loss: 0.0311255 | Time: 6874.8 ms [2022-01-17 21:46:45 main:574] : INFO : Epoch 2010 | loss: 0.0310964 | val_loss: 0.0311319 | Time: 27760.7 ms [2022-01-17 21:46:52 main:574] : INFO : Epoch 2011 | loss: 0.0310963 | val_loss: 0.0311298 | Time: 6722.91 ms [2022-01-17 21:46:58 main:574] : INFO : Epoch 2012 | loss: 0.0310982 | val_loss: 0.0311344 | Time: 6738.73 ms [2022-01-17 21:47:07 main:574] : INFO : Epoch 2013 | loss: 0.0310966 | val_loss: 0.0311277 | Time: 8162.48 ms [2022-01-17 21:47:23 main:574] : INFO : Epoch 2014 | loss: 0.0310969 | val_loss: 0.0311283 | Time: 16896.6 ms [2022-01-17 21:47:30 main:574] : INFO : Epoch 2015 | loss: 0.0310984 | val_loss: 0.0311313 | Time: 6771.5 ms [2022-01-17 21:47:37 main:574] : INFO : Epoch 2016 | loss: 0.0310976 | val_loss: 0.0311322 | Time: 6849.79 ms [2022-01-17 21:47:44 main:574] : INFO : Epoch 2017 | loss: 0.0310973 | val_loss: 0.0311322 | Time: 6861.78 ms [2022-01-17 21:47:51 main:574] : INFO : Epoch 2018 | loss: 0.0310977 | val_loss: 0.0311339 | Time: 7262.48 ms [2022-01-17 21:47:59 main:574] : INFO : Epoch 2019 | loss: 0.0310998 | val_loss: 0.0311315 | Time: 7423.84 ms [2022-01-17 21:48:17 main:574] : INFO : Epoch 2020 | loss: 0.0310966 | val_loss: 0.0311396 | Time: 17808 ms [2022-01-17 21:48:24 main:574] : INFO : Epoch 2021 | loss: 0.0311005 | val_loss: 0.0311423 | Time: 7243.9 ms [2022-01-17 21:48:31 main:574] : INFO : Epoch 2022 | loss: 0.0310986 | val_loss: 0.0311308 | Time: 6899.62 ms [2022-01-17 21:48:38 main:574] : INFO : Epoch 2023 | loss: 0.0310953 | val_loss: 0.031129 | Time: 6918.31 ms [2022-01-17 21:48:45 main:574] : INFO : Epoch 2024 | loss: 0.0310945 | val_loss: 0.0311348 | Time: 6790.92 ms [2022-01-17 21:48:52 main:574] : INFO : Epoch 2025 | loss: 0.031094 | val_loss: 0.0311282 | Time: 6980.28 ms [2022-01-17 21:48:59 main:574] : INFO : Epoch 2026 | loss: 0.0310946 | val_loss: 0.0311316 | Time: 6794.43 ms [2022-01-17 21:49:16 main:574] : INFO : Epoch 2027 | loss: 0.0310946 | val_loss: 0.0311347 | Time: 17426.6 ms [2022-01-17 21:49:23 main:574] : INFO : Epoch 2028 | loss: 0.0311027 | val_loss: 0.0311281 | Time: 6798.88 ms [2022-01-17 21:49:30 main:574] : INFO : Epoch 2029 | loss: 0.0311004 | val_loss: 0.0311351 | Time: 7117.2 ms [2022-01-17 21:49:37 main:574] : INFO : Epoch 2030 | loss: 0.031095 | val_loss: 0.0311288 | Time: 6814.1 ms [2022-01-17 21:49:44 main:574] : INFO : Epoch 2031 | loss: 0.0310947 | val_loss: 0.0311297 | Time: 7085.42 ms [2022-01-17 21:49:52 main:574] : INFO : Epoch 2032 | loss: 0.0310982 | val_loss: 0.0311402 | Time: 7797.2 ms [2022-01-17 21:50:09 main:574] : INFO : Epoch 2033 | loss: 0.0310975 | val_loss: 0.0311343 | Time: 16868.1 ms [2022-01-17 21:50:15 main:574] : INFO : Epoch 2034 | loss: 0.0311012 | val_loss: 0.0311333 | Time: 6755.57 ms [2022-01-17 21:50:23 main:574] : INFO : Epoch 2035 | loss: 0.0310978 | val_loss: 0.0311275 | Time: 7237.59 ms [2022-01-17 21:50:29 main:574] : INFO : Epoch 2036 | loss: 0.0311002 | val_loss: 0.0311347 | Time: 6757.67 ms [2022-01-17 21:50:37 main:574] : INFO : Epoch 2037 | loss: 0.0310982 | val_loss: 0.0311288 | Time: 7509.36 ms [2022-01-17 21:50:54 main:574] : INFO : Epoch 2038 | loss: 0.0310947 | val_loss: 0.0311336 | Time: 17526 ms [2022-01-17 21:51:01 main:574] : INFO : Epoch 2039 | loss: 0.0310934 | val_loss: 0.031128 | Time: 6811.23 ms [2022-01-17 21:51:08 main:574] : INFO : Epoch 2040 | loss: 0.0310933 | val_loss: 0.031129 | Time: 6787.62 ms [2022-01-17 21:51:15 main:574] : INFO : Epoch 2041 | loss: 0.0310924 | val_loss: 0.0311306 | Time: 6776.8 ms [2022-01-17 21:51:22 main:574] : INFO : Epoch 2042 | loss: 0.0310945 | val_loss: 0.0311311 | Time: 7151.05 ms [2022-01-17 21:51:29 main:574] : INFO : Epoch 2043 | loss: 0.0310957 | val_loss: 0.0311315 | Time: 7120.72 ms [2022-01-17 21:51:59 main:574] : INFO : Epoch 2044 | loss: 0.0310946 | val_loss: 0.0311355 | Time: 30052 ms [2022-01-17 21:52:07 main:574] : INFO : Epoch 2045 | loss: 0.0310959 | val_loss: 0.0311294 | Time: 7282.78 ms [2022-01-17 21:52:14 main:574] : INFO : Epoch 2046 | loss: 0.0310959 | val_loss: 0.0311313 | Time: 6836.65 ms [2022-01-17 21:52:21 main:574] : INFO : Epoch 2047 | loss: 0.0310943 | val_loss: 0.0311313 | Time: 7036.88 ms [2022-01-17 21:52:28 main:574] : INFO : Epoch 2048 | loss: 0.0310917 | val_loss: 0.0311341 | Time: 6960.86 ms [2022-01-17 21:52:28 main:597] : INFO : Saving trained model to model-final.pt, val_loss 0.0311341 [2022-01-17 21:52:28 main:603] : INFO : Saving end state to config to file [2022-01-17 21:52:28 main:608] : INFO : Success, exiting.. 21:52:28 (21596): called boinc_finish(0) </stderr_txt> ]]>
©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)