Task 12583961

Name ParityModified-1645996871-27788-2-0_0
Workunit 9743373
Created 5 Mar 2022, 21:29:39 UTC
Sent 9 Mar 2022, 16:49:28 UTC
Report deadline 17 Mar 2022, 16:49:28 UTC
Received 10 Mar 2022, 16:23:13 UTC
Server state Over
Outcome Success
Client state Done
Exit status 0 (0x00000000)
Computer ID 19252
Run time 50 min 54 sec
CPU time 50 min 4 sec
Validate state Valid
Credit 4,160.00
Device peak FLOPS 4,569.32 GFLOPS
Application version Machine Learning Dataset Generator (GPU) v9.75 (cuda10200)
windows_x86_64
Peak working set size 1.63 GB
Peak swap size 3.61 GB
Peak disk usage 1.54 GB

Stderr output

<core_client_version>7.16.20</core_client_version>
<![CDATA[
<stderr_txt>
NFO	:	    Patience: 10
[2022-03-09 17:47:23	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-09 17:47:23	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-09 17:47:23	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-09 17:47:23	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-09 17:47:23	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-09 17:47:23	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-09 17:47:23	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-09 17:47:24	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-09 17:47:24	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-09 17:47:25	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-09 17:47:25	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-09 17:47:25	                main:494]	:	INFO	:	Creating Model
[2022-03-09 17:47:25	                main:507]	:	INFO	:	Preparing config file
[2022-03-09 17:47:25	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-09 17:47:25	                main:512]	:	INFO	:	Loading config
[2022-03-09 17:47:25	                main:514]	:	INFO	:	Loading state
[2022-03-09 17:47:25	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-09 17:47:25	                main:562]	:	INFO	:	Starting Training
[2022-03-09 17:47:27	                main:574]	:	INFO	:	Epoch 1867 | loss: 0.0311895 | val_loss: 0.0311747 | Time: 1795.74 ms
[2022-03-09 17:47:29	                main:574]	:	INFO	:	Epoch 1868 | loss: 0.0311686 | val_loss: 0.0311668 | Time: 1638.22 ms
[2022-03-09 17:47:31	                main:574]	:	INFO	:	Epoch 1869 | loss: 0.0311592 | val_loss: 0.0311638 | Time: 1883.99 ms
[2022-03-09 17:47:33	                main:574]	:	INFO	:	Epoch 1870 | loss: 0.0311561 | val_loss: 0.0311625 | Time: 2324.91 ms
[2022-03-09 17:47:35	                main:574]	:	INFO	:	Epoch 1871 | loss: 0.0311553 | val_loss: 0.0311609 | Time: 2354.72 ms
[2022-03-09 17:47:38	                main:574]	:	INFO	:	Epoch 1872 | loss: 0.0311531 | val_loss: 0.0311652 | Time: 2358 ms
[2022-03-09 17:47:40	                main:574]	:	INFO	:	Epoch 1873 | loss: 0.0311551 | val_loss: 0.0311665 | Time: 2356.7 ms
[2022-03-09 17:47:42	                main:574]	:	INFO	:	Epoch 1874 | loss: 0.0311553 | val_loss: 0.0311606 | Time: 2360.57 ms
[2022-03-09 17:47:45	                main:574]	:	INFO	:	Epoch 1875 | loss: 0.0311542 | val_loss: 0.0311588 | Time: 2349.65 ms
[2022-03-09 17:47:47	                main:574]	:	INFO	:	Epoch 1876 | loss: 0.0311538 | val_loss: 0.0311605 | Time: 2364.65 ms
[2022-03-09 17:47:49	                main:574]	:	INFO	:	Epoch 1877 | loss: 0.0311556 | val_loss: 0.0311585 | Time: 2357.96 ms
[2022-03-09 17:47:52	                main:574]	:	INFO	:	Epoch 1878 | loss: 0.0311533 | val_loss: 0.0311588 | Time: 2358.93 ms
[2022-03-09 17:47:54	                main:574]	:	INFO	:	Epoch 1879 | loss: 0.0311544 | val_loss: 0.0311624 | Time: 2362.94 ms
[2022-03-09 17:47:57	                main:574]	:	INFO	:	Epoch 1880 | loss: 0.0311558 | val_loss: 0.031164 | Time: 2373.27 ms
[2022-03-09 17:47:59	                main:574]	:	INFO	:	Epoch 1881 | loss: 0.0311582 | val_loss: 0.0311609 | Time: 2368.28 ms
[2022-03-09 17:48:01	                main:574]	:	INFO	:	Epoch 1882 | loss: 0.0311573 | val_loss: 0.0311611 | Time: 2361.46 ms
[2022-03-09 17:48:04	                main:574]	:	INFO	:	Epoch 1883 | loss: 0.0311567 | val_loss: 0.0311626 | Time: 2362.05 ms
[2022-03-09 17:48:06	                main:574]	:	INFO	:	Epoch 1884 | loss: 0.0311557 | val_loss: 0.0311615 | Time: 2357.52 ms
[2022-03-09 17:48:08	                main:574]	:	INFO	:	Epoch 1885 | loss: 0.0311552 | val_loss: 0.0311608 | Time: 2361.23 ms
[2022-03-09 17:48:11	                main:574]	:	INFO	:	Epoch 1886 | loss: 0.031153 | val_loss: 0.0311577 | Time: 2364.53 ms
[2022-03-09 17:48:13	                main:574]	:	INFO	:	Epoch 1887 | loss: 0.0311517 | val_loss: 0.0311571 | Time: 2364.89 ms
[2022-03-09 17:48:15	                main:574]	:	INFO	:	Epoch 1888 | loss: 0.0311502 | val_loss: 0.0311566 | Time: 2363.97 ms
[2022-03-09 17:48:18	                main:574]	:	INFO	:	Epoch 1889 | loss: 0.0311501 | val_loss: 0.0311586 | Time: 2357.9 ms
[2022-03-09 17:48:20	                main:574]	:	INFO	:	Epoch 1890 | loss: 0.0311516 | val_loss: 0.0311602 | Time: 2368.95 ms
[2022-03-09 17:48:23	                main:574]	:	INFO	:	Epoch 1891 | loss: 0.0311521 | val_loss: 0.0311599 | Time: 2410.45 ms
[2022-03-09 17:48:25	                main:574]	:	INFO	:	Epoch 1892 | loss: 0.0311521 | val_loss: 0.0311597 | Time: 2361.49 ms
[2022-03-09 17:48:27	                main:574]	:	INFO	:	Epoch 1893 | loss: 0.0311532 | val_loss: 0.0311553 | Time: 2370.77 ms
[2022-03-09 17:48:30	                main:574]	:	INFO	:	Epoch 1894 | loss: 0.0311508 | val_loss: 0.031156 | Time: 2358.01 ms
[2022-03-09 17:48:32	                main:574]	:	INFO	:	Epoch 1895 | loss: 0.0311527 | val_loss: 0.031158 | Time: 2362.13 ms
[2022-03-09 17:48:34	                main:574]	:	INFO	:	Epoch 1896 | loss: 0.0311525 | val_loss: 0.0311576 | Time: 2363.74 ms
[2022-03-09 17:48:37	                main:574]	:	INFO	:	Epoch 1897 | loss: 0.0311496 | val_loss: 0.0311567 | Time: 2355.81 ms
[2022-03-09 17:48:39	                main:574]	:	INFO	:	Epoch 1898 | loss: 0.0311485 | val_loss: 0.0311547 | Time: 2371.55 ms
[2022-03-09 17:48:42	                main:574]	:	INFO	:	Epoch 1899 | loss: 0.0311481 | val_loss: 0.0311584 | Time: 2371.44 ms
[2022-03-09 17:48:44	                main:574]	:	INFO	:	Epoch 1900 | loss: 0.031148 | val_loss: 0.0311527 | Time: 2360.52 ms
[2022-03-09 17:48:46	                main:574]	:	INFO	:	Epoch 1901 | loss: 0.0311459 | val_loss: 0.0311521 | Time: 2368.49 ms
[2022-03-09 17:48:49	                main:574]	:	INFO	:	Epoch 1902 | loss: 0.0311464 | val_loss: 0.0311554 | Time: 2368.47 ms
[2022-03-09 17:48:51	                main:574]	:	INFO	:	Epoch 1903 | loss: 0.0311458 | val_loss: 0.0311541 | Time: 2366.57 ms
[2022-03-09 17:48:53	                main:574]	:	INFO	:	Epoch 1904 | loss: 0.0311446 | val_loss: 0.0311533 | Time: 2373.03 ms
[2022-03-09 17:48:56	                main:574]	:	INFO	:	Epoch 1905 | loss: 0.0311462 | val_loss: 0.0311558 | Time: 2360.93 ms
[2022-03-09 17:48:58	                main:574]	:	INFO	:	Epoch 1906 | loss: 0.0311454 | val_loss: 0.0311542 | Time: 2375.07 ms
[2022-03-09 17:49:01	                main:574]	:	INFO	:	Epoch 1907 | loss: 0.031145 | val_loss: 0.0311542 | Time: 2359.95 ms
[2022-03-09 17:49:03	                main:574]	:	INFO	:	Epoch 1908 | loss: 0.0311444 | val_loss: 0.0311529 | Time: 2364.38 ms
[2022-03-09 17:49:05	                main:574]	:	INFO	:	Epoch 1909 | loss: 0.0311445 | val_loss: 0.0311545 | Time: 2366.75 ms
[2022-03-09 17:49:08	                main:574]	:	INFO	:	Epoch 1910 | loss: 0.0311434 | val_loss: 0.0311508 | Time: 2359.19 ms
[2022-03-09 17:49:10	                main:574]	:	INFO	:	Epoch 1911 | loss: 0.0311442 | val_loss: 0.0311584 | Time: 2363.31 ms
[2022-03-09 17:49:12	                main:574]	:	INFO	:	Epoch 1912 | loss: 0.0311447 | val_loss: 0.0311517 | Time: 2369.47 ms
[2022-03-09 17:49:15	                main:574]	:	INFO	:	Epoch 1913 | loss: 0.0311448 | val_loss: 0.0311481 | Time: 2365.21 ms
[2022-03-09 17:49:17	                main:574]	:	INFO	:	Epoch 1914 | loss: 0.0311424 | val_loss: 0.0311463 | Time: 2360.81 ms
[2022-03-09 17:49:19	                main:574]	:	INFO	:	Epoch 1915 | loss: 0.0311417 | val_loss: 0.0311484 | Time: 2363.79 ms
[2022-03-09 17:49:22	                main:574]	:	INFO	:	Epoch 1916 | loss: 0.0311419 | val_loss: 0.0311524 | Time: 2362.9 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB)
[2022-03-09 17:52:24	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-09 17:52:24	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-09 17:52:24	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-09 17:52:24	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-09 17:52:24	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-09 17:52:24	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-09 17:52:24	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-09 17:52:24	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-09 17:52:24	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-09 17:52:24	                main:474]	:	INFO	:	Configuration: 
[2022-03-09 17:52:24	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-09 17:52:24	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-09 17:52:24	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-09 17:52:24	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-09 17:52:24	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-09 17:52:24	                main:480]	:	INFO	:	    Patience: 10
[2022-03-09 17:52:24	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-09 17:52:24	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-09 17:52:24	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-09 17:52:24	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-09 17:52:24	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-09 17:52:24	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-09 17:52:24	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-09 17:52:26	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-09 17:52:26	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-09 17:52:26	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-09 17:52:26	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-09 17:52:26	                main:494]	:	INFO	:	Creating Model
[2022-03-09 17:52:26	                main:507]	:	INFO	:	Preparing config file
[2022-03-09 17:52:26	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-09 17:52:26	                main:512]	:	INFO	:	Loading config
[2022-03-09 17:52:26	                main:514]	:	INFO	:	Loading state
[2022-03-09 17:52:26	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-09 17:52:26	                main:562]	:	INFO	:	Starting Training
[2022-03-09 17:52:28	                main:574]	:	INFO	:	Epoch 1867 | loss: 0.0311886 | val_loss: 0.0311768 | Time: 1803.54 ms
[2022-03-09 17:52:30	                main:574]	:	INFO	:	Epoch 1868 | loss: 0.0311601 | val_loss: 0.0311633 | Time: 1614.29 ms
[2022-03-09 17:52:32	                main:574]	:	INFO	:	Epoch 1869 | loss: 0.0311571 | val_loss: 0.0311686 | Time: 1919.41 ms
[2022-03-09 17:52:34	                main:574]	:	INFO	:	Epoch 1870 | loss: 0.0311551 | val_loss: 0.031164 | Time: 2315.69 ms
[2022-03-09 17:52:36	                main:574]	:	INFO	:	Epoch 1871 | loss: 0.0311548 | val_loss: 0.0311642 | Time: 2326.96 ms
[2022-03-09 17:52:39	                main:574]	:	INFO	:	Epoch 1872 | loss: 0.0311535 | val_loss: 0.0311623 | Time: 2337.15 ms
[2022-03-09 17:52:41	                main:574]	:	INFO	:	Epoch 1873 | loss: 0.0311555 | val_loss: 0.0311601 | Time: 2345.2 ms
[2022-03-09 17:52:43	                main:574]	:	INFO	:	Epoch 1874 | loss: 0.0311571 | val_loss: 0.0311611 | Time: 2358.63 ms
[2022-03-09 17:52:46	                main:574]	:	INFO	:	Epoch 1875 | loss: 0.0311542 | val_loss: 0.0311603 | Time: 2349.61 ms
[2022-03-09 17:52:48	                main:574]	:	INFO	:	Epoch 1876 | loss: 0.0311536 | val_loss: 0.0311608 | Time: 2356.35 ms
[2022-03-09 17:52:51	                main:574]	:	INFO	:	Epoch 1877 | loss: 0.0311538 | val_loss: 0.03116 | Time: 2354.63 ms
[2022-03-09 17:52:53	                main:574]	:	INFO	:	Epoch 1878 | loss: 0.0311526 | val_loss: 0.0311602 | Time: 2352.14 ms
[2022-03-09 17:52:55	                main:574]	:	INFO	:	Epoch 1879 | loss: 0.0311521 | val_loss: 0.0311604 | Time: 2361.06 ms
[2022-03-09 17:52:58	                main:574]	:	INFO	:	Epoch 1880 | loss: 0.0311506 | val_loss: 0.0311612 | Time: 2356.04 ms
[2022-03-09 17:53:00	                main:574]	:	INFO	:	Epoch 1881 | loss: 0.0311504 | val_loss: 0.0311596 | Time: 2356.32 ms
[2022-03-09 17:53:02	                main:574]	:	INFO	:	Epoch 1882 | loss: 0.0311516 | val_loss: 0.0311623 | Time: 2354.37 ms
[2022-03-09 17:53:05	                main:574]	:	INFO	:	Epoch 1883 | loss: 0.0311573 | val_loss: 0.0311638 | Time: 2355.46 ms
[2022-03-09 17:53:07	                main:574]	:	INFO	:	Epoch 1884 | loss: 0.0311544 | val_loss: 0.03116 | Time: 2361.48 ms
[2022-03-09 17:53:09	                main:574]	:	INFO	:	Epoch 1885 | loss: 0.0311516 | val_loss: 0.0311608 | Time: 2366.93 ms
[2022-03-09 17:53:12	                main:574]	:	INFO	:	Epoch 1886 | loss: 0.0311544 | val_loss: 0.0311657 | Time: 2366.13 ms
[2022-03-09 17:53:14	                main:574]	:	INFO	:	Epoch 1887 | loss: 0.0311582 | val_loss: 0.0311622 | Time: 2355.56 ms
[2022-03-09 17:53:17	                main:574]	:	INFO	:	Epoch 1888 | loss: 0.0311568 | val_loss: 0.0311616 | Time: 2360.29 ms
[2022-03-09 17:53:19	                main:574]	:	INFO	:	Epoch 1889 | loss: 0.0311557 | val_loss: 0.0311594 | Time: 2358.26 ms
[2022-03-09 17:53:21	                main:574]	:	INFO	:	Epoch 1890 | loss: 0.0311565 | val_loss: 0.031162 | Time: 2355.42 ms
[2022-03-09 17:53:24	                main:574]	:	INFO	:	Epoch 1891 | loss: 0.0311597 | val_loss: 0.0311611 | Time: 2414.92 ms
[2022-03-09 17:53:26	                main:574]	:	INFO	:	Epoch 1892 | loss: 0.0311587 | val_loss: 0.0311644 | Time: 2355.81 ms
[2022-03-09 17:53:28	                main:574]	:	INFO	:	Epoch 1893 | loss: 0.031159 | val_loss: 0.0311633 | Time: 2366.2 ms
[2022-03-09 17:53:31	                main:574]	:	INFO	:	Epoch 1894 | loss: 0.0311599 | val_loss: 0.0311677 | Time: 2359.02 ms
[2022-03-09 17:53:33	                main:574]	:	INFO	:	Epoch 1895 | loss: 0.0311599 | val_loss: 0.0311653 | Time: 2358.68 ms
[2022-03-09 17:53:35	                main:574]	:	INFO	:	Epoch 1896 | loss: 0.0311591 | val_loss: 0.0311639 | Time: 2351.99 ms
[2022-03-09 17:53:38	                main:574]	:	INFO	:	Epoch 1897 | loss: 0.0311591 | val_loss: 0.0311631 | Time: 2360.65 ms
[2022-03-09 17:53:40	                main:574]	:	INFO	:	Epoch 1898 | loss: 0.0311602 | val_loss: 0.0311654 | Time: 2352.27 ms
[2022-03-09 17:53:43	                main:574]	:	INFO	:	Epoch 1899 | loss: 0.0311643 | val_loss: 0.031166 | Time: 2356.89 ms
[2022-03-09 17:53:45	                main:574]	:	INFO	:	Epoch 1900 | loss: 0.0311612 | val_loss: 0.031162 | Time: 2358.09 ms
[2022-03-09 17:53:47	                main:574]	:	INFO	:	Epoch 1901 | loss: 0.0311578 | val_loss: 0.0311617 | Time: 2363.4 ms
[2022-03-09 17:53:50	                main:574]	:	INFO	:	Epoch 1902 | loss: 0.0311573 | val_loss: 0.0311639 | Time: 2358.28 ms
[2022-03-09 17:53:52	                main:574]	:	INFO	:	Epoch 1903 | loss: 0.0311582 | val_loss: 0.0311642 | Time: 2363.7 ms
[2022-03-09 17:53:54	                main:574]	:	INFO	:	Epoch 1904 | loss: 0.0311644 | val_loss: 0.031166 | Time: 2355.74 ms
[2022-03-09 17:53:57	                main:574]	:	INFO	:	Epoch 1905 | loss: 0.0311629 | val_loss: 0.0311643 | Time: 2359.25 ms
[2022-03-09 17:53:59	                main:574]	:	INFO	:	Epoch 1906 | loss: 0.0311612 | val_loss: 0.0311643 | Time: 2358.03 ms
[2022-03-09 17:54:01	                main:574]	:	INFO	:	Epoch 1907 | loss: 0.031161 | val_loss: 0.0311649 | Time: 2355.92 ms
[2022-03-09 17:54:04	                main:574]	:	INFO	:	Epoch 1908 | loss: 0.031161 | val_loss: 0.0311637 | Time: 2355.09 ms
[2022-03-09 17:54:06	                main:574]	:	INFO	:	Epoch 1909 | loss: 0.0311613 | val_loss: 0.0311615 | Time: 2357.91 ms
[2022-03-09 17:54:09	                main:574]	:	INFO	:	Epoch 1910 | loss: 0.0311594 | val_loss: 0.0311625 | Time: 2354.52 ms
[2022-03-09 17:54:11	                main:574]	:	INFO	:	Epoch 1911 | loss: 0.0311569 | val_loss: 0.0311611 | Time: 2355.14 ms
[2022-03-09 17:54:13	                main:574]	:	INFO	:	Epoch 1912 | loss: 0.0311557 | val_loss: 0.0311631 | Time: 2349.16 ms
[2022-03-09 17:54:16	                main:574]	:	INFO	:	Epoch 1913 | loss: 0.0311559 | val_loss: 0.0311612 | Time: 2354.08 ms
[2022-03-09 17:54:18	                main:574]	:	INFO	:	Epoch 1914 | loss: 0.0311551 | val_loss: 0.0311609 | Time: 2351.76 ms
[2022-03-09 17:54:20	                main:574]	:	INFO	:	Epoch 1915 | loss: 0.0311554 | val_loss: 0.0311625 | Time: 2349.07 ms
[2022-03-09 17:54:23	                main:574]	:	INFO	:	Epoch 1916 | loss: 0.0311547 | val_loss: 0.0311657 | Time: 2351.15 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB)
[2022-03-09 17:57:25	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-09 17:57:25	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-09 17:57:25	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-09 17:57:25	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-09 17:57:25	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-09 17:57:25	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-09 17:57:25	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-09 17:57:25	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-09 17:57:25	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-09 17:57:25	                main:474]	:	INFO	:	Configuration: 
[2022-03-09 17:57:25	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-09 17:57:25	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-09 17:57:25	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-09 17:57:25	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-09 17:57:25	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-09 17:57:25	                main:480]	:	INFO	:	    Patience: 10
[2022-03-09 17:57:25	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-09 17:57:25	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-09 17:57:25	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-09 17:57:25	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-09 17:57:25	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-09 17:57:25	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-09 17:57:25	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-09 17:57:27	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-09 17:57:27	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-09 17:57:27	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-09 17:57:27	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-09 17:57:27	                main:494]	:	INFO	:	Creating Model
[2022-03-09 17:57:27	                main:507]	:	INFO	:	Preparing config file
[2022-03-09 17:57:27	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-09 17:57:27	                main:512]	:	INFO	:	Loading config
[2022-03-09 17:57:27	                main:514]	:	INFO	:	Loading state
[2022-03-09 17:57:27	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-09 17:57:27	                main:562]	:	INFO	:	Starting Training
[2022-03-09 17:57:29	                main:574]	:	INFO	:	Epoch 1867 | loss: 0.0311854 | val_loss: 0.0311681 | Time: 1803.25 ms
[2022-03-09 17:57:31	                main:574]	:	INFO	:	Epoch 1868 | loss: 0.0311587 | val_loss: 0.031162 | Time: 1618.93 ms
[2022-03-09 17:57:33	                main:574]	:	INFO	:	Epoch 1869 | loss: 0.0311557 | val_loss: 0.0311602 | Time: 1928.98 ms
[2022-03-09 17:57:35	                main:574]	:	INFO	:	Epoch 1870 | loss: 0.0311543 | val_loss: 0.0311623 | Time: 2332.41 ms
[2022-03-09 17:57:37	                main:574]	:	INFO	:	Epoch 1871 | loss: 0.0311554 | val_loss: 0.0311604 | Time: 2336.01 ms
[2022-03-09 17:57:40	                main:574]	:	INFO	:	Epoch 1872 | loss: 0.031154 | val_loss: 0.0311636 | Time: 2342.45 ms
[2022-03-09 17:57:42	                main:574]	:	INFO	:	Epoch 1873 | loss: 0.03116 | val_loss: 0.031165 | Time: 2350.21 ms
[2022-03-09 17:57:45	                main:574]	:	INFO	:	Epoch 1874 | loss: 0.0311574 | val_loss: 0.0311651 | Time: 2375.17 ms
[2022-03-09 17:57:47	                main:574]	:	INFO	:	Epoch 1875 | loss: 0.0311591 | val_loss: 0.0311612 | Time: 2370.38 ms
[2022-03-09 17:57:49	                main:574]	:	INFO	:	Epoch 1876 | loss: 0.0311562 | val_loss: 0.0311617 | Time: 2362.98 ms
[2022-03-09 17:57:52	                main:574]	:	INFO	:	Epoch 1877 | loss: 0.031155 | val_loss: 0.031161 | Time: 2368.34 ms
[2022-03-09 17:57:54	                main:574]	:	INFO	:	Epoch 1878 | loss: 0.0311543 | val_loss: 0.0311621 | Time: 2373.87 ms
[2022-03-09 17:57:56	                main:574]	:	INFO	:	Epoch 1879 | loss: 0.0311531 | val_loss: 0.031164 | Time: 2373.28 ms
[2022-03-09 17:57:59	                main:574]	:	INFO	:	Epoch 1880 | loss: 0.0311537 | val_loss: 0.0311621 | Time: 2366.81 ms
[2022-03-09 17:58:01	                main:574]	:	INFO	:	Epoch 1881 | loss: 0.031153 | val_loss: 0.0311625 | Time: 2376.9 ms
[2022-03-09 17:58:04	                main:574]	:	INFO	:	Epoch 1882 | loss: 0.0311517 | val_loss: 0.031158 | Time: 2371.75 ms
[2022-03-09 17:58:06	                main:574]	:	INFO	:	Epoch 1883 | loss: 0.0311513 | val_loss: 0.0311595 | Time: 2366.74 ms
[2022-03-09 17:58:08	                main:574]	:	INFO	:	Epoch 1884 | loss: 0.0311503 | val_loss: 0.0311573 | Time: 2373.87 ms
[2022-03-09 17:58:11	                main:574]	:	INFO	:	Epoch 1885 | loss: 0.0311494 | val_loss: 0.0311648 | Time: 2368.4 ms
[2022-03-09 17:58:13	                main:574]	:	INFO	:	Epoch 1886 | loss: 0.0311513 | val_loss: 0.0311573 | Time: 2369.52 ms
[2022-03-09 17:58:15	                main:574]	:	INFO	:	Epoch 1887 | loss: 0.0311492 | val_loss: 0.0311647 | Time: 2363.48 ms
[2022-03-09 17:58:18	                main:574]	:	INFO	:	Epoch 1888 | loss: 0.0311496 | val_loss: 0.0311589 | Time: 2375.32 ms
[2022-03-09 17:58:20	                main:574]	:	INFO	:	Epoch 1889 | loss: 0.0311482 | val_loss: 0.0311596 | Time: 2367.1 ms
[2022-03-09 17:58:23	                main:574]	:	INFO	:	Epoch 1890 | loss: 0.03115 | val_loss: 0.031158 | Time: 2373.45 ms
[2022-03-09 17:58:25	                main:574]	:	INFO	:	Epoch 1891 | loss: 0.0311478 | val_loss: 0.0311552 | Time: 2382.13 ms
[2022-03-09 17:58:27	                main:574]	:	INFO	:	Epoch 1892 | loss: 0.031147 | val_loss: 0.0311577 | Time: 2359.64 ms
[2022-03-09 17:58:30	                main:574]	:	INFO	:	Epoch 1893 | loss: 0.0311483 | val_loss: 0.0311613 | Time: 2359.66 ms
[2022-03-09 17:58:32	                main:574]	:	INFO	:	Epoch 1894 | loss: 0.0311501 | val_loss: 0.0311581 | Time: 2361.8 ms
[2022-03-09 17:58:34	                main:574]	:	INFO	:	Epoch 1895 | loss: 0.0311491 | val_loss: 0.0311572 | Time: 2357.62 ms
[2022-03-09 17:58:37	                main:574]	:	INFO	:	Epoch 1896 | loss: 0.0311513 | val_loss: 0.0311557 | Time: 2353.01 ms
[2022-03-09 17:58:39	                main:574]	:	INFO	:	Epoch 1897 | loss: 0.0311485 | val_loss: 0.0311565 | Time: 2357.41 ms
[2022-03-09 17:58:41	                main:574]	:	INFO	:	Epoch 1898 | loss: 0.0311497 | val_loss: 0.0311607 | Time: 2365.39 ms
[2022-03-09 17:58:44	                main:574]	:	INFO	:	Epoch 1899 | loss: 0.0311511 | val_loss: 0.0311587 | Time: 2360.34 ms
[2022-03-09 17:58:46	                main:574]	:	INFO	:	Epoch 1900 | loss: 0.0311498 | val_loss: 0.0311605 | Time: 2362.23 ms
[2022-03-09 17:58:49	                main:574]	:	INFO	:	Epoch 1901 | loss: 0.0311496 | val_loss: 0.0311552 | Time: 2362.51 ms
[2022-03-09 17:58:51	                main:574]	:	INFO	:	Epoch 1902 | loss: 0.0311482 | val_loss: 0.0311584 | Time: 2363.61 ms
[2022-03-09 17:58:53	                main:574]	:	INFO	:	Epoch 1903 | loss: 0.0311501 | val_loss: 0.0311591 | Time: 2359.24 ms
[2022-03-09 17:58:56	                main:574]	:	INFO	:	Epoch 1904 | loss: 0.0311479 | val_loss: 0.0311565 | Time: 2356.53 ms
[2022-03-09 17:58:58	                main:574]	:	INFO	:	Epoch 1905 | loss: 0.031149 | val_loss: 0.031156 | Time: 2359.5 ms
[2022-03-09 17:59:01	                main:574]	:	INFO	:	Epoch 1906 | loss: 0.0311489 | val_loss: 0.0311527 | Time: 2367.24 ms
[2022-03-09 17:59:03	                main:574]	:	INFO	:	Epoch 1907 | loss: 0.0311485 | val_loss: 0.031153 | Time: 2359.92 ms
[2022-03-09 17:59:05	                main:574]	:	INFO	:	Epoch 1908 | loss: 0.0311477 | val_loss: 0.0311529 | Time: 2357.41 ms
[2022-03-09 17:59:08	                main:574]	:	INFO	:	Epoch 1909 | loss: 0.031147 | val_loss: 0.0311531 | Time: 2358.51 ms
[2022-03-09 17:59:10	                main:574]	:	INFO	:	Epoch 1910 | loss: 0.0311531 | val_loss: 0.0311572 | Time: 2368.2 ms
[2022-03-09 17:59:12	                main:574]	:	INFO	:	Epoch 1911 | loss: 0.0311498 | val_loss: 0.0311551 | Time: 2352.94 ms
[2022-03-09 17:59:15	                main:574]	:	INFO	:	Epoch 1912 | loss: 0.0311476 | val_loss: 0.0311535 | Time: 2362.42 ms
[2022-03-09 17:59:17	                main:574]	:	INFO	:	Epoch 1913 | loss: 0.0311487 | val_loss: 0.0311544 | Time: 2356.25 ms
[2022-03-09 17:59:19	                main:574]	:	INFO	:	Epoch 1914 | loss: 0.0311488 | val_loss: 0.0311571 | Time: 2359.31 ms
[2022-03-09 17:59:22	                main:574]	:	INFO	:	Epoch 1915 | loss: 0.0311465 | val_loss: 0.0311529 | Time: 2356.34 ms
[2022-03-09 17:59:24	                main:574]	:	INFO	:	Epoch 1916 | loss: 0.0311449 | val_loss: 0.0311539 | Time: 2352.35 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB)
[2022-03-09 18:02:25	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-09 18:02:25	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-09 18:02:25	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-09 18:02:25	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-09 18:02:25	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-09 18:02:25	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-09 18:02:25	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-09 18:02:25	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-09 18:02:25	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-09 18:02:25	                main:474]	:	INFO	:	Configuration: 
[2022-03-09 18:02:25	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-09 18:02:25	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-09 18:02:25	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-09 18:02:25	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-09 18:02:25	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-09 18:02:25	                main:480]	:	INFO	:	    Patience: 10
[2022-03-09 18:02:25	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-09 18:02:25	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-09 18:02:25	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-09 18:02:26	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-09 18:02:26	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-09 18:02:26	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-09 18:02:26	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-09 18:02:27	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-09 18:02:27	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-09 18:02:27	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-09 18:02:27	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-09 18:02:27	                main:494]	:	INFO	:	Creating Model
[2022-03-09 18:02:27	                main:507]	:	INFO	:	Preparing config file
[2022-03-09 18:02:27	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-09 18:02:27	                main:512]	:	INFO	:	Loading config
[2022-03-09 18:02:27	                main:514]	:	INFO	:	Loading state
[2022-03-09 18:02:28	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-09 18:02:28	                main:562]	:	INFO	:	Starting Training
[2022-03-09 18:02:30	                main:574]	:	INFO	:	Epoch 1867 | loss: 0.0311869 | val_loss: 0.0311705 | Time: 1789.98 ms
[2022-03-09 18:02:31	                main:574]	:	INFO	:	Epoch 1868 | loss: 0.0311576 | val_loss: 0.0311625 | Time: 1611.16 ms
[2022-03-09 18:02:33	                main:574]	:	INFO	:	Epoch 1869 | loss: 0.0311548 | val_loss: 0.0311628 | Time: 1709.19 ms
[2022-03-09 18:02:35	                main:574]	:	INFO	:	Epoch 1870 | loss: 0.0311534 | val_loss: 0.0311633 | Time: 2307.02 ms
[2022-03-09 18:02:38	                main:574]	:	INFO	:	Epoch 1871 | loss: 0.0311538 | val_loss: 0.0311635 | Time: 2319.06 ms
[2022-03-09 18:02:40	                main:574]	:	INFO	:	Epoch 1872 | loss: 0.0311533 | val_loss: 0.0311651 | Time: 2342.14 ms
[2022-03-09 18:02:42	                main:574]	:	INFO	:	Epoch 1873 | loss: 0.0311559 | val_loss: 0.0311655 | Time: 2340.08 ms
[2022-03-09 18:02:45	                main:574]	:	INFO	:	Epoch 1874 | loss: 0.0311555 | val_loss: 0.0311655 | Time: 2344.38 ms
[2022-03-09 18:02:47	                main:574]	:	INFO	:	Epoch 1875 | loss: 0.031153 | val_loss: 0.0311629 | Time: 2358.03 ms
[2022-03-09 18:02:50	                main:574]	:	INFO	:	Epoch 1876 | loss: 0.031152 | val_loss: 0.0311632 | Time: 2351.79 ms
[2022-03-09 18:02:52	                main:574]	:	INFO	:	Epoch 1877 | loss: 0.0311531 | val_loss: 0.0311653 | Time: 2359 ms
[2022-03-09 18:02:54	                main:574]	:	INFO	:	Epoch 1878 | loss: 0.0311557 | val_loss: 0.0311637 | Time: 2356.77 ms
[2022-03-09 18:02:57	                main:574]	:	INFO	:	Epoch 1879 | loss: 0.0311548 | val_loss: 0.0311639 | Time: 2362.72 ms
[2022-03-09 18:02:59	                main:574]	:	INFO	:	Epoch 1880 | loss: 0.0311534 | val_loss: 0.0311654 | Time: 2359.24 ms
[2022-03-09 18:03:01	                main:574]	:	INFO	:	Epoch 1881 | loss: 0.0311525 | val_loss: 0.031163 | Time: 2356.29 ms
[2022-03-09 18:03:04	                main:574]	:	INFO	:	Epoch 1882 | loss: 0.0311517 | val_loss: 0.0311581 | Time: 2359.98 ms
[2022-03-09 18:03:06	                main:574]	:	INFO	:	Epoch 1883 | loss: 0.0311519 | val_loss: 0.0311594 | Time: 2355.43 ms
[2022-03-09 18:03:08	                main:574]	:	INFO	:	Epoch 1884 | loss: 0.0311526 | val_loss: 0.0311589 | Time: 2358.76 ms
[2022-03-09 18:03:11	                main:574]	:	INFO	:	Epoch 1885 | loss: 0.0311519 | val_loss: 0.0311585 | Time: 2366.67 ms
[2022-03-09 18:03:13	                main:574]	:	INFO	:	Epoch 1886 | loss: 0.0311534 | val_loss: 0.0311637 | Time: 2365.5 ms
[2022-03-09 18:03:16	                main:574]	:	INFO	:	Epoch 1887 | loss: 0.0311535 | val_loss: 0.0311592 | Time: 2361.96 ms
[2022-03-09 18:03:18	                main:574]	:	INFO	:	Epoch 1888 | loss: 0.0311545 | val_loss: 0.031159 | Time: 2361.27 ms
[2022-03-09 18:03:20	                main:574]	:	INFO	:	Epoch 1889 | loss: 0.0311522 | val_loss: 0.0311607 | Time: 2357.29 ms
[2022-03-09 18:03:23	                main:574]	:	INFO	:	Epoch 1890 | loss: 0.0311529 | val_loss: 0.0311592 | Time: 2357.14 ms
[2022-03-09 18:03:25	                main:574]	:	INFO	:	Epoch 1891 | loss: 0.0311511 | val_loss: 0.0311583 | Time: 2364.88 ms
[2022-03-09 18:03:27	                main:574]	:	INFO	:	Epoch 1892 | loss: 0.0311512 | val_loss: 0.0311658 | Time: 2380.89 ms
[2022-03-09 18:03:30	                main:574]	:	INFO	:	Epoch 1893 | loss: 0.031152 | val_loss: 0.0311629 | Time: 2349.12 ms
[2022-03-09 18:03:32	                main:574]	:	INFO	:	Epoch 1894 | loss: 0.0311544 | val_loss: 0.0311596 | Time: 2361.82 ms
[2022-03-09 18:03:34	                main:574]	:	INFO	:	Epoch 1895 | loss: 0.0311493 | val_loss: 0.0311569 | Time: 2347.14 ms
[2022-03-09 18:03:37	                main:574]	:	INFO	:	Epoch 1896 | loss: 0.03115 | val_loss: 0.0311565 | Time: 2350.62 ms
[2022-03-09 18:03:39	                main:574]	:	INFO	:	Epoch 1897 | loss: 0.0311508 | val_loss: 0.0311571 | Time: 2351.29 ms
[2022-03-09 18:03:42	                main:574]	:	INFO	:	Epoch 1898 | loss: 0.0311495 | val_loss: 0.0311565 | Time: 2351.65 ms
[2022-03-09 18:03:44	                main:574]	:	INFO	:	Epoch 1899 | loss: 0.0311482 | val_loss: 0.0311567 | Time: 2348.99 ms
[2022-03-09 18:03:46	                main:574]	:	INFO	:	Epoch 1900 | loss: 0.0311502 | val_loss: 0.0311579 | Time: 2351.43 ms
[2022-03-09 18:03:49	                main:574]	:	INFO	:	Epoch 1901 | loss: 0.0311486 | val_loss: 0.0311578 | Time: 2370.04 ms
[2022-03-09 18:03:51	                main:574]	:	INFO	:	Epoch 1902 | loss: 0.0311534 | val_loss: 0.0311567 | Time: 2355.62 ms
[2022-03-09 18:03:54	                main:574]	:	INFO	:	Epoch 1903 | loss: 0.0311521 | val_loss: 0.031157 | Time: 2354.63 ms
[2022-03-09 18:07:26	                main:574]	:	INFO	:	Epoch 1904 | loss: 0.0311504 | val_loss: 0.0311543 | Time: 212646 ms
[2022-03-09 18:07:30	                main:574]	:	INFO	:	Epoch 1905 | loss: 0.0311493 | val_loss: 0.0311542 | Time: 3166.59 ms
[2022-03-09 18:07:33	                main:574]	:	INFO	:	Epoch 1906 | loss: 0.0311477 | val_loss: 0.0311557 | Time: 3204.7 ms
[2022-03-09 18:07:36	                main:574]	:	INFO	:	Epoch 1907 | loss: 0.0311473 | val_loss: 0.0311528 | Time: 3259.21 ms
[2022-03-09 18:07:39	                main:574]	:	INFO	:	Epoch 1908 | loss: 0.0311464 | val_loss: 0.031152 | Time: 3272.72 ms
[2022-03-09 18:07:43	                main:574]	:	INFO	:	Epoch 1909 | loss: 0.0311447 | val_loss: 0.0311524 | Time: 3296.11 ms
[2022-03-09 18:07:46	                main:574]	:	INFO	:	Epoch 1910 | loss: 0.0311455 | val_loss: 0.0311511 | Time: 3301.72 ms
[2022-03-09 18:07:49	                main:574]	:	INFO	:	Epoch 1911 | loss: 0.031146 | val_loss: 0.0311569 | Time: 3291.29 ms
[2022-03-09 18:07:53	                main:574]	:	INFO	:	Epoch 1912 | loss: 0.0311443 | val_loss: 0.0311528 | Time: 3313.69 ms
[2022-03-09 18:07:56	                main:574]	:	INFO	:	Epoch 1913 | loss: 0.0311447 | val_loss: 0.031152 | Time: 3317.99 ms
[2022-03-09 18:08:00	                main:574]	:	INFO	:	Epoch 1914 | loss: 0.0311479 | val_loss: 0.0311524 | Time: 3322.87 ms
[2022-03-09 18:08:03	                main:574]	:	INFO	:	Epoch 1915 | loss: 0.0311483 | val_loss: 0.0311534 | Time: 3314.54 ms
[2022-03-09 18:08:06	                main:574]	:	INFO	:	Epoch 1916 | loss: 0.031145 | val_loss: 0.0311526 | Time: 3312.48 ms
[2022-03-09 18:08:10	                main:574]	:	INFO	:	Epoch 1917 | loss: 0.0311447 | val_loss: 0.0311518 | Time: 3308.06 ms
[2022-03-09 18:08:13	                main:574]	:	INFO	:	Epoch 1918 | loss: 0.0311439 | val_loss: 0.0311522 | Time: 3318.3 ms
[2022-03-09 18:08:17	                main:574]	:	INFO	:	Epoch 1919 | loss: 0.0311436 | val_loss: 0.031151 | Time: 3315.24 ms
[2022-03-09 18:08:20	                main:574]	:	INFO	:	Epoch 1920 | loss: 0.031142 | val_loss: 0.0311494 | Time: 3308.87 ms
[2022-03-09 18:08:23	                main:574]	:	INFO	:	Epoch 1921 | loss: 0.0311428 | val_loss: 0.0311514 | Time: 3310.1 ms
[2022-03-09 18:08:27	                main:574]	:	INFO	:	Epoch 1922 | loss: 0.0311434 | val_loss: 0.0311505 | Time: 3321.01 ms
[2022-03-09 18:08:30	                main:574]	:	INFO	:	Epoch 1923 | loss: 0.0311426 | val_loss: 0.0311511 | Time: 3321.1 ms
[2022-03-09 18:08:34	                main:574]	:	INFO	:	Epoch 1924 | loss: 0.0311426 | val_loss: 0.0311486 | Time: 3318.46 ms
[2022-03-09 18:08:37	                main:574]	:	INFO	:	Epoch 1925 | loss: 0.0311414 | val_loss: 0.0311484 | Time: 3317.12 ms
[2022-03-09 18:08:41	                main:574]	:	INFO	:	Epoch 1926 | loss: 0.0311408 | val_loss: 0.0311481 | Time: 3307.86 ms
[2022-03-09 18:08:44	                main:574]	:	INFO	:	Epoch 1927 | loss: 0.0311404 | val_loss: 0.0311489 | Time: 3308.02 ms
[2022-03-09 18:08:47	                main:574]	:	INFO	:	Epoch 1928 | loss: 0.0311402 | val_loss: 0.0311506 | Time: 3325.35 ms
[2022-03-09 18:08:51	                main:574]	:	INFO	:	Epoch 1929 | loss: 0.0311412 | val_loss: 0.0311487 | Time: 3321.84 ms
[2022-03-09 18:08:54	                main:574]	:	INFO	:	Epoch 1930 | loss: 0.031141 | val_loss: 0.0311486 | Time: 3304.6 ms
[2022-03-09 18:08:58	                main:574]	:	INFO	:	Epoch 1931 | loss: 0.0311482 | val_loss: 0.0311635 | Time: 3310.68 ms
[2022-03-09 18:09:01	                main:574]	:	INFO	:	Epoch 1932 | loss: 0.0311595 | val_loss: 0.0311568 | Time: 3312.98 ms
[2022-03-09 18:09:05	                main:574]	:	INFO	:	Epoch 1933 | loss: 0.0311556 | val_loss: 0.0311523 | Time: 3327.77 ms
[2022-03-09 18:09:08	                main:574]	:	INFO	:	Epoch 1934 | loss: 0.0311521 | val_loss: 0.0311524 | Time: 3304.24 ms
[2022-03-09 18:09:11	                main:574]	:	INFO	:	Epoch 1935 | loss: 0.0311507 | val_loss: 0.0311504 | Time: 3312.89 ms
[2022-03-09 18:09:15	                main:574]	:	INFO	:	Epoch 1936 | loss: 0.0311494 | val_loss: 0.031153 | Time: 3315.05 ms
[2022-03-09 18:09:18	                main:574]	:	INFO	:	Epoch 1937 | loss: 0.0311484 | val_loss: 0.031149 | Time: 3321.01 ms
[2022-03-09 18:09:21	                main:574]	:	INFO	:	Epoch 1938 | loss: 0.0311489 | val_loss: 0.0311491 | Time: 3313.89 ms
[2022-03-09 18:09:25	                main:574]	:	INFO	:	Epoch 1939 | loss: 0.0311484 | val_loss: 0.0311471 | Time: 3315.56 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB)
[2022-03-09 18:12:28	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-09 18:12:28	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-09 18:12:28	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-09 18:12:28	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-09 18:12:28	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-09 18:12:28	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-09 18:12:28	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-09 18:12:29	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-09 18:12:29	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-09 18:12:29	                main:474]	:	INFO	:	Configuration: 
[2022-03-09 18:12:29	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-09 18:12:29	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-09 18:12:29	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-09 18:12:29	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-09 18:12:29	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-09 18:12:29	                main:480]	:	INFO	:	    Patience: 10
[2022-03-09 18:12:29	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-09 18:12:29	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-09 18:12:29	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-09 18:12:29	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-09 18:12:29	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-09 18:12:29	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-09 18:12:29	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-09 18:12:31	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-09 18:12:31	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-09 18:12:31	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-09 18:12:31	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-09 18:12:31	                main:494]	:	INFO	:	Creating Model
[2022-03-09 18:12:31	                main:507]	:	INFO	:	Preparing config file
[2022-03-09 18:12:31	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-09 18:12:31	                main:512]	:	INFO	:	Loading config
[2022-03-09 18:12:31	                main:514]	:	INFO	:	Loading state
[2022-03-09 18:12:32	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-09 18:12:32	                main:562]	:	INFO	:	Starting Training
[2022-03-09 18:12:34	                main:574]	:	INFO	:	Epoch 1905 | loss: 0.0311885 | val_loss: 0.0311649 | Time: 1784.67 ms
[2022-03-09 18:12:36	                main:574]	:	INFO	:	Epoch 1906 | loss: 0.0311536 | val_loss: 0.0311588 | Time: 1630.35 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GTX 1060 6GB)
[2022-03-10 10:13:40	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-10 10:13:40	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-10 10:13:40	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-10 10:13:40	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-10 10:13:40	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-10 10:13:40	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-10 10:13:40	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-10 10:13:40	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-10 10:13:40	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-10 10:13:40	                main:474]	:	INFO	:	Configuration: 
[2022-03-10 10:13:40	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-10 10:13:40	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-10 10:13:40	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-10 10:13:40	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-10 10:13:40	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-10 10:13:40	                main:480]	:	INFO	:	    Patience: 10
[2022-03-10 10:13:40	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-10 10:13:40	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-10 10:13:40	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-10 10:13:40	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-10 10:13:40	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-10 10:13:40	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-10 10:13:40	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-10 10:13:42	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-10 10:13:42	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-10 10:13:42	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-10 10:13:43	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-10 10:13:43	                main:494]	:	INFO	:	Creating Model
[2022-03-10 10:13:43	                main:507]	:	INFO	:	Preparing config file
[2022-03-10 10:13:43	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-10 10:13:43	                main:512]	:	INFO	:	Loading config
[2022-03-10 10:13:43	                main:514]	:	INFO	:	Loading state
[2022-03-10 10:13:44	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-10 10:13:44	                main:562]	:	INFO	:	Starting Training
[2022-03-10 10:13:46	                main:574]	:	INFO	:	Epoch 1905 | loss: 0.0311876 | val_loss: 0.0311619 | Time: 2079.71 ms
[2022-03-10 10:13:48	                main:574]	:	INFO	:	Epoch 1906 | loss: 0.0311533 | val_loss: 0.0311551 | Time: 2240.56 ms
[2022-03-10 10:13:51	                main:574]	:	INFO	:	Epoch 1907 | loss: 0.0311498 | val_loss: 0.0311544 | Time: 2443.15 ms
[2022-03-10 10:13:57	                main:574]	:	INFO	:	Epoch 1908 | loss: 0.0311495 | val_loss: 0.0311547 | Time: 6288.94 ms
[2022-03-10 10:13:59	                main:574]	:	INFO	:	Epoch 1909 | loss: 0.0311502 | val_loss: 0.0311556 | Time: 1744.31 ms
[2022-03-10 10:14:00	                main:574]	:	INFO	:	Epoch 1910 | loss: 0.0311485 | val_loss: 0.0311551 | Time: 1585.89 ms
[2022-03-10 10:14:02	                main:574]	:	INFO	:	Epoch 1911 | loss: 0.0311482 | val_loss: 0.0311543 | Time: 1580.82 ms
[2022-03-10 10:14:04	                main:574]	:	INFO	:	Epoch 1912 | loss: 0.0311497 | val_loss: 0.0311572 | Time: 1543.3 ms
[2022-03-10 10:14:05	                main:574]	:	INFO	:	Epoch 1913 | loss: 0.0311469 | val_loss: 0.031156 | Time: 1515.57 ms
[2022-03-10 10:14:07	                main:574]	:	INFO	:	Epoch 1914 | loss: 0.0311459 | val_loss: 0.0311528 | Time: 1490.1 ms
[2022-03-10 10:14:08	                main:574]	:	INFO	:	Epoch 1915 | loss: 0.0311466 | val_loss: 0.0311551 | Time: 1443.69 ms
[2022-03-10 10:14:09	                main:574]	:	INFO	:	Epoch 1916 | loss: 0.0311475 | val_loss: 0.031158 | Time: 1436.07 ms
[2022-03-10 10:14:11	                main:574]	:	INFO	:	Epoch 1917 | loss: 0.0311465 | val_loss: 0.0311542 | Time: 1428.21 ms
[2022-03-10 10:14:12	                main:574]	:	INFO	:	Epoch 1918 | loss: 0.0311447 | val_loss: 0.0311555 | Time: 1455.97 ms
[2022-03-10 10:14:14	                main:574]	:	INFO	:	Epoch 1919 | loss: 0.031145 | val_loss: 0.0311528 | Time: 1442.77 ms
[2022-03-10 10:14:15	                main:574]	:	INFO	:	Epoch 1920 | loss: 0.031145 | val_loss: 0.0311508 | Time: 1440.76 ms
[2022-03-10 10:14:17	                main:574]	:	INFO	:	Epoch 1921 | loss: 0.0311434 | val_loss: 0.0311509 | Time: 1438.18 ms
[2022-03-10 10:14:18	                main:574]	:	INFO	:	Epoch 1922 | loss: 0.0311435 | val_loss: 0.0311503 | Time: 1426.72 ms
[2022-03-10 10:14:20	                main:574]	:	INFO	:	Epoch 1923 | loss: 0.0311428 | val_loss: 0.0311525 | Time: 1445.84 ms
[2022-03-10 10:14:21	                main:574]	:	INFO	:	Epoch 1924 | loss: 0.0311463 | val_loss: 0.0311601 | Time: 1460 ms
[2022-03-10 10:14:22	                main:574]	:	INFO	:	Epoch 1925 | loss: 0.0311462 | val_loss: 0.0311537 | Time: 1452.2 ms
[2022-03-10 10:14:24	                main:574]	:	INFO	:	Epoch 1926 | loss: 0.031146 | val_loss: 0.0311552 | Time: 1445.37 ms
[2022-03-10 10:14:25	                main:574]	:	INFO	:	Epoch 1927 | loss: 0.0311472 | val_loss: 0.0311527 | Time: 1455.27 ms
[2022-03-10 10:14:27	                main:574]	:	INFO	:	Epoch 1928 | loss: 0.0311455 | val_loss: 0.0311505 | Time: 1447.05 ms
[2022-03-10 10:14:28	                main:574]	:	INFO	:	Epoch 1929 | loss: 0.0311457 | val_loss: 0.0311518 | Time: 1467.49 ms
[2022-03-10 10:14:30	                main:574]	:	INFO	:	Epoch 1930 | loss: 0.0311465 | val_loss: 0.031153 | Time: 1445.65 ms
[2022-03-10 10:14:31	                main:574]	:	INFO	:	Epoch 1931 | loss: 0.031144 | val_loss: 0.03115 | Time: 1438.2 ms
[2022-03-10 10:14:33	                main:574]	:	INFO	:	Epoch 1932 | loss: 0.0311455 | val_loss: 0.0311535 | Time: 1468.48 ms
[2022-03-10 10:14:34	                main:574]	:	INFO	:	Epoch 1933 | loss: 0.0311439 | val_loss: 0.0311497 | Time: 1456.66 ms
[2022-03-10 10:14:36	                main:574]	:	INFO	:	Epoch 1934 | loss: 0.0311415 | val_loss: 0.0311487 | Time: 1444.77 ms
[2022-03-10 10:14:37	                main:574]	:	INFO	:	Epoch 1935 | loss: 0.0311411 | val_loss: 0.0311496 | Time: 1435.61 ms
[2022-03-10 10:14:38	                main:574]	:	INFO	:	Epoch 1936 | loss: 0.0311431 | val_loss: 0.0311519 | Time: 1461.39 ms
[2022-03-10 10:14:40	                main:574]	:	INFO	:	Epoch 1937 | loss: 0.0311442 | val_loss: 0.0311513 | Time: 1467.07 ms
[2022-03-10 10:14:41	                main:574]	:	INFO	:	Epoch 1938 | loss: 0.031143 | val_loss: 0.0311487 | Time: 1470.64 ms
[2022-03-10 10:14:43	                main:574]	:	INFO	:	Epoch 1939 | loss: 0.0311399 | val_loss: 0.0311495 | Time: 1541.02 ms
[2022-03-10 10:14:44	                main:574]	:	INFO	:	Epoch 1940 | loss: 0.0311408 | val_loss: 0.0311495 | Time: 1476.08 ms
[2022-03-10 10:14:46	                main:574]	:	INFO	:	Epoch 1941 | loss: 0.0311404 | val_loss: 0.0311459 | Time: 1419.16 ms
[2022-03-10 10:14:47	                main:574]	:	INFO	:	Epoch 1942 | loss: 0.0311394 | val_loss: 0.031147 | Time: 1432.71 ms
[2022-03-10 10:14:49	                main:574]	:	INFO	:	Epoch 1943 | loss: 0.0311376 | val_loss: 0.0311501 | Time: 1447.29 ms
[2022-03-10 10:14:50	                main:574]	:	INFO	:	Epoch 1944 | loss: 0.031138 | val_loss: 0.0311528 | Time: 1440.78 ms
[2022-03-10 10:14:52	                main:574]	:	INFO	:	Epoch 1945 | loss: 0.0311388 | val_loss: 0.0311475 | Time: 1436.07 ms
[2022-03-10 10:14:53	                main:574]	:	INFO	:	Epoch 1946 | loss: 0.0311377 | val_loss: 0.0311466 | Time: 1453.96 ms
[2022-03-10 10:14:55	                main:574]	:	INFO	:	Epoch 1947 | loss: 0.031137 | val_loss: 0.0311477 | Time: 1449.56 ms
[2022-03-10 10:14:56	                main:574]	:	INFO	:	Epoch 1948 | loss: 0.0311373 | val_loss: 0.0311477 | Time: 1447.65 ms
[2022-03-10 10:14:57	                main:574]	:	INFO	:	Epoch 1949 | loss: 0.0311378 | val_loss: 0.0311476 | Time: 1436.57 ms
[2022-03-10 10:14:59	                main:574]	:	INFO	:	Epoch 1950 | loss: 0.0311392 | val_loss: 0.0311524 | Time: 1459.28 ms
[2022-03-10 10:15:00	                main:574]	:	INFO	:	Epoch 1951 | loss: 0.0311413 | val_loss: 0.0311533 | Time: 1450.93 ms
[2022-03-10 10:15:02	                main:574]	:	INFO	:	Epoch 1952 | loss: 0.031141 | val_loss: 0.0311476 | Time: 1435.27 ms
[2022-03-10 10:15:03	                main:574]	:	INFO	:	Epoch 1953 | loss: 0.0311408 | val_loss: 0.0311471 | Time: 1428.57 ms
[2022-03-10 10:15:05	                main:574]	:	INFO	:	Epoch 1954 | loss: 0.0311389 | val_loss: 0.031148 | Time: 1466.44 ms
[2022-03-10 10:15:06	                main:574]	:	INFO	:	Epoch 1955 | loss: 0.0311379 | val_loss: 0.0311468 | Time: 1426.94 ms
[2022-03-10 10:15:08	                main:574]	:	INFO	:	Epoch 1956 | loss: 0.0311411 | val_loss: 0.0311508 | Time: 1451.03 ms
[2022-03-10 10:15:09	                main:574]	:	INFO	:	Epoch 1957 | loss: 0.0311393 | val_loss: 0.0311495 | Time: 1454.97 ms
[2022-03-10 10:15:11	                main:574]	:	INFO	:	Epoch 1958 | loss: 0.0311394 | val_loss: 0.0311538 | Time: 1463.21 ms
[2022-03-10 10:15:12	                main:574]	:	INFO	:	Epoch 1959 | loss: 0.0311411 | val_loss: 0.0311477 | Time: 1482.43 ms
[2022-03-10 10:15:14	                main:574]	:	INFO	:	Epoch 1960 | loss: 0.0311389 | val_loss: 0.031147 | Time: 1448.9 ms
[2022-03-10 10:15:15	                main:574]	:	INFO	:	Epoch 1961 | loss: 0.0311364 | val_loss: 0.0311464 | Time: 1434.27 ms
[2022-03-10 10:15:16	                main:574]	:	INFO	:	Epoch 1962 | loss: 0.0311362 | val_loss: 0.0311455 | Time: 1474.42 ms
[2022-03-10 10:15:18	                main:574]	:	INFO	:	Epoch 1963 | loss: 0.0311361 | val_loss: 0.0311464 | Time: 1439.51 ms
[2022-03-10 10:15:19	                main:574]	:	INFO	:	Epoch 1964 | loss: 0.0311354 | val_loss: 0.0311497 | Time: 1459.53 ms
[2022-03-10 10:15:21	                main:574]	:	INFO	:	Epoch 1965 | loss: 0.0311359 | val_loss: 0.0311462 | Time: 1714.61 ms
[2022-03-10 10:15:23	                main:574]	:	INFO	:	Epoch 1966 | loss: 0.0311344 | val_loss: 0.0311467 | Time: 1610.53 ms
[2022-03-10 10:15:24	                main:574]	:	INFO	:	Epoch 1967 | loss: 0.0311365 | val_loss: 0.0311487 | Time: 1571.87 ms
[2022-03-10 10:15:26	                main:574]	:	INFO	:	Epoch 1968 | loss: 0.0311404 | val_loss: 0.0311492 | Time: 1463.72 ms
[2022-03-10 10:15:27	                main:574]	:	INFO	:	Epoch 1969 | loss: 0.0311459 | val_loss: 0.0311501 | Time: 1486.77 ms
[2022-03-10 10:15:29	                main:574]	:	INFO	:	Epoch 1970 | loss: 0.0311426 | val_loss: 0.0311491 | Time: 1465.42 ms
[2022-03-10 10:15:30	                main:574]	:	INFO	:	Epoch 1971 | loss: 0.0311412 | val_loss: 0.0311485 | Time: 1446.47 ms
[2022-03-10 10:15:32	                main:574]	:	INFO	:	Epoch 1972 | loss: 0.0311398 | val_loss: 0.031151 | Time: 1612.55 ms
[2022-03-10 10:15:33	                main:574]	:	INFO	:	Epoch 1973 | loss: 0.0311405 | val_loss: 0.0311504 | Time: 1560.75 ms
[2022-03-10 10:15:35	                main:574]	:	INFO	:	Epoch 1974 | loss: 0.0311393 | val_loss: 0.0311511 | Time: 1425.26 ms
[2022-03-10 10:15:36	                main:574]	:	INFO	:	Epoch 1975 | loss: 0.0311392 | val_loss: 0.0311511 | Time: 1452.12 ms
[2022-03-10 10:15:38	                main:574]	:	INFO	:	Epoch 1976 | loss: 0.0311435 | val_loss: 0.0311553 | Time: 1491.2 ms
[2022-03-10 10:15:39	                main:574]	:	INFO	:	Epoch 1977 | loss: 0.0311392 | val_loss: 0.0311503 | Time: 1443.84 ms
[2022-03-10 10:15:41	                main:574]	:	INFO	:	Epoch 1978 | loss: 0.0311412 | val_loss: 0.0311518 | Time: 1452.76 ms
[2022-03-10 10:15:42	                main:574]	:	INFO	:	Epoch 1979 | loss: 0.0311395 | val_loss: 0.0311489 | Time: 1440.66 ms
[2022-03-10 10:15:44	                main:574]	:	INFO	:	Epoch 1980 | loss: 0.0311432 | val_loss: 0.031148 | Time: 1448.92 ms
[2022-03-10 10:15:45	                main:574]	:	INFO	:	Epoch 1981 | loss: 0.0311419 | val_loss: 0.0311466 | Time: 1462.86 ms
[2022-03-10 10:15:46	                main:574]	:	INFO	:	Epoch 1982 | loss: 0.0311392 | val_loss: 0.0311497 | Time: 1435.98 ms
[2022-03-10 10:15:48	                main:574]	:	INFO	:	Epoch 1983 | loss: 0.031138 | val_loss: 0.0311498 | Time: 1434.01 ms
[2022-03-10 10:15:49	                main:574]	:	INFO	:	Epoch 1984 | loss: 0.0311366 | val_loss: 0.0311482 | Time: 1455.02 ms
[2022-03-10 10:15:51	                main:574]	:	INFO	:	Epoch 1985 | loss: 0.0311367 | val_loss: 0.0311497 | Time: 1443.55 ms
[2022-03-10 10:15:52	                main:574]	:	INFO	:	Epoch 1986 | loss: 0.0311362 | val_loss: 0.0311466 | Time: 1446.97 ms
[2022-03-10 10:15:54	                main:574]	:	INFO	:	Epoch 1987 | loss: 0.0311353 | val_loss: 0.031154 | Time: 1434.22 ms
[2022-03-10 10:15:55	                main:574]	:	INFO	:	Epoch 1988 | loss: 0.0311378 | val_loss: 0.031149 | Time: 1421.16 ms
[2022-03-10 10:15:57	                main:574]	:	INFO	:	Epoch 1989 | loss: 0.031135 | val_loss: 0.0311469 | Time: 1439.31 ms
[2022-03-10 10:15:58	                main:574]	:	INFO	:	Epoch 1990 | loss: 0.0311342 | val_loss: 0.0311489 | Time: 1492.39 ms
[2022-03-10 10:16:00	                main:574]	:	INFO	:	Epoch 1991 | loss: 0.0311325 | val_loss: 0.0311468 | Time: 1530.28 ms
[2022-03-10 10:16:01	                main:574]	:	INFO	:	Epoch 1992 | loss: 0.031134 | val_loss: 0.0311473 | Time: 1494.46 ms
[2022-03-10 10:16:03	                main:574]	:	INFO	:	Epoch 1993 | loss: 0.031134 | val_loss: 0.0311482 | Time: 1480.65 ms
[2022-03-10 10:16:04	                main:574]	:	INFO	:	Epoch 1994 | loss: 0.0311353 | val_loss: 0.0311483 | Time: 1494.85 ms
[2022-03-10 10:16:06	                main:574]	:	INFO	:	Epoch 1995 | loss: 0.0311375 | val_loss: 0.0311504 | Time: 1477.66 ms
[2022-03-10 10:16:07	                main:574]	:	INFO	:	Epoch 1996 | loss: 0.0311367 | val_loss: 0.0311502 | Time: 1494.97 ms
[2022-03-10 10:16:09	                main:574]	:	INFO	:	Epoch 1997 | loss: 0.0311362 | val_loss: 0.0311466 | Time: 1481.09 ms
[2022-03-10 10:16:10	                main:574]	:	INFO	:	Epoch 1998 | loss: 0.0311343 | val_loss: 0.0311464 | Time: 1492.58 ms
[2022-03-10 10:16:12	                main:574]	:	INFO	:	Epoch 1999 | loss: 0.0311347 | val_loss: 0.0311488 | Time: 1510.07 ms
[2022-03-10 10:16:13	                main:574]	:	INFO	:	Epoch 2000 | loss: 0.0311368 | val_loss: 0.0311484 | Time: 1455.24 ms
[2022-03-10 10:16:15	                main:574]	:	INFO	:	Epoch 2001 | loss: 0.0311376 | val_loss: 0.0311479 | Time: 1476.14 ms
[2022-03-10 10:16:16	                main:574]	:	INFO	:	Epoch 2002 | loss: 0.031137 | val_loss: 0.0311488 | Time: 1489.74 ms
[2022-03-10 10:16:18	                main:574]	:	INFO	:	Epoch 2003 | loss: 0.0311353 | val_loss: 0.0311496 | Time: 1497.02 ms
[2022-03-10 10:16:19	                main:574]	:	INFO	:	Epoch 2004 | loss: 0.0311344 | val_loss: 0.0311454 | Time: 1489.62 ms
[2022-03-10 10:16:21	                main:574]	:	INFO	:	Epoch 2005 | loss: 0.0311347 | val_loss: 0.031149 | Time: 1504.7 ms
[2022-03-10 10:16:22	                main:574]	:	INFO	:	Epoch 2006 | loss: 0.0311339 | val_loss: 0.0311516 | Time: 1609.03 ms
[2022-03-10 10:16:24	                main:574]	:	INFO	:	Epoch 2007 | loss: 0.0311333 | val_loss: 0.0311449 | Time: 1515.89 ms
[2022-03-10 10:16:25	                main:574]	:	INFO	:	Epoch 2008 | loss: 0.0311316 | val_loss: 0.031147 | Time: 1491.55 ms
[2022-03-10 10:16:27	                main:574]	:	INFO	:	Epoch 2009 | loss: 0.0311327 | val_loss: 0.0311461 | Time: 1498.78 ms
[2022-03-10 10:16:28	                main:574]	:	INFO	:	Epoch 2010 | loss: 0.0311348 | val_loss: 0.0311442 | Time: 1498.17 ms
[2022-03-10 10:16:30	                main:574]	:	INFO	:	Epoch 2011 | loss: 0.0311322 | val_loss: 0.0311455 | Time: 1521 ms
[2022-03-10 10:16:31	                main:574]	:	INFO	:	Epoch 2012 | loss: 0.0311336 | val_loss: 0.0311454 | Time: 1486.1 ms
[2022-03-10 10:16:33	                main:574]	:	INFO	:	Epoch 2013 | loss: 0.0311324 | val_loss: 0.0311454 | Time: 1520.46 ms
[2022-03-10 10:16:34	                main:574]	:	INFO	:	Epoch 2014 | loss: 0.0311323 | val_loss: 0.0311511 | Time: 1491.87 ms
[2022-03-10 10:16:36	                main:574]	:	INFO	:	Epoch 2015 | loss: 0.0311314 | val_loss: 0.0311427 | Time: 1499.32 ms
[2022-03-10 10:16:37	                main:574]	:	INFO	:	Epoch 2016 | loss: 0.031132 | val_loss: 0.0311439 | Time: 1483.55 ms
[2022-03-10 10:16:39	                main:574]	:	INFO	:	Epoch 2017 | loss: 0.0311318 | val_loss: 0.0311471 | Time: 1500.45 ms
[2022-03-10 10:16:40	                main:574]	:	INFO	:	Epoch 2018 | loss: 0.0311307 | val_loss: 0.031148 | Time: 1493.69 ms
[2022-03-10 10:16:42	                main:574]	:	INFO	:	Epoch 2019 | loss: 0.0311357 | val_loss: 0.0311535 | Time: 1473.16 ms
[2022-03-10 10:16:43	                main:574]	:	INFO	:	Epoch 2020 | loss: 0.0311338 | val_loss: 0.0311477 | Time: 1488.3 ms
[2022-03-10 10:16:45	                main:574]	:	INFO	:	Epoch 2021 | loss: 0.0311325 | val_loss: 0.0311448 | Time: 1528.5 ms
[2022-03-10 10:16:46	                main:574]	:	INFO	:	Epoch 2022 | loss: 0.0311351 | val_loss: 0.0311481 | Time: 1486.56 ms
[2022-03-10 10:16:48	                main:574]	:	INFO	:	Epoch 2023 | loss: 0.0311378 | val_loss: 0.0311466 | Time: 1515.55 ms
[2022-03-10 10:16:49	                main:574]	:	INFO	:	Epoch 2024 | loss: 0.0311375 | val_loss: 0.031146 | Time: 1504.61 ms
[2022-03-10 10:16:51	                main:574]	:	INFO	:	Epoch 2025 | loss: 0.0311333 | val_loss: 0.0311479 | Time: 1512.78 ms
[2022-03-10 10:16:52	                main:574]	:	INFO	:	Epoch 2026 | loss: 0.0311347 | val_loss: 0.031147 | Time: 1505.09 ms
[2022-03-10 10:16:54	                main:574]	:	INFO	:	Epoch 2027 | loss: 0.0311435 | val_loss: 0.0311522 | Time: 1492.32 ms
[2022-03-10 10:16:55	                main:574]	:	INFO	:	Epoch 2028 | loss: 0.0311469 | val_loss: 0.031146 | Time: 1520.4 ms
[2022-03-10 10:16:57	                main:574]	:	INFO	:	Epoch 2029 | loss: 0.0311438 | val_loss: 0.0311465 | Time: 1510.68 ms
[2022-03-10 10:16:58	                main:574]	:	INFO	:	Epoch 2030 | loss: 0.0311456 | val_loss: 0.0311492 | Time: 1496.76 ms
[2022-03-10 10:17:00	                main:574]	:	INFO	:	Epoch 2031 | loss: 0.0311432 | val_loss: 0.0311439 | Time: 1481.04 ms
[2022-03-10 10:17:01	                main:574]	:	INFO	:	Epoch 2032 | loss: 0.0311402 | val_loss: 0.0311444 | Time: 1491.61 ms
[2022-03-10 10:17:03	                main:574]	:	INFO	:	Epoch 2033 | loss: 0.0311378 | val_loss: 0.0311433 | Time: 1527.46 ms
[2022-03-10 10:17:04	                main:574]	:	INFO	:	Epoch 2034 | loss: 0.0311347 | val_loss: 0.0311467 | Time: 1512.85 ms
[2022-03-10 10:17:06	                main:574]	:	INFO	:	Epoch 2035 | loss: 0.0311345 | val_loss: 0.0311447 | Time: 1526.41 ms
[2022-03-10 10:17:07	                main:574]	:	INFO	:	Epoch 2036 | loss: 0.0311332 | val_loss: 0.0311451 | Time: 1486.54 ms
[2022-03-10 10:17:09	                main:574]	:	INFO	:	Epoch 2037 | loss: 0.0311323 | val_loss: 0.0311457 | Time: 1514.55 ms
[2022-03-10 10:17:10	                main:574]	:	INFO	:	Epoch 2038 | loss: 0.0311314 | val_loss: 0.0311415 | Time: 1532.09 ms
[2022-03-10 10:17:12	                main:574]	:	INFO	:	Epoch 2039 | loss: 0.0311313 | val_loss: 0.0311435 | Time: 1494.53 ms
[2022-03-10 10:17:13	                main:574]	:	INFO	:	Epoch 2040 | loss: 0.0311344 | val_loss: 0.0311448 | Time: 1476.05 ms
[2022-03-10 10:17:15	                main:574]	:	INFO	:	Epoch 2041 | loss: 0.0311393 | val_loss: 0.0311554 | Time: 1498.9 ms
[2022-03-10 10:17:16	                main:574]	:	INFO	:	Epoch 2042 | loss: 0.0311669 | val_loss: 0.0311736 | Time: 1479.11 ms
[2022-03-10 10:17:18	                main:574]	:	INFO	:	Epoch 2043 | loss: 0.0311811 | val_loss: 0.0311817 | Time: 1501.88 ms
[2022-03-10 10:17:19	                main:574]	:	INFO	:	Epoch 2044 | loss: 0.0311812 | val_loss: 0.0311765 | Time: 1510.41 ms
[2022-03-10 10:17:21	                main:574]	:	INFO	:	Epoch 2045 | loss: 0.0311847 | val_loss: 0.0311843 | Time: 1490.39 ms
[2022-03-10 10:17:22	                main:574]	:	INFO	:	Epoch 2046 | loss: 0.0311891 | val_loss: 0.0311786 | Time: 1504.96 ms
[2022-03-10 10:17:24	                main:574]	:	INFO	:	Epoch 2047 | loss: 0.0311815 | val_loss: 0.0311757 | Time: 1480.77 ms
[2022-03-10 10:17:25	                main:574]	:	INFO	:	Epoch 2048 | loss: 0.0311758 | val_loss: 0.031173 | Time: 1505.04 ms
[2022-03-10 10:17:25	                main:597]	:	INFO	:	Saving trained model to model-final.pt, val_loss 0.031173
[2022-03-10 10:17:25	                main:603]	:	INFO	:	Saving end state to config to file
[2022-03-10 10:17:25	                main:608]	:	INFO	:	Success, exiting..
10:17:25 (11572): called boinc_finish(0)

</stderr_txt>
]]>


©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)