Task 12841537

Name ParityModified-1645994049-10255-3-0_0
Workunit 9984684
Created 9 Mar 2022, 20:59:41 UTC
Sent 11 Mar 2022, 11:28:45 UTC
Report deadline 19 Mar 2022, 11:28:45 UTC
Received 16 Mar 2022, 19:03:43 UTC
Server state Over
Outcome Success
Client state Done
Exit status 0 (0x00000000)
Computer ID 6180
Run time 3 hours 37 min 15 sec
CPU time 3 hours 22 min 27 sec
Validate state Valid
Credit 4,160.00
Device peak FLOPS 884.73 GFLOPS
Application version Machine Learning Dataset Generator (GPU) v9.75 (cuda10200)
windows_x86_64
Peak working set size 1.54 GB
Peak swap size 3.44 GB
Peak disk usage 1.54 GB

Stderr output

<core_client_version>7.16.20</core_client_version>
<![CDATA[
<stderr_txt>
0.031129 | val_loss: 0.0311657 | Time: 6373.85 ms
[2022-03-16 14:22:37	                main:574]	:	INFO	:	Epoch 1731 | loss: 0.0311285 | val_loss: 0.0311658 | Time: 6342.52 ms
[2022-03-16 14:22:43	                main:574]	:	INFO	:	Epoch 1732 | loss: 0.0311284 | val_loss: 0.0311647 | Time: 6331.07 ms
[2022-03-16 14:22:50	                main:574]	:	INFO	:	Epoch 1733 | loss: 0.0311278 | val_loss: 0.0311644 | Time: 6325.38 ms
[2022-03-16 14:22:56	                main:574]	:	INFO	:	Epoch 1734 | loss: 0.031126 | val_loss: 0.0311646 | Time: 6313.25 ms
[2022-03-16 14:23:02	                main:574]	:	INFO	:	Epoch 1735 | loss: 0.0311263 | val_loss: 0.031165 | Time: 6291.2 ms
[2022-03-16 14:23:09	                main:574]	:	INFO	:	Epoch 1736 | loss: 0.0311264 | val_loss: 0.0311658 | Time: 6282.35 ms
[2022-03-16 14:23:15	                main:574]	:	INFO	:	Epoch 1737 | loss: 0.0311272 | val_loss: 0.0311659 | Time: 6298.64 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 940MX)
[2022-03-16 14:24:17	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-16 14:24:17	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-16 14:24:17	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-16 14:24:17	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-16 14:24:18	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-16 14:24:18	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-16 14:24:18	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-16 14:24:18	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-16 14:24:18	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-16 14:24:18	                main:474]	:	INFO	:	Configuration: 
[2022-03-16 14:24:18	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-16 14:24:18	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-16 14:24:18	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-16 14:24:18	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-16 14:24:18	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-16 14:24:18	                main:480]	:	INFO	:	    Patience: 10
[2022-03-16 14:24:18	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-16 14:24:18	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-16 14:24:18	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-16 14:24:18	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-16 14:24:18	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-16 14:24:18	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-16 14:24:18	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-16 14:24:21	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-16 14:24:21	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-16 14:24:21	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-16 14:24:21	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-16 14:24:21	                main:494]	:	INFO	:	Creating Model
[2022-03-16 14:24:21	                main:507]	:	INFO	:	Preparing config file
[2022-03-16 14:24:21	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-16 14:24:21	                main:512]	:	INFO	:	Loading config
[2022-03-16 14:24:21	                main:514]	:	INFO	:	Loading state
[2022-03-16 14:24:22	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-16 14:24:22	                main:562]	:	INFO	:	Starting Training
[2022-03-16 14:24:29	                main:574]	:	INFO	:	Epoch 1738 | loss: 0.03115 | val_loss: 0.0311653 | Time: 6332.46 ms
[2022-03-16 14:24:35	                main:574]	:	INFO	:	Epoch 1739 | loss: 0.0311281 | val_loss: 0.0311705 | Time: 6122.79 ms
[2022-03-16 14:24:41	                main:574]	:	INFO	:	Epoch 1740 | loss: 0.0311256 | val_loss: 0.0311673 | Time: 6275.9 ms
[2022-03-16 14:24:48	                main:574]	:	INFO	:	Epoch 1741 | loss: 0.0311242 | val_loss: 0.0311702 | Time: 6332.2 ms
[2022-03-16 14:24:54	                main:574]	:	INFO	:	Epoch 1742 | loss: 0.031127 | val_loss: 0.0311651 | Time: 6302.82 ms
[2022-03-16 14:25:00	                main:574]	:	INFO	:	Epoch 1743 | loss: 0.031125 | val_loss: 0.0311623 | Time: 6306.61 ms
[2022-03-16 14:25:06	                main:574]	:	INFO	:	Epoch 1744 | loss: 0.0311248 | val_loss: 0.0311645 | Time: 6288.9 ms
[2022-03-16 14:25:13	                main:574]	:	INFO	:	Epoch 1745 | loss: 0.0311263 | val_loss: 0.031166 | Time: 6288.88 ms
[2022-03-16 14:25:19	                main:574]	:	INFO	:	Epoch 1746 | loss: 0.0311281 | val_loss: 0.0311683 | Time: 6293.48 ms
[2022-03-16 14:25:25	                main:574]	:	INFO	:	Epoch 1747 | loss: 0.0311278 | val_loss: 0.0311665 | Time: 6315.05 ms
[2022-03-16 14:25:32	                main:574]	:	INFO	:	Epoch 1748 | loss: 0.0311292 | val_loss: 0.0311668 | Time: 6324.66 ms
[2022-03-16 14:25:38	                main:574]	:	INFO	:	Epoch 1749 | loss: 0.0311284 | val_loss: 0.0311677 | Time: 6340.79 ms
[2022-03-16 14:25:44	                main:574]	:	INFO	:	Epoch 1750 | loss: 0.0311279 | val_loss: 0.0311677 | Time: 6295.73 ms
[2022-03-16 14:25:51	                main:574]	:	INFO	:	Epoch 1751 | loss: 0.0311319 | val_loss: 0.0311695 | Time: 6280.81 ms
[2022-03-16 14:25:57	                main:574]	:	INFO	:	Epoch 1752 | loss: 0.031132 | val_loss: 0.0311688 | Time: 6293.24 ms
[2022-03-16 14:26:03	                main:574]	:	INFO	:	Epoch 1753 | loss: 0.0311304 | val_loss: 0.0311666 | Time: 6304.96 ms
[2022-03-16 14:26:10	                main:574]	:	INFO	:	Epoch 1754 | loss: 0.0311279 | val_loss: 0.0311709 | Time: 6335.85 ms
[2022-03-16 14:26:16	                main:574]	:	INFO	:	Epoch 1755 | loss: 0.0311291 | val_loss: 0.0311737 | Time: 6295.09 ms
[2022-03-16 14:26:22	                main:574]	:	INFO	:	Epoch 1756 | loss: 0.0311325 | val_loss: 0.031173 | Time: 6290.37 ms
[2022-03-16 14:26:29	                main:574]	:	INFO	:	Epoch 1757 | loss: 0.031128 | val_loss: 0.031164 | Time: 6321.84 ms
[2022-03-16 14:26:35	                main:574]	:	INFO	:	Epoch 1758 | loss: 0.031126 | val_loss: 0.0311632 | Time: 6281.2 ms
[2022-03-16 14:26:41	                main:574]	:	INFO	:	Epoch 1759 | loss: 0.0311262 | val_loss: 0.0311675 | Time: 6313.99 ms
[2022-03-16 14:26:47	                main:574]	:	INFO	:	Epoch 1760 | loss: 0.0311264 | val_loss: 0.031165 | Time: 6334.95 ms
[2022-03-16 14:26:54	                main:574]	:	INFO	:	Epoch 1761 | loss: 0.0311271 | val_loss: 0.0311649 | Time: 6321.8 ms
[2022-03-16 14:27:00	                main:574]	:	INFO	:	Epoch 1762 | loss: 0.0311305 | val_loss: 0.031166 | Time: 6303.06 ms
[2022-03-16 14:27:06	                main:574]	:	INFO	:	Epoch 1763 | loss: 0.0311311 | val_loss: 0.0311652 | Time: 6294.96 ms
[2022-03-16 14:27:13	                main:574]	:	INFO	:	Epoch 1764 | loss: 0.0311313 | val_loss: 0.0311643 | Time: 6305.4 ms
[2022-03-16 14:27:19	                main:574]	:	INFO	:	Epoch 1765 | loss: 0.0311307 | val_loss: 0.031161 | Time: 6332.07 ms
[2022-03-16 14:27:25	                main:574]	:	INFO	:	Epoch 1766 | loss: 0.0311289 | val_loss: 0.0311669 | Time: 6339.41 ms
[2022-03-16 14:27:32	                main:574]	:	INFO	:	Epoch 1767 | loss: 0.0311296 | val_loss: 0.0311609 | Time: 6287.57 ms
[2022-03-16 14:27:38	                main:574]	:	INFO	:	Epoch 1768 | loss: 0.0311313 | val_loss: 0.0311643 | Time: 6348.73 ms
[2022-03-16 14:27:44	                main:574]	:	INFO	:	Epoch 1769 | loss: 0.0311276 | val_loss: 0.0311695 | Time: 6347.27 ms
[2022-03-16 14:27:51	                main:574]	:	INFO	:	Epoch 1770 | loss: 0.0311275 | val_loss: 0.0311649 | Time: 6295.53 ms
[2022-03-16 14:27:57	                main:574]	:	INFO	:	Epoch 1771 | loss: 0.0311271 | val_loss: 0.0311722 | Time: 6341.05 ms
[2022-03-16 14:28:03	                main:574]	:	INFO	:	Epoch 1772 | loss: 0.031127 | val_loss: 0.0311628 | Time: 6337.92 ms
[2022-03-16 14:28:10	                main:574]	:	INFO	:	Epoch 1773 | loss: 0.0311253 | val_loss: 0.0311672 | Time: 6316.73 ms
[2022-03-16 14:28:16	                main:574]	:	INFO	:	Epoch 1774 | loss: 0.0311275 | val_loss: 0.0311612 | Time: 6334.39 ms
[2022-03-16 14:28:22	                main:574]	:	INFO	:	Epoch 1775 | loss: 0.0311291 | val_loss: 0.0311658 | Time: 6319.27 ms
[2022-03-16 14:28:29	                main:574]	:	INFO	:	Epoch 1776 | loss: 0.0311333 | val_loss: 0.0311658 | Time: 6282.9 ms
[2022-03-16 14:28:35	                main:574]	:	INFO	:	Epoch 1777 | loss: 0.0311368 | val_loss: 0.0311686 | Time: 6340.59 ms
[2022-03-16 14:28:41	                main:574]	:	INFO	:	Epoch 1778 | loss: 0.0311354 | val_loss: 0.0311681 | Time: 6314.07 ms
[2022-03-16 14:28:48	                main:574]	:	INFO	:	Epoch 1779 | loss: 0.0311337 | val_loss: 0.0311673 | Time: 6326.52 ms
[2022-03-16 14:28:54	                main:574]	:	INFO	:	Epoch 1780 | loss: 0.0311332 | val_loss: 0.0311645 | Time: 6309.09 ms
[2022-03-16 14:29:00	                main:574]	:	INFO	:	Epoch 1781 | loss: 0.0311309 | val_loss: 0.0311705 | Time: 6292.22 ms
[2022-03-16 14:29:07	                main:574]	:	INFO	:	Epoch 1782 | loss: 0.0311309 | val_loss: 0.03117 | Time: 6289.81 ms
[2022-03-16 14:29:13	                main:574]	:	INFO	:	Epoch 1783 | loss: 0.0311319 | val_loss: 0.0311721 | Time: 6317.54 ms
[2022-03-16 14:29:19	                main:574]	:	INFO	:	Epoch 1784 | loss: 0.0311305 | val_loss: 0.0311712 | Time: 6303.07 ms
[2022-03-16 14:29:25	                main:574]	:	INFO	:	Epoch 1785 | loss: 0.0311302 | val_loss: 0.0311635 | Time: 6274.45 ms
[2022-03-16 14:29:32	                main:574]	:	INFO	:	Epoch 1786 | loss: 0.0311289 | val_loss: 0.0311664 | Time: 6304.15 ms
[2022-03-16 14:29:38	                main:574]	:	INFO	:	Epoch 1787 | loss: 0.0311295 | val_loss: 0.0311641 | Time: 6302.46 ms
[2022-03-16 14:29:44	                main:574]	:	INFO	:	Epoch 1788 | loss: 0.0311302 | val_loss: 0.0311626 | Time: 6311.56 ms
[2022-03-16 14:29:51	                main:574]	:	INFO	:	Epoch 1789 | loss: 0.0311305 | val_loss: 0.031172 | Time: 6289.97 ms
[2022-03-16 14:29:57	                main:574]	:	INFO	:	Epoch 1790 | loss: 0.0311288 | val_loss: 0.031167 | Time: 6288.07 ms
[2022-03-16 14:30:03	                main:574]	:	INFO	:	Epoch 1791 | loss: 0.0311307 | val_loss: 0.0311615 | Time: 6285.07 ms
[2022-03-16 14:30:10	                main:574]	:	INFO	:	Epoch 1792 | loss: 0.0311301 | val_loss: 0.0311703 | Time: 6381.1 ms
[2022-03-16 14:30:16	                main:574]	:	INFO	:	Epoch 1793 | loss: 0.0311319 | val_loss: 0.0311669 | Time: 6348.53 ms
[2022-03-16 14:30:22	                main:574]	:	INFO	:	Epoch 1794 | loss: 0.0311337 | val_loss: 0.0311696 | Time: 6301.52 ms
[2022-03-16 14:30:29	                main:574]	:	INFO	:	Epoch 1795 | loss: 0.0311291 | val_loss: 0.0311654 | Time: 6312.58 ms
[2022-03-16 14:30:35	                main:574]	:	INFO	:	Epoch 1796 | loss: 0.0311325 | val_loss: 0.0311749 | Time: 6345.56 ms
[2022-03-16 14:30:41	                main:574]	:	INFO	:	Epoch 1797 | loss: 0.0311305 | val_loss: 0.0311705 | Time: 6312.49 ms
[2022-03-16 14:30:48	                main:574]	:	INFO	:	Epoch 1798 | loss: 0.031128 | val_loss: 0.0311714 | Time: 6464.49 ms
[2022-03-16 14:30:54	                main:574]	:	INFO	:	Epoch 1799 | loss: 0.0311315 | val_loss: 0.0311726 | Time: 6404.98 ms
[2022-03-16 14:31:01	                main:574]	:	INFO	:	Epoch 1800 | loss: 0.0311399 | val_loss: 0.0311586 | Time: 6344.3 ms
[2022-03-16 14:31:07	                main:574]	:	INFO	:	Epoch 1801 | loss: 0.0311485 | val_loss: 0.031165 | Time: 6303.19 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 940MX)
[2022-03-16 14:32:43	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-16 14:32:43	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-16 14:32:43	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-16 14:32:43	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-16 14:32:43	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-16 14:32:43	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-16 14:32:43	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-16 14:32:43	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-16 14:32:43	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-16 14:32:43	                main:474]	:	INFO	:	Configuration: 
[2022-03-16 14:32:43	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-16 14:32:43	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-16 14:32:43	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-16 14:32:43	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-16 14:32:43	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-16 14:32:43	                main:480]	:	INFO	:	    Patience: 10
[2022-03-16 14:32:43	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-16 14:32:43	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-16 14:32:43	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-16 14:32:43	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-16 14:32:43	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-16 14:32:43	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-16 14:32:44	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-16 14:32:47	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-16 14:32:47	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-16 14:32:47	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-16 14:32:47	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-16 14:32:47	                main:494]	:	INFO	:	Creating Model
[2022-03-16 14:32:47	                main:507]	:	INFO	:	Preparing config file
[2022-03-16 14:32:47	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-16 14:32:47	                main:512]	:	INFO	:	Loading config
[2022-03-16 14:32:47	                main:514]	:	INFO	:	Loading state
[2022-03-16 14:32:48	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-16 14:32:48	                main:562]	:	INFO	:	Starting Training
[2022-03-16 14:32:55	                main:574]	:	INFO	:	Epoch 1797 | loss: 0.0311539 | val_loss: 0.031167 | Time: 6282.47 ms
[2022-03-16 14:33:01	                main:574]	:	INFO	:	Epoch 1798 | loss: 0.031134 | val_loss: 0.0311707 | Time: 6204.35 ms
[2022-03-16 14:33:07	                main:574]	:	INFO	:	Epoch 1799 | loss: 0.0311317 | val_loss: 0.0311679 | Time: 6232.21 ms
[2022-03-16 14:33:13	                main:574]	:	INFO	:	Epoch 1800 | loss: 0.0311304 | val_loss: 0.031173 | Time: 6301.71 ms
[2022-03-16 14:33:20	                main:574]	:	INFO	:	Epoch 1801 | loss: 0.0311319 | val_loss: 0.0311714 | Time: 6358.59 ms
[2022-03-16 14:33:26	                main:574]	:	INFO	:	Epoch 1802 | loss: 0.0311312 | val_loss: 0.0311686 | Time: 6329.3 ms
[2022-03-16 14:33:32	                main:574]	:	INFO	:	Epoch 1803 | loss: 0.0311322 | val_loss: 0.0311727 | Time: 6335.46 ms
[2022-03-16 14:33:39	                main:574]	:	INFO	:	Epoch 1804 | loss: 0.0311314 | val_loss: 0.0311746 | Time: 6288.21 ms
[2022-03-16 14:33:45	                main:574]	:	INFO	:	Epoch 1805 | loss: 0.0311333 | val_loss: 0.0311702 | Time: 6288.09 ms
[2022-03-16 14:33:51	                main:574]	:	INFO	:	Epoch 1806 | loss: 0.0311325 | val_loss: 0.0311693 | Time: 6356.98 ms
[2022-03-16 14:33:58	                main:574]	:	INFO	:	Epoch 1807 | loss: 0.0311325 | val_loss: 0.0311653 | Time: 6336.42 ms
[2022-03-16 14:34:04	                main:574]	:	INFO	:	Epoch 1808 | loss: 0.0311314 | val_loss: 0.0311675 | Time: 6378.69 ms
[2022-03-16 14:34:10	                main:574]	:	INFO	:	Epoch 1809 | loss: 0.0311333 | val_loss: 0.0311662 | Time: 6314.83 ms
[2022-03-16 14:34:17	                main:574]	:	INFO	:	Epoch 1810 | loss: 0.0311332 | val_loss: 0.0311629 | Time: 6288.26 ms
[2022-03-16 14:34:23	                main:574]	:	INFO	:	Epoch 1811 | loss: 0.031137 | val_loss: 0.0311659 | Time: 6318.83 ms
[2022-03-16 14:34:29	                main:574]	:	INFO	:	Epoch 1812 | loss: 0.031137 | val_loss: 0.0311684 | Time: 6324.98 ms
[2022-03-16 14:34:36	                main:574]	:	INFO	:	Epoch 1813 | loss: 0.0311361 | val_loss: 0.0311693 | Time: 6354.9 ms
[2022-03-16 14:34:42	                main:574]	:	INFO	:	Epoch 1814 | loss: 0.0311342 | val_loss: 0.0311693 | Time: 6347.46 ms
[2022-03-16 14:34:48	                main:574]	:	INFO	:	Epoch 1815 | loss: 0.0311326 | val_loss: 0.031166 | Time: 6369.65 ms
[2022-03-16 14:34:55	                main:574]	:	INFO	:	Epoch 1816 | loss: 0.0311323 | val_loss: 0.0311666 | Time: 6326.69 ms
[2022-03-16 14:35:01	                main:574]	:	INFO	:	Epoch 1817 | loss: 0.0311323 | val_loss: 0.0311752 | Time: 6315.94 ms
[2022-03-16 14:35:07	                main:574]	:	INFO	:	Epoch 1818 | loss: 0.0311352 | val_loss: 0.0311707 | Time: 6301.15 ms
[2022-03-16 14:35:14	                main:574]	:	INFO	:	Epoch 1819 | loss: 0.0311342 | val_loss: 0.0311652 | Time: 6303.12 ms
[2022-03-16 14:35:20	                main:574]	:	INFO	:	Epoch 1820 | loss: 0.0311326 | val_loss: 0.0311633 | Time: 6325.78 ms
[2022-03-16 14:35:26	                main:574]	:	INFO	:	Epoch 1821 | loss: 0.0311323 | val_loss: 0.0311661 | Time: 6300.7 ms
[2022-03-16 14:35:33	                main:574]	:	INFO	:	Epoch 1822 | loss: 0.0311318 | val_loss: 0.0311674 | Time: 6321.74 ms
[2022-03-16 14:35:39	                main:574]	:	INFO	:	Epoch 1823 | loss: 0.0311319 | val_loss: 0.0311675 | Time: 6289.58 ms
[2022-03-16 14:35:45	                main:574]	:	INFO	:	Epoch 1824 | loss: 0.0311324 | val_loss: 0.0311698 | Time: 6290.83 ms
[2022-03-16 14:35:52	                main:574]	:	INFO	:	Epoch 1825 | loss: 0.0311325 | val_loss: 0.0311675 | Time: 6331.5 ms
[2022-03-16 14:35:58	                main:574]	:	INFO	:	Epoch 1826 | loss: 0.0311339 | val_loss: 0.0311616 | Time: 6310.62 ms
[2022-03-16 14:36:04	                main:574]	:	INFO	:	Epoch 1827 | loss: 0.0311358 | val_loss: 0.0311623 | Time: 6342.91 ms
[2022-03-16 14:36:11	                main:574]	:	INFO	:	Epoch 1828 | loss: 0.031135 | val_loss: 0.0311624 | Time: 6284.3 ms
[2022-03-16 14:36:17	                main:574]	:	INFO	:	Epoch 1829 | loss: 0.0311353 | val_loss: 0.0311607 | Time: 6285.03 ms
[2022-03-16 14:36:23	                main:574]	:	INFO	:	Epoch 1830 | loss: 0.0311359 | val_loss: 0.0311637 | Time: 6282.88 ms
[2022-03-16 14:36:29	                main:574]	:	INFO	:	Epoch 1831 | loss: 0.0311344 | val_loss: 0.0311622 | Time: 6310.77 ms
[2022-03-16 14:36:36	                main:574]	:	INFO	:	Epoch 1832 | loss: 0.0311356 | val_loss: 0.0311637 | Time: 6316.75 ms
[2022-03-16 14:36:42	                main:574]	:	INFO	:	Epoch 1833 | loss: 0.0311369 | val_loss: 0.0311647 | Time: 6320.95 ms
[2022-03-16 14:36:48	                main:574]	:	INFO	:	Epoch 1834 | loss: 0.0311384 | val_loss: 0.031165 | Time: 6320.52 ms
[2022-03-16 14:36:55	                main:574]	:	INFO	:	Epoch 1835 | loss: 0.0311408 | val_loss: 0.0311671 | Time: 6303.74 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 940MX)
[2022-03-16 14:37:56	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-16 14:37:56	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-16 14:37:56	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-16 14:37:56	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-16 14:37:56	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-16 14:37:56	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-16 14:37:56	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-16 14:37:56	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-16 14:37:56	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-16 14:37:56	                main:474]	:	INFO	:	Configuration: 
[2022-03-16 14:37:57	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-16 14:37:57	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-16 14:37:57	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-16 14:37:57	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-16 14:37:57	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-16 14:37:57	                main:480]	:	INFO	:	    Patience: 10
[2022-03-16 14:37:57	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-16 14:37:57	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-16 14:37:57	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-16 14:37:57	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-16 14:37:57	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-16 14:37:57	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-16 14:37:57	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-16 14:38:00	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-16 14:38:00	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-16 14:38:00	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-16 14:38:00	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-16 14:38:00	                main:494]	:	INFO	:	Creating Model
[2022-03-16 14:38:00	                main:507]	:	INFO	:	Preparing config file
[2022-03-16 14:38:00	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-16 14:38:00	                main:512]	:	INFO	:	Loading config
[2022-03-16 14:38:00	                main:514]	:	INFO	:	Loading state
[2022-03-16 14:38:01	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-16 14:38:01	                main:562]	:	INFO	:	Starting Training
[2022-03-16 14:38:08	                main:574]	:	INFO	:	Epoch 1836 | loss: 0.0311609 | val_loss: 0.0311742 | Time: 6452.82 ms
[2022-03-16 14:38:14	                main:574]	:	INFO	:	Epoch 1837 | loss: 0.0311417 | val_loss: 0.031168 | Time: 6153.46 ms
[2022-03-16 14:38:20	                main:574]	:	INFO	:	Epoch 1838 | loss: 0.0311411 | val_loss: 0.031171 | Time: 6269.82 ms
[2022-03-16 14:38:26	                main:574]	:	INFO	:	Epoch 1839 | loss: 0.03114 | val_loss: 0.0311714 | Time: 6288.2 ms
[2022-03-16 14:38:33	                main:574]	:	INFO	:	Epoch 1840 | loss: 0.0311379 | val_loss: 0.031167 | Time: 6299.41 ms
[2022-03-16 14:38:39	                main:574]	:	INFO	:	Epoch 1841 | loss: 0.0311356 | val_loss: 0.0311655 | Time: 6336.24 ms
[2022-03-16 14:38:45	                main:574]	:	INFO	:	Epoch 1842 | loss: 0.0311358 | val_loss: 0.0311653 | Time: 6322.5 ms
[2022-03-16 14:38:52	                main:574]	:	INFO	:	Epoch 1843 | loss: 0.0311377 | val_loss: 0.0311801 | Time: 6315.51 ms
[2022-03-16 14:38:58	                main:574]	:	INFO	:	Epoch 1844 | loss: 0.0311403 | val_loss: 0.0311752 | Time: 6373.81 ms
[2022-03-16 14:39:04	                main:574]	:	INFO	:	Epoch 1845 | loss: 0.0311368 | val_loss: 0.0311698 | Time: 6314.14 ms
[2022-03-16 14:39:11	                main:574]	:	INFO	:	Epoch 1846 | loss: 0.0311366 | val_loss: 0.0311694 | Time: 6312.75 ms
[2022-03-16 14:39:17	                main:574]	:	INFO	:	Epoch 1847 | loss: 0.031137 | val_loss: 0.0311706 | Time: 6334.38 ms
[2022-03-16 14:39:23	                main:574]	:	INFO	:	Epoch 1848 | loss: 0.0311361 | val_loss: 0.031173 | Time: 6324.53 ms
[2022-03-16 14:39:30	                main:574]	:	INFO	:	Epoch 1849 | loss: 0.0311368 | val_loss: 0.0311779 | Time: 7020.19 ms
[2022-03-16 14:39:37	                main:574]	:	INFO	:	Epoch 1850 | loss: 0.0311356 | val_loss: 0.031172 | Time: 6362.22 ms
[2022-03-16 14:39:43	                main:574]	:	INFO	:	Epoch 1851 | loss: 0.0311347 | val_loss: 0.0311717 | Time: 6329.58 ms
[2022-03-16 14:39:49	                main:574]	:	INFO	:	Epoch 1852 | loss: 0.0311353 | val_loss: 0.0311653 | Time: 6310.2 ms
[2022-03-16 14:39:56	                main:574]	:	INFO	:	Epoch 1853 | loss: 0.0311339 | val_loss: 0.0311679 | Time: 6313.94 ms
[2022-03-16 14:40:02	                main:574]	:	INFO	:	Epoch 1854 | loss: 0.0311338 | val_loss: 0.0311655 | Time: 6355.24 ms
[2022-03-16 14:40:09	                main:574]	:	INFO	:	Epoch 1855 | loss: 0.0311338 | val_loss: 0.0311677 | Time: 6340 ms
[2022-03-16 14:40:15	                main:574]	:	INFO	:	Epoch 1856 | loss: 0.0311335 | val_loss: 0.0311695 | Time: 6317.53 ms
[2022-03-16 14:40:21	                main:574]	:	INFO	:	Epoch 1857 | loss: 0.0311333 | val_loss: 0.0311685 | Time: 6284.13 ms
[2022-03-16 14:40:27	                main:574]	:	INFO	:	Epoch 1858 | loss: 0.0311329 | val_loss: 0.0311687 | Time: 6314.46 ms
[2022-03-16 14:40:34	                main:574]	:	INFO	:	Epoch 1859 | loss: 0.0311315 | val_loss: 0.0311698 | Time: 6927.66 ms
[2022-03-16 14:40:41	                main:574]	:	INFO	:	Epoch 1860 | loss: 0.0311313 | val_loss: 0.0311709 | Time: 6644.25 ms
[2022-03-16 14:40:48	                main:574]	:	INFO	:	Epoch 1861 | loss: 0.0311308 | val_loss: 0.0311676 | Time: 6730.73 ms
[2022-03-16 14:40:54	                main:574]	:	INFO	:	Epoch 1862 | loss: 0.0311312 | val_loss: 0.0311687 | Time: 6710.26 ms
[2022-03-16 14:41:01	                main:574]	:	INFO	:	Epoch 1863 | loss: 0.0311338 | val_loss: 0.0311707 | Time: 6414.46 ms
[2022-03-16 14:41:07	                main:574]	:	INFO	:	Epoch 1864 | loss: 0.0311329 | val_loss: 0.031166 | Time: 6365.62 ms
[2022-03-16 14:41:14	                main:574]	:	INFO	:	Epoch 1865 | loss: 0.031131 | val_loss: 0.0311683 | Time: 6295.58 ms
[2022-03-16 14:41:20	                main:574]	:	INFO	:	Epoch 1866 | loss: 0.0311298 | val_loss: 0.0311665 | Time: 6394.51 ms
[2022-03-16 14:41:26	                main:574]	:	INFO	:	Epoch 1867 | loss: 0.0311289 | val_loss: 0.0311654 | Time: 6373.7 ms
[2022-03-16 14:41:33	                main:574]	:	INFO	:	Epoch 1868 | loss: 0.0311288 | val_loss: 0.0311688 | Time: 6359.02 ms
[2022-03-16 14:41:39	                main:574]	:	INFO	:	Epoch 1869 | loss: 0.0311328 | val_loss: 0.0311639 | Time: 6314.83 ms
[2022-03-16 14:41:45	                main:574]	:	INFO	:	Epoch 1870 | loss: 0.031136 | val_loss: 0.0311659 | Time: 6337.49 ms
[2022-03-16 14:41:52	                main:574]	:	INFO	:	Epoch 1871 | loss: 0.031133 | val_loss: 0.0311655 | Time: 6303.25 ms
[2022-03-16 14:41:58	                main:574]	:	INFO	:	Epoch 1872 | loss: 0.0311324 | val_loss: 0.0311653 | Time: 6396.89 ms
[2022-03-16 14:42:04	                main:574]	:	INFO	:	Epoch 1873 | loss: 0.0311317 | val_loss: 0.0311611 | Time: 6353.26 ms
[2022-03-16 14:42:11	                main:574]	:	INFO	:	Epoch 1874 | loss: 0.0311305 | val_loss: 0.0311668 | Time: 6358.89 ms
[2022-03-16 14:42:17	                main:574]	:	INFO	:	Epoch 1875 | loss: 0.0311319 | val_loss: 0.0311669 | Time: 6345.05 ms
[2022-03-16 14:42:23	                main:574]	:	INFO	:	Epoch 1876 | loss: 0.0311309 | val_loss: 0.0311664 | Time: 6288.59 ms
[2022-03-16 14:42:30	                main:574]	:	INFO	:	Epoch 1877 | loss: 0.0311304 | val_loss: 0.0311627 | Time: 6304.71 ms
[2022-03-16 14:42:36	                main:574]	:	INFO	:	Epoch 1878 | loss: 0.0311303 | val_loss: 0.0311717 | Time: 6388.58 ms
[2022-03-16 14:42:43	                main:574]	:	INFO	:	Epoch 1879 | loss: 0.0311305 | val_loss: 0.0311684 | Time: 6378.85 ms
[2022-03-16 14:42:49	                main:574]	:	INFO	:	Epoch 1880 | loss: 0.0311301 | val_loss: 0.0311634 | Time: 6295.75 ms
[2022-03-16 14:42:55	                main:574]	:	INFO	:	Epoch 1881 | loss: 0.0311311 | val_loss: 0.0311653 | Time: 6427.48 ms
[2022-03-16 14:43:02	                main:574]	:	INFO	:	Epoch 1882 | loss: 0.0311313 | val_loss: 0.0311684 | Time: 6297.11 ms
[2022-03-16 14:43:08	                main:574]	:	INFO	:	Epoch 1883 | loss: 0.0311318 | val_loss: 0.0311626 | Time: 6378.8 ms
[2022-03-16 14:43:14	                main:574]	:	INFO	:	Epoch 1884 | loss: 0.0311305 | val_loss: 0.0311645 | Time: 6352.56 ms
[2022-03-16 14:43:21	                main:574]	:	INFO	:	Epoch 1885 | loss: 0.0311331 | val_loss: 0.0311649 | Time: 6333.33 ms
[2022-03-16 14:43:27	                main:574]	:	INFO	:	Epoch 1886 | loss: 0.0311339 | val_loss: 0.0311646 | Time: 6300.25 ms
[2022-03-16 14:43:33	                main:574]	:	INFO	:	Epoch 1887 | loss: 0.0311334 | val_loss: 0.0311646 | Time: 6306.43 ms
[2022-03-16 14:43:40	                main:574]	:	INFO	:	Epoch 1888 | loss: 0.0311321 | val_loss: 0.0311665 | Time: 6294.8 ms
[2022-03-16 14:43:46	                main:574]	:	INFO	:	Epoch 1889 | loss: 0.031131 | val_loss: 0.0311655 | Time: 6301.28 ms
[2022-03-16 14:43:52	                main:574]	:	INFO	:	Epoch 1890 | loss: 0.031131 | val_loss: 0.0311661 | Time: 6375.14 ms
[2022-03-16 14:43:59	                main:574]	:	INFO	:	Epoch 1891 | loss: 0.0311305 | val_loss: 0.0311719 | Time: 6349.04 ms
[2022-03-16 14:44:05	                main:574]	:	INFO	:	Epoch 1892 | loss: 0.0311351 | val_loss: 0.0311631 | Time: 6294.47 ms
[2022-03-16 14:44:11	                main:574]	:	INFO	:	Epoch 1893 | loss: 0.0311376 | val_loss: 0.0311634 | Time: 6307.15 ms
[2022-03-16 14:44:17	                main:574]	:	INFO	:	Epoch 1894 | loss: 0.0311379 | val_loss: 0.0311646 | Time: 6286.94 ms
[2022-03-16 14:44:24	                main:574]	:	INFO	:	Epoch 1895 | loss: 0.0311366 | val_loss: 0.0311643 | Time: 6306.52 ms
[2022-03-16 14:44:30	                main:574]	:	INFO	:	Epoch 1896 | loss: 0.0311367 | val_loss: 0.0311669 | Time: 6465.6 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 940MX)
[2022-03-16 14:45:36	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-16 14:45:36	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-16 14:45:36	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-16 14:45:36	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-16 14:45:36	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-16 14:45:36	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-16 14:45:36	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-16 14:45:36	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-16 14:45:36	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-16 14:45:36	                main:474]	:	INFO	:	Configuration: 
[2022-03-16 14:45:36	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-16 14:45:36	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-16 14:45:36	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-16 14:45:36	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-16 14:45:36	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-16 14:45:36	                main:480]	:	INFO	:	    Patience: 10
[2022-03-16 14:45:36	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-16 14:45:36	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-16 14:45:36	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-16 14:45:36	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-16 14:45:36	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-16 14:45:36	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-16 14:45:37	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-16 14:45:39	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-16 14:45:39	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-16 14:45:39	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-16 14:45:39	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-16 14:45:39	                main:494]	:	INFO	:	Creating Model
[2022-03-16 14:45:39	                main:507]	:	INFO	:	Preparing config file
[2022-03-16 14:45:39	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-16 14:45:39	                main:512]	:	INFO	:	Loading config
[2022-03-16 14:45:39	                main:514]	:	INFO	:	Loading state
[2022-03-16 14:45:40	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-16 14:45:40	                main:562]	:	INFO	:	Starting Training
[2022-03-16 14:45:47	                main:574]	:	INFO	:	Epoch 1895 | loss: 0.0311642 | val_loss: 0.0311727 | Time: 6392.39 ms
[2022-03-16 14:45:53	                main:574]	:	INFO	:	Epoch 1896 | loss: 0.0311411 | val_loss: 0.0311656 | Time: 6199.76 ms
[2022-03-16 14:45:59	                main:574]	:	INFO	:	Epoch 1897 | loss: 0.0311371 | val_loss: 0.0311638 | Time: 6235.47 ms
[2022-03-16 14:46:05	                main:574]	:	INFO	:	Epoch 1898 | loss: 0.0311334 | val_loss: 0.0311626 | Time: 6244.96 ms
[2022-03-16 14:46:11	                main:574]	:	INFO	:	Epoch 1899 | loss: 0.0311335 | val_loss: 0.0311663 | Time: 6263.61 ms
[2022-03-16 14:46:18	                main:574]	:	INFO	:	Epoch 1900 | loss: 0.0311329 | val_loss: 0.031169 | Time: 6270.3 ms
[2022-03-16 14:46:24	                main:574]	:	INFO	:	Epoch 1901 | loss: 0.0311348 | val_loss: 0.0311595 | Time: 6303.03 ms
[2022-03-16 14:46:30	                main:574]	:	INFO	:	Epoch 1902 | loss: 0.0311355 | val_loss: 0.031168 | Time: 6281.15 ms
[2022-03-16 14:46:37	                main:574]	:	INFO	:	Epoch 1903 | loss: 0.0311358 | val_loss: 0.0311617 | Time: 6302.1 ms
[2022-03-16 14:46:43	                main:574]	:	INFO	:	Epoch 1904 | loss: 0.0311335 | val_loss: 0.0311616 | Time: 6265.49 ms
[2022-03-16 14:46:49	                main:574]	:	INFO	:	Epoch 1905 | loss: 0.031133 | val_loss: 0.0311612 | Time: 6281.38 ms
[2022-03-16 14:46:55	                main:574]	:	INFO	:	Epoch 1906 | loss: 0.0311335 | val_loss: 0.0311636 | Time: 6270.3 ms
[2022-03-16 14:47:02	                main:574]	:	INFO	:	Epoch 1907 | loss: 0.0311316 | val_loss: 0.0311654 | Time: 6276.44 ms
[2022-03-16 14:47:08	                main:574]	:	INFO	:	Epoch 1908 | loss: 0.0311318 | val_loss: 0.0311651 | Time: 6271.21 ms
[2022-03-16 14:47:14	                main:574]	:	INFO	:	Epoch 1909 | loss: 0.0311315 | val_loss: 0.0311706 | Time: 6295.37 ms
[2022-03-16 14:47:21	                main:574]	:	INFO	:	Epoch 1910 | loss: 0.0311335 | val_loss: 0.0311685 | Time: 6290.48 ms
[2022-03-16 14:47:27	                main:574]	:	INFO	:	Epoch 1911 | loss: 0.0311387 | val_loss: 0.0311694 | Time: 6289.9 ms
[2022-03-16 14:47:33	                main:574]	:	INFO	:	Epoch 1912 | loss: 0.0311391 | val_loss: 0.0311708 | Time: 6275.45 ms
[2022-03-16 14:47:39	                main:574]	:	INFO	:	Epoch 1913 | loss: 0.0311391 | val_loss: 0.0311699 | Time: 6279.69 ms
[2022-03-16 14:47:46	                main:574]	:	INFO	:	Epoch 1914 | loss: 0.0311375 | val_loss: 0.0311717 | Time: 6273.82 ms
[2022-03-16 14:47:52	                main:574]	:	INFO	:	Epoch 1915 | loss: 0.0311356 | val_loss: 0.0311747 | Time: 6283.46 ms
[2022-03-16 14:47:58	                main:574]	:	INFO	:	Epoch 1916 | loss: 0.0311356 | val_loss: 0.0311732 | Time: 6301.46 ms
[2022-03-16 14:48:05	                main:574]	:	INFO	:	Epoch 1917 | loss: 0.0311344 | val_loss: 0.0311707 | Time: 6268.28 ms
[2022-03-16 14:48:11	                main:574]	:	INFO	:	Epoch 1918 | loss: 0.0311358 | val_loss: 0.0311721 | Time: 6266.62 ms
[2022-03-16 14:48:17	                main:574]	:	INFO	:	Epoch 1919 | loss: 0.031136 | val_loss: 0.0311629 | Time: 6266.91 ms
[2022-03-16 14:48:23	                main:574]	:	INFO	:	Epoch 1920 | loss: 0.0311331 | val_loss: 0.0311661 | Time: 6271.91 ms
[2022-03-16 14:48:30	                main:574]	:	INFO	:	Epoch 1921 | loss: 0.0311313 | val_loss: 0.0311638 | Time: 6270.89 ms
[2022-03-16 14:48:37	                main:574]	:	INFO	:	Epoch 1922 | loss: 0.0311322 | val_loss: 0.0311642 | Time: 6796.78 ms
[2022-03-16 14:48:43	                main:574]	:	INFO	:	Epoch 1923 | loss: 0.0311353 | val_loss: 0.0311669 | Time: 6733.61 ms
[2022-03-16 14:49:10	                main:574]	:	INFO	:	Epoch 1924 | loss: 0.0311385 | val_loss: 0.0311731 | Time: 26890.4 ms
[2022-03-16 14:49:17	                main:574]	:	INFO	:	Epoch 1925 | loss: 0.0311397 | val_loss: 0.0311726 | Time: 6334.77 ms
[2022-03-16 14:49:23	                main:574]	:	INFO	:	Epoch 1926 | loss: 0.0311365 | val_loss: 0.0311704 | Time: 6362.62 ms
[2022-03-16 14:49:29	                main:574]	:	INFO	:	Epoch 1927 | loss: 0.0311362 | val_loss: 0.0311678 | Time: 6351.41 ms
[2022-03-16 14:49:36	                main:574]	:	INFO	:	Epoch 1928 | loss: 0.0311369 | val_loss: 0.0311669 | Time: 6321.73 ms
[2022-03-16 14:49:42	                main:574]	:	INFO	:	Epoch 1929 | loss: 0.0311374 | val_loss: 0.0311691 | Time: 6302.68 ms
[2022-03-16 14:49:48	                main:574]	:	INFO	:	Epoch 1930 | loss: 0.0311339 | val_loss: 0.0311621 | Time: 6312.67 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 940MX)
[2022-03-16 14:50:50	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-16 14:50:50	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-16 14:50:50	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-16 14:50:50	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-16 14:50:50	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-16 14:50:50	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-16 14:50:50	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-16 14:50:50	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-16 14:50:50	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-16 14:50:50	                main:474]	:	INFO	:	Configuration: 
[2022-03-16 14:50:50	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-16 14:50:50	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-16 14:50:50	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-16 14:50:50	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-16 14:50:50	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-16 14:50:50	                main:480]	:	INFO	:	    Patience: 10
[2022-03-16 14:50:50	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-16 14:50:50	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-16 14:50:50	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-16 14:50:50	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-16 14:50:50	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-16 14:50:50	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-16 14:50:51	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-16 14:50:53	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-16 14:50:53	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-16 14:50:53	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-16 14:50:53	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-16 14:50:53	                main:494]	:	INFO	:	Creating Model
[2022-03-16 14:50:53	                main:507]	:	INFO	:	Preparing config file
[2022-03-16 14:50:53	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-16 14:50:53	                main:512]	:	INFO	:	Loading config
[2022-03-16 14:50:53	                main:514]	:	INFO	:	Loading state
[2022-03-16 14:50:54	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-16 14:50:54	                main:562]	:	INFO	:	Starting Training
[2022-03-16 14:51:01	                main:574]	:	INFO	:	Epoch 1931 | loss: 0.0311679 | val_loss: 0.0311664 | Time: 6418.98 ms
[2022-03-16 14:51:07	                main:574]	:	INFO	:	Epoch 1932 | loss: 0.0311362 | val_loss: 0.0311655 | Time: 6234.69 ms
[2022-03-16 14:51:13	                main:574]	:	INFO	:	Epoch 1933 | loss: 0.0311329 | val_loss: 0.0311676 | Time: 6228.25 ms
[2022-03-16 14:51:19	                main:574]	:	INFO	:	Epoch 1934 | loss: 0.0311336 | val_loss: 0.0311689 | Time: 6399.2 ms
[2022-03-16 14:51:26	                main:574]	:	INFO	:	Epoch 1935 | loss: 0.0311343 | val_loss: 0.0311692 | Time: 6313.22 ms
[2022-03-16 14:51:32	                main:574]	:	INFO	:	Epoch 1936 | loss: 0.0311332 | val_loss: 0.0311684 | Time: 6286.41 ms
[2022-03-16 14:51:38	                main:574]	:	INFO	:	Epoch 1937 | loss: 0.0311341 | val_loss: 0.0311725 | Time: 6313.03 ms
[2022-03-16 14:51:45	                main:574]	:	INFO	:	Epoch 1938 | loss: 0.0311341 | val_loss: 0.0311646 | Time: 6318.16 ms
[2022-03-16 14:51:51	                main:574]	:	INFO	:	Epoch 1939 | loss: 0.0311334 | val_loss: 0.0311635 | Time: 6308.75 ms
[2022-03-16 14:51:57	                main:574]	:	INFO	:	Epoch 1940 | loss: 0.031134 | val_loss: 0.0311699 | Time: 6271.15 ms
[2022-03-16 14:52:04	                main:574]	:	INFO	:	Epoch 1941 | loss: 0.0311334 | val_loss: 0.0311663 | Time: 6335.45 ms
[2022-03-16 14:52:10	                main:574]	:	INFO	:	Epoch 1942 | loss: 0.0311342 | val_loss: 0.0311657 | Time: 6288.91 ms
[2022-03-16 14:52:16	                main:574]	:	INFO	:	Epoch 1943 | loss: 0.0311338 | val_loss: 0.0311661 | Time: 6280.39 ms
[2022-03-16 14:52:22	                main:574]	:	INFO	:	Epoch 1944 | loss: 0.0311384 | val_loss: 0.0311662 | Time: 6300.55 ms
[2022-03-16 14:52:29	                main:574]	:	INFO	:	Epoch 1945 | loss: 0.0311381 | val_loss: 0.0311649 | Time: 6275.17 ms
[2022-03-16 14:52:35	                main:574]	:	INFO	:	Epoch 1946 | loss: 0.0311365 | val_loss: 0.0311635 | Time: 6279.62 ms
[2022-03-16 14:52:41	                main:574]	:	INFO	:	Epoch 1947 | loss: 0.0311378 | val_loss: 0.031165 | Time: 6309.03 ms
[2022-03-16 14:52:48	                main:574]	:	INFO	:	Epoch 1948 | loss: 0.0311381 | val_loss: 0.0311619 | Time: 6293.88 ms
[2022-03-16 14:52:54	                main:574]	:	INFO	:	Epoch 1949 | loss: 0.0311348 | val_loss: 0.0311618 | Time: 6269.82 ms
[2022-03-16 14:53:00	                main:574]	:	INFO	:	Epoch 1950 | loss: 0.0311327 | val_loss: 0.031164 | Time: 6289.5 ms
[2022-03-16 14:53:07	                main:574]	:	INFO	:	Epoch 1951 | loss: 0.0311337 | val_loss: 0.0311656 | Time: 6285.16 ms
[2022-03-16 14:53:13	                main:574]	:	INFO	:	Epoch 1952 | loss: 0.0311346 | val_loss: 0.0311602 | Time: 6292.47 ms
[2022-03-16 14:53:19	                main:574]	:	INFO	:	Epoch 1953 | loss: 0.0311308 | val_loss: 0.0311661 | Time: 6272.78 ms
[2022-03-16 14:53:25	                main:574]	:	INFO	:	Epoch 1954 | loss: 0.0311306 | val_loss: 0.0311641 | Time: 6274.43 ms
[2022-03-16 14:53:32	                main:574]	:	INFO	:	Epoch 1955 | loss: 0.0311308 | val_loss: 0.0311628 | Time: 6268.23 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 940MX)
[2022-03-16 19:51:03	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-16 19:51:03	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-16 19:51:03	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-16 19:51:03	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-16 19:51:03	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-16 19:51:03	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-16 19:51:03	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-16 19:51:03	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-16 19:51:03	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-16 19:51:03	                main:474]	:	INFO	:	Configuration: 
[2022-03-16 19:51:03	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-16 19:51:03	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-16 19:51:03	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-16 19:51:03	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-16 19:51:03	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-16 19:51:03	                main:480]	:	INFO	:	    Patience: 10
[2022-03-16 19:51:03	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-16 19:51:03	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-16 19:51:03	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-16 19:51:03	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-16 19:51:03	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-16 19:51:03	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-16 19:51:03	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-16 19:51:08	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-16 19:51:08	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-16 19:51:08	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-16 19:51:08	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-16 19:51:08	                main:494]	:	INFO	:	Creating Model
[2022-03-16 19:51:08	                main:507]	:	INFO	:	Preparing config file
[2022-03-16 19:51:08	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-16 19:51:08	                main:512]	:	INFO	:	Loading config
[2022-03-16 19:51:08	                main:514]	:	INFO	:	Loading state
[2022-03-16 19:51:10	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-16 19:51:10	                main:562]	:	INFO	:	Starting Training
[2022-03-16 19:51:17	                main:574]	:	INFO	:	Epoch 1950 | loss: 0.0311629 | val_loss: 0.0311697 | Time: 7072.85 ms
[2022-03-16 19:51:24	                main:574]	:	INFO	:	Epoch 1951 | loss: 0.031137 | val_loss: 0.0311624 | Time: 6232.37 ms
[2022-03-16 19:51:30	                main:574]	:	INFO	:	Epoch 1952 | loss: 0.0311325 | val_loss: 0.0311649 | Time: 6269.01 ms
[2022-03-16 19:51:36	                main:574]	:	INFO	:	Epoch 1953 | loss: 0.0311321 | val_loss: 0.0311656 | Time: 6278.4 ms
[2022-03-16 19:51:42	                main:574]	:	INFO	:	Epoch 1954 | loss: 0.0311316 | val_loss: 0.0311621 | Time: 6308.84 ms
[2022-03-16 19:51:49	                main:574]	:	INFO	:	Epoch 1955 | loss: 0.0311308 | val_loss: 0.0311665 | Time: 6275.88 ms
[2022-03-16 19:51:55	                main:574]	:	INFO	:	Epoch 1956 | loss: 0.0311342 | val_loss: 0.0311631 | Time: 6266.78 ms
[2022-03-16 19:52:01	                main:574]	:	INFO	:	Epoch 1957 | loss: 0.0311314 | val_loss: 0.0311628 | Time: 6282.2 ms
[2022-03-16 19:52:08	                main:574]	:	INFO	:	Epoch 1958 | loss: 0.0311292 | val_loss: 0.0311642 | Time: 6255.32 ms
[2022-03-16 19:52:14	                main:574]	:	INFO	:	Epoch 1959 | loss: 0.0311311 | val_loss: 0.0311653 | Time: 6281.73 ms
[2022-03-16 19:52:20	                main:574]	:	INFO	:	Epoch 1960 | loss: 0.0311338 | val_loss: 0.0311629 | Time: 6278.36 ms
[2022-03-16 19:52:26	                main:574]	:	INFO	:	Epoch 1961 | loss: 0.0311301 | val_loss: 0.0311658 | Time: 6291.4 ms
[2022-03-16 19:52:33	                main:574]	:	INFO	:	Epoch 1962 | loss: 0.0311283 | val_loss: 0.0311633 | Time: 6266.02 ms
[2022-03-16 19:52:39	                main:574]	:	INFO	:	Epoch 1963 | loss: 0.0311285 | val_loss: 0.0311698 | Time: 6277.32 ms
[2022-03-16 19:52:45	                main:574]	:	INFO	:	Epoch 1964 | loss: 0.0311312 | val_loss: 0.0311676 | Time: 6265.4 ms
[2022-03-16 19:52:52	                main:574]	:	INFO	:	Epoch 1965 | loss: 0.0311291 | val_loss: 0.031171 | Time: 6262.42 ms
[2022-03-16 19:52:58	                main:574]	:	INFO	:	Epoch 1966 | loss: 0.0311293 | val_loss: 0.0311696 | Time: 6284.22 ms
[2022-03-16 19:53:04	                main:574]	:	INFO	:	Epoch 1967 | loss: 0.0311297 | val_loss: 0.0311655 | Time: 6270.33 ms
[2022-03-16 19:53:10	                main:574]	:	INFO	:	Epoch 1968 | loss: 0.0311285 | val_loss: 0.0311705 | Time: 6291.58 ms
[2022-03-16 19:53:17	                main:574]	:	INFO	:	Epoch 1969 | loss: 0.0311276 | val_loss: 0.0311636 | Time: 6291.97 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce 940MX)
[2022-03-16 19:55:00	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-16 19:55:00	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-16 19:55:00	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-16 19:55:00	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-16 19:55:00	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-16 19:55:00	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-16 19:55:00	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-16 19:55:00	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-16 19:55:00	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-16 19:55:00	                main:474]	:	INFO	:	Configuration: 
[2022-03-16 19:55:00	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-16 19:55:00	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-16 19:55:00	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-16 19:55:00	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-16 19:55:00	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-16 19:55:00	                main:480]	:	INFO	:	    Patience: 10
[2022-03-16 19:55:00	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-16 19:55:00	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-16 19:55:00	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-16 19:55:00	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-16 19:55:00	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-16 19:55:00	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-16 19:55:01	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-16 19:55:03	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-16 19:55:03	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-16 19:55:03	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-16 19:55:03	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-16 19:55:03	                main:494]	:	INFO	:	Creating Model
[2022-03-16 19:55:03	                main:507]	:	INFO	:	Preparing config file
[2022-03-16 19:55:03	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-16 19:55:03	                main:512]	:	INFO	:	Loading config
[2022-03-16 19:55:03	                main:514]	:	INFO	:	Loading state
[2022-03-16 19:55:04	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-16 19:55:04	                main:562]	:	INFO	:	Starting Training
[2022-03-16 19:55:11	                main:574]	:	INFO	:	Epoch 1969 | loss: 0.0311551 | val_loss: 0.0311691 | Time: 6477.08 ms
[2022-03-16 19:55:17	                main:574]	:	INFO	:	Epoch 1970 | loss: 0.0311349 | val_loss: 0.0311631 | Time: 6177.68 ms
[2022-03-16 19:55:23	                main:574]	:	INFO	:	Epoch 1971 | loss: 0.0311298 | val_loss: 0.0311642 | Time: 6219.86 ms
[2022-03-16 19:55:29	                main:574]	:	INFO	:	Epoch 1972 | loss: 0.0311287 | val_loss: 0.0311627 | Time: 6247.35 ms
[2022-03-16 19:55:35	                main:574]	:	INFO	:	Epoch 1973 | loss: 0.0311286 | val_loss: 0.031165 | Time: 6256.36 ms
[2022-03-16 19:55:42	                main:574]	:	INFO	:	Epoch 1974 | loss: 0.031129 | val_loss: 0.0311672 | Time: 6287.1 ms
[2022-03-16 19:55:48	                main:574]	:	INFO	:	Epoch 1975 | loss: 0.0311286 | val_loss: 0.0311669 | Time: 6274.15 ms
[2022-03-16 19:55:54	                main:574]	:	INFO	:	Epoch 1976 | loss: 0.0311296 | val_loss: 0.031162 | Time: 6292.57 ms
[2022-03-16 19:56:01	                main:574]	:	INFO	:	Epoch 1977 | loss: 0.0311282 | val_loss: 0.0311722 | Time: 6335.61 ms
[2022-03-16 19:56:07	                main:574]	:	INFO	:	Epoch 1978 | loss: 0.0311297 | val_loss: 0.0311651 | Time: 6271.78 ms
[2022-03-16 19:56:13	                main:574]	:	INFO	:	Epoch 1979 | loss: 0.0311283 | val_loss: 0.0311651 | Time: 6269.13 ms
[2022-03-16 19:56:20	                main:574]	:	INFO	:	Epoch 1980 | loss: 0.0311299 | val_loss: 0.0311662 | Time: 6290.35 ms
[2022-03-16 19:56:26	                main:574]	:	INFO	:	Epoch 1981 | loss: 0.0311286 | val_loss: 0.0311681 | Time: 6305.63 ms
[2022-03-16 19:56:32	                main:574]	:	INFO	:	Epoch 1982 | loss: 0.0311298 | val_loss: 0.0311714 | Time: 6302.26 ms
[2022-03-16 19:56:38	                main:574]	:	INFO	:	Epoch 1983 | loss: 0.0311278 | val_loss: 0.031169 | Time: 6277.99 ms
[2022-03-16 19:56:45	                main:574]	:	INFO	:	Epoch 1984 | loss: 0.0311275 | val_loss: 0.0311664 | Time: 6288.91 ms
[2022-03-16 19:56:51	                main:574]	:	INFO	:	Epoch 1985 | loss: 0.0311272 | val_loss: 0.0311686 | Time: 6277.1 ms
[2022-03-16 19:56:57	                main:574]	:	INFO	:	Epoch 1986 | loss: 0.031126 | val_loss: 0.0311697 | Time: 6268.04 ms
[2022-03-16 19:57:04	                main:574]	:	INFO	:	Epoch 1987 | loss: 0.0311273 | val_loss: 0.0311682 | Time: 6276.46 ms
[2022-03-16 19:57:10	                main:574]	:	INFO	:	Epoch 1988 | loss: 0.0311275 | val_loss: 0.031169 | Time: 6302.5 ms
[2022-03-16 19:57:16	                main:574]	:	INFO	:	Epoch 1989 | loss: 0.0311266 | val_loss: 0.0311688 | Time: 6277.07 ms
[2022-03-16 19:57:22	                main:574]	:	INFO	:	Epoch 1990 | loss: 0.0311254 | val_loss: 0.0311722 | Time: 6295.74 ms
[2022-03-16 19:57:29	                main:574]	:	INFO	:	Epoch 1991 | loss: 0.0311275 | val_loss: 0.0311694 | Time: 6290.77 ms
[2022-03-16 19:57:35	                main:574]	:	INFO	:	Epoch 1992 | loss: 0.0311265 | val_loss: 0.0311696 | Time: 6290.64 ms
[2022-03-16 19:57:41	                main:574]	:	INFO	:	Epoch 1993 | loss: 0.031127 | val_loss: 0.0311635 | Time: 6250.87 ms
[2022-03-16 19:57:48	                main:574]	:	INFO	:	Epoch 1994 | loss: 0.0311264 | val_loss: 0.0311678 | Time: 6250.72 ms
[2022-03-16 19:57:54	                main:574]	:	INFO	:	Epoch 1995 | loss: 0.0311256 | val_loss: 0.0311664 | Time: 6272.05 ms
[2022-03-16 19:58:00	                main:574]	:	INFO	:	Epoch 1996 | loss: 0.0311281 | val_loss: 0.031168 | Time: 6323.86 ms
[2022-03-16 19:58:06	                main:574]	:	INFO	:	Epoch 1997 | loss: 0.0311285 | val_loss: 0.0311705 | Time: 6298.61 ms
[2022-03-16 19:58:13	                main:574]	:	INFO	:	Epoch 1998 | loss: 0.03113 | val_loss: 0.0311711 | Time: 6269.2 ms
[2022-03-16 19:58:19	                main:574]	:	INFO	:	Epoch 1999 | loss: 0.0311305 | val_loss: 0.0311673 | Time: 6285.89 ms
[2022-03-16 19:58:25	                main:574]	:	INFO	:	Epoch 2000 | loss: 0.0311281 | val_loss: 0.0311673 | Time: 6250.51 ms
[2022-03-16 19:58:32	                main:574]	:	INFO	:	Epoch 2001 | loss: 0.0311268 | val_loss: 0.0311638 | Time: 6254.78 ms
[2022-03-16 19:58:38	                main:574]	:	INFO	:	Epoch 2002 | loss: 0.0311271 | val_loss: 0.0311696 | Time: 6276.37 ms
[2022-03-16 19:58:44	                main:574]	:	INFO	:	Epoch 2003 | loss: 0.0311279 | val_loss: 0.031163 | Time: 6270.19 ms
[2022-03-16 19:58:50	                main:574]	:	INFO	:	Epoch 2004 | loss: 0.0311293 | val_loss: 0.0311717 | Time: 6286.7 ms
[2022-03-16 19:58:57	                main:574]	:	INFO	:	Epoch 2005 | loss: 0.0311322 | val_loss: 0.0311765 | Time: 6299.25 ms
[2022-03-16 19:59:03	                main:574]	:	INFO	:	Epoch 2006 | loss: 0.0311309 | val_loss: 0.0311665 | Time: 6395.35 ms
[2022-03-16 19:59:09	                main:574]	:	INFO	:	Epoch 2007 | loss: 0.0311326 | val_loss: 0.0311627 | Time: 6311.14 ms
[2022-03-16 19:59:16	                main:574]	:	INFO	:	Epoch 2008 | loss: 0.0311317 | val_loss: 0.0311698 | Time: 6278.01 ms
[2022-03-16 19:59:22	                main:574]	:	INFO	:	Epoch 2009 | loss: 0.0311317 | val_loss: 0.0311688 | Time: 6308.63 ms
[2022-03-16 19:59:28	                main:574]	:	INFO	:	Epoch 2010 | loss: 0.0311319 | val_loss: 0.031171 | Time: 6373.22 ms
[2022-03-16 19:59:35	                main:574]	:	INFO	:	Epoch 2011 | loss: 0.0311311 | val_loss: 0.0311648 | Time: 6348.52 ms
[2022-03-16 19:59:41	                main:574]	:	INFO	:	Epoch 2012 | loss: 0.0311305 | val_loss: 0.0311682 | Time: 6314.15 ms
[2022-03-16 19:59:47	                main:574]	:	INFO	:	Epoch 2013 | loss: 0.0311328 | val_loss: 0.0311716 | Time: 6271.9 ms
[2022-03-16 19:59:54	                main:574]	:	INFO	:	Epoch 2014 | loss: 0.0311296 | val_loss: 0.0311657 | Time: 6289.46 ms
[2022-03-16 20:00:00	                main:574]	:	INFO	:	Epoch 2015 | loss: 0.031128 | val_loss: 0.0311736 | Time: 6299.11 ms
[2022-03-16 20:00:06	                main:574]	:	INFO	:	Epoch 2016 | loss: 0.0311309 | val_loss: 0.0311715 | Time: 6341.17 ms
[2022-03-16 20:00:13	                main:574]	:	INFO	:	Epoch 2017 | loss: 0.0311288 | val_loss: 0.0311683 | Time: 6286.04 ms
[2022-03-16 20:00:19	                main:574]	:	INFO	:	Epoch 2018 | loss: 0.0311285 | val_loss: 0.0311683 | Time: 6340.58 ms
[2022-03-16 20:00:25	                main:574]	:	INFO	:	Epoch 2019 | loss: 0.0311275 | val_loss: 0.0311731 | Time: 6303.45 ms
[2022-03-16 20:00:31	                main:574]	:	INFO	:	Epoch 2020 | loss: 0.0311281 | val_loss: 0.0311712 | Time: 6263.14 ms
[2022-03-16 20:00:38	                main:574]	:	INFO	:	Epoch 2021 | loss: 0.0311267 | val_loss: 0.0311691 | Time: 6250.96 ms
[2022-03-16 20:00:44	                main:574]	:	INFO	:	Epoch 2022 | loss: 0.0311255 | val_loss: 0.0311755 | Time: 6265.88 ms
[2022-03-16 20:00:50	                main:574]	:	INFO	:	Epoch 2023 | loss: 0.0311282 | val_loss: 0.0311672 | Time: 6267.12 ms
[2022-03-16 20:00:57	                main:574]	:	INFO	:	Epoch 2024 | loss: 0.0311275 | val_loss: 0.0311661 | Time: 6291.13 ms
[2022-03-16 20:01:03	                main:574]	:	INFO	:	Epoch 2025 | loss: 0.0311308 | val_loss: 0.0311706 | Time: 6294.43 ms
[2022-03-16 20:01:09	                main:574]	:	INFO	:	Epoch 2026 | loss: 0.03113 | val_loss: 0.0311687 | Time: 6258.99 ms
[2022-03-16 20:01:15	                main:574]	:	INFO	:	Epoch 2027 | loss: 0.0311309 | val_loss: 0.031166 | Time: 6249.53 ms
[2022-03-16 20:01:22	                main:574]	:	INFO	:	Epoch 2028 | loss: 0.031133 | val_loss: 0.0311749 | Time: 6304.18 ms
[2022-03-16 20:01:28	                main:574]	:	INFO	:	Epoch 2029 | loss: 0.0311308 | val_loss: 0.0311704 | Time: 6313.62 ms
[2022-03-16 20:01:34	                main:574]	:	INFO	:	Epoch 2030 | loss: 0.031131 | val_loss: 0.0311704 | Time: 6298.8 ms
[2022-03-16 20:01:41	                main:574]	:	INFO	:	Epoch 2031 | loss: 0.0311304 | val_loss: 0.0311685 | Time: 6284.12 ms
[2022-03-16 20:01:47	                main:574]	:	INFO	:	Epoch 2032 | loss: 0.0311321 | val_loss: 0.0311687 | Time: 6277.06 ms
[2022-03-16 20:01:53	                main:574]	:	INFO	:	Epoch 2033 | loss: 0.0311322 | val_loss: 0.0311708 | Time: 6290.52 ms
[2022-03-16 20:01:59	                main:574]	:	INFO	:	Epoch 2034 | loss: 0.03113 | val_loss: 0.0311698 | Time: 6256.91 ms
[2022-03-16 20:02:06	                main:574]	:	INFO	:	Epoch 2035 | loss: 0.0311315 | val_loss: 0.0311693 | Time: 6307.59 ms
[2022-03-16 20:02:12	                main:574]	:	INFO	:	Epoch 2036 | loss: 0.031129 | val_loss: 0.0311693 | Time: 6289.85 ms
[2022-03-16 20:02:18	                main:574]	:	INFO	:	Epoch 2037 | loss: 0.0311281 | val_loss: 0.0311739 | Time: 6290.37 ms
[2022-03-16 20:02:25	                main:574]	:	INFO	:	Epoch 2038 | loss: 0.031127 | val_loss: 0.0311679 | Time: 6267.92 ms
[2022-03-16 20:02:31	                main:574]	:	INFO	:	Epoch 2039 | loss: 0.031128 | val_loss: 0.0311687 | Time: 6274.27 ms
[2022-03-16 20:02:37	                main:574]	:	INFO	:	Epoch 2040 | loss: 0.0311265 | val_loss: 0.031173 | Time: 6261.45 ms
[2022-03-16 20:02:43	                main:574]	:	INFO	:	Epoch 2041 | loss: 0.0311259 | val_loss: 0.0311681 | Time: 6240.6 ms
[2022-03-16 20:02:50	                main:574]	:	INFO	:	Epoch 2042 | loss: 0.0311252 | val_loss: 0.0311681 | Time: 6258.97 ms
[2022-03-16 20:02:56	                main:574]	:	INFO	:	Epoch 2043 | loss: 0.0311255 | val_loss: 0.0311645 | Time: 6269.04 ms
[2022-03-16 20:03:02	                main:574]	:	INFO	:	Epoch 2044 | loss: 0.0311255 | val_loss: 0.0311694 | Time: 6290.39 ms
[2022-03-16 20:03:09	                main:574]	:	INFO	:	Epoch 2045 | loss: 0.0311261 | val_loss: 0.0311718 | Time: 6361.54 ms
[2022-03-16 20:03:15	                main:574]	:	INFO	:	Epoch 2046 | loss: 0.0311281 | val_loss: 0.0311674 | Time: 6278.89 ms
[2022-03-16 20:03:21	                main:574]	:	INFO	:	Epoch 2047 | loss: 0.0311286 | val_loss: 0.0311673 | Time: 6273.69 ms
[2022-03-16 20:03:27	                main:574]	:	INFO	:	Epoch 2048 | loss: 0.0311284 | val_loss: 0.0311739 | Time: 6262.88 ms
[2022-03-16 20:03:27	                main:597]	:	INFO	:	Saving trained model to model-final.pt, val_loss 0.0311739
[2022-03-16 20:03:27	                main:603]	:	INFO	:	Saving end state to config to file
[2022-03-16 20:03:27	                main:608]	:	INFO	:	Success, exiting..
20:03:27 (4892): called boinc_finish(0)

</stderr_txt>
]]>


©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)