Task 11558622

Name ParityModified-1639964508-416-3-0_0
Workunit 8781618
Created 14 Feb 2022, 16:13:43 UTC
Sent 25 Feb 2022, 19:42:12 UTC
Report deadline 5 Mar 2022, 19:42:12 UTC
Received 3 Mar 2022, 20:38:30 UTC
Server state Over
Outcome Success
Client state Done
Exit status 0 (0x00000000)
Computer ID 16732
Run time 2 hours 41 min 36 sec
CPU time 2 hours 37 min 25 sec
Validate state Valid
Credit 4,160.00
Device peak FLOPS 1,132.15 GFLOPS
Application version Machine Learning Dataset Generator (GPU) v9.75 (cuda10200)
windows_x86_64
Peak working set size 1.62 GB
Peak swap size 3.57 GB
Peak disk usage 1.54 GB

Stderr output

<core_client_version>7.16.20</core_client_version>
<![CDATA[
<stderr_txt>
             main:574]	:	INFO	:	Epoch 1802 | loss: 0.0311822 | val_loss: 0.0311795 | Time: 4175.73 ms
[2022-03-03 11:15:09	                main:574]	:	INFO	:	Epoch 1803 | loss: 0.0311809 | val_loss: 0.0311798 | Time: 4087.55 ms
[2022-03-03 11:15:13	                main:574]	:	INFO	:	Epoch 1804 | loss: 0.0311809 | val_loss: 0.0311815 | Time: 4049.53 ms
[2022-03-03 11:15:17	                main:574]	:	INFO	:	Epoch 1805 | loss: 0.0311818 | val_loss: 0.0311819 | Time: 4032.97 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-03-03 11:17:59	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-03 11:17:59	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-03 11:17:59	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-03 11:17:59	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-03 11:17:59	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-03 11:17:59	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-03 11:17:59	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-03 11:17:59	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-03 11:17:59	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-03 11:17:59	                main:474]	:	INFO	:	Configuration: 
[2022-03-03 11:17:59	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-03 11:17:59	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-03 11:17:59	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-03 11:17:59	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-03 11:17:59	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-03 11:17:59	                main:480]	:	INFO	:	    Patience: 10
[2022-03-03 11:17:59	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-03 11:17:59	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-03 11:17:59	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-03 11:17:59	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-03 11:17:59	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-03 11:17:59	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-03-03 11:21:27	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-03 11:21:27	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-03 11:21:27	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-03 11:21:27	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-03 11:21:27	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-03 11:21:27	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-03 11:21:27	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-03 11:21:27	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-03 11:21:27	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-03 11:21:27	                main:474]	:	INFO	:	Configuration: 
[2022-03-03 11:21:27	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-03 11:21:27	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-03 11:21:27	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-03 11:21:27	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-03 11:21:27	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-03 11:21:27	                main:480]	:	INFO	:	    Patience: 10
[2022-03-03 11:21:27	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-03 11:21:27	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-03 11:21:27	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-03 11:21:27	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-03 11:21:27	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-03 11:21:27	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-03 11:21:27	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-03 11:21:28	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-03 11:21:28	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-03 11:21:29	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-03 11:21:29	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-03 11:21:29	                main:494]	:	INFO	:	Creating Model
[2022-03-03 11:21:29	                main:507]	:	INFO	:	Preparing config file
[2022-03-03 11:21:29	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-03 11:21:29	                main:512]	:	INFO	:	Loading config
[2022-03-03 11:21:29	                main:514]	:	INFO	:	Loading state
[2022-03-03 11:21:29	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-03 11:21:29	                main:562]	:	INFO	:	Starting Training
[2022-03-03 11:21:34	                main:574]	:	INFO	:	Epoch 1793 | loss: 0.0312145 | val_loss: 0.0311818 | Time: 4395.18 ms
[2022-03-03 11:21:38	                main:574]	:	INFO	:	Epoch 1794 | loss: 0.0311861 | val_loss: 0.0311808 | Time: 4421.56 ms
[2022-03-03 11:21:43	                main:574]	:	INFO	:	Epoch 1795 | loss: 0.0311839 | val_loss: 0.031181 | Time: 5095.91 ms
[2022-03-03 11:21:48	                main:574]	:	INFO	:	Epoch 1796 | loss: 0.0311829 | val_loss: 0.0311808 | Time: 5261.63 ms
[2022-03-03 11:21:53	                main:574]	:	INFO	:	Epoch 1797 | loss: 0.0311835 | val_loss: 0.0311813 | Time: 5034.9 ms
[2022-03-03 11:21:59	                main:574]	:	INFO	:	Epoch 1798 | loss: 0.0311827 | val_loss: 0.0311824 | Time: 5275.35 ms
[2022-03-03 11:22:04	                main:574]	:	INFO	:	Epoch 1799 | loss: 0.0311826 | val_loss: 0.031189 | Time: 5223.68 ms
[2022-03-03 11:22:09	                main:574]	:	INFO	:	Epoch 1800 | loss: 0.0311847 | val_loss: 0.0311834 | Time: 4909.84 ms
[2022-03-03 11:22:14	                main:574]	:	INFO	:	Epoch 1801 | loss: 0.031182 | val_loss: 0.0311797 | Time: 5246.64 ms
[2022-03-03 11:22:19	                main:574]	:	INFO	:	Epoch 1802 | loss: 0.031181 | val_loss: 0.0311797 | Time: 4873.2 ms
[2022-03-03 11:22:24	                main:574]	:	INFO	:	Epoch 1803 | loss: 0.0311802 | val_loss: 0.031181 | Time: 5216.62 ms
[2022-03-03 11:22:30	                main:574]	:	INFO	:	Epoch 1804 | loss: 0.0311809 | val_loss: 0.0311806 | Time: 5461.47 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-03-03 11:30:08	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-03 11:30:08	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-03 11:30:08	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-03 11:30:08	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-03 11:30:08	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-03 11:30:08	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-03 11:30:08	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-03 11:30:08	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-03 11:30:08	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-03 11:30:08	                main:474]	:	INFO	:	Configuration: 
[2022-03-03 11:30:08	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-03 11:30:08	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-03 11:30:08	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-03 11:30:08	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-03 11:30:08	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-03 11:30:08	                main:480]	:	INFO	:	    Patience: 10
[2022-03-03 11:30:08	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-03 11:30:08	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-03 11:30:08	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-03 11:30:08	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-03 11:30:08	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-03 11:30:08	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-03 11:30:08	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-03 11:30:09	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-03 11:30:09	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-03 11:30:09	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-03 11:30:09	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-03 11:30:09	                main:494]	:	INFO	:	Creating Model
[2022-03-03 11:30:09	                main:507]	:	INFO	:	Preparing config file
[2022-03-03 11:30:09	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-03 11:30:09	                main:512]	:	INFO	:	Loading config
[2022-03-03 11:30:09	                main:514]	:	INFO	:	Loading state
[2022-03-03 11:30:10	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-03 11:30:10	                main:562]	:	INFO	:	Starting Training
[2022-03-03 11:30:14	                main:574]	:	INFO	:	Epoch 1793 | loss: 0.0312134 | val_loss: 0.0311807 | Time: 4171.08 ms
[2022-03-03 11:30:19	                main:574]	:	INFO	:	Epoch 1794 | loss: 0.0311832 | val_loss: 0.0311805 | Time: 4326.41 ms
[2022-03-03 11:30:24	                main:574]	:	INFO	:	Epoch 1795 | loss: 0.0311832 | val_loss: 0.0311815 | Time: 5062.03 ms
[2022-03-03 11:30:29	                main:574]	:	INFO	:	Epoch 1796 | loss: 0.0311833 | val_loss: 0.0311792 | Time: 5332.83 ms
[2022-03-03 11:30:34	                main:574]	:	INFO	:	Epoch 1797 | loss: 0.0311831 | val_loss: 0.0311807 | Time: 4904.32 ms
[2022-03-03 11:30:39	                main:574]	:	INFO	:	Epoch 1798 | loss: 0.0311834 | val_loss: 0.0311816 | Time: 5190.36 ms
[2022-03-03 11:30:44	                main:574]	:	INFO	:	Epoch 1799 | loss: 0.0311842 | val_loss: 0.0311809 | Time: 5188.91 ms
[2022-03-03 11:30:49	                main:574]	:	INFO	:	Epoch 1800 | loss: 0.0311838 | val_loss: 0.0311816 | Time: 5027.18 ms
[2022-03-03 11:30:55	                main:574]	:	INFO	:	Epoch 1801 | loss: 0.0311831 | val_loss: 0.0311795 | Time: 5297.96 ms
[2022-03-03 11:30:59	                main:574]	:	INFO	:	Epoch 1802 | loss: 0.0311834 | val_loss: 0.0311831 | Time: 4850.86 ms
[2022-03-03 11:31:05	                main:574]	:	INFO	:	Epoch 1803 | loss: 0.0311832 | val_loss: 0.0311811 | Time: 5258.32 ms
[2022-03-03 11:31:10	                main:574]	:	INFO	:	Epoch 1804 | loss: 0.031184 | val_loss: 0.0311819 | Time: 5107.75 ms
[2022-03-03 11:31:15	                main:574]	:	INFO	:	Epoch 1805 | loss: 0.0311848 | val_loss: 0.031181 | Time: 5230.06 ms
[2022-03-03 11:31:20	                main:574]	:	INFO	:	Epoch 1806 | loss: 0.0311844 | val_loss: 0.0311805 | Time: 5352.7 ms
[2022-03-03 11:31:25	                main:574]	:	INFO	:	Epoch 1807 | loss: 0.0311842 | val_loss: 0.03118 | Time: 4809.55 ms
[2022-03-03 11:31:31	                main:574]	:	INFO	:	Epoch 1808 | loss: 0.0311843 | val_loss: 0.0311799 | Time: 5287.69 ms
[2022-03-03 11:31:35	                main:574]	:	INFO	:	Epoch 1809 | loss: 0.0311852 | val_loss: 0.0311802 | Time: 4866.48 ms
[2022-03-03 11:31:40	                main:574]	:	INFO	:	Epoch 1810 | loss: 0.0311854 | val_loss: 0.0311835 | Time: 4163.77 ms
[2022-03-03 11:31:44	                main:574]	:	INFO	:	Epoch 1811 | loss: 0.0311861 | val_loss: 0.0311865 | Time: 4057.44 ms
[2022-03-03 11:31:48	                main:574]	:	INFO	:	Epoch 1812 | loss: 0.0311897 | val_loss: 0.0311876 | Time: 4127.37 ms
[2022-03-03 11:31:52	                main:574]	:	INFO	:	Epoch 1813 | loss: 0.0311914 | val_loss: 0.0311872 | Time: 4107.48 ms
[2022-03-03 11:31:56	                main:574]	:	INFO	:	Epoch 1814 | loss: 0.0311912 | val_loss: 0.0311891 | Time: 4112.75 ms
[2022-03-03 11:32:00	                main:574]	:	INFO	:	Epoch 1815 | loss: 0.0311943 | val_loss: 0.0311926 | Time: 4272.82 ms
[2022-03-03 11:32:04	                main:574]	:	INFO	:	Epoch 1816 | loss: 0.031193 | val_loss: 0.0311941 | Time: 4081.27 ms
[2022-03-03 11:32:08	                main:574]	:	INFO	:	Epoch 1817 | loss: 0.031197 | val_loss: 0.0311937 | Time: 4159.44 ms
[2022-03-03 11:32:13	                main:574]	:	INFO	:	Epoch 1818 | loss: 0.0311949 | val_loss: 0.0311939 | Time: 4048.45 ms
[2022-03-03 11:32:17	                main:574]	:	INFO	:	Epoch 1819 | loss: 0.0311952 | val_loss: 0.0311943 | Time: 4002.15 ms
[2022-03-03 11:32:21	                main:574]	:	INFO	:	Epoch 1820 | loss: 0.0311947 | val_loss: 0.0311978 | Time: 4059.71 ms
[2022-03-03 11:32:25	                main:574]	:	INFO	:	Epoch 1821 | loss: 0.0311988 | val_loss: 0.0312022 | Time: 4187.53 ms
[2022-03-03 11:32:29	                main:574]	:	INFO	:	Epoch 1822 | loss: 0.0312035 | val_loss: 0.0312064 | Time: 4062.2 ms
[2022-03-03 11:32:33	                main:574]	:	INFO	:	Epoch 1823 | loss: 0.0312034 | val_loss: 0.0312021 | Time: 4036.14 ms
[2022-03-03 11:32:37	                main:574]	:	INFO	:	Epoch 1824 | loss: 0.0312015 | val_loss: 0.0312002 | Time: 4100.59 ms
[2022-03-03 11:32:41	                main:574]	:	INFO	:	Epoch 1825 | loss: 0.0312 | val_loss: 0.0312002 | Time: 4057.3 ms
[2022-03-03 11:32:45	                main:574]	:	INFO	:	Epoch 1826 | loss: 0.0312008 | val_loss: 0.0311986 | Time: 4034.1 ms
[2022-03-03 11:32:49	                main:574]	:	INFO	:	Epoch 1827 | loss: 0.0311987 | val_loss: 0.0311989 | Time: 4142.25 ms
[2022-03-03 11:32:53	                main:574]	:	INFO	:	Epoch 1828 | loss: 0.0311998 | val_loss: 0.0311963 | Time: 4099.6 ms
[2022-03-03 11:32:57	                main:574]	:	INFO	:	Epoch 1829 | loss: 0.0311996 | val_loss: 0.0311967 | Time: 4063.47 ms
[2022-03-03 11:33:01	                main:574]	:	INFO	:	Epoch 1830 | loss: 0.0311994 | val_loss: 0.0311964 | Time: 4091.97 ms
[2022-03-03 11:33:06	                main:574]	:	INFO	:	Epoch 1831 | loss: 0.0311993 | val_loss: 0.0311988 | Time: 4135.29 ms
[2022-03-03 11:33:10	                main:574]	:	INFO	:	Epoch 1832 | loss: 0.0311991 | val_loss: 0.0311993 | Time: 4090.82 ms
[2022-03-03 11:33:14	                main:574]	:	INFO	:	Epoch 1833 | loss: 0.0311991 | val_loss: 0.0311979 | Time: 4072.07 ms
[2022-03-03 11:33:18	                main:574]	:	INFO	:	Epoch 1834 | loss: 0.0312005 | val_loss: 0.0312031 | Time: 4009.45 ms
[2022-03-03 11:33:22	                main:574]	:	INFO	:	Epoch 1835 | loss: 0.0312001 | val_loss: 0.0311977 | Time: 3990.19 ms
[2022-03-03 11:33:26	                main:574]	:	INFO	:	Epoch 1836 | loss: 0.0312012 | val_loss: 0.0311978 | Time: 4174.29 ms
[2022-03-03 11:33:30	                main:574]	:	INFO	:	Epoch 1837 | loss: 0.0311985 | val_loss: 0.031197 | Time: 4059.93 ms
[2022-03-03 11:33:34	                main:574]	:	INFO	:	Epoch 1838 | loss: 0.0311979 | val_loss: 0.0311968 | Time: 4094.63 ms
[2022-03-03 11:33:38	                main:574]	:	INFO	:	Epoch 1839 | loss: 0.0311973 | val_loss: 0.0311972 | Time: 4182.25 ms
[2022-03-03 11:33:42	                main:574]	:	INFO	:	Epoch 1840 | loss: 0.0311975 | val_loss: 0.0311972 | Time: 4131.83 ms
[2022-03-03 11:33:46	                main:574]	:	INFO	:	Epoch 1841 | loss: 0.0311964 | val_loss: 0.0311973 | Time: 3997.32 ms
[2022-03-03 11:33:51	                main:574]	:	INFO	:	Epoch 1842 | loss: 0.0311963 | val_loss: 0.0311974 | Time: 4163.25 ms
[2022-03-03 11:33:55	                main:574]	:	INFO	:	Epoch 1843 | loss: 0.0311967 | val_loss: 0.0311981 | Time: 4130.23 ms
[2022-03-03 11:33:59	                main:574]	:	INFO	:	Epoch 1844 | loss: 0.0311967 | val_loss: 0.0311974 | Time: 4015.71 ms
[2022-03-03 11:34:03	                main:574]	:	INFO	:	Epoch 1845 | loss: 0.0311967 | val_loss: 0.0311977 | Time: 4126 ms
[2022-03-03 11:34:07	                main:574]	:	INFO	:	Epoch 1846 | loss: 0.0311976 | val_loss: 0.0312007 | Time: 4114.57 ms
[2022-03-03 11:34:11	                main:574]	:	INFO	:	Epoch 1847 | loss: 0.0311967 | val_loss: 0.0311959 | Time: 4116.07 ms
[2022-03-03 11:34:15	                main:574]	:	INFO	:	Epoch 1848 | loss: 0.0311959 | val_loss: 0.0311959 | Time: 4106.87 ms
[2022-03-03 11:34:19	                main:574]	:	INFO	:	Epoch 1849 | loss: 0.0311963 | val_loss: 0.031196 | Time: 4043.86 ms
[2022-03-03 11:34:23	                main:574]	:	INFO	:	Epoch 1850 | loss: 0.0311961 | val_loss: 0.0311955 | Time: 4084.75 ms
[2022-03-03 11:34:27	                main:574]	:	INFO	:	Epoch 1851 | loss: 0.0311957 | val_loss: 0.0311954 | Time: 4051.42 ms
[2022-03-03 11:34:32	                main:574]	:	INFO	:	Epoch 1852 | loss: 0.0311964 | val_loss: 0.0311954 | Time: 4147.49 ms
[2022-03-03 11:34:36	                main:574]	:	INFO	:	Epoch 1853 | loss: 0.0311972 | val_loss: 0.031197 | Time: 4044.2 ms
[2022-03-03 11:34:40	                main:574]	:	INFO	:	Epoch 1854 | loss: 0.0311961 | val_loss: 0.0311959 | Time: 4163.79 ms
[2022-03-03 11:34:44	                main:574]	:	INFO	:	Epoch 1855 | loss: 0.0311966 | val_loss: 0.0311971 | Time: 4069.62 ms
[2022-03-03 11:34:48	                main:574]	:	INFO	:	Epoch 1856 | loss: 0.0311965 | val_loss: 0.0311974 | Time: 4014.33 ms
[2022-03-03 11:34:52	                main:574]	:	INFO	:	Epoch 1857 | loss: 0.0311962 | val_loss: 0.0311985 | Time: 4095.49 ms
[2022-03-03 11:34:56	                main:574]	:	INFO	:	Epoch 1858 | loss: 0.0311966 | val_loss: 0.0311993 | Time: 4030.95 ms
[2022-03-03 11:35:00	                main:574]	:	INFO	:	Epoch 1859 | loss: 0.0311964 | val_loss: 0.0311984 | Time: 4052.91 ms
[2022-03-03 11:35:04	                main:574]	:	INFO	:	Epoch 1860 | loss: 0.0311964 | val_loss: 0.0311985 | Time: 4051.85 ms
[2022-03-03 11:35:08	                main:574]	:	INFO	:	Epoch 1861 | loss: 0.0311976 | val_loss: 0.0311981 | Time: 4041.66 ms
[2022-03-03 11:35:12	                main:574]	:	INFO	:	Epoch 1862 | loss: 0.0311984 | val_loss: 0.0311991 | Time: 4209 ms
[2022-03-03 11:35:16	                main:574]	:	INFO	:	Epoch 1863 | loss: 0.0311978 | val_loss: 0.0311993 | Time: 4008.07 ms
[2022-03-03 11:35:20	                main:574]	:	INFO	:	Epoch 1864 | loss: 0.0311964 | val_loss: 0.0312 | Time: 4000.44 ms
[2022-03-03 11:35:24	                main:574]	:	INFO	:	Epoch 1865 | loss: 0.0311961 | val_loss: 0.0311974 | Time: 4028.08 ms
[2022-03-03 11:35:28	                main:574]	:	INFO	:	Epoch 1866 | loss: 0.0311985 | val_loss: 0.031199 | Time: 4055.63 ms
[2022-03-03 11:35:33	                main:574]	:	INFO	:	Epoch 1867 | loss: 0.0311956 | val_loss: 0.0311975 | Time: 4182.62 ms
[2022-03-03 11:35:37	                main:574]	:	INFO	:	Epoch 1868 | loss: 0.0311961 | val_loss: 0.0311975 | Time: 4092.03 ms
[2022-03-03 11:35:41	                main:574]	:	INFO	:	Epoch 1869 | loss: 0.0311955 | val_loss: 0.0311972 | Time: 4089.19 ms
[2022-03-03 11:35:45	                main:574]	:	INFO	:	Epoch 1870 | loss: 0.0311956 | val_loss: 0.0311969 | Time: 4123.5 ms
[2022-03-03 11:35:49	                main:574]	:	INFO	:	Epoch 1871 | loss: 0.0311957 | val_loss: 0.0311987 | Time: 4030.97 ms
[2022-03-03 11:35:53	                main:574]	:	INFO	:	Epoch 1872 | loss: 0.0311958 | val_loss: 0.0311972 | Time: 4086.94 ms
[2022-03-03 11:35:57	                main:574]	:	INFO	:	Epoch 1873 | loss: 0.031196 | val_loss: 0.0311976 | Time: 4228.95 ms
[2022-03-03 11:36:01	                main:574]	:	INFO	:	Epoch 1874 | loss: 0.0311963 | val_loss: 0.0311977 | Time: 4073.08 ms
[2022-03-03 11:36:05	                main:574]	:	INFO	:	Epoch 1875 | loss: 0.0311965 | val_loss: 0.0311972 | Time: 4012.6 ms
[2022-03-03 11:36:10	                main:574]	:	INFO	:	Epoch 1876 | loss: 0.0311966 | val_loss: 0.0311974 | Time: 4144.37 ms
[2022-03-03 11:36:14	                main:574]	:	INFO	:	Epoch 1877 | loss: 0.0311986 | val_loss: 0.0311987 | Time: 4061.45 ms
[2022-03-03 11:36:18	                main:574]	:	INFO	:	Epoch 1878 | loss: 0.0312032 | val_loss: 0.0312066 | Time: 4041.48 ms
[2022-03-03 11:36:22	                main:574]	:	INFO	:	Epoch 1879 | loss: 0.0312078 | val_loss: 0.0312091 | Time: 4071.87 ms
[2022-03-03 11:36:26	                main:574]	:	INFO	:	Epoch 1880 | loss: 0.0312117 | val_loss: 0.031209 | Time: 3980.24 ms
[2022-03-03 11:36:30	                main:574]	:	INFO	:	Epoch 1881 | loss: 0.0312084 | val_loss: 0.0312067 | Time: 4119.12 ms
[2022-03-03 11:36:34	                main:574]	:	INFO	:	Epoch 1882 | loss: 0.0312078 | val_loss: 0.0312075 | Time: 4085.29 ms
[2022-03-03 11:36:38	                main:574]	:	INFO	:	Epoch 1883 | loss: 0.0312084 | val_loss: 0.0312094 | Time: 4028.5 ms
[2022-03-03 11:36:42	                main:574]	:	INFO	:	Epoch 1884 | loss: 0.0312105 | val_loss: 0.0312156 | Time: 4067.31 ms
[2022-03-03 11:36:46	                main:574]	:	INFO	:	Epoch 1885 | loss: 0.0312168 | val_loss: 0.0312197 | Time: 4039.78 ms
[2022-03-03 11:36:50	                main:574]	:	INFO	:	Epoch 1886 | loss: 0.0312205 | val_loss: 0.031211 | Time: 4039.59 ms
[2022-03-03 11:36:54	                main:574]	:	INFO	:	Epoch 1887 | loss: 0.0312176 | val_loss: 0.0312125 | Time: 4165.37 ms
[2022-03-03 11:36:58	                main:574]	:	INFO	:	Epoch 1888 | loss: 0.0312168 | val_loss: 0.0312094 | Time: 4042.78 ms
[2022-03-03 11:37:02	                main:574]	:	INFO	:	Epoch 1889 | loss: 0.0312154 | val_loss: 0.031209 | Time: 4081.15 ms
[2022-03-03 11:37:06	                main:574]	:	INFO	:	Epoch 1890 | loss: 0.0312165 | val_loss: 0.0312125 | Time: 4018 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-03-03 11:40:16	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-03 11:40:16	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-03 11:40:16	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-03 11:40:16	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-03 11:40:16	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-03 11:40:16	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-03 11:40:16	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-03 11:40:16	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-03 11:40:16	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-03 11:40:16	                main:474]	:	INFO	:	Configuration: 
[2022-03-03 11:40:16	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-03 11:40:16	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-03 11:40:16	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-03 11:40:16	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-03 11:40:16	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-03 11:40:16	                main:480]	:	INFO	:	    Patience: 10
[2022-03-03 11:40:16	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-03 11:40:16	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-03 11:40:16	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-03 11:40:16	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-03 11:40:16	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-03 11:40:16	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-03-03 11:51:50	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-03 11:51:50	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-03 11:51:50	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-03 11:51:50	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-03 11:51:50	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-03 11:51:50	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-03 11:51:50	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-03 11:51:50	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-03 11:51:50	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-03 11:51:50	                main:474]	:	INFO	:	Configuration: 
[2022-03-03 11:51:50	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-03 11:51:50	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-03 11:51:50	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-03 11:51:50	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-03 11:51:50	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-03 11:51:50	                main:480]	:	INFO	:	    Patience: 10
[2022-03-03 11:51:50	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-03 11:51:50	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-03 11:51:50	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-03 11:51:50	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-03 11:51:50	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-03 11:51:50	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-03 11:51:50	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-03 11:51:51	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-03 11:51:51	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-03 11:51:51	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-03 11:51:51	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-03 11:51:51	                main:494]	:	INFO	:	Creating Model
[2022-03-03 11:51:51	                main:507]	:	INFO	:	Preparing config file
[2022-03-03 11:51:51	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-03 11:51:51	                main:512]	:	INFO	:	Loading config
[2022-03-03 11:51:51	                main:514]	:	INFO	:	Loading state
[2022-03-03 11:51:52	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-03 11:51:52	                main:562]	:	INFO	:	Starting Training
[2022-03-03 11:51:56	                main:574]	:	INFO	:	Epoch 1878 | loss: 0.0312158 | val_loss: 0.0312016 | Time: 4270.38 ms
[2022-03-03 11:52:00	                main:574]	:	INFO	:	Epoch 1879 | loss: 0.0311986 | val_loss: 0.0311983 | Time: 4153.76 ms
[2022-03-03 11:52:04	                main:574]	:	INFO	:	Epoch 1880 | loss: 0.031198 | val_loss: 0.0311972 | Time: 4067.73 ms
[2022-03-03 11:52:08	                main:574]	:	INFO	:	Epoch 1881 | loss: 0.031199 | val_loss: 0.0311949 | Time: 4063.88 ms
[2022-03-03 11:52:13	                main:574]	:	INFO	:	Epoch 1882 | loss: 0.0311984 | val_loss: 0.0311951 | Time: 4137.72 ms
[2022-03-03 11:52:17	                main:574]	:	INFO	:	Epoch 1883 | loss: 0.0311987 | val_loss: 0.0311952 | Time: 4093 ms
[2022-03-03 11:52:21	                main:574]	:	INFO	:	Epoch 1884 | loss: 0.0311964 | val_loss: 0.0311942 | Time: 4111.37 ms
[2022-03-03 11:52:25	                main:574]	:	INFO	:	Epoch 1885 | loss: 0.0312017 | val_loss: 0.0312089 | Time: 4044.6 ms
[2022-03-03 11:52:29	                main:574]	:	INFO	:	Epoch 1886 | loss: 0.0312068 | val_loss: 0.0312037 | Time: 4152.75 ms
[2022-03-03 11:52:33	                main:574]	:	INFO	:	Epoch 1887 | loss: 0.0312044 | val_loss: 0.0312028 | Time: 4157.53 ms
[2022-03-03 11:52:37	                main:574]	:	INFO	:	Epoch 1888 | loss: 0.031203 | val_loss: 0.0312012 | Time: 4158.08 ms
[2022-03-03 11:52:41	                main:574]	:	INFO	:	Epoch 1889 | loss: 0.0312025 | val_loss: 0.0312016 | Time: 4100.85 ms
[2022-03-03 11:52:46	                main:574]	:	INFO	:	Epoch 1890 | loss: 0.0312021 | val_loss: 0.0312006 | Time: 4153.62 ms
[2022-03-03 11:52:50	                main:574]	:	INFO	:	Epoch 1891 | loss: 0.0312022 | val_loss: 0.0312003 | Time: 4136.42 ms
[2022-03-03 11:52:54	                main:574]	:	INFO	:	Epoch 1892 | loss: 0.0312016 | val_loss: 0.0312004 | Time: 4156.11 ms
[2022-03-03 11:52:58	                main:574]	:	INFO	:	Epoch 1893 | loss: 0.0312015 | val_loss: 0.0312012 | Time: 4054.03 ms
[2022-03-03 11:53:02	                main:574]	:	INFO	:	Epoch 1894 | loss: 0.0312011 | val_loss: 0.0312033 | Time: 4128.02 ms
[2022-03-03 11:53:06	                main:574]	:	INFO	:	Epoch 1895 | loss: 0.0312022 | val_loss: 0.0312012 | Time: 4090.18 ms
[2022-03-03 11:53:10	                main:574]	:	INFO	:	Epoch 1896 | loss: 0.0312038 | val_loss: 0.0312023 | Time: 4087.67 ms
[2022-03-03 11:53:14	                main:574]	:	INFO	:	Epoch 1897 | loss: 0.031201 | val_loss: 0.0311997 | Time: 4271.66 ms
[2022-03-03 11:53:19	                main:574]	:	INFO	:	Epoch 1898 | loss: 0.0312023 | val_loss: 0.0311993 | Time: 4229.03 ms
[2022-03-03 11:53:23	                main:574]	:	INFO	:	Epoch 1899 | loss: 0.0312014 | val_loss: 0.0311981 | Time: 4293.77 ms
[2022-03-03 11:53:27	                main:574]	:	INFO	:	Epoch 1900 | loss: 0.0312012 | val_loss: 0.0312004 | Time: 4163 ms
[2022-03-03 11:53:31	                main:574]	:	INFO	:	Epoch 1901 | loss: 0.031203 | val_loss: 0.0312032 | Time: 4179.5 ms
[2022-03-03 11:53:35	                main:574]	:	INFO	:	Epoch 1902 | loss: 0.0312044 | val_loss: 0.0311985 | Time: 3986.71 ms
[2022-03-03 11:53:39	                main:574]	:	INFO	:	Epoch 1903 | loss: 0.031204 | val_loss: 0.0312099 | Time: 4128.36 ms
[2022-03-03 11:53:44	                main:574]	:	INFO	:	Epoch 1904 | loss: 0.0312042 | val_loss: 0.0312002 | Time: 4118.91 ms
[2022-03-03 11:53:48	                main:574]	:	INFO	:	Epoch 1905 | loss: 0.0312037 | val_loss: 0.0311975 | Time: 4171.94 ms
[2022-03-03 11:53:52	                main:574]	:	INFO	:	Epoch 1906 | loss: 0.0312009 | val_loss: 0.0311978 | Time: 4154.8 ms
[2022-03-03 11:53:56	                main:574]	:	INFO	:	Epoch 1907 | loss: 0.0311997 | val_loss: 0.031197 | Time: 4140.49 ms
[2022-03-03 11:54:00	                main:574]	:	INFO	:	Epoch 1908 | loss: 0.0311988 | val_loss: 0.0311971 | Time: 4065.92 ms
[2022-03-03 11:54:04	                main:574]	:	INFO	:	Epoch 1909 | loss: 0.0311994 | val_loss: 0.0311968 | Time: 4061.35 ms
[2022-03-03 11:54:08	                main:574]	:	INFO	:	Epoch 1910 | loss: 0.0311993 | val_loss: 0.0311975 | Time: 4048.79 ms
[2022-03-03 11:54:12	                main:574]	:	INFO	:	Epoch 1911 | loss: 0.0311988 | val_loss: 0.0311971 | Time: 4027.95 ms
[2022-03-03 11:54:16	                main:574]	:	INFO	:	Epoch 1912 | loss: 0.031199 | val_loss: 0.0311965 | Time: 4031.76 ms
[2022-03-03 11:54:20	                main:574]	:	INFO	:	Epoch 1913 | loss: 0.0311984 | val_loss: 0.0311967 | Time: 4110.02 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-03-03 12:30:14	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-03 12:30:14	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-03 12:30:14	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-03 12:30:14	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-03 12:30:14	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-03 12:30:14	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-03 12:30:14	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-03 12:30:14	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-03 12:30:14	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-03 12:30:14	                main:474]	:	INFO	:	Configuration: 
[2022-03-03 12:30:14	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-03 12:30:14	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-03 12:30:14	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-03 12:30:14	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-03 12:30:14	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-03 12:30:14	                main:480]	:	INFO	:	    Patience: 10
[2022-03-03 12:30:14	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-03 12:30:14	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-03 12:30:14	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-03 12:30:14	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-03 12:30:14	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-03 12:30:14	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-03 12:30:14	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-03 12:30:15	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-03 12:30:15	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-03 12:30:15	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-03 12:30:15	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-03 12:30:15	                main:494]	:	INFO	:	Creating Model
[2022-03-03 12:30:15	                main:507]	:	INFO	:	Preparing config file
[2022-03-03 12:30:15	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-03 12:30:15	                main:512]	:	INFO	:	Loading config
[2022-03-03 12:30:15	                main:514]	:	INFO	:	Loading state
[2022-03-03 12:30:16	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-03 12:30:16	                main:562]	:	INFO	:	Starting Training
[2022-03-03 12:30:20	                main:574]	:	INFO	:	Epoch 1878 | loss: 0.0312295 | val_loss: 0.0312059 | Time: 4195.48 ms
[2022-03-03 12:30:24	                main:574]	:	INFO	:	Epoch 1879 | loss: 0.0312039 | val_loss: 0.0312054 | Time: 4030.29 ms
[2022-03-03 12:30:28	                main:574]	:	INFO	:	Epoch 1880 | loss: 0.031201 | val_loss: 0.0312033 | Time: 3996.17 ms
[2022-03-03 12:30:32	                main:574]	:	INFO	:	Epoch 1881 | loss: 0.0312009 | val_loss: 0.0312006 | Time: 4125.2 ms
[2022-03-03 12:30:36	                main:574]	:	INFO	:	Epoch 1882 | loss: 0.0312001 | val_loss: 0.0311999 | Time: 4083.49 ms
[2022-03-03 12:30:40	                main:574]	:	INFO	:	Epoch 1883 | loss: 0.0312003 | val_loss: 0.0312028 | Time: 4049.73 ms
[2022-03-03 12:30:44	                main:574]	:	INFO	:	Epoch 1884 | loss: 0.0312031 | val_loss: 0.0312034 | Time: 4033.64 ms
[2022-03-03 12:30:48	                main:574]	:	INFO	:	Epoch 1885 | loss: 0.0312029 | val_loss: 0.0312025 | Time: 4090.45 ms
[2022-03-03 12:30:53	                main:574]	:	INFO	:	Epoch 1886 | loss: 0.0312028 | val_loss: 0.0312028 | Time: 4111.18 ms
[2022-03-03 12:30:57	                main:574]	:	INFO	:	Epoch 1887 | loss: 0.0312034 | val_loss: 0.0312038 | Time: 3995.65 ms
[2022-03-03 12:31:01	                main:574]	:	INFO	:	Epoch 1888 | loss: 0.0312024 | val_loss: 0.031201 | Time: 4054 ms
[2022-03-03 12:31:05	                main:574]	:	INFO	:	Epoch 1889 | loss: 0.0312032 | val_loss: 0.031203 | Time: 4029.38 ms
[2022-03-03 12:31:09	                main:574]	:	INFO	:	Epoch 1890 | loss: 0.0312028 | val_loss: 0.0312043 | Time: 4081.31 ms
[2022-03-03 12:31:13	                main:574]	:	INFO	:	Epoch 1891 | loss: 0.031203 | val_loss: 0.0312017 | Time: 4133.25 ms
[2022-03-03 12:31:17	                main:574]	:	INFO	:	Epoch 1892 | loss: 0.0312032 | val_loss: 0.0312075 | Time: 4017.13 ms
[2022-03-03 12:31:21	                main:574]	:	INFO	:	Epoch 1893 | loss: 0.0312074 | val_loss: 0.0312048 | Time: 3983.06 ms
[2022-03-03 12:31:25	                main:574]	:	INFO	:	Epoch 1894 | loss: 0.0312024 | val_loss: 0.0311974 | Time: 4057.55 ms
[2022-03-03 12:31:29	                main:574]	:	INFO	:	Epoch 1895 | loss: 0.0312006 | val_loss: 0.0311985 | Time: 4073.22 ms
[2022-03-03 12:31:33	                main:574]	:	INFO	:	Epoch 1896 | loss: 0.0311998 | val_loss: 0.031202 | Time: 3975.67 ms
[2022-03-03 12:31:37	                main:574]	:	INFO	:	Epoch 1897 | loss: 0.0311987 | val_loss: 0.0311967 | Time: 4052.7 ms
[2022-03-03 12:31:41	                main:574]	:	INFO	:	Epoch 1898 | loss: 0.0311983 | val_loss: 0.031196 | Time: 4045.22 ms
[2022-03-03 12:31:45	                main:574]	:	INFO	:	Epoch 1899 | loss: 0.0311995 | val_loss: 0.0311967 | Time: 4046.51 ms
[2022-03-03 12:31:49	                main:574]	:	INFO	:	Epoch 1900 | loss: 0.0311985 | val_loss: 0.0311946 | Time: 4110.65 ms
[2022-03-03 12:31:53	                main:574]	:	INFO	:	Epoch 1901 | loss: 0.0311977 | val_loss: 0.031199 | Time: 4034.57 ms
[2022-03-03 12:31:57	                main:574]	:	INFO	:	Epoch 1902 | loss: 0.0311991 | val_loss: 0.0311981 | Time: 4093.45 ms
[2022-03-03 12:32:01	                main:574]	:	INFO	:	Epoch 1903 | loss: 0.0311987 | val_loss: 0.0311978 | Time: 4026.83 ms
[2022-03-03 12:32:05	                main:574]	:	INFO	:	Epoch 1904 | loss: 0.0311989 | val_loss: 0.0312019 | Time: 4063.55 ms
[2022-03-03 12:32:10	                main:574]	:	INFO	:	Epoch 1905 | loss: 0.0312006 | val_loss: 0.0311987 | Time: 4025.81 ms
[2022-03-03 12:32:14	                main:574]	:	INFO	:	Epoch 1906 | loss: 0.0311982 | val_loss: 0.0311959 | Time: 4095.4 ms
[2022-03-03 12:32:18	                main:574]	:	INFO	:	Epoch 1907 | loss: 0.0311974 | val_loss: 0.0311963 | Time: 4145.12 ms
[2022-03-03 12:32:22	                main:574]	:	INFO	:	Epoch 1908 | loss: 0.0311961 | val_loss: 0.0311953 | Time: 4043.8 ms
[2022-03-03 12:32:26	                main:574]	:	INFO	:	Epoch 1909 | loss: 0.0311973 | val_loss: 0.0311965 | Time: 4161.72 ms
[2022-03-03 12:32:30	                main:574]	:	INFO	:	Epoch 1910 | loss: 0.0311975 | val_loss: 0.0311977 | Time: 4191.83 ms
[2022-03-03 12:32:34	                main:574]	:	INFO	:	Epoch 1911 | loss: 0.0311958 | val_loss: 0.0311944 | Time: 4090.02 ms
[2022-03-03 12:32:38	                main:574]	:	INFO	:	Epoch 1912 | loss: 0.0311953 | val_loss: 0.0311945 | Time: 4023.4 ms
[2022-03-03 12:32:42	                main:574]	:	INFO	:	Epoch 1913 | loss: 0.0311954 | val_loss: 0.0311949 | Time: 4068.46 ms
[2022-03-03 12:32:47	                main:574]	:	INFO	:	Epoch 1914 | loss: 0.031196 | val_loss: 0.0311954 | Time: 4140.11 ms
[2022-03-03 12:32:51	                main:574]	:	INFO	:	Epoch 1915 | loss: 0.0311954 | val_loss: 0.0311956 | Time: 4146.81 ms
[2022-03-03 12:32:55	                main:574]	:	INFO	:	Epoch 1916 | loss: 0.0311948 | val_loss: 0.0311945 | Time: 4072.25 ms
[2022-03-03 12:32:59	                main:574]	:	INFO	:	Epoch 1917 | loss: 0.0311957 | val_loss: 0.0311971 | Time: 4079.74 ms
[2022-03-03 12:33:03	                main:574]	:	INFO	:	Epoch 1918 | loss: 0.031198 | val_loss: 0.0311967 | Time: 4163.93 ms
[2022-03-03 12:33:07	                main:574]	:	INFO	:	Epoch 1919 | loss: 0.0311954 | val_loss: 0.0311935 | Time: 4082.47 ms
[2022-03-03 12:33:11	                main:574]	:	INFO	:	Epoch 1920 | loss: 0.0311953 | val_loss: 0.0311956 | Time: 4070.59 ms
[2022-03-03 12:33:15	                main:574]	:	INFO	:	Epoch 1921 | loss: 0.0311957 | val_loss: 0.0311942 | Time: 4032.4 ms
[2022-03-03 12:33:19	                main:574]	:	INFO	:	Epoch 1922 | loss: 0.0311951 | val_loss: 0.0311935 | Time: 4132.61 ms
[2022-03-03 12:33:23	                main:574]	:	INFO	:	Epoch 1923 | loss: 0.0311951 | val_loss: 0.0311948 | Time: 4054.94 ms
[2022-03-03 12:33:27	                main:574]	:	INFO	:	Epoch 1924 | loss: 0.0311948 | val_loss: 0.0311929 | Time: 4084.37 ms
[2022-03-03 12:33:32	                main:574]	:	INFO	:	Epoch 1925 | loss: 0.0311941 | val_loss: 0.0311932 | Time: 4097.55 ms
[2022-03-03 12:33:36	                main:574]	:	INFO	:	Epoch 1926 | loss: 0.0311941 | val_loss: 0.031193 | Time: 4122.97 ms
[2022-03-03 12:33:40	                main:574]	:	INFO	:	Epoch 1927 | loss: 0.031196 | val_loss: 0.0311929 | Time: 4058.33 ms
[2022-03-03 12:33:44	                main:574]	:	INFO	:	Epoch 1928 | loss: 0.0311951 | val_loss: 0.0311958 | Time: 4048.41 ms
[2022-03-03 12:33:48	                main:574]	:	INFO	:	Epoch 1929 | loss: 0.0311949 | val_loss: 0.0312011 | Time: 4073.55 ms
[2022-03-03 12:33:52	                main:574]	:	INFO	:	Epoch 1930 | loss: 0.0311967 | val_loss: 0.0311931 | Time: 4060.46 ms
[2022-03-03 12:33:56	                main:574]	:	INFO	:	Epoch 1931 | loss: 0.0311939 | val_loss: 0.0311936 | Time: 4120.6 ms
[2022-03-03 12:34:00	                main:574]	:	INFO	:	Epoch 1932 | loss: 0.0311941 | val_loss: 0.0311931 | Time: 4147.39 ms
[2022-03-03 12:34:04	                main:574]	:	INFO	:	Epoch 1933 | loss: 0.0311975 | val_loss: 0.0311963 | Time: 4161.36 ms
[2022-03-03 12:34:08	                main:574]	:	INFO	:	Epoch 1934 | loss: 0.0311947 | val_loss: 0.0311946 | Time: 4048.2 ms
[2022-03-03 12:34:12	                main:574]	:	INFO	:	Epoch 1935 | loss: 0.0311948 | val_loss: 0.0311933 | Time: 4091.01 ms
[2022-03-03 12:34:17	                main:574]	:	INFO	:	Epoch 1936 | loss: 0.0311946 | val_loss: 0.0311925 | Time: 4173.33 ms
[2022-03-03 12:34:21	                main:574]	:	INFO	:	Epoch 1937 | loss: 0.0311933 | val_loss: 0.0311917 | Time: 4003.24 ms
[2022-03-03 12:34:25	                main:574]	:	INFO	:	Epoch 1938 | loss: 0.0311926 | val_loss: 0.0311929 | Time: 4103.52 ms
[2022-03-03 12:34:29	                main:574]	:	INFO	:	Epoch 1939 | loss: 0.0311933 | val_loss: 0.0311941 | Time: 4168.73 ms
[2022-03-03 12:34:33	                main:574]	:	INFO	:	Epoch 1940 | loss: 0.0311938 | val_loss: 0.0311965 | Time: 4044.33 ms
[2022-03-03 12:34:37	                main:574]	:	INFO	:	Epoch 1941 | loss: 0.0311937 | val_loss: 0.0311922 | Time: 4000.41 ms
[2022-03-03 12:34:41	                main:574]	:	INFO	:	Epoch 1942 | loss: 0.031193 | val_loss: 0.0311906 | Time: 4102.31 ms
[2022-03-03 12:34:45	                main:574]	:	INFO	:	Epoch 1943 | loss: 0.0311933 | val_loss: 0.031192 | Time: 4107.8 ms
[2022-03-03 12:34:49	                main:574]	:	INFO	:	Epoch 1944 | loss: 0.0311969 | val_loss: 0.0312012 | Time: 4108.15 ms
[2022-03-03 12:34:53	                main:574]	:	INFO	:	Epoch 1945 | loss: 0.0312028 | val_loss: 0.0312032 | Time: 4010.78 ms
[2022-03-03 12:34:57	                main:574]	:	INFO	:	Epoch 1946 | loss: 0.0312031 | val_loss: 0.0312026 | Time: 4014.28 ms
[2022-03-03 12:35:01	                main:574]	:	INFO	:	Epoch 1947 | loss: 0.0312013 | val_loss: 0.0312002 | Time: 3965.68 ms
[2022-03-03 12:35:05	                main:574]	:	INFO	:	Epoch 1948 | loss: 0.0311998 | val_loss: 0.0311993 | Time: 4071.09 ms
[2022-03-03 12:35:09	                main:574]	:	INFO	:	Epoch 1949 | loss: 0.0311986 | val_loss: 0.0311986 | Time: 4021.53 ms
[2022-03-03 12:35:13	                main:574]	:	INFO	:	Epoch 1950 | loss: 0.0311981 | val_loss: 0.0311997 | Time: 4075.76 ms
[2022-03-03 12:35:18	                main:574]	:	INFO	:	Epoch 1951 | loss: 0.0311988 | val_loss: 0.0312007 | Time: 4205.29 ms
[2022-03-03 12:35:22	                main:574]	:	INFO	:	Epoch 1952 | loss: 0.0311988 | val_loss: 0.031198 | Time: 4025.29 ms
[2022-03-03 12:35:26	                main:574]	:	INFO	:	Epoch 1953 | loss: 0.0311985 | val_loss: 0.0311988 | Time: 4031 ms
[2022-03-03 12:35:30	                main:574]	:	INFO	:	Epoch 1954 | loss: 0.0311978 | val_loss: 0.0311972 | Time: 4063.34 ms
[2022-03-03 12:35:34	                main:574]	:	INFO	:	Epoch 1955 | loss: 0.0312005 | val_loss: 0.0311993 | Time: 4116.7 ms
[2022-03-03 12:35:38	                main:574]	:	INFO	:	Epoch 1956 | loss: 0.0311973 | val_loss: 0.031197 | Time: 4197.62 ms
[2022-03-03 12:35:42	                main:574]	:	INFO	:	Epoch 1957 | loss: 0.0311975 | val_loss: 0.0312 | Time: 4267.56 ms
[2022-03-03 12:35:47	                main:574]	:	INFO	:	Epoch 1958 | loss: 0.0311982 | val_loss: 0.0311972 | Time: 4109.63 ms
[2022-03-03 12:35:51	                main:574]	:	INFO	:	Epoch 1959 | loss: 0.031198 | val_loss: 0.0312003 | Time: 4056.7 ms
[2022-03-03 12:35:55	                main:574]	:	INFO	:	Epoch 1960 | loss: 0.0311983 | val_loss: 0.0311965 | Time: 4150.3 ms
[2022-03-03 12:35:59	                main:574]	:	INFO	:	Epoch 1961 | loss: 0.0311976 | val_loss: 0.0311983 | Time: 4008.03 ms
[2022-03-03 12:36:03	                main:574]	:	INFO	:	Epoch 1962 | loss: 0.0311976 | val_loss: 0.0311982 | Time: 4057.83 ms
[2022-03-03 12:36:07	                main:574]	:	INFO	:	Epoch 1963 | loss: 0.0311997 | val_loss: 0.0312023 | Time: 4034.66 ms
[2022-03-03 12:36:11	                main:574]	:	INFO	:	Epoch 1964 | loss: 0.0311979 | val_loss: 0.031197 | Time: 4114.3 ms
[2022-03-03 12:36:15	                main:574]	:	INFO	:	Epoch 1965 | loss: 0.0311979 | val_loss: 0.0312055 | Time: 4079.47 ms
[2022-03-03 12:36:19	                main:574]	:	INFO	:	Epoch 1966 | loss: 0.0311992 | val_loss: 0.0311965 | Time: 4201.13 ms
[2022-03-03 12:36:23	                main:574]	:	INFO	:	Epoch 1967 | loss: 0.0311968 | val_loss: 0.0311967 | Time: 4073.9 ms
[2022-03-03 12:36:27	                main:574]	:	INFO	:	Epoch 1968 | loss: 0.0311966 | val_loss: 0.0311962 | Time: 4113.73 ms
[2022-03-03 12:36:32	                main:574]	:	INFO	:	Epoch 1969 | loss: 0.0311968 | val_loss: 0.0311984 | Time: 4099.4 ms
[2022-03-03 12:36:36	                main:574]	:	INFO	:	Epoch 1970 | loss: 0.0311971 | val_loss: 0.0312014 | Time: 4141.01 ms
[2022-03-03 12:36:40	                main:574]	:	INFO	:	Epoch 1971 | loss: 0.0311996 | val_loss: 0.031198 | Time: 4150.26 ms
[2022-03-03 12:36:44	                main:574]	:	INFO	:	Epoch 1972 | loss: 0.0311989 | val_loss: 0.0311975 | Time: 4029.81 ms
[2022-03-03 12:36:48	                main:574]	:	INFO	:	Epoch 1973 | loss: 0.0311992 | val_loss: 0.0312042 | Time: 4003.2 ms
[2022-03-03 12:36:52	                main:574]	:	INFO	:	Epoch 1974 | loss: 0.0312004 | val_loss: 0.0312004 | Time: 4037.9 ms
[2022-03-03 12:36:56	                main:574]	:	INFO	:	Epoch 1975 | loss: 0.0311993 | val_loss: 0.0312026 | Time: 4052.49 ms
[2022-03-03 12:37:00	                main:574]	:	INFO	:	Epoch 1976 | loss: 0.0311989 | val_loss: 0.0311997 | Time: 4131.59 ms
[2022-03-03 12:37:04	                main:574]	:	INFO	:	Epoch 1977 | loss: 0.0311991 | val_loss: 0.0312062 | Time: 4092.9 ms
[2022-03-03 12:37:08	                main:574]	:	INFO	:	Epoch 1978 | loss: 0.0312006 | val_loss: 0.0312 | Time: 4079.71 ms
[2022-03-03 12:37:12	                main:574]	:	INFO	:	Epoch 1979 | loss: 0.0312018 | val_loss: 0.0312002 | Time: 3995.2 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-03-03 13:42:45	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-03 13:42:45	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-03 13:42:45	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-03 13:42:45	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-03 13:42:45	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-03 13:42:45	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-03 13:42:45	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-03 13:42:45	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-03 13:42:45	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-03 13:42:45	                main:474]	:	INFO	:	Configuration: 
[2022-03-03 13:42:45	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-03 13:42:45	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-03 13:42:45	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-03 13:42:45	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-03 13:42:45	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-03 13:42:45	                main:480]	:	INFO	:	    Patience: 10
[2022-03-03 13:42:45	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-03 13:42:45	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-03 13:42:45	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-03 13:42:45	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-03 13:42:45	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-03 13:42:45	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-03-03 14:32:40	                main:435]	:	INFO	:	Set logging level to 1
[2022-03-03 14:32:40	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-03-03 14:32:40	                main:444]	:	INFO	:	Resolving all filenames
[2022-03-03 14:32:40	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-03-03 14:32:40	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-03-03 14:32:40	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-03-03 14:32:40	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-03-03 14:32:40	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-03-03 14:32:40	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-03-03 14:32:40	                main:474]	:	INFO	:	Configuration: 
[2022-03-03 14:32:40	                main:475]	:	INFO	:	    Model type: GRU
[2022-03-03 14:32:40	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-03-03 14:32:40	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-03-03 14:32:40	                main:478]	:	INFO	:	    Batch Size: 128
[2022-03-03 14:32:40	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-03-03 14:32:40	                main:480]	:	INFO	:	    Patience: 10
[2022-03-03 14:32:40	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-03-03 14:32:40	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-03-03 14:32:40	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-03-03 14:32:40	                main:484]	:	INFO	:	    # Threads: 1
[2022-03-03 14:32:40	                main:486]	:	INFO	:	Preparing Dataset
[2022-03-03 14:32:40	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-03-03 14:32:40	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-03-03 14:32:42	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-03-03 14:32:42	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-03-03 14:32:42	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-03-03 14:32:42	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-03-03 14:32:42	                main:494]	:	INFO	:	Creating Model
[2022-03-03 14:32:42	                main:507]	:	INFO	:	Preparing config file
[2022-03-03 14:32:42	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-03-03 14:32:42	                main:512]	:	INFO	:	Loading config
[2022-03-03 14:32:42	                main:514]	:	INFO	:	Loading state
[2022-03-03 14:32:42	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-03-03 14:32:42	                main:562]	:	INFO	:	Starting Training
[2022-03-03 14:32:47	                main:574]	:	INFO	:	Epoch 1967 | loss: 0.0312241 | val_loss: 0.0311994 | Time: 4246.89 ms
[2022-03-03 14:32:51	                main:574]	:	INFO	:	Epoch 1968 | loss: 0.0311968 | val_loss: 0.0311946 | Time: 4012.62 ms
[2022-03-03 14:32:55	                main:574]	:	INFO	:	Epoch 1969 | loss: 0.0311939 | val_loss: 0.0311939 | Time: 4112.07 ms
[2022-03-03 14:32:59	                main:574]	:	INFO	:	Epoch 1970 | loss: 0.0311929 | val_loss: 0.0311945 | Time: 4025.42 ms
[2022-03-03 14:33:03	                main:574]	:	INFO	:	Epoch 1971 | loss: 0.0311911 | val_loss: 0.0311933 | Time: 4030.88 ms
[2022-03-03 14:33:07	                main:574]	:	INFO	:	Epoch 1972 | loss: 0.0311919 | val_loss: 0.0311953 | Time: 4049.54 ms
[2022-03-03 14:33:11	                main:574]	:	INFO	:	Epoch 1973 | loss: 0.0311935 | val_loss: 0.0311941 | Time: 4107.92 ms
[2022-03-03 14:33:15	                main:574]	:	INFO	:	Epoch 1974 | loss: 0.0311948 | val_loss: 0.0311958 | Time: 4041.47 ms
[2022-03-03 14:33:19	                main:574]	:	INFO	:	Epoch 1975 | loss: 0.0311955 | val_loss: 0.031195 | Time: 3980.54 ms
[2022-03-03 14:33:23	                main:574]	:	INFO	:	Epoch 1976 | loss: 0.031196 | val_loss: 0.0311958 | Time: 4087.02 ms
[2022-03-03 14:33:27	                main:574]	:	INFO	:	Epoch 1977 | loss: 0.0311951 | val_loss: 0.0311929 | Time: 4058.13 ms
[2022-03-03 14:33:31	                main:574]	:	INFO	:	Epoch 1978 | loss: 0.0311961 | val_loss: 0.0311929 | Time: 4042.35 ms
[2022-03-03 14:33:35	                main:574]	:	INFO	:	Epoch 1979 | loss: 0.0311964 | val_loss: 0.0311934 | Time: 4100.88 ms
[2022-03-03 14:33:39	                main:574]	:	INFO	:	Epoch 1980 | loss: 0.0311972 | val_loss: 0.0311949 | Time: 4139.54 ms
[2022-03-03 14:33:44	                main:574]	:	INFO	:	Epoch 1981 | loss: 0.0311967 | val_loss: 0.0311934 | Time: 4144.93 ms
[2022-03-03 14:33:48	                main:574]	:	INFO	:	Epoch 1982 | loss: 0.0311961 | val_loss: 0.0311947 | Time: 4114.2 ms
[2022-03-03 14:33:52	                main:574]	:	INFO	:	Epoch 1983 | loss: 0.0311965 | val_loss: 0.0311958 | Time: 4121.25 ms
[2022-03-03 14:33:56	                main:574]	:	INFO	:	Epoch 1984 | loss: 0.0311949 | val_loss: 0.0311942 | Time: 4087.85 ms
[2022-03-03 14:34:00	                main:574]	:	INFO	:	Epoch 1985 | loss: 0.031195 | val_loss: 0.0311947 | Time: 4129.14 ms
[2022-03-03 14:34:04	                main:574]	:	INFO	:	Epoch 1986 | loss: 0.0311961 | val_loss: 0.0311934 | Time: 4146.51 ms
[2022-03-03 14:34:08	                main:574]	:	INFO	:	Epoch 1987 | loss: 0.0311951 | val_loss: 0.0311937 | Time: 4031.65 ms
[2022-03-03 14:34:12	                main:574]	:	INFO	:	Epoch 1988 | loss: 0.0311961 | val_loss: 0.0311953 | Time: 4067.68 ms
[2022-03-03 14:34:16	                main:574]	:	INFO	:	Epoch 1989 | loss: 0.0311943 | val_loss: 0.0311924 | Time: 4066.86 ms
[2022-03-03 14:34:21	                main:574]	:	INFO	:	Epoch 1990 | loss: 0.0311937 | val_loss: 0.0311917 | Time: 4115.32 ms
[2022-03-03 14:34:25	                main:574]	:	INFO	:	Epoch 1991 | loss: 0.0311939 | val_loss: 0.0311906 | Time: 4113.55 ms
[2022-03-03 14:34:29	                main:574]	:	INFO	:	Epoch 1992 | loss: 0.0311935 | val_loss: 0.0311945 | Time: 4058 ms
[2022-03-03 14:34:33	                main:574]	:	INFO	:	Epoch 1993 | loss: 0.0311941 | val_loss: 0.0311918 | Time: 4264.63 ms
[2022-03-03 14:34:37	                main:574]	:	INFO	:	Epoch 1994 | loss: 0.0311944 | val_loss: 0.0311934 | Time: 4023.77 ms
[2022-03-03 14:34:41	                main:574]	:	INFO	:	Epoch 1995 | loss: 0.0311964 | val_loss: 0.0311944 | Time: 4055.69 ms
[2022-03-03 14:34:45	                main:574]	:	INFO	:	Epoch 1996 | loss: 0.0311951 | val_loss: 0.0311946 | Time: 4157.47 ms
[2022-03-03 14:34:49	                main:574]	:	INFO	:	Epoch 1997 | loss: 0.0311957 | val_loss: 0.0311977 | Time: 4107.1 ms
[2022-03-03 14:34:53	                main:574]	:	INFO	:	Epoch 1998 | loss: 0.031196 | val_loss: 0.0311946 | Time: 4032.49 ms
[2022-03-03 14:34:57	                main:574]	:	INFO	:	Epoch 1999 | loss: 0.0311959 | val_loss: 0.0311945 | Time: 4016.36 ms
[2022-03-03 14:35:01	                main:574]	:	INFO	:	Epoch 2000 | loss: 0.0312006 | val_loss: 0.0311999 | Time: 3995.45 ms
[2022-03-03 14:35:05	                main:574]	:	INFO	:	Epoch 2001 | loss: 0.0311974 | val_loss: 0.0311944 | Time: 4034.51 ms
[2022-03-03 14:35:09	                main:574]	:	INFO	:	Epoch 2002 | loss: 0.031195 | val_loss: 0.0311941 | Time: 4023.79 ms
[2022-03-03 14:35:14	                main:574]	:	INFO	:	Epoch 2003 | loss: 0.0311949 | val_loss: 0.0311949 | Time: 4138.96 ms
[2022-03-03 14:35:18	                main:574]	:	INFO	:	Epoch 2004 | loss: 0.0311955 | val_loss: 0.0311949 | Time: 4168.57 ms
[2022-03-03 14:35:22	                main:574]	:	INFO	:	Epoch 2005 | loss: 0.0311941 | val_loss: 0.0311961 | Time: 4094.99 ms
[2022-03-03 14:35:26	                main:574]	:	INFO	:	Epoch 2006 | loss: 0.0311942 | val_loss: 0.0311941 | Time: 4182.01 ms
[2022-03-03 14:35:30	                main:574]	:	INFO	:	Epoch 2007 | loss: 0.0311945 | val_loss: 0.0311963 | Time: 3980.34 ms
[2022-03-03 14:35:34	                main:574]	:	INFO	:	Epoch 2008 | loss: 0.0311946 | val_loss: 0.031193 | Time: 4178.71 ms
[2022-03-03 14:35:38	                main:574]	:	INFO	:	Epoch 2009 | loss: 0.0311948 | val_loss: 0.0311933 | Time: 4035.53 ms
[2022-03-03 14:35:42	                main:574]	:	INFO	:	Epoch 2010 | loss: 0.0311932 | val_loss: 0.0311943 | Time: 4167.25 ms
[2022-03-03 14:35:46	                main:574]	:	INFO	:	Epoch 2011 | loss: 0.0311945 | val_loss: 0.0311933 | Time: 4024.19 ms
[2022-03-03 14:35:51	                main:574]	:	INFO	:	Epoch 2012 | loss: 0.0311937 | val_loss: 0.0311929 | Time: 4104.92 ms
[2022-03-03 14:35:55	                main:574]	:	INFO	:	Epoch 2013 | loss: 0.0311936 | val_loss: 0.0311934 | Time: 4060.75 ms
[2022-03-03 14:35:59	                main:574]	:	INFO	:	Epoch 2014 | loss: 0.0311928 | val_loss: 0.031193 | Time: 4157.29 ms
[2022-03-03 14:36:03	                main:574]	:	INFO	:	Epoch 2015 | loss: 0.0311936 | val_loss: 0.0311924 | Time: 4068.06 ms
[2022-03-03 14:36:07	                main:574]	:	INFO	:	Epoch 2016 | loss: 0.0311935 | val_loss: 0.0311947 | Time: 4051.19 ms
[2022-03-03 14:36:11	                main:574]	:	INFO	:	Epoch 2017 | loss: 0.0311936 | val_loss: 0.0311939 | Time: 4069.12 ms
[2022-03-03 14:36:15	                main:574]	:	INFO	:	Epoch 2018 | loss: 0.0311927 | val_loss: 0.0311936 | Time: 4160.16 ms
[2022-03-03 14:36:19	                main:574]	:	INFO	:	Epoch 2019 | loss: 0.0311929 | val_loss: 0.0311928 | Time: 4039.2 ms
[2022-03-03 14:36:23	                main:574]	:	INFO	:	Epoch 2020 | loss: 0.0311924 | val_loss: 0.0311934 | Time: 4096.37 ms
[2022-03-03 14:36:27	                main:574]	:	INFO	:	Epoch 2021 | loss: 0.0311924 | val_loss: 0.0311922 | Time: 4050.07 ms
[2022-03-03 14:36:31	                main:574]	:	INFO	:	Epoch 2022 | loss: 0.0311923 | val_loss: 0.0311927 | Time: 4089.02 ms
[2022-03-03 14:36:36	                main:574]	:	INFO	:	Epoch 2023 | loss: 0.0311919 | val_loss: 0.0311943 | Time: 4131.36 ms
[2022-03-03 14:36:40	                main:574]	:	INFO	:	Epoch 2024 | loss: 0.0311931 | val_loss: 0.0311933 | Time: 4150.16 ms
[2022-03-03 14:36:44	                main:574]	:	INFO	:	Epoch 2025 | loss: 0.0311921 | val_loss: 0.0311921 | Time: 4076.28 ms
[2022-03-03 14:36:48	                main:574]	:	INFO	:	Epoch 2026 | loss: 0.0311964 | val_loss: 0.031201 | Time: 4141.52 ms
[2022-03-03 14:36:52	                main:574]	:	INFO	:	Epoch 2027 | loss: 0.031194 | val_loss: 0.0311924 | Time: 4041.06 ms
[2022-03-03 14:36:56	                main:574]	:	INFO	:	Epoch 2028 | loss: 0.0311924 | val_loss: 0.0311918 | Time: 4056.21 ms
[2022-03-03 14:37:00	                main:574]	:	INFO	:	Epoch 2029 | loss: 0.0311914 | val_loss: 0.0311918 | Time: 4061.28 ms
[2022-03-03 14:37:04	                main:574]	:	INFO	:	Epoch 2030 | loss: 0.031192 | val_loss: 0.0311919 | Time: 4117.99 ms
[2022-03-03 14:37:08	                main:574]	:	INFO	:	Epoch 2031 | loss: 0.0311945 | val_loss: 0.0311968 | Time: 4090.94 ms
[2022-03-03 14:37:12	                main:574]	:	INFO	:	Epoch 2032 | loss: 0.0311977 | val_loss: 0.0312005 | Time: 4105.24 ms
[2022-03-03 14:37:16	                main:574]	:	INFO	:	Epoch 2033 | loss: 0.0311987 | val_loss: 0.0312002 | Time: 4092.54 ms
[2022-03-03 14:37:21	                main:574]	:	INFO	:	Epoch 2034 | loss: 0.0312008 | val_loss: 0.0312067 | Time: 4069.43 ms
[2022-03-03 14:37:25	                main:574]	:	INFO	:	Epoch 2035 | loss: 0.0311995 | val_loss: 0.0312015 | Time: 4072.11 ms
[2022-03-03 14:37:29	                main:574]	:	INFO	:	Epoch 2036 | loss: 0.0311995 | val_loss: 0.0312008 | Time: 4033.38 ms
[2022-03-03 14:37:33	                main:574]	:	INFO	:	Epoch 2037 | loss: 0.0311979 | val_loss: 0.0311987 | Time: 4136.82 ms
[2022-03-03 14:37:37	                main:574]	:	INFO	:	Epoch 2038 | loss: 0.0311982 | val_loss: 0.0312 | Time: 4020.28 ms
[2022-03-03 14:37:41	                main:574]	:	INFO	:	Epoch 2039 | loss: 0.0311991 | val_loss: 0.0311994 | Time: 4158.64 ms
[2022-03-03 14:37:45	                main:574]	:	INFO	:	Epoch 2040 | loss: 0.0311997 | val_loss: 0.0312018 | Time: 4062.44 ms
[2022-03-03 14:37:49	                main:574]	:	INFO	:	Epoch 2041 | loss: 0.0312015 | val_loss: 0.0312015 | Time: 4008.43 ms
[2022-03-03 14:37:53	                main:574]	:	INFO	:	Epoch 2042 | loss: 0.0312004 | val_loss: 0.0312013 | Time: 4028.71 ms
[2022-03-03 14:37:57	                main:574]	:	INFO	:	Epoch 2043 | loss: 0.0312009 | val_loss: 0.0311999 | Time: 4091.52 ms
[2022-03-03 14:38:01	                main:574]	:	INFO	:	Epoch 2044 | loss: 0.0311994 | val_loss: 0.0311993 | Time: 4106.53 ms
[2022-03-03 14:38:05	                main:574]	:	INFO	:	Epoch 2045 | loss: 0.0311993 | val_loss: 0.0312038 | Time: 4101.62 ms
[2022-03-03 14:38:09	                main:574]	:	INFO	:	Epoch 2046 | loss: 0.0311992 | val_loss: 0.0311994 | Time: 4071.39 ms
[2022-03-03 14:38:14	                main:574]	:	INFO	:	Epoch 2047 | loss: 0.0311987 | val_loss: 0.0311994 | Time: 4086.06 ms
[2022-03-03 14:38:18	                main:574]	:	INFO	:	Epoch 2048 | loss: 0.0312001 | val_loss: 0.0312012 | Time: 4085.21 ms
[2022-03-03 14:38:18	                main:597]	:	INFO	:	Saving trained model to model-final.pt, val_loss 0.0312012
[2022-03-03 14:38:18	                main:603]	:	INFO	:	Saving end state to config to file
[2022-03-03 14:38:18	                main:608]	:	INFO	:	Success, exiting..
14:38:18 (14760): called boinc_finish(0)

</stderr_txt>
]]>


©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)