Task 14597631

Name ParityModified-1647043297-13359-3-0_0
Workunit 11618583
Created 11 Apr 2022, 21:02:40 UTC
Sent 11 Apr 2022, 21:02:52 UTC
Report deadline 19 Apr 2022, 21:02:52 UTC
Received 18 Apr 2022, 10:34:35 UTC
Server state Over
Outcome Success
Client state Done
Exit status 0 (0x00000000)
Computer ID 13966
Run time 2 hours 14 min 1 sec
CPU time 2 hours 10 min 42 sec
Validate state Valid
Credit 4,160.00
Device peak FLOPS 10,956.98 GFLOPS
Application version Machine Learning Dataset Generator (GPU) v9.75 (cuda10200)
windows_x86_64
Peak working set size 2.08 GB
Peak swap size 4.37 GB
Peak disk usage 1.54 GB

Stderr output

<core_client_version>7.16.20</core_client_version>
<![CDATA[
<stderr_txt>
9 | Time: 5026.69 ms
[2022-04-18 19:26:17	                main:574]	:	INFO	:	Epoch 1731 | loss: 0.0311179 | val_loss: 0.0311745 | Time: 4740.8 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 2080)
[2022-04-18 19:33:59	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-18 19:33:59	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-18 19:33:59	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-18 19:33:59	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-18 19:33:59	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-18 19:33:59	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-18 19:33:59	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-18 19:33:59	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-18 19:33:59	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-18 19:33:59	                main:474]	:	INFO	:	Configuration: 
[2022-04-18 19:33:59	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-18 19:33:59	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-18 19:33:59	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-18 19:33:59	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-18 19:33:59	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-18 19:33:59	                main:480]	:	INFO	:	    Patience: 10
[2022-04-18 19:33:59	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-18 19:33:59	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-18 19:33:59	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-18 19:33:59	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-18 19:33:59	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-18 19:33:59	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-18 19:33:59	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-18 19:34:01	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-18 19:34:01	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-18 19:34:01	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-18 19:34:01	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-18 19:34:01	                main:494]	:	INFO	:	Creating Model
[2022-04-18 19:34:01	                main:507]	:	INFO	:	Preparing config file
[2022-04-18 19:34:01	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-18 19:34:01	                main:512]	:	INFO	:	Loading config
[2022-04-18 19:34:01	                main:514]	:	INFO	:	Loading state
[2022-04-18 19:34:02	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-18 19:34:02	                main:562]	:	INFO	:	Starting Training
[2022-04-18 19:34:07	                main:574]	:	INFO	:	Epoch 1723 | loss: 0.0311386 | val_loss: 0.0311775 | Time: 5059.15 ms
[2022-04-18 19:34:12	                main:574]	:	INFO	:	Epoch 1724 | loss: 0.0311156 | val_loss: 0.0311812 | Time: 4588.46 ms
[2022-04-18 19:34:16	                main:574]	:	INFO	:	Epoch 1725 | loss: 0.0311106 | val_loss: 0.0311883 | Time: 4615.75 ms
[2022-04-18 19:34:21	                main:574]	:	INFO	:	Epoch 1726 | loss: 0.03111 | val_loss: 0.0311823 | Time: 4606.03 ms
[2022-04-18 19:34:26	                main:574]	:	INFO	:	Epoch 1727 | loss: 0.0311102 | val_loss: 0.0311849 | Time: 4672.86 ms
[2022-04-18 19:34:30	                main:574]	:	INFO	:	Epoch 1728 | loss: 0.0311107 | val_loss: 0.0311896 | Time: 4566.49 ms
[2022-04-18 19:34:35	                main:574]	:	INFO	:	Epoch 1729 | loss: 0.0311091 | val_loss: 0.0311817 | Time: 4570.39 ms
[2022-04-18 19:34:39	                main:574]	:	INFO	:	Epoch 1730 | loss: 0.0311068 | val_loss: 0.0311773 | Time: 4648.42 ms
[2022-04-18 19:34:44	                main:574]	:	INFO	:	Epoch 1731 | loss: 0.0311055 | val_loss: 0.0311835 | Time: 4576.5 ms
[2022-04-18 19:34:49	                main:574]	:	INFO	:	Epoch 1732 | loss: 0.0311101 | val_loss: 0.0311798 | Time: 4566.07 ms
[2022-04-18 19:34:53	                main:574]	:	INFO	:	Epoch 1733 | loss: 0.0311119 | val_loss: 0.0311804 | Time: 4634.38 ms
[2022-04-18 19:34:58	                main:574]	:	INFO	:	Epoch 1734 | loss: 0.0311103 | val_loss: 0.0311756 | Time: 4625.51 ms
[2022-04-18 19:35:03	                main:574]	:	INFO	:	Epoch 1735 | loss: 0.0311127 | val_loss: 0.0311831 | Time: 5591.32 ms
[2022-04-18 19:35:10	                main:574]	:	INFO	:	Epoch 1736 | loss: 0.0311157 | val_loss: 0.0311785 | Time: 5930.82 ms
[2022-04-18 19:35:15	                main:574]	:	INFO	:	Epoch 1737 | loss: 0.0311099 | val_loss: 0.0311834 | Time: 5243.97 ms
[2022-04-18 19:35:20	                main:574]	:	INFO	:	Epoch 1738 | loss: 0.031108 | val_loss: 0.031177 | Time: 5027.29 ms
[2022-04-18 19:35:25	                main:574]	:	INFO	:	Epoch 1739 | loss: 0.0311065 | val_loss: 0.0311815 | Time: 4885.64 ms
[2022-04-18 19:35:30	                main:574]	:	INFO	:	Epoch 1740 | loss: 0.0311077 | val_loss: 0.0311819 | Time: 4816.56 ms
[2022-04-18 19:35:34	                main:574]	:	INFO	:	Epoch 1741 | loss: 0.0311078 | val_loss: 0.0311873 | Time: 4711.39 ms
[2022-04-18 19:35:39	                main:574]	:	INFO	:	Epoch 1742 | loss: 0.031107 | val_loss: 0.0311784 | Time: 4559.41 ms
[2022-04-18 19:35:43	                main:574]	:	INFO	:	Epoch 1743 | loss: 0.0311104 | val_loss: 0.0311828 | Time: 4584.02 ms
[2022-04-18 19:35:48	                main:574]	:	INFO	:	Epoch 1744 | loss: 0.0311124 | val_loss: 0.0311788 | Time: 4581.65 ms
[2022-04-18 19:35:53	                main:574]	:	INFO	:	Epoch 1745 | loss: 0.0311129 | val_loss: 0.0311797 | Time: 4677.62 ms
[2022-04-18 19:35:58	                main:574]	:	INFO	:	Epoch 1746 | loss: 0.031111 | val_loss: 0.031182 | Time: 5089.58 ms
[2022-04-18 19:36:03	                main:574]	:	INFO	:	Epoch 1747 | loss: 0.0311095 | val_loss: 0.0311773 | Time: 4908.04 ms
[2022-04-18 19:36:07	                main:574]	:	INFO	:	Epoch 1748 | loss: 0.0311098 | val_loss: 0.031181 | Time: 4713.09 ms
[2022-04-18 19:36:12	                main:574]	:	INFO	:	Epoch 1749 | loss: 0.0311068 | val_loss: 0.0311854 | Time: 4602.93 ms
[2022-04-18 19:36:17	                main:574]	:	INFO	:	Epoch 1750 | loss: 0.0311097 | val_loss: 0.0311843 | Time: 4647.21 ms
[2022-04-18 19:36:21	                main:574]	:	INFO	:	Epoch 1751 | loss: 0.0311089 | val_loss: 0.0311856 | Time: 4613.7 ms
[2022-04-18 19:36:26	                main:574]	:	INFO	:	Epoch 1752 | loss: 0.031112 | val_loss: 0.0311832 | Time: 4608.5 ms
[2022-04-18 19:36:31	                main:574]	:	INFO	:	Epoch 1753 | loss: 0.0311118 | val_loss: 0.0311774 | Time: 4672.77 ms
[2022-04-18 19:36:35	                main:574]	:	INFO	:	Epoch 1754 | loss: 0.0311122 | val_loss: 0.0311791 | Time: 4544.33 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 2080)
[2022-04-18 19:46:35	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-18 19:46:35	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-18 19:46:35	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-18 19:46:35	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-18 19:46:35	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-18 19:46:35	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-18 19:46:35	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-18 19:46:35	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-18 19:46:35	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-18 19:46:35	                main:474]	:	INFO	:	Configuration: 
[2022-04-18 19:46:35	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-18 19:46:35	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-18 19:46:35	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-18 19:46:35	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-18 19:46:35	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-18 19:46:35	                main:480]	:	INFO	:	    Patience: 10
[2022-04-18 19:46:35	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-18 19:46:35	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-18 19:46:35	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-18 19:46:35	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-18 19:46:35	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-18 19:46:35	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-18 19:46:35	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-18 19:46:37	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-18 19:46:37	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-18 19:46:37	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-18 19:46:37	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-18 19:46:37	                main:494]	:	INFO	:	Creating Model
[2022-04-18 19:46:37	                main:507]	:	INFO	:	Preparing config file
[2022-04-18 19:46:37	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-18 19:46:37	                main:512]	:	INFO	:	Loading config
[2022-04-18 19:46:37	                main:514]	:	INFO	:	Loading state
[2022-04-18 19:46:38	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-18 19:46:38	                main:562]	:	INFO	:	Starting Training
[2022-04-18 19:46:43	                main:574]	:	INFO	:	Epoch 1751 | loss: 0.0311366 | val_loss: 0.0311829 | Time: 5181.49 ms
[2022-04-18 19:46:48	                main:574]	:	INFO	:	Epoch 1752 | loss: 0.0311142 | val_loss: 0.0311868 | Time: 4992.87 ms
[2022-04-18 19:46:53	                main:574]	:	INFO	:	Epoch 1753 | loss: 0.0311127 | val_loss: 0.031179 | Time: 4833.86 ms
[2022-04-18 19:46:58	                main:574]	:	INFO	:	Epoch 1754 | loss: 0.0311083 | val_loss: 0.0311804 | Time: 4908.37 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 2080)
[2022-04-18 19:53:58	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-18 19:53:58	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-18 19:53:58	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-18 19:53:58	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-18 19:53:58	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-18 19:53:58	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-18 19:53:58	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-18 19:53:58	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-18 19:53:58	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-18 19:53:58	                main:474]	:	INFO	:	Configuration: 
[2022-04-18 19:53:58	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-18 19:53:58	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-18 19:53:58	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-18 19:53:58	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-18 19:53:58	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-18 19:53:58	                main:480]	:	INFO	:	    Patience: 10
[2022-04-18 19:53:58	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-18 19:53:58	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-18 19:53:58	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-18 19:53:58	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-18 19:53:58	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-18 19:53:58	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-18 19:53:58	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-18 19:54:00	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-18 19:54:00	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-18 19:54:00	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-18 19:54:00	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-18 19:54:00	                main:494]	:	INFO	:	Creating Model
[2022-04-18 19:54:00	                main:507]	:	INFO	:	Preparing config file
[2022-04-18 19:54:00	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-18 19:54:00	                main:512]	:	INFO	:	Loading config
[2022-04-18 19:54:00	                main:514]	:	INFO	:	Loading state
[2022-04-18 19:54:01	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-18 19:54:01	                main:562]	:	INFO	:	Starting Training
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 2080)
[2022-04-18 19:57:03	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-18 19:57:03	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-18 19:57:03	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-18 19:57:03	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-18 19:57:03	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-18 19:57:03	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-18 19:57:03	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-18 19:57:03	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-18 19:57:03	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-18 19:57:03	                main:474]	:	INFO	:	Configuration: 
[2022-04-18 19:57:03	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-18 19:57:03	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-18 19:57:03	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-18 19:57:03	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-18 19:57:03	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-18 19:57:03	                main:480]	:	INFO	:	    Patience: 10
[2022-04-18 19:57:03	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-18 19:57:03	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-18 19:57:03	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-18 19:57:03	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-18 19:57:03	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-18 19:57:03	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-18 19:57:04	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 2080)
[2022-04-18 20:00:20	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-18 20:00:20	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-18 20:00:20	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-18 20:00:20	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-18 20:00:20	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-18 20:00:20	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-18 20:00:20	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-18 20:00:20	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-18 20:00:20	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-18 20:00:20	                main:474]	:	INFO	:	Configuration: 
[2022-04-18 20:00:20	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-18 20:00:20	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-18 20:00:20	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-18 20:00:20	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-18 20:00:20	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-18 20:00:20	                main:480]	:	INFO	:	    Patience: 10
[2022-04-18 20:00:20	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-18 20:00:20	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-18 20:00:20	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-18 20:00:20	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-18 20:00:20	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-18 20:00:20	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-18 20:00:21	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-18 20:00:22	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-18 20:00:22	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-18 20:00:22	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-18 20:00:22	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-18 20:00:22	                main:494]	:	INFO	:	Creating Model
[2022-04-18 20:00:22	                main:507]	:	INFO	:	Preparing config file
[2022-04-18 20:00:22	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-18 20:00:22	                main:512]	:	INFO	:	Loading config
[2022-04-18 20:00:23	                main:514]	:	INFO	:	Loading state
[2022-04-18 20:00:24	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-18 20:00:24	                main:562]	:	INFO	:	Starting Training
[2022-04-18 20:00:29	                main:574]	:	INFO	:	Epoch 1751 | loss: 0.0311385 | val_loss: 0.0311825 | Time: 4988.87 ms
[2022-04-18 20:00:33	                main:574]	:	INFO	:	Epoch 1752 | loss: 0.0311175 | val_loss: 0.0311775 | Time: 4810.44 ms
[2022-04-18 20:00:38	                main:574]	:	INFO	:	Epoch 1753 | loss: 0.0311121 | val_loss: 0.0311833 | Time: 4779.34 ms
[2022-04-18 20:00:43	                main:574]	:	INFO	:	Epoch 1754 | loss: 0.0311122 | val_loss: 0.0311894 | Time: 4662.75 ms
[2022-04-18 20:00:48	                main:574]	:	INFO	:	Epoch 1755 | loss: 0.031115 | val_loss: 0.03118 | Time: 4716.95 ms
[2022-04-18 20:00:52	                main:574]	:	INFO	:	Epoch 1756 | loss: 0.0311144 | val_loss: 0.0311841 | Time: 4768.41 ms
[2022-04-18 20:00:57	                main:574]	:	INFO	:	Epoch 1757 | loss: 0.0311131 | val_loss: 0.0311813 | Time: 4725.12 ms
[2022-04-18 20:01:02	                main:574]	:	INFO	:	Epoch 1758 | loss: 0.0311102 | val_loss: 0.0311886 | Time: 4767.71 ms
[2022-04-18 20:01:07	                main:574]	:	INFO	:	Epoch 1759 | loss: 0.0311081 | val_loss: 0.0311854 | Time: 4813.7 ms
[2022-04-18 20:01:11	                main:574]	:	INFO	:	Epoch 1760 | loss: 0.0311079 | val_loss: 0.0311794 | Time: 4629.03 ms
[2022-04-18 20:01:16	                main:574]	:	INFO	:	Epoch 1761 | loss: 0.0311074 | val_loss: 0.031185 | Time: 4627.07 ms
[2022-04-18 20:01:21	                main:574]	:	INFO	:	Epoch 1762 | loss: 0.031108 | val_loss: 0.0311824 | Time: 4709.64 ms
[2022-04-18 20:01:25	                main:574]	:	INFO	:	Epoch 1763 | loss: 0.0311096 | val_loss: 0.0311793 | Time: 4656.94 ms
[2022-04-18 20:01:30	                main:574]	:	INFO	:	Epoch 1764 | loss: 0.031109 | val_loss: 0.0311827 | Time: 4648.35 ms
[2022-04-18 20:01:35	                main:574]	:	INFO	:	Epoch 1765 | loss: 0.031109 | val_loss: 0.0311875 | Time: 4666.7 ms
[2022-04-18 20:01:39	                main:574]	:	INFO	:	Epoch 1766 | loss: 0.0311104 | val_loss: 0.0311752 | Time: 4645.66 ms
[2022-04-18 20:01:44	                main:574]	:	INFO	:	Epoch 1767 | loss: 0.0311077 | val_loss: 0.0311827 | Time: 4630.9 ms
[2022-04-18 20:01:49	                main:574]	:	INFO	:	Epoch 1768 | loss: 0.0311092 | val_loss: 0.0311878 | Time: 4567.73 ms
[2022-04-18 20:01:53	                main:574]	:	INFO	:	Epoch 1769 | loss: 0.0311091 | val_loss: 0.0311786 | Time: 4831.22 ms
[2022-04-18 20:01:58	                main:574]	:	INFO	:	Epoch 1770 | loss: 0.0311099 | val_loss: 0.0311794 | Time: 4675.96 ms
[2022-04-18 20:02:03	                main:574]	:	INFO	:	Epoch 1771 | loss: 0.0311086 | val_loss: 0.031185 | Time: 4742.97 ms
[2022-04-18 20:02:07	                main:574]	:	INFO	:	Epoch 1772 | loss: 0.0311078 | val_loss: 0.0311871 | Time: 4695.11 ms
[2022-04-18 20:02:12	                main:574]	:	INFO	:	Epoch 1773 | loss: 0.031105 | val_loss: 0.0311821 | Time: 4668.84 ms
[2022-04-18 20:02:17	                main:574]	:	INFO	:	Epoch 1774 | loss: 0.0311025 | val_loss: 0.0311905 | Time: 4602.45 ms
[2022-04-18 20:02:21	                main:574]	:	INFO	:	Epoch 1775 | loss: 0.0311032 | val_loss: 0.0311857 | Time: 4716.84 ms
[2022-04-18 20:02:26	                main:574]	:	INFO	:	Epoch 1776 | loss: 0.0311045 | val_loss: 0.0311868 | Time: 4654.65 ms
[2022-04-18 20:02:31	                main:574]	:	INFO	:	Epoch 1777 | loss: 0.0311045 | val_loss: 0.0311826 | Time: 4665.72 ms
[2022-04-18 20:02:35	                main:574]	:	INFO	:	Epoch 1778 | loss: 0.0311076 | val_loss: 0.0311855 | Time: 4676.1 ms
[2022-04-18 20:02:40	                main:574]	:	INFO	:	Epoch 1779 | loss: 0.0311162 | val_loss: 0.0311754 | Time: 4658.34 ms
[2022-04-18 20:02:45	                main:574]	:	INFO	:	Epoch 1780 | loss: 0.031112 | val_loss: 0.0311778 | Time: 4626.29 ms
[2022-04-18 20:02:49	                main:574]	:	INFO	:	Epoch 1781 | loss: 0.0311083 | val_loss: 0.0311846 | Time: 4603.14 ms
[2022-04-18 20:02:54	                main:574]	:	INFO	:	Epoch 1782 | loss: 0.0311077 | val_loss: 0.03119 | Time: 4733.02 ms
[2022-04-18 20:02:59	                main:574]	:	INFO	:	Epoch 1783 | loss: 0.0311077 | val_loss: 0.0311798 | Time: 4721.83 ms
[2022-04-18 20:03:04	                main:574]	:	INFO	:	Epoch 1784 | loss: 0.0311071 | val_loss: 0.0311873 | Time: 4756.14 ms
[2022-04-18 20:03:08	                main:574]	:	INFO	:	Epoch 1785 | loss: 0.031109 | val_loss: 0.0311754 | Time: 4719.14 ms
[2022-04-18 20:03:13	                main:574]	:	INFO	:	Epoch 1786 | loss: 0.0311096 | val_loss: 0.0311829 | Time: 4781.3 ms
[2022-04-18 20:03:18	                main:574]	:	INFO	:	Epoch 1787 | loss: 0.0311102 | val_loss: 0.0311809 | Time: 4780.77 ms
[2022-04-18 20:03:23	                main:574]	:	INFO	:	Epoch 1788 | loss: 0.0311068 | val_loss: 0.0311849 | Time: 4753.69 ms
[2022-04-18 20:03:27	                main:574]	:	INFO	:	Epoch 1789 | loss: 0.0311054 | val_loss: 0.0311866 | Time: 4679.91 ms
[2022-04-18 20:03:32	                main:574]	:	INFO	:	Epoch 1790 | loss: 0.0311063 | val_loss: 0.0311812 | Time: 4655.81 ms
[2022-04-18 20:03:37	                main:574]	:	INFO	:	Epoch 1791 | loss: 0.0311085 | val_loss: 0.0311872 | Time: 4601.34 ms
[2022-04-18 20:03:41	                main:574]	:	INFO	:	Epoch 1792 | loss: 0.0311123 | val_loss: 0.0311775 | Time: 4593.39 ms
[2022-04-18 20:03:46	                main:574]	:	INFO	:	Epoch 1793 | loss: 0.0311116 | val_loss: 0.0311866 | Time: 4694.02 ms
[2022-04-18 20:03:51	                main:574]	:	INFO	:	Epoch 1794 | loss: 0.031113 | val_loss: 0.0311691 | Time: 4682.16 ms
[2022-04-18 20:03:55	                main:574]	:	INFO	:	Epoch 1795 | loss: 0.0311134 | val_loss: 0.031183 | Time: 4681.39 ms
[2022-04-18 20:04:00	                main:574]	:	INFO	:	Epoch 1796 | loss: 0.0311175 | val_loss: 0.0311839 | Time: 4621.8 ms
[2022-04-18 20:04:05	                main:574]	:	INFO	:	Epoch 1797 | loss: 0.031118 | val_loss: 0.0311864 | Time: 4687.35 ms
[2022-04-18 20:04:09	                main:574]	:	INFO	:	Epoch 1798 | loss: 0.0311126 | val_loss: 0.0311762 | Time: 4529.94 ms
[2022-04-18 20:04:14	                main:574]	:	INFO	:	Epoch 1799 | loss: 0.031109 | val_loss: 0.031186 | Time: 4629.19 ms
[2022-04-18 20:04:19	                main:574]	:	INFO	:	Epoch 1800 | loss: 0.0311103 | val_loss: 0.0311875 | Time: 4767.15 ms
[2022-04-18 20:04:23	                main:574]	:	INFO	:	Epoch 1801 | loss: 0.0311159 | val_loss: 0.0311807 | Time: 4658.43 ms
[2022-04-18 20:04:28	                main:574]	:	INFO	:	Epoch 1802 | loss: 0.0311139 | val_loss: 0.031187 | Time: 4612.87 ms
[2022-04-18 20:04:32	                main:574]	:	INFO	:	Epoch 1803 | loss: 0.0311141 | val_loss: 0.0311806 | Time: 4630.53 ms
[2022-04-18 20:04:37	                main:574]	:	INFO	:	Epoch 1804 | loss: 0.0311112 | val_loss: 0.031182 | Time: 4610.48 ms
[2022-04-18 20:04:42	                main:574]	:	INFO	:	Epoch 1805 | loss: 0.0311138 | val_loss: 0.0311723 | Time: 4672.77 ms
[2022-04-18 20:04:46	                main:574]	:	INFO	:	Epoch 1806 | loss: 0.0311317 | val_loss: 0.0311765 | Time: 4690.75 ms
[2022-04-18 20:04:51	                main:574]	:	INFO	:	Epoch 1807 | loss: 0.0311315 | val_loss: 0.0311706 | Time: 4616.31 ms
[2022-04-18 20:04:56	                main:574]	:	INFO	:	Epoch 1808 | loss: 0.0311324 | val_loss: 0.0311712 | Time: 4668.32 ms
[2022-04-18 20:05:01	                main:574]	:	INFO	:	Epoch 1809 | loss: 0.0311423 | val_loss: 0.0311803 | Time: 4728.03 ms
[2022-04-18 20:05:05	                main:574]	:	INFO	:	Epoch 1810 | loss: 0.0311486 | val_loss: 0.0311791 | Time: 4693.48 ms
[2022-04-18 20:05:10	                main:574]	:	INFO	:	Epoch 1811 | loss: 0.0311451 | val_loss: 0.0311761 | Time: 4654.79 ms
[2022-04-18 20:05:15	                main:574]	:	INFO	:	Epoch 1812 | loss: 0.0311443 | val_loss: 0.0311741 | Time: 4704.85 ms
[2022-04-18 20:05:20	                main:574]	:	INFO	:	Epoch 1813 | loss: 0.0311394 | val_loss: 0.0311714 | Time: 4942.51 ms
[2022-04-18 20:05:24	                main:574]	:	INFO	:	Epoch 1814 | loss: 0.0311344 | val_loss: 0.0311724 | Time: 4838.38 ms
[2022-04-18 20:05:29	                main:574]	:	INFO	:	Epoch 1815 | loss: 0.031132 | val_loss: 0.0311722 | Time: 4731.91 ms
[2022-04-18 20:05:34	                main:574]	:	INFO	:	Epoch 1816 | loss: 0.0311313 | val_loss: 0.0311756 | Time: 4759.03 ms
[2022-04-18 20:05:39	                main:574]	:	INFO	:	Epoch 1817 | loss: 0.0311268 | val_loss: 0.0311795 | Time: 4774.42 ms
[2022-04-18 20:05:43	                main:574]	:	INFO	:	Epoch 1818 | loss: 0.0311256 | val_loss: 0.031171 | Time: 4682.52 ms
[2022-04-18 20:05:48	                main:574]	:	INFO	:	Epoch 1819 | loss: 0.0311248 | val_loss: 0.0311741 | Time: 4665.17 ms
[2022-04-18 20:05:53	                main:574]	:	INFO	:	Epoch 1820 | loss: 0.0311243 | val_loss: 0.0311709 | Time: 4701.08 ms
[2022-04-18 20:05:57	                main:574]	:	INFO	:	Epoch 1821 | loss: 0.0311217 | val_loss: 0.0311736 | Time: 4638.81 ms
[2022-04-18 20:06:02	                main:574]	:	INFO	:	Epoch 1822 | loss: 0.0311215 | val_loss: 0.0311749 | Time: 4734.36 ms
[2022-04-18 20:06:07	                main:574]	:	INFO	:	Epoch 1823 | loss: 0.0311198 | val_loss: 0.0311779 | Time: 4642.81 ms
[2022-04-18 20:06:11	                main:574]	:	INFO	:	Epoch 1824 | loss: 0.0311218 | val_loss: 0.0311816 | Time: 4614.88 ms
[2022-04-18 20:06:16	                main:574]	:	INFO	:	Epoch 1825 | loss: 0.0311433 | val_loss: 0.0311773 | Time: 4656.77 ms
[2022-04-18 20:06:21	                main:574]	:	INFO	:	Epoch 1826 | loss: 0.0311405 | val_loss: 0.0311785 | Time: 4595.54 ms
[2022-04-18 20:06:25	                main:574]	:	INFO	:	Epoch 1827 | loss: 0.0311363 | val_loss: 0.0311796 | Time: 4675.62 ms
[2022-04-18 20:06:30	                main:574]	:	INFO	:	Epoch 1828 | loss: 0.031131 | val_loss: 0.031179 | Time: 4668.13 ms
[2022-04-18 20:06:35	                main:574]	:	INFO	:	Epoch 1829 | loss: 0.0311296 | val_loss: 0.0311809 | Time: 4658.27 ms
[2022-04-18 20:06:39	                main:574]	:	INFO	:	Epoch 1830 | loss: 0.0311299 | val_loss: 0.0311801 | Time: 4634.79 ms
[2022-04-18 20:06:44	                main:574]	:	INFO	:	Epoch 1831 | loss: 0.0311318 | val_loss: 0.0311726 | Time: 4663.13 ms
[2022-04-18 20:06:49	                main:574]	:	INFO	:	Epoch 1832 | loss: 0.0311287 | val_loss: 0.0311739 | Time: 4612.46 ms
[2022-04-18 20:06:53	                main:574]	:	INFO	:	Epoch 1833 | loss: 0.0311274 | val_loss: 0.0311788 | Time: 4658.7 ms
[2022-04-18 20:06:58	                main:574]	:	INFO	:	Epoch 1834 | loss: 0.0311224 | val_loss: 0.0311797 | Time: 4596.66 ms
[2022-04-18 20:07:03	                main:574]	:	INFO	:	Epoch 1835 | loss: 0.0311261 | val_loss: 0.0311763 | Time: 4766.08 ms
[2022-04-18 20:07:07	                main:574]	:	INFO	:	Epoch 1836 | loss: 0.0311227 | val_loss: 0.0311771 | Time: 4630.51 ms
[2022-04-18 20:07:12	                main:574]	:	INFO	:	Epoch 1837 | loss: 0.0311245 | val_loss: 0.0311752 | Time: 4675.35 ms
[2022-04-18 20:07:17	                main:574]	:	INFO	:	Epoch 1838 | loss: 0.0311252 | val_loss: 0.0311762 | Time: 4665.87 ms
[2022-04-18 20:07:21	                main:574]	:	INFO	:	Epoch 1839 | loss: 0.0311244 | val_loss: 0.0311764 | Time: 4732.72 ms
[2022-04-18 20:07:26	                main:574]	:	INFO	:	Epoch 1840 | loss: 0.0311232 | val_loss: 0.0311741 | Time: 4671.78 ms
[2022-04-18 20:07:31	                main:574]	:	INFO	:	Epoch 1841 | loss: 0.0311223 | val_loss: 0.0311766 | Time: 4690.11 ms
[2022-04-18 20:07:35	                main:574]	:	INFO	:	Epoch 1842 | loss: 0.0311211 | val_loss: 0.0311801 | Time: 4697.69 ms
[2022-04-18 20:07:40	                main:574]	:	INFO	:	Epoch 1843 | loss: 0.0311189 | val_loss: 0.031176 | Time: 4694.14 ms
[2022-04-18 20:07:45	                main:574]	:	INFO	:	Epoch 1844 | loss: 0.0311175 | val_loss: 0.0311782 | Time: 4691.58 ms
[2022-04-18 20:07:50	                main:574]	:	INFO	:	Epoch 1845 | loss: 0.0311151 | val_loss: 0.0311768 | Time: 4749.09 ms
[2022-04-18 20:07:54	                main:574]	:	INFO	:	Epoch 1846 | loss: 0.0311175 | val_loss: 0.0311803 | Time: 4721.66 ms
[2022-04-18 20:07:59	                main:574]	:	INFO	:	Epoch 1847 | loss: 0.031123 | val_loss: 0.0311852 | Time: 4683.16 ms
[2022-04-18 20:08:04	                main:574]	:	INFO	:	Epoch 1848 | loss: 0.0311186 | val_loss: 0.0311755 | Time: 4768.42 ms
[2022-04-18 20:08:08	                main:574]	:	INFO	:	Epoch 1849 | loss: 0.0311167 | val_loss: 0.0311773 | Time: 4626.67 ms
[2022-04-18 20:08:13	                main:574]	:	INFO	:	Epoch 1850 | loss: 0.0311172 | val_loss: 0.0311781 | Time: 4720.85 ms
[2022-04-18 20:08:18	                main:574]	:	INFO	:	Epoch 1851 | loss: 0.0311155 | val_loss: 0.0311802 | Time: 4733.53 ms
[2022-04-18 20:08:23	                main:574]	:	INFO	:	Epoch 1852 | loss: 0.0311152 | val_loss: 0.0311765 | Time: 5151.06 ms
[2022-04-18 20:08:28	                main:574]	:	INFO	:	Epoch 1853 | loss: 0.0311175 | val_loss: 0.0311805 | Time: 5139.92 ms
[2022-04-18 20:08:33	                main:574]	:	INFO	:	Epoch 1854 | loss: 0.0311164 | val_loss: 0.0311806 | Time: 4634 ms
[2022-04-18 20:08:38	                main:574]	:	INFO	:	Epoch 1855 | loss: 0.0311164 | val_loss: 0.0311781 | Time: 4652 ms
[2022-04-18 20:08:42	                main:574]	:	INFO	:	Epoch 1856 | loss: 0.031118 | val_loss: 0.031177 | Time: 4682.72 ms
[2022-04-18 20:08:47	                main:574]	:	INFO	:	Epoch 1857 | loss: 0.0311144 | val_loss: 0.0311819 | Time: 4615.37 ms
[2022-04-18 20:08:51	                main:574]	:	INFO	:	Epoch 1858 | loss: 0.0311137 | val_loss: 0.0311824 | Time: 4630.42 ms
[2022-04-18 20:08:56	                main:574]	:	INFO	:	Epoch 1859 | loss: 0.0311136 | val_loss: 0.0311823 | Time: 4675.32 ms
[2022-04-18 20:09:01	                main:574]	:	INFO	:	Epoch 1860 | loss: 0.0311168 | val_loss: 0.0311806 | Time: 4699.63 ms
[2022-04-18 20:09:05	                main:574]	:	INFO	:	Epoch 1861 | loss: 0.0311162 | val_loss: 0.0311815 | Time: 4655.02 ms
[2022-04-18 20:09:10	                main:574]	:	INFO	:	Epoch 1862 | loss: 0.0311138 | val_loss: 0.0311851 | Time: 4621.55 ms
[2022-04-18 20:09:15	                main:574]	:	INFO	:	Epoch 1863 | loss: 0.0311205 | val_loss: 0.0311855 | Time: 4604.58 ms
[2022-04-18 20:09:19	                main:574]	:	INFO	:	Epoch 1864 | loss: 0.0311265 | val_loss: 0.0311816 | Time: 4702.4 ms
[2022-04-18 20:09:24	                main:574]	:	INFO	:	Epoch 1865 | loss: 0.0311238 | val_loss: 0.0311853 | Time: 4673.85 ms
[2022-04-18 20:09:29	                main:574]	:	INFO	:	Epoch 1866 | loss: 0.0311221 | val_loss: 0.031179 | Time: 4608.88 ms
[2022-04-18 20:09:33	                main:574]	:	INFO	:	Epoch 1867 | loss: 0.0311199 | val_loss: 0.0311824 | Time: 4647.86 ms
[2022-04-18 20:09:38	                main:574]	:	INFO	:	Epoch 1868 | loss: 0.0311189 | val_loss: 0.0311795 | Time: 4651.74 ms
[2022-04-18 20:09:43	                main:574]	:	INFO	:	Epoch 1869 | loss: 0.0311185 | val_loss: 0.0311837 | Time: 4617.1 ms
[2022-04-18 20:09:47	                main:574]	:	INFO	:	Epoch 1870 | loss: 0.0311175 | val_loss: 0.03118 | Time: 4684.99 ms
[2022-04-18 20:09:52	                main:574]	:	INFO	:	Epoch 1871 | loss: 0.0311302 | val_loss: 0.031183 | Time: 4667.47 ms
[2022-04-18 20:09:57	                main:574]	:	INFO	:	Epoch 1872 | loss: 0.0311336 | val_loss: 0.0311769 | Time: 4625.94 ms
[2022-04-18 20:10:01	                main:574]	:	INFO	:	Epoch 1873 | loss: 0.0311301 | val_loss: 0.0311773 | Time: 4768.05 ms
[2022-04-18 20:10:06	                main:574]	:	INFO	:	Epoch 1874 | loss: 0.0311279 | val_loss: 0.0311765 | Time: 4801.35 ms
[2022-04-18 20:10:11	                main:574]	:	INFO	:	Epoch 1875 | loss: 0.0311244 | val_loss: 0.0311716 | Time: 4671.7 ms
[2022-04-18 20:10:16	                main:574]	:	INFO	:	Epoch 1876 | loss: 0.0311221 | val_loss: 0.0311737 | Time: 4837.4 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 2080)
[2022-04-18 20:15:03	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-18 20:15:03	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-18 20:15:03	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-18 20:15:03	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-18 20:15:03	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-18 20:15:03	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-18 20:15:03	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-18 20:15:03	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-18 20:15:03	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-18 20:15:03	                main:474]	:	INFO	:	Configuration: 
[2022-04-18 20:15:03	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-18 20:15:03	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-18 20:15:03	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-18 20:15:03	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-18 20:15:03	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-18 20:15:03	                main:480]	:	INFO	:	    Patience: 10
[2022-04-18 20:15:03	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-18 20:15:03	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-18 20:15:03	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-18 20:15:03	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-18 20:15:03	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-18 20:15:03	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-18 20:15:03	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-18 20:15:05	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-18 20:15:05	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-18 20:15:05	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-18 20:15:05	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-18 20:15:05	                main:494]	:	INFO	:	Creating Model
[2022-04-18 20:15:05	                main:507]	:	INFO	:	Preparing config file
[2022-04-18 20:15:05	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-18 20:15:05	                main:512]	:	INFO	:	Loading config
[2022-04-18 20:15:05	                main:514]	:	INFO	:	Loading state
[2022-04-18 20:15:06	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-18 20:15:07	                main:562]	:	INFO	:	Starting Training
[2022-04-18 20:15:12	                main:574]	:	INFO	:	Epoch 1876 | loss: 0.0311502 | val_loss: 0.0311807 | Time: 5004.33 ms
[2022-04-18 20:15:16	                main:574]	:	INFO	:	Epoch 1877 | loss: 0.0311282 | val_loss: 0.0311766 | Time: 4582.91 ms
[2022-04-18 20:15:21	                main:574]	:	INFO	:	Epoch 1878 | loss: 0.0311257 | val_loss: 0.0311798 | Time: 4625.18 ms
[2022-04-18 20:15:26	                main:574]	:	INFO	:	Epoch 1879 | loss: 0.0311222 | val_loss: 0.0311763 | Time: 4935.48 ms
[2022-04-18 20:15:30	                main:574]	:	INFO	:	Epoch 1880 | loss: 0.0311213 | val_loss: 0.0311789 | Time: 4621.62 ms
[2022-04-18 20:15:35	                main:574]	:	INFO	:	Epoch 1881 | loss: 0.0311186 | val_loss: 0.0311765 | Time: 4577.61 ms
[2022-04-18 20:15:40	                main:574]	:	INFO	:	Epoch 1882 | loss: 0.0311183 | val_loss: 0.0311751 | Time: 4618.23 ms
[2022-04-18 20:15:44	                main:574]	:	INFO	:	Epoch 1883 | loss: 0.0311213 | val_loss: 0.0311763 | Time: 4593.57 ms
[2022-04-18 20:15:49	                main:574]	:	INFO	:	Epoch 1884 | loss: 0.0311203 | val_loss: 0.0311781 | Time: 4609.89 ms
[2022-04-18 20:15:53	                main:574]	:	INFO	:	Epoch 1885 | loss: 0.0311176 | val_loss: 0.0311812 | Time: 4616.4 ms
[2022-04-18 20:15:58	                main:574]	:	INFO	:	Epoch 1886 | loss: 0.0311202 | val_loss: 0.03118 | Time: 4593.35 ms
[2022-04-18 20:16:03	                main:574]	:	INFO	:	Epoch 1887 | loss: 0.0311175 | val_loss: 0.0311778 | Time: 4638.54 ms
[2022-04-18 20:16:07	                main:574]	:	INFO	:	Epoch 1888 | loss: 0.0311172 | val_loss: 0.0311833 | Time: 4664.91 ms
[2022-04-18 20:16:12	                main:574]	:	INFO	:	Epoch 1889 | loss: 0.0311164 | val_loss: 0.0311848 | Time: 4663.96 ms
[2022-04-18 20:16:17	                main:574]	:	INFO	:	Epoch 1890 | loss: 0.0311202 | val_loss: 0.0311777 | Time: 4647.18 ms
[2022-04-18 20:16:21	                main:574]	:	INFO	:	Epoch 1891 | loss: 0.031117 | val_loss: 0.0311793 | Time: 4612.78 ms
[2022-04-18 20:16:26	                main:574]	:	INFO	:	Epoch 1892 | loss: 0.031115 | val_loss: 0.0311761 | Time: 4568.3 ms
[2022-04-18 20:16:31	                main:574]	:	INFO	:	Epoch 1893 | loss: 0.0311131 | val_loss: 0.0311781 | Time: 4666.35 ms
[2022-04-18 20:16:35	                main:574]	:	INFO	:	Epoch 1894 | loss: 0.0311128 | val_loss: 0.0311843 | Time: 4530.98 ms
[2022-04-18 20:16:40	                main:574]	:	INFO	:	Epoch 1895 | loss: 0.0311131 | val_loss: 0.031186 | Time: 4604.43 ms
[2022-04-18 20:16:44	                main:574]	:	INFO	:	Epoch 1896 | loss: 0.0311123 | val_loss: 0.0311849 | Time: 4638.52 ms
[2022-04-18 20:16:49	                main:574]	:	INFO	:	Epoch 1897 | loss: 0.0311118 | val_loss: 0.0311798 | Time: 4540.76 ms
[2022-04-18 20:16:54	                main:574]	:	INFO	:	Epoch 1898 | loss: 0.0311144 | val_loss: 0.0311767 | Time: 4646.27 ms
[2022-04-18 20:16:58	                main:574]	:	INFO	:	Epoch 1899 | loss: 0.0311175 | val_loss: 0.0311806 | Time: 4584.32 ms
[2022-04-18 20:17:03	                main:574]	:	INFO	:	Epoch 1900 | loss: 0.0311211 | val_loss: 0.0311812 | Time: 4916 ms
[2022-04-18 20:17:08	                main:574]	:	INFO	:	Epoch 1901 | loss: 0.0311225 | val_loss: 0.031178 | Time: 4858.34 ms
[2022-04-18 20:17:13	                main:574]	:	INFO	:	Epoch 1902 | loss: 0.0311213 | val_loss: 0.0311729 | Time: 4628.18 ms
[2022-04-18 20:17:17	                main:574]	:	INFO	:	Epoch 1903 | loss: 0.031121 | val_loss: 0.0311747 | Time: 4605.35 ms
[2022-04-18 20:17:22	                main:574]	:	INFO	:	Epoch 1904 | loss: 0.0311194 | val_loss: 0.0311727 | Time: 4750.93 ms
[2022-04-18 20:17:27	                main:574]	:	INFO	:	Epoch 1905 | loss: 0.031122 | val_loss: 0.0311753 | Time: 4790.16 ms
[2022-04-18 20:17:31	                main:574]	:	INFO	:	Epoch 1906 | loss: 0.0311227 | val_loss: 0.0311762 | Time: 4751.58 ms
[2022-04-18 20:17:36	                main:574]	:	INFO	:	Epoch 1907 | loss: 0.0311205 | val_loss: 0.0311669 | Time: 4714.7 ms
[2022-04-18 20:17:41	                main:574]	:	INFO	:	Epoch 1908 | loss: 0.0311185 | val_loss: 0.0311728 | Time: 4744.89 ms
[2022-04-18 20:17:46	                main:574]	:	INFO	:	Epoch 1909 | loss: 0.031118 | val_loss: 0.0311787 | Time: 4623.89 ms
[2022-04-18 20:17:50	                main:574]	:	INFO	:	Epoch 1910 | loss: 0.0311201 | val_loss: 0.0311746 | Time: 4659.43 ms
[2022-04-18 20:17:55	                main:574]	:	INFO	:	Epoch 1911 | loss: 0.0311224 | val_loss: 0.0311752 | Time: 4604.68 ms
[2022-04-18 20:17:59	                main:574]	:	INFO	:	Epoch 1912 | loss: 0.0311226 | val_loss: 0.0311704 | Time: 4626.16 ms
[2022-04-18 20:18:04	                main:574]	:	INFO	:	Epoch 1913 | loss: 0.0311179 | val_loss: 0.0311723 | Time: 4633.34 ms
[2022-04-18 20:18:09	                main:574]	:	INFO	:	Epoch 1914 | loss: 0.0311179 | val_loss: 0.0311717 | Time: 4614.17 ms
[2022-04-18 20:18:13	                main:574]	:	INFO	:	Epoch 1915 | loss: 0.0311155 | val_loss: 0.0311697 | Time: 4625.3 ms
[2022-04-18 20:18:18	                main:574]	:	INFO	:	Epoch 1916 | loss: 0.0311128 | val_loss: 0.0311715 | Time: 4550.22 ms
[2022-04-18 20:18:23	                main:574]	:	INFO	:	Epoch 1917 | loss: 0.031113 | val_loss: 0.031172 | Time: 4653.61 ms
[2022-04-18 20:18:27	                main:574]	:	INFO	:	Epoch 1918 | loss: 0.0311126 | val_loss: 0.0311742 | Time: 4663.7 ms
[2022-04-18 20:18:32	                main:574]	:	INFO	:	Epoch 1919 | loss: 0.0311106 | val_loss: 0.0311737 | Time: 4639.44 ms
[2022-04-18 20:18:36	                main:574]	:	INFO	:	Epoch 1920 | loss: 0.0311133 | val_loss: 0.0311809 | Time: 4581.75 ms
[2022-04-18 20:18:41	                main:574]	:	INFO	:	Epoch 1921 | loss: 0.0311165 | val_loss: 0.0311772 | Time: 4565.83 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce RTX 2080)
[2022-04-18 20:24:02	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-18 20:24:02	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-18 20:24:02	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-18 20:24:02	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-18 20:24:02	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-18 20:24:02	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-18 20:24:02	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-18 20:24:02	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-18 20:24:02	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-18 20:24:02	                main:474]	:	INFO	:	Configuration: 
[2022-04-18 20:24:02	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-18 20:24:02	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-18 20:24:02	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-18 20:24:02	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-18 20:24:02	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-18 20:24:02	                main:480]	:	INFO	:	    Patience: 10
[2022-04-18 20:24:02	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-18 20:24:02	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-18 20:24:02	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-18 20:24:02	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-18 20:24:02	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-18 20:24:02	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-18 20:24:02	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-18 20:24:04	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-18 20:24:04	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-18 20:24:04	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-18 20:24:04	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-18 20:24:04	                main:494]	:	INFO	:	Creating Model
[2022-04-18 20:24:04	                main:507]	:	INFO	:	Preparing config file
[2022-04-18 20:24:04	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-18 20:24:04	                main:512]	:	INFO	:	Loading config
[2022-04-18 20:24:04	                main:514]	:	INFO	:	Loading state
[2022-04-18 20:24:05	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-18 20:24:05	                main:562]	:	INFO	:	Starting Training
[2022-04-18 20:24:10	                main:574]	:	INFO	:	Epoch 1918 | loss: 0.0311409 | val_loss: 0.0311767 | Time: 5228.15 ms
[2022-04-18 20:24:15	                main:574]	:	INFO	:	Epoch 1919 | loss: 0.0311206 | val_loss: 0.0311789 | Time: 4946.25 ms
[2022-04-18 20:24:20	                main:574]	:	INFO	:	Epoch 1920 | loss: 0.0311185 | val_loss: 0.0311716 | Time: 4924.81 ms
[2022-04-18 20:24:25	                main:574]	:	INFO	:	Epoch 1921 | loss: 0.0311181 | val_loss: 0.0311685 | Time: 5209.85 ms
[2022-04-18 20:24:30	                main:574]	:	INFO	:	Epoch 1922 | loss: 0.0311172 | val_loss: 0.0311794 | Time: 4776.93 ms
[2022-04-18 20:24:35	                main:574]	:	INFO	:	Epoch 1923 | loss: 0.0311199 | val_loss: 0.0311707 | Time: 4713.22 ms
[2022-04-18 20:24:40	                main:574]	:	INFO	:	Epoch 1924 | loss: 0.0311242 | val_loss: 0.0311744 | Time: 4605.03 ms
[2022-04-18 20:24:44	                main:574]	:	INFO	:	Epoch 1925 | loss: 0.0311217 | val_loss: 0.0311779 | Time: 4662.22 ms
[2022-04-18 20:24:49	                main:574]	:	INFO	:	Epoch 1926 | loss: 0.0311187 | val_loss: 0.0311821 | Time: 4749.18 ms
[2022-04-18 20:24:54	                main:574]	:	INFO	:	Epoch 1927 | loss: 0.0311149 | val_loss: 0.031176 | Time: 4724.05 ms
[2022-04-18 20:24:59	                main:574]	:	INFO	:	Epoch 1928 | loss: 0.0311167 | val_loss: 0.0311746 | Time: 4936.57 ms
[2022-04-18 20:25:04	                main:574]	:	INFO	:	Epoch 1929 | loss: 0.0311162 | val_loss: 0.031173 | Time: 5082.28 ms
[2022-04-18 20:25:08	                main:574]	:	INFO	:	Epoch 1930 | loss: 0.031116 | val_loss: 0.0311742 | Time: 4683.17 ms
[2022-04-18 20:25:13	                main:574]	:	INFO	:	Epoch 1931 | loss: 0.0311148 | val_loss: 0.0311759 | Time: 4603.23 ms
[2022-04-18 20:25:18	                main:574]	:	INFO	:	Epoch 1932 | loss: 0.0311127 | val_loss: 0.0311768 | Time: 4610.07 ms
[2022-04-18 20:25:22	                main:574]	:	INFO	:	Epoch 1933 | loss: 0.031113 | val_loss: 0.0311817 | Time: 4729.81 ms
[2022-04-18 20:25:27	                main:574]	:	INFO	:	Epoch 1934 | loss: 0.0311167 | val_loss: 0.0311791 | Time: 4933.44 ms
[2022-04-18 20:25:32	                main:574]	:	INFO	:	Epoch 1935 | loss: 0.0311307 | val_loss: 0.0311804 | Time: 4640.4 ms
[2022-04-18 20:25:37	                main:574]	:	INFO	:	Epoch 1936 | loss: 0.0311258 | val_loss: 0.0311774 | Time: 4707.11 ms
[2022-04-18 20:25:41	                main:574]	:	INFO	:	Epoch 1937 | loss: 0.0311226 | val_loss: 0.0311808 | Time: 4671.48 ms
[2022-04-18 20:25:46	                main:574]	:	INFO	:	Epoch 1938 | loss: 0.0311177 | val_loss: 0.0311776 | Time: 4544.5 ms
[2022-04-18 20:25:51	                main:574]	:	INFO	:	Epoch 1939 | loss: 0.0311156 | val_loss: 0.0311772 | Time: 4571.94 ms
[2022-04-18 20:25:55	                main:574]	:	INFO	:	Epoch 1940 | loss: 0.0311177 | val_loss: 0.0311844 | Time: 4582.53 ms
[2022-04-18 20:26:00	                main:574]	:	INFO	:	Epoch 1941 | loss: 0.0311212 | val_loss: 0.031174 | Time: 4674.19 ms
[2022-04-18 20:26:04	                main:574]	:	INFO	:	Epoch 1942 | loss: 0.0311158 | val_loss: 0.0311748 | Time: 4688.13 ms
[2022-04-18 20:26:09	                main:574]	:	INFO	:	Epoch 1943 | loss: 0.0311139 | val_loss: 0.0311738 | Time: 4606.06 ms
[2022-04-18 20:26:14	                main:574]	:	INFO	:	Epoch 1944 | loss: 0.0311125 | val_loss: 0.0311848 | Time: 4616.85 ms
[2022-04-18 20:26:18	                main:574]	:	INFO	:	Epoch 1945 | loss: 0.0311139 | val_loss: 0.0311749 | Time: 4582.92 ms
[2022-04-18 20:26:23	                main:574]	:	INFO	:	Epoch 1946 | loss: 0.0311178 | val_loss: 0.0311752 | Time: 4691.37 ms
[2022-04-18 20:26:28	                main:574]	:	INFO	:	Epoch 1947 | loss: 0.0311142 | val_loss: 0.0311806 | Time: 4724.01 ms
[2022-04-18 20:26:32	                main:574]	:	INFO	:	Epoch 1948 | loss: 0.0311099 | val_loss: 0.0311787 | Time: 4648.9 ms
[2022-04-18 20:26:37	                main:574]	:	INFO	:	Epoch 1949 | loss: 0.0311102 | val_loss: 0.0311751 | Time: 4686.79 ms
[2022-04-18 20:26:42	                main:574]	:	INFO	:	Epoch 1950 | loss: 0.0311122 | val_loss: 0.0311803 | Time: 4745.9 ms
[2022-04-18 20:26:47	                main:574]	:	INFO	:	Epoch 1951 | loss: 0.0311093 | val_loss: 0.0311827 | Time: 4721.56 ms
[2022-04-18 20:26:51	                main:574]	:	INFO	:	Epoch 1952 | loss: 0.0311092 | val_loss: 0.0311804 | Time: 4599.9 ms
[2022-04-18 20:26:56	                main:574]	:	INFO	:	Epoch 1953 | loss: 0.0311098 | val_loss: 0.0311811 | Time: 4787.32 ms
[2022-04-18 20:27:01	                main:574]	:	INFO	:	Epoch 1954 | loss: 0.031111 | val_loss: 0.0311807 | Time: 4798.2 ms
[2022-04-18 20:27:05	                main:574]	:	INFO	:	Epoch 1955 | loss: 0.0311139 | val_loss: 0.0311777 | Time: 4726.59 ms
[2022-04-18 20:27:10	                main:574]	:	INFO	:	Epoch 1956 | loss: 0.0311238 | val_loss: 0.0311758 | Time: 4588.74 ms
[2022-04-18 20:27:15	                main:574]	:	INFO	:	Epoch 1957 | loss: 0.0311235 | val_loss: 0.0311774 | Time: 4622.67 ms
[2022-04-18 20:27:19	                main:574]	:	INFO	:	Epoch 1958 | loss: 0.0311172 | val_loss: 0.0311779 | Time: 4758 ms
[2022-04-18 20:27:24	                main:574]	:	INFO	:	Epoch 1959 | loss: 0.0311151 | val_loss: 0.0311793 | Time: 4874.17 ms
[2022-04-18 20:27:29	                main:574]	:	INFO	:	Epoch 1960 | loss: 0.0311119 | val_loss: 0.0311801 | Time: 4831.29 ms
[2022-04-18 20:27:34	                main:574]	:	INFO	:	Epoch 1961 | loss: 0.0311098 | val_loss: 0.0311815 | Time: 4601.34 ms
[2022-04-18 20:27:38	                main:574]	:	INFO	:	Epoch 1962 | loss: 0.0311117 | val_loss: 0.0311821 | Time: 4628.03 ms
[2022-04-18 20:27:43	                main:574]	:	INFO	:	Epoch 1963 | loss: 0.0311091 | val_loss: 0.031178 | Time: 4551.2 ms
[2022-04-18 20:27:48	                main:574]	:	INFO	:	Epoch 1964 | loss: 0.0311079 | val_loss: 0.0311808 | Time: 4710.94 ms
[2022-04-18 20:27:53	                main:574]	:	INFO	:	Epoch 1965 | loss: 0.0311111 | val_loss: 0.0311828 | Time: 4810.12 ms
[2022-04-18 20:27:57	                main:574]	:	INFO	:	Epoch 1966 | loss: 0.0311162 | val_loss: 0.031182 | Time: 4652.99 ms
[2022-04-18 20:28:02	                main:574]	:	INFO	:	Epoch 1967 | loss: 0.0311263 | val_loss: 0.0311726 | Time: 4631.17 ms
[2022-04-18 20:28:06	                main:574]	:	INFO	:	Epoch 1968 | loss: 0.0311217 | val_loss: 0.0311754 | Time: 4614.3 ms
[2022-04-18 20:28:11	                main:574]	:	INFO	:	Epoch 1969 | loss: 0.0311176 | val_loss: 0.031181 | Time: 4686.87 ms
[2022-04-18 20:28:16	                main:574]	:	INFO	:	Epoch 1970 | loss: 0.0311182 | val_loss: 0.0311737 | Time: 4559.49 ms
[2022-04-18 20:28:20	                main:574]	:	INFO	:	Epoch 1971 | loss: 0.0311161 | val_loss: 0.0311815 | Time: 4668.55 ms
[2022-04-18 20:28:25	                main:574]	:	INFO	:	Epoch 1972 | loss: 0.0311175 | val_loss: 0.0311837 | Time: 4542.65 ms
[2022-04-18 20:28:30	                main:574]	:	INFO	:	Epoch 1973 | loss: 0.0311298 | val_loss: 0.0311836 | Time: 4693.03 ms
[2022-04-18 20:28:34	                main:574]	:	INFO	:	Epoch 1974 | loss: 0.031129 | val_loss: 0.0311741 | Time: 4623.59 ms
[2022-04-18 20:28:39	                main:574]	:	INFO	:	Epoch 1975 | loss: 0.0311215 | val_loss: 0.0311727 | Time: 4596.42 ms
[2022-04-18 20:28:43	                main:574]	:	INFO	:	Epoch 1976 | loss: 0.0311197 | val_loss: 0.0311763 | Time: 4618.93 ms
[2022-04-18 20:28:48	                main:574]	:	INFO	:	Epoch 1977 | loss: 0.0311233 | val_loss: 0.0311727 | Time: 4641.69 ms
[2022-04-18 20:28:53	                main:574]	:	INFO	:	Epoch 1978 | loss: 0.0311286 | val_loss: 0.0311766 | Time: 4696.17 ms
[2022-04-18 20:28:58	                main:574]	:	INFO	:	Epoch 1979 | loss: 0.0311266 | val_loss: 0.0311695 | Time: 4748.39 ms
[2022-04-18 20:29:02	                main:574]	:	INFO	:	Epoch 1980 | loss: 0.031122 | val_loss: 0.0311764 | Time: 4556.54 ms
[2022-04-18 20:29:07	                main:574]	:	INFO	:	Epoch 1981 | loss: 0.0311183 | val_loss: 0.0311696 | Time: 4752.93 ms
[2022-04-18 20:29:12	                main:574]	:	INFO	:	Epoch 1982 | loss: 0.0311178 | val_loss: 0.0311815 | Time: 4724.24 ms
[2022-04-18 20:29:16	                main:574]	:	INFO	:	Epoch 1983 | loss: 0.0311173 | val_loss: 0.0311782 | Time: 4625.2 ms
[2022-04-18 20:29:21	                main:574]	:	INFO	:	Epoch 1984 | loss: 0.0311161 | val_loss: 0.0311732 | Time: 4608.07 ms
[2022-04-18 20:29:26	                main:574]	:	INFO	:	Epoch 1985 | loss: 0.0311128 | val_loss: 0.0311784 | Time: 4682.55 ms
[2022-04-18 20:29:31	                main:574]	:	INFO	:	Epoch 1986 | loss: 0.031113 | val_loss: 0.0311751 | Time: 4917.63 ms
[2022-04-18 20:29:35	                main:574]	:	INFO	:	Epoch 1987 | loss: 0.0311142 | val_loss: 0.0311772 | Time: 4741.03 ms
[2022-04-18 20:29:40	                main:574]	:	INFO	:	Epoch 1988 | loss: 0.0311128 | val_loss: 0.0311749 | Time: 4634.31 ms
[2022-04-18 20:29:44	                main:574]	:	INFO	:	Epoch 1989 | loss: 0.0311103 | val_loss: 0.0311758 | Time: 4555.05 ms
[2022-04-18 20:29:49	                main:574]	:	INFO	:	Epoch 1990 | loss: 0.0311112 | val_loss: 0.0311794 | Time: 4676.28 ms
[2022-04-18 20:29:54	                main:574]	:	INFO	:	Epoch 1991 | loss: 0.0311097 | val_loss: 0.0311745 | Time: 4792.56 ms
[2022-04-18 20:29:59	                main:574]	:	INFO	:	Epoch 1992 | loss: 0.0311093 | val_loss: 0.0311757 | Time: 4656.74 ms
[2022-04-18 20:30:03	                main:574]	:	INFO	:	Epoch 1993 | loss: 0.0311081 | val_loss: 0.0311752 | Time: 4646.15 ms
[2022-04-18 20:30:08	                main:574]	:	INFO	:	Epoch 1994 | loss: 0.0311091 | val_loss: 0.0311736 | Time: 4583.22 ms
[2022-04-18 20:30:12	                main:574]	:	INFO	:	Epoch 1995 | loss: 0.0311094 | val_loss: 0.0311726 | Time: 4601.98 ms
[2022-04-18 20:30:17	                main:574]	:	INFO	:	Epoch 1996 | loss: 0.0311106 | val_loss: 0.0311699 | Time: 4598.76 ms
[2022-04-18 20:30:22	                main:574]	:	INFO	:	Epoch 1997 | loss: 0.0311105 | val_loss: 0.031173 | Time: 4699.47 ms
[2022-04-18 20:30:27	                main:574]	:	INFO	:	Epoch 1998 | loss: 0.0311081 | val_loss: 0.0311753 | Time: 4787.1 ms
[2022-04-18 20:30:31	                main:574]	:	INFO	:	Epoch 1999 | loss: 0.031108 | val_loss: 0.0311684 | Time: 4557.24 ms
[2022-04-18 20:30:36	                main:574]	:	INFO	:	Epoch 2000 | loss: 0.0311072 | val_loss: 0.0311747 | Time: 4575.65 ms
[2022-04-18 20:30:40	                main:574]	:	INFO	:	Epoch 2001 | loss: 0.0311048 | val_loss: 0.0311688 | Time: 4701.86 ms
[2022-04-18 20:30:45	                main:574]	:	INFO	:	Epoch 2002 | loss: 0.0311063 | val_loss: 0.0311708 | Time: 4698.24 ms
[2022-04-18 20:30:50	                main:574]	:	INFO	:	Epoch 2003 | loss: 0.0311054 | val_loss: 0.0311743 | Time: 4624.5 ms
[2022-04-18 20:30:54	                main:574]	:	INFO	:	Epoch 2004 | loss: 0.0311061 | val_loss: 0.0311732 | Time: 4614.74 ms
[2022-04-18 20:30:59	                main:574]	:	INFO	:	Epoch 2005 | loss: 0.0311038 | val_loss: 0.0311745 | Time: 4711.85 ms
[2022-04-18 20:31:04	                main:574]	:	INFO	:	Epoch 2006 | loss: 0.0311059 | val_loss: 0.0311734 | Time: 4745.86 ms
[2022-04-18 20:31:08	                main:574]	:	INFO	:	Epoch 2007 | loss: 0.0311083 | val_loss: 0.0311771 | Time: 4638.69 ms
[2022-04-18 20:31:13	                main:574]	:	INFO	:	Epoch 2008 | loss: 0.0311118 | val_loss: 0.0311781 | Time: 4652.53 ms
[2022-04-18 20:31:18	                main:574]	:	INFO	:	Epoch 2009 | loss: 0.0311086 | val_loss: 0.0311718 | Time: 4701.37 ms
[2022-04-18 20:31:22	                main:574]	:	INFO	:	Epoch 2010 | loss: 0.0311063 | val_loss: 0.0311724 | Time: 4612.13 ms
[2022-04-18 20:31:27	                main:574]	:	INFO	:	Epoch 2011 | loss: 0.031105 | val_loss: 0.031174 | Time: 4770.68 ms
[2022-04-18 20:31:32	                main:574]	:	INFO	:	Epoch 2012 | loss: 0.0311056 | val_loss: 0.0311734 | Time: 4664.15 ms
[2022-04-18 20:31:37	                main:574]	:	INFO	:	Epoch 2013 | loss: 0.031105 | val_loss: 0.0311781 | Time: 4606.03 ms
[2022-04-18 20:31:41	                main:574]	:	INFO	:	Epoch 2014 | loss: 0.0311051 | val_loss: 0.031174 | Time: 4596.78 ms
[2022-04-18 20:31:46	                main:574]	:	INFO	:	Epoch 2015 | loss: 0.031104 | val_loss: 0.0311777 | Time: 4632.82 ms
[2022-04-18 20:31:51	                main:574]	:	INFO	:	Epoch 2016 | loss: 0.0311058 | val_loss: 0.0311778 | Time: 4707.66 ms
[2022-04-18 20:31:55	                main:574]	:	INFO	:	Epoch 2017 | loss: 0.0311077 | val_loss: 0.0311804 | Time: 4649.99 ms
[2022-04-18 20:32:00	                main:574]	:	INFO	:	Epoch 2018 | loss: 0.0311075 | val_loss: 0.0311802 | Time: 4627.41 ms
[2022-04-18 20:32:04	                main:574]	:	INFO	:	Epoch 2019 | loss: 0.031114 | val_loss: 0.0311808 | Time: 4568.51 ms
[2022-04-18 20:32:09	                main:574]	:	INFO	:	Epoch 2020 | loss: 0.0311179 | val_loss: 0.0311749 | Time: 4589.78 ms
[2022-04-18 20:32:14	                main:574]	:	INFO	:	Epoch 2021 | loss: 0.0311263 | val_loss: 0.031178 | Time: 4635.04 ms
[2022-04-18 20:32:18	                main:574]	:	INFO	:	Epoch 2022 | loss: 0.0311237 | val_loss: 0.0311728 | Time: 4669.81 ms
[2022-04-18 20:32:23	                main:574]	:	INFO	:	Epoch 2023 | loss: 0.0311226 | val_loss: 0.0311725 | Time: 4682.95 ms
[2022-04-18 20:32:28	                main:574]	:	INFO	:	Epoch 2024 | loss: 0.0311251 | val_loss: 0.0311735 | Time: 4752.91 ms
[2022-04-18 20:32:32	                main:574]	:	INFO	:	Epoch 2025 | loss: 0.0311237 | val_loss: 0.0311701 | Time: 4574.59 ms
[2022-04-18 20:32:37	                main:574]	:	INFO	:	Epoch 2026 | loss: 0.031122 | val_loss: 0.0311819 | Time: 4554.47 ms
[2022-04-18 20:32:41	                main:574]	:	INFO	:	Epoch 2027 | loss: 0.0311165 | val_loss: 0.0311747 | Time: 4530 ms
[2022-04-18 20:32:46	                main:574]	:	INFO	:	Epoch 2028 | loss: 0.0311135 | val_loss: 0.0311759 | Time: 4672.26 ms
[2022-04-18 20:32:51	                main:574]	:	INFO	:	Epoch 2029 | loss: 0.0311137 | val_loss: 0.0311745 | Time: 4647.78 ms
[2022-04-18 20:32:55	                main:574]	:	INFO	:	Epoch 2030 | loss: 0.0311125 | val_loss: 0.0311723 | Time: 4567.17 ms
[2022-04-18 20:33:00	                main:574]	:	INFO	:	Epoch 2031 | loss: 0.0311089 | val_loss: 0.0311774 | Time: 4650.65 ms
[2022-04-18 20:33:05	                main:574]	:	INFO	:	Epoch 2032 | loss: 0.0311088 | val_loss: 0.0311736 | Time: 4610.12 ms
[2022-04-18 20:33:09	                main:574]	:	INFO	:	Epoch 2033 | loss: 0.0311166 | val_loss: 0.0311735 | Time: 4629.44 ms
[2022-04-18 20:33:14	                main:574]	:	INFO	:	Epoch 2034 | loss: 0.0311202 | val_loss: 0.0311716 | Time: 4544.71 ms
[2022-04-18 20:33:18	                main:574]	:	INFO	:	Epoch 2035 | loss: 0.0311165 | val_loss: 0.0311716 | Time: 4531.73 ms
[2022-04-18 20:33:23	                main:574]	:	INFO	:	Epoch 2036 | loss: 0.031114 | val_loss: 0.031174 | Time: 4551.06 ms
[2022-04-18 20:33:28	                main:574]	:	INFO	:	Epoch 2037 | loss: 0.0311124 | val_loss: 0.0311761 | Time: 4613.7 ms
[2022-04-18 20:33:32	                main:574]	:	INFO	:	Epoch 2038 | loss: 0.0311115 | val_loss: 0.0311755 | Time: 4773.32 ms
[2022-04-18 20:33:37	                main:574]	:	INFO	:	Epoch 2039 | loss: 0.0311173 | val_loss: 0.0311697 | Time: 4656.75 ms
[2022-04-18 20:33:42	                main:574]	:	INFO	:	Epoch 2040 | loss: 0.0311183 | val_loss: 0.0311715 | Time: 4595.42 ms
[2022-04-18 20:33:46	                main:574]	:	INFO	:	Epoch 2041 | loss: 0.0311188 | val_loss: 0.0311734 | Time: 4612.76 ms
[2022-04-18 20:33:51	                main:574]	:	INFO	:	Epoch 2042 | loss: 0.0311205 | val_loss: 0.0311741 | Time: 4660.13 ms
[2022-04-18 20:33:56	                main:574]	:	INFO	:	Epoch 2043 | loss: 0.0311158 | val_loss: 0.0311776 | Time: 4604.81 ms
[2022-04-18 20:34:00	                main:574]	:	INFO	:	Epoch 2044 | loss: 0.0311121 | val_loss: 0.031174 | Time: 4699.08 ms
[2022-04-18 20:34:05	                main:574]	:	INFO	:	Epoch 2045 | loss: 0.0311161 | val_loss: 0.031178 | Time: 4833.29 ms
[2022-04-18 20:34:10	                main:574]	:	INFO	:	Epoch 2046 | loss: 0.0311254 | val_loss: 0.0311809 | Time: 4503.03 ms
[2022-04-18 20:34:14	                main:574]	:	INFO	:	Epoch 2047 | loss: 0.0311212 | val_loss: 0.0311765 | Time: 4570.01 ms
[2022-04-18 20:34:19	                main:574]	:	INFO	:	Epoch 2048 | loss: 0.0311188 | val_loss: 0.0311763 | Time: 4537.94 ms
[2022-04-18 20:34:19	                main:597]	:	INFO	:	Saving trained model to model-final.pt, val_loss 0.0311763
[2022-04-18 20:34:19	                main:603]	:	INFO	:	Saving end state to config to file
[2022-04-18 20:34:19	                main:608]	:	INFO	:	Success, exiting..
20:34:19 (194268): called boinc_finish(0)

</stderr_txt>
]]>


©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)