Task 13694139

Name ParityModified-1647048779-30988-1-0_0
Workunit 10784768
Created 24 Mar 2022, 5:34:10 UTC
Sent 1 Apr 2022, 8:42:44 UTC
Report deadline 9 Apr 2022, 8:42:44 UTC
Received 1 Apr 2022, 22:18:02 UTC
Server state Over
Outcome Success
Client state Done
Exit status 0 (0x00000000)
Computer ID 6170
Run time 5 hours 47 min 39 sec
CPU time 4 hours 45 min 26 sec
Validate state Valid
Credit 4,160.00
Device peak FLOPS 1,130.52 GFLOPS
Application version Machine Learning Dataset Generator (GPU) v9.75 (cuda10200)
windows_x86_64
Peak working set size 1.62 GB
Peak swap size 3.57 GB
Peak disk usage 1.54 GB

Stderr output

<core_client_version>7.16.20</core_client_version>
<![CDATA[
<stderr_txt>
79]	:	INFO	:	    Learning Rate: 0.01
[2022-04-01 13:10:59	                main:480]	:	INFO	:	    Patience: 10
[2022-04-01 13:10:59	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-01 13:10:59	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-01 13:10:59	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-01 13:10:59	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-01 13:10:59	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-01 13:10:59	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-01 13:11:00	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-01 13:11:07	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-01 13:11:07	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-01 13:11:07	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-01 13:11:08	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-01 13:11:08	                main:494]	:	INFO	:	Creating Model
[2022-04-01 13:11:08	                main:507]	:	INFO	:	Preparing config file
[2022-04-01 13:11:08	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-01 13:11:08	                main:512]	:	INFO	:	Loading config
[2022-04-01 13:11:08	                main:514]	:	INFO	:	Loading state
[2022-04-01 13:11:11	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-01 13:11:11	                main:562]	:	INFO	:	Starting Training
[2022-04-01 13:11:26	                main:574]	:	INFO	:	Epoch 1729 | loss: 0.000528822 | val_loss: 0.000108228 | Time: 14922.7 ms
[2022-04-01 13:11:41	                main:574]	:	INFO	:	Epoch 1730 | loss: 7.55902e-05 | val_loss: 4.99893e-05 | Time: 15495.9 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-04-01 13:14:14	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-01 13:14:14	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-01 13:14:14	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-01 13:14:14	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-01 13:14:14	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-01 13:14:14	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-01 13:14:14	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-01 13:14:14	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-01 13:14:14	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-01 13:14:14	                main:474]	:	INFO	:	Configuration: 
[2022-04-01 13:14:14	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-01 13:14:14	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-01 13:14:14	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-01 13:14:14	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-01 13:14:14	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-01 13:14:14	                main:480]	:	INFO	:	    Patience: 10
[2022-04-01 13:14:14	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-01 13:14:14	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-01 13:14:14	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-01 13:14:14	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-01 13:14:14	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-01 13:14:15	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-01 13:14:16	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-01 13:14:21	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-01 13:14:22	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-01 13:14:22	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-01 13:14:22	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-01 13:14:22	                main:494]	:	INFO	:	Creating Model
[2022-04-01 13:14:22	                main:507]	:	INFO	:	Preparing config file
[2022-04-01 13:14:22	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-01 13:14:23	                main:512]	:	INFO	:	Loading config
[2022-04-01 13:14:23	                main:514]	:	INFO	:	Loading state
[2022-04-01 13:14:25	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-01 13:14:25	                main:562]	:	INFO	:	Starting Training
[2022-04-01 13:14:39	                main:574]	:	INFO	:	Epoch 1729 | loss: 0.000538336 | val_loss: 5.49835e-05 | Time: 13294.6 ms
[2022-04-01 13:14:50	                main:574]	:	INFO	:	Epoch 1730 | loss: 7.43934e-05 | val_loss: 3.28561e-05 | Time: 11511.8 ms
[2022-04-01 13:15:00	                main:574]	:	INFO	:	Epoch 1731 | loss: 2.72274e-05 | val_loss: 2.2082e-05 | Time: 9996.57 ms
[2022-04-01 13:15:10	                main:574]	:	INFO	:	Epoch 1732 | loss: 1.77555e-05 | val_loss: 1.57402e-05 | Time: 9309.25 ms
[2022-04-01 13:15:18	                main:574]	:	INFO	:	Epoch 1733 | loss: 1.53226e-05 | val_loss: 1.42197e-05 | Time: 7859.2 ms
[2022-04-01 13:15:25	                main:574]	:	INFO	:	Epoch 1734 | loss: 1.48323e-05 | val_loss: 1.35932e-05 | Time: 7469.37 ms
[2022-04-01 13:15:32	                main:574]	:	INFO	:	Epoch 1735 | loss: 1.54071e-05 | val_loss: 1.40369e-05 | Time: 7124.06 ms
[2022-04-01 13:15:39	                main:574]	:	INFO	:	Epoch 1736 | loss: 1.51555e-05 | val_loss: 1.56324e-05 | Time: 6945.78 ms
[2022-04-01 13:15:46	                main:574]	:	INFO	:	Epoch 1737 | loss: 1.47108e-05 | val_loss: 1.34793e-05 | Time: 6530.03 ms
[2022-04-01 13:15:53	                main:574]	:	INFO	:	Epoch 1738 | loss: 1.46392e-05 | val_loss: 1.33316e-05 | Time: 6718.23 ms
[2022-04-01 13:16:04	                main:574]	:	INFO	:	Epoch 1739 | loss: 1.4646e-05 | val_loss: 1.32848e-05 | Time: 11645.3 ms
[2022-04-01 13:16:13	                main:574]	:	INFO	:	Epoch 1740 | loss: 1.46339e-05 | val_loss: 1.33653e-05 | Time: 8568.86 ms
[2022-04-01 13:16:21	                main:574]	:	INFO	:	Epoch 1741 | loss: 1.47227e-05 | val_loss: 1.35589e-05 | Time: 8005.57 ms
[2022-04-01 13:16:31	                main:574]	:	INFO	:	Epoch 1742 | loss: 1.4672e-05 | val_loss: 1.33902e-05 | Time: 10049.4 ms
[2022-04-01 13:16:41	                main:574]	:	INFO	:	Epoch 1743 | loss: 1.46731e-05 | val_loss: 1.33961e-05 | Time: 10018.4 ms
[2022-04-01 13:16:51	                main:574]	:	INFO	:	Epoch 1744 | loss: 1.46912e-05 | val_loss: 1.32569e-05 | Time: 9442.38 ms
[2022-04-01 13:16:57	                main:574]	:	INFO	:	Epoch 1745 | loss: 1.46435e-05 | val_loss: 1.32577e-05 | Time: 5928.96 ms
[2022-04-01 13:17:03	                main:574]	:	INFO	:	Epoch 1746 | loss: 1.46469e-05 | val_loss: 1.32994e-05 | Time: 5904.58 ms
[2022-04-01 13:17:13	                main:574]	:	INFO	:	Epoch 1747 | loss: 1.46265e-05 | val_loss: 1.34677e-05 | Time: 9609.05 ms
[2022-04-01 13:17:23	                main:574]	:	INFO	:	Epoch 1748 | loss: 1.46399e-05 | val_loss: 1.32398e-05 | Time: 9591.78 ms
[2022-04-01 13:17:32	                main:574]	:	INFO	:	Epoch 1749 | loss: 1.46206e-05 | val_loss: 1.33312e-05 | Time: 9122.77 ms
[2022-04-01 13:17:42	                main:574]	:	INFO	:	Epoch 1750 | loss: 1.45978e-05 | val_loss: 1.32692e-05 | Time: 9573.86 ms
[2022-04-01 13:17:53	                main:574]	:	INFO	:	Epoch 1751 | loss: 1.46444e-05 | val_loss: 1.32821e-05 | Time: 10815.7 ms
[2022-04-01 13:18:04	                main:574]	:	INFO	:	Epoch 1752 | loss: 1.46218e-05 | val_loss: 1.33973e-05 | Time: 10725.3 ms
[2022-04-01 13:18:14	                main:574]	:	INFO	:	Epoch 1753 | loss: 1.4637e-05 | val_loss: 1.32781e-05 | Time: 9601.24 ms
[2022-04-01 13:18:24	                main:574]	:	INFO	:	Epoch 1754 | loss: 1.47424e-05 | val_loss: 1.34965e-05 | Time: 10110.3 ms
[2022-04-01 13:18:35	                main:574]	:	INFO	:	Epoch 1755 | loss: 1.4617e-05 | val_loss: 1.34388e-05 | Time: 9948.75 ms
[2022-04-01 13:18:45	                main:574]	:	INFO	:	Epoch 1756 | loss: 1.46288e-05 | val_loss: 1.37479e-05 | Time: 10300.9 ms
[2022-04-01 13:18:52	                main:574]	:	INFO	:	Epoch 1757 | loss: 1.46713e-05 | val_loss: 1.35992e-05 | Time: 6782.04 ms
[2022-04-01 13:19:03	                main:574]	:	INFO	:	Epoch 1758 | loss: 1.47013e-05 | val_loss: 1.3918e-05 | Time: 10691.2 ms
[2022-04-01 13:19:09	                main:574]	:	INFO	:	Epoch 1759 | loss: 1.46502e-05 | val_loss: 1.35542e-05 | Time: 6196.53 ms
[2022-04-01 13:19:20	                main:574]	:	INFO	:	Epoch 1760 | loss: 1.46192e-05 | val_loss: 1.38118e-05 | Time: 9264.95 ms
[2022-04-01 13:19:33	                main:574]	:	INFO	:	Epoch 1761 | loss: 1.46396e-05 | val_loss: 1.39191e-05 | Time: 9438.32 ms
[2022-04-01 13:19:44	                main:574]	:	INFO	:	Epoch 1762 | loss: 1.46309e-05 | val_loss: 1.37839e-05 | Time: 9877.97 ms
[2022-04-01 13:19:56	                main:574]	:	INFO	:	Epoch 1763 | loss: 1.46976e-05 | val_loss: 1.38083e-05 | Time: 12444.5 ms
[2022-04-01 13:20:03	                main:574]	:	INFO	:	Epoch 1764 | loss: 1.47111e-05 | val_loss: 1.38699e-05 | Time: 7119.33 ms
[2022-04-01 13:20:14	                main:574]	:	INFO	:	Epoch 1765 | loss: 1.47477e-05 | val_loss: 1.38069e-05 | Time: 10029.9 ms
[2022-04-01 13:20:26	                main:574]	:	INFO	:	Epoch 1766 | loss: 1.47166e-05 | val_loss: 1.38986e-05 | Time: 11992 ms
[2022-04-01 13:20:33	                main:574]	:	INFO	:	Epoch 1767 | loss: 1.49139e-05 | val_loss: 1.36992e-05 | Time: 7440.55 ms
[2022-04-01 13:20:46	                main:574]	:	INFO	:	Epoch 1768 | loss: 1.47291e-05 | val_loss: 1.40843e-05 | Time: 12338.3 ms
[2022-04-01 13:20:58	                main:574]	:	INFO	:	Epoch 1769 | loss: 1.48219e-05 | val_loss: 1.42187e-05 | Time: 11966 ms
[2022-04-01 13:21:10	                main:574]	:	INFO	:	Epoch 1770 | loss: 1.48109e-05 | val_loss: 1.40954e-05 | Time: 11503.5 ms
[2022-04-01 13:21:17	                main:574]	:	INFO	:	Epoch 1771 | loss: 1.47989e-05 | val_loss: 1.3788e-05 | Time: 7131.03 ms
[2022-04-01 13:21:30	                main:574]	:	INFO	:	Epoch 1772 | loss: 1.48426e-05 | val_loss: 1.40974e-05 | Time: 12475.6 ms
[2022-04-01 13:21:41	                main:574]	:	INFO	:	Epoch 1773 | loss: 1.48153e-05 | val_loss: 1.46408e-05 | Time: 11521 ms
[2022-04-01 13:21:53	                main:574]	:	INFO	:	Epoch 1774 | loss: 1.82211e-05 | val_loss: 1.57214e-05 | Time: 11364.2 ms
[2022-04-01 13:22:01	                main:574]	:	INFO	:	Epoch 1775 | loss: 1.52135e-05 | val_loss: 1.39237e-05 | Time: 7598.05 ms
[2022-04-01 13:22:13	                main:574]	:	INFO	:	Epoch 1776 | loss: 1.47367e-05 | val_loss: 1.36459e-05 | Time: 12512.1 ms
[2022-04-01 13:22:27	                main:574]	:	INFO	:	Epoch 1777 | loss: 1.46312e-05 | val_loss: 1.38865e-05 | Time: 12894.2 ms
[2022-04-01 13:22:39	                main:574]	:	INFO	:	Epoch 1778 | loss: 1.46915e-05 | val_loss: 1.3776e-05 | Time: 11838.3 ms
[2022-04-01 13:22:49	                main:574]	:	INFO	:	Epoch 1779 | loss: 1.46501e-05 | val_loss: 1.3987e-05 | Time: 10252.8 ms
[2022-04-01 13:23:01	                main:574]	:	INFO	:	Epoch 1780 | loss: 1.46836e-05 | val_loss: 1.39019e-05 | Time: 11489.7 ms
[2022-04-01 13:23:12	                main:574]	:	INFO	:	Epoch 1781 | loss: 1.46256e-05 | val_loss: 1.38807e-05 | Time: 11608.6 ms
[2022-04-01 13:23:24	                main:574]	:	INFO	:	Epoch 1782 | loss: 1.75029e-05 | val_loss: 3.14289e-05 | Time: 11720.1 ms
[2022-04-01 13:23:34	                main:574]	:	INFO	:	Epoch 1783 | loss: 1.96892e-05 | val_loss: 1.37198e-05 | Time: 9894.71 ms
[2022-04-01 13:23:44	                main:574]	:	INFO	:	Epoch 1784 | loss: 1.50892e-05 | val_loss: 1.38597e-05 | Time: 9688.77 ms
[2022-04-01 13:23:56	                main:574]	:	INFO	:	Epoch 1785 | loss: 1.47145e-05 | val_loss: 1.39413e-05 | Time: 11819.9 ms
[2022-04-01 13:24:09	                main:574]	:	INFO	:	Epoch 1786 | loss: 1.46428e-05 | val_loss: 1.38005e-05 | Time: 11972.5 ms
[2022-04-01 13:24:21	                main:574]	:	INFO	:	Epoch 1787 | loss: 1.46263e-05 | val_loss: 1.38079e-05 | Time: 12029.1 ms
[2022-04-01 13:24:34	                main:574]	:	INFO	:	Epoch 1788 | loss: 1.47992e-05 | val_loss: 1.40053e-05 | Time: 12254 ms
[2022-04-01 13:24:47	                main:574]	:	INFO	:	Epoch 1789 | loss: 1.48217e-05 | val_loss: 1.41661e-05 | Time: 12643 ms
[2022-04-01 13:25:00	                main:574]	:	INFO	:	Epoch 1790 | loss: 1.46183e-05 | val_loss: 1.38452e-05 | Time: 12451.7 ms
[2022-04-01 13:25:11	                main:574]	:	INFO	:	Epoch 1791 | loss: 1.46383e-05 | val_loss: 1.3877e-05 | Time: 11166.9 ms
[2022-04-01 13:25:24	                main:574]	:	INFO	:	Epoch 1792 | loss: 1.48061e-05 | val_loss: 1.48006e-05 | Time: 12787.4 ms
[2022-04-01 13:25:37	                main:574]	:	INFO	:	Epoch 1793 | loss: 1.62815e-05 | val_loss: 1.43132e-05 | Time: 12332.2 ms
[2022-04-01 13:25:48	                main:574]	:	INFO	:	Epoch 1794 | loss: 1.50586e-05 | val_loss: 1.44861e-05 | Time: 11561.2 ms
[2022-04-01 13:26:00	                main:574]	:	INFO	:	Epoch 1795 | loss: 1.47613e-05 | val_loss: 1.42637e-05 | Time: 11484 ms
[2022-04-01 13:26:12	                main:574]	:	INFO	:	Epoch 1796 | loss: 1.5184e-05 | val_loss: 1.51652e-05 | Time: 11740.2 ms
[2022-04-01 13:26:22	                main:574]	:	INFO	:	Epoch 1797 | loss: 1.69908e-05 | val_loss: 1.41443e-05 | Time: 9200.1 ms
[2022-04-01 13:26:28	                main:574]	:	INFO	:	Epoch 1798 | loss: 1.49971e-05 | val_loss: 1.40547e-05 | Time: 5912.53 ms
[2022-04-01 13:26:34	                main:574]	:	INFO	:	Epoch 1799 | loss: 1.47362e-05 | val_loss: 1.37937e-05 | Time: 5782.15 ms
[2022-04-01 13:26:39	                main:574]	:	INFO	:	Epoch 1800 | loss: 1.52668e-05 | val_loss: 1.60132e-05 | Time: 5782.73 ms
[2022-04-01 13:26:45	                main:574]	:	INFO	:	Epoch 1801 | loss: 1.5889e-05 | val_loss: 1.39261e-05 | Time: 5728.76 ms
[2022-04-01 13:26:55	                main:574]	:	INFO	:	Epoch 1802 | loss: 1.55573e-05 | val_loss: 1.62992e-05 | Time: 9542.44 ms
[2022-04-01 13:27:06	                main:574]	:	INFO	:	Epoch 1803 | loss: 1.57506e-05 | val_loss: 1.45752e-05 | Time: 9788.22 ms
[2022-04-01 13:27:17	                main:574]	:	INFO	:	Epoch 1804 | loss: 1.48062e-05 | val_loss: 1.38666e-05 | Time: 11472.5 ms
[2022-04-01 13:27:28	                main:574]	:	INFO	:	Epoch 1805 | loss: 1.51925e-05 | val_loss: 1.45336e-05 | Time: 10962 ms
[2022-04-01 13:27:35	                main:574]	:	INFO	:	Epoch 1806 | loss: 1.61807e-05 | val_loss: 1.75029e-05 | Time: 6469.54 ms
[2022-04-01 13:27:46	                main:574]	:	INFO	:	Epoch 1807 | loss: 1.59512e-05 | val_loss: 1.45999e-05 | Time: 10630.1 ms
[2022-04-01 13:27:59	                main:574]	:	INFO	:	Epoch 1808 | loss: 1.50857e-05 | val_loss: 1.38924e-05 | Time: 12187.3 ms
[2022-04-01 13:28:09	                main:574]	:	INFO	:	Epoch 1809 | loss: 1.50185e-05 | val_loss: 1.40273e-05 | Time: 9744 ms
[2022-04-01 13:28:15	                main:574]	:	INFO	:	Epoch 1810 | loss: 1.49376e-05 | val_loss: 1.40804e-05 | Time: 5900.75 ms
[2022-04-01 13:28:21	                main:574]	:	INFO	:	Epoch 1811 | loss: 1.51444e-05 | val_loss: 1.49316e-05 | Time: 5516.29 ms
[2022-04-01 13:28:26	                main:574]	:	INFO	:	Epoch 1812 | loss: 1.53963e-05 | val_loss: 1.44161e-05 | Time: 5374.61 ms
[2022-04-01 13:28:32	                main:574]	:	INFO	:	Epoch 1813 | loss: 1.58786e-05 | val_loss: 1.37568e-05 | Time: 5509.98 ms
[2022-04-01 13:28:37	                main:574]	:	INFO	:	Epoch 1814 | loss: 1.51591e-05 | val_loss: 1.47438e-05 | Time: 5552.96 ms
[2022-04-01 13:28:49	                main:574]	:	INFO	:	Epoch 1815 | loss: 1.61051e-05 | val_loss: 1.57128e-05 | Time: 11930.1 ms
[2022-04-01 13:29:01	                main:574]	:	INFO	:	Epoch 1816 | loss: 1.59559e-05 | val_loss: 1.37762e-05 | Time: 11614 ms
[2022-04-01 13:29:13	                main:574]	:	INFO	:	Epoch 1817 | loss: 1.4933e-05 | val_loss: 1.3912e-05 | Time: 11472.4 ms
[2022-04-01 13:29:20	                main:574]	:	INFO	:	Epoch 1818 | loss: 1.53342e-05 | val_loss: 1.45342e-05 | Time: 7343.97 ms
[2022-04-01 13:29:26	                main:574]	:	INFO	:	Epoch 1819 | loss: 1.48796e-05 | val_loss: 1.36541e-05 | Time: 5854.66 ms
[2022-04-01 13:29:36	                main:574]	:	INFO	:	Epoch 1820 | loss: 1.885e-05 | val_loss: 9.44862e-05 | Time: 9137.55 ms
[2022-04-01 13:29:42	                main:574]	:	INFO	:	Epoch 1821 | loss: 0.000511307 | val_loss: 0.000676337 | Time: 5721.95 ms
[2022-04-01 13:29:47	                main:574]	:	INFO	:	Epoch 1822 | loss: 0.000313322 | val_loss: 0.000112614 | Time: 5464.67 ms
[2022-04-01 13:29:53	                main:574]	:	INFO	:	Epoch 1823 | loss: 7.71277e-05 | val_loss: 5.46489e-05 | Time: 5527.91 ms
[2022-04-01 13:29:58	                main:574]	:	INFO	:	Epoch 1824 | loss: 3.9039e-05 | val_loss: 4.14787e-05 | Time: 5452.32 ms
[2022-04-01 13:30:04	                main:574]	:	INFO	:	Epoch 1825 | loss: 2.71088e-05 | val_loss: 2.1805e-05 | Time: 5574.74 ms
[2022-04-01 13:30:09	                main:574]	:	INFO	:	Epoch 1826 | loss: 1.94073e-05 | val_loss: 1.80388e-05 | Time: 5507.93 ms
[2022-04-01 13:30:15	                main:574]	:	INFO	:	Epoch 1827 | loss: 1.68678e-05 | val_loss: 1.74046e-05 | Time: 5718.68 ms
[2022-04-01 13:30:28	                main:574]	:	INFO	:	Epoch 1828 | loss: 1.58704e-05 | val_loss: 1.64897e-05 | Time: 12937.5 ms
[2022-04-01 13:30:40	                main:574]	:	INFO	:	Epoch 1829 | loss: 1.56084e-05 | val_loss: 2.00262e-05 | Time: 11429.7 ms
[2022-04-01 13:30:47	                main:574]	:	INFO	:	Epoch 1830 | loss: 1.7163e-05 | val_loss: 1.68856e-05 | Time: 6724.79 ms
[2022-04-01 13:30:52	                main:574]	:	INFO	:	Epoch 1831 | loss: 1.63952e-05 | val_loss: 1.52245e-05 | Time: 5631.31 ms
[2022-04-01 13:30:58	                main:574]	:	INFO	:	Epoch 1832 | loss: 1.53525e-05 | val_loss: 1.53022e-05 | Time: 5577.05 ms
[2022-04-01 13:31:04	                main:574]	:	INFO	:	Epoch 1833 | loss: 1.54234e-05 | val_loss: 1.52421e-05 | Time: 5669.63 ms
[2022-04-01 13:31:09	                main:574]	:	INFO	:	Epoch 1834 | loss: 1.52892e-05 | val_loss: 1.51783e-05 | Time: 5582.41 ms
[2022-04-01 13:31:15	                main:574]	:	INFO	:	Epoch 1835 | loss: 1.52314e-05 | val_loss: 1.50283e-05 | Time: 5431.97 ms
[2022-04-01 13:31:20	                main:574]	:	INFO	:	Epoch 1836 | loss: 1.51882e-05 | val_loss: 1.51296e-05 | Time: 5466.21 ms
[2022-04-01 13:31:26	                main:574]	:	INFO	:	Epoch 1837 | loss: 1.51548e-05 | val_loss: 1.49971e-05 | Time: 5364.88 ms
[2022-04-01 13:31:32	                main:574]	:	INFO	:	Epoch 1838 | loss: 1.51327e-05 | val_loss: 1.50488e-05 | Time: 6018.19 ms
[2022-04-01 13:31:42	                main:574]	:	INFO	:	Epoch 1839 | loss: 1.51032e-05 | val_loss: 1.52243e-05 | Time: 10396.7 ms
[2022-04-01 13:31:48	                main:574]	:	INFO	:	Epoch 1840 | loss: 1.49527e-05 | val_loss: 1.51777e-05 | Time: 5752.46 ms
[2022-04-01 13:31:58	                main:574]	:	INFO	:	Epoch 1841 | loss: 1.48522e-05 | val_loss: 1.51823e-05 | Time: 9422.4 ms
[2022-04-01 13:32:03	                main:574]	:	INFO	:	Epoch 1842 | loss: 1.47901e-05 | val_loss: 1.50351e-05 | Time: 5850.72 ms
[2022-04-01 13:32:13	                main:574]	:	INFO	:	Epoch 1843 | loss: 1.47775e-05 | val_loss: 1.50235e-05 | Time: 9025.63 ms
[2022-04-01 13:32:18	                main:574]	:	INFO	:	Epoch 1844 | loss: 1.47761e-05 | val_loss: 1.50354e-05 | Time: 5758.56 ms
[2022-04-01 13:32:24	                main:574]	:	INFO	:	Epoch 1845 | loss: 1.47748e-05 | val_loss: 1.49907e-05 | Time: 5467.95 ms
[2022-04-01 13:32:30	                main:574]	:	INFO	:	Epoch 1846 | loss: 1.47767e-05 | val_loss: 1.50165e-05 | Time: 5548.95 ms
[2022-04-01 13:32:35	                main:574]	:	INFO	:	Epoch 1847 | loss: 1.47696e-05 | val_loss: 1.49998e-05 | Time: 5455.9 ms
[2022-04-01 13:32:46	                main:574]	:	INFO	:	Epoch 1848 | loss: 1.47665e-05 | val_loss: 1.4998e-05 | Time: 11074.2 ms
[2022-04-01 13:32:52	                main:574]	:	INFO	:	Epoch 1849 | loss: 1.47544e-05 | val_loss: 1.50347e-05 | Time: 5748.79 ms
[2022-04-01 13:33:04	                main:574]	:	INFO	:	Epoch 1850 | loss: 1.47665e-05 | val_loss: 1.49534e-05 | Time: 11476.8 ms
[2022-04-01 13:33:13	                main:574]	:	INFO	:	Epoch 1851 | loss: 1.47556e-05 | val_loss: 1.50295e-05 | Time: 8768 ms
[2022-04-01 13:33:24	                main:574]	:	INFO	:	Epoch 1852 | loss: 1.47415e-05 | val_loss: 1.49968e-05 | Time: 11443.1 ms
[2022-04-01 13:33:33	                main:574]	:	INFO	:	Epoch 1853 | loss: 1.47492e-05 | val_loss: 1.50099e-05 | Time: 8696.23 ms
[2022-04-01 13:33:39	                main:574]	:	INFO	:	Epoch 1854 | loss: 1.47532e-05 | val_loss: 1.5032e-05 | Time: 6358.81 ms
[2022-04-01 13:33:45	                main:574]	:	INFO	:	Epoch 1855 | loss: 1.47464e-05 | val_loss: 1.49756e-05 | Time: 5590.05 ms
[2022-04-01 13:33:50	                main:574]	:	INFO	:	Epoch 1856 | loss: 1.4773e-05 | val_loss: 1.49284e-05 | Time: 5467.63 ms
[2022-04-01 13:33:56	                main:574]	:	INFO	:	Epoch 1857 | loss: 1.47515e-05 | val_loss: 1.50207e-05 | Time: 5651.61 ms
[2022-04-01 13:34:02	                main:574]	:	INFO	:	Epoch 1858 | loss: 1.47448e-05 | val_loss: 1.50154e-05 | Time: 5604.22 ms
[2022-04-01 13:34:08	                main:574]	:	INFO	:	Epoch 1859 | loss: 1.47546e-05 | val_loss: 1.50348e-05 | Time: 5826.86 ms
[2022-04-01 13:34:13	                main:574]	:	INFO	:	Epoch 1860 | loss: 1.47354e-05 | val_loss: 1.49821e-05 | Time: 5620.46 ms
[2022-04-01 13:34:19	                main:574]	:	INFO	:	Epoch 1861 | loss: 1.4783e-05 | val_loss: 1.50042e-05 | Time: 5524.94 ms
[2022-04-01 13:34:24	                main:574]	:	INFO	:	Epoch 1862 | loss: 1.47613e-05 | val_loss: 1.50301e-05 | Time: 5654.92 ms
[2022-04-01 13:34:30	                main:574]	:	INFO	:	Epoch 1863 | loss: 1.4756e-05 | val_loss: 1.49451e-05 | Time: 5713.45 ms
[2022-04-01 13:34:36	                main:574]	:	INFO	:	Epoch 1864 | loss: 1.47268e-05 | val_loss: 1.49925e-05 | Time: 5633.94 ms
[2022-04-01 13:34:42	                main:574]	:	INFO	:	Epoch 1865 | loss: 1.47412e-05 | val_loss: 1.50306e-05 | Time: 5660.8 ms
[2022-04-01 13:34:47	                main:574]	:	INFO	:	Epoch 1866 | loss: 1.4743e-05 | val_loss: 1.50258e-05 | Time: 5751.82 ms
[2022-04-01 13:34:53	                main:574]	:	INFO	:	Epoch 1867 | loss: 1.47508e-05 | val_loss: 1.49663e-05 | Time: 5557.26 ms
[2022-04-01 13:34:59	                main:574]	:	INFO	:	Epoch 1868 | loss: 1.478e-05 | val_loss: 1.50204e-05 | Time: 5530.13 ms
[2022-04-01 13:35:04	                main:574]	:	INFO	:	Epoch 1869 | loss: 1.47298e-05 | val_loss: 1.49654e-05 | Time: 5718.23 ms
[2022-04-01 13:35:10	                main:574]	:	INFO	:	Epoch 1870 | loss: 1.47511e-05 | val_loss: 1.49762e-05 | Time: 5511.55 ms
[2022-04-01 13:35:15	                main:574]	:	INFO	:	Epoch 1871 | loss: 1.47462e-05 | val_loss: 1.50149e-05 | Time: 5551.59 ms
[2022-04-01 13:35:21	                main:574]	:	INFO	:	Epoch 1872 | loss: 1.47844e-05 | val_loss: 1.50184e-05 | Time: 5483.34 ms
[2022-04-01 13:35:27	                main:574]	:	INFO	:	Epoch 1873 | loss: 1.47494e-05 | val_loss: 1.49497e-05 | Time: 5616.12 ms
[2022-04-01 13:35:32	                main:574]	:	INFO	:	Epoch 1874 | loss: 1.47644e-05 | val_loss: 1.49673e-05 | Time: 5538.81 ms
[2022-04-01 13:35:38	                main:574]	:	INFO	:	Epoch 1875 | loss: 1.47596e-05 | val_loss: 1.51712e-05 | Time: 5664.7 ms
[2022-04-01 13:35:44	                main:574]	:	INFO	:	Epoch 1876 | loss: 1.54488e-05 | val_loss: 1.50852e-05 | Time: 5632.57 ms
[2022-04-01 13:35:49	                main:574]	:	INFO	:	Epoch 1877 | loss: 1.48772e-05 | val_loss: 1.50928e-05 | Time: 5613.01 ms
[2022-04-01 13:35:55	                main:574]	:	INFO	:	Epoch 1878 | loss: 1.47614e-05 | val_loss: 1.49601e-05 | Time: 5498.97 ms
[2022-04-01 13:36:05	                main:574]	:	INFO	:	Epoch 1879 | loss: 1.47606e-05 | val_loss: 1.49556e-05 | Time: 10431.4 ms
[2022-04-01 13:36:13	                main:574]	:	INFO	:	Epoch 1880 | loss: 1.50838e-05 | val_loss: 1.57284e-05 | Time: 7203.51 ms
[2022-04-01 13:36:25	                main:574]	:	INFO	:	Epoch 1881 | loss: 3.95986e-05 | val_loss: 2.74924e-05 | Time: 12702.8 ms
[2022-04-01 13:36:34	                main:574]	:	INFO	:	Epoch 1882 | loss: 1.87539e-05 | val_loss: 1.53635e-05 | Time: 8440.82 ms
[2022-04-01 13:36:41	                main:574]	:	INFO	:	Epoch 1883 | loss: 1.63604e-05 | val_loss: 1.94942e-05 | Time: 6665.46 ms
[2022-04-01 13:36:46	                main:574]	:	INFO	:	Epoch 1884 | loss: 1.71481e-05 | val_loss: 1.48642e-05 | Time: 5654.78 ms
[2022-04-01 13:36:52	                main:574]	:	INFO	:	Epoch 1885 | loss: 1.50684e-05 | val_loss: 1.40716e-05 | Time: 5429.62 ms
[2022-04-01 13:36:57	                main:574]	:	INFO	:	Epoch 1886 | loss: 1.47826e-05 | val_loss: 1.41491e-05 | Time: 5601.7 ms
[2022-04-01 13:37:03	                main:574]	:	INFO	:	Epoch 1887 | loss: 1.48019e-05 | val_loss: 1.41477e-05 | Time: 5639.46 ms
[2022-04-01 13:37:09	                main:574]	:	INFO	:	Epoch 1888 | loss: 1.47552e-05 | val_loss: 1.41579e-05 | Time: 5734.44 ms
[2022-04-01 13:37:19	                main:574]	:	INFO	:	Epoch 1889 | loss: 1.47559e-05 | val_loss: 1.41156e-05 | Time: 9781.2 ms
[2022-04-01 13:37:25	                main:574]	:	INFO	:	Epoch 1890 | loss: 1.47033e-05 | val_loss: 1.40617e-05 | Time: 6409.64 ms
[2022-04-01 13:37:31	                main:574]	:	INFO	:	Epoch 1891 | loss: 1.47996e-05 | val_loss: 1.42466e-05 | Time: 5926.67 ms
[2022-04-01 13:37:43	                main:574]	:	INFO	:	Epoch 1892 | loss: 1.47439e-05 | val_loss: 1.39959e-05 | Time: 11312.4 ms
[2022-04-01 13:37:52	                main:574]	:	INFO	:	Epoch 1893 | loss: 1.47913e-05 | val_loss: 1.45613e-05 | Time: 9262.77 ms
[2022-04-01 13:38:01	                main:574]	:	INFO	:	Epoch 1894 | loss: 1.48893e-05 | val_loss: 1.42561e-05 | Time: 8852.82 ms
[2022-04-01 13:38:13	                main:574]	:	INFO	:	Epoch 1895 | loss: 1.57685e-05 | val_loss: 1.48721e-05 | Time: 12446.1 ms
[2022-04-01 13:38:23	                main:574]	:	INFO	:	Epoch 1896 | loss: 1.59595e-05 | val_loss: 1.77869e-05 | Time: 9862.69 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-04-01 13:42:50	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-01 13:42:50	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-01 13:42:50	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-01 13:42:50	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-01 13:42:50	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-01 13:42:50	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-01 13:42:50	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-01 13:42:51	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-01 13:42:51	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-01 13:42:51	                main:474]	:	INFO	:	Configuration: 
[2022-04-01 13:42:51	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-01 13:42:51	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-01 13:42:51	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-01 13:42:51	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-01 13:42:51	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-01 13:42:51	                main:480]	:	INFO	:	    Patience: 10
[2022-04-01 13:42:51	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-01 13:42:51	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-01 13:42:51	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-01 13:42:51	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-01 13:42:51	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-01 13:42:51	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-01 13:42:52	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-01 13:42:59	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-01 13:42:59	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-01 13:42:59	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-01 13:43:00	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-01 13:43:00	                main:494]	:	INFO	:	Creating Model
[2022-04-01 13:43:00	                main:507]	:	INFO	:	Preparing config file
[2022-04-01 13:43:00	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-01 13:43:00	                main:512]	:	INFO	:	Loading config
[2022-04-01 13:43:00	                main:514]	:	INFO	:	Loading state
[2022-04-01 13:43:03	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-01 13:43:03	                main:562]	:	INFO	:	Starting Training
[2022-04-01 13:43:18	                main:574]	:	INFO	:	Epoch 1895 | loss: 0.000638552 | val_loss: 5.8206e-05 | Time: 15263.2 ms
[2022-04-01 13:43:27	                main:574]	:	INFO	:	Epoch 1896 | loss: 8.59359e-05 | val_loss: 3.47692e-05 | Time: 8215.55 ms
[2022-04-01 13:43:34	                main:574]	:	INFO	:	Epoch 1897 | loss: 2.83189e-05 | val_loss: 1.78784e-05 | Time: 7014.43 ms
[2022-04-01 13:43:40	                main:574]	:	INFO	:	Epoch 1898 | loss: 1.72178e-05 | val_loss: 1.48149e-05 | Time: 6436.6 ms
[2022-04-01 13:43:46	                main:574]	:	INFO	:	Epoch 1899 | loss: 1.55208e-05 | val_loss: 1.44411e-05 | Time: 6073.81 ms
[2022-04-01 13:43:53	                main:574]	:	INFO	:	Epoch 1900 | loss: 1.58919e-05 | val_loss: 1.4792e-05 | Time: 6269.93 ms
[2022-04-01 13:43:59	                main:574]	:	INFO	:	Epoch 1901 | loss: 1.55904e-05 | val_loss: 1.50759e-05 | Time: 6084.24 ms
[2022-04-01 13:44:05	                main:574]	:	INFO	:	Epoch 1902 | loss: 1.58382e-05 | val_loss: 1.44882e-05 | Time: 6052.39 ms
[2022-04-01 13:44:11	                main:574]	:	INFO	:	Epoch 1903 | loss: 1.60628e-05 | val_loss: 1.40898e-05 | Time: 6034.59 ms
[2022-04-01 13:44:17	                main:574]	:	INFO	:	Epoch 1904 | loss: 1.55396e-05 | val_loss: 1.49929e-05 | Time: 6093.29 ms
[2022-04-01 13:44:30	                main:574]	:	INFO	:	Epoch 1905 | loss: 1.49342e-05 | val_loss: 1.44974e-05 | Time: 12579.1 ms
[2022-04-01 13:44:38	                main:574]	:	INFO	:	Epoch 1906 | loss: 1.50049e-05 | val_loss: 1.37474e-05 | Time: 8031.84 ms
[2022-04-01 13:44:44	                main:574]	:	INFO	:	Epoch 1907 | loss: 1.48038e-05 | val_loss: 1.47675e-05 | Time: 6590.52 ms
[2022-04-01 13:44:51	                main:574]	:	INFO	:	Epoch 1908 | loss: 1.47965e-05 | val_loss: 1.30893e-05 | Time: 6523.1 ms
[2022-04-01 13:45:05	                main:574]	:	INFO	:	Epoch 1909 | loss: 1.46579e-05 | val_loss: 1.33133e-05 | Time: 13711.1 ms
[2022-04-01 13:45:17	                main:574]	:	INFO	:	Epoch 1910 | loss: 1.46537e-05 | val_loss: 1.31424e-05 | Time: 11692.3 ms
[2022-04-01 13:45:26	                main:574]	:	INFO	:	Epoch 1911 | loss: 1.46295e-05 | val_loss: 1.33094e-05 | Time: 9184.79 ms
[2022-04-01 13:45:33	                main:574]	:	INFO	:	Epoch 1912 | loss: 1.46016e-05 | val_loss: 1.30997e-05 | Time: 7580.83 ms
[2022-04-01 13:45:40	                main:574]	:	INFO	:	Epoch 1913 | loss: 1.45949e-05 | val_loss: 1.32943e-05 | Time: 6852.7 ms
[2022-04-01 13:45:47	                main:574]	:	INFO	:	Epoch 1914 | loss: 1.46481e-05 | val_loss: 1.3129e-05 | Time: 6221.16 ms
[2022-04-01 13:45:53	                main:574]	:	INFO	:	Epoch 1915 | loss: 1.46439e-05 | val_loss: 1.30932e-05 | Time: 6253.86 ms
[2022-04-01 13:45:59	                main:574]	:	INFO	:	Epoch 1916 | loss: 1.46305e-05 | val_loss: 1.30981e-05 | Time: 6116.04 ms
[2022-04-01 13:46:05	                main:574]	:	INFO	:	Epoch 1917 | loss: 1.47095e-05 | val_loss: 1.31875e-05 | Time: 6207.55 ms
[2022-04-01 13:46:12	                main:574]	:	INFO	:	Epoch 1918 | loss: 1.46235e-05 | val_loss: 1.32555e-05 | Time: 6318.45 ms
[2022-04-01 13:46:18	                main:574]	:	INFO	:	Epoch 1919 | loss: 1.46165e-05 | val_loss: 1.32124e-05 | Time: 6082.86 ms
[2022-04-01 13:46:24	                main:574]	:	INFO	:	Epoch 1920 | loss: 1.46457e-05 | val_loss: 1.32941e-05 | Time: 6255.84 ms
[2022-04-01 13:46:30	                main:574]	:	INFO	:	Epoch 1921 | loss: 1.46414e-05 | val_loss: 1.32716e-05 | Time: 6194.28 ms
[2022-04-01 13:46:43	                main:574]	:	INFO	:	Epoch 1922 | loss: 1.46302e-05 | val_loss: 1.31579e-05 | Time: 12327.9 ms
[2022-04-01 13:46:51	                main:574]	:	INFO	:	Epoch 1923 | loss: 1.46153e-05 | val_loss: 1.31522e-05 | Time: 8116.87 ms
[2022-04-01 13:46:57	                main:574]	:	INFO	:	Epoch 1924 | loss: 1.4624e-05 | val_loss: 1.31647e-05 | Time: 6396.59 ms
[2022-04-01 13:47:04	                main:574]	:	INFO	:	Epoch 1925 | loss: 1.46095e-05 | val_loss: 1.3263e-05 | Time: 6250.99 ms
[2022-04-01 13:47:10	                main:574]	:	INFO	:	Epoch 1926 | loss: 1.45935e-05 | val_loss: 1.31399e-05 | Time: 6356.18 ms
[2022-04-01 13:47:16	                main:574]	:	INFO	:	Epoch 1927 | loss: 1.4627e-05 | val_loss: 1.31554e-05 | Time: 6292.85 ms
[2022-04-01 13:47:23	                main:574]	:	INFO	:	Epoch 1928 | loss: 1.45926e-05 | val_loss: 1.32109e-05 | Time: 6198.96 ms
[2022-04-01 13:47:29	                main:574]	:	INFO	:	Epoch 1929 | loss: 1.45765e-05 | val_loss: 1.34582e-05 | Time: 6223.96 ms
[2022-04-01 13:47:35	                main:574]	:	INFO	:	Epoch 1930 | loss: 1.46652e-05 | val_loss: 1.32788e-05 | Time: 6327.06 ms
[2022-04-01 13:47:50	                main:574]	:	INFO	:	Epoch 1931 | loss: 1.46568e-05 | val_loss: 1.3234e-05 | Time: 14158.9 ms
[2022-04-01 13:48:01	                main:574]	:	INFO	:	Epoch 1932 | loss: 1.4612e-05 | val_loss: 1.31948e-05 | Time: 11436 ms
[2022-04-01 13:48:10	                main:574]	:	INFO	:	Epoch 1933 | loss: 1.45969e-05 | val_loss: 1.32496e-05 | Time: 8713.75 ms
[2022-04-01 13:48:17	                main:574]	:	INFO	:	Epoch 1934 | loss: 1.46139e-05 | val_loss: 1.34667e-05 | Time: 7402.52 ms
[2022-04-01 13:48:25	                main:574]	:	INFO	:	Epoch 1935 | loss: 1.4592e-05 | val_loss: 1.33104e-05 | Time: 7298.69 ms
[2022-04-01 13:48:32	                main:574]	:	INFO	:	Epoch 1936 | loss: 1.45823e-05 | val_loss: 1.32878e-05 | Time: 7072.04 ms
[2022-04-01 13:48:39	                main:574]	:	INFO	:	Epoch 1937 | loss: 1.45914e-05 | val_loss: 1.32961e-05 | Time: 6968.49 ms
[2022-04-01 13:48:45	                main:574]	:	INFO	:	Epoch 1938 | loss: 1.4593e-05 | val_loss: 1.35285e-05 | Time: 6343.45 ms
[2022-04-01 13:48:51	                main:574]	:	INFO	:	Epoch 1939 | loss: 1.46444e-05 | val_loss: 1.34838e-05 | Time: 6116.37 ms
[2022-04-01 13:48:57	                main:574]	:	INFO	:	Epoch 1940 | loss: 1.45905e-05 | val_loss: 1.33378e-05 | Time: 6099.86 ms
[2022-04-01 13:49:04	                main:574]	:	INFO	:	Epoch 1941 | loss: 1.46336e-05 | val_loss: 1.33465e-05 | Time: 6136.88 ms
[2022-04-01 13:49:10	                main:574]	:	INFO	:	Epoch 1942 | loss: 1.46193e-05 | val_loss: 1.32753e-05 | Time: 6262.09 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-04-01 13:51:28	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-01 13:51:29	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-01 13:51:29	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-01 13:51:29	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-01 13:51:29	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-01 13:51:29	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-01 13:51:29	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-01 13:51:29	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-01 13:51:29	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-01 13:51:29	                main:474]	:	INFO	:	Configuration: 
[2022-04-01 13:51:29	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-01 13:51:29	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-01 13:51:29	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-01 13:51:29	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-01 13:51:29	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-01 13:51:29	                main:480]	:	INFO	:	    Patience: 10
[2022-04-01 13:51:29	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-01 13:51:29	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-01 13:51:29	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-01 13:51:29	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-01 13:51:29	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-01 13:51:29	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-01 13:51:30	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-01 13:51:37	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-01 13:51:37	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-01 13:51:37	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-01 13:51:38	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-01 13:51:38	                main:494]	:	INFO	:	Creating Model
[2022-04-01 13:51:38	                main:507]	:	INFO	:	Preparing config file
[2022-04-01 13:51:38	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-01 13:51:38	                main:512]	:	INFO	:	Loading config
[2022-04-01 13:51:38	                main:514]	:	INFO	:	Loading state
[2022-04-01 13:51:41	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-01 13:51:41	                main:562]	:	INFO	:	Starting Training
[2022-04-01 13:51:58	                main:574]	:	INFO	:	Epoch 1942 | loss: 0.000491533 | val_loss: 8.59576e-05 | Time: 16874.9 ms
[2022-04-01 13:52:12	                main:574]	:	INFO	:	Epoch 1943 | loss: 7.55496e-05 | val_loss: 2.75258e-05 | Time: 14849 ms
[2022-04-01 13:52:25	                main:574]	:	INFO	:	Epoch 1944 | loss: 2.64912e-05 | val_loss: 2.14101e-05 | Time: 12365.6 ms
[2022-04-01 13:52:37	                main:574]	:	INFO	:	Epoch 1945 | loss: 1.77888e-05 | val_loss: 1.43359e-05 | Time: 12513.9 ms
[2022-04-01 13:52:49	                main:574]	:	INFO	:	Epoch 1946 | loss: 1.53677e-05 | val_loss: 1.39332e-05 | Time: 11394.3 ms
[2022-04-01 13:52:59	                main:574]	:	INFO	:	Epoch 1947 | loss: 1.47416e-05 | val_loss: 1.35723e-05 | Time: 10322.9 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-04-01 13:55:51	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-01 13:55:51	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-01 13:55:52	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-01 13:55:52	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-01 13:55:52	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-01 13:55:52	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-01 13:55:52	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-01 13:55:52	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-01 13:55:52	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-01 13:55:52	                main:474]	:	INFO	:	Configuration: 
[2022-04-01 13:55:52	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-01 13:55:52	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-01 13:55:52	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-01 13:55:52	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-01 13:55:52	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-01 13:55:52	                main:480]	:	INFO	:	    Patience: 10
[2022-04-01 13:55:52	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-01 13:55:52	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-01 13:55:52	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-01 13:55:52	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-01 13:55:52	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-01 13:55:52	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-01 13:55:54	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-01 13:56:01	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-01 13:56:01	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-01 13:56:02	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-01 13:56:02	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-01 13:56:02	                main:494]	:	INFO	:	Creating Model
[2022-04-01 13:56:02	                main:507]	:	INFO	:	Preparing config file
[2022-04-01 13:56:03	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-01 13:56:03	                main:512]	:	INFO	:	Loading config
[2022-04-01 13:56:03	                main:514]	:	INFO	:	Loading state
[2022-04-01 13:56:06	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-01 13:56:06	                main:562]	:	INFO	:	Starting Training
[2022-04-01 13:56:18	                main:574]	:	INFO	:	Epoch 1947 | loss: 0.000383234 | val_loss: 0.000118603 | Time: 12045.1 ms
[2022-04-01 13:56:26	                main:574]	:	INFO	:	Epoch 1948 | loss: 5.98007e-05 | val_loss: 3.32665e-05 | Time: 7508.74 ms
[2022-04-01 13:56:33	                main:574]	:	INFO	:	Epoch 1949 | loss: 2.31279e-05 | val_loss: 1.58571e-05 | Time: 7024.78 ms
[2022-04-01 13:56:43	                main:574]	:	INFO	:	Epoch 1950 | loss: 1.64878e-05 | val_loss: 1.43509e-05 | Time: 10213.4 ms
[2022-04-01 13:56:50	                main:574]	:	INFO	:	Epoch 1951 | loss: 1.59025e-05 | val_loss: 1.40061e-05 | Time: 6869.53 ms
[2022-04-01 13:56:57	                main:574]	:	INFO	:	Epoch 1952 | loss: 2.62078e-05 | val_loss: 1.95458e-05 | Time: 7017.18 ms
[2022-04-01 13:57:04	                main:574]	:	INFO	:	Epoch 1953 | loss: 1.67483e-05 | val_loss: 1.467e-05 | Time: 6931.02 ms
[2022-04-01 13:57:11	                main:574]	:	INFO	:	Epoch 1954 | loss: 1.51739e-05 | val_loss: 1.44873e-05 | Time: 7056.39 ms
[2022-04-01 13:57:18	                main:574]	:	INFO	:	Epoch 1955 | loss: 1.86299e-05 | val_loss: 1.63519e-05 | Time: 6989.73 ms
[2022-04-01 13:57:30	                main:574]	:	INFO	:	Epoch 1956 | loss: 1.59508e-05 | val_loss: 1.30487e-05 | Time: 11736.1 ms
[2022-04-01 13:57:38	                main:574]	:	INFO	:	Epoch 1957 | loss: 1.47582e-05 | val_loss: 1.36579e-05 | Time: 7935.79 ms
[2022-04-01 13:57:49	                main:574]	:	INFO	:	Epoch 1958 | loss: 1.46832e-05 | val_loss: 1.38506e-05 | Time: 10987.9 ms
[2022-04-01 13:57:58	                main:574]	:	INFO	:	Epoch 1959 | loss: 1.45922e-05 | val_loss: 1.36718e-05 | Time: 8469.47 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-04-01 14:00:18	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-01 14:00:18	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-01 14:00:18	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-01 14:00:18	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-01 14:00:18	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-01 14:00:18	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-01 14:00:18	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-01 14:00:18	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-01 14:00:18	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-01 14:00:18	                main:474]	:	INFO	:	Configuration: 
[2022-04-01 14:00:19	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-01 14:00:19	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-01 14:00:19	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-01 14:00:19	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-01 14:00:19	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-01 14:00:19	                main:480]	:	INFO	:	    Patience: 10
[2022-04-01 14:00:19	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-01 14:00:19	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-01 14:00:19	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-01 14:00:19	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-01 14:00:19	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-01 14:00:19	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-01 14:00:20	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-01 14:00:27	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-01 14:00:27	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-01 14:00:27	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-01 14:00:28	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-01 14:00:28	                main:494]	:	INFO	:	Creating Model
[2022-04-01 14:00:28	                main:507]	:	INFO	:	Preparing config file
[2022-04-01 14:00:28	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-01 14:00:28	                main:512]	:	INFO	:	Loading config
[2022-04-01 14:00:28	                main:514]	:	INFO	:	Loading state
[2022-04-01 14:00:31	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-01 14:00:31	                main:562]	:	INFO	:	Starting Training
[2022-04-01 14:00:43	                main:574]	:	INFO	:	Epoch 1953 | loss: 0.000275014 | val_loss: 9.91825e-05 | Time: 11912.1 ms
[2022-04-01 14:00:55	                main:574]	:	INFO	:	Epoch 1954 | loss: 4.98298e-05 | val_loss: 2.64544e-05 | Time: 11563.9 ms
[2022-04-01 14:01:08	                main:574]	:	INFO	:	Epoch 1955 | loss: 2.24642e-05 | val_loss: 1.45003e-05 | Time: 12398.3 ms
[2022-04-01 14:01:16	                main:574]	:	INFO	:	Epoch 1956 | loss: 1.59451e-05 | val_loss: 1.33549e-05 | Time: 8427.39 ms
[2022-04-01 14:01:24	                main:574]	:	INFO	:	Epoch 1957 | loss: 1.5039e-05 | val_loss: 1.32949e-05 | Time: 8251.93 ms
[2022-04-01 14:01:32	                main:574]	:	INFO	:	Epoch 1958 | loss: 1.48289e-05 | val_loss: 1.32534e-05 | Time: 7210.3 ms
[2022-04-01 14:01:39	                main:574]	:	INFO	:	Epoch 1959 | loss: 1.47032e-05 | val_loss: 1.31848e-05 | Time: 7138.91 ms
[2022-04-01 14:01:46	                main:574]	:	INFO	:	Epoch 1960 | loss: 1.46985e-05 | val_loss: 1.31448e-05 | Time: 6661.24 ms
[2022-04-01 14:01:52	                main:574]	:	INFO	:	Epoch 1961 | loss: 1.46527e-05 | val_loss: 1.3107e-05 | Time: 6261.68 ms
[2022-04-01 14:01:58	                main:574]	:	INFO	:	Epoch 1962 | loss: 1.47767e-05 | val_loss: 1.4077e-05 | Time: 6388.81 ms
[2022-04-01 14:02:05	                main:574]	:	INFO	:	Epoch 1963 | loss: 1.5033e-05 | val_loss: 1.3887e-05 | Time: 6261.37 ms
[2022-04-01 14:02:17	                main:574]	:	INFO	:	Epoch 1964 | loss: 1.47663e-05 | val_loss: 1.32392e-05 | Time: 12267.3 ms
[2022-04-01 14:02:25	                main:574]	:	INFO	:	Epoch 1965 | loss: 1.50547e-05 | val_loss: 1.36151e-05 | Time: 7664.94 ms
[2022-04-01 14:02:31	                main:574]	:	INFO	:	Epoch 1966 | loss: 1.47054e-05 | val_loss: 1.3182e-05 | Time: 6431.92 ms
[2022-04-01 14:02:37	                main:574]	:	INFO	:	Epoch 1967 | loss: 3.14832e-05 | val_loss: 1.78091e-05 | Time: 6277 ms
[2022-04-01 14:02:44	                main:574]	:	INFO	:	Epoch 1968 | loss: 7.52501e-05 | val_loss: 3.57448e-05 | Time: 6322.13 ms
[2022-04-01 14:02:50	                main:574]	:	INFO	:	Epoch 1969 | loss: 2.55512e-05 | val_loss: 2.16469e-05 | Time: 6244.68 ms
[2022-04-01 14:02:57	                main:574]	:	INFO	:	Epoch 1970 | loss: 2.11768e-05 | val_loss: 1.83463e-05 | Time: 6471.21 ms
[2022-04-01 14:03:03	                main:574]	:	INFO	:	Epoch 1971 | loss: 1.84297e-05 | val_loss: 1.5033e-05 | Time: 6120.21 ms
[2022-04-01 14:03:09	                main:574]	:	INFO	:	Epoch 1972 | loss: 1.61664e-05 | val_loss: 1.38364e-05 | Time: 6354 ms
[2022-04-01 14:03:15	                main:574]	:	INFO	:	Epoch 1973 | loss: 1.54658e-05 | val_loss: 1.4051e-05 | Time: 6140.21 ms
[2022-04-01 14:03:22	                main:574]	:	INFO	:	Epoch 1974 | loss: 1.47839e-05 | val_loss: 1.38907e-05 | Time: 6209.09 ms
[2022-04-01 14:03:35	                main:574]	:	INFO	:	Epoch 1975 | loss: 1.46847e-05 | val_loss: 1.38914e-05 | Time: 13042.7 ms
[2022-04-01 14:03:45	                main:574]	:	INFO	:	Epoch 1976 | loss: 1.46524e-05 | val_loss: 1.39722e-05 | Time: 10407.4 ms
[2022-04-01 14:03:54	                main:574]	:	INFO	:	Epoch 1977 | loss: 1.46687e-05 | val_loss: 1.39981e-05 | Time: 8303.11 ms
[2022-04-01 14:04:06	                main:574]	:	INFO	:	Epoch 1978 | loss: 1.46665e-05 | val_loss: 1.39622e-05 | Time: 12623.4 ms
[2022-04-01 14:04:17	                main:574]	:	INFO	:	Epoch 1979 | loss: 1.46385e-05 | val_loss: 1.38488e-05 | Time: 10666.9 ms
[2022-04-01 14:04:27	                main:574]	:	INFO	:	Epoch 1980 | loss: 1.53043e-05 | val_loss: 1.8284e-05 | Time: 9777.63 ms
[2022-04-01 14:04:35	                main:574]	:	INFO	:	Epoch 1981 | loss: 1.55217e-05 | val_loss: 1.41551e-05 | Time: 8073.59 ms
[2022-04-01 14:04:48	                main:574]	:	INFO	:	Epoch 1982 | loss: 1.47628e-05 | val_loss: 1.41448e-05 | Time: 12919.2 ms
[2022-04-01 14:04:59	                main:574]	:	INFO	:	Epoch 1983 | loss: 1.46543e-05 | val_loss: 1.38547e-05 | Time: 10953.9 ms
[2022-04-01 14:05:07	                main:574]	:	INFO	:	Epoch 1984 | loss: 1.46486e-05 | val_loss: 1.40403e-05 | Time: 7947.22 ms
[2022-04-01 14:05:14	                main:574]	:	INFO	:	Epoch 1985 | loss: 1.46029e-05 | val_loss: 1.39174e-05 | Time: 6976.15 ms
[2022-04-01 14:05:20	                main:574]	:	INFO	:	Epoch 1986 | loss: 1.46231e-05 | val_loss: 1.38727e-05 | Time: 6208.43 ms
[2022-04-01 14:05:27	                main:574]	:	INFO	:	Epoch 1987 | loss: 1.46384e-05 | val_loss: 1.40133e-05 | Time: 6341.82 ms
[2022-04-01 14:05:33	                main:574]	:	INFO	:	Epoch 1988 | loss: 1.46283e-05 | val_loss: 1.388e-05 | Time: 6401.23 ms
[2022-04-01 14:05:47	                main:574]	:	INFO	:	Epoch 1989 | loss: 1.46047e-05 | val_loss: 1.38553e-05 | Time: 13700 ms
[2022-04-01 14:05:55	                main:574]	:	INFO	:	Epoch 1990 | loss: 1.46101e-05 | val_loss: 1.39799e-05 | Time: 7939.79 ms
[2022-04-01 14:06:08	                main:574]	:	INFO	:	Epoch 1991 | loss: 1.46824e-05 | val_loss: 1.40403e-05 | Time: 12527.7 ms
[2022-04-01 14:06:16	                main:574]	:	INFO	:	Epoch 1992 | loss: 1.47496e-05 | val_loss: 1.38755e-05 | Time: 8029.63 ms
[2022-04-01 14:06:23	                main:574]	:	INFO	:	Epoch 1993 | loss: 1.4909e-05 | val_loss: 1.42016e-05 | Time: 7461.01 ms
[2022-04-01 14:06:30	                main:574]	:	INFO	:	Epoch 1994 | loss: 1.48664e-05 | val_loss: 1.42782e-05 | Time: 6957.49 ms
[2022-04-01 14:06:44	                main:574]	:	INFO	:	Epoch 1995 | loss: 1.58144e-05 | val_loss: 1.44062e-05 | Time: 13836.1 ms
[2022-04-01 14:06:54	                main:574]	:	INFO	:	Epoch 1996 | loss: 1.52292e-05 | val_loss: 1.39726e-05 | Time: 9731.07 ms
[2022-04-01 14:07:02	                main:574]	:	INFO	:	Epoch 1997 | loss: 1.48416e-05 | val_loss: 1.44738e-05 | Time: 8330.33 ms
[2022-04-01 14:07:10	                main:574]	:	INFO	:	Epoch 1998 | loss: 1.49507e-05 | val_loss: 1.37403e-05 | Time: 7693.94 ms
[2022-04-01 14:07:17	                main:574]	:	INFO	:	Epoch 1999 | loss: 1.47161e-05 | val_loss: 1.45124e-05 | Time: 7244.79 ms
[2022-04-01 14:07:24	                main:574]	:	INFO	:	Epoch 2000 | loss: 1.48348e-05 | val_loss: 1.38456e-05 | Time: 6436.13 ms
[2022-04-01 14:07:30	                main:574]	:	INFO	:	Epoch 2001 | loss: 1.49114e-05 | val_loss: 1.53226e-05 | Time: 6409.84 ms
[2022-04-01 14:07:37	                main:574]	:	INFO	:	Epoch 2002 | loss: 1.60903e-05 | val_loss: 1.43238e-05 | Time: 6247.66 ms
[2022-04-01 14:07:43	                main:574]	:	INFO	:	Epoch 2003 | loss: 1.49543e-05 | val_loss: 1.38158e-05 | Time: 6394.83 ms
[2022-04-01 14:07:49	                main:574]	:	INFO	:	Epoch 2004 | loss: 1.47006e-05 | val_loss: 1.39417e-05 | Time: 6245.2 ms
[2022-04-01 14:07:56	                main:574]	:	INFO	:	Epoch 2005 | loss: 1.46199e-05 | val_loss: 1.39776e-05 | Time: 6230.63 ms
[2022-04-01 14:08:09	                main:574]	:	INFO	:	Epoch 2006 | loss: 1.47097e-05 | val_loss: 1.38632e-05 | Time: 12903.9 ms
[2022-04-01 14:08:18	                main:574]	:	INFO	:	Epoch 2007 | loss: 1.59062e-05 | val_loss: 1.63973e-05 | Time: 9534.87 ms
[2022-04-01 14:08:26	                main:574]	:	INFO	:	Epoch 2008 | loss: 1.57836e-05 | val_loss: 1.39659e-05 | Time: 7819.39 ms
[2022-04-01 14:08:33	                main:574]	:	INFO	:	Epoch 2009 | loss: 1.5138e-05 | val_loss: 1.46315e-05 | Time: 7129.74 ms
[2022-04-01 14:08:40	                main:574]	:	INFO	:	Epoch 2010 | loss: 1.46918e-05 | val_loss: 1.39263e-05 | Time: 7018.37 ms
[2022-04-01 14:08:47	                main:574]	:	INFO	:	Epoch 2011 | loss: 1.46178e-05 | val_loss: 1.41866e-05 | Time: 6822.9 ms
[2022-04-01 14:08:54	                main:574]	:	INFO	:	Epoch 2012 | loss: 1.53192e-05 | val_loss: 1.53334e-05 | Time: 6615.11 ms
[2022-04-01 14:09:08	                main:574]	:	INFO	:	Epoch 2013 | loss: 1.49784e-05 | val_loss: 1.43754e-05 | Time: 13829.8 ms
[2022-04-01 14:09:18	                main:574]	:	INFO	:	Epoch 2014 | loss: 1.50003e-05 | val_loss: 1.40469e-05 | Time: 9598.51 ms
[2022-04-01 14:09:25	                main:574]	:	INFO	:	Epoch 2015 | loss: 1.60033e-05 | val_loss: 1.40357e-05 | Time: 7359.2 ms
[2022-04-01 14:09:32	                main:574]	:	INFO	:	Epoch 2016 | loss: 1.49657e-05 | val_loss: 1.48834e-05 | Time: 6764.11 ms
[2022-04-01 14:09:38	                main:574]	:	INFO	:	Epoch 2017 | loss: 1.49698e-05 | val_loss: 1.39012e-05 | Time: 6595.88 ms
[2022-04-01 14:09:45	                main:574]	:	INFO	:	Epoch 2018 | loss: 1.49434e-05 | val_loss: 1.44103e-05 | Time: 6302.12 ms
[2022-04-01 14:09:58	                main:574]	:	INFO	:	Epoch 2019 | loss: 1.47438e-05 | val_loss: 1.4329e-05 | Time: 13317 ms
[2022-04-01 14:10:09	                main:574]	:	INFO	:	Epoch 2020 | loss: 1.48111e-05 | val_loss: 1.3972e-05 | Time: 10918.6 ms
[2022-04-01 14:10:22	                main:574]	:	INFO	:	Epoch 2021 | loss: 1.51213e-05 | val_loss: 1.5306e-05 | Time: 13055.1 ms
[2022-04-01 14:10:31	                main:574]	:	INFO	:	Epoch 2022 | loss: 1.58276e-05 | val_loss: 1.43785e-05 | Time: 8172.64 ms
[2022-04-01 14:10:38	                main:574]	:	INFO	:	Epoch 2023 | loss: 1.50712e-05 | val_loss: 1.4037e-05 | Time: 7059.52 ms
[2022-04-01 14:10:45	                main:574]	:	INFO	:	Epoch 2024 | loss: 1.47906e-05 | val_loss: 1.41537e-05 | Time: 7201.42 ms
[2022-04-01 14:10:52	                main:574]	:	INFO	:	Epoch 2025 | loss: 1.56961e-05 | val_loss: 1.42328e-05 | Time: 6705.5 ms
[2022-04-01 14:10:59	                main:574]	:	INFO	:	Epoch 2026 | loss: 1.47012e-05 | val_loss: 1.41718e-05 | Time: 6763.48 ms
[2022-04-01 14:11:05	                main:574]	:	INFO	:	Epoch 2027 | loss: 1.59237e-05 | val_loss: 1.47307e-05 | Time: 6666.91 ms
[2022-04-01 14:11:12	                main:574]	:	INFO	:	Epoch 2028 | loss: 1.50634e-05 | val_loss: 1.52709e-05 | Time: 6247.43 ms
[2022-04-01 14:11:18	                main:574]	:	INFO	:	Epoch 2029 | loss: 1.52037e-05 | val_loss: 1.44317e-05 | Time: 6430.9 ms
[2022-04-01 14:11:32	                main:574]	:	INFO	:	Epoch 2030 | loss: 1.48693e-05 | val_loss: 1.42198e-05 | Time: 13816.2 ms
[2022-04-01 14:11:44	                main:574]	:	INFO	:	Epoch 2031 | loss: 1.50845e-05 | val_loss: 1.44503e-05 | Time: 11701.1 ms
[2022-04-01 14:11:56	                main:574]	:	INFO	:	Epoch 2032 | loss: 1.47988e-05 | val_loss: 1.41535e-05 | Time: 12450.3 ms
[2022-04-01 14:12:06	                main:574]	:	INFO	:	Epoch 2033 | loss: 1.55762e-05 | val_loss: 1.6816e-05 | Time: 10202.7 ms
[2022-04-01 14:12:16	                main:574]	:	INFO	:	Epoch 2034 | loss: 1.5533e-05 | val_loss: 1.40834e-05 | Time: 9793.15 ms
[2022-04-01 14:12:27	                main:574]	:	INFO	:	Epoch 2035 | loss: 1.47155e-05 | val_loss: 1.44733e-05 | Time: 10217.9 ms
[2022-04-01 14:12:34	                main:574]	:	INFO	:	Epoch 2036 | loss: 1.53633e-05 | val_loss: 1.53926e-05 | Time: 7608.54 ms
[2022-04-01 14:12:41	                main:574]	:	INFO	:	Epoch 2037 | loss: 1.52061e-05 | val_loss: 1.42366e-05 | Time: 7123.92 ms
[2022-04-01 14:12:48	                main:574]	:	INFO	:	Epoch 2038 | loss: 1.54526e-05 | val_loss: 1.84259e-05 | Time: 6576.32 ms
[2022-04-01 14:12:54	                main:574]	:	INFO	:	Epoch 2039 | loss: 1.70445e-05 | val_loss: 1.59946e-05 | Time: 6154.43 ms
[2022-04-01 14:13:00	                main:574]	:	INFO	:	Epoch 2040 | loss: 1.52629e-05 | val_loss: 1.43031e-05 | Time: 6157.77 ms
[2022-04-01 14:13:15	                main:574]	:	INFO	:	Epoch 2041 | loss: 1.50803e-05 | val_loss: 1.47427e-05 | Time: 14292.9 ms
[2022-04-01 14:13:24	                main:574]	:	INFO	:	Epoch 2042 | loss: 1.47989e-05 | val_loss: 1.4305e-05 | Time: 9398.25 ms
Machine Learning Dataset Generator v9.75 (Windows/x64) (libTorch: release/1.6 GPU: NVIDIA GeForce GT 1030)
[2022-04-01 14:15:47	                main:435]	:	INFO	:	Set logging level to 1
[2022-04-01 14:15:47	                main:441]	:	INFO	:	Running in BOINC Client mode
[2022-04-01 14:15:47	                main:444]	:	INFO	:	Resolving all filenames
[2022-04-01 14:15:47	                main:452]	:	INFO	:	Resolved: dataset.hdf5 => dataset.hdf5 (exists = 1)
[2022-04-01 14:15:47	                main:452]	:	INFO	:	Resolved: model.cfg => model.cfg (exists = 1)
[2022-04-01 14:15:47	                main:452]	:	INFO	:	Resolved: model-final.pt => model-final.pt (exists = 0)
[2022-04-01 14:15:47	                main:452]	:	INFO	:	Resolved: model-input.pt => model-input.pt (exists = 1)
[2022-04-01 14:15:47	                main:452]	:	INFO	:	Resolved: snapshot.pt => snapshot.pt (exists = 1)
[2022-04-01 14:15:47	                main:472]	:	INFO	:	Dataset filename: dataset.hdf5
[2022-04-01 14:15:47	                main:474]	:	INFO	:	Configuration: 
[2022-04-01 14:15:47	                main:475]	:	INFO	:	    Model type: GRU
[2022-04-01 14:15:47	                main:476]	:	INFO	:	    Validation Loss Threshold: 0.0001
[2022-04-01 14:15:47	                main:477]	:	INFO	:	    Max Epochs: 2048
[2022-04-01 14:15:47	                main:478]	:	INFO	:	    Batch Size: 128
[2022-04-01 14:15:48	                main:479]	:	INFO	:	    Learning Rate: 0.01
[2022-04-01 14:15:48	                main:480]	:	INFO	:	    Patience: 10
[2022-04-01 14:15:48	                main:481]	:	INFO	:	    Hidden Width: 12
[2022-04-01 14:15:48	                main:482]	:	INFO	:	    # Recurrent Layers: 4
[2022-04-01 14:15:48	                main:483]	:	INFO	:	    # Backend Layers: 4
[2022-04-01 14:15:48	                main:484]	:	INFO	:	    # Threads: 1
[2022-04-01 14:15:48	                main:486]	:	INFO	:	Preparing Dataset
[2022-04-01 14:15:48	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xt from dataset.hdf5 into memory
[2022-04-01 14:15:49	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yt from dataset.hdf5 into memory
[2022-04-01 14:15:56	                load:106]	:	INFO	:	Successfully loaded dataset of 2048 examples into memory.
[2022-04-01 14:15:56	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Xv from dataset.hdf5 into memory
[2022-04-01 14:15:56	load_hdf5_ds_into_tensor:28]	:	INFO	:	Loading Dataset /Yv from dataset.hdf5 into memory
[2022-04-01 14:15:57	                load:106]	:	INFO	:	Successfully loaded dataset of 512 examples into memory.
[2022-04-01 14:15:57	                main:494]	:	INFO	:	Creating Model
[2022-04-01 14:15:57	                main:507]	:	INFO	:	Preparing config file
[2022-04-01 14:15:57	                main:511]	:	INFO	:	Found checkpoint, attempting to load... 
[2022-04-01 14:15:57	                main:512]	:	INFO	:	Loading config
[2022-04-01 14:15:57	                main:514]	:	INFO	:	Loading state
[2022-04-01 14:16:00	                main:559]	:	INFO	:	Loading DataLoader into Memory
[2022-04-01 14:16:00	                main:562]	:	INFO	:	Starting Training
[2022-04-01 14:16:13	                main:574]	:	INFO	:	Epoch 2043 | loss: 0.000365716 | val_loss: 9.25369e-05 | Time: 12508.2 ms
[2022-04-01 14:16:27	                main:574]	:	INFO	:	Epoch 2044 | loss: 5.48082e-05 | val_loss: 2.45034e-05 | Time: 13922.6 ms
[2022-04-01 14:16:36	                main:574]	:	INFO	:	Epoch 2045 | loss: 2.34104e-05 | val_loss: 1.84897e-05 | Time: 9467.81 ms
[2022-04-01 14:16:44	                main:574]	:	INFO	:	Epoch 2046 | loss: 1.67258e-05 | val_loss: 1.4156e-05 | Time: 8052.58 ms
[2022-04-01 14:16:51	                main:574]	:	INFO	:	Epoch 2047 | loss: 1.53213e-05 | val_loss: 1.36434e-05 | Time: 6971.37 ms
[2022-04-01 14:17:03	                main:574]	:	INFO	:	Epoch 2048 | loss: 1.50137e-05 | val_loss: 1.34149e-05 | Time: 12178.7 ms
[2022-04-01 14:17:03	                main:597]	:	INFO	:	Saving trained model to model-final.pt, val_loss 1.34149e-05
[2022-04-01 14:17:03	                main:603]	:	INFO	:	Saving end state to config to file
[2022-04-01 14:17:04	                main:608]	:	INFO	:	Success, exiting..
14:17:04 (4904): called boinc_finish(0)

</stderr_txt>
]]>


©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)