Commit Graph

35 Commits

Author SHA1 Message Date
0c3c709155 Switched submodule to relative url so push mirror works 2025-07-12 21:27:08 +00:00
5dc1f8c554 updated training files submodule with a readme 2025-07-12 21:17:00 +00:00
30e16213fe Added training_files submodule 2025-05-13 02:32:22 +00:00
d7422cd99b Moved training files to separate folder in preparation of making training_files a submodule 2025-05-13 01:40:59 +00:00
1f00ca4da4 Inverted pendulum report added. Should be finished 2025-04-06 22:23:15 +00:00
89c06c5c42 Fixed a huge issue where base loss centroid convergence was not using 'one' as its reference loss function, resulting in opposite trends. 2025-04-06 20:52:50 +00:00
8155b0f7ae Basic report added. Report missing significant number of graphs and conclusions 2025-04-01 18:57:36 +00:00
aa533f2a7e Changing linear regression back to log based 2025-04-01 00:16:31 +00:00
a1c931480f Changed linear regression to not account for log axes 2025-04-01 00:12:36 +00:00
867553353b Generated convergence plots for 'centroid' parameters such as t_median, t_mean, R (later/early). 2025-03-31 21:50:30 +00:00
4d90689a60 Generated plots for time_weighting, time_weighting_learning_rate_sweep, and base_loss_learning_rate_sweep 2025-03-30 22:22:10 +00:00
bfa0f3fb02 Trained additional base loss functions for lr sweep. Generated data for time weighting. Generated composite epoch evolution plots 2025-03-29 19:23:52 +00:00
e238bed91e Finding best learning rates from the sweeps 2025-03-29 02:07:34 +00:00
aa34bfac8c Base loss learning rate sweep training done. Made composite plots for epoch evolution of different time weightings 2025-03-26 04:20:07 +00:00
a401ca3f59 Done re-training max_normalized time weighting (with a minimum weight of 0.01) and time_weight_learning_sweep). Started work on base loss function training 2025-03-12 22:33:37 +00:00
eb71ab0de9 Finish training with mirrored weights. Plot max normalized with mirror weights 2025-02-25 23:27:49 +00:00
68f918f51c Added mirror weight functions 2025-02-25 04:20:15 +00:00
c89998da28 Fix issues where the initialized controllers could be different. Created a controller_base.pth that is used for all controller initialization 2025-02-24 02:28:16 +00:00
8be7ad97a8 Plotting theta across epochs for the different loss functions 2025-02-22 23:45:53 +00:00
7d4d34a580 Better analysis file structure 2025-02-19 04:02:53 +00:00
b9212e5a52 Average normalized results across epochs have been plotted 2025-02-19 03:06:48 +00:00
8f92ce3ee1 Restructure files. Changed weight functions to be normalized from 0-1ns to always be normalized from 0-1 (aka max normalization). Also updated average normalization 2025-02-18 18:45:15 +00:00
28c5d14fe8 Plotted controller max normalzied across epoch. Also training average normalized 2025-02-18 00:40:29 +00:00
071669696b Completed training for max normalized comparison 2025-02-17 18:47:05 +00:00
3614b66aee Normalization of loss functions based on max weight 2025-02-17 02:54:56 +00:00
3fc78d4508 Looking at the different controllers as they trained 2025-02-16 00:03:29 +00:00
a865d37722 Controller across epochs plotter 2025-02-15 23:07:51 +00:00
dd97ce7335 Inverted pendulum controllers trained 2025-02-15 16:16:09 +00:00
ceda15213b Trained inverted pendulum, no time weights 2025-02-06 13:39:53 +00:00
0bb316f052 Redo training files to save models after every epoch 2025-02-06 03:10:29 +00:00
5f70241418 Inverted pendulum training started for no time weight, linear, quadratic, cubic, and exponential 2025-02-05 05:07:15 +00:00
a6273835b1 Inverted pendulum with desired theta trained 2025-02-04 03:57:47 +00:00
cdefc00226 Adding 'desired_theta' as neural network input 2025-02-03 22:28:41 +00:00
1b7b40adbc Inverse pendulum testing 2025-02-01 20:34:31 +00:00
f76fa8709d Inverse pendulum testing 2025-02-01 20:34:19 +00:00