Time Dependent Classification/Regression¶
I have been interested in time dependent classification/regression for a long time. Most machine learning models are implemented as static models where our assumptions about the relationships between the input variables and the parameters of the models is fixed. This assumption is mostly true, however, there are cases where the model parameters for the same input variable must be allowed to change with time. Furthermore, the change in the model parameters should not be large compared to their value at the previous timesteps. One simple way to approach this problem is to have a different model for each timestep, however, this will not allow us to learn from all the data at hand. The second approach is to have a different parameter for each timestep and a base set of parameters. This second approach although more appropriate will not allow us to enforce our constraint that the model parameters at a given timestep should not change too much from their value at the last timestep.
I got introduced to the concept of convex optimization after viewing Stephen P. Boyd's lecture vidoes on the topic from the Machine Learning Summer School 2015 in Tubingen. Prof. Boyd has a very engaging style of explaining the topic. His lectures also included some code samples using the CVXPY library
First plot using pelican¶
I am trying this new format for blogging using jupyter notebooks and pelican library. The motivation is to easily share my experiments using notebooks and write accompanying thoughts.
For this post I will test if the matplotlib plotting and latex equations work. I will simply be plotting the following 2 equations.
Semi-supervised way of learning graph embeddings
End to end module to learn NLP tasks as Q/A tasks.