Derek Montanez

I am an undergraduate student at Texas State University majoring in Computer Science and minoring in Mathematics

Summer of 2022 I will be working under the DIMACS REU program on a project called "Exploring the Tradeoffs Between Compression and Performance in Deep Neural Networks" mentored by Prof. Waheed Bajwa

Email: derekmont02@gmail.com


Research Log

Week 1 (6/6 10/6) - This week, was a lot of adjusting to a whole different experience. I had the goals of reading through a survey about different neural network compressions, since I am very new to neural networks in general. I also startetd to view a 1 week course presented by MIT, on neural networks.

Week 2 (13/6 - 17/6) - I was assigned from my advisor to start building my own bibliography, based on different compression methods, and seperate my readings according to the methods. I was also assigned to pick a compression method of my choice, either from the survey paper, or any published research paper, and read through the paper, and be able to present it as a method I want to dwelve deeper in and implement/study.

Week 3 (20/6 - 24/6) - These week I started to code my first neural network, I was tasked of creating a simple MLP and to train it to the best of its abilities to meet the standard baseline of testing cifar10 using a MLP I created. I was able to achieve baseline compression. I also created a presentation of recurrent/common things I was reading in my bibliiography, and seeing if there are any patternsa and correlations between papers.

Week 4 (20/6 - 24/6) - I realized I wanted to change my compression method I had chosen to start to implement. I realized my previous compression method, would not yield in great results, and realizing the disadvantages outweigh the advantages. I switched to a pruning method I believe that would yield much better results, for less of the disadvantages. With this being said, I started to build my first covnolutional neural network. The architecture of the neural network us VGG-16 and AlexNet, which are very large models, that I need to run on a Linux Machine, rather than my own laptop

Week 5 (27/6 - 1/7) - From building my model last week, I was having a lotof trouble building my models using pytorch, so I switched and learned how to build my models using tensorflow. I was successfully able to run VGG-16 and AlexNet on cifar10 - I was not getting good results on my models, I need to adjust the hyperparameters. I started to code the pruning algorithm, and believe I have made good progress doing so. I plan to finish the pruning code this upcoming week and be able to test the model.

Week 6 (4/7 - 8/7) - I was able to finish the pruning method, it took most of my week, with a lot of trial and error. I would be able to prune my model correctly, but the changes I was making were not saving and I could not figure out why. One of the PhD students David, helped me with a better way to go about the code, and it helped me a lot, and I was able to finish the algorith. This next week, I want to get my testing done, and start a new pruning method to try.

Week 7 (11/7 - 15/7) - I spent a lot of my week, writing my new method "Online Filter Weakening and Pruning for Efficient Convnets" a method for convolutional neural networks, it applies a scaling factor to random percentage of filters in each layer after each back propagation step. Thie enables some filters to be gradually set to 0 at the end of training, and then would be able to be pruned safely. I finished most of the code, and hope to be able to use some of my previous code, from the previous method to help me. I decided I needed to switch models because VGG-16 and AlexNet were taking too long to train, and I was not getting accuracy I would have liked, plus there was still many more tests I needed to run.

Week 8 (18/7 - 22/7) - This week I ran tests on a new model, SCNNB which is a much smaller convolutional neural network, the testings were done fast,so if I had any mistakes in my program I was quickly able to recover. I did 20 tests in total for cifar10 and MNIST Fashion. I spent a lot of my week preparing and making my presentaiton, as well as adding to my final report.

Week 9 (25/7 - 29/7) - This week was my final week of the program, and I spent a lot of my week finalizing my tests, and my final report. I finished my testings by Monday night, adn I was able to finish my final draft by Wednesday night. I also got instructed to make a README file for my program,I started working on it this week but was not able to finish, I plan to finish soon in the upcoming week. I also need to finalize my presentation from the week before with updated tests.