||Colorado State University
||Informing Guessing Attacks on Publicly Performed Secrets
Unlike traditional protocols, mobile authentication is frequently performed in a public setting, so a certain amount of information leakage is inevitable. A common attack called "shoulder surfing" takes advantage of this inadvertently leaked information for various harmful purposes. The objective of this project is to determine whether it is possible to obtain information from observing the full-body action of a user unlocking a mobile device. A loftier goal of my research project is to use this information to improve the guessing attack success rate, even in situations where the phone screen is not visible to the observer.
- Week 1:
- This week, I met several members of the lab and got acquainted with the project I will be working on this summer. On Wednesday, my first day, I had orientation with the DIMACS REU group and then I had my first meeting with Gradeigh to discuss my research project. He linked me to several resources for computer vision and motion capture, which I reviewed for much of Wednesday afternoon and Thursday morning. On Thursday, I also went through several OpenCV/JavaCV tutorials and created a primitive GUI for displaying and editing live video feed through my webcam. I met with Gradeigh and the other interns, Samantha and Erica, for lunch on Thursday, where I described my project to the other interns and learned about their projects. On Friday, I created a slide set for a presentation I will be making about my project to the DIMACS REU group on Monday. I showed my slides to Gradeigh on Friday afternoon for feedback. I will send my slides to Janne for additional feedback Friday afternoon/evening after I edit my slides to reflect Gradeigh’s suggestions.
- Week 2:
- I began this week by giving a presentation on Monday morning to my DIMACS REU cohort summarizing my project and my goals for the summer. After that, I went back to alternating between reading papers and experimenting with OpenCV for the rest of the week. After struggling with the limited documentation for JavaCV, I decided to switch my primary language to C++, which seems to fit the structure of OpenCV more naturally. To improve my comfort level with OpenCV, I created a program that identifies hands and phones from a live video feed, based on HSV threshold segmentation, foreground segmentation, and contour analysis. I also created a separate program that measures differentials between frames, since it will be useful in the future to be able to quantify change/movement between frames. I also read a lot of papers about hand, finger, and gesture detection to try to decide what algorithms I’ll need to include in my implementation. I was particularly intrigued by the use of Hidden Markov Models for similar applications and look forward to learning more about them in the future. I met with Janne to discuss my project on Thursday afternoon and got some good feedback on my work so far and my goals for the next few weeks. Janne also provided me with an external webcam that I will use next week to collect motion capture data. Finally, on Friday I set up my webcam system and began modifying my object tracking program to recognize the sticky notes that I will use as markers for motion capture tracking.
- Week 3:
- This week, I focused on developing my experimental setup and preparing for motion capture data collection. I explored many open source motion capture/motion tracking software packages and chose to use one called Kinovea to analyze my video data. It will allow me to track specific objects and export their paths into (x,y,t) format, which will allow me to calculate factors like velocity, acceleration, and deflection. I had a helpful conversation with my graduate student mentor where we considered the relative various camera positions we could use and decided to begin by placing the camera in front of the user. We also discussed our options for objects to track and decided the main ones are the device corners, hands, fingers, arms (specifically forearm, wrist, and elbow), and shoulders. When I collect data next week, I will place markers on all these spots and see what kind of movements can be identified. I will have each person enter the same sequence of four-digit PIN codes on the same device (so we have the same amount of contrast for each person). I also read a paper called “ACCessory: Password Inference using Accelerometers on Smartphones” which was quite interesting because the authors’ research goal was very similar to mine but with a different source of information leakage. Also, I got paid! Yay!
- Week 4:
- Week 4 was spent formalizing my experimental set-up and creating an experimental protocol document. I had to reconsider several aspects of my set-up, like the patterns/PINs that will be entered by the participants and ways to keep as many potential variables constant as possible. Gradeigh raised the idea that more than one participant might not be necessary at this point in the feasibility analysis, since there will probably be limited variation between individuals in such a controlled setting. We also discussed the idea of differentiating more general motions (up/down, left/right) as opposed to trying to identify specific patterns. The experimental set-up process took longer than I was expecting, so I was unable to make as much progress as I would have liked towards several of my goals for this week. Fully completing these goals will be my primary focus going into next week.
- Week 5:
- This week, I collected a lot of movement data and made progress towards building tools to efficiently analyze my data. In particular, I worked a lot with R, building a script that visualizes the movement in a line graph. I focused on the side view this week because it seemed like more information was available in that orientation but I will look into other orientations next week.
- Week 6:
This week, I worked to improve the readability of the movement visualization plots from last week and expanded my collection of recorded movements to include the back orientation as well as the side orientation. I also had a good discussion with Janne and Gradeigh about the state of the project and potential next steps. I began to put together a slideshow for my final presentation for the REU group, which will be 10 minutes long and will take place on either Thursday July 13th or Friday July 14th. I also started to work on my final writeup for the REU program, which is due in a few weeks.
- Week 7:
This week I finalized my slides and gave my final presentation to my REU group. I met with both Janne and Gradeigh to discuss the future of my project. We came up with several goals for the remaining two weeks of my REU, including incorporating 3D video analysis and adding PIN passwords to the dataset. I also began to create more complex movement plots, which I talked about in depth in my presentation. I spoke with Varun about continuing the project into the fall semester and offered to answer any questions he has about the methodology or research goals.
- Week 8:
This week I continued my work with the Tango tablet and made some progress with recording PIN entries. Unfortunately, Gradeigh and I came to the conclusion that I don’t have enough time this summer to really dig into the Tango work, like creating a new Android app to record the data we need, so we will leave that part of the project for the student who takes over this project after I leave. I recorded several basic PIN entries and created movement plots. It appears that the movement plots for PINs are similar in shape to the movement plots for equivalent pattern passwords, but the movement is far less exaggerated and has a smaller magnitude. This week I also began working on curve analysis for the movement plots. I’m following the approach taken in Cracking Android Pattern Lock in Five Attempts for this analysis, since their research problem was very similar to ours. In particular, I’m smoothing the movement plots, finding turning points (which will indicate direction shifts in the pattern/PIN), and calculating line lengths and direction. Hopefully we’ll be able to use those line attributes for the classifier, when it comes to that point. Also, I worked on my final paper, the first draft of which will be ready early next week.
- Week 9:
During my final week at the DIMACS REU, I completed the first draft of my final research paper and got feedback from Janne and Gradeigh. After reviewing their suggestions, I finalized my paper and turned it in to Lazaros along with my evaluation report. I also said goodbye to my friends in Janne's lab and my fellow REU students :( but I'm excited to be going home to Colorado on Friday. Thank you to everyone who worked to make this REU happen!