I generally find that I am interested in using mathematical models to analyze computation structures at a fine granularity in order to design better systems. In terms of at which level these systems are implemented at, I haven’t found anything I’m particularly partial towards. I’ve worked at the architecture level, runtime systems level, and compiler level depending on the specifics of the project. I have yet to invest deeply into any of these areas though I do have to admit that instrumenting and implementing at the compiler level has been the least frustrating experience.
I am also gently interested in machine learning research specifically as it pertains to neuro-inspired algorithms. I concentrated my cognitive science degree in theoretical neuroscience and viewing machine learning research through that lens is very interesting to me.
I am extremely fortunate to have been advised by Saugata Ghose since my freshman year at CMU. Most of the development in my technical interests were very positively shaped under his guidance.
I spent the summer after my freshman year doing research full time with my advisor and a couple of other research interns. Since then I have been doing research part time working on virtual memory projects.
S. Ghose, A. G. Yağlıçkı, R. Gupta, D. Lee, K. Kudrolli, W. X. Liu, H. Hassan, K. K. Chang, N. Chatterjee, A. Agrawal, M. O’Connor, and O. Mutlu. “What Your DRAM Power Models Are Not Telling You: Lessons from a Detailed Experimental Study.” In ACM SIGMETRICS, Jun. 2018. Published in Proc. of the ACM on Measurement and Analysis of Computing Systems (POMACS), Vol. 2, No. 3, Dec. 2018.