Mobile Computing

Spring 2023

This document includes the reading list for the mobile systems and applications course. This reading list was compiled so that various areas of mobile computing can be comprehensively covered. The papers to be presented in class are must-read. Other papers are optional but I encourage you to read them as well. This course was designed together with Prof. Youngki Lee at Seoul National University.


Please use the following link to select your papers to present.

Link

IMPORTANT: Each student is expected to present two papers throughout the semester. For those students in the School of Integrated Technology that selected this course as a qualification requirement should present three papers. Paper selection will be first come first served.


Week 1: Class Intro & Innovative applications (No Paper Presentation)

How to Read a Paper [ACM SIGCOMM Computer Communication Review ’07]

Ubicomp Systems at 20: Progress, Opportunities, and Challenges [IEEE Pervasive Computing ’12]


Week 2: Human Behavior and Context Sensing

A Survey of Mobile Phone Sensing  [IEEE Computer Magazine ’10] (This paper is just for reading, not for presentation)

A practical approach for recognizing eating moments with wrist-mounted inertial sensing, [ACM UbiComp ’15]

Hivemind: Social Control-and-Use of IoT towards Democratization of Public Spaces [ACM MobiSys ’21]

iMon: Appearance-based Gaze Tracking System on Mobile Devices [ACM UbiComp '22]

SpeechQoE: A Novel Personalized QoE Assessment Model for Voice Services via Speech Sensing [ACM SenSys '22]


Week 3: Mobile Healthcare

Assessing Mental Health, Academic Performance and Behavioral Trends of College Students using Smartphones [ACM UbiComp ’14]

VitaMon: measuring heart rate variability using smartphone front camera [ACM SenSys '19]

eBP: A Wearable System For Frequent and Comfortable Blood Pressure Monitoring From User’s Ear [ACM MobiCom '19]

EarHealth: An Earphone-based Acoustic Otoscope for Detection of Multiple Ear Diseases in Daily Life [ACM MobiSys '22]


Week 4: Indoor Localization and Analytics 

Need Accurate User Behavior? Pay Attention to Groups! [ACM UbiComp ‘15]

QueueVadis: queuing analytics using smartphones [ACM/IEEE IPSN ’15]

Symphony: Localizing Multiple Acoustic Sources with a Single Microphone Array [ACM SenSys '20]

SmartLOC: Indoor Localization with Smartphone Anchors for On-Demand Delivery [ACM UbiComp '22]


Week 5: Mobile / Embedded System Privacy

DarkneTZ: Towards Model Privacy at the Edge using Trusted Execution Environments [ACM MobiSys '20]

Alexa, Stop Spying on Me!: Speech Privacy Protection Against Voice Assistants [ACM SenSys '20]

LiteZKP: Lightening Zero-Knowledge Proof-based Blockchains for IoT and Edge Platforms [IEEE Systems Journal '21]

PPFL: Privacy-preserving Federated Learning with Trusted Execution Environments [ACM MobiSys '21]


Week 6: Guest Lecturer

Robert LiKamWa (Arizona State University)

Bio: Robert LiKamWa is an associate professor at Arizona State University, appointed in the School of Arts, Media and Engineering (AME) and the School of Electrical, Computer and Energy Engineering (ECEE). LiKamWa directs Meteor Studio, which explores the research and design of software and hardware for mobile Augmented Reality, Virtual Reality, Mixed Reality, and visual computing systems, and their ability to help people tell their stories. To this end, Meteor Studio’s research and design projects span three arcs: (i) advanced visual capture and processing systems, (ii) systems for hybrid virtual-physical immersion through augmentation of senses, and (iii) design frameworks for data-driven augmented reality and virtual reality storytelling and sensemaking. Prior to coming to ASU, LiKamWa completed his bachelor's, master's and doctoral degrees at Rice University in the Department of Electrical and Computer Engineering. He has also interned at Microsoft Research in Redmond, Washington and in Beijing, China, Samsung Mobile Processor Innovation Lab in Richardson, Texas, and the National Institute of Standards and Technology in Boulder, Colorado. 


Week 7: Mobile Graphics and Systems

Graphics-aware Power Governing for Mobile Devices [ACM MobiSys '19]

GLEAM: An illumination estimation framework for real-time photorealistic augmented reality on mobile devices [ACM MobiSys '19]

LpGL: Low-power Graphics Library for Mobile AR Headsets [ACM MobiSys '19]

UltraDepth: Exposing High-Resolution Texture from Depth Cameras [ACM SenSys '21]

Breaking edge shackles: Infrastructure-free collaborative mobile augmented reality [ACM SenSys '22]


Week 8: Midterm Exam Week


Week 9: Low-power System Design

Avoiding the Rush Hours: WiFi Energy Management via Traffic Isolation [ACM Mobisys ‘11]

Energy Characterization and Optimization of Image Sensing Toward Continuous Mobile Vision [ACM MobiSys ‘13]

zTT: Learning-based DVFS with Zero Thermal Throttling for Mobile Devices [ACM MobiSys ‘21]

Adaptive Intelligence for Batteryless Sensors Using Software-Accelerated Tsetlin Machines [ACM SenSys '22]


Week 10: Innovative Design and Development Tools

AMC: verifying user interface properties for vehicular applications [ACM MobiSys ‘13]

Automatic and scalable fault detection for mobile applications [ACM MobiSys ‘14]

PUMA: Programmable UI-Automation for Large-Scale Dynamic Analysis of Mobile Apps [ACM MobiSys ‘14]

Battery-free MakeCode: Accessible Programming for Intermittent Computing [ACM UbiComp '22]


Week 11: Mobile and Embedded Deep Learning (1)

Deep Learning in the Era of Edge Computing: Challenges and Opportunities [arxiv '20] (This paper is a MUST READ paper -- not for presentation)

DeepX: A Software Accelerator for Low-Power Deep Learning Inference on Mobile Devices [ACM/IEEE IPSN '16]

MobiSR: Efficient On-Device Super-Resolution through Heterogeneous Mobile Processors [ACM MobiCom '19]

Memory-efficient DNN Training on Mobile Devices [ACM MobiSys '22]


Week 12: Mobile and Embedded Deep Learning (2)

Mic2Mic: Using Cycle-Consistent Generative Adversarial Networks to Overcome Microphone Variability in Speech Systems [IPSN '19]

Enabling Real-time Sign Language Translation on Mobile Platforms with On-board Depth Cameras [ACM UbiComp '21]

nn-Meter: Towards Accurate Latency Prediction of Deep-Learning Model Inference on Diverse Edge Devices [ACM MobiSys '21]


Week 13: Invited Talk (TBD)


Week 14: Mobile Cloud and Edge Computing 

CloneCloud: Elastic Execution Between Mobile Device and Cloud  [Eurosys ‘11]

Neurosurgeon: Collaborative Intelligence Between the Cloud and Mobile Edge [ASPLOS '17]

SPINN: synergistic progressive inference of neural networks over device and cloud [MobiSys '20]

RT-mDL: Supporting Real-Time Mixed Deep Learning Tasks on Edge Platforms [ACM SenSys '21]


Week 15: Reading Period


Week 16: Final Exam