[论文笔记]Neurosurgeon: Collaborative Intelligence Between the Cloud and Mobile Edge
keypoints
- How feasible it is to execute large-scale intelligent workloads on today’s mobile platforms?
- At what point is the cost of transferring speech and image data over the wireless network too high to justify cloud processing?
- What role should the mobile edge play in provid- ing processing support for intelligent applications requiring heavy computation?
contributions
-
In-depth examination of the status quo
the latency and energy consumption of executing state- of-the-art DNNs in the cloud and on the mobile device -
DNN compute and data size characteristics study
DNN layers have significantly different compute and data size characteristics depending on their type and configurations -
DNN computation partitioning across the cloud and mobile edge
-
Neurosurgeon runtime system and layer performance prediction models
a set of models to pre- dict the latency and power consumption of a DNN layer based on its type and configuration
a system to intelligently partition DNN com- putation between the mobile and cloud
Model
Algorithm
Experimental setup
mobile devices
server:
framework:
caffe
model:
AlexNet
other:
-
TestMyNet(measure the bandwidth)
-
Watts Up(measure energy consumption)Watts Up?
Power Meter. https://www.wattsupmeters. com/. Accessed: 2015-05. -
Thrift
an open source flexible RPC inter- face for inter-process communication -
MAUI
a general offloading framework(可用做实验比较)
Eduardo Cuervo, Aruna Balasubramanian, Dae-ki Cho, Alec Wolman, Stefan Saroiu, Ranveer Chandra, and Paramvir Bahl. Maui: making smartphones last longer with code offload. In Proceedings of the 8th international conference on Mobile systems, applications, and services, pages 49–62. ACM, 2010. -
BigHouse(Data Center throughput)
David Meisner, Junjie Wu, and Thomas F. Wenisch. Big- House: A Simulation Infrastructure for Data Center Systems. ISPASS ’12: International Symposium on Performance Anal- ysis of Systems and Software, April 2012.