DeFog: Fog Computing Benchmarks
Fog computing envisions that deploying services of an application across resources in the cloud and those located at the edge of the network may improve the overall performance of the application when compared to running the application on the cloud. However, there are currently no benchmarks that can directly compare the performance of the application across the cloud-only, edge-only and cloud-edge deployment platform to obtain any insight on performance improvement. This paper proposes DeFog, a first Fog benchmarking suite to: (i) alleviate the burden of Fog benchmarking by using a standard methodology, and (ii) facilitate the understanding of the target platform by collecting a catalogue of relevant metrics for a set of benchmarks. The current portfolio of DeFog benchmarks comprises six relevant applications conducive to using the edge. Experimental studies are carried out on multiple target platforms to demonstrate the use of DeFog for collecting metrics related to application latencies (communication and computation), for understanding the impact of stress and concurrent users on application latencies, and for understanding the performance of deploying different combination of services of an application across the cloud and edge. DeFog is available for public download (https://github.com/qub-blesson/DeFog).
READ FULL TEXT