Smaller Is Bigger: Rethinking the Embedding Rate of Deep Hiding
Deep hiding, concealing secret information using Deep Neural Networks (DNNs), can significantly increase the embedding rate and improve the efficiency of secret sharing. Existing works mainly force on designing DNNs with higher embedding rates or fancy functionalities. In this paper, we want to answer some fundamental questions: how to increase and what determines the embedding rate of deep hiding. To this end, we first propose a novel Local Deep Hiding (LDH) scheme that significantly increases the embedding rate by hiding large secret images into small local regions of cover images. Our scheme consists of three DNNs: hiding, locating, and revealing. We use the hiding network to convert a secret image in a small imperceptible compact secret code that is embedded into a random local region of a cover image. The locating network assists the revealing process by identifying the position of secret codes in the stego image, while the revealing network recovers all full-size secret images from these identified local regions. Our LDH achieves an extremely high embedding rate, i.e., 16×24 bpp and exhibits superior robustness to common image distortions. We also conduct comprehensive experiments to evaluate our scheme under various system settings. We further quantitatively analyze the trade-off between the embedding rate and image quality with different image restoration algorithms.
READ FULL TEXT