PREF: Phasorial Embedding Fields for Compact Neural Representations
We present a phasorial embedding field PREF as a compact representation to facilitate neural signal modeling and reconstruction tasks. Pure multi-layer perceptron (MLP) based neural techniques are biased towards low frequency signals and have relied on deep layers or Fourier encoding to avoid losing details. PREF instead employs a compact and physically explainable encoding field based on the phasor formulation of the Fourier embedding space. We conduct comprehensive experiments to demonstrate the advantages of PREF over the latest spatial embedding techniques. We then develop a highly efficient frequency learning framework using an approximated inverse Fourier transform scheme for PREF along with a novel Parseval regularizer. Extensive experiments show our efficient and compact frequency-based neural signal processing technique is on par with and even better than the state-of-the-art in 2D image completion, 3D SDF surface regression, and 5D radiance field reconstruction.
READ FULL TEXT