Skip to content
This repository has been archived by the owner on Jul 2, 2024. It is now read-only.

Latest commit

 

History

History

data_gen

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 

NeRFactor: Data Generation

This folder contains code and instructions for:

  1. rendering images of a scene and subsequently converting these image data into a TensorFlow dataset (nerf_synth/),
  2. processing the two real 360-degree captures from NeRF (nerf_real/), and
  3. converting the MERL binary BRDFs into a TensorFlow dataset (merl/).

Converting the MERL Binary BRDFs Into a TensorFlow Dataset

  1. Download the MERL BRDF dataset to $proj_root/data/brdf_merl/.

  2. Convert the dataset into our format:

    proj_root='/data/vision/billf/intrinsic/sim'
    repo_dir="$proj_root/code/nerfactor"
    indir="$proj_root/data/brdf_merl"
    ims='256'
    outdir="$proj_root/data/brdf_merl_npz/ims${ims}_envmaph16_spp1"
    REPO_DIR="$repo_dir" "$repo_dir"/data_gen/merl/make_dataset_run.sh "$indir" "$ims" "$outdir"

    In this conversion process, the BRDFs are visualized to $outdir/vis, in the forms of characteristic clices and renders.

NeRF: Synthetic Data

This section is relevant only if you want to render your own data, e.g., using your own scene or light probe. If our data suffice for your purpose already, it is much easier to just download our rendering and skip the following instructions.

Go to nerf_synth/ and follow the instructions there.

NeRF: Real Captures

This section is relevant only if you want to process your own capture. If our processed version of the NeRF 360-degree real captures already suffices for your purpose, it is much easier to just download our processed version and skip the following instructions.

  1. Download the 360-degree real captures by NeRF from here.

  2. Convert these real images and COLMAP poses into our format:

    scene='pinecone'
    proj_root='/data/vision/billf/intrinsic/sim'
    repo_dir="$proj_root/code/nerfactor"
    scene_dir="$proj_root/data/nerf_real_360/$scene"
    h='512'
    n_vali='2'
    outroot="$proj_root/data/nerf_real_360_proc/${scene}"
    REPO_DIR="$repo_dir" "$repo_dir/data_gen/nerf_real/make_dataset_run.sh" --scene_dir="$scene_dir" --h="$h" --n_vali="$n_vali" --outroot="$outroot"

DTU: MVS Shape

On 05/31/2022, we added our main scripts that convert MVS shape output into our data format (surface points, normals, light visibility, etc.); please see dtu_mvs/.