Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

redkitchen incomplete #10

Open
lo2aayy opened this issue Jan 16, 2018 · 7 comments
Open

redkitchen incomplete #10

lo2aayy opened this issue Jan 16, 2018 · 7 comments

Comments

@lo2aayy
Copy link

lo2aayy commented Jan 16, 2018

Hi,

I tried reconstructing the whole redkitchen sequence, but the reconstruction didn't look like the one given in the dataset, some parts of the reconstruction are cropped/incomplete. Do you have any idea why?

The picture below is my reconstruction.

screenshot from 2018-01-16 23 07 31

and this picture is the original one
redkitchen

@andyzeng
Copy link
Owner

Hello! Have you tried increasing the size of the volumetric voxel grid? With the default parameters ( voxel_size = 0.006 and voxel_grid_dim = 500x500x500 ), the voxel grid only expands to a 3mx3mx3m area.

@lo2aayy
Copy link
Author

lo2aayy commented Jan 18, 2018

Thanks for your reply,
But still when I increase the voxel_grid_dim to any size above 1200x1200x1200, I get the following error:

" terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted (core dumped) "

and less than 1200 the construction is still incomplete

I also had another query, is it possible to directly use the binary tsdf volume in the 7scenes dataset (.raw file) with your tsdf2mesh funtion to generate a mesh?

@andyzeng
Copy link
Owner

The error occurs because the voxel grid is too large to fit into your memory. Since the code creates two voxel grids (one saving distance values and another saving weights for computing running averages), a voxel grid of size 1200x1200x1200 will take 12+GB of RAM and GPU memory.

You'll have to play around with the parameters (voxel_grid_origin_* and voxel_size and voxel_grid_dim_*) to get the same result as the mesh you showed. Try voxel_size = 0.01 to set the size of each voxel to be 1cm, which lowers the resolution but increases the metric coverage of the volume, without increasing memory usage.

The tsdf2mesh in this repository is not out-of-the-box compatible with 7-scenes .raw files. You will need to modify tsdf2mesh to support that.

@lo2aayy
Copy link
Author

lo2aayy commented Jan 20, 2018

Thanks but I was trying to use the info in the mhd file of redkitchen in the 7scenes dataset to make a mesh using tsdf2mesh. The mhd file says that the offset is 0 0 3000 and and the element spacing is 11.718750 and that the unit is mm so i converted it to meters and reconstructed the scene. The problem is that now the mesh has a range of 0.48-5.82 in y, 0.0117-6.00 in x and 3.01-7.96 in z. However the poses in the dataset do not fall inside this model now. Could this be because the model is in the camera coordinates and it has to be converted to global coordinates? Also when you swap the two columns while converting the model from voxel coordinates to camera coordinates, dont you have to change the sign?

@andyzeng
Copy link
Owner

Could this be because the model is in the camera coordinates and it has to be converted to global coordinates?

Correct. The fused model created by demo.cu lies in the camera coordinates of the base frame that you specify (see base_frame_idx in demo.cu). This is different from the models produced by the 7-scenes dataset.

Also when you swap the two columns while converting the model from voxel coordinates to camera coordinates, dont you have to change the sign?

No. The transformation between voxel coordinates and the base camera coordinates (that the model was fused in) should only amount to a translation and a scaling. The swap that occurs in tsdf2mesh.m is only there to account for Matlab's y-first indexing.

@lo2aayy
Copy link
Author

lo2aayy commented Jan 22, 2018

Now I can read the raw file given in the redkitchen dataset and create a mesh using the tsdf2mesh, but the thing is, that it’s not compitable with the poses in the redkitchen dataset, when I plot the poses it doesn’t lie on the image, do you have any idea why?

@ez4lionky
Copy link

Now I can read the raw file given in the redkitchen dataset and create a mesh using the tsdf2mesh, but the thing is, that it’s not compitable with the poses in the redkitchen dataset, when I plot the poses it doesn’t lie on the image, do you have any idea why?

Do you solve the problem? How can I read the raw file from 7Scenes dataset and generate a mesh?


The python version of TSDF-fusion seems like implementing the automatic voxel volume bounds estimating.

shamitlal pushed a commit to shamitlal/tsdf-fusion that referenced this issue Dec 3, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants