You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have been trying to use my own dataset with meshcnn but I am not able to identity how I can make your code to accept my dataset.
I created a new dataset in the dataset folder it has the same directory structure as the shrec16 dataset but when I run the code after changing the dataroot directory to datasets/anchor_obj in train.sh file, I get the following error:
File "/home/sawaiz/Documents/Lab/Projects/MeshCNN/MeshCNN-master/data/init.py", line 12, in CreateDataset
dataset = ClassificationData(opt)
File "/home/sawaiz/Documents/Lab/Projects/MeshCNN/MeshCNN-master/data/classification_data.py", line 19, in init
self.get_mean_std()
File "/home/sawaiz/Documents/Lab/Projects/MeshCNN/MeshCNN-master/data/base_dataset.py", line 32, in get_mean_std
for i, data in enumerate(self):
File "/home/sawaiz/Documents/Lab/Projects/MeshCNN/MeshCNN-master/data/classification_data.py", line 34, in getitem
edge_features = pad(edge_features, self.opt.ninput_edges)
File "/home/sawaiz/Documents/Lab/Projects/MeshCNN/MeshCNN-master/util/util.py", line 22, in pad
return np.pad(input_arr, pad_width=npad, mode='constant', constant_values=val)
File "/home/sawaiz/.local/lib/python3.10/site-packages/numpy/lib/arraypad.py", line 748, in pad
pad_width = _as_pairs(pad_width, array.ndim, as_index=True)
File "/home/sawaiz/.local/lib/python3.10/site-packages/numpy/lib/arraypad.py", line 518, in _as_pairs
raise ValueError("index can't contain negative values")
ValueError: index can't contain negative values
I feel I have to process my data in a certain way but I am not sure. Can you please clarify how I can do that?
My dataset is a face dataset. I used mediapipe facial landmarker to get the facial landmarks of a face in 3d and then convert the point cloud into a mesh using open3d library. I can access the obj files and see the objects. The number of vertices and faces are variable in all these files.
The text was updated successfully, but these errors were encountered:
I have been trying to use my own dataset with meshcnn but I am not able to identity how I can make your code to accept my dataset.
I created a new dataset in the dataset folder it has the same directory structure as the shrec16 dataset but when I run the code after changing the dataroot directory to datasets/anchor_obj in train.sh file, I get the following error:
File "/home/sawaiz/Documents/Lab/Projects/MeshCNN/MeshCNN-master/data/init.py", line 12, in CreateDataset
dataset = ClassificationData(opt)
File "/home/sawaiz/Documents/Lab/Projects/MeshCNN/MeshCNN-master/data/classification_data.py", line 19, in init
self.get_mean_std()
File "/home/sawaiz/Documents/Lab/Projects/MeshCNN/MeshCNN-master/data/base_dataset.py", line 32, in get_mean_std
for i, data in enumerate(self):
File "/home/sawaiz/Documents/Lab/Projects/MeshCNN/MeshCNN-master/data/classification_data.py", line 34, in getitem
edge_features = pad(edge_features, self.opt.ninput_edges)
File "/home/sawaiz/Documents/Lab/Projects/MeshCNN/MeshCNN-master/util/util.py", line 22, in pad
return np.pad(input_arr, pad_width=npad, mode='constant', constant_values=val)
File "/home/sawaiz/.local/lib/python3.10/site-packages/numpy/lib/arraypad.py", line 748, in pad
pad_width = _as_pairs(pad_width, array.ndim, as_index=True)
File "/home/sawaiz/.local/lib/python3.10/site-packages/numpy/lib/arraypad.py", line 518, in _as_pairs
raise ValueError("index can't contain negative values")
ValueError: index can't contain negative values
I feel I have to process my data in a certain way but I am not sure. Can you please clarify how I can do that?
My dataset is a face dataset. I used mediapipe facial landmarker to get the facial landmarks of a face in 3d and then convert the point cloud into a mesh using open3d library. I can access the obj files and see the objects. The number of vertices and faces are variable in all these files.
The text was updated successfully, but these errors were encountered: