I am struck up with using below code, as it take long time on Google Colab.
embeddings = for i in range(metadata.shape): img_path = metadata[i].image_path() img = load_image(img_path) img = (img / 255.).astype(np.float32) img = cv2.resize(img, dsize = (224,224)) embedding_vector = vgg_face_descriptor.predict(np.expand_dims(img, axis=0)) embeddings.append(embedding_vector)return embedding
This code has to have create an embedding vector for 11000 images.Also, there might be a possibility to have error while reading the images, please help me to fill zero for the images.
I am using Pinterest Data for my academic work.
I have to do submission before 3rd July, please help.
BangPypers mailing list
[hidden email] https://mail.python.org/mailman/listinfo/bangpypers