Add value to the select single list structured metadata field
I want to use Node SDK and add through the API new value to the structured metadata select single list. I figured out that I must get the whole datasource, add a new item, and write a new data source.
The problem is my list has thousands of items. Is there any easy way to add a single value without having to download the entire list?
Answers
-
Hi @mgosciniak,
Using Node, you should be able to edit an existing data source like this:
var csv = require("csvtojson"); //npm install --save csvtojson@latest var cloudinary = require('cloudinary').v2; var util = require('util'); // npm install --save util /**config**/ cloudinary.config({ cloud_name: 'xxx', api_key: 'xxxx', api_secret: 'xxxx', secure:true }); /** structure of CSV can be: * external_id,value * Breathing_Space_2,Breathing Space * * * or * value,external_id * Breathing Space,Breathing_Space_2 * * The script will handle both cases properly **/ csv().fromFile("./****_export.csv").then(function(jsonArrayObj){ cloudinary.api.update_metadata_field_datasource('external_id_of_field',{ "values": jsonArrayObj }, function(error, result) { console.log(result, error); }); })
You just need to make sure the structure of the CSV follows the structure in the comment and change the
external_id_of_field
value.Hope that helps.
Best,
Loic
0 -
Thanks but there is another problem. Always, after datasource update, API returns full datasource. Is it possible to disable this return? The problem is this list will have ~100k items, and returning these data each time is a big problem for us.
0 -
Hi,
It's not possible to change the response, are you worried about the response because you save these records? I understand you're asking this hypothetically?
Thanks
-Tamara
0 -
We will be adding data to this collection really often on our backend site. If we get all time full datasource as a result of this add request, it will be really slow from the time perspective. We will have 100k or even more items on this list. Returning each time 100k rows will be a killer for our system.
0 -
Hey @mgosciniak.
If you have several thousand metadata entries, a better approach may be to use a free-text field on the Cloudinary side, and then make use of webhook notifications to notify your site/application that the metadata has been set or updated, and perform the validation on your side.
For instance, if you're using a SKU, you could set that as a mandatory free-text field, and when the notification comes through, check against your database of items and ensure the SKU matches. If it doesn't, you could then perform an update to the asset via the API to change its moderation status, notify the user who uploaded/edited the file, or take some other action.
I hope this helps. Please let me know if there's anything further we can do.
Kind regards,
-Danny
0 -
We cant use text Field. We must use select list with external_id. This list will be synchronized with source that comes from our system. During asset uploading user will select one of this list value. Next we will get a webhook from cludinary and do something on our site. Values from this list must have external id because they are not unique. So we need list with 100k ++ items.
0 -
I appreciate that there is a sync to maintain, however the maximum number of metadata options you can have in a single metadata list is around 2-3,000 so 100K+ isn't supported. Plus, from a user-experience perspective, having your users manually scroll and read through such a large list isn't recommended anyway.
I would strongly recommend performing the validation on the server side for the best results.
0 -
As it works today, the user sees only the first 50 items on the list in Media Library, the rest is searchable.
The problem is what I wrote in another post, that your system is getting whole list data in one request from frontend (now is 3k). In another post, some of your colleagues said that this is possible to change this limit and asked me to send a private ticket. Even if you raise the limit, frontend can have an efficiency issue.
And we must have this long list, a little thing, but a critical issue for my company.
0 -
Hi again @mgosciniak
We do have a little wiggle room in terms of how much we can expand the maximum number of items in the metadata list, but upwards of 100,000, or even upwards of 5,000 will have a significant performance hit for anyone using the Media Library. When a user uploads or edits an asset on your account, we have to pull in all of the metadata so that the autocomplete can populate. With such a large number of options, fetching this metadata will take a not-insignificant amount of time to do, likely causing the browser to hang while these requests complete. And as previously mentioned, these metadata fields are also returned in the API too, and sadly we're not able to prevent that from happening.
I hate to be the bearer of bad news, but I'm afraid we won't be able to support such a large number of metadata fields.
That being said, if you ultimately were to perform the validation on the server side, then you could still ensure accuracy for users who are entering in the metadata, and also not have the Media Library hang for any uploads or changes to metadata.
0