The CSV import extension has a record limit of 25,000 rows in the csv. Does anyone know if using input.fileAsync in a script has that limit?
I don’t know about the scripting limits, so hopefully somebody else can answer that below.
However, you can always use Make with no limits. I just did a podcast episode on that last week:
I don’t believe that input.fileAsync has a row limit.
Thanks @ScottWorld. I love Make. But for this case I think it will eat up way too many operations.
The client exports her member data out of Mighty Networks every Thursday and imports into Airtable to reconcile it with the data she stores there about each member. She just went over 25,000 members and so she had to start splitting the CSV before importing. So with Make it would take over 50,000 operations every Thursday (25,000+ for the search step plus 25,000 for the upsert step).
I’m really close to have the custom script ready for her. It is going to make the import process so much smoother! An added benefit is that I’ll be able to automatically reconcile linked relationships as part of the script. (That was something she had to do after importing using the regular CVS import extension.)
I just watched the Built On Air episode because learning how to use CSVs with Make has been on my to do list. I do have a question - are API rate limits ever an issue when processing that much data into Airtable? Does the Make slow itself down enough to make sure it doesn’t hit that limit while cycling through each row of the csv? I have other Makes that are driving me bananas because I hit the API rate limit of 5 per second per base.
You’re right!
I just tested a csv that had 25,189 rows and it worked.
I don’t think that Make slows itself down, but if it goes too fast, you can always insert a Sleep module into your scenario, which will pause the scenario for the number of seconds that you specify.