Formula to compare 6 dates and determine which was first

I’m looking for a way to determine which course a customer purchased first. The problem is that each course has it’s own table. (I REALLY wish we could restructure this and put all course enrollments are one table but we there is an obstacle I don’t think we can overcome.)

So it breaks down to there being 7 tables:

  • Members Table (all members)
  • Course 1
  • Course 2
  • Course 3
  • Course 4
  • Course 5
  • Course 6

Contacts could be linked to just one or any amount of courses. There is a “joined date” on each courses page and those dates are also in rollup fields on the main Members page.

The client is really wanting a way to see the order the courses were purchased in. I told them I could probably find a way to show which was the first course purchased but that was probably as much as we could realistically do without restructuring the data.

I was picturing a really long nested if statement formula comparing dates to work out first course purchased. But I’ve tried soooooo many things and I can’t seem to get it.

Any suggestions?

It’s not you; I have a hunch it’s the data model, :wink: specifically the lack of a transaction register. Imagine a [flat] list of every course purchased, the course ID, the member, and the date.

  1. Filter for customer
  2. Sort on date (ascending)
  3. Bob’s your uncle.

What am I missing?

Correct, and it probably never should have been stored in separate tables. Sure, you want to render it that way for particular requirements, but typically, the most basic data structure is the one that will afford the greatest agility, and that’s the good’ol trusty transaction register (i.e. transactional data).

2 Likes

Yep, and when a new course is added, new fields are added, and then it’s off to the races to re-scale the system. Any formula dependencies on existing and all future fields must then be changed, right?

Transactional data eliminates this.

This is amazing! You always have all the answers!

It gave the earliest date. Woo-hoo!! I really didn’t think this was going to be possible. And then I wrote a formula to translate that into which product was the first.

Thank you so much!!

image

Yup. I agree that a different scheme would be better. But Alisha said that she already wants to do this but cannot right now. And I’m on my phone with only one hand available right now, so I thought I’d just answer the question.

The obstacle we have comes down to the course platform the client is using - Mighty Networks.

It’s horrible to work with. They don’t have open API. The few zap triggers they have are basically useless. They don’t make it easy to access data.

Regular manual csv imports to update student data/progress is our only solution. Client isn’t tech savvy but has become comfortable enough to handle the regular imports on her own.

The process is tedious. The way MN works, she has to initiate getting the csv from each of the 6 courses separately and wait for them to be emailed to her.

We are “merging” the data in using the csv import extension to update the student’s progress and recent activity. The only unique identifier in the report is their Member ID (no reference at all in the csv as to which course!). If we put all courses on the same table, this process would involve her having to modify the csv columns before import each time. She likes doing daily to every couple of days so it would be alot of work for her to do all that.

But I agree it is not the best way from a data management/analysis standpoint!

This is my third client using Mighty Networks. The more I work with it, the more I dislike it.

Understood. So, you could back into the transactional list using a script process that generates the missing ideal table.

@Alisha_Sadeski Just an FYI that the process you described regarding importing the CSV files could be easily automated with Make.com, and it would also give you significantly more functionality than what you’re getting right now from Airtable’s CSV Import tool. You could even fully automate the process by having Make monitor your emails for those CSV files. Otherwise, they could just be dropped into a cloud folder for processing.

Ahhhhh!!! I use Make alot. But this is not a use case I ever thought of (or knew was possible). :exploding_head:

I just went and played around in Make! I had never even noticed the csv tool before. I just set up a simple little test. It worked and was super smooth and fast. This is so good to know. Thanks for teaching me something new. :blush:

I love using robots to do tasks like this not just just to save time but to prevent human error! I see huge benefit to setting it up this way… except for the amount of tasks. :frowning: The combined rows from these imports she is doing is over 25,000+. (We can’t export just updated students. So each report includes ALL students from all time in that course.)

So glad to have learned this. It’s been on my to do list to watch the Make videos you shared about recently! I’m sure there is so much more to discover!

2 Likes

That’s great to hear! Glad I could help! :blush:

1 Like

Make is very powerful, indeed.

Imagine a different approach that requires only your data and one API call to OpenAI that costs about 1/60th of a penny.

Given this GPT query:

This is a straightforward idea, but when performing queries across data that may not be ideally shaped, we will increasingly see LLMs provide answers where scripting or formulaic approaches are complex and require skills and time (of course).

RELATED - it’s unfortunate that Airtable never took up my 2020 suggestion that scripts could become the underlying code for custom formulas. Had they done this, the formula that @Kuovonne shared could have been:

gptPrompt("Given this list of dates, determine which is the latest:", [{field1}, {field2}, {fieldn...}])

I import about 10 CSVs in airtable on daily basis. The Airtable Extensions work well (especially they don’t cost a penny) but they are limited. I like Integromat because it’s so powerful in many areas, however it becomes complicated if I need to modify the CSV data in some way before import and do it on routine basis. I use https://csvbox.io/, another alternative is https://www.easycsv.io/ and I used to use https://www.easydatatransform.com/ for offline CSV manipulation but then I would still use Airtable extension to import. The first two are ideal for a workflow with Airtable while you can still visualize the configuration and change it easily if the CSV changes. With Integromat, it requires changing the settings that I find takes more time if the CSV changes.