555-555-5555
mymail@mailservice.com
Large Language Models (LLMs) are a type of artificial intelligence (AI) designed to understand and generate human-like language using deep learning techniques. They are built on transformer neural networks, which enable them to process and analyse vast amounts of text data, often spanning billions of parameters, to perform tasks such as text generation, translation and summarisation. Some prominant examples of LLMs include OpenAI GPT-4, Anthropic Claude, Google Gemini and DeepSeek.
These models have revolutionised natural language processing (NLP) by enabling machines to perform human-like communication tasks with remarkable accuracy.
All of the LLMs mentioned above offer access via standard JSON/REST API interfaces and Claris FileMaker first introduced support for integrating with REST APIs using the cURL script step all the way back in FileMaker 16 (released in 2017). FileMaker 2024 massively simplifies the complexity of leverage LLMs by adding a bunch of native AI specific script steps including: Configure AI Account, Insert Embedding and Perform Semantic Find.
We have a number of clients using the FileMaker platform for Recruitment (as Consultants or in-house HR Teams). As you might expect, they have extensive databases with thousands (or tens of thousands) of Candidate profiles, usually with a summary of the Candidate's CV stored in a text field. Commonly, the first stage in generating a shortlist for a new role would be to whittle down potential candidates based on their availability, the seniority of the file and salary band, but this can still result in hundreds of suitable applicants that the Recruiter would need to look at or carry out keyword searches on their CV summary in the hope of getting down to a manageable list of 5-10 prospective candidates to contact.
Classic keyword searches can be exceptionally hit and miss - if you looking at CVs for a Marketing position then just about all candidates will mention 'Marketing' in their CV so that isn't a useful way to refine the search. Likewise, too specific a search term might incorrectly rule out candidates that are otherwise qualified - for example, if you searched for an exact match on 'B2B Marketing' then you would miss candidates which has 'B2B and Consumer Marketing' or 'Marketing (B2B)' mentioned on their CV summary.
Clearly, what we need here, is a search technology that focuses on understanding the meaning and the intent behind the Consultant's search query rather than just bluntly matching keywords and this is exactly what the semantic search does. If they did a semantic search for an 'Experienced B2B Marketing Professional with background in IT', then we would want to to pick up on candidates that might mention as varied phrases in their CV as 'Experienced B2B Marketeer', 'B2B Veteran', 'Cloud Services Marketing (B2B)', etc.
How to achieve this in FileMaker? First you need to register with a LLM provider and obtain an API key. Cohere offers a free trial account. Then you just need to set-up 3 script steps:
This first Script, 'Configure AI Account' simply contains the authentication information so that it can be run as a subscript each time the LLM account needs to be used. The 'Embedding Details Data - Applicants' takes the text from the 'Biography' (Text) field and then pushes the vectorised representation from the LLM into the 'Biography_Embedding' (Container) field. In practice you would probably want this to update every time the 'Biography' field is updated via a Script Trigger. Finally, the last Script, 'Perform Semantic Find - Applicants' takes the text from the 'SearchApplicantGlobal' (Global text field), uses the LLM to generate a vectorised representation which is then compared to the contents of the 'Biography_Embedding' field for the current found set of records. This is limited to only returning the 10 matches which only similarity which the LLM judges to be greater than 25%.
You now have a system that will that allows you to carry out semantic searches on the biography of applicants rather than just key words!
Despite how magical semantic search can seem when it is correctly implemented, it does have some technical constrains which need to be considered in real world applications:
If you don't know where to started with adding AI to your Claris FileMaker App , then why not contact our Team for a free initial consultation. We're here to help you achieve your business goals.
All Rights Reserved | DataTherapy Limited