Introduction
The 4th is for the OIC instance in Chicago.
This is a follow up on the previous PAF post. It's a slight variation on my ubiquitous Order Processing Agent & Tools
Here the orders are in a DB and a user can browse the DB table and decide which orders to process e.g. I see shedloads of orders, but decide for some reason only know to myself to just process order with the number 678.
Chat input - the user input, such as process order 678.
Agent - this agent takes the input and extracts the orderNr, and surface it in an SQL statement e.g. SELECT * from orders WHERE orderNr = '678'.
I join this to the SQL Query action.
Now to the order processing agent -
Here I just push the result of the SQL execution to the Order Processing Agent.Click on Playground, to test the flow -
A European LLM, excellent!
I'm starting with the Experiment for free option -
I get my api key -and off I go -Now to OIC, here I create a new project and add a REST (invoke) connection -
I also add a REST (trigger) connection -I create a simple sync integration - Input and output are simple strings.I add the Mistral invoke -
Here is the full Response structure -I complete the Map actions - second Map -
Note how I use the Mistral AI component, to allow the Agent to use this LLM.
I just need to specify the Model name and Mistral API Key.
Summa Summarum
Mistral AI is simple to use, be it from Postman or as the LLM for your agent in Langflow. What's in a name? Mistral - a strong, cold, dry, northwesterly wind that blows from southern France into the Mediterranean Sea, particularly in winter and spring. Originating from the Occitan word for "master," this wind is known for its force, often reaching over 100 km/h
Courtesy of Wikipedia.
Welcome to this short series of posts for OIC beginners - OIC Provisioning and Configuration - enabling other OIC components, enabling OIC...