Misc functions
ai
Commentary
added in 1.6.0
Generates free text by making calls to LLM providers. Ollama 1 and AWS Bedrock 2 and are currently supported.
Each LLM call contains at least two things: a prompt of your choosing and a preprompt assembled by ShadowTraffic that contains all variables generated before this function call. 3 This preprompt helps give the LLM more context for whatever you're generating.
You can pass arbitrary custom options to the underlying model, like setting the temperature. 4
Examples
Calling Ollama
Set service
to ollama
and provide the minimum required parameters. You can supply arbitrary options to the underlying model by specifying an optional options
map under the ollama
key. Consult the Ollama docs for the content of this map.
{
"_gen": "ai",
"service": "ollama",
"ollama": {
"url": "http://localhost:11434",
"model": "mistral:7b-instruct",
"prompt": "What should I name my next cat?"
}
}
This example might return strings like Whiskers
, Mittens
, and Fluffy
.
Calling Bedrock
Set service
to bedrock
and provide the minimum required parameters.
{
"_gen": "ai",
"service": "bedrock",
"bedrock": {
"modelId": "xxx",
"prompt": "Write a two sentence incident report for a software bug."
}
}
This example returns similiar strings to the prior example.
Integrating variables
Any variables that have been evaluated prior to this function call will automatically be passed as a preprompt. In this case, a map of name
and age
with their generated values will get sent to the LLM. The idea is that this gives the LLM more context to work with. In this example, it can generate more specific health reports.
{
"topic": "sandbox",
"vars": {
"name": {
"_gen": "string",
"expr": "#{Name.fullName}"
},
"age": {
"_gen": "normalDistribution",
"mean": 40,
"sd": 10,
"decimals": 0
}
},
"value": {
"_gen": "ai",
"service": "ollama",
"ollama": {
"url": "http://localhost:11434",
"model": "mistral:7b-instruct",
"prompt": "Write a health report appropriate for the age of the patient."
}
}
}
As an example output, this prompt generated:
Name: Mr. Whitley Greenfelder
Age: 40
Health Report:
Mr. Greenfelder, at the age of 40, is entering a crucial period in his life regarding maintaining optimal health and preventing potential health issues. Forty is considered middle age, and it's important to focus on overall wellness.
Here are some recommendations for Mr. Greenfelder based on his current age:
1. Regular Exercise: Aim for at least 150 minutes of moderate-intensity aerobic activity or 75 minutes of vigorous-intensity activity per week. Additionally, include muscle-strengthening activities on two or more days a week to maintain strong bones and muscles.
2. Balanced Diet: Consume a diet rich in fruits, vegetables, whole grains, lean proteins, and healthy fats. Limit processed foods, sugars, sodium, saturated and trans fats.
...
Modifying Bedrock calls
You can supply arbitrary options to the underlying Bedrock model by specifying an optional inferenceConfig
map under the bedrock
key. Consult the AWS docs for the content of this map.
{
"_gen": "ai",
"service": "bedrock",
"bedrock": {
"modelId": "xxx",
"prompt": "Write a hotel review.",
"inferenceConfig": {
"temperature": 0.3
}
}
}
Specification
JSON schema
{
"type": "object",
"properties": {
"service": {
"type": "string",
"enum": [
"ollama",
"bedrock"
]
},
"ollama": {
"type": "object",
"properties": {
"url": {
"type": "string"
},
"model": {
"type": "string"
},
"prompt": {
"type": "string"
},
"options": {
"type": "object"
}
},
"required": [
"url",
"model",
"prompt"
]
},
"bedrock": {
"type": "object",
"properties": {
"modelId": {
"type": "string"
},
"prompt": {
"type": "string"
},
"inferenceConfig": {
"type": "object"
}
}
}
},
"required": [
"service"
]
}