tg-invoke-prompt
Invokes the LLM prompt service using predefined prompt templates with variable substitution.
Synopsis
tg-invoke-prompt [options] template-id [variable=value ...]
Description
The tg-invoke-prompt command invokes TrustGraph’s LLM prompt service using predefined prompt templates. Templates contain placeholder variables in the format `` that are replaced with values provided on the command line.
This provides a structured way to interact with language models using consistent, reusable prompt templates for specific tasks like question answering, text extraction, analysis, and more.
Options
Required Arguments
| Option | Description |
|---|---|
template-id | Prompt template identifier (e.g., question, extract-definitions) |
Optional Arguments
| Option | Default | Description |
|---|---|---|
-u, --url URL | $TRUSTGRAPH_URL or http://localhost:8088/ | TrustGraph API URL |
-t, --token TOKEN | $TRUSTGRAPH_TOKEN | Authentication token |
-f, --flow-id ID | default | Flow instance ID to use |
variable=value | (none) | Template variable assignments (can specify multiple times) |
Examples
Basic Question Answering
tg-invoke-prompt question \
text="What is artificial intelligence?" \
context="AI research field"
Extract Definitions
tg-invoke-prompt extract-definitions \
document="Machine learning is a subset of AI..." \
terms="machine learning,neural networks"
Text Summarization
tg-invoke-prompt summarize \
text="$(cat document.txt)" \
max_length="200"
With Custom Flow
tg-invoke-prompt analysis \
-f "research-flow" \
data="$(cat research-data.json)" \
focus="trends"
Variable Substitution
Templates use `` placeholders that are replaced with command-line values:
# Template: "Good , !"
tg-invoke-prompt greeting name="Alice" time="morning"
# Result: "Good morning, Alice!"
Variables can contain:
- Simple text values
- File contents via
$(cat file.txt) - Command output via
$(command)
Environment Variables
TRUSTGRAPH_URL: Default API URLTRUSTGRAPH_TOKEN: Default authentication token
Related Commands
tg-show-prompts- Display configured promptstg-set-prompt- Configure prompt templatestg-invoke-llm- Invoke LLM directly
API Integration
This command uses the Prompt API to execute prompt templates with variable substitution.