Command-Line Interface¶
FuzzyInfer provides a powerful CLI for working with fuzzy inference from the command line.
Installation¶
The CLI is included when you install fuzzy-infer:
Verify installation:
Commands¶
Run Inference¶
Execute inference from JSONL input:
# Run from stdin
echo '{"type":"fact","pred":"is-bird","args":["robin"],"deg":0.9}' | fuzzy-infer run
# Run with rules file
fuzzy-infer run rules.jsonl < facts.jsonl
# Run with both facts and rules
fuzzy-infer run rules.jsonl facts.jsonl
Query Facts¶
Query specific predicates:
# Query all facts with predicate
fuzzy-infer query "is-mammal" < knowledge.jsonl
# Query with specific arguments
fuzzy-infer query "is-mammal" "dog" < knowledge.jsonl
# Filter by degree
fuzzy-infer query "is-bird" --min-degree 0.8 < facts.jsonl
Add Facts¶
Create facts from command line:
# Add single fact (degree defaults to 1.0)
fuzzy-infer fact "is-bird" "robin"
# Add with specific degree
fuzzy-infer fact "is-bird" "robin" 0.9
# Multiple arguments
fuzzy-infer fact "temperature" "room-1" "hot" 0.75
List Facts¶
Show all facts:
# List all facts
fuzzy-infer facts < knowledge.jsonl
# Format as JSON
fuzzy-infer facts --format json < knowledge.jsonl
# Filter by predicate
fuzzy-infer facts --predicate "is-mammal" < knowledge.jsonl
Validate¶
Check JSONL format:
# Validate facts and rules
fuzzy-infer validate knowledge.jsonl
# Validate from stdin
cat facts.jsonl | fuzzy-infer validate
JSONL Format¶
FuzzyInfer uses JSONL (JSON Lines) - one JSON object per line.
Fact Format¶
Fields:
- type: Must be "fact"
- pred: Predicate name (string)
- args: Array of arguments
- deg: Degree (0.0 to 1.0, defaults to 1.0)
Rule Format¶
{"type":"rule","name":"birds-fly","cond":[{"pred":"is-bird","args":["?x"]}],"actions":[{"action":"add","fact":{"pred":"can-fly","args":["?x"],"deg":0.9}}],"priority":50}
Fields:
- type: Must be "rule"
- name: Rule identifier (string, optional)
- cond: Array of condition objects
- actions: Array of action objects
- priority: Execution priority (integer, defaults to 50)
Examples¶
Basic Inference¶
Create a facts file:
# facts.jsonl
cat > facts.jsonl << 'EOF'
{"type":"fact","pred":"is-bird","args":["robin"],"deg":0.9}
{"type":"fact","pred":"is-bird","args":["eagle"],"deg":1.0}
{"type":"fact","pred":"has-wings","args":["robin"],"deg":1.0}
{"type":"fact","pred":"has-wings","args":["eagle"],"deg":1.0}
EOF
Create a rules file:
# rules.jsonl
cat > rules.jsonl << 'EOF'
{"type":"rule","name":"birds-fly","cond":[{"pred":"is-bird","args":["?x"],"deg":"?d","deg-pred":[">","?d",0.7]}],"actions":[{"action":"add","fact":{"pred":"can-fly","args":["?x"],"deg":["*","?d",0.9]}}]}
EOF
Run inference:
Output:
{"type":"fact","pred":"can-fly","args":["robin"],"deg":0.81}
{"type":"fact","pred":"can-fly","args":["eagle"],"deg":0.9}
Pipeline Processing¶
Chain multiple operations:
# Create facts, run inference, filter results
echo '{"type":"fact","pred":"is-mammal","args":["dog"],"deg":1.0}' | \
fuzzy-infer run mammal_rules.jsonl | \
fuzzy-infer query "warm-blooded" --min-degree 0.9 | \
jq -r '.args[0]'
Batch Processing¶
Process multiple files:
# Merge multiple knowledge bases
cat kb1.jsonl kb2.jsonl kb3.jsonl | \
fuzzy-infer run rules.jsonl | \
fuzzy-infer facts --format json > merged_kb.jsonl
Filtering with jq¶
Combine with jq for powerful filtering:
# Get all high-confidence mammals
fuzzy-infer facts < knowledge.jsonl | \
jq 'select(.pred == "is-mammal" and .deg > 0.9)'
# Count facts by predicate
fuzzy-infer facts < knowledge.jsonl | \
jq -r '.pred' | sort | uniq -c
# Get unique predicates
fuzzy-infer facts < knowledge.jsonl | \
jq -r '.pred' | sort -u
Integration with Unix Tools¶
grep: Filter facts¶
# Find all bird-related facts
fuzzy-infer facts < knowledge.jsonl | grep "bird"
# Case-insensitive search
fuzzy-infer facts < knowledge.jsonl | grep -i "MAMMAL"
awk: Extract fields¶
# Extract arguments from facts
fuzzy-infer facts < knowledge.jsonl | \
awk -F'"' '{print $8}'
# Calculate average degree
fuzzy-infer facts < knowledge.jsonl | \
awk -F'deg":' '{sum+=$2; count++} END {print sum/count}'
sort: Order results¶
# Sort facts by predicate
fuzzy-infer facts < knowledge.jsonl | sort
# Sort by degree (requires jq)
fuzzy-infer facts < knowledge.jsonl | \
jq -s 'sort_by(.deg) | reverse | .[]'
wc: Count facts¶
# Count total facts
fuzzy-infer facts < knowledge.jsonl | wc -l
# Count high-confidence facts
fuzzy-infer facts < knowledge.jsonl | \
jq 'select(.deg > 0.9)' | wc -l
Real-World Use Cases¶
Sensor Data Processing¶
#!/bin/bash
# Process IoT sensor stream
# Read sensor data, apply rules, extract alerts
cat sensors.jsonl | \
fuzzy-infer run sensor_rules.jsonl | \
fuzzy-infer query "alert" --min-degree 0.8 | \
while read alert; do
echo "$alert" | jq -r '.args[0]' | \
xargs -I {} echo "ALERT: Sensor {} requires attention"
done
Log Analysis¶
#!/bin/bash
# Analyze application logs for anomalies
# Convert logs to facts, detect anomalies
tail -f app.log | \
./log_to_jsonl.py | \
fuzzy-infer run anomaly_rules.jsonl | \
fuzzy-infer query "anomaly" | \
jq -r '"Anomaly detected: \(.args[0]) at \(.deg * 100)% confidence"'
Classification Pipeline¶
#!/bin/bash
# Classify documents
for doc in documents/*.txt; do
./extract_features.py "$doc" | \
fuzzy-infer run classification_rules.jsonl | \
fuzzy-infer query "category" | \
jq -r '{file: "'$doc'", category: .args[1], confidence: .deg}'
done | jq -s '.'
Options Reference¶
Global Options¶
--version # Show version
--help # Show help message
--verbose, -v # Verbose output
--quiet, -q # Suppress output
run Command¶
fuzzy-infer run [OPTIONS] [RULES_FILE] [FACTS_FILE]
Options:
--max-iterations N # Maximum inference iterations (default: 100)
--output, -o FILE # Write results to file
--format FORMAT # Output format: jsonl (default), json
query Command¶
fuzzy-infer query [OPTIONS] PREDICATE [ARGS...]
Options:
--min-degree N # Minimum degree threshold (0.0 to 1.0)
--max-degree N # Maximum degree threshold
--format FORMAT # Output format: jsonl (default), json, text
--limit N # Limit number of results
facts Command¶
fuzzy-infer facts [OPTIONS]
Options:
--predicate PRED # Filter by predicate
--format FORMAT # Output format: jsonl (default), json, text
--sort-by FIELD # Sort by: predicate, degree
validate Command¶
fuzzy-infer validate [OPTIONS] [FILE]
Options:
--strict # Enable strict validation
--show-errors # Show detailed errors
Configuration¶
Create a configuration file at ~/.fuzzy-infer.json:
Or use environment variables:
Error Handling¶
Invalid JSONL¶
# Validate before processing
fuzzy-infer validate knowledge.jsonl || exit 1
fuzzy-infer run rules.jsonl knowledge.jsonl
Iteration Limits¶
# Increase iterations for complex rules
fuzzy-infer run --max-iterations 500 complex_rules.jsonl facts.jsonl
Malformed Rules¶
Performance Tips¶
1. Use Streaming¶
2. Filter Early¶
# Filter before inference when possible
cat facts.jsonl | grep "high-priority" | fuzzy-infer run rules.jsonl
3. Parallel Processing¶
4. Cache Results¶
# Cache inference results
fuzzy-infer run rules.jsonl facts.jsonl | tee results.jsonl | fuzzy-infer query "target"
Debugging¶
Enable Debug Logging¶
Trace Inference¶
# Show each inference step
FUZZY_INFER_LOG_LEVEL=DEBUG fuzzy-infer run rules.jsonl facts.jsonl 2>&1 | \
grep "Fired rule"
Inspect Intermediate Results¶
# Save intermediate results
fuzzy-infer run rules.jsonl facts.jsonl | tee intermediate.jsonl | \
fuzzy-infer query "result"
Best Practices¶
1. Use Consistent JSONL¶
# Validate all inputs
for file in *.jsonl; do
fuzzy-infer validate "$file" || echo "Invalid: $file"
done
2. Version Control Rules¶
3. Document Rules¶
# Add comments in separate documentation
# rules_documentation.md
cat > rules_documentation.md << 'EOF'
## birds-fly
Infers that birds can fly with 90% confidence.
Exceptions handled by higher-priority rules.
EOF
4. Test Rules¶
5. Monitor Performance¶
# Time inference
time fuzzy-infer run rules.jsonl facts.jsonl
# Profile with large datasets
/usr/bin/time -v fuzzy-infer run rules.jsonl large_facts.jsonl
Summary¶
- CLI provides Unix-compliant interface to FuzzyInfer
- JSONL format enables streaming and composability
- Integrates seamlessly with
jq,grep,awk, etc. - Commands:
run,query,facts,validate - Pipeline-friendly for complex workflows
- Configuration via files or environment variables
Next: Unix Pipes - Building inference pipelines