Command Purpose Example stats Aggregate data | stats count() by userchart Visualize aggregations | chart count() by severitytimechart Time-based aggregation | timechart span=1h count()streamstats Running statistics | streamstats count() as numeventstats Add stats to events | eventstats avg(bytes) as avgbin Bucket values | bin span=5m
Command Purpose Example where Filter results | where count > 10head First N results | head 100tail Last N results | tail 50dedup Remove duplicates | dedup usertop Most common | top src_iprare Least common | rare dest_portsample Random sample | sample 1000
Command Purpose Example eval Calculate fields | eval total=bytes_in+bytes_outrename Rename fields | rename src_ip as sourcefields Select fields | fields user, action, timestamptable Display fields | table user, src_ip, actionrex Regex extract / sed replace | rex "user=(?<username>\w+)"regex Regex filter | regex field="pattern"spath Parse JSON/XML | spath path=user.emailfillnull Replace nulls | fillnull value="unknown"mvexpand Expand arrays | mvexpand tags
Command Purpose Example sort Order results | sort -countreverse Reverse order | reverse
Command Purpose Example ai LLM classification | ai prompt="Classify as NORMAL or MALICIOUS"lookup Enrich with lookup tables | lookup ip_reputation src_ipinputlookup Fetch from external URLs | inputlookup url="https://..." key=src_ipprevalence Prevalence filter | prevalence hash_prevalence < 5resolve_identity IP/user/host identity | resolve_identity field=src_iprisk Assign risk score | risk score=50 entity=useranomaly Detect outliers | anomaly field=bytes threshold=3asset Asset investigation view src_host="ws-01" | asset
Command Purpose Example transaction Group events | transaction session_id maxspan=30msequence Ordered patterns | sequence by user [login] [access]funnel Conversion analysis | funnel by user window=1h step1=...tree Hierarchical view | tree process or | tree web
Command Purpose Example append Append subsearch results | append [search status=500]join Join with subsearch | join user [search source_type="users"]format Format subsearch as string [search action=bad | fields src_ip | format]return Return values from subsearch [search threat="high" | return file_hash]
# Brute force detection
action=login status=failure
| bin span=10m
| stats count() by time_bucket, src_ip
| where count > 5
# Data exfiltration
* | bin span=5m
| stats sum(bytes_out) as outbound by time_bucket, src_ip
| where outbound > 100000000
# Port scanning
* | bin span=1m
| stats dc(dest_port) as unique_ports by time_bucket, src_ip
| where unique_ports > 50
# Asset investigation
src_host="workstation-42" | asset
# User activity timeline
user="john.doe"
| timechart span=1h count() by action
# Rare file execution
file_hash=*
| prevalence hash_prevalence < 5
| table timestamp, file_hash, process_name, src_host
# Anomalous behavior
* | anomaly field=bytes_out by user threshold=3
| where is_anomaly=true
# Lateral movement
action=login
| transaction user maxspan=1h
| where eventcount > 5 AND dc(src_host) > 3
# Privilege escalation chain
* | sequence by user maxspan=30m
[privilege="user"]
[action="privilege_escalation"]
[privilege="admin"]
# Process tree investigation
source_type=sysmon action=process_create src_host="SUSPICIOUS-HOST"
| tree process
# PowerShell execution chain with prevalence
source_type=sysmon action=process_create
| tree process root="/powershell|pwsh/"
# Web session flow analysis
source_type=squid_proxy user="suspicious_user"
| tree web
# Enrich with external threat feed
* | inputlookup url="https://api.threatfeed.io/{src_ip}" key=src_ip
| where inputlookup_risk_score > 80
# Correlate with subsearch
* | join type=inner src_ip [search action=suspicious | table src_ip]
Counting: count(), dc(field)
Math: sum(field), avg(field), min(field), max(field)
Stats: median(field), percentile(field, N), perc95(field), stdev(field), var(field)
Lists: values(field), list(field), mode(field)
Time: earliest(field), latest(field), first(field), last(field)
Visualization: sparkline(field)
Other: range(field)
String: lower(), upper(), substr(), replace(), trim(), len(), split(), concat()
Math: abs(), ceil(), floor(), round(), sqrt(), pow(), log(), exp()
Type: tonumber(), tostring(), tobool()
Conditional: if(), case(), coalesce(), nullif()
Time: now(), strftime(), strptime(), time()
Network: is_private_ip(), is_public_ip(), cidr_match()
Crypto: md5(), sha1(), sha256()
Encoding: base64_encode(), base64_decode(), hex_encode(), hex_decode(), url_encode(), url_decode()
Security: entropy(), defang(), refang()
Domain/URL: extract_domain(), extract_tld(), extract_path()
Infix Operators: CONTAINS, LIKE (usable inside eval expressions, e.g. if(cmd CONTAINS "-enc", 1, 0))
isnull(field), isnotnull(field), like(field, pattern), match(field, regex), cidrmatch(cidr, field)
Comparison: =, !=, >, <, >=, <=
Pattern: LIKE, NOT LIKE, CONTAINS, STARTSWITH, ENDSWITH
Regex: =/pattern/, !=/pattern/
List: IN, NOT IN
Logical: AND, OR, NOT
Seconds: 30s
Minutes: 5m, 15m, 30m
Hours: 1h, 6h, 12h
Days: 1d, 7d, 30d
Weeks: 1w, 4w
Filter early - Use search expressions before pipes
Limit results - Use head/tail when exploring
Aggregate - stats is faster than returning raw events
Index fields - Use indexed fields in search expressions
Avoid wildcards - Specific filters are faster
Bin before stats - For time-windowed detection
Dedup wisely - Can be memory-intensive
Sample for testing - Use sample when developing queries