Skip to main content
Rekko’s analysis pipeline runs deep multi-stage research on a market. This takes 30-90 seconds, so it uses an async pattern: trigger, poll, retrieve.

The three-step pattern

1

Trigger

POST /v1/markets/{platform}/{market_id}/analyzeReturns 202 Accepted immediately with an analysis_id and poll_url.
{
  "analysis_id": "rk-abc123",
  "status": "running",
  "poll_url": "/v1/markets/kalshi/KXFED-26MAR19/analyze/rk-abc123/status",
  "estimated_seconds": 60
}
2

Poll

GET /v1/markets/{platform}/{market_id}/analyze/{analysis_id}/statusPoll every 5 seconds until status changes to "complete" or "error".
{
  "analysis_id": "rk-abc123",
  "status": "complete"
}
3

Retrieve

GET /v1/markets/{platform}/{market_id}/analysisReturns the full Analysis object with probability, confidence, edge, recommendation, and optional expansions.

Complete example

import time
import httpx

client = httpx.Client(
    base_url="https://api.rekko.ai/v1",
    headers={"Authorization": "Bearer YOUR_API_KEY"},
)

# 1. Trigger
resp = client.post("/markets/kalshi/KXFED-26MAR19/analyze")
analysis_id = resp.json()["analysis_id"]

# 2. Poll
while True:
    status = client.get(
        f"/markets/kalshi/KXFED-26MAR19/analyze/{analysis_id}/status"
    ).json()
    if status["status"] == "complete":
        break
    if status["status"] == "error":
        raise Exception(status.get("error_message", "Analysis failed"))
    time.sleep(5)

# 3. Retrieve (with causal decomposition)
analysis = client.get(
    "/markets/kalshi/KXFED-26MAR19/analysis",
    params={"expand": "causal,scenarios"},
).json()

print(f"Probability: {analysis['probability']}")
print(f"Edge: {analysis['edge']}")
print(f"Recommendation: {analysis['recommendation']}")

The Analysis object

FieldTypeDescription
probabilityfloat (0-1)Rekko’s estimated true probability
confidencefloat (0-1)Confidence in the estimate
edgefloatprobability - market_price (positive = underpriced YES)
recommendationstring"BUY_YES", "BUY_NO", or "NO_TRADE"
risk_ratingstring"low", "medium", or "high"
summarystringExecutive summary
key_factorsstring[]Top probability drivers
risksstring[]Ways the analysis could be wrong
source_countintegerNumber of sources analyzed
freshnessstring"fresh" (<24h), "stale" (24-72h), "expired" (>72h)

Caching and freshness

Analyses are cached. The freshness field tells you how stale the result is:
FreshnessAgeMeaning
fresh< 24 hoursRecent, likely still accurate
stale24-72 hoursMay have drifted, consider re-triggering
expired> 72 hoursLikely outdated, re-trigger recommended
To force a fresh analysis even when a cached one exists, use ?force=true on the trigger endpoint.

Expandable fields

Add ?expand=scenarios,causal,meta to the retrieve step for deeper data:
ExpansionWhat it adds
scenariosBull/base/bear scenario assessments with probabilities and triggers
causalCausal factor decomposition — Rekko’s key differentiator
metaPipeline metadata: what would change our mind, edge assessment, duration
See Expand Parameter for the full reference.