first commit
Some checks failed
Deploy / lint (push) Failing after 7s
Deploy / test (push) Has been skipped
Deploy / deploy (push) Has been skipped

This commit is contained in:
Dennis Thiessen
2026-02-20 17:31:01 +01:00
commit 61ab24490d
160 changed files with 17034 additions and 0 deletions

View File

@@ -0,0 +1 @@
{"specId": "fa730cf4-a14d-4f62-8993-fd7db6fe25cc", "workflowType": "requirements-first", "specType": "feature"}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,221 @@
# Requirements Document
## Introduction
This document defines the requirements for the Stock Data Backend — an opinionated investing-signal platform built with Python/FastAPI and PostgreSQL, focused on NASDAQ stocks. The platform's philosophy: find the path of least resistance (trend direction), identify key support/resistance zones, detect asymmetric risk-reward setups, and surface the best opportunities through a unified scoring pipeline. It does not attempt to predict price — it identifies where conditions are most favorable.
Every data source (OHLCV, technical indicators, sentiment, fundamentals) feeds into a single composite scoring and ranking system that auto-populates a watchlist and flags trade setups. Data ingestion is exclusively via the configured market data provider — users do not upload data directly.
This is an MVP focused on delivering actionable signals. Engineering concerns (API format, database indexing, logging, connection pooling, graceful shutdown) are design constraints, not requirements.
## Glossary
- **Backend_Service**: The FastAPI-based Python web application that exposes REST API endpoints.
- **Ticker**: A unique NASDAQ stock symbol (e.g., AAPL, MSFT) being tracked by the system.
- **OHLCV_Record**: A single price data point containing Open, High, Low, Close, and Volume values for a specific Ticker on a specific date.
- **Ticker_Registry**: The subsystem responsible for adding, removing, listing, and looking up tracked NASDAQ tickers.
- **Price_Store**: The subsystem responsible for persisting and retrieving OHLCV price data in PostgreSQL.
- **Ingestion_Pipeline**: The subsystem responsible for importing stock data into the Price_Store via the configured market data provider.
- **Data_Collector**: A scheduled job that periodically fetches the latest price data for all tracked tickers and upserts it into the Price_Store.
- **Auth_Service**: The subsystem responsible for user registration, login, JWT token management, and role-based access control.
- **User**: A registered account with a username, hashed password, and assigned role (user or admin).
- **Admin**: A User with the admin role who can manage other users and configure system settings.
- **Access_Token**: A JWT token issued upon login, expires after 60 minutes.
- **ADX**: Average Directional Index — measures trend strength (0-100). Values above 25 indicate a strong trend.
- **EMA**: Exponential Moving Average — configurable period. EMA Cross (e.g., 20/50) determines directional bias.
- **RSI**: Relative Strength Index — momentum oscillator (0-100). Overbought >70, oversold <30.
- **ATR**: Average True Range — measures price volatility. Used for stop-loss and target placement.
- **Volume_Profile**: Distribution of traded volume across price levels, producing POC, Value Area, HVN, and LVN.
- **POC**: Point of Control — price level with highest traded volume.
- **HVN**: High Volume Node — above-average volume level, acts as support/resistance magnet.
- **LVN**: Low Volume Node — below-average volume level, acts as breakout zone.
- **Pivot_Point**: A support or resistance level from swing highs and swing lows.
- **SR_Level**: A support or resistance level tagged with type, strength score, and detection method.
- **SR_Detector**: The subsystem that auto-calculates support and resistance levels.
- **Sentiment_Score**: A record containing bullish/bearish/neutral classification, confidence (0-100), source, and timestamp for a Ticker.
- **Fundamental_Data**: Key financial metrics: P/E ratio, revenue growth rate, earnings surprise %, and market cap.
- **Composite_Score**: A weighted aggregate score (0-100) from all dimension scores for a Ticker.
- **Dimension_Score**: A normalized score (0-100) for a single analysis dimension (technical, S/R quality, sentiment, fundamental, momentum).
- **Scoring_Engine**: The subsystem that computes dimension scores, applies weights, and produces Composite_Scores.
- **RR_Scanner**: The subsystem that scans for asymmetric risk-reward trade setups.
- **Trade_Setup**: A detected trade opportunity with entry, stop-loss, target, R:R ratio, direction (long/short), and Composite_Score.
- **Watchlist**: A curated list of top-ranked tickers from the Scoring_Engine, with manual add/remove support.
- **System_Settings**: Persisted configuration values managed by admins.
## Requirements
### Requirement 1: Ticker Management
**User Story:** As a user, I want to manage the NASDAQ tickers I am tracking, so that I can control which stocks the system analyzes.
#### Acceptance Criteria
- 1.1 WHEN a user submits a valid NASDAQ ticker symbol, THE Ticker_Registry SHALL create a new ticker entry and return the created ticker with its metadata.
- 1.2 WHEN a user submits a ticker symbol that already exists, THE Backend_Service SHALL return a duplicate error.
- 1.3 WHEN a user submits an empty or whitespace-only ticker symbol, THE Backend_Service SHALL reject the request with a validation error.
- 1.4 WHEN a user requests the list of tracked tickers, THE Ticker_Registry SHALL return all tickers sorted alphabetically by symbol.
- 1.5 WHEN a user requests deletion of a tracked ticker, THE Ticker_Registry SHALL remove the ticker and all associated data (OHLCV, scores, setups).
- 1.6 WHEN a user requests deletion of a ticker that does not exist, THE Backend_Service SHALL return a not-found error.
### Requirement 2: OHLCV Price Data Storage
**User Story:** As a user, I want the system to store historical OHLCV price data, so that technical analysis and signal detection have a data foundation.
#### Acceptance Criteria
- 2.1 THE Price_Store SHALL persist each OHLCV_Record with: ticker symbol, date, open, high, low, close, and volume.
- 2.2 THE Price_Store SHALL enforce uniqueness on (ticker symbol, date).
- 2.3 THE Backend_Service SHALL reject OHLCV_Records where high < low, any price is negative, volume is negative, or date is in the future.
- 2.4 THE Backend_Service SHALL reject OHLCV_Records for tickers not in the Ticker_Registry.
### Requirement 3: Data Ingestion
**User Story:** As a user, I want the system to fetch stock data from the market data provider, so that my price history stays current.
#### Acceptance Criteria
- 3.1 WHEN a user requests a data fetch for a ticker and date range, THE Ingestion_Pipeline SHALL fetch from the configured provider and upsert into the Price_Store.
- 3.2 IF the provider is unreachable or errors, THE Ingestion_Pipeline SHALL return a descriptive error without modifying existing data.
- 3.3 IF the provider returns a rate-limit error, THE Ingestion_Pipeline SHALL record progress and return a response indicating how many records were ingested, so the fetch can be resumed without gaps.
- 3.4 WHEN a rate-limited fetch is resumed for the same ticker and date range, THE Ingestion_Pipeline SHALL continue from the last successfully ingested date.
### Requirement 4: Scheduled Data Collection
**User Story:** As a user, I want the system to automatically fetch the latest price data on a schedule, so that my data stays current without manual intervention.
#### Acceptance Criteria
- 4.1 THE Data_Collector SHALL periodically fetch the latest daily OHLCV data for all tracked tickers.
- 4.2 THE Data_Collector SHALL upsert records, updating existing ones if they already exist.
- 4.3 WHEN the Data_Collector encounters an error for a specific ticker, it SHALL log the error and continue with remaining tickers.
- 4.4 THE Data_Collector SHALL be configurable for frequency (daily, hourly) via configuration.
- 4.5 IF a rate limit is hit during collection, THE Data_Collector SHALL record the last successful ticker and resume from there on the next run.
### Requirement 5: Technical Analysis
**User Story:** As a user, I want the system to compute key technical indicators, so that trend strength, momentum, and volatility feed into the scoring pipeline.
#### Acceptance Criteria
- 5.1 THE Backend_Service SHALL compute the following from OHLCV data: ADX, EMA (default periods 20 and 50), RSI (default 14-period), ATR (default 14-period), Volume_Profile (POC, Value Area, HVN, LVN), and Pivot_Points (swing highs/lows).
- 5.2 WHEN an indicator is requested for a Ticker and date range, THE Backend_Service SHALL return both raw values and a normalized score (0-100).
- 5.3 WHEN an EMA Cross signal is requested, THE Backend_Service SHALL compare short vs long EMA and return directional bias (bullish, bearish, neutral).
- 5.4 IF insufficient data exists to compute an indicator, THE Backend_Service SHALL return an error indicating the minimum data requirement.
### Requirement 6: Support/Resistance Detection
**User Story:** As a user, I want the system to auto-calculate support and resistance levels, so that I can see key price zones where buying or selling pressure concentrates.
#### Acceptance Criteria
- 6.1 THE SR_Detector SHALL identify SR_Levels from Volume_Profile (HVN/LVN zones) and from Pivot_Points (swing highs/lows).
- 6.2 THE SR_Detector SHALL assign each level a strength score (0-100) based on how many times price has respected that level.
- 6.3 THE SR_Detector SHALL tag each level as "support" or "resistance" relative to current price.
- 6.4 WHEN new OHLCV data arrives for a Ticker, THE SR_Detector SHALL recalculate its SR_Levels.
- 6.5 THE SR_Detector SHALL merge levels from different methods within a configurable price tolerance (default 0.5%) into a single consolidated level.
- 6.6 WHEN a user requests SR_Levels for a Ticker, they SHALL be returned sorted by strength descending with detection method indicated.
### Requirement 7: Sentiment Data
**User Story:** As a user, I want sentiment data to feed into the scoring pipeline, so that social mood is factored into signal detection.
#### Acceptance Criteria
- 7.1 THE Backend_Service SHALL periodically collect sentiment data for all tracked tickers from a configured source at a configurable interval (default 30 minutes).
- 7.2 EACH Sentiment_Score SHALL contain: classification (bullish/bearish/neutral), confidence (0-100), source identifier, and timestamp.
- 7.3 IF the sentiment source is unreachable, THE Backend_Service SHALL log the error and retain existing data.
- 7.4 WHEN computing the sentiment Dimension_Score, THE Scoring_Engine SHALL aggregate recent scores within a configurable lookback window (default 24h) using configurable source weights and time decay.
### Requirement 8: Fundamental Data
**User Story:** As a user, I want key fundamental metrics to feed into the scoring pipeline, so that financial quality is factored into signal detection.
#### Acceptance Criteria
- 8.1 THE Backend_Service SHALL fetch and store Fundamental_Data for each tracked Ticker: P/E ratio, revenue growth rate, earnings surprise %, and market cap.
- 8.2 THE Data_Collector SHALL periodically fetch updated Fundamental_Data (default daily).
- 8.3 IF the data source is unreachable, THE Backend_Service SHALL log the error and retain the most recent data.
- 8.4 WHEN new Fundamental_Data arrives, THE Scoring_Engine SHALL mark the fundamental Dimension_Score as stale.
### Requirement 9: Composite Scoring and Ranking
**User Story:** As a user, I want each stock scored across all dimensions with configurable weights, so that I can rank stocks by a single unified metric tuned to my preferences.
#### Acceptance Criteria
- 9.1 THE Scoring_Engine SHALL compute a Dimension_Score (0-100) per Ticker for: technical, S/R quality, sentiment, fundamental, and momentum.
- 9.2 THE Scoring_Engine SHALL compute a Composite_Score as the weighted average of available Dimension_Scores using user-configurable weights.
- 9.3 WHEN a Ticker is missing data for one or more dimensions, THE Scoring_Engine SHALL use only available dimensions (re-normalizing weights) and indicate which are missing.
- 9.4 WHEN underlying data changes, THE Scoring_Engine SHALL mark the affected Composite_Score as stale.
- 9.5 WHEN a stale score is requested, THE Scoring_Engine SHALL recompute on-demand. No background recomputation.
- 9.6 WHEN a user requests rankings, THE Scoring_Engine SHALL return tickers sorted by Composite_Score descending with all Dimension_Scores included.
- 9.7 WHEN a user updates dimension weights, THE Scoring_Engine SHALL recompute all Composite_Scores.
### Requirement 10: Asymmetric R:R Trade Detection
**User Story:** As a user, I want the system to scan for trade setups with favorable risk-reward ratios, so that I see highly asymmetric opportunities without manual chart analysis.
#### Acceptance Criteria
- 10.1 THE RR_Scanner SHALL periodically scan all tracked tickers for Trade_Setups meeting a configurable R:R threshold (default 3:1).
- 10.2 FOR long setups: target = nearest SR_Level above price, stop = ATR-based distance below price.
- 10.3 FOR short setups: target = nearest SR_Level below price, stop = ATR-based distance above price.
- 10.4 EACH Trade_Setup SHALL include: entry price, stop-loss, target, R:R ratio, direction (long/short), and Composite_Score.
- 10.5 WHEN underlying SR_Levels or price data changes, THE RR_Scanner SHALL recalculate and remove setups that no longer meet the threshold.
- 10.6 THE RR_Scanner SHALL be configurable for scan frequency via configuration.
- 10.7 IF a Ticker lacks sufficient SR_Levels or ATR data, THE RR_Scanner SHALL skip it and log the reason.
- 10.8 WHEN a user requests trade setups, results SHALL be sorted by R:R descending (secondary: Composite_Score descending), with optional direction filter.
### Requirement 11: Watchlist
**User Story:** As a user, I want a watchlist of top-ranked stocks that auto-populates from scoring, so that I always have a curated shortlist of the best opportunities.
#### Acceptance Criteria
- 11.1 THE Watchlist SHALL auto-include the top-X tickers by Composite_Score (X configurable, default 10).
- 11.2 WHEN requested, THE Watchlist SHALL return each entry with Composite_Score, Dimension_Scores, R:R ratio (if setup exists), and active SR_Levels.
- 11.3 Users MAY manually add/remove tickers. Manual additions are tagged and not subject to auto-population rules.
- 11.4 THE Watchlist SHALL enforce a max size of auto-populate count + 10 manual additions (default max 20).
- 11.5 WHEN Composite_Scores are recomputed, auto-populated entries SHALL update to reflect new rankings.
- 11.6 THE Watchlist SHALL be sortable by Composite_Score, any Dimension_Score, or R:R ratio.
### Requirement 12: User Authentication
**User Story:** As a system owner, I want user registration and login with role-based access, so that only authorized users can access signals and analysis.
#### Acceptance Criteria
- 12.1 WHEN registration is enabled and valid credentials are submitted, THE Auth_Service SHALL create a User with no API access by default.
- 12.2 WHEN registration is disabled, THE Auth_Service SHALL reject registration.
- 12.3 WHEN valid login credentials are submitted, THE Auth_Service SHALL return an Access_Token (60-minute expiry).
- 12.4 WHEN invalid credentials are submitted, THE Auth_Service SHALL return an error without revealing which field was wrong.
- 12.5 Unauthenticated requests to protected endpoints SHALL receive 401. Authenticated users without granted access SHALL receive 403.
- 12.6 WHEN a token expires, THE Backend_Service SHALL return 401 indicating expiration.
### Requirement 13: Admin Management
**User Story:** As an admin, I want to manage users, control system settings, and perform data maintenance.
#### Acceptance Criteria
- 13.1 WHEN the system initializes for the first time, a default admin account SHALL be created (username: "admin", password: "admin").
- 13.2 Admins SHALL be able to: grant/revoke user access, toggle registration, list all users, reset user passwords, and create new user accounts.
- 13.3 Admins SHALL be able to: enable/disable scheduled jobs, update system settings (frequencies, thresholds, weights, watchlist size), and trigger manual job runs.
- 13.4 Admins SHALL be able to delete all data older than a specified number of days (OHLCV, sentiment, fundamentals). Ticker entries, user accounts, and latest scores SHALL be preserved.
- 13.5 Admin endpoints SHALL be restricted to users with the admin role.
## Design Constraints
The following are engineering concerns to be addressed during design, not user-facing requirements:
- Consistent JSON API envelope (status, data, error fields) with appropriate HTTP status codes
- OpenAPI/Swagger documentation endpoint
- Versioned URL prefixes (/api/v1/)
- Composite database index on (ticker, date) for range query performance
- Date-only storage for OHLCV (no time component)
- Database migrations for schema management
- Structured JSON logging with configurable levels
- Database connection pooling (default 5 connections)
- Health check endpoint (unauthenticated)
- Graceful shutdown (complete in-flight requests, stop jobs, close pool)
- Market data provider behind an interface/protocol for swappability

View File

@@ -0,0 +1,255 @@
# Implementation Plan: Stock Data Backend
## Overview
Incremental build of the investing-signal platform: foundation first (config, DB, models, auth), then domain services (tickers, OHLCV, ingestion, indicators, S/R, sentiment, fundamentals), then scoring/ranking (scoring engine, R:R scanner, watchlist), then scheduled jobs, deployment templates, and final wiring. Each step builds on the previous and ends integrated.
## Tasks
- [x] 1. Project scaffolding, configuration, and database foundation
- [x] 1.1 Create project structure with `pyproject.toml`, `.env.example`, `alembic.ini`, and `app/` package
- Create `pyproject.toml` with dependencies: fastapi, uvicorn, sqlalchemy[asyncio], asyncpg, alembic, pydantic-settings, python-jose, passlib[bcrypt], apscheduler, httpx, alpaca-py, google-genai, hypothesis
- Create `.env.example` with all environment variables from design
- Create `app/__init__.py`, `app/config.py` (pydantic-settings `Settings` class)
- Create `app/database.py` (async SQLAlchemy engine, session factory, connection pooling)
- _Requirements: Design Constraints (connection pooling, config)_
- [x] 1.2 Create all SQLAlchemy ORM models and Alembic initial migration
- Create `app/models/__init__.py` and model files: `ticker.py`, `ohlcv.py`, `user.py`, `sentiment.py`, `fundamental.py`, `score.py`, `sr_level.py`, `trade_setup.py`, `watchlist.py`, `settings.py`
- Implement all 12 entities from the ERD: User, Ticker, OHLCVRecord, SentimentScore, FundamentalData, SRLevel, DimensionScore, CompositeScore, TradeSetup, WatchlistEntry, SystemSetting, IngestionProgress
- Include composite unique constraints, indexes, and cascade deletes per design
- Initialize Alembic (`alembic/env.py`) and generate initial migration
- _Requirements: 2.1, 2.2, Design Constraints (composite index on ticker+date)_
- [x] 1.3 Create shared schemas, exception hierarchy, and API envelope
- Create `app/schemas/common.py` with `APIEnvelope` model (status, data, error)
- Create `app/middleware.py` with global exception handler mapping `AppError` subclasses to JSON envelope responses
- Create exception classes: `AppError`, `ValidationError`, `NotFoundError`, `DuplicateError`, `AuthenticationError`, `AuthorizationError`, `ProviderError`, `RateLimitError`
- _Requirements: Design Constraints (JSON envelope, HTTP status codes)_
- [x] 1.4 Create FastAPI app entry point with lifespan, health check, and dependency injection
- Create `app/main.py` with FastAPI app, lifespan handler (DB pool startup/shutdown, default admin creation)
- Create `app/dependencies.py` with `Depends()` factories for DB session, current user, admin guard
- Create `app/routers/health.py` with unauthenticated `/api/v1/health` endpoint
- Wire health router into app
- _Requirements: 13.1, Design Constraints (health check, graceful shutdown, versioned URLs)_
- [x] 2. Authentication and admin services
- [x] 2.1 Implement Auth Service and auth router
- Create `app/services/auth_service.py`: registration (configurable on/off, creates no-access user), login (bcrypt verify, JWT generation with 60-min expiry), token validation
- Create `app/schemas/auth.py`: RegisterRequest, LoginRequest, TokenResponse
- Create `app/routers/auth.py`: `POST /api/v1/auth/register`, `POST /api/v1/auth/login`
- Implement JWT middleware in `app/dependencies.py` for `get_current_user` and `require_admin`
- _Requirements: 12.1, 12.2, 12.3, 12.4, 12.5, 12.6_
- [ ]* 2.2 Write property tests for auth (Properties 34-38)
- **Property 34: Registration creates no-access user** — _Validates: Requirements 12.1_
- **Property 35: Registration disabled rejects all attempts** — _Validates: Requirements 12.2_
- **Property 36: Login returns valid JWT** — _Validates: Requirements 12.3_
- **Property 37: Invalid credentials return generic error** — _Validates: Requirements 12.4_
- **Property 38: Access control enforcement** — _Validates: Requirements 12.5_
- [x] 2.3 Implement Admin Service and admin router
- Create `app/services/admin_service.py`: grant/revoke access, toggle registration, list users, reset passwords, create accounts, system settings CRUD, data cleanup (delete old OHLCV/sentiment/fundamentals preserving tickers/users/scores), job control
- Create `app/schemas/admin.py`: UserManagement, SystemSettingUpdate, DataCleanupRequest
- Create `app/routers/admin.py`: admin-only endpoints under `/api/v1/admin/`
- _Requirements: 13.1, 13.2, 13.3, 13.4, 13.5_
- [ ]* 2.4 Write property tests for admin (Properties 39-40)
- **Property 39: Admin user management operations** — _Validates: Requirements 13.2_
- **Property 40: Data cleanup preserves structure** — _Validates: Requirements 13.4_
- [x] 3. Checkpoint - Ensure all tests pass
- Ensure all tests pass, ask the user if questions arise.
- [x] 4. Ticker management and OHLCV price storage
- [x] 4.1 Implement Ticker Registry service and router
- Create `app/services/ticker_service.py`: add (validate non-empty, uppercase, alphanumeric, check uniqueness), delete (cascade all associated data), list (sorted alphabetically)
- Create `app/schemas/ticker.py`: TickerCreate, TickerResponse
- Create `app/routers/tickers.py`: `POST /api/v1/tickers`, `GET /api/v1/tickers`, `DELETE /api/v1/tickers/{symbol}`
- _Requirements: 1.1, 1.2, 1.3, 1.4, 1.5, 1.6_
- [ ]* 4.2 Write property tests for ticker management (Properties 1-4)
- **Property 1: Ticker creation round-trip** — _Validates: Requirements 1.1_
- **Property 2: Duplicate ticker rejection** — _Validates: Requirements 1.2_
- **Property 3: Whitespace ticker rejection** — _Validates: Requirements 1.3_
- **Property 4: Ticker deletion cascades** — _Validates: Requirements 1.5_
- [x] 4.3 Implement Price Store service and OHLCV router
- Create `app/services/price_service.py`: upsert OHLCV (validate high >= low, prices >= 0, volume >= 0, date <= today, ticker exists), query by ticker + date range
- Create `app/schemas/ohlcv.py`: OHLCVCreate, OHLCVResponse
- Create `app/routers/ohlcv.py`: `POST /api/v1/ohlcv`, `GET /api/v1/ohlcv/{symbol}`
- On upsert: invalidate LRU cache for ticker, mark composite score as stale
- _Requirements: 2.1, 2.2, 2.3, 2.4_
- [ ]* 4.4 Write property tests for OHLCV (Properties 5-7)
- **Property 5: OHLCV storage round-trip** — _Validates: Requirements 2.1, 2.2_
- **Property 6: OHLCV validation rejects invalid records** — _Validates: Requirements 2.3_
- **Property 7: OHLCV rejects unregistered tickers** — _Validates: Requirements 2.4_
- [x] 5. Market data provider and ingestion pipeline
- [x] 5.1 Implement provider protocols and concrete implementations
- Create `app/providers/protocol.py`: `MarketDataProvider` Protocol (fetch_ohlcv), `SentimentProvider` Protocol (fetch_sentiment), `FundamentalProvider` Protocol (fetch_fundamentals)
- Create `app/providers/alpaca.py`: Alpaca OHLCV provider using `alpaca-py` SDK — fetches daily bars by ticker and date range
- Create `app/providers/gemini_sentiment.py`: Gemini sentiment provider using `google-genai` with search grounding — sends structured prompt per ticker, parses JSON response (classification + confidence)
- Create `app/providers/fmp.py`: Financial Modeling Prep fundamentals provider using `httpx` — fetches P/E, revenue growth, earnings surprise, market cap
- _Requirements: Design Constraints (provider behind interface)_
- [x] 5.2 Implement Ingestion Pipeline service and router
- Create `app/services/ingestion_service.py`: fetch + upsert with rate-limit handling (track `last_ingested_date`, return partial progress on rate limit, resume from last date + 1 day), provider error handling (descriptive error, no data modification)
- Create `app/routers/ingestion.py`: `POST /api/v1/ingestion/fetch/{symbol}`
- _Requirements: 3.1, 3.2, 3.3, 3.4_
- [ ]* 5.3 Write property tests for ingestion (Properties 8-9)
- **Property 8: Provider error preserves existing data** — _Validates: Requirements 3.2, 7.3, 8.3_
- **Property 9: Rate-limit resume continuity** — _Validates: Requirements 3.3, 3.4, 4.5_
- [x] 6. Checkpoint - Ensure all tests pass
- Ensure all tests pass, ask the user if questions arise.
- [x] 7. Technical analysis and S/R detection
- [x] 7.1 Implement LRU cache wrapper with invalidation
- Create `app/cache.py`: LRU cache wrapper (max 1000 entries) keyed on ticker + date range + indicator type, with per-ticker invalidation method
- _Requirements: Design Constraints (LRU cache)_
- [x] 7.2 Implement Technical Analysis service and indicators router
- Create `app/services/indicator_service.py`: compute ADX (28+ bars), EMA (period+1 bars, default 20/50), RSI (15+ bars, 14-period), ATR (15+ bars, 14-period), Volume Profile (20+ bars, POC/Value Area/HVN/LVN), Pivot Points (5+ bars, swing highs/lows)
- Each indicator returns raw values + normalized 0-100 score
- Implement EMA cross signal (bullish/bearish/neutral based on short vs long EMA comparison)
- Enforce minimum data requirements, return error if insufficient
- Create `app/schemas/indicator.py`: IndicatorRequest, IndicatorResponse, EMACrossResponse
- Create `app/routers/indicators.py`: `GET /api/v1/indicators/{symbol}/{indicator_type}`, `GET /api/v1/indicators/{symbol}/ema-cross`
- _Requirements: 5.1, 5.2, 5.3, 5.4_
- [ ]* 7.3 Write property tests for indicators (Properties 11-14)
- **Property 11: Score bounds invariant** — _Validates: Requirements 5.2, 6.2, 9.1_
- **Property 12: Indicator minimum data enforcement** — _Validates: Requirements 5.4_
- **Property 13: EMA cross directional bias** — _Validates: Requirements 5.3_
- **Property 14: Indicator computation determinism** — _Validates: Requirements 5.1_
- [x] 7.4 Implement S/R Detector service and router
- Create `app/services/sr_service.py`: detect SR levels from Volume Profile (HVN/LVN) and Pivot Points (swing highs/lows), assign strength scores (0-100 based on price respect count), merge levels within tolerance (default 0.5%), tag as support/resistance relative to current price, recalculate on new OHLCV data
- Create `app/schemas/sr_level.py`: SRLevelResponse
- Create `app/routers/sr_levels.py`: `GET /api/v1/sr-levels/{symbol}` (sorted by strength descending)
- _Requirements: 6.1, 6.2, 6.3, 6.4, 6.5, 6.6_
- [ ]* 7.5 Write property tests for S/R detection (Properties 15-17)
- **Property 15: SR level support/resistance tagging** — _Validates: Requirements 6.3_
- **Property 16: SR level merging within tolerance** — _Validates: Requirements 6.5_
- **Property 17: SR level detection from data** — _Validates: Requirements 6.1_
- [x] 8. Sentiment and fundamental data services
- [x] 8.1 Implement Sentiment service and router
- Create `app/services/sentiment_service.py`: store sentiment records (classification, confidence, source, timestamp), compute dimension score with time-decay weighted average over configurable lookback window (default 24h)
- Create `app/schemas/sentiment.py`: SentimentResponse
- Create `app/routers/sentiment.py`: `GET /api/v1/sentiment/{symbol}`
- _Requirements: 7.1, 7.2, 7.3, 7.4_
- [ ]* 8.2 Write property tests for sentiment (Properties 18-19)
- **Property 18: Sentiment score data shape** — _Validates: Requirements 7.2_
- **Property 19: Sentiment dimension score uses time decay** — _Validates: Requirements 7.4_
- [x] 8.3 Implement Fundamental Data service and router
- Create `app/services/fundamental_service.py`: store fundamental data (P/E, revenue growth, earnings surprise, market cap), mark fundamental dimension score as stale on new data
- Create `app/schemas/fundamental.py`: FundamentalResponse
- Create `app/routers/fundamentals.py`: `GET /api/v1/fundamentals/{symbol}`
- _Requirements: 8.1, 8.2, 8.3, 8.4_
- [ ]* 8.4 Write property test for fundamentals (Property 20)
- **Property 20: Fundamental data storage round-trip** — _Validates: Requirements 8.1_
- [x] 9. Checkpoint - Ensure all tests pass
- Ensure all tests pass, ask the user if questions arise.
- [x] 10. Scoring engine, R:R scanner, and watchlist
- [x] 10.1 Implement Scoring Engine service and router
- Create `app/services/scoring_service.py`: compute dimension scores (technical, sr_quality, sentiment, fundamental, momentum) each 0-100, compute composite score as weighted average of available dimensions with re-normalized weights, staleness marking/recomputation on demand, weight update triggers full recomputation
- Create `app/schemas/score.py`: ScoreResponse, WeightUpdateRequest, RankingResponse
- Create `app/routers/scores.py`: `GET /api/v1/scores/{symbol}`, `GET /api/v1/rankings`, `PUT /api/v1/scores/weights`
- _Requirements: 9.1, 9.2, 9.3, 9.4, 9.5, 9.6, 9.7_
- [ ]* 10.2 Write property tests for scoring (Properties 21-25)
- **Property 21: Composite score is weighted average** — _Validates: Requirements 9.2_
- **Property 22: Missing dimensions re-normalize weights** — _Validates: Requirements 9.3_
- **Property 23: Staleness marking on data change** — _Validates: Requirements 9.4_
- **Property 24: Stale score recomputation on demand** — _Validates: Requirements 9.5_
- **Property 25: Weight update triggers full recomputation** — _Validates: Requirements 9.7_
- [x] 10.3 Implement R:R Scanner service and router
- Create `app/services/rr_scanner_service.py`: scan tickers for trade setups (long: target = nearest SR above, stop = entry - ATR×multiplier; short: target = nearest SR below, stop = entry + ATR×multiplier), filter by R:R threshold (default 3:1), recalculate/prune on data change, skip tickers without sufficient SR/ATR data
- Create `app/schemas/trade_setup.py`: TradeSetupResponse
- Create `app/routers/trades.py`: `GET /api/v1/trades` (sorted by R:R desc, secondary composite desc, optional direction filter)
- _Requirements: 10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 10.7, 10.8_
- [ ]* 10.4 Write property tests for R:R scanner (Properties 26-29)
- **Property 26: Trade setup R:R threshold filtering** — _Validates: Requirements 10.1_
- **Property 27: Trade setup computation correctness** — _Validates: Requirements 10.2, 10.3_
- **Property 28: Trade setup data completeness** — _Validates: Requirements 10.4_
- **Property 29: Trade setup pruning on data change** — _Validates: Requirements 10.5_
- [x] 10.5 Implement Watchlist service and router
- Create `app/services/watchlist_service.py`: auto-populate top-X by composite score (default 10), manual add/remove (tagged, not subject to auto-population), enforce cap (auto + 10 manual, default max 20), update auto entries on score recomputation
- Create `app/schemas/watchlist.py`: WatchlistEntryResponse (includes composite score, dimension scores, R:R ratio, SR levels)
- Create `app/routers/watchlist.py`: `GET /api/v1/watchlist`, `POST /api/v1/watchlist/{symbol}`, `DELETE /api/v1/watchlist/{symbol}` (sortable by composite, dimension, or R:R)
- _Requirements: 11.1, 11.2, 11.3, 11.4, 11.5, 11.6_
- [ ]* 10.6 Write property tests for watchlist (Properties 30-33)
- **Property 30: Watchlist auto-population** — _Validates: Requirements 11.1_
- **Property 31: Watchlist entry data completeness** — _Validates: Requirements 11.2_
- **Property 32: Manual watchlist entries persist through auto-population** — _Validates: Requirements 11.3_
- **Property 33: Watchlist size cap enforcement** — _Validates: Requirements 11.4_
- [x] 11. Checkpoint - Ensure all tests pass
- Ensure all tests pass, ask the user if questions arise.
- [x] 12. Scheduled jobs and sorting correctness
- [x] 12.1 Implement APScheduler job definitions and scheduler integration
- Create `app/scheduler.py`: define scheduled jobs for Data Collector (OHLCV fetch for all tickers, configurable frequency), Sentiment Collector (default 30 min), Fundamental Collector (default daily), R:R Scanner (configurable frequency)
- Each job: process all tracked tickers independently (one failure doesn't stop others), log errors with structured JSON, handle rate limits (record last successful ticker, resume next run)
- Wire scheduler into FastAPI lifespan (start on startup, shutdown gracefully)
- _Requirements: 4.1, 4.2, 4.3, 4.4, 4.5, 7.1, 8.2, 10.6_
- [ ]* 12.2 Write property test for scheduled collection (Property 10)
- **Property 10: Scheduled collection processes all tickers** — _Validates: Requirements 4.1, 4.3, 7.1, 8.2_
- [ ]* 12.3 Write property test for sorting correctness (Property 41)
- **Property 41: Sorting correctness** — _Validates: Requirements 1.4, 6.6, 9.6, 10.8, 11.6_
- [x] 13. Test infrastructure and shared fixtures
- [x] 13.1 Create test configuration and shared fixtures
- Create `tests/conftest.py`: test DB session fixture (transaction rollback per test), FastAPI test client fixture, mock `MarketDataProvider`, hypothesis custom strategies (`valid_ticker_symbols`, `whitespace_strings`, `valid_ohlcv_records`, `invalid_ohlcv_records`, `dimension_scores`, `weight_configs`, `sr_levels`, `sentiment_scores`, `trade_setups`)
- Create `tests/__init__.py`, `tests/unit/__init__.py`, `tests/property/__init__.py`
- _Requirements: Design (Testing Strategy)_
- [x] 14. Deployment templates and CI/CD
- [x] 14.1 Create deployment configuration files
- Create `deploy/nginx.conf` (reverse proxy for signal.thiessen.io)
- Create `deploy/stock-data-backend.service` (systemd unit file)
- Create `deploy/setup_db.sh` (idempotent DB creation + migration script)
- Create `.gitea/workflows/deploy.yml` (lint → test → deploy pipeline)
- _Requirements: Design (Deployment and Infrastructure)_
- [x] 15. Final wiring and integration
- [x] 15.1 Wire all routers into FastAPI app and verify OpenAPI docs
- Register all routers in `app/main.py` under `/api/v1/` prefix
- Verify Swagger/OpenAPI docs endpoint works at `/docs`
- Ensure all middleware (logging, error handling, auth) is applied
- _Requirements: Design Constraints (OpenAPI/Swagger, versioned URLs)_
- [ ]* 15.2 Write integration tests for key API flows
- Test end-to-end: register → login → add ticker → fetch data → get indicators → get scores → get watchlist
- Test auth enforcement: unauthenticated → 401, no-access user → 403, admin endpoints → 403 for non-admin
- Test error flows: duplicate ticker → 409, invalid OHLCV → 400, missing ticker → 404
- _Requirements: 1.1-1.6, 2.1-2.4, 12.1-12.6_
- [x] 16. Final checkpoint - Ensure all tests pass
- Ensure all tests pass, ask the user if questions arise.
## Notes
- Tasks marked with `*` are optional and can be skipped for faster MVP
- Each task references specific requirements for traceability
- Checkpoints ensure incremental validation
- Property tests validate the 41 correctness properties from the design document using `hypothesis`
- Unit tests validate specific examples and edge cases
- All code is Python 3.12+ with FastAPI, SQLAlchemy async, and PostgreSQL