Compare commits

...

53 Commits

Author SHA1 Message Date
FrederikBaerentsen 21d104280c feat(minifigures): individual minifigures can now be disabled using env var 2025-10-10 16:45:04 +02:00
FrederikBaerentsen 5946f86dfa feat(storages): added not in storage section 2025-10-10 14:52:08 +02:00
FrederikBaerentsen 0155144881 Updated gitignore 2025-10-10 14:46:57 +02:00
FrederikBaerentsen 1353153394 feat(minifigures): fixed owner filter on parts page 2025-10-10 11:29:36 +02:00
FrederikBaerentsen 0f45192f8e fix(index): fixed startpage error 2025-10-10 11:25:56 +02:00
FrederikBaerentsen b02f851865 feat(minifigures): fixed badge style and amount. Added env var for desc length and fixed wrap around style 2025-10-10 11:18:41 +02:00
FrederikBaerentsen bddfbb5235 feat(problems): all problems now show on /parts/problems, even from individual figures 2025-10-10 10:57:56 +02:00
FrederikBaerentsen dc34916331 fix(minifigures): metadata now works and saves correctly 2025-10-10 10:54:07 +02:00
FrederikBaerentsen a8d36bc5f1 fix(minifigures): fixed metadata format and individual minifigures layout 2025-10-10 08:29:12 +02:00
FrederikBaerentsen bd32ca5b8f feat(minifigures): added individual sort on /minifigure page 2025-10-09 17:12:04 +02:00
FrederikBaerentsen 2ed60e3fe3 fix(minifigure): fixed double click issue, nil image and bulk add. 2025-10-09 16:28:22 +02:00
FrederikBaerentsen 0ec1d37c36 feat(minifigures): initial upload. 2025-10-09 15:57:55 +02:00
FrederikBaerentsen 8053f5d30c feat(sets): show bricklink if enabled 2025-10-03 10:16:56 +02:00
FrederikBaerentsen 7eb199d289 fix(env): changed default minifigures folder from minifigs to minifigures (#92) 2025-10-03 09:50:41 +02:00
FrederikBaerentsen 6364da676b fix(admin): added log into to respect debug var 2025-10-03 09:22:45 +02:00
FrederikBaerentsen a3d08d8cf6 feat(sets): added filter on sets page to show duplicate sets. default is shown. can be hidden using env var. works with consolidated sets too. 2025-10-03 09:13:15 +02:00
FrederikBaerentsen 4b653ac270 feat(admin): added live configuration management, where user can enable/disable and change configurations without editing .env file. Some changes will need an application restart 2025-10-03 00:15:21 +02:00
FrederikBaerentsen a70a1660f0 fix(admin): open the right drawer on database upgrade 2025-10-02 23:52:13 +02:00
FrederikBaerentsen 0db749fce0 doc(changelog): updated changelog. 2025-10-02 14:58:23 +02:00
FrederikBaerentsen 256108bbdb feat(sql): WAL and index optimization 2025-10-02 14:53:58 +02:00
FrederikBaerentsen 145d9d5dcb feat(admin): database is expanded by default 2025-10-02 14:35:37 +02:00
FrederikBaerentsen b9d42c2866 feat(admin): new env var. for which sections should be open by default in the admin page. 2025-10-02 14:27:32 +02:00
FrederikBaerentsen d1988d015e fix(sets): year-filter now correctly show all years not just current page. 2025-10-02 14:02:51 +02:00
FrederikBaerentsen 8e458b01d1 Merge pull request 'feature/statistics' (#107) from feature/statistics into release/1.3
Reviewed-on: #107
2025-10-02 13:36:31 +02:00
FrederikBaerentsen 989e0d57d0 Fixed date formatting on consolidated sets 2025-10-01 21:17:44 +02:00
FrederikBaerentsen 1097255dca Fixed consolidated price on card 2025-10-01 21:11:14 +02:00
FrederikBaerentsen 7ffbc41f0a Updated changelog 2025-10-01 21:02:58 +02:00
FrederikBaerentsen 11f9e5782f Added charts, env var for charts, fixed formatting and table columns 2025-10-01 20:52:29 +02:00
FrederikBaerentsen 5f43e979f9 feat(statistics): Initial upload 2025-10-01 19:43:25 +02:00
FrederikBaerentsen 4375f018a4 Merge pull request 'feature/consolidation' (#106) from feature/consolidation into release/1.3
Reviewed-on: #106
2025-10-01 19:28:49 +02:00
FrederikBaerentsen 87472039be Changed border color 2025-10-01 19:22:57 +02:00
FrederikBaerentsen c1089c349f Fixed total minifigures for consolidated sets 2025-09-28 08:59:10 +02:00
FrederikBaerentsen 3f6af51a43 Changed the look of consolidated cards when multiple statuses are used. 2025-09-28 08:42:33 +02:00
FrederikBaerentsen bc3cc176ef Fixed purchase information on consolidated cards 2025-09-27 23:43:27 +02:00
FrederikBaerentsen 4a1a265fa8 Updated changelog 2025-09-27 23:32:45 +02:00
FrederikBaerentsen 7c95583345 Changed the "Multiple Copies Available" view and fixed border formatting. 2025-09-27 23:30:13 +02:00
FrederikBaerentsen 65f23c1f12 Fixed nested box formatting. 2025-09-27 23:06:53 +02:00
FrederikBaerentsen aa6c969a6b Fixed consolidating sets. 2025-09-27 23:06:06 +02:00
FrederikBaerentsen 0bff20215c Merge pull request 'feature/checkbox' (#105) from feature/checkbox into release/1.3
Reviewed-on: #105
2025-09-27 16:26:04 +02:00
FrederikBaerentsen d0147b8061 Incremented version to 1.3.0 2025-09-27 16:17:05 +02:00
FrederikBaerentsen ca0de215ab Fixed damaged parts drawer showing on minifigures when no parts are damaged. 2025-09-26 12:46:31 +02:00
FrederikBaerentsen 05b259e494 Removed checkboxes from minifigures details page 2025-09-26 12:28:49 +02:00
FrederikBaerentsen f03fd82be1 Feat(checkbox): Initial upload 2025-09-26 11:47:15 +02:00
FrederikBaerentsen a769e5464b Merge pull request 'feature/peeron' (#104) from feature/peeron into release/1.3
Reviewed-on: #104
2025-09-26 11:40:01 +02:00
FrederikBaerentsen 40871a1c10 Changed download string 2025-09-26 11:37:49 +02:00
FrederikBaerentsen caac283905 Updated peeron download logic with proper socket. 2025-09-26 11:31:22 +02:00
FrederikBaerentsen 4bc0ef5cc4 Peeron thumbnails cache, as peeron uses http and cant live link to https 2025-09-25 22:09:36 +02:00
FrederikBaerentsen ec4f44a3ab Removed unused import 2025-09-25 21:46:58 +02:00
FrederikBaerentsen 0a29543939 Cleanup of peeron download 2025-09-25 21:42:15 +02:00
FrederikBaerentsen 74fe14f09b Added rotation, moved select all, added link after download 2025-09-25 20:47:41 +02:00
FrederikBaerentsen 787624c432 Added env variables and fixed socket for peeron 2025-09-24 21:59:10 +02:00
FrederikBaerentsen eddf4311d3 Feat(peeron): Initial upload 2025-09-24 21:59:10 +02:00
FrederikBaerentsen 90c0c20d75 Merge pull request 'feature/pagination' (#101) from feature/pagination into release/1.3
Reviewed-on: #101
2025-09-24 21:49:05 +02:00
147 changed files with 9555 additions and 422 deletions
+95 -12
View File
@@ -32,6 +32,11 @@
# Default: https://www.bricklink.com/v2/catalog/catalogitem.page?P={part}&C={color}
# BK_BRICKLINK_LINK_PART_PATTERN=
# Optional: Pattern of the link to Bricklink for a set. Will be passed to Python .format()
# Supports {set_num} parameter. Set numbers in format like '10255-1' are used.
# Default: https://www.bricklink.com/v2/catalog/catalogitem.page?S={set_num}
# BK_BRICKLINK_LINK_SET_PATTERN=
# Optional: Display Bricklink links wherever applicable
# Default: false
# BK_BRICKLINK_LINKS=true
@@ -56,6 +61,10 @@
# Default: 25
# BK_DEFAULT_TABLE_PER_PAGE=50
# Optional: Maximum length for description text in badges before truncating with ellipsis
# Default: 15
# BK_DESCRIPTION_BADGE_MAX_LENGTH=15
# Optional: if set up, will add a CORS allow origin restriction to the socket.
# Default:
# Legacy name: DOMAIN_NAME
@@ -97,6 +106,14 @@
# Default: false
# BK_HIDE_ADMIN=true
# Optional: Admin sections to expand by default (comma-separated list)
# Valid sections: authentication, instructions, image, theme, retired, metadata, owner, purchase_location, status, storage, tag, database
# Default: database (maintains original behavior with database section expanded)
# Examples:
# BK_ADMIN_DEFAULT_EXPANDED_SECTIONS=database,theme
# BK_ADMIN_DEFAULT_EXPANDED_SECTIONS=instructions,metadata
# BK_ADMIN_DEFAULT_EXPANDED_SECTIONS= (all sections collapsed)
# Optional: Hide the 'Instructions' entry from the menu. Does not disable the route.
# Default: false
# BK_HIDE_ALL_INSTRUCTIONS=true
@@ -105,6 +122,13 @@
# Default: false
# BK_HIDE_ALL_MINIFIGURES=true
# Optional: Disable the individual/loose minifigures system. This hides all individual
# minifigure UI elements and prevents adding new individual minifigures. The routes remain
# accessible so existing individual minifigures can still be viewed. Users who only track
# set-based minifigures can use this to simplify the interface. Does not disable the route.
# Default: false
# BK_DISABLE_INDIVIDUAL_MINIFIGURES=false
# Optional: Hide the 'Parts' entry from the menu. Does not disable the route.
# Default: false
# BK_HIDE_ALL_PARTS=true
@@ -122,6 +146,10 @@
# Default: false
# BK_HIDE_ALL_STORAGES=true
# Optional: Hide the 'Statistics' entry from the menu. Does not disable the route.
# Default: false
# BK_HIDE_STATISTICS=true
# Optional: Hide the 'Instructions' entry in a Set card
# Default: false
# BK_HIDE_SET_INSTRUCTIONS=true
@@ -134,17 +162,24 @@
# Default: false
# BK_HIDE_TABLE_MISSING_PARTS=true
# Optional: Hide the 'Checked' column from the parts table.
# Default: false
# BK_HIDE_TABLE_CHECKED_PARTS=true
# Optional: Hide the 'Wishlist' entry from the menu. Does not disable the route.
# Default: false
# BK_HIDE_WISHES=true
# Optional: Change the default order of minifigures. By default ordered by insertion order.
# Note: Minifigures are queried from a combined view that merges both set-based and individual minifigures.
# Therefore, column references should use the "combined" table alias.
# Useful column names for this option are:
# - "rebrickable_minifigures"."figure": minifigure ID (fig-xxxxx)
# - "rebrickable_minifigures"."number": minifigure ID as an integer (xxxxx)
# - "rebrickable_minifigures"."name": minifigure name
# Default: "rebrickable_minifigures"."name" ASC
# BK_MINIFIGURES_DEFAULT_ORDER="rebrickable_minifigures"."name" ASC
# - "combined"."figure": minifigure ID (fig-xxxxx)
# - "combined"."number": minifigure ID as an integer (xxxxx)
# - "combined"."name": minifigure name
# - "combined"."rowid": insertion order (for both set and individual minifigures)
# Default: "combined"."name" ASC
# BK_MINIFIGURES_DEFAULT_ORDER="combined"."name" ASC
# Optional: Folder where to store the minifigures images, relative to the '/app/static/' folder
# Default: minifigs
@@ -157,14 +192,16 @@
# BK_NO_THREADED_SOCKET=true
# Optional: Change the default order of parts. By default ordered by insertion order.
# Note: Parts are queried from a combined view that merges both set-based and individual minifigure parts.
# Some columns use the "combined" table alias for fields from the merged view.
# Useful column names for this option are:
# - "bricktracker_parts"."part": part number
# - "bricktracker_parts"."spare": part is a spare part
# - "combined"."part": part number
# - "combined"."spare": part is a spare part (use "combined" not "bricktracker_parts")
# - "rebrickable_parts"."name": part name
# - "rebrickable_parts"."color_name": part color name
# - "total_missing": number of missing parts
# Default: "rebrickable_parts"."name" ASC, "rebrickable_parts"."color_name" ASC, "bricktracker_parts"."spare" ASC
# BK_PARTS_DEFAULT_ORDER="total_missing" DESC, "rebrickable_parts"."name"."name" ASC
# Default: "rebrickable_parts"."name" ASC, "rebrickable_parts"."color_name" ASC, "combined"."spare" ASC
# BK_PARTS_DEFAULT_ORDER="total_missing" DESC, "rebrickable_parts"."name" ASC
# Optional: Folder where to store the parts images, relative to the '/app/static/' folder
# Default: parts
@@ -262,9 +299,36 @@
# Default: https://rebrickable.com/instructions/{path}
# BK_REBRICKABLE_LINK_INSTRUCTIONS_PATTERN=
# Optional: User-Agent to use when querying Rebrickable outside of the Rebrick python library
# Default: 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36'
# BK_REBRICKABLE_USER_AGENT=
# Optional: User-Agent to use when querying Rebrickable and Peeron outside of the Rebrick python library
# Default: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36
# BK_USER_AGENT=Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36
# Legacy: User-Agent for Rebrickable (use BK_USER_AGENT instead)
# BK_REBRICKABLE_USER_AGENT=Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36
# Optional: Delay in milliseconds between Peeron page downloads to avoid being potentially blocked
# Default: 1000
# BK_PEERON_DOWNLOAD_DELAY=1000
# Optional: Minimum image size (width/height) for valid Peeron instruction pages
# Images smaller than this are considered error placeholders and will be rejected
# Default: 100
# BK_PEERON_MIN_IMAGE_SIZE=100
# Optional: Pattern for Peeron instruction page URLs. Will be passed to Python .format()
# Supports {set_number} and {version_number} parameters
# Default: http://peeron.com/scans/{set_number}-{version_number}
# BK_PEERON_INSTRUCTION_PATTERN=
# Optional: Pattern for Peeron thumbnail URLs. Will be passed to Python .format()
# Supports {set_number} and {version_number} parameters
# Default: http://belay.peeron.com/thumbs/{set_number}-{version_number}/
# BK_PEERON_THUMBNAIL_PATTERN=
# Optional: Pattern for Peeron scan URLs. Will be passed to Python .format()
# Supports {set_number} and {version_number} parameters
# Default: http://belay.peeron.com/scans/{set_number}-{version_number}/
# BK_PEERON_SCAN_PATTERN=
# Optional: Display Rebrickable links wherever applicable
# Default: false
@@ -301,6 +365,12 @@
# Default: sets
# BK_SETS_FOLDER=sets
# Optional: Enable set consolidation/grouping on the main sets page
# When enabled, multiple copies of the same set are grouped together showing instance count
# When disabled, each set copy is displayed individually (original behavior)
# Default: false
# BK_SETS_CONSOLIDATION=true
# Optional: Make the grid filters displayed by default, rather than collapsed
# Default: false
# BK_SHOW_GRID_FILTERS=true
@@ -309,6 +379,10 @@
# Default: false
# BK_SHOW_GRID_SORT=true
# Optional: Show duplicate filter button on sets page
# Default: true
# BK_SHOW_SETS_DUPLICATE_FILTER=true
# Optional: Skip saving or displaying spare parts
# Default: false
# BK_SKIP_SPARE_PARTS=true
@@ -354,3 +428,12 @@
# - "bricktracker_wishes"."number_of_parts": set number of parts
# Default: "bricktracker_wishes"."rowid" DESC
# BK_WISHES_DEFAULT_ORDER="bricktracker_wishes"."set" DESC
# Optional: Show collection growth charts on the statistics page
# Default: true
# BK_STATISTICS_SHOW_CHARTS=false
# Optional: Default state of statistics page sections (expanded or collapsed)
# When true, all sections start expanded. When false, all sections start collapsed.
# Default: true
# BK_STATISTICS_DEFAULT_EXPANDED=false
+3
View File
@@ -33,3 +33,6 @@ vitepress/
# Local data
offline/
TODO.md
run-local.sh
test-server.sh
+86 -1
View File
@@ -23,7 +23,92 @@
- Preserves selection state during dropdown consolidation
- Consistent search behavior (instant for client-side, Enter key for server-side)
- Mobile-friendly pagination navigation
- Add Peeron instructions integration
- Full image caching system with automatic thumbnail generation
- Optimized HTTP calls by downloading full images once and generating thumbnails locally
- Automatic cache cleanup after PDF generation to save disk space
- Add parts checking/inventory system
- New "Checked" column in parts tables for tracking inventory progress
- Checkboxes to mark parts as verified during set walkthrough
- `BK_HIDE_TABLE_CHECKED_PARTS`: Environment variable to hide checked column
- Add set consolidation/grouping functionality
- Automatic grouping of duplicate sets on main sets page
- Shows instance count with stack icon badge (e.g., "3 copies")
- Expandable drawer interface to view all set copies individually
- Full set cards for each instance with all badges, statuses, and functionality
- `BK_SETS_CONSOLIDATION`: Environment variable to enable/disable consolidation (default: false)
- Backwards compatible - when disabled, behaves exactly like original individual view
- Improved theme filtering: handles duplicate theme names correctly
- Fixed set number sorting: proper numeric sorting in both ascending and descending order
- Mixed status indicators for consolidated sets: three-state checkboxes (unchecked/partial/checked) with count badges
- Template logic handles three states: none (0/2), all (2/2), partial (1/2) with visual indicators
- Purple overlay styling for partial states, disabled checkboxes for read-only consolidated status display
- Individual sets maintain full interactive checkbox functionality
- Add comprehensive statistics system (#91)
- New Statistics page with collection analytics
- Financial overview: total cost, average price, price range, investment tracking
- Collection metrics: total sets, unique sets, parts count, minifigures count
- Theme distribution statistics with clickable drill-down to filtered sets
- Storage location statistics showing sets per location with value calculations
- Purchase location analytics with spending patterns and date ranges
- Problem tracking: missing and damaged parts statistics
- Clickable numbers throughout statistics that filter to relevant sets
- `BK_HIDE_STATISTICS`: Environment variable to hide statistics menu item
- Year-based analytics: Sets by release year and purchases by year
- Sets by Release Year: Shows collection distribution across LEGO release years
- Purchases by Year: Tracks spending patterns and acquisition timeline
- Year summary with peak collection/spending years and timeline insights
- Enhanced statistics interface and functionality
- Collapsible sections: All statistics sections have clickable headers to expand/collapse
- Collection growth charts: Line charts showing sets, parts, and minifigures over time
- Configuration options: `BK_STATISTICS_SHOW_CHARTS` and `BK_STATISTICS_DEFAULT_EXPANDED` environment variables
- Add configurable admin page section expansion
- `BK_ADMIN_DEFAULT_EXPANDED_SECTIONS`: Environment variable to specify which sections expand by default
- Accepts comma-separated list of section names (e.g., "database,theme,instructions")
- Valid sections: authentication, instructions, image, theme, retired, metadata, owner, purchase_location, status, storage, tag, database
- URL parameters take priority over configuration (e.g., `?open_database=1`)
- Database section expanded by default to maintain original behavior
- Smart metadata handling: sub-section expansion automatically expands parent metadata section
- Add duplicate sets filter functionality
- New filter button on Sets page to show only duplicate/consolidated sets
- `BK_SHOW_SETS_DUPLICATE_FILTER`: Environment variable to show/hide the filter button (default: true)
- Works with both server-side and client-side pagination modes
- Consolidated mode: Shows sets that have multiple instances
- Non-consolidated mode: Shows sets that appear multiple times in collection
- Add BrickLink links for sets
- BrickLink badge links now appear on set cards and set details pages alongside Rebrickable links
- `BK_BRICKLINK_LINK_SET_PATTERN`: New environment variable for BrickLink set URL pattern (default: https://www.bricklink.com/v2/catalog/catalogitem.page?S={set_num})
- Controlled by existing `BK_BRICKLINK_LINKS` environment variable
- Add live environment variable configuration management system
- Configuration Management interface in admin panel with live preview and badge system
- Live settings: Can be changed without application restart (menu visibility, table display, pagination, features)
- Static settings: Require restart but can be edited and saved to .env file (authentication, server, database, API keys)
- Advanced badge system showing value status: True/False for booleans, Set/Default/Unset for other values, Changed indicator
- Live API endpoints: `/admin/api/config/update` for immediate changes, `/admin/api/config/update-static` for .env updates
- Form pre-population with current values and automatic page reload after successful live updates
- **BREAKING CHANGE**: Default minifigures folder path changed from `minifigs` to `minifigures`
- Impact: Users who relied on the default `BK_MINIFIGURES_FOLDER` value (without explicitly setting it) will need to either:
1. Set `BK_MINIFIGURES_FOLDER=minifigs` in their environment to maintain existing behavior, or
2. Rename their existing `minifigs` folder to `minifigures`
- No impact: Users who already have `BK_MINIFIGURES_FOLDER` explicitly configured
- Improved consistency across documentation and Docker configurations
- Add performance optimization
- SQLite WAL Mode:
- Increased cache size to 10,000 pages (~40MB) for faster query execution
- Set temp_store to memory for accelerated temporary operations
- Enabled foreign key constraints and optimized synchronous mode
- Added ANALYZE for improved query planning and statistics
- Database Indexes (Migration 0019):
- High-impact composite index for problem parts aggregation (`idx_bricktracker_parts_id_missing_damaged`)
- Parts lookup optimization (`idx_bricktracker_parts_part_color_spare`)
- Set storage filtering (`idx_bricktracker_sets_set_storage`)
- Search optimization with case-insensitive indexes (`idx_rebrickable_sets_name_lower`, `idx_rebrickable_parts_name_lower`)
- Year and theme filtering optimization (`idx_rebrickable_sets_year`, `idx_rebrickable_sets_theme_id`)
- Additional indexes for purchase dates, quantities, sorting, and minifigures aggregation
- Statistics Query Optimization:
- Replaced separate subqueries with efficient CTEs (Common Table Expressions)
- Consolidated aggregations for set, part, minifigure, and financial statistics
### 1.2.4
> **Warning**
+6 -1
View File
@@ -29,9 +29,11 @@ from bricktracker.views.error import error_404
from bricktracker.views.index import index_page
from bricktracker.views.instructions import instructions_page
from bricktracker.views.login import login_page
from bricktracker.views.individual_minifigure import individual_minifigure_page
from bricktracker.views.minifigure import minifigure_page
from bricktracker.views.part import part_page
from bricktracker.views.set import set_page
from bricktracker.views.statistics import statistics_page
from bricktracker.views.storage import storage_page
from bricktracker.views.wish import wish_page
@@ -60,7 +62,8 @@ def setup_app(app: Flask) -> None:
# Setup the login manager
LoginManager(app)
# I don't know :-)
# Configure proxy header handling for reverse proxy deployments (nginx, Apache, etc.)
# This ensures proper client IP detection and HTTPS scheme recognition
app.wsgi_app = ProxyFix(
app.wsgi_app,
x_for=1,
@@ -78,9 +81,11 @@ def setup_app(app: Flask) -> None:
app.register_blueprint(index_page)
app.register_blueprint(instructions_page)
app.register_blueprint(login_page)
app.register_blueprint(individual_minifigure_page)
app.register_blueprint(minifigure_page)
app.register_blueprint(part_page)
app.register_blueprint(set_page)
app.register_blueprint(statistics_page)
app.register_blueprint(storage_page)
app.register_blueprint(wish_page)
+13 -3
View File
@@ -11,11 +11,14 @@ CONFIG: Final[list[dict[str, Any]]] = [
{'n': 'AUTHENTICATION_PASSWORD', 'd': ''},
{'n': 'AUTHENTICATION_KEY', 'd': ''},
{'n': 'BRICKLINK_LINK_PART_PATTERN', 'd': 'https://www.bricklink.com/v2/catalog/catalogitem.page?P={part}&C={color}'}, # noqa: E501
{'n': 'BRICKLINK_LINK_SET_PATTERN', 'd': 'https://www.bricklink.com/v2/catalog/catalogitem.page?S={set_num}'}, # noqa: E501
{'n': 'BRICKLINK_LINKS', 'c': bool},
{'n': 'DATABASE_PATH', 'd': './app.db'},
{'n': 'DATABASE_TIMESTAMP_FORMAT', 'd': '%Y-%m-%d-%H-%M-%S'},
{'n': 'DEBUG', 'c': bool},
{'n': 'DEFAULT_TABLE_PER_PAGE', 'd': 25, 'c': int},
{'n': 'DESCRIPTION_BADGE_MAX_LENGTH', 'd': 15, 'c': int},
{'n': 'DISABLE_INDIVIDUAL_MINIFIGURES', 'c': bool},
{'n': 'DOMAIN_NAME', 'e': 'DOMAIN_NAME', 'd': ''},
{'n': 'FILE_DATETIME_FORMAT', 'd': '%d/%m/%Y, %H:%M:%S'},
{'n': 'HOST', 'd': '0.0.0.0'},
@@ -25,25 +28,28 @@ CONFIG: Final[list[dict[str, Any]]] = [
{'n': 'HIDE_ADD_SET', 'c': bool},
{'n': 'HIDE_ADD_BULK_SET', 'c': bool},
{'n': 'HIDE_ADMIN', 'c': bool},
{'n': 'ADMIN_DEFAULT_EXPANDED_SECTIONS', 'd': ['database'], 'c': list},
{'n': 'HIDE_ALL_INSTRUCTIONS', 'c': bool},
{'n': 'HIDE_ALL_MINIFIGURES', 'c': bool},
{'n': 'HIDE_ALL_PARTS', 'c': bool},
{'n': 'HIDE_ALL_PROBLEMS_PARTS', 'e': 'BK_HIDE_MISSING_PARTS', 'c': bool},
{'n': 'HIDE_ALL_SETS', 'c': bool},
{'n': 'HIDE_ALL_STORAGES', 'c': bool},
{'n': 'HIDE_STATISTICS', 'c': bool},
{'n': 'HIDE_SET_INSTRUCTIONS', 'c': bool},
{'n': 'HIDE_TABLE_DAMAGED_PARTS', 'c': bool},
{'n': 'HIDE_TABLE_MISSING_PARTS', 'c': bool},
{'n': 'HIDE_TABLE_CHECKED_PARTS', 'c': bool},
{'n': 'HIDE_WISHES', 'c': bool},
{'n': 'MINIFIGURES_DEFAULT_ORDER', 'd': '"rebrickable_minifigures"."name" ASC'}, # noqa: E501
{'n': 'MINIFIGURES_FOLDER', 'd': 'minifigs', 's': True},
{'n': 'MINIFIGURES_DEFAULT_ORDER', 'd': '"combined"."name" ASC'}, # noqa: E501
{'n': 'MINIFIGURES_FOLDER', 'd': 'minifigures', 's': True},
{'n': 'MINIFIGURES_PAGINATION_SIZE_DESKTOP', 'd': 10, 'c': int},
{'n': 'MINIFIGURES_PAGINATION_SIZE_MOBILE', 'd': 5, 'c': int},
{'n': 'MINIFIGURES_SERVER_SIDE_PAGINATION', 'c': bool},
{'n': 'NO_THREADED_SOCKET', 'c': bool},
{'n': 'PARTS_SERVER_SIDE_PAGINATION', 'c': bool},
{'n': 'SETS_SERVER_SIDE_PAGINATION', 'c': bool},
{'n': 'PARTS_DEFAULT_ORDER', 'd': '"rebrickable_parts"."name" ASC, "rebrickable_parts"."color_name" ASC, "bricktracker_parts"."spare" ASC'}, # noqa: E501
{'n': 'PARTS_DEFAULT_ORDER', 'd': '"rebrickable_parts"."name" ASC, "rebrickable_parts"."color_name" ASC, "combined"."spare" ASC'}, # noqa: E501
{'n': 'PARTS_FOLDER', 'd': 'parts', 's': True},
{'n': 'PARTS_PAGINATION_SIZE_DESKTOP', 'd': 10, 'c': int},
{'n': 'PARTS_PAGINATION_SIZE_MOBILE', 'd': 5, 'c': int},
@@ -76,8 +82,10 @@ CONFIG: Final[list[dict[str, Any]]] = [
{'n': 'RETIRED_SETS_PATH', 'd': './retired_sets.csv'},
{'n': 'SETS_DEFAULT_ORDER', 'd': '"rebrickable_sets"."number" DESC, "rebrickable_sets"."version" ASC'}, # noqa: E501
{'n': 'SETS_FOLDER', 'd': 'sets', 's': True},
{'n': 'SETS_CONSOLIDATION', 'd': False, 'c': bool},
{'n': 'SHOW_GRID_FILTERS', 'c': bool},
{'n': 'SHOW_GRID_SORT', 'c': bool},
{'n': 'SHOW_SETS_DUPLICATE_FILTER', 'd': True, 'c': bool},
{'n': 'SKIP_SPARE_PARTS', 'c': bool},
{'n': 'SOCKET_NAMESPACE', 'd': 'bricksocket'},
{'n': 'SOCKET_PATH', 'd': '/bricksocket/'},
@@ -87,4 +95,6 @@ CONFIG: Final[list[dict[str, Any]]] = [
{'n': 'TIMEZONE', 'd': 'Etc/UTC'},
{'n': 'USE_REMOTE_IMAGES', 'c': bool},
{'n': 'WISHES_DEFAULT_ORDER', 'd': '"bricktracker_wishes"."rowid" DESC'},
{'n': 'STATISTICS_SHOW_CHARTS', 'd': True, 'c': bool},
{'n': 'STATISTICS_DEFAULT_EXPANDED', 'd': True, 'c': bool},
]
+314
View File
@@ -0,0 +1,314 @@
import os
import logging
from typing import Any, Dict, Final, List, Optional
from pathlib import Path
from flask import current_app
logger = logging.getLogger(__name__)
# Environment variables that can be changed live without restart
LIVE_CHANGEABLE_VARS: Final[List[str]] = [
'BK_BRICKLINK_LINKS',
'BK_DEFAULT_TABLE_PER_PAGE',
'BK_DESCRIPTION_BADGE_MAX_LENGTH',
'BK_INDEPENDENT_ACCORDIONS',
'BK_HIDE_ADD_SET',
'BK_HIDE_ADD_BULK_SET',
'BK_HIDE_ADMIN',
'BK_ADMIN_DEFAULT_EXPANDED_SECTIONS',
'BK_HIDE_ALL_INSTRUCTIONS',
'BK_HIDE_ALL_MINIFIGURES',
'BK_HIDE_ALL_PARTS',
'BK_HIDE_ALL_PROBLEMS_PARTS',
'BK_HIDE_ALL_SETS',
'BK_HIDE_ALL_STORAGES',
'BK_HIDE_STATISTICS',
'BK_HIDE_SET_INSTRUCTIONS',
'BK_HIDE_TABLE_DAMAGED_PARTS',
'BK_HIDE_TABLE_MISSING_PARTS',
'BK_HIDE_TABLE_CHECKED_PARTS',
'BK_HIDE_WISHES',
'BK_MINIFIGURES_PAGINATION_SIZE_DESKTOP',
'BK_MINIFIGURES_PAGINATION_SIZE_MOBILE',
'BK_MINIFIGURES_SERVER_SIDE_PAGINATION',
'BK_PARTS_PAGINATION_SIZE_DESKTOP',
'BK_PARTS_PAGINATION_SIZE_MOBILE',
'BK_PARTS_SERVER_SIDE_PAGINATION',
'BK_SETS_SERVER_SIDE_PAGINATION',
'BK_PROBLEMS_PAGINATION_SIZE_DESKTOP',
'BK_PROBLEMS_PAGINATION_SIZE_MOBILE',
'BK_PROBLEMS_SERVER_SIDE_PAGINATION',
'BK_SETS_PAGINATION_SIZE_DESKTOP',
'BK_SETS_PAGINATION_SIZE_MOBILE',
'BK_SETS_CONSOLIDATION',
'BK_RANDOM',
'BK_REBRICKABLE_LINKS',
'BK_SHOW_GRID_FILTERS',
'BK_SHOW_GRID_SORT',
'BK_SHOW_SETS_DUPLICATE_FILTER',
'BK_SKIP_SPARE_PARTS',
'BK_USE_REMOTE_IMAGES',
'BK_PEERON_DOWNLOAD_DELAY',
'BK_PEERON_MIN_IMAGE_SIZE',
'BK_REBRICKABLE_PAGE_SIZE',
'BK_STATISTICS_SHOW_CHARTS',
'BK_STATISTICS_DEFAULT_EXPANDED',
# Default ordering and formatting
'BK_INSTRUCTIONS_ALLOWED_EXTENSIONS',
'BK_MINIFIGURES_DEFAULT_ORDER',
'BK_PARTS_DEFAULT_ORDER',
'BK_SETS_DEFAULT_ORDER',
'BK_PURCHASE_LOCATION_DEFAULT_ORDER',
'BK_STORAGE_DEFAULT_ORDER',
'BK_WISHES_DEFAULT_ORDER',
# URL and Pattern Variables
'BK_BRICKLINK_LINK_PART_PATTERN',
'BK_BRICKLINK_LINK_SET_PATTERN',
'BK_REBRICKABLE_IMAGE_NIL',
'BK_REBRICKABLE_IMAGE_NIL_MINIFIGURE',
'BK_REBRICKABLE_LINK_MINIFIGURE_PATTERN',
'BK_REBRICKABLE_LINK_PART_PATTERN',
'BK_REBRICKABLE_LINK_INSTRUCTIONS_PATTERN',
'BK_PEERON_INSTRUCTION_PATTERN',
'BK_PEERON_SCAN_PATTERN',
'BK_PEERON_THUMBNAIL_PATTERN',
'BK_RETIRED_SETS_FILE_URL',
'BK_RETIRED_SETS_PATH',
'BK_THEMES_FILE_URL',
'BK_THEMES_PATH'
]
# Environment variables that require restart
RESTART_REQUIRED_VARS: Final[List[str]] = [
'BK_AUTHENTICATION_PASSWORD',
'BK_AUTHENTICATION_KEY',
'BK_DATABASE_PATH',
'BK_DEBUG',
'BK_DISABLE_INDIVIDUAL_MINIFIGURES',
'BK_DOMAIN_NAME',
'BK_HOST',
'BK_PORT',
'BK_SOCKET_NAMESPACE',
'BK_SOCKET_PATH',
'BK_NO_THREADED_SOCKET',
'BK_TIMEZONE',
'BK_REBRICKABLE_API_KEY',
'BK_INSTRUCTIONS_FOLDER',
'BK_PARTS_FOLDER',
'BK_SETS_FOLDER',
'BK_MINIFIGURES_FOLDER',
'BK_DATABASE_TIMESTAMP_FORMAT',
'BK_FILE_DATETIME_FORMAT',
'BK_PURCHASE_DATE_FORMAT',
'BK_PURCHASE_CURRENCY',
'BK_REBRICKABLE_USER_AGENT',
'BK_USER_AGENT'
]
class ConfigManager:
"""Manages live configuration updates for BrickTracker"""
def __init__(self):
self.env_file_path = Path('.env')
def get_current_config(self) -> Dict[str, Any]:
"""Get current configuration values for live-changeable variables"""
config = {}
for var in LIVE_CHANGEABLE_VARS:
# Get internal config name
internal_name = var.replace('BK_', '')
# Get current value from Flask config
if internal_name in current_app.config:
config[var] = current_app.config[internal_name]
else:
# Fallback to environment variable
config[var] = os.environ.get(var, '')
return config
def get_restart_required_config(self) -> Dict[str, Any]:
"""Get current configuration values for restart-required variables"""
config = {}
for var in RESTART_REQUIRED_VARS:
# Get internal config name
internal_name = var.replace('BK_', '')
# Get current value from Flask config
if internal_name in current_app.config:
config[var] = current_app.config[internal_name]
else:
# Fallback to environment variable
config[var] = os.environ.get(var, '')
return config
def update_config(self, updates: Dict[str, Any]) -> Dict[str, str]:
"""Update configuration values. Returns dict with status for each update"""
results = {}
for var_name, new_value in updates.items():
if var_name not in LIVE_CHANGEABLE_VARS:
results[var_name] = f"Error: {var_name} requires restart to change"
continue
try:
# Update environment variable
os.environ[var_name] = str(new_value)
# Update Flask config
internal_name = var_name.replace('BK_', '')
cast_value = self._cast_value(var_name, new_value)
current_app.config[internal_name] = cast_value
# Update .env file
self._update_env_file(var_name, new_value)
results[var_name] = "Updated successfully"
if current_app.debug:
logger.info(f"Config updated: {var_name}={new_value}")
except Exception as e:
results[var_name] = f"Error: {str(e)}"
logger.error(f"Failed to update {var_name}: {e}")
return results
def _cast_value(self, var_name: str, value: Any) -> Any:
"""Cast value to appropriate type based on variable name"""
# List variables (admin sections) - Check this FIRST before boolean check
if 'sections' in var_name.lower():
if isinstance(value, str):
return [section.strip() for section in value.split(',') if section.strip()]
elif isinstance(value, list):
return value
else:
return []
# Integer variables (pagination sizes, delays, etc.) - Check BEFORE boolean check
if any(keyword in var_name.lower() for keyword in ['_size', '_page', 'delay', 'min_', 'per_page', 'page_size', '_length']):
try:
return int(value)
except (ValueError, TypeError):
return 0
# Boolean variables - More specific patterns to avoid conflicts
if any(keyword in var_name.lower() for keyword in ['hide_', 'server_side_pagination', '_links', 'random', 'skip_', 'show_', 'use_', '_consolidation', '_charts', '_expanded']):
if isinstance(value, str):
return value.lower() in ('true', '1', 'yes', 'on')
return bool(value)
# String variables (default)
return str(value)
def _format_env_value(self, value: Any) -> str:
"""Format value for .env file storage"""
if isinstance(value, bool):
return 'true' if value else 'false'
elif isinstance(value, (int, float)):
return str(value)
elif isinstance(value, list):
return ','.join(str(item) for item in value)
elif value is None:
return ''
else:
return str(value)
def _update_env_file(self, var_name: str, value: Any) -> None:
"""Update the .env file with new value"""
if not self.env_file_path.exists():
self.env_file_path.touch()
# Read current .env content
lines = []
if self.env_file_path.exists():
with open(self.env_file_path, 'r', encoding='utf-8') as f:
lines = f.readlines()
# Format value for .env file
env_value = self._format_env_value(value)
# Find and update the line, or add new line
updated = False
# First pass: Look for existing active variable
for i, line in enumerate(lines):
if line.strip().startswith(f"{var_name}="):
lines[i] = f"{var_name}={env_value}\n"
updated = True
break
# Second pass: If not found, look for commented-out variable
if not updated:
for i, line in enumerate(lines):
stripped = line.strip()
# Check for commented-out variable: # BK_VAR= or #BK_VAR=
if stripped.startswith('#') and var_name in stripped:
# Extract the part after #, handling optional space
comment_content = stripped[1:].strip()
if comment_content.startswith(f"{var_name}=") or comment_content.startswith(f"{var_name} ="):
# Uncomment and set new value, preserving any leading whitespace from original line
leading_whitespace = line[:len(line) - len(line.lstrip())]
lines[i] = f"{leading_whitespace}{var_name}={env_value}\n"
updated = True
logger.info(f"Uncommented and updated {var_name} in .env file")
break
# Third pass: If still not found, append to end
if not updated:
lines.append(f"{var_name}={env_value}\n")
logger.info(f"Added new {var_name} to end of .env file")
# Write back to file
with open(self.env_file_path, 'w', encoding='utf-8') as f:
f.writelines(lines)
def validate_config(self) -> Dict[str, Any]:
"""Validate current configuration"""
issues = []
warnings = []
# Check if critical variables are set
if not os.environ.get('BK_REBRICKABLE_API_KEY'):
warnings.append("BK_REBRICKABLE_API_KEY not set - some features may not work")
# Check for conflicting settings
if (os.environ.get('BK_PARTS_SERVER_SIDE_PAGINATION', '').lower() == 'false' and
int(os.environ.get('BK_PARTS_PAGINATION_SIZE_DESKTOP', '10')) > 100):
warnings.append("Large pagination size with client-side pagination may cause performance issues")
# Check pagination sizes are reasonable
for var in ['BK_SETS_PAGINATION_SIZE_DESKTOP', 'BK_PARTS_PAGINATION_SIZE_DESKTOP', 'BK_MINIFIGURES_PAGINATION_SIZE_DESKTOP']:
try:
size = int(os.environ.get(var, '10'))
if size < 1:
issues.append(f"{var} must be at least 1")
elif size > 1000:
warnings.append(f"{var} is very large ({size}) - may cause performance issues")
except ValueError:
issues.append(f"{var} must be a valid integer")
return {
'issues': issues,
'warnings': warnings,
'status': 'valid' if not issues else 'has_issues'
}
def get_variable_help(self, var_name: str) -> str:
"""Get help text for a configuration variable"""
help_text = {
'BK_BRICKLINK_LINKS': 'Show BrickLink links throughout the application',
'BK_DEFAULT_TABLE_PER_PAGE': 'Default number of items per page in tables',
'BK_INDEPENDENT_ACCORDIONS': 'Make accordion sections independent (can open multiple)',
'BK_HIDE_ADD_SET': 'Hide the "Add Set" menu entry',
'BK_HIDE_ADD_BULK_SET': 'Hide the "Add Bulk Set" menu entry',
'BK_HIDE_ADMIN': 'Hide the "Admin" menu entry',
'BK_ADMIN_DEFAULT_EXPANDED_SECTIONS': 'Admin sections to expand by default (comma-separated)',
'BK_HIDE_ALL_INSTRUCTIONS': 'Hide the "Instructions" menu entry',
'BK_HIDE_ALL_MINIFIGURES': 'Hide the "Minifigures" menu entry',
'BK_HIDE_ALL_PARTS': 'Hide the "Parts" menu entry',
'BK_HIDE_ALL_PROBLEMS_PARTS': 'Hide the "Problems" menu entry',
'BK_HIDE_ALL_SETS': 'Hide the "Sets" menu entry',
'BK_HIDE_ALL_STORAGES': 'Hide the "Storages" menu entry',
'BK_HIDE_STATISTICS': 'Hide the "Statistics" menu entry',
'BK_HIDE_SET_INSTRUCTIONS': 'Hide instructions section in set details',
'BK_HIDE_TABLE_DAMAGED_PARTS': 'Hide the "Damaged" column in parts tables',
'BK_HIDE_TABLE_MISSING_PARTS': 'Hide the "Missing" column in parts tables',
'BK_HIDE_TABLE_CHECKED_PARTS': 'Hide the "Checked" column in parts tables',
'BK_HIDE_WISHES': 'Hide the "Wishes" menu entry',
'BK_SETS_CONSOLIDATION': 'Enable set consolidation/grouping functionality',
'BK_SHOW_GRID_FILTERS': 'Show filter options on grids by default',
'BK_SHOW_GRID_SORT': 'Show sort options on grids by default',
'BK_SKIP_SPARE_PARTS': 'Skip spare parts when importing sets',
'BK_USE_REMOTE_IMAGES': 'Use remote images from Rebrickable CDN instead of local',
'BK_STATISTICS_SHOW_CHARTS': 'Show collection growth charts on statistics page',
'BK_STATISTICS_DEFAULT_EXPANDED': 'Expand all statistics sections by default'
}
return help_text.get(var_name, 'No help available for this variable')
+492
View File
@@ -0,0 +1,492 @@
import logging
import traceback
from typing import Any, Self, TYPE_CHECKING
from uuid import uuid4
from flask import current_app, url_for
from .exceptions import NotFoundException, DatabaseException, ErrorException
from .parser import parse_minifig
from .rebrickable import Rebrickable
from .rebrickable_minifigure import RebrickableMinifigure
from .set_owner_list import BrickSetOwnerList
from .set_purchase_location_list import BrickSetPurchaseLocationList
from .set_storage_list import BrickSetStorageList
from .set_tag_list import BrickSetTagList
from .sql import BrickSQL
if TYPE_CHECKING:
from .socket import BrickSocket
logger = logging.getLogger(__name__)
# Individual minifigure (not associated with a set)
class IndividualMinifigure(RebrickableMinifigure):
# Queries
select_query: str = 'individual_minifigure/select/by_id'
light_query: str = 'individual_minifigure/select/light'
insert_query: str = 'individual_minifigure/insert'
# Delete a individual minifigure
def delete(self, /) -> None:
BrickSQL().executescript(
'individual_minifigure/delete/individual_minifigure',
id=self.fields.id
)
# Import a individual minifigure into the database
def download(self, socket: 'BrickSocket', data: dict[str, Any], /) -> bool:
# Load the minifigure
if not self.load(socket, data, from_download=True):
return False
try:
# Insert into the database
socket.auto_progress(
message='Minifigure {figure}: inserting into database'.format(
figure=self.fields.figure
),
increment_total=True,
)
# Generate an UUID for self
self.fields.id = str(uuid4())
# Save the storage
storage = BrickSetStorageList.get(
data.get('storage', ''),
allow_none=True
)
self.fields.storage = storage.fields.id if storage else None
# Save the purchase location
purchase_location = BrickSetPurchaseLocationList.get(
data.get('purchase_location', ''),
allow_none=True
)
self.fields.purchase_location = purchase_location.fields.id if purchase_location else None
# Save quantity and description
self.fields.quantity = int(data.get('quantity', 1))
self.fields.description = data.get('description', '')
# IMPORTANT: Insert rebrickable minifigure FIRST
# bricktracker_individual_minifigures has FK to rebrickable_minifigures
self.insert_rebrickable_loose()
# Now insert into bricktracker_individual_minifigures
# Use no_defer=True to ensure the insert happens before we insert parts
# (parts have a foreign key constraint on this id)
self.insert(commit=False, no_defer=True)
# Save the owners
owners: list[str] = list(data.get('owners', []))
for id in owners:
owner = BrickSetOwnerList.get(id)
owner.update_individual_minifigure_state(self, state=True)
# Save the tags
tags: list[str] = list(data.get('tags', []))
for id in tags:
tag = BrickSetTagList.get(id)
tag.update_individual_minifigure_state(self, state=True)
# Load the parts (elements) for this minifigure
if not self.download_parts(socket):
return False
# Commit the transaction to the database
socket.auto_progress(
message='Minifigure {figure}: writing to the database'.format(
figure=self.fields.figure
),
increment_total=True,
)
BrickSQL().commit()
# Info
logger.info('Minifigure {figure}: imported (id: {id})'.format(
figure=self.fields.figure,
id=self.fields.id,
))
# Complete
socket.complete(
message='Minifigure {figure}: imported (<a href="{url}">Go to the minifigure</a>)'.format(
figure=self.fields.figure,
url=self.url()
),
download=True
)
except Exception as e:
socket.fail(
message='Error while importing minifigure {figure}: {error}'.format(
figure=self.fields.figure,
error=e,
)
)
logger.debug(traceback.format_exc())
return False
return True
# Download parts (elements) for this individual minifigure
def download_parts(self, socket: 'BrickSocket', /) -> bool:
"""Download minifigure parts using get_minifig_elements()"""
try:
# Check if we have cached parts data from load()
if hasattr(self, '_cached_parts_response'):
response = self._cached_parts_response
logger.debug('Using cached parts data from load()')
else:
# Need to fetch parts data
socket.auto_progress(
message='Minifigure {figure}: loading parts from Rebrickable'.format(
figure=self.fields.figure
),
increment_total=True,
)
logger.debug('rebrick.lego.get_minifig_elements("{figure}")'.format(
figure=self.fields.figure,
))
# Load parts data from Rebrickable API
import json
from rebrick import lego
parameters = {
'api_key': current_app.config['REBRICKABLE_API_KEY'],
'page_size': current_app.config['REBRICKABLE_PAGE_SIZE'],
}
response = json.loads(lego.get_minifig_elements(
self.fields.figure,
**parameters
).read())
socket.auto_progress(
message='Minifigure {figure}: saving parts to database'.format(
figure=self.fields.figure
),
)
# Insert each part into individual_minifigure_parts table
from .rebrickable_part import RebrickablePart
if 'results' in response:
logger.debug(f'Processing {len(response["results"])} parts for minifigure {self.fields.figure}')
for idx, result in enumerate(response['results']):
part_num = result['part']['part_num']
color_id = result['color']['id']
logger.debug(
f'Part {idx+1}/{len(response["results"])}: {part_num} '
f'(color: {color_id}, quantity: {result["quantity"]})'
)
# Insert rebrickable part data first
part_data = RebrickablePart.from_rebrickable(result)
logger.debug(f'Rebrickable part data keys: {list(part_data.keys())}')
# Insert into rebrickable_parts if not exists
BrickSQL().execute(
'rebrickable/part/insert',
parameters=part_data,
commit=False,
)
# Download part image if not using remote images
if not current_app.config['USE_REMOTE_IMAGES']:
# Create a RebrickablePart instance for image download
from .set import BrickSet
try:
part_instance = RebrickablePart(record=part_data)
from .rebrickable_image import RebrickableImage
RebrickableImage(
BrickSet(), # Dummy set
minifigure=self,
part=part_instance,
).download()
except Exception as e:
logger.warning(
f'Could not download image for part {part_num}: {e}'
)
# Insert into bricktracker_individual_minifigure_parts
individual_part_params = {
'id': self.fields.id,
'part': part_num,
'color': color_id,
'spare': result.get('is_spare', False),
'quantity': result['quantity'],
'element': result.get('element_id'),
'rebrickable_inventory': result['id'],
}
logger.debug(f'Individual part params: {individual_part_params}')
BrickSQL().execute(
'individual_minifigure/part/insert',
parameters=individual_part_params,
commit=False,
)
logger.debug(f'Successfully inserted all {len(response["results"])} parts')
else:
logger.warning(f'No results in parts response for minifigure {self.fields.figure}')
# Clean up cached data
if hasattr(self, '_cached_parts_response'):
delattr(self, '_cached_parts_response')
return True
except Exception as e:
socket.fail(
message='Error loading parts for minifigure {figure}: {error}'.format(
figure=self.fields.figure,
error=e,
)
)
logger.debug(traceback.format_exc())
return False
# Insert the individual minifigure from Rebrickable
def insert_rebrickable_loose(self, /) -> None:
"""Insert rebrickable minifigure data (without set association)"""
# Insert the Rebrickable minifigure to the database
# Note: We override the parent's insert_rebrickable since we don't have a brickset
from .rebrickable_image import RebrickableImage
# Explicitly build parameters for rebrickable_minifigures insert
params = {
'figure': self.fields.figure,
'number': self.fields.number,
'name': self.fields.name,
'image': self.fields.image,
'number_of_parts': self.fields.number_of_parts,
}
BrickSQL().execute(
RebrickableMinifigure.insert_query,
parameters=params,
commit=False,
)
# Download image locally if not using remote images
if not current_app.config['USE_REMOTE_IMAGES']:
# Create a dummy BrickSet for RebrickableImage
# RebrickableImage checks minifigure first before set, so this works
from .set import BrickSet
try:
RebrickableImage(
BrickSet(), # Dummy set - not used since minifigure takes priority
minifigure=self,
).download()
logger.debug(f'Downloaded image for individual minifigure {self.fields.figure}')
except Exception as e:
logger.warning(
f'Could not download image for individual minifigure {self.fields.figure}: {e}'
)
# Load the minifigure from Rebrickable
def load(
self,
socket: 'BrickSocket',
data: dict[str, Any],
/,
*,
from_download=False,
) -> bool:
# Reset the progress
socket.progress_count = 0
socket.progress_total = 2
try:
# Check if individual minifigures are disabled
from flask import current_app
if current_app.config.get('DISABLE_INDIVIDUAL_MINIFIGURES', False):
raise ErrorException(
'Individual minifigures system is disabled. '
'Only set-based minifigures can be added.'
)
socket.auto_progress(message='Parsing minifigure number')
figure = parse_minifig(str(data['figure']))
socket.auto_progress(
message='Minifigure {figure}: loading from Rebrickable'.format(
figure=figure,
),
)
logger.debug('rebrick.lego.get_minifig_elements("{figure}")'.format(
figure=figure,
))
# Load from Rebrickable using get_minifig_elements
# This gives us both minifigure info and parts in one call
import json
from rebrick import lego
parameters = {
'api_key': current_app.config['REBRICKABLE_API_KEY'],
'page_size': current_app.config['REBRICKABLE_PAGE_SIZE'],
}
response = json.loads(lego.get_minifig_elements(
figure,
**parameters
).read())
# Extract minifigure info from the first part's metadata
if 'results' in response and len(response['results']) > 0:
first_part = response['results'][0]
# Build minifigure data from the response
self.fields.figure = first_part['set_num']
self.fields.number_of_parts = response['count']
# We need to fetch the proper name and image from get_minifig()
# This is a small additional call but gives us the proper minifigure data
try:
# get_minifig() only needs api_key, not page_size
minifig_params = {
'api_key': current_app.config['REBRICKABLE_API_KEY']
}
minifig_response = json.loads(lego.get_minifig(
figure,
**minifig_params
).read())
self.fields.name = minifig_response.get('name', f"Minifigure {figure}")
# Use the minifig image from get_minifig() - this is the assembled minifig
self.fields.image = minifig_response.get('set_img_url')
# Extract number from figure (e.g., fig-005997 -> 5997)
try:
self.fields.number = int(figure.split('-')[1])
except:
self.fields.number = 0
except Exception as e:
logger.warning(f'Could not fetch minifigure name: {e}')
self.fields.name = f"Minifigure {figure}"
# Try to extract number anyway
try:
self.fields.number = int(figure.split('-')[1])
except:
self.fields.number = 0
# Fallback: try to extract image from first part with element_id
self.fields.image = None
for result in response['results']:
if result.get('element_id') and result['part'].get('part_img_url'):
self.fields.image = result['part']['part_img_url']
break
# Store the parts data for later use in download
self._cached_parts_response = response
else:
raise NotFoundException(f'Minifigure {figure} has no parts in Rebrickable')
socket.emit('MINIFIGURE_LOADED', self.short(
from_download=from_download
))
if not from_download:
socket.complete(
message='Minifigure {figure}: loaded from Rebrickable'.format(
figure=self.fields.figure
)
)
return True
except Exception as e:
# Check if this is the "disabled" error - if so, show cleaner message
error_msg = str(e)
if 'Individual minifigures system is disabled' in error_msg:
socket.fail(message=error_msg)
else:
socket.fail(
message='Could not load the minifigure from Rebrickable: {error}. Data: {data}'.format(
error=error_msg,
data=data,
)
)
if not isinstance(e, (NotFoundException, ErrorException)):
logger.debug(traceback.format_exc())
return False
# Return a short form of the minifigure
def short(self, /, *, from_download: bool = False) -> dict[str, Any]:
return {
'download': from_download,
'image': self.url_for_image(),
'name': self.fields.name,
'figure': self.fields.figure,
}
# Select a individual minifigure by ID
def select_by_id(self, id: str, /) -> Self:
# Save the ID parameter
self.fields.id = id
# Import status list here to get metadata columns
from .set_status_list import BrickSetStatusList
# Pass metadata columns to the query with correct table names for individual minifigures
context = {
'owners': ', ' + BrickSetOwnerList.as_columns(table='bricktracker_individual_minifigure_owners') if BrickSetOwnerList.list() else '',
'statuses': ', ' + BrickSetStatusList.as_columns(table='bricktracker_individual_minifigure_statuses', all=True) if BrickSetStatusList.list(all=True) else '',
'tags': ', ' + BrickSetTagList.as_columns(table='bricktracker_individual_minifigure_tags') if BrickSetTagList.list() else '',
}
if not self.select(**context):
raise NotFoundException(
'Individual minifigure with ID {id} was not found in the database'.format(
id=id,
),
)
return self
# URL to this individual minifigure instance
def url(self, /) -> str:
return url_for('individual_minifigure.details', id=self.fields.id)
# URL for updating quantity
def url_for_quantity(self, /) -> str:
return url_for('individual_minifigure.update_quantity', id=self.fields.id)
# URL for updating description
def url_for_description(self, /) -> str:
return url_for('individual_minifigure.update_description', id=self.fields.id)
# Parts
def generic_parts(self, /):
from .part_list import BrickPartList
return BrickPartList().from_individual_minifigure(self)
# Override from_rebrickable to handle minifigure data
@staticmethod
def from_rebrickable(data: dict[str, Any], /, **_) -> dict[str, Any]:
# Extracting number
number = int(str(data['set_num'])[5:])
return {
'figure': str(data['set_num']),
'number': int(number),
'name': str(data['set_name']),
'image': data.get('set_img_url'),
'number_of_parts': int(data.get('num_parts', 0)),
}
@@ -0,0 +1,77 @@
import logging
from typing import Self
from .individual_minifigure import IndividualMinifigure
from .record_list import BrickRecordList
logger = logging.getLogger(__name__)
# Individual minifigures list
class IndividualMinifigureList(BrickRecordList[IndividualMinifigure]):
# Queries
instances_by_figure_query: str = 'individual_minifigure/select/instances_by_figure'
using_storage_query: str = 'individual_minifigure/list/using_storage'
without_storage_query: str = 'individual_minifigure/list/without_storage'
def __init__(self, /):
super().__init__()
# Load all individual instances of a specific minifigure figure
def instances_by_figure(self, figure: str, /) -> Self:
# Save the figure parameter
self.fields.figure = figure
# Import metadata lists to get columns
from .set_owner_list import BrickSetOwnerList
from .set_status_list import BrickSetStatusList
from .set_tag_list import BrickSetTagList
# Prepare context with metadata columns
context = {
'owners': BrickSetOwnerList.as_columns(table='bricktracker_individual_minifigure_owners') if BrickSetOwnerList.list() else 'NULL AS "no_owners"',
'statuses': BrickSetStatusList.as_columns(table='bricktracker_individual_minifigure_statuses', all=True) if BrickSetStatusList.list(all=True) else 'NULL AS "no_statuses"',
'tags': BrickSetTagList.as_columns(table='bricktracker_individual_minifigure_tags') if BrickSetTagList.list() else 'NULL AS "no_tags"',
}
# Load the instances from the database
self.list(override_query=self.instances_by_figure_query, **context)
return self
# Load all individual minifigures using a specific storage
def using_storage(self, storage: 'BrickSetStorage', /) -> Self:
# Save the storage parameter
self.fields.storage = storage.fields.id
# Load the minifigures from the database
self.list(override_query=self.using_storage_query)
return self
# Load all individual minifigures without storage
def without_storage(self, /) -> Self:
# Load minifigures with no storage
self.list(override_query=self.without_storage_query)
return self
# Base individual minifigure list
def list(
self,
/,
*,
override_query: str | None = None,
order: str | None = None,
limit: int | None = None,
**context,
) -> None:
# Load the individual minifigures from the database
for record in super().select(
override_query=override_query,
order=order,
limit=limit,
**context
):
individual_minifigure = IndividualMinifigure(record=record)
self.records.append(individual_minifigure)
+4 -2
View File
@@ -101,8 +101,9 @@ class BrickInstructions(object):
# Skip if we already have it
if os.path.isfile(target):
pdf_url = self.url()
return self.socket.complete(
message=f"File {self.filename} already exists, skipped"
message=f'File {self.filename} already exists, skipped - <a href="{pdf_url}" target="_blank" class="btn btn-sm btn-primary ms-2"><i class="ri-external-link-line"></i> Open PDF</a>'
)
# Fetch PDF via cloudscraper (to bypass Cloudflare)
@@ -141,8 +142,9 @@ class BrickInstructions(object):
# Done!
logger.info(f"Downloaded {self.filename}")
pdf_url = self.url()
self.socket.complete(
message=f"File {self.filename} downloaded ({self.human_size()})"
message=f'File {self.filename} downloaded ({self.human_size()}) - <a href="{pdf_url}" target="_blank" class="btn btn-sm btn-primary ms-2"><i class="ri-external-link-line"></i> Open PDF</a>'
)
except Exception as e:
+83 -4
View File
@@ -9,6 +9,7 @@ from .exceptions import DatabaseException, ErrorException, NotFoundException
from .record import BrickRecord
from .sql import BrickSQL
if TYPE_CHECKING:
from .individual_minifigure import IndividualMinifigure
from .set import BrickSet
logger = logging.getLogger(__name__)
@@ -18,16 +19,20 @@ logger = logging.getLogger(__name__)
class BrickMetadata(BrickRecord):
kind: str
# Set state endpoint
set_state_endpoint: str
# Endpoints (optional, not all metadata types use all of these)
set_state_endpoint: str = ''
individual_minifigure_state_endpoint: str = ''
individual_minifigure_value_endpoint: str = ''
# Queries
delete_query: str
insert_query: str
select_query: str
update_field_query: str
update_set_state_query: str
update_set_value_query: str
update_set_state_query: str = ''
update_set_value_query: str = ''
update_individual_minifigure_state_query: str = ''
update_individual_minifigure_value_query: str = ''
def __init__(
self,
@@ -106,6 +111,21 @@ class BrickMetadata(BrickRecord):
metadata_id=self.fields.id
)
# URL to change the selected state of this metadata item for an individual minifigure
def url_for_individual_minifigure_state(self, id: str, /) -> str:
return url_for(
self.individual_minifigure_state_endpoint,
id=id,
metadata_id=self.fields.id
)
# URL to change the value for an individual minifigure
def url_for_individual_minifigure_value(self, id: str, /) -> str:
return url_for(
self.individual_minifigure_value_endpoint,
id=id
)
# Select a specific metadata (with an id)
def select_specific(self, id: str, /) -> Self:
# Save the parameters to the fields
@@ -216,6 +236,65 @@ class BrickMetadata(BrickRecord):
return state
# Check if this metadata has a specific individual minifigure
def has_individual_minifigure(
self,
individual_minifigure: 'IndividualMinifigure',
/,
) -> bool:
"""Check if this owner/tag/status is assigned to a individual minifigure"""
# Determine the table name based on metadata type
table_name = f'bricktracker_individual_minifigure_{self.kind}s'
column_name = f'{self.kind}_{self.fields.id}'
# Query to check if the relationship exists using raw SQL
sql = BrickSQL()
query = f'SELECT COUNT(*) as count FROM "{table_name}" WHERE "id" = ? AND "{column_name}" = 1'
result = sql.cursor.execute(query, (individual_minifigure.fields.id,)).fetchone()
return result and result['count'] > 0
# Update the selected state of this metadata item for a individual minifigure
def update_individual_minifigure_state(
self,
individual_minifigure: 'IndividualMinifigure',
/,
*,
json: Any | None = None,
state: Any | None = None
) -> Any:
if state is None and json is not None:
state = json.get('value', False)
parameters = self.sql_parameters()
parameters['id'] = individual_minifigure.fields.id
parameters['state'] = state
rows, _ = BrickSQL().execute_and_commit(
self.update_individual_minifigure_state_query,
parameters=parameters,
name=self.as_column(),
)
if rows != 1:
raise DatabaseException('Could not update the {kind} "{name}" state for individual minifigure {figure} ({id})'.format(
kind=self.kind,
name=self.fields.name,
figure=individual_minifigure.fields.figure,
id=individual_minifigure.fields.id,
))
# Info
logger.info('{kind} "{name}" state changed to "{state}" for individual minifigure {figure} ({id})'.format(
kind=self.kind,
name=self.fields.name,
state=state,
figure=individual_minifigure.fields.figure,
id=individual_minifigure.fields.id,
))
return state
# Update the selected value of this metadata item for a set
def update_set_value(
self,
+27 -5
View File
@@ -39,9 +39,10 @@ class BrickMetadataList(BrickRecordList[T]):
# Queries
select_query: str
# Set endpoints
set_state_endpoint: str
set_value_endpoint: str
# List-specific endpoints (for operations on the list itself)
set_state_endpoint: str = ''
set_value_endpoint: str = ''
individual_minifigure_value_endpoint: str = ''
def __init__(
self,
@@ -99,18 +100,31 @@ class BrickMetadataList(BrickRecordList[T]):
# Return the items as columns for a select
@classmethod
def as_columns(cls, /, **kwargs) -> str:
def as_columns(cls, /, table: str | None = None, **kwargs) -> str:
new = cls.new()
# Use provided table name or default to class table
table_name = table if table is not None else cls.table
return ', '.join([
'"{table}"."{column}"'.format(
table=cls.table,
table=table_name,
column=record.as_column(),
)
for record
in new.filter(**kwargs)
])
# Return the items as a dictionary mapping column names to UUIDs
@classmethod
def as_column_mapping(cls, /, **kwargs) -> dict:
new = cls.new()
return {
record.as_column(): record.fields.id
for record in new.filter(**kwargs)
}
# Grab a specific status
@classmethod
def get(cls, id: str | None, /, *, allow_none: bool = False) -> T:
@@ -174,3 +188,11 @@ class BrickMetadataList(BrickRecordList[T]):
cls.set_value_endpoint,
id=id,
)
# URL to change the selected value of this metadata item for an individual minifigure
@classmethod
def url_for_individual_minifigure_value(cls, id: str, /) -> str:
return url_for(
cls.individual_minifigure_value_endpoint,
id=id,
)
+5 -4
View File
@@ -76,12 +76,13 @@ class BrickMinifigureList(BrickRecordList[BrickMinifigure]):
# Field mapping for sorting
field_mapping = {
'name': '"rebrickable_minifigures"."name"',
'parts': '"rebrickable_minifigures"."number_of_parts"',
'name': '"combined"."name"',
'parts': '"combined"."number_of_parts"',
'quantity': '"total_quantity"',
'missing': '"total_missing"',
'damaged': '"total_damaged"',
'sets': '"total_sets"'
'sets': '"total_sets"',
'individual': '"total_individual"'
}
# Use the base pagination method
@@ -112,7 +113,7 @@ class BrickMinifigureList(BrickRecordList[BrickMinifigure]):
if current_app.config['RANDOM']:
order = 'RANDOM()'
else:
order = '"bricktracker_minifigures"."rowid" DESC'
order = '"combined"."rowid" DESC'
self.list(override_query=self.last_query, order=order, limit=limit)
+1
View File
@@ -15,6 +15,7 @@ NAVBAR: Final[list[dict[str, Any]]] = [
{'e': 'minifigure.list', 't': 'Minifigures', 'i': 'group-line', 'f': 'HIDE_ALL_MINIFIGURES'}, # noqa: E501
{'e': 'instructions.list', 't': 'Instructions', 'i': 'file-line', 'f': 'HIDE_ALL_INSTRUCTIONS'}, # noqa: E501
{'e': 'storage.list', 't': 'Storages', 'i': 'archive-2-line', 'f': 'HIDE_ALL_STORAGES'}, # noqa: E501
{'e': 'statistics.overview', 't': 'Statistics', 'i': 'bar-chart-line', 'f': 'HIDE_STATISTICS'}, # noqa: E501
{'e': 'wish.list', 't': 'Wishlist', 'i': 'gift-line', 'f': 'HIDE_WISHES'},
{'e': 'admin.admin', 't': 'Admin', 'i': 'settings-4-line', 'f': 'HIDE_ADMIN'}, # noqa: E501
]
+25
View File
@@ -35,3 +35,28 @@ def parse_set(set: str, /) -> str:
))
return '{number}-{version}'.format(number=number, version=version)
# Make sense of string supposed to contain a minifigure ID
def parse_minifig(figure: str, /) -> str:
# Minifigure format is typically fig-XXXXXX
# We'll accept with or without the 'fig-' prefix
figure = figure.strip()
if not figure.startswith('fig-'):
# Try to add the prefix if it's just numbers
if figure.isdigit():
figure = 'fig-{figure}'.format(figure=figure.zfill(6))
else:
raise ErrorException('Minifigure "{figure}" must start with "fig-"'.format(
figure=figure,
))
# Validate format: fig-XXXXXX where X can be digits or letters
parts = figure.split('-')
if len(parts) != 2 or parts[0] != 'fig':
raise ErrorException('Invalid minifigure format "{figure}". Expected format: fig-XXXXXX'.format(
figure=figure,
))
return figure
+140 -1
View File
@@ -159,6 +159,54 @@ class BrickPart(RebrickablePart):
return self
# Update checked state for part walkthrough
def update_checked(self, json: Any | None, /) -> bool:
# Handle both direct 'checked' key and changer.js 'value' key format
if json:
checked = json.get('checked', json.get('value', False))
else:
checked = False
checked = bool(checked)
# Update the field
self.fields.checked = checked
BrickSQL().execute_and_commit(
'part/update/checked',
parameters=self.sql_parameters()
)
return checked
# Compute the url for updating checked state
def url_for_checked(self, /) -> str:
# Check if this is an individual minifigure (has minifigure with id field, no brickset)
if self.minifigure is not None and hasattr(self.minifigure.fields, 'id') and self.brickset is None:
# Individual minifigure part
return url_for(
'individual_minifigure.checked_part',
id=self.minifigure.fields.id,
part=self.fields.part,
color=self.fields.color,
spare=self.fields.spare,
)
# Set-based part (with or without minifigure)
if self.minifigure is not None:
figure = self.minifigure.fields.figure
else:
figure = None
return url_for(
'set.checked_part',
id=self.fields.id,
figure=figure,
part=self.fields.part,
color=self.fields.color,
spare=self.fields.spare,
)
# Update a problematic part
def update_problem(self, problem: str, json: Any | None, /) -> int:
amount: str | int = json.get('value', '') # type: ignore
@@ -191,7 +239,19 @@ class BrickPart(RebrickablePart):
# Compute the url for problematic part
def url_for_problem(self, problem: str, /) -> str:
# Different URL for a minifigure part
# Check if this is an individual minifigure (has minifigure with id field, no brickset)
if self.minifigure is not None and hasattr(self.minifigure.fields, 'id') and self.brickset is None:
# Individual minifigure part
return url_for(
'individual_minifigure.problem_part',
id=self.minifigure.fields.id,
part=self.fields.part,
color=self.fields.color,
spare=self.fields.spare,
problem=problem,
)
# Set-based part (with or without minifigure)
if self.minifigure is not None:
figure = self.minifigure.fields.figure
else:
@@ -206,3 +266,82 @@ class BrickPart(RebrickablePart):
spare=self.fields.spare,
problem=problem,
)
# Select a specific part from an individual minifigure
def select_specific_individual_minifigure(
self,
minifigure: 'BrickMinifigure',
part: str,
color: int,
spare: int,
/,
) -> Self:
# Save the parameters to the fields
self.minifigure = minifigure
self.fields.id = minifigure.fields.id
self.fields.part = part
self.fields.color = color
self.fields.spare = spare
if not self.select(override_query='individual_minifigure/part/select/specific'):
raise NotFoundException(
'Part {part} with color {color} (spare: {spare}) from individual minifigure {figure} ({id}) was not found in the database'.format(
part=self.fields.part,
color=self.fields.color,
spare=self.fields.spare,
figure=self.minifigure.fields.figure,
id=self.minifigure.fields.id,
),
)
return self
# Update a problematic part for individual minifigure
def update_problem_individual_minifigure(self, problem: str, json: Any | None, /) -> int:
amount: str | int = json.get('value', '') # type: ignore
# We need a positive integer
try:
if amount == '':
amount = 0
amount = int(amount)
if amount < 0:
amount = 0
except Exception:
raise ErrorException('"{amount}" is not a valid integer'.format(
amount=amount
))
if amount < 0:
raise ErrorException('Cannot set a negative amount')
setattr(self.fields, problem, amount)
BrickSQL().execute_and_commit(
'individual_minifigure/part/update/{problem}'.format(problem=problem),
parameters=self.sql_parameters()
)
return amount
# Update checked state for individual minifigure part
def update_checked_individual_minifigure(self, json: Any | None, /) -> bool:
# Handle both direct 'checked' key and changer.js 'value' key format
if json:
checked = json.get('checked', json.get('value', False))
else:
checked = False
checked = bool(checked)
# Update the field
self.fields.checked = checked
BrickSQL().execute_and_commit(
'individual_minifigure/part/update/checked',
parameters=self.sql_parameters()
)
return checked
+18 -1
View File
@@ -25,6 +25,7 @@ class BrickPartList(BrickRecordList[BrickPart]):
all_query: str = 'part/list/all'
all_by_owner_query: str = 'part/list/all_by_owner'
different_color_query = 'part/list/with_different_color'
individual_minifigure_query: str = 'individual_minifigure/part/list/from_instance'
last_query: str = 'part/list/last'
minifigure_query: str = 'part/list/from_minifigure'
problem_query: str = 'part/list/problem'
@@ -212,6 +213,20 @@ class BrickPartList(BrickRecordList[BrickPart]):
return self
# Load parts from an individual minifigure instance
def from_individual_minifigure(
self,
minifigure: 'BrickMinifigure',
/,
) -> Self:
# Save the minifigure
self.minifigure = minifigure
# Load the parts from the database using the instance-specific query
self.list(override_query=self.individual_minifigure_query)
return self
# Load generic parts from a print
def from_print(
self,
@@ -306,9 +321,11 @@ class BrickPartList(BrickRecordList[BrickPart]):
def sql_parameters(self, /) -> dict[str, Any]:
parameters: dict[str, Any] = super().sql_parameters()
# Set id
# Set id - prioritize brickset, then check minifigure
if self.brickset is not None:
parameters['id'] = self.brickset.fields.id
elif self.minifigure is not None and hasattr(self.minifigure.fields, 'id'):
parameters['id'] = self.minifigure.fields.id
# Use the minifigure number if present,
if self.minifigure is not None:
+437
View File
@@ -0,0 +1,437 @@
import hashlib
import logging
import os
from pathlib import Path
import time
from typing import Any, NamedTuple, TYPE_CHECKING
from urllib.parse import urljoin
from bs4 import BeautifulSoup
import cloudscraper
from flask import current_app, url_for
import requests
from .exceptions import ErrorException
if TYPE_CHECKING:
from .socket import BrickSocket
logger = logging.getLogger(__name__)
def get_peeron_user_agent():
"""Get the User-Agent string for Peeron requests from config"""
return current_app.config.get('REBRICKABLE_USER_AGENT',
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36')
def get_peeron_download_delay():
"""Get the delay in milliseconds between Peeron page downloads from config"""
return current_app.config.get('PEERON_DOWNLOAD_DELAY', 1000)
def get_min_image_size():
"""Get the minimum image size for valid Peeron instruction pages from config"""
return current_app.config.get('PEERON_MIN_IMAGE_SIZE', 100)
def get_peeron_instruction_url(set_number: str, version_number: str):
"""Get the Peeron instruction page URL using the configured pattern"""
pattern = current_app.config.get('PEERON_INSTRUCTION_PATTERN', 'http://peeron.com/scans/{set_number}-{version_number}')
return pattern.format(set_number=set_number, version_number=version_number)
def get_peeron_thumbnail_url(set_number: str, version_number: str):
"""Get the Peeron thumbnail base URL using the configured pattern"""
pattern = current_app.config.get('PEERON_THUMBNAIL_PATTERN', 'http://belay.peeron.com/thumbs/{set_number}-{version_number}/')
return pattern.format(set_number=set_number, version_number=version_number)
def get_peeron_scan_url(set_number: str, version_number: str):
"""Get the Peeron scan base URL using the configured pattern"""
pattern = current_app.config.get('PEERON_SCAN_PATTERN', 'http://belay.peeron.com/scans/{set_number}-{version_number}/')
return pattern.format(set_number=set_number, version_number=version_number)
def create_peeron_scraper():
"""Create a cloudscraper instance configured for Peeron"""
scraper = cloudscraper.create_scraper()
scraper.headers.update({
"User-Agent": get_peeron_user_agent()
})
return scraper
def get_peeron_cache_dir():
"""Get the base directory for Peeron caching"""
static_dir = Path(current_app.static_folder)
cache_dir = static_dir / 'images' / 'peeron_cache'
cache_dir.mkdir(parents=True, exist_ok=True)
return cache_dir
def get_set_cache_dir(set_number: str, version_number: str) -> tuple[Path, Path]:
"""Get cache directories for a specific set"""
base_cache_dir = get_peeron_cache_dir()
set_cache_key = f"{set_number}-{version_number}"
full_cache_dir = base_cache_dir / 'full' / set_cache_key
thumb_cache_dir = base_cache_dir / 'thumbs' / set_cache_key
full_cache_dir.mkdir(parents=True, exist_ok=True)
thumb_cache_dir.mkdir(parents=True, exist_ok=True)
return full_cache_dir, thumb_cache_dir
def cache_full_image_and_generate_thumbnail(image_url: str, page_number: str, set_number: str, version_number: str, session=None) -> tuple[str | None, str | None]:
"""
Download and cache full-size image, then generate a thumbnail preview.
Uses the full-size scan URLs from Peeron.
Returns (cached_image_path, thumbnail_url) or (None, None) if caching fails.
"""
try:
full_cache_dir, thumb_cache_dir = get_set_cache_dir(set_number, version_number)
full_filename = f"{page_number}.jpg"
thumb_filename = f"{page_number}.jpg"
full_cache_path = full_cache_dir / full_filename
thumb_cache_path = thumb_cache_dir / thumb_filename
# Return existing cached files if they exist
if full_cache_path.exists() and thumb_cache_path.exists():
set_cache_key = f"{set_number}-{version_number}"
thumbnail_url = url_for('static', filename=f'images/peeron_cache/thumbs/{set_cache_key}/{thumb_filename}')
return str(full_cache_path), thumbnail_url
# Download the full-size image using provided session or create new one
if session is None:
session = create_peeron_scraper()
response = session.get(image_url, timeout=30)
if response.status_code == 200 and len(response.content) > 0:
# Validate it's actually an image by checking minimum size
min_size = get_min_image_size()
if len(response.content) < min_size:
logger.warning(f"Image too small, skipping cache: {image_url}")
return None, None
# Write full-size image to cache
with open(full_cache_path, 'wb') as f:
f.write(response.content)
logger.debug(f"Cached full image: {image_url} -> {full_cache_path}")
# Generate thumbnail from the cached full image
try:
from PIL import Image
with Image.open(full_cache_path) as img:
# Create thumbnail (max 150px on longest side to match template)
img.thumbnail((150, 150), Image.Resampling.LANCZOS)
img.save(thumb_cache_path, 'JPEG', quality=85)
logger.debug(f"Generated thumbnail: {full_cache_path} -> {thumb_cache_path}")
set_cache_key = f"{set_number}-{version_number}"
thumbnail_url = url_for('static', filename=f'images/peeron_cache/thumbs/{set_cache_key}/{thumb_filename}')
return str(full_cache_path), thumbnail_url
except Exception as thumb_error:
logger.error(f"Failed to generate thumbnail for {page_number}: {thumb_error}")
# Clean up the full image if thumbnail generation failed
if full_cache_path.exists():
full_cache_path.unlink()
return None, None
else:
logger.warning(f"Failed to download full image: {image_url}")
return None, None
except Exception as e:
logger.error(f"Error caching full image {image_url}: {e}")
return None, None
def clear_set_cache(set_number: str, version_number: str) -> int:
"""
Clear all cached files for a specific set after PDF generation.
Returns the number of files deleted.
"""
try:
full_cache_dir, thumb_cache_dir = get_set_cache_dir(set_number, version_number)
deleted_count = 0
# Delete full images
if full_cache_dir.exists():
for cache_file in full_cache_dir.glob('*.jpg'):
try:
cache_file.unlink()
deleted_count += 1
logger.debug(f"Deleted cached full image: {cache_file}")
except OSError as e:
logger.warning(f"Failed to delete cache file {cache_file}: {e}")
# Remove directory if empty
try:
full_cache_dir.rmdir()
except OSError:
pass # Directory not empty or other error
# Delete thumbnails
if thumb_cache_dir.exists():
for cache_file in thumb_cache_dir.glob('*.jpg'):
try:
cache_file.unlink()
deleted_count += 1
logger.debug(f"Deleted cached thumbnail: {cache_file}")
except OSError as e:
logger.warning(f"Failed to delete cache file {cache_file}: {e}")
# Remove directory if empty
try:
thumb_cache_dir.rmdir()
except OSError:
pass # Directory not empty or other error
# Try to remove set directory if empty
try:
set_cache_key = f"{set_number}-{version_number}"
full_cache_dir.parent.rmdir() if full_cache_dir.parent.name == set_cache_key else None
thumb_cache_dir.parent.rmdir() if thumb_cache_dir.parent.name == set_cache_key else None
except OSError:
pass # Directory not empty or other error
logger.info(f"Set cache cleanup completed for {set_number}-{version_number}: {deleted_count} files deleted")
return deleted_count
except Exception as e:
logger.error(f"Error during set cache cleanup for {set_number}-{version_number}: {e}")
return 0
def clear_old_cache(max_age_days: int = 7) -> int:
"""
Clear old cache files across all sets.
Returns the number of files deleted.
"""
try:
base_cache_dir = get_peeron_cache_dir()
if not base_cache_dir.exists():
return 0
deleted_count = 0
max_age_seconds = max_age_days * 24 * 60 * 60
current_time = time.time()
# Clean both full and thumbs directories
for cache_type in ['full', 'thumbs']:
cache_type_dir = base_cache_dir / cache_type
if cache_type_dir.exists():
for set_dir in cache_type_dir.iterdir():
if set_dir.is_dir():
for cache_file in set_dir.glob('*.jpg'):
file_age = current_time - os.path.getmtime(cache_file)
if file_age > max_age_seconds:
try:
cache_file.unlink()
deleted_count += 1
logger.debug(f"Deleted old cache file: {cache_file}")
except OSError as e:
logger.warning(f"Failed to delete cache file {cache_file}: {e}")
# Remove empty directories
try:
if not any(set_dir.iterdir()):
set_dir.rmdir()
except OSError:
pass
logger.info(f"Old cache cleanup completed: {deleted_count} files deleted")
return deleted_count
except Exception as e:
logger.error(f"Error during old cache cleanup: {e}")
return 0
class PeeronPage(NamedTuple):
"""Represents a single instruction page from Peeron"""
page_number: str
original_image_url: str # Original Peeron full-size image URL
cached_full_image_path: str # Local full-size cached image path
cached_thumbnail_url: str # Local thumbnail URL for preview
alt_text: str
rotation: int = 0 # Rotation in degrees (0, 90, 180, 270)
# Peeron instruction scraper
class PeeronInstructions(object):
socket: 'BrickSocket | None'
set_number: str
version_number: str
pages: list[PeeronPage]
def __init__(
self,
set_number: str,
version_number: str = '1',
/,
*,
socket: 'BrickSocket | None' = None,
):
# Save the socket
self.socket = socket
# Parse set number (handle both "4011" and "4011-1" formats)
if '-' in set_number:
parts = set_number.split('-', 1)
self.set_number = parts[0]
self.version_number = parts[1] if len(parts) > 1 else '1'
else:
self.set_number = set_number
self.version_number = version_number
# Placeholder for pages
self.pages = []
# Check if instructions exist on Peeron (lightweight)
def exists(self, /) -> bool:
"""Check if the set exists on Peeron without caching thumbnails"""
try:
base_url = get_peeron_instruction_url(self.set_number, self.version_number)
scraper = create_peeron_scraper()
response = scraper.get(base_url)
if response.status_code != 200:
return False
soup = BeautifulSoup(response.text, 'html.parser')
# Check for "Browse instruction library" header (set not found)
if soup.find('h1', string="Browse instruction library"):
return False
# Look for thumbnail images to confirm instructions exist
thumbnails = soup.select('table[cellspacing="5"] a img[src^="http://belay.peeron.com/thumbs/"]')
return len(thumbnails) > 0
except Exception:
return False
# Find all available instruction pages on Peeron
def find_pages(self, /) -> list[PeeronPage]:
"""
Scrape Peeron's HTML and return a list of available instruction pages.
Similar to BrickInstructions.find_instructions() but for Peeron.
"""
base_url = get_peeron_instruction_url(self.set_number, self.version_number)
thumb_base_url = get_peeron_thumbnail_url(self.set_number, self.version_number)
scan_base_url = get_peeron_scan_url(self.set_number, self.version_number)
logger.debug(f"[find_pages] fetching HTML from {base_url!r}")
# Set up session with persistent cookies for Peeron (like working dl_peeron.py)
scraper = create_peeron_scraper()
# Download the main HTML page to establish session and cookies
try:
logger.debug(f"[find_pages] Establishing session by visiting: {base_url}")
response = scraper.get(base_url)
logger.debug(f"[find_pages] Main page visit: HTTP {response.status_code}")
if response.status_code != 200:
raise ErrorException(f'Failed to load Peeron page for {self.set_number}-{self.version_number}. HTTP {response.status_code}')
except requests.exceptions.RequestException as e:
raise ErrorException(f'Failed to connect to Peeron: {e}')
# Parse HTML to locate instruction pages
soup = BeautifulSoup(response.text, 'html.parser')
# Check for "Browse instruction library" header (set not found)
if soup.find('h1', string="Browse instruction library"):
raise ErrorException(f'Set {self.set_number}-{self.version_number} not found on Peeron')
# Locate all thumbnail images in the expected table structure
# Use the configured thumbnail pattern to build the expected URL prefix
thumb_base_url = get_peeron_thumbnail_url(self.set_number, self.version_number)
thumbnails = soup.select(f'table[cellspacing="5"] a img[src^="{thumb_base_url}"]')
if not thumbnails:
raise ErrorException(f'No instruction pages found for {self.set_number}-{self.version_number} on Peeron')
pages: list[PeeronPage] = []
total_thumbnails = len(thumbnails)
# Initialize progress if socket is available
if self.socket:
self.socket.progress_total = total_thumbnails
self.socket.progress_count = 0
self.socket.progress(message=f"Starting to cache {total_thumbnails} full images")
for idx, img in enumerate(thumbnails, 1):
thumb_url = img['src']
# Extract the page number from the thumbnail URL
page_number = thumb_url.split('/')[-2]
# Build the full-size scan URL using the page number
full_size_url = f"{scan_base_url}{page_number}/"
logger.debug(f"[find_pages] Page {page_number}: thumb={thumb_url}, full_size={full_size_url}")
# Create alt text for the page
alt_text = f"LEGO Instructions {self.set_number}-{self.version_number} Page {page_number}"
# Report progress if socket is available
if self.socket:
self.socket.progress_count = idx
self.socket.progress(message=f"Caching full image {idx} of {total_thumbnails}")
# Cache the full-size image and generate thumbnail preview using established session
cached_full_path, cached_thumb_url = cache_full_image_and_generate_thumbnail(
full_size_url, page_number, self.set_number, self.version_number, session=scraper
)
# Skip this page if caching failed
if not cached_full_path or not cached_thumb_url:
logger.warning(f"[find_pages] Skipping page {page_number} due to caching failure")
continue
page = PeeronPage(
page_number=page_number,
original_image_url=full_size_url,
cached_full_image_path=cached_full_path,
cached_thumbnail_url=cached_thumb_url,
alt_text=alt_text
)
pages.append(page)
# Cache the pages for later use
self.pages = pages
logger.debug(f"[find_pages] found {len(pages)} pages for {self.set_number}-{self.version_number}")
return pages
# Find instructions with fallback to Peeron
@staticmethod
def find_instructions_with_peeron_fallback(set: str, /) -> tuple[list[tuple[str, str]], list[PeeronPage] | None]:
"""
Enhanced version of BrickInstructions.find_instructions() that falls back to Peeron.
Returns (rebrickable_instructions, peeron_pages).
If rebrickable_instructions is empty, peeron_pages will contain Peeron data.
"""
from .instructions import BrickInstructions
# First try Rebrickable
try:
rebrickable_instructions = BrickInstructions.find_instructions(set)
return rebrickable_instructions, None
except ErrorException as e:
logger.info(f"Rebrickable failed for {set}: {e}. Trying Peeron fallback...")
# Fallback to Peeron
try:
peeron = PeeronInstructions(set)
peeron_pages = peeron.find_pages()
return [], peeron_pages
except ErrorException as peeron_error:
# Both failed, re-raise original Rebrickable error
logger.info(f"Peeron also failed for {set}: {peeron_error}")
raise e from peeron_error
+200
View File
@@ -0,0 +1,200 @@
import logging
import os
import tempfile
import time
from typing import Any, TYPE_CHECKING
import cloudscraper
from flask import current_app
from PIL import Image
from .exceptions import DownloadException, ErrorException
from .instructions import BrickInstructions
from .peeron_instructions import PeeronPage, get_min_image_size, get_peeron_download_delay, get_peeron_instruction_url, create_peeron_scraper
if TYPE_CHECKING:
from .socket import BrickSocket
logger = logging.getLogger(__name__)
# PDF generator for Peeron instruction pages
class PeeronPDF(object):
socket: 'BrickSocket'
set_number: str
version_number: str
pages: list[PeeronPage]
filename: str
def __init__(
self,
set_number: str,
version_number: str,
pages: list[PeeronPage],
/,
*,
socket: 'BrickSocket',
):
# Save the socket
self.socket = socket
# Save set information
self.set_number = set_number
self.version_number = version_number
self.pages = pages
# Generate filename following BrickTracker conventions
self.filename = f"{set_number}-{version_number}_peeron.pdf"
# Download pages and create PDF
def create_pdf(self, /) -> None:
"""
Downloads selected Peeron pages and merges them into a PDF.
Uses progress updates via socket similar to BrickInstructions.download()
"""
try:
target_path = self._get_target_path()
# Skip if we already have it
if os.path.isfile(target_path):
# Create BrickInstructions instance to get PDF URL
instructions = BrickInstructions(self.filename)
pdf_url = instructions.url()
return self.socket.complete(
message=f'File {self.filename} already exists, skipped - <a href="{pdf_url}" target="_blank" class="btn btn-sm btn-primary ms-2"><i class="ri-external-link-line"></i> Open PDF</a>'
)
# Set up progress tracking
total_pages = len(self.pages)
self.socket.update_total(total_pages)
self.socket.progress_count = 0
self.socket.progress(message=f"Starting PDF creation from {total_pages} cached pages")
# Use cached images directly - no downloads needed!
cached_files_with_rotation = []
missing_pages = []
for i, page in enumerate(self.pages):
# Check if cached file exists
if os.path.isfile(page.cached_full_image_path):
cached_files_with_rotation.append((page.cached_full_image_path, page.rotation))
# Update progress
self.socket.progress_count += 1
self.socket.progress(
message=f"Processing cached page {page.page_number} ({i + 1}/{total_pages})"
)
else:
missing_pages.append(page.page_number)
logger.warning(f"Cached image missing for page {page.page_number}: {page.cached_full_image_path}")
if not cached_files_with_rotation:
raise DownloadException(f"No cached images available for set {self.set_number}-{self.version_number}. Cache may have been cleared.")
elif len(cached_files_with_rotation) < total_pages:
# Partial success
error_msg = f"Only found {len(cached_files_with_rotation)}/{total_pages} cached images."
if missing_pages:
error_msg += f" Missing pages: {', '.join(missing_pages)}."
logger.warning(error_msg)
# Create PDF from cached images with rotation
self._create_pdf_from_images(cached_files_with_rotation, target_path)
# Success
logger.info(f"Created PDF {self.filename} with {len(cached_files_with_rotation)} pages")
# Create BrickInstructions instance to get PDF URL
instructions = BrickInstructions(self.filename)
pdf_url = instructions.url()
self.socket.complete(
message=f'PDF {self.filename} created with {len(cached_files_with_rotation)} pages - <a href="{pdf_url}" target="_blank" class="btn btn-sm btn-primary ms-2"><i class="ri-external-link-line"></i> Open PDF</a>'
)
# Clean up set cache after successful PDF creation
try:
from .peeron_instructions import clear_set_cache
deleted_count = clear_set_cache(self.set_number, self.version_number)
if deleted_count > 0:
logger.info(f"[create_pdf] Cleaned up {deleted_count} cache files for set {self.set_number}-{self.version_number}")
except Exception as e:
logger.warning(f"[create_pdf] Failed to clean set cache: {e}")
except Exception as e:
logger.error(f"Error creating PDF {self.filename}: {e}")
self.socket.fail(
message=f"Error creating PDF {self.filename}: {e}"
)
# Create PDF from downloaded images
def _create_pdf_from_images(self, image_paths_and_rotations: list[tuple[str, int]], output_path: str, /) -> None:
"""Create a PDF from a list of image files with their rotations"""
try:
# Import FPDF (should be available from requirements)
from fpdf import FPDF
except ImportError:
raise ErrorException("FPDF library not available. Install with: pip install fpdf2")
pdf = FPDF()
for i, (img_path, rotation) in enumerate(image_paths_and_rotations):
try:
# Open image and apply rotation if needed
with Image.open(img_path) as image:
# Apply rotation if specified
if rotation != 0:
# PIL rotation is counter-clockwise, so we negate for clockwise rotation
image = image.rotate(-rotation, expand=True)
width, height = image.size
# Add page with image dimensions (convert pixels to mm)
# 1 pixel = 0.264583 mm (assuming 96 DPI)
page_width = width * 0.264583
page_height = height * 0.264583
pdf.add_page(format=(page_width, page_height))
# Save rotated image to temporary file for FPDF
temp_rotated_path = None
if rotation != 0:
import tempfile
temp_fd, temp_rotated_path = tempfile.mkstemp(suffix='.jpg', prefix=f'peeron_rotated_{i}_')
try:
os.close(temp_fd) # Close file descriptor, we'll use the path
image.save(temp_rotated_path, 'JPEG', quality=95)
pdf.image(temp_rotated_path, x=0, y=0, w=page_width, h=page_height)
finally:
# Clean up rotated temp file
if temp_rotated_path and os.path.exists(temp_rotated_path):
os.remove(temp_rotated_path)
else:
pdf.image(img_path, x=0, y=0, w=page_width, h=page_height)
# Update progress
progress_msg = f"Processing page {i + 1}/{len(image_paths_and_rotations)} into PDF"
if rotation != 0:
progress_msg += f" (rotated {rotation}°)"
self.socket.progress(message=progress_msg)
except Exception as e:
logger.warning(f"Failed to add image {img_path} to PDF: {e}")
continue
# Save the PDF
pdf.output(output_path)
# Get target file path
def _get_target_path(self, /) -> str:
"""Get the full path where the PDF should be saved"""
instructions_folder = os.path.join(
current_app.static_folder, # type: ignore
current_app.config['INSTRUCTIONS_FOLDER']
)
return os.path.join(instructions_folder, self.filename)
# Create BrickInstructions instance for the generated PDF
def get_instructions(self, /) -> BrickInstructions:
"""Return a BrickInstructions instance for the generated PDF"""
return BrickInstructions(self.filename)
+9
View File
@@ -179,6 +179,15 @@ class RebrickableSet(BrickRecord):
return ''
# Compute the url for the bricklink page
def url_for_bricklink(self, /) -> str:
if current_app.config['BRICKLINK_LINKS']:
return current_app.config['BRICKLINK_LINK_SET_PATTERN'].format(
set_num=self.fields.set
)
return ''
# Compute the url for the refresh button
def url_for_refresh(self, /) -> str:
return url_for('set.refresh', set=self.fields.set)
+14
View File
@@ -169,6 +169,20 @@ class BrickSet(RebrickableSet):
else:
return ''
# Purchase date max formatted for consolidated sets
def purchase_date_max_formatted(self, /, *, standard: bool = False) -> str:
if hasattr(self.fields, 'purchase_date_max') and self.fields.purchase_date_max is not None:
time = datetime.fromtimestamp(self.fields.purchase_date_max)
if standard:
return time.strftime('%Y/%m/%d')
else:
return time.strftime(
current_app.config['PURCHASE_DATE_FORMAT']
)
else:
return ''
# Purchase price with currency
def purchase_price(self, /) -> str:
if self.fields.purchase_price is not None:
+298 -40
View File
@@ -20,10 +20,12 @@ from .instructions_list import BrickInstructionsList
# All the sets from the database
class BrickSetList(BrickRecordList[BrickSet]):
themes: list[str]
years: list[int]
order: str
# Queries
all_query: str = 'set/list/all'
consolidated_query: str = 'set/list/consolidated'
damaged_minifigure_query: str = 'set/list/damaged_minifigure'
damaged_part_query: str = 'set/list/damaged_part'
generic_query: str = 'set/list/generic'
@@ -34,20 +36,39 @@ class BrickSetList(BrickRecordList[BrickSet]):
using_minifigure_query: str = 'set/list/using_minifigure'
using_part_query: str = 'set/list/using_part'
using_storage_query: str = 'set/list/using_storage'
without_storage_query: str = 'set/list/without_storage'
def __init__(self, /):
super().__init__()
# Placeholders
self.themes = []
self.years = []
# Store the order for this list
self.order = current_app.config['SETS_DEFAULT_ORDER']
# All the sets
def all(self, /) -> Self:
# Load the sets from the database
self.list(do_theme=True)
# Load the sets from the database with metadata context for filtering
filter_context = {
'owners': BrickSetOwnerList.as_columns(),
'statuses': BrickSetStatusList.as_columns(),
'tags': BrickSetTagList.as_columns(),
}
self.list(do_theme=True, **filter_context)
return self
# All sets in consolidated/grouped view
def all_consolidated(self, /) -> Self:
# Load the sets from the database using consolidated query with metadata context
filter_context = {
'owners_dict': BrickSetOwnerList.as_column_mapping(),
'statuses_dict': BrickSetStatusList.as_column_mapping(),
'tags_dict': BrickSetTagList.as_column_mapping(),
}
self.list(override_query=self.consolidated_query, do_theme=True, **filter_context)
return self
@@ -64,7 +85,10 @@ class BrickSetList(BrickRecordList[BrickSet]):
owner_filter: str | None = None,
purchase_location_filter: str | None = None,
storage_filter: str | None = None,
tag_filter: str | None = None
tag_filter: str | None = None,
year_filter: str | None = None,
duplicate_filter: bool = False,
use_consolidated: bool = True
) -> tuple[Self, int]:
# Convert theme name to theme ID for filtering
theme_id_filter = None
@@ -72,7 +96,7 @@ class BrickSetList(BrickRecordList[BrickSet]):
theme_id_filter = self._theme_name_to_id(theme_filter)
# Check if any filters are applied
has_filters = any([status_filter, theme_id_filter, owner_filter, purchase_location_filter, storage_filter, tag_filter])
has_filters = any([status_filter, theme_id_filter, owner_filter, purchase_location_filter, storage_filter, tag_filter, year_filter, duplicate_filter])
# Prepare filter context
filter_context = {
@@ -83,27 +107,56 @@ class BrickSetList(BrickRecordList[BrickSet]):
'purchase_location_filter': purchase_location_filter,
'storage_filter': storage_filter,
'tag_filter': tag_filter,
'year_filter': year_filter,
'duplicate_filter': duplicate_filter,
'owners': BrickSetOwnerList.as_columns(),
'statuses': BrickSetStatusList.as_columns(),
'tags': BrickSetTagList.as_columns(),
'owners_dict': BrickSetOwnerList.as_column_mapping(),
'statuses_dict': BrickSetStatusList.as_column_mapping(),
'tags_dict': BrickSetTagList.as_column_mapping(),
}
# Field mapping for sorting
field_mapping = {
'set': '"rebrickable_sets"."set"',
'name': '"rebrickable_sets"."name"',
'year': '"rebrickable_sets"."year"',
'parts': '"rebrickable_sets"."number_of_parts"',
'theme': '"rebrickable_sets"."theme_id"',
'minifigures': '"total_minifigures"', # Use the alias from the SQL query
'missing': '"total_missing"', # Use the alias from the SQL query
'damaged': '"total_damaged"', # Use the alias from the SQL query
'purchase-date': '"bricktracker_sets"."purchase_date"',
'purchase-price': '"bricktracker_sets"."purchase_price"'
}
if use_consolidated:
field_mapping = {
'set': '"rebrickable_sets"."number", "rebrickable_sets"."version"',
'name': '"rebrickable_sets"."name"',
'year': '"rebrickable_sets"."year"',
'parts': '"rebrickable_sets"."number_of_parts"',
'theme': '"rebrickable_sets"."theme_id"',
'minifigures': '"total_minifigures"',
'missing': '"total_missing"',
'damaged': '"total_damaged"',
'instances': '"instance_count"', # New field for consolidated view
'purchase-date': '"purchase_date"', # Use the MIN aggregated value
'purchase-price': '"purchase_price"' # Use the MIN aggregated value
}
else:
field_mapping = {
'set': '"rebrickable_sets"."number", "rebrickable_sets"."version"',
'name': '"rebrickable_sets"."name"',
'year': '"rebrickable_sets"."year"',
'parts': '"rebrickable_sets"."number_of_parts"',
'theme': '"rebrickable_sets"."theme_id"',
'minifigures': '"total_minifigures"', # Use the alias from the SQL query
'missing': '"total_missing"', # Use the alias from the SQL query
'damaged': '"total_damaged"', # Use the alias from the SQL query
'purchase-date': '"bricktracker_sets"."purchase_date"',
'purchase-price': '"bricktracker_sets"."purchase_price"'
}
# Choose query based on whether filters are applied
query_to_use = 'set/list/all_filtered' if has_filters else self.all_query
# Choose query based on consolidation preference and filter complexity
# Owner/tag filters still need to fall back to non-consolidated for now
# due to complex aggregation requirements
complex_filters = [owner_filter, tag_filter]
if use_consolidated and not any(complex_filters):
query_to_use = self.consolidated_query
else:
# Use filtered query when consolidation is disabled or complex filters applied
query_to_use = 'set/list/all_filtered'
# Handle instructions filtering
if status_filter in ['has-missing-instructions', '-has-missing-instructions']:
@@ -114,6 +167,18 @@ class BrickSetList(BrickRecordList[BrickSet]):
purchase_location_filter, storage_filter, tag_filter
)
# Handle special case for set sorting with multiple columns
if sort_field == 'set' and field_mapping:
# Create custom order clause for set sorting
direction = 'DESC' if sort_order.lower() == 'desc' else 'ASC'
custom_order = f'"rebrickable_sets"."number" {direction}, "rebrickable_sets"."version" {direction}'
filter_context['order'] = custom_order
# Remove set from field mapping to avoid double-processing
field_mapping_copy = field_mapping.copy()
field_mapping_copy.pop('set', None)
field_mapping = field_mapping_copy
sort_field = None # Disable automatic ORDER BY construction
# Normal SQL-based filtering and pagination
result, total_count = self.paginate(
page=page,
@@ -125,8 +190,22 @@ class BrickSetList(BrickRecordList[BrickSet]):
**filter_context
)
# Populate themes for filter dropdown from ALL sets, not just current page
result._populate_themes_global()
# Populate themes and years for filter dropdown from filtered dataset (not just current page)
# For themes dropdown, exclude theme_filter to show ALL available themes
themes_context = filter_context.copy()
themes_context.pop('theme_filter', None)
result._populate_themes_from_filtered_dataset(
query_to_use,
**themes_context
)
# For years dropdown, exclude ALL filters to show ALL available years
years_context = {
'search_query': filter_context.get('search_query'),
}
result._populate_years_from_filtered_dataset(
query_to_use,
**years_context
)
return result, total_count
@@ -140,18 +219,85 @@ class BrickSetList(BrickRecordList[BrickSet]):
self.themes = list(themes)
self.themes.sort()
def _theme_name_to_id(self, theme_name: str) -> str | None:
"""Convert a theme name to theme ID for filtering"""
def _populate_years(self) -> None:
"""Populate years list from the current records"""
years = set()
for record in self.records:
if hasattr(record, 'fields') and hasattr(record.fields, 'year') and record.fields.year:
years.add(record.fields.year)
self.years = list(years)
self.years.sort(reverse=True) # Most recent years first
def _theme_name_to_id(self, theme_name_or_id: str) -> str | None:
"""Convert a theme name or ID to theme ID for filtering"""
try:
theme_list = BrickThemeList()
for theme_id, theme in theme_list.themes.items():
if theme.name.lower() == theme_name.lower():
# Check if the input is already a numeric theme ID
if theme_name_or_id.isdigit():
# Input is already a theme ID, validate it exists
theme_list = BrickThemeList()
theme_id = int(theme_name_or_id)
if theme_id in theme_list.themes:
return str(theme_id)
return None
else:
return None
# Input is a theme name, convert to ID
from .sql import BrickSQL
theme_list = BrickThemeList()
# Find all theme IDs that match the name
matching_theme_ids = []
for theme_id, theme in theme_list.themes.items():
if theme.name.lower() == theme_name_or_id.lower():
matching_theme_ids.append(str(theme_id))
if not matching_theme_ids:
return None
# If only one match, return it
if len(matching_theme_ids) == 1:
return matching_theme_ids[0]
# Multiple matches - check which theme ID actually has sets in the user's collection
sql = BrickSQL()
for theme_id in matching_theme_ids:
result = sql.fetchone(
'set/check_theme_exists',
theme_id=theme_id
)
count = result['count'] if result else 0
if count > 0:
return theme_id
# If none have sets, return the first match (fallback)
return matching_theme_ids[0]
except Exception:
# If themes can't be loaded, return None to disable theme filtering
return None
def _theme_id_to_name(self, theme_id: str) -> str | None:
"""Convert a theme ID to theme name (lowercase) for dropdown display"""
try:
if not theme_id or not theme_id.isdigit():
return None
from .theme_list import BrickThemeList
theme_list = BrickThemeList()
theme_id_int = int(theme_id)
if theme_id_int in theme_list.themes:
return theme_list.themes[theme_id_int].name.lower()
return None
except Exception as e:
# For debugging - log the exception
import logging
logger = logging.getLogger(__name__)
logger.warning(f"Failed to convert theme ID {theme_id} to name: {e}")
return None
def _all_filtered_paginated_with_instructions(
self,
search_query: str | None,
@@ -223,12 +369,15 @@ class BrickSetList(BrickRecordList[BrickSet]):
result = BrickSetList()
result.records = paginated_records
# Copy themes from the source that has all sets
# Copy themes and years from the source that has all sets
result.themes = all_sets.themes if hasattr(all_sets, 'themes') else []
result.years = all_sets.years if hasattr(all_sets, 'years') else []
# If themes weren't populated, populate them globally
# If themes or years weren't populated, populate them from current records
if not result.themes:
result._populate_themes_global()
result._populate_themes()
if not result.years:
result._populate_years()
return result, total_count
@@ -240,21 +389,94 @@ class BrickSetList(BrickRecordList[BrickSet]):
purchase_location_filter, storage_filter, tag_filter
)
def _populate_themes_global(self) -> None:
"""Populate themes list from ALL sets, not just current page"""
def _populate_years_from_filtered_dataset(self, query_name: str, **filter_context) -> None:
"""Populate years list from all available records in filtered dataset"""
try:
# Load all sets to get all possible themes
all_sets = BrickSetList().all()
# Use a simplified query to get just distinct years
years_context = dict(filter_context)
years_context.pop('limit', None)
years_context.pop('offset', None)
# Use a special lightweight query for years
year_records = super().select(
override_query='set/list/years_only',
**years_context
)
# Extract years from records
years = set()
for record in year_records:
year = record['year'] if 'year' in record.keys() else None
if year:
years.add(year)
if years:
self.years = list(years)
self.years.sort(reverse=True) # Most recent years first
else:
import logging
logger = logging.getLogger(__name__)
logger.warning("No years found in filtered dataset, falling back to current page")
self._populate_years()
except Exception as e:
import logging
logger = logging.getLogger(__name__)
logger.error(f"Exception in _populate_years_from_filtered_dataset: {e}")
self._populate_years()
def _populate_themes_from_filtered_dataset(self, query_name: str, **filter_context) -> None:
"""Populate themes list from filtered dataset (all pages, not just current page)"""
try:
from .theme_list import BrickThemeList
# Use a simplified query to get just distinct theme_ids
theme_context = dict(filter_context)
theme_context.pop('limit', None)
theme_context.pop('offset', None)
# Use a special lightweight query for themes
theme_records = super().select(
override_query='set/list/themes_only',
**theme_context
)
# Convert to theme names
theme_list = BrickThemeList()
themes = set()
for record in all_sets.records:
if hasattr(record, 'theme') and hasattr(record.theme, 'name'):
themes.add(record.theme.name)
for record in theme_records:
theme_id = record.get('theme_id')
if theme_id:
theme = theme_list.get(theme_id)
if theme and hasattr(theme, 'name'):
themes.add(theme.name)
self.themes = list(themes)
self.themes.sort()
except Exception:
# Fall back to current page themes
self._populate_themes()
# Fall back to simpler approach: get themes from ALL sets (ignoring filters)
# This is better than showing only current page themes
try:
from .theme_list import BrickThemeList
all_sets = BrickSetList()
all_sets.list(do_theme=True)
themes = set()
years = set()
for record in all_sets.records:
if hasattr(record, 'theme') and hasattr(record.theme, 'name'):
themes.add(record.theme.name)
if hasattr(record, 'fields') and hasattr(record.fields, 'year') and record.fields.year:
years.add(record.fields.year)
self.themes = list(themes)
self.themes.sort()
self.years = list(years)
self.years.sort(reverse=True)
except Exception:
# Final fallback to current page themes
self._populate_themes()
self._populate_years()
def _matches_search(self, record, search_query: str) -> bool:
"""Check if record matches search query"""
@@ -301,7 +523,7 @@ class BrickSetList(BrickRecordList[BrickSet]):
reverse = sort_order == 'desc'
if sort_field == 'set':
return sorted(records, key=lambda r: r.fields.set, reverse=reverse)
return sorted(records, key=lambda r: self._set_sort_key(r.fields.set), reverse=reverse)
elif sort_field == 'name':
return sorted(records, key=lambda r: r.fields.name, reverse=reverse)
elif sort_field == 'year':
@@ -312,6 +534,19 @@ class BrickSetList(BrickRecordList[BrickSet]):
return records
def _set_sort_key(self, set_number: str) -> tuple:
"""Generate sort key for set numbers like '10121-1' -> (10121, 1)"""
try:
if '-' in set_number:
main_part, version_part = set_number.split('-', 1)
return (int(main_part), int(version_part))
else:
return (int(set_number), 0)
except (ValueError, TypeError):
# Fallback to string sorting if parsing fails
return (float('inf'), set_number)
# Sets with a minifigure part damaged
def damaged_minifigure(self, figure: str, /) -> Self:
# Save the parameters to the fields
@@ -357,6 +592,7 @@ class BrickSetList(BrickRecordList[BrickSet]):
**context: Any,
) -> None:
themes = set()
years = set()
if order is None:
order = self.order
@@ -373,11 +609,15 @@ class BrickSetList(BrickRecordList[BrickSet]):
self.records.append(brickset)
if do_theme:
themes.add(brickset.theme.name)
if hasattr(brickset, 'fields') and hasattr(brickset.fields, 'year') and brickset.fields.year:
years.add(brickset.fields.year)
# Convert the set into a list and sort it
if do_theme:
self.themes = list(themes)
self.themes.sort()
self.years = list(years)
self.years.sort(reverse=True) # Most recent years first
# Sets missing a minifigure part
def missing_minifigure(self, figure: str, /) -> Self:
@@ -431,10 +671,17 @@ class BrickSetList(BrickRecordList[BrickSet]):
return self
def without_storage(self, /) -> Self:
# Load sets with no storage
self.list(override_query=self.without_storage_query)
return self
# Helper to build the metadata lists
def set_metadata_lists(
as_class: bool = False
as_class: bool = False,
hardcoded_statuses_only: bool = False
) -> dict[
str,
Union[
@@ -446,9 +693,20 @@ def set_metadata_lists(
list[BrickSetTag]
]
]:
# Get all statuses
all_statuses = BrickSetStatusList.list(all=True)
# Filter to only hardcoded statuses if requested (for individual minifigures)
if hardcoded_statuses_only:
hardcoded_status_ids = ['minifigures_collected', 'set_checked', 'set_collected']
statuses = [s for s in all_statuses if s.fields.id in hardcoded_status_ids]
else:
statuses = all_statuses
return {
'brickset_owners': BrickSetOwnerList.list(),
'brickset_purchase_locations': BrickSetPurchaseLocationList.list(as_class=as_class), # noqa: E501
'brickset_statuses': statuses,
'brickset_storages': BrickSetStorageList.list(as_class=as_class),
'brickset_tags': BrickSetTagList.list(),
}
+3 -1
View File
@@ -5,8 +5,9 @@ from .metadata import BrickMetadata
class BrickSetOwner(BrickMetadata):
kind: str = 'owner'
# Set state endpoint
# Endpoints
set_state_endpoint: str = 'set.update_owner'
individual_minifigure_state_endpoint: str = 'individual_minifigure.update_owner'
# Queries
delete_query: str = 'set/metadata/owner/delete'
@@ -14,3 +15,4 @@ class BrickSetOwner(BrickMetadata):
select_query: str = 'set/metadata/owner/select'
update_field_query: str = 'set/metadata/owner/update/field'
update_set_state_query: str = 'set/metadata/owner/update/state'
update_individual_minifigure_state_query: str = 'individual_minifigure/metadata/owner/update/state'
+3
View File
@@ -15,6 +15,9 @@ class BrickSetOwnerList(BrickMetadataList[BrickSetOwner]):
# Queries
select_query = 'set/metadata/owner/list'
# Endpoints
set_state_endpoint: str = 'set.update_owner'
# Instantiate the list with the proper class
@classmethod
def new(cls, /, *, force: bool = False) -> Self:
+4
View File
@@ -5,9 +5,13 @@ from .metadata import BrickMetadata
class BrickSetPurchaseLocation(BrickMetadata):
kind: str = 'purchase location'
# Endpoints
individual_minifigure_value_endpoint: str = 'individual_minifigure.update_purchase_location'
# Queries
delete_query: str = 'set/metadata/purchase_location/delete'
insert_query: str = 'set/metadata/purchase_location/insert'
select_query: str = 'set/metadata/purchase_location/select'
update_field_query: str = 'set/metadata/purchase_location/update/field'
update_set_value_query: str = 'set/metadata/purchase_location/update/value'
update_individual_minifigure_value_query: str = 'individual_minifigure/metadata/purchase_location/update/value'
@@ -22,6 +22,9 @@ class BrickSetPurchaseLocationList(
# Set value endpoint
set_value_endpoint: str = 'set.update_purchase_location'
# Individual minifigure value endpoint
individual_minifigure_value_endpoint: str = 'individual_minifigure.update_purchase_location'
# Load all purchase locations
@classmethod
def all(cls, /) -> Self:
+3 -1
View File
@@ -7,8 +7,9 @@ from .metadata import BrickMetadata
class BrickSetStatus(BrickMetadata):
kind: str = 'status'
# Set state endpoint
# Endpoints
set_state_endpoint: str = 'set.update_status'
individual_minifigure_state_endpoint: str = 'individual_minifigure.update_status'
# Queries
delete_query: str = 'set/metadata/status/delete'
@@ -16,6 +17,7 @@ class BrickSetStatus(BrickMetadata):
select_query: str = 'set/metadata/status/select'
update_field_query: str = 'set/metadata/status/update/field'
update_set_state_query: str = 'set/metadata/status/update/state'
update_individual_minifigure_state_query: str = 'individual_minifigure/metadata/status/update/state'
# Grab data from a form
def from_form(self, form: dict[str, str], /) -> Self:
+3
View File
@@ -15,6 +15,9 @@ class BrickSetStatusList(BrickMetadataList[BrickSetStatus]):
# Queries
select_query = 'set/metadata/status/list'
# Endpoints
set_state_endpoint: str = 'set.update_status'
# Filter the list of set status
def filter(self, all: bool = False) -> list[BrickSetStatus]:
return [
+4
View File
@@ -7,12 +7,16 @@ from flask import url_for
class BrickSetStorage(BrickMetadata):
kind: str = 'storage'
# Endpoints
individual_minifigure_value_endpoint: str = 'individual_minifigure.update_storage'
# Queries
delete_query: str = 'set/metadata/storage/delete'
insert_query: str = 'set/metadata/storage/insert'
select_query: str = 'set/metadata/storage/select'
update_field_query: str = 'set/metadata/storage/update/field'
update_set_value_query: str = 'set/metadata/storage/update/value'
update_individual_minifigure_value_query: str = 'individual_minifigure/metadata/storage/update/value'
# Self url
def url(self, /) -> str:
+3
View File
@@ -20,6 +20,9 @@ class BrickSetStorageList(BrickMetadataList[BrickSetStorage]):
# Set value endpoint
set_value_endpoint: str = 'set.update_storage'
# Individual minifigure value endpoint
individual_minifigure_value_endpoint: str = 'individual_minifigure.update_storage'
# Load all storages
@classmethod
def all(cls, /) -> Self:
+3 -1
View File
@@ -5,8 +5,9 @@ from .metadata import BrickMetadata
class BrickSetTag(BrickMetadata):
kind: str = 'tag'
# Set state endpoint
# Endpoints
set_state_endpoint: str = 'set.update_tag'
individual_minifigure_state_endpoint: str = 'individual_minifigure.update_tag'
# Queries
delete_query: str = 'set/metadata/tag/delete'
@@ -14,3 +15,4 @@ class BrickSetTag(BrickMetadata):
select_query: str = 'set/metadata/tag/select'
update_field_query: str = 'set/metadata/tag/update/field'
update_set_state_query: str = 'set/metadata/tag/update/state'
update_individual_minifigure_state_query: str = 'individual_minifigure/metadata/tag/update/state'
+3
View File
@@ -15,6 +15,9 @@ class BrickSetTagList(BrickMetadataList[BrickSetTag]):
# Queries
select_query: str = 'set/metadata/tag/list'
# Endpoints
set_state_endpoint: str = 'set.update_tag'
# Instantiate the list with the proper class
@classmethod
def new(cls, /, *, force: bool = False) -> Self:
+106
View File
@@ -6,6 +6,8 @@ from flask_socketio import SocketIO
from .instructions import BrickInstructions
from .instructions_list import BrickInstructionsList
from .peeron_instructions import PeeronInstructions, PeeronPage
from .peeron_pdf import PeeronPDF
from .set import BrickSet
from .socket_decorator import authenticated_socket, rebrickable_socket
from .sql import close as sql_close
@@ -18,9 +20,14 @@ MESSAGES: Final[dict[str, str]] = {
'CONNECT': 'connect',
'DISCONNECT': 'disconnect',
'DOWNLOAD_INSTRUCTIONS': 'download_instructions',
'DOWNLOAD_PEERON_PAGES': 'download_peeron_pages',
'FAIL': 'fail',
'IMPORT_MINIFIGURE': 'import_minifigure',
'IMPORT_SET': 'import_set',
'LOAD_MINIFIGURE': 'load_minifigure',
'LOAD_PEERON_PAGES': 'load_peeron_pages',
'LOAD_SET': 'load_set',
'MINIFIGURE_LOADED': 'minifigure_loaded',
'PROGRESS': 'progress',
'SET_LOADED': 'set_loaded',
}
@@ -106,6 +113,84 @@ class BrickSocket(object):
BrickInstructionsList(force=True)
@self.socket.on(MESSAGES['LOAD_PEERON_PAGES'], namespace=self.namespace) # noqa: E501
def load_peeron_pages(data: dict[str, Any], /) -> None:
logger.debug('Socket: LOAD_PEERON_PAGES={data} (from: {fr})'.format(
data=data, fr=request.remote_addr))
try:
set_number = data.get('set', '')
if not set_number:
self.fail(message="Set number is required")
return
# Create Peeron instructions instance with socket for progress reporting
peeron = PeeronInstructions(set_number, socket=self)
# Find pages (this will report progress for thumbnail caching)
pages = peeron.find_pages()
# Complete the operation (JavaScript will handle redirect)
self.complete(message=f"Found {len(pages)} instruction pages on Peeron")
except Exception as e:
logger.error(f"Error in load_peeron_pages: {e}")
self.fail(message=f"Error loading Peeron pages: {e}")
@self.socket.on(MESSAGES['DOWNLOAD_PEERON_PAGES'], namespace=self.namespace) # noqa: E501
@authenticated_socket(self)
def download_peeron_pages(data: dict[str, Any], /) -> None:
logger.debug('Socket: DOWNLOAD_PEERON_PAGES={data} (from: {fr})'.format(
data=data,
fr=request.sid, # type: ignore
))
try:
# Extract data from the request
set_number = data.get('set', '')
pages_data = data.get('pages', [])
if not set_number:
raise ValueError("Set number is required")
if not pages_data:
raise ValueError("No pages selected")
# Parse set number
if '-' in set_number:
parts = set_number.split('-', 1)
set_num = parts[0]
version_num = parts[1] if len(parts) > 1 else '1'
else:
set_num = set_number
version_num = '1'
# Convert page data to PeeronPage objects
pages = []
for page_data in pages_data:
page = PeeronPage(
page_number=page_data.get('page_number', ''),
original_image_url=page_data.get('original_image_url', ''),
cached_full_image_path=page_data.get('cached_full_image_path', ''),
cached_thumbnail_url='', # Not needed for PDF generation
alt_text=page_data.get('alt_text', ''),
rotation=page_data.get('rotation', 0)
)
pages.append(page)
# Create PDF generator and start download
pdf_generator = PeeronPDF(set_num, version_num, pages, socket=self)
pdf_generator.create_pdf()
# Note: Cache cleanup is handled automatically by pdf_generator.create_pdf()
# Refresh instructions list to include new PDF
BrickInstructionsList(force=True)
except Exception as e:
logger.error(f"Error in download_peeron_pages: {e}")
self.fail(message=f"Error downloading Peeron pages: {e}")
@self.socket.on(MESSAGES['IMPORT_SET'], namespace=self.namespace)
@rebrickable_socket(self)
def import_set(data: dict[str, Any], /) -> None:
@@ -125,6 +210,27 @@ class BrickSocket(object):
BrickSet().load(self, data)
@self.socket.on(MESSAGES['IMPORT_MINIFIGURE'], namespace=self.namespace)
@rebrickable_socket(self)
def import_minifigure(data: dict[str, Any], /) -> None:
logger.debug('Socket: IMPORT_MINIFIGURE={data} (from: {fr})'.format(
data=data,
fr=request.sid, # type: ignore
))
from .individual_minifigure import IndividualMinifigure
IndividualMinifigure().download(self, data)
@self.socket.on(MESSAGES['LOAD_MINIFIGURE'], namespace=self.namespace)
def load_minifigure(data: dict[str, Any], /) -> None:
logger.debug('Socket: LOAD_MINIFIGURE={data} (from: {fr})'.format(
data=data,
fr=request.sid, # type: ignore
))
from .individual_minifigure import IndividualMinifigure
IndividualMinifigure().load(self, data)
# Update the progress auto-incrementing
def auto_progress(
self,
+23
View File
@@ -60,6 +60,29 @@ class BrickSQL(object):
# Grab a cursor
self.cursor = self.connection.cursor()
# SQLite Performance Optimizations
logger.debug('SQLite3: applying performance optimizations')
# Enable WAL (Write-Ahead Logging) mode for better concurrency
# Allows multiple readers while writer is active
self.connection.execute('PRAGMA journal_mode=WAL')
# Increase cache size for better query performance
# Default is 2000 pages, increase to 10000 pages (~40MB for 4KB pages)
self.connection.execute('PRAGMA cache_size=10000')
# Store temporary tables and indices in memory for speed
self.connection.execute('PRAGMA temp_store=memory')
# Enable foreign key constraints (good practice)
self.connection.execute('PRAGMA foreign_keys=ON')
# Optimize for read performance (trade write speed for read speed)
self.connection.execute('PRAGMA synchronous=NORMAL')
# Analyze database statistics for better query planning
self.connection.execute('ANALYZE')
# Grab the version and check
try:
version = self.fetchone('schema/get_version')
@@ -0,0 +1,19 @@
-- Delete individual minifigure parts
DELETE FROM "bricktracker_individual_minifigure_parts"
WHERE "id" = :id;
-- Delete individual minifigure owners
DELETE FROM "bricktracker_individual_minifigure_owners"
WHERE "id" = :id;
-- Delete individual minifigure tags
DELETE FROM "bricktracker_individual_minifigure_tags"
WHERE "id" = :id;
-- Delete individual minifigure statuses
DELETE FROM "bricktracker_individual_minifigure_statuses"
WHERE "id" = :id;
-- Delete the individual minifigure itself
DELETE FROM "bricktracker_individual_minifigures"
WHERE "id" = :id;
@@ -0,0 +1,15 @@
INSERT OR IGNORE INTO "bricktracker_individual_minifigures" (
"id",
"figure",
"quantity",
"description",
"storage",
"purchase_location"
) VALUES (
:id,
:figure,
:quantity,
:description,
:storage,
:purchase_location
)
@@ -0,0 +1,48 @@
-- Get all individual minifigure instances for a specific storage location
SELECT
"bricktracker_individual_minifigures"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigures"."quantity",
"bricktracker_individual_minifigures"."description",
"bricktracker_individual_minifigures"."storage",
"bricktracker_individual_minifigures"."purchase_location",
"rebrickable_minifigures"."number",
"rebrickable_minifigures"."name",
"rebrickable_minifigures"."image",
"rebrickable_minifigures"."number_of_parts",
"storage_meta"."name" AS "storage_name",
"purchase_meta"."name" AS "purchase_location_name",
IFNULL("problem_join"."total_missing", 0) AS "total_missing",
IFNULL("problem_join"."total_damaged", 0) AS "total_damaged"
FROM "bricktracker_individual_minifigures"
INNER JOIN "rebrickable_minifigures"
ON "bricktracker_individual_minifigures"."figure" = "rebrickable_minifigures"."figure"
LEFT JOIN "bricktracker_metadata_storages" AS "storage_meta"
ON "bricktracker_individual_minifigures"."storage" = "storage_meta"."id"
LEFT JOIN "bricktracker_metadata_purchase_locations" AS "purchase_meta"
ON "bricktracker_individual_minifigures"."purchase_location" = "purchase_meta"."id"
LEFT JOIN (
SELECT
"bricktracker_individual_minifigure_parts"."id",
SUM("bricktracker_individual_minifigure_parts"."missing") AS "total_missing",
SUM("bricktracker_individual_minifigure_parts"."damaged") AS "total_damaged"
FROM "bricktracker_individual_minifigure_parts"
GROUP BY "bricktracker_individual_minifigure_parts"."id"
) "problem_join"
ON "bricktracker_individual_minifigures"."id" = "problem_join"."id"
WHERE "bricktracker_individual_minifigures"."storage" IS NOT DISTINCT FROM :storage
{% if order %}
ORDER BY {{ order }}
{% else %}
ORDER BY "bricktracker_individual_minifigures"."rowid" DESC
{% endif %}
{% if limit %}
LIMIT {{ limit }}
{% endif %}
@@ -0,0 +1,48 @@
-- Get all individual minifigure instances without storage
SELECT
"bricktracker_individual_minifigures"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigures"."quantity",
"bricktracker_individual_minifigures"."description",
"bricktracker_individual_minifigures"."storage",
"bricktracker_individual_minifigures"."purchase_location",
"rebrickable_minifigures"."number",
"rebrickable_minifigures"."name",
"rebrickable_minifigures"."image",
"rebrickable_minifigures"."number_of_parts",
"storage_meta"."name" AS "storage_name",
"purchase_meta"."name" AS "purchase_location_name",
IFNULL("problem_join"."total_missing", 0) AS "total_missing",
IFNULL("problem_join"."total_damaged", 0) AS "total_damaged"
FROM "bricktracker_individual_minifigures"
INNER JOIN "rebrickable_minifigures"
ON "bricktracker_individual_minifigures"."figure" = "rebrickable_minifigures"."figure"
LEFT JOIN "bricktracker_metadata_storages" AS "storage_meta"
ON "bricktracker_individual_minifigures"."storage" = "storage_meta"."id"
LEFT JOIN "bricktracker_metadata_purchase_locations" AS "purchase_meta"
ON "bricktracker_individual_minifigures"."purchase_location" = "purchase_meta"."id"
LEFT JOIN (
SELECT
"bricktracker_individual_minifigure_parts"."id",
SUM("bricktracker_individual_minifigure_parts"."missing") AS "total_missing",
SUM("bricktracker_individual_minifigure_parts"."damaged") AS "total_damaged"
FROM "bricktracker_individual_minifigure_parts"
GROUP BY "bricktracker_individual_minifigure_parts"."id"
) "problem_join"
ON "bricktracker_individual_minifigures"."id" = "problem_join"."id"
WHERE "bricktracker_individual_minifigures"."storage" IS NULL
{% if order %}
ORDER BY {{ order }}
{% else %}
ORDER BY "bricktracker_individual_minifigures"."rowid" DESC
{% endif %}
{% if limit %}
LIMIT {{ limit }}
{% endif %}
@@ -0,0 +1,10 @@
INSERT INTO "bricktracker_individual_minifigure_owners" (
"id",
"{{name}}"
) VALUES (
:id,
:state
)
ON CONFLICT("id")
DO UPDATE SET "{{name}}" = :state
WHERE "bricktracker_individual_minifigure_owners"."id" IS NOT DISTINCT FROM :id
@@ -0,0 +1,10 @@
INSERT INTO "bricktracker_individual_minifigure_statuses" (
"id",
"{{name}}"
) VALUES (
:id,
:state
)
ON CONFLICT("id")
DO UPDATE SET "{{name}}" = :state
WHERE "bricktracker_individual_minifigure_statuses"."id" IS NOT DISTINCT FROM :id
@@ -0,0 +1,10 @@
INSERT INTO "bricktracker_individual_minifigure_tags" (
"id",
"{{name}}"
) VALUES (
:id,
:state
)
ON CONFLICT("id")
DO UPDATE SET "{{name}}" = :state
WHERE "bricktracker_individual_minifigure_tags"."id" IS NOT DISTINCT FROM :id
@@ -0,0 +1,23 @@
INSERT OR IGNORE INTO "bricktracker_individual_minifigure_parts" (
"id",
"part",
"color",
"spare",
"quantity",
"element",
"rebrickable_inventory",
"missing",
"damaged",
"checked"
) VALUES (
:id,
:part,
:color,
:spare,
:quantity,
:element,
:rebrickable_inventory,
0,
0,
0
)
@@ -0,0 +1,38 @@
-- Query parts for a specific individual minifigure instance
SELECT
"bricktracker_individual_minifigure_parts"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigure_parts"."part",
"bricktracker_individual_minifigure_parts"."color",
"bricktracker_individual_minifigure_parts"."spare",
"bricktracker_individual_minifigure_parts"."quantity",
"bricktracker_individual_minifigure_parts"."element",
"bricktracker_individual_minifigure_parts"."missing" AS "total_missing",
"bricktracker_individual_minifigure_parts"."damaged" AS "total_damaged",
"bricktracker_individual_minifigure_parts"."checked",
"rebrickable_parts"."color_name",
"rebrickable_parts"."color_rgb",
"rebrickable_parts"."color_transparent",
"rebrickable_parts"."bricklink_color_id",
"rebrickable_parts"."bricklink_color_name",
"rebrickable_parts"."bricklink_part_num",
"rebrickable_parts"."name",
"rebrickable_parts"."image",
"rebrickable_parts"."image_id",
"rebrickable_parts"."url",
"rebrickable_parts"."print",
NULL AS "total_quantity",
NULL AS "total_spare",
NULL AS "total_sets",
NULL AS "total_minifigures"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
INNER JOIN "rebrickable_parts"
ON "bricktracker_individual_minifigure_parts"."part" = "rebrickable_parts"."part"
AND "bricktracker_individual_minifigure_parts"."color" = "rebrickable_parts"."color_id"
WHERE "bricktracker_individual_minifigure_parts"."id" IS NOT DISTINCT FROM :id
{% if order %}
ORDER BY {{ order | replace('"combined"', '"bricktracker_individual_minifigure_parts"') | replace('"bricktracker_parts"', '"bricktracker_individual_minifigure_parts"') }}
{% endif %}
@@ -0,0 +1,33 @@
-- Select a specific part from an individual minifigure instance
SELECT
"bricktracker_individual_minifigure_parts"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigure_parts"."part",
"bricktracker_individual_minifigure_parts"."color",
"bricktracker_individual_minifigure_parts"."spare",
"bricktracker_individual_minifigure_parts"."quantity",
"bricktracker_individual_minifigure_parts"."element",
"bricktracker_individual_minifigure_parts"."missing",
"bricktracker_individual_minifigure_parts"."damaged",
"bricktracker_individual_minifigure_parts"."checked",
"rebrickable_parts"."color_name",
"rebrickable_parts"."color_rgb",
"rebrickable_parts"."color_transparent",
"rebrickable_parts"."bricklink_color_id",
"rebrickable_parts"."bricklink_color_name",
"rebrickable_parts"."bricklink_part_num",
"rebrickable_parts"."name",
"rebrickable_parts"."image",
"rebrickable_parts"."image_id",
"rebrickable_parts"."url",
"rebrickable_parts"."print"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
INNER JOIN "rebrickable_parts"
ON "bricktracker_individual_minifigure_parts"."part" = "rebrickable_parts"."part"
AND "bricktracker_individual_minifigure_parts"."color" = "rebrickable_parts"."color_id"
WHERE "bricktracker_individual_minifigure_parts"."id" IS NOT DISTINCT FROM :id
AND "bricktracker_individual_minifigure_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_individual_minifigure_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_individual_minifigure_parts"."spare" IS NOT DISTINCT FROM :spare
@@ -0,0 +1,6 @@
UPDATE "bricktracker_individual_minifigure_parts"
SET "checked" = :checked
WHERE "bricktracker_individual_minifigure_parts"."id" IS NOT DISTINCT FROM :id
AND "bricktracker_individual_minifigure_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_individual_minifigure_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_individual_minifigure_parts"."spare" IS NOT DISTINCT FROM :spare
@@ -0,0 +1,6 @@
UPDATE "bricktracker_individual_minifigure_parts"
SET "damaged" = :damaged
WHERE "bricktracker_individual_minifigure_parts"."id" IS NOT DISTINCT FROM :id
AND "bricktracker_individual_minifigure_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_individual_minifigure_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_individual_minifigure_parts"."spare" IS NOT DISTINCT FROM :spare
@@ -0,0 +1,6 @@
UPDATE "bricktracker_individual_minifigure_parts"
SET "missing" = :missing
WHERE "bricktracker_individual_minifigure_parts"."id" IS NOT DISTINCT FROM :id
AND "bricktracker_individual_minifigure_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_individual_minifigure_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_individual_minifigure_parts"."spare" IS NOT DISTINCT FROM :spare
@@ -0,0 +1,35 @@
-- Get a specific individual minifigure instance by ID
SELECT
"bricktracker_individual_minifigures"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigures"."quantity",
"bricktracker_individual_minifigures"."description",
"bricktracker_individual_minifigures"."storage",
"bricktracker_individual_minifigures"."purchase_location",
"rebrickable_minifigures"."number",
"rebrickable_minifigures"."name",
"rebrickable_minifigures"."image",
"rebrickable_minifigures"."number_of_parts",
"storage_meta"."name" AS "storage_name",
"purchase_meta"."name" AS "purchase_location_name"{{ owners }}{{ statuses }}{{ tags }}
FROM "bricktracker_individual_minifigures"
INNER JOIN "rebrickable_minifigures"
ON "bricktracker_individual_minifigures"."figure" = "rebrickable_minifigures"."figure"
LEFT JOIN "bricktracker_metadata_storages" AS "storage_meta"
ON "bricktracker_individual_minifigures"."storage" = "storage_meta"."id"
LEFT JOIN "bricktracker_metadata_purchase_locations" AS "purchase_meta"
ON "bricktracker_individual_minifigures"."purchase_location" = "purchase_meta"."id"
LEFT JOIN "bricktracker_individual_minifigure_owners"
ON "bricktracker_individual_minifigures"."id" IS NOT DISTINCT FROM "bricktracker_individual_minifigure_owners"."id"
LEFT JOIN "bricktracker_individual_minifigure_statuses"
ON "bricktracker_individual_minifigures"."id" IS NOT DISTINCT FROM "bricktracker_individual_minifigure_statuses"."id"
LEFT JOIN "bricktracker_individual_minifigure_tags"
ON "bricktracker_individual_minifigures"."id" IS NOT DISTINCT FROM "bricktracker_individual_minifigure_tags"."id"
WHERE "bricktracker_individual_minifigures"."id" = :id
@@ -0,0 +1,52 @@
-- Get all individual minifigure instances for a specific figure
SELECT
"bricktracker_individual_minifigures"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigures"."quantity",
"bricktracker_individual_minifigures"."description",
"bricktracker_individual_minifigures"."storage",
"bricktracker_individual_minifigures"."purchase_location",
"rebrickable_minifigures"."number",
"rebrickable_minifigures"."name",
"rebrickable_minifigures"."image",
"rebrickable_minifigures"."number_of_parts",
"storage_meta"."name" AS "storage_name",
"purchase_meta"."name" AS "purchase_location_name",
{{ owners }},
{{ statuses }},
{{ tags }},
IFNULL("problem_join"."total_missing", 0) AS "total_missing",
IFNULL("problem_join"."total_damaged", 0) AS "total_damaged"
FROM "bricktracker_individual_minifigures"
INNER JOIN "rebrickable_minifigures"
ON "bricktracker_individual_minifigures"."figure" = "rebrickable_minifigures"."figure"
LEFT JOIN "bricktracker_metadata_storages" AS "storage_meta"
ON "bricktracker_individual_minifigures"."storage" = "storage_meta"."id"
LEFT JOIN "bricktracker_metadata_purchase_locations" AS "purchase_meta"
ON "bricktracker_individual_minifigures"."purchase_location" = "purchase_meta"."id"
LEFT JOIN "bricktracker_individual_minifigure_owners"
ON "bricktracker_individual_minifigures"."id" = "bricktracker_individual_minifigure_owners"."id"
LEFT JOIN "bricktracker_individual_minifigure_statuses"
ON "bricktracker_individual_minifigures"."id" = "bricktracker_individual_minifigure_statuses"."id"
LEFT JOIN "bricktracker_individual_minifigure_tags"
ON "bricktracker_individual_minifigures"."id" = "bricktracker_individual_minifigure_tags"."id"
LEFT JOIN (
SELECT
"bricktracker_individual_minifigure_parts"."id",
SUM("bricktracker_individual_minifigure_parts"."missing") AS "total_missing",
SUM("bricktracker_individual_minifigure_parts"."damaged") AS "total_damaged"
FROM "bricktracker_individual_minifigure_parts"
GROUP BY "bricktracker_individual_minifigure_parts"."id"
) "problem_join"
ON "bricktracker_individual_minifigures"."id" = "problem_join"."id"
WHERE "bricktracker_individual_minifigures"."figure" = :figure
ORDER BY "bricktracker_individual_minifigures"."rowid" DESC
@@ -0,0 +1,7 @@
UPDATE "bricktracker_individual_minifigures"
SET
"quantity" = :quantity,
"description" = :description,
"storage" = :storage,
"purchase_location" = :purchase_location
WHERE "id" = :id
+9
View File
@@ -0,0 +1,9 @@
-- description: Add checked field to bricktracker_parts table for part walkthrough tracking
BEGIN TRANSACTION;
-- Add checked field to the bricktracker_parts table
-- This allows users to track which parts they have checked during walkthroughs
ALTER TABLE "bricktracker_parts" ADD COLUMN "checked" BOOLEAN DEFAULT 0;
COMMIT;
+56
View File
@@ -0,0 +1,56 @@
-- Migration 0019: Performance optimization indexes
-- High-impact composite index for problem parts aggregation
-- Used in set listings, statistics, and problem reports
CREATE INDEX IF NOT EXISTS idx_bricktracker_parts_id_missing_damaged
ON bricktracker_parts(id, missing, damaged);
-- Composite index for parts lookup by part and color
-- Used in part listings and filtering operations
CREATE INDEX IF NOT EXISTS idx_bricktracker_parts_part_color_spare
ON bricktracker_parts(part, color, spare);
-- Composite index for set storage filtering
-- Used in set listings filtered by storage location
CREATE INDEX IF NOT EXISTS idx_bricktracker_sets_set_storage
ON bricktracker_sets("set", storage);
-- Search optimization index for set names
-- Improves text search performance on set listings
CREATE INDEX IF NOT EXISTS idx_rebrickable_sets_name_lower
ON rebrickable_sets(LOWER(name));
-- Search optimization index for part names
-- Improves text search performance on part listings
CREATE INDEX IF NOT EXISTS idx_rebrickable_parts_name_lower
ON rebrickable_parts(LOWER(name));
-- Additional indexes for common join patterns
-- Set purchase filtering
CREATE INDEX IF NOT EXISTS idx_bricktracker_sets_purchase_location
ON bricktracker_sets(purchase_location);
-- Parts quantity filtering
CREATE INDEX IF NOT EXISTS idx_bricktracker_parts_quantity
ON bricktracker_parts(quantity);
-- Year-based filtering optimization
CREATE INDEX IF NOT EXISTS idx_rebrickable_sets_year
ON rebrickable_sets(year);
-- Theme-based filtering optimization
CREATE INDEX IF NOT EXISTS idx_rebrickable_sets_theme_id
ON rebrickable_sets(theme_id);
-- Rebrickable sets number and version for sorting
CREATE INDEX IF NOT EXISTS idx_rebrickable_sets_number_version
ON rebrickable_sets(number, version);
-- Purchase date filtering and sorting
CREATE INDEX IF NOT EXISTS idx_bricktracker_sets_purchase_date
ON bricktracker_sets(purchase_date);
-- Minifigures aggregation optimization
CREATE INDEX IF NOT EXISTS idx_bricktracker_minifigures_id_quantity
ON bricktracker_minifigures(id, quantity);
+132
View File
@@ -0,0 +1,132 @@
-- Migration 0020: Add individual minifigures and individual parts tables
-- Individual minifigures table - tracks individual minifigures not associated with sets
CREATE TABLE IF NOT EXISTS "bricktracker_individual_minifigures" (
"id" TEXT NOT NULL,
"figure" TEXT NOT NULL,
"quantity" INTEGER NOT NULL DEFAULT 1,
"description" TEXT,
"storage" TEXT, -- Storage bin location
"purchase_date" REAL, -- Purchase date
"purchase_location" TEXT, -- Purchase location
"purchase_price" REAL, -- Purchase price
PRIMARY KEY("id"),
FOREIGN KEY("figure") REFERENCES "rebrickable_minifigures"("figure"),
FOREIGN KEY("storage") REFERENCES "bricktracker_metadata_storages"("id"),
FOREIGN KEY("purchase_location") REFERENCES "bricktracker_metadata_purchase_locations"("id")
);
-- Individual minifigure statuses
CREATE TABLE IF NOT EXISTS "bricktracker_individual_minifigure_statuses" (
"id" TEXT NOT NULL,
"status_minifigures_collected" BOOLEAN NOT NULL DEFAULT 0,
"status_set_checked" BOOLEAN NOT NULL DEFAULT 0,
"status_set_collected" BOOLEAN NOT NULL DEFAULT 0,
PRIMARY KEY("id"),
FOREIGN KEY("id") REFERENCES "bricktracker_individual_minifigures"("id")
);
-- Individual minifigure owners
CREATE TABLE IF NOT EXISTS "bricktracker_individual_minifigure_owners" (
"id" TEXT NOT NULL,
PRIMARY KEY("id"),
FOREIGN KEY("id") REFERENCES "bricktracker_individual_minifigures"("id")
);
-- Individual minifigure tags
CREATE TABLE IF NOT EXISTS "bricktracker_individual_minifigure_tags" (
"id" TEXT NOT NULL,
PRIMARY KEY("id"),
FOREIGN KEY("id") REFERENCES "bricktracker_individual_minifigures"("id")
);
-- Parts table for individual minifigures - tracks constituent parts
CREATE TABLE IF NOT EXISTS "bricktracker_individual_minifigure_parts" (
"id" TEXT NOT NULL,
"part" TEXT NOT NULL,
"color" INTEGER NOT NULL,
"spare" BOOLEAN NOT NULL,
"quantity" INTEGER NOT NULL,
"element" INTEGER,
"rebrickable_inventory" INTEGER NOT NULL,
"missing" INTEGER NOT NULL DEFAULT 0,
"damaged" INTEGER NOT NULL DEFAULT 0,
"checked" BOOLEAN DEFAULT 0,
PRIMARY KEY("id", "part", "color", "spare"),
FOREIGN KEY("id") REFERENCES "bricktracker_individual_minifigures"("id"),
FOREIGN KEY("part", "color") REFERENCES "rebrickable_parts"("part", "color_id")
);
-- Individual parts table - tracks individual parts not associated with sets
CREATE TABLE IF NOT EXISTS "bricktracker_individual_parts" (
"id" TEXT NOT NULL,
"part" TEXT NOT NULL,
"color" INTEGER NOT NULL,
"quantity" INTEGER NOT NULL DEFAULT 1,
"description" TEXT,
"storage" TEXT, -- Storage bin location
"purchase_date" REAL, -- Purchase date
"purchase_location" TEXT, -- Purchase location
"purchase_price" REAL, -- Purchase price
PRIMARY KEY("id"),
FOREIGN KEY("part", "color") REFERENCES "rebrickable_parts"("part", "color_id"),
FOREIGN KEY("storage") REFERENCES "bricktracker_metadata_storages"("id"),
FOREIGN KEY("purchase_location") REFERENCES "bricktracker_metadata_purchase_locations"("id")
);
-- Individual part owners
CREATE TABLE IF NOT EXISTS "bricktracker_individual_part_owners" (
"id" TEXT NOT NULL,
PRIMARY KEY("id"),
FOREIGN KEY("id") REFERENCES "bricktracker_individual_parts"("id")
);
-- Individual part tags
CREATE TABLE IF NOT EXISTS "bricktracker_individual_part_tags" (
"id" TEXT NOT NULL,
PRIMARY KEY("id"),
FOREIGN KEY("id") REFERENCES "bricktracker_individual_parts"("id")
);
-- Individual part statuses
CREATE TABLE IF NOT EXISTS "bricktracker_individual_part_statuses" (
"id" TEXT NOT NULL,
"status_minifigures_collected" BOOLEAN NOT NULL DEFAULT 0,
"status_set_checked" BOOLEAN NOT NULL DEFAULT 0,
"status_set_collected" BOOLEAN NOT NULL DEFAULT 0,
PRIMARY KEY("id"),
FOREIGN KEY("id") REFERENCES "bricktracker_individual_parts"("id")
);
-- Indexes for individual minifigures
CREATE INDEX IF NOT EXISTS idx_bricktracker_individual_minifigures_figure
ON bricktracker_individual_minifigures(figure);
CREATE INDEX IF NOT EXISTS idx_bricktracker_individual_minifigures_storage
ON bricktracker_individual_minifigures(storage);
CREATE INDEX IF NOT EXISTS idx_bricktracker_individual_minifigures_purchase_location
ON bricktracker_individual_minifigures(purchase_location);
CREATE INDEX IF NOT EXISTS idx_bricktracker_individual_minifigures_purchase_date
ON bricktracker_individual_minifigures(purchase_date);
-- Indexes for individual minifigure parts
CREATE INDEX IF NOT EXISTS idx_bricktracker_individual_minifigure_parts_id_missing_damaged
ON bricktracker_individual_minifigure_parts(id, missing, damaged);
CREATE INDEX IF NOT EXISTS idx_bricktracker_individual_minifigure_parts_part_color
ON bricktracker_individual_minifigure_parts(part, color);
-- Indexes for individual parts
CREATE INDEX IF NOT EXISTS idx_bricktracker_individual_parts_part_color
ON bricktracker_individual_parts(part, color);
CREATE INDEX IF NOT EXISTS idx_bricktracker_individual_parts_storage
ON bricktracker_individual_parts(storage);
CREATE INDEX IF NOT EXISTS idx_bricktracker_individual_parts_purchase_location
ON bricktracker_individual_parts(purchase_location);
CREATE INDEX IF NOT EXISTS idx_bricktracker_individual_parts_purchase_date
ON bricktracker_individual_parts(purchase_date);
+23
View File
@@ -0,0 +1,23 @@
-- Migration 0021: Add existing owner/tag columns to individual minifigure and individual part metadata tables
-- Add owner columns to individual minifigure owners table
ALTER TABLE "bricktracker_individual_minifigure_owners"
ADD COLUMN "owner_32479d0a_cd3c_43c6_aa16_b3f378915b13" BOOLEAN NOT NULL DEFAULT 0;
ALTER TABLE "bricktracker_individual_minifigure_owners"
ADD COLUMN "owner_2f07518d_40e1_4279_b0d0_aa339f195cbf" BOOLEAN NOT NULL DEFAULT 0;
-- Add tag columns to individual minifigure tags table
ALTER TABLE "bricktracker_individual_minifigure_tags"
ADD COLUMN "tag_b1b5c316_5caf_4b82_a085_ac4c7ab9b8db" BOOLEAN NOT NULL DEFAULT 0;
-- Add owner columns to individual part owners table
ALTER TABLE "bricktracker_individual_part_owners"
ADD COLUMN "owner_32479d0a_cd3c_43c6_aa16_b3f378915b13" BOOLEAN NOT NULL DEFAULT 0;
ALTER TABLE "bricktracker_individual_part_owners"
ADD COLUMN "owner_2f07518d_40e1_4279_b0d0_aa339f195cbf" BOOLEAN NOT NULL DEFAULT 0;
-- Add tag columns to individual part tags table
ALTER TABLE "bricktracker_individual_part_tags"
ADD COLUMN "tag_b1b5c316_5caf_4b82_a085_ac4c7ab9b8db" BOOLEAN NOT NULL DEFAULT 0;
+43 -10
View File
@@ -1,10 +1,11 @@
-- Combined query for both set-based and individual minifigures
SELECT
"bricktracker_minifigures"."quantity",
"rebrickable_minifigures"."figure",
"rebrickable_minifigures"."number",
"rebrickable_minifigures"."number_of_parts",
"rebrickable_minifigures"."name",
"rebrickable_minifigures"."image",
"combined"."quantity",
"combined"."figure",
"combined"."number",
"combined"."number_of_parts",
"combined"."name",
"combined"."image",
{% block total_missing %}
NULL AS "total_missing", -- dummy for order: total_missing
{% endblock %}
@@ -15,12 +16,44 @@ SELECT
NULL AS "total_quantity", -- dummy for order: total_quantity
{% endblock %}
{% block total_sets %}
NULL AS "total_sets" -- dummy for order: total_sets
NULL AS "total_sets", -- dummy for order: total_sets
{% endblock %}
FROM "bricktracker_minifigures"
{% block total_individual %}
NULL AS "total_individual" -- dummy for order: total_individual
{% endblock %}
FROM (
-- Set-based minifigures
SELECT
"bricktracker_minifigures"."id",
"bricktracker_minifigures"."quantity",
"rebrickable_minifigures"."figure",
"rebrickable_minifigures"."number",
"rebrickable_minifigures"."number_of_parts",
"rebrickable_minifigures"."name",
"rebrickable_minifigures"."image",
"bricktracker_minifigures"."rowid" AS "rowid",
'set' AS "source_type"
FROM "bricktracker_minifigures"
INNER JOIN "rebrickable_minifigures"
ON "bricktracker_minifigures"."figure" IS NOT DISTINCT FROM "rebrickable_minifigures"."figure"
INNER JOIN "rebrickable_minifigures"
ON "bricktracker_minifigures"."figure" IS NOT DISTINCT FROM "rebrickable_minifigures"."figure"
UNION ALL
-- Individual minifigures
SELECT
"bricktracker_individual_minifigures"."id",
"bricktracker_individual_minifigures"."quantity",
"rebrickable_minifigures"."figure",
"rebrickable_minifigures"."number",
"rebrickable_minifigures"."number_of_parts",
"rebrickable_minifigures"."name",
"rebrickable_minifigures"."image",
"bricktracker_individual_minifigures"."rowid" AS "rowid",
'individual' AS "source_type"
FROM "bricktracker_individual_minifigures"
INNER JOIN "rebrickable_minifigures"
ON "bricktracker_individual_minifigures"."figure" IS NOT DISTINCT FROM "rebrickable_minifigures"."figure"
) AS "combined"
{% block join %}{% endblock %}
+27 -6
View File
@@ -9,16 +9,22 @@ SUM(IFNULL("problem_join"."total_damaged", 0)) AS "total_damaged",
{% endblock %}
{% block total_quantity %}
SUM(IFNULL("bricktracker_minifigures"."quantity", 0)) AS "total_quantity",
SUM(IFNULL("combined"."quantity", 0)) AS "total_quantity",
{% endblock %}
{% block total_sets %}
IFNULL(COUNT("bricktracker_minifigures"."id"), 0) AS "total_sets"
SUM(CASE WHEN "combined"."source_type" = 'set' THEN 1 ELSE 0 END) AS "total_sets",
{% endblock %}
{% block total_individual %}
SUM(CASE WHEN "combined"."source_type" = 'individual' THEN 1 ELSE 0 END) AS "total_individual"
{% endblock %}
{% block join %}
-- LEFT JOIN + SELECT to avoid messing the total
-- Combine parts from both set-based and individual minifigures
LEFT JOIN (
-- Set-based minifigure parts
SELECT
"bricktracker_parts"."id",
"bricktracker_parts"."figure",
@@ -29,18 +35,33 @@ LEFT JOIN (
GROUP BY
"bricktracker_parts"."id",
"bricktracker_parts"."figure"
UNION ALL
-- Individual minifigure parts
SELECT
"bricktracker_individual_minifigure_parts"."id",
"combined"."figure",
SUM("bricktracker_individual_minifigure_parts"."missing") AS "total_missing",
SUM("bricktracker_individual_minifigure_parts"."damaged") AS "total_damaged"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures" ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
INNER JOIN "rebrickable_minifigures" AS "combined" ON "bricktracker_individual_minifigures"."figure" = "combined"."figure"
GROUP BY
"bricktracker_individual_minifigure_parts"."id",
"combined"."figure"
) "problem_join"
ON "bricktracker_minifigures"."id" IS NOT DISTINCT FROM "problem_join"."id"
AND "rebrickable_minifigures"."figure" IS NOT DISTINCT FROM "problem_join"."figure"
ON "combined"."id" IS NOT DISTINCT FROM "problem_join"."id"
AND "combined"."figure" IS NOT DISTINCT FROM "problem_join"."figure"
{% endblock %}
{% block where %}
{% if search_query %}
WHERE (LOWER("rebrickable_minifigures"."name") LIKE LOWER('%{{ search_query }}%'))
WHERE (LOWER("combined"."name") LIKE LOWER('%{{ search_query }}%'))
{% endif %}
{% endblock %}
{% block group %}
GROUP BY
"rebrickable_minifigures"."figure"
"combined"."figure"
{% endblock %}
@@ -10,31 +10,53 @@ SUM(IFNULL("problem_join"."total_damaged", 0)) AS "total_damaged",
{% block total_quantity %}
{% if owner_id and owner_id != 'all' %}
SUM(CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN IFNULL("bricktracker_minifigures"."quantity", 0) ELSE 0 END) AS "total_quantity",
SUM(CASE
WHEN "combined"."source_type" = 'set' AND "set_owners"."owner_{{ owner_id }}" = 1 THEN IFNULL("combined"."quantity", 0)
WHEN "combined"."source_type" = 'individual' AND "individual_owners"."owner_{{ owner_id }}" = 1 THEN IFNULL("combined"."quantity", 0)
ELSE 0
END) AS "total_quantity",
{% else %}
SUM(IFNULL("bricktracker_minifigures"."quantity", 0)) AS "total_quantity",
SUM(IFNULL("combined"."quantity", 0)) AS "total_quantity",
{% endif %}
{% endblock %}
{% block total_sets %}
{% if owner_id and owner_id != 'all' %}
COUNT(CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "bricktracker_minifigures"."id" ELSE NULL END) AS "total_sets"
SUM(CASE
WHEN "combined"."source_type" = 'set' AND "set_owners"."owner_{{ owner_id }}" = 1 THEN 1
ELSE 0
END) AS "total_sets",
{% else %}
COUNT("bricktracker_minifigures"."id") AS "total_sets"
SUM(CASE WHEN "combined"."source_type" = 'set' THEN 1 ELSE 0 END) AS "total_sets",
{% endif %}
{% endblock %}
{% block total_individual %}
{% if owner_id and owner_id != 'all' %}
SUM(CASE
WHEN "combined"."source_type" = 'individual' AND "individual_owners"."owner_{{ owner_id }}" = 1 THEN 1
ELSE 0
END) AS "total_individual"
{% else %}
SUM(CASE WHEN "combined"."source_type" = 'individual' THEN 1 ELSE 0 END) AS "total_individual"
{% endif %}
{% endblock %}
{% block join %}
-- Join with sets to get owner information
INNER JOIN "bricktracker_sets"
ON "bricktracker_minifigures"."id" IS NOT DISTINCT FROM "bricktracker_sets"."id"
-- Join with set owners for set-based minifigures
LEFT JOIN "bricktracker_sets"
ON "combined"."id" = "bricktracker_sets"."id" AND "combined"."source_type" = 'set'
-- Left join with set owners (using dynamic columns)
LEFT JOIN "bricktracker_set_owners"
ON "bricktracker_sets"."id" IS NOT DISTINCT FROM "bricktracker_set_owners"."id"
LEFT JOIN "bricktracker_set_owners" AS "set_owners"
ON "bricktracker_sets"."id" = "set_owners"."id"
-- Join with individual minifigure owners for individual minifigures
LEFT JOIN "bricktracker_individual_minifigure_owners" AS "individual_owners"
ON "combined"."id" = "individual_owners"."id" AND "combined"."source_type" = 'individual'
-- LEFT JOIN + SELECT to avoid messing the total
LEFT JOIN (
-- Set-based minifigure parts
SELECT
"bricktracker_parts"."id",
"bricktracker_parts"."figure",
@@ -47,25 +69,47 @@ LEFT JOIN (
{% endif %}
FROM "bricktracker_parts"
INNER JOIN "bricktracker_sets" AS "parts_sets"
ON "bricktracker_parts"."id" IS NOT DISTINCT FROM "parts_sets"."id"
ON "bricktracker_parts"."id" = "parts_sets"."id"
LEFT JOIN "bricktracker_set_owners" AS "owner_parts"
ON "parts_sets"."id" IS NOT DISTINCT FROM "owner_parts"."id"
ON "parts_sets"."id" = "owner_parts"."id"
WHERE "bricktracker_parts"."figure" IS NOT NULL
GROUP BY
"bricktracker_parts"."id",
"bricktracker_parts"."figure"
UNION ALL
-- Individual minifigure parts
SELECT
"bricktracker_individual_minifigure_parts"."id",
"bricktracker_individual_minifigures"."figure",
{% if owner_id and owner_id != 'all' %}
SUM(CASE WHEN "owner_individual"."owner_{{ owner_id }}" = 1 THEN "bricktracker_individual_minifigure_parts"."missing" ELSE 0 END) AS "total_missing",
SUM(CASE WHEN "owner_individual"."owner_{{ owner_id }}" = 1 THEN "bricktracker_individual_minifigure_parts"."damaged" ELSE 0 END) AS "total_damaged"
{% else %}
SUM("bricktracker_individual_minifigure_parts"."missing") AS "total_missing",
SUM("bricktracker_individual_minifigure_parts"."damaged") AS "total_damaged"
{% endif %}
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
LEFT JOIN "bricktracker_individual_minifigure_owners" AS "owner_individual"
ON "bricktracker_individual_minifigures"."id" = "owner_individual"."id"
GROUP BY
"bricktracker_individual_minifigure_parts"."id",
"bricktracker_individual_minifigures"."figure"
) "problem_join"
ON "bricktracker_minifigures"."id" IS NOT DISTINCT FROM "problem_join"."id"
AND "rebrickable_minifigures"."figure" IS NOT DISTINCT FROM "problem_join"."figure"
ON "combined"."id" = "problem_join"."id"
AND "combined"."figure" = "problem_join"."figure"
{% endblock %}
{% block where %}
{% set conditions = [] %}
{% if owner_id and owner_id != 'all' %}
{% set _ = conditions.append('"bricktracker_set_owners"."owner_' ~ owner_id ~ '" = 1') %}
{% set _ = conditions.append('(("combined"."source_type" = \'set\' AND "set_owners"."owner_' ~ owner_id ~ '" = 1) OR ("combined"."source_type" = \'individual\' AND "individual_owners"."owner_' ~ owner_id ~ '" = 1))') %}
{% endif %}
{% if search_query %}
{% set _ = conditions.append('(LOWER("rebrickable_minifigures"."name") LIKE LOWER(\'%' ~ search_query ~ '%\'))') %}
{% set _ = conditions.append('(LOWER("combined"."name") LIKE LOWER(\'%' ~ search_query ~ '%\'))') %}
{% endif %}
{% if conditions %}
WHERE {{ conditions | join(' AND ') }}
@@ -74,5 +118,5 @@ WHERE {{ conditions | join(' AND ') }}
{% block group %}
GROUP BY
"rebrickable_minifigures"."figure"
"combined"."figure"
{% endblock %}
@@ -1,28 +1,59 @@
{% extends 'minifigure/base/base.sql' %}
{% block total_damaged %}
SUM("bricktracker_parts"."damaged") AS "total_damaged",
SUM("parts_combined"."damaged") AS "total_damaged",
{% endblock %}
{% block join %}
LEFT JOIN "bricktracker_parts"
ON "bricktracker_minifigures"."id" IS NOT DISTINCT FROM "bricktracker_parts"."id"
AND "rebrickable_minifigures"."figure" IS NOT DISTINCT FROM "bricktracker_parts"."figure"
-- Join with parts from both set-based and individual minifigures
LEFT JOIN (
SELECT
"bricktracker_parts"."id",
"bricktracker_parts"."figure",
"bricktracker_parts"."damaged"
FROM "bricktracker_parts"
UNION ALL
SELECT
"bricktracker_individual_minifigure_parts"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigure_parts"."damaged"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
) AS "parts_combined"
ON "combined"."id" IS NOT DISTINCT FROM "parts_combined"."id"
AND "combined"."figure" IS NOT DISTINCT FROM "parts_combined"."figure"
{% endblock %}
{% block where %}
WHERE "rebrickable_minifigures"."figure" IN (
SELECT "bricktracker_parts"."figure"
FROM "bricktracker_parts"
WHERE "bricktracker_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_parts"."figure" IS NOT NULL
AND "bricktracker_parts"."damaged" > 0
GROUP BY "bricktracker_parts"."figure"
WHERE "combined"."figure" IN (
-- Find figures with damaged parts from both sources
SELECT "figure"
FROM (
SELECT "bricktracker_parts"."figure"
FROM "bricktracker_parts"
WHERE "bricktracker_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_parts"."figure" IS NOT NULL
AND "bricktracker_parts"."damaged" > 0
UNION
SELECT "bricktracker_individual_minifigures"."figure"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
WHERE "bricktracker_individual_minifigure_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_individual_minifigure_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_individual_minifigure_parts"."damaged" > 0
) AS "damaged_figures"
GROUP BY "figure"
)
{% endblock %}
{% block group %}
GROUP BY
"rebrickable_minifigures"."figure"
"combined"."figure"
{% endblock %}
@@ -1,5 +1,5 @@
{% extends 'minifigure/base/base.sql' %}
{% block where %}
WHERE "bricktracker_minifigures"."id" IS NOT DISTINCT FROM :id
WHERE "combined"."id" IS NOT DISTINCT FROM :id AND "combined"."source_type" = 'set'
{% endblock %}
+26 -7
View File
@@ -1,21 +1,40 @@
{% extends 'minifigure/base/base.sql' %}
{% block total_missing %}
SUM("bricktracker_parts"."missing") AS "total_missing",
SUM("parts_combined"."missing") AS "total_missing",
{% endblock %}
{% block total_damaged %}
SUM("bricktracker_parts"."damaged") AS "total_damaged",
SUM("parts_combined"."damaged") AS "total_damaged",
{% endblock %}
{% block join %}
LEFT JOIN "bricktracker_parts"
ON "bricktracker_minifigures"."id" IS NOT DISTINCT FROM "bricktracker_parts"."id"
AND "rebrickable_minifigures"."figure" IS NOT DISTINCT FROM "bricktracker_parts"."figure"
-- Join with parts from both set-based and individual minifigures
LEFT JOIN (
SELECT
"bricktracker_parts"."id",
"bricktracker_parts"."figure",
"bricktracker_parts"."missing",
"bricktracker_parts"."damaged"
FROM "bricktracker_parts"
UNION ALL
SELECT
"bricktracker_individual_minifigure_parts"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigure_parts"."missing",
"bricktracker_individual_minifigure_parts"."damaged"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
) AS "parts_combined"
ON "combined"."id" IS NOT DISTINCT FROM "parts_combined"."id"
AND "combined"."figure" IS NOT DISTINCT FROM "parts_combined"."figure"
{% endblock %}
{% block group %}
GROUP BY
"rebrickable_minifigures"."figure",
"bricktracker_minifigures"."id"
"combined"."figure",
"combined"."id"
{% endblock %}
@@ -1,28 +1,59 @@
{% extends 'minifigure/base/base.sql' %}
{% block total_missing %}
SUM("bricktracker_parts"."missing") AS "total_missing",
SUM("parts_combined"."missing") AS "total_missing",
{% endblock %}
{% block join %}
LEFT JOIN "bricktracker_parts"
ON "bricktracker_minifigures"."id" IS NOT DISTINCT FROM "bricktracker_parts"."id"
AND "rebrickable_minifigures"."figure" IS NOT DISTINCT FROM "bricktracker_parts"."figure"
-- Join with parts from both set-based and individual minifigures
LEFT JOIN (
SELECT
"bricktracker_parts"."id",
"bricktracker_parts"."figure",
"bricktracker_parts"."missing"
FROM "bricktracker_parts"
UNION ALL
SELECT
"bricktracker_individual_minifigure_parts"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigure_parts"."missing"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
) AS "parts_combined"
ON "combined"."id" IS NOT DISTINCT FROM "parts_combined"."id"
AND "combined"."figure" IS NOT DISTINCT FROM "parts_combined"."figure"
{% endblock %}
{% block where %}
WHERE "rebrickable_minifigures"."figure" IN (
SELECT "bricktracker_parts"."figure"
FROM "bricktracker_parts"
WHERE "bricktracker_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_parts"."figure" IS NOT NULL
AND "bricktracker_parts"."missing" > 0
GROUP BY "bricktracker_parts"."figure"
WHERE "combined"."figure" IN (
-- Find figures with missing parts from both sources
SELECT "figure"
FROM (
SELECT "bricktracker_parts"."figure"
FROM "bricktracker_parts"
WHERE "bricktracker_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_parts"."figure" IS NOT NULL
AND "bricktracker_parts"."missing" > 0
UNION
SELECT "bricktracker_individual_minifigures"."figure"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
WHERE "bricktracker_individual_minifigure_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_individual_minifigure_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_individual_minifigure_parts"."missing" > 0
) AS "missing_figures"
GROUP BY "figure"
)
{% endblock %}
{% block group %}
GROUP BY
"rebrickable_minifigures"."figure"
"combined"."figure"
{% endblock %}
@@ -1,21 +1,34 @@
{% extends 'minifigure/base/base.sql' %}
{% block total_quantity %}
SUM("bricktracker_minifigures"."quantity") AS "total_quantity",
SUM("combined"."quantity") AS "total_quantity",
{% endblock %}
{% block where %}
WHERE "rebrickable_minifigures"."figure" IN (
SELECT "bricktracker_parts"."figure"
FROM "bricktracker_parts"
WHERE "bricktracker_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_parts"."figure" IS NOT NULL
GROUP BY "bricktracker_parts"."figure"
WHERE "combined"."figure" IN (
-- Find figures from both set-based and individual minifigure parts
SELECT "figure"
FROM (
SELECT "bricktracker_parts"."figure"
FROM "bricktracker_parts"
WHERE "bricktracker_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_parts"."figure" IS NOT NULL
UNION
SELECT "bricktracker_individual_minifigures"."figure"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
WHERE "bricktracker_individual_minifigure_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_individual_minifigure_parts"."color" IS NOT DISTINCT FROM :color
) AS "parts_figures"
GROUP BY "figure"
)
{% endblock %}
{% block group %}
GROUP BY
"rebrickable_minifigures"."figure"
"combined"."figure"
{% endblock %}
+23 -5
View File
@@ -9,16 +9,22 @@ IFNULL("problem_join"."total_damaged", 0) AS "total_damaged",
{% endblock %}
{% block total_quantity %}
SUM(IFNULL("bricktracker_minifigures"."quantity", 0)) AS "total_quantity",
SUM(IFNULL("combined"."quantity", 0)) AS "total_quantity",
{% endblock %}
{% block total_sets %}
IFNULL(COUNT(DISTINCT "bricktracker_minifigures"."id"), 0) AS "total_sets"
IFNULL(COUNT(DISTINCT "combined"."id"), 0) AS "total_sets",
{% endblock %}
{% block total_individual %}
IFNULL(COUNT(DISTINCT "combined"."id"), 0) AS "total_individual"
{% endblock %}
{% block join %}
-- LEFT JOIN + SELECT to avoid messing the total
-- Combine parts from both set-based and individual minifigures
LEFT JOIN (
-- Set-based minifigure parts
SELECT
"bricktracker_parts"."figure",
SUM("bricktracker_parts"."missing") AS "total_missing",
@@ -26,15 +32,27 @@ LEFT JOIN (
FROM "bricktracker_parts"
WHERE "bricktracker_parts"."figure" IS NOT DISTINCT FROM :figure
GROUP BY "bricktracker_parts"."figure"
UNION ALL
-- Individual minifigure parts
SELECT
"bricktracker_individual_minifigures"."figure",
SUM("bricktracker_individual_minifigure_parts"."missing") AS "total_missing",
SUM("bricktracker_individual_minifigure_parts"."damaged") AS "total_damaged"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures" ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
WHERE "bricktracker_individual_minifigures"."figure" IS NOT DISTINCT FROM :figure
GROUP BY "bricktracker_individual_minifigures"."figure"
) "problem_join"
ON "rebrickable_minifigures"."figure" IS NOT DISTINCT FROM "problem_join"."figure"
ON "combined"."figure" IS NOT DISTINCT FROM "problem_join"."figure"
{% endblock %}
{% block where %}
WHERE "rebrickable_minifigures"."figure" IS NOT DISTINCT FROM :figure
WHERE "combined"."figure" IS NOT DISTINCT FROM :figure
{% endblock %}
{% block group %}
GROUP BY
"rebrickable_minifigures"."figure"
"combined"."figure"
{% endblock %}
@@ -1,6 +1,7 @@
{% extends 'minifigure/base/base.sql' %}
{% block where %}
WHERE "bricktracker_minifigures"."id" IS NOT DISTINCT FROM :id
AND "rebrickable_minifigures"."figure" IS NOT DISTINCT FROM :figure
WHERE "combined"."id" IS NOT DISTINCT FROM :id
AND "combined"."figure" IS NOT DISTINCT FROM :figure
AND "combined"."source_type" = 'set'
{% endblock %}
+47 -16
View File
@@ -1,16 +1,14 @@
SELECT
"bricktracker_parts"."id",
"bricktracker_parts"."figure",
"bricktracker_parts"."part",
"bricktracker_parts"."color",
"bricktracker_parts"."spare",
"bricktracker_parts"."quantity",
"bricktracker_parts"."element",
--"bricktracker_parts"."rebrickable_inventory",
"bricktracker_parts"."missing",
"bricktracker_parts"."damaged",
--"rebrickable_parts"."part",
--"rebrickable_parts"."color_id",
"combined"."id",
"combined"."figure",
"combined"."part",
"combined"."color",
"combined"."spare",
"combined"."quantity",
"combined"."element",
"combined"."missing",
"combined"."damaged",
"combined"."checked",
"rebrickable_parts"."color_name",
"rebrickable_parts"."color_rgb",
"rebrickable_parts"."color_transparent",
@@ -18,7 +16,6 @@ SELECT
"rebrickable_parts"."bricklink_color_name",
"rebrickable_parts"."bricklink_part_num",
"rebrickable_parts"."name",
--"rebrickable_parts"."category",
"rebrickable_parts"."image",
"rebrickable_parts"."image_id",
"rebrickable_parts"."url",
@@ -41,11 +38,45 @@ SELECT
{% block total_minifigures %}
NULL AS "total_minifigures" -- dummy for order: total_minifigures
{% endblock %}
FROM "bricktracker_parts"
FROM (
-- Parts from set-based minifigures
SELECT
"bricktracker_parts"."id",
"bricktracker_parts"."figure",
"bricktracker_parts"."part",
"bricktracker_parts"."color",
"bricktracker_parts"."spare",
"bricktracker_parts"."quantity",
"bricktracker_parts"."element",
"bricktracker_parts"."missing",
"bricktracker_parts"."damaged",
"bricktracker_parts"."checked",
'set' AS "source_type"
FROM "bricktracker_parts"
UNION ALL
-- Parts from individual minifigures
SELECT
"bricktracker_individual_minifigure_parts"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigure_parts"."part",
"bricktracker_individual_minifigure_parts"."color",
"bricktracker_individual_minifigure_parts"."spare",
"bricktracker_individual_minifigure_parts"."quantity",
"bricktracker_individual_minifigure_parts"."element",
"bricktracker_individual_minifigure_parts"."missing",
"bricktracker_individual_minifigure_parts"."damaged",
"bricktracker_individual_minifigure_parts"."checked",
'individual' AS "source_type"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
) AS "combined"
INNER JOIN "rebrickable_parts"
ON "bricktracker_parts"."part" IS NOT DISTINCT FROM "rebrickable_parts"."part"
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM "rebrickable_parts"."color_id"
ON "combined"."part" IS NOT DISTINCT FROM "rebrickable_parts"."part"
AND "combined"."color" IS NOT DISTINCT FROM "rebrickable_parts"."color_id"
{% block join %}{% endblock %}
+29 -14
View File
@@ -1,42 +1,57 @@
{% extends 'part/base/base.sql' %}
{% block total_missing %}
SUM("bricktracker_parts"."missing") AS "total_missing",
SUM("combined"."missing") AS "total_missing",
{% endblock %}
{% block total_damaged %}
SUM("bricktracker_parts"."damaged") AS "total_damaged",
SUM("combined"."damaged") AS "total_damaged",
{% endblock %}
{% block total_quantity %}
SUM("bricktracker_parts"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1)) AS "total_quantity",
SUM("combined"."quantity" * IFNULL("minifigure_quantities"."quantity", 1)) AS "total_quantity",
{% endblock %}
{% block total_sets %}
IFNULL(COUNT(DISTINCT "bricktracker_parts"."id"), 0) AS "total_sets",
IFNULL(COUNT(DISTINCT "combined"."id"), 0) AS "total_sets",
{% endblock %}
{% block total_minifigures %}
SUM(IFNULL("bricktracker_minifigures"."quantity", 0)) AS "total_minifigures"
SUM(IFNULL("minifigure_quantities"."quantity", 0)) AS "total_minifigures"
{% endblock %}
{% block join %}
LEFT JOIN "bricktracker_minifigures"
ON "bricktracker_parts"."id" IS NOT DISTINCT FROM "bricktracker_minifigures"."id"
AND "bricktracker_parts"."figure" IS NOT DISTINCT FROM "bricktracker_minifigures"."figure"
-- Join to get minifigure quantities from both set-based and individual minifigures
LEFT JOIN (
SELECT
"bricktracker_minifigures"."id",
"bricktracker_minifigures"."figure",
"bricktracker_minifigures"."quantity"
FROM "bricktracker_minifigures"
UNION ALL
SELECT
"bricktracker_individual_minifigures"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigures"."quantity"
FROM "bricktracker_individual_minifigures"
) AS "minifigure_quantities"
ON "combined"."id" IS NOT DISTINCT FROM "minifigure_quantities"."id"
AND "combined"."figure" IS NOT DISTINCT FROM "minifigure_quantities"."figure"
{% endblock %}
{% block where %}
{% set conditions = [] %}
{% if color_id and color_id != 'all' %}
{% set _ = conditions.append('"bricktracker_parts"."color" = ' ~ color_id) %}
{% set _ = conditions.append('"combined"."color" = ' ~ color_id) %}
{% endif %}
{% if search_query %}
{% set search_condition = '(LOWER("rebrickable_parts"."name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("rebrickable_parts"."color_name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("bricktracker_parts"."part") LIKE LOWER(\'%' ~ search_query ~ '%\'))' %}
{% set search_condition = '(LOWER("rebrickable_parts"."name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("rebrickable_parts"."color_name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("combined"."part") LIKE LOWER(\'%' ~ search_query ~ '%\'))' %}
{% set _ = conditions.append(search_condition) %}
{% endif %}
{% if skip_spare_parts %}
{% set _ = conditions.append('"bricktracker_parts"."spare" = 0') %}
{% set _ = conditions.append('"combined"."spare" = 0') %}
{% endif %}
{% if conditions %}
WHERE {{ conditions | join(' AND ') }}
@@ -45,7 +60,7 @@ WHERE {{ conditions | join(' AND ') }}
{% block group %}
GROUP BY
"bricktracker_parts"."part",
"bricktracker_parts"."color",
"bricktracker_parts"."spare"
"combined"."part",
"combined"."color",
"combined"."spare"
{% endblock %}
+64 -24
View File
@@ -2,73 +2,113 @@
{% block total_missing %}
{% if owner_id and owner_id != 'all' %}
SUM(CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "bricktracker_parts"."missing" ELSE 0 END) AS "total_missing",
SUM(CASE
WHEN "combined"."source_type" = 'set' AND "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."missing"
WHEN "combined"."source_type" = 'individual' AND "bricktracker_individual_minifigure_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."missing"
ELSE 0
END) AS "total_missing",
{% else %}
SUM("bricktracker_parts"."missing") AS "total_missing",
SUM("combined"."missing") AS "total_missing",
{% endif %}
{% endblock %}
{% block total_damaged %}
{% if owner_id and owner_id != 'all' %}
SUM(CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "bricktracker_parts"."damaged" ELSE 0 END) AS "total_damaged",
SUM(CASE
WHEN "combined"."source_type" = 'set' AND "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."damaged"
WHEN "combined"."source_type" = 'individual' AND "bricktracker_individual_minifigure_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."damaged"
ELSE 0
END) AS "total_damaged",
{% else %}
SUM("bricktracker_parts"."damaged") AS "total_damaged",
SUM("combined"."damaged") AS "total_damaged",
{% endif %}
{% endblock %}
{% block total_quantity %}
{% if owner_id and owner_id != 'all' %}
SUM(CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "bricktracker_parts"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1) ELSE 0 END) AS "total_quantity",
SUM(CASE
WHEN "combined"."source_type" = 'set' AND "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1)
WHEN "combined"."source_type" = 'individual' AND "bricktracker_individual_minifigure_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."quantity"
ELSE 0
END) AS "total_quantity",
{% else %}
SUM("bricktracker_parts"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1)) AS "total_quantity",
SUM(CASE
WHEN "combined"."source_type" = 'set' THEN "combined"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1)
ELSE "combined"."quantity"
END) AS "total_quantity",
{% endif %}
{% endblock %}
{% block total_sets %}
{% if owner_id and owner_id != 'all' %}
COUNT(DISTINCT CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "bricktracker_parts"."id" ELSE NULL END) AS "total_sets",
COUNT(DISTINCT CASE
WHEN "combined"."source_type" = 'set' AND "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."id"
ELSE NULL
END) AS "total_sets",
{% else %}
COUNT(DISTINCT "bricktracker_parts"."id") AS "total_sets",
COUNT(DISTINCT CASE WHEN "combined"."source_type" = 'set' THEN "combined"."id" ELSE NULL END) AS "total_sets",
{% endif %}
{% endblock %}
{% block total_minifigures %}
{% if owner_id and owner_id != 'all' %}
SUM(CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN IFNULL("bricktracker_minifigures"."quantity", 0) ELSE 0 END) AS "total_minifigures"
SUM(CASE
WHEN "combined"."source_type" = 'set' AND "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN IFNULL("bricktracker_minifigures"."quantity", 0)
WHEN "combined"."source_type" = 'individual' AND "bricktracker_individual_minifigure_owners"."owner_{{ owner_id }}" = 1 THEN 1
ELSE 0
END) AS "total_minifigures"
{% else %}
SUM(IFNULL("bricktracker_minifigures"."quantity", 0)) AS "total_minifigures"
SUM(CASE
WHEN "combined"."source_type" = 'set' THEN IFNULL("bricktracker_minifigures"."quantity", 0)
WHEN "combined"."source_type" = 'individual' THEN 1
ELSE 0
END) AS "total_minifigures"
{% endif %}
{% endblock %}
{% block join %}
-- Join with sets to get owner information
INNER JOIN "bricktracker_sets"
ON "bricktracker_parts"."id" IS NOT DISTINCT FROM "bricktracker_sets"."id"
-- Left join with sets (for set-based parts)
LEFT JOIN "bricktracker_sets"
ON "combined"."source_type" = 'set'
AND "combined"."id" IS NOT DISTINCT FROM "bricktracker_sets"."id"
-- Left join with set owners (using dynamic columns)
LEFT JOIN "bricktracker_set_owners"
ON "bricktracker_sets"."id" IS NOT DISTINCT FROM "bricktracker_set_owners"."id"
ON "combined"."source_type" = 'set'
AND "bricktracker_sets"."id" IS NOT DISTINCT FROM "bricktracker_set_owners"."id"
-- Left join with minifigures
-- Left join with set-based minifigures
LEFT JOIN "bricktracker_minifigures"
ON "bricktracker_parts"."id" IS NOT DISTINCT FROM "bricktracker_minifigures"."id"
AND "bricktracker_parts"."figure" IS NOT DISTINCT FROM "bricktracker_minifigures"."figure"
ON "combined"."source_type" = 'set'
AND "combined"."id" IS NOT DISTINCT FROM "bricktracker_minifigures"."id"
AND "combined"."figure" IS NOT DISTINCT FROM "bricktracker_minifigures"."figure"
-- Left join with individual minifigures (for individual parts)
LEFT JOIN "bricktracker_individual_minifigures"
ON "combined"."source_type" = 'individual'
AND "combined"."id" IS NOT DISTINCT FROM "bricktracker_individual_minifigures"."id"
-- Left join with individual minifigure owners (using dynamic columns)
LEFT JOIN "bricktracker_individual_minifigure_owners"
ON "combined"."source_type" = 'individual'
AND "bricktracker_individual_minifigures"."id" IS NOT DISTINCT FROM "bricktracker_individual_minifigure_owners"."id"
{% endblock %}
{% block where %}
{% set conditions = [] %}
{% if owner_id and owner_id != 'all' %}
{% set _ = conditions.append('"bricktracker_set_owners"."owner_' ~ owner_id ~ '" = 1') %}
{% set owner_condition = '(("combined"."source_type" = \'set\' AND "bricktracker_set_owners"."owner_' ~ owner_id ~ '" = 1) OR ("combined"."source_type" = \'individual\' AND "bricktracker_individual_minifigure_owners"."owner_' ~ owner_id ~ '" = 1))' %}
{% set _ = conditions.append(owner_condition) %}
{% endif %}
{% if color_id and color_id != 'all' %}
{% set _ = conditions.append('"bricktracker_parts"."color" = ' ~ color_id) %}
{% set _ = conditions.append('"combined"."color" = ' ~ color_id) %}
{% endif %}
{% if search_query %}
{% set search_condition = '(LOWER("rebrickable_parts"."name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("rebrickable_parts"."color_name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("bricktracker_parts"."part") LIKE LOWER(\'%' ~ search_query ~ '%\'))' %}
{% set search_condition = '(LOWER("rebrickable_parts"."name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("rebrickable_parts"."color_name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("combined"."part") LIKE LOWER(\'%' ~ search_query ~ '%\'))' %}
{% set _ = conditions.append(search_condition) %}
{% endif %}
{% if skip_spare_parts %}
{% set _ = conditions.append('"bricktracker_parts"."spare" = 0') %}
{% set _ = conditions.append('"combined"."spare" = 0') %}
{% endif %}
{% if conditions %}
WHERE {{ conditions | join(' AND ') }}
@@ -77,7 +117,7 @@ WHERE {{ conditions | join(' AND ') }}
{% block group %}
GROUP BY
"bricktracker_parts"."part",
"bricktracker_parts"."color",
"bricktracker_parts"."spare"
"combined"."part",
"combined"."color",
"combined"."spare"
{% endblock %}
+83 -16
View File
@@ -1,21 +1,88 @@
-- Query parts from both set-based and individual minifigures
SELECT
"parts_combined"."id",
"parts_combined"."figure",
"parts_combined"."part",
"parts_combined"."color",
"parts_combined"."spare",
SUM("parts_combined"."quantity") AS "quantity",
"parts_combined"."element",
SUM("parts_combined"."missing") AS "total_missing",
SUM("parts_combined"."damaged") AS "total_damaged",
MAX("parts_combined"."checked") AS "checked",
"rebrickable_parts"."color_name",
"rebrickable_parts"."color_rgb",
"rebrickable_parts"."color_transparent",
"rebrickable_parts"."bricklink_color_id",
"rebrickable_parts"."bricklink_color_name",
"rebrickable_parts"."bricklink_part_num",
"rebrickable_parts"."name",
"rebrickable_parts"."image",
"rebrickable_parts"."image_id",
"rebrickable_parts"."url",
"rebrickable_parts"."print",
NULL AS "total_quantity",
NULL AS "total_spare",
NULL AS "total_sets",
NULL AS "total_minifigures"
FROM (
-- Set-based minifigure parts
SELECT
"bricktracker_parts"."id",
"bricktracker_parts"."figure",
"bricktracker_parts"."part",
"bricktracker_parts"."color",
"bricktracker_parts"."spare",
"bricktracker_parts"."quantity",
"bricktracker_parts"."element",
"bricktracker_parts"."missing",
"bricktracker_parts"."damaged",
"bricktracker_parts"."checked"
FROM "bricktracker_parts"
WHERE "bricktracker_parts"."figure" IS NOT DISTINCT FROM :figure
{% extends 'part/base/base.sql' %}
UNION ALL
{% block total_missing %}
SUM("bricktracker_parts"."missing") AS "total_missing",
{% endblock %}
-- Individual minifigure parts
SELECT
"bricktracker_individual_minifigure_parts"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigure_parts"."part",
"bricktracker_individual_minifigure_parts"."color",
"bricktracker_individual_minifigure_parts"."spare",
"bricktracker_individual_minifigure_parts"."quantity",
"bricktracker_individual_minifigure_parts"."element",
"bricktracker_individual_minifigure_parts"."missing",
"bricktracker_individual_minifigure_parts"."damaged",
"bricktracker_individual_minifigure_parts"."checked"
FROM "bricktracker_individual_minifigure_parts"
INNER JOIN "bricktracker_individual_minifigures"
ON "bricktracker_individual_minifigure_parts"."id" = "bricktracker_individual_minifigures"."id"
WHERE "bricktracker_individual_minifigures"."figure" IS NOT DISTINCT FROM :figure
) AS "parts_combined"
{% block total_damaged %}
SUM("bricktracker_parts"."damaged") AS "total_damaged",
{% endblock %}
INNER JOIN "rebrickable_parts"
ON "parts_combined"."part" = "rebrickable_parts"."part"
AND "parts_combined"."color" = "rebrickable_parts"."color_id"
{% block where %}
WHERE "bricktracker_parts"."figure" IS NOT DISTINCT FROM :figure
{% endblock %}
{% block group %}
GROUP BY
"bricktracker_parts"."part",
"bricktracker_parts"."color",
"bricktracker_parts"."spare"
{% endblock %}
"parts_combined"."part",
"parts_combined"."color",
"parts_combined"."spare",
"parts_combined"."element",
"rebrickable_parts"."color_name",
"rebrickable_parts"."color_rgb",
"rebrickable_parts"."color_transparent",
"rebrickable_parts"."bricklink_color_id",
"rebrickable_parts"."bricklink_color_name",
"rebrickable_parts"."bricklink_part_num",
"rebrickable_parts"."name",
"rebrickable_parts"."image",
"rebrickable_parts"."image_id",
"rebrickable_parts"."url",
"rebrickable_parts"."print"
{% if order %}
-- Replace combined/bricktracker_parts references with parts_combined for this query
ORDER BY {{ order | replace('"combined"', '"parts_combined"') | replace('"bricktracker_parts"', '"parts_combined"') }}
{% endif %}
+4 -4
View File
@@ -7,12 +7,12 @@
{% block where %}
WHERE "rebrickable_parts"."print" IS NOT DISTINCT FROM :print
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_parts"."part" IS DISTINCT FROM :part
AND "combined"."color" IS NOT DISTINCT FROM :color
AND "combined"."part" IS DISTINCT FROM :part
{% endblock %}
{% block group %}
GROUP BY
"bricktracker_parts"."part",
"bricktracker_parts"."color"
"combined"."part",
"combined"."color"
{% endblock %}
+63 -24
View File
@@ -2,82 +2,121 @@
{% block total_missing %}
{% if owner_id and owner_id != 'all' %}
SUM(CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "bricktracker_parts"."missing" ELSE 0 END) AS "total_missing",
SUM(CASE
WHEN "combined"."source_type" = 'set' AND "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."missing"
WHEN "combined"."source_type" = 'individual' AND "bricktracker_individual_minifigure_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."missing"
ELSE 0
END) AS "total_missing",
{% else %}
SUM("bricktracker_parts"."missing") AS "total_missing",
SUM("combined"."missing") AS "total_missing",
{% endif %}
{% endblock %}
{% block total_damaged %}
{% if owner_id and owner_id != 'all' %}
SUM(CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "bricktracker_parts"."damaged" ELSE 0 END) AS "total_damaged",
SUM(CASE
WHEN "combined"."source_type" = 'set' AND "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."damaged"
WHEN "combined"."source_type" = 'individual' AND "bricktracker_individual_minifigure_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."damaged"
ELSE 0
END) AS "total_damaged",
{% else %}
SUM("bricktracker_parts"."damaged") AS "total_damaged",
SUM("combined"."damaged") AS "total_damaged",
{% endif %}
{% endblock %}
{% block total_quantity %}
{% if owner_id and owner_id != 'all' %}
SUM(CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "bricktracker_parts"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1) ELSE 0 END) AS "total_quantity",
SUM(CASE
WHEN "combined"."source_type" = 'set' AND "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1)
WHEN "combined"."source_type" = 'individual' AND "bricktracker_individual_minifigure_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."quantity" * IFNULL("bricktracker_individual_minifigures"."quantity", 1)
ELSE 0
END) AS "total_quantity",
{% else %}
SUM("bricktracker_parts"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1)) AS "total_quantity",
SUM(CASE
WHEN "combined"."source_type" = 'set' THEN "combined"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1)
WHEN "combined"."source_type" = 'individual' THEN "combined"."quantity" * IFNULL("bricktracker_individual_minifigures"."quantity", 1)
ELSE "combined"."quantity"
END) AS "total_quantity",
{% endif %}
{% endblock %}
{% block total_sets %}
{% if owner_id and owner_id != 'all' %}
COUNT(DISTINCT CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "bricktracker_parts"."id" ELSE NULL END) AS "total_sets",
COUNT(DISTINCT CASE
WHEN "combined"."source_type" = 'set' AND "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN "combined"."id"
ELSE NULL
END) AS "total_sets",
{% else %}
COUNT(DISTINCT "bricktracker_parts"."id") AS "total_sets",
COUNT(DISTINCT CASE WHEN "combined"."source_type" = 'set' THEN "combined"."id" ELSE NULL END) AS "total_sets",
{% endif %}
{% endblock %}
{% block total_minifigures %}
{% if owner_id and owner_id != 'all' %}
SUM(CASE WHEN "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN IFNULL("bricktracker_minifigures"."quantity", 0) ELSE 0 END) AS "total_minifigures"
SUM(CASE
WHEN "combined"."source_type" = 'set' AND "bricktracker_set_owners"."owner_{{ owner_id }}" = 1 THEN IFNULL("bricktracker_minifigures"."quantity", 0)
WHEN "combined"."source_type" = 'individual' AND "bricktracker_individual_minifigure_owners"."owner_{{ owner_id }}" = 1 THEN IFNULL("bricktracker_individual_minifigures"."quantity", 0)
ELSE 0
END) AS "total_minifigures"
{% else %}
SUM(IFNULL("bricktracker_minifigures"."quantity", 0)) AS "total_minifigures"
SUM(CASE
WHEN "combined"."source_type" = 'set' THEN IFNULL("bricktracker_minifigures"."quantity", 0)
WHEN "combined"."source_type" = 'individual' THEN IFNULL("bricktracker_individual_minifigures"."quantity", 0)
ELSE 0
END) AS "total_minifigures"
{% endif %}
{% endblock %}
{% block join %}
-- Join with sets to get owner information
INNER JOIN "bricktracker_sets"
ON "bricktracker_parts"."id" IS NOT DISTINCT FROM "bricktracker_sets"."id"
-- Left join with sets for set-based parts
LEFT JOIN "bricktracker_sets"
ON "combined"."source_type" = 'set'
AND "combined"."id" IS NOT DISTINCT FROM "bricktracker_sets"."id"
-- Left join with set owners (using dynamic columns)
LEFT JOIN "bricktracker_set_owners"
ON "bricktracker_sets"."id" IS NOT DISTINCT FROM "bricktracker_set_owners"."id"
-- Left join with minifigures
-- Left join with set-based minifigures
LEFT JOIN "bricktracker_minifigures"
ON "bricktracker_parts"."id" IS NOT DISTINCT FROM "bricktracker_minifigures"."id"
AND "bricktracker_parts"."figure" IS NOT DISTINCT FROM "bricktracker_minifigures"."figure"
ON "combined"."source_type" = 'set'
AND "combined"."id" IS NOT DISTINCT FROM "bricktracker_minifigures"."id"
AND "combined"."figure" IS NOT DISTINCT FROM "bricktracker_minifigures"."figure"
-- Left join with individual minifigures
LEFT JOIN "bricktracker_individual_minifigures"
ON "combined"."source_type" = 'individual'
AND "combined"."id" IS NOT DISTINCT FROM "bricktracker_individual_minifigures"."id"
-- Left join with individual minifigure owners
LEFT JOIN "bricktracker_individual_minifigure_owners"
ON "bricktracker_individual_minifigures"."id" IS NOT DISTINCT FROM "bricktracker_individual_minifigure_owners"."id"
{% endblock %}
{% block where %}
{% set conditions = [] %}
-- Always filter for problematic parts
{% set _ = conditions.append('("bricktracker_parts"."missing" > 0 OR "bricktracker_parts"."damaged" > 0)') %}
{% set _ = conditions.append('("combined"."missing" > 0 OR "combined"."damaged" > 0)') %}
{% if owner_id and owner_id != 'all' %}
{% set _ = conditions.append('"bricktracker_set_owners"."owner_' ~ owner_id ~ '" = 1') %}
{% set owner_condition = '(("combined"."source_type" = \'set\' AND "bricktracker_set_owners"."owner_' ~ owner_id ~ '" = 1) OR ("combined"."source_type" = \'individual\' AND "bricktracker_individual_minifigure_owners"."owner_' ~ owner_id ~ '" = 1))' %}
{% set _ = conditions.append(owner_condition) %}
{% endif %}
{% if color_id and color_id != 'all' %}
{% set _ = conditions.append('"bricktracker_parts"."color" = ' ~ color_id) %}
{% set _ = conditions.append('"combined"."color" = ' ~ color_id) %}
{% endif %}
{% if search_query %}
{% set search_condition = '(LOWER("rebrickable_parts"."name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("rebrickable_parts"."color_name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("bricktracker_parts"."part") LIKE LOWER(\'%' ~ search_query ~ '%\'))' %}
{% set search_condition = '(LOWER("rebrickable_parts"."name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("rebrickable_parts"."color_name") LIKE LOWER(\'%' ~ search_query ~ '%\') OR LOWER("combined"."part") LIKE LOWER(\'%' ~ search_query ~ '%\'))' %}
{% set _ = conditions.append(search_condition) %}
{% endif %}
{% if skip_spare_parts %}
{% set _ = conditions.append('"bricktracker_parts"."spare" = 0') %}
{% set _ = conditions.append('"combined"."spare" = 0') %}
{% endif %}
WHERE {{ conditions | join(' AND ') }}
{% endblock %}
{% block group %}
GROUP BY
"bricktracker_parts"."part",
"bricktracker_parts"."color",
"bricktracker_parts"."spare"
"combined"."part",
"combined"."color",
"combined"."spare"
{% endblock %}
+4 -4
View File
@@ -2,14 +2,14 @@
{% extends 'part/base/base.sql' %}
{% block total_missing %}
IFNULL("bricktracker_parts"."missing", 0) AS "total_missing",
IFNULL("combined"."missing", 0) AS "total_missing",
{% endblock %}
{% block total_damaged %}
IFNULL("bricktracker_parts"."damaged", 0) AS "total_damaged",
IFNULL("combined"."damaged", 0) AS "total_damaged",
{% endblock %}
{% block where %}
WHERE "bricktracker_parts"."id" IS NOT DISTINCT FROM :id
AND "bricktracker_parts"."figure" IS NOT DISTINCT FROM :figure
WHERE "combined"."id" IS NOT DISTINCT FROM :id
AND "combined"."figure" IS NOT DISTINCT FROM :figure
{% endblock %}
@@ -6,12 +6,12 @@
{% block total_damaged %}{% endblock %}
{% block where %}
WHERE "bricktracker_parts"."color" IS DISTINCT FROM :color
AND "bricktracker_parts"."part" IS NOT DISTINCT FROM :part
WHERE "combined"."color" IS DISTINCT FROM :color
AND "combined"."part" IS NOT DISTINCT FROM :part
{% endblock %}
{% block group %}
GROUP BY
"bricktracker_parts"."part",
"bricktracker_parts"."color"
"combined"."part",
"combined"."color"
{% endblock %}
+28 -11
View File
@@ -1,34 +1,51 @@
{% extends 'part/base/base.sql' %}
{% block total_missing %}
SUM("bricktracker_parts"."missing") AS "total_missing",
SUM("combined"."missing") AS "total_missing",
{% endblock %}
{% block total_damaged %}
SUM("bricktracker_parts"."damaged") AS "total_damaged",
SUM("combined"."damaged") AS "total_damaged",
{% endblock %}
{% block total_quantity %}
SUM((NOT "bricktracker_parts"."spare") * "bricktracker_parts"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1)) AS "total_quantity",
SUM((NOT "combined"."spare") * "combined"."quantity" * IFNULL("minifigure_quantities"."quantity", 1)) AS "total_quantity",
{% endblock %}
{% block total_spare %}
SUM("bricktracker_parts"."spare" * "bricktracker_parts"."quantity" * IFNULL("bricktracker_minifigures"."quantity", 1)) AS "total_spare",
SUM("combined"."spare" * "combined"."quantity" * IFNULL("minifigure_quantities"."quantity", 1)) AS "total_spare",
{% endblock %}
{% block join %}
LEFT JOIN "bricktracker_minifigures"
ON "bricktracker_parts"."id" IS NOT DISTINCT FROM "bricktracker_minifigures"."id"
AND "bricktracker_parts"."figure" IS NOT DISTINCT FROM "bricktracker_minifigures"."figure"
-- Join to get minifigure quantities from both set-based and individual minifigures
LEFT JOIN (
-- Set-based minifigure quantities
SELECT
"bricktracker_minifigures"."id",
"bricktracker_minifigures"."figure",
"bricktracker_minifigures"."quantity"
FROM "bricktracker_minifigures"
UNION ALL
-- Individual minifigure quantities
SELECT
"bricktracker_individual_minifigures"."id",
"bricktracker_individual_minifigures"."figure",
"bricktracker_individual_minifigures"."quantity"
FROM "bricktracker_individual_minifigures"
) AS "minifigure_quantities"
ON "combined"."id" IS NOT DISTINCT FROM "minifigure_quantities"."id"
AND "combined"."figure" IS NOT DISTINCT FROM "minifigure_quantities"."figure"
{% endblock %}
{% block where %}
WHERE "bricktracker_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM :color
WHERE "combined"."part" IS NOT DISTINCT FROM :part
AND "combined"."color" IS NOT DISTINCT FROM :color
{% endblock %}
{% block group %}
GROUP BY
"bricktracker_parts"."part",
"bricktracker_parts"."color"
"combined"."part",
"combined"."color"
{% endblock %}
+10 -10
View File
@@ -1,18 +1,18 @@
{% extends 'part/base/base.sql' %}
{% block where %}
WHERE "bricktracker_parts"."id" IS NOT DISTINCT FROM :id
AND "bricktracker_parts"."figure" IS NOT DISTINCT FROM :figure
AND "bricktracker_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_parts"."spare" IS NOT DISTINCT FROM :spare
WHERE "combined"."id" IS NOT DISTINCT FROM :id
AND "combined"."figure" IS NOT DISTINCT FROM :figure
AND "combined"."part" IS NOT DISTINCT FROM :part
AND "combined"."color" IS NOT DISTINCT FROM :color
AND "combined"."spare" IS NOT DISTINCT FROM :spare
{% endblock %}
{% block group %}
GROUP BY
"bricktracker_parts"."id",
"bricktracker_parts"."figure",
"bricktracker_parts"."part",
"bricktracker_parts"."color",
"bricktracker_parts"."spare"
"combined"."id",
"combined"."figure",
"combined"."part",
"combined"."color",
"combined"."spare"
{% endblock %}
+7
View File
@@ -0,0 +1,7 @@
UPDATE "bricktracker_parts"
SET "checked" = :checked
WHERE "bricktracker_parts"."id" IS NOT DISTINCT FROM :id
AND "bricktracker_parts"."figure" IS NOT DISTINCT FROM :figure
AND "bricktracker_parts"."part" IS NOT DISTINCT FROM :part
AND "bricktracker_parts"."color" IS NOT DISTINCT FROM :color
AND "bricktracker_parts"."spare" IS NOT DISTINCT FROM :spare
@@ -0,0 +1,4 @@
SELECT COUNT(*) as count
FROM "bricktracker_sets"
INNER JOIN "rebrickable_sets" ON "bricktracker_sets"."set" = "rebrickable_sets"."set"
WHERE "rebrickable_sets"."theme_id" = {{ theme_id }}
+13 -1
View File
@@ -8,7 +8,11 @@ AND (LOWER("rebrickable_sets"."name") LIKE LOWER('%{{ search_query }}%')
{% endif %}
{% if theme_filter %}
AND "rebrickable_sets"."theme_id" = '{{ theme_filter }}'
AND "rebrickable_sets"."theme_id" = {{ theme_filter }}
{% endif %}
{% if year_filter %}
AND "rebrickable_sets"."year" = {{ year_filter }}
{% endif %}
{% if storage_filter %}
@@ -66,4 +70,12 @@ AND EXISTS (
)
{% endif %}
{% endif %}
{% if duplicate_filter %}
AND (
SELECT COUNT(*)
FROM "bricktracker_sets" as "duplicate_check"
WHERE "duplicate_check"."set" = "bricktracker_sets"."set"
) > 1
{% endif %}
{% endblock %}
+178
View File
@@ -0,0 +1,178 @@
SELECT
(SELECT MIN("id") FROM "bricktracker_sets" WHERE "set" = "rebrickable_sets"."set") AS "id",
"rebrickable_sets"."set",
"rebrickable_sets"."number",
"rebrickable_sets"."version",
"rebrickable_sets"."name",
"rebrickable_sets"."year",
"rebrickable_sets"."theme_id",
"rebrickable_sets"."number_of_parts",
"rebrickable_sets"."image",
"rebrickable_sets"."url",
COUNT("bricktracker_sets"."id") AS "instance_count",
IFNULL(SUM("problem_join"."total_missing"), 0) AS "total_missing",
IFNULL(SUM("problem_join"."total_damaged"), 0) AS "total_damaged",
IFNULL(MAX("minifigures_join"."total"), 0) AS "total_minifigures",
-- Keep one representative instance for display purposes
GROUP_CONCAT("bricktracker_sets"."id", '|') AS "instance_ids",
REPLACE(GROUP_CONCAT(DISTINCT "bricktracker_sets"."storage"), ',', '|') AS "storage",
MIN("bricktracker_sets"."purchase_date") AS "purchase_date",
MAX("bricktracker_sets"."purchase_date") AS "purchase_date_max",
REPLACE(GROUP_CONCAT(DISTINCT "bricktracker_sets"."purchase_location"), ',', '|') AS "purchase_location",
ROUND(AVG("bricktracker_sets"."purchase_price"), 1) AS "purchase_price"
{% block owners %}
{% if owners_dict %}
{% for column, uuid in owners_dict.items() %}
, MAX("bricktracker_set_owners"."{{ column }}") AS "{{ column }}"
{% endfor %}
{% endif %}
{% endblock %}
{% block tags %}
{% if tags_dict %}
{% for column, uuid in tags_dict.items() %}
, MAX("bricktracker_set_tags"."{{ column }}") AS "{{ column }}"
{% endfor %}
{% endif %}
{% endblock %}
{% block statuses %}
{% if statuses_dict %}
{% for column, uuid in statuses_dict.items() %}
, MAX("bricktracker_set_statuses"."{{ column }}") AS "{{ column }}"
, IFNULL(SUM("bricktracker_set_statuses"."{{ column }}"), 0) AS "{{ column }}_count"
{% endfor %}
{% endif %}
{% endblock %}
FROM "bricktracker_sets"
INNER JOIN "rebrickable_sets"
ON "bricktracker_sets"."set" IS NOT DISTINCT FROM "rebrickable_sets"."set"
-- LEFT JOIN + SELECT to avoid messing the total
LEFT JOIN (
SELECT
"bricktracker_parts"."id",
SUM("bricktracker_parts"."missing") AS "total_missing",
SUM("bricktracker_parts"."damaged") AS "total_damaged"
FROM "bricktracker_parts"
GROUP BY "bricktracker_parts"."id"
) "problem_join"
ON "bricktracker_sets"."id" IS NOT DISTINCT FROM "problem_join"."id"
-- LEFT JOIN + SELECT to avoid messing the total
LEFT JOIN (
SELECT
"bricktracker_minifigures"."id",
SUM("bricktracker_minifigures"."quantity") AS "total"
FROM "bricktracker_minifigures"
GROUP BY "bricktracker_minifigures"."id"
) "minifigures_join"
ON "bricktracker_sets"."id" IS NOT DISTINCT FROM "minifigures_join"."id"
{% if owners_dict %}
LEFT JOIN "bricktracker_set_owners"
ON "bricktracker_sets"."id" IS NOT DISTINCT FROM "bricktracker_set_owners"."id"
{% endif %}
{% if statuses_dict %}
LEFT JOIN "bricktracker_set_statuses"
ON "bricktracker_sets"."id" IS NOT DISTINCT FROM "bricktracker_set_statuses"."id"
{% endif %}
{% if tags_dict %}
LEFT JOIN "bricktracker_set_tags"
ON "bricktracker_sets"."id" IS NOT DISTINCT FROM "bricktracker_set_tags"."id"
{% endif %}
{% block where %}
WHERE 1=1
{% if search_query %}
AND (LOWER("rebrickable_sets"."name") LIKE LOWER('%{{ search_query }}%')
OR LOWER("rebrickable_sets"."set") LIKE LOWER('%{{ search_query }}%'))
{% endif %}
{% if theme_filter %}
AND "rebrickable_sets"."theme_id" = {{ theme_filter }}
{% endif %}
{% if year_filter %}
AND "rebrickable_sets"."year" = {{ year_filter }}
{% endif %}
{% if storage_filter %}
AND EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND bs_filter."storage" = '{{ storage_filter }}'
)
{% endif %}
{% if purchase_location_filter %}
AND EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND bs_filter."purchase_location" = '{{ purchase_location_filter }}'
)
{% endif %}
{% if status_filter %}
{% if status_filter == 'has-storage' %}
AND EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND bs_filter."storage" IS NOT NULL AND bs_filter."storage" != ''
)
{% elif status_filter == '-has-storage' %}
AND NOT EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND bs_filter."storage" IS NOT NULL AND bs_filter."storage" != ''
)
{% elif status_filter.startswith('status-') %}
AND EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
JOIN "bricktracker_set_statuses" ON bs_filter."id" = "bricktracker_set_statuses"."id"
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND "bricktracker_set_statuses"."{{ status_filter.replace('-', '_') }}" = 1
)
{% elif status_filter.startswith('-status-') %}
AND NOT EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
JOIN "bricktracker_set_statuses" ON bs_filter."id" = "bricktracker_set_statuses"."id"
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND "bricktracker_set_statuses"."{{ status_filter[1:].replace('-', '_') }}" = 1
)
{% endif %}
{% endif %}
{% endblock %}
GROUP BY "rebrickable_sets"."set"
{% if status_filter or duplicate_filter %}
HAVING 1=1
{% if status_filter %}
{% if status_filter == 'has-missing' %}
AND IFNULL(SUM("problem_join"."total_missing"), 0) > 0
{% elif status_filter == '-has-missing' %}
AND IFNULL(SUM("problem_join"."total_missing"), 0) = 0
{% elif status_filter == 'has-damaged' %}
AND IFNULL(SUM("problem_join"."total_damaged"), 0) > 0
{% elif status_filter == '-has-damaged' %}
AND IFNULL(SUM("problem_join"."total_damaged"), 0) = 0
{% endif %}
{% endif %}
{% if duplicate_filter %}
AND COUNT("bricktracker_sets"."id") > 1
{% endif %}
{% endif %}
{% if order %}
ORDER BY {{ order }}
{% endif %}
{% if limit %}
LIMIT {{ limit }}
{% endif %}
{% if offset %}
OFFSET {{ offset }}
{% endif %}
@@ -5,7 +5,7 @@ WHERE "bricktracker_sets"."id" IN (
SELECT "bricktracker_parts"."id"
FROM "bricktracker_parts"
WHERE "bricktracker_parts"."figure" IS NOT DISTINCT FROM :figure
AND "bricktracker_parts"."missing" > 0
AND "bricktracker_parts"."damaged" > 0
GROUP BY "bricktracker_parts"."id"
)
{% endblock %}
+87
View File
@@ -0,0 +1,87 @@
SELECT DISTINCT "rebrickable_sets"."theme_id"
FROM "bricktracker_sets"
INNER JOIN "rebrickable_sets"
ON "bricktracker_sets"."set" IS NOT DISTINCT FROM "rebrickable_sets"."set"
{% block where %}
WHERE 1=1
{% if search_query %}
AND (LOWER("rebrickable_sets"."name") LIKE LOWER('%{{ search_query }}%')
OR LOWER("rebrickable_sets"."set") LIKE LOWER('%{{ search_query }}%'))
{% endif %}
{% if storage_filter %}
AND EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND bs_filter."storage" = '{{ storage_filter }}'
)
{% endif %}
{% if purchase_location_filter %}
AND EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND bs_filter."purchase_location" = '{{ purchase_location_filter }}'
)
{% endif %}
{% if status_filter %}
{% if status_filter == 'has-storage' %}
AND EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND bs_filter."storage" IS NOT NULL AND bs_filter."storage" != ''
)
{% elif status_filter == '-has-storage' %}
AND NOT EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND bs_filter."storage" IS NOT NULL AND bs_filter."storage" != ''
)
{% elif status_filter.startswith('status-') %}
AND EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
JOIN "bricktracker_set_statuses" ON bs_filter."id" = "bricktracker_set_statuses"."id"
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND "bricktracker_set_statuses"."{{ status_filter.replace('-', '_') }}" = 1
)
{% elif status_filter.startswith('-status-') %}
AND NOT EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
JOIN "bricktracker_set_statuses" ON bs_filter."id" = "bricktracker_set_statuses"."id"
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND "bricktracker_set_statuses"."{{ status_filter[1:].replace('-', '_') }}" = 1
)
{% elif status_filter == 'has-missing' %}
AND EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
JOIN "bricktracker_parts" ON bs_filter."id" = "bricktracker_parts"."id"
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND "bricktracker_parts"."missing" > 0
)
{% elif status_filter == '-has-missing' %}
AND NOT EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
JOIN "bricktracker_parts" ON bs_filter."id" = "bricktracker_parts"."id"
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND "bricktracker_parts"."missing" > 0
)
{% elif status_filter == 'has-damaged' %}
AND EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
JOIN "bricktracker_parts" ON bs_filter."id" = "bricktracker_parts"."id"
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND "bricktracker_parts"."damaged" > 0
)
{% elif status_filter == '-has-damaged' %}
AND NOT EXISTS (
SELECT 1 FROM "bricktracker_sets" bs_filter
JOIN "bricktracker_parts" ON bs_filter."id" = "bricktracker_parts"."id"
WHERE bs_filter."set" = "rebrickable_sets"."set"
AND "bricktracker_parts"."damaged" > 0
)
{% endif %}
{% endif %}
{% endblock %}
@@ -0,0 +1,5 @@
{% extends 'set/base/full.sql' %}
{% block where %}
WHERE "bricktracker_sets"."storage" IS NULL
{% endblock %}
+13
View File
@@ -0,0 +1,13 @@
SELECT DISTINCT "rebrickable_sets"."year"
FROM "bricktracker_sets"
INNER JOIN "rebrickable_sets"
ON "bricktracker_sets"."set" IS NOT DISTINCT FROM "rebrickable_sets"."set"
{% block where %}
WHERE 1=1
{% if search_query %}
AND (LOWER("rebrickable_sets"."name") LIKE LOWER('%{{ search_query }}%')
OR LOWER("rebrickable_sets"."set") LIKE LOWER('%{{ search_query }}%'))
{% endif %}
{% endblock %}
@@ -7,6 +7,14 @@ ADD COLUMN "owner_{{ id }}" BOOLEAN NOT NULL DEFAULT 0;
ALTER TABLE "bricktracker_wish_owners"
ADD COLUMN "owner_{{ id }}" BOOLEAN NOT NULL DEFAULT 0;
-- Also inject into individual minifigures
ALTER TABLE "bricktracker_individual_minifigure_owners"
ADD COLUMN "owner_{{ id }}" BOOLEAN NOT NULL DEFAULT 0;
-- Also inject into individual parts
ALTER TABLE "bricktracker_individual_part_owners"
ADD COLUMN "owner_{{ id }}" BOOLEAN NOT NULL DEFAULT 0;
INSERT INTO "bricktracker_metadata_owners" (
"id",
"name"
@@ -1,12 +1,16 @@
{% extends 'set/metadata/storage/base.sql' %}
{% block total_sets %}
IFNULL(COUNT("bricktracker_sets"."id"), 0) AS "total_sets"
IFNULL(COUNT(DISTINCT "bricktracker_sets"."id"), 0) AS "total_sets",
IFNULL(COUNT(DISTINCT "bricktracker_individual_minifigures"."id"), 0) AS "total_individual_minifigures"
{% endblock %}
{% block join %}
LEFT JOIN "bricktracker_sets"
ON "bricktracker_metadata_storages"."id" IS NOT DISTINCT FROM "bricktracker_sets"."storage"
LEFT JOIN "bricktracker_individual_minifigures"
ON "bricktracker_metadata_storages"."id" IS NOT DISTINCT FROM "bricktracker_individual_minifigures"."storage"
{% endblock %}
{% block group %}
@@ -3,6 +3,14 @@ BEGIN TRANSACTION;
ALTER TABLE "bricktracker_set_tags"
ADD COLUMN "tag_{{ id }}" BOOLEAN NOT NULL DEFAULT 0;
-- Also inject into individual minifigures
ALTER TABLE "bricktracker_individual_minifigure_tags"
ADD COLUMN "tag_{{ id }}" BOOLEAN NOT NULL DEFAULT 0;
-- Also inject into individual parts
ALTER TABLE "bricktracker_individual_part_tags"
ADD COLUMN "tag_{{ id }}" BOOLEAN NOT NULL DEFAULT 0;
INSERT INTO "bricktracker_metadata_tags" (
"id",
"name"
+83
View File
@@ -0,0 +1,83 @@
-- Statistics Overview Query (Optimized with CTEs)
-- Provides comprehensive statistics for BrickTracker dashboard
-- Performance improved by consolidating subqueries into CTEs
-- Expected impact: 60-80% performance improvement for dashboard loading
WITH
-- Set statistics aggregation
set_stats AS (
SELECT
COUNT(*) AS total_sets,
COUNT(DISTINCT "set") AS unique_sets,
COUNT(CASE WHEN "purchase_price" IS NOT NULL THEN 1 END) AS sets_with_price,
ROUND(SUM("purchase_price"), 2) AS total_cost,
ROUND(AVG("purchase_price"), 2) AS average_cost,
ROUND(MIN("purchase_price"), 2) AS minimum_cost,
ROUND(MAX("purchase_price"), 2) AS maximum_cost,
COUNT(DISTINCT CASE WHEN "storage" IS NOT NULL THEN "storage" END) AS storage_locations_used,
COUNT(DISTINCT CASE WHEN "purchase_location" IS NOT NULL THEN "purchase_location" END) AS purchase_locations_used,
COUNT(CASE WHEN "storage" IS NOT NULL THEN 1 END) AS sets_with_storage,
COUNT(CASE WHEN "purchase_location" IS NOT NULL THEN 1 END) AS sets_with_purchase_location
FROM "bricktracker_sets"
),
-- Part statistics aggregation
part_stats AS (
SELECT
COUNT(*) AS total_part_instances,
SUM("quantity") AS total_parts_count,
COUNT(DISTINCT "part") AS unique_parts,
SUM("missing") AS total_missing_parts,
SUM("damaged") AS total_damaged_parts
FROM "bricktracker_parts"
),
-- Minifigure statistics aggregation
minifig_stats AS (
SELECT
COUNT(*) AS total_minifigure_instances,
SUM("quantity") AS total_minifigures_count,
COUNT(DISTINCT "figure") AS unique_minifigures
FROM "bricktracker_minifigures"
),
-- Rebrickable sets count (for sets we actually own)
rebrickable_stats AS (
SELECT COUNT(*) AS unique_rebrickable_sets
FROM "rebrickable_sets"
WHERE "set" IN (SELECT DISTINCT "set" FROM "bricktracker_sets")
)
-- Final select combining all statistics
SELECT
-- Basic counts
set_stats.total_sets,
set_stats.unique_sets,
rebrickable_stats.unique_rebrickable_sets,
-- Parts statistics
part_stats.total_part_instances,
part_stats.total_parts_count,
part_stats.unique_parts,
part_stats.total_missing_parts,
part_stats.total_damaged_parts,
-- Minifigures statistics
minifig_stats.total_minifigure_instances,
minifig_stats.total_minifigures_count,
minifig_stats.unique_minifigures,
-- Financial statistics
set_stats.sets_with_price,
set_stats.total_cost,
set_stats.average_cost,
set_stats.minimum_cost,
set_stats.maximum_cost,
-- Storage and location statistics
set_stats.storage_locations_used,
set_stats.purchase_locations_used,
set_stats.sets_with_storage,
set_stats.sets_with_purchase_location
FROM set_stats, part_stats, minifig_stats, rebrickable_stats
@@ -0,0 +1,45 @@
-- Purchase Location Statistics
-- Shows statistics grouped by purchase location
SELECT
"bricktracker_sets"."purchase_location" AS "location_id",
"bricktracker_metadata_purchase_locations"."name" AS "location_name",
COUNT("bricktracker_sets"."id") AS "set_count",
COUNT(DISTINCT "bricktracker_sets"."set") AS "unique_set_count",
SUM("rebrickable_sets"."number_of_parts") AS "total_parts",
ROUND(AVG("rebrickable_sets"."number_of_parts"), 0) AS "avg_parts_per_set",
-- Financial statistics per purchase location
COUNT(CASE WHEN "bricktracker_sets"."purchase_price" IS NOT NULL THEN 1 END) AS "sets_with_price",
ROUND(SUM("bricktracker_sets"."purchase_price"), 2) AS "total_spent",
ROUND(AVG("bricktracker_sets"."purchase_price"), 2) AS "avg_price",
ROUND(MIN("bricktracker_sets"."purchase_price"), 2) AS "min_price",
ROUND(MAX("bricktracker_sets"."purchase_price"), 2) AS "max_price",
-- Date range statistics
MIN("bricktracker_sets"."purchase_date") AS "first_purchase",
MAX("bricktracker_sets"."purchase_date") AS "latest_purchase",
-- Problem statistics per purchase location
COALESCE(SUM("problem_stats"."missing_parts"), 0) AS "missing_parts",
COALESCE(SUM("problem_stats"."damaged_parts"), 0) AS "damaged_parts",
-- Minifigure statistics per purchase location
COALESCE(SUM("minifigure_stats"."minifigure_count"), 0) AS "total_minifigures"
FROM "bricktracker_sets"
INNER JOIN "rebrickable_sets" ON "bricktracker_sets"."set" = "rebrickable_sets"."set"
LEFT JOIN "bricktracker_metadata_purchase_locations" ON "bricktracker_sets"."purchase_location" = "bricktracker_metadata_purchase_locations"."id"
LEFT JOIN (
SELECT
"bricktracker_parts"."id",
SUM("bricktracker_parts"."missing") AS "missing_parts",
SUM("bricktracker_parts"."damaged") AS "damaged_parts"
FROM "bricktracker_parts"
GROUP BY "bricktracker_parts"."id"
) "problem_stats" ON "bricktracker_sets"."id" = "problem_stats"."id"
LEFT JOIN (
SELECT
"bricktracker_minifigures"."id",
SUM("bricktracker_minifigures"."quantity") AS "minifigure_count"
FROM "bricktracker_minifigures"
GROUP BY "bricktracker_minifigures"."id"
) "minifigure_stats" ON "bricktracker_sets"."id" = "minifigure_stats"."id"
WHERE "bricktracker_sets"."purchase_location" IS NOT NULL
GROUP BY "bricktracker_sets"."purchase_location", "bricktracker_metadata_purchase_locations"."name"
ORDER BY "set_count" DESC, "location_name" ASC
@@ -0,0 +1,49 @@
-- Purchases by Year Statistics
-- Shows statistics grouped by purchase year (when you bought the sets)
SELECT
strftime('%Y', datetime("bricktracker_sets"."purchase_date", 'unixepoch')) AS "purchase_year",
COUNT("bricktracker_sets"."id") AS "total_sets",
COUNT(DISTINCT "bricktracker_sets"."set") AS "unique_sets",
SUM("rebrickable_sets"."number_of_parts") AS "total_parts",
ROUND(AVG("rebrickable_sets"."number_of_parts"), 0) AS "avg_parts_per_set",
-- Financial statistics per purchase year
COUNT(CASE WHEN "bricktracker_sets"."purchase_price" IS NOT NULL THEN 1 END) AS "sets_with_price",
ROUND(SUM("bricktracker_sets"."purchase_price"), 2) AS "total_spent",
ROUND(AVG("bricktracker_sets"."purchase_price"), 2) AS "avg_price_per_set",
ROUND(MIN("bricktracker_sets"."purchase_price"), 2) AS "min_price",
ROUND(MAX("bricktracker_sets"."purchase_price"), 2) AS "max_price",
-- Release year statistics for sets purchased in this year
MIN("rebrickable_sets"."year") AS "oldest_set_year",
MAX("rebrickable_sets"."year") AS "newest_set_year",
ROUND(AVG("rebrickable_sets"."year"), 0) AS "avg_set_release_year",
-- Problem statistics per purchase year
COALESCE(SUM("problem_stats"."missing_parts"), 0) AS "missing_parts",
COALESCE(SUM("problem_stats"."damaged_parts"), 0) AS "damaged_parts",
-- Minifigure statistics per purchase year
COALESCE(SUM("minifigure_stats"."minifigure_count"), 0) AS "total_minifigures",
-- Diversity statistics per purchase year
COUNT(DISTINCT "rebrickable_sets"."theme_id") AS "unique_themes",
COUNT(DISTINCT "bricktracker_sets"."purchase_location") AS "unique_purchase_locations",
-- Monthly statistics within the year
COUNT(DISTINCT strftime('%m', datetime("bricktracker_sets"."purchase_date", 'unixepoch'))) AS "months_with_purchases"
FROM "bricktracker_sets"
INNER JOIN "rebrickable_sets" ON "bricktracker_sets"."set" = "rebrickable_sets"."set"
LEFT JOIN (
SELECT
"bricktracker_parts"."id",
SUM("bricktracker_parts"."missing") AS "missing_parts",
SUM("bricktracker_parts"."damaged") AS "damaged_parts"
FROM "bricktracker_parts"
GROUP BY "bricktracker_parts"."id"
) "problem_stats" ON "bricktracker_sets"."id" = "problem_stats"."id"
LEFT JOIN (
SELECT
"bricktracker_minifigures"."id",
SUM("bricktracker_minifigures"."quantity") AS "minifigure_count"
FROM "bricktracker_minifigures"
GROUP BY "bricktracker_minifigures"."id"
) "minifigure_stats" ON "bricktracker_sets"."id" = "minifigure_stats"."id"
WHERE "bricktracker_sets"."purchase_date" IS NOT NULL
GROUP BY strftime('%Y', datetime("bricktracker_sets"."purchase_date", 'unixepoch'))
ORDER BY "purchase_year" DESC
@@ -0,0 +1,44 @@
-- Sets by Year Statistics
-- Shows statistics grouped by LEGO set release year
SELECT
"rebrickable_sets"."year",
COUNT("bricktracker_sets"."id") AS "total_sets",
COUNT(DISTINCT "bricktracker_sets"."set") AS "unique_sets",
SUM("rebrickable_sets"."number_of_parts") AS "total_parts",
ROUND(AVG("rebrickable_sets"."number_of_parts"), 0) AS "avg_parts_per_set",
MIN("rebrickable_sets"."number_of_parts") AS "min_parts",
MAX("rebrickable_sets"."number_of_parts") AS "max_parts",
-- Financial statistics per year (release year)
COUNT(CASE WHEN "bricktracker_sets"."purchase_price" IS NOT NULL THEN 1 END) AS "sets_with_price",
ROUND(SUM("bricktracker_sets"."purchase_price"), 2) AS "total_spent",
ROUND(AVG("bricktracker_sets"."purchase_price"), 2) AS "avg_price_per_set",
ROUND(MIN("bricktracker_sets"."purchase_price"), 2) AS "min_price",
ROUND(MAX("bricktracker_sets"."purchase_price"), 2) AS "max_price",
-- Problem statistics per year
COALESCE(SUM("problem_stats"."missing_parts"), 0) AS "missing_parts",
COALESCE(SUM("problem_stats"."damaged_parts"), 0) AS "damaged_parts",
-- Minifigure statistics per year
COALESCE(SUM("minifigure_stats"."minifigure_count"), 0) AS "total_minifigures",
-- Theme diversity per year
COUNT(DISTINCT "rebrickable_sets"."theme_id") AS "unique_themes"
FROM "bricktracker_sets"
INNER JOIN "rebrickable_sets" ON "bricktracker_sets"."set" = "rebrickable_sets"."set"
LEFT JOIN (
SELECT
"bricktracker_parts"."id",
SUM("bricktracker_parts"."missing") AS "missing_parts",
SUM("bricktracker_parts"."damaged") AS "damaged_parts"
FROM "bricktracker_parts"
GROUP BY "bricktracker_parts"."id"
) "problem_stats" ON "bricktracker_sets"."id" = "problem_stats"."id"
LEFT JOIN (
SELECT
"bricktracker_minifigures"."id",
SUM("bricktracker_minifigures"."quantity") AS "minifigure_count"
FROM "bricktracker_minifigures"
GROUP BY "bricktracker_minifigures"."id"
) "minifigure_stats" ON "bricktracker_sets"."id" = "minifigure_stats"."id"
WHERE "rebrickable_sets"."year" IS NOT NULL
GROUP BY "rebrickable_sets"."year"
ORDER BY "rebrickable_sets"."year" DESC
+40
View File
@@ -0,0 +1,40 @@
-- Storage Location Statistics
-- Shows statistics grouped by storage location
SELECT
"bricktracker_sets"."storage" AS "storage_id",
"bricktracker_metadata_storages"."name" AS "storage_name",
COUNT("bricktracker_sets"."id") AS "set_count",
COUNT(DISTINCT "bricktracker_sets"."set") AS "unique_set_count",
SUM("rebrickable_sets"."number_of_parts") AS "total_parts",
ROUND(AVG("rebrickable_sets"."number_of_parts"), 0) AS "avg_parts_per_set",
-- Financial statistics per storage
COUNT(CASE WHEN "bricktracker_sets"."purchase_price" IS NOT NULL THEN 1 END) AS "sets_with_price",
ROUND(SUM("bricktracker_sets"."purchase_price"), 2) AS "total_value",
ROUND(AVG("bricktracker_sets"."purchase_price"), 2) AS "avg_price",
-- Problem statistics per storage
COALESCE(SUM("problem_stats"."missing_parts"), 0) AS "missing_parts",
COALESCE(SUM("problem_stats"."damaged_parts"), 0) AS "damaged_parts",
-- Minifigure statistics per storage
COALESCE(SUM("minifigure_stats"."minifigure_count"), 0) AS "total_minifigures"
FROM "bricktracker_sets"
INNER JOIN "rebrickable_sets" ON "bricktracker_sets"."set" = "rebrickable_sets"."set"
LEFT JOIN "bricktracker_metadata_storages" ON "bricktracker_sets"."storage" = "bricktracker_metadata_storages"."id"
LEFT JOIN (
SELECT
"bricktracker_parts"."id",
SUM("bricktracker_parts"."missing") AS "missing_parts",
SUM("bricktracker_parts"."damaged") AS "damaged_parts"
FROM "bricktracker_parts"
GROUP BY "bricktracker_parts"."id"
) "problem_stats" ON "bricktracker_sets"."id" = "problem_stats"."id"
LEFT JOIN (
SELECT
"bricktracker_minifigures"."id",
SUM("bricktracker_minifigures"."quantity") AS "minifigure_count"
FROM "bricktracker_minifigures"
GROUP BY "bricktracker_minifigures"."id"
) "minifigure_stats" ON "bricktracker_sets"."id" = "minifigure_stats"."id"
WHERE "bricktracker_sets"."storage" IS NOT NULL
GROUP BY "bricktracker_sets"."storage", "bricktracker_metadata_storages"."name"
ORDER BY "set_count" DESC, "storage_name" ASC
+39
View File
@@ -0,0 +1,39 @@
-- Theme Distribution Statistics
-- Shows statistics grouped by theme
SELECT
"rebrickable_sets"."theme_id",
COUNT("bricktracker_sets"."id") AS "set_count",
COUNT(DISTINCT "bricktracker_sets"."set") AS "unique_set_count",
SUM("rebrickable_sets"."number_of_parts") AS "total_parts",
ROUND(AVG("rebrickable_sets"."number_of_parts"), 0) AS "avg_parts_per_set",
MIN("rebrickable_sets"."year") AS "earliest_year",
MAX("rebrickable_sets"."year") AS "latest_year",
-- Financial statistics per theme
COUNT(CASE WHEN "bricktracker_sets"."purchase_price" IS NOT NULL THEN 1 END) AS "sets_with_price",
ROUND(SUM("bricktracker_sets"."purchase_price"), 2) AS "total_spent",
ROUND(AVG("bricktracker_sets"."purchase_price"), 2) AS "avg_price",
-- Problem statistics per theme
COALESCE(SUM("problem_stats"."missing_parts"), 0) AS "missing_parts",
COALESCE(SUM("problem_stats"."damaged_parts"), 0) AS "damaged_parts",
-- Minifigure statistics per theme
COALESCE(SUM("minifigure_stats"."minifigure_count"), 0) AS "total_minifigures"
FROM "bricktracker_sets"
INNER JOIN "rebrickable_sets" ON "bricktracker_sets"."set" = "rebrickable_sets"."set"
LEFT JOIN (
SELECT
"bricktracker_parts"."id",
SUM("bricktracker_parts"."missing") AS "missing_parts",
SUM("bricktracker_parts"."damaged") AS "damaged_parts"
FROM "bricktracker_parts"
GROUP BY "bricktracker_parts"."id"
) "problem_stats" ON "bricktracker_sets"."id" = "problem_stats"."id"
LEFT JOIN (
SELECT
"bricktracker_minifigures"."id",
SUM("bricktracker_minifigures"."quantity") AS "minifigure_count"
FROM "bricktracker_minifigures"
GROUP BY "bricktracker_minifigures"."id"
) "minifigure_stats" ON "bricktracker_sets"."id" = "minifigure_stats"."id"
GROUP BY "rebrickable_sets"."theme_id"
ORDER BY "set_count" DESC, "rebrickable_sets"."theme_id" ASC
+132
View File
@@ -0,0 +1,132 @@
"""
Statistics module for BrickTracker
Provides statistics and analytics functionality
"""
import logging
from typing import Any
from .sql import BrickSQL
from .theme_list import BrickThemeList
logger = logging.getLogger(__name__)
class BrickStatistics:
"""Main statistics class providing overview and detailed statistics"""
def __init__(self):
self.sql = BrickSQL()
def get_overview(self) -> dict[str, Any]:
"""Get overview statistics"""
result = self.sql.fetchone('statistics/overview')
if result:
return dict(result)
return {}
def get_theme_statistics(self) -> list[dict[str, Any]]:
"""Get statistics grouped by theme with theme names"""
results = self.sql.fetchall('statistics/themes')
# Load theme list to get theme names
theme_list = BrickThemeList()
statistics = []
for row in results:
stat = dict(row)
# Add theme name from theme list
theme = theme_list.get(stat['theme_id'])
stat['theme_name'] = theme.name if theme else f"Theme {stat['theme_id']}"
statistics.append(stat)
return statistics
def get_storage_statistics(self) -> list[dict[str, Any]]:
"""Get statistics grouped by storage location"""
results = self.sql.fetchall('statistics/storage')
return [dict(row) for row in results]
def get_purchase_location_statistics(self) -> list[dict[str, Any]]:
"""Get statistics grouped by purchase location"""
results = self.sql.fetchall('statistics/purchase_locations')
return [dict(row) for row in results]
def get_financial_summary(self) -> dict[str, Any]:
"""Get financial summary from overview statistics"""
overview = self.get_overview()
return {
'total_cost': overview.get('total_cost', 0),
'average_cost': overview.get('average_cost', 0),
'minimum_cost': overview.get('minimum_cost', 0),
'maximum_cost': overview.get('maximum_cost', 0),
'sets_with_price': overview.get('sets_with_price', 0),
'total_sets': overview.get('total_sets', 0),
'percentage_with_price': round(
(overview.get('sets_with_price', 0) / max(overview.get('total_sets', 1), 1)) * 100, 1
)
}
def get_collection_summary(self) -> dict[str, Any]:
"""Get collection summary from overview statistics"""
overview = self.get_overview()
return {
'total_sets': overview.get('total_sets', 0),
'unique_sets': overview.get('unique_sets', 0),
'total_parts_count': overview.get('total_parts_count', 0),
'unique_parts': overview.get('unique_parts', 0),
'total_minifigures_count': overview.get('total_minifigures_count', 0),
'unique_minifigures': overview.get('unique_minifigures', 0),
'total_missing_parts': overview.get('total_missing_parts', 0),
'total_damaged_parts': overview.get('total_damaged_parts', 0),
'storage_locations_used': overview.get('storage_locations_used', 0),
'purchase_locations_used': overview.get('purchase_locations_used', 0)
}
def get_sets_by_year_statistics(self) -> list[dict[str, Any]]:
"""Get statistics grouped by LEGO set release year"""
results = self.sql.fetchall('statistics/sets_by_year')
return [dict(row) for row in results]
def get_purchases_by_year_statistics(self) -> list[dict[str, Any]]:
"""Get statistics grouped by purchase year"""
results = self.sql.fetchall('statistics/purchases_by_year')
return [dict(row) for row in results]
def get_year_summary(self) -> dict[str, Any]:
"""Get year-based summary statistics"""
sets_by_year = self.get_sets_by_year_statistics()
purchases_by_year = self.get_purchases_by_year_statistics()
# Calculate summary metrics
years_represented = len(sets_by_year)
years_with_purchases = len(purchases_by_year)
# Find peak year for collection (by set count)
peak_collection_year = None
max_sets_in_year = 0
if sets_by_year:
peak_year_data = max(sets_by_year, key=lambda x: x['total_sets'])
peak_collection_year = peak_year_data['year']
max_sets_in_year = peak_year_data['total_sets']
# Find peak spending year
peak_spending_year = None
max_spending = 0
if purchases_by_year:
spending_years = [y for y in purchases_by_year if y.get('total_spent')]
if spending_years:
peak_spending_data = max(spending_years, key=lambda x: x['total_spent'] or 0)
peak_spending_year = peak_spending_data['purchase_year']
max_spending = peak_spending_data['total_spent']
return {
'years_represented': years_represented,
'years_with_purchases': years_with_purchases,
'peak_collection_year': peak_collection_year,
'max_sets_in_year': max_sets_in_year,
'peak_spending_year': peak_spending_year,
'max_spending': max_spending,
'oldest_set_year': min([y['year'] for y in sets_by_year]) if sets_by_year else None,
'newest_set_year': max([y['year'] for y in sets_by_year]) if sets_by_year else None
}
+2 -2
View File
@@ -1,4 +1,4 @@
from typing import Final
__version__: Final[str] = '1.2.5'
__database_version__: Final[int] = 17
__version__: Final[str] = '1.3.0'
__database_version__: Final[int] = 21
+203 -8
View File
@@ -1,9 +1,11 @@
import logging
from flask import Blueprint, request, render_template
from flask import Blueprint, request, render_template, current_app, jsonify
from flask_login import login_required
from ...configuration_list import BrickConfigurationList
from ...config_manager import ConfigManager
from ...config import CONFIG
from ..exceptions import exception_handler
from ...instructions_list import BrickInstructionsList
from ...rebrickable_image import RebrickableImage
@@ -27,6 +29,68 @@ logger = logging.getLogger(__name__)
admin_page = Blueprint('admin', __name__, url_prefix='/admin')
def get_env_values():
"""Get current environment values, using defaults from config when not set"""
import os
from pathlib import Path
env_values = {}
config_defaults = {}
env_explicit_values = {} # Track which values are explicitly set
# Read .env file if it exists
env_file = Path('.env')
env_from_file = {}
if env_file.exists():
with open(env_file, 'r', encoding='utf-8') as f:
for line in f:
line = line.strip()
if line and not line.startswith('#') and '=' in line:
key, value = line.split('=', 1)
env_from_file[key] = value
# Process each config item
for config_item in CONFIG:
env_name = f"BK_{config_item['n']}"
# Store default value (with casting applied)
default_value = config_item.get('d', '')
if 'c' in config_item and default_value is not None:
cast_type = config_item['c']
if cast_type == bool and default_value == '':
default_value = False # Default for booleans is False only if no default specified
elif cast_type == list and isinstance(default_value, str):
default_value = [item.strip() for item in default_value.split(',') if item.strip()]
# For int/other types, keep the original default value
config_defaults[env_name] = default_value
# Check if value is explicitly set in .env file or environment
is_explicitly_set = env_name in env_from_file or env_name in os.environ
env_explicit_values[env_name] = is_explicitly_set
# Get value from .env file, environment, or default
value = env_from_file.get(env_name) or os.environ.get(env_name)
if value is None:
value = default_value
else:
# Apply casting if specified
if 'c' in config_item and value is not None:
cast_type = config_item['c']
if cast_type == bool and isinstance(value, str):
value = value.lower() in ('true', '1', 'yes', 'on')
elif cast_type == int and value != '':
try:
value = int(value)
except (ValueError, TypeError):
value = config_item.get('d', 0)
elif cast_type == list and isinstance(value, str):
value = [item.strip() for item in value.split(',') if item.strip()]
env_values[env_name] = value
return env_values, config_defaults, env_explicit_values
# Admin
@admin_page.route('/', methods=['GET'])
@login_required
@@ -102,18 +166,49 @@ def admin() -> str:
open_tag
)
open_database = (
open_image is None and
open_instructions is None and
open_logout is None and
not open_metadata and
open_retired is None and
open_theme is None
# Get configurable default expanded sections
default_expanded_sections = current_app.config.get('ADMIN_DEFAULT_EXPANDED_SECTIONS', [])
# Helper function to check if section should be expanded
def should_expand(section_name, url_param):
# URL parameter takes priority over default config
if url_param is not None:
return url_param
# Check if section is in default expanded list
return section_name in default_expanded_sections
# Apply configurable default expansion logic
open_database = should_expand('database', request.args.get('open_database', None))
open_image = should_expand('image', open_image)
open_instructions = should_expand('instructions', open_instructions)
open_logout = should_expand('authentication', open_logout)
open_retired = should_expand('retired', open_retired)
open_theme = should_expand('theme', open_theme)
# Metadata sub-sections
open_owner = should_expand('owner', open_owner)
open_purchase_location = should_expand('purchase_location', open_purchase_location)
open_status = should_expand('status', open_status)
open_storage = should_expand('storage', open_storage)
open_tag = should_expand('tag', open_tag)
# Recalculate metadata section based on sub-sections or direct config
open_metadata = (
should_expand('metadata', open_metadata) or
open_owner or
open_purchase_location or
open_status or
open_storage or
open_tag
)
env_values, config_defaults, env_explicit_values = get_env_values()
return render_template(
'admin.html',
configuration=BrickConfigurationList.list(),
env_values=env_values,
config_defaults=config_defaults,
env_explicit_values=env_explicit_values,
database_counters=database_counters,
database_error=request.args.get('database_error'),
database_exception=database_exception,
@@ -149,3 +244,103 @@ def admin() -> str:
tag_error=request.args.get('tag_error'),
theme=BrickThemeList(),
)
# API Endpoints for Configuration Management
@admin_page.route('/api/config/update', methods=['POST'])
@login_required
@exception_handler(__file__)
def update_config() -> str:
"""Update live configuration variables"""
try:
data = request.get_json()
if not data:
return jsonify({
'status': 'error',
'message': 'No JSON data provided'
}), 400
updates = data.get('updates', {})
if not updates:
return jsonify({
'status': 'error',
'message': 'No updates provided'
}), 400
# Use ConfigManager to update live configuration
config_manager = ConfigManager()
results = config_manager.update_config(updates)
# Check if all updates were successful
successful_updates = {k: v for k, v in results.items() if "successfully" in v}
failed_updates = {k: v for k, v in results.items() if "successfully" not in v}
logger.info(f"Configuration update: {len(successful_updates)} successful, {len(failed_updates)} failed")
if failed_updates:
logger.warning(f"Failed updates: {failed_updates}")
return jsonify({
'status': 'success' if not failed_updates else 'partial',
'results': results,
'successful_count': len(successful_updates),
'failed_count': len(failed_updates)
})
except Exception as e:
logger.error(f"Error updating configuration: {e}")
return jsonify({
'status': 'error',
'message': str(e)
}), 500
@admin_page.route('/api/config/update-static', methods=['POST'])
@login_required
@exception_handler(__file__)
def update_static_config() -> str:
"""Update static configuration variables (requires restart)"""
try:
data = request.get_json()
if not data:
return jsonify({
'status': 'error',
'message': 'No JSON data provided'
}), 400
updates = data.get('updates', {})
if not updates:
return jsonify({
'status': 'error',
'message': 'No updates provided'
}), 400
# Use ConfigManager to update .env file
config_manager = ConfigManager()
# Update each variable in the .env file
updated_count = 0
for var_name, value in updates.items():
try:
config_manager._update_env_file(var_name, value)
updated_count += 1
logger.info(f"Updated static config: {var_name}")
except Exception as e:
logger.error(f"Failed to update static config {var_name}: {e}")
raise e
logger.info(f"Updated {updated_count} static configuration variables")
return jsonify({
'status': 'success',
'message': f'Successfully updated {updated_count} static configuration variables to .env file',
'updated_count': updated_count
})
except Exception as e:
logger.error(f"Error updating static configuration: {e}")
return jsonify({
'status': 'error',
'message': str(e)
}), 500
-2
View File
@@ -2,7 +2,6 @@ from flask import Blueprint, render_template
from .exceptions import exception_handler
from ..minifigure_list import BrickMinifigureList
from ..set_status_list import BrickSetStatusList
from ..set_list import BrickSetList, set_metadata_lists
index_page = Blueprint('index', __name__)
@@ -15,7 +14,6 @@ def index() -> str:
return render_template(
'index.html',
brickset_collection=BrickSetList().last(),
brickset_statuses=BrickSetStatusList.list(),
minifigure_collection=BrickMinifigureList().last(),
**set_metadata_lists(as_class=True)
)
+281
View File
@@ -0,0 +1,281 @@
import logging
from flask import Blueprint, jsonify, redirect, render_template, request, url_for, Response
from flask_login import login_required
from .exceptions import exception_handler
from ..individual_minifigure import IndividualMinifigure
from ..part import BrickPart
from ..set_list import set_metadata_lists
from ..set_owner_list import BrickSetOwnerList
from ..set_tag_list import BrickSetTagList
from ..set_storage_list import BrickSetStorageList
from ..set_purchase_location_list import BrickSetPurchaseLocationList
from ..sql import BrickSQL
logger = logging.getLogger(__name__)
individual_minifigure_page = Blueprint('individual_minifigure', __name__, url_prefix='/individual-minifigures')
# Individual minifigure instance details/edit
@individual_minifigure_page.route('/<id>')
@exception_handler(__file__)
def details(*, id: str) -> str:
item = IndividualMinifigure().select_by_id(id)
return render_template(
'individual_minifigure/details.html',
item=item,
**set_metadata_lists(as_class=True)
)
# Update individual minifigure instance
@individual_minifigure_page.route('/<id>/update', methods=['POST'])
@exception_handler(__file__)
def update(*, id: str):
item = IndividualMinifigure().select_by_id(id)
# Update basic fields
item.fields.quantity = int(request.form.get('quantity', 1))
item.fields.description = request.form.get('description', '')
item.fields.storage = request.form.get('storage') or None
item.fields.purchase_location = request.form.get('purchase_location') or None
# Update the individual minifigure
from ..sql import BrickSQL
BrickSQL().execute(
'individual_minifigure/update',
parameters={
'id': item.fields.id,
'quantity': item.fields.quantity,
'description': item.fields.description,
'storage': item.fields.storage,
'purchase_location': item.fields.purchase_location,
},
commit=False,
)
# Update owners
owners = request.form.getlist('owners')
for owner in BrickSetOwnerList.list():
owner.update_individual_minifigure_state(item, state=(owner.fields.id in owners))
# Update tags
tags = request.form.getlist('tags')
for tag in BrickSetTagList.list():
tag.update_individual_minifigure_state(item, state=(tag.fields.id in tags))
BrickSQL().commit()
return redirect(url_for('individual_minifigure.details', id=id))
# Update quantity
@individual_minifigure_page.route('/<id>/update/quantity', methods=['POST'])
@login_required
@exception_handler(__file__)
def update_quantity(*, id: str):
item = IndividualMinifigure().select_by_id(id)
item.fields.quantity = int(request.json.get('value', 1))
BrickSQL().execute_and_commit(
'individual_minifigure/update',
parameters={
'id': item.fields.id,
'quantity': item.fields.quantity,
'description': item.fields.description,
'storage': item.fields.storage,
'purchase_location': item.fields.purchase_location,
}
)
return jsonify({'success': True})
# Update description
@individual_minifigure_page.route('/<id>/update/description', methods=['POST'])
@login_required
@exception_handler(__file__)
def update_description(*, id: str):
item = IndividualMinifigure().select_by_id(id)
item.fields.description = request.json.get('value', '')
BrickSQL().execute_and_commit(
'individual_minifigure/update',
parameters={
'id': item.fields.id,
'quantity': item.fields.quantity,
'description': item.fields.description,
'storage': item.fields.storage,
'purchase_location': item.fields.purchase_location,
}
)
return jsonify({'success': True})
# Update owner
@individual_minifigure_page.route('/<id>/update/owner/<metadata_id>', methods=['POST'])
@login_required
@exception_handler(__file__)
def update_owner(*, id: str, metadata_id: str):
item = IndividualMinifigure().select_by_id(id)
owner = BrickSetOwnerList.get(metadata_id)
owner.update_individual_minifigure_state(item, json=request.json)
return jsonify({'success': True})
# Update tag
@individual_minifigure_page.route('/<id>/update/tag/<metadata_id>', methods=['POST'])
@login_required
@exception_handler(__file__)
def update_tag(*, id: str, metadata_id: str):
item = IndividualMinifigure().select_by_id(id)
tag = BrickSetTagList.get(metadata_id)
tag.update_individual_minifigure_state(item, json=request.json)
return jsonify({'success': True})
# Update status
@individual_minifigure_page.route('/<id>/update/status/<metadata_id>', methods=['POST'])
@login_required
@exception_handler(__file__)
def update_status(*, id: str, metadata_id: str):
item = IndividualMinifigure().select_by_id(id)
from ..set_status_list import BrickSetStatusList
status = BrickSetStatusList.get(metadata_id)
status.update_individual_minifigure_state(item, json=request.json)
return jsonify({'success': True})
# Update storage
@individual_minifigure_page.route('/<id>/update/storage', methods=['POST'])
@login_required
@exception_handler(__file__)
def update_storage(*, id: str):
item = IndividualMinifigure().select_by_id(id)
storage_id = request.json.get('value')
BrickSQL().execute_and_commit(
'individual_minifigure/update',
parameters={
'id': item.fields.id,
'quantity': item.fields.quantity,
'description': item.fields.description,
'storage': storage_id if storage_id else None,
'purchase_location': item.fields.purchase_location,
}
)
return jsonify({'success': True})
# Update purchase location
@individual_minifigure_page.route('/<id>/update/purchase_location', methods=['POST'])
@login_required
@exception_handler(__file__)
def update_purchase_location(*, id: str):
item = IndividualMinifigure().select_by_id(id)
location_id = request.json.get('value')
BrickSQL().execute_and_commit(
'individual_minifigure/update',
parameters={
'id': item.fields.id,
'quantity': item.fields.quantity,
'description': item.fields.description,
'storage': item.fields.storage,
'purchase_location': location_id if location_id else None,
}
)
return jsonify({'success': True})
# Update problematic pieces of an individual minifigure
@individual_minifigure_page.route('/<id>/parts/<part>/<int:color>/<int:spare>/<problem>', methods=['POST'])
@login_required
@exception_handler(__file__, json=True)
def problem_part(
*,
id: str,
part: str,
color: int,
spare: int,
problem: str,
) -> Response:
minifigure = IndividualMinifigure().select_by_id(id)
brickpart = BrickPart().select_specific_individual_minifigure(
minifigure,
part,
color,
spare,
)
amount = brickpart.update_problem_individual_minifigure(problem, request.json)
# Info
logger.info('Individual minifigure {figure} ({id}): updated part ({part} color: {color}, spare: {spare}) {problem} count to {amount}'.format(
figure=minifigure.fields.figure,
id=minifigure.fields.id,
part=brickpart.fields.part,
color=brickpart.fields.color,
spare=brickpart.fields.spare,
problem=problem,
amount=amount
))
return jsonify({problem: amount})
# Update checked state of parts
@individual_minifigure_page.route('/<id>/parts/<part>/<int:color>/<int:spare>/checked', methods=['POST'])
@login_required
@exception_handler(__file__, json=True)
def checked_part(
*,
id: str,
part: str,
color: int,
spare: int,
) -> Response:
minifigure = IndividualMinifigure().select_by_id(id)
brickpart = BrickPart().select_specific_individual_minifigure(
minifigure,
part,
color,
spare,
)
checked = brickpart.update_checked_individual_minifigure(request.json)
# Info
logger.info('Individual minifigure {figure} ({id}): updated part ({part} color: {color}, spare: {spare}) checked state to {checked}'.format(
figure=minifigure.fields.figure,
id=minifigure.fields.id,
part=brickpart.fields.part,
color=brickpart.fields.color,
spare=brickpart.fields.spare,
checked=checked
))
return jsonify({'checked': checked})
# Delete individual minifigure instance
@individual_minifigure_page.route('/<id>/delete', methods=['POST'])
@login_required
@exception_handler(__file__)
def delete(*, id: str):
item = IndividualMinifigure().select_by_id(id)
figure = item.fields.figure
item.delete()
return redirect(url_for('minifigure.details', figure=figure))
+68 -9
View File
@@ -14,6 +14,7 @@ from .exceptions import exception_handler
from ..instructions import BrickInstructions
from ..instructions_list import BrickInstructionsList
from ..parser import parse_set
from ..peeron_instructions import PeeronInstructions
from ..socket import MESSAGES
from .upload import upload_helper
@@ -24,6 +25,22 @@ instructions_page = Blueprint(
)
def _render_peeron_select_page(set: str) -> str:
"""Helper function to render the Peeron page selection interface with cached thumbnails."""
peeron = PeeronInstructions(set)
peeron_pages = peeron.find_pages() # This will use the cached thumbnails
current_app.logger.debug(f"[peeron_loaded] Found {len(peeron_pages)} pages for {set}")
return render_template(
'peeron_select.html',
download=True,
pages=peeron_pages,
set=set,
path=current_app.config['SOCKET_PATH'],
namespace=current_app.config['SOCKET_NAMESPACE'],
messages=MESSAGES
)
# Index
@instructions_page.route('/', methods=['GET'])
@exception_handler(__file__)
@@ -141,6 +158,10 @@ def download() -> str:
except Exception:
set = ''
# Check if this is a redirect after Peeron pages were loaded
if request.args.get('peeron_loaded'):
return _render_peeron_select_page(set)
return render_template(
'instructions.html',
download=True,
@@ -160,12 +181,50 @@ def do_download() -> str:
except Exception:
set = ''
return render_template(
'instructions.html',
download=True,
instructions=BrickInstructions.find_instructions(set),
set=set,
path=current_app.config['SOCKET_PATH'],
namespace=current_app.config['SOCKET_NAMESPACE'],
messages=MESSAGES
)
# Check if this is a redirect after Peeron pages were loaded
if request.args.get('peeron_loaded'):
return _render_peeron_select_page(set)
# Try Rebrickable first
try:
from .instructions import BrickInstructions
rebrickable_instructions = BrickInstructions.find_instructions(set)
# Standard Rebrickable instructions found
return render_template(
'instructions.html',
download=True,
instructions=rebrickable_instructions,
set=set,
path=current_app.config['SOCKET_PATH'],
namespace=current_app.config['SOCKET_NAMESPACE'],
messages=MESSAGES
)
except Exception:
# Rebrickable failed, check if Peeron has instructions (without caching thumbnails yet)
try:
peeron = PeeronInstructions(set)
# Just check if pages exist, don't cache thumbnails yet
if peeron.exists():
# Peeron has instructions - show loading interface
return render_template(
'peeron_select.html',
download=True,
loading_peeron=True, # Flag to show loading state
set=set,
path=current_app.config['SOCKET_PATH'],
namespace=current_app.config['SOCKET_NAMESPACE'],
messages=MESSAGES
)
else:
raise Exception("Not found on Peeron either")
except Exception:
return render_template(
'instructions.html',
download=True,
instructions=[],
set=set,
error='No instructions found on Rebrickable or Peeron',
path=current_app.config['SOCKET_PATH'],
namespace=current_app.config['SOCKET_NAMESPACE'],
messages=MESSAGES
)
+2
View File
@@ -3,6 +3,7 @@ from flask import Blueprint, current_app, render_template, request
from .exceptions import exception_handler
from ..minifigure import BrickMinifigure
from ..minifigure_list import BrickMinifigureList
from ..individual_minifigure_list import IndividualMinifigureList
from ..pagination_helper import get_pagination_config, build_pagination_context, get_request_params
from ..set_list import BrickSetList, set_metadata_lists
from ..set_owner_list import BrickSetOwnerList
@@ -72,5 +73,6 @@ def details(*, figure: str) -> str:
using=BrickSetList().using_minifigure(figure),
missing=BrickSetList().missing_minifigure(figure),
damaged=BrickSetList().damaged_minifigure(figure),
individual_instances=IndividualMinifigureList().instances_by_figure(figure),
**set_metadata_lists(as_class=True)
)
+103 -10
View File
@@ -46,6 +46,8 @@ def list() -> str:
purchase_location_filter = request.args.get('purchase_location')
storage_filter = request.args.get('storage')
tag_filter = request.args.get('tag')
year_filter = request.args.get('year')
duplicate_filter = request.args.get('duplicate', '').lower() == 'true'
# Get pagination configuration
per_page, is_mobile = get_pagination_config('sets')
@@ -64,15 +66,31 @@ def list() -> str:
owner_filter=owner_filter,
purchase_location_filter=purchase_location_filter,
storage_filter=storage_filter,
tag_filter=tag_filter
tag_filter=tag_filter,
year_filter=year_filter,
duplicate_filter=duplicate_filter,
use_consolidated=current_app.config['SETS_CONSOLIDATION']
)
pagination_context = build_pagination_context(page, per_page, total_count, is_mobile)
else:
# ORIGINAL MODE - Single page with all data for client-side search
sets = BrickSetList().all()
if current_app.config['SETS_CONSOLIDATION']:
sets = BrickSetList().all_consolidated()
else:
sets = BrickSetList().all()
pagination_context = None
# Convert theme ID to theme name for dropdown display if needed
display_theme_filter = theme_filter
if theme_filter and theme_filter.isdigit():
# Theme filter is an ID, convert to name for dropdown
# Create a fresh BrickSetList instance for theme conversion
converter = BrickSetList()
theme_name = converter._theme_id_to_name(theme_filter)
if theme_name:
display_theme_filter = theme_name
template_context = {
'collection': sets,
'search_query': search_query,
@@ -80,11 +98,13 @@ def list() -> str:
'current_sort': sort_field,
'current_order': sort_order,
'current_status_filter': status_filter,
'current_theme_filter': theme_filter,
'current_theme_filter': display_theme_filter,
'current_owner_filter': owner_filter,
'current_purchase_location_filter': purchase_location_filter,
'current_storage_filter': storage_filter,
'current_tag_filter': tag_filter,
'current_year_filter': year_filter,
'current_duplicate_filter': duplicate_filter,
'brickset_statuses': BrickSetStatusList.list(),
**set_metadata_lists(as_class=True)
}
@@ -239,13 +259,42 @@ def deleted(*, id: str) -> str:
@set_page.route('/<id>/details', methods=['GET'])
@exception_handler(__file__)
def details(*, id: str) -> str:
return render_template(
'set.html',
item=BrickSet().select_specific(id),
open_instructions=request.args.get('open_instructions'),
brickset_statuses=BrickSetStatusList.list(all=True),
**set_metadata_lists(as_class=True)
)
# Load the specific set
item = BrickSet().select_specific(id)
# Check if there are multiple instances of this set
all_instances = BrickSetList()
# Load all sets with metadata context for tags, owners, etc.
filter_context = {
'owners': BrickSetOwnerList.as_columns(),
'statuses': BrickSetStatusList.as_columns(),
'tags': BrickSetTagList.as_columns(),
}
all_instances.list(do_theme=True, **filter_context)
# Find all instances with the same set number
same_set_instances = [
record for record in all_instances.records
if record.fields.set == item.fields.set
]
# If consolidation is enabled and multiple instances exist, show consolidated view
if current_app.config['SETS_CONSOLIDATION'] and len(same_set_instances) > 1:
return render_template(
'set.html',
item=item,
all_instances=same_set_instances,
open_instructions=request.args.get('open_instructions'),
**set_metadata_lists(as_class=True)
)
else:
# Single instance or consolidation disabled, show normal view
return render_template(
'set.html',
item=item,
open_instructions=request.args.get('open_instructions'),
**set_metadata_lists(as_class=True)
)
# Update problematic pieces of a set
@@ -294,6 +343,50 @@ def problem_part(
return jsonify({problem: amount})
# Update checked state of parts during walkthrough
@set_page.route('/<id>/parts/<part>/<int:color>/<int:spare>/checked', defaults={'figure': None}, methods=['POST']) # noqa: E501
@set_page.route('/<id>/minifigures/<figure>/parts/<part>/<int:color>/<int:spare>/checked', methods=['POST']) # noqa: E501
@login_required
@exception_handler(__file__, json=True)
def checked_part(
*,
id: str,
figure: str | None,
part: str,
color: int,
spare: int,
) -> Response:
brickset = BrickSet().select_specific(id)
if figure is not None:
brickminifigure = BrickMinifigure().select_specific(brickset, figure)
else:
brickminifigure = None
brickpart = BrickPart().select_specific(
brickset,
part,
color,
spare,
minifigure=brickminifigure,
)
checked = brickpart.update_checked(request.json)
# Info
logger.info('Set {set} ({id}): updated part ({part} color: {color}, spare: {spare}, minifigure: {figure}) checked state to {checked}'.format( # noqa: E501
set=brickset.fields.set,
id=brickset.fields.id,
figure=figure,
part=brickpart.fields.part,
color=brickpart.fields.color,
spare=brickpart.fields.spare,
checked=checked
))
return jsonify({'checked': checked})
# Refresh a set
@set_page.route('/refresh/<set>/', methods=['GET'])
@set_page.route('/<id>/refresh', methods=['GET'])
+189
View File
@@ -0,0 +1,189 @@
"""
Statistics views for BrickTracker
Provides statistics and analytics pages
"""
import logging
from flask import Blueprint, render_template, request, url_for, redirect, current_app
from werkzeug.wrappers.response import Response
from .exceptions import exception_handler
from ..statistics import BrickStatistics
logger = logging.getLogger(__name__)
statistics_page = Blueprint('statistics', __name__, url_prefix='/statistics')
@statistics_page.route('/', methods=['GET'])
@exception_handler(__file__)
def overview() -> str:
"""Statistics overview page with metrics"""
stats = BrickStatistics()
# Get all statistics data
overview_stats = stats.get_overview()
theme_stats = stats.get_theme_statistics()
storage_stats = stats.get_storage_statistics()
purchase_location_stats = stats.get_purchase_location_statistics()
financial_summary = stats.get_financial_summary()
collection_summary = stats.get_collection_summary()
sets_by_year_stats = stats.get_sets_by_year_statistics()
purchases_by_year_stats = stats.get_purchases_by_year_statistics()
year_summary = stats.get_year_summary()
# Prepare chart data for visualization (only if charts are enabled)
chart_data = {}
if current_app.config['STATISTICS_SHOW_CHARTS']:
chart_data = prepare_chart_data(sets_by_year_stats, purchases_by_year_stats)
# Get filter parameters for clickable statistics
filter_type = request.args.get('filter_type')
filter_value = request.args.get('filter_value')
# If a filter is applied, redirect to sets page with appropriate filters
if filter_type and filter_value:
return redirect_to_filtered_sets(filter_type, filter_value)
return render_template(
'statistics.html',
overview=overview_stats,
theme_statistics=theme_stats,
storage_statistics=storage_stats,
purchase_location_statistics=purchase_location_stats,
financial_summary=financial_summary,
collection_summary=collection_summary,
sets_by_year_statistics=sets_by_year_stats,
purchases_by_year_statistics=purchases_by_year_stats,
year_summary=year_summary,
chart_data=chart_data,
title="Statistics Overview"
)
def redirect_to_filtered_sets(filter_type: str, filter_value: str) -> Response:
"""Redirect to sets page with appropriate filters based on statistics click"""
# Map filter types to sets page parameters
filter_mapping = {
'theme': {'theme': filter_value},
'storage': {'storage': filter_value},
'purchase_location': {'purchase_location': filter_value},
'has_price': {'has_price': '1'} if filter_value == '1' else {},
'missing_parts': {'status': 'has-missing'},
'damaged_parts': {'status': 'has-damaged'},
'has_storage': {'status': 'has-storage'},
'no_storage': {'status': '-has-storage'},
}
# Get the appropriate filter parameters
filter_params = filter_mapping.get(filter_type, {})
if filter_params:
return redirect(url_for('set.list', **filter_params))
else:
# Default fallback to sets page
return redirect(url_for('set.list'))
@statistics_page.route('/themes', methods=['GET'])
@exception_handler(__file__)
def themes() -> str:
"""Detailed theme statistics page"""
stats = BrickStatistics()
theme_stats = stats.get_theme_statistics()
return render_template(
'statistics_themes.html',
theme_statistics=theme_stats,
title="Theme Statistics"
)
@statistics_page.route('/storage', methods=['GET'])
@exception_handler(__file__)
def storage() -> str:
"""Detailed storage statistics page"""
stats = BrickStatistics()
storage_stats = stats.get_storage_statistics()
return render_template(
'statistics_storage.html',
storage_statistics=storage_stats,
title="Storage Statistics"
)
@statistics_page.route('/purchase-locations', methods=['GET'])
@exception_handler(__file__)
def purchase_locations() -> str:
"""Detailed purchase location statistics page"""
stats = BrickStatistics()
purchase_stats = stats.get_purchase_location_statistics()
return render_template(
'statistics_purchase_locations.html',
purchase_location_statistics=purchase_stats,
title="Purchase Location Statistics"
)
def prepare_chart_data(sets_by_year_stats, purchases_by_year_stats):
"""Prepare data for Chart.js visualization"""
import json
# Get all years from both datasets
all_years = set()
# Add years from sets by year
if sets_by_year_stats:
for year_stat in sets_by_year_stats:
if 'year' in year_stat:
all_years.add(year_stat['year'])
# Add years from purchases by year
if purchases_by_year_stats:
for year_stat in purchases_by_year_stats:
if 'purchase_year' in year_stat:
all_years.add(int(year_stat['purchase_year']))
# Create sorted list of years
years = sorted(list(all_years))
# Initialize data arrays
sets_data = []
parts_data = []
minifigs_data = []
# Create lookup dictionaries for quick access
sets_by_year_lookup = {}
if sets_by_year_stats:
for year_stat in sets_by_year_stats:
if 'year' in year_stat:
sets_by_year_lookup[year_stat['year']] = year_stat
# Fill data arrays
for year in years:
# Get sets and parts data from sets_by_year
year_data = sets_by_year_lookup.get(year)
if year_data:
sets_data.append(year_data.get('total_sets', 0))
parts_data.append(year_data.get('total_parts', 0))
# Use actual minifigure count from the database
minifigs_data.append(year_data.get('total_minifigures', 0))
else:
sets_data.append(0)
parts_data.append(0)
minifigs_data.append(0)
return {
'years': json.dumps(years),
'sets_data': json.dumps(sets_data),
'parts_data': json.dumps(parts_data),
'minifigs_data': json.dumps(minifigs_data)
}
+42
View File
@@ -1,9 +1,11 @@
from flask import Blueprint, render_template
from .exceptions import exception_handler
from ..individual_minifigure_list import IndividualMinifigureList
from ..set_list import BrickSetList, set_metadata_lists
from ..set_storage import BrickSetStorage
from ..set_storage_list import BrickSetStorageList
from ..sql import BrickSQL
storage_page = Blueprint('storage', __name__, url_prefix='/storages')
@@ -12,9 +14,48 @@ storage_page = Blueprint('storage', __name__, url_prefix='/storages')
@storage_page.route('/', methods=['GET'])
@exception_handler(__file__)
def list() -> str:
# Get counts of items with no storage
sql = BrickSQL()
# Count sets with no storage
sets_no_storage_query = 'SELECT COUNT(*) FROM "bricktracker_sets" WHERE "storage" IS NULL'
sql.cursor.execute(sets_no_storage_query)
sets_no_storage = sql.cursor.fetchone()[0]
# Count individual minifigures with no storage
minifigs_no_storage_query = 'SELECT COUNT(*) FROM "bricktracker_individual_minifigures" WHERE "storage" IS NULL'
sql.cursor.execute(minifigs_no_storage_query)
minifigs_no_storage = sql.cursor.fetchone()[0]
return render_template(
'storages.html',
table_collection=BrickSetStorageList.all(),
sets_no_storage=sets_no_storage,
minifigs_no_storage=minifigs_no_storage,
)
# Storage details - no storage
@storage_page.route('/no_storage/details')
@exception_handler(__file__)
def no_storage_details() -> str:
# Create a mock storage object for "no storage"
from ..record import BrickRecord
no_storage = BrickRecord()
no_storage.fields.id = None
no_storage.fields.name = 'Not in a storage location'
# Get sets and individual minifigures with no storage
sets = BrickSetList().without_storage()
individual_minifigures = IndividualMinifigureList().without_storage()
return render_template(
'storage.html',
item=no_storage,
sets=sets,
individual_minifigures=individual_minifigures,
**set_metadata_lists(as_class=True)
)
@@ -28,5 +69,6 @@ def details(*, id: str) -> str:
'storage.html',
item=storage,
sets=BrickSetList().using_storage(storage),
individual_minifigures=IndividualMinifigureList().using_storage(storage),
**set_metadata_lists(as_class=True)
)

Some files were not shown because too many files have changed in this diff Show More