name: data-import-workflow description: Managing the lifecycle of weapon and mod data from external sources to the local database
Data Import Workflow Skill
Use this skill when working on or running the data import pipeline.
Scope
- Running the importer to populate or refresh data
- Extending the importer to handle new fields or types
- Debugging import failures
Running the Importer
| Command | Use Case |
|---|---|
task importer:start | Fetch fresh data from tarkov.dev API and update the database. |
task importer:start:use-cache | Use locally cached JSON files (faster, avoids API rate limits). |
task importer:start:cache-only | Fetch from API and update local cache, but skip database writes. |
Import Behavior
- Purge: Existing weapons, mods, trader offers, and optimum builds are deleted.
- Fetch: Data is retrieved from tarkov.dev GraphQL API (or local cache).
- Transform: External schema is mapped to internal
modelsstructs. - Persist: Data is written to the database within a transaction.
- Invalidate:
optimum_buildsare purged since they depend on the imported data.
Extending the Importer
When adding new fields or types:
- Update GraphQL Query: Modify
internal/tarkovdev/schemas/queries.graphql. - Regenerate Client: Run
task tarkovdev:regenerate. - Update Models: Add new fields to structs in
internal/models/. - Update Mapping: Adjust conversion logic in
internal/importers/. - Update SQL: Modify
Upsertfunctions ininternal/models/to include new columns.
Troubleshooting
- Rate Limiting: Use
--use-cachefor repeated runs during development. - Schema Mismatch: Run
task tarkovdev:get-schemathentask tarkovdev:regenerate. - Connection Issues: Verify
POSTGRES_HOSTin.envand ensure the database is running.