Added uploads, part renaming, bulk data import acceptance
This commit is contained in:
112
ACCEPT_JOBS_README.md
Normal file
112
ACCEPT_JOBS_README.md
Normal file
@@ -0,0 +1,112 @@
|
|||||||
|
# Bulk Import Job Acceptance Automation
|
||||||
|
|
||||||
|
This feature automates the acceptance of bulk import jobs in PartDB.
|
||||||
|
|
||||||
|
## What It Does
|
||||||
|
|
||||||
|
The automation will:
|
||||||
|
1. Navigate to your import job page (or you can navigate there manually)
|
||||||
|
2. Find all selectable "Update Part" buttons (only `btn btn-primary` without the `disabled` class)
|
||||||
|
3. For each button:
|
||||||
|
- Click the button and wait for the page to load (stays on same page)
|
||||||
|
- Click "Save" and wait for the page to load
|
||||||
|
- Click "Save" again and wait for the page to load
|
||||||
|
- Click "Complete" to finish the job
|
||||||
|
4. Repeat until no more enabled "Update Part" buttons are found
|
||||||
|
|
||||||
|
## How to Use
|
||||||
|
|
||||||
|
### Option 1: From the UI (Recommended)
|
||||||
|
|
||||||
|
1. Run the main application: `python main.py`
|
||||||
|
2. On the home page, click the **"Accept Import Jobs"** button in the Tools section
|
||||||
|
3. A browser window will open
|
||||||
|
4. When prompted, navigate to the import job page where the "Update part" buttons are
|
||||||
|
5. Press Enter in the console to start the automation
|
||||||
|
6. Watch as the automation processes each job
|
||||||
|
7. When complete, press Enter to close the browser
|
||||||
|
|
||||||
|
### Option 2: Standalone Script
|
||||||
|
|
||||||
|
1. Open PowerShell/Terminal
|
||||||
|
2. Run: `python workflows\accept_import_jobs.py`
|
||||||
|
3. Follow the same steps as above
|
||||||
|
|
||||||
|
### Option 3: With Direct URL
|
||||||
|
|
||||||
|
If you know the exact URL of the import job page, you can modify the script:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from workflows.accept_import_jobs import run_accept_import_jobs
|
||||||
|
|
||||||
|
# Provide the direct URL
|
||||||
|
run_accept_import_jobs("https://partdb.neutronservices.duckdns.org/en/import/jobs/123")
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
In `config.py`, you can adjust:
|
||||||
|
|
||||||
|
```python
|
||||||
|
# Maximum number of jobs to process in one run (prevents infinite loops)
|
||||||
|
ACCEPT_JOBS_MAX_ITERATIONS = 100
|
||||||
|
|
||||||
|
# Delay between job attempts (seconds)
|
||||||
|
ACCEPT_JOBS_RETRY_DELAY = 1.0
|
||||||
|
|
||||||
|
# Whether to run browser in headless mode
|
||||||
|
HEADLESS_CONTROLLER = False # Set to True to hide the browser window
|
||||||
|
```
|
||||||
|
|
||||||
|
## Button Detection
|
||||||
|
|
||||||
|
The automation specifically looks for "Update Part" buttons that:
|
||||||
|
- Have the class `btn btn-primary` (indicating a clickable button)
|
||||||
|
- Do **NOT** have the `disabled` class (which would make them unclickable)
|
||||||
|
|
||||||
|
This ensures only valid, actionable import jobs are processed and disabled buttons are skipped.
|
||||||
|
|
||||||
|
Button texts detected:
|
||||||
|
- "Update Part"
|
||||||
|
- "Update part"
|
||||||
|
|
||||||
|
And will click Save/Complete buttons with these texts:
|
||||||
|
- "Save"
|
||||||
|
- "Save changes"
|
||||||
|
- "Complete"
|
||||||
|
|
||||||
|
**Important:** The automation filters out any buttons with `class="btn btn-primary disabled"` to avoid clicking non-actionable buttons.
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### No buttons found
|
||||||
|
- Make sure you're on the correct page with import jobs
|
||||||
|
- Check that there are "Update Part" buttons with class `btn btn-primary` (without `disabled`)
|
||||||
|
- The buttons must be visible, enabled, and not have the `disabled` class
|
||||||
|
- Only jobs that are ready to process will have enabled buttons
|
||||||
|
|
||||||
|
### Automation stops early
|
||||||
|
- Check the console output for error messages
|
||||||
|
- Some jobs might have different button text or layout
|
||||||
|
- You can adjust the XPath selectors in `provider/selenium_flow.py` if needed
|
||||||
|
|
||||||
|
### Browser closes immediately
|
||||||
|
- Make sure you press Enter only when you're on the correct page
|
||||||
|
- Check that you're logged in to PartDB
|
||||||
|
|
||||||
|
## Statistics
|
||||||
|
|
||||||
|
After completion, you'll see:
|
||||||
|
- Number of jobs successfully processed
|
||||||
|
- Number of jobs that failed
|
||||||
|
- Total time taken
|
||||||
|
|
||||||
|
## Technical Details
|
||||||
|
|
||||||
|
The automation uses:
|
||||||
|
- **Selenium WebDriver** for browser automation
|
||||||
|
- **Firefox** as the default browser (with Chrome fallback)
|
||||||
|
- **Robust element detection** that handles stale elements and page reloads
|
||||||
|
- **Automatic retry logic** for clicking buttons
|
||||||
|
|
||||||
|
The main function is `accept_bulk_import_jobs()` in `provider/selenium_flow.py`.
|
||||||
7
Import CSVs/voltage_reference.csv
Normal file
7
Import CSVs/voltage_reference.csv
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
Datasheet,Image,DK Part #,Mfr Part #,Mfr,Supplier,Description,Stock,Price,@ qty,Min Qty, Package,Series,Product Status,Reference Type,Output Type,Voltage - Output (Min/Fixed),Current - Output,Tolerance,Temperature Coefficient,Noise - 0.1Hz to 10Hz,Noise - 10Hz to 10kHz,Voltage - Input,Current - Supply,Operating Temperature,Mounting Type,Package / Case,Supplier Device Package,URL
|
||||||
|
https://www.ti.com/lit/ds/symlink/ref30.pdf?ts=1743588330976&ref_url=https%253A%252F%252Fwww.ti.com%252Fproduct%252FREF30%252Fpart-details%252FREF3033AIDBZT%253FkeyMatch%253DREF3033AIDBZT%2526tisearch%253Duniversal_search%2526usecase%253DOPN,//mm.digikey.com/Volume0/opasdata/d220001/medias/images/3969/296%7E4203227%7EDBZ%7E3.JPG,"296-26324-2-ND,296-26324-1-ND,296-26324-6-ND",REF3033AIDBZR,Texas Instruments,Texas Instruments,IC VREF SERIES 0.2% SOT23-3,"24,432",1.75,0,1,"Tape & Reel (TR),Cut Tape (CT),Digi-Reel®",-,Active,Series,Fixed,3.3V,25 mA,±0.2%,75ppm/°C,36µVp-p,105µVrms,3.35V ~ 5.5V,50µA,-40°C ~ 125°C (TA),Surface Mount,"TO-236-3, SC-59, SOT-23-3",SOT-23-3,https://www.digikey.com.au/en/products/detail/texas-instruments/REF3033AIDBZR/1573916
|
||||||
|
https://www.ti.com/lit/ds/symlink/ref30.pdf?ts=1743588330976&ref_url=https%253A%252F%252Fwww.ti.com%252Fproduct%252FREF30%252Fpart-details%252FREF3033AIDBZT%253FkeyMatch%253DREF3033AIDBZT%2526tisearch%253Duniversal_search%2526usecase%253DOPN,//mm.digikey.com/Volume0/opasdata/d220001/medias/images/3969/296%7E4203227%7EDBZ%7E3.JPG,"296-26321-2-ND,296-26321-1-ND,296-26321-6-ND",REF3020AIDBZR,Texas Instruments,Texas Instruments,IC VREF SERIES 0.2% SOT23-3,"19,223",1.75,0,1,"Tape & Reel (TR),Cut Tape (CT),Digi-Reel®",-,Active,Series,Fixed,2.048V,25 mA,±0.2%,75ppm/°C,23µVp-p,65µVrms,2.098V ~ 5.5V,50µA,-40°C ~ 125°C (TA),Surface Mount,"TO-236-3, SC-59, SOT-23-3",SOT-23-3,https://www.digikey.com.au/en/products/detail/texas-instruments/REF3020AIDBZR/1573908
|
||||||
|
https://www.ti.com/lit/ds/symlink/ref30.pdf?ts=1743588330976&ref_url=https%253A%252F%252Fwww.ti.com%252Fproduct%252FREF30%252Fpart-details%252FREF3033AIDBZT%253FkeyMatch%253DREF3033AIDBZT%2526tisearch%253Duniversal_search%2526usecase%253DOPN,//mm.digikey.com/Volume0/opasdata/d220001/medias/images/3969/296%7E4203227%7EDBZ%7E3.JPG,"296-26323-2-ND,296-26323-1-ND,296-26323-6-ND",REF3030AIDBZR,Texas Instruments,Texas Instruments,IC VREF SERIES 0.2% SOT23-3,"11,327",1.75,0,1,"Tape & Reel (TR),Cut Tape (CT),Digi-Reel®",-,Active,Series,Fixed,3V,25 mA,±0.2%,75ppm/°C,33µVp-p,94µVrms,3.05V ~ 5.5V,50µA,-40°C ~ 125°C (TA),Surface Mount,"TO-236-3, SC-59, SOT-23-3",SOT-23-3,https://www.digikey.com.au/en/products/detail/texas-instruments/REF3030AIDBZR/1573913
|
||||||
|
https://www.ti.com/lit/ds/symlink/ref30.pdf?ts=1743588330976&ref_url=https%253A%252F%252Fwww.ti.com%252Fproduct%252FREF30%252Fpart-details%252FREF3033AIDBZT%253FkeyMatch%253DREF3033AIDBZT%2526tisearch%253Duniversal_search%2526usecase%253DOPN,//mm.digikey.com/Volume0/opasdata/d220001/medias/images/3969/296%7E4203227%7EDBZ%7E3.JPG,"296-26322-2-ND,296-26322-1-ND,296-26322-6-ND",REF3025AIDBZR,Texas Instruments,Texas Instruments,IC VREF SERIES 0.2% SOT23-3,"11,179",1.75,0,1,"Tape & Reel (TR),Cut Tape (CT),Digi-Reel®",-,Active,Series,Fixed,2.5V,25 mA,±0.2%,75ppm/°C,28µVp-p,80µVrms,2.55V ~ 5.5V,50µA,-40°C ~ 125°C (TA),Surface Mount,"TO-236-3, SC-59, SOT-23-3",SOT-23-3,https://www.digikey.com.au/en/products/detail/texas-instruments/REF3025AIDBZR/1573911
|
||||||
|
https://www.ti.com/lit/ds/symlink/ref30.pdf?ts=1743588330976&ref_url=https%253A%252F%252Fwww.ti.com%252Fproduct%252FREF30%252Fpart-details%252FREF3033AIDBZT%253FkeyMatch%253DREF3033AIDBZT%2526tisearch%253Duniversal_search%2526usecase%253DOPN,//mm.digikey.com/Volume0/opasdata/d220001/medias/images/3969/296%7E4203227%7EDBZ%7E3.JPG,"296-32213-2-ND,296-32213-1-ND,296-32213-6-ND",REF3012AIDBZR,Texas Instruments,Texas Instruments,IC VREF SERIES 0.2% SOT23-3,"3,143",1.75,0,1,"Tape & Reel (TR),Cut Tape (CT),Digi-Reel®",-,Active,Series,Fixed,1.25V,25 mA,±0.2%,75ppm/°C,14µVp-p,42µVrms,1.8V ~ 5.5V,50µA,-40°C ~ 125°C (TA),Surface Mount,"TO-236-3, SC-59, SOT-23-3",SOT-23-3,https://www.digikey.com.au/en/products/detail/texas-instruments/REF3012AIDBZR/1573905
|
||||||
|
https://www.ti.com/lit/ds/symlink/ref30.pdf?ts=1743588330976&ref_url=https%253A%252F%252Fwww.ti.com%252Fproduct%252FREF30%252Fpart-details%252FREF3033AIDBZT%253FkeyMatch%253DREF3033AIDBZT%2526tisearch%253Duniversal_search%2526usecase%253DOPN,//mm.digikey.com/Volume0/opasdata/d220001/medias/images/3969/296%7E4203227%7EDBZ%7E3.JPG,"REF3040AIDBZTTR-ND,REF3040AIDBZTCT-ND,REF3040AIDBZTDKR-ND",REF3040AIDBZT,Texas Instruments,Texas Instruments,IC VREF SERIES 0.2% SOT23-3,"7,615",2.06,0,1,"Tape & Reel (TR),Cut Tape (CT),Digi-Reel®",-,Active,Series,Fixed,4.096V,25 mA,±0.2%,75ppm/°C,45µVp-p,128µVrms,4.146V ~ 5.5V,50µA,-40°C ~ 125°C (TA),Surface Mount,"TO-236-3, SC-59, SOT-23-3",SOT-23-3,https://www.digikey.com.au/en/products/detail/texas-instruments/REF3040AIDBZT/459294
|
||||||
|
19
config.py
19
config.py
@@ -5,7 +5,7 @@ PARTDB_TOKEN = "tcp_564c6518a8476c25c68778e640c1bf40eecdec9f67be580bbd6504e9b6e
|
|||||||
UI_LANG_PATH = "/en"
|
UI_LANG_PATH = "/en"
|
||||||
|
|
||||||
# Modes: "bulk" or "scan"
|
# Modes: "bulk" or "scan"
|
||||||
MODE = "scan"
|
MODE = "bulk"
|
||||||
|
|
||||||
# Scanner
|
# Scanner
|
||||||
COM_PORT = "COM7"
|
COM_PORT = "COM7"
|
||||||
@@ -30,4 +30,19 @@ ENV_USER = "Nick"
|
|||||||
ENV_PASSWORD = "O@IyECa^XND7BvPpRX9XRKBhv%XVwCV4"
|
ENV_PASSWORD = "O@IyECa^XND7BvPpRX9XRKBhv%XVwCV4"
|
||||||
|
|
||||||
# UI defaults
|
# UI defaults
|
||||||
WINDOW_GEOM = "860x560"
|
WINDOW_GEOM = "860x560"
|
||||||
|
|
||||||
|
# Bulk import job acceptance
|
||||||
|
ACCEPT_JOBS_MAX_ITERATIONS = 100 # Maximum number of jobs to process in one run
|
||||||
|
ACCEPT_JOBS_RETRY_DELAY = 1.0 # Delay between job attempts (seconds)
|
||||||
|
|
||||||
|
# Bulk add workflow settings
|
||||||
|
ENABLE_RESISTORS_0805 = False
|
||||||
|
ENABLE_RESISTORS_0603 = False
|
||||||
|
ENABLE_CAPS_0805 = False
|
||||||
|
ADD_1206_FOR_LOW_V_CAPS = False
|
||||||
|
LOW_V_CAP_THRESHOLD_V = 10.0
|
||||||
|
UPSIZED_1206_TARGET_V = "25V"
|
||||||
|
DEFAULT_CAP_MANUFACTURER = "Samsung"
|
||||||
|
MAX_TO_CREATE = None # None for all, or a number to limit
|
||||||
|
SKIP_IF_EXISTS = True
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
import re, math
|
import re, math
|
||||||
from typing import Optional, List
|
from typing import Optional, List
|
||||||
|
|
||||||
RE_RES_SIMPLE = re.compile(r"(?i)^\s*(\d+(?:\.\d+)?)\s*(ohm|ohms|r|k|m|kohm|mohm|kΩ|mΩ|kOhm|MOhm)?\s*$")
|
RE_RES_SIMPLE = re.compile(r"(?i)^\s*(\d+(?:\.\d+)?)\s*(ohm|ohms|r|k|m|kohm|kohms|mohm|mohms|kΩ|mΩ|kOhm|kOhms|MOhm|MOhms)?\s*$")
|
||||||
RE_RES_LETTER = re.compile(r"(?i)^\s*(\d+)([rkm])(\d+)?\s*$")
|
RE_RES_LETTER = re.compile(r"(?i)^\s*(\d+)([rkm])(\d+)?\s*$")
|
||||||
RE_CAP_SIMPLE = re.compile(r"(?i)^(\d+(?:\.\d+)?)(p|n|u|m|pf|nf|uf|mf|f)?$")
|
RE_CAP_SIMPLE = re.compile(r"(?i)^(\d+(?:\.\d+)?)(p|n|u|m|pf|nf|uf|mf|f)?$")
|
||||||
RE_CAP_LETTER = re.compile(r"(?i)^(\d+)([pnu])(\d+)?$")
|
RE_CAP_LETTER = re.compile(r"(?i)^(\d+)([pnu])(\d+)?$")
|
||||||
@@ -44,11 +44,11 @@ def value_to_code(ohms: float) -> str:
|
|||||||
|
|
||||||
def parse_resistance_to_ohms(value: str, unit: Optional[str]) -> Optional[float]:
|
def parse_resistance_to_ohms(value: str, unit: Optional[str]) -> Optional[float]:
|
||||||
if value is None: return None
|
if value is None: return None
|
||||||
s = str(value).strip().replace(" ", "").replace("Ω","ohm").replace("Ω","ohm")
|
s = str(value).strip().replace(" ", "").replace(",", "").replace("Ω","ohm").replace("Ω","ohm")
|
||||||
m = RE_RES_SIMPLE.fullmatch(s)
|
m = RE_RES_SIMPLE.fullmatch(s)
|
||||||
if m and unit is None:
|
if m and unit is None:
|
||||||
num = float(m.group(1)); u = (m.group(2) or "").lower()
|
num = float(m.group(1)); u = (m.group(2) or "").lower()
|
||||||
table = {"ohm":1.0, "ohms":1.0, "r":1.0, "k":1e3, "kohm":1e3, "kω":1e3, "m":1e6, "mohm":1e6, "mω":1e6}
|
table = {"ohm":1.0, "ohms":1.0, "r":1.0, "k":1e3, "kohm":1e3, "kohms":1e3, "kω":1e3, "m":1e6, "mohm":1e6, "mohms":1e6, "mω":1e6}
|
||||||
return num * table.get(u, 1.0)
|
return num * table.get(u, 1.0)
|
||||||
m = RE_RES_LETTER.fullmatch(s)
|
m = RE_RES_LETTER.fullmatch(s)
|
||||||
if m and unit is None:
|
if m and unit is None:
|
||||||
@@ -60,7 +60,7 @@ def parse_resistance_to_ohms(value: str, unit: Optional[str]) -> Optional[float]
|
|||||||
num = float(s)
|
num = float(s)
|
||||||
if unit is None: return num
|
if unit is None: return num
|
||||||
u = str(unit).strip().lower()
|
u = str(unit).strip().lower()
|
||||||
table = {"ohm":1.0, "ohms":1.0, "r":1.0, "k":1e3, "kohm":1e3, "m":1e6, "mohm":1e6}
|
table = {"ohm":1.0, "ohms":1.0, "r":1.0, "k":1e3, "kohm":1e3, "kohms":1e3, "m":1e6, "mohm":1e6, "mohms":1e6}
|
||||||
mul = table.get(u);
|
mul = table.get(u);
|
||||||
return num * mul if mul else None
|
return num * mul if mul else None
|
||||||
except ValueError:
|
except ValueError:
|
||||||
@@ -85,7 +85,7 @@ def format_ohms_for_eda(ohms: float) -> str:
|
|||||||
|
|
||||||
def parse_capacitance_to_farads(value: str, unit: Optional[str]) -> Optional[float]:
|
def parse_capacitance_to_farads(value: str, unit: Optional[str]) -> Optional[float]:
|
||||||
if value is None: return None
|
if value is None: return None
|
||||||
s = str(value).strip().replace(" ", "").replace("µ","u").replace("μ","u")
|
s = str(value).strip().replace(" ", "").replace(",", "").replace("µ","u").replace("μ","u")
|
||||||
m = RE_CAP_SIMPLE.fullmatch(s)
|
m = RE_CAP_SIMPLE.fullmatch(s)
|
||||||
if m and unit is None:
|
if m and unit is None:
|
||||||
num = float(m.group(1)); su = (m.group(2) or "f").lower()
|
num = float(m.group(1)); su = (m.group(2) or "f").lower()
|
||||||
|
|||||||
@@ -0,0 +1 @@
|
|||||||
|
[{"name": "PHPSESSID", "value": "5c79b03879c439fa0991d88e4d5206b1", "path": "/", "domain": "partdb.neutronservices.duckdns.org", "secure": true, "httpOnly": true, "sameSite": "Lax"}]
|
||||||
@@ -663,6 +663,218 @@ def set_eda_from_capacitance(api: PartDB, part_id: int, *, max_wait_s: int = 12,
|
|||||||
if time.time() >= deadline: return False
|
if time.time() >= deadline: return False
|
||||||
time.sleep(poll_every)
|
time.sleep(poll_every)
|
||||||
|
|
||||||
|
def accept_bulk_import_jobs(driver, base_url: str, lang: str, job_url: str = None, max_iterations: int = 100) -> Tuple[int, int, int]:
|
||||||
|
"""
|
||||||
|
Automates accepting bulk import jobs by:
|
||||||
|
1. Finding and marking skipped parts (cards with "0 results found") as pending
|
||||||
|
2. Finding the first selectable "Update part" button in a border-success card
|
||||||
|
3. Clicking it and waiting for page load
|
||||||
|
4. Clicking "Save" and waiting for page load
|
||||||
|
5. Clicking "Save" again and waiting for page load
|
||||||
|
6. Clicking "Complete" and waiting for page load
|
||||||
|
7. Repeating until no more selectable buttons exist
|
||||||
|
|
||||||
|
Returns (successful_count, failed_count, skipped_count)
|
||||||
|
"""
|
||||||
|
# Navigate to the job/import page if URL provided
|
||||||
|
if job_url:
|
||||||
|
driver.get(job_url)
|
||||||
|
time.sleep(1.5)
|
||||||
|
|
||||||
|
successful = 0
|
||||||
|
failed = 0
|
||||||
|
total_skipped = 0
|
||||||
|
|
||||||
|
for iteration in range(max_iterations):
|
||||||
|
print(f"\n[Accept Jobs] Iteration {iteration + 1}/{max_iterations}")
|
||||||
|
|
||||||
|
# Scroll to top first to ensure we see all buttons
|
||||||
|
try:
|
||||||
|
driver.execute_script("window.scrollTo(0, 0);")
|
||||||
|
time.sleep(0.5)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# First check for skipped parts (border-warning cards with "0 results found" or "No results found")
|
||||||
|
# and mark them as skipped by clicking "Mark Pending"
|
||||||
|
skipped_count = 0
|
||||||
|
try:
|
||||||
|
# Find cards with border-warning that have "0 results found" badge or "No results found" alert
|
||||||
|
warning_cards = driver.find_elements(By.XPATH, "//div[contains(@class, 'card') and contains(@class, 'border-warning')]")
|
||||||
|
|
||||||
|
for card in warning_cards:
|
||||||
|
try:
|
||||||
|
# Check if it has "0 results found" badge or "No results found" message
|
||||||
|
has_no_results = False
|
||||||
|
try:
|
||||||
|
badge = card.find_element(By.XPATH, ".//span[contains(@class, 'badge') and contains(@class, 'bg-info') and contains(., 'results found')]")
|
||||||
|
if badge and '0 results' in badge.text:
|
||||||
|
has_no_results = True
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
if not has_no_results:
|
||||||
|
try:
|
||||||
|
alert = card.find_element(By.XPATH, ".//div[contains(@class, 'alert-info') and contains(., 'No results found')]")
|
||||||
|
if alert:
|
||||||
|
has_no_results = True
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
if has_no_results:
|
||||||
|
# This card should be skipped, click "Mark Skipped" button if available
|
||||||
|
try:
|
||||||
|
mark_skipped_btn = card.find_element(By.XPATH, ".//button[contains(., 'Mark Skipped')]")
|
||||||
|
if mark_skipped_btn and mark_skipped_btn.is_displayed():
|
||||||
|
driver.execute_script("arguments[0].scrollIntoView({block:'center', behavior:'smooth'});", mark_skipped_btn)
|
||||||
|
time.sleep(0.3)
|
||||||
|
try:
|
||||||
|
mark_skipped_btn.click()
|
||||||
|
except Exception:
|
||||||
|
driver.execute_script("arguments[0].click();", mark_skipped_btn)
|
||||||
|
skipped_count += 1
|
||||||
|
print(f"[Accept Jobs] Marked card as skipped (no results found)")
|
||||||
|
time.sleep(0.5)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
except Exception as e:
|
||||||
|
continue
|
||||||
|
|
||||||
|
if skipped_count > 0:
|
||||||
|
print(f"[Accept Jobs] Marked {skipped_count} cards as pending (no results)")
|
||||||
|
total_skipped += skipped_count
|
||||||
|
time.sleep(1.0) # Wait after marking items as pending
|
||||||
|
except Exception as e:
|
||||||
|
print(f"[Accept Jobs] Error checking for skipped cards: {e}")
|
||||||
|
|
||||||
|
# Find all "Update part" buttons that are NOT disabled (no 'disabled' in class)
|
||||||
|
update_button = None
|
||||||
|
try:
|
||||||
|
# Find <a> or <button> elements with "Update Part" text that don't have 'disabled' in their class
|
||||||
|
# Must be within btn-group-vertical and must not have 'disabled' class
|
||||||
|
possible_xpaths = [
|
||||||
|
"//a[contains(@class, 'btn') and not(contains(@class, 'disabled')) and contains(., 'Update Part')]",
|
||||||
|
"//a[contains(@class, 'btn') and not(contains(@class, 'disabled')) and contains(., 'Update part')]",
|
||||||
|
"//button[contains(@class, 'btn') and not(contains(@class, 'disabled')) and contains(., 'Update Part')]",
|
||||||
|
"//button[contains(@class, 'btn') and not(contains(@class, 'disabled')) and contains(., 'Update part')]",
|
||||||
|
]
|
||||||
|
|
||||||
|
for xpath in possible_xpaths:
|
||||||
|
try:
|
||||||
|
elements = driver.find_elements(By.XPATH, xpath)
|
||||||
|
for el in elements:
|
||||||
|
# Double-check: must not have 'disabled' in class attribute
|
||||||
|
class_attr = el.get_attribute('class') or ''
|
||||||
|
if 'disabled' in class_attr.lower():
|
||||||
|
continue
|
||||||
|
if el.is_displayed() and el.is_enabled():
|
||||||
|
# Found a valid button, use the first one
|
||||||
|
update_button = el
|
||||||
|
print(f"[Accept Jobs] Found button with text: '{el.text.strip()}' and class: '{class_attr}'")
|
||||||
|
break
|
||||||
|
if update_button:
|
||||||
|
break
|
||||||
|
except Exception as e:
|
||||||
|
print(f"[Accept Jobs] Error with xpath {xpath}: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"[Accept Jobs] Error finding buttons: {e}")
|
||||||
|
break
|
||||||
|
|
||||||
|
if not update_button:
|
||||||
|
print("[Accept Jobs] No more selectable 'Update part' buttons (without 'disabled' class) found. Done.")
|
||||||
|
break
|
||||||
|
|
||||||
|
# Click the button and wait for page load
|
||||||
|
try:
|
||||||
|
# Scroll into view
|
||||||
|
driver.execute_script("arguments[0].scrollIntoView({block:'center', behavior:'smooth'});", update_button)
|
||||||
|
time.sleep(0.5)
|
||||||
|
|
||||||
|
# Click the button
|
||||||
|
try:
|
||||||
|
update_button.click()
|
||||||
|
except Exception:
|
||||||
|
# Fallback to JS click
|
||||||
|
driver.execute_script("arguments[0].click();", update_button)
|
||||||
|
|
||||||
|
print("[Accept Jobs] Clicked 'Update part' button, waiting for page load...")
|
||||||
|
time.sleep(2.5) # Wait for page to load
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"[Accept Jobs] Failed to click button: {e}")
|
||||||
|
failed += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Now on the same page (no tab switch) - click Save first time
|
||||||
|
save_success = False
|
||||||
|
try:
|
||||||
|
print("[Accept Jobs] Looking for first 'Save' button...")
|
||||||
|
ok = click_xpath_robust(driver, [
|
||||||
|
"//button[@type='submit' and @id='part_base_save']",
|
||||||
|
"//button[@type='submit' and contains(@name, 'save')]",
|
||||||
|
"//button[normalize-space()='Save changes']",
|
||||||
|
"//button[contains(normalize-space(), 'Save changes')]",
|
||||||
|
"//button[normalize-space()='Save']",
|
||||||
|
"//input[@type='submit' and contains(@value, 'Save')]",
|
||||||
|
], total_timeout=15)
|
||||||
|
|
||||||
|
if ok:
|
||||||
|
print("[Accept Jobs] First 'Save' clicked, waiting for page load...")
|
||||||
|
time.sleep(2.5) # Wait for page to load
|
||||||
|
|
||||||
|
# Click Save second time
|
||||||
|
print("[Accept Jobs] Looking for second 'Save' button...")
|
||||||
|
ok = click_xpath_robust(driver, [
|
||||||
|
"//button[@type='submit' and @id='part_base_save']",
|
||||||
|
"//button[@type='submit' and contains(@name, 'save')]",
|
||||||
|
"//button[normalize-space()='Save changes']",
|
||||||
|
"//button[contains(normalize-space(), 'Save changes')]",
|
||||||
|
"//button[normalize-space()='Save']",
|
||||||
|
"//input[@type='submit' and contains(@value, 'Save')]",
|
||||||
|
], total_timeout=15)
|
||||||
|
|
||||||
|
if ok:
|
||||||
|
print("[Accept Jobs] Second 'Save' clicked, waiting for page load...")
|
||||||
|
time.sleep(2.5) # Wait for page to load
|
||||||
|
|
||||||
|
# Click Complete
|
||||||
|
print("[Accept Jobs] Looking for 'Complete' button...")
|
||||||
|
ok = click_xpath_robust(driver, [
|
||||||
|
"//button[normalize-space()='Complete']",
|
||||||
|
"//button[contains(normalize-space(), 'Complete')]",
|
||||||
|
"//input[@type='submit' and contains(@value, 'Complete')]",
|
||||||
|
"//a[contains(normalize-space(), 'Complete')]",
|
||||||
|
], total_timeout=15)
|
||||||
|
|
||||||
|
if ok:
|
||||||
|
print("[Accept Jobs] 'Complete' clicked successfully!")
|
||||||
|
save_success = True
|
||||||
|
time.sleep(2.0) # Wait for completion to process
|
||||||
|
else:
|
||||||
|
print("[Accept Jobs] Failed to find/click 'Complete' button")
|
||||||
|
else:
|
||||||
|
print("[Accept Jobs] Failed to find/click second 'Save' button")
|
||||||
|
else:
|
||||||
|
print("[Accept Jobs] Failed to find/click first 'Save' button")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"[Accept Jobs] Error during save/complete sequence: {e}")
|
||||||
|
|
||||||
|
if save_success:
|
||||||
|
successful += 1
|
||||||
|
print(f"[Accept Jobs] Job accepted successfully! (Total: {successful})")
|
||||||
|
else:
|
||||||
|
failed += 1
|
||||||
|
print(f"[Accept Jobs] Job failed! (Total failed: {failed})")
|
||||||
|
|
||||||
|
# Small delay before next iteration
|
||||||
|
time.sleep(1.0)
|
||||||
|
|
||||||
|
print(f"\n[Accept Jobs] Complete: {successful} successful, {failed} failed, {total_skipped} skipped")
|
||||||
|
return (successful, failed, total_skipped)
|
||||||
|
|
||||||
def update_part_from_providers_once(part_id: int, *, headless: bool = True) -> Tuple[bool, str]:
|
def update_part_from_providers_once(part_id: int, *, headless: bool = True) -> Tuple[bool, str]:
|
||||||
"""
|
"""
|
||||||
Run the same 'Tools → Info providers → Update' flow once for a part.
|
Run the same 'Tools → Info providers → Update' flow once for a part.
|
||||||
|
|||||||
101
test_accept_jobs.py
Normal file
101
test_accept_jobs.py
Normal file
@@ -0,0 +1,101 @@
|
|||||||
|
"""
|
||||||
|
Quick test script for the bulk import job acceptance automation.
|
||||||
|
|
||||||
|
This script demonstrates how to use the automation with minimal setup.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from provider.selenium_flow import start_firefox_resilient, ensure_logged_in, accept_bulk_import_jobs
|
||||||
|
from config import PARTDB_BASE, UI_LANG_PATH
|
||||||
|
|
||||||
|
|
||||||
|
def test_accept_jobs():
|
||||||
|
"""
|
||||||
|
Test the accept_bulk_import_jobs function.
|
||||||
|
This will open a browser, log you in, and let you manually navigate
|
||||||
|
to the import jobs page before starting automation.
|
||||||
|
"""
|
||||||
|
print("=" * 70)
|
||||||
|
print("BULK IMPORT JOB ACCEPTANCE - TEST MODE")
|
||||||
|
print("=" * 70)
|
||||||
|
print()
|
||||||
|
print("This will:")
|
||||||
|
print("1. Open a Firefox browser")
|
||||||
|
print("2. Log you in to PartDB")
|
||||||
|
print("3. Wait for you to navigate to the import jobs page")
|
||||||
|
print("4. Start automating the acceptance of jobs")
|
||||||
|
print()
|
||||||
|
print("-" * 70)
|
||||||
|
|
||||||
|
# Start browser
|
||||||
|
print("Starting browser...")
|
||||||
|
driver = start_firefox_resilient(headless_first=False)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Navigate and login
|
||||||
|
print("Navigating to PartDB...")
|
||||||
|
driver.get(PARTDB_BASE + "/")
|
||||||
|
|
||||||
|
print("Logging in...")
|
||||||
|
if not ensure_logged_in(driver, PARTDB_BASE, interactive_ok=True, wait_s=600):
|
||||||
|
print("ERROR: Could not log in. Exiting.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print("✓ Login successful!")
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Wait for user to navigate
|
||||||
|
print("=" * 70)
|
||||||
|
print("READY TO START")
|
||||||
|
print("=" * 70)
|
||||||
|
print()
|
||||||
|
print("Please navigate to your import jobs page in the browser.")
|
||||||
|
print("The URL should look something like:")
|
||||||
|
print(" https://partdb.../en/import/jobs/...")
|
||||||
|
print()
|
||||||
|
print("When you're on the page with 'Update part' buttons,")
|
||||||
|
input("press ENTER to start automation... ")
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Run automation
|
||||||
|
print("Starting automation...")
|
||||||
|
print("=" * 70)
|
||||||
|
successful, failed, skipped = accept_bulk_import_jobs(
|
||||||
|
driver,
|
||||||
|
PARTDB_BASE,
|
||||||
|
UI_LANG_PATH,
|
||||||
|
job_url=None, # User already navigated
|
||||||
|
max_iterations=100
|
||||||
|
)
|
||||||
|
|
||||||
|
# Results
|
||||||
|
print()
|
||||||
|
print("=" * 70)
|
||||||
|
print("AUTOMATION COMPLETE")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"✓ Successfully processed: {successful} jobs")
|
||||||
|
print(f"✗ Failed: {failed} jobs")
|
||||||
|
print(f"⊘ Skipped (no results): {skipped} jobs")
|
||||||
|
print("=" * 70)
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Keep browser open
|
||||||
|
print("Browser will remain open for inspection.")
|
||||||
|
input("Press ENTER to close and exit... ")
|
||||||
|
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
print("\n\nInterrupted by user.")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"\n\nERROR: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
input("\nPress ENTER to close... ")
|
||||||
|
finally:
|
||||||
|
try:
|
||||||
|
driver.quit()
|
||||||
|
print("Browser closed.")
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
test_accept_jobs()
|
||||||
178
ui/app_tk.py
178
ui/app_tk.py
@@ -15,10 +15,175 @@ from apis.digikey_api import suggest_category_from_digikey
|
|||||||
# ------------ Pages ------------
|
# ------------ Pages ------------
|
||||||
|
|
||||||
class HomePage(ttk.Frame):
|
class HomePage(ttk.Frame):
|
||||||
"""Always listening; on scan -> ask Part-DB; route to View or Create."""
|
"""Home page with tools and settings."""
|
||||||
def __init__(self, master, app):
|
def __init__(self, master, app):
|
||||||
super().__init__(master)
|
super().__init__(master)
|
||||||
self.app = app
|
self.app = app
|
||||||
|
|
||||||
|
ttk.Label(self, text="Part-DB Helper", font=("Segoe UI", 16, "bold")).pack(pady=(18,8))
|
||||||
|
|
||||||
|
# Settings section
|
||||||
|
settings_frame = ttk.LabelFrame(self, text="Settings", padding=10)
|
||||||
|
settings_frame.pack(fill="x", padx=20, pady=(0,10))
|
||||||
|
|
||||||
|
# Mode toggle (Live/Dry Run)
|
||||||
|
mode_frame = ttk.Frame(settings_frame)
|
||||||
|
mode_frame.pack(fill="x", pady=2)
|
||||||
|
ttk.Label(mode_frame, text="Standardization Mode:", width=20).pack(side="left")
|
||||||
|
self.mode_var = tk.StringVar(value="live")
|
||||||
|
ttk.Radiobutton(mode_frame, text="Live (Make Changes)", variable=self.mode_var, value="live").pack(side="left", padx=5)
|
||||||
|
ttk.Radiobutton(mode_frame, text="Dry Run (Preview Only)", variable=self.mode_var, value="dry").pack(side="left", padx=5)
|
||||||
|
|
||||||
|
# Provider update toggle
|
||||||
|
provider_frame = ttk.Frame(settings_frame)
|
||||||
|
provider_frame.pack(fill="x", pady=2)
|
||||||
|
ttk.Label(provider_frame, text="Provider Updates:", width=20).pack(side="left")
|
||||||
|
self.provider_var = tk.BooleanVar(value=True)
|
||||||
|
ttk.Checkbutton(provider_frame, text="Update from information providers (Digi-Key, etc.)", variable=self.provider_var).pack(side="left", padx=5)
|
||||||
|
|
||||||
|
# Tools section
|
||||||
|
ttk.Separator(self, orient="horizontal").pack(fill="x", pady=10, padx=20)
|
||||||
|
ttk.Label(self, text="Tools", font=("Segoe UI", 14, "bold")).pack(pady=(0,10))
|
||||||
|
|
||||||
|
tools_frame = ttk.Frame(self)
|
||||||
|
tools_frame.pack(pady=(0,10))
|
||||||
|
ttk.Button(tools_frame, text="Scanner", command=self.goto_scanner, width=25).pack(pady=4)
|
||||||
|
ttk.Button(tools_frame, text="Accept Import Jobs", command=self.run_accept_jobs, width=25).pack(pady=4)
|
||||||
|
ttk.Button(tools_frame, text="Run Bulk Add", command=self.run_bulk_add, width=25).pack(pady=4)
|
||||||
|
ttk.Button(tools_frame, text="Import from CSV Files", command=self.run_import_csv, width=25).pack(pady=4)
|
||||||
|
ttk.Button(tools_frame, text="Update Components", command=self.run_standardize_components, width=25).pack(pady=4)
|
||||||
|
|
||||||
|
def goto_scanner(self):
|
||||||
|
"""Navigate to scanner page."""
|
||||||
|
self.app.goto_scanner()
|
||||||
|
|
||||||
|
def run_accept_jobs(self):
|
||||||
|
"""Launch the accept import jobs automation in a new thread."""
|
||||||
|
from workflows.accept_import_jobs import run_accept_import_jobs
|
||||||
|
|
||||||
|
def work():
|
||||||
|
run_accept_import_jobs(auto_close=True)
|
||||||
|
|
||||||
|
t = threading.Thread(target=work, daemon=True)
|
||||||
|
t.start()
|
||||||
|
messagebox.showinfo("Started", "Import job acceptance automation started in browser.\nCheck the browser window for progress.")
|
||||||
|
|
||||||
|
def run_bulk_add(self):
|
||||||
|
"""Launch the bulk add workflow."""
|
||||||
|
from workflows.bulk_add import run_bulk_add
|
||||||
|
|
||||||
|
def work():
|
||||||
|
run_bulk_add()
|
||||||
|
|
||||||
|
t = threading.Thread(target=work, daemon=True)
|
||||||
|
t.start()
|
||||||
|
messagebox.showinfo("Started", "Bulk add workflow started in browser.\nCheck the browser window for progress.")
|
||||||
|
|
||||||
|
def run_import_csv(self):
|
||||||
|
"""Launch the CSV import workflow."""
|
||||||
|
update_providers = self.provider_var.get()
|
||||||
|
|
||||||
|
from workflows.import_from_csv import run_import_from_csv
|
||||||
|
from ui.progress_dialog import ProgressDialog
|
||||||
|
|
||||||
|
# Create progress dialog
|
||||||
|
progress = ProgressDialog(self.app, "Importing from CSV")
|
||||||
|
|
||||||
|
def progress_callback(current, total, status):
|
||||||
|
progress.update(current, total, status)
|
||||||
|
return progress.is_cancelled()
|
||||||
|
|
||||||
|
def work():
|
||||||
|
try:
|
||||||
|
run_import_from_csv(update_providers=update_providers, progress_callback=progress_callback)
|
||||||
|
finally:
|
||||||
|
progress.close()
|
||||||
|
# Show completion message
|
||||||
|
self.app.after(0, lambda: messagebox.showinfo("Import Complete", "CSV import finished!\nCheck console for details."))
|
||||||
|
|
||||||
|
t = threading.Thread(target=work, daemon=True)
|
||||||
|
t.start()
|
||||||
|
|
||||||
|
def run_standardize_components(self):
|
||||||
|
"""Launch the component standardization workflow."""
|
||||||
|
dry_run = (self.mode_var.get() == "dry")
|
||||||
|
|
||||||
|
from workflows.standardize_components import run_standardize_components
|
||||||
|
from ui.progress_dialog import ProgressDialog
|
||||||
|
|
||||||
|
# Create progress dialog - self.app is the Tk root
|
||||||
|
progress = ProgressDialog(self.app, "Standardizing Components")
|
||||||
|
|
||||||
|
def progress_callback(current, total, status):
|
||||||
|
progress.update(current, total, status)
|
||||||
|
return progress.is_cancelled()
|
||||||
|
|
||||||
|
def work():
|
||||||
|
try:
|
||||||
|
run_standardize_components(dry_run=dry_run, progress_callback=progress_callback)
|
||||||
|
finally:
|
||||||
|
progress.close()
|
||||||
|
|
||||||
|
t = threading.Thread(target=work, daemon=True)
|
||||||
|
t.start()
|
||||||
|
|
||||||
|
def run_standardize_passives(self):
|
||||||
|
"""Launch the passive components standardization workflow."""
|
||||||
|
dry_run = (self.mode_var.get() == "dry")
|
||||||
|
|
||||||
|
from workflows.standardize_passives import run_standardize_passives
|
||||||
|
from ui.progress_dialog import ProgressDialog
|
||||||
|
|
||||||
|
# Create progress dialog - self.app is the Tk root
|
||||||
|
progress = ProgressDialog(self.app, "Standardizing Passives")
|
||||||
|
|
||||||
|
def progress_callback(current, total, status):
|
||||||
|
progress.update(current, total, status)
|
||||||
|
return progress.is_cancelled()
|
||||||
|
|
||||||
|
def work():
|
||||||
|
try:
|
||||||
|
run_standardize_passives(category_name="Passives", dry_run=dry_run, progress_callback=progress_callback)
|
||||||
|
finally:
|
||||||
|
progress.close()
|
||||||
|
|
||||||
|
t = threading.Thread(target=work, daemon=True)
|
||||||
|
t.start()
|
||||||
|
|
||||||
|
def run_standardize_asdmb(self):
|
||||||
|
"""Launch the ASDMB crystal standardization workflow."""
|
||||||
|
dry_run = (self.mode_var.get() == "dry")
|
||||||
|
provider_update = self.provider_var.get() and not dry_run
|
||||||
|
|
||||||
|
from workflows.standardize_asdmb import run_standardize_asdmb
|
||||||
|
from ui.progress_dialog import ProgressDialog
|
||||||
|
|
||||||
|
# Create progress dialog - self.app is the Tk root
|
||||||
|
progress = ProgressDialog(self.app, "Standardizing ASDMB Crystals")
|
||||||
|
|
||||||
|
def progress_callback(current, total, status):
|
||||||
|
progress.update(current, total, status)
|
||||||
|
return progress.is_cancelled()
|
||||||
|
|
||||||
|
def work():
|
||||||
|
try:
|
||||||
|
run_standardize_asdmb(category_name="Clock - ASDMB", dry_run=dry_run, update_providers=provider_update, progress_callback=progress_callback)
|
||||||
|
finally:
|
||||||
|
progress.close()
|
||||||
|
|
||||||
|
t = threading.Thread(target=work, daemon=True)
|
||||||
|
t.start()
|
||||||
|
|
||||||
|
|
||||||
|
class ScannerPage(ttk.Frame):
|
||||||
|
"""Scanner page for scanning Digi-Key labels."""
|
||||||
|
def __init__(self, master, app):
|
||||||
|
super().__init__(master)
|
||||||
|
self.app = app
|
||||||
|
|
||||||
|
# Back button
|
||||||
|
ttk.Button(self, text="← Back to Home", command=self.app.goto_home).pack(anchor="w", padx=10, pady=10)
|
||||||
|
|
||||||
ttk.Label(self, text="Scan a Digi-Key code", font=("Segoe UI", 16, "bold")).pack(pady=(18,8))
|
ttk.Label(self, text="Scan a Digi-Key code", font=("Segoe UI", 16, "bold")).pack(pady=(18,8))
|
||||||
ttk.Label(self, text=f"Listening on {COM_PORT} @ {BAUD_RATE}").pack(pady=(0,12))
|
ttk.Label(self, text=f"Listening on {COM_PORT} @ {BAUD_RATE}").pack(pady=(0,12))
|
||||||
|
|
||||||
@@ -29,7 +194,7 @@ class HomePage(ttk.Frame):
|
|||||||
ttk.Entry(wrap, textvariable=self.last_var, state="readonly").pack(side="left", fill="x", expand=True, padx=(8,0))
|
ttk.Entry(wrap, textvariable=self.last_var, state="readonly").pack(side="left", fill="x", expand=True, padx=(8,0))
|
||||||
|
|
||||||
ttk.Label(self, text="(This page listens continuously. Just scan another label.)", foreground="#666").pack()
|
ttk.Label(self, text="(This page listens continuously. Just scan another label.)", foreground="#666").pack()
|
||||||
|
|
||||||
def on_scan(self, raw: str):
|
def on_scan(self, raw: str):
|
||||||
self.last_var.set(raw)
|
self.last_var.set(raw)
|
||||||
fields = parse_digikey(raw)
|
fields = parse_digikey(raw)
|
||||||
@@ -50,6 +215,7 @@ class HomePage(ttk.Frame):
|
|||||||
# No part -> prepare creation
|
# No part -> prepare creation
|
||||||
self.app.goto_create(fields, raw)
|
self.app.goto_create(fields, raw)
|
||||||
|
|
||||||
|
|
||||||
class ViewPage(ttk.Frame):
|
class ViewPage(ttk.Frame):
|
||||||
"""Shows existing part summary + button to force Digi-Key provider update."""
|
"""Shows existing part summary + button to force Digi-Key provider update."""
|
||||||
def __init__(self, master, app):
|
def __init__(self, master, app):
|
||||||
@@ -275,10 +441,11 @@ class App(tk.Tk):
|
|||||||
self.pdb = PartDB(PARTDB_BASE, PARTDB_TOKEN)
|
self.pdb = PartDB(PARTDB_BASE, PARTDB_TOKEN)
|
||||||
|
|
||||||
self.home = HomePage(self, self)
|
self.home = HomePage(self, self)
|
||||||
|
self.scanner = ScannerPage(self, self)
|
||||||
self.view = ViewPage(self, self)
|
self.view = ViewPage(self, self)
|
||||||
self.create = CreatePage(self, self)
|
self.create = CreatePage(self, self)
|
||||||
|
|
||||||
for f in (self.home, self.view, self.create):
|
for f in (self.home, self.scanner, self.view, self.create):
|
||||||
f.grid(row=0, column=0, sticky="nsew")
|
f.grid(row=0, column=0, sticky="nsew")
|
||||||
|
|
||||||
# start listening thread once
|
# start listening thread once
|
||||||
@@ -290,10 +457,13 @@ class App(tk.Tk):
|
|||||||
def on_scan(self, flat: str):
|
def on_scan(self, flat: str):
|
||||||
if self.busy:
|
if self.busy:
|
||||||
return # ignore scans during provider update
|
return # ignore scans during provider update
|
||||||
self.home.on_scan(flat)
|
self.scanner.on_scan(flat)
|
||||||
|
|
||||||
def goto_home(self):
|
def goto_home(self):
|
||||||
self.home.tkraise()
|
self.home.tkraise()
|
||||||
|
|
||||||
|
def goto_scanner(self):
|
||||||
|
self.scanner.tkraise()
|
||||||
|
|
||||||
def goto_view(self, summary: dict):
|
def goto_view(self, summary: dict):
|
||||||
self.view.set_summary(summary)
|
self.view.set_summary(summary)
|
||||||
|
|||||||
127
ui/progress_dialog.py
Normal file
127
ui/progress_dialog.py
Normal file
@@ -0,0 +1,127 @@
|
|||||||
|
"""
|
||||||
|
Progress dialog with progress bar, ETA, and cancel button.
|
||||||
|
"""
|
||||||
|
import tkinter as tk
|
||||||
|
from tkinter import ttk
|
||||||
|
import time
|
||||||
|
import threading
|
||||||
|
|
||||||
|
|
||||||
|
class ProgressDialog:
|
||||||
|
"""
|
||||||
|
A modal progress dialog with progress bar, status text, ETA, and cancel button.
|
||||||
|
Thread-safe for updating from background threads.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, parent, title="Progress"):
|
||||||
|
self.parent = parent
|
||||||
|
self.cancelled = False
|
||||||
|
self._start_time = time.time()
|
||||||
|
|
||||||
|
# Create dialog
|
||||||
|
self.dialog = tk.Toplevel(parent)
|
||||||
|
self.dialog.title(title)
|
||||||
|
self.dialog.geometry("500x150")
|
||||||
|
self.dialog.resizable(False, False)
|
||||||
|
|
||||||
|
# Make it modal
|
||||||
|
self.dialog.transient(parent)
|
||||||
|
self.dialog.grab_set()
|
||||||
|
|
||||||
|
# Center on parent
|
||||||
|
self.dialog.update_idletasks()
|
||||||
|
x = parent.winfo_x() + (parent.winfo_width() - 500) // 2
|
||||||
|
y = parent.winfo_y() + (parent.winfo_height() - 150) // 2
|
||||||
|
self.dialog.geometry(f"+{x}+{y}")
|
||||||
|
|
||||||
|
# Status label
|
||||||
|
self.status_var = tk.StringVar(value="Starting...")
|
||||||
|
ttk.Label(self.dialog, textvariable=self.status_var, font=("Segoe UI", 10)).pack(pady=(15, 5))
|
||||||
|
|
||||||
|
# Progress bar
|
||||||
|
self.progress = ttk.Progressbar(self.dialog, length=450, mode='determinate')
|
||||||
|
self.progress.pack(pady=(5, 5), padx=25)
|
||||||
|
|
||||||
|
# ETA label
|
||||||
|
self.eta_var = tk.StringVar(value="Calculating...")
|
||||||
|
ttk.Label(self.dialog, textvariable=self.eta_var, font=("Segoe UI", 9), foreground="#666").pack(pady=(0, 10))
|
||||||
|
|
||||||
|
# Cancel button
|
||||||
|
self.cancel_btn = ttk.Button(self.dialog, text="Cancel", command=self.cancel)
|
||||||
|
self.cancel_btn.pack(pady=(0, 15))
|
||||||
|
|
||||||
|
# Handle window close
|
||||||
|
self.dialog.protocol("WM_DELETE_WINDOW", self.cancel)
|
||||||
|
|
||||||
|
def update(self, current, total, status_text=None):
|
||||||
|
"""
|
||||||
|
Update progress bar and ETA.
|
||||||
|
Thread-safe - can be called from background threads.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
current: Current item number (0-based or 1-based)
|
||||||
|
total: Total number of items
|
||||||
|
status_text: Optional status text to display
|
||||||
|
"""
|
||||||
|
def _update():
|
||||||
|
if self.cancelled:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Update progress bar
|
||||||
|
if total > 0:
|
||||||
|
percentage = (current / total) * 100
|
||||||
|
self.progress['value'] = percentage
|
||||||
|
|
||||||
|
# Update status text
|
||||||
|
if status_text:
|
||||||
|
self.status_var.set(status_text)
|
||||||
|
|
||||||
|
# Calculate and update ETA
|
||||||
|
if current > 0 and total > 0:
|
||||||
|
elapsed = time.time() - self._start_time
|
||||||
|
avg_time_per_item = elapsed / current
|
||||||
|
remaining_items = total - current
|
||||||
|
eta_seconds = remaining_items * avg_time_per_item
|
||||||
|
|
||||||
|
if eta_seconds < 60:
|
||||||
|
eta_text = f"ETA: {int(eta_seconds)}s"
|
||||||
|
elif eta_seconds < 3600:
|
||||||
|
eta_text = f"ETA: {int(eta_seconds / 60)}m {int(eta_seconds % 60)}s"
|
||||||
|
else:
|
||||||
|
hours = int(eta_seconds / 3600)
|
||||||
|
minutes = int((eta_seconds % 3600) / 60)
|
||||||
|
eta_text = f"ETA: {hours}h {minutes}m"
|
||||||
|
|
||||||
|
self.eta_var.set(f"{current}/{total} items - {eta_text}")
|
||||||
|
else:
|
||||||
|
self.eta_var.set(f"{current}/{total} items")
|
||||||
|
|
||||||
|
# Schedule update on main thread
|
||||||
|
if threading.current_thread() == threading.main_thread():
|
||||||
|
_update()
|
||||||
|
else:
|
||||||
|
self.dialog.after(0, _update)
|
||||||
|
|
||||||
|
def cancel(self):
|
||||||
|
"""Mark as cancelled and close dialog."""
|
||||||
|
self.cancelled = True
|
||||||
|
self.close()
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
"""Close the dialog."""
|
||||||
|
def _close():
|
||||||
|
try:
|
||||||
|
self.dialog.grab_release()
|
||||||
|
self.dialog.destroy()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Schedule on main thread
|
||||||
|
if threading.current_thread() == threading.main_thread():
|
||||||
|
_close()
|
||||||
|
else:
|
||||||
|
self.dialog.after(0, _close)
|
||||||
|
|
||||||
|
def is_cancelled(self):
|
||||||
|
"""Check if user cancelled the operation."""
|
||||||
|
return self.cancelled
|
||||||
163
workflows/accept_import_jobs.py
Normal file
163
workflows/accept_import_jobs.py
Normal file
@@ -0,0 +1,163 @@
|
|||||||
|
"""
|
||||||
|
Workflow to accept bulk import jobs automatically.
|
||||||
|
|
||||||
|
This script automates the process of accepting import jobs by:
|
||||||
|
1. Navigating to the bulk import management page
|
||||||
|
2. Clicking the "View Results" button for the first import job
|
||||||
|
3. Finding and clicking "Update Part" buttons (only those without 'disabled' class)
|
||||||
|
4. Clicking Save twice and Complete for each job (same page, no tabs)
|
||||||
|
5. Repeating until all jobs are processed
|
||||||
|
"""
|
||||||
|
|
||||||
|
import time
|
||||||
|
from selenium.webdriver.common.by import By
|
||||||
|
from config import PARTDB_BASE, UI_LANG_PATH, HEADLESS_CONTROLLER
|
||||||
|
from provider.selenium_flow import (
|
||||||
|
start_firefox_resilient,
|
||||||
|
ensure_logged_in,
|
||||||
|
accept_bulk_import_jobs
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def run_accept_import_jobs(job_url: str = None, auto_close: bool = False):
|
||||||
|
"""
|
||||||
|
Main function to accept bulk import jobs.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
job_url: Optional URL to the specific import job page.
|
||||||
|
If None, uses the default bulk import management page.
|
||||||
|
auto_close: If True, automatically closes the browser after completion.
|
||||||
|
If False, waits for user input before closing.
|
||||||
|
"""
|
||||||
|
print("=== Starting Bulk Import Job Acceptance ===\n")
|
||||||
|
|
||||||
|
# Default to the bulk import management page
|
||||||
|
if job_url is None:
|
||||||
|
job_url = PARTDB_BASE + "/en/tools/bulk_info_provider_import/manage"
|
||||||
|
|
||||||
|
# Start browser
|
||||||
|
driver = start_firefox_resilient(headless_first=HEADLESS_CONTROLLER)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Navigate to base URL and login
|
||||||
|
driver.get(PARTDB_BASE + "/")
|
||||||
|
|
||||||
|
if not ensure_logged_in(driver, PARTDB_BASE, interactive_ok=True, wait_s=600):
|
||||||
|
print("Could not login; aborting.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print("Login successful!\n")
|
||||||
|
|
||||||
|
# Navigate to the bulk import management page
|
||||||
|
print(f"Navigating to: {job_url}")
|
||||||
|
driver.get(job_url)
|
||||||
|
time.sleep(2.0)
|
||||||
|
|
||||||
|
# Find and click the first "View Results" button for an ACTIVE job
|
||||||
|
print("Looking for 'View Results' button on active job...")
|
||||||
|
try:
|
||||||
|
view_results_button = None
|
||||||
|
|
||||||
|
# Find all table rows
|
||||||
|
rows = driver.find_elements(By.XPATH, "//tbody/tr")
|
||||||
|
|
||||||
|
for row in rows:
|
||||||
|
try:
|
||||||
|
# Check if this row has an "Active" badge or "In Progress" status
|
||||||
|
badges = row.find_elements(By.XPATH, ".//span[contains(@class, 'badge')]")
|
||||||
|
is_active = False
|
||||||
|
|
||||||
|
for badge in badges:
|
||||||
|
badge_text = badge.text.strip().lower()
|
||||||
|
if 'active' in badge_text or 'in progress' in badge_text:
|
||||||
|
is_active = True
|
||||||
|
break
|
||||||
|
|
||||||
|
# If this row is active, find its "View Results" button
|
||||||
|
if is_active:
|
||||||
|
view_btn = row.find_elements(By.XPATH, ".//a[contains(@class, 'btn') and contains(., 'View Results')]")
|
||||||
|
if view_btn:
|
||||||
|
view_results_button = view_btn[0]
|
||||||
|
print(f"Found active job with 'View Results' button")
|
||||||
|
break
|
||||||
|
except Exception as e:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Fallback: if no active job found, just get the first View Results button
|
||||||
|
if not view_results_button:
|
||||||
|
print("No active job found, looking for any 'View Results' button...")
|
||||||
|
xpaths = [
|
||||||
|
"//a[contains(@class, 'btn') and contains(., 'View Results')]",
|
||||||
|
"//a[contains(@href, '/bulk_info_provider_import/step2/') and contains(@class, 'btn-primary')]",
|
||||||
|
]
|
||||||
|
|
||||||
|
for xpath in xpaths:
|
||||||
|
elements = driver.find_elements(By.XPATH, xpath)
|
||||||
|
if elements:
|
||||||
|
view_results_button = elements[0]
|
||||||
|
break
|
||||||
|
|
||||||
|
if view_results_button:
|
||||||
|
print(f"Clicking 'View Results' button...")
|
||||||
|
driver.execute_script("arguments[0].scrollIntoView({block:'center'});", view_results_button)
|
||||||
|
time.sleep(0.5)
|
||||||
|
view_results_button.click()
|
||||||
|
time.sleep(2.0)
|
||||||
|
print("✓ Navigated to results page")
|
||||||
|
else:
|
||||||
|
print("Could not find 'View Results' button. Make sure there's an import job to process.")
|
||||||
|
if not auto_close:
|
||||||
|
print("Press Enter to close...")
|
||||||
|
input()
|
||||||
|
return
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error clicking 'View Results': {e}")
|
||||||
|
if not auto_close:
|
||||||
|
print("Press Enter to close...")
|
||||||
|
input()
|
||||||
|
return
|
||||||
|
|
||||||
|
# Run the automation
|
||||||
|
print("\nStarting automation...")
|
||||||
|
print("=" * 70)
|
||||||
|
successful, failed, skipped = accept_bulk_import_jobs(
|
||||||
|
driver,
|
||||||
|
PARTDB_BASE,
|
||||||
|
UI_LANG_PATH,
|
||||||
|
job_url=None, # Already on the page
|
||||||
|
max_iterations=100
|
||||||
|
)
|
||||||
|
|
||||||
|
print("\n" + "=" * 60)
|
||||||
|
print("AUTOMATION COMPLETE")
|
||||||
|
print(f"Successfully processed: {successful} jobs")
|
||||||
|
print(f"Failed: {failed} jobs")
|
||||||
|
print(f"Skipped (no results): {skipped} jobs")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
# Keep browser open for inspection if not auto_close
|
||||||
|
if not auto_close:
|
||||||
|
print("\nBrowser will remain open for inspection.")
|
||||||
|
print("Press Enter to close...")
|
||||||
|
input()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"\nError during automation: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
|
||||||
|
if not auto_close:
|
||||||
|
print("\nPress Enter to close...")
|
||||||
|
input()
|
||||||
|
|
||||||
|
finally:
|
||||||
|
try:
|
||||||
|
driver.quit()
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
# You can optionally provide a direct URL to the import job page
|
||||||
|
# Example: run_accept_import_jobs("https://partdb.neutronservices.duckdns.org/en/import/jobs/123")
|
||||||
|
run_accept_import_jobs()
|
||||||
420
workflows/import_from_csv.py
Normal file
420
workflows/import_from_csv.py
Normal file
@@ -0,0 +1,420 @@
|
|||||||
|
"""
|
||||||
|
Import parts from Digi-Key CSV exports and update from providers.
|
||||||
|
|
||||||
|
This workflow:
|
||||||
|
1. Scans the "import CSVs" folder for CSV files
|
||||||
|
2. Reads Digi-Key part numbers and other info from each CSV
|
||||||
|
3. Creates parts in PartDB if they don't exist
|
||||||
|
4. Triggers provider updates to fetch full information
|
||||||
|
5. Sets EDA values based on part type
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import csv
|
||||||
|
import re
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import List, Dict, Tuple
|
||||||
|
from tqdm import tqdm
|
||||||
|
|
||||||
|
from config import PARTDB_BASE, PARTDB_TOKEN, HEADLESS_PROVIDER
|
||||||
|
from apis.partdb_api import PartDB
|
||||||
|
from provider.selenium_flow import (
|
||||||
|
start_firefox_resilient, ensure_logged_in, run_provider_update_flow
|
||||||
|
)
|
||||||
|
from parsers.values import (
|
||||||
|
parse_resistance_to_ohms, format_ohms_for_eda,
|
||||||
|
parse_capacitance_to_farads, format_farads_for_eda
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def find_csv_files(folder_path: str = "import CSVs") -> List[Path]:
|
||||||
|
"""Find all CSV files in the import folder."""
|
||||||
|
folder = Path(folder_path)
|
||||||
|
if not folder.exists():
|
||||||
|
print(f"Creating folder: {folder}")
|
||||||
|
folder.mkdir(parents=True, exist_ok=True)
|
||||||
|
return []
|
||||||
|
|
||||||
|
csv_files = list(folder.glob("*.csv"))
|
||||||
|
return csv_files
|
||||||
|
|
||||||
|
|
||||||
|
def parse_digikey_csv(csv_path: Path) -> List[Dict[str, str]]:
|
||||||
|
"""
|
||||||
|
Parse a Digi-Key CSV export file.
|
||||||
|
|
||||||
|
Expected columns (case-insensitive):
|
||||||
|
- Digi-Key Part Number
|
||||||
|
- Manufacturer Part Number
|
||||||
|
- Manufacturer
|
||||||
|
- Description
|
||||||
|
- Quantity Available
|
||||||
|
- Unit Price
|
||||||
|
|
||||||
|
Returns list of part dictionaries.
|
||||||
|
"""
|
||||||
|
parts = []
|
||||||
|
|
||||||
|
with open(csv_path, 'r', encoding='utf-8-sig') as f:
|
||||||
|
reader = csv.DictReader(f)
|
||||||
|
|
||||||
|
# Normalize column names (remove BOM, strip whitespace, lowercase)
|
||||||
|
if reader.fieldnames:
|
||||||
|
reader.fieldnames = [name.strip().lower() for name in reader.fieldnames]
|
||||||
|
|
||||||
|
for row in reader:
|
||||||
|
# Skip empty rows
|
||||||
|
if not any(row.values()):
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Extract relevant fields (try multiple column name variations)
|
||||||
|
dkpn = (row.get('dk part #') or
|
||||||
|
row.get('digi-key part number') or
|
||||||
|
row.get('digikey part number') or
|
||||||
|
row.get('part number') or '').strip()
|
||||||
|
|
||||||
|
# Handle multiple DK part numbers separated by commas
|
||||||
|
if dkpn and ',' in dkpn:
|
||||||
|
dkpn = dkpn.split(',')[0].strip()
|
||||||
|
|
||||||
|
mpn = (row.get('mfr part #') or
|
||||||
|
row.get('manufacturer part number') or
|
||||||
|
row.get('mfr part number') or
|
||||||
|
row.get('mpn') or '').strip()
|
||||||
|
|
||||||
|
manufacturer = (row.get('mfr') or
|
||||||
|
row.get('manufacturer') or '').strip()
|
||||||
|
|
||||||
|
description = (row.get('description') or
|
||||||
|
row.get('product description') or '').strip()
|
||||||
|
|
||||||
|
# Skip if no MPN
|
||||||
|
if not mpn:
|
||||||
|
continue
|
||||||
|
|
||||||
|
parts.append({
|
||||||
|
'dkpn': dkpn,
|
||||||
|
'mpn': mpn,
|
||||||
|
'manufacturer': manufacturer,
|
||||||
|
'description': description
|
||||||
|
})
|
||||||
|
|
||||||
|
return parts
|
||||||
|
|
||||||
|
|
||||||
|
def create_part_if_not_exists(api: PartDB, part_info: Dict[str, str]) -> Tuple[bool, int, str]:
|
||||||
|
"""
|
||||||
|
Create a part in PartDB if it doesn't already exist.
|
||||||
|
|
||||||
|
Returns: (created, part_id, message)
|
||||||
|
- created: True if part was created, False if it already existed
|
||||||
|
- part_id: The part ID (existing or newly created)
|
||||||
|
- message: Status message
|
||||||
|
"""
|
||||||
|
mpn = part_info['mpn']
|
||||||
|
dkpn = part_info.get('dkpn')
|
||||||
|
manufacturer_name = part_info.get('manufacturer', 'Unknown')
|
||||||
|
|
||||||
|
# Check if part already exists
|
||||||
|
existing_id = api.find_part_exact(dkpn=dkpn, mpn=mpn)
|
||||||
|
if existing_id:
|
||||||
|
return (False, existing_id, "Already exists")
|
||||||
|
|
||||||
|
# Create minimal part (provider update will fill in details)
|
||||||
|
try:
|
||||||
|
# Get or create manufacturer
|
||||||
|
manufacturer_id = api.ensure_manufacturer(manufacturer_name)
|
||||||
|
|
||||||
|
# Use a default category - provider update will suggest better one
|
||||||
|
# You can change this to a "To Be Categorized" category ID
|
||||||
|
default_category_id = 1 # Change this to your default category
|
||||||
|
|
||||||
|
part_id = api.create_part(
|
||||||
|
category_id=default_category_id,
|
||||||
|
manufacturer_id=manufacturer_id,
|
||||||
|
name=mpn, # Will be updated by provider
|
||||||
|
mpn=mpn,
|
||||||
|
description=part_info.get('description', ''),
|
||||||
|
footprint_id=None,
|
||||||
|
product_url=f"https://www.digikey.com/product-detail/en/-/{dkpn}" if dkpn else None
|
||||||
|
)
|
||||||
|
|
||||||
|
return (True, part_id, "Created")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return (False, 0, f"Failed: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
|
def set_eda_value_for_part(api: PartDB, part_id: int, mpn: str):
|
||||||
|
"""
|
||||||
|
Set the EDA value for a part based on its type.
|
||||||
|
|
||||||
|
For passives (resistors, capacitors, inductors): use the component value
|
||||||
|
For ICs and other components: use the base of the IC name
|
||||||
|
|
||||||
|
Args:
|
||||||
|
api: PartDB API instance
|
||||||
|
part_id: ID of the part to update
|
||||||
|
mpn: Manufacturer part number
|
||||||
|
"""
|
||||||
|
# Get part details
|
||||||
|
try:
|
||||||
|
part = api.get_part(part_id)
|
||||||
|
name = part.get('name', mpn).upper()
|
||||||
|
description = part.get('description', '').upper()
|
||||||
|
|
||||||
|
# Detect part type and extract value
|
||||||
|
eda_value = None
|
||||||
|
|
||||||
|
# Check for resistors
|
||||||
|
if any(indicator in name or indicator in description for indicator in ['OHM', 'Ω', 'RESISTOR', 'RES ']):
|
||||||
|
try:
|
||||||
|
ohms = parse_resistance_to_ohms(name)
|
||||||
|
if ohms is not None:
|
||||||
|
eda_value = format_ohms_for_eda(ohms)
|
||||||
|
print(f" Setting resistor EDA value: {eda_value}")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Check for capacitors
|
||||||
|
elif any(indicator in name or indicator in description for indicator in ['FARAD', 'F ', 'CAP', 'CAPACITOR']):
|
||||||
|
try:
|
||||||
|
farads = parse_capacitance_to_farads(name)
|
||||||
|
if farads is not None:
|
||||||
|
eda_value = format_farads_for_eda(farads)
|
||||||
|
print(f" Setting capacitor EDA value: {eda_value}")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Check for inductors
|
||||||
|
elif any(indicator in name or indicator in description for indicator in ['INDUCTOR', 'HENRY', 'H ', 'IND ']):
|
||||||
|
# Extract inductance value (similar pattern to resistance/capacitance)
|
||||||
|
# Look for patterns like "10uH", "100nH", etc.
|
||||||
|
match = re.search(r'(\d+\.?\d*)\s*(M|K|µ|U|N|P)?H', name, re.IGNORECASE)
|
||||||
|
if match:
|
||||||
|
value = float(match.group(1))
|
||||||
|
unit = match.group(2).upper() if match.group(2) else ''
|
||||||
|
|
||||||
|
# Convert to henries
|
||||||
|
if unit in ['M', 'MH']:
|
||||||
|
value *= 1e-3
|
||||||
|
elif unit in ['µ', 'U', 'UH']:
|
||||||
|
value *= 1e-6
|
||||||
|
elif unit in ['N', 'NH']:
|
||||||
|
value *= 1e-9
|
||||||
|
elif unit in ['P', 'PH']:
|
||||||
|
value *= 1e-12
|
||||||
|
|
||||||
|
# Format for EDA
|
||||||
|
if value >= 1:
|
||||||
|
eda_value = f"{value:.2f}H"
|
||||||
|
elif value >= 1e-3:
|
||||||
|
eda_value = f"{value * 1e3:.2f}mH"
|
||||||
|
elif value >= 1e-6:
|
||||||
|
eda_value = f"{value * 1e6:.2f}µH"
|
||||||
|
else:
|
||||||
|
eda_value = f"{value * 1e9:.2f}nH"
|
||||||
|
print(f" Setting inductor EDA value: {eda_value}")
|
||||||
|
|
||||||
|
# For ICs and other components, use base name
|
||||||
|
if eda_value is None:
|
||||||
|
# Extract base name (remove package suffix)
|
||||||
|
# Common IC patterns: "TPS54302DDCR" -> "TPS54302"
|
||||||
|
# Look for alphanumeric base followed by optional package code
|
||||||
|
|
||||||
|
# Try to extract the base part number (before package code)
|
||||||
|
match = re.match(r'^([A-Z0-9]+?)([A-Z]{2,4}[A-Z]?)?$', mpn)
|
||||||
|
if match:
|
||||||
|
base_name = match.group(1)
|
||||||
|
# If we found a reasonable base (at least 5 chars), use it
|
||||||
|
if len(base_name) >= 5:
|
||||||
|
eda_value = base_name
|
||||||
|
print(f" Setting IC/component EDA value: {eda_value}")
|
||||||
|
else:
|
||||||
|
# Use full MPN if base is too short
|
||||||
|
eda_value = mpn
|
||||||
|
print(f" Setting EDA value to full MPN: {eda_value}")
|
||||||
|
else:
|
||||||
|
# Use full MPN if pattern doesn't match
|
||||||
|
eda_value = mpn
|
||||||
|
print(f" Setting EDA value to full MPN: {eda_value}")
|
||||||
|
|
||||||
|
# Set the EDA value
|
||||||
|
if eda_value:
|
||||||
|
api.patch_eda_value(part_id, eda_value)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
raise Exception(f"Error setting EDA value: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
def run_import_from_csv(folder_path: str = "import CSVs", update_providers: bool = True, progress_callback=None):
|
||||||
|
"""
|
||||||
|
Main function to import parts from CSV files.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
folder_path: Path to folder containing CSV files
|
||||||
|
update_providers: If True, trigger provider updates after creating parts
|
||||||
|
progress_callback: Optional callback for progress updates
|
||||||
|
"""
|
||||||
|
print("=" * 70)
|
||||||
|
print("IMPORT PARTS FROM CSV FILES")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"Folder: {folder_path}")
|
||||||
|
print(f"Provider updates: {'ENABLED' if update_providers else 'DISABLED'}")
|
||||||
|
print("=" * 70)
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Find CSV files
|
||||||
|
csv_files = find_csv_files(folder_path)
|
||||||
|
if not csv_files:
|
||||||
|
print(f"No CSV files found in '{folder_path}'")
|
||||||
|
print("Place Digi-Key CSV exports in this folder and try again.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print(f"Found {len(csv_files)} CSV file(s):")
|
||||||
|
for csv_file in csv_files:
|
||||||
|
print(f" - {csv_file.name}")
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Initialize API
|
||||||
|
api = PartDB(PARTDB_BASE, PARTDB_TOKEN)
|
||||||
|
|
||||||
|
# Process each CSV file
|
||||||
|
all_created_parts = []
|
||||||
|
total_processed = 0
|
||||||
|
total_created = 0
|
||||||
|
total_skipped = 0
|
||||||
|
total_failed = 0
|
||||||
|
|
||||||
|
for csv_file in csv_files:
|
||||||
|
print(f"\nProcessing: {csv_file.name}")
|
||||||
|
print("-" * 70)
|
||||||
|
|
||||||
|
# Parse CSV
|
||||||
|
try:
|
||||||
|
parts = parse_digikey_csv(csv_file)
|
||||||
|
print(f"Found {len(parts)} parts in CSV")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error parsing CSV: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
if not parts:
|
||||||
|
print("No valid parts found in CSV")
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Create parts
|
||||||
|
use_tqdm = not progress_callback
|
||||||
|
iterator = tqdm(parts, desc="Creating parts") if use_tqdm else parts
|
||||||
|
|
||||||
|
for idx, part_info in enumerate(iterator):
|
||||||
|
# Check for cancellation
|
||||||
|
if progress_callback:
|
||||||
|
cancelled = progress_callback(
|
||||||
|
total_processed + idx,
|
||||||
|
sum(len(parse_digikey_csv(f)) for f in csv_files),
|
||||||
|
f"Processing {csv_file.name}: {part_info['mpn']}"
|
||||||
|
)
|
||||||
|
if cancelled:
|
||||||
|
print("\n⚠ Operation cancelled by user")
|
||||||
|
return
|
||||||
|
|
||||||
|
created, part_id, message = create_part_if_not_exists(api, part_info)
|
||||||
|
|
||||||
|
if created:
|
||||||
|
total_created += 1
|
||||||
|
all_created_parts.append((part_id, part_info['mpn']))
|
||||||
|
if not use_tqdm:
|
||||||
|
print(f"✓ Created: {part_info['mpn']} (ID: {part_id})")
|
||||||
|
elif part_id > 0:
|
||||||
|
total_skipped += 1
|
||||||
|
else:
|
||||||
|
total_failed += 1
|
||||||
|
print(f"✗ Failed: {part_info['mpn']} - {message}")
|
||||||
|
|
||||||
|
total_processed += 1
|
||||||
|
|
||||||
|
# Summary
|
||||||
|
print("\n" + "=" * 70)
|
||||||
|
print("IMPORT SUMMARY")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"Total parts processed: {total_processed}")
|
||||||
|
print(f"Created: {total_created}")
|
||||||
|
print(f"Skipped (exist): {total_skipped}")
|
||||||
|
print(f"Failed: {total_failed}")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
# Provider updates
|
||||||
|
if all_created_parts and update_providers:
|
||||||
|
print("\n" + "=" * 70)
|
||||||
|
print("TRIGGERING PROVIDER UPDATES")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"Updating {len(all_created_parts)} newly created parts from providers...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
print("Starting browser...")
|
||||||
|
driver = start_firefox_resilient(headless_first=HEADLESS_PROVIDER)
|
||||||
|
|
||||||
|
print("Logging in...")
|
||||||
|
driver.get(PARTDB_BASE + "/")
|
||||||
|
if not ensure_logged_in(driver, PARTDB_BASE, interactive_ok=True, wait_s=120):
|
||||||
|
print("Failed to log in!")
|
||||||
|
driver.quit()
|
||||||
|
return
|
||||||
|
|
||||||
|
controller = driver.current_window_handle
|
||||||
|
provider_success = 0
|
||||||
|
provider_failed = 0
|
||||||
|
|
||||||
|
use_tqdm = not progress_callback
|
||||||
|
iterator = tqdm(all_created_parts, desc="Updating from providers") if use_tqdm else all_created_parts
|
||||||
|
|
||||||
|
for idx, (part_id, mpn) in enumerate(iterator):
|
||||||
|
# Check for cancellation
|
||||||
|
if progress_callback:
|
||||||
|
cancelled = progress_callback(
|
||||||
|
idx,
|
||||||
|
len(all_created_parts),
|
||||||
|
f"Updating from providers: {mpn}"
|
||||||
|
)
|
||||||
|
if cancelled:
|
||||||
|
print("\n⚠ Provider updates cancelled by user")
|
||||||
|
break
|
||||||
|
|
||||||
|
try:
|
||||||
|
success = run_provider_update_flow(driver, PARTDB_BASE, part_id, controller)
|
||||||
|
if success:
|
||||||
|
provider_success += 1
|
||||||
|
# Set EDA value after provider update
|
||||||
|
try:
|
||||||
|
set_eda_value_for_part(api, part_id, mpn)
|
||||||
|
except Exception as e:
|
||||||
|
print(f" Warning: Could not set EDA value for {mpn}: {e}")
|
||||||
|
else:
|
||||||
|
provider_failed += 1
|
||||||
|
print(f"✗ Provider update failed for: {mpn}")
|
||||||
|
except Exception as e:
|
||||||
|
provider_failed += 1
|
||||||
|
print(f"✗ Error updating {mpn}: {e}")
|
||||||
|
|
||||||
|
driver.quit()
|
||||||
|
|
||||||
|
print("\n" + "=" * 70)
|
||||||
|
print("PROVIDER UPDATE SUMMARY")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"Successful: {provider_success}")
|
||||||
|
print(f"Failed: {provider_failed}")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error during provider updates: {e}")
|
||||||
|
|
||||||
|
print("\nDone!")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import sys
|
||||||
|
|
||||||
|
# Check for --no-provider flag
|
||||||
|
update_providers = "--no-provider" not in sys.argv
|
||||||
|
|
||||||
|
run_import_from_csv(update_providers=update_providers)
|
||||||
456
workflows/standardize_asdmb.py
Normal file
456
workflows/standardize_asdmb.py
Normal file
@@ -0,0 +1,456 @@
|
|||||||
|
"""
|
||||||
|
Workflow to standardize ASDMB crystal parts.
|
||||||
|
|
||||||
|
This script goes through all parts in the "Clock - ASDMB" category and:
|
||||||
|
1. Splits the name at "/" - first part becomes name, second part becomes description
|
||||||
|
2. For parts without a description after splitting, triggers info provider update
|
||||||
|
|
||||||
|
Uses the PartDB API for all operations.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import re
|
||||||
|
from typing import Optional, List, Tuple
|
||||||
|
from tqdm import tqdm
|
||||||
|
|
||||||
|
from config import PARTDB_BASE, PARTDB_TOKEN
|
||||||
|
from apis.partdb_api import PartDB
|
||||||
|
|
||||||
|
|
||||||
|
def get_all_parts_in_category(api: PartDB, category_name: str) -> List[dict]:
|
||||||
|
"""
|
||||||
|
Get all parts in a specific category and its subcategories.
|
||||||
|
"""
|
||||||
|
# First, find the category
|
||||||
|
categories = api.list_categories()
|
||||||
|
target_cat_id = None
|
||||||
|
|
||||||
|
for cat in categories:
|
||||||
|
name = (cat.get("name") or "").strip()
|
||||||
|
if name.lower() == category_name.lower():
|
||||||
|
target_cat_id = api._extract_id(cat)
|
||||||
|
break
|
||||||
|
|
||||||
|
if not target_cat_id:
|
||||||
|
print(f"Category '{category_name}' not found!")
|
||||||
|
return []
|
||||||
|
|
||||||
|
print(f"Found category '{category_name}' with ID {target_cat_id}")
|
||||||
|
|
||||||
|
# Find all subcategories
|
||||||
|
subcategory_ids = [target_cat_id]
|
||||||
|
|
||||||
|
def find_children(parent_id: int):
|
||||||
|
for cat in categories:
|
||||||
|
parent = cat.get("parent")
|
||||||
|
if parent:
|
||||||
|
parent_id_str = None
|
||||||
|
if isinstance(parent, dict):
|
||||||
|
parent_id_str = parent.get("id") or parent.get("_id")
|
||||||
|
elif isinstance(parent, str):
|
||||||
|
parent_id_str = parent
|
||||||
|
|
||||||
|
if parent_id_str:
|
||||||
|
# Extract just the number
|
||||||
|
if isinstance(parent_id_str, str):
|
||||||
|
parent_num = int(''.join(c for c in parent_id_str if c.isdigit()))
|
||||||
|
else:
|
||||||
|
parent_num = int(parent_id_str)
|
||||||
|
|
||||||
|
if parent_num == parent_id:
|
||||||
|
child_id = api._extract_id(cat)
|
||||||
|
if child_id and child_id not in subcategory_ids:
|
||||||
|
subcategory_ids.append(child_id)
|
||||||
|
print(f" Found subcategory: {cat.get('name')} (ID: {child_id})")
|
||||||
|
find_children(child_id)
|
||||||
|
|
||||||
|
find_children(target_cat_id)
|
||||||
|
|
||||||
|
print(f"Total categories to process: {len(subcategory_ids)}")
|
||||||
|
print(f"Category IDs: {subcategory_ids}")
|
||||||
|
|
||||||
|
# Fetch all parts in this category with pagination
|
||||||
|
all_parts = []
|
||||||
|
page = 1
|
||||||
|
per_page = 30 # Use smaller page size to match API default
|
||||||
|
|
||||||
|
print("\nFetching parts from API...")
|
||||||
|
while True:
|
||||||
|
params = {"per_page": per_page, "page": page}
|
||||||
|
print(f" Fetching page {page}...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
parts = api._get("/api/parts", params=params)
|
||||||
|
|
||||||
|
if isinstance(parts, list):
|
||||||
|
if not parts:
|
||||||
|
print(f" No parts returned, stopping")
|
||||||
|
break
|
||||||
|
|
||||||
|
# Filter by category
|
||||||
|
matches_this_page = 0
|
||||||
|
category_ids_found = set()
|
||||||
|
for part in parts:
|
||||||
|
part_cat = part.get("category")
|
||||||
|
part_cat_id = None
|
||||||
|
|
||||||
|
if isinstance(part_cat, dict):
|
||||||
|
part_cat_id = api._extract_id(part_cat)
|
||||||
|
elif isinstance(part_cat, str):
|
||||||
|
try:
|
||||||
|
if "/categories/" in part_cat:
|
||||||
|
part_cat_id = int(part_cat.strip("/").split("/")[-1])
|
||||||
|
else:
|
||||||
|
part_cat_id = int(''.join(c for c in part_cat if c.isdigit()))
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
elif isinstance(part_cat, int):
|
||||||
|
part_cat_id = part_cat
|
||||||
|
|
||||||
|
# Also check relationships for category
|
||||||
|
if part_cat_id is None:
|
||||||
|
relationships = part.get("relationships", {})
|
||||||
|
if relationships:
|
||||||
|
rel_cat = relationships.get("category")
|
||||||
|
if isinstance(rel_cat, dict):
|
||||||
|
rel_cat_data = rel_cat.get("data", {})
|
||||||
|
if isinstance(rel_cat_data, dict):
|
||||||
|
part_cat_id = api._extract_id(rel_cat_data)
|
||||||
|
|
||||||
|
# Also check attributes
|
||||||
|
if part_cat_id is None:
|
||||||
|
attributes = part.get("attributes", {})
|
||||||
|
if attributes:
|
||||||
|
attr_cat = attributes.get("category")
|
||||||
|
if attr_cat:
|
||||||
|
if isinstance(attr_cat, dict):
|
||||||
|
part_cat_id = api._extract_id(attr_cat)
|
||||||
|
elif isinstance(attr_cat, (int, str)):
|
||||||
|
try:
|
||||||
|
part_cat_id = int(str(attr_cat).strip("/").split("/")[-1])
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
if part_cat_id:
|
||||||
|
category_ids_found.add(part_cat_id)
|
||||||
|
|
||||||
|
if part_cat_id and part_cat_id in subcategory_ids:
|
||||||
|
all_parts.append(part)
|
||||||
|
matches_this_page += 1
|
||||||
|
|
||||||
|
print(f" Got {len(parts)} parts ({matches_this_page} matches, total: {len(all_parts)})")
|
||||||
|
|
||||||
|
# Continue to next page if we got a full page
|
||||||
|
if len(parts) < per_page:
|
||||||
|
break
|
||||||
|
page += 1
|
||||||
|
|
||||||
|
elif isinstance(parts, dict):
|
||||||
|
data = parts.get("data", [])
|
||||||
|
meta = parts.get("meta", {})
|
||||||
|
|
||||||
|
if not data:
|
||||||
|
print(f" No data returned, stopping")
|
||||||
|
break
|
||||||
|
|
||||||
|
# Filter by category
|
||||||
|
matches_this_page = 0
|
||||||
|
category_ids_found = set()
|
||||||
|
for part in data:
|
||||||
|
part_cat = part.get("category")
|
||||||
|
part_cat_id = None
|
||||||
|
|
||||||
|
if isinstance(part_cat, dict):
|
||||||
|
part_cat_id = api._extract_id(part_cat)
|
||||||
|
elif isinstance(part_cat, str):
|
||||||
|
try:
|
||||||
|
if "/categories/" in part_cat:
|
||||||
|
part_cat_id = int(part_cat.strip("/").split("/")[-1])
|
||||||
|
else:
|
||||||
|
part_cat_id = int(''.join(c for c in part_cat if c.isdigit()))
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
elif isinstance(part_cat, int):
|
||||||
|
part_cat_id = part_cat
|
||||||
|
|
||||||
|
# Also check relationships for category
|
||||||
|
if part_cat_id is None:
|
||||||
|
relationships = part.get("relationships", {})
|
||||||
|
if relationships:
|
||||||
|
rel_cat = relationships.get("category")
|
||||||
|
if isinstance(rel_cat, dict):
|
||||||
|
rel_cat_data = rel_cat.get("data", {})
|
||||||
|
if isinstance(rel_cat_data, dict):
|
||||||
|
part_cat_id = api._extract_id(rel_cat_data)
|
||||||
|
|
||||||
|
# Also check attributes
|
||||||
|
if part_cat_id is None:
|
||||||
|
attributes = part.get("attributes", {})
|
||||||
|
if attributes:
|
||||||
|
attr_cat = attributes.get("category")
|
||||||
|
if attr_cat:
|
||||||
|
if isinstance(attr_cat, dict):
|
||||||
|
part_cat_id = api._extract_id(attr_cat)
|
||||||
|
elif isinstance(attr_cat, (int, str)):
|
||||||
|
try:
|
||||||
|
part_cat_id = int(str(attr_cat).strip("/").split("/")[-1])
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
if part_cat_id:
|
||||||
|
category_ids_found.add(part_cat_id)
|
||||||
|
|
||||||
|
if part_cat_id and part_cat_id in subcategory_ids:
|
||||||
|
all_parts.append(part)
|
||||||
|
matches_this_page += 1
|
||||||
|
|
||||||
|
print(f" Got {len(data)} parts ({matches_this_page} matches, total: {len(all_parts)})")
|
||||||
|
|
||||||
|
# Check if there's more pages using meta or data length
|
||||||
|
has_more = False
|
||||||
|
if meta.get("current_page") and meta.get("last_page"):
|
||||||
|
if meta["current_page"] < meta["last_page"]:
|
||||||
|
has_more = True
|
||||||
|
elif len(data) >= per_page:
|
||||||
|
has_more = True
|
||||||
|
|
||||||
|
if not has_more:
|
||||||
|
break
|
||||||
|
|
||||||
|
page += 1
|
||||||
|
# Safety check
|
||||||
|
if page > 100:
|
||||||
|
print(f" Warning: Fetched 100 pages, stopping")
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
break
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f" Error fetching page {page}: {e}")
|
||||||
|
break
|
||||||
|
|
||||||
|
print(f"\nFound {len(all_parts)} parts in category")
|
||||||
|
return all_parts
|
||||||
|
|
||||||
|
|
||||||
|
def standardize_asdmb_part(api: PartDB, part: dict, dry_run: bool = False) -> Tuple[bool, str, bool]:
|
||||||
|
"""
|
||||||
|
Standardize a single ASDMB crystal part.
|
||||||
|
|
||||||
|
Returns: (success, message, needs_provider_update)
|
||||||
|
"""
|
||||||
|
part_id = api._extract_id(part)
|
||||||
|
if not part_id:
|
||||||
|
return (False, "No part ID", False)
|
||||||
|
|
||||||
|
# Get current name and description
|
||||||
|
current_name = part.get("name") or part.get("attributes", {}).get("name") or ""
|
||||||
|
current_desc = part.get("description") or part.get("attributes", {}).get("description") or ""
|
||||||
|
|
||||||
|
# Split name at "/" to get new name (first part)
|
||||||
|
new_name = current_name
|
||||||
|
if "/" in current_name:
|
||||||
|
new_name = current_name.split("/", 1)[0].strip()
|
||||||
|
|
||||||
|
# Split description at "/" to get new description (second part)
|
||||||
|
new_description = ""
|
||||||
|
needs_provider_update = False
|
||||||
|
|
||||||
|
if "/" in current_desc:
|
||||||
|
parts = current_desc.split("/", 1)
|
||||||
|
new_description = parts[1].strip() if len(parts) > 1 else ""
|
||||||
|
if not new_description:
|
||||||
|
needs_provider_update = True
|
||||||
|
elif not current_desc.strip():
|
||||||
|
# No description at all
|
||||||
|
needs_provider_update = True
|
||||||
|
else:
|
||||||
|
# Has description but no "/" - leave as is
|
||||||
|
new_description = current_desc
|
||||||
|
|
||||||
|
# Check what needs updating
|
||||||
|
changes = []
|
||||||
|
|
||||||
|
if current_name != new_name:
|
||||||
|
changes.append(f"name: '{current_name}' → '{new_name}'")
|
||||||
|
|
||||||
|
if new_description and current_desc != new_description:
|
||||||
|
changes.append(f"desc: '{current_desc}' → '{new_description}'")
|
||||||
|
|
||||||
|
if needs_provider_update:
|
||||||
|
changes.append("needs provider update for description")
|
||||||
|
|
||||||
|
if not changes:
|
||||||
|
return (True, "Already correct", False)
|
||||||
|
|
||||||
|
if dry_run:
|
||||||
|
return (True, f"Would update: {'; '.join(changes)}", needs_provider_update)
|
||||||
|
|
||||||
|
# Apply updates
|
||||||
|
try:
|
||||||
|
payload = {
|
||||||
|
"name": new_name
|
||||||
|
}
|
||||||
|
|
||||||
|
# Only update description if we have one and it changed
|
||||||
|
if new_description and new_description != current_desc:
|
||||||
|
payload["description"] = new_description
|
||||||
|
|
||||||
|
r = api._patch_merge(f"/api/parts/{part_id}", payload)
|
||||||
|
if r.status_code not in range(200, 300):
|
||||||
|
return (False, f"Failed to update: {r.status_code}", needs_provider_update)
|
||||||
|
|
||||||
|
result_msg = f"Updated: {'; '.join(changes)}"
|
||||||
|
return (True, result_msg, needs_provider_update)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return (False, f"Update failed: {e}", needs_provider_update)
|
||||||
|
|
||||||
|
|
||||||
|
def run_standardize_asdmb(category_name: str = "Clock - ASDMB", dry_run: bool = False, update_providers: bool = False, progress_callback=None):
|
||||||
|
"""
|
||||||
|
Main function to standardize all ASDMB crystal parts.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
category_name: Name of the category to process (default: "Clock - ASDMB")
|
||||||
|
dry_run: If True, don't make any changes
|
||||||
|
update_providers: If True, trigger provider updates for parts without descriptions
|
||||||
|
progress_callback: Optional callback function(current, total, status_text, should_cancel_func)
|
||||||
|
Returns True if operation should be cancelled
|
||||||
|
"""
|
||||||
|
print("=" * 70)
|
||||||
|
print("ASDMB CRYSTAL STANDARDIZATION")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"Category: {category_name}")
|
||||||
|
print(f"Mode: {'DRY RUN (no changes)' if dry_run else 'LIVE MODE (will update parts)'}")
|
||||||
|
print(f"Provider updates: {'ENABLED' if update_providers else 'DISABLED'}")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
# Initialize API
|
||||||
|
api = PartDB(PARTDB_BASE, PARTDB_TOKEN)
|
||||||
|
|
||||||
|
# Get all parts in category
|
||||||
|
print("\nFetching parts from category...")
|
||||||
|
parts = get_all_parts_in_category(api, category_name)
|
||||||
|
|
||||||
|
if not parts:
|
||||||
|
print("No parts found!")
|
||||||
|
return
|
||||||
|
|
||||||
|
print(f"\nProcessing {len(parts)} parts...")
|
||||||
|
|
||||||
|
# Track results
|
||||||
|
successful = 0
|
||||||
|
failed = 0
|
||||||
|
skipped = 0
|
||||||
|
needs_provider = []
|
||||||
|
|
||||||
|
# Process each part
|
||||||
|
use_tqdm = not progress_callback
|
||||||
|
iterator = tqdm(parts, desc="Processing parts") if use_tqdm else parts
|
||||||
|
|
||||||
|
for idx, part in enumerate(iterator):
|
||||||
|
# Check for cancellation
|
||||||
|
if progress_callback:
|
||||||
|
cancelled = progress_callback(idx, len(parts), f"Processing part {idx+1}/{len(parts)}...")
|
||||||
|
if cancelled:
|
||||||
|
print("\n⚠ Operation cancelled by user")
|
||||||
|
break
|
||||||
|
|
||||||
|
part_name = part.get("name") or "Unknown"
|
||||||
|
part_id = api._extract_id(part)
|
||||||
|
|
||||||
|
success, message, needs_update = standardize_asdmb_part(api, part, dry_run)
|
||||||
|
|
||||||
|
if success:
|
||||||
|
if "Already correct" in message or "skipping" in message:
|
||||||
|
skipped += 1
|
||||||
|
else:
|
||||||
|
successful += 1
|
||||||
|
print(f"✓ {part_name}: {message}")
|
||||||
|
|
||||||
|
if needs_update:
|
||||||
|
needs_provider.append((part_id, part_name))
|
||||||
|
else:
|
||||||
|
failed += 1
|
||||||
|
print(f"✗ {part_name}: {message}")
|
||||||
|
|
||||||
|
# Final progress update
|
||||||
|
if progress_callback:
|
||||||
|
progress_callback(len(parts), len(parts), "Complete!")
|
||||||
|
|
||||||
|
# Summary
|
||||||
|
print("\n" + "=" * 70)
|
||||||
|
print("SUMMARY")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"Total parts: {len(parts)}")
|
||||||
|
print(f"Updated: {successful}")
|
||||||
|
print(f"Failed: {failed}")
|
||||||
|
print(f"Skipped: {skipped}")
|
||||||
|
print(f"Need provider update: {len(needs_provider)}")
|
||||||
|
|
||||||
|
if needs_provider and update_providers and not dry_run:
|
||||||
|
print("\n" + "=" * 70)
|
||||||
|
print("TRIGGERING PROVIDER UPDATES")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
# Import selenium flow for provider updates
|
||||||
|
try:
|
||||||
|
from provider.selenium_flow import start_firefox_resilient, ensure_logged_in, run_provider_update_flow
|
||||||
|
from config import HEADLESS_PROVIDER
|
||||||
|
|
||||||
|
print("Starting browser...")
|
||||||
|
driver = start_firefox_resilient(headless_first=HEADLESS_PROVIDER)
|
||||||
|
|
||||||
|
print("Logging in...")
|
||||||
|
driver.get(PARTDB_BASE + "/")
|
||||||
|
if not ensure_logged_in(driver, PARTDB_BASE, interactive_ok=True, wait_s=120):
|
||||||
|
print("Failed to log in!")
|
||||||
|
driver.quit()
|
||||||
|
return
|
||||||
|
|
||||||
|
controller = driver.current_window_handle
|
||||||
|
provider_success = 0
|
||||||
|
provider_failed = 0
|
||||||
|
|
||||||
|
for part_id, part_name in tqdm(needs_provider, desc="Updating from providers"):
|
||||||
|
print(f"\nUpdating {part_name}...")
|
||||||
|
ok, where = run_provider_update_flow(driver, PARTDB_BASE, "/en/", part_id, controller)
|
||||||
|
|
||||||
|
if ok:
|
||||||
|
provider_success += 1
|
||||||
|
print(f" ✓ Success")
|
||||||
|
else:
|
||||||
|
provider_failed += 1
|
||||||
|
print(f" ✗ Failed at: {where}")
|
||||||
|
|
||||||
|
driver.quit()
|
||||||
|
|
||||||
|
print("\n" + "=" * 70)
|
||||||
|
print("PROVIDER UPDATE SUMMARY")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"Successful: {provider_success}")
|
||||||
|
print(f"Failed: {provider_failed}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error during provider updates: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
|
||||||
|
elif needs_provider and not update_providers:
|
||||||
|
print("\nParts needing provider update:")
|
||||||
|
for part_id, part_name in needs_provider[:10]: # Show first 10
|
||||||
|
print(f" - {part_name} (ID: {part_id})")
|
||||||
|
if len(needs_provider) > 10:
|
||||||
|
print(f" ... and {len(needs_provider) - 10} more")
|
||||||
|
print("\nRe-run with update_providers=True to trigger provider updates")
|
||||||
|
|
||||||
|
print("\n" + "=" * 70)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import sys
|
||||||
|
|
||||||
|
dry_run = "--dry-run" in sys.argv or "-d" in sys.argv
|
||||||
|
update_providers = "--update-providers" in sys.argv or "-u" in sys.argv
|
||||||
|
|
||||||
|
run_standardize_asdmb(dry_run=dry_run, update_providers=update_providers)
|
||||||
446
workflows/standardize_components.py
Normal file
446
workflows/standardize_components.py
Normal file
@@ -0,0 +1,446 @@
|
|||||||
|
"""
|
||||||
|
Workflow to standardize components in PartDB.
|
||||||
|
|
||||||
|
For parts in the Passives category:
|
||||||
|
- Sets the name to "Value Package" format
|
||||||
|
- Sets the description to "MPN Value Package Tolerance Voltage/Current/Power" format
|
||||||
|
- Sets the EDA value to match the component value
|
||||||
|
- Fixes names/descriptions with "/" separators
|
||||||
|
|
||||||
|
For parts in other categories:
|
||||||
|
- Checks if EDA value is set, if not sets it to the part name
|
||||||
|
- Fixes names/descriptions with "/" separators
|
||||||
|
"""
|
||||||
|
|
||||||
|
import re
|
||||||
|
from typing import Optional, List, Tuple
|
||||||
|
from tqdm import tqdm
|
||||||
|
|
||||||
|
from config import PARTDB_BASE, PARTDB_TOKEN
|
||||||
|
from apis.partdb_api import PartDB
|
||||||
|
from workflows.standardize_passives import standardize_part as standardize_passive_part
|
||||||
|
|
||||||
|
|
||||||
|
def fix_slash_separated_fields(name: str, description: str) -> Tuple[str, str]:
|
||||||
|
"""
|
||||||
|
Fix names and descriptions that have two parts separated by '/'.
|
||||||
|
For example: "ABC123/XYZ789" -> "ABC123"
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (fixed_name, fixed_description)
|
||||||
|
"""
|
||||||
|
fixed_name = name
|
||||||
|
fixed_desc = description
|
||||||
|
|
||||||
|
# Check if name has a slash with content on both sides
|
||||||
|
if '/' in name:
|
||||||
|
parts = name.split('/')
|
||||||
|
if len(parts) == 2 and parts[0].strip() and parts[1].strip():
|
||||||
|
# Take the first part
|
||||||
|
fixed_name = parts[0].strip()
|
||||||
|
|
||||||
|
# Check if description has a slash with content on both sides
|
||||||
|
if '/' in description:
|
||||||
|
parts = description.split('/')
|
||||||
|
if len(parts) == 2 and parts[0].strip() and parts[1].strip():
|
||||||
|
# Take the first part
|
||||||
|
fixed_desc = parts[0].strip()
|
||||||
|
|
||||||
|
return fixed_name, fixed_desc
|
||||||
|
|
||||||
|
|
||||||
|
def standardize_non_passive_part(api: PartDB, part: dict, dry_run: bool = False) -> Tuple[bool, str]:
|
||||||
|
"""
|
||||||
|
Standardize a non-passive component.
|
||||||
|
|
||||||
|
For non-passives:
|
||||||
|
1. Fix slash-separated names/descriptions
|
||||||
|
2. Ensure EDA value is set (use name if not set)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
(success: bool, message: str)
|
||||||
|
"""
|
||||||
|
part_id = api._extract_id(part)
|
||||||
|
current_name = part.get("name", "")
|
||||||
|
current_desc = part.get("description", "")
|
||||||
|
current_eda = part.get("value", "")
|
||||||
|
|
||||||
|
# Fix slash-separated fields
|
||||||
|
new_name, new_desc = fix_slash_separated_fields(current_name, current_desc)
|
||||||
|
|
||||||
|
# Check if EDA value needs to be set
|
||||||
|
new_eda = current_eda
|
||||||
|
if not current_eda or current_eda.strip() == "":
|
||||||
|
new_eda = current_name
|
||||||
|
|
||||||
|
# Check what needs to be changed
|
||||||
|
changes = []
|
||||||
|
if current_name != new_name:
|
||||||
|
changes.append(f"name: '{current_name}' -> '{new_name}'")
|
||||||
|
if current_desc != new_desc:
|
||||||
|
changes.append(f"desc: '{current_desc}' -> '{new_desc}'")
|
||||||
|
if current_eda != new_eda:
|
||||||
|
changes.append(f"EDA: '{current_eda}' -> '{new_eda}'")
|
||||||
|
|
||||||
|
if not changes:
|
||||||
|
return (True, "Already correct")
|
||||||
|
|
||||||
|
if dry_run:
|
||||||
|
return (True, f"Would update: {'; '.join(changes)}")
|
||||||
|
|
||||||
|
# Apply updates
|
||||||
|
try:
|
||||||
|
# Update name and description if needed
|
||||||
|
if current_name != new_name or current_desc != new_desc:
|
||||||
|
payload = {}
|
||||||
|
if current_name != new_name:
|
||||||
|
payload["name"] = new_name
|
||||||
|
if current_desc != new_desc:
|
||||||
|
payload["description"] = new_desc
|
||||||
|
|
||||||
|
r = api._patch_merge(f"/api/parts/{part_id}", payload)
|
||||||
|
if r.status_code not in range(200, 300):
|
||||||
|
return (False, f"Failed to update name/desc: {r.status_code}")
|
||||||
|
|
||||||
|
# Update EDA value if needed
|
||||||
|
if current_eda != new_eda:
|
||||||
|
success = api.patch_eda_value(part_id, new_eda)
|
||||||
|
if not success:
|
||||||
|
return (False, "Failed to update EDA value")
|
||||||
|
|
||||||
|
return (True, f"Updated: {'; '.join(changes)}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return (False, f"Update failed: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
def get_all_parts_paginated(api: PartDB) -> List[dict]:
|
||||||
|
"""
|
||||||
|
Get all parts from PartDB using pagination.
|
||||||
|
"""
|
||||||
|
all_parts = []
|
||||||
|
page = 1
|
||||||
|
per_page = 30
|
||||||
|
|
||||||
|
print("Fetching all parts from API...")
|
||||||
|
|
||||||
|
while True:
|
||||||
|
params = {"per_page": per_page, "page": page}
|
||||||
|
print(f" Fetching page {page}...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
parts = api._get("/api/parts", params=params)
|
||||||
|
|
||||||
|
if isinstance(parts, list):
|
||||||
|
if not parts:
|
||||||
|
break
|
||||||
|
all_parts.extend(parts)
|
||||||
|
print(f" Got {len(parts)} parts (total: {len(all_parts)})")
|
||||||
|
page += 1
|
||||||
|
else:
|
||||||
|
print(f" Unexpected response type: {type(parts)}")
|
||||||
|
break
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f" Error fetching parts: {e}")
|
||||||
|
break
|
||||||
|
|
||||||
|
print(f"Total parts fetched: {len(all_parts)}")
|
||||||
|
return all_parts
|
||||||
|
|
||||||
|
|
||||||
|
def is_part_in_passives_category(api: PartDB, part: dict, passives_category_ids: List[int]) -> bool:
|
||||||
|
"""
|
||||||
|
Check if a part belongs to the Passives category or any of its subcategories.
|
||||||
|
"""
|
||||||
|
category = part.get("category")
|
||||||
|
if not category:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Extract category ID
|
||||||
|
if isinstance(category, dict):
|
||||||
|
cat_id_str = category.get("id") or category.get("_id")
|
||||||
|
elif isinstance(category, str):
|
||||||
|
cat_id_str = category
|
||||||
|
else:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Parse the ID
|
||||||
|
try:
|
||||||
|
if isinstance(cat_id_str, str):
|
||||||
|
cat_id = int(''.join(c for c in cat_id_str if c.isdigit()))
|
||||||
|
else:
|
||||||
|
cat_id = int(cat_id_str)
|
||||||
|
|
||||||
|
return cat_id in passives_category_ids
|
||||||
|
except:
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def get_passives_category_ids(api: PartDB) -> List[int]:
|
||||||
|
"""
|
||||||
|
Get all category IDs for Passives and its subcategories.
|
||||||
|
"""
|
||||||
|
categories = api.list_categories()
|
||||||
|
target_cat_id = None
|
||||||
|
|
||||||
|
# Find the Passives category
|
||||||
|
for cat in categories:
|
||||||
|
name = (cat.get("name") or "").strip()
|
||||||
|
if name.lower() == "passives":
|
||||||
|
target_cat_id = api._extract_id(cat)
|
||||||
|
break
|
||||||
|
|
||||||
|
if not target_cat_id:
|
||||||
|
print("Warning: Passives category not found!")
|
||||||
|
return []
|
||||||
|
|
||||||
|
# Find all subcategories
|
||||||
|
category_ids = [target_cat_id]
|
||||||
|
|
||||||
|
def find_children(parent_id: int):
|
||||||
|
for cat in categories:
|
||||||
|
parent = cat.get("parent")
|
||||||
|
if parent:
|
||||||
|
parent_id_str = None
|
||||||
|
if isinstance(parent, dict):
|
||||||
|
parent_id_str = parent.get("id") or parent.get("_id")
|
||||||
|
elif isinstance(parent, str):
|
||||||
|
parent_id_str = parent
|
||||||
|
|
||||||
|
if parent_id_str:
|
||||||
|
try:
|
||||||
|
if isinstance(parent_id_str, str):
|
||||||
|
parent_num = int(''.join(c for c in parent_id_str if c.isdigit()))
|
||||||
|
else:
|
||||||
|
parent_num = int(parent_id_str)
|
||||||
|
|
||||||
|
if parent_num == parent_id:
|
||||||
|
child_id = api._extract_id(cat)
|
||||||
|
if child_id and child_id not in category_ids:
|
||||||
|
category_ids.append(child_id)
|
||||||
|
find_children(child_id)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
find_children(target_cat_id)
|
||||||
|
|
||||||
|
print(f"Found Passives category with {len(category_ids)} total categories (including subcategories)")
|
||||||
|
return category_ids
|
||||||
|
|
||||||
|
|
||||||
|
def run_standardize_components(dry_run: bool = False, progress_callback=None):
|
||||||
|
"""
|
||||||
|
Main function to standardize all components.
|
||||||
|
|
||||||
|
For passives: Full standardization (value, format, etc.)
|
||||||
|
For others: Fix slashes, ensure EDA value is set
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dry_run: If True, only show what would be changed without making changes
|
||||||
|
progress_callback: Optional callback function(current, total, status_text)
|
||||||
|
Returns True if operation should be cancelled
|
||||||
|
"""
|
||||||
|
print("=" * 70)
|
||||||
|
print("COMPONENT STANDARDIZATION")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"Mode: {'DRY RUN (no changes)' if dry_run else 'LIVE (making changes)'}")
|
||||||
|
print("=" * 70)
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Initialize API
|
||||||
|
api = PartDB(PARTDB_BASE, PARTDB_TOKEN)
|
||||||
|
|
||||||
|
# Get passives category IDs
|
||||||
|
passives_ids = get_passives_category_ids(api)
|
||||||
|
|
||||||
|
# Get all parts
|
||||||
|
print("\nFetching all parts...")
|
||||||
|
all_parts = get_all_parts_paginated(api)
|
||||||
|
|
||||||
|
if not all_parts:
|
||||||
|
print("No parts found!")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Separate passives from others
|
||||||
|
passives = []
|
||||||
|
non_passives = []
|
||||||
|
|
||||||
|
for part in all_parts:
|
||||||
|
if is_part_in_passives_category(api, part, passives_ids):
|
||||||
|
passives.append(part)
|
||||||
|
else:
|
||||||
|
non_passives.append(part)
|
||||||
|
|
||||||
|
print(f"\nFound {len(passives)} passive components")
|
||||||
|
print(f"Found {len(non_passives)} non-passive components")
|
||||||
|
print(f"Total: {len(all_parts)} parts")
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Statistics
|
||||||
|
passive_success = 0
|
||||||
|
passive_already_correct = 0
|
||||||
|
passive_failed = 0
|
||||||
|
|
||||||
|
non_passive_success = 0
|
||||||
|
non_passive_already_correct = 0
|
||||||
|
non_passive_failed = 0
|
||||||
|
|
||||||
|
errors = []
|
||||||
|
|
||||||
|
# Calculate total for progress
|
||||||
|
total_parts = len(all_parts)
|
||||||
|
processed = 0
|
||||||
|
|
||||||
|
# Process passives
|
||||||
|
if passives:
|
||||||
|
print("=" * 70)
|
||||||
|
print("PROCESSING PASSIVE COMPONENTS")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
bar = tqdm(passives, desc="Passives", unit="part") if not progress_callback else None
|
||||||
|
|
||||||
|
for part in passives:
|
||||||
|
# Check for cancellation
|
||||||
|
if progress_callback:
|
||||||
|
cancelled = progress_callback(
|
||||||
|
processed, total_parts,
|
||||||
|
f"Processing passives ({processed+1}/{total_parts})..."
|
||||||
|
)
|
||||||
|
if cancelled:
|
||||||
|
print("\n⚠ Operation cancelled by user")
|
||||||
|
if bar:
|
||||||
|
bar.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
part_id = api._extract_id(part)
|
||||||
|
mpn = part.get("manufacturer_product_number") or part.get("mpn") or f"ID:{part_id}"
|
||||||
|
|
||||||
|
if bar:
|
||||||
|
bar.set_postfix_str(mpn[:30])
|
||||||
|
|
||||||
|
success, message = standardize_passive_part(api, part, dry_run=dry_run)
|
||||||
|
|
||||||
|
if success:
|
||||||
|
if "Already correct" in message:
|
||||||
|
passive_already_correct += 1
|
||||||
|
else:
|
||||||
|
passive_success += 1
|
||||||
|
if bar:
|
||||||
|
tqdm.write(f"✓ {mpn}: {message}")
|
||||||
|
else:
|
||||||
|
print(f"✓ {mpn}: {message}")
|
||||||
|
else:
|
||||||
|
passive_failed += 1
|
||||||
|
errors.append((mpn, message))
|
||||||
|
if bar:
|
||||||
|
tqdm.write(f"✗ {mpn}: {message}")
|
||||||
|
else:
|
||||||
|
print(f"✗ {mpn}: {message}")
|
||||||
|
|
||||||
|
processed += 1
|
||||||
|
|
||||||
|
if bar:
|
||||||
|
bar.close()
|
||||||
|
|
||||||
|
# Process non-passives
|
||||||
|
if non_passives:
|
||||||
|
print()
|
||||||
|
print("=" * 70)
|
||||||
|
print("PROCESSING NON-PASSIVE COMPONENTS")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
bar = tqdm(non_passives, desc="Others", unit="part") if not progress_callback else None
|
||||||
|
|
||||||
|
for part in non_passives:
|
||||||
|
# Check for cancellation
|
||||||
|
if progress_callback:
|
||||||
|
cancelled = progress_callback(
|
||||||
|
processed, total_parts,
|
||||||
|
f"Processing non-passives ({processed+1}/{total_parts})..."
|
||||||
|
)
|
||||||
|
if cancelled:
|
||||||
|
print("\n⚠ Operation cancelled by user")
|
||||||
|
if bar:
|
||||||
|
bar.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
part_id = api._extract_id(part)
|
||||||
|
mpn = part.get("manufacturer_product_number") or part.get("mpn") or f"ID:{part_id}"
|
||||||
|
|
||||||
|
if bar:
|
||||||
|
bar.set_postfix_str(mpn[:30])
|
||||||
|
|
||||||
|
success, message = standardize_non_passive_part(api, part, dry_run=dry_run)
|
||||||
|
|
||||||
|
if success:
|
||||||
|
if "Already correct" in message:
|
||||||
|
non_passive_already_correct += 1
|
||||||
|
else:
|
||||||
|
non_passive_success += 1
|
||||||
|
if bar:
|
||||||
|
tqdm.write(f"✓ {mpn}: {message}")
|
||||||
|
else:
|
||||||
|
print(f"✓ {mpn}: {message}")
|
||||||
|
else:
|
||||||
|
non_passive_failed += 1
|
||||||
|
errors.append((mpn, message))
|
||||||
|
if bar:
|
||||||
|
tqdm.write(f"✗ {mpn}: {message}")
|
||||||
|
else:
|
||||||
|
print(f"✗ {mpn}: {message}")
|
||||||
|
|
||||||
|
processed += 1
|
||||||
|
|
||||||
|
if bar:
|
||||||
|
bar.close()
|
||||||
|
|
||||||
|
# Final progress update
|
||||||
|
if progress_callback:
|
||||||
|
progress_callback(total_parts, total_parts, "Complete!")
|
||||||
|
|
||||||
|
# Print summary
|
||||||
|
print()
|
||||||
|
print("=" * 70)
|
||||||
|
print("SUMMARY")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"PASSIVE COMPONENTS:")
|
||||||
|
print(f" Total processed: {len(passives)}")
|
||||||
|
print(f" Already correct: {passive_already_correct}")
|
||||||
|
print(f" Successfully updated: {passive_success}")
|
||||||
|
print(f" Failed: {passive_failed}")
|
||||||
|
print()
|
||||||
|
print(f"NON-PASSIVE COMPONENTS:")
|
||||||
|
print(f" Total processed: {len(non_passives)}")
|
||||||
|
print(f" Already correct: {non_passive_already_correct}")
|
||||||
|
print(f" Successfully updated: {non_passive_success}")
|
||||||
|
print(f" Failed: {non_passive_failed}")
|
||||||
|
print()
|
||||||
|
print(f"TOTAL:")
|
||||||
|
print(f" Parts processed: {total_parts}")
|
||||||
|
print(f" Successfully updated: {passive_success + non_passive_success}")
|
||||||
|
print(f" Failed: {passive_failed + non_passive_failed}")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
if errors:
|
||||||
|
print()
|
||||||
|
print("ERRORS:")
|
||||||
|
for mpn, msg in errors[:20]: # Show first 20 errors
|
||||||
|
print(f" {mpn}: {msg}")
|
||||||
|
if len(errors) > 20:
|
||||||
|
print(f" ... and {len(errors) - 20} more")
|
||||||
|
|
||||||
|
print()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import sys
|
||||||
|
|
||||||
|
# Check for dry-run flag
|
||||||
|
dry_run = "--dry-run" in sys.argv or "-n" in sys.argv
|
||||||
|
|
||||||
|
if dry_run:
|
||||||
|
print("Running in DRY RUN mode - no changes will be made")
|
||||||
|
print()
|
||||||
|
|
||||||
|
run_standardize_components(dry_run=dry_run)
|
||||||
1084
workflows/standardize_passives.py
Normal file
1084
workflows/standardize_passives.py
Normal file
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user