43 Commits

Author SHA1 Message Date
d4012589e6 add 0.6.6 2026-01-18 19:55:59 +00:00
8ac67fa60c 0.6.6 2026-01-18 19:53:34 +00:00
981490c152 0.6.6 2026-01-18 19:53:12 +00:00
99b2a0f0c5 0.6.6 2026-01-18 19:52:27 +00:00
7ead494765 0.6.6 2026-01-18 19:51:31 +00:00
342b6f6c57 0.6.6 2026-01-18 19:50:44 +00:00
66ca63b2be I 2026-01-18 19:35:19 +00:00
e8325f722f I 2026-01-18 19:32:58 +00:00
7c1a91937a add 0.6.5 2026-01-18 18:55:40 +00:00
7ac3289bb7 0.6.5 2026-01-18 18:53:52 +00:00
19bdbd1b9a 0.6.5 2026-01-18 18:53:30 +00:00
24363cd2ac 0.6.5 2026-01-18 18:52:55 +00:00
e19ca5bff1 0.6.5 2026-01-18 18:51:48 +00:00
05897d4370 0.6.5 2026-01-18 18:51:17 +00:00
7a3a28d87f add 0.6.4 2026-01-18 16:58:08 +00:00
240cded8a9 0.6.4 2026-01-18 16:56:23 +00:00
31e241f052 0.6.4 2026-01-18 16:56:04 +00:00
de579682a0 0.6.4 2026-01-18 16:55:36 +00:00
9acbd5046c Add 0.6.3 2026-01-18 15:54:21 +00:00
8d63c88e69 0.6.3 2026-01-18 15:53:52 +00:00
cffb0af60e 0.6.3 2026-01-18 15:53:03 +00:00
857b7a127a 0.6 3 2026-01-18 15:52:28 +00:00
66b24ece48 0.6.3 2026-01-18 15:52:01 +00:00
0cc3b466e0 0.6.3 2026-01-18 15:51:34 +00:00
f1e03b31a1 add 0.6.2 2026-01-18 13:16:42 +00:00
4e12d596d6 0.6.2 2026-01-18 13:12:47 +00:00
fa97f89afb 0.6.2 2026-01-18 13:12:21 +00:00
0718bee185 0.6.2 2026-01-18 13:11:39 +00:00
1a53107450 0.6.2 2026-01-18 13:10:59 +00:00
ab82cc6fd3 0.6.2 2026-01-18 13:10:16 +00:00
8e51f144e1 0.6.2 2026-01-18 13:09:37 +00:00
f292e22301 Fix restore version 2026-01-18 09:08:25 +00:00
2eb194c001 Add 0.6 1 2026-01-18 09:07:40 +00:00
f4e367987a 0.6.1 2026-01-18 09:07:03 +00:00
08aa4b5e15 0.6.0 2026-01-18 08:37:07 +00:00
b1676482f0 0.6.0 2026-01-18 08:36:34 +00:00
e46cd6e488 0.6.0 2026-01-18 08:34:44 +00:00
edd2fdd3fb 0.6.0 2026-01-18 08:33:34 +00:00
a4a0c1462b 0.6.0 2026-01-18 08:32:51 +00:00
196e63c08e 0.6.0 2026-01-18 08:32:06 +00:00
518ac1d59d add 0.5.11 2026-01-18 07:47:24 +00:00
ad699dc69a 0.5.11 2026-01-18 07:45:56 +00:00
a8e247d288 Add backup 2026-01-18 07:45:34 +00:00
9 changed files with 1761 additions and 96 deletions

View File

@@ -11,6 +11,94 @@ Sections:
--- ---
## [0.6.6] - 2026-01-18
### Added
- Source filter to limit repositories by origin: BCS Official, HACS, or Custom.
- Visual source badges for repositories (BCS Official, HACS, Custom).
- Restored HACS enable/disable toggle in the Store UI.
### Changed
- HACS repositories now display human-readable names and descriptions based on official HACS metadata.
- Improved Store usability on mobile devices by fixing back navigation from repository detail view.
### Fixed
- Fixed missing HACS toggle after UI updates.
- Fixed mobile browser back button exiting the Store instead of returning to the repository list.
## [0.6.5] - 2026-01-18
### Added
- Separate handling of HACS official repositories with an enable/disable toggle in the Store UI.
- HACS repositories are now loaded independently from the main store index.
### Changed
- Store index can remain minimal and curated; HACS repositories are no longer required in store.yaml.
- Improved Store performance and clarity by clearly separating repository sources.
### Fixed
- Browser cache issues resolved by proper panel cache-busting for UI updates.
### Internal
- No changes to install, update, backup, or restore logic.
- Fully backward compatible with existing installations and configurations.
## [0.6.4] - 2026-01-18
### Fixed
- Fixed long Home Assistant startup times caused by background repository enrichment running too early.
### Changed
- Background repository enrichment is now started only after Home Assistant has fully started.
- Repository cache updates now run fully asynchronous without blocking Home Assistant startup.
### Internal
- Improved alignment with Home Assistant startup lifecycle.
- No functional changes to store behavior or UI.
## [0.6.3] - 2026-01-18
### Changed
- Improved Store performance for large indexes by avoiding full metadata enrichment during list refresh.
- Repository details are loaded on demand, reducing initial load time and network requests.
- Index refresh is skipped when the index content has not changed.
## [0.6.2] - 2026-01-18
### Added
- Selectable install/update version per repository (install older releases/tags to downgrade when needed).
- New API endpoint to list available versions for a repository: `GET /api/bcs/versions?repo_id=...`.
## [0.6.1] - 2026-01-18
### Fixed
- Restored integrations now correctly report the restored version instead of the latest installed version.
- Update availability is correctly recalculated after restoring a backup, allowing updates to be applied again.
- Improved restore compatibility with backups created before version metadata was introduced.
## [0.6.0] - 2026-01-18
### Added
- Restore feature with selection of the last available backups (up to 5 per domain).
- New API endpoints to list and restore backups:
- `GET /api/bcs/backups?repo_id=...`
- `POST /api/bcs/restore?repo_id=...&backup_id=...`
### Safety
- Restoring a backup triggers a “restart required” prompt to apply the recovered integration state.
### Notes
- This is a major release milestone consolidating install/update/uninstall, backup/rollback, and restore workflows.
## [0.5.11] - 2026-01-18
### Added
- Automatic backup of existing custom components before install or update.
- Backup retention with a configurable limit per domain.
### Safety
- Automatic rollback is triggered if an install or update fails after a backup was created.
## [0.5.10] - 2026-01-17 ## [0.5.10] - 2026-01-17
### Added ### Added

View File

@@ -4,7 +4,7 @@ description: >
Supports GitHub, GitLab, Gitea and Bahmcloud repositories with Supports GitHub, GitLab, Gitea and Bahmcloud repositories with
a central index, UI panel and API, similar to HACS but independent. a central index, UI panel and API, similar to HACS but independent.
category: integration category: Integrations
author: Bahmcloud author: Bahmcloud
maintainer: Bahmcloud maintainer: Bahmcloud

View File

@@ -5,7 +5,8 @@ from datetime import timedelta
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.components.panel_custom import async_register_panel from homeassistant.components.panel_custom import async_register_panel
from homeassistant.helpers.event import async_track_time_interval from homeassistant.helpers.event import async_track_time_interval, async_call_later
from homeassistant.const import EVENT_HOMEASSISTANT_STARTED
from homeassistant.helpers.discovery import async_load_platform from homeassistant.helpers.discovery import async_load_platform
from .core import BCSCore, BCSConfig, BCSError from .core import BCSCore, BCSConfig, BCSError
@@ -34,21 +35,31 @@ async def async_setup(hass: HomeAssistant, config: dict) -> bool:
from .views import ( from .views import (
StaticAssetsView, StaticAssetsView,
BCSApiView, BCSApiView,
BCSSettingsView,
BCSReadmeView, BCSReadmeView,
BCSVersionsView,
BCSRepoDetailView,
BCSCustomRepoView, BCSCustomRepoView,
BCSInstallView, BCSInstallView,
BCSUpdateView, BCSUpdateView,
BCSUninstallView, BCSUninstallView,
BCSBackupsView,
BCSRestoreView,
BCSRestartView, BCSRestartView,
) )
hass.http.register_view(StaticAssetsView()) hass.http.register_view(StaticAssetsView())
hass.http.register_view(BCSApiView(core)) hass.http.register_view(BCSApiView(core))
hass.http.register_view(BCSSettingsView(core))
hass.http.register_view(BCSReadmeView(core)) hass.http.register_view(BCSReadmeView(core))
hass.http.register_view(BCSVersionsView(core))
hass.http.register_view(BCSRepoDetailView(core))
hass.http.register_view(BCSCustomRepoView(core)) hass.http.register_view(BCSCustomRepoView(core))
hass.http.register_view(BCSInstallView(core)) hass.http.register_view(BCSInstallView(core))
hass.http.register_view(BCSUpdateView(core)) hass.http.register_view(BCSUpdateView(core))
hass.http.register_view(BCSUninstallView(core)) hass.http.register_view(BCSUninstallView(core))
hass.http.register_view(BCSBackupsView(core))
hass.http.register_view(BCSRestoreView(core))
hass.http.register_view(BCSRestartView(core)) hass.http.register_view(BCSRestartView(core))
await async_register_panel( await async_register_panel(
@@ -56,17 +67,24 @@ async def async_setup(hass: HomeAssistant, config: dict) -> bool:
frontend_url_path="bahmcloud-store", frontend_url_path="bahmcloud-store",
webcomponent_name="bahmcloud-store-panel", webcomponent_name="bahmcloud-store-panel",
# IMPORTANT: bump v to avoid caching old JS # IMPORTANT: bump v to avoid caching old JS
module_url="/api/bahmcloud_store_static/panel.js?v=102", module_url="/api/bahmcloud_store_static/panel.js?v=108",
sidebar_title="Bahmcloud Store", sidebar_title="Bahmcloud Store",
sidebar_icon="mdi:store", sidebar_icon="mdi:store",
require_admin=True, require_admin=True,
config={}, config={},
) )
try: async def _do_startup_refresh(_now=None) -> None:
await core.full_refresh(source="startup") try:
except BCSError as e: await core.full_refresh(source="startup")
_LOGGER.error("Initial refresh failed: %s", e) except BCSError as e:
_LOGGER.error("Initial refresh failed: %s", e)
# Do not block Home Assistant startup. Schedule the initial refresh after HA started.
def _on_ha_started(_event) -> None:
async_call_later(hass, 30, _do_startup_refresh)
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STARTED, _on_ha_started)
async def periodic(_now) -> None: async def periodic(_now) -> None:
try: try:

View File

@@ -20,7 +20,7 @@ from homeassistant.helpers import issue_registry as ir
from homeassistant.util import yaml as ha_yaml from homeassistant.util import yaml as ha_yaml
from .storage import BCSStorage, CustomRepo from .storage import BCSStorage, CustomRepo
from .providers import fetch_repo_info, detect_provider, RepoInfo, fetch_readme_markdown from .providers import fetch_repo_info, detect_provider, RepoInfo, fetch_readme_markdown, fetch_repo_versions
from .metadata import fetch_repo_metadata, RepoMetadata from .metadata import fetch_repo_metadata, RepoMetadata
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -30,6 +30,14 @@ DOMAIN = "bahmcloud_store"
SIGNAL_UPDATED = f"{DOMAIN}_updated" SIGNAL_UPDATED = f"{DOMAIN}_updated"
RESTART_REQUIRED_ISSUE_ID = "restart_required" RESTART_REQUIRED_ISSUE_ID = "restart_required"
BACKUP_META_FILENAME = ".bcs_backup_meta.json"
# Optional HACS integrations index (GitHub repositories only).
HACS_INTEGRATIONS_URL = "https://data-v2.hacs.xyz/integration/repositories.json"
HACS_INTEGRATIONS_DATA_URL = "https://data-v2.hacs.xyz/integration/data.json"
HACS_DEFAULT_CATEGORY = "Integrations"
HACS_CACHE_TTL_SECONDS = 60 * 60 * 24 # 24h
class BCSError(Exception): class BCSError(Exception):
"""BCS core error.""" """BCS core error."""
@@ -87,17 +95,53 @@ class BCSCore:
self.last_index_hash: str | None = None self.last_index_hash: str | None = None
self.last_index_loaded_at: float | None = None self.last_index_loaded_at: float | None = None
# Fast refresh: skip expensive processing when index/custom repos unchanged
self._last_refresh_signature: str | None = None
self._install_lock = asyncio.Lock() self._install_lock = asyncio.Lock()
self._installed_cache: dict[str, Any] = {} self._installed_cache: dict[str, Any] = {}
# Persistent settings (UI toggles etc.)
self.settings: dict[str, Any] = {"hacs_enabled": False}
# Cached HACS metadata (display names/descriptions). Loaded from storage.
self._hacs_meta_fetched_at: int = 0
self._hacs_meta: dict[str, dict[str, Any]] = {}
self._hacs_meta_lock = asyncio.Lock()
# Phase F2: backups before install/update
self._backup_root = Path(self.hass.config.path(".bcs_backups"))
self._backup_keep_per_domain: int = 5
async def async_initialize(self) -> None: async def async_initialize(self) -> None:
"""Async initialization that avoids blocking file IO.""" """Async initialization that avoids blocking file IO."""
self.version = await self._read_manifest_version_async() self.version = await self._read_manifest_version_async()
await self._refresh_installed_cache() await self._refresh_installed_cache()
# Load persistent settings (do not fail startup)
try:
s = await self.storage.get_settings()
if isinstance(s, dict):
self.settings.update(s)
except Exception:
pass
# After a successful HA restart, restart-required is no longer relevant. # After a successful HA restart, restart-required is no longer relevant.
self._clear_restart_required_issue() self._clear_restart_required_issue()
# Load cached HACS metadata (optional; improves UX when HACS toggle is enabled).
try:
hc = await self.storage.get_hacs_cache()
if isinstance(hc, dict):
self._hacs_meta_fetched_at = int(hc.get("fetched_at") or 0)
repos = hc.get("repos")
if isinstance(repos, dict):
# Normalize to string keys
self._hacs_meta = {str(k): (v if isinstance(v, dict) else {}) for k, v in repos.items()}
except Exception:
self._hacs_meta_fetched_at = 0
self._hacs_meta = {}
async def _read_manifest_version_async(self) -> str: async def _read_manifest_version_async(self) -> str:
def _read() -> str: def _read() -> str:
try: try:
@@ -166,17 +210,62 @@ class BCSCore:
data = (self._installed_cache or {}).get(repo_id) data = (self._installed_cache or {}).get(repo_id)
return data if isinstance(data, dict) else None return data if isinstance(data, dict) else None
def get_settings_public(self) -> dict[str, Any]:
"""Return UI-relevant settings (no I/O)."""
return {
"hacs_enabled": bool(self.settings.get("hacs_enabled", False)),
}
async def set_settings(self, updates: dict[str, Any]) -> dict[str, Any]:
"""Persist settings and apply them."""
safe_updates: dict[str, Any] = {}
if "hacs_enabled" in (updates or {}):
safe_updates["hacs_enabled"] = bool(updates.get("hacs_enabled"))
merged = await self.storage.set_settings(safe_updates)
if isinstance(merged, dict):
self.settings.update(merged)
# Reload repo list after changing settings.
await self.full_refresh(source="settings")
return self.get_settings_public()
async def refresh(self) -> None: async def refresh(self) -> None:
index_repos, refresh_seconds = await self._load_index_repos() index_repos, refresh_seconds = await self._load_index_repos()
self.refresh_seconds = refresh_seconds self.refresh_seconds = refresh_seconds
hacs_enabled = bool(self.settings.get("hacs_enabled", False))
hacs_repos: list[RepoItem] = []
if hacs_enabled:
try:
hacs_repos = await self._load_hacs_repos()
except Exception as e:
_LOGGER.warning("BCS HACS index load failed: %s", e)
custom_repos = await self.storage.list_custom_repos() custom_repos = await self.storage.list_custom_repos()
# Fast path: if index + custom repos did not change, skip expensive work.
try:
custom_sig = [(c.id, (c.url or '').strip(), (c.name or '').strip()) for c in (custom_repos or [])]
custom_sig.sort()
hacs_sig = len(hacs_repos) if hacs_enabled else 0
refresh_signature = json.dumps({"index_hash": self.last_index_hash, "custom": custom_sig, "hacs": hacs_sig, "hacs_enabled": hacs_enabled}, sort_keys=True)
except Exception:
refresh_signature = f"{self.last_index_hash}:{len(custom_repos or [])}:{'h' if hacs_enabled else 'n'}:{len(hacs_repos)}"
if self._last_refresh_signature and refresh_signature == self._last_refresh_signature and self.repos:
_LOGGER.debug("BCS refresh skipped (no changes detected)")
return
merged: dict[str, RepoItem] = {} merged: dict[str, RepoItem] = {}
for item in index_repos: for item in index_repos:
merged[item.id] = item merged[item.id] = item
for item in hacs_repos:
merged[item.id] = item
for c in custom_repos: for c in custom_repos:
merged[c.id] = RepoItem( merged[c.id] = RepoItem(
id=c.id, id=c.id,
@@ -188,16 +277,207 @@ class BCSCore:
for r in merged.values(): for r in merged.values():
r.provider = detect_provider(r.url) r.provider = detect_provider(r.url)
await self._enrich_and_resolve(merged) # Apply cached HACS display metadata immediately (fast UX).
if hacs_enabled and hacs_repos:
self._apply_hacs_meta(merged)
# Refresh HACS metadata in the background if cache is missing/stale.
if self._hacs_meta_needs_refresh():
self.hass.async_create_task(self._refresh_hacs_meta_background())
await self._enrich_installed_only(merged)
self.repos = merged self.repos = merged
self._last_refresh_signature = refresh_signature
_LOGGER.info( _LOGGER.info(
"BCS refresh complete: repos=%s (index=%s, custom=%s)", "BCS refresh complete: repos=%s (index=%s, hacs=%s, custom=%s)",
len(self.repos), len(self.repos),
len([r for r in self.repos.values() if r.source == "index"]), len([r for r in self.repos.values() if r.source == "index"]),
len([r for r in self.repos.values() if r.source == "hacs"]),
len([r for r in self.repos.values() if r.source == "custom"]), len([r for r in self.repos.values() if r.source == "custom"]),
) )
async def _load_hacs_repos(self) -> list[RepoItem]:
"""Load the official HACS integrations repository list.
This is used as an optional additional source to keep the local store index small.
We only parse owner/repo strings and map them to GitHub URLs.
"""
session = async_get_clientsession(self.hass)
headers = {
"User-Agent": "BahmcloudStore (Home Assistant)",
"Cache-Control": "no-cache, no-store, max-age=0",
"Pragma": "no-cache",
}
async with session.get(HACS_INTEGRATIONS_URL, timeout=60, headers=headers) as resp:
if resp.status != 200:
raise BCSError(f"HACS index returned {resp.status}")
data = await resp.json()
if not isinstance(data, list):
raise BCSError("HACS repositories.json must be a list")
items: list[RepoItem] = []
for entry in data:
if not isinstance(entry, str):
continue
full_name = entry.strip().strip("/")
if not full_name or "/" not in full_name:
continue
repo_id = f"hacs:{full_name.lower()}"
owner = full_name.split("/", 1)[0].strip()
items.append(
RepoItem(
id=repo_id,
# Name is improved later via cached HACS meta (manifest.name).
name=full_name,
url=f"https://github.com/{full_name}",
source="hacs",
owner=owner,
provider_repo_name=full_name, # keep stable owner/repo reference
meta_category=HACS_DEFAULT_CATEGORY,
)
)
return items
def _hacs_meta_needs_refresh(self) -> bool:
if not self._hacs_meta_fetched_at or not self._hacs_meta:
return True
age = int(time.time()) - int(self._hacs_meta_fetched_at)
return age > HACS_CACHE_TTL_SECONDS
def _apply_hacs_meta(self, merged: dict[str, RepoItem]) -> None:
"""Apply cached HACS metadata to matching repos (no I/O)."""
if not self._hacs_meta:
return
def _full_name_from_repo(r: RepoItem) -> str | None:
# Prefer the original owner/repo (stable) if we kept it.
if r.provider_repo_name and "/" in str(r.provider_repo_name):
return str(r.provider_repo_name).strip()
# Fall back to URL path: https://github.com/owner/repo
try:
u = urlparse((r.url or "").strip())
parts = [p for p in (u.path or "").strip("/").split("/") if p]
if len(parts) >= 2:
repo = parts[1]
if repo.endswith(".git"):
repo = repo[:-4]
return f"{parts[0]}/{repo}"
except Exception:
pass
return None
for r in merged.values():
if r.source != "hacs":
continue
key = _full_name_from_repo(r)
if not key or "/" not in key:
continue
meta = self._hacs_meta.get(key)
if not isinstance(meta, dict) or not meta:
continue
# Prefer HACS manifest name as display name.
display_name = meta.get("name")
if isinstance(display_name, str) and display_name.strip():
r.name = display_name.strip()
r.meta_name = display_name.strip()
desc = meta.get("description")
if isinstance(desc, str) and desc.strip():
r.meta_description = desc.strip()
domain = meta.get("domain")
# We don't store domain in RepoItem fields, but keep it in meta_source for debugging.
# (Optional: extend RepoItem later if needed.)
if isinstance(domain, str) and domain.strip():
# Keep under meta_source marker to help identify source.
pass
r.meta_source = r.meta_source or "hacs"
r.meta_category = r.meta_category or HACS_DEFAULT_CATEGORY
async def _refresh_hacs_meta_background(self) -> None:
"""Fetch and cache HACS integration metadata in the background.
Uses the official HACS data endpoint which includes manifest data.
This avoids per-repo GitHub calls and improves the UX (names/descriptions).
"""
async with self._hacs_meta_lock:
# Another task might have refreshed already.
if not self._hacs_meta_needs_refresh():
return
session = async_get_clientsession(self.hass)
headers = {
"User-Agent": "BahmcloudStore (Home Assistant)",
"Cache-Control": "no-cache, no-store, max-age=0",
"Pragma": "no-cache",
}
try:
async with session.get(HACS_INTEGRATIONS_DATA_URL, timeout=120, headers=headers) as resp:
if resp.status != 200:
raise BCSError(f"HACS data.json returned {resp.status}")
data = await resp.json()
except Exception as e:
_LOGGER.warning("BCS HACS meta refresh failed: %s", e)
return
# Build mapping owner/repo -> {name, description, domain}
meta_map: dict[str, dict[str, Any]] = {}
if isinstance(data, dict):
for _, obj in data.items():
if not isinstance(obj, dict):
continue
full_name = obj.get("full_name")
if not isinstance(full_name, str) or "/" not in full_name:
continue
manifest = obj.get("manifest")
mname = None
mdesc = None
mdomain = None
if isinstance(manifest, dict):
mname = manifest.get("name")
mdesc = manifest.get("description")
mdomain = manifest.get("domain")
entry: dict[str, Any] = {}
if isinstance(mname, str) and mname.strip():
entry["name"] = mname.strip()
if isinstance(mdesc, str) and mdesc.strip():
entry["description"] = mdesc.strip()
if isinstance(mdomain, str) and mdomain.strip():
entry["domain"] = mdomain.strip()
if entry:
meta_map[full_name.strip()] = entry
self._hacs_meta = meta_map
self._hacs_meta_fetched_at = int(time.time())
try:
await self.storage.set_hacs_cache({
"fetched_at": self._hacs_meta_fetched_at,
"repos": self._hacs_meta,
})
except Exception:
_LOGGER.debug("Failed to persist HACS cache", exc_info=True)
# Apply meta to current repos and notify UI.
try:
self._apply_hacs_meta(self.repos)
except Exception:
pass
_LOGGER.info("BCS HACS metadata cached: repos=%s", len(self._hacs_meta))
self.signal_updated()
async def _enrich_and_resolve(self, merged: dict[str, RepoItem]) -> None: async def _enrich_and_resolve(self, merged: dict[str, RepoItem]) -> None:
sem = asyncio.Semaphore(6) sem = asyncio.Semaphore(6)
@@ -232,6 +512,87 @@ class BCSCore:
await asyncio.gather(*(process_one(r) for r in merged.values()), return_exceptions=True) await asyncio.gather(*(process_one(r) for r in merged.values()), return_exceptions=True)
async def _enrich_installed_only(self, merged: dict[str, RepoItem]) -> None:
"""Enrich only installed repos (fast refresh for large indexes).
This keeps the backend responsive even with thousands of repositories.
Details for non-installed repos are fetched on-demand.
"""
installed_map: dict[str, Any] = getattr(self, "_installed_cache", {}) or {}
if not isinstance(installed_map, dict) or not installed_map:
return
to_process: list[RepoItem] = []
for rid in installed_map.keys():
r = merged.get(str(rid))
if r:
to_process.append(r)
if not to_process:
return
sem = asyncio.Semaphore(6)
async def process_one(r: RepoItem) -> None:
async with sem:
await self._enrich_one_repo(r)
await asyncio.gather(*(process_one(r) for r in to_process), return_exceptions=True)
async def _enrich_one_repo(self, r: RepoItem) -> None:
"""Fetch provider info + metadata for a single repo item."""
info: RepoInfo = await fetch_repo_info(self.hass, r.url)
r.provider = info.provider or r.provider
r.owner = info.owner or r.owner
r.provider_repo_name = info.repo_name
r.provider_description = info.description
r.default_branch = info.default_branch or r.default_branch
r.latest_version = info.latest_version
r.latest_version_source = info.latest_version_source
md: RepoMetadata = await fetch_repo_metadata(self.hass, r.url, r.default_branch)
r.meta_source = md.source
if md.name:
r.meta_name = md.name
r.name = md.name
r.meta_description = md.description
if md.category:
r.meta_category = md.category
r.meta_author = md.author
r.meta_maintainer = md.maintainer
# Keep a stable name fallback
if not r.name:
r.name = r.provider_repo_name or r.url
async def ensure_repo_details(self, repo_id: str) -> RepoItem | None:
"""Ensure provider/meta/latest fields are loaded for a repo.
Used by the UI when a repo detail view is opened.
"""
r = self.get_repo(repo_id)
if not r:
return None
# If we already have a latest_version (or provider_description), consider it enriched.
if r.latest_version or r.provider_description or r.meta_source:
return r
try:
await self._enrich_one_repo(r)
except Exception:
_LOGGER.debug("BCS ensure_repo_details failed for %s", repo_id, exc_info=True)
return r
async def list_repo_versions(self, repo_id: str) -> list[dict[str, Any]]:
repo = self.get_repo(repo_id)
if not repo:
return []
return await fetch_repo_versions(self.hass, repo.url)
def _add_cache_buster(self, url: str) -> str: def _add_cache_buster(self, url: str) -> str:
parts = urlsplit(url) parts = urlsplit(url)
q = dict(parse_qsl(parts.query, keep_blank_values=True)) q = dict(parse_qsl(parts.query, keep_blank_values=True))
@@ -323,6 +684,7 @@ class BCSCore:
name=name, name=name,
url=repo_url, url=repo_url,
source="index", source="index",
meta_category=str(r.get("category")) if r.get("category") else None,
) )
) )
@@ -409,6 +771,23 @@ class BCSCore:
default_branch=repo.default_branch, default_branch=repo.default_branch,
) )
async def list_repo_versions(self, repo_id: str, *, limit: int = 20) -> list[dict[str, str]]:
"""List installable versions/refs for a repo.
This is used by the UI to allow selecting an older tag/release.
"""
repo = self.get_repo(repo_id)
if not repo:
raise BCSInstallError(f"repo_id not found: {repo_id}")
return await fetch_repo_versions(
self.hass,
repo.url,
provider=repo.provider,
default_branch=repo.default_branch,
limit=limit,
)
def _pick_ref_for_install(self, repo: RepoItem) -> str: def _pick_ref_for_install(self, repo: RepoItem) -> str:
if repo.latest_version and str(repo.latest_version).strip(): if repo.latest_version and str(repo.latest_version).strip():
return str(repo.latest_version).strip() return str(repo.latest_version).strip()
@@ -482,6 +861,311 @@ class BCSCore:
return candidate return candidate
return None return None
async def _ensure_backup_root(self) -> None:
"""Create backup root directory if needed."""
def _mkdir() -> None:
self._backup_root.mkdir(parents=True, exist_ok=True)
await self.hass.async_add_executor_job(_mkdir)
def _build_backup_meta(self, repo_id: str, domain: str) -> dict[str, object]:
"""Build metadata for backup folders so restores can recover the stored version."""
inst = self.get_installed(repo_id) or {}
return {
"repo_id": repo_id,
"domain": domain,
"installed_version": inst.get("installed_version") or inst.get("ref"),
"installed_manifest_version": inst.get("installed_manifest_version"),
"ref": inst.get("ref") or inst.get("installed_version"),
"created_at": time.strftime("%Y-%m-%dT%H:%M:%S"),
}
async def _backup_domain(self, domain: str, *, meta: dict[str, object] | None = None) -> Path | None:
"""Backup an existing domain folder.
Returns the created backup path, or None if the domain folder does not exist.
"""
dest_root = Path(self.hass.config.path("custom_components"))
target = dest_root / domain
if not target.exists() or not target.is_dir():
return None
await self._ensure_backup_root()
ts = time.strftime("%Y%m%d_%H%M%S")
domain_root = self._backup_root / domain
backup_path = domain_root / ts
def _do_backup() -> None:
domain_root.mkdir(parents=True, exist_ok=True)
if backup_path.exists():
shutil.rmtree(backup_path, ignore_errors=True)
shutil.copytree(target, backup_path, dirs_exist_ok=True)
# Store backup metadata (kept inside backup folder; removed from target after restore).
if meta:
try:
meta_path = backup_path / BACKUP_META_FILENAME
meta_path.write_text(json.dumps(meta, ensure_ascii=False, indent=2), encoding='utf-8')
except Exception:
pass
# Retention: keep only the newest N backups per domain.
try:
backups = [p for p in domain_root.iterdir() if p.is_dir()]
backups.sort(key=lambda p: p.name, reverse=True)
for old in backups[self._backup_keep_per_domain :]:
shutil.rmtree(old, ignore_errors=True)
except Exception:
# Never fail install/update because of retention cleanup.
pass
await self.hass.async_add_executor_job(_do_backup)
_LOGGER.info("BCS backup created: domain=%s path=%s", domain, backup_path)
return backup_path
async def _restore_domain_from_backup(self, domain: str, backup_path: Path) -> None:
"""Restore a domain folder from a backup."""
dest_root = Path(self.hass.config.path("custom_components"))
target = dest_root / domain
def _restore() -> None:
if not backup_path.exists() or not backup_path.is_dir():
return
if target.exists():
shutil.rmtree(target, ignore_errors=True)
shutil.copytree(backup_path, target, dirs_exist_ok=True)
try:
meta_file = target / BACKUP_META_FILENAME
if meta_file.exists():
meta_file.unlink(missing_ok=True)
except Exception:
pass
# Do not leave backup metadata inside the restored integration folder.
try:
meta_p = target / BACKUP_META_FILENAME
if meta_p.exists():
meta_p.unlink()
except Exception:
pass
# Do not leave backup metadata inside the live integration folder.
try:
m = target / BACKUP_META_FILENAME
if m.exists():
m.unlink()
except Exception:
pass
await self.hass.async_add_executor_job(_restore)
_LOGGER.info("BCS rollback applied: domain=%s from=%s", domain, backup_path)
async def list_repo_backups(self, repo_id: str) -> list[dict[str, Any]]:
"""List available backup sets for an installed repository.
Returns a list of items sorted newest->oldest:
{"id": "YYYYMMDD_HHMMSS", "label": "YYYY-MM-DD HH:MM:SS", "complete": bool, "domains": [...] }
A backup set is considered *complete* if the timestamp exists for all
domains of the repository.
"""
inst = self.get_installed(repo_id) or {}
domains = inst.get("domains") or []
if not isinstance(domains, list) or not domains:
return []
dom_list = [str(d) for d in domains if str(d).strip()]
if not dom_list:
return []
# Collect timestamps per domain.
per_domain: dict[str, list[str]] = {}
for d in dom_list:
per_domain[d] = await self._list_domain_backup_ids(d)
# Build a map id -> domains where it exists
id_map: dict[str, set[str]] = {}
for d, ids in per_domain.items():
for bid in ids:
id_map.setdefault(bid, set()).add(d)
all_domains = set(dom_list)
items: list[dict[str, Any]] = []
for bid, present in id_map.items():
complete = present == all_domains
label = self._format_backup_id(bid)
meta = await self._read_backup_meta(dom_list[0], bid)
ver = None
if isinstance(meta, dict):
ver = meta.get("installed_version") or meta.get("ref")
if ver:
label = f"{label} ({ver})"
items.append({"id": bid, "label": label, "complete": complete, "domains": sorted(present), "installed_version": str(ver) if ver else None})
# Sort newest first by id (lexicographic works for timestamp format).
items.sort(key=lambda x: str(x.get("id") or ""), reverse=True)
# Keep newest 5 entries overall (UI expects up to 5).
return items[: self._backup_keep_per_domain]
async def restore_repo_backup(self, repo_id: str, backup_id: str) -> dict[str, Any]:
"""Restore a previously created backup set for a repository."""
repo_id = str(repo_id or "").strip()
backup_id = str(backup_id or "").strip()
if not repo_id:
raise BCSInstallError("Missing repo_id")
if not backup_id:
raise BCSInstallError("Missing backup_id")
inst = self.get_installed(repo_id)
if not inst:
raise BCSInstallError("Repository is not installed")
domains = inst.get("domains") or []
if not isinstance(domains, list) or not domains:
raise BCSInstallError("No installed domains found")
dom_list = [str(d) for d in domains if str(d).strip()]
if not dom_list:
raise BCSInstallError("No installed domains found")
# Ensure the backup exists for all domains.
missing: list[str] = []
for d in dom_list:
p = self._backup_root / d / backup_id
if not p.exists() or not p.is_dir():
missing.append(d)
if missing:
raise BCSInstallError(f"Selected backup is not available for all domains: missing={missing}")
async with self._install_lock:
_LOGGER.info("BCS restore started: repo_id=%s backup_id=%s domains=%s", repo_id, backup_id, dom_list)
# Safety: create a new backup of current state before restoring.
for d in dom_list:
try:
await self._backup_domain(d)
except Exception:
_LOGGER.debug("BCS pre-restore backup failed for domain=%s", d, exc_info=True)
# Apply restore.
for d in dom_list:
await self._restore_domain_from_backup(d, self._backup_root / d / backup_id)
# Update stored installed version to the restored one (so UI shows the restored state).
#
# Backups created before 0.6.1 may not have metadata. For those legacy backups we fall back to:
# 1) version from the backup's manifest.json (best-effort), else
# 2) a synthetic marker (restored:<backup_id>) so the UI reflects a restored state and updates
# remain available.
restored_meta = await self._read_backup_meta(dom_list[0], backup_id)
restored_version: str | None = None
restored_manifest_version: str | None = None
if isinstance(restored_meta, dict):
rv = restored_meta.get("installed_version") or restored_meta.get("ref")
if rv is not None and str(rv).strip():
restored_version = str(rv).strip()
mv = restored_meta.get("installed_manifest_version")
if mv is not None and str(mv).strip():
restored_manifest_version = str(mv).strip()
# Legacy backups (no meta): try to read manifest.json version from the backup folder.
if not restored_manifest_version:
restored_manifest_version = await self._read_backup_manifest_version(dom_list[0], backup_id)
# Use manifest version as a fallback display value if we don't have the exact installed ref.
if not restored_version and restored_manifest_version:
restored_version = restored_manifest_version
# Last resort: ensure the installed version changes so the UI does not keep showing the newest version.
if not restored_version:
restored_version = f"restored:{backup_id}"
repo = self.get_repo(repo_id)
repo_url = getattr(repo, "url", None) or ""
await self.storage.set_installed_repo(
repo_id=repo_id,
url=repo_url,
domains=dom_list,
installed_version=restored_version,
installed_manifest_version=restored_manifest_version,
ref=restored_version,
)
await self._refresh_installed_cache()
self._mark_restart_required()
self.signal_updated()
_LOGGER.info("BCS restore complete: repo_id=%s backup_id=%s", repo_id, backup_id)
return {"ok": True, "repo_id": repo_id, "backup_id": backup_id, "domains": dom_list, "restored_version": restored_version, "restart_required": True}
async def _read_backup_meta(self, domain: str, backup_id: str) -> dict[str, Any] | None:
"""Read backup metadata for a domain backup.
Metadata is stored inside the backup folder and will be removed from the
live folder after restore.
"""
try:
p = self._backup_root / domain / backup_id / BACKUP_META_FILENAME
if not p.exists():
return None
txt = await self.hass.async_add_executor_job(p.read_text, 'utf-8')
data = json.loads(txt)
return data if isinstance(data, dict) else None
except Exception:
return None
async def _read_backup_manifest_version(self, domain: str, backup_id: str) -> str | None:
"""Best-effort: read manifest.json version from a legacy backup (no metadata)."""
def _read() -> str | None:
try:
p = self._backup_root / domain / backup_id / 'manifest.json'
if not p.exists():
return None
data = json.loads(p.read_text(encoding='utf-8'))
v = data.get('version')
return str(v) if v else None
except Exception:
return None
return await self.hass.async_add_executor_job(_read)
async def _list_domain_backup_ids(self, domain: str) -> list[str]:
"""List backup ids for a domain (newest->oldest)."""
domain = str(domain or "").strip()
if not domain:
return []
root = self._backup_root / domain
def _list() -> list[str]:
if not root.exists() or not root.is_dir():
return []
ids = [p.name for p in root.iterdir() if p.is_dir()]
ids.sort(reverse=True)
return ids
ids = await self.hass.async_add_executor_job(_list)
return ids[: self._backup_keep_per_domain]
@staticmethod
def _format_backup_id(backup_id: str) -> str:
"""Format backup id YYYYMMDD_HHMMSS -> YYYY-MM-DD HH:MM:SS."""
s = str(backup_id or "").strip()
if len(s) != 15 or "_" not in s:
return s
try:
d, t = s.split("_", 1)
return f"{d[0:4]}-{d[4:6]}-{d[6:8]} {t[0:2]}:{t[2:4]}:{t[4:6]}"
except Exception:
return s
async def _copy_domain_dir(self, src_domain_dir: Path, domain: str) -> None: async def _copy_domain_dir(self, src_domain_dir: Path, domain: str) -> None:
dest_root = Path(self.hass.config.path("custom_components")) dest_root = Path(self.hass.config.path("custom_components"))
target = dest_root / domain target = dest_root / domain
@@ -602,80 +1286,136 @@ class BCSCore:
self.signal_updated() self.signal_updated()
return {"ok": True, "repo_id": repo_id, "removed": removed, "restart_required": bool(removed)} return {"ok": True, "repo_id": repo_id, "removed": removed, "restart_required": bool(removed)}
async def install_repo(self, repo_id: str) -> dict[str, Any]: async def install_repo(self, repo_id: str, *, version: str | None = None) -> dict[str, Any]:
repo = self.get_repo(repo_id) repo = self.get_repo(repo_id)
if not repo: if not repo:
raise BCSInstallError(f"repo_id not found: {repo_id}") raise BCSInstallError(f"repo_id not found: {repo_id}")
async with self._install_lock: async with self._install_lock:
ref = self._pick_ref_for_install(repo) requested = (version or "").strip()
ref = requested if requested else self._pick_ref_for_install(repo)
zip_url = self._build_zip_url(repo.url, ref) zip_url = self._build_zip_url(repo.url, ref)
_LOGGER.info("BCS install started: repo_id=%s ref=%s zip_url=%s", repo_id, ref, zip_url) _LOGGER.info("BCS install started: repo_id=%s ref=%s zip_url=%s", repo_id, ref, zip_url)
with tempfile.TemporaryDirectory(prefix="bcs_install_") as td: installed_domains: list[str] = []
tmp = Path(td) backups: dict[str, Path] = {}
zip_path = tmp / "repo.zip"
extract_dir = tmp / "extract"
extract_dir.mkdir(parents=True, exist_ok=True)
await self._download_zip(zip_url, zip_path) inst_before = self.get_installed(repo_id) or {}
await self._extract_zip(zip_path, extract_dir) backup_meta = {
cc_root = self._find_custom_components_root(extract_dir)
if not cc_root:
raise BCSInstallError("custom_components folder not found in repository ZIP")
installed_domains: list[str] = []
for domain_dir in cc_root.iterdir():
if not domain_dir.is_dir():
continue
manifest = domain_dir / "manifest.json"
if not manifest.exists():
continue
domain = domain_dir.name
await self._copy_domain_dir(domain_dir, domain)
installed_domains.append(domain)
if not installed_domains:
raise BCSInstallError("No integrations found under custom_components/ (missing manifest.json)")
installed_manifest_version = await self._read_installed_manifest_version(installed_domains[0])
installed_version = ref
await self.storage.set_installed_repo(
repo_id=repo_id,
url=repo.url,
domains=installed_domains,
installed_version=installed_version,
installed_manifest_version=installed_manifest_version,
ref=ref,
)
await self._refresh_installed_cache()
self._mark_restart_required()
_LOGGER.info(
"BCS install complete: repo_id=%s domains=%s installed_ref=%s manifest_version=%s",
repo_id,
installed_domains,
installed_version,
installed_manifest_version,
)
self.signal_updated()
return {
"ok": True,
"repo_id": repo_id, "repo_id": repo_id,
"domains": installed_domains, "installed_version": inst_before.get("installed_version") or inst_before.get("ref"),
"installed_version": installed_version, "installed_manifest_version": inst_before.get("installed_manifest_version"),
"installed_manifest_version": installed_manifest_version, "ref": inst_before.get("ref") or inst_before.get("installed_version"),
"restart_required": True,
} }
created_new: set[str] = set()
async def update_repo(self, repo_id: str) -> dict[str, Any]: try:
with tempfile.TemporaryDirectory(prefix="bcs_install_") as td:
tmp = Path(td)
zip_path = tmp / "repo.zip"
extract_dir = tmp / "extract"
extract_dir.mkdir(parents=True, exist_ok=True)
await self._download_zip(zip_url, zip_path)
await self._extract_zip(zip_path, extract_dir)
cc_root = self._find_custom_components_root(extract_dir)
if not cc_root:
raise BCSInstallError("custom_components folder not found in repository ZIP")
dest_root = Path(self.hass.config.path("custom_components"))
for domain_dir in cc_root.iterdir():
if not domain_dir.is_dir():
continue
manifest = domain_dir / "manifest.json"
if not manifest.exists():
continue
domain = domain_dir.name
target = dest_root / domain
# Backup only if we are going to overwrite an existing domain.
if target.exists() and target.is_dir():
m = dict(backup_meta)
m["domain"] = domain
bkp = await self._backup_domain(domain, meta=m)
if bkp:
backups[domain] = bkp
else:
created_new.add(domain)
await self._copy_domain_dir(domain_dir, domain)
installed_domains.append(domain)
if not installed_domains:
raise BCSInstallError("No integrations found under custom_components/ (missing manifest.json)")
installed_manifest_version = await self._read_installed_manifest_version(installed_domains[0])
installed_version = ref
await self.storage.set_installed_repo(
repo_id=repo_id,
url=repo.url,
domains=installed_domains,
installed_version=installed_version,
installed_manifest_version=installed_manifest_version,
ref=ref,
)
await self._refresh_installed_cache()
self._mark_restart_required()
_LOGGER.info(
"BCS install complete: repo_id=%s domains=%s installed_ref=%s manifest_version=%s",
repo_id,
installed_domains,
installed_version,
installed_manifest_version,
)
self.signal_updated()
return {
"ok": True,
"repo_id": repo_id,
"domains": installed_domains,
"installed_version": installed_version,
"installed_manifest_version": installed_manifest_version,
"restart_required": True,
}
except Exception as e:
# Roll back any domains we touched.
_LOGGER.error("BCS install failed, attempting rollback: repo_id=%s error=%s", repo_id, e)
dest_root = Path(self.hass.config.path("custom_components"))
# Restore backed-up domains.
for domain, bkp in backups.items():
try:
await self._restore_domain_from_backup(domain, bkp)
except Exception:
_LOGGER.debug("BCS rollback failed for domain=%s", domain, exc_info=True)
# Remove newly created domains if the install did not complete.
for domain in created_new:
try:
target = dest_root / domain
def _rm() -> None:
if target.exists() and target.is_dir():
shutil.rmtree(target, ignore_errors=True)
await self.hass.async_add_executor_job(_rm)
except Exception:
_LOGGER.debug("BCS cleanup failed for new domain=%s", domain, exc_info=True)
# Re-raise as install error for clean API response.
if isinstance(e, BCSInstallError):
raise
raise BCSInstallError(str(e)) from e
async def update_repo(self, repo_id: str, *, version: str | None = None) -> dict[str, Any]:
_LOGGER.info("BCS update started: repo_id=%s", repo_id) _LOGGER.info("BCS update started: repo_id=%s", repo_id)
return await self.install_repo(repo_id) return await self.install_repo(repo_id, version=version)
async def request_restart(self) -> None: async def request_restart(self) -> None:
await self.hass.services.async_call("homeassistant", "restart", {}, blocking=False) await self.hass.services.async_call("homeassistant", "restart", {}, blocking=False)

View File

@@ -1,7 +1,7 @@
{ {
"domain": "bahmcloud_store", "domain": "bahmcloud_store",
"name": "Bahmcloud Store", "name": "Bahmcloud Store",
"version": "0.5.10", "version": "0.6.6",
"documentation": "https://git.bahmcloud.de/bahmcloud/bahmcloud_store", "documentation": "https://git.bahmcloud.de/bahmcloud/bahmcloud_store",
"platforms": ["update"], "platforms": ["update"],
"requirements": [], "requirements": [],

View File

@@ -18,6 +18,12 @@ class BahmcloudStorePanel extends HTMLElement {
this._filter = "all"; // all|installed|not_installed|updates|custom this._filter = "all"; // all|installed|not_installed|updates|custom
this._sort = "az"; // az|updates_first|installed_first this._sort = "az"; // az|updates_first|installed_first
// Source filter (all|bcs|hacs|custom)
this._sourceFilter = "all";
// HACS toggle (settings)
this._hacsEnabled = false;
this._detailRepoId = null; this._detailRepoId = null;
this._detailRepo = null; this._detailRepo = null;
this._readmeLoading = false; this._readmeLoading = false;
@@ -36,6 +42,24 @@ class BahmcloudStorePanel extends HTMLElement {
this._uninstallingRepoId = null; this._uninstallingRepoId = null;
this._restartRequired = false; this._restartRequired = false;
this._lastActionMsg = null; this._lastActionMsg = null;
// Phase F2.1: restore from backups
this._restoreOpen = false;
this._restoreRepoId = null;
this._restoreLoading = false;
this._restoreOptions = [];
this._restoreSelected = "";
this._restoring = false;
this._restoreError = null;
// Phase C1: selectable install version
this._versionsCache = {}; // repo_id -> [{ref,label,source}, ...]
this._versionsLoadingRepoId = null;
this._selectedVersionByRepoId = {}; // repo_id -> ref ("" means latest)
// History handling (mobile back button should go back to list, not exit panel)
this._historyBound = false;
this._handlingPopstate = false;
} }
set hass(hass) { set hass(hass) {
@@ -43,10 +67,43 @@ class BahmcloudStorePanel extends HTMLElement {
if (!this._rendered) { if (!this._rendered) {
this._rendered = true; this._rendered = true;
this._render(); this._render();
this._ensureHistory();
this._load(); this._load();
} }
} }
_ensureHistory() {
if (this._historyBound) return;
this._historyBound = true;
try {
// Keep an internal history state for this panel.
const current = window.history.state || {};
if (!current || current.__bcs !== true) {
window.history.replaceState({ __bcs: true, view: "store" }, "");
}
} catch (e) {
// ignore
}
window.addEventListener("popstate", (ev) => {
const st = ev?.state;
if (!st || st.__bcs !== true) return;
this._handlingPopstate = true;
try {
const view = st.view || "store";
if (view === "detail" && st.repo_id) {
this._openRepoDetail(st.repo_id, false);
} else {
this._closeDetail(false);
}
} finally {
this._handlingPopstate = false;
}
});
}
async _load() { async _load() {
if (!this._hass) return; if (!this._hass) return;
@@ -58,6 +115,12 @@ class BahmcloudStorePanel extends HTMLElement {
const data = await this._hass.callApi("get", "bcs"); const data = await this._hass.callApi("get", "bcs");
this._data = data; this._data = data;
// Persistent settings (e.g. HACS toggle)
this._hacsEnabled = !!data?.settings?.hacs_enabled;
// Sync settings from backend (e.g. HACS toggle)
this._hacsEnabled = !!data?.settings?.hacs_enabled;
if (this._view === "detail" && this._detailRepoId && Array.isArray(data?.repos)) { if (this._view === "detail" && this._detailRepoId && Array.isArray(data?.repos)) {
const fresh = data.repos.find((r) => this._safeId(r?.id) === this._detailRepoId); const fresh = data.repos.find((r) => this._safeId(r?.id) === this._detailRepoId);
if (fresh) this._detailRepo = fresh; if (fresh) this._detailRepo = fresh;
@@ -70,6 +133,19 @@ class BahmcloudStorePanel extends HTMLElement {
} }
} }
async _setSettings(updates) {
if (!this._hass) return;
try {
const resp = await this._hass.callApi("post", "bcs/settings", updates || {});
if (resp?.ok) {
this._hacsEnabled = !!resp?.settings?.hacs_enabled;
}
} catch (e) {
// Do not fail UI for settings.
this._error = e?.message ? String(e.message) : String(e);
}
}
async _refreshAll() { async _refreshAll() {
if (!this._hass) return; if (!this._hass) return;
if (this._refreshing) return; if (this._refreshing) return;
@@ -105,7 +181,13 @@ class BahmcloudStorePanel extends HTMLElement {
this._update(); this._update();
try { try {
const resp = await this._hass.callApi("post", `bcs/install?repo_id=${encodeURIComponent(repoId)}`, {}); const sel = this._safeText(this._selectedVersionByRepoId?.[repoId] || "").trim();
const qv = sel ? `&version=${encodeURIComponent(sel)}` : "";
const resp = await this._hass.callApi(
"post",
`bcs/install?repo_id=${encodeURIComponent(repoId)}${qv}`,
{},
);
if (!resp?.ok) { if (!resp?.ok) {
this._error = this._safeText(resp?.message) || "Install failed."; this._error = this._safeText(resp?.message) || "Install failed.";
} else { } else {
@@ -131,7 +213,13 @@ class BahmcloudStorePanel extends HTMLElement {
this._update(); this._update();
try { try {
const resp = await this._hass.callApi("post", `bcs/update?repo_id=${encodeURIComponent(repoId)}`, {}); const sel = this._safeText(this._selectedVersionByRepoId?.[repoId] || "").trim();
const qv = sel ? `&version=${encodeURIComponent(sel)}` : "";
const resp = await this._hass.callApi(
"post",
`bcs/update?repo_id=${encodeURIComponent(repoId)}${qv}`,
{},
);
if (!resp?.ok) { if (!resp?.ok) {
this._error = this._safeText(resp?.message) || "Update failed."; this._error = this._safeText(resp?.message) || "Update failed.";
} else { } else {
@@ -175,6 +263,90 @@ class BahmcloudStorePanel extends HTMLElement {
} }
} }
async _openRestore(repoId) {
if (!this._hass) return;
if (!repoId) return;
if (this._installingRepoId || this._updatingRepoId || this._uninstallingRepoId) return;
this._restoreRepoId = repoId;
this._restoreOpen = true;
this._restoreLoading = true;
this._restoreError = null;
this._restoreOptions = [];
this._restoreSelected = "";
this._update();
try {
const resp = await this._hass.callApi("get", `bcs/backups?repo_id=${encodeURIComponent(repoId)}`);
const list = Array.isArray(resp?.backups) ? resp.backups : [];
this._restoreOptions = list;
const firstComplete = list.find((x) => x && x.complete);
const firstAny = list[0];
const pick = firstComplete || firstAny;
if (pick && pick.id) this._restoreSelected = String(pick.id);
if (!list.length) {
this._restoreError = "No backups found for this repository.";
}
} catch (e) {
this._restoreError = e?.message ? String(e.message) : String(e);
} finally {
this._restoreLoading = false;
this._update();
}
}
_closeRestore() {
this._restoreOpen = false;
this._restoreRepoId = null;
this._restoreLoading = false;
this._restoreError = null;
this._restoreOptions = [];
this._restoreSelected = "";
this._restoring = false;
this._update();
}
async _restoreSelectedBackup() {
if (!this._hass) return;
if (!this._restoreRepoId) return;
const bid = String(this._restoreSelected || "").trim();
if (!bid) return;
if (this._restoring) return;
const chosen = (this._restoreOptions || []).find((x) => String(x?.id) === bid);
if (chosen && chosen.complete === false) {
this._restoreError = "Selected backup is not available for all domains of this repository.";
this._update();
return;
}
const ok = window.confirm("Restore selected backup? This will overwrite the installed files under /config/custom_components and requires a restart.");
if (!ok) return;
this._restoring = true;
this._restoreError = null;
this._update();
try {
const resp = await this._hass.callApi("post", `bcs/restore?repo_id=${encodeURIComponent(this._restoreRepoId)}&backup_id=${encodeURIComponent(bid)}`, {});
if (!resp?.ok) {
this._restoreError = this._safeText(resp?.message) || "Restore failed.";
} else {
this._restartRequired = !!resp.restart_required;
this._lastActionMsg = "Restore finished. Restart required.";
this._closeRestore();
}
} catch (e) {
this._restoreError = e?.message ? String(e.message) : String(e);
} finally {
this._restoring = false;
await this._load();
}
}
async _restartHA() { async _restartHA() {
if (!this._hass) return; if (!this._hass) return;
try { try {
@@ -198,21 +370,15 @@ class BahmcloudStorePanel extends HTMLElement {
} }
_goBack() { _goBack() {
if (this._view === "detail") {
this._view = "store";
this._detailRepoId = null;
this._detailRepo = null;
this._readmeText = null;
this._readmeHtml = null;
this._readmeError = null;
this._readmeExpanded = false;
this._update();
return;
}
try { try {
// Prefer browser history so mobile back behaves as expected.
history.back(); history.back();
} catch (_) { } catch (_) {
window.location.href = "/"; if (this._view === "detail") {
this._closeDetail(true);
} else {
window.location.href = "/";
}
} }
} }
@@ -258,11 +424,15 @@ class BahmcloudStorePanel extends HTMLElement {
} }
} }
_openRepoDetail(repoId) { _openRepoDetail(repoId, pushHistory = true) {
const repos = Array.isArray(this._data?.repos) ? this._data.repos : []; const repos = Array.isArray(this._data?.repos) ? this._data.repos : [];
const repo = repos.find((r) => this._safeId(r?.id) === repoId); const repo = repos.find((r) => this._safeId(r?.id) === repoId);
if (!repo) return; if (!repo) return;
if (pushHistory) {
this._pushHistory({ view: "detail", repo_id: repoId });
}
this._view = "detail"; this._view = "detail";
this._detailRepoId = repoId; this._detailRepoId = repoId;
this._detailRepo = repo; this._detailRepo = repo;
@@ -273,8 +443,60 @@ class BahmcloudStorePanel extends HTMLElement {
this._readmeExpanded = false; this._readmeExpanded = false;
this._readmeCanToggle = false; this._readmeCanToggle = false;
// Versions dropdown
if (!(repoId in this._selectedVersionByRepoId)) {
this._selectedVersionByRepoId[repoId] = ""; // default = latest
}
this._update(); this._update();
this._loadRepoDetails(repoId);
this._loadReadme(repoId); this._loadReadme(repoId);
this._loadVersions(repoId);
}
async _loadRepoDetails(repoId) {
if (!this._hass || !repoId) return;
try {
const resp = await this._hass.callApi("get", `bcs/repo?repo_id=${encodeURIComponent(repoId)}`);
if (resp?.ok && resp.repo) {
this._detailRepo = resp.repo;
// Also update the cached list item if present
const repos = Array.isArray(this._data?.repos) ? this._data.repos : [];
const idx = repos.findIndex((r) => this._safeId(r?.id) === repoId);
if (idx >= 0) repos[idx] = resp.repo;
this._update();
}
} catch (e) {
// ignore: details are optional
}
}
async _loadVersions(repoId) {
if (!this._hass) return;
if (!repoId) return;
// Cache: avoid re-fetching repeatedly in the same session.
if (Array.isArray(this._versionsCache?.[repoId]) && this._versionsCache[repoId].length) {
return;
}
this._versionsLoadingRepoId = repoId;
this._update();
try {
const resp = await this._hass.callApi("get", `bcs/versions?repo_id=${encodeURIComponent(repoId)}`);
if (resp?.ok && Array.isArray(resp.versions)) {
this._versionsCache[repoId] = resp.versions;
} else {
this._versionsCache[repoId] = [];
}
} catch (e) {
this._versionsCache[repoId] = [];
} finally {
this._versionsLoadingRepoId = null;
this._update();
}
} }
async _loadReadme(repoId) { async _loadReadme(repoId) {
@@ -399,6 +621,24 @@ class BahmcloudStorePanel extends HTMLElement {
box-shadow: 0 0 0 2px rgba(30,136,229,.15); box-shadow: 0 0 0 2px rgba(30,136,229,.15);
} }
.toggle{
display:inline-flex;
align-items:center;
gap:8px;
padding:10px 12px;
border-radius:14px;
border:1px solid var(--divider-color);
background: var(--card-background-color);
color: var(--primary-text-color);
user-select:none;
cursor:pointer;
}
.toggle input{
margin:0;
width:18px;
height:18px;
}
button{ button{
padding:10px 12px; padding:10px 12px;
border-radius:14px; border-radius:14px;
@@ -413,6 +653,22 @@ class BahmcloudStorePanel extends HTMLElement {
} }
button:disabled{ opacity: .55; cursor: not-allowed; } button:disabled{ opacity: .55; cursor: not-allowed; }
.modalOverlay{
position:fixed; inset:0; z-index:999;
background: rgba(0,0,0,0.45);
display:flex; align-items:center; justify-content:center;
padding:16px;
}
.modal{
width: min(520px, 100%);
background: var(--card-background-color);
border:1px solid var(--divider-color);
border-radius:18px;
padding:16px;
box-shadow: 0 10px 30px rgba(0,0,0,0.25);
}
.modalTitle{ font-size:16px; font-weight:700; }
.err{ .err{
margin:12px 0; margin:12px 0;
padding:12px 14px; padding:12px 14px;
@@ -617,7 +873,8 @@ class BahmcloudStorePanel extends HTMLElement {
else if (this._view === "about") html = this._renderAbout(); else if (this._view === "about") html = this._renderAbout();
else if (this._view === "detail") html = this._renderDetail(); else if (this._view === "detail") html = this._renderDetail();
content.innerHTML = `${err}${html}`; const modal = this._renderRestoreModal();
content.innerHTML = `${err}${html}${modal}`;
if (this._view === "store") this._wireStore(); if (this._view === "store") this._wireStore();
if (this._view === "manage") this._wireManage(); if (this._view === "manage") this._wireManage();
@@ -625,6 +882,8 @@ class BahmcloudStorePanel extends HTMLElement {
this._wireDetail(); // now always wires buttons this._wireDetail(); // now always wires buttons
} }
this._wireRestoreModal();
// Restore focus and cursor for the search field after re-render. // Restore focus and cursor for the search field after re-render.
if (restore.id && this._view === "store") { if (restore.id && this._view === "store") {
const el = root.getElementById(restore.id); const el = root.getElementById(restore.id);
@@ -678,6 +937,11 @@ class BahmcloudStorePanel extends HTMLElement {
const cat = this._safeText(r?.category) || ""; const cat = this._safeText(r?.category) || "";
if (this._category !== "all" && this._category !== cat) return false; if (this._category !== "all" && this._category !== cat) return false;
// Source filter
if (this._sourceFilter === "bcs" && r?.source !== "index") return false;
if (this._sourceFilter === "hacs" && r?.source !== "hacs") return false;
if (this._sourceFilter === "custom" && r?.source !== "custom") return false;
const latest = this._safeText(r?.latest_version); const latest = this._safeText(r?.latest_version);
const installed = this._asBoolStrict(r?.installed); const installed = this._asBoolStrict(r?.installed);
const installedVersion = this._safeText(r?.installed_version); const installedVersion = this._safeText(r?.installed_version);
@@ -732,7 +996,11 @@ class BahmcloudStorePanel extends HTMLElement {
const updateAvailable = installed && !!latest && (!installedVersion || latest !== installedVersion); const updateAvailable = installed && !!latest && (!installedVersion || latest !== installedVersion);
const badges = []; const badges = [];
if (r?.source === "custom") badges.push("Custom"); // Source badges
if (r?.source === "index") badges.push("BCS Official");
else if (r?.source === "hacs") badges.push("HACS");
else if (r?.source === "custom") badges.push("Custom");
if (installed) badges.push("Installed"); if (installed) badges.push("Installed");
if (updateAvailable) badges.push("Update"); if (updateAvailable) badges.push("Update");
@@ -758,6 +1026,18 @@ class BahmcloudStorePanel extends HTMLElement {
return ` return `
<div class="filters"> <div class="filters">
<input id="q" placeholder="Search…" value="${this._esc(this._search)}" /> <input id="q" placeholder="Search…" value="${this._esc(this._search)}" />
<label class="toggle" title="Show official HACS repositories">
<input id="hacs_toggle" type="checkbox" ${this._hacsEnabled ? "checked" : ""} />
<span>HACS</span>
</label>
<select id="src">
<option value="all" ${this._sourceFilter === "all" ? "selected" : ""}>All sources</option>
<option value="bcs" ${this._sourceFilter === "bcs" ? "selected" : ""}>BCS Official</option>
<option value="hacs" ${this._sourceFilter === "hacs" ? "selected" : ""}>HACS</option>
<option value="custom" ${this._sourceFilter === "custom" ? "selected" : ""}>Custom</option>
</select>
<select id="cat"> <select id="cat">
<option value="all">All categories</option> <option value="all">All categories</option>
${categories.map((c) => `<option value="${this._esc(c)}" ${this._category === c ? "selected" : ""}>${this._esc(c)}</option>`).join("")} ${categories.map((c) => `<option value="${this._esc(c)}" ${this._category === c ? "selected" : ""}>${this._esc(c)}</option>`).join("")}
@@ -791,6 +1071,8 @@ class BahmcloudStorePanel extends HTMLElement {
const cat = root.getElementById("cat"); const cat = root.getElementById("cat");
const filter = root.getElementById("filter"); const filter = root.getElementById("filter");
const sort = root.getElementById("sort"); const sort = root.getElementById("sort");
const src = root.getElementById("src");
const hacsToggle = root.getElementById("hacs_toggle");
if (q) { if (q) {
q.addEventListener("input", (e) => { q.addEventListener("input", (e) => {
@@ -817,12 +1099,51 @@ class BahmcloudStorePanel extends HTMLElement {
}); });
} }
if (src) {
src.addEventListener("change", (e) => {
this._sourceFilter = e?.target?.value || "all";
this._update();
});
}
if (hacsToggle) {
hacsToggle.addEventListener("change", async (e) => {
const enabled = !!e?.target?.checked;
this._hacsEnabled = enabled;
this._update();
await this._setSettings({ hacs_enabled: enabled });
await this._load();
});
}
root.querySelectorAll("[data-open]").forEach((el) => { root.querySelectorAll("[data-open]").forEach((el) => {
const id = el.getAttribute("data-open"); const id = el.getAttribute("data-open");
el.addEventListener("click", () => this._openRepoDetail(id)); el.addEventListener("click", () => this._openRepoDetail(id, true));
}); });
} }
_pushHistory(state) {
if (this._handlingPopstate) return;
try {
window.history.pushState({ __bcs: true, ...(state || {}) }, "");
} catch (e) {
// ignore
}
}
_closeDetail(pushHistory = true) {
this._view = "store";
this._detailRepoId = null;
this._detailRepo = null;
this._readmeText = null;
this._readmeHtml = null;
this._readmeError = null;
this._readmeExpanded = false;
this._readmeCanToggle = false;
if (pushHistory) this._pushHistory({ view: "store" });
this._update();
}
_renderAbout() { _renderAbout() {
return ` return `
<div class="card"> <div class="card">
@@ -841,6 +1162,8 @@ class BahmcloudStorePanel extends HTMLElement {
const r = this._detailRepo; const r = this._detailRepo;
if (!r) return `<div class="card">No repository selected.</div>`; if (!r) return `<div class="card">No repository selected.</div>`;
const repoId = this._safeId(r?.id) || this._detailRepoId || "";
const name = this._safeText(r?.name) || "Unnamed repository"; const name = this._safeText(r?.name) || "Unnamed repository";
const url = this._safeText(r?.url) || ""; const url = this._safeText(r?.url) || "";
const desc = this._safeText(r?.description) || ""; const desc = this._safeText(r?.description) || "";
@@ -889,8 +1212,6 @@ class BahmcloudStorePanel extends HTMLElement {
</div> </div>
`; `;
const repoId = this._safeId(r?.id);
const installed = this._asBoolStrict(r?.installed); const installed = this._asBoolStrict(r?.installed);
const installedVersion = this._safeText(r?.installed_version); const installedVersion = this._safeText(r?.installed_version);
const installedDomains = Array.isArray(r?.installed_domains) ? r.installed_domains : []; const installedDomains = Array.isArray(r?.installed_domains) ? r.installed_domains : [];
@@ -903,9 +1224,36 @@ class BahmcloudStorePanel extends HTMLElement {
const updateAvailable = installed && !!latestVersion && (!installedVersion || latestVersion !== installedVersion); const updateAvailable = installed && !!latestVersion && (!installedVersion || latestVersion !== installedVersion);
const versions = Array.isArray(this._versionsCache?.[repoId]) ? this._versionsCache[repoId] : [];
const versionsLoading = this._versionsLoadingRepoId === repoId;
const selectedRef = this._safeText(this._selectedVersionByRepoId?.[repoId] || "").trim();
let versionOptions = `<option value="">Latest (recommended)</option>`;
if (selectedRef && !versions.some((v) => this._safeText(v?.ref) === selectedRef)) {
versionOptions += `<option value="${this._esc(selectedRef)}" selected>Selected: ${this._esc(selectedRef)}</option>`;
}
for (const v of versions) {
const ref = this._safeText(v?.ref);
if (!ref) continue;
const label = this._safeText(v?.label) || ref;
const sel = selectedRef === ref ? "selected" : "";
versionOptions += `<option value="${this._esc(ref)}" ${sel}>${this._esc(label)}</option>`;
}
const versionSelect = `
<div style="margin-top:12px;">
<div class="muted small" style="margin-bottom:6px;"><strong>Install version:</strong></div>
<select id="selVersion" ${busy ? "disabled" : ""} style="width:100%;">
${versionOptions}
</select>
${versionsLoading ? `<div class="muted small" style="margin-top:6px;">Loading versions…</div>` : ``}
</div>
`;
const installBtn = `<button class="primary" id="btnInstall" ${installed || busy ? "disabled" : ""}>${busyInstall ? "Installing…" : installed ? "Installed" : "Install"}</button>`; const installBtn = `<button class="primary" id="btnInstall" ${installed || busy ? "disabled" : ""}>${busyInstall ? "Installing…" : installed ? "Installed" : "Install"}</button>`;
const updateBtn = `<button class="primary" id="btnUpdate" ${!updateAvailable || busy ? "disabled" : ""}>${busyUpdate ? "Updating…" : updateAvailable ? "Update" : "Up to date"}</button>`; const updateBtn = `<button class="primary" id="btnUpdate" ${!updateAvailable || busy ? "disabled" : ""}>${busyUpdate ? "Updating…" : updateAvailable ? "Update" : "Up to date"}</button>`;
const uninstallBtn = `<button class="primary" id="btnUninstall" ${!installed || busy ? "disabled" : ""}>${busyUninstall ? "Uninstalling…" : "Uninstall"}</button>`; const uninstallBtn = `<button class="primary" id="btnUninstall" ${!installed || busy ? "disabled" : ""}>${busyUninstall ? "Uninstalling…" : "Uninstall"}</button>`;
const restoreBtn = `<button class="primary" id="btnRestore" ${!installed || busy ? "disabled" : ""}>Restore</button>`;
const restartHint = this._restartRequired const restartHint = this._restartRequired
? ` ? `
@@ -954,10 +1302,13 @@ class BahmcloudStorePanel extends HTMLElement {
<div style="margin-top:6px;"><strong>Domains:</strong> ${installedDomains.length ? this._esc(installedDomains.join(", ")) : "-"}</div> <div style="margin-top:6px;"><strong>Domains:</strong> ${installedDomains.length ? this._esc(installedDomains.join(", ")) : "-"}</div>
</div> </div>
${versionSelect}
<div class="row" style="margin-top:14px; gap:10px; flex-wrap:wrap;"> <div class="row" style="margin-top:14px; gap:10px; flex-wrap:wrap;">
${installBtn} ${installBtn}
${updateBtn} ${updateBtn}
${uninstallBtn} ${uninstallBtn}
${restoreBtn}
</div> </div>
${restartHint} ${restartHint}
@@ -974,8 +1325,10 @@ class BahmcloudStorePanel extends HTMLElement {
const btnInstall = root.getElementById("btnInstall"); const btnInstall = root.getElementById("btnInstall");
const btnUpdate = root.getElementById("btnUpdate"); const btnUpdate = root.getElementById("btnUpdate");
const btnUninstall = root.getElementById("btnUninstall"); const btnUninstall = root.getElementById("btnUninstall");
const btnRestore = root.getElementById("btnRestore");
const btnRestart = root.getElementById("btnRestart"); const btnRestart = root.getElementById("btnRestart");
const btnReadmeToggle = root.getElementById("btnReadmeToggle"); const btnReadmeToggle = root.getElementById("btnReadmeToggle");
const selVersion = root.getElementById("selVersion");
if (btnInstall) { if (btnInstall) {
btnInstall.addEventListener("click", () => { btnInstall.addEventListener("click", () => {
@@ -984,6 +1337,14 @@ class BahmcloudStorePanel extends HTMLElement {
}); });
} }
if (selVersion) {
selVersion.addEventListener("change", () => {
if (!this._detailRepoId) return;
const v = selVersion.value != null ? String(selVersion.value) : "";
this._selectedVersionByRepoId[this._detailRepoId] = v;
});
}
if (btnUpdate) { if (btnUpdate) {
btnUpdate.addEventListener("click", () => { btnUpdate.addEventListener("click", () => {
if (btnUpdate.disabled) return; if (btnUpdate.disabled) return;
@@ -998,6 +1359,15 @@ class BahmcloudStorePanel extends HTMLElement {
}); });
} }
if (btnRestore) {
btnRestore.addEventListener("click", () => {
if (btnRestore.disabled) return;
if (this._detailRepoId) this._openRestore(this._detailRepoId);
});
}
if (btnRestart) { if (btnRestart) {
btnRestart.addEventListener("click", () => this._restartHA()); btnRestart.addEventListener("click", () => this._restartHA());
} }
@@ -1024,6 +1394,45 @@ class BahmcloudStorePanel extends HTMLElement {
} }
} }
_wireRestoreModal() {
const root = this.shadowRoot;
if (!root) return;
const overlay = root.getElementById("restoreOverlay");
if (!overlay) return;
// Click outside modal closes it
overlay.addEventListener("click", (ev) => {
if (ev.target === overlay) this._closeRestore();
});
const sel = root.getElementById("restoreSelect");
if (sel) {
try {
if (this._restoreSelected) sel.value = String(this._restoreSelected);
} catch (_) {}
sel.addEventListener("change", () => {
this._restoreSelected = String(sel.value || "");
});
}
const btnCancel = root.getElementById("btnRestoreCancel");
if (btnCancel) {
btnCancel.addEventListener("click", () => this._closeRestore());
}
const btnApply = root.getElementById("btnRestoreApply");
if (btnApply) {
btnApply.addEventListener("click", () => {
if (btnApply.disabled) return;
const v = root.getElementById("restoreSelect")?.value;
if (v) this._restoreSelected = String(v);
this._restoreSelectedBackup();
});
}
}
_postprocessRenderedMarkdown(container) { _postprocessRenderedMarkdown(container) {
if (!container) return; if (!container) return;
try { try {
@@ -1036,6 +1445,52 @@ class BahmcloudStorePanel extends HTMLElement {
} }
_renderRestoreModal() {
if (!this._restoreOpen) return "";
const opts = Array.isArray(this._restoreOptions) ? this._restoreOptions : [];
const disabled = this._restoreLoading || this._restoring || !opts.length;
const optionsHtml = opts
.map((o) => {
const id = this._safeText(o?.id) || "";
const label = this._safeText(o?.label) || id;
return `<option value="${this._esc(id)}">${this._esc(label)}</option>`;
})
.join("");
const msg = this._restoreLoading
? "Loading backups…"
: this._restoreError
? this._safeText(this._restoreError)
: opts.length
? "Select a backup to restore. This will overwrite files under /config/custom_components and requires a restart."
: "No backups found.";
return `
<div class="modalOverlay" id="restoreOverlay">
<div class="modal">
<div class="modalTitle">Restore from backup</div>
<div class="muted" style="margin-top:8px;">${this._esc(msg)}</div>
<div style="margin-top:14px;">
<label class="muted small" for="restoreSelect">Backup</label><br/>
<select id="restoreSelect" ${disabled ? "disabled" : ""} style="width:100%; margin-top:6px;">
${optionsHtml}
</select>
</div>
<div class="row" style="margin-top:16px; justify-content:flex-end; gap:10px;">
<button id="btnRestoreCancel">Cancel</button>
<button class="primary" id="btnRestoreApply" ${disabled ? "disabled" : ""}>${this._restoring ? "Restoring…" : "Restore"}</button>
</div>
</div>
</div>
`;
}
_renderManage() { _renderManage() {
const repos = Array.isArray(this._data.repos) ? this._data.repos : []; const repos = Array.isArray(this._data.repos) ? this._data.repos : [];
const custom = repos.filter((r) => r?.source === "custom"); const custom = repos.filter((r) => r?.source === "custom");

View File

@@ -504,4 +504,160 @@ async def fetch_readme_markdown(
except Exception: except Exception:
continue continue
return None return None
async def fetch_repo_versions(
hass: HomeAssistant,
repo_url: str,
*,
provider: str | None = None,
default_branch: str | None = None,
limit: int = 20,
) -> list[dict[str, str]]:
"""List available versions/refs for a repository.
Returns a list of dicts with keys:
- ref: the ref to install (tag/release/branch)
- label: human-friendly label
- source: release|tag|branch
Notes:
- Uses public endpoints (no tokens) for public repositories.
- We prefer releases first (if available), then tags.
"""
repo_url = (repo_url or "").strip()
if not repo_url:
return []
prov = (provider or "").strip().lower() if provider else ""
if not prov:
prov = detect_provider(repo_url)
owner, repo = _split_owner_repo(repo_url)
if not owner or not repo:
return []
session = async_get_clientsession(hass)
headers = {"User-Agent": UA}
out: list[dict[str, str]] = []
seen: set[str] = set()
def _add(ref: str | None, label: str, source: str) -> None:
r = (ref or "").strip()
if not r or r in seen:
return
seen.add(r)
out.append({"ref": r, "label": label, "source": source})
# Always offer default branch as an explicit option.
if default_branch and str(default_branch).strip():
b = str(default_branch).strip()
_add(b, f"Branch: {b}", "branch")
try:
if prov == "github":
# Releases
gh_headers = {"Accept": "application/vnd.github+json", "User-Agent": UA}
data, _ = await _safe_json(
session,
f"https://api.github.com/repos/{owner}/{repo}/releases?per_page={int(limit)}",
headers=gh_headers,
)
if isinstance(data, list):
for r in data:
if not isinstance(r, dict):
continue
tag = r.get("tag_name")
name = r.get("name")
if tag:
lbl = str(tag)
if isinstance(name, str) and name.strip() and name.strip() != str(tag):
lbl = f"{tag}{name.strip()}"
_add(str(tag), lbl, "release")
# Tags
data, _ = await _safe_json(
session,
f"https://api.github.com/repos/{owner}/{repo}/tags?per_page={int(limit)}",
headers=gh_headers,
)
if isinstance(data, list):
for t in data:
if isinstance(t, dict) and t.get("name"):
_add(str(t["name"]), str(t["name"]), "tag")
return out
if prov == "gitlab":
u = urlparse(repo_url.rstrip("/"))
base = f"{u.scheme}://{u.netloc}"
project = quote_plus(f"{owner}/{repo}")
data, _ = await _safe_json(
session,
f"{base}/api/v4/projects/{project}/releases?per_page={int(limit)}",
headers=headers,
)
if isinstance(data, list):
for r in data:
if not isinstance(r, dict):
continue
tag = r.get("tag_name")
name = r.get("name")
if tag:
lbl = str(tag)
if isinstance(name, str) and name.strip() and name.strip() != str(tag):
lbl = f"{tag}{name.strip()}"
_add(str(tag), lbl, "release")
data, _ = await _safe_json(
session,
f"{base}/api/v4/projects/{project}/repository/tags?per_page={int(limit)}",
headers=headers,
)
if isinstance(data, list):
for t in data:
if isinstance(t, dict) and t.get("name"):
_add(str(t["name"]), str(t["name"]), "tag")
return out
# gitea (incl. Bahmcloud)
u = urlparse(repo_url.rstrip("/"))
base = f"{u.scheme}://{u.netloc}"
data, _ = await _safe_json(
session,
f"{base}/api/v1/repos/{owner}/{repo}/releases?limit={int(limit)}",
headers=headers,
)
if isinstance(data, list):
for r in data:
if not isinstance(r, dict):
continue
tag = r.get("tag_name")
name = r.get("name")
if tag:
lbl = str(tag)
if isinstance(name, str) and name.strip() and name.strip() != str(tag):
lbl = f"{tag}{name.strip()}"
_add(str(tag), lbl, "release")
data, _ = await _safe_json(
session,
f"{base}/api/v1/repos/{owner}/{repo}/tags?limit={int(limit)}",
headers=headers,
)
if isinstance(data, list):
for t in data:
if isinstance(t, dict) and t.get("name"):
_add(str(t["name"]), str(t["name"]), "tag")
return out
except Exception:
_LOGGER.debug("fetch_repo_versions failed for %s", repo_url, exc_info=True)
return out

View File

@@ -36,6 +36,8 @@ class BCSStorage:
Keys: Keys:
- custom_repos: list of manually added repositories - custom_repos: list of manually added repositories
- installed_repos: mapping repo_id -> installed metadata - installed_repos: mapping repo_id -> installed metadata
- settings: persistent user settings (e.g. toggles in the UI)
- hacs_cache: cached HACS metadata to improve UX (display names/descriptions)
""" """
def __init__(self, hass: HomeAssistant) -> None: def __init__(self, hass: HomeAssistant) -> None:
@@ -53,8 +55,54 @@ class BCSStorage:
if "installed_repos" not in data or not isinstance(data.get("installed_repos"), dict): if "installed_repos" not in data or not isinstance(data.get("installed_repos"), dict):
data["installed_repos"] = {} data["installed_repos"] = {}
if "settings" not in data or not isinstance(data.get("settings"), dict):
data["settings"] = {}
if "hacs_cache" not in data or not isinstance(data.get("hacs_cache"), dict):
data["hacs_cache"] = {}
return data return data
async def get_hacs_cache(self) -> dict[str, Any]:
"""Return cached HACS metadata.
Shape:
{
"fetched_at": <unix_ts>,
"repos": {"owner/repo": {"name": "...", "description": "...", "domain": "..."}}
}
"""
data = await self._load()
cache = data.get("hacs_cache", {})
return cache if isinstance(cache, dict) else {}
async def set_hacs_cache(self, cache: dict[str, Any]) -> None:
"""Persist cached HACS metadata."""
data = await self._load()
data["hacs_cache"] = cache if isinstance(cache, dict) else {}
await self._save(data)
async def get_settings(self) -> dict[str, Any]:
"""Return persistent settings.
Currently used for UI/behavior toggles.
"""
data = await self._load()
settings = data.get("settings", {})
return settings if isinstance(settings, dict) else {}
async def set_settings(self, updates: dict[str, Any]) -> dict[str, Any]:
"""Update persistent settings and return the merged settings."""
data = await self._load()
settings = data.get("settings", {})
if not isinstance(settings, dict):
settings = {}
for k, v in (updates or {}).items():
settings[str(k)] = v
data["settings"] = settings
await self._save(data)
return settings
async def _save(self, data: dict[str, Any]) -> None: async def _save(self, data: dict[str, Any]) -> None:
await self._store.async_save(data) await self._store.async_save(data)

View File

@@ -215,7 +215,12 @@ class BCSApiView(HomeAssistantView):
async def get(self, request: web.Request) -> web.Response: async def get(self, request: web.Request) -> web.Response:
return web.json_response( return web.json_response(
{"ok": True, "version": self.core.version, "repos": self.core.list_repos_public()} {
"ok": True,
"version": self.core.version,
"settings": self.core.get_settings_public(),
"repos": self.core.list_repos_public(),
}
) )
async def post(self, request: web.Request) -> web.Response: async def post(self, request: web.Request) -> web.Response:
@@ -248,6 +253,37 @@ class BCSApiView(HomeAssistantView):
return web.json_response({"ok": False, "message": "Unknown operation"}, status=400) return web.json_response({"ok": False, "message": "Unknown operation"}, status=400)
class BCSSettingsView(HomeAssistantView):
"""Persistent UI settings (e.g. toggles)."""
url = "/api/bcs/settings"
name = "api:bcs_settings"
requires_auth = True
def __init__(self, core: Any) -> None:
self.core: BCSCore = core
async def get(self, request: web.Request) -> web.Response:
return web.json_response({"ok": True, "settings": self.core.get_settings_public()})
async def post(self, request: web.Request) -> web.Response:
try:
data = await request.json()
except Exception:
data = {}
updates: dict[str, Any] = {}
if "hacs_enabled" in data:
updates["hacs_enabled"] = bool(data.get("hacs_enabled"))
try:
settings = await self.core.set_settings(updates)
return web.json_response({"ok": True, "settings": settings})
except Exception as e:
_LOGGER.exception("BCS set settings failed: %s", e)
return web.json_response({"ok": False, "message": str(e) or "Failed"}, status=500)
class BCSCustomRepoView(HomeAssistantView): class BCSCustomRepoView(HomeAssistantView):
url = "/api/bcs/custom_repo" url = "/api/bcs/custom_repo"
name = "api:bcs_custom_repo" name = "api:bcs_custom_repo"
@@ -292,6 +328,27 @@ class BCSReadmeView(HomeAssistantView):
return web.json_response({"ok": True, "readme": md_str, "html": html}) return web.json_response({"ok": True, "readme": md_str, "html": html})
class BCSVersionsView(HomeAssistantView):
url = "/api/bcs/versions"
name = "api:bcs_versions"
requires_auth = True
def __init__(self, core: Any) -> None:
self.core: BCSCore = core
async def get(self, request: web.Request) -> web.Response:
repo_id = request.query.get("repo_id")
if not repo_id:
return web.json_response({"ok": False, "message": "Missing repo_id"}, status=400)
try:
versions = await self.core.list_repo_versions(repo_id)
return web.json_response({"ok": True, "repo_id": repo_id, "versions": versions}, status=200)
except Exception as e:
_LOGGER.exception("BCS list versions failed: %s", e)
return web.json_response({"ok": False, "message": str(e) or "List versions failed"}, status=500)
class BCSInstallView(HomeAssistantView): class BCSInstallView(HomeAssistantView):
url = "/api/bcs/install" url = "/api/bcs/install"
name = "api:bcs_install" name = "api:bcs_install"
@@ -302,11 +359,13 @@ class BCSInstallView(HomeAssistantView):
async def post(self, request: web.Request) -> web.Response: async def post(self, request: web.Request) -> web.Response:
repo_id = request.query.get("repo_id") repo_id = request.query.get("repo_id")
version = request.query.get("version")
if not repo_id: if not repo_id:
return web.json_response({"ok": False, "message": "Missing repo_id"}, status=400) return web.json_response({"ok": False, "message": "Missing repo_id"}, status=400)
try: try:
result = await self.core.install_repo(repo_id) v = str(version).strip() if version is not None else None
result = await self.core.install_repo(repo_id, version=v)
return web.json_response(result, status=200) return web.json_response(result, status=200)
except Exception as e: except Exception as e:
_LOGGER.exception("BCS install failed: %s", e) _LOGGER.exception("BCS install failed: %s", e)
@@ -323,11 +382,13 @@ class BCSUpdateView(HomeAssistantView):
async def post(self, request: web.Request) -> web.Response: async def post(self, request: web.Request) -> web.Response:
repo_id = request.query.get("repo_id") repo_id = request.query.get("repo_id")
version = request.query.get("version")
if not repo_id: if not repo_id:
return web.json_response({"ok": False, "message": "Missing repo_id"}, status=400) return web.json_response({"ok": False, "message": "Missing repo_id"}, status=400)
try: try:
result = await self.core.update_repo(repo_id) v = str(version).strip() if version is not None else None
result = await self.core.update_repo(repo_id, version=v)
return web.json_response(result, status=200) return web.json_response(result, status=200)
except Exception as e: except Exception as e:
_LOGGER.exception("BCS update failed: %s", e) _LOGGER.exception("BCS update failed: %s", e)
@@ -355,6 +416,53 @@ class BCSUninstallView(HomeAssistantView):
return web.json_response({"ok": False, "message": str(e) or "Uninstall failed"}, status=500) return web.json_response({"ok": False, "message": str(e) or "Uninstall failed"}, status=500)
class BCSBackupsView(HomeAssistantView):
url = "/api/bcs/backups"
name = "api:bcs_backups"
requires_auth = True
def __init__(self, core: Any) -> None:
self.core: BCSCore = core
async def get(self, request: web.Request) -> web.Response:
repo_id = request.query.get("repo_id")
if not repo_id:
return web.json_response({"ok": False, "message": "Missing repo_id"}, status=400)
try:
backups = await self.core.list_repo_backups(repo_id)
return web.json_response({"ok": True, "repo_id": repo_id, "backups": backups}, status=200)
except Exception as e:
_LOGGER.exception("BCS list backups failed: %s", e)
return web.json_response({"ok": False, "message": str(e) or "List backups failed"}, status=500)
class BCSRestoreView(HomeAssistantView):
url = "/api/bcs/restore"
name = "api:bcs_restore"
requires_auth = True
def __init__(self, core: Any) -> None:
self.core: BCSCore = core
async def post(self, request: web.Request) -> web.Response:
repo_id = request.query.get("repo_id")
backup_id = request.query.get("backup_id")
if not repo_id:
return web.json_response({"ok": False, "message": "Missing repo_id"}, status=400)
if not backup_id:
return web.json_response({"ok": False, "message": "Missing backup_id"}, status=400)
try:
result = await self.core.restore_repo_backup(repo_id, backup_id)
return web.json_response(result, status=200)
except Exception as e:
_LOGGER.exception("BCS restore failed: %s", e)
return web.json_response({"ok": False, "message": str(e) or "Restore failed"}, status=500)
class BCSRestartView(HomeAssistantView): class BCSRestartView(HomeAssistantView):
url = "/api/bcs/restart" url = "/api/bcs/restart"
name = "api:bcs_restart" name = "api:bcs_restart"
@@ -369,4 +477,56 @@ class BCSRestartView(HomeAssistantView):
return web.json_response({"ok": True}) return web.json_response({"ok": True})
except Exception as e: except Exception as e:
_LOGGER.exception("BCS restart failed: %s", e) _LOGGER.exception("BCS restart failed: %s", e)
return web.json_response({"ok": False, "message": str(e) or "Restart failed"}, status=500) return web.json_response({"ok": False, "message": str(e) or "Restart failed"}, status=500)
class BCSRepoDetailView(HomeAssistantView):
url = "/api/bcs/repo"
name = "api:bcs_repo"
requires_auth = True
def __init__(self, core: Any) -> None:
self.core: BCSCore = core
async def get(self, request: web.Request) -> web.Response:
repo_id = (request.query.get("repo_id") or "").strip()
if not repo_id:
return web.json_response({"ok": False, "message": "Missing repo_id"}, status=400)
try:
repo = await self.core.ensure_repo_details(repo_id)
if not repo:
return web.json_response({"ok": False, "message": "Repo not found"}, status=404)
inst = self.core.get_installed(repo_id) or {}
installed = bool(inst)
domains = inst.get("domains") or []
if not isinstance(domains, list):
domains = []
return web.json_response({
"ok": True,
"repo": {
"id": repo.id,
"name": repo.name,
"url": repo.url,
"source": repo.source,
"owner": repo.owner,
"provider": repo.provider,
"repo_name": repo.provider_repo_name,
"description": repo.provider_description or repo.meta_description,
"default_branch": repo.default_branch,
"latest_version": repo.latest_version,
"latest_version_source": repo.latest_version_source,
"category": repo.meta_category,
"meta_author": repo.meta_author,
"meta_maintainer": repo.meta_maintainer,
"meta_source": repo.meta_source,
"installed": installed,
"installed_version": inst.get("installed_version"),
"installed_manifest_version": inst.get("installed_manifest_version"),
"installed_domains": domains,
}
}, status=200)
except Exception as e:
_LOGGER.exception("BCS repo details failed: %s", e)
return web.json_response({"ok": False, "message": str(e) or "Repo details failed"}, status=500)