Compare commits
57 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| c18e93406a | |||
| 9af18ba090 | |||
| fff50a1580 | |||
| f8e9967c3a | |||
| 7bc493eb45 | |||
| b97b970a45 | |||
| 593e0c367d | |||
| 8e0817a64b | |||
| dfc7e44565 | |||
| c9c4f99fbf | |||
| 37cc11c9ee | |||
| 9c773c07e8 | |||
| c04612e159 | |||
| 5796012189 | |||
| 01576153d8 | |||
| 30484a08c1 | |||
| faf122aa1c | |||
| 1e86df49e9 | |||
| df631eec9e | |||
| 07240d1268 | |||
| 50587ffbbd | |||
| d6347e7e59 | |||
| 870e77ec13 | |||
| 38fb9fb073 | |||
| c20bd4dd07 | |||
| 296c816633 | |||
| 18a2b5529c | |||
| 246fab7e1e | |||
| ce5802721f | |||
| 2f46966fe2 | |||
| 132f9e27c1 | |||
| 618511be73 | |||
| 6488b434d8 | |||
| bffc594da5 | |||
| d78217100c | |||
| 09e1ef1af5 | |||
| 9ad558c9ab | |||
| 19df0eea22 | |||
| 745979b9a6 | |||
| f861b2490a | |||
| 32946c1a98 | |||
| a9a681d801 | |||
| 2ae6ac43a5 | |||
| 504c126c2c | |||
| 85cc97b557 | |||
| 4ca80a9c88 | |||
| ac5bc8a6f4 | |||
| c4361cc8bd | |||
| 1794d579d2 | |||
| bcfbf7151c | |||
| 38730cdd31 | |||
| 5d5d78d727 | |||
| 67297bfc9c | |||
| 82fda5dfc4 | |||
| 907f14b73c | |||
| 3eefd447ac | |||
| 72ce95525c |
85
CHANGELOG.md
85
CHANGELOG.md
@@ -11,19 +11,84 @@ Sections:
|
||||
|
||||
---
|
||||
|
||||
## [0.5.3] - 2026-01-16
|
||||
|
||||
### Added
|
||||
- Native Home Assistant Update entities for installed repositories (shown under **Settings → System → Updates**).
|
||||
- Human-friendly update names based on repository name (instead of internal repo IDs like `index:1`).
|
||||
|
||||
### Changed
|
||||
- Update UI now behaves like official Home Assistant integrations (update action is triggered via the HA Updates screen).
|
||||
|
||||
## [0.5.2] - 2026-01-16
|
||||
|
||||
### Added
|
||||
- Install and update backend endpoints (`POST /api/bcs/install`, `POST /api/bcs/update`) to install repositories into `/config/custom_components`.
|
||||
- Installed version tracking based on the actually installed ref (tag/release/branch), stored persistently to support repositories with outdated/`0.0.0` manifest versions.
|
||||
- API fields `installed_version` (installed ref) and `installed_manifest_version` (informational) to improve transparency in the UI.
|
||||
|
||||
### Changed
|
||||
- Update availability is now evaluated using the stored installed ref (instead of `manifest.json` version), preventing false-positive updates when repositories do not maintain manifest versions.
|
||||
|
||||
### Fixed
|
||||
- Repositories with `manifest.json` version `0.0.0` (or stale versions) no longer appear as constantly requiring updates after installing the latest release/tag.
|
||||
|
||||
## [0.5.0] - 2026-01-15
|
||||
|
||||
### Added
|
||||
- Manual refresh button that triggers a full backend refresh (store index + provider data).
|
||||
- Unified refresh pipeline: startup, timer and UI now use the same refresh logic.
|
||||
- Cache-busting for store index requests to always fetch the latest store.yaml.
|
||||
|
||||
### Improved
|
||||
- Logging for store index loading and parsing.
|
||||
- Refresh behavior now deterministic and verifiable via logs.
|
||||
|
||||
### Fixed
|
||||
- Refresh button previously only reloaded cached data.
|
||||
- Store index was not always reloaded immediately on user action.
|
||||
|
||||
## [0.4.1] - 2026-01-15
|
||||
### Fixed
|
||||
- Fixed GitLab README loading by using robust raw file endpoints.
|
||||
- Added support for nested GitLab groups when resolving README paths.
|
||||
- Added fallback handling for multiple README filenames (`README.md`, `README`, `README.rst`, etc.).
|
||||
- Added branch fallback logic for README loading (`default`, `main`, `master`).
|
||||
- Improved error resilience so README loading failures never break the store core.
|
||||
- No behavior change for GitHub and Gitea providers.
|
||||
|
||||
## [0.4.0] - 2026-01-15
|
||||
|
||||
### Added
|
||||
- Repository detail view (second page) in the Store UI.
|
||||
- README rendering using Home Assistant's `ha-markdown` element.
|
||||
- Floating action buttons (FAB):
|
||||
- Open repository
|
||||
- Reload README
|
||||
- Install (coming soon)
|
||||
- Update (coming soon)
|
||||
- Search field and category filter on the repository list page.
|
||||
- New authenticated API endpoint:
|
||||
- `GET /api/bcs/readme?repo_id=<id>` returns README markdown (best-effort).
|
||||
- Initial public release of the Bahmcloud Store integration.
|
||||
- Sidebar panel with repository browser UI.
|
||||
- Support for loading repositories from a central `store.yaml` index.
|
||||
- Support for custom repositories added by the user.
|
||||
- Provider abstraction for GitHub, GitLab and Gitea:
|
||||
- Fetch repository information (name, description, default branch).
|
||||
- Resolve latest version from:
|
||||
- Releases
|
||||
- Tags
|
||||
- Fallback mechanisms.
|
||||
- Repository metadata support via:
|
||||
- `bcs.yaml`
|
||||
- `hacs.yaml`
|
||||
- `hacs.json`
|
||||
- README loading and rendering pipeline:
|
||||
- Fetch raw README files.
|
||||
- Server-side Markdown rendering.
|
||||
- Sanitized HTML output for the panel UI.
|
||||
- Auto refresh mechanism for store index and repository metadata.
|
||||
- API endpoints:
|
||||
- List repositories
|
||||
- Add custom repository
|
||||
- Remove repository
|
||||
Persisted via Home Assistant storage (`.storage/bcs_store`).
|
||||
- Public static asset endpoint for panel JS (`/api/bahmcloud_store_static/...`) without auth (required for HA custom panels).
|
||||
- Initial API namespace:
|
||||
- `GET /api/bcs` list merged repositories (index + custom)
|
||||
- `POST /api/bcs` add custom repository
|
||||
- `DELETE /api/bcs/custom_repo` remove custom repository
|
||||
|
||||
### Changed
|
||||
- Repository cards are now clickable to open the detail view.
|
||||
|
||||
19
bcs.yaml
Normal file
19
bcs.yaml
Normal file
@@ -0,0 +1,19 @@
|
||||
name: Bahmcloud Store
|
||||
description: >
|
||||
Provider-neutral custom integration store for Home Assistant.
|
||||
Supports GitHub, GitLab, Gitea and Bahmcloud repositories with
|
||||
a central index, UI panel and API, similar to HACS but independent.
|
||||
|
||||
category: Store
|
||||
|
||||
author: Bahmcloud
|
||||
maintainer: Bahmcloud
|
||||
|
||||
domains:
|
||||
- bahmcloud_store
|
||||
|
||||
min_ha_version: "2024.1.0"
|
||||
|
||||
homepage: https://git.bahmcloud.de/bahmcloud/bahmcloud_store
|
||||
issues: https://git.bahmcloud.de/bahmcloud/bahmcloud_store/issues
|
||||
source: https://git.bahmcloud.de/bahmcloud/bahmcloud_store
|
||||
@@ -1,59 +1,166 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from datetime import timedelta
|
||||
from dataclasses import dataclass
|
||||
from typing import Any
|
||||
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.const import Platform
|
||||
from homeassistant.helpers.discovery import async_load_platform
|
||||
from homeassistant.helpers.event import async_track_time_interval
|
||||
from homeassistant.components.panel_custom import async_register_panel
|
||||
from homeassistant.components.update import UpdateEntity, UpdateEntityFeature
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
from homeassistant.helpers.dispatcher import async_dispatcher_connect
|
||||
from homeassistant.helpers.entity_platform import AddEntitiesCallback
|
||||
from homeassistant.helpers.entity import EntityCategory
|
||||
|
||||
from .core import BCSCore, BCSConfig, BCSError
|
||||
from .core import DOMAIN, SIGNAL_UPDATED, BCSCore
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
DOMAIN = "bahmcloud_store"
|
||||
|
||||
DEFAULT_STORE_URL = "https://git.bahmcloud.de/bahmcloud/ha_store/raw/branch/main/store.yaml"
|
||||
CONF_STORE_URL = "store_url"
|
||||
|
||||
|
||||
async def async_setup(hass: HomeAssistant, config: dict) -> bool:
|
||||
cfg = config.get(DOMAIN, {})
|
||||
store_url = cfg.get(CONF_STORE_URL, DEFAULT_STORE_URL)
|
||||
|
||||
core = BCSCore(hass, BCSConfig(store_url=store_url))
|
||||
hass.data[DOMAIN] = core
|
||||
|
||||
await core.register_http_views()
|
||||
|
||||
# RESTORE: keep the module_url pattern that worked for you
|
||||
await async_register_panel(
|
||||
hass,
|
||||
frontend_url_path="bahmcloud-store",
|
||||
webcomponent_name="bahmcloud-store-panel",
|
||||
module_url="/api/bahmcloud_store_static/panel.js?v=42",
|
||||
sidebar_title="Bahmcloud Store",
|
||||
sidebar_icon="mdi:store",
|
||||
require_admin=True,
|
||||
config={},
|
||||
)
|
||||
|
||||
def _pretty_repo_name(core: BCSCore, repo_id: str) -> str:
|
||||
"""Return a human-friendly name for a repo update entity."""
|
||||
try:
|
||||
await core.refresh()
|
||||
except BCSError as e:
|
||||
_LOGGER.error("Initial refresh failed: %s", e)
|
||||
repo = core.get_repo(repo_id)
|
||||
if repo and getattr(repo, "name", None):
|
||||
name = str(repo.name).strip()
|
||||
if name:
|
||||
return name
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Fallbacks
|
||||
if repo_id.startswith("index:"):
|
||||
return f"BCS Index {repo_id.split(':', 1)[1]}"
|
||||
if repo_id.startswith("custom:"):
|
||||
return f"BCS Custom {repo_id.split(':', 1)[1]}"
|
||||
return f"BCS {repo_id}"
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class _RepoKey:
|
||||
repo_id: str
|
||||
|
||||
|
||||
class BCSRepoUpdateEntity(UpdateEntity):
|
||||
"""Update entity representing a BCS-managed repository."""
|
||||
|
||||
_attr_entity_category = EntityCategory.DIAGNOSTIC
|
||||
_attr_supported_features = UpdateEntityFeature.INSTALL
|
||||
|
||||
def __init__(self, core: BCSCore, repo_id: str) -> None:
|
||||
self._core = core
|
||||
self._repo_id = repo_id
|
||||
self._in_progress = False
|
||||
|
||||
# Stable unique id (do NOT change)
|
||||
self._attr_unique_id = f"{DOMAIN}:{repo_id}"
|
||||
|
||||
# Human-friendly name in UI
|
||||
pretty = _pretty_repo_name(core, repo_id)
|
||||
self._attr_name = pretty
|
||||
|
||||
# Title shown in the entity dialog
|
||||
self._attr_title = pretty
|
||||
|
||||
@property
|
||||
def available(self) -> bool:
|
||||
repo = self._core.get_repo(self._repo_id)
|
||||
installed = self._core.get_installed(self._repo_id)
|
||||
return repo is not None and installed is not None
|
||||
|
||||
@property
|
||||
def in_progress(self) -> bool | None:
|
||||
return self._in_progress
|
||||
|
||||
@property
|
||||
def installed_version(self) -> str | None:
|
||||
installed = self._core.get_installed(self._repo_id) or {}
|
||||
v = installed.get("installed_version") or installed.get("ref")
|
||||
return str(v) if v else None
|
||||
|
||||
@property
|
||||
def latest_version(self) -> str | None:
|
||||
repo = self._core.get_repo(self._repo_id)
|
||||
if not repo:
|
||||
return None
|
||||
v = getattr(repo, "latest_version", None)
|
||||
return str(v) if v else None
|
||||
|
||||
@property
|
||||
def update_available(self) -> bool:
|
||||
latest = self.latest_version
|
||||
installed = self.installed_version
|
||||
if not latest or not installed:
|
||||
return False
|
||||
return latest != installed
|
||||
|
||||
def version_is_newer(self, latest_version: str, installed_version: str) -> bool:
|
||||
return latest_version != installed_version
|
||||
|
||||
@property
|
||||
def release_url(self) -> str | None:
|
||||
repo = self._core.get_repo(self._repo_id)
|
||||
return getattr(repo, "url", None) if repo else None
|
||||
|
||||
async def async_install(self, version: str | None, backup: bool, **kwargs: Any) -> None:
|
||||
if version is not None:
|
||||
_LOGGER.debug(
|
||||
"BCS update entity requested specific version=%s (ignored)", version
|
||||
)
|
||||
|
||||
self._in_progress = True
|
||||
self.async_write_ha_state()
|
||||
|
||||
async def periodic(_now) -> None:
|
||||
try:
|
||||
await core.refresh()
|
||||
core.signal_updated()
|
||||
except BCSError as e:
|
||||
_LOGGER.warning("Periodic refresh failed: %s", e)
|
||||
await self._core.update_repo(self._repo_id)
|
||||
finally:
|
||||
self._in_progress = False
|
||||
self.async_write_ha_state()
|
||||
|
||||
interval = timedelta(seconds=int(core.refresh_seconds or 300))
|
||||
async_track_time_interval(hass, periodic, interval)
|
||||
|
||||
await async_load_platform(hass, Platform.UPDATE, DOMAIN, {}, config)
|
||||
return True
|
||||
@callback
|
||||
def _sync_entities(
|
||||
core: BCSCore,
|
||||
existing: dict[str, BCSRepoUpdateEntity],
|
||||
async_add_entities: AddEntitiesCallback,
|
||||
) -> None:
|
||||
"""Ensure there is one update entity per installed repo."""
|
||||
installed_map = getattr(core, "_installed_cache", {}) or {}
|
||||
new_entities: list[BCSRepoUpdateEntity] = []
|
||||
|
||||
for repo_id, data in installed_map.items():
|
||||
if not isinstance(data, dict):
|
||||
continue
|
||||
if repo_id in existing:
|
||||
continue
|
||||
|
||||
ent = BCSRepoUpdateEntity(core, repo_id)
|
||||
existing[repo_id] = ent
|
||||
new_entities.append(ent)
|
||||
|
||||
if new_entities:
|
||||
async_add_entities(new_entities)
|
||||
|
||||
for ent in existing.values():
|
||||
ent.async_write_ha_state()
|
||||
|
||||
|
||||
async def async_setup_platform(
|
||||
hass: HomeAssistant,
|
||||
config,
|
||||
async_add_entities: AddEntitiesCallback,
|
||||
discovery_info=None,
|
||||
):
|
||||
"""Set up BCS update entities."""
|
||||
core: BCSCore | None = hass.data.get(DOMAIN)
|
||||
if not core:
|
||||
_LOGGER.debug("BCS core not available, skipping update platform setup")
|
||||
return
|
||||
|
||||
entities: dict[str, BCSRepoUpdateEntity] = {}
|
||||
|
||||
_sync_entities(core, entities, async_add_entities)
|
||||
|
||||
@callback
|
||||
def _handle_update() -> None:
|
||||
_sync_entities(core, entities, async_add_entities)
|
||||
|
||||
async_dispatcher_connect(hass, SIGNAL_UPDATED, _handle_update)
|
||||
@@ -1,361 +1,166 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
from dataclasses import dataclass
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
from urllib.parse import urlparse
|
||||
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.aiohttp_client import async_get_clientsession
|
||||
from homeassistant.util import yaml as ha_yaml
|
||||
from homeassistant.components.update import UpdateEntity, UpdateEntityFeature
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
from homeassistant.helpers.dispatcher import async_dispatcher_connect
|
||||
from homeassistant.helpers.entity_platform import AddEntitiesCallback
|
||||
from homeassistant.helpers.entity import EntityCategory
|
||||
|
||||
from .storage import BCSStorage, CustomRepo
|
||||
from .views import StaticAssetsView, BCSApiView, BCSReadmeView
|
||||
from .custom_repo_view import BCSCustomRepoView
|
||||
from .providers import fetch_repo_info, detect_provider, RepoInfo
|
||||
from .metadata import fetch_repo_metadata, RepoMetadata
|
||||
from .core import DOMAIN, SIGNAL_UPDATED, BCSCore
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
DOMAIN = "bahmcloud_store"
|
||||
|
||||
def _pretty_repo_name(core: BCSCore, repo_id: str) -> str:
|
||||
"""Return a human-friendly name for a repo update entity."""
|
||||
try:
|
||||
repo = core.get_repo(repo_id)
|
||||
if repo and getattr(repo, "name", None):
|
||||
name = str(repo.name).strip()
|
||||
if name:
|
||||
return name
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Fallbacks
|
||||
if repo_id.startswith("index:"):
|
||||
return f"BCS Index {repo_id.split(':', 1)[1]}"
|
||||
if repo_id.startswith("custom:"):
|
||||
return f"BCS Custom {repo_id.split(':', 1)[1]}"
|
||||
return f"BCS {repo_id}"
|
||||
|
||||
|
||||
class BCSError(Exception):
|
||||
"""BCS core error."""
|
||||
@dataclass(frozen=True)
|
||||
class _RepoKey:
|
||||
repo_id: str
|
||||
|
||||
|
||||
@dataclass
|
||||
class BCSConfig:
|
||||
store_url: str
|
||||
class BCSRepoUpdateEntity(UpdateEntity):
|
||||
"""Update entity representing a BCS-managed repository."""
|
||||
|
||||
_attr_entity_category = EntityCategory.DIAGNOSTIC
|
||||
_attr_supported_features = UpdateEntityFeature.INSTALL
|
||||
|
||||
@dataclass
|
||||
class RepoItem:
|
||||
id: str
|
||||
name: str
|
||||
url: str
|
||||
source: str # "index" | "custom"
|
||||
def __init__(self, core: BCSCore, repo_id: str) -> None:
|
||||
self._core = core
|
||||
self._repo_id = repo_id
|
||||
self._in_progress = False
|
||||
|
||||
owner: str | None = None
|
||||
provider: str | None = None
|
||||
provider_repo_name: str | None = None
|
||||
provider_description: str | None = None
|
||||
default_branch: str | None = None
|
||||
# Stable unique id (do NOT change)
|
||||
self._attr_unique_id = f"{DOMAIN}:{repo_id}"
|
||||
|
||||
latest_version: str | None = None
|
||||
latest_version_source: str | None = None # "release" | "tag" | None
|
||||
# Human-friendly name in UI
|
||||
pretty = _pretty_repo_name(core, repo_id)
|
||||
self._attr_name = pretty
|
||||
|
||||
meta_source: str | None = None
|
||||
meta_name: str | None = None
|
||||
meta_description: str | None = None
|
||||
meta_category: str | None = None
|
||||
meta_author: str | None = None
|
||||
meta_maintainer: str | None = None
|
||||
# Title shown in the entity dialog
|
||||
self._attr_title = pretty
|
||||
|
||||
@property
|
||||
def available(self) -> bool:
|
||||
repo = self._core.get_repo(self._repo_id)
|
||||
installed = self._core.get_installed(self._repo_id)
|
||||
return repo is not None and installed is not None
|
||||
|
||||
class BCSCore:
|
||||
def __init__(self, hass: HomeAssistant, config: BCSConfig) -> None:
|
||||
self.hass = hass
|
||||
self.config = config
|
||||
self.storage = BCSStorage(hass)
|
||||
@property
|
||||
def in_progress(self) -> bool | None:
|
||||
return self._in_progress
|
||||
|
||||
self.refresh_seconds: int = 300
|
||||
self.repos: dict[str, RepoItem] = {}
|
||||
self._listeners: list[callable] = []
|
||||
@property
|
||||
def installed_version(self) -> str | None:
|
||||
installed = self._core.get_installed(self._repo_id) or {}
|
||||
v = installed.get("installed_version") or installed.get("ref")
|
||||
return str(v) if v else None
|
||||
|
||||
self.version: str = self._read_manifest_version()
|
||||
|
||||
def _read_manifest_version(self) -> str:
|
||||
try:
|
||||
manifest_path = Path(__file__).resolve().parent / "manifest.json"
|
||||
data = json.loads(manifest_path.read_text(encoding="utf-8"))
|
||||
v = data.get("version")
|
||||
return str(v) if v else "unknown"
|
||||
except Exception:
|
||||
return "unknown"
|
||||
|
||||
def add_listener(self, cb) -> None:
|
||||
self._listeners.append(cb)
|
||||
|
||||
def signal_updated(self) -> None:
|
||||
for cb in list(self._listeners):
|
||||
try:
|
||||
cb()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
async def register_http_views(self) -> None:
|
||||
self.hass.http.register_view(StaticAssetsView())
|
||||
self.hass.http.register_view(BCSApiView(self))
|
||||
self.hass.http.register_view(BCSReadmeView(self))
|
||||
self.hass.http.register_view(BCSCustomRepoView(self))
|
||||
|
||||
def get_repo(self, repo_id: str) -> RepoItem | None:
|
||||
return self.repos.get(repo_id)
|
||||
|
||||
async def refresh(self) -> None:
|
||||
index_repos, refresh_seconds = await self._load_index_repos()
|
||||
self.refresh_seconds = refresh_seconds
|
||||
|
||||
custom_repos = await self.storage.list_custom_repos()
|
||||
|
||||
merged: dict[str, RepoItem] = {}
|
||||
|
||||
for item in index_repos:
|
||||
merged[item.id] = item
|
||||
|
||||
for c in custom_repos:
|
||||
merged[c.id] = RepoItem(
|
||||
id=c.id,
|
||||
name=(c.name or c.url),
|
||||
url=c.url,
|
||||
source="custom",
|
||||
)
|
||||
|
||||
for r in merged.values():
|
||||
r.provider = detect_provider(r.url)
|
||||
|
||||
await self._enrich_and_resolve(merged)
|
||||
self.repos = merged
|
||||
|
||||
async def _enrich_and_resolve(self, merged: dict[str, RepoItem]) -> None:
|
||||
sem = asyncio.Semaphore(6)
|
||||
|
||||
async def process_one(r: RepoItem) -> None:
|
||||
async with sem:
|
||||
info: RepoInfo = await fetch_repo_info(self.hass, r.url)
|
||||
|
||||
r.provider = info.provider or r.provider
|
||||
r.owner = info.owner or r.owner
|
||||
r.provider_repo_name = info.repo_name
|
||||
r.provider_description = info.description
|
||||
r.default_branch = info.default_branch or r.default_branch
|
||||
|
||||
r.latest_version = info.latest_version
|
||||
r.latest_version_source = info.latest_version_source
|
||||
|
||||
md: RepoMetadata = await fetch_repo_metadata(self.hass, r.url, r.default_branch)
|
||||
r.meta_source = md.source
|
||||
r.meta_name = md.name
|
||||
r.meta_description = md.description
|
||||
r.meta_category = md.category
|
||||
r.meta_author = md.author
|
||||
r.meta_maintainer = md.maintainer
|
||||
|
||||
has_user_or_index_name = bool(r.name) and (r.name != r.url) and (not str(r.name).startswith("http"))
|
||||
if r.meta_name:
|
||||
r.name = r.meta_name
|
||||
elif not has_user_or_index_name and r.provider_repo_name:
|
||||
r.name = r.provider_repo_name
|
||||
elif not r.name:
|
||||
r.name = r.url
|
||||
|
||||
await asyncio.gather(*(process_one(r) for r in merged.values()), return_exceptions=True)
|
||||
|
||||
async def _load_index_repos(self) -> tuple[list[RepoItem], int]:
|
||||
session = async_get_clientsession(self.hass)
|
||||
try:
|
||||
async with session.get(self.config.store_url, timeout=20) as resp:
|
||||
if resp.status != 200:
|
||||
raise BCSError(f"store_url returned {resp.status}")
|
||||
raw = await resp.text()
|
||||
except Exception as e:
|
||||
raise BCSError(f"Failed fetching store index: {e}") from e
|
||||
|
||||
try:
|
||||
data = ha_yaml.parse_yaml(raw)
|
||||
if not isinstance(data, dict):
|
||||
raise BCSError("store.yaml must be a mapping")
|
||||
|
||||
refresh_seconds = int(data.get("refresh_seconds", 300))
|
||||
repos = data.get("repos", [])
|
||||
if not isinstance(repos, list):
|
||||
raise BCSError("store.yaml 'repos' must be a list")
|
||||
|
||||
items: list[RepoItem] = []
|
||||
for i, r in enumerate(repos):
|
||||
if not isinstance(r, dict):
|
||||
continue
|
||||
url = str(r.get("url", "")).strip()
|
||||
if not url:
|
||||
continue
|
||||
name = str(r.get("name") or url).strip()
|
||||
|
||||
items.append(
|
||||
RepoItem(
|
||||
id=f"index:{i}",
|
||||
name=name,
|
||||
url=url,
|
||||
source="index",
|
||||
)
|
||||
)
|
||||
|
||||
return items, refresh_seconds
|
||||
except Exception as e:
|
||||
raise BCSError(f"Invalid store.yaml: {e}") from e
|
||||
|
||||
async def add_custom_repo(self, url: str, name: str | None) -> CustomRepo:
|
||||
url = str(url or "").strip()
|
||||
if not url:
|
||||
raise BCSError("Missing url")
|
||||
|
||||
c = await self.storage.add_custom_repo(url, name)
|
||||
await self.refresh()
|
||||
self.signal_updated()
|
||||
return c
|
||||
|
||||
async def remove_custom_repo(self, repo_id: str) -> None:
|
||||
await self.storage.remove_custom_repo(repo_id)
|
||||
await self.refresh()
|
||||
self.signal_updated()
|
||||
|
||||
async def list_custom_repos(self) -> list[CustomRepo]:
|
||||
return await self.storage.list_custom_repos()
|
||||
|
||||
def list_repos_public(self) -> list[dict[str, Any]]:
|
||||
out: list[dict[str, Any]] = []
|
||||
for r in self.repos.values():
|
||||
out.append(
|
||||
{
|
||||
"id": r.id,
|
||||
"name": r.name,
|
||||
"url": r.url,
|
||||
"source": r.source,
|
||||
"owner": r.owner,
|
||||
"provider": r.provider,
|
||||
"repo_name": r.provider_repo_name,
|
||||
"description": r.provider_description or r.meta_description,
|
||||
"default_branch": r.default_branch,
|
||||
"latest_version": r.latest_version,
|
||||
"latest_version_source": r.latest_version_source,
|
||||
"category": r.meta_category,
|
||||
"meta_author": r.meta_author,
|
||||
"meta_maintainer": r.meta_maintainer,
|
||||
"meta_source": r.meta_source,
|
||||
}
|
||||
)
|
||||
return out
|
||||
|
||||
def _split_owner_repo(self, repo_url: str) -> tuple[str | None, str | None]:
|
||||
u = urlparse(repo_url.rstrip("/"))
|
||||
parts = [p for p in u.path.strip("/").split("/") if p]
|
||||
if len(parts) < 2:
|
||||
return None, None
|
||||
owner = parts[0].strip() or None
|
||||
name = parts[1].strip()
|
||||
if name.endswith(".git"):
|
||||
name = name[:-4]
|
||||
name = name.strip() or None
|
||||
return owner, name
|
||||
|
||||
def _is_github(self, repo_url: str) -> bool:
|
||||
return "github.com" in urlparse(repo_url).netloc.lower()
|
||||
|
||||
def _is_gitea(self, repo_url: str) -> bool:
|
||||
host = urlparse(repo_url).netloc.lower()
|
||||
return host and "github.com" not in host and "gitlab.com" not in host
|
||||
|
||||
async def _fetch_text(self, url: str) -> str | None:
|
||||
session = async_get_clientsession(self.hass)
|
||||
try:
|
||||
async with session.get(url, timeout=20) as resp:
|
||||
if resp.status != 200:
|
||||
return None
|
||||
return await resp.text()
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
async def fetch_readme_markdown(self, repo_id: str) -> str | None:
|
||||
"""Fetch README markdown from GitHub, Gitea or GitLab.
|
||||
|
||||
Defensive behavior:
|
||||
- tries multiple common filenames
|
||||
- tries multiple branches (default, main, master)
|
||||
- uses public raw endpoints (no tokens required for public repositories)
|
||||
"""
|
||||
repo = self.get_repo(repo_id)
|
||||
@property
|
||||
def latest_version(self) -> str | None:
|
||||
repo = self._core.get_repo(self._repo_id)
|
||||
if not repo:
|
||||
return None
|
||||
v = getattr(repo, "latest_version", None)
|
||||
return str(v) if v else None
|
||||
|
||||
repo_url = (repo.url or "").strip()
|
||||
if not repo_url:
|
||||
return None
|
||||
@property
|
||||
def update_available(self) -> bool:
|
||||
latest = self.latest_version
|
||||
installed = self.installed_version
|
||||
if not latest or not installed:
|
||||
return False
|
||||
return latest != installed
|
||||
|
||||
# Branch fallbacks
|
||||
branch_candidates: list[str] = []
|
||||
if repo.default_branch and str(repo.default_branch).strip():
|
||||
branch_candidates.append(str(repo.default_branch).strip())
|
||||
for b in ("main", "master"):
|
||||
if b not in branch_candidates:
|
||||
branch_candidates.append(b)
|
||||
def version_is_newer(self, latest_version: str, installed_version: str) -> bool:
|
||||
return latest_version != installed_version
|
||||
|
||||
# Filename fallbacks
|
||||
filenames = ["README.md", "readme.md", "README.MD", "README.rst", "README"]
|
||||
@property
|
||||
def release_url(self) -> str | None:
|
||||
repo = self._core.get_repo(self._repo_id)
|
||||
return getattr(repo, "url", None) if repo else None
|
||||
|
||||
provider = (repo.provider or "").strip().lower()
|
||||
if not provider:
|
||||
provider = detect_provider(repo_url) or ""
|
||||
async def async_install(self, version: str | None, backup: bool, **kwargs: Any) -> None:
|
||||
if version is not None:
|
||||
_LOGGER.debug(
|
||||
"BCS update entity requested specific version=%s (ignored)", version
|
||||
)
|
||||
|
||||
u = urlparse(repo_url.rstrip("/"))
|
||||
host = (u.netloc or "").lower()
|
||||
self._in_progress = True
|
||||
self.async_write_ha_state()
|
||||
|
||||
candidates: list[str] = []
|
||||
try:
|
||||
await self._core.update_repo(self._repo_id)
|
||||
finally:
|
||||
self._in_progress = False
|
||||
self.async_write_ha_state()
|
||||
|
||||
if self._is_github(repo_url):
|
||||
owner, name = self._split_owner_repo(repo_url)
|
||||
if not owner or not name:
|
||||
return None
|
||||
for branch in branch_candidates:
|
||||
base = f"https://raw.githubusercontent.com/{owner}/{name}/{branch}"
|
||||
candidates.extend([f"{base}/{fn}" for fn in filenames])
|
||||
|
||||
elif provider == "gitlab" or "gitlab" in host:
|
||||
# GitLab can have nested groups: /group/subgroup/repo
|
||||
parts = [p for p in u.path.strip("/").split("/") if p]
|
||||
if len(parts) < 2:
|
||||
return None
|
||||
@callback
|
||||
def _sync_entities(
|
||||
core: BCSCore,
|
||||
existing: dict[str, BCSRepoUpdateEntity],
|
||||
async_add_entities: AddEntitiesCallback,
|
||||
) -> None:
|
||||
"""Ensure there is one update entity per installed repo."""
|
||||
installed_map = getattr(core, "_installed_cache", {}) or {}
|
||||
new_entities: list[BCSRepoUpdateEntity] = []
|
||||
|
||||
repo_name = parts[-1].strip()
|
||||
if repo_name.endswith(".git"):
|
||||
repo_name = repo_name[:-4]
|
||||
group_path = "/".join(parts[:-1]).strip("/")
|
||||
if not group_path or not repo_name:
|
||||
return None
|
||||
for repo_id, data in installed_map.items():
|
||||
if not isinstance(data, dict):
|
||||
continue
|
||||
if repo_id in existing:
|
||||
continue
|
||||
|
||||
root = f"{u.scheme}://{u.netloc}/{group_path}/{repo_name}"
|
||||
for branch in branch_candidates:
|
||||
bases = [
|
||||
f"{root}/-/raw/{branch}",
|
||||
# Some instances may expose /raw/<branch> as well
|
||||
f"{root}/raw/{branch}",
|
||||
]
|
||||
for b in bases:
|
||||
candidates.extend([f"{b}/{fn}" for fn in filenames])
|
||||
ent = BCSRepoUpdateEntity(core, repo_id)
|
||||
existing[repo_id] = ent
|
||||
new_entities.append(ent)
|
||||
|
||||
elif self._is_gitea(repo_url):
|
||||
owner, name = self._split_owner_repo(repo_url)
|
||||
if not owner or not name:
|
||||
return None
|
||||
if new_entities:
|
||||
async_add_entities(new_entities)
|
||||
|
||||
root = f"{u.scheme}://{u.netloc}/{owner}/{name}"
|
||||
for ent in existing.values():
|
||||
ent.async_write_ha_state()
|
||||
|
||||
for branch in branch_candidates:
|
||||
bases = [
|
||||
f"{root}/raw/branch/{branch}",
|
||||
f"{root}/raw/{branch}",
|
||||
]
|
||||
for b in bases:
|
||||
candidates.extend([f"{b}/{fn}" for fn in filenames])
|
||||
|
||||
else:
|
||||
return None
|
||||
async def async_setup_platform(
|
||||
hass: HomeAssistant,
|
||||
config,
|
||||
async_add_entities: AddEntitiesCallback,
|
||||
discovery_info=None,
|
||||
):
|
||||
"""Set up BCS update entities."""
|
||||
core: BCSCore | None = hass.data.get(DOMAIN)
|
||||
if not core:
|
||||
_LOGGER.debug("BCS core not available, skipping update platform setup")
|
||||
return
|
||||
|
||||
for url in candidates:
|
||||
txt = await self._fetch_text(url)
|
||||
if txt and txt.strip():
|
||||
return txt
|
||||
entities: dict[str, BCSRepoUpdateEntity] = {}
|
||||
|
||||
return None
|
||||
_sync_entities(core, entities, async_add_entities)
|
||||
|
||||
@callback
|
||||
def _handle_update() -> None:
|
||||
_sync_entities(core, entities, async_add_entities)
|
||||
|
||||
async_dispatcher_connect(hass, SIGNAL_UPDATED, _handle_update)
|
||||
@@ -1,9 +1,10 @@
|
||||
{
|
||||
"domain": "bahmcloud_store",
|
||||
"name": "Bahmcloud Store",
|
||||
"version": "0.4.0",
|
||||
"version": "0.5.3",
|
||||
"documentation": "https://git.bahmcloud.de/bahmcloud/bahmcloud_store",
|
||||
"platforms": ["update"],
|
||||
"requirements": [],
|
||||
"codeowners": [],
|
||||
"codeowners": ["@bahmcloud"],
|
||||
"iot_class": "local_polling"
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -6,6 +6,8 @@ import xml.etree.ElementTree as ET
|
||||
from dataclasses import dataclass
|
||||
from urllib.parse import quote_plus, urlparse
|
||||
|
||||
from packaging.version import InvalidVersion, Version
|
||||
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.aiohttp_client import async_get_clientsession
|
||||
|
||||
@@ -51,12 +53,7 @@ def detect_provider(repo_url: str) -> str:
|
||||
return "github"
|
||||
if "gitlab" in host:
|
||||
return "gitlab"
|
||||
|
||||
owner, repo = _split_owner_repo(repo_url)
|
||||
if owner and repo:
|
||||
return "gitea"
|
||||
|
||||
return "generic"
|
||||
return "gitea"
|
||||
|
||||
|
||||
async def _safe_json(session, url: str, *, headers: dict | None = None, timeout: int = 20):
|
||||
@@ -82,130 +79,113 @@ async def _safe_text(session, url: str, *, headers: dict | None = None, timeout:
|
||||
|
||||
|
||||
def _extract_tag_from_github_url(url: str) -> str | None:
|
||||
m = re.search(r"/releases/tag/([^/?#]+)", url)
|
||||
if m:
|
||||
return m.group(1)
|
||||
m = re.search(r"/tag/([^/?#]+)", url)
|
||||
if m:
|
||||
return m.group(1)
|
||||
return None
|
||||
|
||||
|
||||
def _strip_html(s: str) -> str:
|
||||
# minimal HTML entity cleanup for meta descriptions
|
||||
out = (
|
||||
s.replace("&", "&")
|
||||
.replace(""", '"')
|
||||
.replace("'", "'")
|
||||
.replace("<", "<")
|
||||
.replace(">", ">")
|
||||
)
|
||||
return re.sub(r"\s+", " ", out).strip()
|
||||
m = re.search(r"/releases/tag/([^/?#]+)", url or "")
|
||||
if not m:
|
||||
return None
|
||||
return m.group(1).strip() or None
|
||||
|
||||
|
||||
def _extract_meta(html: str, *, prop: str | None = None, name: str | None = None) -> str | None:
|
||||
# Extract <meta property="og:description" content="...">
|
||||
# or <meta name="description" content="...">
|
||||
if not html:
|
||||
return None
|
||||
if prop:
|
||||
# property="..." content="..."
|
||||
m = re.search(
|
||||
r'<meta[^>]+property=["\']' + re.escape(prop) + r'["\'][^>]+content=["\']([^"\']+)["\']',
|
||||
html,
|
||||
flags=re.IGNORECASE,
|
||||
)
|
||||
m = re.search(rf'<meta\s+property="{re.escape(prop)}"\s+content="([^"]+)"', html)
|
||||
if m:
|
||||
return _strip_html(m.group(1))
|
||||
m = re.search(
|
||||
r'<meta[^>]+content=["\']([^"\']+)["\'][^>]+property=["\']' + re.escape(prop) + r'["\']',
|
||||
html,
|
||||
flags=re.IGNORECASE,
|
||||
)
|
||||
if m:
|
||||
return _strip_html(m.group(1))
|
||||
|
||||
return m.group(1).strip()
|
||||
if name:
|
||||
m = re.search(
|
||||
r'<meta[^>]+name=["\']' + re.escape(name) + r'["\'][^>]+content=["\']([^"\']+)["\']',
|
||||
html,
|
||||
flags=re.IGNORECASE,
|
||||
)
|
||||
m = re.search(rf'<meta\s+name="{re.escape(name)}"\s+content="([^"]+)"', html)
|
||||
if m:
|
||||
return _strip_html(m.group(1))
|
||||
m = re.search(
|
||||
r'<meta[^>]+content=["\']([^"\']+)["\'][^>]+name=["\']' + re.escape(name) + r'["\']',
|
||||
html,
|
||||
flags=re.IGNORECASE,
|
||||
)
|
||||
if m:
|
||||
return _strip_html(m.group(1))
|
||||
|
||||
return m.group(1).strip()
|
||||
return None
|
||||
|
||||
|
||||
async def _github_description_html(hass: HomeAssistant, owner: str, repo: str) -> str | None:
|
||||
"""
|
||||
GitHub API may be rate-limited; fetch public HTML and read meta description.
|
||||
"""
|
||||
session = async_get_clientsession(hass)
|
||||
headers = {
|
||||
"User-Agent": UA,
|
||||
"Accept": "text/html,application/xhtml+xml",
|
||||
}
|
||||
def _semver_key(tag: str) -> Version | None:
|
||||
t = (tag or "").strip()
|
||||
if not t:
|
||||
return None
|
||||
if t.startswith(("v", "V")):
|
||||
t = t[1:]
|
||||
try:
|
||||
return Version(t)
|
||||
except InvalidVersion:
|
||||
return None
|
||||
|
||||
html, status = await _safe_text(session, f"https://github.com/{owner}/{repo}", headers=headers)
|
||||
if not html or status != 200:
|
||||
|
||||
def _pick_highest_semver(tags: list[str]) -> str | None:
|
||||
parsed: list[tuple[Version, str]] = []
|
||||
for t in tags:
|
||||
if not isinstance(t, str):
|
||||
continue
|
||||
ts = t.strip()
|
||||
if not ts:
|
||||
continue
|
||||
v = _semver_key(ts)
|
||||
if v is not None:
|
||||
parsed.append((v, ts))
|
||||
|
||||
if not parsed:
|
||||
return None
|
||||
parsed.sort(key=lambda x: x[0], reverse=True)
|
||||
return parsed[0][1]
|
||||
|
||||
|
||||
async def _github_description_html(hass: HomeAssistant, owner: str, repo: str) -> str | None:
|
||||
session = async_get_clientsession(hass)
|
||||
url = f"https://github.com/{owner}/{repo}"
|
||||
html, status = await _safe_text(session, url, headers={"User-Agent": UA})
|
||||
if status != 200 or not html:
|
||||
return None
|
||||
|
||||
desc = _extract_meta(html, prop="og:description")
|
||||
if desc:
|
||||
return desc
|
||||
|
||||
desc = _extract_meta(html, name="description")
|
||||
if desc:
|
||||
return desc
|
||||
|
||||
return None
|
||||
return _extract_meta(html, name="description")
|
||||
|
||||
|
||||
async def _github_latest_version_atom(hass: HomeAssistant, owner: str, repo: str) -> tuple[str | None, str | None]:
|
||||
session = async_get_clientsession(hass)
|
||||
headers = {"User-Agent": UA, "Accept": "application/atom+xml,text/xml;q=0.9,*/*;q=0.8"}
|
||||
|
||||
xml_text, _ = await _safe_text(session, f"https://github.com/{owner}/{repo}/releases.atom", headers=headers)
|
||||
if not xml_text:
|
||||
url = f"https://github.com/{owner}/{repo}/releases.atom"
|
||||
atom, status = await _safe_text(session, url, headers={"User-Agent": UA})
|
||||
if status != 200 or not atom:
|
||||
return None, None
|
||||
|
||||
try:
|
||||
root = ET.fromstring(xml_text)
|
||||
except Exception:
|
||||
return None, None
|
||||
|
||||
for entry in root.findall(".//{*}entry"):
|
||||
for link in entry.findall(".//{*}link"):
|
||||
href = link.attrib.get("href")
|
||||
if not href:
|
||||
continue
|
||||
tag = _extract_tag_from_github_url(href)
|
||||
root = ET.fromstring(atom)
|
||||
ns = {"a": "http://www.w3.org/2005/Atom"}
|
||||
entry = root.find("a:entry", ns)
|
||||
if entry is None:
|
||||
return None, None
|
||||
link = entry.find("a:link", ns)
|
||||
if link is not None and link.attrib.get("href"):
|
||||
tag = _extract_tag_from_github_url(link.attrib["href"])
|
||||
if tag:
|
||||
return tag, "atom"
|
||||
title = entry.find("a:title", ns)
|
||||
if title is not None and title.text:
|
||||
t = title.text.strip()
|
||||
if t:
|
||||
return t, "atom"
|
||||
except Exception:
|
||||
return None, None
|
||||
|
||||
return None, None
|
||||
|
||||
|
||||
async def _github_latest_version_redirect(hass: HomeAssistant, owner: str, repo: str) -> tuple[str | None, str | None]:
|
||||
session = async_get_clientsession(hass)
|
||||
headers = {"User-Agent": UA}
|
||||
url = f"https://github.com/{owner}/{repo}/releases/latest"
|
||||
try:
|
||||
async with session.head(url, allow_redirects=False, timeout=15, headers=headers) as resp:
|
||||
if resp.status in (301, 302, 303, 307, 308):
|
||||
loc = resp.headers.get("Location")
|
||||
if loc:
|
||||
tag = _extract_tag_from_github_url(loc)
|
||||
if tag:
|
||||
return tag, "release"
|
||||
async with session.get(url, timeout=20, headers={"User-Agent": UA}, allow_redirects=True) as resp:
|
||||
if resp.status != 200:
|
||||
return None, None
|
||||
final = str(resp.url)
|
||||
tag = _extract_tag_from_github_url(final)
|
||||
if tag:
|
||||
return tag, "release"
|
||||
except Exception:
|
||||
pass
|
||||
return None, None
|
||||
|
||||
return None, None
|
||||
|
||||
|
||||
@@ -213,75 +193,125 @@ async def _github_latest_version_api(hass: HomeAssistant, owner: str, repo: str)
|
||||
session = async_get_clientsession(hass)
|
||||
headers = {"Accept": "application/vnd.github+json", "User-Agent": UA}
|
||||
|
||||
data, _ = await _safe_json(session, f"https://api.github.com/repos/{owner}/{repo}/releases/latest", headers=headers)
|
||||
if isinstance(data, dict):
|
||||
tag = data.get("tag_name") or data.get("name")
|
||||
if isinstance(tag, str) and tag.strip():
|
||||
return tag.strip(), "release"
|
||||
data, status = await _safe_json(session, f"https://api.github.com/repos/{owner}/{repo}/releases/latest", headers=headers)
|
||||
if isinstance(data, dict) and data.get("tag_name"):
|
||||
return str(data["tag_name"]), "release"
|
||||
|
||||
data, _ = await _safe_json(session, f"https://api.github.com/repos/{owner}/{repo}/tags?per_page=1", headers=headers)
|
||||
if isinstance(data, list) and data:
|
||||
tag = data[0].get("name")
|
||||
if isinstance(tag, str) and tag.strip():
|
||||
return tag.strip(), "tag"
|
||||
# No releases -> pick highest semver from many tags (instead of per_page=1)
|
||||
if status == 404:
|
||||
data, _ = await _safe_json(session, f"https://api.github.com/repos/{owner}/{repo}/tags?per_page=100", headers=headers)
|
||||
tags: list[str] = []
|
||||
if isinstance(data, list):
|
||||
for t in data:
|
||||
if isinstance(t, dict) and t.get("name"):
|
||||
tags.append(str(t["name"]))
|
||||
|
||||
best = _pick_highest_semver(tags)
|
||||
if best:
|
||||
return best, "tag"
|
||||
|
||||
# fallback: keep old behavior (first tag)
|
||||
if tags:
|
||||
return tags[0], "tag"
|
||||
|
||||
return None, None
|
||||
|
||||
|
||||
async def _github_latest_version(hass: HomeAssistant, owner: str, repo: str) -> tuple[str | None, str | None]:
|
||||
tag, src = await _github_latest_version_atom(hass, owner, repo)
|
||||
if tag:
|
||||
return tag, src
|
||||
|
||||
tag, src = await _github_latest_version_redirect(hass, owner, repo)
|
||||
if tag:
|
||||
return tag, src
|
||||
|
||||
return await _github_latest_version_api(hass, owner, repo)
|
||||
tag, src = await _github_latest_version_api(hass, owner, repo)
|
||||
if tag:
|
||||
return tag, src
|
||||
|
||||
return await _github_latest_version_atom(hass, owner, repo)
|
||||
|
||||
|
||||
async def _gitea_latest_version(hass: HomeAssistant, base: str, owner: str, repo: str) -> tuple[str | None, str | None]:
|
||||
session = async_get_clientsession(hass)
|
||||
|
||||
data, _ = await _safe_json(session, f"{base}/api/v1/repos/{owner}/{repo}/releases?limit=1")
|
||||
if isinstance(data, list) and data:
|
||||
tag = data[0].get("tag_name") or data[0].get("name")
|
||||
if isinstance(tag, str) and tag.strip():
|
||||
return tag.strip(), "release"
|
||||
# releases: fetch multiple, pick highest semver (instead of limit=1)
|
||||
data, _ = await _safe_json(session, f"{base}/api/v1/repos/{owner}/{repo}/releases?limit=50")
|
||||
rel_tags: list[str] = []
|
||||
if isinstance(data, list):
|
||||
for r in data:
|
||||
if isinstance(r, dict) and r.get("tag_name"):
|
||||
rel_tags.append(str(r["tag_name"]))
|
||||
|
||||
data, _ = await _safe_json(session, f"{base}/api/v1/repos/{owner}/{repo}/tags?limit=1")
|
||||
if isinstance(data, list) and data:
|
||||
tag = data[0].get("name")
|
||||
if isinstance(tag, str) and tag.strip():
|
||||
return tag.strip(), "tag"
|
||||
best_rel = _pick_highest_semver(rel_tags)
|
||||
if best_rel:
|
||||
return best_rel, "release"
|
||||
if rel_tags:
|
||||
return rel_tags[0], "release"
|
||||
|
||||
# tags: fetch multiple, pick highest semver (instead of limit=1)
|
||||
data, _ = await _safe_json(session, f"{base}/api/v1/repos/{owner}/{repo}/tags?limit=50")
|
||||
tags: list[str] = []
|
||||
if isinstance(data, list):
|
||||
for t in data:
|
||||
if isinstance(t, dict) and t.get("name"):
|
||||
tags.append(str(t["name"]))
|
||||
|
||||
best = _pick_highest_semver(tags)
|
||||
if best:
|
||||
return best, "tag"
|
||||
if tags:
|
||||
return tags[0], "tag"
|
||||
|
||||
return None, None
|
||||
|
||||
|
||||
async def _gitlab_latest_version(hass: HomeAssistant, base: str, owner: str, repo: str) -> tuple[str | None, str | None]:
|
||||
async def _gitlab_latest_version(
|
||||
hass: HomeAssistant, base: str, owner: str, repo: str
|
||||
) -> tuple[str | None, str | None]:
|
||||
session = async_get_clientsession(hass)
|
||||
headers = {"User-Agent": UA}
|
||||
|
||||
project = quote_plus(f"{owner}/{repo}")
|
||||
|
||||
data, _ = await _safe_json(
|
||||
session,
|
||||
f"{base}/api/v4/projects/{project}/releases?per_page=1&order_by=released_at&sort=desc",
|
||||
headers=headers,
|
||||
)
|
||||
if isinstance(data, list) and data:
|
||||
tag = data[0].get("tag_name") or data[0].get("name")
|
||||
if isinstance(tag, str) and tag.strip():
|
||||
return tag.strip(), "release"
|
||||
# releases: fetch multiple, pick highest semver (instead of per_page=1)
|
||||
data, _ = await _safe_json(session, f"{base}/api/v4/projects/{project}/releases?per_page=50", headers=headers)
|
||||
rel_tags: list[str] = []
|
||||
if isinstance(data, list):
|
||||
for r in data:
|
||||
if isinstance(r, dict) and r.get("tag_name"):
|
||||
rel_tags.append(str(r["tag_name"]))
|
||||
|
||||
data, _ = await _safe_json(
|
||||
session,
|
||||
f"{base}/api/v4/projects/{project}/repository/tags?per_page=1&order_by=updated&sort=desc",
|
||||
headers=headers,
|
||||
)
|
||||
if isinstance(data, list) and data:
|
||||
tag = data[0].get("name")
|
||||
if isinstance(tag, str) and tag.strip():
|
||||
return tag.strip(), "tag"
|
||||
best_rel = _pick_highest_semver(rel_tags)
|
||||
if best_rel:
|
||||
return best_rel, "release"
|
||||
if rel_tags:
|
||||
return rel_tags[0], "release"
|
||||
|
||||
# tags: fetch multiple, pick highest semver (instead of per_page=1)
|
||||
data, _ = await _safe_json(session, f"{base}/api/v4/projects/{project}/repository/tags?per_page=50", headers=headers)
|
||||
tags: list[str] = []
|
||||
if isinstance(data, list):
|
||||
for t in data:
|
||||
if isinstance(t, dict) and t.get("name"):
|
||||
tags.append(str(t["name"]))
|
||||
|
||||
best = _pick_highest_semver(tags)
|
||||
if best:
|
||||
return best, "tag"
|
||||
if tags:
|
||||
return tags[0], "tag"
|
||||
|
||||
# atom fallback
|
||||
atom, status = await _safe_text(session, f"{base}/{owner}/{repo}/-/tags?format=atom", headers=headers)
|
||||
if status == 200 and atom:
|
||||
try:
|
||||
root = ET.fromstring(atom)
|
||||
ns = {"a": "http://www.w3.org/2005/Atom"}
|
||||
entry = root.find("a:entry", ns)
|
||||
if entry is not None:
|
||||
title = entry.find("a:title", ns)
|
||||
if title is not None and title.text:
|
||||
return title.text.strip(), "atom"
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return None, None
|
||||
|
||||
@@ -307,7 +337,6 @@ async def fetch_repo_info(hass: HomeAssistant, repo_url: str) -> RepoInfo:
|
||||
|
||||
try:
|
||||
if provider == "github":
|
||||
# Try API repo details (may be rate-limited)
|
||||
headers = {"Accept": "application/vnd.github+json", "User-Agent": UA}
|
||||
data, status = await _safe_json(session, f"https://api.github.com/repos/{owner}/{repo}", headers=headers)
|
||||
|
||||
@@ -318,12 +347,10 @@ async def fetch_repo_info(hass: HomeAssistant, repo_url: str) -> RepoInfo:
|
||||
if isinstance(data.get("owner"), dict) and data["owner"].get("login"):
|
||||
info.owner = data["owner"]["login"]
|
||||
else:
|
||||
# If API blocked, still set reasonable defaults
|
||||
if status == 403:
|
||||
_LOGGER.debug("GitHub API blocked/rate-limited for repo info %s/%s", owner, repo)
|
||||
info.default_branch = "main"
|
||||
|
||||
# If description missing, fetch from GitHub HTML
|
||||
if not info.description:
|
||||
desc = await _github_description_html(hass, owner, repo)
|
||||
if desc:
|
||||
@@ -371,8 +398,110 @@ async def fetch_repo_info(hass: HomeAssistant, repo_url: str) -> RepoInfo:
|
||||
info.latest_version_source = src
|
||||
return info
|
||||
|
||||
return info
|
||||
|
||||
except Exception as e:
|
||||
_LOGGER.debug("Provider fetch failed for %s: %s", repo_url, e)
|
||||
return info
|
||||
_LOGGER.debug("fetch_repo_info failed for %s: %s", repo_url, e)
|
||||
|
||||
return info
|
||||
|
||||
|
||||
async def fetch_readme_markdown(
|
||||
hass: HomeAssistant,
|
||||
repo_url: str,
|
||||
*,
|
||||
provider: str | None = None,
|
||||
default_branch: str | None = None,
|
||||
) -> str | None:
|
||||
"""Fetch README Markdown for public repositories (GitHub/GitLab/Gitea).
|
||||
|
||||
Defensive behavior:
|
||||
- tries multiple common README filenames
|
||||
- tries multiple branches (default, main, master)
|
||||
- uses public raw endpoints (no tokens required for public repositories)
|
||||
"""
|
||||
repo_url = (repo_url or "").strip()
|
||||
if not repo_url:
|
||||
return None
|
||||
|
||||
prov = (provider or "").strip().lower() if provider else ""
|
||||
if not prov:
|
||||
prov = detect_provider(repo_url)
|
||||
|
||||
branch_candidates: list[str] = []
|
||||
if default_branch and str(default_branch).strip():
|
||||
branch_candidates.append(str(default_branch).strip())
|
||||
for b in ("main", "master"):
|
||||
if b not in branch_candidates:
|
||||
branch_candidates.append(b)
|
||||
|
||||
filenames = ["README.md", "readme.md", "README.MD", "README.rst", "README"]
|
||||
|
||||
session = async_get_clientsession(hass)
|
||||
headers = {"User-Agent": UA}
|
||||
|
||||
def _normalize_gitlab_path(path: str) -> str | None:
|
||||
p = (path or "").strip().strip("/")
|
||||
if not p:
|
||||
return None
|
||||
parts = [x for x in p.split("/") if x]
|
||||
if len(parts) < 2:
|
||||
return None
|
||||
if parts[-1].endswith(".git"):
|
||||
parts[-1] = parts[-1][:-4]
|
||||
return "/".join(parts)
|
||||
|
||||
candidates: list[str] = []
|
||||
|
||||
if prov == "github":
|
||||
owner, repo = _split_owner_repo(repo_url)
|
||||
if not owner or not repo:
|
||||
return None
|
||||
for branch in branch_candidates:
|
||||
base = f"https://raw.githubusercontent.com/{owner}/{repo}/{branch}"
|
||||
for fn in filenames:
|
||||
candidates.append(f"{base}/{fn}")
|
||||
|
||||
elif prov == "gitea":
|
||||
owner, repo = _split_owner_repo(repo_url)
|
||||
if not owner or not repo:
|
||||
return None
|
||||
u = urlparse(repo_url.rstrip("/"))
|
||||
root = f"{u.scheme}://{u.netloc}/{owner}/{repo}"
|
||||
for branch in branch_candidates:
|
||||
bases = [
|
||||
f"{root}/raw/branch/{branch}",
|
||||
f"{root}/raw/{branch}",
|
||||
]
|
||||
for b in bases:
|
||||
for fn in filenames:
|
||||
candidates.append(f"{b}/{fn}")
|
||||
|
||||
elif prov == "gitlab":
|
||||
u = urlparse(repo_url.rstrip("/"))
|
||||
path_repo = _normalize_gitlab_path(u.path)
|
||||
if not path_repo:
|
||||
return None
|
||||
root = f"{u.scheme}://{u.netloc}/{path_repo}"
|
||||
for branch in branch_candidates:
|
||||
bases = [
|
||||
f"{root}/-/raw/{branch}",
|
||||
f"{root}/raw/{branch}",
|
||||
]
|
||||
for b in bases:
|
||||
for fn in filenames:
|
||||
candidates.append(f"{b}/{fn}")
|
||||
|
||||
else:
|
||||
return None
|
||||
|
||||
for url in candidates:
|
||||
try:
|
||||
async with session.get(url, timeout=20, headers=headers) as resp:
|
||||
if resp.status != 200:
|
||||
continue
|
||||
txt = await resp.text()
|
||||
if txt and txt.strip():
|
||||
return txt
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
return None
|
||||
@@ -1,5 +1,6 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import time
|
||||
import uuid
|
||||
from dataclasses import dataclass
|
||||
from typing import Any
|
||||
@@ -18,19 +19,40 @@ class CustomRepo:
|
||||
name: str | None = None
|
||||
|
||||
|
||||
@dataclass
|
||||
class InstalledRepo:
|
||||
repo_id: str
|
||||
url: str
|
||||
domains: list[str]
|
||||
installed_at: int
|
||||
installed_version: str | None = None # BCS "installed ref" (tag/release/branch)
|
||||
installed_manifest_version: str | None = None # informational only
|
||||
ref: str | None = None # kept for backward compatibility / diagnostics
|
||||
|
||||
|
||||
class BCSStorage:
|
||||
"""Persistent storage for manually added repositories."""
|
||||
"""Persistent storage for Bahmcloud Store.
|
||||
|
||||
Keys:
|
||||
- custom_repos: list of manually added repositories
|
||||
- installed_repos: mapping repo_id -> installed metadata
|
||||
"""
|
||||
|
||||
def __init__(self, hass: HomeAssistant) -> None:
|
||||
self.hass = hass
|
||||
self._store = Store(hass, _STORAGE_VERSION, _STORAGE_KEY)
|
||||
self._store: Store[dict[str, Any]] = Store(hass, _STORAGE_VERSION, _STORAGE_KEY)
|
||||
|
||||
async def _load(self) -> dict[str, Any]:
|
||||
data = await self._store.async_load()
|
||||
if not data:
|
||||
return {"custom_repos": []}
|
||||
if "custom_repos" not in data:
|
||||
data = await self._store.async_load() or {}
|
||||
if not isinstance(data, dict):
|
||||
data = {}
|
||||
|
||||
if "custom_repos" not in data or not isinstance(data.get("custom_repos"), list):
|
||||
data["custom_repos"] = []
|
||||
|
||||
if "installed_repos" not in data or not isinstance(data.get("installed_repos"), dict):
|
||||
data["installed_repos"] = {}
|
||||
|
||||
return data
|
||||
|
||||
async def _save(self, data: dict[str, Any]) -> None:
|
||||
@@ -43,24 +65,20 @@ class BCSStorage:
|
||||
for r in repos:
|
||||
if not isinstance(r, dict):
|
||||
continue
|
||||
rid = str(r.get("id") or "")
|
||||
url = str(r.get("url") or "")
|
||||
name = r.get("name")
|
||||
if rid and url:
|
||||
out.append(CustomRepo(id=rid, url=url, name=str(name) if name else None))
|
||||
rid = r.get("id")
|
||||
url = r.get("url")
|
||||
if not rid or not url:
|
||||
continue
|
||||
out.append(CustomRepo(id=str(rid), url=str(url), name=r.get("name")))
|
||||
return out
|
||||
|
||||
async def add_custom_repo(self, url: str, name: str | None) -> CustomRepo:
|
||||
data = await self._load()
|
||||
repos = data.get("custom_repos", [])
|
||||
|
||||
# Deduplicate by URL
|
||||
# De-duplicate by URL
|
||||
for r in repos:
|
||||
if isinstance(r, dict) and str(r.get("url", "")).strip() == url.strip():
|
||||
# Update name if provided
|
||||
if name:
|
||||
r["name"] = name
|
||||
await self._save(data)
|
||||
if isinstance(r, dict) and str(r.get("url") or "").strip() == url.strip():
|
||||
return CustomRepo(id=str(r["id"]), url=str(r["url"]), name=r.get("name"))
|
||||
|
||||
rid = f"custom:{uuid.uuid4().hex[:10]}"
|
||||
@@ -73,6 +91,94 @@ class BCSStorage:
|
||||
async def remove_custom_repo(self, repo_id: str) -> None:
|
||||
data = await self._load()
|
||||
repos = data.get("custom_repos", [])
|
||||
data["custom_repos"] = [r for r in repos if not (isinstance(r, dict) and r.get("id") == repo_id)]
|
||||
data["custom_repos"] = [
|
||||
r for r in repos if not (isinstance(r, dict) and r.get("id") == repo_id)
|
||||
]
|
||||
await self._save(data)
|
||||
|
||||
async def get_installed_repo(self, repo_id: str) -> InstalledRepo | None:
|
||||
data = await self._load()
|
||||
installed = data.get("installed_repos", {})
|
||||
if not isinstance(installed, dict):
|
||||
return None
|
||||
entry = installed.get(repo_id)
|
||||
if not isinstance(entry, dict):
|
||||
return None
|
||||
|
||||
try:
|
||||
domains = entry.get("domains") or []
|
||||
if not isinstance(domains, list):
|
||||
domains = []
|
||||
domains = [str(d) for d in domains if str(d).strip()]
|
||||
|
||||
installed_version = entry.get("installed_version")
|
||||
ref = entry.get("ref")
|
||||
|
||||
# Backward compatibility:
|
||||
# If installed_version wasn't stored, fall back to ref.
|
||||
if (not installed_version) and ref:
|
||||
installed_version = ref
|
||||
|
||||
installed_manifest_version = entry.get("installed_manifest_version")
|
||||
|
||||
return InstalledRepo(
|
||||
repo_id=str(entry.get("repo_id") or repo_id),
|
||||
url=str(entry.get("url") or ""),
|
||||
domains=domains,
|
||||
installed_at=int(entry.get("installed_at") or 0),
|
||||
installed_version=str(installed_version) if installed_version else None,
|
||||
installed_manifest_version=str(installed_manifest_version) if installed_manifest_version else None,
|
||||
ref=str(ref) if ref else None,
|
||||
)
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
async def list_installed_repos(self) -> list[InstalledRepo]:
|
||||
data = await self._load()
|
||||
installed = data.get("installed_repos", {})
|
||||
out: list[InstalledRepo] = []
|
||||
if not isinstance(installed, dict):
|
||||
return out
|
||||
for rid in list(installed.keys()):
|
||||
item = await self.get_installed_repo(str(rid))
|
||||
if item:
|
||||
out.append(item)
|
||||
return out
|
||||
|
||||
async def set_installed_repo(
|
||||
self,
|
||||
*,
|
||||
repo_id: str,
|
||||
url: str,
|
||||
domains: list[str],
|
||||
installed_version: str | None,
|
||||
installed_manifest_version: str | None = None,
|
||||
ref: str | None,
|
||||
) -> None:
|
||||
data = await self._load()
|
||||
installed = data.get("installed_repos", {})
|
||||
if not isinstance(installed, dict):
|
||||
installed = {}
|
||||
data["installed_repos"] = installed
|
||||
|
||||
installed[str(repo_id)] = {
|
||||
"repo_id": str(repo_id),
|
||||
"url": str(url),
|
||||
"domains": [str(d) for d in (domains or []) if str(d).strip()],
|
||||
"installed_at": int(time.time()),
|
||||
# IMPORTANT: this is what BCS uses as "installed version" (ref/tag/branch)
|
||||
"installed_version": installed_version,
|
||||
# informational only
|
||||
"installed_manifest_version": installed_manifest_version,
|
||||
# keep ref too (debug/backward compatibility)
|
||||
"ref": ref,
|
||||
}
|
||||
await self._save(data)
|
||||
|
||||
async def remove_installed_repo(self, repo_id: str) -> None:
|
||||
data = await self._load()
|
||||
installed = data.get("installed_repos", {})
|
||||
if isinstance(installed, dict) and repo_id in installed:
|
||||
installed.pop(repo_id, None)
|
||||
data["installed_repos"] = installed
|
||||
await self._save(data)
|
||||
8
custom_components/bahmcloud_store/strings.json
Normal file
8
custom_components/bahmcloud_store/strings.json
Normal file
@@ -0,0 +1,8 @@
|
||||
{
|
||||
"issues": {
|
||||
"restart_required": {
|
||||
"title": "Restart required",
|
||||
"description": "One or more integrations were installed or updated by Bahmcloud Store. Restart Home Assistant to load the changes."
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,11 +1,146 @@
|
||||
from __future__ import annotations
|
||||
|
||||
# NOTE:
|
||||
# Update entities will be implemented once installation/provider resolution is in place.
|
||||
# This stub prevents platform load errors and keeps the integration stable in 0.3.0.
|
||||
import logging
|
||||
from dataclasses import dataclass
|
||||
from typing import Any
|
||||
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.components.update import UpdateEntity, UpdateEntityFeature
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
from homeassistant.helpers.dispatcher import async_dispatcher_connect
|
||||
from homeassistant.helpers.entity_platform import AddEntitiesCallback
|
||||
from homeassistant.helpers.entity import EntityCategory
|
||||
|
||||
from .core import DOMAIN, SIGNAL_UPDATED, BCSCore
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _pretty_repo_name(core: BCSCore, repo_id: str) -> str:
|
||||
"""Return a human-friendly name for a repo update entity."""
|
||||
try:
|
||||
repo = core.get_repo(repo_id)
|
||||
if repo and getattr(repo, "name", None):
|
||||
name = str(repo.name).strip()
|
||||
if name:
|
||||
return name
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Fallbacks
|
||||
if repo_id.startswith("index:"):
|
||||
return f"BCS Index {repo_id.split(':', 1)[1]}"
|
||||
if repo_id.startswith("custom:"):
|
||||
return f"BCS Custom {repo_id.split(':', 1)[1]}"
|
||||
return f"BCS {repo_id}"
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class _RepoKey:
|
||||
repo_id: str
|
||||
|
||||
|
||||
class BCSRepoUpdateEntity(UpdateEntity):
|
||||
"""Update entity representing a BCS-managed repository."""
|
||||
|
||||
_attr_entity_category = EntityCategory.DIAGNOSTIC
|
||||
_attr_supported_features = UpdateEntityFeature.INSTALL
|
||||
|
||||
def __init__(self, core: BCSCore, repo_id: str) -> None:
|
||||
self._core = core
|
||||
self._repo_id = repo_id
|
||||
self._in_progress = False
|
||||
|
||||
# Stable unique id (do NOT change)
|
||||
self._attr_unique_id = f"{DOMAIN}:{repo_id}"
|
||||
|
||||
# Human-friendly name in UI
|
||||
pretty = _pretty_repo_name(core, repo_id)
|
||||
self._attr_name = pretty
|
||||
|
||||
# Title shown in the entity dialog
|
||||
self._attr_title = pretty
|
||||
|
||||
@property
|
||||
def available(self) -> bool:
|
||||
repo = self._core.get_repo(self._repo_id)
|
||||
installed = self._core.get_installed(self._repo_id)
|
||||
return repo is not None and installed is not None
|
||||
|
||||
@property
|
||||
def in_progress(self) -> bool | None:
|
||||
return self._in_progress
|
||||
|
||||
@property
|
||||
def installed_version(self) -> str | None:
|
||||
installed = self._core.get_installed(self._repo_id) or {}
|
||||
v = installed.get("installed_version") or installed.get("ref")
|
||||
return str(v) if v else None
|
||||
|
||||
@property
|
||||
def latest_version(self) -> str | None:
|
||||
repo = self._core.get_repo(self._repo_id)
|
||||
if not repo:
|
||||
return None
|
||||
v = getattr(repo, "latest_version", None)
|
||||
return str(v) if v else None
|
||||
|
||||
@property
|
||||
def update_available(self) -> bool:
|
||||
latest = self.latest_version
|
||||
installed = self.installed_version
|
||||
if not latest or not installed:
|
||||
return False
|
||||
return latest != installed
|
||||
|
||||
def version_is_newer(self, latest_version: str, installed_version: str) -> bool:
|
||||
return latest_version != installed_version
|
||||
|
||||
@property
|
||||
def release_url(self) -> str | None:
|
||||
repo = self._core.get_repo(self._repo_id)
|
||||
return getattr(repo, "url", None) if repo else None
|
||||
|
||||
async def async_install(self, version: str | None, backup: bool, **kwargs: Any) -> None:
|
||||
if version is not None:
|
||||
_LOGGER.debug(
|
||||
"BCS update entity requested specific version=%s (ignored)", version
|
||||
)
|
||||
|
||||
self._in_progress = True
|
||||
self.async_write_ha_state()
|
||||
|
||||
try:
|
||||
await self._core.update_repo(self._repo_id)
|
||||
finally:
|
||||
self._in_progress = False
|
||||
self.async_write_ha_state()
|
||||
|
||||
|
||||
@callback
|
||||
def _sync_entities(
|
||||
core: BCSCore,
|
||||
existing: dict[str, BCSRepoUpdateEntity],
|
||||
async_add_entities: AddEntitiesCallback,
|
||||
) -> None:
|
||||
"""Ensure there is one update entity per installed repo."""
|
||||
installed_map = getattr(core, "_installed_cache", {}) or {}
|
||||
new_entities: list[BCSRepoUpdateEntity] = []
|
||||
|
||||
for repo_id, data in installed_map.items():
|
||||
if not isinstance(data, dict):
|
||||
continue
|
||||
if repo_id in existing:
|
||||
continue
|
||||
|
||||
ent = BCSRepoUpdateEntity(core, repo_id)
|
||||
existing[repo_id] = ent
|
||||
new_entities.append(ent)
|
||||
|
||||
if new_entities:
|
||||
async_add_entities(new_entities)
|
||||
|
||||
for ent in existing.values():
|
||||
ent.async_write_ha_state()
|
||||
|
||||
|
||||
async def async_setup_platform(
|
||||
@@ -14,4 +149,18 @@ async def async_setup_platform(
|
||||
async_add_entities: AddEntitiesCallback,
|
||||
discovery_info=None,
|
||||
):
|
||||
return
|
||||
"""Set up BCS update entities."""
|
||||
core: BCSCore | None = hass.data.get(DOMAIN)
|
||||
if not core:
|
||||
_LOGGER.debug("BCS core not available, skipping update platform setup")
|
||||
return
|
||||
|
||||
entities: dict[str, BCSRepoUpdateEntity] = {}
|
||||
|
||||
_sync_entities(core, entities, async_add_entities)
|
||||
|
||||
@callback
|
||||
def _handle_update() -> None:
|
||||
_sync_entities(core, entities, async_add_entities)
|
||||
|
||||
async_dispatcher_connect(hass, SIGNAL_UPDATED, _handle_update)
|
||||
@@ -16,14 +16,12 @@ _LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _render_markdown_server_side(md: str) -> str | None:
|
||||
"""Render Markdown -> sanitized HTML (server-side)."""
|
||||
text = (md or "").strip()
|
||||
if not text:
|
||||
return None
|
||||
|
||||
html: str | None = None
|
||||
|
||||
# 1) python-markdown
|
||||
try:
|
||||
import markdown as mdlib # type: ignore
|
||||
|
||||
@@ -39,7 +37,6 @@ def _render_markdown_server_side(md: str) -> str | None:
|
||||
if not html:
|
||||
return None
|
||||
|
||||
# 2) Sanitize via bleach
|
||||
try:
|
||||
import bleach # type: ignore
|
||||
|
||||
@@ -124,16 +121,6 @@ def _maybe_decode_base64(content: str, encoding: Any) -> str | None:
|
||||
|
||||
|
||||
def _extract_text_recursive(obj: Any, depth: int = 0) -> str | None:
|
||||
"""
|
||||
Robust extraction for README markdown.
|
||||
|
||||
Handles:
|
||||
- str / bytes
|
||||
- dict with:
|
||||
- {content: "...", encoding: "base64"} (possibly nested)
|
||||
- {readme: "..."} etc.
|
||||
- list of dicts (pick first matching)
|
||||
"""
|
||||
if obj is None:
|
||||
return None
|
||||
|
||||
@@ -150,21 +137,16 @@ def _extract_text_recursive(obj: Any, depth: int = 0) -> str | None:
|
||||
return None
|
||||
|
||||
if isinstance(obj, dict):
|
||||
# 1) If it looks like "file content"
|
||||
content = obj.get("content")
|
||||
encoding = obj.get("encoding")
|
||||
|
||||
# Base64 decode if possible
|
||||
decoded = _maybe_decode_base64(content, encoding)
|
||||
if decoded:
|
||||
return decoded
|
||||
|
||||
# content may already be plain text
|
||||
if isinstance(content, str) and (not isinstance(encoding, str) or not encoding.strip()):
|
||||
# Heuristic: treat as markdown if it has typical markdown chars, otherwise still return
|
||||
return content
|
||||
|
||||
# 2) direct text keys (readme/markdown/text/body/data)
|
||||
for k in _TEXT_KEYS:
|
||||
v = obj.get(k)
|
||||
if isinstance(v, str):
|
||||
@@ -175,7 +157,6 @@ def _extract_text_recursive(obj: Any, depth: int = 0) -> str | None:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# 3) Sometimes nested under "file" / "result" / "payload" etc.
|
||||
for v in obj.values():
|
||||
out = _extract_text_recursive(v, depth + 1)
|
||||
if out:
|
||||
@@ -198,7 +179,7 @@ class StaticAssetsView(HomeAssistantView):
|
||||
name = "api:bahmcloud_store_static"
|
||||
requires_auth = False
|
||||
|
||||
async def get(self, request: web.Request, path: str) -> web.Response:
|
||||
async def get(self, request: web.Request, path: str) -> web.StreamResponse:
|
||||
base = Path(__file__).resolve().parent / "panel"
|
||||
base_resolved = base.resolve()
|
||||
|
||||
@@ -218,24 +199,7 @@ class StaticAssetsView(HomeAssistantView):
|
||||
_LOGGER.error("BCS static asset not found: %s", target)
|
||||
return web.Response(status=404)
|
||||
|
||||
content_type = "text/plain"
|
||||
charset = None
|
||||
|
||||
if target.suffix == ".js":
|
||||
content_type = "application/javascript"
|
||||
charset = "utf-8"
|
||||
elif target.suffix == ".html":
|
||||
content_type = "text/html"
|
||||
charset = "utf-8"
|
||||
elif target.suffix == ".css":
|
||||
content_type = "text/css"
|
||||
charset = "utf-8"
|
||||
elif target.suffix == ".svg":
|
||||
content_type = "image/svg+xml"
|
||||
elif target.suffix == ".png":
|
||||
content_type = "image/png"
|
||||
|
||||
resp = web.Response(body=target.read_bytes(), content_type=content_type, charset=charset)
|
||||
resp = web.FileResponse(path=target)
|
||||
resp.headers["Cache-Control"] = "no-store, no-cache, must-revalidate, max-age=0"
|
||||
resp.headers["Pragma"] = "no-cache"
|
||||
return resp
|
||||
@@ -247,7 +211,7 @@ class BCSApiView(HomeAssistantView):
|
||||
requires_auth = True
|
||||
|
||||
def __init__(self, core: Any) -> None:
|
||||
self.core = core
|
||||
self.core: BCSCore = core
|
||||
|
||||
async def get(self, request: web.Request) -> web.Response:
|
||||
return web.json_response(
|
||||
@@ -255,7 +219,21 @@ class BCSApiView(HomeAssistantView):
|
||||
)
|
||||
|
||||
async def post(self, request: web.Request) -> web.Response:
|
||||
data = await request.json()
|
||||
action = request.query.get("action")
|
||||
if action == "refresh":
|
||||
_LOGGER.info("BCS manual refresh triggered via API")
|
||||
try:
|
||||
await self.core.full_refresh(source="manual")
|
||||
return web.json_response({"ok": True})
|
||||
except Exception as e:
|
||||
_LOGGER.error("BCS manual refresh failed: %s", e)
|
||||
return web.json_response({"ok": False, "message": "Refresh failed"}, status=500)
|
||||
|
||||
try:
|
||||
data = await request.json()
|
||||
except Exception:
|
||||
data = {}
|
||||
|
||||
op = data.get("op")
|
||||
|
||||
if op == "add_custom_repo":
|
||||
@@ -276,7 +254,7 @@ class BCSCustomRepoView(HomeAssistantView):
|
||||
requires_auth = True
|
||||
|
||||
def __init__(self, core: Any) -> None:
|
||||
self.core = core
|
||||
self.core: BCSCore = core
|
||||
|
||||
async def delete(self, request: web.Request) -> web.Response:
|
||||
repo_id = request.query.get("id")
|
||||
@@ -292,7 +270,7 @@ class BCSReadmeView(HomeAssistantView):
|
||||
requires_auth = True
|
||||
|
||||
def __init__(self, core: Any) -> None:
|
||||
self.core = core
|
||||
self.core: BCSCore = core
|
||||
|
||||
async def get(self, request: web.Request) -> web.Response:
|
||||
repo_id = request.query.get("repo_id")
|
||||
@@ -309,8 +287,65 @@ class BCSReadmeView(HomeAssistantView):
|
||||
status=404,
|
||||
)
|
||||
|
||||
# Ensure strict JSON string output (avoid accidental objects)
|
||||
md_str = str(md)
|
||||
|
||||
html = _render_markdown_server_side(md_str)
|
||||
return web.json_response({"ok": True, "readme": md_str, "html": html})
|
||||
|
||||
|
||||
class BCSInstallView(HomeAssistantView):
|
||||
url = "/api/bcs/install"
|
||||
name = "api:bcs_install"
|
||||
requires_auth = True
|
||||
|
||||
def __init__(self, core: Any) -> None:
|
||||
self.core: BCSCore = core
|
||||
|
||||
async def post(self, request: web.Request) -> web.Response:
|
||||
repo_id = request.query.get("repo_id")
|
||||
if not repo_id:
|
||||
return web.json_response({"ok": False, "message": "Missing repo_id"}, status=400)
|
||||
|
||||
try:
|
||||
result = await self.core.install_repo(repo_id)
|
||||
return web.json_response(result, status=200)
|
||||
except Exception as e:
|
||||
_LOGGER.exception("BCS install failed: %s", e)
|
||||
return web.json_response({"ok": False, "message": str(e) or "Install failed"}, status=500)
|
||||
|
||||
|
||||
class BCSUpdateView(HomeAssistantView):
|
||||
url = "/api/bcs/update"
|
||||
name = "api:bcs_update"
|
||||
requires_auth = True
|
||||
|
||||
def __init__(self, core: Any) -> None:
|
||||
self.core: BCSCore = core
|
||||
|
||||
async def post(self, request: web.Request) -> web.Response:
|
||||
repo_id = request.query.get("repo_id")
|
||||
if not repo_id:
|
||||
return web.json_response({"ok": False, "message": "Missing repo_id"}, status=400)
|
||||
|
||||
try:
|
||||
result = await self.core.update_repo(repo_id)
|
||||
return web.json_response(result, status=200)
|
||||
except Exception as e:
|
||||
_LOGGER.exception("BCS update failed: %s", e)
|
||||
return web.json_response({"ok": False, "message": str(e) or "Update failed"}, status=500)
|
||||
|
||||
|
||||
class BCSRestartView(HomeAssistantView):
|
||||
url = "/api/bcs/restart"
|
||||
name = "api:bcs_restart"
|
||||
requires_auth = True
|
||||
|
||||
def __init__(self, core: Any) -> None:
|
||||
self.core: BCSCore = core
|
||||
|
||||
async def post(self, request: web.Request) -> web.Response:
|
||||
try:
|
||||
await self.core.request_restart()
|
||||
return web.json_response({"ok": True})
|
||||
except Exception as e:
|
||||
_LOGGER.exception("BCS restart failed: %s", e)
|
||||
return web.json_response({"ok": False, "message": str(e) or "Restart failed"}, status=500)
|
||||
Reference in New Issue
Block a user