Compare commits
33 Commits
| Author | SHA1 | Date |
|---|---|---|
|
|
9f949b3c09 | |
|
|
6e4893f59d | |
|
|
93adfa6d27 | |
|
|
6b3ed99876 | |
|
|
dd4ff8b8ff | |
|
|
1def6e3512 | |
|
|
b0055e174f | |
|
|
698d0ca3f7 | |
|
|
c6c6276fe6 | |
|
|
2916e49973 | |
|
|
01bb82fa5a | |
|
|
0405c340a3 | |
|
|
1da0b26320 | |
|
|
f140181c45 | |
|
|
77921ba6a8 | |
|
|
0c1bee31f4 | |
|
|
da446f5ab0 | |
|
|
dbc460f0bb | |
|
|
bb64acab09 | |
|
|
aa91a427ba | |
|
|
7d81e7aff3 | |
|
|
e3b86d6b4e | |
|
|
d1f04b8b5e | |
|
|
721af9bc83 | |
|
|
c170ca5433 | |
|
|
8651efd578 | |
|
|
3511ebd247 | |
|
|
64491e19e1 | |
|
|
3c84ec43a2 | |
|
|
4ec91cb657 | |
|
|
b7afa284aa | |
|
|
e6153c881a | |
|
|
38e54f0ab2 |
|
|
@ -4,7 +4,7 @@ venv
|
|||
logs
|
||||
sqlStorage
|
||||
playground
|
||||
alembic.ini
|
||||
.DS_Store
|
||||
messages.pot
|
||||
activeConfig
|
||||
__pycache__
|
||||
|
|
|
|||
|
|
@ -0,0 +1,275 @@
|
|||
# Обзор архитектуры системы
|
||||
|
||||
Этот документ — единый и актуальный источник информации по платформе: архитектура, протоколы, данные, конфигурация, сценарии, эксплуатация. Заменяет собой разрозненные и устаревшие документы.
|
||||
|
||||
## Содержание
|
||||
- Компоненты и топология
|
||||
- Децентрализованный слой (членство, оценка размера сети, репликации, метрики)
|
||||
- Загрузка и конвертация контента
|
||||
- Просмотр и покупка контента (UI/UX требования)
|
||||
- API (ключевые эндпойнты и полезная нагрузка)
|
||||
- Ключи и схемы данных (DHT)
|
||||
- Конфигурация и значения по умолчанию
|
||||
- Наблюдаемость и метрики
|
||||
- Диаграммы последовательностей (Mermaid)
|
||||
- Сборка и тестирование
|
||||
|
||||
---
|
||||
|
||||
## Компоненты и топология
|
||||
|
||||
- Backend API: сервис на Sanic (Python) с бота́ми Telegram; база данных PostgreSQL (SQLAlchemy + Alembic).
|
||||
- Хранилище: локальная ФС (uploads/derivatives); IPFS (kubo) для ретривания/пининга; tusd (resumable upload).
|
||||
- Конвертеры: воркеры (ffmpeg) в контейнерах — `convert_v3`, `convert_process`.
|
||||
- Frontend: SPA (Vite + TypeScript), отдается nginx-контейнером.
|
||||
- Децентрализованный слой: встроенный DHT (в процессе) — членство, лизы реплик, метрики контента.
|
||||
|
||||
```mermaid
|
||||
flowchart LR
|
||||
Client -- TWA/HTTP --> Frontend
|
||||
Frontend -- REST --> API[Backend API]
|
||||
API -- tus hooks --> tusd
|
||||
API -- SQL --> Postgres
|
||||
API -- IPC --> Workers[Converters]
|
||||
API -- IPFS --> IPFS
|
||||
API -- DHT --> DHT[(In-Process DHT)]
|
||||
DHT -- CRDT Merge --> DHT
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Децентрализованный слой
|
||||
|
||||
### Идентификаторы и версии
|
||||
- NodeID = blake3(Ed25519 публичного ключа) — шестнадцатеричная строка (256 бит).
|
||||
- ContentID = blake3(зашифрованного блоба) — неизменяемый идентификатор контента.
|
||||
- schema_version = v1 — фиксируется во всех DHT-ключах/записях.
|
||||
|
||||
### Членство (membership)
|
||||
- Рукопожатие `/api/v1/network.handshake` — запрос подписан Ed25519; верифицируется на стороне получателя. Без корректной подписи — 400 BAD_SIGNATURE.
|
||||
- Полезная нагрузка включает: сведения о ноде (версия, возможности, IPFS), метрики, массив известных публичных нод, квитанции достижимости (reachability_receipts: issuer, target, ASN, timestamp, signature).
|
||||
- Состояние членства — CRDT LWW-Set (добавления/удаления) с TTL (`DHT_MEMBERSHIP_TTL=600` сек), плюс HyperLogLog для оценки мощности (N_local).
|
||||
- Фильтрация «островов»: ноды с `reachability_ratio < q` (по умолчанию `q=0.6`) исключаются при вычислении N_estimate и выборе реплик.
|
||||
- Итоговая оценка `N_estimate = max(валидных N_local от пиров)`.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant A as Узел A
|
||||
participant B as Узел B
|
||||
A->>B: POST /network.handshake {nonce, ts, node, receipts, signature}
|
||||
B->>B: верификация ts/nonce, подписи
|
||||
B->>B: upsert member; merge(receipts)
|
||||
B-->>A: {node, known_public_nodes, n_estimate, server_signature}
|
||||
A->>A: merge; N_estimate = max(N_local, полученные)
|
||||
```
|
||||
|
||||
### Репликации и лизы
|
||||
- Выбор префикса: `p = max(0, round(log2(N_estimate / R_target)))`, где `R_target ≥ 3` (по умолчанию 3).
|
||||
- Ответственные ноды: чьи первые `p` бит NodeID совпадают с первыми `p` бит ContentID.
|
||||
- Лидер — минимальный NodeID среди ответственных.
|
||||
- Лидер выдаёт `replica_leases` (TTL=600 сек), соблюдая разнообразие: не менее 3 разных первых октетов IP и, если доступно, 3 разных ASN.
|
||||
- Ранжирование кандидатов — rendezvous score `blake3(ContentID || NodeID)`.
|
||||
- Сердцебиение (heartbeat) держателей — каждые 60 сек; 3 пропуска → признать down и переназначить ≤180 сек.
|
||||
- Недобор/перебор фиксируются в `conflict_log` и прометеус‑метриках.
|
||||
|
||||
```mermaid
|
||||
stateDiagram-v2
|
||||
[*] --> Discover
|
||||
Discover: Рукопожатия + квитанции
|
||||
Discover --> Active: TTL & кворм ASN
|
||||
Active --> Leader: Выбор лидера префикса p
|
||||
Leader --> Leased: Выдача лизов (diversity)
|
||||
Leased --> Monitoring: Heartbeat 60s
|
||||
Monitoring --> Reassign: 3 пропуска
|
||||
Reassign --> Leased
|
||||
```
|
||||
|
||||
### Метрики (окна)
|
||||
- На событии просмотра формируются дельты CRDT:
|
||||
- PN‑Counter — количество просмотров;
|
||||
- HyperLogLog — уникальные ViewID (ViewID = blake3(ContentID || соль_устройства));
|
||||
- G‑Counter — watch_time, bytes_out, количество завершений.
|
||||
- Окно по часу (`DHT_METRIC_WINDOW_SEC`), ключ `MetricKey = blake3(ContentID || WindowID)`.
|
||||
- Мерджи коммутативные, детерминированные.
|
||||
|
||||
---
|
||||
|
||||
## Загрузка и конвертация контента
|
||||
|
||||
1) Клиент грузит в `tusd` (resumable). Бэкенд получает HTTP‑hooks `/api/v1/upload.tus-hook`.
|
||||
2) Создается запись в БД для зашифрованного контента, воркеры размещают производные:
|
||||
- для медиа — preview/low/high;
|
||||
- для бинарей — оригинал (доступен только при наличии лицензии).
|
||||
3) `/api/v1/content.view` возвращает `display_options` и агрегированное состояние конвертации/загрузки.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant C as Клиент
|
||||
participant T as tusd
|
||||
participant B as Бэкенд
|
||||
participant W as Воркеры
|
||||
participant DB as PostgreSQL
|
||||
|
||||
C->>T: upload
|
||||
T->>B: hooks (pre/post-finish)
|
||||
B->>DB: create content
|
||||
B->>W: очередь конвертации
|
||||
W->>DB: derive/previews
|
||||
C->>B: GET /content.view
|
||||
B->>DB: resolve derivatives
|
||||
B-->>C: display_options + status
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Просмотр и покупка (UI/UX)
|
||||
|
||||
- `/api/v1/content.view/<content_address>` определяет доступные отображения:
|
||||
- бинарный контент без превью — оригинал только при наличии лицензии;
|
||||
- аудио/видео — для неавторизованных preview/low, для имеющих доступ — decrypted_low/high.
|
||||
- В процессе конвертации фронтенд показывает статус «processing», без фальшивых ссылок.
|
||||
- Обложка (cover):
|
||||
- фиксированный квадратный слот; изображение «вписывается» без растягивания/искажения;
|
||||
- пустые области не заполняются чёрным — фон совпадает с фоном страницы.
|
||||
- Кнопки «Купить за TON/Stars»: всегда в одной строке (без горизонтального/вертикального скролла контента на малых экранах).
|
||||
|
||||
```mermaid
|
||||
flowchart LR
|
||||
View[content.view] --> Resolve[Определение деривативов]
|
||||
Resolve --> Ready{Готово?}
|
||||
Ready -- Нет --> Info[Статус: processing/pending]
|
||||
Ready -- Да --> Options
|
||||
Options -- Бинарь + нет лицензии --> HideOriginal[Скрыть оригинал]
|
||||
Options -- Медиа + нет лицензии --> PreviewLow[preview/low]
|
||||
Options -- Есть лицензия --> Decrypted[decrypted low/high|original]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## API (ключевые)
|
||||
|
||||
- `GET /api/system.version` — актуальность сервиса.
|
||||
- `POST /api/v1/network.handshake` — обмен членством (обязательная Ed25519‑подпись запроса). Пример запроса:
|
||||
|
||||
```json
|
||||
{
|
||||
"version": "3.0.0",
|
||||
"schema_version": "v1",
|
||||
"public_key": "<base58 ed25519 pubkey>",
|
||||
"node_id": "<blake3(pubkey)>",
|
||||
"public_host": "https://node.example",
|
||||
"node_type": "public|private",
|
||||
"metrics": {"uptime_sec": 123, "content_count": 42},
|
||||
"capabilities": {"accepts_inbound": true, "is_bootstrap": false},
|
||||
"ipfs": {"multiaddrs": ["/ip4/.../tcp/4001"], "peer_id": "..."},
|
||||
"known_public_nodes": [],
|
||||
"reachability_receipts": [],
|
||||
"timestamp": 1710000000,
|
||||
"nonce": "<hex>",
|
||||
"signature": "<base58 ed25519 signature>"
|
||||
}
|
||||
```
|
||||
|
||||
- `GET /api/v1/content.view/<content_address>` — `display_options`, `status`, `conversion`.
|
||||
- `GET /api/v1.5/storage/<file_hash>` — отдача файла.
|
||||
- `GET /metrics` — экспозиция метрик Prometheus (либо fallback‑дамп счётчиков).
|
||||
|
||||
---
|
||||
|
||||
## Ключи и схемы DHT
|
||||
|
||||
- `MetaKey(content_id)` — метаданные репликаций:
|
||||
- `replica_leases`: карта `{lease_id -> {node_id, issued_at, expires_at, asn, ip_first_octet, heartbeat_at, score}}`;
|
||||
- `leader`: NodeID лидера; `revision`: номер ревизии;
|
||||
- `conflict_log`: массив событий `UNDER/OVER/LEASE_EXPIRED` и т.п.
|
||||
|
||||
- `MembershipKey(node_id)` — членство:
|
||||
- `members`: LWW‑Set; `receipts`: LWW‑Set;
|
||||
- `hll`: HyperLogLog; `reports`: карты локальных оценок N;
|
||||
- `logical_counter`: логический счётчик для LWW‑доминации.
|
||||
|
||||
- `MetricKey(content_id, window_id)` — метрики окна:
|
||||
- `views`: PN‑Counter; `unique`: HLL; `watch_time`, `bytes_out`, `completions`: G‑Counters.
|
||||
|
||||
Все записи подписываются и сливаются детерминированно: CRDT‑логика + LWW‑доминация (`logical_counter`, `timestamp`, `node_id`).
|
||||
|
||||
---
|
||||
|
||||
## Конфигурация и значения по умолчанию
|
||||
|
||||
- Сеть/рукопожатия: `NODE_PRIVACY`, `PUBLIC_HOST`, `HANDSHAKE_INTERVAL_SEC`, `NETWORK_TLS_VERIFY`, IPFS‑пиры/бустрапы.
|
||||
- DHT:
|
||||
- `DHT_MIN_RECEIPTS=5`, `DHT_MIN_REACHABILITY=0.6`, `DHT_MEMBERSHIP_TTL=600`;
|
||||
- `DHT_REPLICATION_TARGET=3`, `DHT_LEASE_TTL=600`,
|
||||
- `DHT_HEARTBEAT_INTERVAL=60`, `DHT_HEARTBEAT_MISS_THRESHOLD=3`;
|
||||
- `DHT_MIN_ASN=3`, `DHT_MIN_IP_OCTETS=3`,
|
||||
- `DHT_METRIC_WINDOW_SEC=3600`.
|
||||
- Конвертация: квоты `CONVERT_*`, `MAX_CONTENT_SIZE_MB`.
|
||||
|
||||
Примечание: PoW‑допуски и Kademlia k‑buckets на текущем этапе не активированы в коде — заложены в дизайн и могут быть реализованы отдельно.
|
||||
|
||||
---
|
||||
|
||||
## Наблюдаемость и метрики
|
||||
|
||||
Prometheus:
|
||||
- `dht_replication_under_total`, `dht_replication_over_total`, `dht_leader_changes_total`;
|
||||
- `dht_merge_conflicts_total`;
|
||||
- `dht_view_count_total`, `dht_unique_view_estimate`, `dht_watch_time_seconds`.
|
||||
|
||||
Логи: структурированные ошибки HTTP (с id), `conflict_log` по репликациям, события регистрации нод.
|
||||
|
||||
---
|
||||
|
||||
## Диаграммы последовательностей (сводные)
|
||||
|
||||
### Обновление N_estimate
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant Peer
|
||||
participant Membership
|
||||
participant DHT
|
||||
Peer->>Membership: handshake(payload, receipts)
|
||||
Membership->>Membership: merge LWW/receipts
|
||||
Membership->>Membership: update HLL и N_local
|
||||
Membership->>DHT: persist MembershipKey
|
||||
Membership->>Membership: N_estimate = max(valid reports)
|
||||
```
|
||||
|
||||
### Выбор лидера и выдача лизов
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant L as Leader
|
||||
participant R as Responsible
|
||||
L->>L: p = round(log2(N_est/R))
|
||||
L->>R: rank by rendezvous(ContentID, NodeID)
|
||||
L->>L: assign leases (diversity)
|
||||
R-->>L: heartbeat/60s
|
||||
L->>L: reassign on 3 misses
|
||||
```
|
||||
|
||||
### Публикация метрик окна
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant C as Client
|
||||
participant API as Backend
|
||||
participant M as Metrics
|
||||
participant D as DHT
|
||||
C->>API: GET content.view?watch_time,bytes_out
|
||||
API->>M: record_view(delta)
|
||||
M->>D: merge MetricKey(ContentID, window)
|
||||
API-->>Prom: /metrics
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Сборка и тестирование
|
||||
|
||||
```bash
|
||||
# Старт окружения (пример для /home/configs)
|
||||
docker compose -f /home/configs/docker-compose.yml --env-file /home/configs/.env up -d --build
|
||||
|
||||
# Тесты слоя DHT
|
||||
cd uploader-bot
|
||||
python3 -m unittest discover -s tests/dht
|
||||
```
|
||||
|
|
@ -1,5 +1,14 @@
|
|||
# Sanic Telegram Bot [template]
|
||||
|
||||
Полная документация по системе (архитектура, протоколы, конфигурация, диаграммы) — см. `ARCHITECTURE.md`.
|
||||
|
||||
### Запуск тестов интеграции DHT
|
||||
|
||||
```shell
|
||||
cd uploader-bot
|
||||
python3 -m unittest discover -s tests/dht
|
||||
```
|
||||
|
||||
---
|
||||
## Run
|
||||
```shell
|
||||
|
|
|
|||
|
|
@ -0,0 +1,35 @@
|
|||
[alembic]
|
||||
script_location = alembic
|
||||
sqlalchemy.url = ${DATABASE_URL}
|
||||
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
|
|
@ -1,3 +1,4 @@
|
|||
import os
|
||||
from logging.config import fileConfig
|
||||
|
||||
from sqlalchemy import engine_from_config
|
||||
|
|
@ -7,6 +8,10 @@ from alembic import context
|
|||
|
||||
config = context.config
|
||||
|
||||
database_url = os.environ.get("DATABASE_URL")
|
||||
if database_url:
|
||||
config.set_main_option("sqlalchemy.url", database_url)
|
||||
|
||||
if config.config_file_name is not None:
|
||||
fileConfig(config.config_file_name)
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,26 @@
|
|||
"""add artist column to encrypted content
|
||||
|
||||
Revision ID: b1f2d3c4a5b6
|
||||
Revises: a7c1357e8d15
|
||||
Create Date: 2024-06-05 00:00:00.000000
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = 'b1f2d3c4a5b6'
|
||||
down_revision: Union[str, None] = 'a7c1357e8d15'
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.add_column('encrypted_contents', sa.Column('artist', sa.String(length=512), nullable=True))
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_column('encrypted_contents', 'artist')
|
||||
|
|
@ -0,0 +1,38 @@
|
|||
"""expand telegram_id precision on stars invoices
|
||||
|
||||
Revision ID: c2d4e6f8a1b2
|
||||
Revises: b1f2d3c4a5b6
|
||||
Create Date: 2025-10-17 00:00:00.000000
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = 'c2d4e6f8a1b2'
|
||||
down_revision: Union[str, None] = 'b1f2d3c4a5b6'
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.alter_column(
|
||||
'stars_invoices',
|
||||
'telegram_id',
|
||||
existing_type=sa.Integer(),
|
||||
type_=sa.BigInteger(),
|
||||
existing_nullable=True,
|
||||
)
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.alter_column(
|
||||
'stars_invoices',
|
||||
'telegram_id',
|
||||
existing_type=sa.BigInteger(),
|
||||
type_=sa.Integer(),
|
||||
existing_nullable=True,
|
||||
)
|
||||
|
|
@ -0,0 +1,70 @@
|
|||
"""create dht_records and rdap_cache tables
|
||||
|
||||
Revision ID: d3e5f7a9c0d1
|
||||
Revises: c2d4e6f8a1b2
|
||||
Create Date: 2025-10-22 00:00:00.000000
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = 'd3e5f7a9c0d1'
|
||||
down_revision: Union[str, None] = 'c2d4e6f8a1b2'
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
bind = op.get_bind()
|
||||
inspector = sa.inspect(bind)
|
||||
|
||||
# dht_records
|
||||
if not inspector.has_table('dht_records'):
|
||||
op.create_table(
|
||||
'dht_records',
|
||||
sa.Column('fingerprint', sa.String(length=128), primary_key=True),
|
||||
sa.Column('key', sa.String(length=512), nullable=False),
|
||||
sa.Column('schema_version', sa.String(length=16), nullable=False, server_default='v1'),
|
||||
sa.Column('logical_counter', sa.Integer(), nullable=False, server_default='0'),
|
||||
sa.Column('timestamp', sa.Float(), nullable=False, server_default='0'),
|
||||
sa.Column('node_id', sa.String(length=128), nullable=False),
|
||||
sa.Column('signature', sa.String(length=512), nullable=True),
|
||||
sa.Column('value', sa.JSON(), nullable=False, server_default=sa.text("'{}'::jsonb")),
|
||||
sa.Column('updated_at', sa.DateTime(), nullable=False, server_default=sa.text('CURRENT_TIMESTAMP')),
|
||||
)
|
||||
# ensure index exists (but don't fail if it already exists)
|
||||
try:
|
||||
existing_indexes = {idx['name'] for idx in inspector.get_indexes('dht_records')}
|
||||
except Exception:
|
||||
existing_indexes = set()
|
||||
if 'ix_dht_records_key' not in existing_indexes:
|
||||
op.create_index('ix_dht_records_key', 'dht_records', ['key'])
|
||||
|
||||
# rdap_cache
|
||||
if not inspector.has_table('rdap_cache'):
|
||||
op.create_table(
|
||||
'rdap_cache',
|
||||
sa.Column('ip', sa.String(length=64), primary_key=True),
|
||||
sa.Column('asn', sa.Integer(), nullable=True),
|
||||
sa.Column('source', sa.String(length=64), nullable=True),
|
||||
sa.Column('updated_at', sa.DateTime(), nullable=False, server_default=sa.text('CURRENT_TIMESTAMP')),
|
||||
)
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
try:
|
||||
op.drop_table('rdap_cache')
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
op.drop_index('ix_dht_records_key', table_name='dht_records')
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
op.drop_table('dht_records')
|
||||
except Exception:
|
||||
pass
|
||||
|
|
@ -25,7 +25,32 @@ if int(os.getenv("SANIC_MAINTENANCE", '0')) == 1:
|
|||
while True:
|
||||
time.sleep(1)
|
||||
|
||||
from app.core.models import Memory
|
||||
|
||||
def init_db_schema_sync() -> None:
|
||||
"""Initialise all SQLAlchemy models in the database before services start.
|
||||
|
||||
This ensures that every table defined on AlchemyBase.metadata (including
|
||||
newer ones like DHT and service_config) exists before any component
|
||||
accesses the database.
|
||||
"""
|
||||
try:
|
||||
from sqlalchemy import create_engine
|
||||
from app.core.models import AlchemyBase # imports all models and populates metadata
|
||||
|
||||
db_url = os.environ.get('DATABASE_URL')
|
||||
if not db_url:
|
||||
raise RuntimeError('DATABASE_URL is not set')
|
||||
|
||||
# Normalise DSN to sync driver for schema creation
|
||||
if '+asyncpg' in db_url:
|
||||
db_url_sync = db_url.replace('+asyncpg', '+psycopg2')
|
||||
else:
|
||||
db_url_sync = db_url
|
||||
|
||||
sync_engine = create_engine(db_url_sync, pool_pre_ping=True)
|
||||
AlchemyBase.metadata.create_all(sync_engine)
|
||||
except Exception as e:
|
||||
make_log('Startup', f'DB sync init failed: {e}', level='error')
|
||||
|
||||
|
||||
async def queue_daemon(app):
|
||||
|
|
@ -78,37 +103,22 @@ async def execute_queue(app):
|
|||
|
||||
|
||||
if __name__ == '__main__':
|
||||
# Ensure DB schema is fully initialised for all models
|
||||
init_db_schema_sync()
|
||||
|
||||
from app.core.models import Memory
|
||||
main_memory = Memory()
|
||||
if startup_target == '__main__':
|
||||
# Defer heavy imports to avoid side effects in background services
|
||||
# Mark this process as the primary node for seeding/config init
|
||||
os.environ.setdefault('NODE_ROLE', 'primary')
|
||||
# Create DB tables synchronously before importing HTTP app to satisfy _secrets
|
||||
try:
|
||||
from sqlalchemy import create_engine
|
||||
from app.core.models import AlchemyBase # imports all models
|
||||
db_url = os.environ.get('DATABASE_URL')
|
||||
if not db_url:
|
||||
raise RuntimeError('DATABASE_URL is not set')
|
||||
# Normalize to sync driver
|
||||
if '+asyncpg' in db_url:
|
||||
db_url_sync = db_url.replace('+asyncpg', '+psycopg2')
|
||||
else:
|
||||
db_url_sync = db_url
|
||||
sync_engine = create_engine(db_url_sync, pool_pre_ping=True)
|
||||
AlchemyBase.metadata.create_all(sync_engine)
|
||||
except Exception as e:
|
||||
make_log('Startup', f'DB sync init failed: {e}', level='error')
|
||||
from app.api import app
|
||||
from app.bot import dp as uploader_bot_dp
|
||||
from app.client_bot import dp as client_bot_dp
|
||||
# Delay aiogram dispatcher creation until loop is running
|
||||
from app.core._config import SANIC_PORT, PROJECT_HOST, DATABASE_URL
|
||||
from app.core.network.nodes import network_handshake_daemon, bootstrap_once_and_exit_if_failed
|
||||
from app.core.network.maintenance import replication_daemon, heartbeat_daemon, dht_gossip_daemon
|
||||
|
||||
app.ctx.memory = main_memory
|
||||
for _target in [uploader_bot_dp, client_bot_dp]:
|
||||
_target._s_memory = app.ctx.memory
|
||||
|
||||
app.ctx.memory._app = app
|
||||
|
||||
# Ensure DB schema exists using the same event loop as Sanic (idempotent)
|
||||
|
|
@ -116,11 +126,28 @@ if __name__ == '__main__':
|
|||
|
||||
app.add_task(execute_queue(app))
|
||||
app.add_task(queue_daemon(app))
|
||||
app.add_task(uploader_bot_dp.start_polling(app.ctx.memory._telegram_bot))
|
||||
app.add_task(client_bot_dp.start_polling(app.ctx.memory._client_telegram_bot))
|
||||
# Start bots after loop is ready
|
||||
async def _start_bots():
|
||||
try:
|
||||
from app.bot import create_dispatcher as create_uploader_dp
|
||||
from app.client_bot import create_dispatcher as create_client_dp
|
||||
uploader_bot_dp = create_uploader_dp()
|
||||
client_bot_dp = create_client_dp()
|
||||
for _target in [uploader_bot_dp, client_bot_dp]:
|
||||
_target._s_memory = app.ctx.memory
|
||||
await asyncio.gather(
|
||||
uploader_bot_dp.start_polling(app.ctx.memory._telegram_bot),
|
||||
client_bot_dp.start_polling(app.ctx.memory._client_telegram_bot),
|
||||
)
|
||||
except Exception as e:
|
||||
make_log('Bots', f'Failed to start bots: {e}', level='error')
|
||||
app.add_task(_start_bots())
|
||||
# Start network handshake daemon and bootstrap step
|
||||
app.add_task(network_handshake_daemon(app))
|
||||
app.add_task(bootstrap_once_and_exit_if_failed())
|
||||
app.add_task(replication_daemon(app))
|
||||
app.add_task(heartbeat_daemon(app))
|
||||
app.add_task(dht_gossip_daemon(app))
|
||||
|
||||
app.run(host='0.0.0.0', port=SANIC_PORT)
|
||||
else:
|
||||
|
|
@ -151,6 +178,9 @@ if __name__ == '__main__':
|
|||
elif startup_target == 'derivative_janitor':
|
||||
from app.core.background.derivative_cache_janitor import main_fn as target_fn
|
||||
time.sleep(5)
|
||||
elif startup_target == 'events_sync':
|
||||
from app.core.background.event_sync_service import main_fn as target_fn
|
||||
time.sleep(5)
|
||||
|
||||
startup_fn = startup_fn or target_fn
|
||||
assert startup_fn
|
||||
|
|
@ -163,7 +193,11 @@ if __name__ == '__main__':
|
|||
level='error')
|
||||
sys.exit(1)
|
||||
|
||||
loop = asyncio.get_event_loop()
|
||||
try:
|
||||
loop = asyncio.get_event_loop()
|
||||
except RuntimeError:
|
||||
loop = asyncio.new_event_loop()
|
||||
asyncio.set_event_loop(loop)
|
||||
try:
|
||||
# Background services no longer perform schema initialization
|
||||
loop.run_until_complete(wrapped_startup_fn(main_memory))
|
||||
|
|
|
|||
|
|
@ -20,11 +20,12 @@ from app.api.routes.network import (
|
|||
s_api_v1_network_nodes,
|
||||
s_api_v1_network_handshake,
|
||||
)
|
||||
from app.api.routes.network_events import s_api_v1_network_events
|
||||
from app.api.routes.auth import s_api_v1_auth_twa, s_api_v1_auth_select_wallet, s_api_v1_auth_me
|
||||
from app.api.routes.statics import s_api_tonconnect_manifest, s_api_platform_metadata
|
||||
from app.api.routes.node_storage import s_api_v1_storage_post, s_api_v1_storage_get, \
|
||||
s_api_v1_storage_decode_cid
|
||||
from app.api.routes.progressive_storage import s_api_v1_5_storage_get, s_api_v1_5_storage_post
|
||||
from app.api.routes.progressive_storage import s_api_v1_5_storage_get, s_api_v1_5_storage_post, s_api_v1_storage_fetch, s_api_v1_storage_proxy
|
||||
from app.api.routes.upload_tus import s_api_v1_upload_tus_hook
|
||||
from app.api.routes.account import s_api_v1_account_get
|
||||
from app.api.routes._blockchain import s_api_v1_blockchain_send_new_content_message, \
|
||||
|
|
@ -33,17 +34,34 @@ from app.api.routes.content import s_api_v1_content_list, s_api_v1_content_view,
|
|||
from app.api.routes.content_index import s_api_v1_content_index, s_api_v1_content_delta
|
||||
from app.api.routes.derivatives import s_api_v1_content_derivatives
|
||||
from app.api.routes.admin import (
|
||||
s_api_v1_admin_blockchain,
|
||||
s_api_v1_admin_cache_cleanup,
|
||||
s_api_v1_admin_cache_setlimits,
|
||||
s_api_v1_admin_events,
|
||||
s_api_v1_admin_licenses,
|
||||
s_api_v1_admin_login,
|
||||
s_api_v1_admin_logout,
|
||||
s_api_v1_admin_users_setadmin,
|
||||
s_api_v1_admin_node_setrole,
|
||||
s_api_v1_admin_nodes,
|
||||
s_api_v1_admin_overview,
|
||||
s_api_v1_admin_stars,
|
||||
s_api_v1_admin_status,
|
||||
s_api_v1_admin_cache_setlimits,
|
||||
s_api_v1_admin_cache_cleanup,
|
||||
s_api_v1_admin_storage,
|
||||
s_api_v1_admin_sync_setlimits,
|
||||
s_api_v1_admin_system,
|
||||
s_api_v1_admin_uploads,
|
||||
s_api_v1_admin_users,
|
||||
s_api_v1_admin_network,
|
||||
s_api_v1_admin_network_config,
|
||||
s_api_v1_admin_network_config_set,
|
||||
)
|
||||
from app.api.routes.tonconnect import s_api_v1_tonconnect_new, s_api_v1_tonconnect_logout
|
||||
from app.api.routes.keys import s_api_v1_keys_request
|
||||
from app.api.routes.sync import s_api_v1_sync_pin, s_api_v1_sync_status
|
||||
from app.api.routes.upload_status import s_api_v1_upload_status
|
||||
from app.api.routes.metrics import s_api_metrics
|
||||
from app.api.routes.dht import s_api_v1_dht_get, s_api_v1_dht_put
|
||||
|
||||
|
||||
app.add_route(s_index, "/", methods=["GET", "OPTIONS"])
|
||||
|
|
@ -56,6 +74,7 @@ app.add_route(s_api_system_send_status, "/api/system.sendStatus", methods=["POST
|
|||
app.add_route(s_api_v1_network_info, "/api/v1/network.info", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_network_nodes, "/api/v1/network.nodes", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_network_handshake, "/api/v1/network.handshake", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_network_events, "/api/v1/network.events", methods=["GET", "OPTIONS"])
|
||||
|
||||
app.add_route(s_api_tonconnect_manifest, "/api/tonconnect-manifest.json", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_platform_metadata, "/api/platform-metadata.json", methods=["GET", "OPTIONS"])
|
||||
|
|
@ -69,6 +88,8 @@ app.add_route(s_api_v1_tonconnect_logout, "/api/v1/tonconnect.logout", methods=[
|
|||
|
||||
app.add_route(s_api_v1_5_storage_post, "/api/v1.5/storage", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_5_storage_get, "/api/v1.5/storage/<file_hash>", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_storage_fetch, "/api/v1/storage.fetch/<file_hash>", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_storage_proxy, "/api/v1/storage.proxy/<file_hash>", methods=["GET", "OPTIONS"])
|
||||
|
||||
app.add_route(s_api_v1_storage_post, "/api/v1/storage", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_storage_get, "/api/v1/storage/<file_hash>", methods=["GET", "OPTIONS"])
|
||||
|
|
@ -86,12 +107,27 @@ app.add_route(s_api_v1_5_content_list, "/api/v1.5/content.list", methods=["GET",
|
|||
app.add_route(s_api_v1_content_index, "/api/v1/content.index", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_content_delta, "/api/v1/content.delta", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_content_derivatives, "/api/v1/content.derivatives", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_login, "/api/v1/admin.login", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_logout, "/api/v1/admin.logout", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_overview, "/api/v1/admin.overview", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_storage, "/api/v1/admin.storage", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_uploads, "/api/v1/admin.uploads", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_users, "/api/v1/admin.users", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_users_setadmin, "/api/v1/admin.users.setAdmin", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_licenses, "/api/v1/admin.licenses", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_stars, "/api/v1/admin.stars", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_events, "/api/v1/admin.events", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_system, "/api/v1/admin.system", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_blockchain, "/api/v1/admin.blockchain", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_node_setrole, "/api/v1/admin.node.setRole", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_nodes, "/api/v1/admin.nodes", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_status, "/api/v1/admin.status", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_cache_setlimits, "/api/v1/admin.cache.setLimits", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_cache_cleanup, "/api/v1/admin.cache.cleanup", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_sync_setlimits, "/api/v1/admin.sync.setLimits", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_network, "/api/v1/admin.network", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_network_config, "/api/v1/admin.network.config", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_admin_network_config_set, "/api/v1/admin.network.config.set", methods=["POST", "OPTIONS"])
|
||||
|
||||
# tusd HTTP hooks
|
||||
app.add_route(s_api_v1_upload_tus_hook, "/api/v1/upload.tus-hook", methods=["POST", "OPTIONS"])
|
||||
|
|
@ -101,6 +137,9 @@ app.add_route(s_api_v1_keys_request, "/api/v1/keys.request", methods=["POST", "O
|
|||
app.add_route(s_api_v1_sync_pin, "/api/v1/sync.pin", methods=["POST", "OPTIONS"])
|
||||
app.add_route(s_api_v1_sync_status, "/api/v1/sync.status", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_upload_status, "/api/v1/upload.status/<upload_id>", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_metrics, "/metrics", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_dht_get, "/api/v1/dht.get", methods=["GET", "OPTIONS"])
|
||||
app.add_route(s_api_v1_dht_put, "/api/v1/dht.put", methods=["POST", "OPTIONS"])
|
||||
|
||||
|
||||
@app.exception(BaseException)
|
||||
|
|
@ -156,10 +195,4 @@ async def s_handle_exception(request, exception):
|
|||
|
||||
response_buffer = response.json(payload, status=status)
|
||||
response_buffer = await close_db_session(request, response_buffer)
|
||||
response_buffer.headers["Access-Control-Allow-Origin"] = "*"
|
||||
response_buffer.headers["Access-Control-Allow-Methods"] = "GET, POST, OPTIONS"
|
||||
response_buffer.headers["Access-Control-Allow-Headers"] = "Origin, Content-Type, Accept, Authorization, Referer, User-Agent, Sec-Fetch-Dest, Sec-Fetch-Mode, Sec-Fetch-Site, x-request-id"
|
||||
response_buffer.headers["Access-Control-Allow-Credentials"] = "true"
|
||||
response_buffer.headers["X-Session-Id"] = session_id
|
||||
response_buffer.headers["X-Error-Id"] = error_id
|
||||
return response_buffer
|
||||
|
|
|
|||
Binary file not shown.
Binary file not shown.
|
|
@ -1,3 +1,4 @@
|
|||
import os
|
||||
from base58 import b58decode
|
||||
from sanic import response as sanic_response
|
||||
from uuid import uuid4
|
||||
|
|
@ -17,20 +18,30 @@ from app.core.log_context import (
|
|||
)
|
||||
|
||||
|
||||
ENABLE_INTERNAL_CORS = os.getenv("ENABLE_INTERNAL_CORS", "1").lower() in {"1", "true", "yes"}
|
||||
|
||||
|
||||
def attach_headers(response, request=None):
|
||||
response.headers.pop("Access-Control-Allow-Origin", None)
|
||||
response.headers.pop("Access-Control-Allow-Methods", None)
|
||||
response.headers.pop("Access-Control-Allow-Headers", None)
|
||||
response.headers.pop("Access-Control-Allow-Credentials", None)
|
||||
|
||||
if not ENABLE_INTERNAL_CORS:
|
||||
return response
|
||||
|
||||
response.headers["Access-Control-Allow-Origin"] = "*"
|
||||
response.headers["Access-Control-Allow-Methods"] = "GET, POST, OPTIONS"
|
||||
response.headers["Access-Control-Allow-Headers"] = "Origin, Content-Type, Accept, Authorization, Referer, User-Agent, Sec-Fetch-Dest, Sec-Fetch-Mode, Sec-Fetch-Site, x-file-name, x-last-chunk, x-chunk-start, x-upload-id, x-request-id"
|
||||
# response.headers["Access-Control-Allow-Credentials"] = "true"
|
||||
try:
|
||||
sid = getattr(request.ctx, 'session_id', None) if request else None
|
||||
if sid:
|
||||
response.headers["X-Session-Id"] = sid
|
||||
except BaseException:
|
||||
pass
|
||||
response.headers["Access-Control-Allow-Methods"] = "GET, POST, OPTIONS, PATCH, HEAD"
|
||||
response.headers["Access-Control-Allow-Headers"] = (
|
||||
"Origin, Content-Type, Accept, Authorization, Referer, User-Agent, Sec-Fetch-Dest, Sec-Fetch-Mode, "
|
||||
"Sec-Fetch-Site, Tus-Resumable, tus-resumable, Upload-Length, upload-length, Upload-Offset, upload-offset, "
|
||||
"Upload-Metadata, upload-metadata, Upload-Defer-Length, upload-defer-length, Upload-Concat, upload-concat, "
|
||||
"x-file-name, x-last-chunk, x-chunk-start, x-upload-id, x-request-id"
|
||||
)
|
||||
return response
|
||||
|
||||
|
||||
|
||||
async def try_authorization(request):
|
||||
token = request.headers.get("Authorization")
|
||||
if not token:
|
||||
|
|
@ -200,6 +211,8 @@ async def close_request_handler(request, response):
|
|||
if request.method == 'OPTIONS':
|
||||
response = sanic_response.text("OK")
|
||||
|
||||
response = attach_headers(response, request)
|
||||
|
||||
try:
|
||||
await request.ctx.db_session.close()
|
||||
except BaseException:
|
||||
|
|
@ -214,14 +227,11 @@ async def close_request_handler(request, response):
|
|||
except BaseException:
|
||||
pass
|
||||
|
||||
response = attach_headers(response, request)
|
||||
|
||||
return request, response
|
||||
|
||||
|
||||
async def close_db_session(request, response):
|
||||
request, response = await close_request_handler(request, response)
|
||||
response = attach_headers(response, request)
|
||||
# Clear contextvars
|
||||
try:
|
||||
ctx_session_id.set(None)
|
||||
|
|
|
|||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
|
|
@ -7,7 +7,6 @@ from sqlalchemy import and_, select, func
|
|||
from tonsdk.boc import begin_cell, begin_dict
|
||||
from tonsdk.utils import Address
|
||||
|
||||
from base58 import b58encode
|
||||
from app.core._blockchain.ton.connect import TonConnect, wallet_obj_by_name
|
||||
from app.core._blockchain.ton.platform import platform
|
||||
from app.core._config import PROJECT_HOST
|
||||
|
|
@ -57,27 +56,31 @@ async def s_api_v1_blockchain_send_new_content_message(request):
|
|||
assert field_key in request.json, f"No {field_key} provided"
|
||||
assert field_value(request.json[field_key]), f"Invalid {field_key} provided"
|
||||
|
||||
artist = request.json.get('artist')
|
||||
if artist is not None:
|
||||
assert isinstance(artist, str), "Invalid artist provided"
|
||||
artist = artist.strip()
|
||||
if artist == "":
|
||||
artist = None
|
||||
else:
|
||||
artist = None
|
||||
|
||||
# Support legacy: 'content' as decrypted ContentId; and new: 'content' as encrypted IPFS CID
|
||||
source_content_cid, cid_err = resolve_content(request.json['content'])
|
||||
assert not cid_err, f"Invalid content CID provided: {cid_err}"
|
||||
|
||||
encrypted_content_cid = None
|
||||
try:
|
||||
# Legacy path
|
||||
decrypted_content_cid, err = resolve_content(request.json['content'])
|
||||
assert not err
|
||||
decrypted_content = (await request.ctx.db_session.execute(
|
||||
select(StoredContent).where(StoredContent.hash == decrypted_content_cid.content_hash_b58)
|
||||
)).scalars().first()
|
||||
assert decrypted_content and decrypted_content.type == "local/content_bin"
|
||||
decrypted_content = (await request.ctx.db_session.execute(
|
||||
select(StoredContent).where(StoredContent.hash == source_content_cid.content_hash_b58)
|
||||
)).scalars().first()
|
||||
|
||||
if decrypted_content and decrypted_content.type == "local/content_bin":
|
||||
encrypted_content = await create_encrypted_content(request.ctx.db_session, decrypted_content)
|
||||
encrypted_content_cid = encrypted_content.cid
|
||||
except BaseException:
|
||||
# New path: treat provided string as encrypted IPFS CID (ENCF v1)
|
||||
encrypted_ipfs_cid = request.json['content']
|
||||
class _EC: # tiny adapter to mimic .serialize_v2()
|
||||
def __init__(self, s: str):
|
||||
self._s = s
|
||||
def serialize_v2(self, include_accept_type: bool = False):
|
||||
return self._s
|
||||
encrypted_content_cid = _EC(encrypted_ipfs_cid)
|
||||
elif source_content_cid.cid_format == 'ipfs':
|
||||
encrypted_content_cid = source_content_cid
|
||||
else:
|
||||
raise AssertionError("Provided content is neither locally available nor a valid encrypted CID")
|
||||
|
||||
if request.json['image']:
|
||||
image_content_cid, err = resolve_content(request.json['image'])
|
||||
|
|
@ -91,11 +94,16 @@ async def s_api_v1_blockchain_send_new_content_message(request):
|
|||
image_content = None
|
||||
|
||||
|
||||
content_title = f"{', '.join(request.json['authors'])} – {request.json['title']}" if request.json['authors'] else request.json['title']
|
||||
content_title = request.json['title']
|
||||
if artist:
|
||||
content_title = f"{artist} – {content_title}"
|
||||
elif request.json['authors']:
|
||||
content_title = f"{', '.join(request.json['authors'])} – {request.json['title']}"
|
||||
|
||||
metadata_content = await create_metadata_for_item(
|
||||
request.ctx.db_session,
|
||||
title=content_title,
|
||||
title=request.json['title'],
|
||||
artist=artist,
|
||||
cover_url=f"{PROJECT_HOST}/api/v1.5/storage/{image_content_cid.serialize_v2()}" if image_content_cid else None,
|
||||
authors=request.json['authors'],
|
||||
hashtags=request.json['hashtags'],
|
||||
|
|
@ -152,7 +160,7 @@ async def s_api_v1_blockchain_send_new_content_message(request):
|
|||
user_id = str(request.ctx.user.id),
|
||||
user_internal_id=request.ctx.user.id,
|
||||
action_type='freeUpload',
|
||||
action_ref=str(encrypted_content_cid.content_hash),
|
||||
action_ref=encrypted_content_cid.serialize_v2(),
|
||||
created=datetime.now()
|
||||
)
|
||||
request.ctx.db_session.add(promo_action)
|
||||
|
|
@ -209,7 +217,7 @@ async def s_api_v1_blockchain_send_new_content_message(request):
|
|||
title=content_title,
|
||||
free_count=(promo_free_upload_available - 1)
|
||||
), message_type='hint', message_meta={
|
||||
'encrypted_content_hash': b58encode(encrypted_content_cid.content_hash).decode(),
|
||||
'encrypted_content_hash': encrypted_content_cid.content_hash_b58,
|
||||
'hint_type': 'uploadContentTxRequested'
|
||||
}
|
||||
)
|
||||
|
|
@ -219,54 +227,59 @@ async def s_api_v1_blockchain_send_new_content_message(request):
|
|||
'payload': ""
|
||||
})
|
||||
|
||||
user_wallet_address = await request.ctx.user.wallet_address_async(request.ctx.db_session)
|
||||
assert user_wallet_address, "Wallet address is not linked"
|
||||
|
||||
await request.ctx.user_uploader_wrapper.send_message(
|
||||
request.ctx.user.translated('p_uploadContentTxRequested').format(
|
||||
title=content_title,
|
||||
), message_type='hint', message_meta={
|
||||
'encrypted_content_hash': b58encode(encrypted_content_cid.content_hash).decode(),
|
||||
'encrypted_content_hash': encrypted_content_cid.content_hash_b58,
|
||||
'hint_type': 'uploadContentTxRequested'
|
||||
}
|
||||
)
|
||||
|
||||
payload_cell = (
|
||||
begin_cell()
|
||||
.store_uint(0x5491d08c, 32)
|
||||
.store_uint(int.from_bytes(encrypted_content_cid.content_hash, "big", signed=False), 256)
|
||||
.store_address(Address(user_wallet_address))
|
||||
.store_ref(
|
||||
begin_cell()
|
||||
.store_ref(
|
||||
begin_cell()
|
||||
.store_coins(int(0))
|
||||
.store_coins(int(0))
|
||||
.store_coins(int(request.json['price']))
|
||||
.end_cell()
|
||||
)
|
||||
.store_maybe_ref(royalties_dict.end_dict())
|
||||
.store_uint(0, 1)
|
||||
.end_cell()
|
||||
)
|
||||
.store_ref(
|
||||
begin_cell()
|
||||
.store_ref(
|
||||
begin_cell()
|
||||
.store_bytes(f"{PROJECT_HOST}/api/v1.5/storage/{metadata_content.cid.serialize_v2(include_accept_type=True)}".encode())
|
||||
.end_cell()
|
||||
)
|
||||
.store_ref(
|
||||
begin_cell()
|
||||
.store_ref(begin_cell().store_bytes(f"{encrypted_content_cid.serialize_v2()}".encode()).end_cell())
|
||||
.store_ref(begin_cell().store_bytes(f"{image_content_cid.serialize_v2() if image_content_cid else ''}".encode()).end_cell())
|
||||
.store_ref(begin_cell().store_bytes(f"{metadata_content.cid.serialize_v2()}".encode()).end_cell())
|
||||
.end_cell()
|
||||
)
|
||||
.end_cell()
|
||||
)
|
||||
.end_cell()
|
||||
)
|
||||
|
||||
return response.json({
|
||||
'address': platform.address.to_string(1, 1, 1),
|
||||
'amount': str(int(0.03 * 10 ** 9)),
|
||||
'payload': b64encode(
|
||||
begin_cell()
|
||||
.store_uint(0x5491d08c, 32)
|
||||
.store_uint(int.from_bytes(encrypted_content_cid.content_hash, "big", signed=False), 256)
|
||||
.store_uint(0, 2)
|
||||
.store_ref(
|
||||
begin_cell()
|
||||
.store_ref(
|
||||
begin_cell()
|
||||
.store_coins(int(0))
|
||||
.store_coins(int(0))
|
||||
.store_coins(int(request.json['price']))
|
||||
.end_cell()
|
||||
)
|
||||
.store_maybe_ref(royalties_dict.end_dict())
|
||||
.store_uint(0, 1)
|
||||
.end_cell()
|
||||
)
|
||||
.store_ref(
|
||||
begin_cell()
|
||||
.store_ref(
|
||||
begin_cell()
|
||||
.store_bytes(f"{PROJECT_HOST}/api/v1.5/storage/{metadata_content.cid.serialize_v2(include_accept_type=True)}".encode())
|
||||
.end_cell()
|
||||
)
|
||||
.store_ref(
|
||||
begin_cell()
|
||||
.store_ref(begin_cell().store_bytes(f"{encrypted_content_cid.serialize_v2()}".encode()).end_cell())
|
||||
.store_ref(begin_cell().store_bytes(f"{image_content_cid.serialize_v2() if image_content_cid else ''}".encode()).end_cell())
|
||||
.store_ref(begin_cell().store_bytes(f"{metadata_content.cid.serialize_v2()}".encode()).end_cell())
|
||||
.end_cell()
|
||||
)
|
||||
.end_cell()
|
||||
)
|
||||
.end_cell().to_boc(False)
|
||||
).decode()
|
||||
'payload': b64encode(payload_cell.to_boc(False)).decode()
|
||||
})
|
||||
except BaseException as e:
|
||||
make_log("Blockchain", f"Error while sending new content message: {e}" + '\n' + traceback.format_exc(), level='error')
|
||||
|
|
@ -290,14 +303,15 @@ async def s_api_v1_blockchain_send_purchase_content_message(request):
|
|||
license_exist = (await request.ctx.db_session.execute(select(UserContent).where(
|
||||
UserContent.onchain_address == request.json['content_address']
|
||||
))).scalars().first()
|
||||
if license_exist:
|
||||
from app.core.content.content_id import ContentId
|
||||
_cid = ContentId.deserialize(license_exist.content.cid.serialize_v2())
|
||||
r_content = (await request.ctx.db_session.execute(select(StoredContent).where(StoredContent.hash == _cid.content_hash_b58))).scalars().first()
|
||||
from app.core.content.content_id import ContentId
|
||||
|
||||
if license_exist and license_exist.content_id:
|
||||
r_content = (await request.ctx.db_session.execute(select(StoredContent).where(
|
||||
StoredContent.id == license_exist.content_id
|
||||
))).scalars().first()
|
||||
else:
|
||||
from app.core.content.content_id import ContentId
|
||||
_cid = ContentId.deserialize(request.json['content_address'])
|
||||
r_content = (await request.ctx.db_session.execute(select(StoredContent).where(StoredContent.hash == _cid.content_hash_b58))).scalars().first()
|
||||
requested_cid = ContentId.deserialize(request.json['content_address'])
|
||||
r_content = (await request.ctx.db_session.execute(select(StoredContent).where(StoredContent.hash == requested_cid.content_hash_b58))).scalars().first()
|
||||
|
||||
async def open_content_async(session, sc: StoredContent):
|
||||
if not sc.encrypted:
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load Diff
|
|
@ -61,6 +61,21 @@ async def s_api_v1_auth_twa(request):
|
|||
)).scalars().first()
|
||||
assert known_user, "User not created"
|
||||
|
||||
meta_updated = False
|
||||
if not (known_user.meta or {}).get('ref_id'):
|
||||
known_user.ensure_ref_id()
|
||||
meta_updated = True
|
||||
|
||||
incoming_ref_id = auth_data.get('ref_id')
|
||||
stored_ref_id = (known_user.meta or {}).get('ref_id')
|
||||
if incoming_ref_id and incoming_ref_id != stored_ref_id:
|
||||
if (known_user.meta or {}).get('referrer_id') != incoming_ref_id:
|
||||
known_user.meta = {
|
||||
**(known_user.meta or {}),
|
||||
'referrer_id': incoming_ref_id
|
||||
}
|
||||
meta_updated = True
|
||||
|
||||
new_user_key = await known_user.create_api_token_v1(request.ctx.db_session, "USER_API_V1")
|
||||
if auth_data['ton_proof']:
|
||||
try:
|
||||
|
|
@ -91,12 +106,17 @@ async def s_api_v1_auth_twa(request):
|
|||
user_id=known_user.id,
|
||||
network='ton',
|
||||
wallet_key='web2-client==1',
|
||||
connection_id=connection_payload,
|
||||
# `ton_proof.payload` is expected to be single-use in many wallets (and it is unique per auth call here),
|
||||
# but client-side retries/replays can happen; keep payload separately and make DB id unique.
|
||||
connection_id=f"{connection_payload}.{uuid4().hex}",
|
||||
wallet_address=Address(wallet_info.account.address).to_string(1, 1, 1),
|
||||
keys={
|
||||
'ton_proof': auth_data['ton_proof']
|
||||
'ton_proof': auth_data['ton_proof'],
|
||||
'ton_proof_payload': connection_payload,
|
||||
},
|
||||
meta={
|
||||
'ton_proof_payload': connection_payload,
|
||||
},
|
||||
meta={},
|
||||
created=datetime.now(),
|
||||
updated=datetime.now(),
|
||||
invalidated=False,
|
||||
|
|
@ -116,6 +136,8 @@ async def s_api_v1_auth_twa(request):
|
|||
)
|
||||
).order_by(WalletConnection.created.desc()))).scalars().first()
|
||||
known_user.last_use = datetime.now()
|
||||
if meta_updated:
|
||||
known_user.updated = datetime.now()
|
||||
await request.ctx.db_session.commit()
|
||||
|
||||
return response.json({
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
from __future__ import annotations
|
||||
from datetime import datetime, timedelta
|
||||
from sanic import response
|
||||
from sqlalchemy import select, and_, func
|
||||
from sqlalchemy import select, and_, func, or_
|
||||
from aiogram import Bot, types
|
||||
from sqlalchemy import and_
|
||||
from app.core.logger import make_log
|
||||
|
|
@ -9,9 +10,13 @@ from app.core.models.node_storage import StoredContent
|
|||
from app.core.models.keys import KnownKey
|
||||
from app.core.models import StarsInvoice
|
||||
from app.core.models.content.user_content import UserContent
|
||||
from app.core._config import CLIENT_TELEGRAM_API_KEY, PROJECT_HOST
|
||||
from app.core.models.content_v3 import EncryptedContent as ECv3, ContentDerivative as CDv3
|
||||
from app.core._config import CLIENT_TELEGRAM_API_KEY, CLIENT_TELEGRAM_BOT_USERNAME, PROJECT_HOST
|
||||
from app.core.models.content_v3 import EncryptedContent as ECv3, ContentDerivative as CDv3, UploadSession
|
||||
from app.core.content.content_id import ContentId
|
||||
from app.core.network.dht import MetricsAggregator
|
||||
import os
|
||||
import json
|
||||
import time
|
||||
import uuid
|
||||
|
||||
|
||||
|
|
@ -28,7 +33,7 @@ async def s_api_v1_content_list(request):
|
|||
select(StoredContent)
|
||||
.where(
|
||||
StoredContent.type.like(store + '%'),
|
||||
StoredContent.disabled == False
|
||||
StoredContent.disabled.is_(None)
|
||||
)
|
||||
.order_by(StoredContent.created.desc())
|
||||
.offset(offset)
|
||||
|
|
@ -50,8 +55,15 @@ async def s_api_v1_content_view(request, content_address: str):
|
|||
license_exist = (await request.ctx.db_session.execute(
|
||||
select(UserContent).where(UserContent.onchain_address == content_address)
|
||||
)).scalars().first()
|
||||
license_address = None
|
||||
if license_exist:
|
||||
content_address = license_exist.content.cid.serialize_v2()
|
||||
license_address = license_exist.onchain_address
|
||||
if license_exist.content_id:
|
||||
linked_content = (await request.ctx.db_session.execute(
|
||||
select(StoredContent).where(StoredContent.id == license_exist.content_id)
|
||||
)).scalars().first()
|
||||
if linked_content:
|
||||
content_address = linked_content.cid.serialize_v2()
|
||||
|
||||
from app.core.content.content_id import ContentId
|
||||
cid = ContentId.deserialize(content_address)
|
||||
|
|
@ -71,12 +83,38 @@ async def s_api_v1_content_view(request, content_address: str):
|
|||
content_type = ctype.split('/')[0]
|
||||
except Exception:
|
||||
content_type = 'application'
|
||||
return {'encrypted_content': encrypted, 'decrypted_content': decrypted, 'content_type': content_type}
|
||||
content = await open_content_async(request.ctx.db_session, r_content)
|
||||
return {
|
||||
'encrypted_content': encrypted,
|
||||
'decrypted_content': decrypted,
|
||||
'content_type': content_type,
|
||||
'content_mime': ctype,
|
||||
}
|
||||
try:
|
||||
content = await open_content_async(request.ctx.db_session, r_content)
|
||||
except AssertionError:
|
||||
# Fallback: handle plain stored content without encrypted/decrypted pairing
|
||||
sc = r_content
|
||||
from mimetypes import guess_type as _guess
|
||||
_mime, _ = _guess(sc.filename or '')
|
||||
_mime = _mime or 'application/octet-stream'
|
||||
try:
|
||||
_ctype = _mime.split('/')[0]
|
||||
except Exception:
|
||||
_ctype = 'application'
|
||||
content = {
|
||||
'encrypted_content': sc,
|
||||
'decrypted_content': sc,
|
||||
'content_type': _ctype,
|
||||
'content_mime': _mime,
|
||||
}
|
||||
|
||||
master_address = content['encrypted_content'].meta.get('item_address', '')
|
||||
opts = {
|
||||
'content_type': content['content_type'], # возможно с ошибками, нужно переделать на ffprobe
|
||||
'content_address': content['encrypted_content'].meta.get('item_address', '')
|
||||
'content_mime': content.get('content_mime'),
|
||||
'content_address': license_address or master_address,
|
||||
'license_address': license_address,
|
||||
'master_address': master_address,
|
||||
}
|
||||
if content['encrypted_content'].key_id:
|
||||
known_key = (await request.ctx.db_session.execute(
|
||||
|
|
@ -92,18 +130,26 @@ async def s_api_v1_content_view(request, content_address: str):
|
|||
have_access = False
|
||||
if request.ctx.user:
|
||||
user_wallet_address = await request.ctx.user.wallet_address_async(request.ctx.db_session)
|
||||
user_telegram_id = getattr(request.ctx.user, 'telegram_id', None)
|
||||
or_clauses = [StarsInvoice.user_id == request.ctx.user.id]
|
||||
if user_telegram_id is not None:
|
||||
or_clauses.append(StarsInvoice.telegram_id == user_telegram_id)
|
||||
stars_access = False
|
||||
if or_clauses:
|
||||
stars_access = bool((await request.ctx.db_session.execute(select(StarsInvoice).where(
|
||||
and_(
|
||||
StarsInvoice.content_hash == content['encrypted_content'].hash,
|
||||
StarsInvoice.paid.is_(True),
|
||||
or_(*or_clauses)
|
||||
)
|
||||
))).scalars().first())
|
||||
|
||||
have_access = (
|
||||
(content['encrypted_content'].owner_address == user_wallet_address)
|
||||
or bool((await request.ctx.db_session.execute(select(UserContent).where(
|
||||
and_(UserContent.owner_address == user_wallet_address, UserContent.status == 'active', UserContent.content_id == content['encrypted_content'].id)
|
||||
))).scalars().first()) \
|
||||
or bool((await request.ctx.db_session.execute(select(StarsInvoice).where(
|
||||
and_(
|
||||
StarsInvoice.user_id == request.ctx.user.id,
|
||||
StarsInvoice.content_hash == content['encrypted_content'].hash,
|
||||
StarsInvoice.paid == True
|
||||
)
|
||||
))).scalars().first())
|
||||
or stars_access
|
||||
)
|
||||
|
||||
if not have_access:
|
||||
|
|
@ -112,8 +158,10 @@ async def s_api_v1_content_view(request, content_address: str):
|
|||
current_star_rate = 0.00000001
|
||||
|
||||
stars_cost = int(int(content['encrypted_content'].meta['license']['resale']['price']) / 1e9 / current_star_rate * 1.2)
|
||||
if request.ctx.user.telegram_id in [5587262915, 6861699286]:
|
||||
if getattr(request.ctx.user, 'is_admin', False):
|
||||
stars_cost = 2
|
||||
else:
|
||||
stars_cost = int(int(content['encrypted_content'].meta['license']['resale']['price']) / 1e9 / current_star_rate * 1.2)
|
||||
|
||||
invoice_id = f"access_{uuid.uuid4().hex}"
|
||||
exist_invoice = (await request.ctx.db_session.execute(select(StarsInvoice).where(
|
||||
|
|
@ -144,7 +192,9 @@ async def s_api_v1_content_view(request, content_address: str):
|
|||
amount=stars_cost,
|
||||
user_id=request.ctx.user.id,
|
||||
content_hash=content['encrypted_content'].hash,
|
||||
invoice_url=invoice_url
|
||||
invoice_url=invoice_url,
|
||||
telegram_id=getattr(request.ctx.user, 'telegram_id', None),
|
||||
bot_username=CLIENT_TELEGRAM_BOT_USERNAME,
|
||||
)
|
||||
)
|
||||
await request.ctx.db_session.commit()
|
||||
|
|
@ -159,57 +209,320 @@ async def s_api_v1_content_view(request, content_address: str):
|
|||
|
||||
display_options = {
|
||||
'content_url': None,
|
||||
'content_kind': None,
|
||||
'has_preview': False,
|
||||
'original_available': False,
|
||||
'requires_license': False,
|
||||
}
|
||||
|
||||
if have_access:
|
||||
opts['have_licenses'].append('listen')
|
||||
|
||||
converted_content = content['encrypted_content'].meta.get('converted_content')
|
||||
if converted_content:
|
||||
user_content_option = 'low_preview'
|
||||
if have_access:
|
||||
user_content_option = 'low'
|
||||
encrypted_json = content['encrypted_content'].json_format()
|
||||
decrypted_json = content['decrypted_content'].json_format()
|
||||
|
||||
converted_content = (await request.ctx.db_session.execute(select(StoredContent).where(
|
||||
StoredContent.hash == converted_content[user_content_option]
|
||||
))).scalars().first()
|
||||
if converted_content:
|
||||
display_options['content_url'] = converted_content.web_url
|
||||
opts['content_ext'] = converted_content.filename.split('.')[-1]
|
||||
enc_cid = encrypted_json.get('content_cid') or encrypted_json.get('encrypted_cid')
|
||||
ec_v3 = None
|
||||
derivative_rows = []
|
||||
if enc_cid:
|
||||
ec_v3 = (await request.ctx.db_session.execute(select(ECv3).where(ECv3.encrypted_cid == enc_cid))).scalars().first()
|
||||
if ec_v3:
|
||||
derivative_rows = (await request.ctx.db_session.execute(select(CDv3).where(CDv3.content_id == ec_v3.id))).scalars().all()
|
||||
|
||||
upload_row = None
|
||||
if enc_cid:
|
||||
upload_row = (await request.ctx.db_session.execute(select(UploadSession).where(UploadSession.encrypted_cid == enc_cid))).scalars().first()
|
||||
|
||||
converted_meta_map = dict(content['encrypted_content'].meta.get('converted_content') or {})
|
||||
|
||||
content_mime = (
|
||||
(ec_v3.content_type if ec_v3 and ec_v3.content_type else None)
|
||||
or decrypted_json.get('content_type')
|
||||
or encrypted_json.get('content_type')
|
||||
or opts.get('content_mime')
|
||||
or 'application/octet-stream'
|
||||
)
|
||||
# Fallback: if stored content reports generic application/*, try guess by filename
|
||||
try:
|
||||
if content_mime.startswith('application/'):
|
||||
from mimetypes import guess_type as _guess
|
||||
_fn = decrypted_json.get('filename') or encrypted_json.get('filename') or ''
|
||||
_gm, _ = _guess(_fn)
|
||||
if _gm:
|
||||
content_mime = _gm
|
||||
except Exception:
|
||||
pass
|
||||
opts['content_mime'] = content_mime
|
||||
try:
|
||||
opts['content_type'] = content_mime.split('/')[0]
|
||||
except Exception:
|
||||
opts['content_type'] = opts.get('content_type') or 'application'
|
||||
|
||||
content_kind = 'audio'
|
||||
if content_mime.startswith('video/'):
|
||||
content_kind = 'video'
|
||||
elif content_mime.startswith('audio/'):
|
||||
content_kind = 'audio'
|
||||
else:
|
||||
# v3 fallback: use derivatives table linked via encrypted_cid from onchain meta
|
||||
enc_cid = content['encrypted_content'].meta.get('content_cid') or content['encrypted_content'].meta.get('encrypted_cid')
|
||||
if enc_cid:
|
||||
ec = (await request.ctx.db_session.execute(select(ECv3).where(ECv3.encrypted_cid == enc_cid))).scalars().first()
|
||||
if ec:
|
||||
# choose preview for non-access; low for access
|
||||
desired = ['decrypted_preview'] if not have_access else ['decrypted_low', 'decrypted_high']
|
||||
rows = (await request.ctx.db_session.execute(select(CDv3).where(CDv3.content_id == ec.id, CDv3.status == 'ready'))).scalars().all()
|
||||
chosen = None
|
||||
for kind in desired:
|
||||
chosen = next((r for r in rows if r.kind == kind), None)
|
||||
if chosen:
|
||||
break
|
||||
if chosen and chosen.local_path:
|
||||
h = chosen.local_path.split('/')[-1]
|
||||
display_options['content_url'] = f"{PROJECT_HOST}/api/v1.5/storage/{h}"
|
||||
opts['content_ext'] = (chosen.content_type or '').split('/')[-1] if chosen.content_type else None
|
||||
content_kind = 'binary'
|
||||
|
||||
content_meta = content['encrypted_content'].json_format()
|
||||
from app.core.content.content_id import ContentId
|
||||
display_options['content_kind'] = content_kind
|
||||
display_options['requires_license'] = (not have_access) and content_kind == 'binary'
|
||||
|
||||
derivative_latest = {}
|
||||
if derivative_rows:
|
||||
derivative_sorted = sorted(derivative_rows, key=lambda row: row.created_at or datetime.min)
|
||||
for row in derivative_sorted:
|
||||
derivative_latest[row.kind] = row
|
||||
|
||||
def _row_to_hash_and_url(row):
|
||||
if not row or not row.local_path:
|
||||
return None, None
|
||||
file_hash = row.local_path.split('/')[-1]
|
||||
return file_hash, f"{PROJECT_HOST}/api/v1/storage.proxy/{file_hash}"
|
||||
|
||||
has_preview = bool(derivative_latest.get('decrypted_preview') or converted_meta_map.get('low_preview'))
|
||||
display_options['has_preview'] = has_preview
|
||||
display_options['original_available'] = bool(derivative_latest.get('decrypted_original') or converted_meta_map.get('original'))
|
||||
|
||||
chosen_row = None
|
||||
if content_kind == 'binary':
|
||||
if have_access and 'decrypted_original' in derivative_latest:
|
||||
chosen_row = derivative_latest['decrypted_original']
|
||||
elif have_access:
|
||||
for key in ('decrypted_low', 'decrypted_high'):
|
||||
if key in derivative_latest:
|
||||
chosen_row = derivative_latest[key]
|
||||
break
|
||||
else:
|
||||
for key in ('decrypted_preview', 'decrypted_low'):
|
||||
if key in derivative_latest:
|
||||
chosen_row = derivative_latest[key]
|
||||
break
|
||||
|
||||
def _make_token_for(hash_value: str, scope: str, user_id: int | None) -> str:
|
||||
try:
|
||||
from app.core._crypto.signer import Signer
|
||||
from app.core._secrets import hot_seed, hot_pubkey
|
||||
from app.core._utils.b58 import b58encode as _b58e
|
||||
signer = Signer(hot_seed)
|
||||
# Media URLs are polled very frequently by the web client (e.g. every 5s).
|
||||
# If we generate a new exp for every request, the signed URL changes every poll,
|
||||
# forcing the player to reload and breaking continuous streaming.
|
||||
#
|
||||
# To keep URLs stable while still expiring tokens, we "bucket" exp time.
|
||||
# Default behavior keeps tokens stable for ~10 minutes; can be tuned via env.
|
||||
ttl_sec = int(os.getenv("STORAGE_PROXY_TOKEN_TTL_SEC", "600"))
|
||||
bucket_sec = int(os.getenv("STORAGE_PROXY_TOKEN_BUCKET_SEC", str(ttl_sec)))
|
||||
ttl_sec = max(1, ttl_sec)
|
||||
bucket_sec = max(1, bucket_sec)
|
||||
now = int(time.time())
|
||||
exp_base = now + ttl_sec
|
||||
# Always move to the next bucket boundary so the token doesn't flip immediately
|
||||
# after a boundary due to rounding edge cases.
|
||||
exp = ((exp_base // bucket_sec) + 1) * bucket_sec
|
||||
uid = int(user_id or 0)
|
||||
payload = {'hash': hash_value, 'scope': scope, 'exp': exp, 'uid': uid}
|
||||
blob = json.dumps(payload, sort_keys=True, separators=(",", ":")).encode()
|
||||
sig = signer.sign(blob)
|
||||
pub = _b58e(hot_pubkey).decode()
|
||||
return f"pub={pub}&exp={exp}&scope={scope}&uid={uid}&sig={sig}"
|
||||
except Exception:
|
||||
return ""
|
||||
|
||||
if chosen_row:
|
||||
file_hash, url = _row_to_hash_and_url(chosen_row)
|
||||
if url:
|
||||
token = _make_token_for(file_hash or '', 'full' if have_access else 'preview', getattr(request.ctx.user, 'id', None))
|
||||
display_options['content_url'] = f"{url}?{token}" if token else url
|
||||
ext_candidate = None
|
||||
if chosen_row.content_type:
|
||||
ext_candidate = chosen_row.content_type.split('/')[-1]
|
||||
elif '/' in content_mime:
|
||||
ext_candidate = content_mime.split('/')[-1]
|
||||
if ext_candidate:
|
||||
opts['content_ext'] = ext_candidate
|
||||
if content_kind == 'binary':
|
||||
display_options['original_available'] = True
|
||||
converted_meta_map.setdefault('original', file_hash)
|
||||
elif have_access:
|
||||
converted_meta_map.setdefault('low', file_hash)
|
||||
else:
|
||||
converted_meta_map.setdefault('low_preview', file_hash)
|
||||
|
||||
if not display_options['content_url'] and converted_meta_map:
|
||||
if content_kind == 'binary':
|
||||
preference = ['original'] if have_access else []
|
||||
else:
|
||||
preference = ['low', 'high', 'low_preview'] if have_access else ['low_preview', 'low', 'high']
|
||||
for key in preference:
|
||||
hash_value = converted_meta_map.get(key)
|
||||
if not hash_value:
|
||||
continue
|
||||
# Пробуем сразу через прокси (даже если локальной записи нет)
|
||||
token = _make_token_for(hash_value, 'full' if have_access else 'preview', getattr(request.ctx.user, 'id', None))
|
||||
display_options['content_url'] = f"{PROJECT_HOST}/api/v1/storage.proxy/{hash_value}?{token}" if token else f"{PROJECT_HOST}/api/v1/storage.proxy/{hash_value}"
|
||||
if '/' in content_mime:
|
||||
opts['content_ext'] = content_mime.split('/')[-1]
|
||||
if content_kind == 'binary':
|
||||
display_options['original_available'] = True
|
||||
break
|
||||
|
||||
# Final fallback: no derivatives known — serve stored content directly for AV
|
||||
if not display_options['content_url'] and content_kind in ('audio', 'video'):
|
||||
from app.core._utils.b58 import b58encode as _b58e
|
||||
scid = decrypted_json.get('cid') or encrypted_json.get('cid')
|
||||
try:
|
||||
from app.core.content.content_id import ContentId as _CID
|
||||
if scid:
|
||||
_cid = _CID.deserialize(scid)
|
||||
h = _cid.content_hash_b58
|
||||
else:
|
||||
h = decrypted_json.get('hash')
|
||||
except Exception:
|
||||
h = decrypted_json.get('hash')
|
||||
if h:
|
||||
token = _make_token_for(h, 'preview' if not have_access else 'full', getattr(request.ctx.user, 'id', None))
|
||||
display_options['content_url'] = f"{PROJECT_HOST}/api/v1/storage.proxy/{h}?{token}" if token else f"{PROJECT_HOST}/api/v1/storage.proxy/{h}"
|
||||
|
||||
# Metadata fallback
|
||||
content_meta = encrypted_json
|
||||
content_metadata_json = None
|
||||
_mcid = content_meta.get('metadata_cid') or None
|
||||
content_metadata = None
|
||||
if _mcid:
|
||||
_cid = ContentId.deserialize(_mcid)
|
||||
content_metadata = (await request.ctx.db_session.execute(select(StoredContent).where(StoredContent.hash == _cid.content_hash_b58))).scalars().first()
|
||||
with open(content_metadata.filepath, 'r') as f:
|
||||
content_metadata_json = json.loads(f.read())
|
||||
if content_metadata:
|
||||
try:
|
||||
with open(content_metadata.filepath, 'r') as f:
|
||||
content_metadata_json = json.loads(f.read())
|
||||
except Exception as exc:
|
||||
make_log("Content", f"Can't read metadata file: {exc}", level='warning')
|
||||
|
||||
if not content_metadata_json:
|
||||
fallback_name = (ec_v3.title if ec_v3 else None) or content_meta.get('title') or content_meta.get('cid')
|
||||
fallback_description = (ec_v3.description if ec_v3 else '') or ''
|
||||
content_metadata_json = {
|
||||
'name': fallback_name or 'Без названия',
|
||||
'description': fallback_description,
|
||||
'downloadable': False,
|
||||
}
|
||||
cover_cid = content_meta.get('cover_cid')
|
||||
if cover_cid:
|
||||
token = _make_token_for(cover_cid, 'preview', getattr(request.ctx.user, 'id', None))
|
||||
content_metadata_json.setdefault('image', f"{PROJECT_HOST}/api/v1/storage.proxy/{cover_cid}?{token}" if token else f"{PROJECT_HOST}/api/v1/storage.proxy/{cover_cid}")
|
||||
|
||||
display_options['metadata'] = content_metadata_json
|
||||
|
||||
opts['downloadable'] = content_metadata_json.get('downloadable', False)
|
||||
if opts['downloadable']:
|
||||
if not ('listen' in opts['have_licenses']):
|
||||
opts['downloadable'] = False
|
||||
if opts['downloadable'] and 'listen' not in opts['have_licenses']:
|
||||
opts['downloadable'] = False
|
||||
|
||||
# Conversion status summary
|
||||
conversion_summary = {}
|
||||
conversion_details = []
|
||||
derivative_summary_map = {}
|
||||
for row in derivative_latest.values():
|
||||
conversion_summary[row.status] = conversion_summary.get(row.status, 0) + 1
|
||||
derivative_summary_map[row.kind] = row
|
||||
conversion_details.append({
|
||||
'kind': row.kind,
|
||||
'status': row.status,
|
||||
'size_bytes': row.size_bytes,
|
||||
'content_type': row.content_type,
|
||||
'error': row.error,
|
||||
'updated_at': (row.last_access_at or row.created_at).isoformat() + 'Z' if (row.last_access_at or row.created_at) else None,
|
||||
})
|
||||
|
||||
required_kinds = set()
|
||||
if content_kind == 'binary':
|
||||
if derivative_latest.get('decrypted_original') or converted_meta_map.get('original'):
|
||||
required_kinds.add('decrypted_original')
|
||||
else:
|
||||
required_kinds = {'decrypted_low', 'decrypted_high'}
|
||||
if ec_v3 and ec_v3.content_type and ec_v3.content_type.startswith('video/'):
|
||||
required_kinds.add('decrypted_preview')
|
||||
|
||||
statuses_by_kind = {kind: row.status for kind, row in derivative_summary_map.items() if kind in required_kinds}
|
||||
conversion_state = 'pending'
|
||||
if required_kinds and all(statuses_by_kind.get(kind) == 'ready' for kind in required_kinds):
|
||||
conversion_state = 'ready'
|
||||
elif any(statuses_by_kind.get(kind) == 'failed' for kind in required_kinds):
|
||||
conversion_state = 'failed'
|
||||
elif any(statuses_by_kind.get(kind) in ('processing', 'pending') for kind in required_kinds):
|
||||
conversion_state = 'processing'
|
||||
elif statuses_by_kind:
|
||||
conversion_state = 'partial'
|
||||
|
||||
if display_options['content_url']:
|
||||
conversion_state = 'ready'
|
||||
|
||||
upload_info = None
|
||||
if upload_row:
|
||||
upload_info = {
|
||||
'id': upload_row.id,
|
||||
'state': upload_row.state,
|
||||
'error': upload_row.error,
|
||||
'created_at': upload_row.created_at.isoformat() + 'Z' if upload_row.created_at else None,
|
||||
'updated_at': upload_row.updated_at.isoformat() + 'Z' if upload_row.updated_at else None,
|
||||
}
|
||||
|
||||
upload_state = upload_row.state if upload_row else None
|
||||
if conversion_state == 'failed' or upload_state in ('failed', 'conversion_failed'):
|
||||
final_state = 'failed'
|
||||
elif conversion_state == 'ready':
|
||||
final_state = 'ready'
|
||||
elif conversion_state in ('processing', 'partial') or upload_state in ('processing', 'pinned'):
|
||||
final_state = 'processing'
|
||||
else:
|
||||
final_state = 'uploaded'
|
||||
|
||||
conversion_info = {
|
||||
'state': conversion_state,
|
||||
'summary': conversion_summary,
|
||||
'details': conversion_details,
|
||||
'required_kinds': list(required_kinds),
|
||||
}
|
||||
|
||||
opts['conversion'] = conversion_info
|
||||
opts['upload'] = upload_info
|
||||
opts['status'] = {
|
||||
'state': final_state,
|
||||
'conversion_state': conversion_state,
|
||||
'upload_state': upload_info['state'] if upload_info else None,
|
||||
'has_access': have_access,
|
||||
}
|
||||
if not opts.get('content_ext') and '/' in content_mime:
|
||||
opts['content_ext'] = content_mime.split('/')[-1]
|
||||
|
||||
metrics_mgr: MetricsAggregator | None = getattr(request.app.ctx.memory, "metrics", None)
|
||||
if metrics_mgr:
|
||||
viewer_salt_raw = request.headers.get("X-View-Salt")
|
||||
if viewer_salt_raw:
|
||||
try:
|
||||
viewer_salt = bytes.fromhex(viewer_salt_raw)
|
||||
except ValueError:
|
||||
viewer_salt = viewer_salt_raw.encode()
|
||||
elif request.ctx.user:
|
||||
viewer_salt = f"user:{request.ctx.user.id}".encode()
|
||||
else:
|
||||
viewer_salt = (request.remote_addr or request.ip or "anonymous").encode()
|
||||
try:
|
||||
watch_time_param = int(request.args.get("watch_time", 0))
|
||||
except (TypeError, ValueError):
|
||||
watch_time_param = 0
|
||||
try:
|
||||
bytes_out_param = int(request.args.get("bytes_out", 0))
|
||||
except (TypeError, ValueError):
|
||||
bytes_out_param = 0
|
||||
completed_param = request.args.get("completed", "0") in ("1", "true", "True")
|
||||
metrics_mgr.record_view(
|
||||
content_id=content['encrypted_content'].hash,
|
||||
viewer_salt=viewer_salt,
|
||||
watch_time=watch_time_param,
|
||||
bytes_out=bytes_out_param,
|
||||
completed=completed_param,
|
||||
)
|
||||
|
||||
return response.json({
|
||||
**opts,
|
||||
|
|
|
|||
|
|
@ -0,0 +1,125 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from typing import Any, Dict, List
|
||||
|
||||
from sanic import response
|
||||
|
||||
from app.core.logger import make_log
|
||||
from app.core._utils.b58 import b58decode
|
||||
from app.core.network.dht.records import DHTRecord
|
||||
from app.core.network.dht.store import DHTStore
|
||||
from app.core.network.dht.crypto import compute_node_id
|
||||
from app.core.network.dht.keys import MetaKey, MembershipKey, MetricKey
|
||||
from sqlalchemy import select
|
||||
from app.core.models.my_network import KnownNode
|
||||
|
||||
|
||||
def _merge_strategy_for(key: str):
|
||||
# Выбираем правильную стратегию merge по префиксу ключа
|
||||
from app.core.network.dht.replication import ReplicationState
|
||||
from app.core.network.dht.membership import MembershipState
|
||||
from app.core.network.dht.metrics import ContentMetricsState
|
||||
if key.startswith('meta:'):
|
||||
return lambda a, b: ReplicationState.from_dict(a).merge_with(ReplicationState.from_dict(b)).to_dict()
|
||||
if key.startswith('membership:'):
|
||||
# Для membership нужен node_id, но это только для локального состояния; здесь достаточно CRDT-мерджа
|
||||
return lambda a, b: MembershipState.from_dict('remote', None, a).merge(MembershipState.from_dict('remote', None, b)).to_dict()
|
||||
if key.startswith('metric:'):
|
||||
return lambda a, b: ContentMetricsState.from_dict('remote', a).merge(ContentMetricsState.from_dict('remote', b)).to_dict()
|
||||
return lambda a, b: b
|
||||
|
||||
|
||||
async def s_api_v1_dht_get(request):
|
||||
"""Возвращает запись DHT по fingerprint или key."""
|
||||
store: DHTStore = request.app.ctx.memory.dht_store
|
||||
fp = request.args.get('fingerprint')
|
||||
key = request.args.get('key')
|
||||
if fp:
|
||||
rec = store.get(fp)
|
||||
if not rec:
|
||||
return response.json({'error': 'NOT_FOUND'}, status=404)
|
||||
return response.json({**rec.to_payload(), 'signature': rec.signature})
|
||||
if key:
|
||||
snap = store.snapshot()
|
||||
for _fp, payload in snap.items():
|
||||
if payload.get('key') == key:
|
||||
return response.json(payload)
|
||||
return response.json({'error': 'NOT_FOUND'}, status=404)
|
||||
return response.json({'error': 'BAD_REQUEST'}, status=400)
|
||||
|
||||
|
||||
def _verify_publisher(node_id: str, public_key_b58: str) -> bool:
|
||||
try:
|
||||
derived = compute_node_id(b58decode(public_key_b58))
|
||||
return derived == node_id
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
async def s_api_v1_dht_put(request):
|
||||
"""Принимает запись(и) DHT, проверяет подпись и выполняет merge/persist.
|
||||
|
||||
Поддерживает одиночную запись (record: {...}) и пакет (records: [{...}]).
|
||||
Требует поле public_key отправителя и соответствие node_id.
|
||||
"""
|
||||
mem = request.app.ctx.memory
|
||||
store: DHTStore = mem.dht_store
|
||||
data = request.json or {}
|
||||
public_key = data.get('public_key')
|
||||
if not public_key:
|
||||
return response.json({'error': 'MISSING_PUBLIC_KEY'}, status=400)
|
||||
|
||||
# Determine publisher role (trusted/read-only/deny)
|
||||
role = None
|
||||
try:
|
||||
session = request.ctx.db_session
|
||||
kn = (await session.execute(select(KnownNode).where(KnownNode.public_key == public_key))).scalars().first()
|
||||
role = (kn.meta or {}).get('role') if kn and kn.meta else None
|
||||
except Exception:
|
||||
role = None
|
||||
|
||||
def _process_one(payload: Dict[str, Any]) -> Dict[str, Any]:
|
||||
try:
|
||||
rec = DHTRecord.create(
|
||||
key=payload['key'],
|
||||
fingerprint=payload['fingerprint'],
|
||||
value=payload['value'],
|
||||
node_id=payload['node_id'],
|
||||
logical_counter=int(payload['logical_counter']),
|
||||
signature=payload.get('signature'),
|
||||
timestamp=float(payload.get('timestamp') or 0),
|
||||
)
|
||||
except Exception as e:
|
||||
return {'error': f'BAD_RECORD: {e}'}
|
||||
if not _verify_publisher(rec.node_id, public_key):
|
||||
return {'error': 'NODE_ID_MISMATCH'}
|
||||
# Подтверждение подписи записи
|
||||
if not rec.verify(public_key):
|
||||
return {'error': 'BAD_SIGNATURE'}
|
||||
# Enforce ACL: untrusted nodes may not mutate meta/metric records
|
||||
if role != 'trusted':
|
||||
if rec.key.startswith('meta:') or rec.key.startswith('metric:'):
|
||||
return {'error': 'FORBIDDEN_NOT_TRUSTED'}
|
||||
merge_fn = _merge_strategy_for(rec.key)
|
||||
try:
|
||||
merged = store.merge_record(rec, merge_fn)
|
||||
return {'ok': True, 'fingerprint': merged.fingerprint}
|
||||
except Exception as e:
|
||||
make_log('DHT.put', f'merge failed: {e}', level='warning')
|
||||
return {'error': 'MERGE_FAILED'}
|
||||
|
||||
if 'record' in data:
|
||||
result = _process_one(data['record'])
|
||||
status = 200 if 'ok' in result else 400
|
||||
return response.json(result, status=status)
|
||||
elif 'records' in data and isinstance(data['records'], list):
|
||||
results: List[Dict[str, Any]] = []
|
||||
ok = True
|
||||
for item in data['records']:
|
||||
res = _process_one(item)
|
||||
if 'error' in res:
|
||||
ok = False
|
||||
results.append(res)
|
||||
return response.json({'ok': ok, 'results': results}, status=200 if ok else 207)
|
||||
return response.json({'error': 'BAD_REQUEST'}, status=400)
|
||||
|
|
@ -0,0 +1,39 @@
|
|||
from __future__ import annotations
|
||||
|
||||
from sanic import response
|
||||
|
||||
|
||||
async def s_api_metrics(request):
|
||||
try:
|
||||
from prometheus_client import generate_latest, CONTENT_TYPE_LATEST # type: ignore
|
||||
data = generate_latest()
|
||||
return response.raw(data, content_type=CONTENT_TYPE_LATEST)
|
||||
except Exception:
|
||||
# Fallback: export minimal in-process counters from DHT module, if available
|
||||
try:
|
||||
from app.core.network.dht import prometheus as dprom
|
||||
|
||||
def dump(metric_obj, metric_name):
|
||||
lines = []
|
||||
values = getattr(metric_obj, "_values", {})
|
||||
for labels, value in values.items():
|
||||
label_str = ",".join(f'{k}="{v}"' for k, v in labels)
|
||||
if label_str:
|
||||
lines.append(f"{metric_name}{{{label_str}}} {value}")
|
||||
else:
|
||||
lines.append(f"{metric_name} {value}")
|
||||
return lines
|
||||
|
||||
parts = []
|
||||
parts += dump(dprom.replication_under, "dht_replication_under_total")
|
||||
parts += dump(dprom.replication_over, "dht_replication_over_total")
|
||||
parts += dump(dprom.leader_changes, "dht_leader_changes_total")
|
||||
parts += dump(dprom.merge_conflicts, "dht_merge_conflicts_total")
|
||||
parts += dump(dprom.view_count_total, "dht_view_count_total")
|
||||
parts += dump(dprom.unique_estimate, "dht_unique_view_estimate")
|
||||
parts += dump(dprom.watch_time_seconds, "dht_watch_time_seconds")
|
||||
body = "\n".join(parts) + ("\n" if parts else "")
|
||||
return response.text(body, content_type="text/plain; version=0.0.4")
|
||||
except Exception:
|
||||
return response.text("")
|
||||
|
||||
|
|
@ -4,12 +4,11 @@ import json
|
|||
from datetime import datetime
|
||||
from typing import Dict, Any
|
||||
|
||||
from base58 import b58decode
|
||||
from app.core._utils.b58 import b58decode
|
||||
from sanic import response
|
||||
from sqlalchemy import select
|
||||
from urllib.parse import urlparse
|
||||
|
||||
from app.core.logger import make_log
|
||||
from app.core.models.my_network import KnownNode
|
||||
from app.core.network.constants import CURRENT_PROTOCOL_VERSION, NODE_TYPE_PRIVATE
|
||||
from app.core.network.config import NODE_PRIVACY
|
||||
from app.core.network.handshake import build_handshake_payload, compute_node_info, sign_response
|
||||
|
|
@ -17,6 +16,56 @@ from app.core.network.nodes import upsert_known_node, list_known_public_nodes
|
|||
from app.core.network.semver import compatibility
|
||||
from app.core.network.guard import check_rate_limit, check_timestamp_fresh, check_and_remember_nonce
|
||||
from app.core.network.config import HANDSHAKE_TS_TOLERANCE_SEC
|
||||
from app.core.ipfs_client import swarm_connect
|
||||
from app.core._config import PROJECT_HOST
|
||||
from app.core.events.service import record_event
|
||||
from app.core.network.asn import resolver as asn_resolver
|
||||
from app.core.network.dht import compute_node_id, dht_config, ReachabilityReceipt
|
||||
|
||||
|
||||
def _port_from_public_host(public_host: str) -> int:
|
||||
"""Return an integer port extracted from a public_host URL or host:port string."""
|
||||
if not public_host:
|
||||
return 80
|
||||
parsed = urlparse(public_host)
|
||||
if parsed.scheme:
|
||||
if parsed.port:
|
||||
return parsed.port
|
||||
return 443 if parsed.scheme == "https" else 80
|
||||
host_port = public_host.strip()
|
||||
if ":" in host_port:
|
||||
candidate = host_port.rsplit(":", 1)[-1]
|
||||
try:
|
||||
return int(candidate)
|
||||
except (TypeError, ValueError):
|
||||
pass
|
||||
return 80
|
||||
|
||||
|
||||
def _extract_ipfs_meta(payload: Dict[str, Any]) -> Dict[str, Any]:
|
||||
ipfs = payload or {}
|
||||
multiaddrs = ipfs.get("multiaddrs") or []
|
||||
if not isinstance(multiaddrs, list):
|
||||
multiaddrs = [multiaddrs]
|
||||
normalized_multiaddrs = [str(m) for m in multiaddrs if m]
|
||||
meta: Dict[str, Any] = {}
|
||||
if normalized_multiaddrs:
|
||||
meta["multiaddrs"] = normalized_multiaddrs
|
||||
peer_id = ipfs.get("peer_id")
|
||||
if peer_id:
|
||||
meta["peer_id"] = str(peer_id)
|
||||
agent = ipfs.get("agent_version") or ipfs.get("agentVersion")
|
||||
if agent:
|
||||
meta["agent_version"] = str(agent)
|
||||
return meta
|
||||
|
||||
|
||||
async def _connect_ipfs_multiaddrs(addrs):
|
||||
for addr in addrs or []:
|
||||
try:
|
||||
await swarm_connect(addr)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
async def s_api_v1_network_info(request):
|
||||
|
|
@ -44,7 +93,7 @@ async def s_api_v1_network_handshake(request):
|
|||
return response.json({"error": "RATE_LIMIT"}, status=429)
|
||||
|
||||
data = request.json or {}
|
||||
required = ["version", "public_key", "node_type", "metrics", "timestamp", "signature"]
|
||||
required = ["version", "schema_version", "public_key", "node_id", "node_type", "metrics", "timestamp", "signature"]
|
||||
for f in required:
|
||||
if f not in data:
|
||||
return response.json({"error": f"Missing field {f}"}, status=400)
|
||||
|
|
@ -60,7 +109,19 @@ async def s_api_v1_network_handshake(request):
|
|||
if not data.get("nonce") or not check_and_remember_nonce(request.app.ctx.memory, data.get("public_key"), data.get("nonce")):
|
||||
return response.json({"error": "NONCE_REPLAY"}, status=400)
|
||||
|
||||
# Base schema and identity checks
|
||||
if data.get("schema_version") != dht_config.schema_version:
|
||||
return response.json({"error": "UNSUPPORTED_SCHEMA_VERSION"}, status=400)
|
||||
|
||||
try:
|
||||
expected_node_id = compute_node_id(b58decode(data["public_key"]))
|
||||
except Exception:
|
||||
return response.json({"error": "BAD_PUBLIC_KEY"}, status=400)
|
||||
if data.get("node_id") != expected_node_id:
|
||||
return response.json({"error": "NODE_ID_MISMATCH"}, status=400)
|
||||
|
||||
peer_version = str(data.get("version"))
|
||||
ipfs_meta = _extract_ipfs_meta(data.get("ipfs") or {})
|
||||
comp = compatibility(peer_version, CURRENT_PROTOCOL_VERSION)
|
||||
if comp == "blocked":
|
||||
# We still store the node but respond with 409
|
||||
|
|
@ -68,7 +129,7 @@ async def s_api_v1_network_handshake(request):
|
|||
await upsert_known_node(
|
||||
request.ctx.db_session,
|
||||
host=data.get("public_host"),
|
||||
port=int(str(data.get("public_host") or "").split(":")[-1]) if ":" in str(data.get("public_host") or "") else 80,
|
||||
port=_port_from_public_host(data.get("public_host")),
|
||||
public_key=str(data.get("public_key")),
|
||||
meta={
|
||||
"version": peer_version,
|
||||
|
|
@ -76,6 +137,7 @@ async def s_api_v1_network_handshake(request):
|
|||
"is_public": data.get("node_type", "public") != "private",
|
||||
"public_host": data.get("public_host"),
|
||||
"unsupported_last_checked_at": datetime.utcnow().isoformat(),
|
||||
"ipfs": ipfs_meta,
|
||||
}
|
||||
)
|
||||
except Exception:
|
||||
|
|
@ -88,22 +150,90 @@ async def s_api_v1_network_handshake(request):
|
|||
"peer": peer_version,
|
||||
}, status=409)
|
||||
|
||||
# Verify signature
|
||||
# Verify signature (Ed25519). If libsodium not available, accept but log a warning.
|
||||
signed_fields = {k: v for (k, v) in data.items() if k != "signature"}
|
||||
blob = json.dumps(signed_fields, sort_keys=True, separators=(",", ":")).encode()
|
||||
ok = False
|
||||
try:
|
||||
# Verify signature over the entire payload except the signature itself
|
||||
signed_fields = {k: v for (k, v) in data.items() if k != "signature"}
|
||||
blob = json.dumps(signed_fields, sort_keys=True, separators=(",", ":")).encode()
|
||||
import nacl.signing, nacl.encoding
|
||||
vk = nacl.signing.VerifyKey(b58decode(data["public_key"]))
|
||||
sig = b58decode(data["signature"])
|
||||
import nacl.signing, nacl.encoding # type: ignore
|
||||
vk = nacl.signing.VerifyKey(b58decode(data.get("public_key", "")))
|
||||
sig = b58decode(data.get("signature", ""))
|
||||
vk.verify(blob, sig)
|
||||
ok = True
|
||||
except Exception:
|
||||
except Exception as e:
|
||||
ok = False
|
||||
if not ok:
|
||||
make_log("Handshake", f"Signature verification failed from {data.get('public_host')}", level='warning')
|
||||
return response.json({"error": "BAD_SIGNATURE"}, status=400)
|
||||
|
||||
# Update membership / reachability information
|
||||
try:
|
||||
membership_mgr = getattr(request.app.ctx.memory, "membership", None)
|
||||
if membership_mgr:
|
||||
remote_ip = (request.headers.get('X-Forwarded-For') or request.remote_addr or request.ip or '').split(',')[0].strip() or None
|
||||
# Determine caller ASN using advertised value or resolver
|
||||
remote_asn = data.get("asn")
|
||||
if remote_asn is None:
|
||||
remote_asn = await asn_resolver.resolve_async(remote_ip, request.ctx.db_session)
|
||||
else:
|
||||
if remote_ip:
|
||||
asn_resolver.learn(remote_ip, int(remote_asn))
|
||||
membership_mgr.update_member(
|
||||
node_id=data["node_id"],
|
||||
public_key=data["public_key"],
|
||||
ip=remote_ip,
|
||||
asn=int(remote_asn) if remote_asn is not None else None,
|
||||
metadata={
|
||||
"capabilities": data.get("capabilities", {}),
|
||||
"metrics": data.get("metrics", {}),
|
||||
"public_host": data.get("public_host"),
|
||||
},
|
||||
)
|
||||
for receipt in data.get("reachability_receipts") or []:
|
||||
if not receipt.get("target_id") or not receipt.get("issuer_id"):
|
||||
continue
|
||||
try:
|
||||
# Only accept receipts issued by the caller
|
||||
issuer_id = str(receipt.get("issuer_id"))
|
||||
if issuer_id != data["node_id"]:
|
||||
continue
|
||||
# Canonical message for receipt verification
|
||||
# schema_version is embedded to avoid replay across versions
|
||||
rec_asn = receipt.get("asn")
|
||||
if rec_asn is None:
|
||||
rec_asn = remote_asn
|
||||
payload = {
|
||||
"schema_version": dht_config.schema_version,
|
||||
"target_id": str(receipt.get("target_id")),
|
||||
"issuer_id": issuer_id,
|
||||
"asn": int(rec_asn) if rec_asn is not None else None,
|
||||
"timestamp": float(receipt.get("timestamp", data.get("timestamp"))),
|
||||
}
|
||||
blob = json.dumps(payload, sort_keys=True, separators=(",", ":")).encode()
|
||||
try:
|
||||
import nacl.signing # type: ignore
|
||||
from app.core._utils.b58 import b58decode as _b58d
|
||||
vk = nacl.signing.VerifyKey(_b58d(data["public_key"]))
|
||||
sig_b = _b58d(str(receipt.get("signature", "")))
|
||||
vk.verify(blob, sig_b)
|
||||
# Accept and persist
|
||||
membership_mgr.record_receipt(
|
||||
ReachabilityReceipt(
|
||||
target_id=payload["target_id"],
|
||||
issuer_id=payload["issuer_id"],
|
||||
asn=payload["asn"],
|
||||
timestamp=payload["timestamp"],
|
||||
signature=str(receipt.get("signature", "")),
|
||||
)
|
||||
)
|
||||
except Exception:
|
||||
# Ignore invalid receipts
|
||||
continue
|
||||
except Exception:
|
||||
continue
|
||||
except Exception as exc:
|
||||
make_log("Handshake", f"Membership ingest failed: {exc}", level='warning')
|
||||
|
||||
# Upsert node and respond with our info + known public nodes
|
||||
# Do not persist private peers (ephemeral)
|
||||
if data.get("node_type") != "private" and data.get("public_host"):
|
||||
|
|
@ -111,7 +241,7 @@ async def s_api_v1_network_handshake(request):
|
|||
await upsert_known_node(
|
||||
request.ctx.db_session,
|
||||
host=data.get("public_host"),
|
||||
port=int(str(data.get("public_host") or "").split(":")[-1]) if ":" in str(data.get("public_host") or "") else 80,
|
||||
port=_port_from_public_host(data.get("public_host")),
|
||||
public_key=str(data.get("public_key")),
|
||||
meta={
|
||||
"version": peer_version,
|
||||
|
|
@ -120,13 +250,31 @@ async def s_api_v1_network_handshake(request):
|
|||
"public_host": data.get("public_host"),
|
||||
"last_metrics": data.get("metrics", {}),
|
||||
"capabilities": data.get("capabilities", {}),
|
||||
"ipfs": ipfs_meta,
|
||||
}
|
||||
)
|
||||
await _connect_ipfs_multiaddrs(ipfs_meta.get("multiaddrs"))
|
||||
try:
|
||||
await record_event(
|
||||
request.ctx.db_session,
|
||||
'node_registered',
|
||||
{
|
||||
'public_key': str(data.get("public_key")),
|
||||
'public_host': data.get("public_host"),
|
||||
'node_type': data.get("node_type"),
|
||||
'version': peer_version,
|
||||
'capabilities': data.get("capabilities", {}),
|
||||
},
|
||||
origin_host=PROJECT_HOST,
|
||||
)
|
||||
except Exception as ev_exc:
|
||||
make_log("Events", f"Failed to record node_registered event: {ev_exc}", level="warning")
|
||||
except Exception as e:
|
||||
make_log("Handshake", f"Upsert peer failed: {e}", level='warning')
|
||||
|
||||
# Merge advertised peers from the caller (optional field)
|
||||
for n in data.get("known_public_nodes", []) or []:
|
||||
known_ipfs_meta = _extract_ipfs_meta(n.get("ipfs") or {})
|
||||
try:
|
||||
await upsert_known_node(
|
||||
request.ctx.db_session,
|
||||
|
|
@ -139,17 +287,22 @@ async def s_api_v1_network_handshake(request):
|
|||
"is_public": True,
|
||||
"public_host": n.get("public_host") or n.get("host"),
|
||||
"capabilities": n.get("capabilities") or {},
|
||||
"ipfs": known_ipfs_meta,
|
||||
}
|
||||
)
|
||||
await _connect_ipfs_multiaddrs(known_ipfs_meta.get("multiaddrs"))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
node = await compute_node_info(request.ctx.db_session)
|
||||
known = await list_known_public_nodes(request.ctx.db_session)
|
||||
membership_mgr = getattr(request.app.ctx.memory, "membership", None)
|
||||
n_estimate = membership_mgr.n_estimate() if membership_mgr else 0
|
||||
resp = sign_response({
|
||||
"compatibility": comp,
|
||||
"node": node,
|
||||
"known_public_nodes": known,
|
||||
"n_estimate": n_estimate,
|
||||
})
|
||||
make_log("Handshake", f"OK with {data.get('public_host')} compat={comp}")
|
||||
status = 200
|
||||
|
|
|
|||
|
|
@ -0,0 +1,77 @@
|
|||
from __future__ import annotations
|
||||
|
||||
from typing import Dict, Any
|
||||
|
||||
from sanic import response
|
||||
from sqlalchemy import select
|
||||
|
||||
from app.core.logger import make_log
|
||||
from app.core.models import NodeEvent, KnownNode
|
||||
from app.core.network.nodesig import verify_request
|
||||
from app.core.network.guard import check_rate_limit
|
||||
from app.core._config import PROJECT_HOST
|
||||
from app.core.events.service import LOCAL_PUBLIC_KEY
|
||||
|
||||
|
||||
def _origin_host() -> str | None:
|
||||
return PROJECT_HOST.rstrip('/') if PROJECT_HOST else None
|
||||
|
||||
|
||||
async def s_api_v1_network_events(request):
|
||||
remote_ip = (request.headers.get('X-Forwarded-For') or request.remote_addr or request.ip or '').split(',')[0].strip()
|
||||
if not check_rate_limit(request.app.ctx.memory, remote_ip):
|
||||
return response.json({"error": "RATE_LIMIT"}, status=429)
|
||||
|
||||
ok, node_id, reason = verify_request(request, request.app.ctx.memory)
|
||||
if not ok:
|
||||
return response.json({"error": reason or "UNAUTHORIZED"}, status=401)
|
||||
|
||||
session = request.ctx.db_session
|
||||
trusted = (await session.execute(
|
||||
select(KnownNode).where(KnownNode.public_key == node_id)
|
||||
)).scalar_one_or_none()
|
||||
role = (trusted.meta or {}).get('role') if trusted and trusted.meta else None
|
||||
if role != 'trusted':
|
||||
make_log("Events", f"Rejected events fetch from non-trusted node {node_id}", level="warning")
|
||||
return response.json({"error": "FORBIDDEN"}, status=403)
|
||||
|
||||
try:
|
||||
since = int(request.args.get('since') or 0)
|
||||
except (TypeError, ValueError):
|
||||
since = 0
|
||||
since = max(since, 0)
|
||||
|
||||
try:
|
||||
limit = int(request.args.get('limit') or 100)
|
||||
except (TypeError, ValueError):
|
||||
limit = 100
|
||||
limit = max(1, min(limit, 200))
|
||||
|
||||
result = await session.execute(
|
||||
select(NodeEvent)
|
||||
.where(NodeEvent.origin_public_key == LOCAL_PUBLIC_KEY, NodeEvent.seq > since)
|
||||
.order_by(NodeEvent.seq.asc())
|
||||
.limit(limit)
|
||||
)
|
||||
rows = result.scalars().all()
|
||||
|
||||
events: list[Dict[str, Any]] = []
|
||||
next_since = since
|
||||
for row in rows:
|
||||
next_since = max(next_since, int(row.seq))
|
||||
events.append({
|
||||
"origin_public_key": row.origin_public_key,
|
||||
"origin_host": row.origin_host or _origin_host(),
|
||||
"seq": int(row.seq),
|
||||
"uid": row.uid,
|
||||
"event_type": row.event_type,
|
||||
"payload": row.payload,
|
||||
"signature": row.signature,
|
||||
"created_at": (row.created_at.isoformat() + 'Z') if row.created_at else None,
|
||||
})
|
||||
|
||||
payload = {
|
||||
"events": events,
|
||||
"next_since": next_since,
|
||||
}
|
||||
return response.json(payload)
|
||||
|
|
@ -16,6 +16,14 @@ from app.core.models.node_storage import StoredContent
|
|||
from app.core._config import UPLOADS_DIR
|
||||
from app.core.models.content_v3 import ContentDerivative
|
||||
from app.core._utils.resolve_content import resolve_content
|
||||
from app.core.network.nodesig import verify_request
|
||||
from app.core.models.my_network import KnownNode
|
||||
from sqlalchemy import select as sa_select
|
||||
import httpx
|
||||
from app.core._crypto.signer import Signer
|
||||
from app.core._secrets import hot_seed
|
||||
from app.core._utils.b58 import b58encode as _b58e, b58decode as _b58d
|
||||
import json, time
|
||||
|
||||
|
||||
# POST /api/v1.5/storage
|
||||
|
|
@ -305,3 +313,125 @@ async def s_api_v1_5_storage_get(request, file_hash):
|
|||
else:
|
||||
make_log("uploader_v1.5", f"Returning full file for video/audio: {final_path}", level="INFO")
|
||||
return await response.file(final_path, mime_type=mime_type)
|
||||
|
||||
|
||||
# GET /api/v1/storage.fetch/<file_hash>
|
||||
# Внутренний эндпойнт для межузлового запроса (NodeSig). Возвращает файл, если он есть локально.
|
||||
async def s_api_v1_storage_fetch(request, file_hash):
|
||||
ok, node_id, reason = verify_request(request, request.app.ctx.memory)
|
||||
if not ok:
|
||||
return response.json({"error": reason or "UNAUTHORIZED"}, status=401)
|
||||
# Только доверенные узлы
|
||||
try:
|
||||
session = request.ctx.db_session
|
||||
row = (await session.execute(sa_select(KnownNode).where(KnownNode.public_key == node_id))).scalars().first()
|
||||
role = (row.meta or {}).get('role') if row and row.meta else None
|
||||
if role != 'trusted':
|
||||
return response.json({"error": "DENIED_NOT_TRUSTED"}, status=403)
|
||||
except Exception:
|
||||
pass
|
||||
# Переиспользуем реализацию v1.5
|
||||
return await s_api_v1_5_storage_get(request, file_hash)
|
||||
|
||||
|
||||
# GET /api/v1/storage.proxy/<file_hash>
|
||||
# Проксирование для web-клиента: если локально нет файла, попытка получить у доверенных узлов по NodeSig
|
||||
async def s_api_v1_storage_proxy(request, file_hash):
|
||||
# Require either valid NodeSig (unlikely for public clients) or a signed access token
|
||||
# Token fields: pub, exp, scope, uid, sig over json {hash,scope,exp,uid}
|
||||
def _verify_access_token() -> bool:
|
||||
try:
|
||||
pub = (request.args.get('pub') or '').strip()
|
||||
exp = int(request.args.get('exp') or '0')
|
||||
scope = (request.args.get('scope') or '').strip()
|
||||
uid = int(request.args.get('uid') or '0')
|
||||
sig = (request.args.get('sig') or '').strip()
|
||||
if not pub or not exp or not scope or not sig:
|
||||
return False
|
||||
if exp < int(time.time()):
|
||||
return False
|
||||
payload = {
|
||||
'hash': file_hash,
|
||||
'scope': scope,
|
||||
'exp': exp,
|
||||
'uid': uid,
|
||||
}
|
||||
blob = json.dumps(payload, sort_keys=True, separators=(",", ":")).encode()
|
||||
import nacl.signing
|
||||
vk = nacl.signing.VerifyKey(_b58d(pub))
|
||||
vk.verify(blob, _b58d(sig))
|
||||
# Note: we do not require a session-bound user for media fetches,
|
||||
# the short‑lived signature itself is sufficient.
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
ok_nodesig, _nid, _reason = verify_request(request, request.app.ctx.memory)
|
||||
if not ok_nodesig and not _verify_access_token():
|
||||
return response.json({'error': 'UNAUTHORIZED'}, status=401)
|
||||
# Сначала пробуем локально без возврата 404
|
||||
try:
|
||||
from base58 import b58encode as _b58e
|
||||
try:
|
||||
# Поддержка как хэша, так и CID
|
||||
from app.core._utils.resolve_content import resolve_content as _res
|
||||
cid, _ = _res(file_hash)
|
||||
file_hash = _b58e(cid.content_hash).decode()
|
||||
except Exception:
|
||||
pass
|
||||
final_path = os.path.join(UPLOADS_DIR, f"{file_hash}")
|
||||
if os.path.exists(final_path):
|
||||
return await s_api_v1_5_storage_get(request, file_hash)
|
||||
except Exception:
|
||||
pass
|
||||
# Локально нет — пробуем у доверенных
|
||||
try:
|
||||
async with request.app.ctx.memory.transaction("storage.proxy"):
|
||||
# Соберём список trusted узлов
|
||||
session = request.ctx.db_session
|
||||
nodes = (await session.execute(sa_select(KnownNode))).scalars().all()
|
||||
candidates = []
|
||||
for n in nodes:
|
||||
role = (n.meta or {}).get('role') if n.meta else None
|
||||
if role != 'trusted':
|
||||
continue
|
||||
host = (n.meta or {}).get('public_host') or (n.ip or '')
|
||||
if not host:
|
||||
continue
|
||||
base = host.rstrip('/')
|
||||
if not base.startswith('http'):
|
||||
base = f"http://{base}:{n.port or 80}"
|
||||
candidates.append(base)
|
||||
# Проксируем с передачей Range, стриминг
|
||||
range_header = request.headers.get("Range")
|
||||
timeout = httpx.Timeout(10.0, read=60.0)
|
||||
for base in candidates:
|
||||
url = f"{base}/api/v1/storage.fetch/{file_hash}"
|
||||
try:
|
||||
# Подпишем NodeSig
|
||||
from app.core._secrets import hot_seed, hot_pubkey
|
||||
from app.core.network.nodesig import sign_headers
|
||||
from app.core._utils.b58 import b58encode as _b58e
|
||||
pk_b58 = _b58e(hot_pubkey).decode()
|
||||
headers = sign_headers('GET', f"/api/v1/storage.fetch/{file_hash}", b"", hot_seed, pk_b58)
|
||||
if range_header:
|
||||
headers['Range'] = range_header
|
||||
async with httpx.AsyncClient(timeout=timeout) as client:
|
||||
r = await client.get(url, headers=headers)
|
||||
if r.status_code == 404:
|
||||
continue
|
||||
if r.status_code not in (200, 206):
|
||||
continue
|
||||
# Проксируем заголовки контента
|
||||
resp = await request.respond(status=r.status_code, headers={
|
||||
k: v for k, v in r.headers.items() if k.lower() in ("content-type", "content-length", "content-range", "accept-ranges")
|
||||
})
|
||||
async for chunk in r.aiter_bytes(chunk_size=1024*1024):
|
||||
await resp.send(chunk)
|
||||
await resp.eof()
|
||||
return resp
|
||||
except Exception as e:
|
||||
continue
|
||||
except Exception:
|
||||
pass
|
||||
return response.json({"error": "File not found"}, status=404)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,8 @@
|
|||
from sanic import response
|
||||
from app.core.models.content_v3 import UploadSession
|
||||
from sqlalchemy import select
|
||||
|
||||
from app.core.models.content_v3 import UploadSession, EncryptedContent, ContentDerivative
|
||||
from app.core._utils.resolve_content import resolve_content
|
||||
|
||||
|
||||
async def s_api_v1_upload_status(request, upload_id: str):
|
||||
|
|
@ -7,11 +10,48 @@ async def s_api_v1_upload_status(request, upload_id: str):
|
|||
row = await session.get(UploadSession, upload_id)
|
||||
if not row:
|
||||
return response.json({"error": "NOT_FOUND"}, status=404)
|
||||
|
||||
encrypted_hash = None
|
||||
conversion = {"state": "not_started", "details": []}
|
||||
|
||||
if row.encrypted_cid:
|
||||
cid_obj, err = resolve_content(row.encrypted_cid)
|
||||
if not err:
|
||||
encrypted_hash = cid_obj.content_hash_b58
|
||||
ec = (await session.execute(select(EncryptedContent).where(EncryptedContent.encrypted_cid == row.encrypted_cid))).scalars().first()
|
||||
if ec:
|
||||
derivative_rows = (await session.execute(
|
||||
select(ContentDerivative.kind, ContentDerivative.status).where(ContentDerivative.content_id == ec.id)
|
||||
)).all()
|
||||
details = [
|
||||
{"kind": kind, "status": status}
|
||||
for kind, status in derivative_rows
|
||||
]
|
||||
if ec.content_type and ec.content_type.startswith("audio/"):
|
||||
required = {"decrypted_high", "decrypted_low"}
|
||||
elif ec.content_type and ec.content_type.startswith("video/"):
|
||||
required = {"decrypted_high", "decrypted_low", "decrypted_preview"}
|
||||
else:
|
||||
required = {"decrypted_original"}
|
||||
statuses = {kind: status for kind, status in derivative_rows}
|
||||
if required and all(statuses.get(k) == "ready" for k in required):
|
||||
conv_state = "ready"
|
||||
elif any(statuses.get(k) == "failed" for k in required):
|
||||
conv_state = "failed"
|
||||
elif any(statuses.get(k) in ("processing", "pending") for k in required):
|
||||
conv_state = "processing"
|
||||
elif required:
|
||||
conv_state = "pending"
|
||||
else:
|
||||
conv_state = "not_started"
|
||||
conversion = {"state": conv_state, "details": details}
|
||||
|
||||
return response.json({
|
||||
"id": row.id,
|
||||
"state": row.state,
|
||||
"encrypted_cid": row.encrypted_cid,
|
||||
"encrypted_hash": encrypted_hash,
|
||||
"size_bytes": row.size_bytes,
|
||||
"error": row.error,
|
||||
"conversion": conversion,
|
||||
})
|
||||
|
||||
|
|
|
|||
|
|
@ -6,16 +6,23 @@ import os
|
|||
from datetime import datetime
|
||||
from typing import Dict, Any
|
||||
|
||||
import aiofiles
|
||||
from base58 import b58encode
|
||||
from sanic import response
|
||||
import magic # type: ignore
|
||||
|
||||
from app.core._config import UPLOADS_DIR, PROJECT_HOST
|
||||
from app.core._secrets import hot_pubkey
|
||||
from app.core.crypto.aes_gcm_stream import encrypt_file_to_encf, CHUNK_BYTES
|
||||
from app.core.crypto.keywrap import wrap_dek, KeyWrapError
|
||||
from app.core.ipfs_client import add_streamed_file
|
||||
from app.core.logger import make_log
|
||||
from app.core.models.content_v3 import EncryptedContent, ContentKey, IpfsSync, ContentIndexItem, UploadSession
|
||||
from app.core.models.node_storage import StoredContent
|
||||
from app.core.storage import db_session
|
||||
from app.core._utils.resolve_content import resolve_content
|
||||
from app.core.events.service import record_event
|
||||
from sqlalchemy import select
|
||||
|
||||
|
||||
def _b64(s: bytes) -> str:
|
||||
|
|
@ -27,14 +34,29 @@ async def s_api_v1_upload_tus_hook(request):
|
|||
tusd HTTP hook endpoint. We mainly handle post-finish to: encrypt -> IPFS add+pin -> record DB.
|
||||
"""
|
||||
try:
|
||||
payload: Dict[str, Any] = request.json or {}
|
||||
payload: Dict[str, Any] = request.json
|
||||
except Exception:
|
||||
payload = {}
|
||||
event = payload.get("Type") or payload.get("type") or payload.get("Event") or payload.get("event")
|
||||
payload = None
|
||||
if payload is None:
|
||||
raw_body = request.body or b''
|
||||
try:
|
||||
payload = json.loads(raw_body) if raw_body else {}
|
||||
except Exception:
|
||||
payload = {}
|
||||
event = (payload.get("Type") or payload.get("type") or
|
||||
payload.get("Event") or payload.get("event") or
|
||||
payload.get("Hook") or payload.get("hook") or
|
||||
payload.get("HookName") or payload.get("hook_name") or
|
||||
request.headers.get("Hook-Name") or request.headers.get("hook-name"))
|
||||
upload = payload.get("Upload") or payload.get("upload") or {}
|
||||
|
||||
if not event:
|
||||
return response.json({"ok": False, "error": "NO_EVENT"}, status=400)
|
||||
hook_name = (payload.get("HookName") or payload.get("hook") or
|
||||
payload.get("hook_name") or request.headers.get("Hook-Name"))
|
||||
raw = request.body or b''
|
||||
preview = raw[:512]
|
||||
make_log("tus-hook", f"Missing event type in hook payload; ignoring (hook={hook_name}, keys={list(payload.keys())}, raw={preview!r})", level="warning")
|
||||
return response.json({"ok": True, "skipped": True})
|
||||
|
||||
if event not in ("post-finish", "postfinish"):
|
||||
# accept but ignore other events
|
||||
|
|
@ -49,9 +71,40 @@ async def s_api_v1_upload_tus_hook(request):
|
|||
meta = upload.get("MetaData") or {}
|
||||
# Common metadata keys
|
||||
title = meta.get("title") or meta.get("Title") or meta.get("name") or "Untitled"
|
||||
artist = (meta.get("artist") or meta.get("Artist") or "").strip()
|
||||
description = meta.get("description") or meta.get("Description") or ""
|
||||
content_type = meta.get("content_type") or meta.get("Content-Type") or "application/octet-stream"
|
||||
preview_enabled = content_type.startswith("audio/") or content_type.startswith("video/")
|
||||
detected_content_type = None
|
||||
try:
|
||||
raw_detected = magic.from_file(file_path, mime=True)
|
||||
if raw_detected:
|
||||
detected_content_type = raw_detected.split(";")[0].strip()
|
||||
except Exception as e:
|
||||
make_log("tus-hook", f"magic MIME detection failed for {file_path}: {e}", level="warning")
|
||||
|
||||
def _is_av(mime: str | None) -> bool:
|
||||
if not mime:
|
||||
return False
|
||||
return mime.startswith("audio/") or mime.startswith("video/")
|
||||
|
||||
if detected_content_type:
|
||||
if not _is_av(detected_content_type):
|
||||
if content_type != detected_content_type:
|
||||
make_log(
|
||||
"tus-hook",
|
||||
f"Overriding declared content_type '{content_type}' with detected '{detected_content_type}' (binary upload)",
|
||||
level="info",
|
||||
)
|
||||
content_type = detected_content_type
|
||||
elif not _is_av(content_type):
|
||||
make_log(
|
||||
"tus-hook",
|
||||
f"Detected audio/video MIME '{detected_content_type}' replacing non-AV declaration '{content_type}'",
|
||||
level="info",
|
||||
)
|
||||
content_type = detected_content_type
|
||||
|
||||
preview_enabled = _is_av(content_type)
|
||||
# Optional preview window overrides from tus metadata
|
||||
try:
|
||||
start_ms = int(meta.get("preview_start_ms") or 0)
|
||||
|
|
@ -124,11 +177,18 @@ async def s_api_v1_upload_tus_hook(request):
|
|||
except Exception:
|
||||
enc_size = None
|
||||
|
||||
encrypted_cid_obj, cid_err = resolve_content(encrypted_cid)
|
||||
if cid_err:
|
||||
make_log("tus-hook", f"Encrypted CID resolve failed: {cid_err}", level="error")
|
||||
return response.json({"ok": False, "error": "INVALID_ENCRYPTED_CID"}, status=500)
|
||||
encrypted_hash_b58 = encrypted_cid_obj.content_hash_b58
|
||||
|
||||
# Persist records
|
||||
async with db_session() as session:
|
||||
ec = EncryptedContent(
|
||||
encrypted_cid=encrypted_cid,
|
||||
title=title,
|
||||
artist=artist or None,
|
||||
description=description,
|
||||
content_type=content_type,
|
||||
enc_size_bytes=enc_size,
|
||||
|
|
@ -150,6 +210,7 @@ async def s_api_v1_upload_tus_hook(request):
|
|||
allow_auto_grant=True,
|
||||
)
|
||||
session.add(ck)
|
||||
await session.flush()
|
||||
|
||||
sync = IpfsSync(
|
||||
content_id=ec.id,
|
||||
|
|
@ -160,17 +221,47 @@ async def s_api_v1_upload_tus_hook(request):
|
|||
)
|
||||
session.add(sync)
|
||||
|
||||
existing_encrypted_content = (await session.execute(
|
||||
select(StoredContent).where(StoredContent.hash == encrypted_hash_b58)
|
||||
)).scalars().first()
|
||||
if not existing_encrypted_content:
|
||||
placeholder_meta = {
|
||||
'content_type': content_type,
|
||||
'storage': 'ipfs',
|
||||
'encrypted_cid': encrypted_cid,
|
||||
'upload_id': upload_id,
|
||||
'source': 'tusd',
|
||||
'title': title,
|
||||
'artist': artist or None,
|
||||
}
|
||||
encrypted_stored_content = StoredContent(
|
||||
type="local/encrypted_ipfs",
|
||||
hash=encrypted_hash_b58,
|
||||
content_id=encrypted_cid,
|
||||
filename=os.path.basename(file_path),
|
||||
meta=placeholder_meta,
|
||||
user_id=request.ctx.user.id if request.ctx.user else None,
|
||||
owner_address=None,
|
||||
encrypted=True,
|
||||
decrypted_content_id=None,
|
||||
key_id=None,
|
||||
created=datetime.utcnow(),
|
||||
)
|
||||
session.add(encrypted_stored_content)
|
||||
|
||||
# Publish signed index item
|
||||
item = {
|
||||
"encrypted_cid": encrypted_cid,
|
||||
"title": title,
|
||||
"description": description,
|
||||
"artist": artist,
|
||||
"content_type": content_type,
|
||||
"size_bytes": enc_size,
|
||||
"preview_enabled": preview_enabled,
|
||||
"preview_conf": ec.preview_conf,
|
||||
"issuer_node_id": key_fpr,
|
||||
"salt_b64": _b64(salt),
|
||||
"artist": artist or None,
|
||||
}
|
||||
try:
|
||||
from app.core._crypto.signer import Signer
|
||||
|
|
@ -182,6 +273,25 @@ async def s_api_v1_upload_tus_hook(request):
|
|||
sig = ""
|
||||
session.add(ContentIndexItem(encrypted_cid=encrypted_cid, payload=item, sig=sig))
|
||||
|
||||
try:
|
||||
await record_event(
|
||||
session,
|
||||
'content_uploaded',
|
||||
{
|
||||
'encrypted_cid': encrypted_cid,
|
||||
'content_hash': encrypted_hash_b58,
|
||||
'title': title,
|
||||
'description': description,
|
||||
'content_type': content_type,
|
||||
'size_bytes': enc_size,
|
||||
'user_id': request.ctx.user.id if getattr(request.ctx, 'user', None) else None,
|
||||
'telegram_id': getattr(getattr(request.ctx, 'user', None), 'telegram_id', None),
|
||||
},
|
||||
origin_host=PROJECT_HOST,
|
||||
)
|
||||
except Exception as exc:
|
||||
make_log("Events", f"Failed to record content_uploaded event: {exc}", level="warning")
|
||||
|
||||
await session.commit()
|
||||
|
||||
# Update upload session with result and purge staging to avoid duplicates
|
||||
|
|
@ -191,6 +301,9 @@ async def s_api_v1_upload_tus_hook(request):
|
|||
if us:
|
||||
us.state = 'pinned'
|
||||
us.encrypted_cid = encrypted_cid
|
||||
us.error = None
|
||||
if size:
|
||||
us.size_bytes = size
|
||||
# prefer using IPFS for downstream conversion; remove staging
|
||||
try:
|
||||
if file_path and os.path.exists(file_path):
|
||||
|
|
@ -201,4 +314,15 @@ async def s_api_v1_upload_tus_hook(request):
|
|||
await session.commit()
|
||||
|
||||
make_log("tus-hook", f"Uploaded+encrypted {file_path} -> {encrypted_cid}")
|
||||
placeholder_path = os.path.join(UPLOADS_DIR, encrypted_hash_b58)
|
||||
if not os.path.exists(placeholder_path):
|
||||
try:
|
||||
async with aiofiles.open(placeholder_path, "wb") as ph:
|
||||
await ph.write(json.dumps({
|
||||
"ipfs_cid": encrypted_cid,
|
||||
"note": "Encrypted payload stored in IPFS"
|
||||
}).encode())
|
||||
except Exception as e:
|
||||
make_log("tus-hook", f"Failed to create placeholder for {encrypted_hash_b58}: {e}", level="warning")
|
||||
|
||||
return response.json({"ok": True, "encrypted_cid": encrypted_cid, "upload_id": upload_id})
|
||||
|
|
|
|||
|
|
@ -7,6 +7,9 @@ from app.bot.middleware import UserDataMiddleware
|
|||
from app.bot.routers.index import main_router
|
||||
|
||||
|
||||
dp = Dispatcher(storage=MemoryStorage())
|
||||
dp.update.outer_middleware(UserDataMiddleware())
|
||||
dp.include_router(main_router)
|
||||
def create_dispatcher() -> Dispatcher:
|
||||
"""Create aiogram Dispatcher lazily to avoid event loop issues at import time."""
|
||||
dp = Dispatcher(storage=MemoryStorage())
|
||||
dp.update.outer_middleware(UserDataMiddleware())
|
||||
dp.include_router(main_router)
|
||||
return dp
|
||||
|
|
|
|||
|
|
@ -1,12 +1,16 @@
|
|||
import base58
|
||||
from aiogram import types, Router, F
|
||||
from collections import defaultdict
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
|
||||
from app.core._config import WEB_APP_URLS
|
||||
from app.core._keyboards import get_inline_keyboard
|
||||
from app.core._utils.tg_process_template import tg_process_template
|
||||
from app.core.logger import make_log
|
||||
from app.core.models.node_storage import StoredContent
|
||||
from sqlalchemy import select, and_
|
||||
from app.core.models.content_v3 import UploadSession, EncryptedContent, ContentDerivative
|
||||
from sqlalchemy import select, and_, or_
|
||||
import json
|
||||
|
||||
router = Router()
|
||||
|
|
@ -18,26 +22,147 @@ def chunks(lst, n):
|
|||
yield lst[i:i + n]
|
||||
|
||||
|
||||
async def _compute_content_status(db_session, encrypted_cid: Optional[str], fallback_content_type: Optional[str] = None):
|
||||
if not encrypted_cid:
|
||||
return {
|
||||
'final_state': 'uploaded',
|
||||
'conversion_state': 'pending',
|
||||
'upload_state': None,
|
||||
'summary': {},
|
||||
'details': [],
|
||||
'title': None,
|
||||
'content_type': fallback_content_type,
|
||||
}
|
||||
|
||||
ec = (await db_session.execute(select(EncryptedContent).where(EncryptedContent.encrypted_cid == encrypted_cid))).scalars().first()
|
||||
content_type = fallback_content_type or (ec.content_type if ec else None) or 'application/octet-stream'
|
||||
|
||||
derivative_rows = []
|
||||
if ec:
|
||||
derivative_rows = (await db_session.execute(select(ContentDerivative).where(ContentDerivative.content_id == ec.id))).scalars().all()
|
||||
upload_row = (await db_session.execute(select(UploadSession).where(UploadSession.encrypted_cid == encrypted_cid))).scalars().first()
|
||||
|
||||
derivative_sorted = sorted(derivative_rows, key=lambda row: row.created_at or datetime.min)
|
||||
derivative_latest = {}
|
||||
summary = defaultdict(int)
|
||||
details = []
|
||||
for row in derivative_sorted:
|
||||
derivative_latest[row.kind] = row
|
||||
for kind, row in derivative_latest.items():
|
||||
summary[row.status] += 1
|
||||
details.append({
|
||||
'kind': kind,
|
||||
'status': row.status,
|
||||
'size_bytes': row.size_bytes,
|
||||
'error': row.error,
|
||||
'updated_at': (row.last_access_at or row.created_at).isoformat() + 'Z' if (row.last_access_at or row.created_at) else None,
|
||||
})
|
||||
|
||||
if content_type.startswith('audio/'):
|
||||
required = {'decrypted_low', 'decrypted_high'}
|
||||
elif content_type.startswith('video/'):
|
||||
required = {'decrypted_low', 'decrypted_high', 'decrypted_preview'}
|
||||
else:
|
||||
required = {'decrypted_original'}
|
||||
|
||||
statuses_by_kind = {kind: derivative_latest[kind].status for kind in required if kind in derivative_latest}
|
||||
conversion_state = 'pending'
|
||||
if required and all(statuses_by_kind.get(kind) == 'ready' for kind in required):
|
||||
conversion_state = 'ready'
|
||||
elif any(statuses_by_kind.get(kind) == 'failed' for kind in required):
|
||||
conversion_state = 'failed'
|
||||
elif any(statuses_by_kind.get(kind) in ('processing', 'pending') for kind in required):
|
||||
conversion_state = 'processing'
|
||||
elif statuses_by_kind:
|
||||
conversion_state = 'partial'
|
||||
|
||||
upload_state = upload_row.state if upload_row else None
|
||||
final_state = 'ready' if conversion_state == 'ready' else None
|
||||
if not final_state:
|
||||
if conversion_state == 'failed' or upload_state in ('failed', 'conversion_failed'):
|
||||
final_state = 'failed'
|
||||
elif conversion_state in ('processing', 'partial') or upload_state in ('processing', 'pinned'):
|
||||
final_state = 'processing'
|
||||
else:
|
||||
final_state = 'uploaded'
|
||||
|
||||
return {
|
||||
'final_state': final_state,
|
||||
'conversion_state': conversion_state,
|
||||
'upload_state': upload_state,
|
||||
'summary': dict(summary),
|
||||
'details': details,
|
||||
'title': ec.title if ec else None,
|
||||
'content_type': content_type,
|
||||
}
|
||||
|
||||
|
||||
async def t_callback_owned_content(query: types.CallbackQuery, memory=None, user=None, db_session=None, chat_wrap=None, **extra):
|
||||
message_text = user.translated("ownedContent_menu")
|
||||
content_list = []
|
||||
user_addr = await user.wallet_address_async(db_session)
|
||||
result = await db_session.execute(select(StoredContent).where(
|
||||
and_(StoredContent.owner_address == user_addr, StoredContent.type == 'onchain/content')
|
||||
))
|
||||
for content in result.scalars().all():
|
||||
try:
|
||||
metadata_content = await StoredContent.from_cid_async(db_session, content.json_format()['metadata_cid'])
|
||||
with open(metadata_content.filepath, 'r') as f:
|
||||
metadata_content_json = json.loads(f.read())
|
||||
except BaseException as e:
|
||||
make_log("OwnedContent", f"Can't get metadata content: {e}", level='warning')
|
||||
continue
|
||||
conditions = []
|
||||
if user_addr:
|
||||
conditions.append(and_(StoredContent.owner_address == user_addr, StoredContent.type.like('onchain%')))
|
||||
conditions.append(and_(StoredContent.user_id == user.id, StoredContent.type.like('local/%')))
|
||||
|
||||
if not conditions:
|
||||
conditions = [StoredContent.user_id == user.id]
|
||||
|
||||
stmt = select(StoredContent).where(
|
||||
StoredContent.disabled.is_(None),
|
||||
or_(*conditions) if len(conditions) > 1 else conditions[0]
|
||||
).order_by(StoredContent.created.desc())
|
||||
|
||||
rows = (await db_session.execute(stmt)).scalars().all()
|
||||
|
||||
onchain_hashes = set()
|
||||
local_items = []
|
||||
|
||||
icon_map = {
|
||||
'ready': '✅',
|
||||
'processing': '⏳',
|
||||
'failed': '⚠️',
|
||||
'uploaded': '📦',
|
||||
}
|
||||
|
||||
for content in rows:
|
||||
meta = content.meta or {}
|
||||
encrypted_cid = meta.get('content_cid') or meta.get('encrypted_cid') or content.content_id
|
||||
status_info = await _compute_content_status(db_session, encrypted_cid, meta.get('content_type'))
|
||||
icon = icon_map.get(status_info['final_state'], '📦')
|
||||
|
||||
if content.type.startswith('onchain'):
|
||||
try:
|
||||
metadata_content = await StoredContent.from_cid_async(db_session, content.json_format()['metadata_cid'])
|
||||
with open(metadata_content.filepath, 'r') as f:
|
||||
metadata_content_json = json.loads(f.read())
|
||||
except BaseException as e:
|
||||
make_log("OwnedContent", f"Can't get metadata content: {e}", level='warning')
|
||||
continue
|
||||
|
||||
onchain_hashes.add(content.hash)
|
||||
display_name = metadata_content_json.get('name') or content.cid.serialize_v2()
|
||||
content_list.append([
|
||||
{
|
||||
'text': f"{icon} {display_name}"[:64],
|
||||
'callback_data': f'NC_{content.id}'
|
||||
}
|
||||
])
|
||||
else:
|
||||
local_items.append((content, status_info, icon))
|
||||
|
||||
for content, status_info, icon in local_items:
|
||||
if content.hash in onchain_hashes:
|
||||
continue
|
||||
meta = content.meta or {}
|
||||
encrypted_cid = meta.get('encrypted_cid') or content.content_id
|
||||
display_name = status_info['title'] or content.filename or content.cid.serialize_v2()
|
||||
button_text = f"{icon} {display_name}"
|
||||
content_list.append([
|
||||
{
|
||||
'text': metadata_content_json['name'],
|
||||
'callback_data': f'NC_{content.id}'
|
||||
'text': button_text[:64],
|
||||
'callback_data': f'LC_{content.id}'
|
||||
}
|
||||
])
|
||||
|
||||
|
|
@ -77,3 +202,51 @@ async def t_callback_node_content(query: types.CallbackQuery, memory=None, user=
|
|||
|
||||
router.callback_query.register(t_callback_owned_content, F.data == 'ownedContent')
|
||||
router.callback_query.register(t_callback_node_content, F.data.startswith('NC_'))
|
||||
|
||||
|
||||
async def t_callback_local_content(query: types.CallbackQuery, memory=None, user=None, db_session=None, chat_wrap=None, **extra):
|
||||
content_oid = int(query.data.split('_')[1])
|
||||
content = (await db_session.execute(select(StoredContent).where(StoredContent.id == content_oid))).scalars().first()
|
||||
if not content:
|
||||
return await query.answer(user.translated('error_contentNotFound'), show_alert=True)
|
||||
|
||||
upload_id = (content.meta or {}).get('upload_id')
|
||||
upload_session = await db_session.get(UploadSession, upload_id) if upload_id else None
|
||||
|
||||
encrypted_cid = (content.meta or {}).get('encrypted_cid') or content.content_id
|
||||
status_info = await _compute_content_status(db_session, encrypted_cid, (content.meta or {}).get('content_type'))
|
||||
display_name = status_info['title'] or content.filename or content.cid.serialize_v2()
|
||||
state_label = {
|
||||
'ready': 'Готов',
|
||||
'processing': 'Обработка',
|
||||
'failed': 'Ошибка',
|
||||
'uploaded': 'Загружено',
|
||||
}.get(status_info['final_state'], 'Статус неизвестен')
|
||||
|
||||
lines = [
|
||||
f"<b>{display_name}</b>",
|
||||
f"Состояние: {state_label}"
|
||||
]
|
||||
if upload_session:
|
||||
lines.append(f"Статус загрузки: {upload_session.state}")
|
||||
if upload_session.error:
|
||||
lines.append(f"Ошибка: {upload_session.error}")
|
||||
if status_info['summary']:
|
||||
lines.append("Конвертация:")
|
||||
for status, count in status_info['summary'].items():
|
||||
lines.append(f"• {status}: {count}")
|
||||
|
||||
await chat_wrap.send_message(
|
||||
'\n'.join(lines),
|
||||
message_type='notification',
|
||||
message_meta={'content_id': content.id},
|
||||
reply_markup=get_inline_keyboard([
|
||||
[{
|
||||
'text': user.translated('back_button'),
|
||||
'callback_data': 'ownedContent'
|
||||
}]
|
||||
])
|
||||
)
|
||||
|
||||
|
||||
router.callback_query.register(t_callback_local_content, F.data.startswith('LC_'))
|
||||
|
|
|
|||
|
|
@ -7,6 +7,7 @@ from sqlalchemy import select, and_
|
|||
from app.core._keyboards import get_inline_keyboard
|
||||
from app.core._utils.tg_process_template import tg_process_template
|
||||
from app.core.models.wallet_connection import WalletConnection
|
||||
from app.core._config import PROJECT_HOST
|
||||
|
||||
main_router = Router()
|
||||
|
||||
|
|
@ -83,6 +84,35 @@ async def t_home_menu(__msg, **extra):
|
|||
return await send_home_menu(chat_wrap, user, wallet_connection, message_id=message_id)
|
||||
|
||||
|
||||
async def t_admin_panel(message: types.Message, **extra):
|
||||
user = extra.get('user')
|
||||
chat_wrap = extra.get('chat_wrap')
|
||||
admin_host = (PROJECT_HOST or '').rstrip('/')
|
||||
if not user or not getattr(user, 'is_admin', False):
|
||||
await chat_wrap.send_message("Доступ к админ-панели ограничен.")
|
||||
return
|
||||
if not admin_host:
|
||||
await chat_wrap.send_message("Адрес админ-панели не настроен на этой ноде.")
|
||||
return
|
||||
admin_url = f"{admin_host}/admin"
|
||||
buttons = []
|
||||
if admin_url.startswith('https://'):
|
||||
buttons.append({
|
||||
'text': 'Открыть в Telegram',
|
||||
'web_app': types.WebAppInfo(url=admin_url),
|
||||
})
|
||||
buttons.append({
|
||||
'text': 'Открыть в браузере',
|
||||
'url': admin_url,
|
||||
})
|
||||
keyboard = get_inline_keyboard([buttons]) if buttons else None
|
||||
await chat_wrap.send_message(
|
||||
"Админ-панель доступна по кнопке ниже.",
|
||||
keyboard=keyboard,
|
||||
)
|
||||
|
||||
|
||||
main_router.message.register(t_home_menu, Command('start'))
|
||||
main_router.message.register(t_admin_panel, Command('admin'))
|
||||
main_router.callback_query.register(t_home_menu, F.data == 'home')
|
||||
router = main_router
|
||||
|
|
|
|||
|
|
@ -6,6 +6,9 @@ from aiogram.fsm.storage.memory import MemoryStorage
|
|||
from app.bot.middleware import UserDataMiddleware
|
||||
from app.client_bot.routers.index import main_router
|
||||
|
||||
dp = Dispatcher(storage=MemoryStorage())
|
||||
dp.update.outer_middleware(UserDataMiddleware())
|
||||
dp.include_router(main_router)
|
||||
|
||||
def create_dispatcher() -> Dispatcher:
|
||||
dp = Dispatcher(storage=MemoryStorage())
|
||||
dp.update.outer_middleware(UserDataMiddleware())
|
||||
dp.include_router(main_router)
|
||||
return dp
|
||||
|
|
|
|||
|
|
@ -9,6 +9,7 @@ from app.core._utils.tg_process_template import tg_process_template
|
|||
from app.core.logger import make_log
|
||||
from app.core.models.wallet_connection import WalletConnection
|
||||
from app.core.models.node_storage import StoredContent
|
||||
from app.core._config import PROJECT_HOST
|
||||
|
||||
main_router = Router()
|
||||
|
||||
|
|
@ -86,12 +87,44 @@ async def t_home_menu(__msg, **extra):
|
|||
make_log("Home", f"Home menu args: {args}", level='debug')
|
||||
if args:
|
||||
if args[0].startswith('C'):
|
||||
content = StoredContent.from_cid(db_session, args[0][1:])
|
||||
payload = args[0][1:]
|
||||
if '!' in payload:
|
||||
payload = payload.split('!', 1)[0]
|
||||
content = StoredContent.from_cid(db_session, payload)
|
||||
return await chat_wrap.send_content(db_session, content, message_id=message_id)
|
||||
|
||||
return await send_home_menu(chat_wrap, user, wallet_connection, message_id=message_id)
|
||||
|
||||
|
||||
async def t_admin_panel(message: types.Message, **extra):
|
||||
user = extra.get('user')
|
||||
chat_wrap = extra.get('chat_wrap')
|
||||
admin_host = (PROJECT_HOST or '').rstrip('/')
|
||||
if not user or not getattr(user, 'is_admin', False):
|
||||
await chat_wrap.send_message("Доступ к админ-панели ограничен.")
|
||||
return
|
||||
if not admin_host:
|
||||
await chat_wrap.send_message("Адрес админ-панели не настроен на этой ноде.")
|
||||
return
|
||||
admin_url = f"{admin_host}/admin"
|
||||
buttons = []
|
||||
if admin_url.startswith('https://'):
|
||||
buttons.append({
|
||||
'text': 'Открыть в Telegram',
|
||||
'web_app': types.WebAppInfo(url=admin_url),
|
||||
})
|
||||
buttons.append({
|
||||
'text': 'Открыть в браузере',
|
||||
'url': admin_url,
|
||||
})
|
||||
keyboard = get_inline_keyboard([buttons]) if buttons else None
|
||||
await chat_wrap.send_message(
|
||||
"Админ-панель доступна по кнопке ниже.",
|
||||
keyboard=keyboard,
|
||||
)
|
||||
|
||||
|
||||
main_router.message.register(t_home_menu, Command('start'))
|
||||
main_router.message.register(t_admin_panel, Command('admin'))
|
||||
main_router.callback_query.register(t_home_menu, F.data == 'home')
|
||||
router = main_router
|
||||
|
|
|
|||
|
|
@ -1,4 +1,6 @@
|
|||
from aiogram import types, Router, F
|
||||
from sqlalchemy import select
|
||||
|
||||
from app.core.logger import make_log
|
||||
from app.core.models import StarsInvoice
|
||||
|
||||
|
|
@ -12,9 +14,10 @@ async def t_pre_checkout_query_stars_processing(pre_checkout_query: types.PreChe
|
|||
|
||||
invoice_id = pre_checkout_query.invoice_payload
|
||||
|
||||
existing_invoice = db_session.query(StarsInvoice).filter(
|
||||
StarsInvoice.external_id == invoice_id
|
||||
).first()
|
||||
result = await db_session.execute(
|
||||
select(StarsInvoice).where(StarsInvoice.external_id == invoice_id)
|
||||
)
|
||||
existing_invoice = result.scalars().first()
|
||||
if not existing_invoice:
|
||||
return await pre_checkout_query.answer(ok=False, error_message="Invoice not found")
|
||||
|
||||
|
|
|
|||
Binary file not shown.
|
|
@ -56,7 +56,7 @@ _now_str = datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
|
|||
LOG_FILEPATH = f"{LOG_DIR}/{_now_str}.log"
|
||||
|
||||
WEB_APP_URLS = {
|
||||
'uploadContent': f"https://my-public-node-8.projscale.dev/uploadContent"
|
||||
'uploadContent': f"https://my-public-node-103.projscale.dev/uploadContent"
|
||||
}
|
||||
|
||||
ALLOWED_CONTENT_TYPES = [
|
||||
|
|
|
|||
Binary file not shown.
Binary file not shown.
|
|
@ -1,24 +1,58 @@
|
|||
import base58
|
||||
import nacl.encoding
|
||||
import nacl.signing
|
||||
from app.core._utils.b58 import b58encode, b58decode
|
||||
|
||||
try:
|
||||
import nacl.encoding
|
||||
import nacl.signing
|
||||
import nacl.exceptions
|
||||
_HAS_NACL = True
|
||||
except Exception: # pragma: no cover - fallback path
|
||||
_HAS_NACL = False
|
||||
|
||||
from app.core._utils.hash import blake3_digest
|
||||
|
||||
|
||||
class Signer:
|
||||
def __init__(self, seed: bytes):
|
||||
if len(seed) != 32:
|
||||
raise ValueError("Seed must be 32 bytes")
|
||||
self.signing_key = nacl.signing.SigningKey(seed)
|
||||
self.verify_key = self.signing_key.verify_key
|
||||
if _HAS_NACL:
|
||||
|
||||
def sign(self, data_bytes: bytes) -> str:
|
||||
signed_message = self.signing_key.sign(data_bytes)
|
||||
signature = signed_message.signature
|
||||
return base58.b58encode(signature).decode()
|
||||
class Signer:
|
||||
def __init__(self, seed: bytes):
|
||||
if len(seed) != 32:
|
||||
raise ValueError("Seed must be 32 bytes")
|
||||
self.signing_key = nacl.signing.SigningKey(seed)
|
||||
self.verify_key = self.signing_key.verify_key
|
||||
|
||||
def verify(self, data_bytes: bytes, signature: str) -> bool:
|
||||
signature_bytes = base58.b58decode(signature)
|
||||
try:
|
||||
self.verify_key.verify(data_bytes, signature_bytes)
|
||||
return True
|
||||
except nacl.exceptions.BadSignatureError:
|
||||
return False
|
||||
def sign(self, data_bytes: bytes) -> str:
|
||||
signed_message = self.signing_key.sign(data_bytes)
|
||||
signature = signed_message.signature
|
||||
return b58encode(signature).decode()
|
||||
|
||||
def verify(self, data_bytes: bytes, signature: str) -> bool:
|
||||
signature_bytes = b58decode(signature)
|
||||
try:
|
||||
self.verify_key.verify(data_bytes, signature_bytes)
|
||||
return True
|
||||
except nacl.exceptions.BadSignatureError:
|
||||
return False
|
||||
|
||||
else:
|
||||
|
||||
class _VerifyKey:
|
||||
def __init__(self, key_bytes: bytes):
|
||||
self._key_bytes = key_bytes
|
||||
|
||||
def encode(self) -> bytes:
|
||||
return self._key_bytes
|
||||
|
||||
class Signer:
|
||||
def __init__(self, seed: bytes):
|
||||
if len(seed) != 32:
|
||||
raise ValueError("Seed must be 32 bytes")
|
||||
self.seed = seed
|
||||
self.verify_key = _VerifyKey(seed)
|
||||
|
||||
def sign(self, data_bytes: bytes) -> str:
|
||||
digest = blake3_digest(self.seed + data_bytes)
|
||||
return b58encode(digest).decode()
|
||||
|
||||
def verify(self, data_bytes: bytes, signature: str) -> bool:
|
||||
expected = self.sign(data_bytes)
|
||||
return expected == signature
|
||||
|
|
|
|||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
|
|
@ -0,0 +1,51 @@
|
|||
from __future__ import annotations
|
||||
|
||||
try:
|
||||
# Prefer external package if available
|
||||
from base58 import b58encode, b58decode # type: ignore
|
||||
except Exception:
|
||||
# Minimal fallback (compatible subset)
|
||||
ALPHABET = "123456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnopqrstuvwxyz"
|
||||
ALPHABET_INDEX = {c: i for i, c in enumerate(ALPHABET)}
|
||||
|
||||
def _to_bytes(value: bytes | bytearray | str) -> bytes:
|
||||
if isinstance(value, (bytes, bytearray)):
|
||||
return bytes(value)
|
||||
if isinstance(value, str):
|
||||
return value.encode()
|
||||
raise TypeError("value must be bytes or str")
|
||||
|
||||
def b58encode(data: bytes | bytearray | str) -> bytes:
|
||||
data = _to_bytes(data)
|
||||
if not data:
|
||||
return b""
|
||||
n = int.from_bytes(data, "big")
|
||||
out = []
|
||||
while n > 0:
|
||||
n, rem = divmod(n, 58)
|
||||
out.append(ALPHABET[rem])
|
||||
enc = "".join(reversed(out))
|
||||
leading = 0
|
||||
for b in data:
|
||||
if b == 0:
|
||||
leading += 1
|
||||
else:
|
||||
break
|
||||
return ("1" * leading + enc).encode()
|
||||
|
||||
def b58decode(data: bytes | bytearray | str) -> bytes:
|
||||
data_b = _to_bytes(data)
|
||||
if not data_b:
|
||||
return b""
|
||||
num = 0
|
||||
for ch in data_b.decode():
|
||||
num = num * 58 + ALPHABET_INDEX[ch]
|
||||
full = num.to_bytes((num.bit_length() + 7) // 8, "big")
|
||||
leading = 0
|
||||
for ch in data_b:
|
||||
if ch == ord('1'):
|
||||
leading += 1
|
||||
else:
|
||||
break
|
||||
return b"\x00" * leading + full
|
||||
|
||||
|
|
@ -1,4 +1,6 @@
|
|||
from sqlalchemy.ext.asyncio import AsyncEngine
|
||||
from sqlalchemy import text
|
||||
|
||||
from app.core.models import BlockchainTask
|
||||
from app.core.models.base import AlchemyBase
|
||||
|
||||
|
|
@ -9,4 +11,36 @@ async def create_db_tables(engine: AsyncEngine):
|
|||
BlockchainTask()
|
||||
async with engine.begin() as conn:
|
||||
await conn.run_sync(AlchemyBase.metadata.create_all)
|
||||
await conn.execute(text("""
|
||||
ALTER TABLE users
|
||||
ADD COLUMN IF NOT EXISTS is_admin BOOLEAN DEFAULT FALSE
|
||||
"""))
|
||||
await conn.execute(text("""
|
||||
ALTER TABLE stars_invoices
|
||||
ADD COLUMN IF NOT EXISTS telegram_id BIGINT
|
||||
"""))
|
||||
await conn.execute(text("""
|
||||
ALTER TABLE stars_invoices
|
||||
ADD COLUMN IF NOT EXISTS paid_at TIMESTAMPTZ
|
||||
"""))
|
||||
await conn.execute(text("""
|
||||
ALTER TABLE stars_invoices
|
||||
ADD COLUMN IF NOT EXISTS payment_tx_id VARCHAR(256)
|
||||
"""))
|
||||
await conn.execute(text("""
|
||||
ALTER TABLE stars_invoices
|
||||
ADD COLUMN IF NOT EXISTS payment_node_id VARCHAR(128)
|
||||
"""))
|
||||
await conn.execute(text("""
|
||||
ALTER TABLE stars_invoices
|
||||
ADD COLUMN IF NOT EXISTS payment_node_public_host VARCHAR(256)
|
||||
"""))
|
||||
await conn.execute(text("""
|
||||
ALTER TABLE stars_invoices
|
||||
ADD COLUMN IF NOT EXISTS bot_username VARCHAR(128)
|
||||
"""))
|
||||
await conn.execute(text("""
|
||||
ALTER TABLE stars_invoices
|
||||
ADD COLUMN IF NOT EXISTS is_remote BOOLEAN DEFAULT FALSE
|
||||
"""))
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,29 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import hashlib
|
||||
from typing import Iterable
|
||||
|
||||
|
||||
def _to_bytes(data: Iterable[int] | bytes | bytearray | str) -> bytes:
|
||||
if isinstance(data, (bytes, bytearray)):
|
||||
return bytes(data)
|
||||
if isinstance(data, str):
|
||||
return data.encode()
|
||||
return bytes(data)
|
||||
|
||||
|
||||
def blake3_digest(data: Iterable[int] | bytes | bytearray | str) -> bytes:
|
||||
try:
|
||||
from blake3 import blake3 # type: ignore
|
||||
return blake3(_to_bytes(data)).digest()
|
||||
except Exception:
|
||||
return hashlib.blake2s(_to_bytes(data)).digest()
|
||||
|
||||
|
||||
def blake3_hex(data: Iterable[int] | bytes | bytearray | str) -> str:
|
||||
try:
|
||||
from blake3 import blake3 # type: ignore
|
||||
return blake3(_to_bytes(data)).hexdigest()
|
||||
except Exception:
|
||||
return hashlib.blake2s(_to_bytes(data)).hexdigest()
|
||||
|
||||
|
|
@ -2,8 +2,9 @@ from app.core.content.content_id import ContentId
|
|||
|
||||
|
||||
def resolve_content(content_id) -> ContentId: # -> [content, error]
|
||||
if isinstance(content_id, ContentId):
|
||||
return content_id, None
|
||||
try:
|
||||
return ContentId.deserialize(content_id), None
|
||||
except BaseException as e:
|
||||
return None, f"{e}"
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,20 @@
|
|||
from typing import Optional
|
||||
from urllib.parse import urlencode
|
||||
|
||||
STARTAPP_LIMIT = 64
|
||||
|
||||
|
||||
def build_content_links(content_token: str, ref_id: Optional[str], *, project_host: str, bot_username: str):
|
||||
"""Return tuple of (startapp_payload, telegram_url, web_url)."""
|
||||
payload = (content_token or '').strip()
|
||||
if len(payload) > STARTAPP_LIMIT:
|
||||
payload = payload[:STARTAPP_LIMIT]
|
||||
|
||||
telegram_url = f"https://t.me/{bot_username}/content?startapp={payload}"
|
||||
|
||||
query = [('content', content_token)]
|
||||
if ref_id:
|
||||
query.append(('ref', ref_id))
|
||||
web_url = f"{project_host}/viewContent?{urlencode(query)}"
|
||||
|
||||
return payload, telegram_url, web_url
|
||||
|
|
@ -136,8 +136,7 @@ async def convert_loop(memory):
|
|||
]
|
||||
if trim_value:
|
||||
cmd.extend(["--trim", trim_value])
|
||||
if content_kind == "audio":
|
||||
cmd.append("--audio-only") # audio-only flag
|
||||
# converter auto-detects audio/video, no explicit flag required
|
||||
|
||||
process = await asyncio.create_subprocess_exec(
|
||||
*cmd,
|
||||
|
|
|
|||
|
|
@ -2,10 +2,13 @@ import asyncio
|
|||
import os
|
||||
import json
|
||||
import shutil
|
||||
import tempfile
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime
|
||||
from typing import List, Tuple
|
||||
from pathlib import Path
|
||||
from typing import List, Optional, Tuple
|
||||
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy import select, and_, or_
|
||||
|
||||
from app.core.logger import make_log
|
||||
from app.core.storage import db_session
|
||||
|
|
@ -22,9 +25,41 @@ from app.core.crypto.encf_stream import decrypt_encf_auto
|
|||
from app.core.crypto.keywrap import unwrap_dek, wrap_dek, KeyWrapError
|
||||
from app.core.network.key_client import request_key_from_peer
|
||||
from app.core.models.my_network import KnownNode
|
||||
from app.core._utils.resolve_content import resolve_content
|
||||
from app.core.content.content_id import ContentId
|
||||
|
||||
|
||||
CONCURRENCY = int(os.getenv("CONVERT_V3_MAX_CONCURRENCY", "3"))
|
||||
STAGING_SUBDIR = os.getenv("CONVERT_V3_STAGING_SUBDIR", "convert-staging")
|
||||
UPLOADS_PATH = Path(UPLOADS_DIR).resolve()
|
||||
_host_uploads_env = os.getenv("BACKEND_DATA_DIR_HOST")
|
||||
HOST_UPLOADS_PATH = Path(_host_uploads_env).resolve() if _host_uploads_env else None
|
||||
|
||||
|
||||
@dataclass
|
||||
class PlainStaging:
|
||||
container_path: str
|
||||
host_path: str
|
||||
|
||||
|
||||
def _container_to_host(path: str) -> str:
|
||||
"""Map a container path under UPLOADS_DIR to the host path for docker -v."""
|
||||
if not HOST_UPLOADS_PATH:
|
||||
raise RuntimeError("BACKEND_DATA_DIR_HOST is not configured for convert_v3")
|
||||
real_path = Path(path).resolve()
|
||||
try:
|
||||
real_path.relative_to(UPLOADS_PATH)
|
||||
except ValueError:
|
||||
# Not under uploads; best effort fallback to original string
|
||||
return str(real_path)
|
||||
rel = real_path.relative_to(UPLOADS_PATH)
|
||||
return str(HOST_UPLOADS_PATH / rel)
|
||||
|
||||
|
||||
MEDIA_CONVERTER_CPU_LIMIT = os.getenv("MEDIA_CONVERTER_CPU_LIMIT")
|
||||
MEDIA_CONVERTER_MEM_LIMIT = os.getenv("MEDIA_CONVERTER_MEM_LIMIT")
|
||||
MEDIA_CONVERTER_CPUSET = os.getenv("MEDIA_CONVERTER_CPUSET") or os.getenv("CONVERT_CPUSET")
|
||||
ERROR_TRUNCATE_LIMIT = 512
|
||||
|
||||
|
||||
def _ensure_dir(path: str):
|
||||
|
|
@ -57,27 +92,49 @@ async def _save_derivative(file_path: str, filename: str) -> Tuple[str, int]:
|
|||
return file_hash, size
|
||||
|
||||
|
||||
async def _run_media_converter(input_host_path: str, input_ext: str, quality: str, trim_value: str | None, is_audio: bool) -> Tuple[str, dict]:
|
||||
async def _run_media_converter(staging: PlainStaging, input_ext: str, quality: str, trim_value: Optional[str], is_audio: bool):
|
||||
if not os.path.exists(staging.container_path):
|
||||
raise FileNotFoundError(f"Plain input missing at {staging.container_path}")
|
||||
|
||||
host_input_path = staging.host_path
|
||||
if not host_input_path or not host_input_path.startswith('/'):
|
||||
host_input_path = os.path.abspath(host_input_path)
|
||||
|
||||
rid = __import__('uuid').uuid4().hex[:8]
|
||||
output_dir_container = f"/tmp/conv_{rid}"
|
||||
output_dir_host = f"/tmp/conv_{rid}"
|
||||
_ensure_dir(output_dir_host)
|
||||
logs_dir_host = BACKEND_LOGS_DIR_HOST
|
||||
_ensure_dir(logs_dir_host)
|
||||
output_dir_container = UPLOADS_PATH / "convert-output" / f"conv_{rid}"
|
||||
output_dir_host = _container_to_host(output_dir_container)
|
||||
_ensure_dir(str(output_dir_container))
|
||||
|
||||
logs_dir_candidate = os.getenv("BACKEND_LOGS_DIR_HOST", "")
|
||||
logs_dir_host = logs_dir_candidate if logs_dir_candidate else str(HOST_UPLOADS_PATH / "logs" / "converter") if HOST_UPLOADS_PATH else "/tmp/converter-logs"
|
||||
if not logs_dir_host.startswith('/'):
|
||||
logs_dir_host = os.path.join(os.getcwd(), logs_dir_host)
|
||||
try:
|
||||
os.makedirs(logs_dir_host, exist_ok=True)
|
||||
except Exception:
|
||||
fallback_logs = HOST_UPLOADS_PATH / "logs" / "converter" if HOST_UPLOADS_PATH else Path("/tmp/converter-logs")
|
||||
logs_dir_host = str(fallback_logs)
|
||||
os.makedirs(logs_dir_host, exist_ok=True)
|
||||
|
||||
cmd = [
|
||||
"docker", "run", "--rm",
|
||||
"-v", f"{input_host_path}:/app/input:ro",
|
||||
"-v", f"{host_input_path}:/app/input:ro",
|
||||
"-v", f"{output_dir_host}:/app/output",
|
||||
"-v", f"{logs_dir_host}:/app/logs",
|
||||
"media_converter",
|
||||
"--ext", input_ext,
|
||||
"--quality", quality,
|
||||
]
|
||||
if MEDIA_CONVERTER_CPU_LIMIT:
|
||||
cmd.extend(["--cpus", str(MEDIA_CONVERTER_CPU_LIMIT)])
|
||||
if MEDIA_CONVERTER_MEM_LIMIT:
|
||||
cmd.extend(["--memory", str(MEDIA_CONVERTER_MEM_LIMIT)])
|
||||
if MEDIA_CONVERTER_CPUSET:
|
||||
cmd.extend(["--cpuset-cpus", MEDIA_CONVERTER_CPUSET])
|
||||
|
||||
cmd.append("media_converter")
|
||||
cmd.extend(["--ext", input_ext, "--quality", quality])
|
||||
if trim_value:
|
||||
cmd.extend(["--trim", trim_value])
|
||||
if is_audio:
|
||||
cmd.append("--audio-only")
|
||||
|
||||
make_log('convert_v3', f"Run media_converter cmd: {' '.join(cmd)}")
|
||||
|
||||
proc = await asyncio.create_subprocess_exec(
|
||||
*cmd,
|
||||
|
|
@ -90,15 +147,15 @@ async def _run_media_converter(input_host_path: str, input_ext: str, quality: st
|
|||
|
||||
# Find produced media file and optional output.json
|
||||
try:
|
||||
files = os.listdir(output_dir_host)
|
||||
files = os.listdir(output_dir_container)
|
||||
except Exception as e:
|
||||
raise RuntimeError(f"Read output dir error: {e}")
|
||||
media_files = [f for f in files if f != "output.json"]
|
||||
if len(media_files) != 1:
|
||||
raise RuntimeError(f"Expected one media file, found {len(media_files)}: {media_files}")
|
||||
output_media = os.path.join(output_dir_host, media_files[0])
|
||||
output_media = os.path.join(output_dir_container, media_files[0])
|
||||
ffprobe_meta = {}
|
||||
out_json = os.path.join(output_dir_host, "output.json")
|
||||
out_json = os.path.join(output_dir_container, "output.json")
|
||||
if os.path.exists(out_json):
|
||||
try:
|
||||
with open(out_json, 'r') as f:
|
||||
|
|
@ -108,24 +165,102 @@ async def _run_media_converter(input_host_path: str, input_ext: str, quality: st
|
|||
return output_media, ffprobe_meta
|
||||
|
||||
|
||||
async def _convert_content(ec: EncryptedContent, input_host_path: str):
|
||||
content_kind = 'audio' if ec.content_type.startswith('audio/') else ('video' if ec.content_type.startswith('video/') else 'other')
|
||||
if content_kind == 'other':
|
||||
return
|
||||
async def _update_upload_session(ec: EncryptedContent, all_success: bool, errors: List[str]):
|
||||
async with db_session() as session:
|
||||
upload_row = (await session.execute(
|
||||
select(UploadSession).where(UploadSession.encrypted_cid == ec.encrypted_cid)
|
||||
)).scalars().first()
|
||||
if upload_row:
|
||||
if all_success:
|
||||
upload_row.state = 'converted'
|
||||
upload_row.error = None
|
||||
elif upload_row.state != 'converted':
|
||||
upload_row.state = 'conversion_failed'
|
||||
if errors:
|
||||
upload_row.error = _short_error(errors[0])
|
||||
await session.commit()
|
||||
|
||||
|
||||
async def _convert_content(ec: EncryptedContent, staging: PlainStaging):
|
||||
content_kind = 'audio' if ec.content_type.startswith('audio/') else ('video' if ec.content_type.startswith('video/') else 'other')
|
||||
input_ext = (ec.content_type.split('/')[-1] or 'bin')
|
||||
is_audio = content_kind == 'audio'
|
||||
# Required outputs
|
||||
required = ['high', 'low', 'low_preview']
|
||||
encrypted_hash_b58 = ContentId.deserialize(ec.encrypted_cid).content_hash_b58
|
||||
|
||||
# Preview interval
|
||||
if content_kind == 'other':
|
||||
errors: List[str] = []
|
||||
all_success = True
|
||||
try:
|
||||
file_hash, size_bytes = await _save_derivative(staging.container_path, staging.container_path)
|
||||
plain_path = os.path.join(UPLOADS_DIR, file_hash)
|
||||
plain_filename = f"{ec.encrypted_cid}.{input_ext}" if input_ext else ec.encrypted_cid
|
||||
async with db_session() as session:
|
||||
existing = (await session.execute(select(StoredContent).where(StoredContent.hash == file_hash))).scalars().first()
|
||||
if existing:
|
||||
sc = existing
|
||||
sc.type = sc.type or "local/content_bin"
|
||||
sc.filename = plain_filename
|
||||
sc.meta = {
|
||||
**(sc.meta or {}),
|
||||
'encrypted_cid': ec.encrypted_cid,
|
||||
'kind': 'original',
|
||||
'content_type': ec.content_type,
|
||||
}
|
||||
sc.updated = datetime.utcnow()
|
||||
else:
|
||||
sc = StoredContent(
|
||||
type="local/content_bin",
|
||||
hash=file_hash,
|
||||
user_id=None,
|
||||
filename=plain_filename,
|
||||
meta={
|
||||
'encrypted_cid': ec.encrypted_cid,
|
||||
'kind': 'original',
|
||||
'content_type': ec.content_type,
|
||||
},
|
||||
created=datetime.utcnow(),
|
||||
)
|
||||
session.add(sc)
|
||||
await session.flush()
|
||||
|
||||
encrypted_records = (await session.execute(select(StoredContent).where(StoredContent.hash == encrypted_hash_b58))).scalars().all()
|
||||
for encrypted_sc in encrypted_records:
|
||||
meta = dict(encrypted_sc.meta or {})
|
||||
converted = dict(meta.get('converted_content') or {})
|
||||
converted['original'] = file_hash
|
||||
meta['converted_content'] = converted
|
||||
if 'content_type' not in meta:
|
||||
meta['content_type'] = ec.content_type
|
||||
encrypted_sc.meta = meta
|
||||
encrypted_sc.decrypted_content_id = sc.id
|
||||
encrypted_sc.updated = datetime.utcnow()
|
||||
|
||||
derivative = ContentDerivative(
|
||||
content_id=ec.id,
|
||||
kind='decrypted_original',
|
||||
local_path=plain_path,
|
||||
content_type=ec.content_type,
|
||||
size_bytes=size_bytes,
|
||||
status='ready',
|
||||
)
|
||||
session.add(derivative)
|
||||
await session.commit()
|
||||
make_log('convert_v3', f"Stored original derivative for {ec.encrypted_cid}")
|
||||
except Exception as e:
|
||||
all_success = False
|
||||
errors.append(str(e))
|
||||
make_log('convert_v3', f"Convert error {ec.encrypted_cid} opt=original: {e}", level='error')
|
||||
await _update_upload_session(ec, all_success, errors)
|
||||
return
|
||||
|
||||
# audio/video path
|
||||
required = ['high', 'low', 'low_preview']
|
||||
conf = ec.preview_conf or {}
|
||||
intervals = conf.get('intervals') or [[0, int(conf.get('duration_ms', 30000))]]
|
||||
main_interval = intervals[0]
|
||||
trim_value = None
|
||||
start_s = max(0, int(main_interval[0]) // 1000)
|
||||
dur_s = max(1, int((main_interval[1] - main_interval[0]) // 1000) or 30)
|
||||
trim_value = f"{start_s},{dur_s}"
|
||||
trim_value = f"{start_s}-{start_s + dur_s}"
|
||||
|
||||
qualities = {
|
||||
'high': 'high',
|
||||
|
|
@ -133,96 +268,160 @@ async def _convert_content(ec: EncryptedContent, input_host_path: str):
|
|||
'low_preview': 'low',
|
||||
}
|
||||
|
||||
all_success = True
|
||||
errors: List[str] = []
|
||||
|
||||
for opt in required:
|
||||
derivative_kind = f"decrypted_{opt if opt != 'low_preview' else 'preview'}"
|
||||
derivative_id: Optional[int] = None
|
||||
try:
|
||||
# Mark derivative processing
|
||||
async with db_session() as session:
|
||||
cd = ContentDerivative(
|
||||
content_id=ec.id,
|
||||
kind=f"decrypted_{opt if opt != 'low_preview' else 'preview'}",
|
||||
kind=derivative_kind,
|
||||
interval_start_ms=main_interval[0] if opt == 'low_preview' else None,
|
||||
interval_end_ms=main_interval[1] if opt == 'low_preview' else None,
|
||||
local_path="",
|
||||
status='processing',
|
||||
)
|
||||
session.add(cd)
|
||||
await session.flush()
|
||||
derivative_id = cd.id
|
||||
await session.commit()
|
||||
|
||||
out_path, ffprobe = await _run_media_converter(
|
||||
input_host_path=input_host_path,
|
||||
staging=staging,
|
||||
input_ext=input_ext,
|
||||
quality=qualities[opt],
|
||||
trim_value=trim_value if opt == 'low_preview' else None,
|
||||
is_audio=is_audio,
|
||||
)
|
||||
|
||||
# Save into store and StoredContent
|
||||
file_hash, size_bytes = await _save_derivative(out_path, os.path.basename(out_path))
|
||||
|
||||
async with db_session() as session:
|
||||
sc = StoredContent(
|
||||
type="local/content_bin",
|
||||
hash=file_hash,
|
||||
user_id=None,
|
||||
filename=os.path.basename(out_path),
|
||||
meta={'encrypted_cid': ec.encrypted_cid, 'kind': opt, 'ffprobe_meta': ffprobe},
|
||||
created=datetime.utcnow(),
|
||||
)
|
||||
session.add(sc)
|
||||
await session.flush()
|
||||
sc = (await session.execute(select(StoredContent).where(StoredContent.hash == file_hash))).scalars().first()
|
||||
meta_payload = {'encrypted_cid': ec.encrypted_cid, 'kind': opt, 'ffprobe_meta': ffprobe}
|
||||
if sc:
|
||||
sc.type = sc.type or "local/content_bin"
|
||||
sc.filename = os.path.basename(out_path)
|
||||
sc.meta = meta_payload
|
||||
sc.updated = datetime.utcnow()
|
||||
else:
|
||||
sc = StoredContent(
|
||||
type="local/content_bin",
|
||||
hash=file_hash,
|
||||
user_id=None,
|
||||
filename=os.path.basename(out_path),
|
||||
meta=meta_payload,
|
||||
created=datetime.utcnow(),
|
||||
)
|
||||
session.add(sc)
|
||||
await session.flush()
|
||||
|
||||
# Update derivative record
|
||||
cd = (await session.execute(select(ContentDerivative).where(
|
||||
ContentDerivative.content_id == ec.id,
|
||||
ContentDerivative.kind == (f"decrypted_{opt if opt != 'low_preview' else 'preview'}"),
|
||||
ContentDerivative.status == 'processing'
|
||||
))).scalars().first()
|
||||
encrypted_sc = (await session.execute(select(StoredContent).where(StoredContent.hash == encrypted_hash_b58))).scalars().first()
|
||||
if encrypted_sc:
|
||||
meta = dict(encrypted_sc.meta or {})
|
||||
converted = dict(meta.get('converted_content') or {})
|
||||
converted[opt] = file_hash
|
||||
meta['converted_content'] = converted
|
||||
encrypted_sc.meta = meta
|
||||
if opt == 'high':
|
||||
encrypted_sc.decrypted_content_id = sc.id
|
||||
encrypted_sc.updated = datetime.utcnow()
|
||||
|
||||
cd = await session.get(ContentDerivative, derivative_id) if derivative_id else None
|
||||
if cd:
|
||||
cd.local_path = os.path.join(UPLOADS_DIR, file_hash)
|
||||
cd.size_bytes = size_bytes
|
||||
cd.content_type = ('audio/mpeg' if is_audio else 'video/mp4') if opt != 'high' else ec.content_type
|
||||
if is_audio:
|
||||
cd.content_type = 'audio/flac' if opt == 'high' else 'audio/mpeg'
|
||||
else:
|
||||
cd.content_type = ec.content_type if opt == 'high' else 'video/mp4'
|
||||
cd.status = 'ready'
|
||||
cd.error = None
|
||||
await session.commit()
|
||||
|
||||
output_parent = Path(out_path).parent
|
||||
shutil.rmtree(output_parent, ignore_errors=True)
|
||||
make_log('convert_v3', f"Converted {ec.encrypted_cid} opt={opt} -> {file_hash}")
|
||||
except Exception as e:
|
||||
make_log('convert_v3', f"Convert error {ec.encrypted_cid} opt={opt}: {e}", level='error')
|
||||
all_success = False
|
||||
errors.append(_short_error(e))
|
||||
async with db_session() as session:
|
||||
cd = ContentDerivative(
|
||||
content_id=ec.id,
|
||||
kind=f"decrypted_{opt if opt != 'low_preview' else 'preview'}",
|
||||
status='failed',
|
||||
error=str(e),
|
||||
local_path="",
|
||||
)
|
||||
session.add(cd)
|
||||
cd = await session.get(ContentDerivative, derivative_id) if derivative_id else None
|
||||
if cd:
|
||||
cd.status = 'failed'
|
||||
cd.error = _short_error(e)
|
||||
else:
|
||||
session.add(ContentDerivative(
|
||||
content_id=ec.id,
|
||||
kind=derivative_kind,
|
||||
status='failed',
|
||||
error=_short_error(e),
|
||||
local_path="",
|
||||
))
|
||||
await session.commit()
|
||||
|
||||
await _update_upload_session(ec, all_success, errors)
|
||||
|
||||
async def _pick_pending(limit: int) -> List[Tuple[EncryptedContent, str]]:
|
||||
|
||||
async def _pick_pending(limit: int) -> List[Tuple[EncryptedContent, PlainStaging]]:
|
||||
async with db_session() as session:
|
||||
# Find A/V contents with preview_enabled and no ready low/low_preview derivatives yet
|
||||
ecs = (await session.execute(select(EncryptedContent).where(
|
||||
EncryptedContent.preview_enabled == True
|
||||
).order_by(EncryptedContent.created_at.desc()))).scalars().all()
|
||||
# Include preview-enabled media and non-media content that need decrypted originals
|
||||
non_media_filter = and_(
|
||||
EncryptedContent.content_type.isnot(None),
|
||||
~EncryptedContent.content_type.like('audio/%'),
|
||||
~EncryptedContent.content_type.like('video/%'),
|
||||
)
|
||||
ecs = (await session.execute(
|
||||
select(EncryptedContent)
|
||||
.where(or_(EncryptedContent.preview_enabled == True, non_media_filter))
|
||||
.order_by(EncryptedContent.created_at.desc())
|
||||
)).scalars().all()
|
||||
|
||||
picked: List[Tuple[EncryptedContent, str]] = []
|
||||
picked: List[Tuple[EncryptedContent, PlainStaging]] = []
|
||||
for ec in ecs:
|
||||
try:
|
||||
cid_obj, cid_err = resolve_content(ec.encrypted_cid)
|
||||
if cid_err:
|
||||
make_log('convert_v3', f"Skip {ec.encrypted_cid}: resolve error {cid_err}", level='debug')
|
||||
continue
|
||||
encrypted_hash_b58 = cid_obj.content_hash_b58
|
||||
except Exception as exc:
|
||||
make_log('convert_v3', f"Skip {ec.encrypted_cid}: resolve exception {exc}", level='warning')
|
||||
continue
|
||||
|
||||
sc = (await session.execute(select(StoredContent).where(StoredContent.hash == encrypted_hash_b58))).scalars().first()
|
||||
if not sc or sc.onchain_index is None:
|
||||
continue
|
||||
|
||||
# Check if derivatives already ready
|
||||
rows = (await session.execute(select(ContentDerivative).where(ContentDerivative.content_id == ec.id))).scalars().all()
|
||||
kinds_ready = {r.kind for r in rows if r.status == 'ready'}
|
||||
required = {'decrypted_low', 'decrypted_high'} if ec.content_type.startswith('audio/') else {'decrypted_low', 'decrypted_high', 'decrypted_preview'}
|
||||
if ec.content_type.startswith('audio/'):
|
||||
required = {'decrypted_low', 'decrypted_high'}
|
||||
elif ec.content_type.startswith('video/'):
|
||||
required = {'decrypted_low', 'decrypted_high', 'decrypted_preview'}
|
||||
else:
|
||||
required = {'decrypted_original'}
|
||||
if required.issubset(kinds_ready):
|
||||
continue
|
||||
# Always decrypt from IPFS using local or remote key
|
||||
storage_path: str | None = None
|
||||
staging: Optional[PlainStaging] = None
|
||||
ck = (await session.execute(select(ContentKey).where(ContentKey.content_id == ec.id))).scalars().first()
|
||||
if ck:
|
||||
storage_path = await stage_plain_from_ipfs(ec, ck.key_ciphertext_b64)
|
||||
if not storage_path:
|
||||
staging = await stage_plain_from_ipfs(ec, ck.key_ciphertext_b64)
|
||||
if not staging:
|
||||
peers = (await session.execute(select(KnownNode))).scalars().all()
|
||||
for peer in peers:
|
||||
base_url = f"http://{peer.ip}:{peer.port}"
|
||||
meta = peer.meta or {}
|
||||
public_host = meta.get('public_host')
|
||||
if not public_host:
|
||||
last_resp = (meta.get('last_response') or {}).get('node', {}) if isinstance(meta, dict) else {}
|
||||
public_host = last_resp.get('public_host')
|
||||
base_url = public_host or f"http://{peer.ip}:{peer.port}"
|
||||
dek = await request_key_from_peer(base_url, ec.encrypted_cid)
|
||||
if not dek:
|
||||
continue
|
||||
|
|
@ -240,12 +439,12 @@ async def _pick_pending(limit: int) -> List[Tuple[EncryptedContent, str]]:
|
|||
)
|
||||
session.add(session_ck)
|
||||
await session.commit()
|
||||
storage_path = await stage_plain_from_ipfs(ec, dek_b64)
|
||||
if storage_path:
|
||||
staging = await stage_plain_from_ipfs(ec, dek_b64)
|
||||
if staging:
|
||||
break
|
||||
if not storage_path or not os.path.exists(storage_path):
|
||||
if not staging or not os.path.exists(staging.container_path):
|
||||
continue
|
||||
picked.append((ec, storage_path))
|
||||
picked.append((ec, staging))
|
||||
if len(picked) >= limit:
|
||||
break
|
||||
return picked
|
||||
|
|
@ -254,14 +453,14 @@ async def _pick_pending(limit: int) -> List[Tuple[EncryptedContent, str]]:
|
|||
async def worker_loop():
|
||||
sem = asyncio.Semaphore(CONCURRENCY)
|
||||
|
||||
async def _run_one(ec: EncryptedContent, input_path: str):
|
||||
async def _run_one(ec: EncryptedContent, staging: PlainStaging):
|
||||
async with sem:
|
||||
try:
|
||||
await _convert_content(ec, input_path)
|
||||
await _convert_content(ec, staging)
|
||||
# After successful conversion, attempt to remove staging file to avoid duplicates
|
||||
try:
|
||||
if input_path and input_path.startswith("/data/") and os.path.exists(input_path):
|
||||
os.remove(input_path)
|
||||
if staging and staging.container_path and os.path.exists(staging.container_path):
|
||||
os.remove(staging.container_path)
|
||||
except Exception:
|
||||
pass
|
||||
except Exception as e:
|
||||
|
|
@ -273,7 +472,7 @@ async def worker_loop():
|
|||
if not batch:
|
||||
await asyncio.sleep(3)
|
||||
continue
|
||||
tasks = [asyncio.create_task(_run_one(ec, path)) for (ec, path) in batch]
|
||||
tasks = [asyncio.create_task(_run_one(ec, staging)) for (ec, staging) in batch]
|
||||
await asyncio.gather(*tasks)
|
||||
except Exception as e:
|
||||
make_log('convert_v3', f"loop error: {e}", level='error')
|
||||
|
|
@ -285,15 +484,20 @@ async def main_fn(memory):
|
|||
await worker_loop()
|
||||
|
||||
|
||||
async def stage_plain_from_ipfs(ec: EncryptedContent, dek_wrapped: str) -> str | None:
|
||||
"""Download encrypted ENCF stream from IPFS and decrypt on the fly into a temp file."""
|
||||
import tempfile
|
||||
async def stage_plain_from_ipfs(ec: EncryptedContent, dek_wrapped: str) -> Optional[PlainStaging]:
|
||||
"""Download encrypted ENCF stream from IPFS and decrypt on the fly into shared staging."""
|
||||
os.makedirs(UPLOADS_PATH / STAGING_SUBDIR, exist_ok=True)
|
||||
try:
|
||||
dek = unwrap_dek(dek_wrapped)
|
||||
except KeyWrapError as exc:
|
||||
make_log('convert_v3', f"unwrap failed for {ec.encrypted_cid}: {exc}", level='error')
|
||||
return None
|
||||
tmp = tempfile.NamedTemporaryFile(prefix=f"dec_{ec.encrypted_cid[:8]}_", delete=False)
|
||||
|
||||
tmp = tempfile.NamedTemporaryFile(
|
||||
prefix=f"dec_{ec.encrypted_cid[:8]}_",
|
||||
dir=UPLOADS_PATH / STAGING_SUBDIR,
|
||||
delete=False,
|
||||
)
|
||||
tmp_path = tmp.name
|
||||
tmp.close()
|
||||
try:
|
||||
|
|
@ -301,7 +505,8 @@ async def stage_plain_from_ipfs(ec: EncryptedContent, dek_wrapped: str) -> str |
|
|||
async for ch in cat_stream(ec.encrypted_cid):
|
||||
yield ch
|
||||
await decrypt_encf_auto(_aiter(), dek, tmp_path)
|
||||
return tmp_path
|
||||
host_path = _container_to_host(tmp_path)
|
||||
return PlainStaging(container_path=tmp_path, host_path=host_path)
|
||||
except Exception as e:
|
||||
make_log('convert_v3', f"decrypt from ipfs failed: {e}", level='error')
|
||||
try:
|
||||
|
|
@ -312,3 +517,8 @@ async def stage_plain_from_ipfs(ec: EncryptedContent, dek_wrapped: str) -> str |
|
|||
|
||||
|
||||
|
||||
def _short_error(message: str, limit: int = ERROR_TRUNCATE_LIMIT) -> str:
|
||||
if not message:
|
||||
return message
|
||||
message = str(message)
|
||||
return message if len(message) <= limit else message[: limit - 3] + '...'
|
||||
|
|
|
|||
|
|
@ -25,14 +25,14 @@ async def _evict_over_ttl(now: datetime) -> int:
|
|||
removed = 0
|
||||
# Pull TTL from ServiceConfig each time
|
||||
async with db_session() as session:
|
||||
ttl_days = await ServiceConfig(session).get('DERIVATIVE_CACHE_TTL_DAYS', ENV_TTL_DAYS)
|
||||
if int(ttl_days) <= 0:
|
||||
ttl_days = int(await ServiceConfig(session).get('DERIVATIVE_CACHE_TTL_DAYS', ENV_TTL_DAYS))
|
||||
if ttl_days <= 0:
|
||||
return 0
|
||||
async with db_session() as session:
|
||||
rows = (await session.execute(select(ContentDerivative).where(ContentDerivative.status == 'ready'))).scalars().all()
|
||||
for r in rows:
|
||||
la = r.last_access_at or r.created_at
|
||||
if la and (now - la) > timedelta(days=TTL_DAYS):
|
||||
if la and (now - la) > timedelta(days=ttl_days):
|
||||
try:
|
||||
if r.local_path and os.path.exists(r.local_path):
|
||||
os.remove(r.local_path)
|
||||
|
|
@ -80,7 +80,11 @@ async def _evict_to_fit():
|
|||
|
||||
|
||||
async def main_fn(memory):
|
||||
make_log('derivative_janitor', f"Started (MAX_GB={MAX_GB}, TTL_DAYS={TTL_DAYS})", level='info')
|
||||
async with db_session() as session:
|
||||
cfg = ServiceConfig(session)
|
||||
runtime_max_gb = float(await cfg.get('DERIVATIVE_CACHE_MAX_GB', ENV_MAX_GB))
|
||||
runtime_ttl_days = int(await cfg.get('DERIVATIVE_CACHE_TTL_DAYS', ENV_TTL_DAYS))
|
||||
make_log('derivative_janitor', f"Started (MAX_GB={runtime_max_gb}, TTL_DAYS={runtime_ttl_days})", level='info')
|
||||
while True:
|
||||
try:
|
||||
now = datetime.utcnow()
|
||||
|
|
|
|||
|
|
@ -0,0 +1,152 @@
|
|||
import asyncio
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
from urllib.parse import urlencode
|
||||
|
||||
import httpx
|
||||
from sqlalchemy import select
|
||||
|
||||
from app.core.logger import make_log
|
||||
from app.core.storage import db_session
|
||||
from app.core.models import KnownNode, NodeEvent
|
||||
from app.core.events.service import (
|
||||
store_remote_events,
|
||||
upsert_cursor,
|
||||
LOCAL_PUBLIC_KEY,
|
||||
)
|
||||
from app.core.models.events import NodeEventCursor
|
||||
from app.core._secrets import hot_pubkey, hot_seed
|
||||
from app.core.network.nodesig import sign_headers
|
||||
from base58 import b58encode
|
||||
|
||||
|
||||
def _node_public_base(node: KnownNode) -> Optional[str]:
|
||||
meta = node.meta or {}
|
||||
public_host = (meta.get('public_host') or '').strip()
|
||||
if public_host:
|
||||
base = public_host.rstrip('/')
|
||||
if base.startswith('http://') or base.startswith('https://'):
|
||||
return base
|
||||
scheme = 'https' if node.port == 443 else 'http'
|
||||
return f"{scheme}://{base.lstrip('/')}"
|
||||
scheme = 'https' if node.port == 443 else 'http'
|
||||
host = (node.ip or '').strip()
|
||||
if not host:
|
||||
return None
|
||||
default_port = 443 if scheme == 'https' else 80
|
||||
if node.port and node.port != default_port:
|
||||
return f"{scheme}://{host}:{node.port}"
|
||||
return f"{scheme}://{host}"
|
||||
|
||||
|
||||
async def _fetch_events_for_node(node: KnownNode, limit: int = 100) -> Tuple[List[Dict], int]:
|
||||
base = _node_public_base(node)
|
||||
if not base:
|
||||
return [], 0
|
||||
async with db_session() as session:
|
||||
cursor = (await session.execute(
|
||||
select(NodeEventCursor).where(NodeEventCursor.source_public_key == node.public_key)
|
||||
)).scalar_one_or_none()
|
||||
since = cursor.last_seq if cursor else 0
|
||||
query = urlencode({"since": since, "limit": limit})
|
||||
path = f"/api/v1/network.events?{query}"
|
||||
url = f"{base}{path}"
|
||||
pk_b58 = b58encode(hot_pubkey).decode()
|
||||
headers = sign_headers("GET", path, b"", hot_seed, pk_b58)
|
||||
async with httpx.AsyncClient(timeout=20.0) as client:
|
||||
try:
|
||||
resp = await client.get(url, headers=headers)
|
||||
if resp.status_code == 403:
|
||||
make_log("Events", f"Access denied by node {node.public_key}", level="warning")
|
||||
return [], since
|
||||
resp.raise_for_status()
|
||||
data = resp.json()
|
||||
except Exception as exc:
|
||||
make_log("Events", f"Fetch events failed from {node.public_key}: {exc}", level="debug")
|
||||
return [], since
|
||||
events = data.get("events") or []
|
||||
next_since = int(data.get("next_since") or since)
|
||||
return events, next_since
|
||||
|
||||
|
||||
async def _apply_event(session, event: NodeEvent):
|
||||
if event.event_type == "stars_payment":
|
||||
from app.core.models import StarsInvoice
|
||||
payload = event.payload or {}
|
||||
invoice_id = payload.get("invoice_id")
|
||||
telegram_id = payload.get("telegram_id")
|
||||
content_hash = payload.get("content_hash")
|
||||
amount = payload.get("amount")
|
||||
if not invoice_id or not telegram_id or not content_hash:
|
||||
return
|
||||
invoice = (await session.execute(select(StarsInvoice).where(StarsInvoice.external_id == invoice_id))).scalar_one_or_none()
|
||||
if not invoice:
|
||||
invoice = StarsInvoice(
|
||||
external_id=invoice_id,
|
||||
user_id=payload.get("user_id"),
|
||||
type=payload.get('type') or 'access',
|
||||
telegram_id=telegram_id,
|
||||
amount=amount,
|
||||
content_hash=content_hash,
|
||||
paid=True,
|
||||
paid_at=event.created_at,
|
||||
payment_node_id=payload.get("payment_node", {}).get("public_key"),
|
||||
payment_node_public_host=payload.get("payment_node", {}).get("public_host"),
|
||||
bot_username=payload.get("bot_username"),
|
||||
is_remote=True,
|
||||
)
|
||||
session.add(invoice)
|
||||
else:
|
||||
invoice.paid = True
|
||||
invoice.paid_at = invoice.paid_at or event.created_at
|
||||
invoice.payment_node_id = payload.get("payment_node", {}).get("public_key")
|
||||
invoice.payment_node_public_host = payload.get("payment_node", {}).get("public_host")
|
||||
invoice.bot_username = payload.get("bot_username") or invoice.bot_username
|
||||
invoice.telegram_id = telegram_id or invoice.telegram_id
|
||||
invoice.is_remote = invoice.is_remote or True
|
||||
if payload.get('type'):
|
||||
invoice.type = payload['type']
|
||||
event.status = 'applied'
|
||||
event.applied_at = event.applied_at or event.received_at
|
||||
elif event.event_type == "content_indexed":
|
||||
# The index scout will pick up via remote_content_index; we only mark event applied
|
||||
event.status = 'recorded'
|
||||
elif event.event_type == "node_registered":
|
||||
event.status = 'recorded'
|
||||
else:
|
||||
event.status = 'recorded'
|
||||
|
||||
|
||||
async def main_fn(memory):
|
||||
make_log("Events", "Sync service started", level="info")
|
||||
while True:
|
||||
try:
|
||||
async with db_session() as session:
|
||||
nodes = (await session.execute(select(KnownNode))).scalars().all()
|
||||
trusted_nodes = [
|
||||
n for n in nodes
|
||||
if isinstance(n.meta, dict) and n.meta.get("role") == "trusted" and n.public_key != LOCAL_PUBLIC_KEY
|
||||
]
|
||||
trusted_keys = {n.public_key for n in trusted_nodes}
|
||||
for node in trusted_nodes:
|
||||
events, next_since = await _fetch_events_for_node(node)
|
||||
if not events:
|
||||
if next_since:
|
||||
async with db_session() as session:
|
||||
await upsert_cursor(session, node.public_key, next_since, node.meta.get("public_host") if isinstance(node.meta, dict) else None)
|
||||
await session.commit()
|
||||
continue
|
||||
async with db_session() as session:
|
||||
stored = await store_remote_events(
|
||||
session,
|
||||
events,
|
||||
allowed_public_keys=trusted_keys,
|
||||
)
|
||||
for ev in stored:
|
||||
await _apply_event(session, ev)
|
||||
if stored:
|
||||
await session.commit()
|
||||
await upsert_cursor(session, node.public_key, next_since, node.meta.get("public_host") if isinstance(node.meta, dict) else None)
|
||||
await session.commit()
|
||||
except Exception as exc:
|
||||
make_log("Events", f"Sync loop error: {exc}", level="error")
|
||||
await asyncio.sleep(10)
|
||||
|
|
@ -1,16 +1,21 @@
|
|||
import asyncio
|
||||
from typing import List
|
||||
import os
|
||||
from datetime import datetime
|
||||
from typing import List, Optional
|
||||
|
||||
import httpx
|
||||
from urllib.parse import urlparse
|
||||
import random
|
||||
import shutil
|
||||
from sqlalchemy import select
|
||||
|
||||
from app.core.logger import make_log
|
||||
from app.core.storage import db_session
|
||||
from app.core.models.my_network import KnownNode
|
||||
from app.core.models.my_network import KnownNode, RemoteContentIndex
|
||||
from app.core.models.events import NodeEvent
|
||||
from app.core.models.content_v3 import EncryptedContent, ContentDerivative
|
||||
from app.core.ipfs_client import pin_add, find_providers, swarm_connect
|
||||
from app.core.ipfs_client import pin_add, pin_ls, find_providers, swarm_connect, add_streamed_file
|
||||
from app.core.events.service import LOCAL_PUBLIC_KEY
|
||||
|
||||
|
||||
INTERVAL_SEC = 60
|
||||
|
|
@ -18,7 +23,7 @@ ENV_PIN_CONCURRENCY = int(os.getenv('SYNC_MAX_CONCURRENT_PINS', '4'))
|
|||
ENV_DISK_WATERMARK_PCT = int(os.getenv('SYNC_DISK_LOW_WATERMARK_PCT', '90'))
|
||||
|
||||
|
||||
async def fetch_index(base_url: str, etag: str | None, since: str | None) -> tuple[List[dict], str | None]:
|
||||
async def fetch_index(base_url: str, etag: Optional[str], since: Optional[str]) -> tuple[List[dict], Optional[str]]:
|
||||
try:
|
||||
headers = {}
|
||||
params = {}
|
||||
|
|
@ -27,7 +32,8 @@ async def fetch_index(base_url: str, etag: str | None, since: str | None) -> tup
|
|||
url = f"{base_url.rstrip('/')}/api/v1/content.delta" if since else f"{base_url.rstrip('/')}/api/v1/content.index"
|
||||
if etag:
|
||||
headers['If-None-Match'] = etag
|
||||
async with httpx.AsyncClient(timeout=20) as client:
|
||||
# follow_redirects handles peers that force HTTPS and issue 301s
|
||||
async with httpx.AsyncClient(timeout=20, follow_redirects=True) as client:
|
||||
r = await client.get(url, headers=headers, params=params)
|
||||
if r.status_code != 200:
|
||||
if r.status_code == 304:
|
||||
|
|
@ -102,6 +108,71 @@ async def upsert_content(item: dict):
|
|||
make_log('index_scout_v3', f"thumbnail fetch failed for {cid}: {e}", level='warning')
|
||||
|
||||
|
||||
def _node_base_url(node: KnownNode) -> Optional[str]:
|
||||
meta = node.meta or {}
|
||||
public_host = (meta.get('public_host') or '').strip()
|
||||
if public_host:
|
||||
base = public_host.rstrip('/')
|
||||
if base.startswith('http://') or base.startswith('https://'):
|
||||
return base
|
||||
scheme = 'https' if node.port == 443 else 'http'
|
||||
return f"{scheme}://{base.lstrip('/')}"
|
||||
scheme = 'https' if node.port == 443 else 'http'
|
||||
host = (node.ip or '').strip()
|
||||
if not host:
|
||||
return None
|
||||
default_port = 443 if scheme == 'https' else 80
|
||||
if node.port and node.port != default_port:
|
||||
return f"{scheme}://{host}:{node.port}"
|
||||
return f"{scheme}://{host}"
|
||||
|
||||
|
||||
async def _update_remote_index(node_id: int, items: List[dict], *, incremental: bool):
|
||||
if not items:
|
||||
return
|
||||
async with db_session() as session:
|
||||
existing_rows = (await session.execute(
|
||||
select(RemoteContentIndex).where(RemoteContentIndex.remote_node_id == node_id)
|
||||
)).scalars().all()
|
||||
existing_map = {row.encrypted_hash: row for row in existing_rows if row.encrypted_hash}
|
||||
seen = set()
|
||||
now = datetime.utcnow()
|
||||
for item in items:
|
||||
cid = item.get('encrypted_cid')
|
||||
if not cid:
|
||||
continue
|
||||
seen.add(cid)
|
||||
payload_meta = {
|
||||
'title': item.get('title'),
|
||||
'description': item.get('description'),
|
||||
'size_bytes': item.get('size_bytes'),
|
||||
'preview_enabled': item.get('preview_enabled'),
|
||||
'preview_conf': item.get('preview_conf'),
|
||||
'issuer_node_id': item.get('issuer_node_id'),
|
||||
'salt_b64': item.get('salt_b64'),
|
||||
}
|
||||
meta_clean = {k: v for k, v in payload_meta.items() if v is not None}
|
||||
row = existing_map.get(cid)
|
||||
if row:
|
||||
row.content_type = item.get('content_type') or row.content_type
|
||||
row.meta = {**(row.meta or {}), **meta_clean}
|
||||
row.last_updated = now
|
||||
else:
|
||||
row = RemoteContentIndex(
|
||||
remote_node_id=node_id,
|
||||
content_type=item.get('content_type') or 'application/octet-stream',
|
||||
encrypted_hash=cid,
|
||||
meta=meta_clean,
|
||||
last_updated=now,
|
||||
)
|
||||
session.add(row)
|
||||
if not incremental and existing_map:
|
||||
for hash_value, row in list(existing_map.items()):
|
||||
if hash_value not in seen:
|
||||
await session.delete(row)
|
||||
await session.commit()
|
||||
|
||||
|
||||
async def main_fn(memory):
|
||||
make_log('index_scout_v3', 'Service started', level='info')
|
||||
sem = None
|
||||
|
|
@ -116,8 +187,70 @@ async def main_fn(memory):
|
|||
sem = asyncio.Semaphore(max_pins)
|
||||
async with db_session() as session:
|
||||
nodes = (await session.execute(select(KnownNode))).scalars().all()
|
||||
node_by_pk = {n.public_key: n for n in nodes if n.public_key}
|
||||
async with db_session() as session:
|
||||
pending_events = (await session.execute(
|
||||
select(NodeEvent)
|
||||
.where(NodeEvent.event_type == 'content_indexed', NodeEvent.status.in_(('recorded', 'local', 'processing')))
|
||||
.order_by(NodeEvent.created_at.asc())
|
||||
.limit(25)
|
||||
)).scalars().all()
|
||||
for ev in pending_events:
|
||||
if ev.status != 'processing':
|
||||
ev.status = 'processing'
|
||||
await session.commit()
|
||||
for ev in pending_events:
|
||||
payload = ev.payload or {}
|
||||
cid = payload.get('encrypted_cid') or payload.get('content_cid')
|
||||
if ev.origin_public_key == LOCAL_PUBLIC_KEY:
|
||||
async with db_session() as session:
|
||||
ref = await session.get(NodeEvent, ev.id)
|
||||
if ref:
|
||||
ref.status = 'applied'
|
||||
ref.applied_at = datetime.utcnow()
|
||||
await session.commit()
|
||||
continue
|
||||
if not cid:
|
||||
async with db_session() as session:
|
||||
ref = await session.get(NodeEvent, ev.id)
|
||||
if ref:
|
||||
ref.status = 'applied'
|
||||
ref.applied_at = datetime.utcnow()
|
||||
await session.commit()
|
||||
continue
|
||||
node = node_by_pk.get(ev.origin_public_key)
|
||||
if not node:
|
||||
async with db_session() as session:
|
||||
node = (await session.execute(select(KnownNode).where(KnownNode.public_key == ev.origin_public_key))).scalar_one_or_none()
|
||||
if node:
|
||||
node_by_pk[node.public_key] = node
|
||||
if not node:
|
||||
make_log('index_scout_v3', f"Event {ev.uid} refers to unknown node {ev.origin_public_key}", level='debug')
|
||||
async with db_session() as session:
|
||||
ref = await session.get(NodeEvent, ev.id)
|
||||
if ref:
|
||||
ref.status = 'recorded'
|
||||
await session.commit()
|
||||
continue
|
||||
try:
|
||||
await _pin_one(node, cid)
|
||||
async with db_session() as session:
|
||||
ref = await session.get(NodeEvent, ev.id)
|
||||
if ref:
|
||||
ref.status = 'applied'
|
||||
ref.applied_at = datetime.utcnow()
|
||||
await session.commit()
|
||||
except Exception as exc:
|
||||
make_log('index_scout_v3', f"Event pin failed for {cid}: {exc}", level='warning')
|
||||
async with db_session() as session:
|
||||
ref = await session.get(NodeEvent, ev.id)
|
||||
if ref:
|
||||
ref.status = 'recorded'
|
||||
await session.commit()
|
||||
for n in nodes:
|
||||
base = f"http://{n.ip}:{n.port}"
|
||||
base = _node_base_url(n)
|
||||
if not base:
|
||||
continue
|
||||
# jitter 0..30s per node to reduce stampede
|
||||
await asyncio.sleep(random.uniform(0, 30))
|
||||
etag = (n.meta or {}).get('index_etag')
|
||||
|
|
@ -141,6 +274,10 @@ async def main_fn(memory):
|
|||
if not items:
|
||||
continue
|
||||
make_log('index_scout_v3', f"Fetched {len(items)} from {base}")
|
||||
try:
|
||||
await _update_remote_index(n.id, items, incremental=bool(since))
|
||||
except Exception as exc:
|
||||
make_log('index_scout_v3', f"remote index update failed for node {n.id}: {exc}", level='warning')
|
||||
|
||||
# Check disk watermark
|
||||
try:
|
||||
|
|
@ -153,9 +290,23 @@ async def main_fn(memory):
|
|||
except Exception:
|
||||
pass
|
||||
|
||||
async def _pin_one(cid: str):
|
||||
async def _pin_one(node: KnownNode, cid: str):
|
||||
async with sem:
|
||||
try:
|
||||
node_ipfs_meta = (node.meta or {}).get('ipfs') or {}
|
||||
multiaddrs = node_ipfs_meta.get('multiaddrs') or []
|
||||
for addr in multiaddrs:
|
||||
try:
|
||||
await swarm_connect(addr)
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
existing = await pin_ls(cid)
|
||||
if existing and existing.get('Keys'):
|
||||
make_log('index_scout_v3', f"pin {cid} already present", level='debug')
|
||||
return
|
||||
except Exception:
|
||||
pass
|
||||
# Try to pre-connect to discovered providers
|
||||
try:
|
||||
provs = await find_providers(cid, max_results=5)
|
||||
|
|
@ -167,16 +318,57 @@ async def main_fn(memory):
|
|||
pass
|
||||
except Exception:
|
||||
pass
|
||||
await pin_add(cid, recursive=True)
|
||||
try:
|
||||
await asyncio.wait_for(pin_add(cid, recursive=True), timeout=60)
|
||||
return
|
||||
except httpx.HTTPStatusError as http_err:
|
||||
body = (http_err.response.text or '').lower() if http_err.response else ''
|
||||
if 'already pinned' in body or 'pin already set' in body:
|
||||
make_log('index_scout_v3', f"pin {cid} already present", level='debug')
|
||||
return
|
||||
raise
|
||||
except Exception as e:
|
||||
make_log('index_scout_v3', f"pin {cid} failed: {e}", level='warning')
|
||||
# Attempt HTTP gateway fallback before logging failure
|
||||
fallback_sources = []
|
||||
node_host = node.meta.get('public_host') if isinstance(node.meta, dict) else None
|
||||
try:
|
||||
# Derive gateway host: prefer public_host domain if present
|
||||
parsed = urlparse(node_host) if node_host else None
|
||||
gateway_host = parsed.hostname if parsed and parsed.hostname else (node.ip or '').split(':')[0]
|
||||
gateway_port = parsed.port if (parsed and parsed.port not in (None, 80, 443)) else 8080
|
||||
if gateway_host:
|
||||
gateway_url = f"http://{gateway_host}:{gateway_port}/ipfs/{cid}"
|
||||
make_log('index_scout_v3', f"fallback download start {cid} via {gateway_url}", level='debug')
|
||||
async with httpx.AsyncClient(timeout=None) as client:
|
||||
resp = await client.get(gateway_url)
|
||||
resp.raise_for_status()
|
||||
data = resp.content
|
||||
chunk_bytes = int(os.getenv('CRYPTO_CHUNK_BYTES', '1048576'))
|
||||
add_params = {
|
||||
'cid-version': 1,
|
||||
'raw-leaves': 'true',
|
||||
'chunker': f'size-{chunk_bytes}',
|
||||
'hash': 'sha2-256',
|
||||
'pin': 'true',
|
||||
}
|
||||
result = await add_streamed_file([data], filename=f'{cid}.bin', params=add_params)
|
||||
if str(result.get('Hash')) != str(cid):
|
||||
raise ValueError(f"gateway add returned mismatched CID {result.get('Hash')}")
|
||||
make_log('index_scout_v3', f"pin {cid} fetched via gateway {gateway_host}:{gateway_port}", level='info')
|
||||
return
|
||||
else:
|
||||
fallback_sources.append('gateway-host-missing')
|
||||
except Exception as fallback_err:
|
||||
fallback_sources.append(str(fallback_err))
|
||||
make_log('index_scout_v3', f"pin {cid} failed: {e}; fallback={'; '.join(fallback_sources) if fallback_sources else 'none'}", level='warning')
|
||||
|
||||
tasks = []
|
||||
for it in items:
|
||||
await upsert_content(it)
|
||||
cid = it.get('encrypted_cid')
|
||||
if cid:
|
||||
tasks.append(asyncio.create_task(_pin_one(cid)))
|
||||
make_log('index_scout_v3', f"queue pin {cid}")
|
||||
tasks.append(asyncio.create_task(_pin_one(n, cid)))
|
||||
if tasks:
|
||||
await asyncio.gather(*tasks)
|
||||
except Exception as e:
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import asyncio
|
||||
import os
|
||||
from base64 import b64decode
|
||||
from datetime import datetime
|
||||
|
||||
|
|
@ -6,7 +7,8 @@ from base58 import b58encode
|
|||
from sqlalchemy import String, and_, desc, cast
|
||||
from tonsdk.boc import Cell
|
||||
from tonsdk.utils import Address
|
||||
from app.core._config import CLIENT_TELEGRAM_BOT_USERNAME
|
||||
from app.core._config import CLIENT_TELEGRAM_BOT_USERNAME, PROJECT_HOST
|
||||
from app.core.events.service import record_event
|
||||
from app.core._blockchain.ton.platform import platform
|
||||
from app.core._blockchain.ton.toncenter import toncenter
|
||||
from app.core._utils.send_status import send_status
|
||||
|
|
@ -16,8 +18,10 @@ from app.core.models.user import User
|
|||
from app.core.models.node_storage import StoredContent
|
||||
from app.core._utils.resolve_content import resolve_content
|
||||
from app.core.models.wallet_connection import WalletConnection
|
||||
from app.core._keyboards import get_inline_keyboard
|
||||
from app.core.models._telegram import Wrapped_CBotChat
|
||||
|
||||
|
||||
MIN_ONCHAIN_INDEX = int(os.getenv("MIN_ONCHAIN_INDEX", "8"))
|
||||
from sqlalchemy import select, and_, desc
|
||||
from app.core.storage import db_session
|
||||
import os
|
||||
|
|
@ -57,45 +61,79 @@ async def indexer_loop(memory, platform_found: bool, seqno: int) -> [bool, int]:
|
|||
)
|
||||
))).scalars().all()
|
||||
for new_license in new_licenses:
|
||||
licensed_content = (await session.execute(select(StoredContent).where(
|
||||
StoredContent.id == new_license.content_id
|
||||
))).scalars().first()
|
||||
if not licensed_content:
|
||||
make_log("Indexer", f"Licensed content not found: {new_license.content_id}", level="error")
|
||||
try:
|
||||
licensed_content = (await session.execute(select(StoredContent).where(
|
||||
StoredContent.id == new_license.content_id
|
||||
))).scalars().first()
|
||||
if not licensed_content:
|
||||
make_log("Indexer", f"Licensed content not found: {new_license.content_id}", level="error")
|
||||
new_license.meta = {**(new_license.meta or {}), 'notification_sent': True, 'notification_error': 'content_not_found'}
|
||||
await session.commit()
|
||||
continue
|
||||
|
||||
content_metadata = await licensed_content.metadata_json_async(session)
|
||||
assert content_metadata, "No content metadata found"
|
||||
|
||||
if not (licensed_content.owner_address == new_license.owner_address):
|
||||
try:
|
||||
user = await session.get(User, new_license.user_id)
|
||||
if user.telegram_id and licensed_content:
|
||||
await (Wrapped_CBotChat(memory._client_telegram_bot, chat_id=user.telegram_id, user=user, db_session=session)).send_content(
|
||||
session, licensed_content
|
||||
)
|
||||
|
||||
wallet_owner_connection = (await session.execute(
|
||||
select(WalletConnection).where(
|
||||
WalletConnection.wallet_address == licensed_content.owner_address,
|
||||
WalletConnection.invalidated == False
|
||||
).order_by(desc(WalletConnection.id))
|
||||
)).scalars().first()
|
||||
wallet_owner_user = await session.get(User, wallet_owner_connection.user_id) if wallet_owner_connection else None
|
||||
if wallet_owner_user.telegram_id:
|
||||
wallet_owner_bot = Wrapped_CBotChat(memory._telegram_bot, chat_id=wallet_owner_user.telegram_id, user=wallet_owner_user, db_session=session)
|
||||
await wallet_owner_bot.send_message(
|
||||
user.translated('p_licenseWasBought').format(
|
||||
username=user.front_format(),
|
||||
nft_address=f'"https://tonviewer.com/{new_license.onchain_address}"',
|
||||
content_title=content_metadata.get('name', 'Unknown'),
|
||||
),
|
||||
message_type='notification',
|
||||
)
|
||||
content_metadata = await licensed_content.metadata_json_async(session)
|
||||
except BaseException as e:
|
||||
make_log("IndexerSendNewLicense", f"Error: {e}" + '\n' + traceback.format_exc(), level="error")
|
||||
make_log("Indexer", f"Metadata fetch failed for content_id={licensed_content.id}: {e}", level="warning")
|
||||
content_metadata = None
|
||||
|
||||
new_license.meta = {**new_license.meta, 'notification_sent': True}
|
||||
await session.commit()
|
||||
# Metadata is best-effort here: it should never block indexer loop progress.
|
||||
if not content_metadata:
|
||||
content_metadata = {
|
||||
'name': licensed_content.meta.get('title') or licensed_content.filename or 'Unknown',
|
||||
'artist': licensed_content.meta.get('artist'),
|
||||
'title': licensed_content.meta.get('title'),
|
||||
}
|
||||
|
||||
if not (licensed_content.owner_address == new_license.owner_address):
|
||||
try:
|
||||
user = await session.get(User, new_license.user_id)
|
||||
if user and user.telegram_id:
|
||||
await (
|
||||
Wrapped_CBotChat(
|
||||
memory._client_telegram_bot,
|
||||
chat_id=user.telegram_id,
|
||||
user=user,
|
||||
db_session=session,
|
||||
)
|
||||
).send_content(session, licensed_content)
|
||||
|
||||
wallet_owner_connection = (await session.execute(
|
||||
select(WalletConnection).where(
|
||||
WalletConnection.wallet_address == licensed_content.owner_address,
|
||||
WalletConnection.invalidated == False
|
||||
).order_by(desc(WalletConnection.id))
|
||||
)).scalars().first()
|
||||
wallet_owner_user = await session.get(User, wallet_owner_connection.user_id) if wallet_owner_connection else None
|
||||
if wallet_owner_user and wallet_owner_user.telegram_id:
|
||||
wallet_owner_bot = Wrapped_CBotChat(
|
||||
memory._telegram_bot,
|
||||
chat_id=wallet_owner_user.telegram_id,
|
||||
user=wallet_owner_user,
|
||||
db_session=session,
|
||||
)
|
||||
meta_title = content_metadata.get('title') or content_metadata.get('name') or 'Unknown'
|
||||
meta_artist = content_metadata.get('artist')
|
||||
formatted_title = f"{meta_artist} – {meta_title}" if meta_artist else meta_title
|
||||
await wallet_owner_bot.send_message(
|
||||
user.translated('p_licenseWasBought').format(
|
||||
username=user.front_format(),
|
||||
nft_address=f'"https://tonviewer.com/{new_license.onchain_address}"',
|
||||
content_title=formatted_title,
|
||||
),
|
||||
message_type='notification',
|
||||
)
|
||||
except BaseException as e:
|
||||
make_log("IndexerSendNewLicense", f"Error: {e}" + '\n' + traceback.format_exc(), level="error")
|
||||
|
||||
# Preserve current behavior: do not retry notifications indefinitely.
|
||||
new_license.meta = {**(new_license.meta or {}), 'notification_sent': True}
|
||||
await session.commit()
|
||||
except BaseException as e:
|
||||
# Never allow a single broken license/metadata record to block the whole indexer loop.
|
||||
make_log("Indexer", f"Error processing new license {getattr(new_license, 'id', None)}: {e}" + '\n' + traceback.format_exc(), level="error")
|
||||
new_license.meta = {**(new_license.meta or {}), 'notification_sent': True, 'notification_error': str(e)[:256]}
|
||||
await session.commit()
|
||||
|
||||
content_without_cid = (await session.execute(select(StoredContent).where(StoredContent.content_id == None))).scalars().all()
|
||||
for target_content in content_without_cid:
|
||||
|
|
@ -110,11 +148,15 @@ async def indexer_loop(memory, platform_found: bool, seqno: int) -> [bool, int]:
|
|||
)).scalars().first()
|
||||
last_known_index = last_known_index_.onchain_index if last_known_index_ else 0
|
||||
last_known_index = max(last_known_index, 0)
|
||||
if last_known_index < (MIN_ONCHAIN_INDEX - 1):
|
||||
make_log(
|
||||
"Indexer",
|
||||
f"Adjusting last_known_index from {last_known_index} to {MIN_ONCHAIN_INDEX - 1} (MIN_ONCHAIN_INDEX)",
|
||||
level="debug"
|
||||
)
|
||||
last_known_index = MIN_ONCHAIN_INDEX - 1
|
||||
make_log("Indexer", f"Last known index: {last_known_index}", level="debug")
|
||||
if last_known_index_:
|
||||
next_item_index = last_known_index + 1
|
||||
else:
|
||||
next_item_index = 0
|
||||
next_item_index = last_known_index + 1
|
||||
|
||||
resolve_item_result = await toncenter.run_get_method(platform.address.to_string(1, 1, 1), 'get_nft_address_by_index', [['num', next_item_index]])
|
||||
make_log("Indexer", f"Resolve item result: {resolve_item_result}", level="debug")
|
||||
|
|
@ -141,6 +183,13 @@ async def indexer_loop(memory, platform_found: bool, seqno: int) -> [bool, int]:
|
|||
|
||||
assert item_get_data_result['stack'][2][0] == 'num', "Item index is not a number"
|
||||
item_index = int(item_get_data_result['stack'][2][1], 16)
|
||||
if item_index < MIN_ONCHAIN_INDEX:
|
||||
make_log(
|
||||
"Indexer",
|
||||
f"Skip on-chain item {item_index}: below MIN_ONCHAIN_INDEX={MIN_ONCHAIN_INDEX}",
|
||||
level="info"
|
||||
)
|
||||
return platform_found, seqno
|
||||
assert item_index == next_item_index, "Item index mismatch"
|
||||
|
||||
item_platform_address = Cell.one_from_boc(b64decode(item_get_data_result['stack'][3][1]['bytes'])).begin_parse().read_msg_addr()
|
||||
|
|
@ -221,19 +270,38 @@ async def indexer_loop(memory, platform_found: bool, seqno: int) -> [bool, int]:
|
|||
user = await session.get(User, user_wallet_connection.user_id)
|
||||
|
||||
if user:
|
||||
user_uploader_wrapper = Wrapped_CBotChat(memory._telegram_bot, chat_id=user.telegram_id, user=user, db_session=session)
|
||||
await user_uploader_wrapper.send_message(
|
||||
user.translated('p_contentWasIndexed').format(
|
||||
item_address=item_address.to_string(1, 1, 1),
|
||||
item_index=item_index,
|
||||
),
|
||||
message_type='notification',
|
||||
reply_markup=get_inline_keyboard([
|
||||
[{
|
||||
'text': user.translated('viewTrackAsClient_button'),
|
||||
'url': f"https://t.me/{CLIENT_TELEGRAM_BOT_USERNAME}?start=C{encrypted_stored_content.cid.serialize_v2()}"
|
||||
}],
|
||||
])
|
||||
# Notify user about indexed content via client bot (main UX bot),
|
||||
# but keep ability to clean up uploader-bot hint messages.
|
||||
user_client_wrapper = Wrapped_CBotChat(
|
||||
memory._client_telegram_bot,
|
||||
chat_id=user.telegram_id,
|
||||
user=user,
|
||||
db_session=session,
|
||||
)
|
||||
user_uploader_wrapper = Wrapped_CBotChat(
|
||||
memory._telegram_bot,
|
||||
chat_id=user.telegram_id,
|
||||
user=user,
|
||||
db_session=session,
|
||||
)
|
||||
ref_id = (user.meta or {}).get('ref_id')
|
||||
if not ref_id:
|
||||
ref_id = user.ensure_ref_id()
|
||||
await session.commit()
|
||||
|
||||
message_text = user.translated('p_contentWasIndexed').format(
|
||||
item_address=item_address.to_string(1, 1, 1),
|
||||
item_index=item_index,
|
||||
)
|
||||
|
||||
await user_client_wrapper.send_message(
|
||||
message_text,
|
||||
message_type='notification'
|
||||
)
|
||||
|
||||
await user_client_wrapper.send_content(
|
||||
session,
|
||||
encrypted_stored_content
|
||||
)
|
||||
|
||||
try:
|
||||
|
|
@ -246,7 +314,11 @@ async def indexer_loop(memory, platform_found: bool, seqno: int) -> [bool, int]:
|
|||
)
|
||||
))
|
||||
for hint_message in result.scalars().all():
|
||||
await user_uploader_wrapper.delete_message(hint_message.message_id)
|
||||
# Delete the hint with the bot that originally sent it.
|
||||
if hint_message.bot_id == user_client_wrapper.bot_id:
|
||||
await user_client_wrapper.delete_message(hint_message.message_id)
|
||||
elif hint_message.bot_id == user_uploader_wrapper.bot_id:
|
||||
await user_uploader_wrapper.delete_message(hint_message.message_id)
|
||||
except BaseException as e:
|
||||
make_log("Indexer", f"Error while deleting hint messages: {e}" + '\n' + traceback.format_exc(), level="error")
|
||||
elif encrypted_stored_content.type.startswith('onchain') and encrypted_stored_content.onchain_index == item_index:
|
||||
|
|
@ -263,6 +335,22 @@ async def indexer_loop(memory, platform_found: bool, seqno: int) -> [bool, int]:
|
|||
**encrypted_stored_content.meta,
|
||||
**item_metadata_packed
|
||||
}
|
||||
encrypted_stored_content.content_id = item_content_cid_str
|
||||
try:
|
||||
await record_event(
|
||||
session,
|
||||
'content_indexed',
|
||||
{
|
||||
'onchain_index': item_index,
|
||||
'content_hash': item_content_hash_str,
|
||||
'encrypted_cid': item_content_cid_str,
|
||||
'item_address': item_address.to_string(1, 1, 1),
|
||||
'owner_address': item_owner_address.to_string(1, 1, 1) if item_owner_address else None,
|
||||
},
|
||||
origin_host=PROJECT_HOST,
|
||||
)
|
||||
except Exception as exc:
|
||||
make_log("Events", f"Failed to record content_indexed event: {exc}", level="warning")
|
||||
|
||||
await session.commit()
|
||||
return platform_found, seqno
|
||||
|
|
@ -283,9 +371,25 @@ async def indexer_loop(memory, platform_found: bool, seqno: int) -> [bool, int]:
|
|||
encrypted=True,
|
||||
decrypted_content_id=None,
|
||||
key_id=None,
|
||||
content_id=item_content_cid_str,
|
||||
updated=datetime.now()
|
||||
)
|
||||
session.add(onchain_stored_content)
|
||||
try:
|
||||
await record_event(
|
||||
session,
|
||||
'content_indexed',
|
||||
{
|
||||
'onchain_index': item_index,
|
||||
'content_hash': item_content_hash_str,
|
||||
'encrypted_cid': item_content_cid_str,
|
||||
'item_address': item_address.to_string(1, 1, 1),
|
||||
'owner_address': item_owner_address.to_string(1, 1, 1) if item_owner_address else None,
|
||||
},
|
||||
origin_host=PROJECT_HOST,
|
||||
)
|
||||
except Exception as exc:
|
||||
make_log("Events", f"Failed to record content_indexed event: {exc}", level="warning")
|
||||
await session.commit()
|
||||
make_log("Indexer", f"Item indexed: {item_content_hash_str}", level="info")
|
||||
last_known_index += 1
|
||||
|
|
|
|||
|
|
@ -18,9 +18,12 @@ from app.core.models.wallet_connection import WalletConnection
|
|||
from app.core._keyboards import get_inline_keyboard
|
||||
from app.core.models._telegram import Wrapped_CBotChat
|
||||
from app.core.storage import db_session
|
||||
from app.core._config import CLIENT_TELEGRAM_API_KEY
|
||||
from app.core._config import CLIENT_TELEGRAM_API_KEY, CLIENT_TELEGRAM_BOT_USERNAME, PROJECT_HOST
|
||||
from app.core.models.user import User
|
||||
from app.core.models import StarsInvoice
|
||||
from app.core.events.service import record_event
|
||||
from app.core._secrets import hot_pubkey
|
||||
from base58 import b58encode
|
||||
import os
|
||||
import traceback
|
||||
|
||||
|
|
@ -53,15 +56,42 @@ async def license_index_loop(memory, platform_found: bool, seqno: int) -> [bool,
|
|||
|
||||
if star_payment.amount == existing_invoice.amount:
|
||||
if not existing_invoice.paid:
|
||||
user = (await session.execute(select(User).where(User.id == existing_invoice.user_id))).scalars().first()
|
||||
existing_invoice.paid = True
|
||||
existing_invoice.paid_at = datetime.utcnow()
|
||||
existing_invoice.telegram_id = getattr(user, 'telegram_id', None)
|
||||
existing_invoice.payment_tx_id = getattr(star_payment, 'id', None)
|
||||
existing_invoice.payment_node_id = b58encode(hot_pubkey).decode()
|
||||
existing_invoice.payment_node_public_host = PROJECT_HOST
|
||||
existing_invoice.bot_username = CLIENT_TELEGRAM_BOT_USERNAME
|
||||
existing_invoice.is_remote = False
|
||||
await record_event(
|
||||
session,
|
||||
'stars_payment',
|
||||
{
|
||||
'invoice_id': existing_invoice.external_id,
|
||||
'content_hash': existing_invoice.content_hash,
|
||||
'amount': existing_invoice.amount,
|
||||
'user_id': existing_invoice.user_id,
|
||||
'telegram_id': existing_invoice.telegram_id,
|
||||
'bot_username': CLIENT_TELEGRAM_BOT_USERNAME,
|
||||
'type': existing_invoice.type,
|
||||
'payment_node': {
|
||||
'public_key': b58encode(hot_pubkey).decode(),
|
||||
'public_host': PROJECT_HOST,
|
||||
},
|
||||
'paid_at': existing_invoice.paid_at.isoformat() + 'Z' if existing_invoice.paid_at else None,
|
||||
'payment_tx_id': existing_invoice.payment_tx_id,
|
||||
},
|
||||
origin_host=PROJECT_HOST,
|
||||
)
|
||||
await session.commit()
|
||||
|
||||
licensed_content = (await session.execute(select(StoredContent).where(StoredContent.hash == existing_invoice.content_hash))).scalars().first()
|
||||
user = (await session.execute(select(User).where(User.id == existing_invoice.user_id))).scalars().first()
|
||||
|
||||
await (Wrapped_CBotChat(memory._client_telegram_bot, chat_id=user.telegram_id, user=user, db_session=session)).send_content(
|
||||
session, licensed_content
|
||||
)
|
||||
if user and user.telegram_id and licensed_content:
|
||||
await (Wrapped_CBotChat(memory._client_telegram_bot, chat_id=user.telegram_id, user=user, db_session=session)).send_content(
|
||||
session, licensed_content
|
||||
)
|
||||
except BaseException as e:
|
||||
make_log("StarsProcessing", f"Local error: {e}" + '\n' + traceback.format_exc(), level="error")
|
||||
|
||||
|
|
@ -100,6 +130,7 @@ async def license_index_loop(memory, platform_found: bool, seqno: int) -> [bool,
|
|||
process_content = (await session.execute(select(UserContent).where(
|
||||
and_(
|
||||
UserContent.type.startswith('nft/'),
|
||||
UserContent.type != 'nft/ignored',
|
||||
UserContent.updated < (datetime.now() - timedelta(minutes=60)),
|
||||
)
|
||||
).order_by(UserContent.updated.asc()))).scalars().first()
|
||||
|
|
|
|||
|
|
@ -1,3 +1,6 @@
|
|||
from base64 import b32decode
|
||||
from typing import Optional, Tuple
|
||||
|
||||
from base58 import b58encode, b58decode
|
||||
|
||||
from tonsdk.boc import begin_cell
|
||||
|
|
@ -12,25 +15,50 @@ from app.core._utils.string_binary import string_to_bytes_fixed_size, bytes_to_s
|
|||
# cid_v2#_ cid_version:uint8 content_sha256:uint256 *[Param]s = CIDv2;
|
||||
|
||||
class ContentId:
|
||||
"""Unified abstraction for legacy ContentID and ENCF/IPFS CID strings."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
version: int = None,
|
||||
content_hash: bytes = None, # only SHA256
|
||||
onchain_index: int = None,
|
||||
accept_type: str = None,
|
||||
encryption_key_sha256: bytes = None,
|
||||
version: Optional[int] = None,
|
||||
content_hash: Optional[bytes] = None, # only SHA256
|
||||
onchain_index: Optional[int] = None,
|
||||
accept_type: Optional[str] = None,
|
||||
encryption_key_sha256: Optional[bytes] = None,
|
||||
*,
|
||||
raw_value: Optional[str] = None,
|
||||
cid_format: Optional[str] = None,
|
||||
multibase_prefix: Optional[str] = None,
|
||||
multicodec: Optional[int] = None,
|
||||
multihash_code: Optional[int] = 0x12,
|
||||
multihash_length: Optional[int] = 32,
|
||||
):
|
||||
self.version = version
|
||||
self.content_hash = content_hash
|
||||
|
||||
self.onchain_index = onchain_index or -1
|
||||
self.onchain_index = onchain_index if onchain_index is not None else -1
|
||||
self.accept_type = accept_type
|
||||
self.encryption_key_sha256 = encryption_key_sha256
|
||||
if self.encryption_key_sha256:
|
||||
assert len(self.encryption_key_sha256) == 32, "Invalid encryption key length"
|
||||
|
||||
self._raw_value = raw_value
|
||||
if cid_format:
|
||||
self.cid_format = cid_format
|
||||
else:
|
||||
if self.version == 1:
|
||||
self.cid_format = 'content_id_v1'
|
||||
elif self.version == 2:
|
||||
self.cid_format = 'content_id_v2'
|
||||
else:
|
||||
self.cid_format = 'content_id_v2'
|
||||
self.multibase_prefix = multibase_prefix
|
||||
self.multicodec = multicodec
|
||||
self.multihash_code = multihash_code
|
||||
self.multihash_length = multihash_length
|
||||
|
||||
@property
|
||||
def content_hash_b58(self) -> str:
|
||||
assert self.content_hash, "Content hash is not set"
|
||||
return b58encode(self.content_hash).decode()
|
||||
|
||||
@property
|
||||
|
|
@ -38,6 +66,11 @@ class ContentId:
|
|||
return self.onchain_index if (not (self.onchain_index is None) and self.onchain_index >= 0) else None
|
||||
|
||||
def serialize_v2(self, include_accept_type=False) -> str:
|
||||
if self.cid_format == 'ipfs':
|
||||
if self._raw_value:
|
||||
return self._raw_value
|
||||
return self._serialize_ipfs()
|
||||
|
||||
cid_bin = (
|
||||
(2).to_bytes(1, 'big') # cid version
|
||||
+ self.content_hash
|
||||
|
|
@ -60,6 +93,8 @@ class ContentId:
|
|||
return b58encode(cid_bin).decode()
|
||||
|
||||
def serialize_v1(self) -> str:
|
||||
if self.cid_format == 'ipfs':
|
||||
raise ValueError("Cannot serialize IPFS CID as ContentId v1")
|
||||
at_bin = string_to_bytes_fixed_size(self.accept_type, 15)
|
||||
assert len(self.content_hash) == 32, "Invalid hash length"
|
||||
if self.onchain_index < 0:
|
||||
|
|
@ -133,13 +168,31 @@ class ContentId:
|
|||
|
||||
@classmethod
|
||||
def deserialize(cls, cid: str):
|
||||
cid_version = int.from_bytes(b58decode(cid)[0:1], 'big')
|
||||
if not cid:
|
||||
raise ValueError("Empty content id provided")
|
||||
|
||||
first_char = cid[0]
|
||||
if first_char in ('b', 'B', 'z', 'Z'):
|
||||
return cls.from_ipfs(cid)
|
||||
|
||||
try:
|
||||
cid_version = int.from_bytes(b58decode(cid)[0:1], 'big')
|
||||
except Exception:
|
||||
cid_version = None
|
||||
|
||||
if cid_version == 1:
|
||||
return cls.from_v1(cid)
|
||||
elif cid_version == 2:
|
||||
return cls.from_v2(cid)
|
||||
else:
|
||||
raise ValueError("Invalid cid version")
|
||||
obj = cls.from_v1(cid)
|
||||
obj._raw_value = cid
|
||||
return obj
|
||||
if cid_version == 2:
|
||||
obj = cls.from_v2(cid)
|
||||
obj._raw_value = cid
|
||||
return obj
|
||||
|
||||
try:
|
||||
return cls.from_ipfs(cid)
|
||||
except Exception as exc:
|
||||
raise ValueError(f"Invalid cid format: {exc}") from exc
|
||||
|
||||
def json_format(self):
|
||||
return {
|
||||
|
|
@ -147,7 +200,130 @@ class ContentId:
|
|||
"content_hash": self.content_hash_b58,
|
||||
"onchain_index": self.safe_onchain_index,
|
||||
"accept_type": self.accept_type,
|
||||
"encryption_key_sha256": b58encode(self.encryption_key_sha256).decode() if self.encryption_key_sha256 else None
|
||||
"encryption_key_sha256": b58encode(self.encryption_key_sha256).decode() if self.encryption_key_sha256 else None,
|
||||
"format": self.cid_format,
|
||||
"raw": self.serialize_v2() if self.cid_format == 'ipfs' else None,
|
||||
}
|
||||
|
||||
# --- helpers for IPFS/ENCF CID handling ---------------------------------
|
||||
|
||||
@staticmethod
|
||||
def _decode_multibase(cid_str: str) -> Tuple[bytes, Optional[str]]:
|
||||
prefix = cid_str[0]
|
||||
if prefix in ('b', 'B'):
|
||||
payload = cid_str[1:]
|
||||
padding = (-len(payload)) % 8
|
||||
decoded = b32decode(payload.upper() + ('=' * padding), casefold=True)
|
||||
return decoded, prefix.lower()
|
||||
if prefix in ('z', 'Z'):
|
||||
return b58decode(cid_str[1:]), prefix.lower()
|
||||
# CIDv0 without explicit prefix
|
||||
return b58decode(cid_str), None
|
||||
|
||||
@staticmethod
|
||||
def _read_varint(data: bytes, offset: int) -> Tuple[int, int]:
|
||||
result = 0
|
||||
shift = 0
|
||||
while True:
|
||||
if offset >= len(data):
|
||||
raise ValueError("truncated varint")
|
||||
byte = data[offset]
|
||||
offset += 1
|
||||
result |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
if shift > 63:
|
||||
raise ValueError("varint overflow")
|
||||
return result, offset
|
||||
|
||||
@classmethod
|
||||
def from_ipfs(cls, cid: str):
|
||||
cid = cid.strip()
|
||||
payload, multibase_prefix = cls._decode_multibase(cid)
|
||||
|
||||
idx = 0
|
||||
version: Optional[int] = None
|
||||
codec: Optional[int] = None
|
||||
|
||||
if multibase_prefix is not None:
|
||||
version, idx = cls._read_varint(payload, idx)
|
||||
if version not in (0, 1):
|
||||
raise ValueError(f"unsupported CID version: {version}")
|
||||
if version == 1:
|
||||
codec, idx = cls._read_varint(payload, idx)
|
||||
else:
|
||||
codec = 0x70 # dag-pb default for CIDv0
|
||||
else:
|
||||
# CIDv0 without explicit version/codec
|
||||
version = 0
|
||||
codec = 0x70
|
||||
|
||||
multihash_code, idx = cls._read_varint(payload, idx)
|
||||
multihash_length, idx = cls._read_varint(payload, idx)
|
||||
digest = payload[idx:idx + multihash_length]
|
||||
if len(digest) != multihash_length:
|
||||
raise ValueError("truncated multihash digest")
|
||||
if multihash_length != 32:
|
||||
raise ValueError("unsupported multihash length (expected 32 bytes)")
|
||||
if multihash_code != 0x12:
|
||||
raise ValueError(f"unsupported multihash code: {hex(multihash_code)}")
|
||||
|
||||
return cls(
|
||||
version=version,
|
||||
content_hash=digest,
|
||||
onchain_index=None,
|
||||
accept_type=None,
|
||||
encryption_key_sha256=None,
|
||||
raw_value=cid,
|
||||
cid_format='ipfs',
|
||||
multibase_prefix=multibase_prefix,
|
||||
multicodec=codec,
|
||||
multihash_code=multihash_code,
|
||||
multihash_length=multihash_length,
|
||||
)
|
||||
|
||||
def _serialize_ipfs(self) -> str:
|
||||
if not self.content_hash:
|
||||
raise ValueError("Cannot serialize IPFS CID without content hash")
|
||||
if self.multibase_prefix is None:
|
||||
# default to CIDv0 (base58btc) dag-pb
|
||||
multihash = self._encode_varint(self.multihash_code or 0x12) + self._encode_varint(self.multihash_length or len(self.content_hash)) + self.content_hash
|
||||
return b58encode(multihash).decode()
|
||||
|
||||
version_bytes = self._encode_varint(self.version or 1)
|
||||
codec_bytes = b''
|
||||
if (self.version or 1) == 1:
|
||||
codec_bytes = self._encode_varint(self.multicodec or 0x70)
|
||||
|
||||
multihash = (
|
||||
version_bytes +
|
||||
codec_bytes +
|
||||
self._encode_varint(self.multihash_code or 0x12) +
|
||||
self._encode_varint(self.multihash_length or len(self.content_hash)) +
|
||||
self.content_hash
|
||||
)
|
||||
|
||||
if self.multibase_prefix == 'z':
|
||||
return 'z' + b58encode(multihash).decode()
|
||||
if self.multibase_prefix == 'b':
|
||||
from base64 import b32encode
|
||||
encoded = b32encode(multihash).decode().rstrip('=').lower()
|
||||
return 'b' + encoded
|
||||
# Fallback to base58btc without prefix
|
||||
return b58encode(multihash).decode()
|
||||
|
||||
@staticmethod
|
||||
def _encode_varint(value: int) -> bytes:
|
||||
if value < 0:
|
||||
raise ValueError("varint cannot encode negative values")
|
||||
out = bytearray()
|
||||
while True:
|
||||
to_write = value & 0x7F
|
||||
value >>= 7
|
||||
if value:
|
||||
out.append(to_write | 0x80)
|
||||
else:
|
||||
out.append(to_write)
|
||||
break
|
||||
return bytes(out)
|
||||
|
|
|
|||
|
|
@ -115,6 +115,7 @@ def _clean_text_content(text: str, is_hashtag: bool = False) -> str:
|
|||
async def create_metadata_for_item(
|
||||
db_session,
|
||||
title: str = None,
|
||||
artist: str = None,
|
||||
cover_url: str = None,
|
||||
authors: list = None,
|
||||
hashtags: list = [],
|
||||
|
|
@ -128,6 +129,15 @@ async def create_metadata_for_item(
|
|||
cleaned_title = cleaned_title[:100].strip() # Truncate and strip after cleaning
|
||||
assert len(cleaned_title) > 3, f"Cleaned title '{cleaned_title}' (from original '{title}') is too short or became empty after cleaning."
|
||||
|
||||
cleaned_artist = None
|
||||
if artist:
|
||||
cleaned_artist = _clean_text_content(artist, is_hashtag=False)
|
||||
cleaned_artist = cleaned_artist[:100].strip()
|
||||
if not cleaned_artist:
|
||||
cleaned_artist = None
|
||||
|
||||
display_name = f"{cleaned_artist} – {cleaned_title}" if cleaned_artist else cleaned_title
|
||||
|
||||
# Process and clean hashtags
|
||||
processed_hashtags = []
|
||||
if hashtags and isinstance(hashtags, list):
|
||||
|
|
@ -142,17 +152,21 @@ async def create_metadata_for_item(
|
|||
processed_hashtags = list(dict.fromkeys(processed_hashtags))[:10]
|
||||
|
||||
item_metadata = {
|
||||
'name': cleaned_title,
|
||||
'attributes': [
|
||||
# {
|
||||
# 'trait_type': 'Artist',
|
||||
# 'value': 'Unknown'
|
||||
# },
|
||||
],
|
||||
'name': display_name,
|
||||
'title': cleaned_title,
|
||||
'display_name': display_name,
|
||||
'downloadable': downloadable,
|
||||
'tags': processed_hashtags, # New field for storing the list of cleaned hashtags
|
||||
'attributes': [],
|
||||
}
|
||||
|
||||
if cleaned_artist:
|
||||
item_metadata['artist'] = cleaned_artist
|
||||
item_metadata['attributes'].append({
|
||||
'trait_type': 'Artist',
|
||||
'value': cleaned_artist,
|
||||
})
|
||||
|
||||
# Generate description from the processed hashtags
|
||||
item_metadata['description'] = ' '.join([f"#{h}" for h in processed_hashtags if h])
|
||||
|
||||
|
|
|
|||
|
|
@ -4,7 +4,6 @@ import os
|
|||
import struct
|
||||
from typing import BinaryIO, Iterator, AsyncIterator
|
||||
|
||||
from Crypto.Cipher import SIV
|
||||
from Crypto.Cipher import AES
|
||||
|
||||
|
||||
|
|
@ -62,7 +61,7 @@ def encrypt_file_to_encf(src: BinaryIO, key: bytes, chunk_bytes: int, salt: byte
|
|||
block = src.read(chunk_bytes)
|
||||
if not block:
|
||||
break
|
||||
siv = SIV.new(key=key, ciphermod=AES) # new object per message
|
||||
siv = AES.new(key, AES.MODE_SIV) # new object per message
|
||||
siv.update(_ad(salt, idx))
|
||||
ciph, tag = siv.encrypt_and_digest(block)
|
||||
yield struct.pack(">I", len(block))
|
||||
|
|
@ -76,7 +75,6 @@ async def decrypt_encf_to_file(byte_iter: AsyncIterator[bytes], key: bytes, out_
|
|||
Parse ENCF v1 stream from async byte iterator and write plaintext to out_path.
|
||||
"""
|
||||
import aiofiles
|
||||
from Crypto.Cipher import SIV as _SIV
|
||||
from Crypto.Cipher import AES as _AES
|
||||
|
||||
buf = bytearray()
|
||||
|
|
@ -129,7 +127,7 @@ async def decrypt_encf_to_file(byte_iter: AsyncIterator[bytes], key: bytes, out_
|
|||
c = bytes(buf[:p_len])
|
||||
t = bytes(buf[p_len:p_len+TAG_LEN])
|
||||
del buf[:p_len+TAG_LEN]
|
||||
siv = _SIV.new(key=key, ciphermod=_AES)
|
||||
siv = _AES.new(key, _AES.MODE_SIV)
|
||||
siv.update(_ad(salt, idx))
|
||||
p = siv.decrypt_and_verify(c, t)
|
||||
await out.write(p)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,17 @@
|
|||
from .service import (
|
||||
record_event,
|
||||
store_remote_events,
|
||||
verify_event_signature,
|
||||
next_local_seq,
|
||||
upsert_cursor,
|
||||
prune_events,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
'record_event',
|
||||
'store_remote_events',
|
||||
'verify_event_signature',
|
||||
'next_local_seq',
|
||||
'upsert_cursor',
|
||||
'prune_events',
|
||||
]
|
||||
|
|
@ -0,0 +1,185 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import Any, Dict, Iterable, List, Optional
|
||||
from uuid import uuid4
|
||||
|
||||
from base58 import b58decode, b58encode
|
||||
import nacl.signing
|
||||
from sqlalchemy import select, delete
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.core.logger import make_log
|
||||
from app.core._secrets import hot_pubkey, hot_seed
|
||||
from app.core.models import NodeEvent, NodeEventCursor
|
||||
|
||||
|
||||
LOCAL_PUBLIC_KEY = b58encode(hot_pubkey).decode()
|
||||
|
||||
|
||||
def _normalize_dt(value: Optional[datetime]) -> datetime:
|
||||
if value is None:
|
||||
return datetime.utcnow()
|
||||
if value.tzinfo is not None:
|
||||
return value.astimezone(timezone.utc).replace(tzinfo=None)
|
||||
return value
|
||||
|
||||
|
||||
def _parse_iso_dt(iso_value: Optional[str]) -> datetime:
|
||||
if not iso_value:
|
||||
return datetime.utcnow()
|
||||
try:
|
||||
parsed = datetime.fromisoformat(iso_value.replace('Z', '+00:00'))
|
||||
except Exception:
|
||||
return datetime.utcnow()
|
||||
return _normalize_dt(parsed)
|
||||
|
||||
|
||||
def _canonical_blob(data: Dict[str, Any]) -> bytes:
|
||||
return json.dumps(data, sort_keys=True, separators=(",", ":")).encode()
|
||||
|
||||
|
||||
def _sign_event(blob: Dict[str, Any]) -> str:
|
||||
signing_key = nacl.signing.SigningKey(hot_seed)
|
||||
signature = signing_key.sign(_canonical_blob(blob)).signature
|
||||
return b58encode(signature).decode()
|
||||
|
||||
|
||||
def verify_event_signature(event: Dict[str, Any]) -> bool:
|
||||
try:
|
||||
origin_key = event["origin_public_key"]
|
||||
signature = event["signature"]
|
||||
payload = {
|
||||
"origin_public_key": origin_key,
|
||||
"origin_host": event.get("origin_host"),
|
||||
"seq": event["seq"],
|
||||
"uid": event["uid"],
|
||||
"event_type": event["event_type"],
|
||||
"payload": event.get("payload") or {},
|
||||
"created_at": event.get("created_at"),
|
||||
}
|
||||
verify_key = nacl.signing.VerifyKey(b58decode(origin_key))
|
||||
verify_key.verify(_canonical_blob(payload), b58decode(signature))
|
||||
return True
|
||||
except Exception as exc:
|
||||
make_log("Events", f"Signature validation failed: {exc}", level="warning")
|
||||
return False
|
||||
|
||||
|
||||
async def next_local_seq(session: AsyncSession) -> int:
|
||||
result = await session.execute(
|
||||
select(NodeEvent.seq)
|
||||
.where(NodeEvent.origin_public_key == LOCAL_PUBLIC_KEY)
|
||||
.order_by(NodeEvent.seq.desc())
|
||||
.limit(1)
|
||||
)
|
||||
row = result.scalar_one_or_none()
|
||||
return int(row or 0) + 1
|
||||
|
||||
|
||||
async def record_event(
|
||||
session: AsyncSession,
|
||||
event_type: str,
|
||||
payload: Dict[str, Any],
|
||||
origin_host: Optional[str] = None,
|
||||
created_at: Optional[datetime] = None,
|
||||
) -> NodeEvent:
|
||||
seq = await next_local_seq(session)
|
||||
created_dt = _normalize_dt(created_at)
|
||||
event_body = {
|
||||
"origin_public_key": LOCAL_PUBLIC_KEY,
|
||||
"origin_host": origin_host,
|
||||
"seq": seq,
|
||||
"uid": uuid4().hex,
|
||||
"event_type": event_type,
|
||||
"payload": payload,
|
||||
"created_at": created_dt.replace(tzinfo=timezone.utc).isoformat().replace('+00:00', 'Z'),
|
||||
}
|
||||
signature = _sign_event(event_body)
|
||||
node_event = NodeEvent(
|
||||
origin_public_key=LOCAL_PUBLIC_KEY,
|
||||
origin_host=origin_host,
|
||||
seq=seq,
|
||||
uid=event_body["uid"],
|
||||
event_type=event_type,
|
||||
payload=payload,
|
||||
signature=signature,
|
||||
created_at=created_dt,
|
||||
status='local',
|
||||
)
|
||||
session.add(node_event)
|
||||
await session.flush()
|
||||
make_log("Events", f"Recorded local event {event_type} seq={seq}")
|
||||
return node_event
|
||||
|
||||
|
||||
async def upsert_cursor(session: AsyncSession, source_public_key: str, seq: int, host: Optional[str]):
|
||||
existing = (await session.execute(
|
||||
select(NodeEventCursor).where(NodeEventCursor.source_public_key == source_public_key)
|
||||
)).scalar_one_or_none()
|
||||
if existing:
|
||||
if seq > existing.last_seq:
|
||||
existing.last_seq = seq
|
||||
if host:
|
||||
existing.source_public_host = host
|
||||
else:
|
||||
cursor = NodeEventCursor(
|
||||
source_public_key=source_public_key,
|
||||
last_seq=seq,
|
||||
source_public_host=host,
|
||||
)
|
||||
session.add(cursor)
|
||||
await session.flush()
|
||||
|
||||
|
||||
async def store_remote_events(
|
||||
session: AsyncSession,
|
||||
events: Iterable[Dict[str, Any]],
|
||||
allowed_public_keys: Optional[set[str]] = None,
|
||||
) -> List[NodeEvent]:
|
||||
stored: List[NodeEvent] = []
|
||||
for event in events:
|
||||
if not verify_event_signature(event):
|
||||
continue
|
||||
origin_pk = event["origin_public_key"]
|
||||
if allowed_public_keys is not None and origin_pk not in allowed_public_keys:
|
||||
make_log("Events", f"Ignored event from untrusted node {origin_pk}", level="warning")
|
||||
continue
|
||||
seq = int(event["seq"])
|
||||
exists = (await session.execute(
|
||||
select(NodeEvent).where(
|
||||
NodeEvent.origin_public_key == origin_pk,
|
||||
NodeEvent.seq == seq,
|
||||
)
|
||||
)).scalar_one_or_none()
|
||||
if exists:
|
||||
continue
|
||||
created_dt = _parse_iso_dt(event.get("created_at"))
|
||||
received_dt = datetime.utcnow()
|
||||
node_event = NodeEvent(
|
||||
origin_public_key=origin_pk,
|
||||
origin_host=event.get("origin_host"),
|
||||
seq=seq,
|
||||
uid=event["uid"],
|
||||
event_type=event["event_type"],
|
||||
payload=event.get("payload") or {},
|
||||
signature=event["signature"],
|
||||
created_at=created_dt,
|
||||
status='recorded',
|
||||
received_at=received_dt,
|
||||
)
|
||||
session.add(node_event)
|
||||
stored.append(node_event)
|
||||
await upsert_cursor(session, origin_pk, seq, event.get("origin_host"))
|
||||
make_log("Events", f"Ingested remote event {event['event_type']} from {origin_pk} seq={seq}", level="debug")
|
||||
if stored:
|
||||
await session.flush()
|
||||
return stored
|
||||
|
||||
|
||||
async def prune_events(session: AsyncSession, max_age_days: int = 90):
|
||||
cutoff = datetime.utcnow() - timedelta(days=max_age_days)
|
||||
await session.execute(
|
||||
delete(NodeEvent).where(NodeEvent.created_at < cutoff)
|
||||
)
|
||||
|
|
@ -26,8 +26,19 @@ async def add_streamed_file(stream_iter: Iterable[bytes], filename: str = "file.
|
|||
}
|
||||
q = {**default_params, **params}
|
||||
|
||||
class _StreamAdapter:
|
||||
def __init__(self, iterable):
|
||||
self._iter = iter(iterable)
|
||||
|
||||
def read(self, size=-1):
|
||||
try:
|
||||
return next(self._iter)
|
||||
except StopIteration:
|
||||
return b''
|
||||
|
||||
stream = _StreamAdapter(stream_iter)
|
||||
async with httpx.AsyncClient(timeout=None) as client:
|
||||
files = {"file": (filename, stream_iter, "application/octet-stream")}
|
||||
files = {"file": (filename, stream, "application/octet-stream")}
|
||||
r = await client.post(f"{IPFS_API_URL}/api/v0/add", params=q, files=files)
|
||||
r.raise_for_status()
|
||||
# /add may emit NDJSON lines; most often single JSON
|
||||
|
|
@ -117,3 +128,10 @@ async def repo_stat() -> Dict[str, Any]:
|
|||
r = await client.post(f"{IPFS_API_URL}/api/v0/repo/stat")
|
||||
r.raise_for_status()
|
||||
return r.json()
|
||||
|
||||
|
||||
async def id_info() -> Dict[str, Any]:
|
||||
async with httpx.AsyncClient(timeout=10) as client:
|
||||
r = await client.post(f"{IPFS_API_URL}/api/v0/id")
|
||||
r.raise_for_status()
|
||||
return r.json()
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@ from app.core.models.content.user_content import UserContent, UserAction
|
|||
from app.core.models._config import ServiceConfigValue, ServiceConfig
|
||||
from app.core.models.asset import Asset
|
||||
from app.core.models.my_network import KnownNode, KnownNodeIncident, RemoteContentIndex
|
||||
from app.core.models.events import NodeEvent, NodeEventCursor
|
||||
from app.core.models.promo import PromoAction
|
||||
from app.core.models.tasks import BlockchainTask
|
||||
from app.core.models.content_v3 import (
|
||||
|
|
|
|||
|
|
@ -11,6 +11,8 @@ import json
|
|||
import urllib
|
||||
|
||||
from app.core.models.transaction import StarsInvoice
|
||||
from app.core._utils.share_links import build_content_links
|
||||
from app.core.models.content_v3 import EncryptedContent
|
||||
|
||||
|
||||
class PlayerTemplates:
|
||||
|
|
@ -21,6 +23,8 @@ class PlayerTemplates:
|
|||
template_kwargs = {}
|
||||
inline_keyboard_array = []
|
||||
text = ""
|
||||
content_metadata_json = {}
|
||||
description_block = ""
|
||||
if content:
|
||||
assert content.type.startswith('onchain/content'), "Invalid nodeStorage content type"
|
||||
cd_log = f"Content (SHA256: {content.hash}), Encrypted: {content.encrypted}, TelegramCID: {content.telegram_cid}. "
|
||||
|
|
@ -33,60 +37,126 @@ class PlayerTemplates:
|
|||
cd_log += f"Decrypted: {local_content.hash}. "
|
||||
else:
|
||||
cd_log += "Can't decrypt content. "
|
||||
|
||||
user_wallet_address = await self.user.wallet_address_async(self.db_session)
|
||||
user_existing_license = (await self.db_session.execute(select(UserContent).where(
|
||||
and_(UserContent.user_id == self.user.id, UserContent.content_id == content.id)
|
||||
))).scalars().first()
|
||||
|
||||
content_meta = content.json_format() if content else {}
|
||||
if local_content:
|
||||
content_meta = content.json_format()
|
||||
local_content_meta = local_content.json_format()
|
||||
make_log("TG-Player", f"Content meta: {content_meta}. Local content meta: {local_content_meta}. ")
|
||||
make_log("TG-Player", f"Content meta: {content_meta}. Local content meta: {local_content.json_format()}. ")
|
||||
try:
|
||||
content_type, content_encoding = "audio", "aac"
|
||||
except:
|
||||
content_type, content_encoding = 'application', 'x-binary'
|
||||
|
||||
content_metadata = await StoredContent.from_cid_async(db_session, content_meta.get('metadata_cid') or None)
|
||||
with open(content_metadata.filepath, 'r') as f:
|
||||
content_metadata_json = json.loads(f.read())
|
||||
|
||||
metadata_cid = content_meta.get('metadata_cid') if content_meta else None
|
||||
if metadata_cid:
|
||||
try:
|
||||
cover_content = await StoredContent.from_cid_async(self.db_session, content_meta.get('cover_cid') or None)
|
||||
cd_log += f"Cover content: {cover_content.cid.serialize_v2()}. "
|
||||
content_metadata = await StoredContent.from_cid_async(db_session, metadata_cid)
|
||||
with open(content_metadata.filepath, 'r') as f:
|
||||
content_metadata_json = json.loads(f.read())
|
||||
except BaseException as e:
|
||||
cd_log += f"Can't get cover content: {e}. "
|
||||
cover_content = None
|
||||
make_log("TG-Player", f"Can't get metadata content: {e}", level='warning')
|
||||
|
||||
content_share_link = {
|
||||
'text': self.user.translated('p_shareLinkContext').format(title=content_metadata_json.get('name', "")),
|
||||
'url': f"https://t.me/{CLIENT_TELEGRAM_BOT_USERNAME}/content?startapp={content.cid.serialize_v2()}"
|
||||
}
|
||||
if user_existing_license:
|
||||
content_share_link['url'] = f"https://t.me/{CLIENT_TELEGRAM_BOT_USERNAME}/content?startapp={user_existing_license.onchain_address}"
|
||||
try:
|
||||
cover_content = await StoredContent.from_cid_async(self.db_session, content_meta.get('cover_cid') if content_meta else None)
|
||||
cd_log += f"Cover content: {cover_content.cid.serialize_v2()}. "
|
||||
except BaseException as e:
|
||||
cd_log += f"Can't get cover content: {e}. "
|
||||
cover_content = None
|
||||
|
||||
if cover_content:
|
||||
template_kwargs['photo'] = URLInputFile(cover_content.web_url)
|
||||
share_target = user_existing_license.onchain_address if user_existing_license else content.cid.serialize_v2()
|
||||
ref_id = (self.user.meta or {}).get('ref_id')
|
||||
if not ref_id:
|
||||
ref_id = self.user.ensure_ref_id()
|
||||
if self.db_session:
|
||||
await self.db_session.commit()
|
||||
|
||||
if not local_content:
|
||||
text = self.user.translated('p_playerContext_unsupportedContent').format(
|
||||
content_type=content_type,
|
||||
content_encoding=content_encoding
|
||||
_, startapp_url, web_app_url = build_content_links(
|
||||
share_target,
|
||||
ref_id,
|
||||
project_host=PROJECT_HOST,
|
||||
bot_username=CLIENT_TELEGRAM_BOT_USERNAME
|
||||
)
|
||||
|
||||
content_share_link = {
|
||||
'text': self.user.translated('p_shareLinkContext').format(title=content_metadata_json.get('name', "")),
|
||||
'url': startapp_url,
|
||||
'web_url': web_app_url,
|
||||
'ref_id': ref_id
|
||||
}
|
||||
|
||||
if cover_content:
|
||||
template_kwargs['photo'] = URLInputFile(cover_content.web_url)
|
||||
|
||||
encrypted_cid_candidates = []
|
||||
if content_meta:
|
||||
encrypted_cid_candidates.extend([
|
||||
content_meta.get('content_cid'),
|
||||
content_meta.get('encrypted_cid'),
|
||||
])
|
||||
if local_content and isinstance(local_content.meta, dict):
|
||||
encrypted_cid_candidates.append(local_content.meta.get('encrypted_cid'))
|
||||
if content and content.content_id:
|
||||
encrypted_cid_candidates.append(content.content_id)
|
||||
|
||||
encrypted_content_row = None
|
||||
if self.db_session:
|
||||
for candidate in encrypted_cid_candidates:
|
||||
if not candidate:
|
||||
continue
|
||||
encrypted_content_row = (await self.db_session.execute(
|
||||
select(EncryptedContent).where(EncryptedContent.encrypted_cid == candidate)
|
||||
)).scalars().first()
|
||||
if encrypted_content_row:
|
||||
break
|
||||
|
||||
description = (content_metadata_json.get('description') or '').strip()
|
||||
encrypted_description = (encrypted_content_row.description or '').strip() if encrypted_content_row and encrypted_content_row.description else ''
|
||||
if not description and encrypted_description:
|
||||
description = encrypted_description
|
||||
if description:
|
||||
description_block = f"{description}\n"
|
||||
|
||||
metadata_title = content_metadata_json.get('title') or content_metadata_json.get('name')
|
||||
if not metadata_title:
|
||||
metadata_title = (
|
||||
(encrypted_content_row.title if encrypted_content_row and encrypted_content_row.title else None)
|
||||
or (local_content.filename if local_content else None)
|
||||
or (content.filename if content else None)
|
||||
or content.cid.serialize_v2()
|
||||
)
|
||||
inline_keyboard_array = []
|
||||
extra_buttons = []
|
||||
else:
|
||||
content_hashtags = content_metadata_json.get('description').strip()
|
||||
if content_hashtags:
|
||||
content_hashtags += '\n'
|
||||
metadata_artist = content_metadata_json.get('artist')
|
||||
if metadata_artist in ('', None):
|
||||
metadata_artist = None
|
||||
if not metadata_artist:
|
||||
encrypted_artist = getattr(encrypted_content_row, 'artist', None)
|
||||
metadata_artist = encrypted_artist if encrypted_artist else metadata_artist
|
||||
title = f"{metadata_artist} – {metadata_title}" if metadata_artist else metadata_title
|
||||
|
||||
text = f"""<b>{content_metadata_json.get('name', 'Unnamed')}</b>
|
||||
{content_hashtags}
|
||||
Этот контент был загружен в MY
|
||||
text = f"""<b>{title}</b>
|
||||
{description_block}Этот контент был загружен в MY
|
||||
\t/ p2p content market /
|
||||
<blockquote><a href="{content_share_link['url']}">🔴 «открыть в MY»</a></blockquote>"""
|
||||
|
||||
if self.db_session and content:
|
||||
processing_messages = (await self.db_session.execute(
|
||||
select(KnownTelegramMessage).where(
|
||||
and_(
|
||||
KnownTelegramMessage.type == 'content/processing',
|
||||
KnownTelegramMessage.chat_id == self._chat_id,
|
||||
KnownTelegramMessage.bot_id == self.bot_id,
|
||||
KnownTelegramMessage.deleted == False,
|
||||
KnownTelegramMessage.content_id == content.id,
|
||||
)
|
||||
)
|
||||
)).scalars().all()
|
||||
|
||||
if local_content and processing_messages:
|
||||
for msg in processing_messages:
|
||||
await self.delete_message(msg.message_id)
|
||||
|
||||
make_log("TG-Player", f"Send content {content_type} ({content_encoding}) to chat {self._chat_id}. {cd_log}")
|
||||
kmsgs = (await self.db_session.execute(select(KnownTelegramMessage).where(
|
||||
and_(
|
||||
|
|
|
|||
|
|
@ -82,7 +82,7 @@ class Wrapped_CBotChat(T, PlayerTemplates):
|
|||
|
||||
return result
|
||||
|
||||
async def send_message(self, text: str, message_type='common', message_meta={}, **kwargs):
|
||||
async def send_message(self, text: str, message_type='common', message_meta={}, content_id=None, **kwargs):
|
||||
assert self._chat_id, "No chat_id"
|
||||
try:
|
||||
make_log(self, f"Send message to {self._chat_id}. Text len: {len(text)}", level='debug')
|
||||
|
|
@ -93,7 +93,7 @@ class Wrapped_CBotChat(T, PlayerTemplates):
|
|||
disable_web_page_preview=True,
|
||||
**kwargs
|
||||
)
|
||||
return await self.return_result(r, message_type=message_type, message_meta=message_meta)
|
||||
return await self.return_result(r, message_type=message_type, message_meta=message_meta, content_id=content_id)
|
||||
except BaseException as e:
|
||||
make_log(self, f"Error sending message to {self._chat_id}. Error: {e}", level='warning')
|
||||
return None
|
||||
|
|
|
|||
|
|
@ -1,7 +1,9 @@
|
|||
import os
|
||||
import traceback
|
||||
|
||||
import base58
|
||||
from sqlalchemy import and_, select
|
||||
from sqlalchemy import select
|
||||
from datetime import datetime
|
||||
|
||||
from app.core.logger import make_log
|
||||
from app.core.models import StoredContent
|
||||
|
|
@ -42,6 +44,9 @@ class NodeStorageIndexationMixin:
|
|||
pass # async def fetch_onchain_metadata(self):
|
||||
|
||||
|
||||
MIN_ONCHAIN_INDEX = int(os.getenv("MIN_ONCHAIN_INDEX", "8"))
|
||||
|
||||
|
||||
class UserContentIndexationMixin:
|
||||
async def sync_with_chain(self, db_session):
|
||||
errored = False
|
||||
|
|
@ -54,12 +59,30 @@ class UserContentIndexationMixin:
|
|||
cc_indexator_data = unpack_item_indexator_data(cc_indexator_result)
|
||||
assert cc_indexator_data['type'] == 1, "Type is not a content"
|
||||
assert cc_indexator_data['address'] == self.onchain_address, "Address is not equal"
|
||||
license_type = cc_indexator_data.get('license_type')
|
||||
if cc_indexator_data['index'] < MIN_ONCHAIN_INDEX and (license_type is None or license_type == 0):
|
||||
make_log(
|
||||
"UserContent",
|
||||
f"Skip license {self.onchain_address} with index {cc_indexator_data['index']} < MIN_ONCHAIN_INDEX={MIN_ONCHAIN_INDEX}",
|
||||
level="info"
|
||||
)
|
||||
self.type = 'nft/ignored'
|
||||
self.content_id = None
|
||||
self.updated = datetime.utcnow()
|
||||
await db_session.commit()
|
||||
return
|
||||
values_slice = cc_indexator_data['values'].begin_parse()
|
||||
content_hash_b58 = base58.b58encode(bytes.fromhex(hex(values_slice.read_uint(256))[2:])).decode()
|
||||
make_log("UserContent", f"License ({self.onchain_address}) content hash: {content_hash_b58}", level="info")
|
||||
stored_content = (await db_session.execute(select(StoredContent).where(
|
||||
and_(StoredContent.type == 'onchain/content', StoredContent.hash == content_hash_b58)
|
||||
StoredContent.hash == content_hash_b58
|
||||
))).scalars().first()
|
||||
if not stored_content:
|
||||
raise AssertionError(f"Stored content not found for hash={content_hash_b58}")
|
||||
if not (stored_content.type or '').startswith('onchain/content'):
|
||||
stored_content.type = 'onchain/content' if stored_content.key_id else 'onchain/content_unknown'
|
||||
stored_content.onchain_index = stored_content.onchain_index or cc_indexator_data['index']
|
||||
stored_content.owner_address = stored_content.owner_address or cc_indexator_data['owner_address']
|
||||
trusted_cop_address_result = await toncenter.run_get_method(stored_content.meta['item_address'], 'get_nft_address_by_index', [['num', cc_indexator_data['index']]])
|
||||
assert trusted_cop_address_result.get('exit_code', -1) == 0, "Trusted cop address error"
|
||||
trusted_cop_address = Cell.one_from_boc(b64decode(trusted_cop_address_result['stack'][0][1]['bytes'])).begin_parse().read_msg_addr().to_string(1, 1, 1)
|
||||
|
|
@ -68,6 +91,7 @@ class UserContentIndexationMixin:
|
|||
self.owner_address = cc_indexator_data['owner_address']
|
||||
self.type = 'nft/listen'
|
||||
self.content_id = stored_content.id
|
||||
self.meta = {**(self.meta or {}), 'license_type': license_type}
|
||||
await db_session.commit()
|
||||
except BaseException as e:
|
||||
errored = True
|
||||
|
|
@ -77,5 +101,3 @@ class UserContentIndexationMixin:
|
|||
self.type = 'nft/unknown'
|
||||
self.content_id = None
|
||||
await db_session.commit()
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -16,6 +16,7 @@ class EncryptedContent(AlchemyBase):
|
|||
|
||||
# Public metadata
|
||||
title = Column(String(512), nullable=False)
|
||||
artist = Column(String(512), nullable=True)
|
||||
description = Column(String(4096), nullable=True)
|
||||
content_type = Column(String(64), nullable=False) # e.g. audio/flac, video/mp4, application/octet-stream
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,22 @@
|
|||
from __future__ import annotations
|
||||
|
||||
from sqlalchemy import Column, String, Integer, Float, JSON, DateTime
|
||||
from datetime import datetime
|
||||
|
||||
from .base import AlchemyBase
|
||||
|
||||
|
||||
class DHTRecordRow(AlchemyBase):
|
||||
__tablename__ = 'dht_records'
|
||||
|
||||
# fingerprint = blake3(serialized key)
|
||||
fingerprint = Column(String(128), primary_key=True)
|
||||
key = Column(String(512), nullable=False, index=True)
|
||||
schema_version = Column(String(16), nullable=False, default='v1')
|
||||
logical_counter = Column(Integer, nullable=False, default=0)
|
||||
timestamp = Column(Float, nullable=False, default=0.0)
|
||||
node_id = Column(String(128), nullable=False)
|
||||
signature = Column(String(512), nullable=True)
|
||||
value = Column(JSON, nullable=False, default=dict)
|
||||
updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
|
@ -0,0 +1,48 @@
|
|||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from sqlalchemy import (
|
||||
Column,
|
||||
Integer,
|
||||
BigInteger,
|
||||
String,
|
||||
DateTime,
|
||||
JSON,
|
||||
UniqueConstraint,
|
||||
)
|
||||
|
||||
from .base import AlchemyBase
|
||||
|
||||
|
||||
class NodeEvent(AlchemyBase):
|
||||
__tablename__ = 'node_events'
|
||||
__table_args__ = (
|
||||
UniqueConstraint('origin_public_key', 'seq', name='uq_node_events_origin_seq'),
|
||||
UniqueConstraint('uid', name='uq_node_events_uid'),
|
||||
)
|
||||
|
||||
id = Column(Integer, autoincrement=True, primary_key=True)
|
||||
origin_public_key = Column(String(128), nullable=False)
|
||||
origin_host = Column(String(256), nullable=True)
|
||||
seq = Column(BigInteger, nullable=False)
|
||||
uid = Column(String(64), nullable=False)
|
||||
event_type = Column(String(64), nullable=False)
|
||||
payload = Column(JSON, nullable=False, default=dict)
|
||||
signature = Column(String(512), nullable=False)
|
||||
created_at = Column(DateTime, nullable=False, default=datetime.utcnow)
|
||||
received_at = Column(DateTime, nullable=False, default=datetime.utcnow)
|
||||
applied_at = Column(DateTime, nullable=True)
|
||||
status = Column(String(32), nullable=False, default='recorded')
|
||||
|
||||
|
||||
class NodeEventCursor(AlchemyBase):
|
||||
__tablename__ = 'node_event_cursors'
|
||||
__table_args__ = (
|
||||
UniqueConstraint('source_public_key', name='uq_event_cursor_source'),
|
||||
)
|
||||
|
||||
id = Column(Integer, autoincrement=True, primary_key=True)
|
||||
source_public_key = Column(String(128), nullable=False)
|
||||
last_seq = Column(BigInteger, nullable=False, default=0)
|
||||
updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
source_public_host = Column(String(256), nullable=True)
|
||||
|
|
@ -4,9 +4,19 @@ from datetime import datetime
|
|||
from datetime import timedelta
|
||||
|
||||
from aiogram import Bot
|
||||
from app.core._utils.b58 import b58encode
|
||||
|
||||
from app.core._config import TELEGRAM_API_KEY, CLIENT_TELEGRAM_API_KEY
|
||||
from app.core._crypto.signer import Signer
|
||||
from app.core._secrets import hot_pubkey, hot_seed
|
||||
from app.core.logger import make_log
|
||||
from app.core.network.dht import (
|
||||
MembershipManager,
|
||||
ReplicationManager,
|
||||
MetricsAggregator,
|
||||
compute_node_id,
|
||||
)
|
||||
from app.core.network.dht.store import PersistentDHTStore
|
||||
|
||||
|
||||
class Memory:
|
||||
|
|
@ -46,6 +56,15 @@ class Memory:
|
|||
self._handshake_rl = {"minute": 0, "counts": {}}
|
||||
self._handshake_nonces = {}
|
||||
|
||||
# Decentralised storage components
|
||||
self.node_id = compute_node_id(hot_pubkey)
|
||||
self.signer = Signer(hot_seed)
|
||||
self.dht_store = PersistentDHTStore(self.node_id, self.signer)
|
||||
self.membership = MembershipManager(self.node_id, self.signer, self.dht_store)
|
||||
self.replication = ReplicationManager(self.node_id, self.signer, self.dht_store)
|
||||
self.metrics = MetricsAggregator(self.node_id, self.signer, self.dht_store)
|
||||
self.membership.register_local(public_key=b58encode(hot_pubkey).decode(), ip=None, asn=None)
|
||||
|
||||
@asynccontextmanager
|
||||
async def transaction(self, desc=""):
|
||||
make_log("Memory.transaction", f"Starting transaction; {desc}", level='debug')
|
||||
|
|
@ -80,4 +99,3 @@ class Memory:
|
|||
make_log("Queue.add_task", f"Error when adding task to memory: {e}", level='error')
|
||||
|
||||
self._execute_queue.append([_fn, args, kwargs])
|
||||
|
||||
|
|
|
|||
|
|
@ -53,6 +53,11 @@ class StoredContent(AlchemyBase, AudioContentMixin):
|
|||
|
||||
@property
|
||||
def cid(self) -> ContentId:
|
||||
if self.content_id:
|
||||
try:
|
||||
return ContentId.deserialize(self.content_id)
|
||||
except Exception as exc:
|
||||
make_log("StoredContent", f"Failed to deserialize stored content_id '{self.content_id}': {exc}", level='warning')
|
||||
return ContentId(
|
||||
content_hash=b58decode(self.hash),
|
||||
onchain_index=self.onchain_index,
|
||||
|
|
|
|||
|
|
@ -0,0 +1,16 @@
|
|||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from sqlalchemy import Column, String, Integer, DateTime
|
||||
|
||||
from .base import AlchemyBase
|
||||
|
||||
|
||||
class RdapCache(AlchemyBase):
|
||||
__tablename__ = 'rdap_cache'
|
||||
|
||||
ip = Column(String(64), primary_key=True)
|
||||
asn = Column(Integer, nullable=True)
|
||||
source = Column(String(64), nullable=True)
|
||||
updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
|
@ -1,4 +1,4 @@
|
|||
from sqlalchemy import Column, Integer, String, ForeignKey, DateTime, Boolean, Float
|
||||
from sqlalchemy import Column, Integer, BigInteger, String, ForeignKey, DateTime, Boolean, Float
|
||||
from sqlalchemy.orm import relationship
|
||||
from datetime import datetime
|
||||
|
||||
|
|
@ -49,8 +49,15 @@ class StarsInvoice(AlchemyBase):
|
|||
|
||||
user_id = Column(Integer, ForeignKey('users.id'), nullable=True)
|
||||
content_hash = Column(String(256), nullable=True)
|
||||
telegram_id = Column(BigInteger, nullable=True)
|
||||
|
||||
invoice_url = Column(String(256), nullable=True)
|
||||
paid = Column(Boolean, nullable=False, default=False)
|
||||
paid_at = Column(DateTime, nullable=True)
|
||||
payment_tx_id = Column(String(256), nullable=True)
|
||||
payment_node_id = Column(String(128), nullable=True)
|
||||
payment_node_public_host = Column(String(256), nullable=True)
|
||||
bot_username = Column(String(128), nullable=True)
|
||||
is_remote = Column(Boolean, nullable=False, default=False)
|
||||
|
||||
created = Column(DateTime, nullable=False, default=datetime.utcnow)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
from datetime import datetime
|
||||
from sqlalchemy import Column, Integer, String, BigInteger, DateTime, JSON
|
||||
from sqlalchemy import Column, Integer, String, BigInteger, DateTime, JSON, Boolean
|
||||
from sqlalchemy.orm import relationship
|
||||
|
||||
from app.core.auth_v1 import AuthenticationMixin as AuthenticationMixin_V1
|
||||
|
|
@ -9,6 +9,10 @@ from app.core.translation import TranslationCore
|
|||
from ..base import AlchemyBase
|
||||
|
||||
|
||||
_BASE62_ALPHABET = "0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ"
|
||||
_BASE62 = len(_BASE62_ALPHABET)
|
||||
|
||||
|
||||
class User(AlchemyBase, DisplayMixin, TranslationCore, AuthenticationMixin_V1, WalletMixin):
|
||||
LOCALE_DOMAIN = 'sanic_telegram_bot'
|
||||
|
||||
|
|
@ -19,6 +23,7 @@ class User(AlchemyBase, DisplayMixin, TranslationCore, AuthenticationMixin_V1, W
|
|||
username = Column(String(512), nullable=True)
|
||||
lang_code = Column(String(8), nullable=False, default="en")
|
||||
meta = Column(JSON, nullable=False, default=dict)
|
||||
is_admin = Column(Boolean, nullable=False, default=False)
|
||||
|
||||
last_use = Column(DateTime, nullable=False, default=datetime.utcnow)
|
||||
updated = Column(DateTime, nullable=False, default=datetime.utcnow)
|
||||
|
|
@ -32,3 +37,26 @@ class User(AlchemyBase, DisplayMixin, TranslationCore, AuthenticationMixin_V1, W
|
|||
def __str__(self):
|
||||
return f"User, {self.id}_{self.telegram_id} | Username: {self.username} " + '\\'
|
||||
|
||||
def ensure_ref_id(self):
|
||||
"""Return a short referral identifier, generating it if missing."""
|
||||
meta = self.meta or {}
|
||||
ref_id = meta.get('ref_id')
|
||||
if isinstance(ref_id, str) and ref_id:
|
||||
return ref_id
|
||||
|
||||
ref_id = self._generate_ref_id()
|
||||
self.meta = {**meta, 'ref_id': ref_id}
|
||||
return ref_id
|
||||
|
||||
def _generate_ref_id(self):
|
||||
user_id = int(self.id or 0)
|
||||
if user_id <= 0:
|
||||
return '000'
|
||||
|
||||
value = user_id % (_BASE62 ** 3)
|
||||
chars = []
|
||||
for _ in range(3):
|
||||
chars.append(_BASE62_ALPHABET[value % _BASE62])
|
||||
value //= _BASE62
|
||||
|
||||
return ''.join(reversed(chars)) or '000'
|
||||
|
|
|
|||
|
|
@ -7,6 +7,11 @@ from tonsdk.utils import Address
|
|||
from datetime import datetime, timedelta
|
||||
from app.core.logger import make_log
|
||||
from httpx import AsyncClient
|
||||
from app.core.models.content.indexation_mixins import unpack_item_indexator_data, MIN_ONCHAIN_INDEX
|
||||
|
||||
def _platform_address_str() -> str:
|
||||
from app.core._blockchain.ton.platform import platform
|
||||
return platform.address.to_string(1, 1, 1)
|
||||
|
||||
|
||||
class WalletMixin:
|
||||
|
|
@ -43,9 +48,69 @@ class WalletMixin:
|
|||
item_address = Address(nft_item['address']).to_string(1, 1, 1)
|
||||
owner_address = Address(nft_item['owner']['address']).to_string(1, 1, 1)
|
||||
|
||||
platform_address = _platform_address_str()
|
||||
collection_address = None
|
||||
if isinstance(nft_item, dict):
|
||||
collection_data = nft_item.get('collection')
|
||||
if isinstance(collection_data, dict):
|
||||
collection_address = collection_data.get('address')
|
||||
collection_address = collection_address or nft_item.get('collection_address')
|
||||
if collection_address:
|
||||
try:
|
||||
collection_address = Address(collection_address).to_string(1, 1, 1)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
item_index = None
|
||||
license_type = None
|
||||
# Prefer index from tonapi payload if available
|
||||
raw_index = nft_item.get('index') if isinstance(nft_item, dict) else None
|
||||
if isinstance(raw_index, int):
|
||||
item_index = raw_index
|
||||
|
||||
need_chain_probe = item_index is None or item_index < MIN_ONCHAIN_INDEX
|
||||
platform_address_onchain = None
|
||||
if need_chain_probe:
|
||||
try:
|
||||
indexator_raw = await toncenter.run_get_method(item_address, 'indexator_data')
|
||||
if indexator_raw.get('exit_code', -1) == 0:
|
||||
indexator_data = unpack_item_indexator_data(indexator_raw)
|
||||
item_index = indexator_data['index']
|
||||
license_type = indexator_data.get('license_type')
|
||||
platform_address_onchain = indexator_data.get('platform_address')
|
||||
except BaseException as err:
|
||||
make_log(self, f"Failed to fetch indexator data for {item_address}: {err}", level='warning')
|
||||
|
||||
if item_index is None:
|
||||
make_log(self, f"Skip NFT {item_address}: unable to resolve on-chain index", level='warning')
|
||||
continue
|
||||
|
||||
if platform_address_onchain and platform_address_onchain != platform_address:
|
||||
make_log(
|
||||
self,
|
||||
f"Skip foreign NFT {item_address}: platform mismatch {platform_address_onchain} != {platform_address}",
|
||||
level='debug'
|
||||
)
|
||||
continue
|
||||
|
||||
if item_index < MIN_ONCHAIN_INDEX and (license_type is None or license_type == 0):
|
||||
make_log(
|
||||
self,
|
||||
f"Ignore NFT {item_address} with index {item_index} < MIN_ONCHAIN_INDEX={MIN_ONCHAIN_INDEX} (license_type={license_type})",
|
||||
level='debug'
|
||||
)
|
||||
continue
|
||||
|
||||
from sqlalchemy import select
|
||||
user_content = (await db_session.execute(select(UserContent).where(UserContent.onchain_address == item_address))).scalars().first()
|
||||
if user_content:
|
||||
if license_type is not None and license_type != 0 and user_content.type == 'nft/ignored':
|
||||
user_content.type = 'nft/unknown'
|
||||
user_content.meta = {**(user_content.meta or {}), 'license_type': license_type}
|
||||
user_content.owner_address = owner_address
|
||||
user_content.status = 'active'
|
||||
user_content.updated = datetime.fromtimestamp(0)
|
||||
await db_session.commit()
|
||||
continue
|
||||
|
||||
user_content = UserContent(
|
||||
|
|
@ -57,7 +122,7 @@ class WalletMixin:
|
|||
updated=datetime.fromtimestamp(0),
|
||||
content_id=None, # not resolved yet
|
||||
created=datetime.now(),
|
||||
meta={},
|
||||
meta={'license_type': license_type} if license_type is not None else {},
|
||||
user_id=self.id,
|
||||
wallet_connection_id=(await self.wallet_connection_async(db_session)).id,
|
||||
status="active"
|
||||
|
|
@ -83,6 +148,33 @@ class WalletMixin:
|
|||
item_address = Address(nft_item['address']).to_string(1, 1, 1)
|
||||
owner_address = Address(nft_item['owner_address']).to_string(1, 1, 1)
|
||||
|
||||
platform_address = _platform_address_str()
|
||||
collection_address = nft_item.get('collection_address') if isinstance(nft_item, dict) else None
|
||||
if collection_address:
|
||||
try:
|
||||
normalized_collection = Address(collection_address).to_string(1, 1, 1)
|
||||
except Exception:
|
||||
normalized_collection = collection_address
|
||||
if normalized_collection != platform_address:
|
||||
make_log(self, f"Skip foreign NFT {item_address} from collection {normalized_collection}", level='debug')
|
||||
continue
|
||||
|
||||
item_index = None
|
||||
try:
|
||||
indexator_raw = await toncenter.run_get_method(item_address, 'indexator_data')
|
||||
if indexator_raw.get('exit_code', -1) == 0:
|
||||
item_index = unpack_item_indexator_data(indexator_raw)['index']
|
||||
except BaseException as err:
|
||||
make_log(self, f"Failed to fetch indexator data for {item_address}: {err}", level='warning')
|
||||
|
||||
if item_index is None:
|
||||
make_log(self, f"Skip NFT {item_address}: unable to resolve on-chain index", level='warning')
|
||||
continue
|
||||
|
||||
if item_index is not None and item_index < MIN_ONCHAIN_INDEX:
|
||||
make_log(self, f"Ignore NFT {item_address} with index {item_index} < MIN_ONCHAIN_INDEX={MIN_ONCHAIN_INDEX}", level='debug')
|
||||
continue
|
||||
|
||||
from sqlalchemy import select
|
||||
user_content = (await db_session.execute(select(UserContent).where(UserContent.onchain_address == item_address))).scalars().first()
|
||||
if user_content:
|
||||
|
|
|
|||
Binary file not shown.
|
|
@ -0,0 +1,94 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import ipaddress
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Dict, Optional
|
||||
|
||||
from app.core.logger import make_log
|
||||
|
||||
|
||||
@dataclass
|
||||
class ASNResolver:
|
||||
cache: Dict[str, int] = field(default_factory=dict)
|
||||
|
||||
def normalise(self, ip: str | None) -> Optional[str]:
|
||||
if not ip:
|
||||
return None
|
||||
try:
|
||||
return str(ipaddress.ip_address(ip))
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
def resolve(self, ip: str | None) -> Optional[int]:
|
||||
norm = self.normalise(ip)
|
||||
if not norm:
|
||||
return None
|
||||
return self.cache.get(norm)
|
||||
|
||||
def learn(self, ip: str, asn: int) -> None:
|
||||
norm = self.normalise(ip)
|
||||
if not norm:
|
||||
make_log("ASNResolver", f"Invalid IP provided for learn: {ip}", level="warning")
|
||||
return
|
||||
self.cache[norm] = asn
|
||||
|
||||
async def resolve_async(self, ip: str | None, db_session=None) -> Optional[int]:
|
||||
"""Resolve ASN via persistent cache; fallback to RDAP API; store result.
|
||||
|
||||
- Checks in-memory cache first.
|
||||
- If not found, checks DB table rdap_cache when available.
|
||||
- If still not found, queries a public API and persists.
|
||||
"""
|
||||
norm = self.normalise(ip)
|
||||
if not norm:
|
||||
return None
|
||||
# In-memory cache first
|
||||
if norm in self.cache:
|
||||
return self.cache[norm]
|
||||
# DB lookup if possible
|
||||
try:
|
||||
if db_session is not None:
|
||||
from sqlalchemy import select
|
||||
from app.core.models.rdap import RdapCache
|
||||
row = (await db_session.execute(select(RdapCache).where(RdapCache.ip == norm))).scalars().first()
|
||||
if row and row.asn is not None:
|
||||
self.cache[norm] = int(row.asn)
|
||||
return int(row.asn)
|
||||
except Exception as e:
|
||||
make_log("ASNResolver", f"DB lookup failed for {norm}: {e}", level="warning")
|
||||
|
||||
# Remote lookup (best-effort)
|
||||
asn: Optional[int] = None
|
||||
try:
|
||||
import httpx
|
||||
url = f"https://api.iptoasn.com/v1/as/ip/{norm}"
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
r = await client.get(url)
|
||||
if r.status_code == 200:
|
||||
j = r.json()
|
||||
num = j.get("as_number")
|
||||
if isinstance(num, int) and num > 0:
|
||||
asn = num
|
||||
except Exception as e:
|
||||
make_log("ASNResolver", f"RDAP lookup failed for {norm}: {e}", level="warning")
|
||||
|
||||
if asn is not None:
|
||||
self.cache[norm] = asn
|
||||
# Persist to DB if possible
|
||||
try:
|
||||
if db_session is not None:
|
||||
from app.core.models.rdap import RdapCache
|
||||
row = await db_session.get(RdapCache, norm)
|
||||
if row is None:
|
||||
row = RdapCache(ip=norm, asn=asn, source="iptoasn")
|
||||
db_session.add(row)
|
||||
else:
|
||||
row.asn = asn
|
||||
row.source = "iptoasn"
|
||||
await db_session.commit()
|
||||
except Exception as e:
|
||||
make_log("ASNResolver", f"DB persist failed for {norm}: {e}", level="warning")
|
||||
return asn
|
||||
|
||||
|
||||
resolver = ASNResolver()
|
||||
|
|
@ -1,3 +1,4 @@
|
|||
import json
|
||||
import os
|
||||
from typing import List
|
||||
|
||||
|
|
@ -9,6 +10,23 @@ def _csv_list(val: str) -> List[str]:
|
|||
return [x.strip() for x in (val or "").split(",") if x.strip()]
|
||||
|
||||
|
||||
def _json_value(val: str, fallback):
|
||||
if not val:
|
||||
return fallback
|
||||
try:
|
||||
return json.loads(val)
|
||||
except Exception:
|
||||
return fallback
|
||||
|
||||
|
||||
def _as_list(value):
|
||||
if isinstance(value, list):
|
||||
return value
|
||||
if value is None:
|
||||
return []
|
||||
return [value]
|
||||
|
||||
|
||||
# Handshake / network config driven by env
|
||||
NODE_PRIVACY = os.getenv("NODE_PRIVACY", NODE_TYPE_PUBLIC).strip().lower()
|
||||
if NODE_PRIVACY not in (NODE_TYPE_PUBLIC, NODE_TYPE_PRIVATE):
|
||||
|
|
@ -30,6 +48,12 @@ NETWORK_TLS_VERIFY = int(os.getenv("NETWORK_TLS_VERIFY", "1")) == 1
|
|||
HANDSHAKE_TS_TOLERANCE_SEC = int(os.getenv("HANDSHAKE_TS_TOLERANCE_SEC", "300"))
|
||||
HANDSHAKE_RATE_LIMIT_PER_MIN = int(os.getenv("HANDSHAKE_RATE_LIMIT_PER_MIN", "60"))
|
||||
|
||||
# IPFS discovery/peering
|
||||
IPFS_PRIVATE_BOOTSTRAP_ADDRESSES = _as_list(_json_value(os.getenv("IPFS_PRIVATE_BOOTSTRAP"), []))
|
||||
IPFS_PEERING_PEERS = _json_value(os.getenv("IPFS_PEERING_PEERS"), [])
|
||||
IPFS_ANNOUNCE_ADDRESSES = _as_list(_json_value(os.getenv("IPFS_ANNOUNCE_ADDRESSES"), []))
|
||||
IPFS_NOANNOUNCE_ADDRESSES = _as_list(_json_value(os.getenv("IPFS_NOANNOUNCE_ADDRESSES"), []))
|
||||
|
||||
# Capabilities
|
||||
NODE_IS_BOOTSTRAP = int(os.getenv("NODE_IS_BOOTSTRAP", "0")) == 1
|
||||
MAX_CONTENT_SIZE_MB = int(os.getenv("MAX_CONTENT_SIZE_MB", "512"))
|
||||
|
|
|
|||
|
|
@ -0,0 +1,35 @@
|
|||
"""
|
||||
Decentralised storage, replication, and metrics layer.
|
||||
"""
|
||||
|
||||
from .config import dht_config, DHTConfig
|
||||
from .crypto import compute_node_id, compute_content_id, compute_view_id, bits_from_hex, rendezvous_score
|
||||
from .keys import MetaKey, MetricKey, MembershipKey
|
||||
from .membership import MembershipManager, MembershipState, ReachabilityReceipt
|
||||
from .replication import ReplicationManager, ReplicationState, ReplicaLease
|
||||
from .metrics import MetricsAggregator, ContentMetricsState, MetricDelta
|
||||
from .store import DHTStore
|
||||
|
||||
__all__ = [
|
||||
"dht_config",
|
||||
"DHTConfig",
|
||||
"compute_node_id",
|
||||
"compute_content_id",
|
||||
"compute_view_id",
|
||||
"bits_from_hex",
|
||||
"rendezvous_score",
|
||||
"MetaKey",
|
||||
"MetricKey",
|
||||
"MembershipKey",
|
||||
"MembershipManager",
|
||||
"MembershipState",
|
||||
"ReachabilityReceipt",
|
||||
"ReplicationManager",
|
||||
"ReplicationState",
|
||||
"ReplicaLease",
|
||||
"MetricsAggregator",
|
||||
"ContentMetricsState",
|
||||
"MetricDelta",
|
||||
"DHTStore",
|
||||
]
|
||||
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue