Skip to content

Add firo support to enkrypt#797

Open
reubenyap wants to merge 113 commits intoenkryptcom:mainfrom
firoorg:main
Open

Add firo support to enkrypt#797
reubenyap wants to merge 113 commits intoenkryptcom:mainfrom
firoorg:main

Conversation

@reubenyap
Copy link
Copy Markdown

@reubenyap reubenyap commented Apr 23, 2026

Add Firo network support (transparent + Spark private addresses)

Summary

This PR adds full support for the Firo network to Enkrypt, including both the
standard transparent UTXO wallet and the Spark privacy protocol (private
addresses, mints and spends). It also adds a Firo-specific Electrum client,
activity handler, synchronization flow for the Spark anonymity set, and all of
the UI needed to deposit, send and view balances on both address types.

No changes are made to existing networks; everything is additive under the
Bitcoin-family provider and a new spark-handler library.

What's included

Network / chain integration

  • New networks: firo (mainnet) and firo-testnet, added under the Bitcoin
    provider with icon, config and test coverage.
  • Firo-specific Electrum client (electrum-client.ts, abstract-electrum.ts)
    used for transparent UTXOs, anonymity-set sync, and broadcasting.
  • Firo REST API wrapper (api-firo.ts) and activity handler
    (activity-handlers/providers/firo).

Transparent wallet

  • firo-wallet/base-firo-wallet.ts + configs/utils implement Firo-flavoured
    transparent address derivation, UTXO selection and signing.
  • Transparent send flow (tabs/transparent-send-tab.vue) and updated
    verify-transaction screen.

Spark (private) support

  • spark-handler/ library wrapping the Spark WASM module:
    generateSparkWallet, getMintTxData, getSparkCoinInfo, getFee,
    getSerializedCoin, getTotalMintedAmount, createTempTx,
    createTempFromSparkTx, serializeMintContext.
  • Bundled Spark WASM (libs/utils/wasmModule/spark.{js,wasm}) and a worker copy
    (wasmWorkerModule/) with wasm-loader.ts / wasm-worker-loader.ts.
  • Background worker sparkCoinInfoWorker.ts to decode coin info off the main
    thread.
  • Anonymity-set / tags sync pipeline under libs/utils/updateAndSync/:
    updateCoinSet, updateTagsSet, handleCoinSetUpdates, markCoinsAsUsed,
    calculateCurrentSparkBalance, getSetsFromDb.
  • IndexedDB store (ui/action/db/indexedDB.ts) persisting coins, tags and the
    last scanned coin-set id so re-opening the wallet resumes incrementally
    instead of re-scanning.
  • Composables: synchronize-spark-state, get-spark-address,
    update-activity-state, async-computed.

UI

  • Spark send tab (tabs/spark-send-tab.vue) and Spark address input
    (send-spark-address-input.vue) with support for both Spark→Spark and
    Spark→transparent sends.
  • Verify screens for Spark send flows
    (verify-send-from-spark-transaction, verify-send-to-spark-transaction).
  • synchronize-state.vue progress UI shown while the anonymity set syncs.
  • Deposit view updated to expose both the transparent and Spark addresses; QR
    encodes Firo:{address}.
  • Accounts list / modal show the Spark address alongside the transparent
    address; non-compatible accounts are filtered out on Firo.
  • Reset-wallet flow clears the Spark IndexedDB.

Keyring / infrastructure

  • packages/keyring bumped and extended with Firo signer hooks.
  • public-keyring / keyring.ts updated to expose Firo key material.
  • Vite config and tsconfig.app.json updated to bundle the WASM assets; new
    .prettierignore entry to skip the generated spark.js.

Notes for reviewers

A few things are worth flagging explicitly so they are not mistaken for bugs:

  • Token list balance is transparent-only. The account row / token list in
    the main wallet view displays only the transparent FIRO balance. It does
    not aggregate transparent + private (Spark) balance. The private balance
    is visible inside the Firo-specific views (send flow, deposit view, activity
    screen) once the anonymity set has synced.
  • Spark spends show as "swapped". Both incoming and outgoing Spark spends
    are rendered with the "swapped" activity icon/label. This is a limitation of
    Enkrypt's activity UI (there is no native "shielded transfer" category to
    map to); we were advised to accept this rather than patch the shared
    activity types.
  • Initial sync is long. On first unlock with a Firo account, the extension
    downloads the full Spark coin set and tag set, decodes coins in a worker,
    and writes them to IndexedDB. The synchronize-state.vue modal reflects
    progress. Subsequent unlocks resume from the last stored coin-set id.
  • WASM memory. Several commits late in the branch tighten WASM pointer
    lifetimes (getMintTxData, getSparkCoinInfo, and the serialContext
    buffer allocation). If you see review questions about manual _free calls,
    that is why.
  • Scope. Only Firo code paths are touched; other Bitcoin-family networks
    (BTC, LTC, DOGE, …) continue to use the existing send-transaction view and
    are unaffected.

Test plan

  • Import a seed, confirm both transparent and Spark addresses are derived
    and shown in the accounts list and deposit view.
  • Receive transparent FIRO; confirm balance + activity entry.
  • Receive private FIRO (mint); confirm Spark balance appears after sync.
  • Send transparent → transparent.
  • Send transparent → Spark (mint).
  • Send Spark → Spark.
  • Send Spark → transparent (spend).
  • Lock / unlock and confirm sync resumes from IndexedDB.
  • Reset wallet and confirm Spark DB is cleared.
  • Confirm other networks (BTC, ETH, …) are unaffected.

kvhnuke and others added 30 commits December 23, 2024 15:39
…ture/add-firo-network

# Conflicts:
#	packages/extension/src/libs/utils/wasmModule/spark.js
Add TX activity history after sending from Spark address
…ed-spark

Disable activity when spark to spark transaction is failed or dropped
VahanSargsyan and others added 28 commits February 10, 2026 08:00
Fix activity state and add reset DB on reset
2. Fix state init
3. In QR now shows Firo:{address}
1. Accounts list show spark addresses
In getSparkCoinInfo, the WASM buffer for the serial context was allocated
with serializedCoin.length and then populated with serialContext via
HEAPU8.set(serialContext, ...). When serialContext.length exceeds
serializedCoin.length, this writes past the allocated region, corrupting
WASM heap memory and producing non-deterministic failures or undefined
behavior in js_deserializeCoin. Use serialContext.length, matching the
pattern already used in libs/spark-handler/index.ts.

Co-authored-by: Reuben Yap <reuben@firo.org>
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 23, 2026

Walkthrough

This PR adds comprehensive support for the Firo blockchain network and its Spark privacy protocol. It introduces WASM-backed Spark wallet functionality, cryptographic operations, privacy-focused transaction handling, anonymity set synchronization, and associated UI components for transparent and private transaction sending.

Changes

Cohort / File(s) Summary
Network Configuration & Gitignore
.gitignore, README.md, packages/extension/.prettierignore, packages/extension/vite.config.ts
Adds IDE configuration exclusion, updates supported chains list, configures Prettier formatting rules, and reorders import statements in Vite config.
Dependencies
package.json, packages/extension/package.json, packages/keyring/package.json
Adds runtime dependencies: ecpair, bitcoinerlab/secp256k1, bitcoinjs-lib-firo, electrum-client-browser; replaces bip39 with bip32 in keyring.
Firo Network Provider
packages/extension/src/providers/bitcoin/networks/firo.ts, packages/extension/src/providers/bitcoin/networks/firo-testnet.ts, packages/extension/src/providers/bitcoin/networks/index.ts
Introduces new Firo and FiroTestnet network definitions with configuration, API provider type, activity handlers, fee rates, and derivation paths.
Firo API Implementation
packages/extension/src/providers/bitcoin/libs/api-firo.ts
Implements Firo-specific API provider with transaction status retrieval, balance fetching, UTXO listing, and transaction broadcasting via Insight API endpoints.
Electrum Client for Firo
packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts, packages/extension/src/providers/bitcoin/libs/electrum-client/abstract-electrum.ts
Adds FiroElectrum client with WSS peer management and high-level APIs for coin metadata, balances, transaction histories, anonymity sets, and used-coin tags.
Firo Wallet Implementation
packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts, packages/extension/src/providers/bitcoin/libs/firo-wallet/configs.ts, packages/extension/src/providers/bitcoin/libs/firo-wallet/utils.ts
Implements BaseFiroWallet with ECDSA support, mnemonic persistence, address derivation, UTXO management, and Spark anonymity set operations; adds utility helpers for chunked processing and random selection.
Spark Transaction Handlers
packages/extension/src/libs/spark-handler/createTempFromSparkTx.ts, packages/extension/src/libs/spark-handler/createTempTx.ts, packages/extension/src/libs/spark-handler/generateSparkWallet.ts, packages/extension/src/libs/spark-handler/getMintTxData.ts, packages/extension/src/libs/spark-handler/getSparkCoinInfo.ts, packages/extension/src/libs/spark-handler/getTotalMintedAmount.ts, packages/extension/src/libs/spark-handler/serializeMintContext.ts, packages/extension/src/libs/spark-handler/utils.ts, packages/extension/src/libs/spark-handler/constants.ts
Comprehensive Spark transaction building, WASM-backed coin info processing, mint transaction construction, fee calculation, and serialization utilities for privacy-focused transactions.
Spark Wallet Sync & State
packages/extension/src/libs/spark-handler/index.ts
Implements sendFromSparkAddress for constructing and broadcasting privacy transactions with coin set management, fee computation, and WASM-based transaction signing.
IndexedDB Persistence
packages/extension/src/ui/action/db/indexedDB.ts
Introduces IndexedDB storage layer for Spark anonymity sets, coin tracking, and metadata with transaction-based read/write operations and set update management.
Spark State Synchronization
packages/extension/src/libs/utils/updateAndSync/calculateCurrentSparkBalance.ts, packages/extension/src/libs/utils/updateAndSync/getSetsFromDb.ts, packages/extension/src/libs/utils/updateAndSync/handleCoinSetUpdates.ts, packages/extension/src/libs/utils/updateAndSync/markCoinsAsUsed.ts, packages/extension/src/libs/utils/updateAndSync/updateCoinSet.ts, packages/extension/src/libs/utils/updateAndSync/updateTagsSet.ts
Implements coin and tag synchronization workflows, IndexedDB persistence for anonymity sets, balance calculation, and marking coins as spent with transaction detail fetching.
WASM Loading & Workers
packages/extension/src/libs/utils/wasm-loader.ts, packages/extension/src/libs/utils/wasm-worker-loader.ts, packages/extension/src/ui/action/workers/sparkCoinInfoWorker.ts
Introduces WASM module loaders with caching and error handling; implements web worker for concurrent coin info processing with WASM integration.
Transaction Signing
packages/extension/src/providers/bitcoin/ui/libs/signer.ts
Adds FiroTransactionSigner for PSBT construction, input signing via keypair mapping, and signature validation for Firo transactions.
Activity Handlers
packages/extension/src/providers/bitcoin/libs/activity-handlers/index.ts, packages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.ts
Implements Firo activity handler using Insight API to fetch and transform transactions into Activity objects with price data and token metadata.
Keyring & Background
packages/extension/src/libs/keyring/keyring.ts, packages/extension/src/libs/keyring/public-keyring.ts, packages/extension/src/libs/background/index.ts
Exposes getPrivateKey and getSavedMnemonic APIs in KeyRing; updates BackgroundHandler to initialize BaseFiroWallet and handle unlock flow with mnemonic extraction.
Bitcoin Provider Utils
packages/extension/src/providers/bitcoin/libs/utils.ts, packages/extension/src/providers/bitcoin/libs/filter-ordinals.ts, packages/extension/src/libs/utils/accounts.ts, packages/extension/src/libs/utils/initialize-wallet.ts, packages/extension/src/libs/utils/number-formatter.ts
Adds isSparkAddress validator, extends account filtering for FIRO base path, initializes Firo accounts on setup, and adds logging to ordinal filtering.
Type Definitions
packages/extension/src/providers/bitcoin/types/bitcoin-network.ts, packages/extension/src/providers/bitcoin/types/index.ts, packages/extension/src/types/activity.ts, packages/extension/src/types/provider.ts, packages/extension/src/ui/action/types/account.ts
Adds Firo network types, Spark account structures, activity type for spark transactions, and provider storage namespace; extends UTXO/transaction types with Firo-specific fields.
UI Views & Components
packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue, packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue, packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue, packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue, packages/extension/src/providers/bitcoin/ui/send-transaction/verify-transaction/index.vue, packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue, packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue
Restructures send-transaction UI with tab-based switching between transparent and Spark send modes; adds Spark address input validation, send-from-Spark verification, and send-to-Spark verification views.
Activity & Account UI
packages/extension/src/ui/action/views/network-activity/components/network-activity-total.vue, packages/extension/src/ui/action/views/network-activity/components/network-activity-transaction.vue, packages/extension/src/ui/action/views/network-activity/components/synchronize-state.vue, packages/extension/src/ui/action/views/accounts/components/accounts-list-item.vue, packages/extension/src/ui/action/views/accounts/index.vue, packages/extension/src/ui/action/views/modal-accounts/index.vue
Adds Spark balance display, coin synchronization controls, transaction rendering for Spark spends, and Spark address display in account lists.
Deposit & Balance Views
packages/extension/src/ui/action/views/deposit/index.vue, packages/extension/src/ui/action/views/network-activity/index.vue, packages/extension/src/ui/action/views/network-assets/index.vue, packages/extension/src/ui/action/views/send-process/components/send-process-account.vue
Adds Spark address display and QR code generation in deposit view; extends activity views with Spark balance and synchronization state tracking; updates asset views with sync flags.
Lock Screen & Wallet Reset
packages/extension/src/ui/action/views/lock-screen/index.vue, packages/extension/src/ui/action/views/lock-screen/components/lock-screen-forgot.vue, packages/extension/src/ui/action/views/reset-wallet/index.vue
Adds IndexedDB reset on wallet reset/forgot actions; emits spark-state init event during unlock.
App Core & Routing
packages/extension/src/ui/action/App.vue, packages/extension/src/ui/action/main.ts, packages/extension/src/ui/action/router/index.ts, packages/extension/src/ui/onboard/App.vue
Integrates Spark state synchronization, Firo account initialization, IndexedDB connectivity, Electrum connection lifecycle, and new Spark transaction verification routes.
Composables & Utilities
packages/extension/src/ui/action/composables/async-computed.ts, packages/extension/src/ui/action/composables/get-spark-address.ts, packages/extension/src/ui/action/composables/synchronize-spark-state.ts, packages/extension/src/ui/action/composables/update-activity-state.ts, packages/extension/src/ui/action/utils/array-utils.ts, packages/extension/src/ui/action/utils/set-utils.ts, packages/extension/src/ui/action/views/app-menu/index.vue
Implements async computed properties, Spark address derivation, coin set synchronization with worker coordination, activity state updates, array/set utilities, and dev-only fullscreen mode.
App Menu & UI Tweaks
packages/extension/src/ui/action/views/app-menu/index.vue
Adds development-only fullscreen button in app menu.
TypeScript Configuration
packages/extension/tsconfig.app.json
Enables skipLibCheck and allowJs compiler options.
Tests & Mocking
packages/extension/src/providers/bitcoin/tests/bitcoin.address.derivation.test.ts, packages/name-resolution/tests/resolver.test.ts, packages/name-resolution/tests/sid.test.ts, packages/name-resolution/tests/ud.test.ts, packages/keyring/src/index.ts
Adds Spark wallet mocks in address derivation tests; adds commented Vitest mocks for name SDKs; updates name resolution test assertions to Vitest style.

Sequence Diagrams

sequenceDiagram
    participant User as User/UI
    participant App as App Component
    participant Sync as Coin Sync Service
    participant IDB as IndexedDB
    participant Electrum as Electrum Client
    participant WASM as WASM Module
    participant Worker as Coin Info Worker

    User->>App: Network initialized (Firo)
    App->>Sync: startCoinSetSync()
    Sync->>Electrum: Fetch remote anonymity sets
    Electrum-->>Sync: Set metadata
    Sync->>IDB: Load local anonymity sets
    IDB-->>Sync: Stored sets
    Sync->>Electrum: Fetch anonymity set sectors
    Electrum-->>Sync: Coin data
    Sync->>IDB: Update sets with new coins
    Sync->>Worker: Process coin info (via worker)
    Worker->>WASM: Load module + view keys
    Worker->>WASM: getSparkCoinInfo per coin
    WASM-->>Worker: Coin metadata (value, used state)
    Worker->>IDB: Store processed coins
    Worker-->>App: Coin processing complete
    App->>App: Update sparkBalance
    App-->>User: Display private balance
Loading
sequenceDiagram
    participant User as User/UI
    participant SendTab as Spark Send Tab
    participant Handler as Spark Handler
    participant Electrum as Electrum Client
    participant WASM as WASM Module
    participant IDB as IndexedDB
    participant Signer as Firo Signer

    User->>SendTab: Enter address & amount
    SendTab->>Handler: sendFromSparkAddress()
    Handler->>IDB: Load owned coins + used tags
    Handler->>Handler: Filter and select coins
    Handler->>Electrum: Fetch UTXO data
    Handler->>WASM: Initialize Spark keys
    Handler->>WASM: Build recipients/memos
    Handler->>IDB: Load anonymity set data
    Handler->>WASM: Create spend transaction
    WASM-->>Handler: Serialized tx + scripts
    Handler->>Signer: Sign transaction
    Signer->>WASM: Validate + finalize
    Handler->>Electrum: Broadcast tx
    Electrum-->>Handler: txid
    Handler-->>SendTab: Broadcast success
    SendTab-->>User: Transaction confirmed
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Possibly related PRs

Suggested reviewers

  • gamalielhere
  • kvhnuke
  • NickKelly1
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 14

Note

Due to the large number of review comments, Critical severity comments were prioritized as inline comments.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (5)
packages/extension/src/providers/bitcoin/types/index.ts (1)

91-128: ⚠️ Potential issue | 🟠 Major

Move vin.addr into Firo-only type definition.

Lines 100–102 make addr required in the shared SSTxType.vin, but the SS activity handler (line 34 of activity-handlers/providers/ss/index.ts) filters by vi.addresses and never accesses addr, while the Firo activity handler (line 32 of activity-handlers/providers/firo/index.ts) filters by vi.addr and never uses addresses. These are incompatible shapes from different explorers. Requiring addr in the shared type breaks existing SS consumers that only receive addresses in their API responses.

Make addr optional in SSTxType.vin and override vin in FiroTxType to require addr:

vin: {
  txid: string;
  addresses?: string[];
-  addr: string;
+  addr?: string;
  value: string;
}[];

export interface FiroTxType extends Omit<
  SSTxType,
-  'fee' | 'timestamp' | 'vout' | 'blockHeight'
+  'fee' | 'timestamp' | 'vout' | 'blockHeight' | 'vin'
> {
  fees: number;
  time: number;
  blockheight: number;
+  vin: (Omit<SSTxType['vin'][number], 'addr'> & {
+    addr: string;
+  })[];
  vout: {
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/providers/bitcoin/types/index.ts` around lines 91 -
128, Make SSTxType.vin's addr optional (addr?: string) so the shared SS type no
longer requires a field SS explorers don't provide, and then explicitly override
vin in FiroTxType to a Firo-specific shape that requires addr (e.g., add a vin
declaration in FiroTxType with addr: string non-optional and the same other
properties). Update only the type definitions SSTxType and FiroTxType (leave
activity handlers as-is) so SS consumers get addresses[] and Firo consumers get
addr.
packages/extension/src/ui/action/views/lock-screen/index.vue (1)

57-90: ⚠️ Potential issue | 🔴 Critical

Declared update:init-spark-state emit is never fired.

defineEmits now declares (e: 'update:init-spark-state', password: string), but unlockAction only emits update:init and the password is cleared immediately after (line 85). Whatever listener binds @update:init-spark-state in the parent will never be invoked, so the Firo Spark initialization flow this emit is meant to trigger will silently not run. Either emit it alongside update:init before clearing the password, or drop the declaration.

🔧 Proposed fix
     isError.value = false;
-    password.value = '';
     emit('update:init');
+    emit('update:init-spark-state', password.value.trim());
+    password.value = '';
     setTimeout(() => (isUnlocking.value = false), 750);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/ui/action/views/lock-screen/index.vue` around lines 57
- 90, The declared emit 'update:init-spark-state' is never used; update the
unlockAction in this component to emit('update:init-spark-state',
password.value.trim()) before clearing password.value (and before or alongside
emit('update:init')) so the parent receives the password for Firo Spark
initialization, or remove the ('update:init-spark-state', password: string)
signature from defineEmits if you choose to drop that flow; locate defineEmits
and the unlockAction function to apply the change.
packages/extension/src/ui/action/components/accounts-header/components/header-accounts.vue (1)

6-28: ⚠️ Potential issue | 🟡 Minor

Handle empty sparkAddress gracefully on Firo.

sparkAddress defaults to '' and only becomes populated once accountInfo.sparkAccount?.defaultAddress resolves. While the prop is empty on the Firo network, network.identicon('') and replaceWithEllipsis('', 6, 4) will render a placeholder identicon and an empty/ellipsed address label. Consider falling back to address (the transparent address) until sparkAddress is ready, to avoid a transient broken-looking header:

-      network.name === NetworkNames.Firo ? sparkAddress : address,
+      network.name === NetworkNames.Firo && sparkAddress
+        ? sparkAddress
+        : address,

The same guard should apply to the replaceWithEllipsis call at lines 22-28.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/ui/action/components/accounts-header/components/header-accounts.vue`
around lines 6 - 28, When NetworkNames.Firo is active, the template currently
passes sparkAddress directly to network.identicon and
$filters.replaceWithEllipsis even when sparkAddress is an empty string; update
both usages to fall back to address when sparkAddress is falsy so the
transparent address is shown until accountInfo.sparkAccount?.defaultAddress
resolves. Specifically change the identicon src binding (network.identicon(...))
and the display span (the $filters.replaceWithEllipsis(...) call) to use
(sparkAddress || address) when network.name === NetworkNames.Firo, keeping all
other logic intact.
packages/extension/src/ui/action/views/network-activity/index.vue (1)

303-328: ⚠️ Potential issue | 🟠 Major

Pending spark_transaction activities are never polled.

setActivities only dispatches pending items for ActivityType.transaction (to checkActivity) and ActivityType.swap (to checkSwap). Pending entries with the new ActivityType.spark_transaction are silently skipped, so a Spark spend that is in ActivityStatus.pending at load time will stay pending in the UI indefinitely — no status transition to success/failed/dropped will be persisted via updateActivity. Consider adding a branch (e.g., checkActivity-equivalent using the Firo API's getTransactionStatus) for spark_transaction.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/ui/action/views/network-activity/index.vue` around
lines 303 - 328, setActivities currently only triggers polling for
ActivityType.transaction (checkActivity) and ActivityType.swap (checkSwap), so
pending activities with ActivityType.spark_transaction are never polled; add a
branch in setActivities to detect (act.type === ActivityType.spark_transaction
&& (act.status === ActivityStatus.pending || act.status ===
ActivityStatus.dropped)) and call a new or existing routine that uses the Firo
API (e.g., getTransactionStatus) to poll and then persist results via
updateActivity (mirror behavior of checkActivity), or implement a dedicated
checkSparkTransaction function that performs the getTransactionStatus calls and
calls updateActivity when status changes.
packages/extension/src/ui/action/views/deposit/index.vue (1)

161-180: ⚠️ Potential issue | 🟠 Major

Default to a real transparent fallback when no Spark account exists.

SparkAccount | null currently defaults to {}, so !!sparkAccount becomes true and the Spark tab can render undefined. Also, resetting activeTab to spark hides the transparent address when no Spark account is available.

Proposed fix
 const isCopied = ref(false);
-const activeTab = ref<'transparent' | 'spark'>('spark');
+const activeTab = ref<'transparent' | 'spark'>('transparent');
@@
   sparkAccount: {
     type: Object as PropType<SparkAccount | null>,
-    default: () => {
-      return {};
-    },
+    default: null,
   },
@@
 watch(
-  () => props.showDeposit,
-  v => {
-    if (!v) {
+  () => [props.showDeposit, props.sparkAccount] as const,
+  ([showDeposit, sparkAccount]) => {
+    if (!showDeposit) {
       isCopied.value = false;
-      activeTab.value = 'spark';
     }
+    activeTab.value = sparkAccount ? 'spark' : 'transparent';
   },
 );

Also applies to: 214-220

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/ui/action/views/deposit/index.vue` around lines 161 -
180, The Spark account prop currently declares type SparkAccount | null but
defaults to an empty object, causing !!sparkAccount to be truthy and allowing
the Spark tab to render undefined; update the prop default for sparkAccount to
return null (or undefined) so the value is truly falsy when absent, and adjust
activeTab initialization and any reset logic (activeTab ref and places where
it’s reset to 'spark') to fall back to 'transparent' when sparkAccount is falsy
(use the sparkAccount prop and checks around activeTab in the component to
conditionally set 'spark' only when sparkAccount exists); apply the same
default-null fix to the other identical prop occurrence referenced in the file.
♻️ Duplicate comments (1)
packages/name-resolution/tests/sid.test.ts (1)

4-24: ⚠️ Potential issue | 🟡 Minor

Commented-out mocks won't work and may be unnecessary.

This commented mock scaffolding has the same issues as in resolver.test.ts:

  1. Uses vi.fn() and vi.mock() without importing vi from Vitest
  2. Unclear relevance to Firo blockchain support

Additionally, the comment on line 4 says these mocks are "to pass the tests," but the entire test suite is skipped (line 25: describe.skip(...)). If the tests are intentionally disabled, these mock stubs serve no purpose.

Consider either removing the commented code or properly implementing it if you plan to re-enable these tests.

🔧 Proposed fix to add missing import

If these mocks are needed when tests are re-enabled:

-import { describe, it, expect } from "vitest";
+import { describe, it, expect, vi } from "vitest";
 import SIDResolver from "../src/sid";
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/name-resolution/tests/sid.test.ts` around lines 4 - 24, The
commented-out vi.mock/vi.fn scaffolding in sid.test.ts is dead or broken: either
remove these commented mocks (they're unnecessary while the suite is skipped via
describe.skip) or implement them correctly before re-enabling tests; to
implement, import vi from Vitest (e.g. import { vi } from 'vitest'), register
mocks for the exact modules used (e.g. "@web3-name-sdk/core",
"@web3-name-sdk/core/solName", "@web3-name-sdk/core/paymentIdName") and ensure
the mocked methods (getDomainName/getAddress) reflect the actual Firo SDK
surface if Firo support is intended, then re-enable the suite by removing
describe.skip once mocks are correct.
🟡 Minor comments (27)
packages/name-resolution/tests/resolver.test.ts-4-24 (1)

4-24: ⚠️ Potential issue | 🟡 Minor

Commented-out mocks use undefined vi without importing it.

Lines 4-22 contain mock scaffolding that calls vi.fn() and vi.mock(), but vi is not imported from Vitest. If uncommented, these would immediately fail with a ReferenceError.

Since no Firo-related changes appear in this test file, this commented code appears to be unrelated scaffolding. Either:

  • Remove it if not needed
  • Implement it properly by adding import { vi } from "vitest" to the imports
  • Move it to a separate PR if it's preparatory work
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/name-resolution/tests/resolver.test.ts` around lines 4 - 24, The
commented mock scaffolding uses Vitest's global "vi" (vi.mock, vi.fn) but there
is no import for vi, so if you keep/uncomment these lines either remove the
unrelated mock block altogether, move it to a separate preparatory PR, or add an
explicit import "import { vi } from 'vitest'" at the top of the test file;
specifically modify the mocked calls that reference vi (the vi.mock(...) blocks
and vi.fn() usages that create createWeb3Name/createSolName/createPaymentIdName
stubs) to ensure vi is imported when enabling these mocks.
packages/extension/src/libs/utils/number-formatter.ts-316-317 (1)

316-317: ⚠️ Potential issue | 🟡 Minor

Remove the @ts-ignore suppression or replace replaceAll with replace(/%/g, '').

Line 316 suppresses a type error instead of fixing it. However, other files in the same package (substrate.ts, acala.ts, asset-handler.ts) use replaceAll without suppression and compile successfully, indicating the @ts-ignore is likely unnecessary. Either remove the comment, or use the proposed fix to ensure type safety without suppression:

Proposed fix
-  // `@ts-ignore`
-  const value = new BigNumber(_value.toString().replaceAll('%', ''));
+  const value = new BigNumber(_value.toString().replace(/%/g, ''));
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/utils/number-formatter.ts` around lines 316 -
317, Remove the TypeScript suppression and fix the string replacement in the
BigNumber construction: replace the line using _value.toString().replaceAll('%',
'') with _value.toString().replace(/%/g, '') (or simply remove the // `@ts-ignore`
if replaceAll is allowed), so update the code that builds value (const value =
new BigNumber(...)) to use the regex replace on _value and eliminate the
`@ts-ignore` comment.
packages/extension/src/providers/bitcoin/libs/utils.ts-21-26 (1)

21-26: ⚠️ Potential issue | 🟡 Minor

Normalize the boolean flag to a number before ccall.

Line 25 passes a boolean isTestNetwork directly, but the WASM function signature declares the second argument as 'number'. This is inconsistent with the same pattern in getMintTxData.ts (line 58), which converts the value explicitly using Number(isTestNetwork).

Proposed fix
  const isAddressValid = wasm.ccall(
    'js_isValidSparkAddress',
    'number',
    ['string', 'number'],
-   [address, isTestNetwork],
+   [address, Number(isTestNetwork)],
  );
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/providers/bitcoin/libs/utils.ts` around lines 21 - 26,
The call to wasm.ccall for js_isValidSparkAddress passes a boolean for the
second parameter but the WASM signature expects a number; update the wasm.ccall
invocation in isAddressValid to normalize isTestNetwork to a numeric value
(e.g., Number(isTestNetwork)) before passing it so the argument types match the
declared ['string','number'] signature.
packages/extension/src/libs/utils/updateAndSync/calculateCurrentSparkBalance.ts-7-18 (1)

7-18: ⚠️ Potential issue | 🟡 Minor

Validate the IndexedDB payload before filtering.

readData returns persisted local data, so a malformed myCoins value will make .filter throw and break the Spark balance refresh. Mirror the defensive fallback used by getSetsFromDb.

Proposed fix
-export const calculateCurrentSparkBalance = async () => {
-  const myCoins = await db.readData<MyCoinModel[]>(DB_DATA_KEYS.myCoins);
+export const calculateCurrentSparkBalance = async (): Promise<string> => {
+  const myCoins = (await db.readData<MyCoinModel[]>(DB_DATA_KEYS.myCoins)) ?? [];
 
-  if (!myCoins?.length) {
+  if (!Array.isArray(myCoins) || !myCoins.length) {
     return '0';
   }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/libs/utils/updateAndSync/calculateCurrentSparkBalance.ts`
around lines 7 - 18, The calculateCurrentSparkBalance function assumes
readData(DB_DATA_KEYS.myCoins) returns a valid array and directly calls .filter,
which can throw on malformed IndexedDB payloads; before filtering, validate
myCoins with Array.isArray and normalize it (e.g., fallback to an empty array)
and ensure items have the expected shape (check existence of isUsed and
numeric/value fields) before reducing; update the logic in
calculateCurrentSparkBalance to mirror the defensive checks used in
getSetsFromDb so a bad payload returns '0' rather than throwing.
packages/extension/src/ui/action/views/accounts/components/accounts-list-item.vue-14-22 (1)

14-22: ⚠️ Potential issue | 🟡 Minor

Initial-render flicker and wrong-address display for Firo accounts.

computedAsync resolves asynchronously, so on first render sparkDisplayAddress is undefined (no evaluating/initial value provided). During that window, with network.name === NetworkNames.Firo && active, replaceWithEllipsis(undefined, 6, 4) runs on undefined, which will either throw or produce a garbled ellipsis string depending on the filter's implementation. Pass an initial value to computedAsync and guard the template against falsy values (e.g., fall back to the transparent address until the spark address resolves).

🔧 Proposed fix
-const sparkDisplayAddress = computedAsync(async () => {
-  if (props.network.name !== NetworkNames.Firo) return null;
-  return await getSparkAddress(props.network, props.address);
-});
+const sparkDisplayAddress = computedAsync(async () => {
+  if (props.network.name !== NetworkNames.Firo) return null;
+  return (await getSparkAddress(props.network, props.address)) ?? props.address;
+}, props.address);

Also applies to: 98-101

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/ui/action/views/accounts/components/accounts-list-item.vue`
around lines 14 - 22, The Firo account label uses sparkDisplayAddress (from
computedAsync) before it resolves, causing flicker/garbled output; update the
computedAsync that produces sparkDisplayAddress to supply a safe initial value
(e.g., address or empty string) and change the template expression so it only
calls $filters.replaceWithEllipsis(sparkDisplayAddress, 6, 4) when
sparkDisplayAddress is truthy, otherwise fall back to using address; reference
the sparkDisplayAddress computed, the computedAsync initializer, and the
template usage around network.name === NetworkNames.Firo && active, and ensure
the same guard is applied in the other occurrence (lines referenced 98-101).
packages/extension/src/libs/keyring/public-keyring.ts-109-118 (1)

109-118: ⚠️ Potential issue | 🟡 Minor

Dev mock account missing isTestWallet: true.

Every other __IS_DEV__ mock in this block sets isTestWallet: true; the new Firo entry omits it. If any UI filters dev accounts by isTestWallet, this one will leak through as a "real" account in dev builds.

         walletType: WalletType.mnemonic,
         isHardware: false,
+        isTestWallet: true,
       };
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/keyring/public-keyring.ts` around lines 109 -
118, The dev mock entry in public-keyring.ts for allKeys key
'TMSnbcpSw9JhteaJqioT2sz2sW1Qhqyf2Q' is missing the isTestWallet flag; update
that object in allKeys (the Firo account mock) to include isTestWallet: true so
it matches other __IS_DEV__ mocks and will be filtered correctly by UI code that
checks isTestWallet.
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue-241-245 (1)

241-245: ⚠️ Potential issue | 🟡 Minor

setMaxValue does not guard against empty/NaN assetMaxValue.

assetMaxValue starts as '' (initial default of useAsyncComputed). parseFloat('') === NaN, so NaN < 0 is false and amount.value becomes '', which then flows into toBase('', decimals). This is likely benign but also does not actually set a max. Guard with a check that assetMaxValue.value is a non-empty numeric string before applying.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue`
around lines 241 - 245, The setMaxValue handler should guard against empty or
non-numeric assetMaxValue before assigning amount: in the setMaxValue function
(where isMaxSelected.value, amount.value and assetMaxValue.value are used) check
that assetMaxValue.value is a non-empty numeric string (e.g. trim and use
parseFloat + isNaN or Number.isFinite) and only then set amount.value to
assetMaxValue.value; otherwise set amount.value to '0' (or leave it unchanged
per UI expectation) to avoid passing ''/NaN into toBase or downstream
conversions.
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue-233-238 (1)

233-238: ⚠️ Potential issue | 🟡 Minor

Don't overwrite the asset name and drop the commented-out code.

  • Line 238 sets availableAsset.value.name = 'Available Balance', which changes the token's identity to a UI label. If name is consumed anywhere else (send review screen, metrics, tx building) this will surface the literal string "Available Balance" instead of "Firo". Either rename to a local label variable or avoid mutating the token's name.
  • Remove the commented-out block at lines 233–235 (and the similar block at 148–158) to keep the file clean.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue`
around lines 233 - 238, The code incorrectly mutates the token identity by
setting availableAsset.value.name = 'Available Balance' after calling
calculateCurrentSparkBalance(); revert that mutation and instead introduce or
use a separate local label (e.g., availableAssetLabel or a computed property
used by the UI) so the token's name (e.g., "Firo") remains unchanged where it's
consumed (send review, metrics, tx building), and remove the stale commented-out
balance assignment blocks (the commented snippet around
calculateCurrentSparkBalance and the similar commented block elsewhere) to clean
up the file.
packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue-100-100 (1)

100-100: ⚠️ Potential issue | 🟡 Minor

Remove debug console.log.

console.log({ isLoadingAssets: isLoadingAssets.value }) is a debug artifact; please remove before release.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue` at
line 100, Remove the debug console.log statement that prints the reactive
flag—delete the line calling console.log({ isLoadingAssets:
isLoadingAssets.value }) from the send-transaction component (index.vue) so no
debug output remains; locate the usage of isLoadingAssets.value in the component
and simply remove the console.log call, leaving the reactive variable untouched.
packages/extension/src/ui/action/App.vue-191-210 (1)

191-210: ⚠️ Potential issue | 🟡 Minor

Remove debug console.log and clean up the balance-refresh closure.

console.log('UPDATING BALANCE') at line 194 is a debug artifact and should be removed before release. Also consider using ?? when defaulting sparkBalance, since 0 (a legitimate zero balance) would be replaced with '0' string by the || operator — currently harmless because calculateCurrentSparkBalance() likely returns a string, but worth keeping consistent.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/ui/action/App.vue` around lines 191 - 210, Remove the
debug console.log inside the callback passed to useSynchronizeSparkState and
replace the sparkBalance fallback from using || to nullish coalescing (??) so a
legitimate zero balance is preserved; specifically, in the closure that calls
calculateCurrentSparkBalance() and updates accountHeaderData (referencing
accountHeaderData.value.sparkAccount and sparkBalance), delete
console.log('UPDATING BALANCE') and set availableBalance to sparkBalance ?? '0';
keep the rest of the update logic intact and leave useUpdateActivityState and
sparkUnusedTxDetails usage unchanged.
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue-192-222 (1)

192-222: ⚠️ Potential issue | 🟡 Minor

sparkAccount.defaultAddress may be undefined when Send is clicked.

sendAction reads props.sparkAccount.defaultAddress unconditionally. If sparkAccount is {} (its default), fromAccountAddress will be undefined and the verification route will receive fromAddress: undefined in the base64 payload. There's no guard here — consider either returning early with an error or disabling the Send button when sparkAccount.defaultAddress is missing.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue`
around lines 192 - 222, sendAction currently reads
props.sparkAccount.defaultAddress without checking it, which can cause
fromAddress to be undefined in the router.push payload; update sendAction
(and/or the Send button logic) to guard against a missing address: inside
sendAction check props.sparkAccount && props.sparkAccount.defaultAddress and if
missing either return early and surface an error/notification (e.g., call a
provided showError method) or prevent navigation, and additionally ensure the
Send button is disabled when props.sparkAccount.defaultAddress is falsy so
router.push (to RouterNames.verifySendFromSpark.name) is never called with an
undefined fromAddress.
packages/extension/src/libs/utils/updateAndSync/handleCoinSetUpdates.ts-54-67 (1)

54-67: ⚠️ Potential issue | 🟡 Minor

Possible sparse array when setId skips indices.

If updatedSets.length < setId - 1, assigning updatedSets[setIndex] = {...} creates holes (undefined entries) that are later persisted. Depending on how saveData serializes, holes could become nulls and later break consumers that expect contiguous StoredAnonymitySet entries (and the deduplication relying on existingSet?.coins). Consider filling any gap explicitly (e.g. with an empty set shell) or at least validating that setId - 1 <= updatedSets.length.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/utils/updateAndSync/handleCoinSetUpdates.ts`
around lines 54 - 67, The loop over normalizedUpdates can create sparse arrays
when setId skips indices; in the updatedSets assignment inside the forEach,
ensure contiguous entries by checking setIndex against updatedSets.length and
filling any gaps with an empty StoredAnonymitySet shell (e.g., { blockHash: '',
setHash: '', coins: [] }) before the assignment; keep using
mergeCoins(existingSet?.coins ?? [], update.set.coins) so deduplication still
works, or alternatively validate and throw if setIndex > updatedSets.length to
avoid holes—update the logic in the normalizedUpdates.forEach block (referencing
normalizedUpdates, setId/setIndex, updatedSets, existingSet, and mergeCoins).
packages/extension/src/ui/action/composables/synchronize-spark-state.ts-135-135 (1)

135-135: ⚠️ Potential issue | 🟡 Minor

Remove debug console.log.

console.log({ coinDone, tagDone, workerDone }) inside the sync-state watcher is a debug artifact that will run on every sync-flag change. Please remove before release. (Same applies to console.log('Worker already running, skip start') at line 45.)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/ui/action/composables/synchronize-spark-state.ts` at
line 135, Remove the debug console.log statements in the sync-state watcher and
worker start check: delete the console.log({ coinDone, tagDone, workerDone })
inside the watcher (where coinDone, tagDone, workerDone are referenced) and
remove the console.log('Worker already running, skip start') in the worker-start
logic (the branch that checks if the worker is already running). Leave any real
logging mechanism intact if present or replace with a proper logger call if
needed; ensure no stray console.log calls remain in synchronize-spark-state.ts.
packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue-156-160 (1)

156-160: ⚠️ Potential issue | 🟡 Minor

Fix activity status and address inconsistency for Spark transactions.

On success, the activity remains in ActivityStatus.pending after broadcast (line 177—no status update). Set it to ActivityStatus.success before recording. Additionally, on failure (line 207), the activity is written under the raw txData.fromAddress, while success uses network.value.displayAddress(txData.fromAddress), causing failed transactions to appear under a different address key than successful ones.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue`
around lines 156 - 160, The activity status and address key are inconsistent:
after broadcast via sendFromSparkAddress you must update the activity from
ActivityStatus.pending to ActivityStatus.success before recording (set
ActivityStatus.success on the successful branch where rawTx is returned), and
when writing the activity in the failure branch use the same normalized display
address as success (network.value.displayAddress(txData.fromAddress)) instead of
raw txData.fromAddress so both success and failure entries get stored under the
same address key.
packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue-246-263 (1)

246-263: ⚠️ Potential issue | 🟡 Minor

isIncoming: txData.fromAddress === txData.toAddress is semantically incorrect for a send flow.

We're in a send-to-Spark confirmation flow, so the activity is intrinsically outgoing from the user's perspective. Setting isIncoming true only when fromAddress === toAddress (sending to self) misclassifies self-sends as "incoming" in the activity list while correctly classifying them as outgoing for everyone else. Spark's mint-to-self is still an outgoing transaction from the wallet's viewpoint (fees are paid, UTXOs consumed). Set isIncoming: false.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue`
around lines 246 - 263, The isIncoming flag on the txActivity object is wrong
for the send-to-Spark flow: change the isIncoming assignment in the txActivity
construction (the isIncoming field on the txActivity object created in
index.vue) to false so all send-to-Spark activities are recorded as outgoing
(keep other fields like ActivityType.transaction, transactionHash, token, value,
etc. unchanged); do not use txData.fromAddress === txData.toAddress to infer
incoming for this send flow.
packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue-181-195 (1)

181-195: ⚠️ Potential issue | 🟡 Minor

Magic number 500 in change calc obscures intent and risks over/under-funding.

changeAmount: inputAmountBn.minus(amountToSendBN).minus(new BigNumber(500)),

500 appears to be a placeholder fee used solely to build tempTx for fee estimation, but it's not labeled and the same literal is not reconciled anywhere else. If getFee later returns a value that, when combined with requestedAmount, exceeds inputAmountBn - 500, the real psbt build below still proceeds and only later the inputAmountBn.lt(amountToSendBN.plus(feeBn)) guard throws. Extract a named constant and document why.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue`
around lines 181 - 195, The change calculation in createTempTx uses a magic
literal new BigNumber(500) which obscures intent and can mismatch later fee
checks; replace that literal with a clearly named constant (e.g.
ESTIMATED_FEE_BUFFER = new BigNumber(500)) and add a short comment documenting
that it is a conservative fee buffer used only for initial tempTx fee
estimation; update the changeAmount expression in createTempTx and ensure any
subsequent fee comparison logic (e.g. where getFee is used and the
inputAmountBn.lt(amountToSendBN.plus(feeBn)) guard) consistently references the
same named constant or is reconciled with it so the placeholder buffer is
obvious and consistent across createTempTx, changeAmount and fee validation.
packages/extension/src/ui/action/db/indexedDB.ts-136-144 (1)

136-144: ⚠️ Potential issue | 🟡 Minor

if (index) treats index 0 as unset.

Passing a valid index = 0 takes the return data?.length branch instead of returning data[0].coins.length. Use if (index !== undefined).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/ui/action/db/indexedDB.ts` around lines 136 - 144, In
getLengthOf, the check "if (index)" treats 0 as unset; change the condition to
explicitly detect a provided index (e.g., use "index !== undefined" or "index !=
null") so index = 0 is handled; keep the existing validation of data[index] and
error throw (Invalid index) and return data[index].coins.length when an index is
provided, otherwise return data?.length; references: getLengthOf and readData.
packages/extension/src/ui/action/db/indexedDB.ts-119-129 (1)

119-129: ⚠️ Potential issue | 🟡 Minor

removeSector is a misleading name — it keeps the slice, it doesn't remove it.

data[index].coins = data[index].coins.slice(startIndex, endIndex) retains the [startIndex, endIndex) range and discards the rest. That's the opposite of "removing a sector". Either rename (e.g. keepSector/sliceCoins) or actually splice-out the sector. Also consider validating startIndex <= endIndex and clamping to array bounds.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/ui/action/db/indexedDB.ts` around lines 119 - 129, The
method removeSector currently assigns data[index].coins =
data[index].coins.slice(startIndex, endIndex) which keeps the slice rather than
removing it; update the implementation of removeSector (or rename it to reflect
behavior, e.g., keepSector or sliceCoins) to either (A) actually remove the
range using an in-place splice on data[index].coins (e.g.,
data[index].coins.splice(startIndex, endIndex - startIndex)) or (B) if you
intend to keep the slice, rename removeSector to sliceCoins/keepSector; in
either case add validation in removeSector (or the renamed method) to ensure
startIndex <= endIndex and clamp both indices to [0, data[index].coins.length]
before performing the operation and preserve throwing the existing 'Invalid
index' if data[index] is missing.
packages/extension/src/libs/spark-handler/index.ts-321-332 (1)

321-332: ⚠️ Potential issue | 🟡 Minor

Hardcoded 'Private memo' string is embedded in every outgoing Spark spend.

Every Spark send created through this path ships the literal string 'Private memo' as the encrypted memo. While memos are encrypted in Spark, this is still a confusing default and leaks a fingerprint that identifies transactions as originating from this extension build. Make this configurable (empty string by default, or thread the user's memo from the UI).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/spark-handler/index.ts` around lines 321 - 332,
The code currently passes the hardcoded string 'Private memo' into Module.ccall
when invoking js_addPrivateRecipientForCreateSparkSpendTransaction; replace this
literal with a memo variable (e.g., memoString) that is either passed into the
surrounding function or read from the UI/options so it defaults to an empty
string when not provided. Locate the call to Module.ccall and the surrounding
function that builds privateRecipientsVector (and where isSparkTransaction,
recipientAddressObj, senderAddressObj, privateRecipientAmount are available) and
change the argument from the literal to the memo variable, ensuring the ccall
still uses the 'string' type for that parameter and add minimal validation
(coerce undefined/null to '') before calling.
packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts-33-38 (1)

33-38: ⚠️ Potential issue | 🟡 Minor

Hardcoded peer port 50004 ignores the Peer.tcp/Peer.ssl config you just defined.

hardcodedPeers entries carry tcp: '50001', ssl: '50002', but connectMain passes literal 50004, 'wss' and never reads either field. Either drop the unused fields from Peer or actually use peer.ssl/peer.tcp so the configuration is honored.

Also applies to: 77-80

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts`
around lines 33 - 38, The hardcodedPeers array defines tcp and ssl ports but
connectMain (and the other call sites around the same area) currently ignore
them by passing literal 50004 and 'wss'; update the connection logic to read and
use the Peer fields (peer.tcp and peer.ssl) and the desired protocol value from
the Peer or connection options instead of hardcoded values, or remove tcp/ssl
from the Peer type if you intend to always use fixed ports; specifically modify
connectMain and the other connection invocation(s) near lines 77-80 to use
peer.ssl/peer.tcp (and peer.protocol if you add it) so the configured ports are
honored.
packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts-367-378 (1)

367-378: ⚠️ Potential issue | 🟡 Minor

Debug string "23333" leaked into error message.

throw new Error(`${result.message} 23333`);

Appears to be leftover instrumentation. Strip it.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts`
around lines 367 - 378, In getLatestSetId, remove the stray debug token "23333"
from the thrown error: when mainClient?.request('spark.getsparklatestcoinid',
[]) returns an Error (check in getLatestSetId), either rethrow that Error or
throw a new Error using only result.message (no extra debug string); update the
throw in getLatestSetId to not append "23333" and preserve the original error
information from result.
packages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.ts-17-18 (1)

17-18: ⚠️ Potential issue | 🟡 Minor

Prefer checking HTTP status over stringly-typed message property and drop the any cast.

(txs as any).message depends on the Insight API's shape-on-error. If the node returns non-200 with JSON { error: ... }, or returns no txs field at all, the code proceeds to txs.txs.map and throws a confusing error that gets swallowed in .catch. Check res.ok first and validate the presence of txs.txs before mapping.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.ts`
around lines 17 - 18, In the .then(async (txs: { txs: FiroTxType[] }) => { ...
}) callback replace the stringy error check and any-cast with proper HTTP and
shape validation: ensure the fetch response was ok (check res.ok at the fetch
response handling point) and that the parsed body contains a populated txs.txs
array before calling txs.txs.map; remove the (txs as any).message check and any
casts, and return [] early if the response is not ok or txs.txs is missing/not
an array to avoid throwing when mapping in the callback.
packages/extension/src/ui/action/db/indexedDB.ts-131-154 (1)

131-154: ⚠️ Potential issue | 🟡 Minor

Helpers throw raw errors when the underlying key is absent.

getLength, getBlockHashes, getSetHashes all call .reduce/.map on the result of readData, which returns undefined when the key was never written (first-run, post-reset). That throws a TypeError rather than returning a sensible 0/[]. Defensive null-check (const data = (await this.readData<any[]>(key)) ?? [];).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/ui/action/db/indexedDB.ts` around lines 131 - 154,
getLength, getBlockHashes and getSetHashes can throw when readData returns
undefined; update each to defensively default the readData result to an empty
array (e.g. const data = (await this.readData<any[]>(keyOrDB_DATA_KEYS.sets)) ??
[]) before calling .reduce or .map so they return 0 or [] instead of raising a
TypeError; adjust getLength to sum over the defaulted array and
getBlockHashes/getSetHashes to map the defaulted array (referencing methods
getLength, getBlockHashes, getSetHashes and DB_DATA_KEYS.sets).
packages/extension/src/libs/utils/updateAndSync/updateTagsSet.ts-82-89 (1)

82-89: ⚠️ Potential issue | 🟡 Minor

onComplete fires after the first error, which may mislead callers.

fireCompleteOnce() is invoked from both the success and error branches, meaning an immediate network/DB failure on the very first run will trigger onComplete even though nothing has ever succeeded. If UI uses "complete" as a gate to flip away from a "syncing" indicator, users will see a green/completed state despite a stale DB. Consider firing complete only on the first successful run, and letting onError drive failure UX.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/utils/updateAndSync/updateTagsSet.ts` around
lines 82 - 89, The current flow calls fireCompleteOnce() from both success and
error paths, causing onComplete to fire after the first failure; change this so
fireCompleteOnce() is invoked only when a successful run occurs (e.g., set or
check a "hasSucceeded" flag inside the try-success path and call
fireCompleteOnce() there) and remove the call from the catch block that handles
options?.onError?.(error); keep isRunning=false and scheduleNext() in finally so
loop behavior is unchanged. Ensure you update references in the same function
(fireCompleteOnce, options?.onError, isRunning, scheduleNext) so only the first
successful completion triggers onComplete.
packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts-173-173 (1)

173-173: ⚠️ Potential issue | 🟡 Minor

Remove shipped debug log.

-    console.log(`🔹 Total Spendable UTXOs: ${allUtxos.length}`);

Same applies to similar emoji-prefixed logs elsewhere in the file.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts`
at line 173, Remove the shipped debug console.log from base-firo-wallet (the
line printing `🔹 Total Spendable UTXOs`) and any other emoji-prefixed
console.log statements in this file; locate occurrences in functions/methods
such as where `allUtxos` is processed and delete those console.log calls or
replace them with proper structured logging (e.g., use the configured logger at
debug/trace level) so no emoji debug prints remain in production code.
packages/extension/src/libs/utils/updateAndSync/updateCoinSet.ts-70-72 (1)

70-72: ⚠️ Potential issue | 🟡 Minor

Remove debug logging before release.

These console.log calls run on every sync cycle and leak set metadata to the console.

-      console.log('===>>>localCoinsCount', localCoinsCount);
-
-      console.log('===>>>Meta for set', meta, localCoinsCount);
-
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/utils/updateAndSync/updateCoinSet.ts` around
lines 70 - 72, Remove the two debug console.log statements that print
localCoinsCount and meta inside updateCoinSet (the lines logging
'===>>>localCoinsCount' and '===>>>Meta for set'); either delete them or replace
them with a proper logger call at a debug/trace level (using the project's
logging facility) so sensitive set metadata is not printed to stdout on every
sync cycle, referencing the variables localCoinsCount and meta and the
updateCoinSet function to locate the statements.
packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts-263-305 (1)

263-305: ⚠️ Potential issue | 🟡 Minor

index = index * 1 is not an int cast, and unknown nodes return undefined silently.

  • Line 264: index * 1 only coerces a string operand to Number; it does not truncate floats as the comment claims. If the intent is integer coercion, use Math.trunc(index) (or index | 0 for small ints). Otherwise drop the misleading line.
  • The function quietly returns undefined for any node value other than 0 or 1 (falls off the end). Callers use non-null assertions downstream, so a bad node manifests as a runtime error far from the source. Add an explicit throw for unsupported nodes.
-  async _getNodeAddressByIndex(node: number, index: number) {
-    index = index * 1; // cast to int
+  async _getNodeAddressByIndex(node: number, index: number) {
+    if (node !== EXTERNAL_INDEX && node !== INTERNAL_INDEX) {
+      throw new Error(`Unsupported node ${node}`);
+    }
+    index = Math.trunc(index);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts`
around lines 263 - 305, The _getNodeAddressByIndex function currently uses a
misleading "index = index * 1" cast and can silently return undefined for
unsupported node values; change the index handling to an explicit integer
conversion (e.g. index = Math.trunc(Number(index)) and validate it's finite and
>= 0) and add an early guard that throws a clear Error if node is not
EXTERNAL_INDEX or INTERNAL_INDEX; keep the existing logic using getXpub,
bip32.fromBase58, _node0/_node1, and the
external_addresses_cache/internal_addresses_cache but ensure the function always
either returns a cached/derived address or throws for invalid input instead of
falling off the end.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 78717869-9804-4756-865e-800a10d53529

📥 Commits

Reviewing files that changed from the base of the PR and between 1362fcf and c69d076.

⛔ Files ignored due to path filters (4)
  • packages/extension/src/libs/utils/wasmModule/spark.wasm is excluded by !**/*.wasm
  • packages/extension/src/libs/utils/wasmWorkerModule/spark.wasm is excluded by !**/*.wasm
  • packages/extension/src/providers/bitcoin/networks/icons/firo.svg is excluded by !**/*.svg
  • yarn.lock is excluded by !**/yarn.lock, !**/*.lock
📒 Files selected for processing (99)
  • .gitignore
  • README.md
  • package.json
  • packages/extension/.prettierignore
  • packages/extension/package.json
  • packages/extension/src/libs/background/index.ts
  • packages/extension/src/libs/keyring/keyring.ts
  • packages/extension/src/libs/keyring/public-keyring.ts
  • packages/extension/src/libs/spark-handler/constants.ts
  • packages/extension/src/libs/spark-handler/createTempFromSparkTx.ts
  • packages/extension/src/libs/spark-handler/createTempTx.ts
  • packages/extension/src/libs/spark-handler/generateSparkWallet.ts
  • packages/extension/src/libs/spark-handler/getFee.ts
  • packages/extension/src/libs/spark-handler/getMintTxData.ts
  • packages/extension/src/libs/spark-handler/getSerializedCoin.ts
  • packages/extension/src/libs/spark-handler/getSparkCoinInfo.ts
  • packages/extension/src/libs/spark-handler/getTotalMintedAmount.ts
  • packages/extension/src/libs/spark-handler/index.ts
  • packages/extension/src/libs/spark-handler/serializeMintContext.ts
  • packages/extension/src/libs/spark-handler/utils.ts
  • packages/extension/src/libs/utils/accounts.ts
  • packages/extension/src/libs/utils/initialize-wallet.ts
  • packages/extension/src/libs/utils/number-formatter.ts
  • packages/extension/src/libs/utils/updateAndSync/calculateCurrentSparkBalance.ts
  • packages/extension/src/libs/utils/updateAndSync/getSetsFromDb.ts
  • packages/extension/src/libs/utils/updateAndSync/handleCoinSetUpdates.ts
  • packages/extension/src/libs/utils/updateAndSync/markCoinsAsUsed.ts
  • packages/extension/src/libs/utils/updateAndSync/updateCoinSet.ts
  • packages/extension/src/libs/utils/updateAndSync/updateTagsSet.ts
  • packages/extension/src/libs/utils/wasm-loader.ts
  • packages/extension/src/libs/utils/wasm-worker-loader.ts
  • packages/extension/src/libs/utils/wasmModule/spark.js
  • packages/extension/src/libs/utils/wasmWorkerModule/spark.js
  • packages/extension/src/providers/bitcoin/libs/activity-handlers/index.ts
  • packages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.ts
  • packages/extension/src/providers/bitcoin/libs/api-firo.ts
  • packages/extension/src/providers/bitcoin/libs/api.ts
  • packages/extension/src/providers/bitcoin/libs/electrum-client/abstract-electrum.ts
  • packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts
  • packages/extension/src/providers/bitcoin/libs/filter-ordinals.ts
  • packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts
  • packages/extension/src/providers/bitcoin/libs/firo-wallet/configs.ts
  • packages/extension/src/providers/bitcoin/libs/firo-wallet/utils.ts
  • packages/extension/src/providers/bitcoin/libs/utils.ts
  • packages/extension/src/providers/bitcoin/networks/firo-testnet.ts
  • packages/extension/src/providers/bitcoin/networks/firo.ts
  • packages/extension/src/providers/bitcoin/networks/index.ts
  • packages/extension/src/providers/bitcoin/tests/bitcoin.address.derivation.test.ts
  • packages/extension/src/providers/bitcoin/types/bitcoin-network.ts
  • packages/extension/src/providers/bitcoin/types/index.ts
  • packages/extension/src/providers/bitcoin/ui/libs/signer.ts
  • packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue
  • packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue
  • packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue
  • packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue
  • packages/extension/src/providers/bitcoin/ui/send-transaction/verify-transaction/index.vue
  • packages/extension/src/providers/bitcoin/ui/types.ts
  • packages/extension/src/providers/ethereum/libs/transaction/types.ts
  • packages/extension/src/types/activity.ts
  • packages/extension/src/types/provider.ts
  • packages/extension/src/ui/action/App.vue
  • packages/extension/src/ui/action/components/accounts-header/components/header-accounts.vue
  • packages/extension/src/ui/action/components/accounts-header/index.vue
  • packages/extension/src/ui/action/components/app-menu/index.vue
  • packages/extension/src/ui/action/composables/async-computed.ts
  • packages/extension/src/ui/action/composables/get-spark-address.ts
  • packages/extension/src/ui/action/composables/synchronize-spark-state.ts
  • packages/extension/src/ui/action/composables/update-activity-state.ts
  • packages/extension/src/ui/action/db/indexedDB.ts
  • packages/extension/src/ui/action/main.ts
  • packages/extension/src/ui/action/router/index.ts
  • packages/extension/src/ui/action/types/account.ts
  • packages/extension/src/ui/action/utils/array-utils.ts
  • packages/extension/src/ui/action/utils/set-utils.ts
  • packages/extension/src/ui/action/views/accounts/components/accounts-list-item.vue
  • packages/extension/src/ui/action/views/accounts/index.vue
  • packages/extension/src/ui/action/views/deposit/index.vue
  • packages/extension/src/ui/action/views/lock-screen/components/lock-screen-forgot.vue
  • packages/extension/src/ui/action/views/lock-screen/index.vue
  • packages/extension/src/ui/action/views/modal-accounts/index.vue
  • packages/extension/src/ui/action/views/network-activity/components/network-activity-total.vue
  • packages/extension/src/ui/action/views/network-activity/components/network-activity-transaction.vue
  • packages/extension/src/ui/action/views/network-activity/components/synchronize-state.vue
  • packages/extension/src/ui/action/views/network-activity/index.vue
  • packages/extension/src/ui/action/views/network-assets/index.vue
  • packages/extension/src/ui/action/views/reset-wallet/index.vue
  • packages/extension/src/ui/action/views/send-process/components/send-process-account.vue
  • packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue
  • packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue
  • packages/extension/src/ui/action/workers/sparkCoinInfoWorker.ts
  • packages/extension/src/ui/onboard/App.vue
  • packages/extension/tsconfig.app.json
  • packages/extension/vite.config.ts
  • packages/keyring/package.json
  • packages/keyring/src/index.ts
  • packages/name-resolution/tests/resolver.test.ts
  • packages/name-resolution/tests/sid.test.ts
  • packages/name-resolution/tests/ud.test.ts
  • packages/types/src/networks.ts

Comment on lines 177 to 181
case InternalMethods.unlock:
const password = message?.params?.[0] as string;
const mnemonic = await this.#keyring.getSavedMnemonic(password);
this.#wallet.setSecret(mnemonic);
return unlock(this.#keyring, message);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

cat -n packages/extension/src/libs/background/index.ts | sed -n '170,190p'

Repository: enkryptcom/enKrypt

Length of output: 1133


🏁 Script executed:

# Search for getSavedMnemonic and related keyring methods
rg -A 5 "getSavedMnemonic|getMnemonic" packages/extension/src --type ts

Repository: enkryptcom/enKrypt

Length of output: 2257


🏁 Script executed:

# Find the unlock function implementation
rg -B 2 -A 10 "(?:export\s+)?(?:function|const)\s+unlock\s*\(" packages/extension/src --type ts

Repository: enkryptcom/enKrypt

Length of output: 44


🏁 Script executed:

# Check InternalMethods enum
rg -A 20 "enum InternalMethods|InternalMethods\s*=" packages/extension/src --type ts

Repository: enkryptcom/enKrypt

Length of output: 1663


🏁 Script executed:

# Look for other cases in the same switch statement to understand scope leaking
cat -n packages/extension/src/libs/background/index.ts | sed -n '160,210p'

Repository: enkryptcom/enKrypt

Length of output: 2110


🏁 Script executed:

# Find the unlock function implementation
rg -B 2 -A 15 "export.*unlock\s*\(|const unlock\s*=" packages/extension/src --type ts

Repository: enkryptcom/enKrypt

Length of output: 1694


🏁 Script executed:

# Check the MnemonicWithExtraWord type definition
rg -B 2 -A 5 "MnemonicWithExtraWord|interface.*Mnemonic" packages/extension/src --type ts

Repository: enkryptcom/enKrypt

Length of output: 1237


🏁 Script executed:

# Check getSavedMnemonic implementation in private keyring
rg -B 3 -A 10 "getSavedMnemonic\s*\(" packages/extension/src/libs/keyring --type ts

Repository: enkryptcom/enKrypt

Length of output: 1878


🏁 Script executed:

# Check for Firo-specific logic or wallet.setSecret usage
rg -B 3 -A 3 "setSecret|Firo" packages/extension/src/libs/background --type ts

Repository: enkryptcom/enKrypt

Length of output: 2443


🏁 Script executed:

# Check Biome configuration
cat biome.json 2>/dev/null || cat .biome.json 2>/dev/null || rg "noSwitchDeclarations" . 2>/dev/null

Repository: enkryptcom/enKrypt

Length of output: 1321


🏁 Script executed:

# Find the actual private keyring implementation
find packages/extension/src/libs/keyring -name "*.ts" -type f | head -20

Repository: enkryptcom/enKrypt

Length of output: 162


🏁 Script executed:

# Search for the BaseKeyRing or private keyring class
rg -l "class.*Keyring" packages/extension/src/libs/keyring --type ts

Repository: enkryptcom/enKrypt

Length of output: 44


🏁 Script executed:

# Get full unlock.ts to see error handling
cat -n packages/extension/src/libs/background/internal/unlock.ts

Repository: enkryptcom/enKrypt

Length of output: 1288


🏁 Script executed:

# Search for getSavedMnemonic and getMnemonic actual implementations
rg -B 3 -A 12 "getSavedMnemonic\s*\(" packages/extension/src/libs/keyring/base-keyring.ts --type ts

Repository: enkryptcom/enKrypt

Length of output: 153


🏁 Script executed:

# Get getMnemonic implementation
rg -B 3 -A 12 "getMnemonic\s*\(" packages/extension/src/libs/keyring/base-keyring.ts --type ts

Repository: enkryptcom/enKrypt

Length of output: 153


🏁 Script executed:

# Look inside keyring.ts to see the private keyring implementation
cat -n packages/extension/src/libs/keyring/keyring.ts | head -100

Repository: enkryptcom/enKrypt

Length of output: 3792


🏁 Script executed:

# Search for the actual keyring implementations
rg -l "class.*" packages/extension/src/libs/keyring --type ts

Repository: enkryptcom/enKrypt

Length of output: 162


🏁 Script executed:

# Search for getSavedMnemonic implementation that throws on decrypt failure
rg -B 5 -A 15 "getSavedMnemonic" packages/extension/src/libs/keyring/keyring.ts --type ts

Repository: enkryptcom/enKrypt

Length of output: 754


🏁 Script executed:

# Find where `#keyring` is instantiated in keyring.ts
rg -B 5 -A 5 "#keyring\s*=" packages/extension/src/libs/keyring/keyring.ts --type ts

Repository: enkryptcom/enKrypt

Length of output: 403


🏁 Script executed:

# List all files in the keyring directory
ls -la packages/extension/src/libs/keyring/

Repository: enkryptcom/enKrypt

Length of output: 304


🏁 Script executed:

# Search for other usages of getSavedMnemonic to understand its error behavior
rg -B 3 -A 3 "getSavedMnemonic" packages/extension/src --type ts | grep -A 5 -B 5 "try\|catch"

Repository: enkryptcom/enKrypt

Length of output: 44


🏁 Script executed:

# Check if there are other places where getSavedMnemonic is called with error handling
rg -B 5 -A 5 "getSavedMnemonic" packages/extension/src --type ts

Repository: enkryptcom/enKrypt

Length of output: 2859


🏁 Script executed:

# Verify the actual call flow - does the switch case need braces?
ast-grep --pattern 'case InternalMethods.unlock: { $$$}'

Repository: enkryptcom/enKrypt

Length of output: 44


Unlock flow has several correctness/security regressions.

  1. Breaks error handling on wrong password. getSavedMnemonic(password) throws on decrypt failure, so the exception occurs before unlock(...) is reached. The existing unlock handler's .catch() (which translates wrong-password errors into structured InternalOnMessageResponse with error set) is bypassed, and all callers receive a raw rejection instead of the documented response shape.

  2. Unconditionally decrypts and stores a raw mnemonic on every unlock, even for users who never touch Firo. This expands the extension's in-memory secret footprint and runs a decryption on every unlock just to support one network.

  3. Loses extraWord. getSavedMnemonic returns only the base mnemonic; getMnemonic is available and returns MnemonicWithExtraWord. Users who configured an extra word will have #wallet.setSecret(mnemonic) seeded inconsistently with the rest of the keyring's derivations.

  4. Switch-declaration scoping — flagged by Biome (noSwitchDeclarations): const password / const mnemonic leak into sibling cases.

Wrap the Firo setup in a try/catch (or only run it after unlock succeeds), gate it on the currently selected network being Firo, use getMnemonic and include the extra word, and wrap the case in a block.

🔧 Proposed fix sketch
-      case InternalMethods.unlock:
-        const password = message?.params?.[0] as string;
-        const mnemonic = await this.#keyring.getSavedMnemonic(password);
-        this.#wallet.setSecret(mnemonic);
-        return unlock(this.#keyring, message);
+      case InternalMethods.unlock: {
+        const response = await unlock(this.#keyring, message);
+        if (!response.error) {
+          try {
+            const password = message?.params?.[0] as string;
+            const { mnemonic, extraWord } =
+              await this.#keyring.getMnemonic(password);
+            this.#wallet.setSecret(
+              extraWord ? `${mnemonic} ${extraWord}` : mnemonic,
+            );
+          } catch (_) {
+            // non-fatal: Firo wallet will be unavailable until re-unlock
+          }
+        }
+        return response;
+      }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
case InternalMethods.unlock:
const password = message?.params?.[0] as string;
const mnemonic = await this.#keyring.getSavedMnemonic(password);
this.#wallet.setSecret(mnemonic);
return unlock(this.#keyring, message);
case InternalMethods.unlock: {
const response = await unlock(this.#keyring, message);
if (!response.error) {
try {
const password = message?.params?.[0] as string;
const { mnemonic, extraWord } =
await this.#keyring.getMnemonic(password);
this.#wallet.setSecret(
extraWord ? `${mnemonic} ${extraWord}` : mnemonic,
);
} catch (_) {
// non-fatal: Firo wallet will be unavailable until re-unlock
}
}
return response;
}
🧰 Tools
🪛 Biome (2.4.11)

[error] 178-178: Other switch clauses can erroneously access this declaration.
Wrap the declaration in a block to restrict its access to the switch clause.

(lint/correctness/noSwitchDeclarations)


[error] 179-179: Other switch clauses can erroneously access this declaration.
Wrap the declaration in a block to restrict its access to the switch clause.

(lint/correctness/noSwitchDeclarations)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/background/index.ts` around lines 177 - 181, The
unlock case currently calls getSavedMnemonic(password) and sets
`#wallet.setSecret` before calling unlock, which leaks raw mnemonic, loses
extraWord, bypasses unlock's error translation, and breaks switch-scoped consts;
fix by enclosing the case body in its own block, first call
unlock(this.#keyring, message) and await its success (so its .catch() handles
wrong-password responses), then only if the selected network is Firo retrieve
the full MnemonicWithExtraWord via this.#keyring.getMnemonic(password) (not
getSavedMnemonic), call `#wallet.setSecret` with the mnemonic that includes
extraWord, and wrap the Firo setup in try/catch to avoid throwing unhandled
exceptions back to callers.

Comment on lines +103 to +319
export async function sendFromSparkAddress(
networkInfo: bitcoin.Network,
to: string,
amount: string,
): Promise<string | undefined> {
let spendKeyObj = 0;
let fullViewKeyObj = 0;
let incomingViewKeyObj = 0;
let senderAddressObj = 0;
let recipientAddressObj = 0;
let recipientsVector = 0;
let privateRecipientsVector = 0;
let coinsList = 0;
let coverSetDataMap = 0;
let idAndBlockHashesMap = 0;
let result = 0;
let deserializedCoinObj = 0;
let coverSetData = 0;
const diversifier = 1n;
const isTestNetwork = networkInfo.bech32 === 'tb';
const requestedAmount = getRequestedAmount(amount);
const db = new IndexedDBHelper();
const Module = await wasmInstance.getInstance();
const cleanup = () => {
if (spendKeyObj && spendKeyObj !== 0) {
Module.ccall('js_freeSpendKey', null, ['number'], [spendKeyObj]);
}
if (fullViewKeyObj && fullViewKeyObj !== 0) {
Module.ccall('js_freeFullViewKey', null, ['number'], [fullViewKeyObj]);
}
if (incomingViewKeyObj && incomingViewKeyObj !== 0) {
Module.ccall(
'js_freeIncomingViewKey',
null,
['number'],
[incomingViewKeyObj],
);
}
if (senderAddressObj && senderAddressObj !== 0) {
Module.ccall('js_freeAddress', null, ['number'], [senderAddressObj]);
}
if (recipientAddressObj && recipientAddressObj !== 0) {
Module.ccall('js_freeAddress', null, ['number'], [recipientAddressObj]);
}
if (recipientsVector) {
Module.ccall(
'js_freeSparkSpendRecipientsVector',
null,
['number'],
[recipientsVector],
);
}
if (coinsList) {
Module.ccall('js_freeSparkSpendCoinsList', null, ['number'], [coinsList]);
}
if (coverSetDataMap && coverSetDataMap !== 0) {
Module.ccall(
'js_freeCoverSetDataMapForCreateSparkSpendTransaction',
null,
['number'],
[coverSetDataMap],
);
}
if (privateRecipientsVector) {
Module.ccall(
'js_freeSparkSpendPrivateRecipientsVector',
null,
['number'],
[privateRecipientsVector],
);
}
if (idAndBlockHashesMap) {
Module.ccall(
'js_freeIdAndBlockHashesMap',
null,
['number'],
[idAndBlockHashesMap],
);
}
if (deserializedCoinObj) {
Module._free(deserializedCoinObj);
}
if (result && result !== 0) {
Module.ccall(
'js_freeCreateSparkSpendTxResult',
null,
['number'],
[result],
);
}
if (coverSetData && coverSetData !== 0) {
Module.ccall('js_freeCoverSetData', null, ['number'], [coverSetData]);
}
};

spendKeyObj = await getSpendKeyObj(Module);
const isSparkTransaction = await isSparkAddress(to);

const ownedCoins = ((await db.readData('myCoins')) || []) as OwnedCoinData[];

const usedCoinsTags = await db.readData<{ tags: string[] }>(
DB_DATA_KEYS.usedCoinsTags,
);
const coinsTagsSet = new Set(usedCoinsTags?.tags ?? []);
const myCoinsTagsSet = new Set((ownedCoins ?? []).map(coin => coin.tag));

const usedMyCoinsTagsSet = intersectSets(coinsTagsSet, myCoinsTagsSet);
const revalidatedCoins = ownedCoins.filter(ownedCoin => {
return !usedMyCoinsTagsSet.has(ownedCoin.tag);
});

const revalidatedUniqueCoins = Array.from(
removeDuplicates(new Set(revalidatedCoins)),
);

const spendCoinList: SpendCoin[] = [];

const keyObjects = await getIncomingViewKey(Module, spendKeyObj);

incomingViewKeyObj = keyObjects.incomingViewKeyObj;
fullViewKeyObj = keyObjects.fullViewKeyObj;

senderAddressObj = Module.ccall(
'js_getAddress',
'number',
['number', 'number'],
[incomingViewKeyObj, diversifier],
);

if (isSparkTransaction) {
recipientAddressObj = Module.ccall(
'js_decodeAddress',
'number',
['string', 'number'],
[to, Number(isTestNetwork)],
);

if (recipientAddressObj === 0) {
throw new Error('Failed to decode Spark recipient address');
}
}

// Create recipients vector for spend transaction
recipientsVector = Module.ccall(
'js_createRecipientsVectorForCreateSparkSpendTransaction',
'number',
['number'],
[1], // intended final size
);

if (!isSparkTransaction) {
Module.ccall(
'js_addRecipientForCreateSparkSpendTransaction',
null,
['number', 'number', 'number'],
[recipientsVector, BigInt(requestedAmount), 0],
);
}

privateRecipientsVector = Module.ccall(
'js_createPrivateRecipientsVectorForCreateSparkSpendTransaction',
'number',
['number'],
[1], // intended final size
);

coinsList = Module.ccall(
'js_createCoinsListForCreateSparkSpendTransaction',
'number',
[],
[],
);

const coinMetaPromiseList: Promise<void>[] = [];

revalidatedUniqueCoins.forEach(ownedCoin => {
const myCoinMetaData = getSparkCoinInfo({
coin: ownedCoin.coin,
fullViewKeyObj,
incomingViewKeyObj,
wasmModule: Module,
keepMemory: true,
})
.then(data => {
if (!data.isUsed) {
spendCoinList.push({
coin: ownedCoin.coin,
setId: ownedCoin.setId,
metadata: data.metaData,
value: Number(data.value),
});
}
})
.catch(err => {
console.error('Error getting spark coin info', err);
});
coinMetaPromiseList.push(myCoinMetaData);
});

await Promise.allSettled(coinMetaPromiseList);

const selectedSpendCoins = selectSpendCoins(spendCoinList, requestedAmount);

const selectedSpendAmount = selectedSpendCoins.reduce(
(total, coin) => total + coin.value,
0,
);
const privateRecipientAmount = isSparkTransaction
? requestedAmount
: selectedSpendAmount - requestedAmount;

if (!selectedSpendCoins.length || selectedSpendAmount < requestedAmount) {
cleanup();
throw new Error(
`Not enough Spark balance to cover transaction amount. Requested ${requestedAmount}, selected ${selectedSpendAmount}.`,
);
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

No try/finally around WASM allocations — any throw before the explicit cleanup() leaks everything.

cleanup() is only invoked in three places: inside the insufficient-balance branch, inside the !result branch, and at the happy-path exit (Line 601). Any synchronous throw from the many Module.ccall invocations above (e.g. a failed js_decodeAddress for a malformed to, a null/invalid coin, a bad setId in getSetById, base64ToReversedHex on bad data at Line 483, etc.) will bubble up to the caller with every already-allocated WASM handle leaked — and because this path runs on user intent, users can retry several times before a page reload reclaims memory.

Wrap the body in try { ... } finally { cleanup(); } and let the throw propagate naturally; remove the manual cleanup() calls before throws.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/spark-handler/index.ts` around lines 103 - 319,
The function sendFromSparkAddress allocates many WASM handles (spendKeyObj,
fullViewKeyObj, incomingViewKeyObj, senderAddressObj, recipientsVector,
coinsList, etc.) but does not guarantee cleanup on synchronous throws; wrap the
main allocation/logic body of sendFromSparkAddress in a try { ... } finally {
cleanup(); } block so cleanup() always runs, remove the ad-hoc cleanup() calls
placed immediately before throws (e.g., the insufficient-balance branch and the
!result branch), and keep throws (like from Module.ccall('js_decodeAddress'),
getSparkCoinInfo, selectSpendCoins, base64ToReversedHex, etc.) propagating so
memory is not leaked even on errors.

Comment on lines +119 to +120
let deserializedCoinObj = 0;
let coverSetData = 0;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Confirm deallocation functions exposed by the WASM glue for coins/cover-set data.
rg -nP 'js_(freeCoin|freeDeserializedCoin|freeCoverSetData|deserializeCoin)\b'

Repository: enkryptcom/enKrypt

Length of output: 1405


🏁 Script executed:

sed -n '119,121p' packages/extension/src/libs/spark-handler/index.ts

Repository: enkryptcom/enKrypt

Length of output: 142


🏁 Script executed:

sed -n '180,210p' packages/extension/src/libs/spark-handler/index.ts

Repository: enkryptcom/enKrypt

Length of output: 1006


🏁 Script executed:

sed -n '389,425p' packages/extension/src/libs/spark-handler/index.ts

Repository: enkryptcom/enKrypt

Length of output: 1165


🏁 Script executed:

sed -n '430,465p' packages/extension/src/libs/spark-handler/index.ts

Repository: enkryptcom/enKrypt

Length of output: 1145


🏁 Script executed:

cat -n packages/extension/src/libs/spark-handler/getSparkCoinInfo.ts | sed -n '140,155p'

Repository: enkryptcom/enKrypt

Length of output: 647


Critical WASM memory leaks in loops — each iteration's allocations except the last are never freed.

Two outer-scope let variables (deserializedCoinObj at Line 119 and coverSetData at Line 120) are reassigned inside loops without freeing prior values. Only the final assignment persists to cleanup(), leaving N−1 allocations leaked per function call:

  • deserializedCoinObj = Module.ccall('js_deserializeCoin', ...) (Line 405) runs once per coin in the forEach loop (Lines 389–421). All but the last allocation leak.
  • coverSetData = Module.ccall('js_createCoverSetData', ...) (Line 436) runs once per set in the for loop (Lines 430–460). All but the last allocation leak.

Three temporary _malloc'd pointers are never freed:

  • serializedCoinPointer (Line 394)
  • serialContextPointer (Line 401)
  • coverSetRepresentationPointer (Line 431)

These should be freed immediately after their WASM calls complete (as shown in getSparkCoinInfo.ts lines 144–145), not deferred to cleanup.

Additionally, there is no try/finally wrapping the loops. Any exception during iteration leaves WASM state permanently leaked.

Use js_freeCoin (not Module._free) as the deallocator for deserialized coins, matching the pattern in getSparkCoinInfo.ts:147.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/spark-handler/index.ts` around lines 119 - 120,
The loops leak WASM allocations because outer-scope vars deserializedCoinObj and
coverSetData are overwritten without freeing previous values and three malloc'd
pointers (serializedCoinPointer, serialContextPointer,
coverSetRepresentationPointer) are never freed; fix by freeing each pointer
immediately after the corresponding Module.ccall returns
(Module._free(serializedCoinPointer), Module._free(serialContextPointer),
Module._free(coverSetRepresentationPointer)), free the previous
deserializedCoinObj before reassigning using js_freeCoin(deserializedCoinObj)
(not Module._free), free coverSetData with the appropriate deallocator before
reassigning, and wrap the affected loops (the forEach that calls
js_deserializeCoin and the for loop that calls js_createCoverSetData) in
try/finally so any thrown exception still runs cleanup for the most recent
allocations and avoids leaks.

Comment on lines +41 to +88
const unusedCoins = updatedCoins.filter(el => !el.isUsed);

const mintIds = unusedCoins.map(unusedCoin => unusedCoin.coin[1]);

const txIdsDecoded = mintIds
.map(base64 => {
try {
return base64ToReversedHex(base64);
} catch {
return '';
}
})
.filter(Boolean);

console.log('===>>>Unused Spark Coins TX IDs:', txIdsDecoded);

if (!!txIdsDecoded?.length) {
const results = await firoElectrum.multiGetTransactionByTxid(txIdsDecoded);

unusedTxDetails = txIdsDecoded
.map((txId, index) => {
const tx = results[txId];

if (!tx) {
console.warn(`Transaction details not found for txId: ${txId}`);
return null;
}

const coin = unusedCoins[index];
const coinValue = coin?.value;

if (coinValue === undefined) {
console.warn(
`Coin value not found for txId: ${txId}, index: ${index}`,
);

return {
...tx,
value: '0',
};
}

return {
...tx,
value: Number(coinValue),
};
})
.filter((tx): tx is SparkUnusedTxDetails => tx !== null);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Index misalignment after .filter(Boolean) causes wrong coin values to be attached to transactions.

txIdsDecoded is produced by mapping over mintIds (parallel to unusedCoins) and then .filter(Boolean) is applied. When any base64 decode fails and returns '', the filter drops entries and re-packs the array, so the index passed into the subsequent .map((txId, index) => ...) no longer corresponds to unusedCoins[index]. As a result, coinValue (and therefore the reported balance/value attached to each returned tx) gets misattributed — silently, because the try/catch swallows decode errors.

Also, the returned object in the coinValue === undefined branch sets value: '0' (string), but SparkUnusedTxDetails declares value: number, so consumers receive a type-violating field.

🐛 Proposed fix – keep coin alignment by filtering and mapping in one pass
-    const mintIds = unusedCoins.map(unusedCoin => unusedCoin.coin[1]);
-
-    const txIdsDecoded = mintIds
-      .map(base64 => {
-        try {
-          return base64ToReversedHex(base64);
-        } catch {
-          return '';
-        }
-      })
-      .filter(Boolean);
-
-    console.log('===>>>Unused Spark Coins TX IDs:', txIdsDecoded);
-
-    if (!!txIdsDecoded?.length) {
-      const results = await firoElectrum.multiGetTransactionByTxid(txIdsDecoded);
-
-      unusedTxDetails = txIdsDecoded
-        .map((txId, index) => {
-          const tx = results[txId];
-
-          if (!tx) {
-            console.warn(`Transaction details not found for txId: ${txId}`);
-            return null;
-          }
-
-          const coin = unusedCoins[index];
-          const coinValue = coin?.value;
-
-          if (coinValue === undefined) {
-            console.warn(
-              `Coin value not found for txId: ${txId}, index: ${index}`,
-            );
-
-            return {
-              ...tx,
-              value: '0',
-            };
-          }
-
-          return {
-            ...tx,
-            value: Number(coinValue),
-          };
-        })
-        .filter((tx): tx is SparkUnusedTxDetails => tx !== null);
-    }
+    const decoded = unusedCoins
+      .map(coin => {
+        try {
+          return { coin, txId: base64ToReversedHex(coin.coin[1]) };
+        } catch {
+          return null;
+        }
+      })
+      .filter((x): x is { coin: typeof unusedCoins[number]; txId: string } => !!x && !!x.txId);
+
+    if (decoded.length) {
+      const txIds = decoded.map(d => d.txId);
+      const results = await firoElectrum.multiGetTransactionByTxid(txIds);
+
+      unusedTxDetails = decoded
+        .map(({ coin, txId }) => {
+          const tx = results[txId];
+          if (!tx) return null;
+          return { ...tx, value: Number(coin.value ?? 0) };
+        })
+        .filter((tx): tx is SparkUnusedTxDetails => tx !== null);
+    }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/utils/updateAndSync/markCoinsAsUsed.ts` around
lines 41 - 88, The txIdsDecoded filtering breaks alignment with unusedCoins
causing wrong coin values to be attached; change the pipeline to produce a
single filtered array of pairs (e.g., {txId, coin}) by decoding each mintId
alongside its corresponding unusedCoins entry (decode each base64 with
base64ToReversedHex, skip and optionally log failures), pass the list of txIds
from that paired array into firoElectrum.multiGetTransactionByTxid, and then map
results back using the paired coin so the right coin.value is used; also ensure
the value field on the returned SparkUnusedTxDetails is a number (use 0, not the
string '0') and remove the separate mintIds/txIdsDecoded filtering that causes
index misalignment.

Comment on lines +146 to +163
const updatedCoinsSet = differenceSets(
new Set(localSets?.[index]?.coins ?? []),
new Set(newCoins),
);

localSets[index] = {
blockHash: remoteMeta.blockHash,
setHash: remoteMeta.setHash,
coins: [...updatedCoinsSet, ...(localSets?.[index]?.coins ?? [])],
};

if (!localSets[index] || isFullReplacement) {
await db.saveData(DB_DATA_KEYS.sets, localSets);
} else {
// await db.appendSetData(DB_DATA_KEYS.sets, index, {
// ...localSets[index],
// });
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Two defects in the incremental update path: broken dedup and dropped persistence.

  1. string[][] items put into a Set are compared by reference, not value, so differenceSets(new Set(localCoins), new Set(newCoins)) will never consider any array "equal" and will always yield all of newCoins. Combined with line 154 prepending them to the existing coins, duplicates will accumulate on every sync.
  2. On line 157, localSets[index] has just been assigned on line 151, so !localSets[index] is always falsy. That means only isFullReplacement === true ever persists to IndexedDB — the entire incremental branch is a no-op (the actual persist call is commented out on lines 160‑162). Incremental updates are effectively lost across sessions.
🛠️ Suggested direction

Dedup by the canonical coin identifier (e.g. coin[0]) using a Set<string>, and unconditionally persist:

-      const updatedCoinsSet = differenceSets(
-        new Set(localSets?.[index]?.coins ?? []),
-        new Set(newCoins),
-      );
-
-      localSets[index] = {
-        blockHash: remoteMeta.blockHash,
-        setHash: remoteMeta.setHash,
-        coins: [...updatedCoinsSet, ...(localSets?.[index]?.coins ?? [])],
-      };
-
-      if (!localSets[index] || isFullReplacement) {
-        await db.saveData(DB_DATA_KEYS.sets, localSets);
-      } else {
-        // await db.appendSetData(DB_DATA_KEYS.sets, index, {
-        //   ...localSets[index],
-        // });
-      }
+      const existingCoins = localSets?.[index]?.coins ?? [];
+      const existingHashes = new Set(existingCoins.map(c => c?.[0]));
+      const trulyNew = isFullReplacement
+        ? newCoins
+        : newCoins.filter(c => !existingHashes.has(c?.[0]));
+
+      localSets[index] = {
+        blockHash: remoteMeta.blockHash,
+        setHash: remoteMeta.setHash,
+        coins: isFullReplacement ? newCoins : [...trulyNew, ...existingCoins],
+      };

Persist once after the Promise.all finishes instead of inside each task (see separate comment on the race).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/libs/utils/updateAndSync/updateCoinSet.ts` around
lines 146 - 163, The incremental-update path is broken: differenceSets(new
Set(localCoins), new Set(newCoins)) compares array references so it never dedups
and duplicates accumulate, and the persistence branch is effectively skipped
because localSets[index] is always truthy and appendSetData is commented out.
Fix by computing deduped new entries using a Set of canonical coin IDs (e.g.
coin[0]) instead of Set<string[]>—for example build a Set<string> of existing
coin IDs from localSets[index].coins, filter newCoins to only those whose
coin[0] is not present, then merge (prepend or append per desired order) without
duplicates into localSets[index].coins; remove or uncomment the appendSetData
code and ensure you call db.saveData(DB_DATA_KEYS.sets, localSets)
unconditionally (or persist once after the surrounding Promise.all) so
incremental updates are actually written. Ensure you update references in the
code around updatedCoinsSet, differenceSets, newCoins, localSets,
isFullReplacement, appendSetData and db.saveData accordingly.

Comment on lines +170 to +235
try {
let signedTx;
if (network.value.name === NetworkNames.Firo) {
signedTx = await FiroTransactionSigner({
account: account.value!,
network: network.value as BitcoinNetwork,
payload: JSON.parse(txData.TxInfo),
});
} else {
signedTx = await TransactionSigner({
account: account.value!,
network: network.value as BitcoinNetwork,
payload: JSON.parse(txData.TxInfo),
});
}

api
.broadcastTx(signedTx.toHex())
.then(() => {
trackSendEvents(SendEventType.SendComplete, {
network: network.value.name,
});
activityState.addActivities(
[
{
address: network.value.displayAddress(txData.fromAddress),
network: network.value.name,
...txActivity,
...{ transactionHash: signedTx.getId() },
},
);
isSendDone.value = true;
if (getCurrentContext() === 'popup') {
setTimeout(() => {
isProcessing.value = false;
callToggleRate();
router.go(-2);
}, 4500);
} else {
setTimeout(() => {
isProcessing.value = false;
callToggleRate();
window.close();
}, 1500);
}
})
.catch(error => {
trackSendEvents(SendEventType.SendFailed, {
network: network.value.name,
error: error.message,
});
txActivity.status = ActivityStatus.failed;
activityState.addActivities([txActivity], {
address: txData.fromAddress,
],
{
address: network.value.displayAddress(txData.fromAddress),
network: network.value.name,
});
console.error('ERROR', error);
},
);
isSendDone.value = true;
if (getCurrentContext() === 'popup') {
setTimeout(() => {
isProcessing.value = false;
callToggleRate();
router.go(-2);
}, 4500);
} else {
setTimeout(() => {
isProcessing.value = false;
callToggleRate();
window.close();
}, 1500);
}
})
.catch(error => {
trackSendEvents(SendEventType.SendFailed, {
network: network.value.name,
error: error.message,
});
})
.catch(error => {
isProcessing.value = false;
console.error('error', error);
errorMsg.value = JSON.stringify(error);
});
txActivity.status = ActivityStatus.failed;
activityState.addActivities([txActivity], {
address: txData.fromAddress,
network: network.value.name,
});
console.error('ERROR', error);
});
} catch (error) {
isProcessing.value = false;
console.error('error', error);
errorMsg.value = JSON.stringify(error);
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Broadcast failure leaves the UI stuck in the "processing" state.

The .catch handler on api.broadcastTx(signedTx.toHex()) (lines 219-230) records the failed activity and logs, but never sets isProcessing.value = false. Because api.broadcastTx(...) is not awaited inside the outer try block, a rejection there does not propagate to the outer catch at line 231 either. Net effect: when broadcast fails, <send-process v-if="isProcessing" ...> remains mounted indefinitely with no "done" signal and no path back — the user has to close the window.

Two fixes:

  1. Reset isProcessing (and surface errorMsg) inside the broadcast .catch, or await the broadcast so the outer catch handles it uniformly.
  2. Line 234: JSON.stringify(error) returns "{}" for native Error instances because message/stack are non-enumerable. Use error?.message ?? String(error) so the user sees something actionable.
Proposed fix
-    api
-      .broadcastTx(signedTx.toHex())
-      .then(() => {
+    try {
+      await api.broadcastTx(signedTx.toHex());
         trackSendEvents(SendEventType.SendComplete, {
           network: network.value.name,
         });
         activityState.addActivities(
-          [
-            {
-              ...txActivity,
-              ...{ transactionHash: signedTx.getId() },
-            },
-          ],
+          [{ ...txActivity, transactionHash: signedTx.getId() }],
           {
             address: network.value.displayAddress(txData.fromAddress),
             network: network.value.name,
           },
         );
         isSendDone.value = true;
-        if (getCurrentContext() === 'popup') {
-          setTimeout(() => {
-            isProcessing.value = false;
-            callToggleRate();
-            router.go(-2);
-          }, 4500);
-        } else {
-          setTimeout(() => {
-            isProcessing.value = false;
-            callToggleRate();
-            window.close();
-          }, 1500);
-        }
-      })
-      .catch(error => {
+        // ...existing timeout logic...
+    } catch (error: any) {
         trackSendEvents(SendEventType.SendFailed, {
           network: network.value.name,
-          error: error.message,
+          error: error?.message,
         });
         txActivity.status = ActivityStatus.failed;
         activityState.addActivities([txActivity], {
           address: txData.fromAddress,
           network: network.value.name,
         });
-        console.error('ERROR', error);
-      });
-  } catch (error) {
-    isProcessing.value = false;
-    console.error('error', error);
-    errorMsg.value = JSON.stringify(error);
+        console.error('ERROR', error);
+        isProcessing.value = false;
+        errorMsg.value = error?.message ?? String(error);
+      }
+  } catch (error: any) {
+    isProcessing.value = false;
+    console.error('error', error);
+    errorMsg.value = error?.message ?? String(error);
   }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/providers/bitcoin/ui/send-transaction/verify-transaction/index.vue`
around lines 170 - 235, The broadcast rejection handler never clears the
processing UI and the outer catch won't catch it because api.broadcastTx(...)
isn't awaited; update the code around api.broadcastTx(signedTx.toHex()) so that
either the broadcast is awaited (await api.broadcastTx(...)) letting the outer
try/catch handle failures, or inside the .catch guarantee you set
isProcessing.value = false and populate errorMsg.value (use error?.message ??
String(error) instead of JSON.stringify(error)) and preserve existing activity
tracking and trackSendEvents; reference TransactionSigner/FiroTransactionSigner,
api.broadcastTx, isProcessing, errorMsg, and txActivity when making the change.

Comment on lines 543 to +552
const onSelectedAddressChanged = async (newAccount: EnkryptAccount) => {
const defaultAddress = await getSparkAddress(
currentNetwork,
newAccount.address,
);

if (defaultAddress && accountHeaderData.value.sparkAccount) {
accountHeaderData.value.sparkAccount['defaultAddress'] = defaultAddress;
}

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

getSparkAddress is being passed a Ref, not the unwrapped network.

currentNetwork here is a ref<BaseNetwork>, but getSparkAddress is called elsewhere in this file (getSparkAccountState, setNetwork) with the unwrapped network value. Passing the ref object will almost certainly break the function (it expects network.name, network.displayAddress(...) etc. on the network itself, not on a Ref). This likely prevents defaultAddress from being resolved on address change for the Firo account.

🐛 Proposed fix
 const onSelectedAddressChanged = async (newAccount: EnkryptAccount) => {
   const defaultAddress = await getSparkAddress(
-    currentNetwork,
+    currentNetwork.value,
     newAccount.address,
   );
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const onSelectedAddressChanged = async (newAccount: EnkryptAccount) => {
const defaultAddress = await getSparkAddress(
currentNetwork,
newAccount.address,
);
if (defaultAddress && accountHeaderData.value.sparkAccount) {
accountHeaderData.value.sparkAccount['defaultAddress'] = defaultAddress;
}
const onSelectedAddressChanged = async (newAccount: EnkryptAccount) => {
const defaultAddress = await getSparkAddress(
currentNetwork.value,
newAccount.address,
);
if (defaultAddress && accountHeaderData.value.sparkAccount) {
accountHeaderData.value.sparkAccount['defaultAddress'] = defaultAddress;
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/extension/src/ui/action/App.vue` around lines 543 - 552,
onSelectedAddressChanged passes the Ref currentNetwork into getSparkAddress
causing it to receive a Ref instead of a BaseNetwork; change the call in
onSelectedAddressChanged to pass the unwrapped network (e.g.
currentNetwork.value or use unref(currentNetwork)) so getSparkAddress can access
network.name and displayAddress correctly; update any nearby calls in
onSelectedAddressChanged that expect the resolved defaultAddress to ensure
accountHeaderData.value.sparkAccount['defaultAddress'] is set using the result.

@@ -1,62 +1,113 @@
packages/extension/src/ui/action/views/network-activity/components/network-activity-total.vue
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Remove the stray filename from the SFC body.

If this line is present in the actual .vue file, the component will fail to compile before <template>.

Proposed fix
-packages/extension/src/ui/action/views/network-activity/components/network-activity-total.vue
-
 <template>
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
packages/extension/src/ui/action/views/network-activity/components/network-activity-total.vue
<template>
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/ui/action/views/network-activity/components/network-activity-total.vue`
at line 1, There is a stray filename string sitting before the component's
<template> in network-activity-total.vue causing the SFC to fail compile; open
network-activity-total.vue, remove the stray filename line that appears above
the <template> tag (or any text outside the <template>, <script>, <style>
blocks), then save so the file begins with a valid SFC block (e.g., <template>)
and the component (network-activity-total) will compile.

Comment on lines +113 to +116
const selectedNetwork: string = route.query.id as string;
const txData: VerifyTransactionParams = JSON.parse(
Buffer.from(route.query.txData as string, 'base64').toString('utf8'),
);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Unhandled JSON.parse on route.query.txData — a malformed URL crashes the view at setup time.

Buffer.from(route.query.txData as string, 'base64').toString('utf8') followed by JSON.parse is executed at script-setup time with no try/catch. If txData is missing, not valid base64, or not valid JSON (e.g. user hits refresh, state lost, or deep link is tampered with), the component throws before mount and the user is stuck on a broken view with no recourse. Wrap it and fail gracefully (navigate back / show error).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue`
around lines 113 - 116, Wrap the base64 decode + JSON.parse of
route.query.txData (where txData: VerifyTransactionParams is currently created
alongside selectedNetwork) in a try/catch and handle failures gracefully;
specifically, validate that route.query.txData exists, attempt Buffer.from(...,
'base64').toString('utf8') and JSON.parse inside the try block, and on any error
(missing, invalid base64, or invalid JSON) either programmatically navigate back
or set a component-level error state to show a user-friendly message instead of
letting the component throw during setup; update any downstream logic that
relies on txData (the code using txData / VerifyTransactionParams) to handle the
error state.

"@polkadot/util": "^14.0.3",
"assert": "^2.1.0",
"bip39": "^3.1.0"
"bip32": "2.0.6"
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Description: Check for remaining bip39 imports and inspect bip32 usage/import style.
rg -n -C4 "from ['\"]bip39['\"]|require\\(['\"]bip39['\"]\\)|from ['\"]bip32['\"]|require\\(['\"]bip32['\"]\\)|BIP32Factory|fromSeed" --glob '!node_modules/**'

fd -i '^package\.json$' --exec sh -c '
  file="$1"
  printf "\n--- %s ---\n" "$file"
  jq -r '"'"'.name as $name | "name=\($name // "<root>") deps.bip39=\(.dependencies.bip39 // "-") deps.bip32=\(.dependencies.bip32 // "-")"'"'"' "$file"
' sh {}

Repository: enkryptcom/enKrypt

Length of output: 13147


Add bip39 back to @enkryptcom/keyring dependencies.

The keyring package still imports bip39 functions at line 16 (entropyToMnemonic, generateMnemonic, mnemonicToEntropy) but the dependency was removed from package.json. This will fail under Yarn PnP because the import cannot be resolved. The bip32@2.0.6 usage (namespace import with fromSeed() and derivePath()) is compatible.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/keyring/package.json` at line 35, Restore the missing bip39
dependency in package.json so imports of entropyToMnemonic, generateMnemonic,
and mnemonicToEntropy resolve under Yarn PnP; add an appropriate bip39 entry
(e.g., "bip39": "3.0.4" or the project’s pinned version) to the dependencies and
run yarn install to ensure the package is available for the code that imports
those functions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants