Add firo support to enkrypt#797
Conversation
…ture/add-firo-network # Conflicts: # packages/extension/src/libs/utils/wasmModule/spark.js
Add TX activity history after sending from Spark address
…ture/add-firo-network
…ed-spark Disable activity when spark to spark transaction is failed or dropped
…ture/add-firo-network
…ture/add-firo-network
Revert "Fix get balance"
Fix activity state and add reset DB on reset
…ture/add-firo-network
2. Fix state init
3. In QR now shows Firo:{address}
1. Accounts list show spark addresses
…ture/add-firo-network
In getSparkCoinInfo, the WASM buffer for the serial context was allocated with serializedCoin.length and then populated with serialContext via HEAPU8.set(serialContext, ...). When serialContext.length exceeds serializedCoin.length, this writes past the allocated region, corrupting WASM heap memory and producing non-deterministic failures or undefined behavior in js_deserializeCoin. Use serialContext.length, matching the pattern already used in libs/spark-handler/index.ts. Co-authored-by: Reuben Yap <reuben@firo.org>
Feature/add firo network
WalkthroughThis PR adds comprehensive support for the Firo blockchain network and its Spark privacy protocol. It introduces WASM-backed Spark wallet functionality, cryptographic operations, privacy-focused transaction handling, anonymity set synchronization, and associated UI components for transparent and private transaction sending. Changes
Sequence DiagramssequenceDiagram
participant User as User/UI
participant App as App Component
participant Sync as Coin Sync Service
participant IDB as IndexedDB
participant Electrum as Electrum Client
participant WASM as WASM Module
participant Worker as Coin Info Worker
User->>App: Network initialized (Firo)
App->>Sync: startCoinSetSync()
Sync->>Electrum: Fetch remote anonymity sets
Electrum-->>Sync: Set metadata
Sync->>IDB: Load local anonymity sets
IDB-->>Sync: Stored sets
Sync->>Electrum: Fetch anonymity set sectors
Electrum-->>Sync: Coin data
Sync->>IDB: Update sets with new coins
Sync->>Worker: Process coin info (via worker)
Worker->>WASM: Load module + view keys
Worker->>WASM: getSparkCoinInfo per coin
WASM-->>Worker: Coin metadata (value, used state)
Worker->>IDB: Store processed coins
Worker-->>App: Coin processing complete
App->>App: Update sparkBalance
App-->>User: Display private balance
sequenceDiagram
participant User as User/UI
participant SendTab as Spark Send Tab
participant Handler as Spark Handler
participant Electrum as Electrum Client
participant WASM as WASM Module
participant IDB as IndexedDB
participant Signer as Firo Signer
User->>SendTab: Enter address & amount
SendTab->>Handler: sendFromSparkAddress()
Handler->>IDB: Load owned coins + used tags
Handler->>Handler: Filter and select coins
Handler->>Electrum: Fetch UTXO data
Handler->>WASM: Initialize Spark keys
Handler->>WASM: Build recipients/memos
Handler->>IDB: Load anonymity set data
Handler->>WASM: Create spend transaction
WASM-->>Handler: Serialized tx + scripts
Handler->>Signer: Sign transaction
Signer->>WASM: Validate + finalize
Handler->>Electrum: Broadcast tx
Electrum-->>Handler: txid
Handler-->>SendTab: Broadcast success
SendTab-->>User: Transaction confirmed
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested reviewers
✨ Finishing Touches🧪 Generate unit tests (beta)
|
There was a problem hiding this comment.
Actionable comments posted: 14
Note
Due to the large number of review comments, Critical severity comments were prioritized as inline comments.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (5)
packages/extension/src/providers/bitcoin/types/index.ts (1)
91-128:⚠️ Potential issue | 🟠 MajorMove
vin.addrinto Firo-only type definition.Lines 100–102 make
addrrequired in the sharedSSTxType.vin, but the SS activity handler (line 34 ofactivity-handlers/providers/ss/index.ts) filters byvi.addressesand never accessesaddr, while the Firo activity handler (line 32 ofactivity-handlers/providers/firo/index.ts) filters byvi.addrand never usesaddresses. These are incompatible shapes from different explorers. Requiringaddrin the shared type breaks existing SS consumers that only receiveaddressesin their API responses.Make
addroptional inSSTxType.vinand overridevininFiroTxTypeto requireaddr:vin: { txid: string; addresses?: string[]; - addr: string; + addr?: string; value: string; }[]; export interface FiroTxType extends Omit< SSTxType, - 'fee' | 'timestamp' | 'vout' | 'blockHeight' + 'fee' | 'timestamp' | 'vout' | 'blockHeight' | 'vin' > { fees: number; time: number; blockheight: number; + vin: (Omit<SSTxType['vin'][number], 'addr'> & { + addr: string; + })[]; vout: {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/types/index.ts` around lines 91 - 128, Make SSTxType.vin's addr optional (addr?: string) so the shared SS type no longer requires a field SS explorers don't provide, and then explicitly override vin in FiroTxType to a Firo-specific shape that requires addr (e.g., add a vin declaration in FiroTxType with addr: string non-optional and the same other properties). Update only the type definitions SSTxType and FiroTxType (leave activity handlers as-is) so SS consumers get addresses[] and Firo consumers get addr.packages/extension/src/ui/action/views/lock-screen/index.vue (1)
57-90:⚠️ Potential issue | 🔴 CriticalDeclared
update:init-spark-stateemit is never fired.
defineEmitsnow declares(e: 'update:init-spark-state', password: string), butunlockActiononly emitsupdate:initand the password is cleared immediately after (line 85). Whatever listener binds@update:init-spark-statein the parent will never be invoked, so the Firo Spark initialization flow this emit is meant to trigger will silently not run. Either emit it alongsideupdate:initbefore clearing the password, or drop the declaration.🔧 Proposed fix
isError.value = false; - password.value = ''; emit('update:init'); + emit('update:init-spark-state', password.value.trim()); + password.value = ''; setTimeout(() => (isUnlocking.value = false), 750);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/views/lock-screen/index.vue` around lines 57 - 90, The declared emit 'update:init-spark-state' is never used; update the unlockAction in this component to emit('update:init-spark-state', password.value.trim()) before clearing password.value (and before or alongside emit('update:init')) so the parent receives the password for Firo Spark initialization, or remove the ('update:init-spark-state', password: string) signature from defineEmits if you choose to drop that flow; locate defineEmits and the unlockAction function to apply the change.packages/extension/src/ui/action/components/accounts-header/components/header-accounts.vue (1)
6-28:⚠️ Potential issue | 🟡 MinorHandle empty
sparkAddressgracefully on Firo.
sparkAddressdefaults to''and only becomes populated onceaccountInfo.sparkAccount?.defaultAddressresolves. While the prop is empty on the Firo network,network.identicon('')andreplaceWithEllipsis('', 6, 4)will render a placeholder identicon and an empty/ellipsed address label. Consider falling back toaddress(the transparent address) untilsparkAddressis ready, to avoid a transient broken-looking header:- network.name === NetworkNames.Firo ? sparkAddress : address, + network.name === NetworkNames.Firo && sparkAddress + ? sparkAddress + : address,The same guard should apply to the
replaceWithEllipsiscall at lines 22-28.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/components/accounts-header/components/header-accounts.vue` around lines 6 - 28, When NetworkNames.Firo is active, the template currently passes sparkAddress directly to network.identicon and $filters.replaceWithEllipsis even when sparkAddress is an empty string; update both usages to fall back to address when sparkAddress is falsy so the transparent address is shown until accountInfo.sparkAccount?.defaultAddress resolves. Specifically change the identicon src binding (network.identicon(...)) and the display span (the $filters.replaceWithEllipsis(...) call) to use (sparkAddress || address) when network.name === NetworkNames.Firo, keeping all other logic intact.packages/extension/src/ui/action/views/network-activity/index.vue (1)
303-328:⚠️ Potential issue | 🟠 MajorPending
spark_transactionactivities are never polled.
setActivitiesonly dispatches pending items forActivityType.transaction(tocheckActivity) andActivityType.swap(tocheckSwap). Pending entries with the newActivityType.spark_transactionare silently skipped, so a Spark spend that is inActivityStatus.pendingat load time will stay pending in the UI indefinitely — no status transition tosuccess/failed/droppedwill be persisted viaupdateActivity. Consider adding a branch (e.g.,checkActivity-equivalent using the Firo API'sgetTransactionStatus) forspark_transaction.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/views/network-activity/index.vue` around lines 303 - 328, setActivities currently only triggers polling for ActivityType.transaction (checkActivity) and ActivityType.swap (checkSwap), so pending activities with ActivityType.spark_transaction are never polled; add a branch in setActivities to detect (act.type === ActivityType.spark_transaction && (act.status === ActivityStatus.pending || act.status === ActivityStatus.dropped)) and call a new or existing routine that uses the Firo API (e.g., getTransactionStatus) to poll and then persist results via updateActivity (mirror behavior of checkActivity), or implement a dedicated checkSparkTransaction function that performs the getTransactionStatus calls and calls updateActivity when status changes.packages/extension/src/ui/action/views/deposit/index.vue (1)
161-180:⚠️ Potential issue | 🟠 MajorDefault to a real transparent fallback when no Spark account exists.
SparkAccount | nullcurrently defaults to{}, so!!sparkAccountbecomes true and the Spark tab can renderundefined. Also, resettingactiveTabtosparkhides the transparent address when no Spark account is available.Proposed fix
const isCopied = ref(false); -const activeTab = ref<'transparent' | 'spark'>('spark'); +const activeTab = ref<'transparent' | 'spark'>('transparent'); @@ sparkAccount: { type: Object as PropType<SparkAccount | null>, - default: () => { - return {}; - }, + default: null, }, @@ watch( - () => props.showDeposit, - v => { - if (!v) { + () => [props.showDeposit, props.sparkAccount] as const, + ([showDeposit, sparkAccount]) => { + if (!showDeposit) { isCopied.value = false; - activeTab.value = 'spark'; } + activeTab.value = sparkAccount ? 'spark' : 'transparent'; }, );Also applies to: 214-220
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/views/deposit/index.vue` around lines 161 - 180, The Spark account prop currently declares type SparkAccount | null but defaults to an empty object, causing !!sparkAccount to be truthy and allowing the Spark tab to render undefined; update the prop default for sparkAccount to return null (or undefined) so the value is truly falsy when absent, and adjust activeTab initialization and any reset logic (activeTab ref and places where it’s reset to 'spark') to fall back to 'transparent' when sparkAccount is falsy (use the sparkAccount prop and checks around activeTab in the component to conditionally set 'spark' only when sparkAccount exists); apply the same default-null fix to the other identical prop occurrence referenced in the file.
♻️ Duplicate comments (1)
packages/name-resolution/tests/sid.test.ts (1)
4-24:⚠️ Potential issue | 🟡 MinorCommented-out mocks won't work and may be unnecessary.
This commented mock scaffolding has the same issues as in
resolver.test.ts:
- Uses
vi.fn()andvi.mock()without importingvifrom Vitest- Unclear relevance to Firo blockchain support
Additionally, the comment on line 4 says these mocks are "to pass the tests," but the entire test suite is skipped (line 25:
describe.skip(...)). If the tests are intentionally disabled, these mock stubs serve no purpose.Consider either removing the commented code or properly implementing it if you plan to re-enable these tests.
🔧 Proposed fix to add missing import
If these mocks are needed when tests are re-enabled:
-import { describe, it, expect } from "vitest"; +import { describe, it, expect, vi } from "vitest"; import SIDResolver from "../src/sid";🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/name-resolution/tests/sid.test.ts` around lines 4 - 24, The commented-out vi.mock/vi.fn scaffolding in sid.test.ts is dead or broken: either remove these commented mocks (they're unnecessary while the suite is skipped via describe.skip) or implement them correctly before re-enabling tests; to implement, import vi from Vitest (e.g. import { vi } from 'vitest'), register mocks for the exact modules used (e.g. "@web3-name-sdk/core", "@web3-name-sdk/core/solName", "@web3-name-sdk/core/paymentIdName") and ensure the mocked methods (getDomainName/getAddress) reflect the actual Firo SDK surface if Firo support is intended, then re-enable the suite by removing describe.skip once mocks are correct.
🟡 Minor comments (27)
packages/name-resolution/tests/resolver.test.ts-4-24 (1)
4-24:⚠️ Potential issue | 🟡 MinorCommented-out mocks use undefined
viwithout importing it.Lines 4-22 contain mock scaffolding that calls
vi.fn()andvi.mock(), butviis not imported from Vitest. If uncommented, these would immediately fail with a ReferenceError.Since no Firo-related changes appear in this test file, this commented code appears to be unrelated scaffolding. Either:
- Remove it if not needed
- Implement it properly by adding
import { vi } from "vitest"to the imports- Move it to a separate PR if it's preparatory work
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/name-resolution/tests/resolver.test.ts` around lines 4 - 24, The commented mock scaffolding uses Vitest's global "vi" (vi.mock, vi.fn) but there is no import for vi, so if you keep/uncomment these lines either remove the unrelated mock block altogether, move it to a separate preparatory PR, or add an explicit import "import { vi } from 'vitest'" at the top of the test file; specifically modify the mocked calls that reference vi (the vi.mock(...) blocks and vi.fn() usages that create createWeb3Name/createSolName/createPaymentIdName stubs) to ensure vi is imported when enabling these mocks.packages/extension/src/libs/utils/number-formatter.ts-316-317 (1)
316-317:⚠️ Potential issue | 🟡 MinorRemove the
@ts-ignoresuppression or replacereplaceAllwithreplace(/%/g, '').Line 316 suppresses a type error instead of fixing it. However, other files in the same package (substrate.ts, acala.ts, asset-handler.ts) use
replaceAllwithout suppression and compile successfully, indicating the@ts-ignoreis likely unnecessary. Either remove the comment, or use the proposed fix to ensure type safety without suppression:Proposed fix
- // `@ts-ignore` - const value = new BigNumber(_value.toString().replaceAll('%', '')); + const value = new BigNumber(_value.toString().replace(/%/g, ''));🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/libs/utils/number-formatter.ts` around lines 316 - 317, Remove the TypeScript suppression and fix the string replacement in the BigNumber construction: replace the line using _value.toString().replaceAll('%', '') with _value.toString().replace(/%/g, '') (or simply remove the // `@ts-ignore` if replaceAll is allowed), so update the code that builds value (const value = new BigNumber(...)) to use the regex replace on _value and eliminate the `@ts-ignore` comment.packages/extension/src/providers/bitcoin/libs/utils.ts-21-26 (1)
21-26:⚠️ Potential issue | 🟡 MinorNormalize the boolean flag to a number before
ccall.Line 25 passes a boolean
isTestNetworkdirectly, but the WASM function signature declares the second argument as'number'. This is inconsistent with the same pattern ingetMintTxData.ts(line 58), which converts the value explicitly usingNumber(isTestNetwork).Proposed fix
const isAddressValid = wasm.ccall( 'js_isValidSparkAddress', 'number', ['string', 'number'], - [address, isTestNetwork], + [address, Number(isTestNetwork)], );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/libs/utils.ts` around lines 21 - 26, The call to wasm.ccall for js_isValidSparkAddress passes a boolean for the second parameter but the WASM signature expects a number; update the wasm.ccall invocation in isAddressValid to normalize isTestNetwork to a numeric value (e.g., Number(isTestNetwork)) before passing it so the argument types match the declared ['string','number'] signature.packages/extension/src/libs/utils/updateAndSync/calculateCurrentSparkBalance.ts-7-18 (1)
7-18:⚠️ Potential issue | 🟡 MinorValidate the IndexedDB payload before filtering.
readDatareturns persisted local data, so a malformedmyCoinsvalue will make.filterthrow and break the Spark balance refresh. Mirror the defensive fallback used bygetSetsFromDb.Proposed fix
-export const calculateCurrentSparkBalance = async () => { - const myCoins = await db.readData<MyCoinModel[]>(DB_DATA_KEYS.myCoins); +export const calculateCurrentSparkBalance = async (): Promise<string> => { + const myCoins = (await db.readData<MyCoinModel[]>(DB_DATA_KEYS.myCoins)) ?? []; - if (!myCoins?.length) { + if (!Array.isArray(myCoins) || !myCoins.length) { return '0'; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/libs/utils/updateAndSync/calculateCurrentSparkBalance.ts` around lines 7 - 18, The calculateCurrentSparkBalance function assumes readData(DB_DATA_KEYS.myCoins) returns a valid array and directly calls .filter, which can throw on malformed IndexedDB payloads; before filtering, validate myCoins with Array.isArray and normalize it (e.g., fallback to an empty array) and ensure items have the expected shape (check existence of isUsed and numeric/value fields) before reducing; update the logic in calculateCurrentSparkBalance to mirror the defensive checks used in getSetsFromDb so a bad payload returns '0' rather than throwing.packages/extension/src/ui/action/views/accounts/components/accounts-list-item.vue-14-22 (1)
14-22:⚠️ Potential issue | 🟡 MinorInitial-render flicker and wrong-address display for Firo accounts.
computedAsyncresolves asynchronously, so on first rendersparkDisplayAddressisundefined(noevaluating/initial value provided). During that window, withnetwork.name === NetworkNames.Firo && active,replaceWithEllipsis(undefined, 6, 4)runs onundefined, which will either throw or produce a garbled ellipsis string depending on the filter's implementation. Pass an initial value tocomputedAsyncand guard the template against falsy values (e.g., fall back to the transparentaddressuntil the spark address resolves).🔧 Proposed fix
-const sparkDisplayAddress = computedAsync(async () => { - if (props.network.name !== NetworkNames.Firo) return null; - return await getSparkAddress(props.network, props.address); -}); +const sparkDisplayAddress = computedAsync(async () => { + if (props.network.name !== NetworkNames.Firo) return null; + return (await getSparkAddress(props.network, props.address)) ?? props.address; +}, props.address);Also applies to: 98-101
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/views/accounts/components/accounts-list-item.vue` around lines 14 - 22, The Firo account label uses sparkDisplayAddress (from computedAsync) before it resolves, causing flicker/garbled output; update the computedAsync that produces sparkDisplayAddress to supply a safe initial value (e.g., address or empty string) and change the template expression so it only calls $filters.replaceWithEllipsis(sparkDisplayAddress, 6, 4) when sparkDisplayAddress is truthy, otherwise fall back to using address; reference the sparkDisplayAddress computed, the computedAsync initializer, and the template usage around network.name === NetworkNames.Firo && active, and ensure the same guard is applied in the other occurrence (lines referenced 98-101).packages/extension/src/libs/keyring/public-keyring.ts-109-118 (1)
109-118:⚠️ Potential issue | 🟡 MinorDev mock account missing
isTestWallet: true.Every other
__IS_DEV__mock in this block setsisTestWallet: true; the new Firo entry omits it. If any UI filters dev accounts byisTestWallet, this one will leak through as a "real" account in dev builds.walletType: WalletType.mnemonic, isHardware: false, + isTestWallet: true, };🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/libs/keyring/public-keyring.ts` around lines 109 - 118, The dev mock entry in public-keyring.ts for allKeys key 'TMSnbcpSw9JhteaJqioT2sz2sW1Qhqyf2Q' is missing the isTestWallet flag; update that object in allKeys (the Firo account mock) to include isTestWallet: true so it matches other __IS_DEV__ mocks and will be filtered correctly by UI code that checks isTestWallet.packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue-241-245 (1)
241-245:⚠️ Potential issue | 🟡 Minor
setMaxValuedoes not guard against empty/NaNassetMaxValue.
assetMaxValuestarts as''(initial default ofuseAsyncComputed).parseFloat('') === NaN, soNaN < 0isfalseandamount.valuebecomes'', which then flows intotoBase('', decimals). This is likely benign but also does not actually set a max. Guard with a check thatassetMaxValue.valueis a non-empty numeric string before applying.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue` around lines 241 - 245, The setMaxValue handler should guard against empty or non-numeric assetMaxValue before assigning amount: in the setMaxValue function (where isMaxSelected.value, amount.value and assetMaxValue.value are used) check that assetMaxValue.value is a non-empty numeric string (e.g. trim and use parseFloat + isNaN or Number.isFinite) and only then set amount.value to assetMaxValue.value; otherwise set amount.value to '0' (or leave it unchanged per UI expectation) to avoid passing ''/NaN into toBase or downstream conversions.packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue-233-238 (1)
233-238:⚠️ Potential issue | 🟡 MinorDon't overwrite the asset
nameand drop the commented-out code.
- Line 238 sets
availableAsset.value.name = 'Available Balance', which changes the token's identity to a UI label. Ifnameis consumed anywhere else (send review screen, metrics, tx building) this will surface the literal string "Available Balance" instead of "Firo". Either rename to a local label variable or avoid mutating the token'sname.- Remove the commented-out block at lines 233–235 (and the similar block at 148–158) to keep the file clean.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue` around lines 233 - 238, The code incorrectly mutates the token identity by setting availableAsset.value.name = 'Available Balance' after calling calculateCurrentSparkBalance(); revert that mutation and instead introduce or use a separate local label (e.g., availableAssetLabel or a computed property used by the UI) so the token's name (e.g., "Firo") remains unchanged where it's consumed (send review, metrics, tx building), and remove the stale commented-out balance assignment blocks (the commented snippet around calculateCurrentSparkBalance and the similar commented block elsewhere) to clean up the file.packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue-100-100 (1)
100-100:⚠️ Potential issue | 🟡 MinorRemove debug
console.log.
console.log({ isLoadingAssets: isLoadingAssets.value })is a debug artifact; please remove before release.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue` at line 100, Remove the debug console.log statement that prints the reactive flag—delete the line calling console.log({ isLoadingAssets: isLoadingAssets.value }) from the send-transaction component (index.vue) so no debug output remains; locate the usage of isLoadingAssets.value in the component and simply remove the console.log call, leaving the reactive variable untouched.packages/extension/src/ui/action/App.vue-191-210 (1)
191-210:⚠️ Potential issue | 🟡 MinorRemove debug
console.logand clean up the balance-refresh closure.
console.log('UPDATING BALANCE')at line 194 is a debug artifact and should be removed before release. Also consider using??when defaultingsparkBalance, since0(a legitimate zero balance) would be replaced with'0'string by the||operator — currently harmless becausecalculateCurrentSparkBalance()likely returns a string, but worth keeping consistent.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/App.vue` around lines 191 - 210, Remove the debug console.log inside the callback passed to useSynchronizeSparkState and replace the sparkBalance fallback from using || to nullish coalescing (??) so a legitimate zero balance is preserved; specifically, in the closure that calls calculateCurrentSparkBalance() and updates accountHeaderData (referencing accountHeaderData.value.sparkAccount and sparkBalance), delete console.log('UPDATING BALANCE') and set availableBalance to sparkBalance ?? '0'; keep the rest of the update logic intact and leave useUpdateActivityState and sparkUnusedTxDetails usage unchanged.packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue-192-222 (1)
192-222:⚠️ Potential issue | 🟡 Minor
sparkAccount.defaultAddressmay be undefined when Send is clicked.
sendActionreadsprops.sparkAccount.defaultAddressunconditionally. IfsparkAccountis{}(its default),fromAccountAddresswill beundefinedand the verification route will receivefromAddress: undefinedin the base64 payload. There's no guard here — consider either returning early with an error or disabling the Send button whensparkAccount.defaultAddressis missing.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue` around lines 192 - 222, sendAction currently reads props.sparkAccount.defaultAddress without checking it, which can cause fromAddress to be undefined in the router.push payload; update sendAction (and/or the Send button logic) to guard against a missing address: inside sendAction check props.sparkAccount && props.sparkAccount.defaultAddress and if missing either return early and surface an error/notification (e.g., call a provided showError method) or prevent navigation, and additionally ensure the Send button is disabled when props.sparkAccount.defaultAddress is falsy so router.push (to RouterNames.verifySendFromSpark.name) is never called with an undefined fromAddress.packages/extension/src/libs/utils/updateAndSync/handleCoinSetUpdates.ts-54-67 (1)
54-67:⚠️ Potential issue | 🟡 MinorPossible sparse array when
setIdskips indices.If
updatedSets.length < setId - 1, assigningupdatedSets[setIndex] = {...}creates holes (undefinedentries) that are later persisted. Depending on howsaveDataserializes, holes could becomenulls and later break consumers that expect contiguousStoredAnonymitySetentries (and the deduplication relying onexistingSet?.coins). Consider filling any gap explicitly (e.g. with an empty set shell) or at least validating thatsetId - 1 <= updatedSets.length.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/libs/utils/updateAndSync/handleCoinSetUpdates.ts` around lines 54 - 67, The loop over normalizedUpdates can create sparse arrays when setId skips indices; in the updatedSets assignment inside the forEach, ensure contiguous entries by checking setIndex against updatedSets.length and filling any gaps with an empty StoredAnonymitySet shell (e.g., { blockHash: '', setHash: '', coins: [] }) before the assignment; keep using mergeCoins(existingSet?.coins ?? [], update.set.coins) so deduplication still works, or alternatively validate and throw if setIndex > updatedSets.length to avoid holes—update the logic in the normalizedUpdates.forEach block (referencing normalizedUpdates, setId/setIndex, updatedSets, existingSet, and mergeCoins).packages/extension/src/ui/action/composables/synchronize-spark-state.ts-135-135 (1)
135-135:⚠️ Potential issue | 🟡 MinorRemove debug
console.log.
console.log({ coinDone, tagDone, workerDone })inside the sync-state watcher is a debug artifact that will run on every sync-flag change. Please remove before release. (Same applies toconsole.log('Worker already running, skip start')at line 45.)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/composables/synchronize-spark-state.ts` at line 135, Remove the debug console.log statements in the sync-state watcher and worker start check: delete the console.log({ coinDone, tagDone, workerDone }) inside the watcher (where coinDone, tagDone, workerDone are referenced) and remove the console.log('Worker already running, skip start') in the worker-start logic (the branch that checks if the worker is already running). Leave any real logging mechanism intact if present or replace with a proper logger call if needed; ensure no stray console.log calls remain in synchronize-spark-state.ts.packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue-156-160 (1)
156-160:⚠️ Potential issue | 🟡 MinorFix activity status and address inconsistency for Spark transactions.
On success, the activity remains in
ActivityStatus.pendingafter broadcast (line 177—no status update). Set it toActivityStatus.successbefore recording. Additionally, on failure (line 207), the activity is written under the rawtxData.fromAddress, while success usesnetwork.value.displayAddress(txData.fromAddress), causing failed transactions to appear under a different address key than successful ones.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue` around lines 156 - 160, The activity status and address key are inconsistent: after broadcast via sendFromSparkAddress you must update the activity from ActivityStatus.pending to ActivityStatus.success before recording (set ActivityStatus.success on the successful branch where rawTx is returned), and when writing the activity in the failure branch use the same normalized display address as success (network.value.displayAddress(txData.fromAddress)) instead of raw txData.fromAddress so both success and failure entries get stored under the same address key.packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue-246-263 (1)
246-263:⚠️ Potential issue | 🟡 Minor
isIncoming: txData.fromAddress === txData.toAddressis semantically incorrect for a send flow.We're in a send-to-Spark confirmation flow, so the activity is intrinsically outgoing from the user's perspective. Setting
isIncomingtrue only whenfromAddress === toAddress(sending to self) misclassifies self-sends as "incoming" in the activity list while correctly classifying them as outgoing for everyone else. Spark's mint-to-self is still an outgoing transaction from the wallet's viewpoint (fees are paid, UTXOs consumed). SetisIncoming: false.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue` around lines 246 - 263, The isIncoming flag on the txActivity object is wrong for the send-to-Spark flow: change the isIncoming assignment in the txActivity construction (the isIncoming field on the txActivity object created in index.vue) to false so all send-to-Spark activities are recorded as outgoing (keep other fields like ActivityType.transaction, transactionHash, token, value, etc. unchanged); do not use txData.fromAddress === txData.toAddress to infer incoming for this send flow.packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue-181-195 (1)
181-195:⚠️ Potential issue | 🟡 MinorMagic number
500in change calc obscures intent and risks over/under-funding.changeAmount: inputAmountBn.minus(amountToSendBN).minus(new BigNumber(500)),
500appears to be a placeholder fee used solely to buildtempTxfor fee estimation, but it's not labeled and the same literal is not reconciled anywhere else. IfgetFeelater returns a value that, when combined withrequestedAmount, exceedsinputAmountBn - 500, the realpsbtbuild below still proceeds and only later theinputAmountBn.lt(amountToSendBN.plus(feeBn))guard throws. Extract a named constant and document why.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue` around lines 181 - 195, The change calculation in createTempTx uses a magic literal new BigNumber(500) which obscures intent and can mismatch later fee checks; replace that literal with a clearly named constant (e.g. ESTIMATED_FEE_BUFFER = new BigNumber(500)) and add a short comment documenting that it is a conservative fee buffer used only for initial tempTx fee estimation; update the changeAmount expression in createTempTx and ensure any subsequent fee comparison logic (e.g. where getFee is used and the inputAmountBn.lt(amountToSendBN.plus(feeBn)) guard) consistently references the same named constant or is reconciled with it so the placeholder buffer is obvious and consistent across createTempTx, changeAmount and fee validation.packages/extension/src/ui/action/db/indexedDB.ts-136-144 (1)
136-144:⚠️ Potential issue | 🟡 Minor
if (index)treats index0as unset.Passing a valid
index = 0takes thereturn data?.lengthbranch instead of returningdata[0].coins.length. Useif (index !== undefined).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/db/indexedDB.ts` around lines 136 - 144, In getLengthOf, the check "if (index)" treats 0 as unset; change the condition to explicitly detect a provided index (e.g., use "index !== undefined" or "index != null") so index = 0 is handled; keep the existing validation of data[index] and error throw (Invalid index) and return data[index].coins.length when an index is provided, otherwise return data?.length; references: getLengthOf and readData.packages/extension/src/ui/action/db/indexedDB.ts-119-129 (1)
119-129:⚠️ Potential issue | 🟡 Minor
removeSectoris a misleading name — it keeps the slice, it doesn't remove it.
data[index].coins = data[index].coins.slice(startIndex, endIndex)retains the[startIndex, endIndex)range and discards the rest. That's the opposite of "removing a sector". Either rename (e.g.keepSector/sliceCoins) or actually splice-out the sector. Also consider validatingstartIndex <= endIndexand clamping to array bounds.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/db/indexedDB.ts` around lines 119 - 129, The method removeSector currently assigns data[index].coins = data[index].coins.slice(startIndex, endIndex) which keeps the slice rather than removing it; update the implementation of removeSector (or rename it to reflect behavior, e.g., keepSector or sliceCoins) to either (A) actually remove the range using an in-place splice on data[index].coins (e.g., data[index].coins.splice(startIndex, endIndex - startIndex)) or (B) if you intend to keep the slice, rename removeSector to sliceCoins/keepSector; in either case add validation in removeSector (or the renamed method) to ensure startIndex <= endIndex and clamp both indices to [0, data[index].coins.length] before performing the operation and preserve throwing the existing 'Invalid index' if data[index] is missing.packages/extension/src/libs/spark-handler/index.ts-321-332 (1)
321-332:⚠️ Potential issue | 🟡 MinorHardcoded
'Private memo'string is embedded in every outgoing Spark spend.Every Spark send created through this path ships the literal string
'Private memo'as the encrypted memo. While memos are encrypted in Spark, this is still a confusing default and leaks a fingerprint that identifies transactions as originating from this extension build. Make this configurable (empty string by default, or thread the user's memo from the UI).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/libs/spark-handler/index.ts` around lines 321 - 332, The code currently passes the hardcoded string 'Private memo' into Module.ccall when invoking js_addPrivateRecipientForCreateSparkSpendTransaction; replace this literal with a memo variable (e.g., memoString) that is either passed into the surrounding function or read from the UI/options so it defaults to an empty string when not provided. Locate the call to Module.ccall and the surrounding function that builds privateRecipientsVector (and where isSparkTransaction, recipientAddressObj, senderAddressObj, privateRecipientAmount are available) and change the argument from the literal to the memo variable, ensuring the ccall still uses the 'string' type for that parameter and add minimal validation (coerce undefined/null to '') before calling.packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts-33-38 (1)
33-38:⚠️ Potential issue | 🟡 MinorHardcoded peer port
50004ignores thePeer.tcp/Peer.sslconfig you just defined.
hardcodedPeersentries carrytcp: '50001', ssl: '50002', butconnectMainpasses literal50004, 'wss'and never reads either field. Either drop the unused fields fromPeeror actually usepeer.ssl/peer.tcpso the configuration is honored.Also applies to: 77-80
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts` around lines 33 - 38, The hardcodedPeers array defines tcp and ssl ports but connectMain (and the other call sites around the same area) currently ignore them by passing literal 50004 and 'wss'; update the connection logic to read and use the Peer fields (peer.tcp and peer.ssl) and the desired protocol value from the Peer or connection options instead of hardcoded values, or remove tcp/ssl from the Peer type if you intend to always use fixed ports; specifically modify connectMain and the other connection invocation(s) near lines 77-80 to use peer.ssl/peer.tcp (and peer.protocol if you add it) so the configured ports are honored.packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts-367-378 (1)
367-378:⚠️ Potential issue | 🟡 MinorDebug string
"23333"leaked into error message.throw new Error(`${result.message} 23333`);Appears to be leftover instrumentation. Strip it.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts` around lines 367 - 378, In getLatestSetId, remove the stray debug token "23333" from the thrown error: when mainClient?.request('spark.getsparklatestcoinid', []) returns an Error (check in getLatestSetId), either rethrow that Error or throw a new Error using only result.message (no extra debug string); update the throw in getLatestSetId to not append "23333" and preserve the original error information from result.packages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.ts-17-18 (1)
17-18:⚠️ Potential issue | 🟡 MinorPrefer checking HTTP status over stringly-typed
messageproperty and drop theanycast.
(txs as any).messagedepends on the Insight API's shape-on-error. If the node returns non-200 with JSON{ error: ... }, or returns notxsfield at all, the code proceeds totxs.txs.mapand throws a confusing error that gets swallowed in.catch. Checkres.okfirst and validate the presence oftxs.txsbefore mapping.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.ts` around lines 17 - 18, In the .then(async (txs: { txs: FiroTxType[] }) => { ... }) callback replace the stringy error check and any-cast with proper HTTP and shape validation: ensure the fetch response was ok (check res.ok at the fetch response handling point) and that the parsed body contains a populated txs.txs array before calling txs.txs.map; remove the (txs as any).message check and any casts, and return [] early if the response is not ok or txs.txs is missing/not an array to avoid throwing when mapping in the callback.packages/extension/src/ui/action/db/indexedDB.ts-131-154 (1)
131-154:⚠️ Potential issue | 🟡 MinorHelpers throw raw errors when the underlying key is absent.
getLength,getBlockHashes,getSetHashesall call.reduce/.mapon the result ofreadData, which returnsundefinedwhen the key was never written (first-run, post-reset). That throws a TypeError rather than returning a sensible0/[]. Defensive null-check (const data = (await this.readData<any[]>(key)) ?? [];).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/ui/action/db/indexedDB.ts` around lines 131 - 154, getLength, getBlockHashes and getSetHashes can throw when readData returns undefined; update each to defensively default the readData result to an empty array (e.g. const data = (await this.readData<any[]>(keyOrDB_DATA_KEYS.sets)) ?? []) before calling .reduce or .map so they return 0 or [] instead of raising a TypeError; adjust getLength to sum over the defaulted array and getBlockHashes/getSetHashes to map the defaulted array (referencing methods getLength, getBlockHashes, getSetHashes and DB_DATA_KEYS.sets).packages/extension/src/libs/utils/updateAndSync/updateTagsSet.ts-82-89 (1)
82-89:⚠️ Potential issue | 🟡 Minor
onCompletefires after the first error, which may mislead callers.
fireCompleteOnce()is invoked from both the success and error branches, meaning an immediate network/DB failure on the very first run will triggeronCompleteeven though nothing has ever succeeded. If UI uses "complete" as a gate to flip away from a "syncing" indicator, users will see a green/completed state despite a stale DB. Consider firing complete only on the first successful run, and lettingonErrordrive failure UX.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/libs/utils/updateAndSync/updateTagsSet.ts` around lines 82 - 89, The current flow calls fireCompleteOnce() from both success and error paths, causing onComplete to fire after the first failure; change this so fireCompleteOnce() is invoked only when a successful run occurs (e.g., set or check a "hasSucceeded" flag inside the try-success path and call fireCompleteOnce() there) and remove the call from the catch block that handles options?.onError?.(error); keep isRunning=false and scheduleNext() in finally so loop behavior is unchanged. Ensure you update references in the same function (fireCompleteOnce, options?.onError, isRunning, scheduleNext) so only the first successful completion triggers onComplete.packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts-173-173 (1)
173-173:⚠️ Potential issue | 🟡 MinorRemove shipped debug log.
- console.log(`🔹 Total Spendable UTXOs: ${allUtxos.length}`);Same applies to similar emoji-prefixed logs elsewhere in the file.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts` at line 173, Remove the shipped debug console.log from base-firo-wallet (the line printing `🔹 Total Spendable UTXOs`) and any other emoji-prefixed console.log statements in this file; locate occurrences in functions/methods such as where `allUtxos` is processed and delete those console.log calls or replace them with proper structured logging (e.g., use the configured logger at debug/trace level) so no emoji debug prints remain in production code.packages/extension/src/libs/utils/updateAndSync/updateCoinSet.ts-70-72 (1)
70-72:⚠️ Potential issue | 🟡 MinorRemove debug logging before release.
These
console.logcalls run on every sync cycle and leak set metadata to the console.- console.log('===>>>localCoinsCount', localCoinsCount); - - console.log('===>>>Meta for set', meta, localCoinsCount); -🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/libs/utils/updateAndSync/updateCoinSet.ts` around lines 70 - 72, Remove the two debug console.log statements that print localCoinsCount and meta inside updateCoinSet (the lines logging '===>>>localCoinsCount' and '===>>>Meta for set'); either delete them or replace them with a proper logger call at a debug/trace level (using the project's logging facility) so sensitive set metadata is not printed to stdout on every sync cycle, referencing the variables localCoinsCount and meta and the updateCoinSet function to locate the statements.packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts-263-305 (1)
263-305:⚠️ Potential issue | 🟡 Minor
index = index * 1is not an int cast, and unknown nodes returnundefinedsilently.
- Line 264:
index * 1only coerces a string operand toNumber; it does not truncate floats as the comment claims. If the intent is integer coercion, useMath.trunc(index)(orindex | 0for small ints). Otherwise drop the misleading line.- The function quietly returns
undefinedfor anynodevalue other than0or1(falls off the end). Callers use non-null assertions downstream, so a badnodemanifests as a runtime error far from the source. Add an explicitthrowfor unsupported nodes.- async _getNodeAddressByIndex(node: number, index: number) { - index = index * 1; // cast to int + async _getNodeAddressByIndex(node: number, index: number) { + if (node !== EXTERNAL_INDEX && node !== INTERNAL_INDEX) { + throw new Error(`Unsupported node ${node}`); + } + index = Math.trunc(index);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts` around lines 263 - 305, The _getNodeAddressByIndex function currently uses a misleading "index = index * 1" cast and can silently return undefined for unsupported node values; change the index handling to an explicit integer conversion (e.g. index = Math.trunc(Number(index)) and validate it's finite and >= 0) and add an early guard that throws a clear Error if node is not EXTERNAL_INDEX or INTERNAL_INDEX; keep the existing logic using getXpub, bip32.fromBase58, _node0/_node1, and the external_addresses_cache/internal_addresses_cache but ensure the function always either returns a cached/derived address or throws for invalid input instead of falling off the end.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 78717869-9804-4756-865e-800a10d53529
⛔ Files ignored due to path filters (4)
packages/extension/src/libs/utils/wasmModule/spark.wasmis excluded by!**/*.wasmpackages/extension/src/libs/utils/wasmWorkerModule/spark.wasmis excluded by!**/*.wasmpackages/extension/src/providers/bitcoin/networks/icons/firo.svgis excluded by!**/*.svgyarn.lockis excluded by!**/yarn.lock,!**/*.lock
📒 Files selected for processing (99)
.gitignoreREADME.mdpackage.jsonpackages/extension/.prettierignorepackages/extension/package.jsonpackages/extension/src/libs/background/index.tspackages/extension/src/libs/keyring/keyring.tspackages/extension/src/libs/keyring/public-keyring.tspackages/extension/src/libs/spark-handler/constants.tspackages/extension/src/libs/spark-handler/createTempFromSparkTx.tspackages/extension/src/libs/spark-handler/createTempTx.tspackages/extension/src/libs/spark-handler/generateSparkWallet.tspackages/extension/src/libs/spark-handler/getFee.tspackages/extension/src/libs/spark-handler/getMintTxData.tspackages/extension/src/libs/spark-handler/getSerializedCoin.tspackages/extension/src/libs/spark-handler/getSparkCoinInfo.tspackages/extension/src/libs/spark-handler/getTotalMintedAmount.tspackages/extension/src/libs/spark-handler/index.tspackages/extension/src/libs/spark-handler/serializeMintContext.tspackages/extension/src/libs/spark-handler/utils.tspackages/extension/src/libs/utils/accounts.tspackages/extension/src/libs/utils/initialize-wallet.tspackages/extension/src/libs/utils/number-formatter.tspackages/extension/src/libs/utils/updateAndSync/calculateCurrentSparkBalance.tspackages/extension/src/libs/utils/updateAndSync/getSetsFromDb.tspackages/extension/src/libs/utils/updateAndSync/handleCoinSetUpdates.tspackages/extension/src/libs/utils/updateAndSync/markCoinsAsUsed.tspackages/extension/src/libs/utils/updateAndSync/updateCoinSet.tspackages/extension/src/libs/utils/updateAndSync/updateTagsSet.tspackages/extension/src/libs/utils/wasm-loader.tspackages/extension/src/libs/utils/wasm-worker-loader.tspackages/extension/src/libs/utils/wasmModule/spark.jspackages/extension/src/libs/utils/wasmWorkerModule/spark.jspackages/extension/src/providers/bitcoin/libs/activity-handlers/index.tspackages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.tspackages/extension/src/providers/bitcoin/libs/api-firo.tspackages/extension/src/providers/bitcoin/libs/api.tspackages/extension/src/providers/bitcoin/libs/electrum-client/abstract-electrum.tspackages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.tspackages/extension/src/providers/bitcoin/libs/filter-ordinals.tspackages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.tspackages/extension/src/providers/bitcoin/libs/firo-wallet/configs.tspackages/extension/src/providers/bitcoin/libs/firo-wallet/utils.tspackages/extension/src/providers/bitcoin/libs/utils.tspackages/extension/src/providers/bitcoin/networks/firo-testnet.tspackages/extension/src/providers/bitcoin/networks/firo.tspackages/extension/src/providers/bitcoin/networks/index.tspackages/extension/src/providers/bitcoin/tests/bitcoin.address.derivation.test.tspackages/extension/src/providers/bitcoin/types/bitcoin-network.tspackages/extension/src/providers/bitcoin/types/index.tspackages/extension/src/providers/bitcoin/ui/libs/signer.tspackages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vuepackages/extension/src/providers/bitcoin/ui/send-transaction/index.vuepackages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vuepackages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vuepackages/extension/src/providers/bitcoin/ui/send-transaction/verify-transaction/index.vuepackages/extension/src/providers/bitcoin/ui/types.tspackages/extension/src/providers/ethereum/libs/transaction/types.tspackages/extension/src/types/activity.tspackages/extension/src/types/provider.tspackages/extension/src/ui/action/App.vuepackages/extension/src/ui/action/components/accounts-header/components/header-accounts.vuepackages/extension/src/ui/action/components/accounts-header/index.vuepackages/extension/src/ui/action/components/app-menu/index.vuepackages/extension/src/ui/action/composables/async-computed.tspackages/extension/src/ui/action/composables/get-spark-address.tspackages/extension/src/ui/action/composables/synchronize-spark-state.tspackages/extension/src/ui/action/composables/update-activity-state.tspackages/extension/src/ui/action/db/indexedDB.tspackages/extension/src/ui/action/main.tspackages/extension/src/ui/action/router/index.tspackages/extension/src/ui/action/types/account.tspackages/extension/src/ui/action/utils/array-utils.tspackages/extension/src/ui/action/utils/set-utils.tspackages/extension/src/ui/action/views/accounts/components/accounts-list-item.vuepackages/extension/src/ui/action/views/accounts/index.vuepackages/extension/src/ui/action/views/deposit/index.vuepackages/extension/src/ui/action/views/lock-screen/components/lock-screen-forgot.vuepackages/extension/src/ui/action/views/lock-screen/index.vuepackages/extension/src/ui/action/views/modal-accounts/index.vuepackages/extension/src/ui/action/views/network-activity/components/network-activity-total.vuepackages/extension/src/ui/action/views/network-activity/components/network-activity-transaction.vuepackages/extension/src/ui/action/views/network-activity/components/synchronize-state.vuepackages/extension/src/ui/action/views/network-activity/index.vuepackages/extension/src/ui/action/views/network-assets/index.vuepackages/extension/src/ui/action/views/reset-wallet/index.vuepackages/extension/src/ui/action/views/send-process/components/send-process-account.vuepackages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vuepackages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vuepackages/extension/src/ui/action/workers/sparkCoinInfoWorker.tspackages/extension/src/ui/onboard/App.vuepackages/extension/tsconfig.app.jsonpackages/extension/vite.config.tspackages/keyring/package.jsonpackages/keyring/src/index.tspackages/name-resolution/tests/resolver.test.tspackages/name-resolution/tests/sid.test.tspackages/name-resolution/tests/ud.test.tspackages/types/src/networks.ts
| case InternalMethods.unlock: | ||
| const password = message?.params?.[0] as string; | ||
| const mnemonic = await this.#keyring.getSavedMnemonic(password); | ||
| this.#wallet.setSecret(mnemonic); | ||
| return unlock(this.#keyring, message); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
cat -n packages/extension/src/libs/background/index.ts | sed -n '170,190p'Repository: enkryptcom/enKrypt
Length of output: 1133
🏁 Script executed:
# Search for getSavedMnemonic and related keyring methods
rg -A 5 "getSavedMnemonic|getMnemonic" packages/extension/src --type tsRepository: enkryptcom/enKrypt
Length of output: 2257
🏁 Script executed:
# Find the unlock function implementation
rg -B 2 -A 10 "(?:export\s+)?(?:function|const)\s+unlock\s*\(" packages/extension/src --type tsRepository: enkryptcom/enKrypt
Length of output: 44
🏁 Script executed:
# Check InternalMethods enum
rg -A 20 "enum InternalMethods|InternalMethods\s*=" packages/extension/src --type tsRepository: enkryptcom/enKrypt
Length of output: 1663
🏁 Script executed:
# Look for other cases in the same switch statement to understand scope leaking
cat -n packages/extension/src/libs/background/index.ts | sed -n '160,210p'Repository: enkryptcom/enKrypt
Length of output: 2110
🏁 Script executed:
# Find the unlock function implementation
rg -B 2 -A 15 "export.*unlock\s*\(|const unlock\s*=" packages/extension/src --type tsRepository: enkryptcom/enKrypt
Length of output: 1694
🏁 Script executed:
# Check the MnemonicWithExtraWord type definition
rg -B 2 -A 5 "MnemonicWithExtraWord|interface.*Mnemonic" packages/extension/src --type tsRepository: enkryptcom/enKrypt
Length of output: 1237
🏁 Script executed:
# Check getSavedMnemonic implementation in private keyring
rg -B 3 -A 10 "getSavedMnemonic\s*\(" packages/extension/src/libs/keyring --type tsRepository: enkryptcom/enKrypt
Length of output: 1878
🏁 Script executed:
# Check for Firo-specific logic or wallet.setSecret usage
rg -B 3 -A 3 "setSecret|Firo" packages/extension/src/libs/background --type tsRepository: enkryptcom/enKrypt
Length of output: 2443
🏁 Script executed:
# Check Biome configuration
cat biome.json 2>/dev/null || cat .biome.json 2>/dev/null || rg "noSwitchDeclarations" . 2>/dev/nullRepository: enkryptcom/enKrypt
Length of output: 1321
🏁 Script executed:
# Find the actual private keyring implementation
find packages/extension/src/libs/keyring -name "*.ts" -type f | head -20Repository: enkryptcom/enKrypt
Length of output: 162
🏁 Script executed:
# Search for the BaseKeyRing or private keyring class
rg -l "class.*Keyring" packages/extension/src/libs/keyring --type tsRepository: enkryptcom/enKrypt
Length of output: 44
🏁 Script executed:
# Get full unlock.ts to see error handling
cat -n packages/extension/src/libs/background/internal/unlock.tsRepository: enkryptcom/enKrypt
Length of output: 1288
🏁 Script executed:
# Search for getSavedMnemonic and getMnemonic actual implementations
rg -B 3 -A 12 "getSavedMnemonic\s*\(" packages/extension/src/libs/keyring/base-keyring.ts --type tsRepository: enkryptcom/enKrypt
Length of output: 153
🏁 Script executed:
# Get getMnemonic implementation
rg -B 3 -A 12 "getMnemonic\s*\(" packages/extension/src/libs/keyring/base-keyring.ts --type tsRepository: enkryptcom/enKrypt
Length of output: 153
🏁 Script executed:
# Look inside keyring.ts to see the private keyring implementation
cat -n packages/extension/src/libs/keyring/keyring.ts | head -100Repository: enkryptcom/enKrypt
Length of output: 3792
🏁 Script executed:
# Search for the actual keyring implementations
rg -l "class.*" packages/extension/src/libs/keyring --type tsRepository: enkryptcom/enKrypt
Length of output: 162
🏁 Script executed:
# Search for getSavedMnemonic implementation that throws on decrypt failure
rg -B 5 -A 15 "getSavedMnemonic" packages/extension/src/libs/keyring/keyring.ts --type tsRepository: enkryptcom/enKrypt
Length of output: 754
🏁 Script executed:
# Find where `#keyring` is instantiated in keyring.ts
rg -B 5 -A 5 "#keyring\s*=" packages/extension/src/libs/keyring/keyring.ts --type tsRepository: enkryptcom/enKrypt
Length of output: 403
🏁 Script executed:
# List all files in the keyring directory
ls -la packages/extension/src/libs/keyring/Repository: enkryptcom/enKrypt
Length of output: 304
🏁 Script executed:
# Search for other usages of getSavedMnemonic to understand its error behavior
rg -B 3 -A 3 "getSavedMnemonic" packages/extension/src --type ts | grep -A 5 -B 5 "try\|catch"Repository: enkryptcom/enKrypt
Length of output: 44
🏁 Script executed:
# Check if there are other places where getSavedMnemonic is called with error handling
rg -B 5 -A 5 "getSavedMnemonic" packages/extension/src --type tsRepository: enkryptcom/enKrypt
Length of output: 2859
🏁 Script executed:
# Verify the actual call flow - does the switch case need braces?
ast-grep --pattern 'case InternalMethods.unlock: { $$$}'Repository: enkryptcom/enKrypt
Length of output: 44
Unlock flow has several correctness/security regressions.
-
Breaks error handling on wrong password.
getSavedMnemonic(password)throws on decrypt failure, so the exception occurs beforeunlock(...)is reached. The existingunlockhandler's.catch()(which translates wrong-password errors into structuredInternalOnMessageResponsewitherrorset) is bypassed, and all callers receive a raw rejection instead of the documented response shape. -
Unconditionally decrypts and stores a raw mnemonic on every unlock, even for users who never touch Firo. This expands the extension's in-memory secret footprint and runs a decryption on every unlock just to support one network.
-
Loses
extraWord.getSavedMnemonicreturns only the base mnemonic;getMnemonicis available and returnsMnemonicWithExtraWord. Users who configured an extra word will have#wallet.setSecret(mnemonic)seeded inconsistently with the rest of the keyring's derivations. -
Switch-declaration scoping — flagged by Biome (
noSwitchDeclarations):const password/const mnemonicleak into sibling cases.
Wrap the Firo setup in a try/catch (or only run it after unlock succeeds), gate it on the currently selected network being Firo, use getMnemonic and include the extra word, and wrap the case in a block.
🔧 Proposed fix sketch
- case InternalMethods.unlock:
- const password = message?.params?.[0] as string;
- const mnemonic = await this.#keyring.getSavedMnemonic(password);
- this.#wallet.setSecret(mnemonic);
- return unlock(this.#keyring, message);
+ case InternalMethods.unlock: {
+ const response = await unlock(this.#keyring, message);
+ if (!response.error) {
+ try {
+ const password = message?.params?.[0] as string;
+ const { mnemonic, extraWord } =
+ await this.#keyring.getMnemonic(password);
+ this.#wallet.setSecret(
+ extraWord ? `${mnemonic} ${extraWord}` : mnemonic,
+ );
+ } catch (_) {
+ // non-fatal: Firo wallet will be unavailable until re-unlock
+ }
+ }
+ return response;
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| case InternalMethods.unlock: | |
| const password = message?.params?.[0] as string; | |
| const mnemonic = await this.#keyring.getSavedMnemonic(password); | |
| this.#wallet.setSecret(mnemonic); | |
| return unlock(this.#keyring, message); | |
| case InternalMethods.unlock: { | |
| const response = await unlock(this.#keyring, message); | |
| if (!response.error) { | |
| try { | |
| const password = message?.params?.[0] as string; | |
| const { mnemonic, extraWord } = | |
| await this.#keyring.getMnemonic(password); | |
| this.#wallet.setSecret( | |
| extraWord ? `${mnemonic} ${extraWord}` : mnemonic, | |
| ); | |
| } catch (_) { | |
| // non-fatal: Firo wallet will be unavailable until re-unlock | |
| } | |
| } | |
| return response; | |
| } |
🧰 Tools
🪛 Biome (2.4.11)
[error] 178-178: Other switch clauses can erroneously access this declaration.
Wrap the declaration in a block to restrict its access to the switch clause.
(lint/correctness/noSwitchDeclarations)
[error] 179-179: Other switch clauses can erroneously access this declaration.
Wrap the declaration in a block to restrict its access to the switch clause.
(lint/correctness/noSwitchDeclarations)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/extension/src/libs/background/index.ts` around lines 177 - 181, The
unlock case currently calls getSavedMnemonic(password) and sets
`#wallet.setSecret` before calling unlock, which leaks raw mnemonic, loses
extraWord, bypasses unlock's error translation, and breaks switch-scoped consts;
fix by enclosing the case body in its own block, first call
unlock(this.#keyring, message) and await its success (so its .catch() handles
wrong-password responses), then only if the selected network is Firo retrieve
the full MnemonicWithExtraWord via this.#keyring.getMnemonic(password) (not
getSavedMnemonic), call `#wallet.setSecret` with the mnemonic that includes
extraWord, and wrap the Firo setup in try/catch to avoid throwing unhandled
exceptions back to callers.
| export async function sendFromSparkAddress( | ||
| networkInfo: bitcoin.Network, | ||
| to: string, | ||
| amount: string, | ||
| ): Promise<string | undefined> { | ||
| let spendKeyObj = 0; | ||
| let fullViewKeyObj = 0; | ||
| let incomingViewKeyObj = 0; | ||
| let senderAddressObj = 0; | ||
| let recipientAddressObj = 0; | ||
| let recipientsVector = 0; | ||
| let privateRecipientsVector = 0; | ||
| let coinsList = 0; | ||
| let coverSetDataMap = 0; | ||
| let idAndBlockHashesMap = 0; | ||
| let result = 0; | ||
| let deserializedCoinObj = 0; | ||
| let coverSetData = 0; | ||
| const diversifier = 1n; | ||
| const isTestNetwork = networkInfo.bech32 === 'tb'; | ||
| const requestedAmount = getRequestedAmount(amount); | ||
| const db = new IndexedDBHelper(); | ||
| const Module = await wasmInstance.getInstance(); | ||
| const cleanup = () => { | ||
| if (spendKeyObj && spendKeyObj !== 0) { | ||
| Module.ccall('js_freeSpendKey', null, ['number'], [spendKeyObj]); | ||
| } | ||
| if (fullViewKeyObj && fullViewKeyObj !== 0) { | ||
| Module.ccall('js_freeFullViewKey', null, ['number'], [fullViewKeyObj]); | ||
| } | ||
| if (incomingViewKeyObj && incomingViewKeyObj !== 0) { | ||
| Module.ccall( | ||
| 'js_freeIncomingViewKey', | ||
| null, | ||
| ['number'], | ||
| [incomingViewKeyObj], | ||
| ); | ||
| } | ||
| if (senderAddressObj && senderAddressObj !== 0) { | ||
| Module.ccall('js_freeAddress', null, ['number'], [senderAddressObj]); | ||
| } | ||
| if (recipientAddressObj && recipientAddressObj !== 0) { | ||
| Module.ccall('js_freeAddress', null, ['number'], [recipientAddressObj]); | ||
| } | ||
| if (recipientsVector) { | ||
| Module.ccall( | ||
| 'js_freeSparkSpendRecipientsVector', | ||
| null, | ||
| ['number'], | ||
| [recipientsVector], | ||
| ); | ||
| } | ||
| if (coinsList) { | ||
| Module.ccall('js_freeSparkSpendCoinsList', null, ['number'], [coinsList]); | ||
| } | ||
| if (coverSetDataMap && coverSetDataMap !== 0) { | ||
| Module.ccall( | ||
| 'js_freeCoverSetDataMapForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number'], | ||
| [coverSetDataMap], | ||
| ); | ||
| } | ||
| if (privateRecipientsVector) { | ||
| Module.ccall( | ||
| 'js_freeSparkSpendPrivateRecipientsVector', | ||
| null, | ||
| ['number'], | ||
| [privateRecipientsVector], | ||
| ); | ||
| } | ||
| if (idAndBlockHashesMap) { | ||
| Module.ccall( | ||
| 'js_freeIdAndBlockHashesMap', | ||
| null, | ||
| ['number'], | ||
| [idAndBlockHashesMap], | ||
| ); | ||
| } | ||
| if (deserializedCoinObj) { | ||
| Module._free(deserializedCoinObj); | ||
| } | ||
| if (result && result !== 0) { | ||
| Module.ccall( | ||
| 'js_freeCreateSparkSpendTxResult', | ||
| null, | ||
| ['number'], | ||
| [result], | ||
| ); | ||
| } | ||
| if (coverSetData && coverSetData !== 0) { | ||
| Module.ccall('js_freeCoverSetData', null, ['number'], [coverSetData]); | ||
| } | ||
| }; | ||
|
|
||
| spendKeyObj = await getSpendKeyObj(Module); | ||
| const isSparkTransaction = await isSparkAddress(to); | ||
|
|
||
| const ownedCoins = ((await db.readData('myCoins')) || []) as OwnedCoinData[]; | ||
|
|
||
| const usedCoinsTags = await db.readData<{ tags: string[] }>( | ||
| DB_DATA_KEYS.usedCoinsTags, | ||
| ); | ||
| const coinsTagsSet = new Set(usedCoinsTags?.tags ?? []); | ||
| const myCoinsTagsSet = new Set((ownedCoins ?? []).map(coin => coin.tag)); | ||
|
|
||
| const usedMyCoinsTagsSet = intersectSets(coinsTagsSet, myCoinsTagsSet); | ||
| const revalidatedCoins = ownedCoins.filter(ownedCoin => { | ||
| return !usedMyCoinsTagsSet.has(ownedCoin.tag); | ||
| }); | ||
|
|
||
| const revalidatedUniqueCoins = Array.from( | ||
| removeDuplicates(new Set(revalidatedCoins)), | ||
| ); | ||
|
|
||
| const spendCoinList: SpendCoin[] = []; | ||
|
|
||
| const keyObjects = await getIncomingViewKey(Module, spendKeyObj); | ||
|
|
||
| incomingViewKeyObj = keyObjects.incomingViewKeyObj; | ||
| fullViewKeyObj = keyObjects.fullViewKeyObj; | ||
|
|
||
| senderAddressObj = Module.ccall( | ||
| 'js_getAddress', | ||
| 'number', | ||
| ['number', 'number'], | ||
| [incomingViewKeyObj, diversifier], | ||
| ); | ||
|
|
||
| if (isSparkTransaction) { | ||
| recipientAddressObj = Module.ccall( | ||
| 'js_decodeAddress', | ||
| 'number', | ||
| ['string', 'number'], | ||
| [to, Number(isTestNetwork)], | ||
| ); | ||
|
|
||
| if (recipientAddressObj === 0) { | ||
| throw new Error('Failed to decode Spark recipient address'); | ||
| } | ||
| } | ||
|
|
||
| // Create recipients vector for spend transaction | ||
| recipientsVector = Module.ccall( | ||
| 'js_createRecipientsVectorForCreateSparkSpendTransaction', | ||
| 'number', | ||
| ['number'], | ||
| [1], // intended final size | ||
| ); | ||
|
|
||
| if (!isSparkTransaction) { | ||
| Module.ccall( | ||
| 'js_addRecipientForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number', 'number'], | ||
| [recipientsVector, BigInt(requestedAmount), 0], | ||
| ); | ||
| } | ||
|
|
||
| privateRecipientsVector = Module.ccall( | ||
| 'js_createPrivateRecipientsVectorForCreateSparkSpendTransaction', | ||
| 'number', | ||
| ['number'], | ||
| [1], // intended final size | ||
| ); | ||
|
|
||
| coinsList = Module.ccall( | ||
| 'js_createCoinsListForCreateSparkSpendTransaction', | ||
| 'number', | ||
| [], | ||
| [], | ||
| ); | ||
|
|
||
| const coinMetaPromiseList: Promise<void>[] = []; | ||
|
|
||
| revalidatedUniqueCoins.forEach(ownedCoin => { | ||
| const myCoinMetaData = getSparkCoinInfo({ | ||
| coin: ownedCoin.coin, | ||
| fullViewKeyObj, | ||
| incomingViewKeyObj, | ||
| wasmModule: Module, | ||
| keepMemory: true, | ||
| }) | ||
| .then(data => { | ||
| if (!data.isUsed) { | ||
| spendCoinList.push({ | ||
| coin: ownedCoin.coin, | ||
| setId: ownedCoin.setId, | ||
| metadata: data.metaData, | ||
| value: Number(data.value), | ||
| }); | ||
| } | ||
| }) | ||
| .catch(err => { | ||
| console.error('Error getting spark coin info', err); | ||
| }); | ||
| coinMetaPromiseList.push(myCoinMetaData); | ||
| }); | ||
|
|
||
| await Promise.allSettled(coinMetaPromiseList); | ||
|
|
||
| const selectedSpendCoins = selectSpendCoins(spendCoinList, requestedAmount); | ||
|
|
||
| const selectedSpendAmount = selectedSpendCoins.reduce( | ||
| (total, coin) => total + coin.value, | ||
| 0, | ||
| ); | ||
| const privateRecipientAmount = isSparkTransaction | ||
| ? requestedAmount | ||
| : selectedSpendAmount - requestedAmount; | ||
|
|
||
| if (!selectedSpendCoins.length || selectedSpendAmount < requestedAmount) { | ||
| cleanup(); | ||
| throw new Error( | ||
| `Not enough Spark balance to cover transaction amount. Requested ${requestedAmount}, selected ${selectedSpendAmount}.`, | ||
| ); | ||
| } |
There was a problem hiding this comment.
No try/finally around WASM allocations — any throw before the explicit cleanup() leaks everything.
cleanup() is only invoked in three places: inside the insufficient-balance branch, inside the !result branch, and at the happy-path exit (Line 601). Any synchronous throw from the many Module.ccall invocations above (e.g. a failed js_decodeAddress for a malformed to, a null/invalid coin, a bad setId in getSetById, base64ToReversedHex on bad data at Line 483, etc.) will bubble up to the caller with every already-allocated WASM handle leaked — and because this path runs on user intent, users can retry several times before a page reload reclaims memory.
Wrap the body in try { ... } finally { cleanup(); } and let the throw propagate naturally; remove the manual cleanup() calls before throws.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/extension/src/libs/spark-handler/index.ts` around lines 103 - 319,
The function sendFromSparkAddress allocates many WASM handles (spendKeyObj,
fullViewKeyObj, incomingViewKeyObj, senderAddressObj, recipientsVector,
coinsList, etc.) but does not guarantee cleanup on synchronous throws; wrap the
main allocation/logic body of sendFromSparkAddress in a try { ... } finally {
cleanup(); } block so cleanup() always runs, remove the ad-hoc cleanup() calls
placed immediately before throws (e.g., the insufficient-balance branch and the
!result branch), and keep throws (like from Module.ccall('js_decodeAddress'),
getSparkCoinInfo, selectSpendCoins, base64ToReversedHex, etc.) propagating so
memory is not leaked even on errors.
| let deserializedCoinObj = 0; | ||
| let coverSetData = 0; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Confirm deallocation functions exposed by the WASM glue for coins/cover-set data.
rg -nP 'js_(freeCoin|freeDeserializedCoin|freeCoverSetData|deserializeCoin)\b'Repository: enkryptcom/enKrypt
Length of output: 1405
🏁 Script executed:
sed -n '119,121p' packages/extension/src/libs/spark-handler/index.tsRepository: enkryptcom/enKrypt
Length of output: 142
🏁 Script executed:
sed -n '180,210p' packages/extension/src/libs/spark-handler/index.tsRepository: enkryptcom/enKrypt
Length of output: 1006
🏁 Script executed:
sed -n '389,425p' packages/extension/src/libs/spark-handler/index.tsRepository: enkryptcom/enKrypt
Length of output: 1165
🏁 Script executed:
sed -n '430,465p' packages/extension/src/libs/spark-handler/index.tsRepository: enkryptcom/enKrypt
Length of output: 1145
🏁 Script executed:
cat -n packages/extension/src/libs/spark-handler/getSparkCoinInfo.ts | sed -n '140,155p'Repository: enkryptcom/enKrypt
Length of output: 647
Critical WASM memory leaks in loops — each iteration's allocations except the last are never freed.
Two outer-scope let variables (deserializedCoinObj at Line 119 and coverSetData at Line 120) are reassigned inside loops without freeing prior values. Only the final assignment persists to cleanup(), leaving N−1 allocations leaked per function call:
deserializedCoinObj = Module.ccall('js_deserializeCoin', ...)(Line 405) runs once per coin in the forEach loop (Lines 389–421). All but the last allocation leak.coverSetData = Module.ccall('js_createCoverSetData', ...)(Line 436) runs once per set in the for loop (Lines 430–460). All but the last allocation leak.
Three temporary _malloc'd pointers are never freed:
serializedCoinPointer(Line 394)serialContextPointer(Line 401)coverSetRepresentationPointer(Line 431)
These should be freed immediately after their WASM calls complete (as shown in getSparkCoinInfo.ts lines 144–145), not deferred to cleanup.
Additionally, there is no try/finally wrapping the loops. Any exception during iteration leaves WASM state permanently leaked.
Use js_freeCoin (not Module._free) as the deallocator for deserialized coins, matching the pattern in getSparkCoinInfo.ts:147.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/extension/src/libs/spark-handler/index.ts` around lines 119 - 120,
The loops leak WASM allocations because outer-scope vars deserializedCoinObj and
coverSetData are overwritten without freeing previous values and three malloc'd
pointers (serializedCoinPointer, serialContextPointer,
coverSetRepresentationPointer) are never freed; fix by freeing each pointer
immediately after the corresponding Module.ccall returns
(Module._free(serializedCoinPointer), Module._free(serialContextPointer),
Module._free(coverSetRepresentationPointer)), free the previous
deserializedCoinObj before reassigning using js_freeCoin(deserializedCoinObj)
(not Module._free), free coverSetData with the appropriate deallocator before
reassigning, and wrap the affected loops (the forEach that calls
js_deserializeCoin and the for loop that calls js_createCoverSetData) in
try/finally so any thrown exception still runs cleanup for the most recent
allocations and avoids leaks.
| const unusedCoins = updatedCoins.filter(el => !el.isUsed); | ||
|
|
||
| const mintIds = unusedCoins.map(unusedCoin => unusedCoin.coin[1]); | ||
|
|
||
| const txIdsDecoded = mintIds | ||
| .map(base64 => { | ||
| try { | ||
| return base64ToReversedHex(base64); | ||
| } catch { | ||
| return ''; | ||
| } | ||
| }) | ||
| .filter(Boolean); | ||
|
|
||
| console.log('===>>>Unused Spark Coins TX IDs:', txIdsDecoded); | ||
|
|
||
| if (!!txIdsDecoded?.length) { | ||
| const results = await firoElectrum.multiGetTransactionByTxid(txIdsDecoded); | ||
|
|
||
| unusedTxDetails = txIdsDecoded | ||
| .map((txId, index) => { | ||
| const tx = results[txId]; | ||
|
|
||
| if (!tx) { | ||
| console.warn(`Transaction details not found for txId: ${txId}`); | ||
| return null; | ||
| } | ||
|
|
||
| const coin = unusedCoins[index]; | ||
| const coinValue = coin?.value; | ||
|
|
||
| if (coinValue === undefined) { | ||
| console.warn( | ||
| `Coin value not found for txId: ${txId}, index: ${index}`, | ||
| ); | ||
|
|
||
| return { | ||
| ...tx, | ||
| value: '0', | ||
| }; | ||
| } | ||
|
|
||
| return { | ||
| ...tx, | ||
| value: Number(coinValue), | ||
| }; | ||
| }) | ||
| .filter((tx): tx is SparkUnusedTxDetails => tx !== null); |
There was a problem hiding this comment.
Index misalignment after .filter(Boolean) causes wrong coin values to be attached to transactions.
txIdsDecoded is produced by mapping over mintIds (parallel to unusedCoins) and then .filter(Boolean) is applied. When any base64 decode fails and returns '', the filter drops entries and re-packs the array, so the index passed into the subsequent .map((txId, index) => ...) no longer corresponds to unusedCoins[index]. As a result, coinValue (and therefore the reported balance/value attached to each returned tx) gets misattributed — silently, because the try/catch swallows decode errors.
Also, the returned object in the coinValue === undefined branch sets value: '0' (string), but SparkUnusedTxDetails declares value: number, so consumers receive a type-violating field.
🐛 Proposed fix – keep coin alignment by filtering and mapping in one pass
- const mintIds = unusedCoins.map(unusedCoin => unusedCoin.coin[1]);
-
- const txIdsDecoded = mintIds
- .map(base64 => {
- try {
- return base64ToReversedHex(base64);
- } catch {
- return '';
- }
- })
- .filter(Boolean);
-
- console.log('===>>>Unused Spark Coins TX IDs:', txIdsDecoded);
-
- if (!!txIdsDecoded?.length) {
- const results = await firoElectrum.multiGetTransactionByTxid(txIdsDecoded);
-
- unusedTxDetails = txIdsDecoded
- .map((txId, index) => {
- const tx = results[txId];
-
- if (!tx) {
- console.warn(`Transaction details not found for txId: ${txId}`);
- return null;
- }
-
- const coin = unusedCoins[index];
- const coinValue = coin?.value;
-
- if (coinValue === undefined) {
- console.warn(
- `Coin value not found for txId: ${txId}, index: ${index}`,
- );
-
- return {
- ...tx,
- value: '0',
- };
- }
-
- return {
- ...tx,
- value: Number(coinValue),
- };
- })
- .filter((tx): tx is SparkUnusedTxDetails => tx !== null);
- }
+ const decoded = unusedCoins
+ .map(coin => {
+ try {
+ return { coin, txId: base64ToReversedHex(coin.coin[1]) };
+ } catch {
+ return null;
+ }
+ })
+ .filter((x): x is { coin: typeof unusedCoins[number]; txId: string } => !!x && !!x.txId);
+
+ if (decoded.length) {
+ const txIds = decoded.map(d => d.txId);
+ const results = await firoElectrum.multiGetTransactionByTxid(txIds);
+
+ unusedTxDetails = decoded
+ .map(({ coin, txId }) => {
+ const tx = results[txId];
+ if (!tx) return null;
+ return { ...tx, value: Number(coin.value ?? 0) };
+ })
+ .filter((tx): tx is SparkUnusedTxDetails => tx !== null);
+ }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/extension/src/libs/utils/updateAndSync/markCoinsAsUsed.ts` around
lines 41 - 88, The txIdsDecoded filtering breaks alignment with unusedCoins
causing wrong coin values to be attached; change the pipeline to produce a
single filtered array of pairs (e.g., {txId, coin}) by decoding each mintId
alongside its corresponding unusedCoins entry (decode each base64 with
base64ToReversedHex, skip and optionally log failures), pass the list of txIds
from that paired array into firoElectrum.multiGetTransactionByTxid, and then map
results back using the paired coin so the right coin.value is used; also ensure
the value field on the returned SparkUnusedTxDetails is a number (use 0, not the
string '0') and remove the separate mintIds/txIdsDecoded filtering that causes
index misalignment.
| const updatedCoinsSet = differenceSets( | ||
| new Set(localSets?.[index]?.coins ?? []), | ||
| new Set(newCoins), | ||
| ); | ||
|
|
||
| localSets[index] = { | ||
| blockHash: remoteMeta.blockHash, | ||
| setHash: remoteMeta.setHash, | ||
| coins: [...updatedCoinsSet, ...(localSets?.[index]?.coins ?? [])], | ||
| }; | ||
|
|
||
| if (!localSets[index] || isFullReplacement) { | ||
| await db.saveData(DB_DATA_KEYS.sets, localSets); | ||
| } else { | ||
| // await db.appendSetData(DB_DATA_KEYS.sets, index, { | ||
| // ...localSets[index], | ||
| // }); | ||
| } |
There was a problem hiding this comment.
Two defects in the incremental update path: broken dedup and dropped persistence.
string[][]items put into aSetare compared by reference, not value, sodifferenceSets(new Set(localCoins), new Set(newCoins))will never consider any array "equal" and will always yield all ofnewCoins. Combined with line 154 prepending them to the existingcoins, duplicates will accumulate on every sync.- On line 157,
localSets[index]has just been assigned on line 151, so!localSets[index]is always falsy. That means onlyisFullReplacement === trueever persists to IndexedDB — the entire incremental branch is a no-op (the actual persist call is commented out on lines 160‑162). Incremental updates are effectively lost across sessions.
🛠️ Suggested direction
Dedup by the canonical coin identifier (e.g. coin[0]) using a Set<string>, and unconditionally persist:
- const updatedCoinsSet = differenceSets(
- new Set(localSets?.[index]?.coins ?? []),
- new Set(newCoins),
- );
-
- localSets[index] = {
- blockHash: remoteMeta.blockHash,
- setHash: remoteMeta.setHash,
- coins: [...updatedCoinsSet, ...(localSets?.[index]?.coins ?? [])],
- };
-
- if (!localSets[index] || isFullReplacement) {
- await db.saveData(DB_DATA_KEYS.sets, localSets);
- } else {
- // await db.appendSetData(DB_DATA_KEYS.sets, index, {
- // ...localSets[index],
- // });
- }
+ const existingCoins = localSets?.[index]?.coins ?? [];
+ const existingHashes = new Set(existingCoins.map(c => c?.[0]));
+ const trulyNew = isFullReplacement
+ ? newCoins
+ : newCoins.filter(c => !existingHashes.has(c?.[0]));
+
+ localSets[index] = {
+ blockHash: remoteMeta.blockHash,
+ setHash: remoteMeta.setHash,
+ coins: isFullReplacement ? newCoins : [...trulyNew, ...existingCoins],
+ };Persist once after the Promise.all finishes instead of inside each task (see separate comment on the race).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/extension/src/libs/utils/updateAndSync/updateCoinSet.ts` around
lines 146 - 163, The incremental-update path is broken: differenceSets(new
Set(localCoins), new Set(newCoins)) compares array references so it never dedups
and duplicates accumulate, and the persistence branch is effectively skipped
because localSets[index] is always truthy and appendSetData is commented out.
Fix by computing deduped new entries using a Set of canonical coin IDs (e.g.
coin[0]) instead of Set<string[]>—for example build a Set<string> of existing
coin IDs from localSets[index].coins, filter newCoins to only those whose
coin[0] is not present, then merge (prepend or append per desired order) without
duplicates into localSets[index].coins; remove or uncomment the appendSetData
code and ensure you call db.saveData(DB_DATA_KEYS.sets, localSets)
unconditionally (or persist once after the surrounding Promise.all) so
incremental updates are actually written. Ensure you update references in the
code around updatedCoinsSet, differenceSets, newCoins, localSets,
isFullReplacement, appendSetData and db.saveData accordingly.
| try { | ||
| let signedTx; | ||
| if (network.value.name === NetworkNames.Firo) { | ||
| signedTx = await FiroTransactionSigner({ | ||
| account: account.value!, | ||
| network: network.value as BitcoinNetwork, | ||
| payload: JSON.parse(txData.TxInfo), | ||
| }); | ||
| } else { | ||
| signedTx = await TransactionSigner({ | ||
| account: account.value!, | ||
| network: network.value as BitcoinNetwork, | ||
| payload: JSON.parse(txData.TxInfo), | ||
| }); | ||
| } | ||
|
|
||
| api | ||
| .broadcastTx(signedTx.toHex()) | ||
| .then(() => { | ||
| trackSendEvents(SendEventType.SendComplete, { | ||
| network: network.value.name, | ||
| }); | ||
| activityState.addActivities( | ||
| [ | ||
| { | ||
| address: network.value.displayAddress(txData.fromAddress), | ||
| network: network.value.name, | ||
| ...txActivity, | ||
| ...{ transactionHash: signedTx.getId() }, | ||
| }, | ||
| ); | ||
| isSendDone.value = true; | ||
| if (getCurrentContext() === 'popup') { | ||
| setTimeout(() => { | ||
| isProcessing.value = false; | ||
| callToggleRate(); | ||
| router.go(-2); | ||
| }, 4500); | ||
| } else { | ||
| setTimeout(() => { | ||
| isProcessing.value = false; | ||
| callToggleRate(); | ||
| window.close(); | ||
| }, 1500); | ||
| } | ||
| }) | ||
| .catch(error => { | ||
| trackSendEvents(SendEventType.SendFailed, { | ||
| network: network.value.name, | ||
| error: error.message, | ||
| }); | ||
| txActivity.status = ActivityStatus.failed; | ||
| activityState.addActivities([txActivity], { | ||
| address: txData.fromAddress, | ||
| ], | ||
| { | ||
| address: network.value.displayAddress(txData.fromAddress), | ||
| network: network.value.name, | ||
| }); | ||
| console.error('ERROR', error); | ||
| }, | ||
| ); | ||
| isSendDone.value = true; | ||
| if (getCurrentContext() === 'popup') { | ||
| setTimeout(() => { | ||
| isProcessing.value = false; | ||
| callToggleRate(); | ||
| router.go(-2); | ||
| }, 4500); | ||
| } else { | ||
| setTimeout(() => { | ||
| isProcessing.value = false; | ||
| callToggleRate(); | ||
| window.close(); | ||
| }, 1500); | ||
| } | ||
| }) | ||
| .catch(error => { | ||
| trackSendEvents(SendEventType.SendFailed, { | ||
| network: network.value.name, | ||
| error: error.message, | ||
| }); | ||
| }) | ||
| .catch(error => { | ||
| isProcessing.value = false; | ||
| console.error('error', error); | ||
| errorMsg.value = JSON.stringify(error); | ||
| }); | ||
| txActivity.status = ActivityStatus.failed; | ||
| activityState.addActivities([txActivity], { | ||
| address: txData.fromAddress, | ||
| network: network.value.name, | ||
| }); | ||
| console.error('ERROR', error); | ||
| }); | ||
| } catch (error) { | ||
| isProcessing.value = false; | ||
| console.error('error', error); | ||
| errorMsg.value = JSON.stringify(error); | ||
| } |
There was a problem hiding this comment.
Broadcast failure leaves the UI stuck in the "processing" state.
The .catch handler on api.broadcastTx(signedTx.toHex()) (lines 219-230) records the failed activity and logs, but never sets isProcessing.value = false. Because api.broadcastTx(...) is not awaited inside the outer try block, a rejection there does not propagate to the outer catch at line 231 either. Net effect: when broadcast fails, <send-process v-if="isProcessing" ...> remains mounted indefinitely with no "done" signal and no path back — the user has to close the window.
Two fixes:
- Reset
isProcessing(and surfaceerrorMsg) inside the broadcast.catch, orawaitthe broadcast so the outercatchhandles it uniformly. - Line 234:
JSON.stringify(error)returns"{}"for nativeErrorinstances becausemessage/stackare non-enumerable. Useerror?.message ?? String(error)so the user sees something actionable.
Proposed fix
- api
- .broadcastTx(signedTx.toHex())
- .then(() => {
+ try {
+ await api.broadcastTx(signedTx.toHex());
trackSendEvents(SendEventType.SendComplete, {
network: network.value.name,
});
activityState.addActivities(
- [
- {
- ...txActivity,
- ...{ transactionHash: signedTx.getId() },
- },
- ],
+ [{ ...txActivity, transactionHash: signedTx.getId() }],
{
address: network.value.displayAddress(txData.fromAddress),
network: network.value.name,
},
);
isSendDone.value = true;
- if (getCurrentContext() === 'popup') {
- setTimeout(() => {
- isProcessing.value = false;
- callToggleRate();
- router.go(-2);
- }, 4500);
- } else {
- setTimeout(() => {
- isProcessing.value = false;
- callToggleRate();
- window.close();
- }, 1500);
- }
- })
- .catch(error => {
+ // ...existing timeout logic...
+ } catch (error: any) {
trackSendEvents(SendEventType.SendFailed, {
network: network.value.name,
- error: error.message,
+ error: error?.message,
});
txActivity.status = ActivityStatus.failed;
activityState.addActivities([txActivity], {
address: txData.fromAddress,
network: network.value.name,
});
- console.error('ERROR', error);
- });
- } catch (error) {
- isProcessing.value = false;
- console.error('error', error);
- errorMsg.value = JSON.stringify(error);
+ console.error('ERROR', error);
+ isProcessing.value = false;
+ errorMsg.value = error?.message ?? String(error);
+ }
+ } catch (error: any) {
+ isProcessing.value = false;
+ console.error('error', error);
+ errorMsg.value = error?.message ?? String(error);
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@packages/extension/src/providers/bitcoin/ui/send-transaction/verify-transaction/index.vue`
around lines 170 - 235, The broadcast rejection handler never clears the
processing UI and the outer catch won't catch it because api.broadcastTx(...)
isn't awaited; update the code around api.broadcastTx(signedTx.toHex()) so that
either the broadcast is awaited (await api.broadcastTx(...)) letting the outer
try/catch handle failures, or inside the .catch guarantee you set
isProcessing.value = false and populate errorMsg.value (use error?.message ??
String(error) instead of JSON.stringify(error)) and preserve existing activity
tracking and trackSendEvents; reference TransactionSigner/FiroTransactionSigner,
api.broadcastTx, isProcessing, errorMsg, and txActivity when making the change.
| const onSelectedAddressChanged = async (newAccount: EnkryptAccount) => { | ||
| const defaultAddress = await getSparkAddress( | ||
| currentNetwork, | ||
| newAccount.address, | ||
| ); | ||
|
|
||
| if (defaultAddress && accountHeaderData.value.sparkAccount) { | ||
| accountHeaderData.value.sparkAccount['defaultAddress'] = defaultAddress; | ||
| } | ||
|
|
There was a problem hiding this comment.
getSparkAddress is being passed a Ref, not the unwrapped network.
currentNetwork here is a ref<BaseNetwork>, but getSparkAddress is called elsewhere in this file (getSparkAccountState, setNetwork) with the unwrapped network value. Passing the ref object will almost certainly break the function (it expects network.name, network.displayAddress(...) etc. on the network itself, not on a Ref). This likely prevents defaultAddress from being resolved on address change for the Firo account.
🐛 Proposed fix
const onSelectedAddressChanged = async (newAccount: EnkryptAccount) => {
const defaultAddress = await getSparkAddress(
- currentNetwork,
+ currentNetwork.value,
newAccount.address,
);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const onSelectedAddressChanged = async (newAccount: EnkryptAccount) => { | |
| const defaultAddress = await getSparkAddress( | |
| currentNetwork, | |
| newAccount.address, | |
| ); | |
| if (defaultAddress && accountHeaderData.value.sparkAccount) { | |
| accountHeaderData.value.sparkAccount['defaultAddress'] = defaultAddress; | |
| } | |
| const onSelectedAddressChanged = async (newAccount: EnkryptAccount) => { | |
| const defaultAddress = await getSparkAddress( | |
| currentNetwork.value, | |
| newAccount.address, | |
| ); | |
| if (defaultAddress && accountHeaderData.value.sparkAccount) { | |
| accountHeaderData.value.sparkAccount['defaultAddress'] = defaultAddress; | |
| } | |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/extension/src/ui/action/App.vue` around lines 543 - 552,
onSelectedAddressChanged passes the Ref currentNetwork into getSparkAddress
causing it to receive a Ref instead of a BaseNetwork; change the call in
onSelectedAddressChanged to pass the unwrapped network (e.g.
currentNetwork.value or use unref(currentNetwork)) so getSparkAddress can access
network.name and displayAddress correctly; update any nearby calls in
onSelectedAddressChanged that expect the resolved defaultAddress to ensure
accountHeaderData.value.sparkAccount['defaultAddress'] is set using the result.
| @@ -1,62 +1,113 @@ | |||
| packages/extension/src/ui/action/views/network-activity/components/network-activity-total.vue | |||
There was a problem hiding this comment.
Remove the stray filename from the SFC body.
If this line is present in the actual .vue file, the component will fail to compile before <template>.
Proposed fix
-packages/extension/src/ui/action/views/network-activity/components/network-activity-total.vue
-
<template>📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| packages/extension/src/ui/action/views/network-activity/components/network-activity-total.vue | |
| <template> |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@packages/extension/src/ui/action/views/network-activity/components/network-activity-total.vue`
at line 1, There is a stray filename string sitting before the component's
<template> in network-activity-total.vue causing the SFC to fail compile; open
network-activity-total.vue, remove the stray filename line that appears above
the <template> tag (or any text outside the <template>, <script>, <style>
blocks), then save so the file begins with a valid SFC block (e.g., <template>)
and the component (network-activity-total) will compile.
| const selectedNetwork: string = route.query.id as string; | ||
| const txData: VerifyTransactionParams = JSON.parse( | ||
| Buffer.from(route.query.txData as string, 'base64').toString('utf8'), | ||
| ); |
There was a problem hiding this comment.
Unhandled JSON.parse on route.query.txData — a malformed URL crashes the view at setup time.
Buffer.from(route.query.txData as string, 'base64').toString('utf8') followed by JSON.parse is executed at script-setup time with no try/catch. If txData is missing, not valid base64, or not valid JSON (e.g. user hits refresh, state lost, or deep link is tampered with), the component throws before mount and the user is stuck on a broken view with no recourse. Wrap it and fail gracefully (navigate back / show error).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue`
around lines 113 - 116, Wrap the base64 decode + JSON.parse of
route.query.txData (where txData: VerifyTransactionParams is currently created
alongside selectedNetwork) in a try/catch and handle failures gracefully;
specifically, validate that route.query.txData exists, attempt Buffer.from(...,
'base64').toString('utf8') and JSON.parse inside the try block, and on any error
(missing, invalid base64, or invalid JSON) either programmatically navigate back
or set a component-level error state to show a user-friendly message instead of
letting the component throw during setup; update any downstream logic that
relies on txData (the code using txData / VerifyTransactionParams) to handle the
error state.
| "@polkadot/util": "^14.0.3", | ||
| "assert": "^2.1.0", | ||
| "bip39": "^3.1.0" | ||
| "bip32": "2.0.6" |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Description: Check for remaining bip39 imports and inspect bip32 usage/import style.
rg -n -C4 "from ['\"]bip39['\"]|require\\(['\"]bip39['\"]\\)|from ['\"]bip32['\"]|require\\(['\"]bip32['\"]\\)|BIP32Factory|fromSeed" --glob '!node_modules/**'
fd -i '^package\.json$' --exec sh -c '
file="$1"
printf "\n--- %s ---\n" "$file"
jq -r '"'"'.name as $name | "name=\($name // "<root>") deps.bip39=\(.dependencies.bip39 // "-") deps.bip32=\(.dependencies.bip32 // "-")"'"'"' "$file"
' sh {}Repository: enkryptcom/enKrypt
Length of output: 13147
Add bip39 back to @enkryptcom/keyring dependencies.
The keyring package still imports bip39 functions at line 16 (entropyToMnemonic, generateMnemonic, mnemonicToEntropy) but the dependency was removed from package.json. This will fail under Yarn PnP because the import cannot be resolved. The bip32@2.0.6 usage (namespace import with fromSeed() and derivePath()) is compatible.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/keyring/package.json` at line 35, Restore the missing bip39
dependency in package.json so imports of entropyToMnemonic, generateMnemonic,
and mnemonicToEntropy resolve under Yarn PnP; add an appropriate bip39 entry
(e.g., "bip39": "3.0.4" or the project’s pinned version) to the dependencies and
run yarn install to ensure the package is available for the code that imports
those functions.
Add Firo network support (transparent + Spark private addresses)
Summary
This PR adds full support for the Firo network to Enkrypt, including both the
standard transparent UTXO wallet and the Spark privacy protocol (private
addresses, mints and spends). It also adds a Firo-specific Electrum client,
activity handler, synchronization flow for the Spark anonymity set, and all of
the UI needed to deposit, send and view balances on both address types.
No changes are made to existing networks; everything is additive under the
Bitcoin-family provider and a new
spark-handlerlibrary.What's included
Network / chain integration
firo(mainnet) andfiro-testnet, added under the Bitcoinprovider with icon, config and test coverage.
electrum-client.ts,abstract-electrum.ts)used for transparent UTXOs, anonymity-set sync, and broadcasting.
api-firo.ts) and activity handler(
activity-handlers/providers/firo).Transparent wallet
firo-wallet/base-firo-wallet.ts+ configs/utils implement Firo-flavouredtransparent address derivation, UTXO selection and signing.
tabs/transparent-send-tab.vue) and updatedverify-transactionscreen.Spark (private) support
spark-handler/library wrapping the Spark WASM module:generateSparkWallet,getMintTxData,getSparkCoinInfo,getFee,getSerializedCoin,getTotalMintedAmount,createTempTx,createTempFromSparkTx,serializeMintContext.libs/utils/wasmModule/spark.{js,wasm}) and a worker copy(
wasmWorkerModule/) withwasm-loader.ts/wasm-worker-loader.ts.sparkCoinInfoWorker.tsto decode coin info off the mainthread.
libs/utils/updateAndSync/:updateCoinSet,updateTagsSet,handleCoinSetUpdates,markCoinsAsUsed,calculateCurrentSparkBalance,getSetsFromDb.ui/action/db/indexedDB.ts) persisting coins, tags and thelast scanned coin-set id so re-opening the wallet resumes incrementally
instead of re-scanning.
synchronize-spark-state,get-spark-address,update-activity-state,async-computed.UI
tabs/spark-send-tab.vue) and Spark address input(
send-spark-address-input.vue) with support for both Spark→Spark andSpark→transparent sends.
(
verify-send-from-spark-transaction,verify-send-to-spark-transaction).synchronize-state.vueprogress UI shown while the anonymity set syncs.encodes
Firo:{address}.address; non-compatible accounts are filtered out on Firo.
Keyring / infrastructure
packages/keyringbumped and extended with Firo signer hooks.public-keyring/keyring.tsupdated to expose Firo key material.tsconfig.app.jsonupdated to bundle the WASM assets; new.prettierignoreentry to skip the generatedspark.js.Notes for reviewers
A few things are worth flagging explicitly so they are not mistaken for bugs:
the main wallet view displays only the transparent FIRO balance. It does
not aggregate transparent + private (Spark) balance. The private balance
is visible inside the Firo-specific views (send flow, deposit view, activity
screen) once the anonymity set has synced.
are rendered with the "swapped" activity icon/label. This is a limitation of
Enkrypt's activity UI (there is no native "shielded transfer" category to
map to); we were advised to accept this rather than patch the shared
activity types.
downloads the full Spark coin set and tag set, decodes coins in a worker,
and writes them to IndexedDB. The
synchronize-state.vuemodal reflectsprogress. Subsequent unlocks resume from the last stored coin-set id.
lifetimes (
getMintTxData,getSparkCoinInfo, and theserialContextbuffer allocation). If you see review questions about manual
_freecalls,that is why.
(BTC, LTC, DOGE, …) continue to use the existing send-transaction view and
are unaffected.
Test plan
and shown in the accounts list and deposit view.