Cannot import into BitwardenRS
-
I have existing vault on Bitwarden.com
It exports ok
The import into cloudron instance of BitwardenRS fails repeatedly on all of .json, .csv and encrypted .json formats.
No details given : unexpected error.
Difficult to check 60k lines of json for format errors, but it's coming from Bitwarden to Bitwarden so I would not expect that "otherwise obvious" issue.Thoughts, all you clever people ?
-
@timconsidine 60k lines of json indicates some larger set of data, is it possible that you need to give the app temporarily more memory during the import?
-
@nebulon said in Cannot import into BitwardenRS:
@timconsidine 60k lines of json indicates some larger set of data, is it possible that you need to give the app temporarily more memory during the import?
Ah. good point. Thank you.
-
@nebulon said in Cannot import into BitwardenRS:
@timconsidine 60k lines of json indicates some larger set of data, is it possible that you need to give the app temporarily more memory during the import?
In case this helps someone else :
I increased memory for the app to 8Gb and set utilisation limit to 75%. Import still failed.
I split the .json into 4 smaller files.
They did then import individually. -
@timconsidine said in Cannot import into BitwardenRS:
I split the .json into 4 smaller files.
Any instructions on how you did this? I want to put a note in our docs.
-
@girish said in Cannot import into BitwardenRS:
Any instructions on how you did this? I want to put a note in our docs.
Not very sophisticated, I'm afraid.
It was unencrypted .json, so easy to edit in text editor.I just duplicated the full export file (~60k lines for about 500 entries) 3 times so I had 4 files, V1, V2, V3 and V4.
I then edited them them so V1 was approx lines 1-15,000, V2 was 15,001-30,000 etc., just deleting lines before the start item and after the end item.
Of course each set of item lines could not be split, so essential to adjust the start and end line count for each file, so the content started on
{ "id": "xxxx", "organizationId": null, . . . .
and ensured the last item and the file ended as :
} <-- no comma at end of last item (not sure if mandatory) ] }
For V2 V3 and V4, after deleting the prelim entries, the file should start with : (just copy in before first item)
{ "encrypted": false, "folders": [], "items": [
NB : in the above the folders line is empty.
I 'cheated' to ensure that the folders in export file were blanked so the import didn't complain about mis-matched folder names or ids. I wanted to reorganise them anyway so not a problem for me.
It looked like this originally :"folders": [ { "id": "xxxxxxx", "name": "Folder1" }, { "id": "yyyyyyy", "name": "Folder2" } ],
I global searched for "xxxxxxx" and "yyyyyyy" (including the double quotes) replacing them with 'null' (without the quote), and then deleted these folder descriptions.
I then imported each of V1, V2, V3, V4 with increased app memory. All went smoothly.
I then re-exported the whole collection from the new instance, so I could check that the total line count of "exported imports" matched the total line count of the original export from the old instance.
I then deleted all source .json files so they are not hanging around in plain text.
And reduced the allocated memory for the app back down.It sounds long winded and is certainly not elegant, but it didn't take more than 20 mins.
I'm sure this can be documented more neatly, just doing a brain dump.
If I get bored some day, I might try repetitive imports of increasing size to see if we can say at what point the import process falls over. Or maybe someone with code access can see if it is hard-coded (hopefully not). But my collection has quite simple content, no long notes and no attachments. Probably that could affect the number of items which can be imported in a single pass.
Hope this helps.
-
@timconsidine Thanks for the excellent write up!