I hope I haven´t done anything dumb


Hi @Akito, :wave:

no sooner did you post "Cannot wait for the next stones on the road of your next adventure" than I have to take you at your word. So sorry about that. :slightly_frowning_face:

I don´t know whether I might have done something stupid (at least I hope not) out of sheer clumsiness. :hushed:

What I did was the following:

I wanted to look for the search word “pass” in a dedicated trans.profile I set up for the use of translate-shell (GitHub - soimort/translate-shell: Command-line translator using Google Translate, Bing Translator, Yandex.Translate, etc.
and Translate Shell ) with firejail.

The command I wanted to use was:
cat /home/rosika/.config/firejail/trans.profile | grep pass .

However I was typing too quickly and forgot to enter “grep”. So the command I actually entered was:
cat /home/rosika/.config/firejail/trans.profile | pass .

In reaction to that I got a new line in the terminal but apparently no prompt.
I didn´t enter anything and wanted to close the terminal with “CTRL+C” which didn´t have any effect. :thinking:

So I tried it with “CTRL+D”. That didn´t shutdown whatever process was triggered either. :thinking: :thinking:
Yet I doubt that any process was triggered at all …
… because this stupidity of mine didn´t result in an entry in fish_history. :relaxed:

I finally managed to close the window by hitting the “X” in the upper right corner of the terminal window.

Hmm, having become a little scared of what I did I looked up what the command “pass” was all about and found out:

whatis pass
pass (1)             - stores, retrieves, generates, and synchronizes passwords securely

So sorry to bother you again @Akito (really, I am :frowning_face:) but might I be bold enough to ask you what you think about the whole matter :question:

I hope I haven´t done anything too dumb.
At least I couldn´t find anything with respect to “pass” with lnav.

Many thanks in advance and many greetings.
Rosika :slightly_smiling_face:

It’s probably no biggie:

Seems harmless. Do not worry!

1 Like

Hi @Akito, :wave:

thank you very much for helping me out once again. :heart:

That´s great news. Phew. :blush:

Thanks also for the link. I read it through and once more I seemed to have learnt quite a bit.
I have to admit I haven´t used (or even heard of) pass before. It looks like a very interesting topic though.

To list the names of all your passwords, run the pass command without any arguments.

So providing my cat command and piping it to pass wouldn´t make any sense either.

O.K. then … I´m really glad. :blush:

Still, I should be more careful of my typing in the future.
It worries me to quite some extent that I haven´t become a tad bit more “professional” after all this time. Seems I´m getting old … :slightly_frowning_face:

I´m always impressed by how quickly and effortlessly you come up with a variety of qualified links on any given subject.
I tip my hat to you, @Akito :+1:

Many thanks again for your kind help.
Have a nice day and many greetings from Rosika :slightly_smiling_face:

1 Like

You might check the file ~/.password-store
if it made one.

Reminds me of a young technician who worked for me years ago. Wanted to remove some files ending in xxx, so she did
rm *xxx
but she slipped up and typed
rm * xxx
and lost 3 months work.
One little blank in the wrong place

You must be feeling lucky

1 Like

I mean, if people keep backing up stuff, it won’t ever get this bad. You just have to stay disciplined about making backups.

It was on a mainframe. User backups did not exist. Old days, old nightmares.

Yes, backups, backups, backups
Users data is most important, you can always put the OS back, but there is usually only one copy of your private stuff.

Hi all, :wave:


How unfortunate. The poor fellow. :slightly_frowning_face:
I´m feeling with him…

@Akito and @nevj:

You´re right. Backups are of utmost importance.

I do it this way:

  • once a month: a complete disk backup with clonezilla live
  • once a month: data backup of 3 USB-sticks with rsync (grsync actually)
  • once a week: automatic timeshift snapshots of the root partition

I think and hope this scenario should serve me well. :blush:

The “most crucial” backup strategy with clonezilla certainly takes some effort. Often enough I´m two days behind my schedule but I try to stay as disciplined as possible. :blush:

Many greetings
Rosika :slightly_smiling_face:


Rosika, that is very well thought out.
My variation is

  • once a month clonezilla all partitions… same as you
  • data backup anytime, depending on how much activity with tar
  • dont do snapshots, but I should
  • dont have off site backup, except for github directories, but I should look at getting some cloud space or whatever

That poor fellow was actually a lady. She was pregnant at the time and was so embarassed she came back while on maternity leave and redid the work to recover it. I had some very dedicated assistants.


1 Like


I forgot to reply to your info. Sorry. :slightly_smiling_face:

Yes, there is a ~/.password-store. But it is empty.
Curiously thunar says the last modification date was 2021-02-21.

Hmm, I just looked up the installation date of my system. It was on 2021-01-20. So basically 32 days after the installation the password store was created.
Alas I cannot remember what I might or might not have done at that point.

I guess everything alright then. :blush:

Thanks a lot and many greetings.
Rosika :slightly_smiling_face:

1 Like

Hi @nevj, :wave:

Thanks a lot for the confirmation. :heart:

Well, your beckup scenario looks superb, especially this part:

Personal data is very important indeed.
May I ask what kind of programme or command you employ for this task (data backup)?

I see. Well, that´s really great. I´m glad everything could be set right in the end. :slightly_smiling_face:

Many greetings from Rosika :slightly_smiling_face:

I have all my data in one area called /common. I access it from several linux versions. All I have in my home directories is the user dot files.

So I just make a tar file ( I assume you know how to use tar) of /common onto an external usb disk. After a while I get too many tar files, so I delete the oldest ones.
Very primitive method. It is not incremental. It makes a new full copy every time. Need to take care with names of each tarfile, so I can keep track.

rsync is a more sophisticated method, but it does not pack the files into an archive like tar does. You can see the individual files on the backup disk with rsync. With tar all you see is one big file.

I can show you how I do tar command if you have never used it. I expect you know.


1 Like


Thanks a lot, Neville, for elaborating on how you do data backups. :heart:

On a few occasions have I used the tar command, like:

tar -czf 2kgw_archiv.tar.gz tor-browser_neue_version/  # create a compressed tar archive, gzip
tar -tvf 2kgw_archiv.tar.gz  # show the contents of the tar archive
tar -xzf 2kgw_archiv.tar.gz  # de-compress the tar archive

That´s pretty much the extent of it… :blush:

I understand. Thanks for pointing out the differences. :+1:

Thanks for the offer, Neville.
I think apart from the three example commands I provided I don´t need to use tar as much as you do…
Yet a single example would certainly be appreciated. Thanks so much. :heart:


I also experimented with borg at one time (https://www.borgbackup.org/ ).
Have to say I loved it actually. :slightly_smiling_face:
The beauty of it: compression and still the ability of mounting archives :wink: :exclamation:

It´s available as standalone binary, too (if preferred).

Deduplicating archiver
with compression and encryption

BorgBackup (short: Borg) gives you:

  • Space efficient storage of backups.
  • Secure, authenticated encryption.
  • Compression: LZ4, zlib, LZMA, zstd (since borg 1.1.4).
  • Mountable backups with FUSE. # which is great :exclamation:
  • Easy installation on multiple platforms: Linux, macOS, BSD, …
  • Free software (BSD license).
  • Backed by a large and active open source community.

Good demo videos can be found here: Demo — BorgBackup

Very well documented: Borg Documentation — Borg - Deduplicating Archiver 1.2.0 documentation

Thanks again and many greetings.
Rosika :slightly_smiling_face:


Was thinking about mentioning Borg, but I’ve talked about that a lot on this forum already and didn’t want to spam about it again.

Thanks for taking this burden off my shoulders. :laughing:

1 Like

Thanks @Akito,

sorry then. :slightly_frowning_face:
I didn´t mean to flood this forum with information which seems to be more commonly known than I thought.

I hope no one is offended hereby. :face_with_hand_over_mouth:

Many greetings.
Rosika :slightly_smiling_face:

No, it’s the opposite. I’ve talked about it a long time ago and I’m sure not everyone in this discussion knew about it. So it’s good!

1 Like

O.K., I´m relieved then. :blush:

Thanks again @Akito

Many greetings from Rosika :slightly_smiling_face:

1 Like

Borg was new to me.
@Rosika did the right thing providing info on Borg
I want to look into it for off site backups


Hi Rosika,
I have one running at the moment , so lets tab it

First the filesystems
nevj@trinity:~$ df
Filesystem      1K-blocks      Used  Available Use% Mounted on
/dev/sda4      1116282132  58208400 1001346796   6% /common
/dev/sdc2       604627664 194819452  379071828  34% /media/nevj/Debian_backup

So common has 58Gb of data and the tarfile is going onto a mounted partition /media…

Now run the tar job

nevj@trinity:~$ su
root@trinity:/home/nevj# cd /
root@trinity:/# tar -c -f /media/nevj/Debian_backup/common.2022.14.01.tar 
root@trinity:/# exit

Now look at the stored tarfile

nevj@trinity:/media/nevj/Debian_backup$ ls -l
total 252730728
-rw-r--r-- 1 root root 44606627840 Mar 19  2021 common.2021.03.16.tar
-rw-r--r-- 1 root root 39556444160 Jul 30  2021 common.2021.07.30.tar
-rw-r--r-- 1 root root 39843768320 Jan 16 20:22 common.2022.01.16.tar
-rw-r--r-- 1 root root 59373916160 Apr  1 13:37 common.2022.14.01.tar
drwx------ 2 root root       16384 Aug 11  2012 lost+found
-rw-r--r-- 1 root root  9727723520 Jul 30  2021 nevj.deb10.2021.07.30.tar
-rw-r--r-- 1 root root 18430914560 Jan 16 20:12 nevj.deb11.2022.01.16.tar
-rw-r--r-- 1 root root 21468958720 Jan 27  2021 nevj.deb8.2021.01.27.tar
-rw-r--r-- 1 root root 22409963520 Jul 30  2021 nevj.deb8.2021.07.30.tar
-rw-r--r-- 1 root root   723783680 Jan 16 20:51 nevj.dev4.2022.01.16.tar
-rw-r--r-- 1 root root   405043200 Jan 16 21:08 nevj.solus.2022.01.16.tar
-rw-r--r-- 1 root root  2249052160 Jan 16 20:40 nevj.void.2022.01.16.tar

You can see the new 59Gb file called common.2022.14.01.tar plus a few older backups. The ones called nevj... are home directories.
Because email folders are in a dot file I need to backup the Debian 11 home directory particularly to cover email.

It took about half an hour to do the 58Gb backup

To restore all the files use tar xpf tarfilename.
Because I used common/ when making the tarfile, the restore will put back the contents of /common but not the common mount point.

To list the content of a tarfile do tar t tarfilename
To extract one file do tar xvf tarfilename filename-to-extract

tar in a very old part of Unix. It was originally used to archive files to magnetic tape ( hence the name - tape archive). On a tape it is better to write one large file - that is why it packs everything into one file.

There is also ar . It is used to make library files eg

nevj@trinity:/usr/lib$ file libRmath.a
libRmath.a: current ar archive

ar is also part of original Unix.

I found Borg - it is in the Debian repository


1 Like

Why not compress it, too?

It’s very hard to see without the h option given to ls.

I think, this backup approach is not very user friendly, because it’s not very efficient and yet it’s not very easy to achieve, either. There are simpler and also more efficient and certainly more elegant solutions.

For example, rsync seems simpler and yet, it’s much more efficient, because it does not need to tar anything and it also does not waste space by having duplicate files, etc. If, for example, applying the default --archive option.


I get it, ls -h gives Mb or Gb

I basically agree. I do it because I have always done it that way. Compression is a waste of time - disk space is cheap today.

There are issues, if I delete a file, it eventually disappears from the backups. Might not want that.

rsync is too complicated for me. Too much opportunity for mistakes. I tried it once but not happy. I want something simple like dump that i can control

I just did it because @Rosika asked. Not trying to push it on anyone.