Upgrading EOL Ubuntu 18.04 with backup

Thanks for your reply. I’ll have to try that . However, I’ll have to pause the exercises. Something more urgent has come up. My laptop’s OS is at EOL and I have to deal with that before anything else. This may not be the place to go into that, so I’m going to search the forum for a thread on updating or upgrading from Ubuntu 18.04. (eg. What does ‘a fresh 22.04 version … will format your data’ mean?) Hopefully, I’ll get back to the Terminal Basics series soon.

Hey Frank,

From Ubuntu 18.04, you can upgrade to Ubuntu 20.04. And if you want, you can upgrade your 20.04 to 22.04.

You cannot upgrade 18.04 to 22.04 directly. You’ll have to upgrade to 20.04 first.

Another way, is to download 22.04 ISO, put it on USB and then boot from this USB to install 22.04 afresh. However, it will format the system as it wipes the entire data and then installs Ubuntu 22.04.

That may not be the ideal solution for you.

What you can do is to make backup of your (important) data on an external disk (optional but recommended) and then upgrade to 20.04.

How do you do that? Go with the GUI method 1 discussed here:

It is for upgrading to 22.04 but it will work the same for upgrading to 20.04 as well.

1 Like

Hi, :wave:

about the update topic:

Of course the way of updating is entirely up to Frank. I just want to provide my suggestion here.
This method has proven to be successful in the past.

  • create 3 partitions on the HDD/SSD: root, home and a data partiton.
  • for me it´s like this: root: 25 GB, home: 38 GB, 3rd: 400 GB (or the rest of the HDD/SSD), but of course choose the values you consider to be best for you
  • when doing a fresh/clean install: choose “something else” (e.g. in ubiquity installer)
  • format root and home but not the third partition, so all of your personal data will be retained in future when doing a fresh install
  • so only root and home will get wiped, whis seems to be perfect for a clean install
  • of course backup your existing OS first (or at least your personal data)

Well, like I said, that´s just one of many possibilities. :wink:

Many greetings from Rosika :slightly_smiling_face:


Hi @Rosika,

I like the way you set up your Linux partitions. I do it the same way.
Just a FYI and maybe not all distro will allow this, but some distro gives you the option
to not format home. So the only thing that is wiped is root.
As always, backup of personal data is always a good idea before doing an upgrade or install.

Kind Regards,

1 Like

Hi Howard, :wave:

Thanks for the additional info. :heart:

Yes, I can confirm this. With me installing Linux Lite and using ubiquity I could´ve left home unformatted as well.

But for a fresh and clean install I think it´s best to have it formatted.
Otherwise e.g. config files of programmes from the old installation would not be touched and whether or not this would be so good for potentially new versions of programmes to be installed… who knows in advance. :thinking:
There might be some changes…

Someone once said this might be a double-sworded thing…

Better not take any chances, I guess.

Many thanks and many greetings
Rosika :slightly_smiling_face:

Hi @Abishek,

I read your article on upgrading my system. I’ll try to follow the instructions. However, the suggestion to first backup the personal files in my Home directory that I don’t want to lose has me stymied. My search for information on how to copy files from Ubuntu to a USB stick has me confused. I’ve tried searching with different words and phrases and keep getting involved instructions that often seem to be for more than I was asking for. Many of them are for running several commands in a terminal. I’m not comfortable in a terminal. When I’m told to be sure there is enough free space in the disk, does that mean I need to have a separate partition for the stick? Partition 1 has 236 GB free out of 496 GB. There is also Extended Partition 2 with 4.3 GB, and Swap Partition 5 with 4.3 GB Swap, also Free Space 1.1MB. The files I don’t want to lose total about 8 GB. The USB stick has 32 GB. When I insert the stick in the USB port, I can’t find it listed anywhere to select as a target to copy to. Am I supposed to do something else before selecting and copying the files? The search results used a lot of terminology I don’t understand. People who know even less about computers than I do are told they should regularly do an external backup. Am I missing something here? Thanks for your help.

Hi Frank,
If you are not happy using terminal commands, and you need to do a backup, I suggest you use one of the backup utility programs… choose one that has a graphic interface. For example Timeshift
How to Backup and Restore Linux System Settings With Timeshift
It is important that you make a second copy of your important files, before attempting any system changes or upgrades.

Having learnt the hard way, to make sure of saving important files;
get a external storage, ie hdd disk, in its own usb holder, then use ie dolphin to put important stuff there. Unplug the external hdd and do whatever! Only need to restore if things dont go as planned


@nevj Following your suggestion, I’m now using Timeshift as well as deja dup. I still can’t find how to copy a few files from my Home directory to an external drive. @0bill0 says to use the file manager to put important stuff there. How do I do that?

You just copy the whole home directory using Timeshhift or whatever.
The whole idea of a backup is to copy everything. That protects you from forgetting something important.

If you really want to just copy a few files

  • use the cp command at the terminal
  • or use the file manager… I think you can drag and drop files from one place to another using the File Manager. Not sure, never done it. Maybe one of our gui experts can tell you.
1 Like

@nevj By a few files, I meant the approx. 8 GB in my home directory, as distinct from the 200+ GB in the harddrive. If possible, I’d also like to copy everything in my Thunderbird account, to avoid having to set up everything there all over again. It’s not immediately apparent to me how to use my new Timeshift to do this. When looking for what will happen, before actually starting the process, I’m using the USB stick I have on hand. I’m assuming the process will be the same on a larger external drive, which I have been advised get. Am I supposed to do something else besides inserting the stick in the port to have it recognised as being there? I can’t find it anywhere I look. Or will Timeshift find it when I start copying? Can you tell me exactly what I should click on in Timeshift? Thank you for any help you can give me.

OK, I get you Frank. You want to backup the whole home directory.

not immediately apparent to me how to use my new Timeshift to do this. 

Well I dont use Timeshift , so I cant give you specific help. I can say this… most backup apps backup whole partitions, rather than filesystems. So if your /home directory is on a separate partition , that should be easy.
If your /home directory is in the same partition as the root filesystem ie ‘/’, you will have to work out how to backup all the files in /home. I am sure Timeshift can do it, but we need a Timeshidt expert to tell us how.
There is a way using the command line. You use the tar command. Something like this

Make sure the external disk is mounted.... lets say it is /media/frank/mydisk
Then issue the tar command
tar -c -f /media/frank/mydisk/myhomebackup.tar  /home/frank
best done as root or with sudo

That will bundle up all of /home and put it in file called myhomebackup.tar.

To get the files back from myhomebackup.tar , you do

tar xf  /media/frank/mydisk/myhomebackup.tar -C /home/frankrecovered

The files will all come back so I put the recovery in a different directory
You can read about tar with man tar

I suggest you try and get help to do it with Timeshift and only try tar as a last resort, because I understand you are not comfortable with command line stuff.

Another option is grsync a gui version of rsync. It will do backups by synchronizing an external disk with your home directory, and is graphic driven.



I’d recommend compression as well - gzip is the default with -z

And you don’t need the hyphen on “-c” or “f” et cetera with modern implementations of tar (gnu tar), and you can make them into a single string - me? I’d do it this way :
tar cvzpf /media/frank/mydisk/myhomebackup.tar.gz /home/frank

v” shows some progress (verifies on screen), and “ppreserves ownership and permissions…

1 Like

Yep , that is better than my effort.

I should qualify that statement - anybody can unarchive / unzip that archive - the permissions are stored as UID:GID (source system local user ID number and group ID number : in /etc/passwd and /etc/group) and octal for “-rwxrwxrwx” (i.e. rwx = 7, so “rwxrwxrwx” = 777)… UID:GID will be dependant on each system…

e.g. you archive a folder owned by fred:fenackipan on System A, where Fred’s UID is 501, and group fenackipan’s GID is 504, but on System B - UID 501 is someone else, and GID 504 is someone else again - or - you as your local user account unarchive, it will get unarchived as your username…

So ‘p’ preserves permissions, but ownership depends on who retrieves it? Not sure about that.
My man page says ‘p’ is default if root runs the process
One would normally run tar as root if using it for backup. At least that is what I do.

We should point out that tar works recursively, ie it archives contents of a directory and all its subdirectories to any number of levels.

This is why I’ll take debian based distros all day over RHEL. No upgrade path like that with Fedora or CentOS, you have to nuke and pave.

Use a rolling release distro. No need for a nuke or an upgrade path.


I’ve been waiting for the entire time I’ve been here to say this …

I use Arch btw. LOL

1 Like

My last foray in Fedora land, last year sometime, distrohopped a bit - ran it for maybe 2-3 weeks on my AMD desktop, and AMD thinkpad - was able to in-place upgrade Fedora 36 to 37 with relatively little pain that I recall…

However - I can’t quite put my finger on it - there was something “not quite right” I think it was a dependancy conflict between Fedora repos and RPM-Fusion, and on a gaming setup with a decent GPU (Radeon XT6600) - you need RPM-Fusion…

So I ended up back on Ubuntu, then Pop!_OS, and now a mix of Ubuntu and Pop!_OS…

I must admit, I quite like Ubuntu’s “do-release-upgrade” - never had a problem with it…

In 2020 - I had to use a hideous Checkpoint SNX VPN for work, and I could only make it work on Ubuntu 18.04… So I’d install Ubuntu 18.04, get SNX working, then “do-release-upgrade” to 20.04, and I think I even did a 18.04 > 20.04 > 21.04 or 21.10 once - it just worked…

Anyway - work’s long decom’d that SNX VPN, and while I say it was hideous, it’s better than what replaced it: “Azure P2S VPN” - which I’ve NEVER been able to make work on Linux, and is pretty flaky on my personal MacBook (but stays connected most of the time), and used to work with some nudging and cajoling on my work MacBook but now point-blank refused to connect (due to the UGLY CORPORATE big brother shite on this “managed” MacOS [but still better than their managed Win10 / Win11 SOE!])…