"Samba doesn't work." Rant

I have been using Ubuntu-based Linux exclusively for decades. One thing that has annoyed me for a long time is that although “Files” (Nautilus, etc.) makes it very easy to create a SHARED FOLDER, I then find that it is near impossible to access that folder from another Linux machine on the same local network. In creating the shared folder I always choose options that allow “anyone to do anything” in that folder. Then I sit down at the other machine, which recognizes the share, and try to open it, and I almost always get “wrong user name or password” or “you don’t have permission”, or “failed to open”, and so I spend half an hour trying every conceivable variation of parameters I can think of, and still can’t open the folder. So I give up, and use Google Drive, or a portable drive to share files. There have been a few occasions when it worked, but I never know what I did to achieve that. To share a folder should not require editing of .conf files, or executing console commands. The GUI share set-up should do whatever is necessary to make the share accessible, don’t you think? Is Windows compatibility the problem? Why don’t we have a GUI based Linux-only app for this? Google Drive is too slow to move a big file collection from one machine to another.

I second that.
I got it to work ONCE and ONLY once. I, to this day, do not know how. I vaguely remember there being an additional windows app that needed to be made active to get it to work with windows.
I tried warpinator and KDE connect with ZERO success.

I loathe samba so much I refuse (mostly) to use it. If you’re Linux to Linux - then NFS is the way to go. But it sounds like you’re shy of the terminal and editing “conf” files (/etc/exports) - so that’s probably not a solution, but the syntax is unbelievably simple e.g. (/etc/exports) :

/path-to-folder/folder *(rw,no_root_squash)

That’s it! Once you’ve edited the file - you run “exportfs” and it’s shared by NFS. You might have to do some permissions tweaking (e.g. /path-to-folder/folder - might have to have 777 on the folder - and UID’s might need to match - e.g. user account “1000” owns file on NFS server, then user 1000 on NFS client has full permissions (1000 is the default UID that Ubuntu creates during install when it asks you what username to create).

To mount it from another host :

sudo mount -t nfs IP.ADDR.OF.NFSSERVER:/path-to-folder/folder /mnt/mountpoint

Instead of stanzas of unintelligible guff in /etc/samba/smb.conf. And NFS is much more “performant” than SMB anyway, less overhead etc.

In the past where I’ve tried to setup “on the fly” Samba share on a Linux “desktop” - I’ve run into symptoms you describe, it’s a travesty! I’ve resorted to installing WebMin to create Samba shares, as I don’t walk around with the nitty-gritty details on smb.conf syntax in my head.

Anyway - I use my NAS (FreeNAS) to share files - and the beauty of that is that it allows me to share the same folders over BOTH protocols - NFS and SMB.

Apple shifted away from their own sharing protocol, and now the default is SMB on MacOS too, but with a bit of tweaking they also support NFS “natively” - so anyway - my Movies folder on my NAS is shared via NFS, and I use that from Linux, and SMB and I use that from MacOs, because it’s easier.

1 Like

I agree with @daniel.m.tripp - use NFS

There are good instructions here

Dont know about the GUI? You need command line to set it up, but once setup you might be able to get NFS mounts to work thru the File Manager - not in the shared section, just where you mount other filesystems on the same computer


Thanks for the replies. I spent about 2 hours with NFS today, following the instructions that @nevj recommended, but it still doesn’t work. The best I can get is
client:~$ sudo mount -t nfs4 /home/james/share
mount.nfs4: access denied by server while mounting
I’ve messed with folder ownership and permissions with no luck.
…come on, now. Every day my desktop automatically links up with my phone and shows me all my text messages. Why is it so difficult for two desktops running the same OS to talk to each other???

1 Like

Had another go at it this evening, following another set of instructions I found on the internet for setting up a nfs server and client and involving the /etc/fstab file. The first attempt did not work, the client complained that it couldn’t connect to the server, so I made some modifications to the setup, and BINGO, the server crashed and refused to boot up again. It was in “emergency mode” or something like that. Finally I booted the server machine on a bootable Zorin installation USB drive, and using the DISKS app was able to ID and mount the Ubuntu partition, and edit the fstab file, and save the system. I’m done though, with Samba and NFS, unless something new gets invented. A friend once told me “With computers, it’s always 3 steps forward and 2 steps back.” Please. Someone. Fix Nautilus so that file sharing is easy.

Not sure why. It used to work fine for me with BSD. Never tried with Linux.
It may be all the modern security nonsense.
Will have a try at setting it up myself.
I hope you realize it is a client-server system, not peer to peer.

Just a thought. Do you have Samba still running. They might clash

It is very easy to lock yourself out like that , if you make the slightest error editing fstab. Make a backup copy of fstab before you edit.
Then you can get back in by booting from a dvd and copying the backup to fstab

Nice panic experience

@jimofadel ,
Another option is to use rsync or unison
That will copy all the files across, not do a mount.


If this is for a home network, this is the “bee’s knees.” :wink: I did this several years ago, and have used it on a few systems. Before following altair4’s instructions, run this in terminal:

sudo apt-get install avahi-daemon avahi-discover avahi-utils libnss-mdns mdns-scan

Post by altair4 » Sat Dec 20, 2014 3:01 pm

If by “network” you mean file sharing no you do not need to use samba. There are alternatives.

However I have an all Linux / OSX network ( with a few Windows machines ) and I use samba because it’s the default system these days on all operating systems. There’s 2 ways to approach the issue of browsing to find other samba machines in the network.

**** You can use the Microsoft method which is the traditional approach: Samba Browsing Problems Checklist

**** Or you can use a native Linux way which is how my systems are set up. On both systems:

[1] Make sure avahi-daemon is running on both machines:


sudo service avahi-daemon start

[2] Make sure port 5353 is open. If you are not sure just disable the firewall if you are using one:


sudo ufw disable

[3] Create an avahi samba service file:


gksu gedit /etc/avahi/services/samba.service

[4] Then copy and paste the following into that file:


<?xml version="1.0" standalone='no'?>
<!DOCTYPE service-group SYSTEM "avahi-service.dtd">
   <name replace-wildcards="yes">%h SMB</name> ## Display Name

Please note: The very first line cannot have any leading spaces in front of it or else it won’t work. If you use gedit you will know you did it right when after you save the file it changes colors like this:


[5] You don’t need to do this but just in case restart avahi:


sudo service avahi-daemon restart

When you use Nemo on either machine and select “Network” you should see the other machine as “hostname SMB”

Also: You are using an avahi samba.service file. Samba does the equivalent by itself without requiring that file since Mint 19. You can keep the samba.service file if you want ( since it allows you to change the display name ) but if you do I would suggest disabling the way samba does it by adding the following line in /etc/samba/smb.conf in the [global] section:

multicast DNS register = no

Then restart samba.

Samba done the Linux way. No more workgroups. No more netbios names. No more name resolve orders. No more nmbd or any other Windows specific services required.

Use the editor of your choice, but take note of getting the samba service file correct!

1 Like

Neville – Thanks for your inputs. <I hope you realize it is a client-server system, not peer to peer.> What I mainly need is to transfer large numbers of files from my “main” desktop to my “secondary” desktop. I assume that is a transaction initiated by the client. Once in a while it would be useful to transfer a single file from secondary to main, but that’s not a must. Can a client stuff a file into the server? Can the server suck a file out of the client? In my efforts to use Samba and NFS I was consistently trying to initiate from the client, tranfer of a file from server to client. In the past though, with Samba, I usually had Samba installed on BOTH machines – was that a mistake?

We need to be careful with terminology
nfs is for mounting a filesystem that is on a server so that you can see it on a client machine. You dont need to copy files with this, you can just read and write on the remote filesystem from the client.

If You really want to transfer files between machines ( ie make duplicate copies) there are many ways - ftp, scp, rsync, unison and they are all quite simple to setup and use. If there are many files and directories it is best to bundle them with tar and just transfer the tarfile and then unpack it at the destination.

So what do you want, mounts or transfers?

Neville – Thank you so much for clearing that up! My attempts to set up client/server were a miserable failure, but even had I succeeded I would have been disappointed. I don’t want mounted access, I want copies transferred, because I consider my secondary computer(s) to also be backups for the main one at least for a lot of content, if not all. I will look at the other methods you mentioned. I still think if Nautilus could do what it purports to be able to do, that would be really nice. When I did have it working, I could copy and paste from one machine to another. --Jim

1 Like

Great, we have progress.

The easiest, and oldest, method is ftp
You need

In the server - ftpd … the daemon which allows other computers to acces the server via ftp

In both the server and all client machines - ftp … the client ftp program which does the logging in and transfers

So you need to find the ftpd package ( it is called ftpd in my debian) and install it in the server
and you need to find the ftp client package ( it is called pure-ftp) in my Debian and install it in all machines

Then, from the client machine, type
ftp servername
and it will offer you a login ( use you own login and password)
then you get a prompt
you can use cd to get to the right directory
get filename to get one file
mget * to get multiple files
when using mget it is best to set prompt
and it is also best to always set bin
say bye when finished

That is about it. Read the instructions, there is more

Dont forget , to transfer lots of files, tar them into a tarfile first, then use ftp to shift the one file, then untar it.

The next simplest thing after ftp is rsync. That will automatically keep 2 filesystems in sync, ie all files identical.

Dont let anyone tell you to use the ssh, scp, … commands. They are a nightmare of security requirements. You would spend your lifetime managing keys.

I think there is a gui version of ftp in some distros. Never used it. I like to see what it is doing, and I recommend you do the same, at least at the start.


Neville – I set up ftp (server) on my main desktop today and the secondary desktop was able to download individual files from it - I’m pleased with that. However what I would like to transfer at times is my entire Music directory which has hundreds of folders and sub folders, and around 8000 files. I don’t expect this to be done quickly. An hour or two is OK. However I need all the directory structure to be re-created. I tried tar.xz to compress a 318MB mp3 file and it took about 5 minutes, so it could take about 10 hours for 40GB of music, just to make the tarfile! And the compressed file was 316MB so mp3 doesn’t compress much. But still, I can use ftp for a lot of other transfers that are small. You mentioned rsync, and I remember that a backup program that I have used is based on that. Is it possible for one computer to present itself as an external hard drive to another? I would love to tell my main box to "backup up the Music directory to “other box” – using Lucky Backup, not duplicty or Deja Dup (incremental, not file copying). I guess I could do that using a portable drive, then move the drive. Thanks again for your consultation. Automatically keeping them in sync would be great too.

1 Like

Not sure about lucky backup, but I think some backup programs will do that. Also you can definitely use rsync to do it, I have done that, rsync will work across a network.

 I guess I could do that using a portable drive, then move the drive.

That might be the fastest way to do the initial transfer, then use rsync to keep it synchronised. You would probably only need to run rsync about once a month.

– I set up ftp (server) on my main desktop today and the secondary desktop was able to download individual files from it - I’m pleased with that.

Congratulations. At least you have something now. Yes a big transfer will be slow, but over a wire nothing would beat ftp for speed.

You can use ftp over the internet. There used to be lots of ftp sites from which you could download software. Before www that used to be the only way to download software. ftp has a long history, so it is at least bug free and reliable.

I reckon just make the big tarfile and let ftp handle it overnight. Wont break anything. Computers are made for large repetitive jobs, let it do the work for you. You sure have lots of music.


I use ssh, not knowing any better. I haven’t set up any keys, though. I just click on “network” in my file browser, find my wife’s PC, put in the password, and I’m in just like it was a drive on my machine.
How to Use SSH to Connect to a Remote Server in Linux or Windows is the tutorial I used (I think).
No, Samba doesn’t work well. It’s for Windows to Linux. Time to get rid of everything Windows related.
I use KDE Connect to talk to my phone, but the Android part of it is flaky at times. But it’s fun typing long replies to text messages without using thumbs on a tiny screen.

Yes ssh is good for logins across machines.
You will find , though, that if you change anything,
for example install an updated os, or change to another os in a multiboot machine, you have to reset the keys.

I liked the original rsh but that has been deprecated because of security concerns.


Today I was able to transfer some file folders from one machine to another by purging Samba on both machines, then letting Nautillus re-install it on one of them by asking it to share a folder. I tried repeating the whole process reversing which one has Samba, and it doesn’t work that way. Putting it back the original way, it works again. I don’t know why it doesn’t work both ways. Well it’s better than nothing. Both machines can now move files in and out of the shared folder. The Public folder appears as a mounted external disk in the “no samba” machine. Items placed in the shared folder by the “no samba” machine are seen as “locked” on the “samba” machine until they are moved into a normal folder; at that point they are intact and usable. Your results may vary. It could be that having “no samba” on one machine has nothing to do with this working. It is interesting that the two machines don’t work symmetrically; both are ubuntu-based systems. Anonymous connection doesn’t work, even though it was requested in the setup; you have to connect as a registered user with password. Once you get connected the mounting takes place and stays there during the session. One “moved” folder contained 19 sub-folders, and ~25 sub-sub-folders and a total size of 3.7GB. It took ~2 min. to copy INTO the shared folder, and ~1 min. to copy OUT of the shared folder on the other machine. Not bad. Still, I won’t be surprised if I have to go through this all again before long!

I cant think of an answer to that either.
Only time I had anything to do with Samba, it was on a server, and not on the clients.
Your 2 machines must be not as symmetric as you think.

Anyway, at least you have some options now