FsF Free software Foundation NVIDIA official disinformation

I opened a bottle of champagne when I left Windows. I don’t want to get it back.

Why didn’t you choose a recent AMD graphics card, and that would work for you exceptionally well both under Windows and Linux?

Agreed.
:wink:

I try not to react to strawman accusations. Never said it will work “exceptionally”, at all. I was just comparing the two and seeing everything from a relative, instead of an absolute perspective. Everything is relative. The only thing that is absolute in our universe is that everything is absolutely relative. :wink:

I assume that you are comparing anecdotal evidence to “knowing one’s Linux shit”. In this case, I am not talking about some random non-technical wannabe tech pros, like some try to seem here, for example, just by making a “rEviiEw” of an OS, without having any technical background knowledge, at all, (but that is another topic anyway) – I am talking about people who actually work with Linux all day long and actually know what they are talking about. Like actual C or Linux developers or (Linux) DevOps engineers or somebody like that. Not just randoms talking about crap they don’t know about.

For example, this forum has a couple of people who I value a lot, but not only, for their mainly, but not only, experience based technical knowledge. One of those couple people here is @daniel.m.tripp . He already worked with *NIX systems when most of the It’sFOSS writers were still shitting into their diapers.

OK, then I’m convinced, Nvidia sucks on Linux.
Despite of sucking, it still works very good.
If I ever buy a newer graphics card, I’ll do it with my software in my mind: so if Nvidia works better, it will be Nvidia again. I won’t care if it sucks or not, I don’t care for politics of AMD, I’ll just look at the fact how I can use it.
If AMD comes up with something I can’t deny, it may be an AMD.
If my current hardware would break suddenly, and I had to buy something next week, I’d go for Nvidia again for sure (100%).

Perfect. This is the most reasonable approach. I agree 100%. :smiley: :ok_hand:

Hey @kovacslt - Take no notice of the idiotic content about NVIDIA and just treat all that Wayland crap as irrelevant - with FsF gnulinux Trisquel there are no NVIDIA problems at all - Just record braking efficiency giving stunning performance.

However dear fossers - gamers too - take no notice of all this silly disinformation content from @Akito @daniel.m.tripp @clatterfordslim and read what forensic grade professional benchmark testers have to say… :thinking: :wink:

Even our staff can self educate from these savvy guys :face_with_hand_over_mouth: :kissing_heart: :wink:

Enjoy … :slightly_smiling_face: … much more later :face_with_monocle: :wink:

ps @Akito paid how much … :astonished: :flushed: :joy: :joy: :joy: :face_with_hand_over_mouth: :kissing_heart: :wink:

You still have to do all the things I mentioned with NVIDIA in Linux when using xorg. Trust me I game in Linux and have done for years. I’m currently using NVIDIA 1660 graphic card on my RYZEN 2600 AMD build. NVIDIA for gaming has come along way, it’s the terrible drivers with older cards, like 1030 NVIDIA run of the mill card for older hardware. This is where the true problem is, when there isn’t any decent drivers out there, to cope with the horrible screen tearing whilst watching video or scrolling your mouse wheel up and down a website.

AMD wins hands down for dealing with that issue, as I have a old AMD only build, built it in 2015, on board graphics on the CPU and a nice script to stop screen tearing.

1 Like

EVERY graphics had this tearing problem I had the chance to try and use so far.
I must be an idiot, as that picture above shows that girl with the gun (see: FsF Free software Foundation NVIDIA official disinformation - #36 by Akito ), because I found the solution for screen tearing on all those graphics regardless of its manufacturer.
The list starts with Intel graphics built in my laptop (i5 5200u and its HD5500), my kids current desktops have i3 8100 and its UHD 630, but before that they had C2D/Q based machines, one with ATI HD4650, and other 2 with Geforce GT210, and my R9 380 had the tearing phenomenon too, and yes, my current GTX1060 really showed this as well.
Trust me, the tearing issue is not the fault of the card or the driver, but how the driver works (is configured) by default.
The problem is with the bad default, and they all have this bad default, not just Nvidia, which of course sucks so much.
There are the options eliminate, either use compiz/compton, or use a non-default config of the Xorg driver. I choosed the second one in all cases.
The HD4650 required a .conf like this:
Section “Device”
Identifier “Radeon”
Driver “radeon”
Option “TearFree” “on”
EndSection

All the intel graphics require a conf like this:
section “Device”
Identifier “Intel Graphics”
Driver “intel”
Option “TearFree” “true”
EndSection

Nvidia needs Force full composition pipeline to be switched on for eliminating screen tearing. That was true for both 2 GT210 cards, and now my GTX1060 as well.
Extremely hard job to do, because Nvidia sucks.

The R9 380 was for me the most tricky in this regard!!!
It also needed a “tearfree” option in the config file, but did not work…
I had to read the half duck2go, until I found that I need the BusID too, and I get it with lspci.
This was on GA-AB350M-HD3 board, it had 2 PCIe16x slots, probably I placed the card in the other slot, probably that’s why the “works everywhere busid:1:0:0” did not work for me.
I ended up with a config like this:

Identifier “Card0”
Driver “amdgpu”
BusID “PCI:2:0:0”
Option “TearFree” “on”

AMD wins…

Dear @Andy2
This is my last post here, I quit that debate.
I don’t really care if the majority’s opinion goes against my own experience.
I beleive my experience 1000 times better, than anyones opinion.
Well, in fact, for me it will be good if the vast majority thinks nvidia is like sh!t :smiley:
That will make me possible to buy my next Nvidia for dirt cheap :wink:
But this doesn’t mean I’m addicted to Nvidia, I’m not.
If I can get an AMD card for cheaper, which performs better or similar, and has all the feauteres I need working on Linux, I’ll buy it…

1 Like

Not sure if you really got my point, so I want to clarify by using your example:

If you want to configure something, it will on average be easier and more successful with AMD graphics cards, because of AMD’s politics. This fact does not mean it is impossible with NVIDIA, as is obvious in your graphics card unrelated xorg.confexample. However, since AMD cards tend to be much more open than NVIDIA cards, those are more likely to be a success in your setup, than the latter.

Again, just for clarification:

It does not suck. The company behind NVIDIA executes bad politics, resulting in bad Linux performance in general, on average. If you use a Windows 10 PC, you will have no issue with NVIDIA.

And that is perfectly fine and acceptable. The point I was trying to make the whole time is that your experience does not matter at all for other people in general. It only matters to you.

So, as I said earlier, if you think NVIDIA works better for you that is perfectly fine and I think everyone should understand that.

BUT do not try to make it seem like your experience accounts for everyone’s lives.

Again, people do not care so much about what they are getting exactly since years, because all graphics cards are pretty expensive since years. So people will just be happy to buy any graphics card, as long as it reaches the performance they need.

And in your scenario I have a couple of possible speculative scenarios in my mind:

  • It just seemed cheap to you, but actually wasn’t because in your country everything is generally cheaper, however maybe not from your personal perspective.
  • The guy who sold it to you had no idea what he was doing.
  • The guy who sold it to you found a small defect in the hardware and wanted to get rid off it as long as the card still works for normal users. At first, some defects only surface during stress testing the hardware or just using it to its full potential through heavy weight games, for example.
  • You just got lucky.

I find those guesses much more likely than some anti-NVIDIA conspiracy leading you to this, as you say, dirt cheap card.
That said, there is already one thing I noticed since then, which is fact:
Paying ~130 USD is actually not dirt cheap for a GTX 1060. This is like the normal price or actually slightly higher, than average. If you want a cheap price on a GTX1060, you need to stop buying at ~110 USD.
(I changed the currency for easier understanding, as everyone knows how much a dollar is.)

Again, this is the approach I would recommend to everyone. Buy the one that works best for you, no matter how it is branded.

I bought it 05. 01. 2019.
I mean cheap, because it costed the fraction of a new card of the same model from a shop at that time.
The card was still under warranty when I bought - however, only for a short time, couple weeks I think - not sure.
OK, “dirt cheap” is an exaggeration… :wink:

He bought a 1080Ti, and did not need this one.

I use it with Davinci Resolve successfully since then… I’m completely satisfied with it, regarding performance, as well as stability… no problem so far. For my fullHD stuff it is simply perfect.
If I had to go for 4K, I had to trade it for a more powerful card, but then my other components (CPU, RAM, storage, monitors, etc…) should be upgraded too.

I think so :slight_smile:

Yes it has to go into /usr/share/X11/xorg.conf.d the file must be called 20-radeon.conf

For Intel, Laptops especially I have kept the script that Mark Greaves wrote.

On Intel
Copy and paste into terminal the 2nd command beginning with echo is all one line.

sudo mkdir -v /etc/X11/xorg.conf.d

echo -e 'Section "Device"\n Identifier "Intel Graphics"\n Driver "Intel"\n Option "AccelMethod" "sna"\n Option "TearFree" "true"\nEndSection' | sudo tee /etc/X11/xorg.conf.d/20-intel.conf

Just for accuracy, and this is really my last post here.

“20-radeon.conf” is just a well reasoned file name, there’s no “must” to call it so. You could name it “a_n_y_t_h_i_n_g.conf”.
The name must end with “.conf”, that’s all - otherwise it won’t be included :wink:

Depends on the distro too. On Debian it’s /etc/X11/xorg.conf.d on Ubuntu and derivatives (like Linux Mint) I saw /usr/share/X11/xorg.conf.d is used.

If you have any trouble reading my post…

…I am ready to assist you at any time - on average professionals have the opposite opinion to your amateur content and importantly back it up with forensic grade evidence from testing :thinking: :slightly_smiling_face: :kissing_heart: :wink:

However; good FOSSers, transient visitors and Noobs - I have great confidence that you can read without an problems at all; a thoroughly reliable highly professional reference work by those Phoronix guys…? :wink: Where rather than being amateurish without any tests at all…

Hey @Akito - No Problemo :robot: Asta la Vista :baby: Baby :grin: :laughing: :joy: :joy: :joy: :joy: :kissing_heart: :wink:

… the professionals clearly show that with Linux - NVIDIA is the better graphics card.
So please stop this silly disinformation …

So there you have it - 7 pages of highly professional forensic grade benchmark testing. Now just who you want to trust - these Phoronix pros or the spreaders of disinformation here at it’sFOSS - I leave it with you to decide as I do not have any NVIDIA problems and don’t play such games. :slightly_smiling_face: :kissing_heart:

It is clear that these guys at Phoronix have done forensic grade tests so that our FOSSers and others have a clear statements on what is best - not some uninformed content by an amateur without a clue. :thinking: :slightly_smiling_face: :kissing_heart: :wink:
I do not want to waste time with that unreliable wayland crap… the vast majority do not either Top marks to @kovacslt for ignoring all the bad advice - disinformation and buying second hand NVIDIA graphics card with instant confirmation that he made the correct choice. :yum:

… I will just continue in a highly efficient manner with my workflow with NVIDIA graphics working faultlessly and unnoticed. :grinning: :kissing_heart:

Hey Fossers would you buy a car or anything else recommended by… :woozy_face:
Nope; me neither.

Paid how much for a car - sorry card … $1200 :astonished: :flushed: you are joking… :rofl: :kissing_heart: :wink:

Hey @Akito please try to get up to date as it might cost you a lot more than $1200 :woozy_face: :astonished: :flushed: :joy: :joy: :joy: :joy: :joy: :kissing_heart: :wink:

https://itsfoss.community/t/cryptocurrency-mining-processor-nvidia-leads-the-way/6431

Yes we are talking about the FUTURE - hope you can handle that concept…?

Enjoy - :kissing_heart: :wink:

Edit:- Oooooooooops! Someone has removed the post entirely so you cannot read it - please send pm and I will show it to you - I hate this secrecy :frowning_face:

One thing I have noticed about newer nVidia cards, the quality control isn’t as good as it used to be.
Surface mount capacitors are often not attached very well and don’t always make a good contact with board.
I know some of the issue is due lead free solder now used (causes other problems as well over time or relative high humidity environments) I’m also wondering if built in ‘hardware obsolescence’ is now part of design? (even though cards are obsolete by the time they hit market, hardware issues never used to be a problem)

1 Like