if you mean the most secure desktop? then linux is not. not by a long shot. use windows.
https://madaidans-insecurities.github.io/security-privacy-advice.html#desktop-os
https://madaidans-insecurities.github.io/linux.html
if you mean most most free, linux it is. personally I use linux.
I want to agree with you and I do to a large extend. I like new codecs and having more opensourcy coded is better than using a codec that has many patents. long term patents(current situation) slows technological progress.
what I don’t agree with you is some details.
first, Netflix youtube and so on need low bitrate and they (specially google/youtube) don’t care that much about quality. google youtube video are really bit starved for their resolutions. netflix is a bit better.
second, many people when they discuss codecs they are referring to a different use case for them. they are talking about archiving. as in, the best quality codec at a same size. so they compare original (raw video, no lossy codec used) with encoded ones. their conclusion is that av1 is great for size reduction, but cant beat h264 for fidelity at any size. I think that h264 has a placebo or transparent profile but av1 doesn’t.
so when I download a fi…I mean a linux ISO from torrents, I usually go for newest codec. but recently I don’t go for the smallest size because it takes away from details in the picture.
but if I want to archive a movie (that I like a lot, which is rare) I get the bigger h264 (or if uhd blueray h265).
third: a lot of people’s idea of codec quality is formed based on downloading or streaming other people’s encoded videos and they themself don’t compare the quality (as they don’t have time or a good raw video to compare).
4th: I have heard av1 has issues with film grain, as in it removes them. film grain is an artifact of physical films (non-digital) that unfortunately many directors try (or used to) to duplicate because they grew up watching movies on films and think that movies should be like so they add them in in post production. even though it is literally a defect and even human eyes doesn’t duplicate it so it is not even natural. but this still is a bug of av1 (if I read correctly) because codec should go for high fidelity and not high smoothness.