• 2 Posts
  • 14 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2024

help-circle
  • oh dont get me wrong. as I said I agree with most of your original (and now second post).

    my gripe with grain was not about av1 per se. it was with movie makers that add it just because they think it is how movies should be

    this is retarded to me: “Reasons to Keep Film Grain On: Artistic Effect: Film grain can add a nostalgic or artistic quality to video and photography, evoking a classic film look” because the reason is just “nostalgic” that the director has, as in if he was born after digital era, he would have an issue with it and not add it (usually).

    about h264 and transparency, the issue is not that h264 can get that but at high bitrate, the issue is that av1 (as I read) can’t get it at any bitrate.

    but overall I agree with you.

    I even recently was shocked to see how much faster av1 encoding has gotten. I would have thought it was still orders of magnitude, but with some setting (like x265 slow setting) av1 is has the same encoding speed.


  • I want to agree with you and I do to a large extend. I like new codecs and having more opensourcy coded is better than using a codec that has many patents. long term patents(current situation) slows technological progress.

    what I don’t agree with you is some details.

    first, Netflix youtube and so on need low bitrate and they (specially google/youtube) don’t care that much about quality. google youtube video are really bit starved for their resolutions. netflix is a bit better.

    second, many people when they discuss codecs they are referring to a different use case for them. they are talking about archiving. as in, the best quality codec at a same size. so they compare original (raw video, no lossy codec used) with encoded ones. their conclusion is that av1 is great for size reduction, but cant beat h264 for fidelity at any size. I think that h264 has a placebo or transparent profile but av1 doesn’t.

    so when I download a fi…I mean a linux ISO from torrents, I usually go for newest codec. but recently I don’t go for the smallest size because it takes away from details in the picture.

    but if I want to archive a movie (that I like a lot, which is rare) I get the bigger h264 (or if uhd blueray h265).

    third: a lot of people’s idea of codec quality is formed based on downloading or streaming other people’s encoded videos and they themself don’t compare the quality (as they don’t have time or a good raw video to compare).

    4th: I have heard av1 has issues with film grain, as in it removes them. film grain is an artifact of physical films (non-digital) that unfortunately many directors try (or used to) to duplicate because they grew up watching movies on films and think that movies should be like so they add them in in post production. even though it is literally a defect and even human eyes doesn’t duplicate it so it is not even natural. but this still is a bug of av1 (if I read correctly) because codec should go for high fidelity and not high smoothness.


  • you didn’t do the wrong thing.

    what many people don’t notice is that support for a codec in gpu(in hardware) is two part. one is decoding and one is encoding.

    for quality video nobody does hardware encoding (at least not on consumer systems linux this 3050 nvidia)

    for most users the important this is hardware support for decoding so that they can watch their 4k movie with no issue.

    so you are in the clear.

    you can watch av1 right now and when av2 becomes popular enough to be used in at least 4 years from now.


  • maybe, maybe not.

    when h264 was introduced (Aug 2004), even intel had HW encoding for it with sandybridge in 2011. nvidia had at 2012

    so less than 7 years.

    av1 was first introduced 7 years ago and for at least two years android TVs require HW decoding for it.

    And AMD rdna2 had the same 4 years ago.

    so from introduction to hardware decoding it took 3 years.

    I have no idea why 10 years is thrown around.

    and av1 had to compete with h264 and h265 both. ( they had to decide if it was worth implementing it)






  • I get the intel lower power when not doing stuff (wish amd had high/low config for cores too) but what I mean was, in laptop cpus that are not on battery (just connected to power) does amd do more with the same power usage?

    if the comparation can not be done with the same gen cpus from two companies, then maybe a similar power usage cpu from amd and one from intel (laptop of course), do they for example have similar geekbench benchmark results? (for lack of better tool)

    so what I am asking is I dont care that is the same gen amd and intel laptop cpu with both connected to socket power, if amd is better. I want to know if for the same power usage (not idling but working) amd is better or not.




  • I don’t think that is how it works. For example I don’t think you can get google security update for older android without updating the whole system. there were monthly google security update that are going to become less frequent.

    Also I think the part that stops the apk installation (those not signed with signature in google database) are checked by google play services and that is installed in background which is the result of project treble and mainline that google implemented for modular updates without rom update. so you probably can’t stop google from doing this policy even if you stay on older ROMs.