тна Back to MegaDB Search

Luigi_Gamer | 123 points | Feb 14 2018 17:51:50

[TV] The Legend of Zelda 1989 Season 1 [480p DVD x265 HEVC 10bit AAC 2.0 Langbard] [QxR] (1.32 GB) | Megalinks MegaDB [TV] The Legend of Zelda 1989 Season 1 [480p DVD x265 HEVC 10bit AAC 2.0 Langbard] [QxR] (1.32 GB)

Poster

E01 Media Info

MEGA | Megalinks MegaDB MEGA

Any pass = the name of this subreddit

Enjoy! | Megalinks MegaDB Enjoy!

Zippyshare | Megalinks MegaDB Zippyshare

permalink


[-] petrocity06 | 59 points | Feb 14 2018 18:23:11

WELL EXCUUUUUUUUUSE ME PRINCESS

permalink

[-] OkayDragon | 22 points | Feb 14 2018 20:23:30

I came here for this comment. Was not disappointed.

permalink

[-] chriscrowder | 12 points | Feb 14 2018 21:09:14

I came here to say this. Got beat :(

permalink

[-] redrosebluesky | 3 points | Feb 15 2018 03:47:48

MUH BOI

permalink

[-] LL_Cruel_J | 10 points | Feb 14 2018 18:45:56

Hey, Paisanos!

permalink

[-] kaching335 | 3 points | Feb 14 2018 18:40:57

Thanks!

permalink

[-] Vex99 | 1 points | Feb 15 2018 03:53:23

Hell Yea Thanks!

permalink

[-] grognakbabarian | 1 points | Feb 15 2018 05:29:06

What's the point of making x265 DVD rips? I get it for Bluray rips, as it will severely reduce the filesize, but good quality DVD rips can already be small, filesize reduction will basically be moot at that point.

permalink

[-] hawtsause | 2 points | Feb 16 2018 03:04:42

Some people like using/keeping everything with the latest and greatest codec

permalink

[-] [deleted] | -31 points | Feb 14 2018 19:04:35

[removed]

permalink

[-] darkshadow6400 | 28 points | Feb 14 2018 19:43:14

Nothing except reduced banding and increased compression due to the encoder being more precise.

permalink

[-] 12mo | -5 points | Feb 15 2018 08:50:10

That is a retarded misconception. You can't reduce 8-bit banding by reenconding into 10-bit just like you can't reduce 8-bit audio artifacts in MP3 by reencoding in 16-bit.

The increased compression is also a lie that stems from comparing MPEG-2 to HEVC, not HEVC 8bit to HEVC 10-bit.

Basically you're another retard who's spreading misinformation.

permalink

[-] darkshadow6400 | 7 points | Feb 15 2018 09:13:21

Sorry but you're wrong. Of course the source isn't going to magically become 10bit, but when you encode an 8bit source using a 10bit encoder you do get better compression and less banding compared to encoding with the same settings using an 8bit encoder, as the encoder is more precise and results in less rounding errors.

Just to make sure you don't think I'm comparing different codecs again, I'm talking about both x264 8bit vs x264 10bit, AND x265 8bit vs x265 10bit.

I'd suggest you do a bit of reading up on how the encoders read and calculate data before trying to insult others.

permalink

[-] AHGJ | 2 points | Feb 15 2018 12:59:22

Relevant

permalink

[-] 12mo | -2 points | Feb 15 2018 16:36:11

Relevant

They are comparing MPEG-2, a codec finalized in the '90s, to H264, a codec finalized in the 2010s. Of course H264 has better compression than MPEG-2.

However, feel free to read incomplete information off of an advertisement and shut off your brain.

permalink

[-] grognakbabarian | 1 points | Feb 17 2018 04:15:41

If an encoder can't reduce banding in an 8bit encode then it just shows that they're incompetent.

permalink

[-] 12mo | -2 points | Feb 15 2018 16:34:27

but when you encode an 8bit source using a 10bit encoder you do get better compression and less banding compared to encoding with the same settings using an 8bit encoder

No, you don't. If you have a black-and-white gradient (1-bit, 2 colors) no matter how many bits you encode it with, it will remain two colors, 1 bit. Same with 8 bit, if you have 256 bands they are going to remain as 256 bands, no matter how many bits you add.

I'd suggest you do a bit of reading up on how the encoders read and calculate data before trying to insult others.

How about others do some reading? Because that brain-dead-simple example shows exactly what happens when you encode a lower-bit-depth image as a higher-bit-depth image. You gain nothing.

permalink

[-] darkshadow6400 | 6 points | Feb 15 2018 20:51:18

I feel like you're missing the point, and your example is too simple. I'm not talking about creating additional colours, I'm talking about retaining the existing colour data.

When you encode anything, there is data bring thrown away, due to the lower bitrate. Sometimes you lose information from a smooth gradient, which is what introduces banding. With an 8bit encoder, this is more common, as the 8bit encoder has less of an colour space to work with internally and often leads to rounding errors, resulting in a less smooth gradient.

With a 10bit encoder of the same codec, it's much easier to retain the data in the gradient, as the encoder has a higher internal precision, due to the additional steps existing between each colour in the same colour space. This in turn leads to a smoother gradient in the final output.

So it's not so much that 10bit magically reduces banding due to making more colours, it's just more efficient at retaining the existing data due to its higher internal precision.

permalink

[-] VyndrosNoldor | 2 points | Feb 17 2018 11:09:20

If you have existing banding in the source you can also use filters to correct that (such as f3kdb which works pretty damn good).

I don't understand why the OP is against 10-bit encoding unless he has a machine from 15 years ago that has trouble with newer technologies.

permalink