тна Back to MegaDB Search

ecchh | 46 points | Sep 11 2017 01:25:47

[META] For those confused/angry about 10-bit encodes, look no further. | Megalinks MegaDB [META] For those confused/angry about 10-bit encodes, look no further.

This is why many release groups use 10bit encoding regardless of the source bit-depth.

TL;DR, it saves bandwidth, allowing for a reduced bitrate while keeping the same level of perceived quality as a higher bitrate 8bit encode. 10bit also reduces color-banding artifacts at low bitrates.

permalink


[-] ndg5800 | 5 points | Sep 11 2017 10:52:20

Dear all, can someone elaborate the difference between a plain x265 vid and a 10 bit x265 vid. Also I have a doubt, the whole purpose of x265 is to reduce size without quality loss, if so, why are the 10 bit x265 vids massive compared to a 1080p x264 or even a plain x265? Someone told me that the 10bit x265 have awesome colour reproduction.

Forgive my ignorance and many thanks in advance.

permalink

[-] Jhoopes517 | 4 points | Sep 11 2017 01:33:29

That's why I'm in the market for a 1050ti.

10bit nvenc encoding here I come!

permalink

[-] ecchh | 2 points | Sep 11 2017 01:36:18

Hope you find a good deal on one!

If you're looking for a good nvenc encoder, I'd suggest using staxrip, if you havent' heard of it already.

permalink

[-] Jhoopes517 | 4 points | Sep 11 2017 01:54:24

Got a 4gb for $100 I'm supposed to be getting tomorrow when the storm passes. Upgrading from a 960 so not expecting much more performance wise, but encoding should definitely have one.

Edit: also, love staxrip! Bounced between it and MediaCoder but I prefer Stax.

permalink

[-] chagmed | 1 points | Sep 11 2017 01:45:03

Can one use staxrip with nvenc without having to install nVidia's CUDA toolkit?

permalink

[-] Jhoopes517 | 4 points | Sep 11 2017 01:53:12

Yessir. Use it with my 960gtx with nothing special installed

permalink

[-] ecchh | 1 points | Sep 11 2017 03:26:43

Yeah, you don't need the CUDA toolkit any more with the newest version of staxrip

permalink

[-] chagmed | 2 points | Sep 11 2017 07:27:22

excellent, thx. I will give it a try. last attempt to install CUDA toolkit failed due to its arcane prereqs (e.g., old version of Visual Studio).

permalink

[-] douchebanner | 1 points | Sep 11 2017 08:05:19

ive encoded some test files with a 1060 and for an 8bit source i cant tell the difference between an 8bit encode and an 10bit one, whatever difference there is is not easily perceivable. (at least by me)

permalink

[-] Jhoopes517 | 2 points | Sep 11 2017 09:52:13

I'm mainly in for the faster encoding speeds.

permalink

[-] siddoshi02 | 1 points | Sep 18 2017 05:49:01

Is there any way to encode 10bit nvenv with an rx470

permalink

[-] Jhoopes517 | 1 points | Sep 18 2017 12:48:09

Not through NVENC.

You have something available to you called VCEEnc

Dunno if you'll have 10bit. But you should have 8bit

permalink

[-] anastusfocht | 3 points | Sep 11 2017 12:15:04

My only issue with 10bit files w h264/265 encoding is the fact most stand alone players either barely work with it, or more commonly, not at all. My WD Live box won't play it, My Pi box vomits on 10bit files regardless of bitrate or frame size, none of my Dune boxes do either. Apparently, I have to build a HTPC or get an Android box, all of which seem to have very mixed reviews on 10b playback.

Anyone have a stand alone box they'd recommend?

permalink

[-] coheedcollapse | 3 points | Sep 11 2017 23:11:55

I know this isn't a straight out solution, but if you've got issues watching stuff on certain devices, you can run a Plex server for free on your main PC and simply stream using that instead of directly. It should transcode to something that will be handled by your target device automatically.

I mean, in a few years, it's going to be hard to find anything that won't be able to handle 256, I suspect, so a temporary solution while keeping the "better" files is a decent halfway between finding other files and buying a new device altogether.

permalink

[-] ecchh | 1 points | Sep 11 2017 12:21:08

I suppose that's true. In my usage case scenario, the file is always being played back on a PC of some sort (custom built) so I haven't noticed any issues myself.

As far as boxes, an Nvidia Shield can do 4k x265. Might be worth looking into. [Source]

permalink

[-] Axelstrife | 2 points | Sep 11 2017 01:41:59

How do you encode in 10bit?

permalink

[-] ecchh | 7 points | Sep 11 2017 03:27:16

I use Handbrake Nightly personally, you need to download the nightly version, and then the 10bit library files from their forums. Handbrake, as opposed to Staxrip, doesn't need to extract & remux the files to convert them, and encoding on the processor side always produces better quality at the expense of time. Still, Staxrip is incredibly useful for hardware-accelerated encoding and I still sometimes use it to quickly shrink a bluray for my personal collection.

permalink

[-] littlebuck2007 | 5 points | Sep 11 2017 04:00:18

I don't know very little about encoding and all that jazz, but could you explain a little as to why using the CPU in lieu of the GPU returns a better product? I understand the speed differences and have an above average knowledge on computer hardware, I just don't understand how the two processes end in different results.

permalink

[-] ecchh | 2 points | Sep 11 2017 04:31:47

I'm not an engineer or anything, but I understand a little bit of how video encoding works.

Video encoding involves some pretty heavy and unpredictable integer and floating point maths. GPUs are generally better at floating point operations, but usually when the operations are more predictable.

With video encoding, the operations needed to be performed from frame to frame are so complex and unpredictable that the GPU has trouble retaining all the information. Why? Bits.

CPUs operate in 80-bit extended precision mode when performing floating point operations, while the GPU operates either at single-precision 32-bit or double-precision 64-bit. Either way, more information is being lost with every single cycle compared to the CPU.

Again, not an engineer, this is just what I've gathered from research and asking around. Anyone with more accurate knowledge, please, kindly educate me!

permalink

[-] littlebuck2007 | 3 points | Sep 11 2017 04:51:52

Interesting. I assume you can't answer this, but is there a way to marry the two together? You'd think that there would be a way that the CPU could do some stuff with precision while the GPU did all the leg work. Good to know though. Thanks for the information.

permalink

[-] tiiiiimmmm | 7 points | Sep 11 2017 08:34:45

That's just not how computers work unfortunately. The architecture between CPUs and GPUs is fundamentally different and they actually use different processes to perform math. You can see a little about this from what u/ecchh posted: the CPU can work with 80-bit values while the GPU maxes out with 64-bit values. Because of how binary math works, adding an extra bit will double the amount of values you can store, for example (common notation puts the letter 'b' in front of a number to indicate it's a binary value, the decimal value is in parentheses):

1 bit can represent 2 values, b0 (0) and b1 (1).

2 bits can represent 4 values, b00 (0), b01 (1), b10 (2), and b11 (3)

3 bits can represent 8 values, b000 (0), b001 (1), b010 (2), and b011 (3), b100 (4), b101 (5), b110 (6), b111 (7)

And so on...

This means that the CPU can work with a number range that includes 2^16 (65,536) more values than a GPU. So why is this so important for video encoding?

Video encoding and compression work primarily via a system called 'predictive motion compensation'. Basically, the encoder tries to figure out how few pixels need to be updated (and therefore stored in the file) in the next frame for every frame of the video. The first frame of a video is a 'key' frame and has a value written for every pixel on the screen. The next frame only holds the values for the pixels that change color, every other pixel is left blank and the value from the same location in the previous frame will be used while decoding instead. The computer keeps making frames like this until it comes to a frame where most of the information changes (typically a camera cut of some kind or part of a high-motion scene) and it needs to establish a new 'key' frame.

Without getting too technical, the lack of precision on the GPU means that the computer cannot explore as many possible ways each frame can be represented and therefore the GPU never finds as 'optimal' a solution for compressing as the CPU does. This generally means smaller areas of each frame are recognized as being redundant (and therefore not needed to be stored) in the previous frame and more key frames are created, making the file-size bigger (what people refer to when they say GPU encoding is less efficient).

I would imagine the hybrid algorithms that work via CPU and GPU try to identify different sections of the video to offload to either the CPU or the GPU with smarter ones sending more complex scenes to the CPU and less complex to the GPU. This is because frames have to be encoded in order for the reasons previously explained, so the computer can't start working on frame B until A is finished, it can't work on C until B is finished, and so forth. If you had the CPU and GPU trying to encode the same frame at the same time or even different frames in the same scene at the same time you would constantly have timing issues where the GPU would have to wait for the CPU to complete it's task before continuing which would be an inefficient nightmare.

Bonus Explanation: This is why encoding videos is slow in general, a lot of making programs more efficient is done via parallelization i.e. breaking a task into multiple steps and doing those steps at the same time. Because the encoder must create frames in order (output of frame A affects the output of frame B, and the output of frame B affects the output of frame C, and so on), encoders are really limited in the ways they can parallelize. GPU encoders are much faster than CPU encoders because GPUs have special instruction blocks specially made for encoding video while the CPU has a set of generic instructions that it uses to do every math operation. This means that GPU algorithms can do operations natively in 1 instruction while a CPU must perform the same operation by performing whatever generic instructions it needs to in order over multiple cycles (each cycle can only perform part of an instruction because of how CMOS based logic works, CMOS transistors are what your CPU and GPU are made of so you can see how reducing the number of instructions given to the processor really speeds up processing), it just can't find as optimal a way to compress the frames as the CPU does. This is why you get bigger files encoded much faster on a GPU.

permalink

[-] littlebuck2007 | 2 points | Sep 11 2017 08:57:14

Wow, that's a lot of information. It makes sense the way that you've explained it. I didn't know what exactly was taking place when a video is getting encoded, so the bit values and stuff didn't really mean anything. I know binary and to a very limited extent, instructions, but it makes sense when you put it all together like this. Thanks for the info!

Now all I need to do is learn what all of the settings in these programs do, and what is an ideal setting for me.

permalink

[-] ecchh | 2 points | Sep 11 2017 04:57:47

That's kind of what NVENC is, it's hardware-accelerated encoding. Some of the work is offloaded to the GPU, but the CPU is still working pretty hard as well. It just isn't as good as doing it solely on the CPU. It really comes up to

  1. How small/large do you want your file? Quick, high quality encoding always takes up more hard drive space than a painfully slow encode.
  2. How important is accuracy to the source material in your encode? Grain always is preserved better on the CPU side. Films without grain will look better than films with grain through NVENC, but there will still be some artifacts around hard edges.
  3. How much time do you really have? Are you willing to leave your computer on all night & have it shut down when it's finished to do x265 10bit encodes? (It takes me ~6 hours to encode one film @ 1080p x265 10bit)

permalink

[-] Axelstrife | 1 points | Sep 11 2017 08:33:42

I just encoded am episode that i previously encoded with MeGUI 8bit x265 now with staxrip 10bit x265 and idk which is better.

Bitrate between the two has a difference of 1.

http://screenshotcomparison.com/comparison/117098

interested to know what you think looks better?

I didn't mess with whatever the standard quality settings just set to 700MB filesize and x265 and 10bit for staxrip.

permalink

[-] ecchh | 2 points | Sep 11 2017 08:39:29

The mouseover one looks better to my eyes - it appears sharper, while on mouse out it's definitely softened. Mouseover appears to have more grain in it as well. In motion it might be different, the sharpness of the one could mean visible ringing-type artifacts if the bitrate is low enough, so it's best to do a comparison of stills and motion shots to choose which one looks best to you.

permalink

[-] Axelstrife | 1 points | Sep 11 2017 08:53:26

is grain a good thing or?

http://i.cubeupload.com/W9KRVU.png

original x264

this is a rencode struggled to even find this online but 380GB for all seasons of according to jim isnt worth the space lol.

permalink

[-] ecchh | 3 points | Sep 11 2017 09:05:22

Usually there is grain on a bluray source, some people don't like it and encode it to remove it, i personally like the grain and try to retain it in my encodes. Again ,it comes down to personal preference.

Yes, I'd say the mouseover looks closer to the source material than the mouse out picture.

permalink

[-] Axelstrife | 1 points | Sep 11 2017 09:08:43

Thanks for your help.

staxrip has an option for 12bit is this much better?

permalink

[-] ecchh | 2 points | Sep 11 2017 09:18:15

12bit isn't very compatible right now. I'd have to assume it would save more bandwidth. Handbrake has an option for it as well, but I would assume encoding would take even longer. I'd say give it a shot, especially if you're doing NVENC.

permalink

[-] Lordmau5 | 1 points | Sep 11 2017 11:30:02

Do you have a good preset config that you could share? :)

I've got a few things over here that I'd really like to encode to 10bit x265 but I can't find any good presets and don't want to be at potential loss of quality or similar.

permalink

[-] ecchh | 2 points | Sep 11 2017 11:53:08

Sure - I can share some of the settings I use. I'll just run down the tabbed sections in Handbrake and list off the settings I use. Any tabs not mentioned means there are no changes to be made.

Dimensions - not really a setting per se, but some times the autocrop in Handbrake isn't zealous enough, or perhaps a little too overzealous. Make sure the image is cropped equally on either side e.g. 140 on the top and bottom, as opposed to 140 top and 142 bottom. Common dimensions are 1920x800 for letterboxed 1080p, 1280x553 for letterboxed 720p, and 1440x1080p for 4:3.

Filters - I turn everything here off. I personally like to preserve the film grain from the original source file, though some people prefer their encodes without the film grain. If you prefer no film grain, you can turn on hqdn3d under Denoise.

Video - Framerate: same as source. Optimize video: medium preset. If you like film grain and don't mind larger filesizes, set Encoder Tune to grain. If you don't like grain, just leave it alone set at "None". Set the encode mode for Quality @ CRF 22. This should be enough to make for a perceived quality very close to the original bluray source. CRF mode will also take much less time than 2pass.

If you're more interested in targeting a specific filesize, you can turn on Avg Bitrate 2pass encoding. Leave turbo first pass unchecked. The average bitrate can be as low as 2,000 for animated sources and as high as you want for more quality. I usually go to about 8,000 on 2pass if the image is very grainy. 2pass will usually produce better results than CRF at similar bitrates since the first pass calculates how each frame should be composed before it ever writes a single byte, rather than trying to predict how it should encode as it's doing it. Be warned though, this is a long process. ~6 hours or more in some cases for one film.

Audio - This is up to personal preference really, but I use Opus mainly, though AAC is more compatible. 448k should be enough bitrate for a 5.1 channel sound source, though some push it to 384 or lower. 128k for stereo commentary tracks, 224k for stereo main tracks.

I don't think I've missed anything. I don't use custom switches really for x265 as much as x264. The presets seem to do well enough for me.

permalink

[-] dkimmortal | 3 points | Sep 11 2017 02:05:38

apps like Handbrake have 10bit encode builds i think idk about other programs. once you have the correct app, there should be options to choose it

permalink

[-] Jhoopes517 | 2 points | Sep 11 2017 02:07:38

Most apps have is a a option. I recommend StaxRip. It's a nice to use program. Easy to be honest.

permalink

[-] triadwarfare | 2 points | Sep 11 2017 09:09:17

Welp. Looks like it's time to buy a new phone. :(

Current phone: LG G4

permalink

[-] Avid28 | 1 points | Sep 11 2017 11:57:49

10bit is fine, its the 265 that's killing me

permalink

[-] mimecry | 1 points | Sep 12 2017 08:24:49

what's wrong with x265?

permalink

[-] Avid28 | 2 points | Sep 12 2017 12:23:57

Nothing. Just my TV does not support it and I don't have a streaming device right now. I use a USB

permalink

[-] Obi_Wanka_Noobie1 | 1 points | Sep 11 2017 14:25:10

Are 10-bit encoded videos useful if I have an 8-bit monitor?

permalink

[-] ecchh | 1 points | Sep 11 2017 15:04:34

Yes, a 10-bit encode is mostly for saving bandwidth. It's an upconvert of 8-bit content unless the HDR tag is specifically in the title.

permalink

[-] [deleted] | 1 points | Sep 11 2017 22:50:52

[deleted]

permalink

[-] ecchh | 3 points | Sep 11 2017 22:53:18

1080p 10bit content is a no go - lots of stuttering. Tried on a friends Raspberry Pi 3. Not sure about 720p content.

permalink

[-] greatryry | 1 points | Sep 14 2017 19:07:38

x265 has been total crap for me as well. I download it and try to play and the video always lags behind. Always.

permalink

[-] ecchh | 1 points | Sep 14 2017 19:32:54

Your device isn't fast enough to play x265 in that case

permalink

[-] greatryry | 1 points | Sep 14 2017 19:35:52

And if that's the case, it's going to be the case for a lot of people. Thus, x265 today isn't a sound solution. I've seen a lot of people with this problem. So forcing 265 on everyone rather than 264 is not great. At least do both.

permalink

[-] ecchh | 2 points | Sep 14 2017 20:52:19

No one is forcing x265 on anyone. There are still plenty of people doing x264 encodes. Hell, I even still see XviD encodes out there.

x265 has been around for almost five years now. It makes sense that eventually it would pick up traction with people. A lot of TVs and set-top boxes play HEVC. 4K content is exclusively encoded in HEVC because of the bandwidth savings. It's just the way things are looking going forward.

permalink