New H.265 video standard approved, will allow for high-quality video at half the bitrate

panasonic viera tc tv

The International Telecommunication Union (ITU) announced today that its members had agreed upon the format for the successor to H.264 video — a development that the body believes will set the stage for a new generation of high-definition video. Casually dubbed High Efficiency Video Coding (HEVC), the format's official name is Recommendation ITU-T H.265, and it brings with it one very specific benefit: the ability to reproduce quality imagery at half the bitrate required with H.264.

The H.264 video standard has become a de facto standard in recent years, turning up in everything from Blu-ray players to web video. The ITU said that it envisions H.265 being able to support video needs for the next decade, and with 4K streaming video already on the horizon and consumers wanting more video despite the limitations of their ISPs there will no doubt be a need for more efficient codecs in the very near future. While it will undoubtedly take quite some time for the new format to reach the kind of ubiquity its predecessor currently enjoys, the ITU says companies like Mitsubishi have already demonstrated implementations of the new format.

Recommended by Outbrain

Comments

Yeah, take that fansubbers, your 10bit encodings are out of date now! Start encoding yours in H.265 so nobody can play it.

CCCP…

By default cccp can’t play 10-bit…

CCCP ever since last year CCCP can play 10bit by default

In CCCP 10bit plays you !

k-lite all the way

K-lite is for illiterate.

It’s only fitting that you would like CCCP!

Kind of – for it is very simple and useful – but russian does not mean soviet by default.

Well those 10 bit encodings do save Hard Drive space.

Very, very little. I will never understand the purpose. You eliminate 95% of all hardware decoders, just to save an average of 10-20MB on a 300+MB file.

The image quality is much better. Less banding.

Compared to the same image quality as 8bit? You save a ton of space.

I’ll take the bit larger file that I can watch on any devices I have.

I watch stuff on my laptop anyways… Also.

Newer ARM chips can do 10bit 720p at least (and unless you have the Xperia Z or something, why would you need FHD?). Pretty much any laptop better than Atom should be able to as well.

Unless you have an ancient/low end device, or really want 1080p, 10bit works fine.

Also, you’d need a lot larger file to achieve the same image quality as 10bit with 8bit. Most 8bit stuff, even the really huge files show a lot of banding.

Yeah, those “better image quality” is going to look great on the crappy TN panels most laptops have.

Newer ARM chip do 10bit decoding via software, which is why the performance and reliability is crap shoot. If you are going to watch the content on tablets/phones, do the 10bit encoding really matter? Why not just use standard encoding and then your devices can play it natively, hardware accelerated and such. I dare you if you can even tell the difference under normal viewing conditions, unless all you do is pause the video and pixel peeping.

10bit doesn’t work “fine.” It only works “reliably” on PCs due to the raw processing power. Even then, not all software supported it. The point of having a video file is so that it’s more portable and I can play it on any devices, the advantage over physical media. What’s the point of restricting it even more? To save a couple megabytes and to see some improved color banding that I won’t even care when I’m actually watching the video? I don’t know about you, but when I watch a video content, I watch it for the content, the story, not finding where the color bandings are.

To be correct, 10-bit coding has nothing to do with raising image quality, because the setting is to save the space.

It helps with more efficient compression thanks to switching from 8 bit * 3 colours = 24 bit = 3 bytes to 10 bit * 3 colours = 30 bit = 32 bit = 4 bytes.

Algorithm works more efficiently with more fitting to CPU’s 1 byte /2 bytes /4 bytes /8 bytes /… 2^x paradigm and instruction sets, rather than odd 3 bytes.

Yeah, those “better image quality” is going to look great on the crappy TN panels most laptops have.

Actually, banding shows up on pretty every monitor, no matter how crappy…

Newer ARM chip do 10bit decoding via software, which is why the performance and reliability is crap shoot. If you are going to watch the content on tablets/phones, do the 10bit encoding really matter? Why not just use standard encoding and then your devices can play it natively, hardware accelerated and such. I dare you if you can even tell the difference under normal viewing conditions, unless all you do is pause the video and pixel peeping.

It’s very easy to notice the difference in banding in darker scenes even if you aren’t trying.

10bit doesn’t work “fine.” It only works “reliably” on PCs due to the raw processing power.

And works on most new high end phones, due to the raw processing power.

Even then, not all software supported it.

So find a player that does?

The point of having a video file is so that it’s more portable and I can play it on any devices, the advantage over physical media.

Are you saying that we should downconvert all video to some crappy resolution and encoding to make sure it works with devices we don’t use?

What’s the point of restricting it even more? To save a couple megabytes and to see some improved color banding that I won’t even care when I’m actually watching the video?

I care about image quality, I think those extra megabytes are pretty nice too. If you don’t, feel free to watch crappy video in large files.

I don’t know about you, but when I watch a video content, I watch it for the content, the story, not finding where the color bandings are.

I watch if for the content, the story, not getting annoyed with low image quality.

And hard-drive space is so expensive today? Besides, most MP4 encodings (regular H.264) actually have less file size than those “space saving” 10bit encoding.

but with worse image quality. Can’t link but you can find small 10 bit encodes having better image quality than those MP4 encodes.

Worse image quality? Maybe. Significant? Doubtful. Do you watch a video by pausing the video and pixel peeping the picture? No. Until I see a double blind comparison of those people being able to differentiate the quality between 10-bit encoding and regular encoding (especially if the source is DVD/blu-ray) under normal viewing condition, those claims of “better image quality” are just empty claims.

Go to any dark scene. Most small 8bit encodes will show significant banding, 10bit will show a lot less banding.

And would you notice that under normal viewing condition? There is probably going to be a lot more quality reduction from lousy TN panels that most people use for their PC monitors.

Stop talking when you don’t even know the slightest thing about encoding or quality. You can start by reading this and this. If that is to hard for you to comprehend then at least read this

the main advantages come when you are compressing cartoon, since it will often produce a lot banding, which meant two things. 10-bit makes it so you can avoid using a low crf value, and therefor saving a lot of space, second 10-bit have a higher compression which can give up to a 30% reduction if filesize, however source with a lot of grain will not have much reduction. It also make it easier for encoder since they don’t have to work with an extra filter for dealing with new banding they create, which mean there is one less factor they can screw up in.

The scene is always behind, it is only a couple of years ago they started encoding at constant quality instead of a constant filesize. What does that make of a difference you might think? the quality of what you got was very inconsistent trough a series. Even today they haven’t even moved up to L5.1 or 16 re-frames, the majority are still using something like L3.1 with 5 re-frames, which don’t offer much compression, yet they still aim after the same file size as they were when they used xvid. That is god damn stupid when the internet speed has increased a lot since the start of xvid and the price to GB ratio have also improved a lot.

Then why do the scene encode it like that? There are some reasons one of them is the majority don’t care about quality even the slightest (or just have a bad vision). Just look at how many people actually downloads cam-rips >.>

TL;DR ignorance is a bliss

Oh, such a great image quality… being looked on a laptop with crappy TN panels. Yeah, the extra effort is definitely worth it. Meanwhile, I’ll take my “lousy” 8-bit encoding that I can watch on any devices I want to.

View All Comments
Back to top ↑